
Reliability
Reliability in Content Analysis
The main reliability concern in content analysis research is intercoder reliability, which is defined by The Content Analysis Guidebook as the amount of agreement or correspondence among two or more coders. Reliability is paramount in content analysis in order to establish the objectivity of the codebook, and allow the confident interpretation of results. Chapter 7 of The Content Analysis Guidebook discusses a variety of intercoder reliability coefficients and their formulas, programs that help calculate reliability coefficients, dealing with multiple coders, and the treatment of variables that do not reach acceptable levels of reliability. The purpose of this section of The Content Analysis Guidebook Online is to provide additional reliability resources to content analysis researchers and students.
Programs that Calculate Reliability Coefficients
PRAM – PRAM is a computer program that was developed to simplify the calculation of intercoder reliability coefficients for two or more coders. It may be downloaded for free for academic use. PRAM requires input data to be formatted specifically in an Excel spreadsheet and the PRAM output is available to be viewed immediately onscreen or saved as an .xls file. PRAM calculates percent agreement, Scott’s pi, Cohen’s kappa, Spearman rho, Pearson correlation coefficient (r), and Lin’s concordance correlation coefficient (rc); an updated version calculates Fleiss’ adaptation of Cohen’s kappa for multiple coders and Krippendorff’s alpha.
ReCal – ReCal (Reliability Calculator) is an online utility that computes intercoder/interrater reliability coefficients for nominal content analysis data. It processes multiple variables simultaneously if there are only two coders, and only a single variable at a time if there are three or more variables. It is compatible with Excel, SPSS, Stata, OpenOffice, Google Docs, and any other database, spreadsheet, or statistical application that can export CSV files. ReCal was developed by Deen Freelon as a doctoral student in the Department of Communication at the University of Washington. (He is currently Professor at the Annenberg School of Communication, University of Pennsylvania.)
AgreeStat – “An Excel-based application for performing advanced statistical analysis of the extent of agreement among multiple raters. You may compute Chance-corrected Agreement Coefficients (CAC) as well as Intraclass Correlation Coefficients (ICC).”
Links to other Reliability Resources
Computing Krippendorff’s alpha reliability (See: KALPHA) by Andrew F. Hayes and Klaus Krippendorff.
Gwet’s Inter-Rater Reliability Repository by Kilem Gwet. Gwet provides free access to their published and unpublished articles, downloadable computer programs, and excerpts from their current and future books.
Practical Resources for Assessing and Reporting Intercoder Reliability in Content Analysis Research Projects by Matthew Lombard, Jennifer Snyder-Duch, and Cheryl Campanella Bracken.
SPSS macro for computing Krippendorff’s alpha by Matthew Lombard.