Measuring and reporting intercoder reliability in plan quality evaluation research

TitleMeasuring and reporting intercoder reliability in plan quality evaluation research
Publication TypeJournal Article
Year of Publication2014
AuthorsStevens, M, Lyles, W, Berke, PR
JournalJournal of Planning Education and Research
Volume34
Pagination77–93
Keywordscontent analysis, intercoder reliability, Krippendorff’s alpha, percent agreement, plan evaluation, plan quality
Abstract

Plan quality evaluation researchers typically evaluate plans in relation to whether they contain certain desirable features. Best practice dictates that plans be evaluated by at least two readers and that researchers report a measure of the extent to which the readers agree on whether the plans contain the desirable features. Established practice for assessing this agreement has been subject to criticism. We summarize this criticism, discuss an alternative approach to assessing agreement, and provide recommendations for plan quality evaluation researchers to follow to improve the quality of their data and the manner in which they assess and report that quality.

DOI10.1177/0739456X13513614