Skip to main content

Table 1 Example of how Cohen’s kappa is used to assess concordance

From: Quantifying implementation strategy and dissemination channel preferences and experiences for pain management in primary care: a novel implementer-reported outcome

 

Current

Preferred

Colleagues

1

0

Your own clinical experience

1

0

Patients

0

0

Professional organizations

0

1

Researchers

0

1

Clinical experts

1

0

Pharmaceutical representatives

0

1

Primary peer-reviewed literature (e.g., PubMed)

0

1

Online peer-reviewed clinical resources (e.g., UptoDate)

0

0

Email listserv

1

0

Practice Briefs or Practice Guidelines

1

0

Annual conferences

0

1

Seminars at my clinic/institution (e.g., grand rounds; case conference)

0

0

Web-based continuing education modules

1

0

Workshops on specific intervention (e.g., CBT, Yoga)

1

1

Main-stream media (e.g., NPR, CNN, FoxNews)

0

0

Blogs (e.g., Tumblr, Wordpress)

1

0

Social media (e.g., Facebook, Twitter, Reddit)

1

1

Podcasts

1

0

Other

0

1

None of these

0

0

 Agreement, n (%)

7 (33)

 

 Disagreement, n (%)

14 (66)

 

 Cohen’s kappa (κ)

−.33

 
  1. Table 1 shows an example of how the data were structured to assess the degree of concordance or agreement between what a fictional provider is currently experiencing and what they would prefer to experience in an ideal scenario with one indicating yes and zero indicating no. Percent agreement is the number and percent of total response options the fictional respondent agreed between current and preferred. Disagreement is the number and percent of the total response options the fictional respondent did not provide the same answer for both current and preferred. Cohen’s kappa (κ) was calculated to describe the level of actual concordance for the fictional provider