📊

Understanding Data Entry and Reliability

Feb 21, 2025

Lecture Notes: Data Entry and Inter-Rater Reliability

Introduction

  • Focus on data entry and calculating inter-rater reliability.
  • Types of Data
    • Categorical: Divided into nominal and ordinal.
    • Continuous: Divided into continuous discrete and continuous continuous.

Types of Data

Categorical Data

  • Nominal Data
    • Categories are named (e.g., gender, race).
    • No intrinsic order; numbers assigned are just labels.
  • Ordinal Data
    • Categories have a natural order (e.g., letter grades, education level).
    • Order matters; not interchangeable.

Continuous Data

  • Continuous Discrete Data
    • Whole numbers only (e.g., number of students, days in a week).
  • Continuous Continuous Data
    • Can include decimals/fractions (e.g., temperature, height).

Data Entry

  • Use Excel: Each column as a variable, each row as a subject/participant.
  • Variable name in first row; use underscores to replace spaces.
  • Example: Gender as 1 (female), 0 (male).
  • Ensure variable names are a single word and avoid special characters.
  • Translate observation sheet data to Excel.

Calculating Inter-Rater Reliability

  • Methods
    • Cohen’s Kappa: Two raters, categorical data.
    • Intraclass Correlation Coefficient (ICC): Two or more raters, continuous data.
    • Fleiss’ Kappa: Three or more raters, categorical data.

Cohen’s Kappa

  • Assumptions: Two raters, categorical data, independent ratings.
  • Categorical data must have the same number of categories.
  • Interpretation of scores:
    • 0 (no agreement) to 1 (perfect agreement).
    • Agreement categories: Poor, fair, moderate, good, very good.

Fleiss’ Kappa

  • For three or more raters with categorical data.
  • Requires SPSS for calculation.
  • Interpreted similarly to Cohen’s Kappa.

Intraclass Correlation Coefficient (ICC)

  • For continuous data, two or more raters.
  • Measures consistency of ratings.
  • Interpretation:
    • Poor to moderate (0.5-0.74), Good (0.75-0.86), Excellent (>0.86).

SPSS Demonstrations

  • Cohen’s Kappa: Analyze -> Descriptive Statistics -> Cross tabs.
  • Fleiss' Kappa: Analyze -> Scale -> Reliability Analysis.
  • ICC: Analyze -> Scale -> Reliability Analysis; choose absolute agreement.

Data Reporting

  • Use APA style for reporting inter-rater reliability results.
  • Include values for kappa/ICC, confidence intervals, and significance levels.

Practical Aspects

  • Use Excel for data entry initially.
  • Ensure proper formatting and entry of categorical and continuous data.
  • Practice SPSS operations for calculating reliability.

Reminders

  • Formal data collection begins this week and next.
  • Complete 120 minutes of observation.
  • Quiz #1 this week (25 points, open book).
  • Bring questions to the lab for further discussion.