ICC Calculator: Calculate Intraclass Correlation Coefficient (SPSS)


ICC Calculator (Intraclass Correlation Coefficient)

A tool to calculate ICC from ANOVA table values, often used after analysis in software like SPSS.



Also known as Mean Square for Rows (MSR). Found in your ANOVA output.

Please enter a valid positive number.



Also known as Mean Square for Error (MSE). Found in your ANOVA output.

Please enter a valid positive number.



The number of raters, judges, or repeated measurements for each subject.

Please enter an integer greater than 1.


Visual Representation of ICC Value

What is the Intraclass Correlation Coefficient (ICC)?

The Intraclass Correlation Coefficient (ICC) is a descriptive statistic used to measure the reliability or consistency of quantitative measurements made by different observers (raters) on the same subjects. In essence, it describes how strongly units within the same group resemble each other. Unlike the Pearson correlation, which assesses the relationship between two different variables, the ICC assesses the agreement on the same variable across different raters or time points.

It’s widely used in clinical research, psychology, and other fields to determine inter-rater reliability (the degree of agreement among raters), test-retest reliability (consistency of a measure over time), and intrarater reliability (consistency of one rater over multiple ratings). The value of an ICC generally ranges from 0 to 1, where 0 indicates no reliability and 1 indicates perfect reliability.

ICC Formula and Explanation

This calculator computes the ICC for a one-way random-effects model, often denoted as ICC(1,1). This model is appropriate when each subject is rated by a different, random set of k raters. The formula uses values from a one-way Analysis of Variance (ANOVA) table. The general formula can be expressed as: ICC = (between-group variability) / (between-group variability + within-group variability).

The specific formula used here is:

ICC(1,1) = (MSB – MSW) / (MSB + (k – 1) * MSW)

Description of variables used in the ICC formula.
Variable Meaning Unit Typical Range
MSB Mean Square Between subjects. Represents the variance between the different subjects being measured. Unitless (variance) Any positive number
MSW Mean Square Within subjects. Represents the variance within the ratings for the same subject (i.e., measurement error or rater disagreement). Unitless (variance) Any positive number
k Number of raters or measurements per subject. Integer 2 or greater

Practical Examples

Example 1: High Agreement

Imagine a study where 3 radiologists rate the size of tumors on a set of 20 MRI scans. After running an ANOVA in SPSS, they get the following values:

  • Inputs:
    • Mean Square Between (MSB): 150.5
    • Mean Square Within (MSW): 10.2
    • Number of Raters (k): 3
  • Calculation:
    • Numerator: 150.5 – 10.2 = 140.3
    • Denominator: 150.5 + (3 – 1) * 10.2 = 150.5 + 20.4 = 170.9
    • ICC = 140.3 / 170.9 = 0.821
  • Result: An ICC of 0.821 indicates “Good” reliability among the radiologists’ ratings.

Example 2: Poor Agreement

Consider a scenario where 4 judges at a talent show rate 50 contestants. The scores show a lot of disagreement. The ANOVA output is:

  • Inputs:
    • Mean Square Between (MSB): 45.0
    • Mean Square Within (MSW): 32.5
    • Number of Raters (k): 4
  • Calculation:
    • Numerator: 45.0 – 32.5 = 12.5
    • Denominator: 45.0 + (4 – 1) * 32.5 = 45.0 + 97.5 = 142.5
    • ICC = 12.5 / 142.5 = 0.088
  • Result: An ICC of 0.088 signifies “Poor” reliability, suggesting the judges lack a consistent standard for scoring. For more information on reliability analysis, see our guide on choosing a reliability statistic.

How to Use This calculate icc using spss Calculator

This calculator is designed to be a bridge between your statistical software output (like SPSS) and a final ICC value. Here’s how to use it effectively:

  1. Run ANOVA in SPSS: First, perform a reliability analysis in SPSS by going to `Analyze > Scale > Reliability Analysis`. Move your rater/measurement variables into the ‘Items’ box.
  2. Open Statistics Options: In the `Statistics…` dialog box, check the box for “Intraclass correlation coefficient”. Ensure you select a “One-Way Random” model to match this calculator’s formula.
  3. Locate ANOVA Table: In the SPSS output, find the “Analysis of Variance” (ANOVA) source table. From this table, you will get the Mean Square values you need.
  4. Enter Values into Calculator:
    • Find the ‘Between People’ (or similar) row and enter its ‘Mean Square’ value into the Mean Square Between Subjects (MSB) field.
    • Find the ‘Within People’ or ‘Residual’ row and enter its ‘Mean Square’ value into the Mean Square Within Subjects (MSW) field.
  5. Enter Number of Raters: Input the number of raters or measurements you used (e.g., if you had 4 judges, enter ‘4’) into the k field.
  6. Interpret the Result: The calculator automatically computes the ICC value and provides a qualitative interpretation (Poor, Moderate, Good, Excellent) based on established guidelines. You can explore further with our advanced statistical tools.

Key Factors That Affect Intraclass Correlation Coefficient

The final ICC value is not just a number; it’s influenced by several characteristics of your data and study design. Understanding these can help you interpret your results more accurately.

  • Between-Subject Variability: If the subjects you are measuring are very similar to each other, there is low between-subject variability. This can mathematically lead to a lower ICC, even if raters are consistent. Conversely, a very heterogeneous group of subjects can inflate the ICC.
  • Rater Disagreement (Error): This is the core of what ICC measures. The higher the variability of scores *within* a single subject (high MSW), the lower the reliability and the lower the ICC will be. This reflects random error or systematic differences between raters.
  • Number of Raters (k): While this calculator uses ‘k’ for a single measure’s reliability (ICC(1,1)), reliability generally increases as you average the scores from more raters. An ICC for an average of ‘k’ measures (ICC(1,k)) will be higher than for a single measure.
  • Choice of ICC Model: This calculator uses a one-way random model. However, SPSS and other software offer two-way random or two-way mixed models (e.g., ICC(2,1), ICC(3,1)). Choosing the wrong model for your study design (e.g., using a one-way model when you should account for rater variance) will change the result. Our guide to interpreting statistical output can help clarify this.
  • Sample Size: A small sample of subjects may not provide a stable estimate of the true variance in the population, potentially leading to an ICC value that isn’t representative.
  • Outliers: Extreme or unusual ratings can significantly impact the mean squares calculated by the ANOVA, thereby distorting the final ICC value. It’s important to check your data for outliers before analysis.

Frequently Asked Questions about calculate icc using spss

What is a good ICC value?

Generally accepted guidelines classify ICC values as follows: Below 0.50 is ‘Poor’, 0.50-0.75 is ‘Moderate’, 0.75-0.90 is ‘Good’, and above 0.90 is ‘Excellent’ reliability. However, context matters, and in some fields, a ‘Moderate’ ICC may be considered acceptable.

Can the ICC be negative?

Yes, mathematically, the ICC can be negative if the within-group variance is greater than the between-group variance. A negative ICC should be interpreted as having no reliability (effectively zero), indicating that the measurements are essentially random noise.

What’s the difference between ICC and Pearson Correlation?

Pearson correlation measures the linear relationship between two different variables (e.g., height and weight). ICC measures the agreement or consistency of one variable measured multiple times (e.g., a patient’s blood pressure measured by three different nurses). Two raters could have a perfect Pearson correlation of 1 if one always scores exactly 5 points higher than the other, but their ICC would be less than 1 because their absolute values don’t agree.

How do I run this analysis directly in SPSS?

Go to `Analyze > Scale > Reliability Analysis…`. Add the columns representing your raters to the “Items” box. Click the “Statistics…” button and check the box for “Intraclass correlation coefficient”. Select the appropriate model (e.g., One-Way Random) and click Continue, then OK. Learn more in our SPSS tutorial series.

What are the different ICC models (e.g., two-way random)?

The models differ based on your assumptions. A ‘one-way’ model is used when each subject is rated by a different set of randomly selected raters. A ‘two-way’ model is used when the same set of raters evaluates all subjects. The effect can be ‘random’ (if raters are a random sample) or ‘mixed’ (if you are only interested in a specific, fixed set of raters).

Why is my ICC value so low?

A low ICC could be due to several reasons: high disagreement among raters, a lack of variability among the subjects you are rating (they are too similar), or simply a small number of subjects or raters in your study.

Where do I find MSB and MSW in my SPSS output?

In the ANOVA table generated by the Reliability Analysis procedure, MSB is the “Mean Square” value in the “Between-Subjects” (or “Between-People”) row. MSW is the “Mean Square” value in the “Within-Subjects” or “Residual” row.

Is this an “agreement” or “consistency” ICC?

The one-way random effects model, ICC(1,1), is a measure of absolute agreement because the within-subject variance component (MSW) includes both random error and systematic differences between raters.

© 2026 Statistical Calculators Inc. All rights reserved.



Leave a Reply

Your email address will not be published. Required fields are marked *