Adjusted R-Squared Calculator Using SST and SSR


Adjusted R-Squared Calculator

Determine the model’s goodness of fit, adjusted for the number of predictors, using SST and SSR values.



The total variance in the dependent variable. Must be a positive number.


The variance not explained by the model. Must be less than or equal to SST.


The total number of data points in your sample.


The number of explanatory variables in your model.

What is Adjusted R-Squared?

Adjusted R-Squared is a modified version of R-Squared (the coefficient of determination) that is used in regression analysis. While R-Squared measures the proportion of variance in the dependent variable that is explained by the independent variables, it has a key limitation: it always increases or stays the same when you add more predictors to the model, even if those predictors are not useful. This can be misleading.

The adjusted r squared calculator using sst and ssr solves this problem. It “adjusts” the R-Squared value by penalizing the model for including extra independent variables. If you add a useful variable, the Adjusted R-Squared will increase. If you add a useless variable that doesn’t significantly improve the model’s fit, the Adjusted R-Squared will decrease. This makes it a more reliable metric for comparing models with different numbers of predictors.

The Adjusted R-Squared Formula

To calculate Adjusted R-Squared, you first need to calculate the standard R-Squared. The formula for R-Squared using the Total Sum of Squares (SST) and the Sum of Squared Residuals (SSR) is:

R² = 1 - (SSR / SST)

Once you have the R-Squared value, you can use the following formula to find the Adjusted R-Squared:

Adjusted R² = 1 - [(1 - R²) * (n - 1) / (n - p - 1)]

Variables Explained

Description of variables used in the adjusted r squared calculator.
Variable Meaning Unit Typical Range
SST Total Sum of Squares: The total variation in the data. Unitless Positive Number (> 0)
SSR Sum of Squared Residuals: The variation not explained by the model (error). Unitless 0 to SST
n Sample Size: The total number of observations in your dataset. Count Integer greater than p + 1
p Number of Predictors: The count of independent variables in your model. Count Integer ≥ 0

Practical Examples

Let’s see how the adjusted r squared calculator using sst and ssr works in practice. We’ll explore two scenarios.

Example 1: A Well-Fitted Model

Suppose a research model has the following values:

  • SST: 2000
  • SSR: 400
  • Sample Size (n): 100
  • Number of Predictors (p): 4

First, we calculate R-Squared: 1 - (400 / 2000) = 0.80. Then, we find Adjusted R-Squared: 1 - [(1 - 0.80) * (100 - 1) / (100 - 4 - 1)] = 1 - [0.20 * 99 / 95] ≈ 0.7916. This shows a strong model fit. For more details on regression models, see resources on {related_keywords}.

Example 2: An Overfit Model

Now, let’s say a data scientist adds 10 more, largely irrelevant, predictors to the same model. The SSR might slightly decrease, but the number of predictors increases significantly.

  • SST: 2000
  • SSR: 380
  • Sample Size (n): 100
  • Number of Predictors (p): 14

R-Squared increases: 1 - (380 / 2000) = 0.81. However, Adjusted R-Squared is: 1 - [(1 - 0.81) * (100 - 1) / (100 - 14 - 1)] = 1 - [0.19 * 99 / 85] ≈ 0.7789. Notice that even though R-Squared went up, the Adjusted R-Squared went down, correctly signaling that the new model is worse due to the added complexity.

Dynamic chart illustrating the relationship between SST and SSR.

How to Use This Adjusted R-Squared Calculator

  1. Enter Total Sum of Squares (SST): Input the total variability of your dependent variable. You can often find this in your statistical software’s output.
  2. Enter Sum of Squared Residuals (SSR): This is the sum of the squared errors of your model, also found in statistical outputs.
  3. Enter Sample Size (n): Provide the total number of observations used to build the model.
  4. Enter Number of Predictors (p): Input how many independent variables are in your regression model.
  5. Interpret the Results: The calculator instantly provides both the R-Squared and the more robust Adjusted R-Squared. A smaller gap between the two values generally indicates a more parsimonious model. You can find more about model interpretation in a {related_keywords} guide.

Key Factors That Affect Adjusted R-Squared

  • Number of Predictors: This is the core of the adjustment. Adding predictors that don’t contribute significantly will lower the Adjusted R-Squared.
  • Goodness of Fit (R-Squared Value): A model that explains more variance (higher initial R-Squared) will have a higher starting point for its adjusted value.
  • Sample Size: With a very large sample size, the penalty for adding an extra predictor becomes smaller. In small datasets, the penalty is much more pronounced.
  • Relevance of Predictors: Including variables that are strongly correlated with the dependent variable will increase Adjusted R-Squared.
  • Model Complexity: The metric serves as a guard against overfitting by penalizing overly complex models.
  • Degrees of Freedom: The adjustment is based on the ratio of the degrees of freedom of the total variance to the degrees of freedom of the residual variance. You can learn more with a {related_keywords} tutorial.

Frequently Asked Questions (FAQ)

What is a good Adjusted R-Squared value?
It depends heavily on the field of study. In social sciences, a value over 0.20 might be significant, while in physics or engineering, you might expect values over 0.90. The key is to use it for comparing models.
Can Adjusted R-Squared be negative?
Yes. If the model’s R-Squared is very low (close to zero), and you have several predictors, the penalty can be large enough to push the Adjusted R-Squared value below zero. This indicates your model is worse at predicting the outcome than simply using the mean of the dependent variable.
What’s the main difference between R-Squared and Adjusted R-Squared?
R-Squared explains the proportion of variance explained by the model, while Adjusted R-Squared does the same but penalizes the model for having extra predictors that don’t improve it.
How do I find SST and SSR?
Most statistical software (like R, Python’s statsmodels, SPSS, etc.) provides these values in the regression output table, often under “ANOVA” or “Model Summary”. SST is the total sum of squares, and SSR is the residual (or error) sum of squares.
Why did my Adjusted R-Squared go down when I added a new variable?
This means the variable you added did not improve the model enough to justify its inclusion and the loss of a degree of freedom. It’s a sign that the variable is likely irrelevant.
Should I always use Adjusted R-Squared over R-Squared?
When comparing models with different numbers of predictors, Adjusted R-Squared is almost always superior. For a simple linear regression with only one predictor, both values will be very close.
What does it mean if R-Squared is high but Adjusted R-Squared is much lower?
This is a classic sign of an overfit model. It suggests you have included too many independent variables, and some of them are not contributing meaningfully to the model’s explanatory power.
Is a higher Adjusted R-Squared always better?
Generally, yes, when comparing models for the same dataset. However, it should be considered alongside other metrics like p-values for coefficients, residual plots, and the theoretical soundness of the model. Learn more about this in a {related_keywords} course.

© 2026 Your Company. All Rights Reserved. This calculator is for educational purposes only.



Leave a Reply

Your email address will not be published. Required fields are marked *