Codementor Events

Hierarchical Regression and Hierarchical Linear Modeling

Published Jul 13, 2018Last updated Aug 15, 2019
Hierarchical Regression and Hierarchical Linear Modeling

When you are planning to conduct your data analysis, you must have come across two similar sounding statistical analysis techniques “Hierarchical Linear Modeling” and “Hierarchical Regression”. The first thought that comes to mind is that these two techniques are just two names for the same technique. However, these two are completely different techniques that are used on different types of data and answer different types of questions. Let us understand the difference between the two.

The other name for Hierarchical Linear Modeling is ‘Multi-level modeling’ and it falls under the category of ‘mixed effect modeling’. When there is nested structure in the data this is one of the analysis to be used. For example, let us say that you are collecting data from students and these students come from different schools. Here in this data we can see that the students are nested within schools. The students from the same school will have some common variance associated with them, and they cannot be considered independent of each other. The multiple linear regression analysis requires the assumptions to be independent of each other, and thus a different method is required to model data that is nested. Compared to multiple linear regression analysis, Hierarchical linear modeling allows for a more efficient method to model nested data.

On the other hand, if we consider Hierarchical regression analysis, it is nothing but a way to deal with how the independent variables will be selected and entered into the model. In more simple terms, hierarchical regression analysis can be understood as the process of adding or removing independent variables from the regression model in steps. Let me explain this using an example for you to understand the difference between Hierarchical regression and Hierarchical linear model. Suppose you want to predict the college GPA of a student (which becomes our dependent variable) based on his performance in school (independent variables) while controlling for demographic characteristics (i.e. covariates). For this analysis, you can start by entering demographic variables in the first step, and then you can add school performance in the second step. This will help you in understanding what is the predictive capability of school performance beyond the demographic factors. Hierarchical regression also includes forward, backward, or stepwise regression, where the independent variables are added or removed automatically basis some statistical criteria. These different types of hierarchical regressions are particularly useful when we have very large number of potential predictor variables and want to determine (statistically) which variables have the most predictive power.

So we can summarize that hierarchical linear modeling is used when you have nested data and we use hierarchical regression when we want to add or remove variables from your model in multiple steps. I hope that understanding this difference between the two statistical methods will help you in determining the appropriate analysis for your study.

Discover and read more posts from Ashish Soni
get started
post commentsBe the first to share your opinion
Show more replies