Multicollinearity, a common issue in regression analysis, occurs when predictor variables are highly correlated. This article navigates through the intricacies of multicollinearity, addressing its consequences, detection methods, and effective solutions. Understanding and managing multicollinearity is essential for accurate regression models and insightful data analysis. Explore the nuances of this challenge and enhance your statistical proficiency.
This article was published as a part of the Data Science Blogathon.
One crucial assumption in regression models is that independent variables should not correlate among themselves. This is essential for isolating the individual impact of each variable on the target variable, as indicated by regression coefficients. Multicollinearity arises when variables are correlated, making it challenging to discern their separate effects on the target variable.
Multicollinearity causes the following 2 primary issues:
A very simple test known as the VIF test is used to assess multicollinearity in our regression model. The variance inflation factor (VIF) identifies the strength of correlation among the predictors.
Now we may think about why we need to use ‘VIF’s and why we are simply not using the Pairwise Correlations.
Since multicollinearity is the correlation amongst the explanatory variables it seems quite logical to use the pairwise correlation between all predictors in the model to assess the degree of correlation. However, we may observe a scenario when we have five predictors and the pairwise correlations between each pair are not exceptionally high and it is still possible that three predictors together could explain a very high proportion of the variance in the fourth predictor.
I know this sounds like a multiple regression model itself and this is exactly what VIFs do. Of course, the original model has a dependent variable (Y), but we don’t need to worry about it while calculating multicollinearity. The formula of VIF is:
VIF = 1 /(1- Rj2)
Here the Rj2 is the R squared of the model of one individual predictor against all the other predictors. The subscript j indicates the predictors and each predictor has one VIF. So more precisely, VIFs use a multiple regression model to calculate the degree of multicollinearity. Suppose we have four predictors – X1, X2, X3, and X4. So, to calculate VIF, all the independent variables will become dependent variables one by one. Each model will produce an R-squared value indicating the percentage of the variance in the individual predictor that the set of other predictors explain.
The term “variance inflation factor” (VIF) indicates the degree to which correlations among predictors inflate variance. For instance, a VIF of 10 means existing multicollinearity inflates coefficient variance tenfold compared to a model without multicollinearity. VIFs assess the precision of coefficient estimates, influencing the width of confidence intervals. Lower VIF values are preferable; values between 1 and 5 suggest manageable correlation, while those exceeding 5 indicate severe multicollinearity. Industry standards often recommend maintaining VIF below 5, although some texts consider VIF greater than 10 as severe, with judgment playing a role in deciding corrective measures.
The potential solutions include the following:
It then creates new variables known as Principal components that are uncorrelated. So, if we have 10-dimensional data then a PCA transformation will give us 10 principal components and will squeeze maximum possible information in the first component and then the maximum remaining information in the second component and so on. The primary limitation of this method is the interpretability of the results as the original predictors lose their identity and there is a chance of information loss. At the end of the day, it is a trade-off between accuracy and interpretability.
I am using a subset of the house price data from Kaggle. The dependent/target variable in this dataset is “SalePrice”. There are around 80 predictors (both quantitative and qualitative) in the actual dataset. For Simplicity’s purpose, I have selected 10 predictors based on my intuition that I feel will be suitable predictors for the Sale price of the houses. Please note that I did not do any treatment e.g., creating dummies for the qualitative variables. This example is just for representation purposes.
The following table describes the predictors I chose and their description.
The below code shows how to calculate VIF in R. For this we need to install the ‘car’ package. There are other packages available in R as well.
The output is shown below. As we can see most of the predictors have VIF <= 5
Now if we want to do the same thing in python then please see the code and output below
Please note that in the python code I have added a column of intercept/constant to my data set before calculating the VIFs. This is because the variance_inflation_factor function in python does not assume the intercept by default while calculating the VIFs. Hence, often we may come across very different results in R and Python output. For details, please see this discussion here.
In conclusion, understanding and addressing multicollinearity are crucial for robust regression models. Detecting signs, interpreting VIF values, and implementing corrective measures are essential. Mastering these skills is paramount for data scientists. Elevate your expertise further with our Blackbelt course, offering in-depth knowledge and practical applications in regression analysis. Enroll now for a seamless learning experience!
A. Multicollinearity is the high correlation between independent variables in a regression model. It’s problematic because it undermines the model’s ability to distinguish individual effects of predictors.
A. Multicollinearity hampers the interpretability of regression coefficients by inflating standard errors, making it challenging to discern the unique impact of each variable on the dependent variable.
A. Multicollinearity is interpreted through variance inflation factor (VIF) values. High VIF values, typically above 5, indicate problematic multicollinearity, affecting the reliability of coefficient estimates.
A. Signs of multicollinearity include high pairwise correlations between predictors, coefficients changing signs when variables are added or removed, and inflated standard errors in regression results.