Abstract
The results of Morris and Lieberman (2012) were extended to include more extreme multicollinearity conditions, with specific attention paid to the effect of a wide range of multicollinearity levels on Ordinary Least Squares and Logistic Regression cross-validated classification accuracy, and the attendant increase in prediction accuracy afforded by Ridge Regression and Principal Components models with increasing validity concentration. Even very extreme multicollinearity did not affect OLS or LR prediction accuracy, but, given attendant validity concentration, did enable Ridge Regression and Principal Components to exceed the accuracy of these typically used classification techniques. Clarification regarding how these results bear on admonitions regarding multicollinearity was tendered.

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Copyright (c) 2014 Mary G. Lieberman, John D. Morris (Author)