Abstract
Multicollinearity effects can result in imprecise estimation of regression coefficients, but small sample sizes and low model coefficients of determination can produce the same effects. Multicollinearity does not result in a violation of the assumptions that underly statistical inference on linear models. This implies that a researcher wishing only to protect against type I errors need not be concerned about high interpredictor correlations. The researcher concerned about the power of tests should incorporate a consideration of multicollinearity into the planning of an analysis. It has been shown here that it is possible to compensate for any degree of multicollinearity by increasing sample sizes or model validity. The use of these methods insures that the researcher will be able to benefit from the valuable sampling characteristics of least squares estimation; the sample coefficients are unbiased estimators of their respective parameters.

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Copyright (c) 1983 John T. Pohlman (Author)