MONTE CARLO APPROACH FOR COMPARATIVE ANALYSIS OF REGRESSION TECHNIQUES IN THE PRESENCE OF MULTICOLLINEARITY AND AUTOCORRELATION PHENOMENA
Abstract
Multicollinearity and Autocorrelation are two very common problems in regression analysis. As its well-known, the presence of some degrees of multicollinearity results in estimation instability and model mis-specification while the presence of serial correlated errors lead to underestimation of the variance of parameter estimates and inefficient prediction. These two conditions have adverse effects on estimation and prediction; therefore, a wide range of tests have been developed to reduce their impact. Invariably, the multicollinearity and autocorrelation problems are dealt with separately in most studies. Thus, this study explored the predictive ability of the proposed GLS-Ridge regression on multicollinearity and autocorrelation problems simultaneously, using simulated dataset. Data used for the study was the data simulated using Monte Carlo. In the application, 1000 repetitions have been simulated for each of the sample size of . The model (GLS-Ridge), was proposed, and an estimator, was derived. Least squares, ridge, lasso and the GLS-R model were applied to the simulated dataset. Regression coefficients for each estimator were computed and statistical comparison criteria; Mean Square Error and Akaike Information Criteria of the estimates were used to select the best model. For the simulated data, the GLS-R model had smaller AIC value than least squares, ridge regression and LASSO techniques for samples . Among these four techniques, the GLS-R model gives the smallest AIC value. The research work revealed that the GLS-R regression technique has a better predictive ability in the presence of autocorrelation and multicollinearity, hence it is preferred than the other three techniques.
References
Agunbiade, D.A. (2012). ‘A note on the effects of the multicollinearity phenomenon of a Simultaneous equation model. Journal of Mathematical Sciences 15(1), 1-12.
Agunbiade, D.A. &Iyaniwura, J.O. (2010). On estimation under multicollinearity: A comparative approach using Monte Carlo methods. Journal of Mathematics and Statistics, 6(12), 183-192.
Alexandrov, V.N., Martel, C.N. &Straßburg, J (2011), Monte Carlo scalable algorithms for Computational Finance, Procedia Computer Science, 4, 1708-1715
Alhassan U. A., Balakarishnan, U.V., &Jah, P.S. (2019). Detection of Collinearity Effects on Explanatory Variables and Error Terms in Multiple Regressions, International Journal of Innovative Technology and Exploring Engineering (IJITEE), 8(6S4), 1584-1591
Ayinde, K. (2007). A Comparative Study of the Performances of the OLS and Some GLS Estimators When Stochastic Regressors Are both Collinear and Correlated with Error Terms. Journal of Mathematics and Statistics, 3(4), 196 –200.
Ayinde, K., Alao, R. F. &Ayoola, F. J. (2012). Effect of Multicolinearity and Autocorrelation on Predictive Ability of Some Estimators of Linear Regression Model Mathematical Theory and Modeling, Vol. 2, No.11, 41-52.
Ayinde, K., Apata, E.O. &Alaba, O. (2012). Estimators of Linear Regression Model and Prediction under Some Assumptions Violation, Open Journal of Statistics, 2012, 2, 534-546 Open Journal of Statistics, 2, 534-546
Bock, M. E., Yancey, T. A. & Judge, G. G. (1973). Statistical consequences of preliminary test estimation in regression. Journal of the American Statistical Association, 60, 234 – 246.
Chatterjee, S &Hadi, A.S. (2006). Regression Analysis by Example, 4th ed. NJ: John Wiley and Sons.
Gujarati, D.N. (2005) “Basic Econometrics,†4th Edition, Tata McGraw-Hill Publishing Company Limited, New Delhi and New York.
Gunst, R.F& Manson, R. L, (1979). Some considerations in the evaluation of alternate prediction equations, Technometrics, 21, 55-63.
Hoerl, A.E., (1962). Application of ridge analysis to regression problems. Chem. Eng. Prog. 58: 54-59.
Hoerl, A.E., Kannard, R.W. & Baldwin, K.F. (1975). Ridge regression: Some simulations. Commun. Stat., 4: 105-123.
Holland, I. S. (1990). Partial least squares regression and statistical methods. Scandinavian Journal of Statistics, 17, 97 – 114.
Maddala, G. S. (2002). Introduction to Econometrics. 3rd Edition: John Willey and Sons Limited, England.
Marquardt, D. W. (1970). Generalized inverse, Ridge Regression, Biased Linear Estimation and Non – linear Estimation. Technometrics, 12, 591– 612.
Naes, T. & Marten, H. (1988). Principal Component Regression in NIR analysis: View points, Background Details Selection of components. Journal of Chemometrics, 2, 155 – 167.
Phatak, A. &Jony, S. D. (1997). The geometry of partial least squares. Journal of Chemometrics, 11, 311 – 338.
FUDMA Journal of Sciences