Comparison of parameter estimation methods when multicollinearity and outlier exists / Aida Nurasikin Jamil ...[et al.]

Ordinary Least Squares (OLS) estimator become worse in the presence of multicollinearity and outlier. Here, three methods are suggested to improve the model when multicollinearity and outlier exists, the first one is Jackknife Regression (JR) based on left out method, the second is Ridge Regression...

詳細記述

書誌詳細
主要な著者: Jamil, Aida Nurasikin, Abdul Muluk, Muhammad Fahmi, Anuar, Nur Sabrina, Abu Bakar, Mohamad Suffian
フォーマット: 学位論文
言語:英語
出版事項: 2019
主題:
オンライン・アクセス:https://ir.uitm.edu.my/id/eprint/32559/1/32559.pdf
その他の書誌記述
要約:Ordinary Least Squares (OLS) estimator become worse in the presence of multicollinearity and outlier. Here, three methods are suggested to improve the model when multicollinearity and outlier exists, the first one is Jackknife Regression (JR) based on left out method, the second is Ridge Regression (RR) based on the addition of shrinking parameter, and the third is Latent Root Regression (LRR) by adding the latent root and latent vector. In the application, model parameters, standard errors, length of confidence intervals (L.C.I), coefficients of determination ( 2 R ), and mean square error (MSE) of these methods are estimated. Next, the perfomance of these three methods are compared with OLS by using the MSE and 2 R .Based on the analysis, LRR method was the best method compared to other methods since the value of MSE is less and 2 R is higher among others. The LRR was not only the best method when multicollinearity exist, but also was the best when the presence of both multicollinearity and outlier