M-Estimators in Regression Models

Muthukrishnan R, Radha Myilsamy


Regression analysis plays a vital role in many areas of science. Almost all the regression analysis relies on the method of least squares for estimation of the parameters in the model. But this method constructed under specific assumptions, such as normality of the error distribution. When outliers are present in the data, this method of estimation, resulting in parameter estimates that do not provide useful information for the majority of the data. Robust regression analyses have been developed as an improvement to least square estimation in the presence of outliers. The main purpose of robust regression analysis is to fit a model that represents the information of the majority of the data. Many researchers have worked in this field and developed methods for these problems. The most commonly used robust estimators are Huber’s M-estimator, Hampel estimator, Tukey’s bisquare estimator etc. In this paper, an attempt is made to review such type of estimators and carried out a simulation study of these estimators in regression models. R code has been written for this purpose and illustrations are provided.

Full Text:


DOI: https://doi.org/10.5539/jmr.v2n4p23

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Journal of Mathematics Research   ISSN 1916-9795 (Print)   ISSN 1916-9809 (Online)

Copyright © Canadian Center of Science and Education

To make sure that you can receive messages from us, please add the 'ccsenet.org' domain to your e-mail 'safe list'. If you do not receive e-mail in your 'inbox', check your 'bulk mail' or 'junk mail' folders.