Masters Thesis

A Study in Functional Errors-In-Variables Models

Errors-in-Variables (EIV) models are regression models in which both the explanatory and response variables are measured with error. This seemingly small change leads to a myriad of issues that are not present in the classical model. In fact, widely used methods under the classical model which are usually considered excellent become woefully inadequate. For instance, the Least Squares estimator (LS) of the slope parameter suffers from an attenuation bias while the Maximum Likelihood Estimator (MLE) of the slope parameter has infinite moments. Accordingly, several approaches have been developed in the literature in order to produce better estimators. This thesis aims to develop new estimators by undergoing a new approach. Instead of minimizing an objective function by utilizing the likelihood principle, a family of unspecified objective functions is considered. This degree of freedom allows us to develop estimators with desirable statistical properties, such as efficiency and unbiasedness up to the fourth-leading term. To derive such a weight, a general form of the second-order bias is formulated with the aid of perturbation theory. This process yields a system of first-order linear partial differential equations that yield a closed- form solution for our weight function. Accordingly, our estimator can be obtained by minimizing the objective function associated with this weight by using Levenberg-Marquardt algorithm (LM). The effectiveness and superiority of our method were assessed by a series of Monte-Carlo simulations.

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.