# Using kernel regression to estimate the coefficients of the regression model with random error limits that are self-correlated with practical application

## Keywords:

autocorrelation, kernel regression, local linear regression, Modified Mallo's criterion

## Abstract

The regression model is one of the models used for the purposes of interpreting the effect of a phenomenon or several phenomena on a particular phenomenon, by estimating the coefficients of the model, in addition to adopting the estimated model in making future predictions for the effect or effects of the interpreted phenomena on the response phenomenon. The regression model is built on several assumptions. If they are achieved, we can get estimates that have the desired characteristics. One of these assumptions is related to the limits of random error, as it is assumed that these limits are independent among them. Limits of random error and this problem has an impact on the least-squares estimates of the model coefficients, which leads to misleading results about the effect of the explanatory phenomena as well as for future predictions. There are several methods for estimating the coefficients of the regression model in light of this problem, including its parameters and non-parametric ones. One of the non-parametric methods is the kernel regression method, as the research aims to use this method to estimate the positional linear regression in estimating the regression model with self-correlated random error limits and depending on the modified Mallo criterion. The appropriate bandwidth parameter has been selected so that the bandwidth parameter has a clear influence on the estimation process and works to approximate and smooth the estimated curve from the real curve. A practical application was made on real data represented by the money supply and some factors affecting it, and through the use of the (Gaussian) and (Epanechnikv) kernel functions, the regression curve that represents the expected value of the money supply was estimated. The results of the estimation revealed that the kernel function (Gaussian) is the best in smoothing the regression function based on the comparison criteria MAE, RMSE, and MAPE, despite the convergence of the estimation results for the two kernel functions.