# Testing and Goodness of Fit

In many situations, we examine data to get an estimate of unknown parameters. Examples for such parameters are the mean of a distribution, the variance of a distribution or the weights b that we apply in combining variables x into a prediction of another variable y.

In this book, we mainly employ the maximum likelihood and the least-squares estimation principles. The maximum likelihood principle is described in Appendix A3. In least squares, we choose the estimate such that the squared differences between observed values and our predictions are minimized. As an illustration, consider the case where we want to estimate the mean m of a sample of N observations xi. In the least squares approach, our prediction for a single observation will be just the mean m we are looking for, and so we minimize:

We can solve this problem by taking the first derivative with respect to m:

Solving for m yields the estimator :

that is, the arithmetic average of our observed xs.

### Standard errors

Once we have arrived at some estimate b, we would like to know about the ...

Get Credit Risk Modeling Using Excel and VBA with DVD now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.