glossary:least_squares

Least squares

Least squares is an optimization paradigm for matching data ('fitting') with a parametrised model equation. A famous example is the linear regression used for finding the linear equation that best matches a given set of data points.

The least squares measure for the goodness-of-fit is $$\chi ^2_{red}=\frac{1}{N-n_p}\sum_{i=1}^{N} \bigg[{{D\left(\mbox{model parameters},t_i\right)-D_i^{exp}} \over {w_i}}\bigg]^2$$

$(D_i^{exp}|t_i)$ is the i-th data point of an experimental data set consisting of <$N^{}_{}$ data points, $D\left(\mbox{model parameters},t_i\right)$ is the model equation at the observed points $t^{}_i$ and $n^{}_p$ is the number of freely varying model parameters.

$w^{}_i$ is some weighting factor describing the experimental uncertainty of each individual data point. For TCSPC data $w^{}_i$ is defined as

$$w_i=\sqrt{D_i^{exp}}$$

Least sqaures is a maximum likelihood estimator if the following preconditions are met:

  • All data points $D_i^{exp}$ are independent observations.
  • The number of data points is sufficient (i.e. the model parameters are overdetermined).
  • There are no systematic errors, resp. the model describes the data correctly.
  • The experimental noise along the time axis is negligible.
  • glossary/least_squares.txt
  • Last modified: 2014/04/09 20:43
  • by admin