# Errors of NLLSF

### Errors of NLLSF

Calculating confidence intervals for nonlinear least square fit

The calculation of the uncertainty bounds for linear least square fits is well tabulated in the text book. However, for nonlinear least square fit, people usually rely on statistics package. For example, in matlab function fit() can estimate the 68% confidence interval for a standard expression that uses built-in functions. However, if you want to use a non-built-in function to fit your data, you have no luck in matlab. You will need to write your own code to estimate the uncertainties. Even thought the principles are taught in most statistics class, I found that when comes to nonlinear least square fit, most people without a statistical background do not know how to actually estimate the errors. Here I give a very simple procedure that allows you to estimate your error bars.

y = f(a,b,....,x) + error

a, b, .. are the fitting parameters, x is the random variable and y is the observable. In the algorithm given below, we assume the error follows a normal distribution independent of x and y.

Given a set of experimental data of x' vs. y'. We can use least square fit to find the fitting parameters a, b, ....

Then we estimate the error by assuming the best fit is the expectation value. Then we can estimate the errors by subtracting y' by f(a,b,....,x') (People often call these deviations residual).

Then we calculate the reduced squared 2-norm of the residual

Err2 = [sum_i (y'_i - f(a,b,...,x'_i))^2] / n

Then we create a new thing called reduced χ^2 (chi square). It is defined as

reduced χ^2 = sum_i [(f(a',b',...,x'_i) - f(a,b,...,x'_i))^2 / Err2]

The 68% confidence intervals of a, b,... can be defined as when χ^2 reaches 1. Statistical software usually calculate the intervals by calculating the derivative of χ^2 as a function of a and b ..., (this is where the Jacobian matrix is useful.) But, we simplify that by doing optimization search of χ^2 by changing a', b', .... , i. e.,

a' = fminsearch(abs(reduced χ^2 - 1),a)

From the minimization, you get one side of the bounds. You can play with the fitting to get another side.