Description
Signal-to-noise ratio can be defined as the mean value of a signal (S) divided by the standard deviation of the background (N). The signal becomes impossible to see when S/N drops below 2 or 3.
In this simulation, a 1000-point background of Gaussian white noise is generated with a standard deviation of 1. Then the user-defined signal level is added to a single point in the middle of the array, and the entire array is plotted. The actual signal-to-noise ratio (S/N) is calculated and displayed with S/N = (original point value + signal level) / 1.
(Reference: Skoog, Holler, and Crouch Principles of Instrumental Analysis, 6th Ed Thomson Brooks/Cole 2007)
Exercises and questions
At what S/N value do you see the signal begin to distinguish itself from the noisy background? (In other words, when would you begin to think there is more than just noise in the data?
At what S/N value would you feel comfortable using the data for a quantitative determination of an unknown concentration? Why?