In analytical chemistry, a calibration curve, also known as a standard curve, is a general method for determining the concentration of a substance in an unknown sample by comparing the unknown to a set of standard samples of known concentration. A calibration curve is one approach to the problem of instrument calibration; other standard approaches may mix the standard into the unknown, giving an internal standard.
The calibration curve is a plot of how the instrumental response, the so-called analytical signal, changes with the concentration of the analyte (the substance to be measured). The operator prepares a series of standards across a range of concentrations near the expected concentration of analyte in the unknown. The concentrations of the standards must lie within the working range of the technique (instrumentation) they are using. Analyzing each of these standards using the chosen technique will produce a series of measurements. For most analyses a plot of instrument response vs. concentration will show a linear relationship. The operator can measure the response of the unknown and, using the calibration curve, can interpolate to find the concentration of analyte.
In more general use, a calibration curve is a curve or table for a measuring instrument which measures some parameter indirectly, giving values for the desired quantity as a function of values of sensor output. For example, a calibration curve can be made for a particular pressure transducer to determine applied pressure from transducer output (a voltage). Such a curve is typically used when an instrument uses a sensor whose calibration varies from one sample to another, or changes with time or use; if sensor output is consistent the instrument would be marked directly in terms of the measured unit.
The data - the concentrations of the analyte and the instrument response for each standard - can be fit to a straight line, using linear regression analysis. This yields a model described by the equation y = mx + y0, where y is the instrument response, m represents the sensitivity, and y0 is a constant that describes the background. The analyte concentration (x) of unknown samples may be calculated from this equation.