The Sensitivity of a Hach method is defined as the change in concentration for a 0.010 change in absorbance.
A sensitivity value of 0.009 mg/L would be considered greater than a sensitivity of 0.022 mg/L, as it would measure the smallest change in concentration.
For Hach methods using a spectrophotometer or colorimeter, the sensitivity value is derived from the calibration curve when comparing absorbance on the x-axis with the concentration on the y-axis.
- If the calibration is a line, the sensitivity is the slope of the line multiplied by 0.010.
- If the calibration is a curve, the sensitivity is the slope of the tangent line to the curve at the concentration of interest multiplied by 0.010.
Often, the sensitivity value is used as the Estimated Detection Limit (EDL).
For more information on detection limits, see the following article: