The *standard deviation* is a measure of how varied the values in a data set are. Standard deviation is calculated as the square root of the variance.

Standard deviation is often used as a perimeter value in statistical analysis, where the phrase ‘within 1 (or 1.5, 2, etc) standard deviations’ means that a given value is +/- (N * standard deviation) from the central point of the data.

### Example

Consider a data set with a mean value of 100, and a standard deviation of 12.5, using the following range checks:

- 105 is within 1 standard deviation of the mean –
**|100 – 105| < 12.5**
- 80 is within 2 standard deviations of the mean –
**|100 – 80| < 25**
- 80 is not within 1.5 standard deviations of the mean –
**|100 – 80| > (12.5 × 1.5)**

Note that the standard deviation does not need to be a whole integer, as demonstrated in the 3 ^{rd} example.