What symbol is used for SD?

The symbol used for standard deviation (SD) is the Greek letter sigma σ. Standard deviation is a measure of dispersion, or how spread out a set of data values are from the mean. It indicates how much variation there is in the data set.

Quick Answer

The Greek letter sigma (σ) is universally used to represent standard deviation in mathematical and statistical contexts.

What Does Standard Deviation Tell You?

Standard deviation measures how spread out the values in a data set are from the mean. A low standard deviation indicates that the values are clustered closely around the mean, while a high standard deviation indicates the values are dispersed widely from the mean.

For example, let’s say you have two sets of exam scores:

  • Set 1: 82, 78, 75, 81, 84
  • Set 2: 45, 63, 87, 100, 56

Set 1 has a mean of 80 and a standard deviation of 3.74. This low standard deviation tells us the scores are all clustered closely around 80.

Set 2 has the same mean of 80 but a higher standard deviation of 19.90. This indicates there is a wider spread of scores in this data set.

So in summary, a low standard deviation indicates the data points are close to the mean, while a high standard deviation indicates the data are spread farther from the mean.

Using Sigma for Standard Deviation

The Greek letter sigma σ is used to represent standard deviation because it is the first letter in the Greek word σπείρειν, meaning “to scatter.” This ties into the concept of standard deviation measuring how scattered or dispersed the data is.

Some key facts about using sigma for standard deviation:

  • Lowercase σ (sigma) is used to represent sample standard deviation, while uppercase Σ is used to represent population standard deviation.
  • Sigma is followed by subscript notation to indicate the data set: σX would represent the standard deviation of set X.
  • Sigma is often preceded by s or SD to denote “sample standard deviation” or “standard deviation.” For example: sX = 3.2.

So in practice you will generally see standard deviation expressed using sigma, either as σ on its own, preceded by s or SD, or with a subscript to identify the data set.

Standard Deviation Formula

The formula for calculating sample standard deviation is:

Where:

  • σ = Sample standard deviation
  • Σ = Sum of
  • xi = Each value in the data set
  • x̄ = Mean of the data set
  • N = Total number of values in the data set

This formula takes each value, subtracts the mean, squares the result, sums them all, divides by N-1, and finally takes the square root. This yields the standard deviation σ.

Examples of Sigma Used for Standard Deviation

Here are some examples of sigma being used to represent standard deviation:

  • The standard deviation of exam scores for Mrs. Johnson’s class is σexam = 7.2.
  • The mean height of plants receiving fertilizer A is 62 cm, with a standard deviation of sA = 3.1 cm.
  • Student test scores improved from SD = 12.7 to SD = 9.4 after the new study program was implemented.

In statistical software outputs, you will also commonly see sigma used. For example:

So in both mathematical formulas and statistical analyses, sigma is universally used to represent the standard deviation.

Why Sigma is Used

There are a few key reasons why sigma became the standard symbol used to represent standard deviation:

  • Etymology – As mentioned earlier, sigma comes from the Greek word for scatter or dispersion.
  • Distinction from other statistics – Using sigma helps distinguish standard deviation from other statistical values like the mean (typically denoted μ) and variance (typically denoted σ2).
  • Convention – Early statisticians like Karl Pearson adopted sigma to represent standard deviation, and the convention stuck over time.
  • Intuitive meaning – The shape of the Greek sigma letter conveys the idea of divergence or spread around a central point, similar to the concept of standard deviation.

So in summary, sigma’s etymology, ability to distinguish standard deviation, conventional use over time, and intuitive meaning made it the standard choice to represent standard deviation in statistical contexts.

Similar Symbols and Notations

While sigma σ is by far the most common notation, there are some less common variations you may encounter when seeing standard deviation expressed:

  • STD – Writing out the full words “standard deviation.”
  • SD – Using just the capital letters SD, typically followed by the data label in subscript. For example: SDtest1
  • s – Lowercase s, follows same conventions as sigma. sX = 4.5.
  • Σ – Uppercase sigma used for population standard deviation.

But the lowercase Greek sigma σ remains the most universally used and accepted notation for sample standard deviation in statistical applications.

Confusion With Other Symbols

Because sigma resembles some other mathematical symbols, it can occasionally be confused with:

  • Summation sign – The uppercase Greek sigma Σ is used to denote summation. So it can be confused with the population standard deviation notation.
  • Pi – The Greek letter pi π is written similarly. But pi has uses completely separate from standard deviation.
  • Lambda – The Greek lambda λ looks nearly identical but is its own distinct letter used in equations.

However, from context standard deviation will almost always be clearly distinguished from these other notations when Greek sigma σ is used.

Origin of the Standard Deviation Concept

While the sigma σ notation became the standard later on, the origins of the standard deviation concept long predate the use of the Greek letter.

Some key developments were:

  • 1674 – Gottfried Leibniz introduced the idea of using the distance from the mean to measure dispersion.
  • 1809 – Legendre introduced the least squares method, involving sums of squared deviations.
  • 1877 – Galton introduced the term standard deviation and formulated it verbal description.
  • 1893 – Edgeworth mathematically defined standard deviation using modern notation.

So by the late 19th century, the foundation for standard deviation as we know it was established. The sigma notation was popularized shortly after by Karl Pearson in the early 1900s.

Standard Deviation vs. Variance

Standard deviation is closely related to variance, another measure of dispersion. While expressed differently, the formulas for variance and standard deviation are directly linked:

  • Variance = Sigma squared (σ2)
  • Standard Deviation = Square root of variance (σ)

Because variance is represented by sigma squared, this further establishes sigma σ as the notation used for standard deviation specifically.

Use in Normal Distribution

Standard deviation is a key component of the normal distribution – one of the most common and useful probability distributions in statistics. The normal distribution forms a symmetric bell curve shape, with the mean at the center and the standard deviation controlling the spread:

Around 68% of values will fall within 1 standard deviation of the mean, 95% within 2 standard deviations, and over 99% within 3 standard deviations. So the standard deviation helps define the shape and properties of this distribution.

Applications of Standard Deviation

Standard deviation has many applications in statistics and other domains including:

  • Measuring variability and dispersion in a dataset
  • Identifying outliers in a distribution
  • Statistical quality control and processes
  • Forecasting and predicting variability
  • Stock market volatility analysis
  • Weather pattern analysis
  • Error analysis in measurements

In all these cases, standard deviation provides a concise quantitative measure of how dispersed the data is around the mean value.

Conclusion

In summary, the Greek letter sigma σ is universally used to represent standard deviation in statistics and other technical fields. This notation originated because sigma is the first letter of the Greek word for dispersion, and the intuitive shape fits the concept of spread around a central mean.

While other symbols like s or full word representations are occasionally seen, sigma remains the standard notation when writing out formulas, conducting statistical analyses, or referring to standard deviation in technical writing. The origins of standard deviation trace back to work in the 17th-19th centuries, leading to the formal definition and sigma notation in the early 20th century.

Standard deviation complements the closely related concept of variance, and plays a key role in probability distributions like the normal distribution. It has widespread applications in science, social science, finance, and many other fields. So the next time you see σ know it stands for the very important statistical value of standard deviation.