+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Here's What Statisticians Mean By A Standard Deviation

Dec 2, 2014, 23:47 IST

Standard deviation is a concept that's thrown around frequently in finance.

Advertisement

So, what is it really?

It's More Than Average

When working with a quantitative data set, one of the first things we want to know is what the "typical" element of the set looks like, or where the middle of the set is. We do this by finding a mean or a median, or some other related measure of average.

But knowing the middle of the set doesn't tell us everything. We also want to know more about the overall shape of our data.

Standard deviation is a measure of how spread out a data set is. It's used in a huge number of applications. In finance, standard deviations of price data are frequently used as a measure of volatility. In opinion polling, standard deviations are a key part of calculating margins of error.

Advertisement

First, let's look at what a standard deviation is measuring. Consider two small businesses with four employees each. In one business, two employees make $19 per hour and the other two make $21 per hour. In the second business, two employees make $15 per hour, one makes $24, and the last makes $26:

Business Insider/Andy KierszIn both companies, the average wage is $20 per hour, but the distribution of hourly wages is clearly different. In company A, all four employees' wages are tightly bunched around that average, while at company B, there's a big spread between the two employees making $15 and the other two employees.

Standard deviation is a measure of how far away individual measurements tend to be from the mean value of a data set. The standard deviation of company A's employees is 1, while the standard deviation of company B's wages is about 5. In general, the larger the standard deviation of a data set, the more spread out the individual points are in that set.

Technically, It's More Complicated

The technical definition of standard deviation is somewhat complicated. First, for each data value, find out how far the value is from the mean by taking the difference of the value and the mean. Then, square all of those differences. Then, take the average of those squared differences. Finally, take the square root of that average.

The reason we go through such a complicated process to define standard deviation is because this measure appears as a parameter in a number of statistical and probabilistic formulas, most notably the normal distribution.

Advertisement

economicshelp.org

The normal distribution is an extremely important tool in statistics. The shape of a normal distribution is a bell shaped curve, like the one in the image.

That curve shows, roughly speaking, how likely a random process following a normal distribution is to take on a particular value along the horizontal axis.

Values near the peak, where the curve is highest, are more likely than values further away, where the curve is closer to the horizontal axis.

Normal distributions appear in situations where there are a large number of independent but similar random events occurring. Things like heights of people in a particular population tend to roughly follow a normal distribution.

Advertisement

Standard deviations are important here because the shape of a normal curve is determined by its mean and standard deviation. The mean tells you where the middle, highest part of the curve should go. The standard deviation tells you how skinny or wide the curve will be. If you know these two numbers, you know everything you need to know about the shape of your curve.

Flipping this idea around, normal distributions also give us a good way to interpret standard deviations. In any normal distribution, there are fixed probabilities for intervals around the mean based on multiples of the standard deviation of the distribution.

In particular, about 2/3 of measurements of a normally distributed quantity should fall within one standard deviation of the mean, 95% of measurements within two standard deviations of the mean, and 99.7% within three standard deviations of the mean. This illustration of the normal curve lists these values:

Suppose there's a standardized test that hundreds of thousands of students take. If the test's questions are well designed, the students' scores should be roughly normally distributed. Say the mean score on the test is 100, with a standard deviation of 10 points. The rule mentioned above means that about 2/3 of the students should have scores between 90 and 110, 95% of students should be between 80 and 120, and nearly all the students - 99.7% - should have scores within three standard deviations of the mean.

Advertisement

Any questions?

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article