What Are Standard Deviation Percentiles?
Standard deviation percentiles are used to determine the percentage of occurrences that are above or below an average. In statistical analysis, the average of all numerical scores or occurrences is known as the mean. Since not all of the collected data will be equal to the mean, the standard deviation reflects how far the majority of that data will be from the average. In normal distributions, 50 percent of the occurrences will be either less than or greater than the data set's average.
One of the most efficient ways to think of standard deviation percentiles is as the amount of occurrences that will be included in a range of numerical scores. For example, a set of final exam test scores may be earned by a group of college students in an economics course. The mean will represent the average score, and in most cases will be assigned a percentile of 50 percent. Test scores that fall within one or two standard deviations from the mean will usually be assigned a different percentile.
Standard deviation percentiles that fall below the mean in a normal distribution are less than 50 percent. Those that deviate higher or to the right of the mean will be more than 50 percent. For instance, if the average exam score is 70, then scores that fall within a range of 71 to 81 might be assigned to the 75th percentile. Those scores that range between 59 and 69, on the other hand, would most likely be within the 25th percentile.
Graphical displays of standard deviation percentiles are often used to determine the significance of a particular score. Individuals can use average salary statistics to see if a particular income is significantly higher or lower than the average. For example, a salary that corresponds to the 90th percentile in a normal distribution means that the individual earns more than 90 percent of his peers. Standard deviation percentiles can also be grouped into spreads or ranges according to the data set's average.
Using standard deviation percentiles, someone can easily determine whether a numerical score is extremely high or low. In a class where a range of exam scores between 59 and 81 fall within one standard deviation of the average, 50 percent of the students will most likely produce an exam score somewhere between 59 and 81. Scores below 59 or above 81 may be within two to three standard deviations from the average.
Discuss this Article
Post your comments