FAQ

From ShotStat
Revision as of 20:17, 24 May 2014 by David (talk | contribs)
Jump to: navigation, search

What is sigma (σ) and what does it mean?

σ ("sigma") is a single number that characterizes precision. In statistics σ represents standard deviation, which is a measure of dispersion, and which is a parameter for the normal distribution.

Distribution of samples from a bivariate standard normal distribution (σ = 1)

The most convenient statistical model for shooting precision uses a bivariate normal distribution to characterize the point of impact of shots on a target. In this model the same σ that characterizes the dispersion along each axis is also the parameter for the Rayleigh distribution, which describes how far we expect shots to fall from the center of impact on a target.

Shooting precision is described using angular units, so typical values of σ are things like 0.1mil or 0.5MOA.

With respect to shooting precision the meaning of σ has an analog to the "68-95-99.7 rule" for standard deviation: The 39-86-99 rule. I.e., we expect 39% of shots to fall within 1σ of center, 86% within 2σ, and 99% within 3σ. Other common values are listed in the following table:

Name Multiple of σ Shots Covered
1 39%
CEP 1.18 50%
MR 1.25 54%
2 86%
3 99%

So, for example, if σ=0.5MOA then 99% of shots should stay within a circle of radius 3σ=1.5MOA.

σ also tells us what to expect from other precision measures. For example, on average a five-shot group has an extreme spread of 3σ. So if σ=0.5SMOA and we are shooting at a 100-yard target we would expect the extreme spread of an average 5-shot group to be 1.5".

How many shots do I need to sight in?

How do I tell whether A is more accurate than B?