User talk:Herb

From ShotStat
Revision as of 01:47, 5 June 2015 by Herb (talk | contribs)
Jump to: navigation, search

Herb's talk page

I am interested in modeling the precision of shooting at a target. The general case is to assume that the fundamental distribution is the bivariate normal distribution. h and v being the horizontal and vertical axis for the target.

This would allow for elliptical shot groups, or round groups. However assuming that there is no correlation between the axes, and that the variances along the axis are equal, and translating the coordinate system to the the center of impact (average shot along each axis), then the distribution reduces to Rayleigh Distribution.


As I understand it the correlation between the two axis is measured using "ordinary" linear least squares. That is assuming one axis is the independent variable and measuring the residual error perpendicular to that axis. In other words all the error is in the dependent axis.

Consider using the total least squares line. Imagine the shots being marked on a transparent sheet where the center of impact was at the origin. As the shot pattern was rotated around the origin the correlation line would stay the same relative to the shot pattern.

What would be the result of using total least squares to fit the regression line in regards to the bivariate normal distribution reducing to the Rayleigh distribution?