Got the targets USA scope tracking device (http://www.targetsusa.com/scope-tools.html) last week and playing around with it.
Seems to me if you’re measuring the calibration of the *reticle* you need to measure from the target/ruler to the reticle/turret. That is the tracking error of the turret/reticle as measured on the ruler.
I’ve seen others say measure from muzzle and PRB says half way from muzzle and turrets:
http://precisionrifleblog.com/2014/0...rmance-part-1/
Now before anybody body says that’s splitting hairs - we’ll of course it is and that’s the point of checking my calibration. If you want to exclude errors greater than 5% or something big then it doesn’t matter. If you want to measure as accurately as you can any tracking error it does matter. Which is why you need a surveyors tape or a Leica DISTO laser with accuracy of inches rather than yards to measure your distance to ruler. If you don’t believe me do the calcs with 100 yards and 100 yards 2 feet and compare the results. You have just made a 0.67% measurement error - that’s the magnitude of error I’m trying to measure, and you can’t have measurement error anywhere near the quantity you’re trying to measure. It would be like trying to cut an inch off a board and having a ruler that gives you a measurement of +\- 11/16”. My PLRF15 advertises 2 meter accuracy - which could introduce >2% error and that ain’t good enough. But I digress.
My logic on the question is, why the f$&@ would barrel length matter, and if it does, why don’t any of the ballistics programs not include a dimension called “distance from turret to muzzle”?
makes no sense to me that distance from turret to muzzle would have anything to do with scope calibration. Right or wrong?
Seems to me if you’re measuring the calibration of the *reticle* you need to measure from the target/ruler to the reticle/turret. That is the tracking error of the turret/reticle as measured on the ruler.
I’ve seen others say measure from muzzle and PRB says half way from muzzle and turrets:
http://precisionrifleblog.com/2014/0...rmance-part-1/
Now before anybody body says that’s splitting hairs - we’ll of course it is and that’s the point of checking my calibration. If you want to exclude errors greater than 5% or something big then it doesn’t matter. If you want to measure as accurately as you can any tracking error it does matter. Which is why you need a surveyors tape or a Leica DISTO laser with accuracy of inches rather than yards to measure your distance to ruler. If you don’t believe me do the calcs with 100 yards and 100 yards 2 feet and compare the results. You have just made a 0.67% measurement error - that’s the magnitude of error I’m trying to measure, and you can’t have measurement error anywhere near the quantity you’re trying to measure. It would be like trying to cut an inch off a board and having a ruler that gives you a measurement of +\- 11/16”. My PLRF15 advertises 2 meter accuracy - which could introduce >2% error and that ain’t good enough. But I digress.
My logic on the question is, why the f$&@ would barrel length matter, and if it does, why don’t any of the ballistics programs not include a dimension called “distance from turret to muzzle”?
makes no sense to me that distance from turret to muzzle would have anything to do with scope calibration. Right or wrong?
Last edited: