You're over thinking. I'm an experimental scientist myself and I can tell you that reality and theory don't often match at the beginning. You want DATA. Our data suggests 0.5 mils 1.8 MOA for a 100 yard zero at 200 yards. (for instances your line of sight isn't parallel to the ground so you are actually shooting slightly "up" aka your scope is aimed slightly down) so the bullet "rises" compared to the ground a few inches but is acutally always falling from line of sight. At this level of precision/calculation details matter.)1G acceleration is 32 feet per second per second. So 0.118s should equal a 3.7ft drop (assuming linear acceleration and that terminal velocity hasn't been reached).
OK, thanks for the advice, appreciated
And there is a huge amount of data that says your drop should be 0.5 mils at 200 or very close (as 200 yards isn't enough time to see variance in BC, MV. Don't try and do the theory of what it should be. We've measured it. We KNOW what is SHOULD be.
Even at 1000 yards, if you come in and say my dope is 4 Mils, even though the range of dope is larger for 1000 yards, we know something is wrong as you should be between 7 and 10 mills. All of our rifles shoot in that range (wanting for the idiot shooting a 33XC at 3200 FPS). You are shooting a standard 6.5 Creed. between 2700 and 2900 fps. We know roughly what your dope should be. If it isn't there are 2 solutions. (1) Something you are doing is wrong. (2) All of us are wrong.
Thus I am assuming #1.
In god we trust, all others bring data. Its a learning process, but too often I see people caught up in minutea. Practice practice practice your fundamentals at 100, 200, 400 yards (or similar) and get your DATA. If your data is outside the trend, something is off.