Being a simple person I don't have any problem accepting the "low velocity = more time in the barrel = higher point of impact" hypothesis. (Theory #1) But being really simple I also accept the "higher velocity = flatter trajectory = higher point of impact" theory. (Theory #2) I just can't equate the two in my mind.
Example - sighting in of my 45 Colt. I used slow, cast (250 gr.) bullets, ones I had loaded with a light charge of Unique for my SAA's, to get on paper. At 50 yards they hit
really low. Matter of fact I ran out of elevation getting them off the ground and onto the target.
Thinking of Theory #1, I had just about decided I was going to have to replace the front sight. Hi-ebber, and day always be a hi-ebber, I decided to give my 250 gr. hunting load a try. It carried a healthy charge of 2400.
When I cranked off the first one it almost went OVER the target!
Hence the faster load shot way higher then the slower load ala Theory #2.
How can you predict who gonna do what?

Seems to me that there needs to be a range to target factor figured into these theories.