My team is trying to get the correct equation for the optimal angle for our team’s turret. We tried deriving the equation and got the equation that is shown on the graph. The constants are: v= initial velocity of our power cells coming out of the shooter(inches/second), g=gravitational constant (inches/second^2), and h=the difference in height between the center of our target and the launcher (inches). The equation shown has the angle as the y-axis and distance as the x-axis. When close to the target, the angle is close to 90 degrees, which makes sense, as our turret has to aim almost directly upwards to reach the target. However, at farther distances, the angle should also increase so that it can travel farther without hitting the ground. Is there something wrong with the equation? The constants? Can someone please explain what is wrong with the graph.
There are a lot of things your model doesn’t take into account: drag, spin and the lift it creates. The best advice I can offer you is stop looking for a magic equation and start collecting empirical data. Use your data to create a model of how your shooter affects the ball. The more data you have, the better your model will be. I was rather surprised how the spin and drag affected the ball
Yes, definitely. We find that even if our exit velocity of the ball is 45 m/s, the actual in flight velocity drops down to even 30 m/s in no time, over a 6 meter distance
I’m on my phone right now, so I can’t write a ton. Here is a calculator I made a while back that solves a lot of these problems.
As @wilsonmw04 said, the two major factors that your model (or mine) doesn’t include are drag and magnus effect. Drag is going to reduce your distance and magnus effect (assuming you have backspin) is going to increase it.
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.