So I'm doing a physics lab were we drop a golf ball from a certain height and measure how long it takes it to ht the ground. We dropped the ball three times from four different heights, making it a total of twelve drops. The purpose is to use the data to figure out the gravity level at school, where we did the tests. Can someone tell me what equation I'm supposed to use (not do the work for me, just tell me how to do it.)? Comment if you need the exact numbers.
I would use this equation:
Drop height = 0.5 * g * t^2
You know "t" (time) for each drop, so you can calculate "g". One you have the 12 estimates of "g", you could average them, to see if you get close to 9.81 m/s^2. It might be better to throw out the smallest and largest values, and average the rest.
Use the most basic of all the relationships: distance = average speed X time traveled; S = Vavg T.
We assume A = g = the acceleration due to gravity = constant. So Vavg = V/2 where V = U + AT = U + gT = gT is the impact speed. T is your measured drop times.
So S = Vavg T = V/2 T = gT T/2 = 1/2 gT^2 where S is the various heights you dropped from. And there you are:
g = 2S/T^2 is the gravity field/acceleration. You measure the T's and S's. And find the average T and average S for each of the four heights. Then solve for g for each height.
We use the average T and S for each height to minimize measuring errors. You should have four different results of g. But they should agree in value within margins of error.
You have the basic equation from kinematics {s = ut + ½at² if you use suvat symbols} giving { when u=0}:
h = ½ *g * t²
Rearrange:
g = (2h / t²)
In a lab report, I'd produce a table of the | height | 2 * height | average drop-time | (average drop time)² | and then plot a graph (t²) on the y-axis against (2h) {the independent variable} on the x-axis.
The slope of the best straight line plot through the points is then (1 / g)