A person drove 120 miles at 40 mph, then drove back the same 120 miles at 60 mph. What was their average speed?
The average of the speeds is 40 mph+60 mph2=50 mph so the total trip time should be, by the definition of average speed, 240 mi50 mph=4.8 hours.
However, this is wrong, because the trip actually took 3+2=5 hours.
What did I do wrong, and what is the correct way to calculate the average speed?
Answer
The reason is because the time taken for the two trips are different, so the average speed is not simply v1+v22
We should go back to the definition. The average speed is always (total length) ÷ (total time). In your case, the total time can be calculated as
time1=120miles40mphtime2=120miles60mph
so the total time is 120miles×(140mph+160mph). The average speed is therefore:
average speed=2×120miles120miles×(140mph+160mph)=2(140mph+160mph)=48mph
In general, when the length of the trips are the same, the average speed will be the harmonic mean of the respective speeds.
average speed=21v1+1v2
No comments:
Post a Comment