A person drove 120 miles at 40 mph, then drove back the same 120 miles at 60 mph. What was their average speed?
The average of the speeds is $$\frac{40\ \text{mph} +60\ \text{mph}}{2} = 50\ \text{mph}$$ so the total trip time should be, by the definition of average speed, $$\frac{240\ \text{mi}}{50\ \text{mph}} = 4.8 \ \text{hours}.$$
However, this is wrong, because the trip actually took $3 + 2 = 5$ hours.
What did I do wrong, and what is the correct way to calculate the average speed?
Answer
The reason is because the time taken for the two trips are different, so the average speed is not simply $\frac{v_1 + v_2}{2}$
We should go back to the definition. The average speed is always (total length) ÷ (total time). In your case, the total time can be calculated as
\begin{align} \text{time}_1 &= \frac{120 \mathrm{miles}}{40 \mathrm{mph}} \\\\ \text{time}_2 &= \frac{120 \mathrm{miles}}{60 \mathrm{mph}} \end{align}
so the total time is $120\mathrm{miles} \times \left(\frac1{40\mathrm{mph}} + \frac1{60\mathrm{mph}}\right)$. The average speed is therefore:
\begin{align} \text{average speed} &= \frac{2 \times 120\mathrm{miles}}{120\mathrm{miles} \times \left(\frac1{40\mathrm{mph}} + \frac1{60\mathrm{mph}}\right)} \\\\ &= \frac{2 }{ \left(\frac1{40\mathrm{mph}} + \frac1{60\mathrm{mph}}\right)} \\\\ &= 48 \mathrm{mph} \end{align}
In general, when the length of the trips are the same, the average speed will be the harmonic mean of the respective speeds.
$$ \text{average speed} = \frac2{\frac1{v_1} + \frac1{v_2}} $$
No comments:
Post a Comment