Here's the question. Suppose you had a distance to travel of 75 miles. For the first mile you travel at 75 mph. When you're 74 miles from your destination, you reduce your speed to 74 mph. When you're 73 miles from your destination, you reduce your speed to 73 mph., and so on, until, when you're 1 mile from your destination you are traveling at 1 mph.
How long does it take to get there?
When I heard this riddle, I assumed that since I was traveling one mile at each speed between 1 and 75, my average speed would be 38 mph, and it would take me about 2 hours.
This is horribly incorrect.
Why? And what is the correct answer?
(AND NO FAIR RESPONDING TO THIS IF YOU HEARD THE PUZZLE ON NPR'S "CAR TALK")