The very title of this post
may surprise and/or offend some relativistically minded readers. I will try to
give a reason for this possible reaction, while at the same time try to narrow
the gap between relativists and more practically minded engineers. To set the
scene, I quote a piece from Einstein's 1905 paper on Special Relativity (1923
English translation), specifically on the definition of simultaneity.
"If at the point A
of space there is a clock, an observer at A can determine the time values of
events in the immediate proximity of A by finding the positions of the hands
which are simultaneous with these events. If there is at the point B of space
another clock in all respects resembling the one at A, it is possible for an
observer at B to determine the time values of events in the immediate
neighbourhood of B. But it is not possible without further assumption to
compare, in respect of time, an event at A with an event at B. We have so far
defined only an "A time" and a "B time." We have not
defined a common "time" for A and B, for the latter cannot be defined
at all unless we establish by definition that the "time" required by
light to travel from A to B equals the "time" it requires to travel
from B to A."
Einstein then described how
the two clocks A and B can be set to a "common time", i.e. be
synchronized in time readings. The method is simple in principle. Observers A
and B are both stationary in the same inertial frame in free space and they
measure the distance AB between them. They agree that A will send a light flash
when her clock reads precisely time T1. Observer B sets and holds
his clock on time T1 plus the time that the light pulse should take to reach him, i.e. AB/c. When
he observes A's flash, he cancels the hold and let his clock run freely from T1+AB/c. They
can now verify the process by B sending a light flash at a predefined time T2
and A simply checks that it arrives at time T2+AB/c. Their clocks
are now "Einstein-synchronized".
By Einstein-synchronizing the two clocks we have forced the one-way speed of light to be measured
as the same as its two-way speed, c. By demanding that every inertial frame
synchronizes its clocks by this method, it is unremarkable that the one-way speed
of light is observed to be c by all inertial observers and the same in every
direction. Likewise, the fact that the two-way speed if light is c in all
directions for all inertial frames is also unremarkable, provided that we
forget about a medium for light propagation (the 'luminiferous aether'). Here we
need only one clock to check the two-way speed in every direction possible - hence
clock synchronization does not enter the picture. Better still, we can use an interferometer,
just like Michelson-Morley did way back in the 1880s. It boils down to checking
whether the round-trip time-of-flight for light over a given constant distance changes
for different directions in space.
Even if the light signal
was a stream of perfectly elastic balls, shot at identical speeds to bounce back
from reflectors and their time-of-flight measured, we would not expect the
direction to influence the round-trip time of the balls, irrespective of how
the apparatus was moving in free space (provided that it was not nudged or
accelerated, of course). As long as we leave the aether out of the picture,
there is absolutely nothing to relativity. Granted, the whole picture is not quite as simple
as that, but we do not need Lorentz-contracting arms of the apparatus (or time
dilation) to explain the Michelson-Morley experiment.
Yes, if some observer who
is rapidly moving relative to our setup was silly enough to attempt a measurement
of the lengths of the arms of our interferometer, he will get shorter arms than
what we reckon they are (in the direction of his movement). But that's his problem
and we know why - he has the difficulty of reading his measuring tape simultaneously
(on the fly), at both ends of the in-line interferometer arm; simultaneously according
to his own set of synchronized clocks. We know that his clocks will not appear synchronized
in our inertial frame, but this is perfectly predictable, using Einstein's
relativity theory. There is obviously no physical contraction in any of our interferometer
arms. They work just fine with the lengths that our workshop made them to have.
What about the tick rate of
the clocks of this 'fly-by' observer? Will they tick slower than ours? Not in
any absolute sense, but if we make careful observations, we may think that his
clocks are losing time. The problem is that if he makes careful observations,
he will think that our clocks are losing time. This is a scenario that I will
handle later, but for now, forget about time dilation for purely inertially
moving clocks. It is really a myth, just like the "luminiferous aether". They
cannot both lose time relative to one other - it is just another
observational issue. There are scenarios where the loss or gain of time is
real, but this will have to stand over untill next time.
To conclude this post on a philosophical
note, why are some scientists hooked on the "myths" of relativity - like the invariance
of the one-way speed of light and Lorentz contraction, as if they are (sort-of)
real physical realities? We have seen that they are both consequences of the convention that we
use to synchronize clocks - the Einstein method. It was a brilliant
masterstroke of the great man, because of all possible synchronization schemes,
it makes the most sense - it makes physics "as simple as possible, but not
simpler", as Einstein has once famously said.
Surprisingly, in years of
studying relativity, I realized that physicists in the academic
environment are not too dogmatic about these issues; a few of them actually convinced
me that we cannot measure the one-way speed of light in any way - it is an assumption that is proven to be a good one by many experiments. Academics tend to look carefully at the origins of theories.
On the other hand, hands-on scientists using the one-way speed of
light in their work (e.g. particle physicists at accelerators) can be more
dogmatic. I think they are starting to forget the original assumptions made and
just press on and use it to their advantage, almost as if it is the only way to
view the world. It works, so no problem - why dwell on the origins of a successful theory.
This rigidity is perhaps also enhanced by the fact that the one-way speed of light is defined by international
standards authorities to be a constant. Even the meter is defined
in terms of the speed of light: "The meter is the length of the path travelled by light in vacuum during a time interval of 1/299 792 458 of a second."
-J
|
Comments rated to be "almost" Good Answers: