## Is the Speed of Light Slowing Down?

© 1996 Frank Steiger; permission granted for retransmission.

A favorite creationist argument is the theory by Australian Barry Setterfield that the speed of light has been slowing down exponentially from the moment of creation. Based on this theory, light from the most distant galaxies would have covered most its journey to earth in the recent past, because (according to the theory) at that time it was traveling at a velocity millions of times faster than at present. Thus, according to Setterfield's hypothesis, the light from the most distant stars actually left those stars only a few thousand years ago. This would support the creationist contention that the universe is only a few thousand years old.

However, there is a problem with this theory, independent of the appalling lack of experimental data to support it. Distances to remote galaxies are measured by correlating the observed shift of spectral lines towards longer wavelengths with measurements that can be made on closer star systems. This shift towards longer wave lengths ("red shift") is the result of the light source moving away from the observer, thus stretching the wave lengths in a manner similar to the drop in pitch of a train whistle as the train goes by. This phenomenon is known as the Doppler shift. The relationship between the receding velocity V of the galaxy and the speed of light is given by:

```             V = c(wavelength shift/wavelength)    (1)

where:  c = velocity of light

(this equation must be modified for very large values of V; in these cases
the wavelength shift/wavelength factor equals the square root of 1 + V/c
divided by the square root of 1 - V/c, and subtracting 1 from the ratio.)

```

The more distant the galaxy, the greater the shift, indicating that the universe is expanding. In the case of remote galaxies, this "red shift" is the only means of measuring distances. (It should be remembered that the creationist "speed of light" hypothesis does not dispute the distances to the most remote galaxies, so galaxy distance is not an issue in this discussion.)

Equation (1) shows that if the velocity of light leaving (in the distant past) the most distant stars were millions of times faster than light leaving (in the recent past) the closest stars, it would require a universe expansion rate millions of times faster than presently indicated in order to result in the observed spectral shift of distant stars. This is because the wavelength shift/wavelength ratio is equal to V/c, and the presumed velocity c of the light would be so great to begin with that the velocity V of the receding galaxy would have to be correspondingly high to cause an appreciable shift in the wave lengths of spectral lines.

The creationist argument that the speed of light was once millions of times greater than it is at the present time mandates the conclusion that at the time of creation the galaxies were whizzing apart at unbelievable velocities. This is contradicted by the presence today of nearby galaxies.

Different galaxies are receding at different velocities, depending on their distance, and therefore the Doppler shift for different galaxies will also vary. This is perfectly logical, yet Setterfield believes that variation in the red shifts occurs because the red shifts are "quantized" and have no relationship to either distance or velocity. Since quantization applies only to atomic phenomena, Setterfield concludes that the "red shift" towards longer wave lengths is due to an atomic effect. He would have us believe that all the stars in any given galaxy have atomic properties such that their spectral lines are shifted to the same degree, and that this spectral shift varies from galaxy to galaxy, and that it is not related to the galaxy's distance or velocity away from earth. Just why this would be so is a complete mystery, but there is no mystery about the motivation behind the argument: creationists believe that the galaxies were all created in place only a few thousand years ago, and that they are not expanding away from each other. The fact of the Doppler shift strongly contradicts this idea, so creationists have concocted bizarre explanations like the quantization of red shifts in a pathetic attempt to reject the overwhelming evidence that the universe is expanding. For example, Walter Brown of the Center for Scientific Creation states: "This is very strange if stars are moving away from us. It would be as if galaxies could travel only at specific speeds, jumping abruptly from one speed to another, without passing through intermediate speeds. If stars are not moving away from us at high speeds, the big bang theory will fall, along with most other beliefs in the field of cosmology."

The claim that light velocity is slowing down as time goes by is based on gross misinterpretations of inaccurate data, as we shall see.

The speed of light was first measured by Roemer in 1675 by measuring the variation of the observed (apparent) period of revolution of the satellites of Jupiter as the earth was either moving away or towards Jupiter. This indicated that light takes about 16.5 minutes to cross the diameter of the earth's orbit. From this, its speed could be calculated. Because of unavoidable errors in measurement, the calculated velocity was not highly accurate. Since that time the speed of light has been determined with increasing accuracy.

Walter Brown of the Center for Scientific Creation refers to Barry Setterfield's 1981 hypothesis that the speed of light is slowing down, and therefore the light from the most galaxies began its journey towards earth a mere 6000 years ago. Setterfield based his belief on a plot he constructed of measured light velocity vs. year measurement made. From this plot he concluded that the velocity of light increases exponentially as we go backward in time, becoming infinite at 4040 BC, which he describes as "the time of creation/fall."

Actually, none of the plotted points lay on the curve, yet he claimed a perfect correlation.

In fact, the more accurate determinations of the velocity of light made since 1960 do not support the conclusion that the speed of light is decreasing. Sutterfield's alibi is that the speed of light had reached its minimum at that time and was constant thereafter. Although Setterfield's plotted curve shows that the speed of light was infinite at the "moment of creation," he arbitrarily modifies the curve so that it becomes level going back before time = end of creation week, stating that "I will assume that this value held from the time of creation until the time of the fall, as in my opinion the Creator would not have allowed it to decay during His initial work."

Setterfield's hypothesis was so lacking in plausibility that even the Institute for Creation Research rejected it. (Acts and Facts, June 1988, G. Aardsma)

With respect to the fact that measurements made after 1960 do not show any decrease in the speed of light, Walt Brown has concocted his own misinformed "explanation" based on the assumption of two different systems of time:

By way of background, scientists found that it was necessary to revise the length of a "standard" second. The standard second is equal to the number of vibrations of a cesium atom that correspond to a second based on the time required (in seconds) for the earth to orbit the sun.

The cesium atom vibration frequency is extremely constant. Scientists have constructed instruments which can count these vibrations. By assigning a specific number of vibrations to a standard second, a super-accurate clock can be constructed. However, the cesium clock must be calibrated in order to correspond to the average period of revolution of the earth around the sun. In order to make the standard second (as defined by the cesium clock) precisely equal to the length of a second based on new and more accurate astronomical measurements, it was necessary to revise the previously selected number of vibrations corresponding to the standard second. The change was extremely minute.

The CSC web site speaks of "orbital" time versus "atomic" time as if they were two different systems of time measurement. Because of the necessity to re-calibrate the cesium clock, Brown mistakenly concludes that "atomic" time is "slowing." He states: "If atomic frequencies are decreasing, then both the measured quantity (the speed of light) and the measuring tool (atomic clocks) are changing at the same rate. Naturally, no relative change would be detected, and the speed of light would be constant in atomic time-but not orbital time." Of course, this is complete nonsense.

Now the latest twist in the unending campaign of young earth creationists to lend credence to their bankruupt theory is based on the hypothesis by Joao Magueijo of the Imperial college of London that light may have traveled faster during the first 10-43 seconds (a decimal point followed by 43 zeros) only, after the Big Bang. This creationist "evidence" is a typical example of their distortion of the facts. More information on the Magueijo theory can be found in the January 2001 isue of Scientific American.

Additional information on the constancy of the speed of light is available in talk.origins faq: The Decay of C-Decay