The logic behind this common question has several hidden assumptions. Probably the most questionable assumption is that starlight has always traveled at the same speed. Has it? Has the speed of light always been 186,000 miles per second or, more precisely, 299,792.458 kilometers per second? One simple test is to compare the historic measurements of the speed of light.
Historical Measurements. During the last 300 years, at least 164 separate measurements of the speed of light have been published. Sixteen different measurement techniques were used. Astronomer Barry Setterfield of Australia has studied these measurements, especially their precision and experimental errors. His results show that the speed of light has apparently decreased so rapidly that experimental error cannot explain it! In the seven instances where the same scientists measured the speed of light with the same equipment years later, a decrease was always reported. The decreases were often several times greater than the reported experimental errors. This author has conducted other analyses that weight (or give significance to) each measurement according to its accuracy. Even after considering the wide range of accuracies, it is hard to see how anyone can claim, with any statistical rigor, that the speed of light has remained constant.
M. E. J. Gheury de Bray, writing in the official French astronomical journal in 1927, was probably the first to propose a decreasing speed of light. He based his conclusion on measurements spanning 75 years. Later, he became more convinced and twice published his results in Nature, possibly the most prestigious scientific journal in the world. He emphasized, "If the velocity of light is constant, how is it that, invariably, new determinations give values which are lower than the last one obtained . . . . There are twenty-two coincidences in favour of a decrease of the velocity of light, while there is not a single one against it." [emphasis in original]
Although the speed of light has only decreased a percent or so during the past three centuries, the decrease is statistically significant since measurement techniques can detect changes that are thousands of times smaller. Of course the older measurements have greater errors. However, the trend of the data is startling. The speed of light apparently increases the further back one looks in time. The rate of change is high. Several mathematical curves seem to fit these three centuries of data. Projecting these curves back in time, the speed of light becomes so fast that conceivably the light from distant galaxies could reach Earth in several thousand years.
There is no physical reason why the speed of light must be constant. Most of us simply assumed that it is, and of course, changing old ways of thinking is sometimes difficult. Russian cosmologist, V. S. Troitskii, at the Radiophysical Research Institute in Gorky, is also questioning some old beliefs. He concluded, independently of Setterfield, that the speed of light was ten billion times faster at time zero! Furthermore, he attributed the cosmic background radiation and most redshifts to this rapidly decreasing speed of light. Setterfield reached the same conclusion concerning redshifts by a completely different approach. If either Setterfield or Troitskii is correct, the big bang theory will fall (with a big bang).
Atomic vs. Orbital Time. Why would the speed of light decrease? T. C. Van Flandern, working at the U.S. Naval Observatory, showed that atomic clocks are apparently slowing relative to orbital clocks. Orbital clocks are based on orbiting astronomical bodies, especially Earth's oneyear period about the sun. Before 1967, one second of time was defined by international agreement as 1/31,556,925.9747 of the time it takes Earth to orbit the sun. Atomic clocks are based on the vibrational period of the cesium-133 atom. In 1967, a second was redefined as 9,192,631,770 oscillations of the cesium-133 atom. Van Flandern showed that if atomic clocks are "correct," then the orbital speeds of Mercury, Venus, and Mars are increasing; consequently, the gravitational "constant" should be changing. However, he noted that if orbital clocks are "correct," then the gravitational constant is truly constant, but atomic vibrations and the speed of light are decreasing. The drift between the two types of clocks is only several parts per billion per year. But again, the precision of the measurements is so good that the discrepancy is probably real.
There are four reasons why orbital clocks seem to be correct and why atomic frequencies are probably slowing very slightly.
If a planet's orbital speed increased (and all other orbital parameters remained the same), then its energy would increase. This would violate the law of conservation of mass-energy.
If atomic time is slowing, then clocks based on the radioactive decay of atoms should also be slowing. Radiometric dating techniques would give ages that are too old. This would bring radiometric clocks more in line with most other dating clocks. This would also explain why no primordial isotopes have half-lives less than 50 million years. Such isotopes simply decayed away when radioactive decay rates were much greater.
If atomic clocks and Van Flandern's study are correct, the gravitational "constant" should change. Statistical studies have not detected these variations.
If atomic frequencies are decreasing, then five "properties" of the atom, such as Planck's constant, should also be changing. Statistical studies of the past measurements of four of the five of these "properties" support both the magnitude and direction of this change.
For these reasons, orbital clocks seem to be more accurate than the extremely precise atomic clocks.
Many of us were skeptical of Setterfield's initial claim, since the decrease in the speed of light apparently ceased in 1960. Large, one-time changes seldom occur in nature. The measurement techniques were precise enough to detect any decrease in the speed of light after 1960, if the trend of the prior three centuries had continued. Later, Setterfield realized that beginning in the 1960s, atomic clocks were used to measure the speed of light. If atomic frequencies are decreasing, then both the measured quantity (the speed of light) and the measuring tool (atomic clocks) are changing at the same rate. Naturally, no relative change would be detected, and the speed of light would be constant in atomic time--but not orbital time.
Misconceptions. Does the decrease in the speed of light conflict with the statement frequently attributed to Albert Einstein that the speed of light is constant? Not really. Einstein's theory of special relativity assumes that the speed of light is independent of the velocity of the light source. This is called Einstein's Second Postulate. Many have misinterpreted this to mean that "Einstein said that the speed of light is constant." Imagine spaceships A and B traveling away from each other. An astronaut in spaceship A suddenly shines a flashlight at spaceship B. Einstein claimed that the beam will strike spaceship B at the same speed as it would if the spaceships were traveling toward each other. This paradox has some experimental support. Setterfield, on the other hand, says that while the speed of light has decreased over time, at any instant all light beams travel at the same speed, regardless of the velocity and location of their sources.
Some people give another explanation for why we see distant stars in a young universe. They believe that light was created God created a beam of light between Earth and each star. Of course, a creation would immediately produce completed things. Seconds later, they would look older than they really were. This is called "creation with the appearance of age." The concept is sound. However, for starlight, it is probably not an acceptable explanation for two reasons:
Very bright, exploding stars are called "supernovas." If starlight, apparently from a supernova, were created en route to Earth and did not originate at the surface of the star, then what exploded? If the image of an explosion was only created on that beam of light, then the star never existed, and the explosion never happened. Only a relatively short beam would have been created near Earth. One finds this hard to accept.
Every hot gas radiates a unique set of precise colors, called its emission spectrum. The gaseous envelope around each star also emits specific colors that identify the chemical composition of the gas. Since all starlight has emission spectra, this strongly suggests that a star's light originated at the star--not in cold, empty space. Each beam of starlight also carries other information, such as the star's spin rate, magnetic field, surface temperature, and the chemical composition of the cold gases between the star and Earth. Of course, God could have created this beam of light with all this information in it. However, the real question is not, "Could God have done it?" but, "Did He?"
For these reasons, starlight seems to have originated at stellar surfaces, not in empty space.
Surprising Observations. Starlight from distant stars and galaxies is redshifted--meaning that the light is redder than it should be. (Most astronomers have interpreted the redshifted light to be a wave effect, similar to the pitch of a train's whistle that is lower when the train is going away from an observer. The greater the redshift, the faster stars and galaxies are supposedly moving away from us.) Since 1976, William Tifft, a University of Arizona astronomer, has found that the redshifts of distant stars and galaxies typically differ from each other by fixed amounts. This is very strange if stars are moving away from us. It would be as if galaxies could travel only at specific speeds, jumping abruptly from one speed to another, without passing through intermediate speeds. If stars are not moving away from us at high speeds, the big bang theory is incorrect, along with most other beliefs in the field of cosmology. Many other astronomers, not believing Tifft's results, have done similar work, only to reach the same conclusions as Tifft.
Atoms behave in a similar way. That is, they give off tiny bundles of energy (called quanta) of fixed amounts--and nothing in between. So Setterfield believes that the "quantization of redshifts," as many refer to the phenomenon, is an atomic effect, not a strange recessional velocity effect. If a property of space is slowly removing energy from all emitted light, it would do so in fixed increments. This would also redshift starlight, with the furthest star's light being redshifted the most. Furthermore, it would also slow the velocity of light and the vibrational frequency of the atom, all of which is observed. Setterfield is currently working on a theory to tie all of this together. PREDICTION 16: The redshifts of some specific, distant galaxies will undergo abrupt decreases.
Another surprising observation is that most distant galaxies look remarkably similar to nearer galaxies. For example, galaxies are fully developed and show no signs of evolving. This puzzles astronomers. If the speed of light has decreased drastically, these distant, yet mature, galaxies no longer need explaining.
A Critical Test. How can we test whether the speed of light has decreased a millionfold? If it has, we should observe events in outer space in extreme slow motion. Here is why.
Consider a time in the distant past when the speed of light was, say, a million times faster than it is today. On a hypothetical planet, billions of light-years from Earth, a light started flashing toward Earth every second. Each flash then began a very long trip to Earth. Since the speed of light was a million times greater than it is today, those initial flashes were spaced a million times further apart in distance than they would have been at today's slower speed of light.
Thousands of years have now passed. Throughout the universe, the speed of light has slowed to today's speed, and the first of those flashes--strung out like beads sliding down a long string--are approaching Earth. The distances separating adjacent flashes have remained constant during these thousands of years, because the moving flashes slowed in unison. Since the first flashes to strike Earth are spaced so far apart, they will strike Earth every million seconds. In other words, we are seeing past events on that planet (the flashing of a light) in slow motion. If the speed of light has been decreasing since the creation, then the further out in space we look, the more extreme this slow motion becomes.
As one example, galaxies would be seen in slow motion. Galaxies that appear to spin at a rate of once every 200 million years would be spinning much faster. This might explain the partial twist seen in all spiral galaxies. If the speed of light has not decreased, and there is no slow-motion effect, then why do billion-year-old spiral galaxies, at all distances, show about the same twist?
Most stars in our galaxy are binary; that is, they and a companion star are in a tight orbit around each other. If there is a "slow-motion effect," the orbital periods of binary stars should tend to increase with increasing distance from Earth. If the speed of light has been decreasing, the Hubble Space Telescope will find that binary stars at great distances have very long orbital periods, showing that they are in slow motion.
These calculations contain mathematical errors which, if corrected, would support the hypothesis that the speed of light has decreased. I have discussed these matters with each author. The following professional statisticians have verified my conclusions or have reached similar conclusions independently:
Michael Hasofer, University of New South Wales, Sidney 2033, Australia.
David J. Merkel, 11 Sunnybank Road, Aston, Pennsylvania 19014, U.S.A.
Alan Montgomery, 218 McCurdy Drive, Kanata, Ontario K2L 2L6, Canada.
No physical law prevents anything from exceeding the speed of light. In two published experiments, the speed of light was apparently exceeded by as much as a factor of 100! The first experiment involved radio signals which, of course, are a type of light. Counterexplanations are being proposed for these surprising results, but so far, no one has repeated the experiment or shown it to be false. [Alexis Guy Obolensky, personal communication.] The second report referred to a theoretical derivation and a simple experiment that permitted electrical signals to greatly exceed the speed of light. This derivation follows directly from Maxwell's equations. The special conditions involved extremely thin electrical conductors with very low capacitance and inductance.
A strange quantum effect also causes light, in certain situations, to slightly exceed the normal speed of light. Some who believe in an old universe have a different explanation. Those isotopes are extinct because so much time has passed. However, this explanation raises a counterbalancing question: How did those isotopes, and 97 percent of all elements, form? The standard answer is that these elements appeared during supernova explosions. This is actually speculation, since essentially no supporting evidence has been found. Besides, all supernova remnants we see in our galaxy appear to be less than 10,000 years old. This is based on the well-established decay pattern of a supernova's light intensity in the radio-wave frequency range. The light beams are considered to be traveling in a vacuum. Light travels at slightly slower speeds when it travels through any substance, such as air, water, or glass.
Another question concerns Einstein's well-known formula, E=mc 2 , which supposedly gives the energy (E) released when a nuclear reaction annihilates a mass (m). If the speed of light (c) decreases, then one might think that either E must decrease or m must increase. Not necessarily.
In the universe, time could flow according to either atomic time or orbital time. Under which standard would E=mc 2 be a true statement? Mass-energy would be conserved under both; in other words, the energy or mass of an isolated system would not depend on how fast time passed. Obviously, E=mc 2 would be absolutely true in atomic time where c is constant, but not in orbital time where c decreases. Let's now see why E=mc 2 will be approximately correct even in orbital time.
Nuclear reactions convert mass to energy. Unfortunately, the extremely small mass lost and large energy produced cannot be measured precisely enough to test whether E=mc 2 is absolutely true. Even if mass and energy could be precisely measured, this formula has embedded in it an experimentally-derived, unit-conversion factor that requires a time measurement by some clock. Which type of clock should be used: an orbital clock or an atomic clock? Again, we can see that E=mc 2 is "clock dependent."
If c has decreased (the orbital time standard), neither length, electrical charge, nor temperature standards would change. Therefore, chemical and nuclear reactions would not change. However, the speed of nuclear reactions, and to a slight extent chemical reactions, would change, since the vibrational frequencies of atoms would change. Also, radioactive decay rates, which depend on the vibrational frequency of the atom, would decrease if c decreased.