At that same moment, satellites on the other side of the Earth (the daylight side) detected X rays coming from the sun, which signaled the beginning of a solar flare (Ibid.).
This was not the only evidence for such a change in the radioactive decay rate.
Knowing about half-lives is important because it enables you to determine when a sample of radioactive material is safe to handle.Continue reading The most widely used tool to measure the age of the Earth is radioactive decay.The great scientist Ernest Rutherford was the first to define the concept of “half-life,” that is, the time it takes for one half of the atoms in a given quantity of a radioactive element (such as plutonium) to decay into another element (such as uranium), or for one isotope of an element (such as carbon-14) to decay into another isotope of that same element (such as carbon-12).But if the change is real, rather than an anomaly in the detector, it would challenge the entire concept of half-life and even force physicists to rewrite their nuclear physics textbooks (Ibid.).Because the decay rates in the two studies from the 1980s were altered by the seasons, physicists suspect that the sun was affecting the rates of decay, “possibly through some physical mechanism that had never before been observed” (Ibid.).