The Atomic Clock :
An atomic clock is a clock that uses an electronic transition frequency in the microwave, optical, or ultraviolet region of the electromagnetic spectrum of atoms as a frequency standard for its timekeeping element. Atomic clocks are the most accurate time and frequency standards known, and are used as primary standards for international time distribution services, to control the wave frequency of television broadcasts, and in global navigation satellite systems such as GPS.
The principle of operation of an atomic clock is not based on nuclear physics, but rather on atomic physics and using the microwave signal that electrons in atoms emit when they change energy levels. Early atomic clocks were based on masers at room temperature. Currently, the most accurate atomic clocks first cool the atoms to near absolute zero temperature by slowing them with lasers and probing them in atomic fountains in a microwave-filled cavity. An example of this is the NIST-F1 atomic clock, the U.S. national primary time and frequency standard.
The accuracy of an atomic clock depends on the temperature of the sample atomsócolder atoms move much more slowly, allowing longer probe times, as well as having reduced collision ratesóand on the frequency and intrinsic width of the electronic transition. Higher frequencies and narrow lines increase the precision.
National standards agencies maintain an accuracy of 10−9 seconds per day (approximately 1 part in 1014), and a precision set by the radio transmitter pumping the maser. These clocks collectively define a continuous and stable time scale, International Atomic Time (TAI). For civil time, another time scale is disseminated, Coordinated Universal Time (UTC). UTC is derived from TAI, but approximately synchronized, by using leap seconds, to UT1, which is based on actual rotations of the earth with respect to the solar time.