October 17, 2006 > A Brief History of Timekeeping
A Brief History of Timekeeping
As we approach the end of Daylight Savings Time, that biannual reminder of how important time has become to us, a brief overview of the development of clocks seems appropriate.
Ancient peoples probably measured time primarily by the motion of celestial bodies (sun, moon, planets, and stars) across the sky. It's likely that prehistoric civilizations understood the period of the lunar cycle, and were able to use it to predict seasonal changes tens of thousands of years ago. However, they have left us little evidence to confirm that.
Archeological evidence proves that as early as 4000 B.C., Egyptians were using a 365-day calendar, an amazing achievement considering the limited technology available to them. But they subsequently managed to build the pyramids with that technology, so perhaps we shouldn't be surprised. By the time Stonehenge was built, roughly 2000 B.C., several other civilizations, including the Sumerians, Babylonians, and Mayans, had developed calendars based on 12 lunar cycles or months.
By then, in addition to marking the passage of seasons to aid in agricultural and migratory decision-making, our ancestors had also found a need to more accurately measure the passage of time during the day. Perhaps this was driven by regularly scheduled religious observances or other formal gatherings, which arose with the advent of more structured and bureaucratic societies and governments.
Whatever the reason, primitive clocks were in use some 5000 years ago. The Egyptians constructed obelisks-tall, narrow, pointed monuments-at least in part to accurately measure the motion of the sun by tracking the progress of their shadows. The obelisk is in fact a prototype sundial. Smaller and more practical sundial designs were invented more than 3000 years ago. Such devices still exist today, though they are likely to be more ornamental than functional.
It was about this time, roughly 3000 years ago, that water clocks were invented. These clocks were simply vessels into which a steady drip of water was allowed to fall. Graduations on the inner surface of the vessel indicated the passage of time, as the water level rose. Alternatively, a vessel was filled with water, which was allowed to drip out through an aperture in the bottom, and time was measured by the lowering water level. The water clock was probably the first timepiece to use a technique other than the motion of celestial bodies.
Though sundials and water clocks were refined and improved, they remained the state of the art in timekeeping for more than 2000 years. Some impressive mechanized clocks utilizing gears, shafts, weights, etc. were developed in Asia and Greece prior to 1000 A.D., but because of their large size and complexity, were of limited practical value.
As Europe emerged from the technological stagnation of the Dark Ages, mechanized clocks were developed in the 14th and 15th centuries in Italy and Germany. These were still primarily large, bulky, spring-driven devices, and not very accurate. However, they laid the foundation for much more sophisticated clocks.
In the 17th century, Dutch scientist Christian Huygens built the first pendulum clock. This was a huge leap forward. The regular period of oscillation of the pendulum allowed clockmakers to build devices accurate to within a few seconds per day. By the 18th century, pendulum clocks with accuracies of one second per day were being produced. Today's best pendulum clocks can attain accuracies on the order of one second per year.
The 18th century also saw the invention of a spring and balance wheel design, which allowed precision timekeeping on a ship at sea. This was a tremendous aid to navigation, as a sailor needed to combine the positions of the stars with an accurate time to correctly determine his position. The spring and balance wheel escapement is still used today in many wristwatches.
The next quantum leap was the development of the quartz clock in the 1930s. These clocks are based on piezoelectricity, the property of crystalline materials to generate an electric field when placed under physical stress. When placed into an electrical circuit, a quartz crystal resonates and, via piezoelectric principles, emits an electrical signal at a regular frequency.
One problem with this technology is that the frequency of the signal changes slightly with variations in temperature. But because quartz clocks are sufficiently accurate (a second per day or better) for most modern uses and inexpensive to produce, they continue to dominate the market today. Some high-end quartz watches and clocks are programmed to regulate themselves by "learning" from being set, and thus become more accurate over time.
Finally, the atomic clock was developed in the 1950s, setting the modern standard for timekeeping accuracy. These clocks are based on the principle that all atoms resonate and emit radiation at very specific frequencies. The best atomic clocks now keep time to accuracies of 30-billionths of a second per year. This represents an inaccuracy of about one second in 30 million years. While such precise timekeeping doesn't mean much to most of us, it is critical to certain fields of scientific research.
In 1967, the National Institute of Standards and Technology (NIST) redefined the basic unit of time, the second, as 9,192,631,770 cycles of the characteristic frequency of the cesium atom. This replaced the previous scientific definition, based on the rotation of the earth, after thousands of years of relying on that motion to measure the passage of time.
The TechKnow Guy utilized source material from about.com in preparing this article. For more information on the history of time measurement, see http://inventors.about.com/library/weekly/aa070701a.htm.