Greenwich Becomes Mean
June 22, 1675
King Charles II of Britain decrees the establishment of an observatory at Greenwich for the purpose of finding better ways of determining the longitudinal locations of ships at sea. The prevailing theory at the time was that accurate star charts combined with a table of the moon’s position would help navigators establish how far east or west of Greenwich they were located. Ironically, this method did not prove reliable enough and eventually a time-based method was developed when clockmaker John Harrison created spring-driven timepieces that could keep accurate enough time on ships. It took nearly 100 years after the establishment of the Royal Observatory at Greenwich for Harrison’s method to be accepted as a reliable standard. British mariners would set at least one chronometer on their ship to Greenwich Mean Time in order to calculate their precise longitude. By 1884 72% of global commerce used nautical charts based on Greenwich and in that year it was established as the Prime Meridian of the world which also lead to Greenwich Mean Time becoming the international time standard. In the 1970’s Coordinated Universal Time (UTC) would become the world time standard, using GMT as its base time zone (UTC+00:00).
UNIX and UNIX-based operating systems keep time by using UTC and applying an offset for the local time zone. Notably only the Windows operating system uses local time as the assumptive basis for your computer’s clock. In fact, UNIX-based operating systems define the current time by the number of seconds which have passed since 00:00:00 UTC on Thursday, 1 January 1970, otherwise known as the UNIX epoch. So for those of you following along, a decree by a king in 1675 is the basis for how much of our technology of today keeps track of time.