Wednesday, September 7, 2011

The History of the Meter

In science, precise measurements are everything. In order to prove even the simplest theorems, you need systems of measurement that can be both accurate for your purposes, and translatable to other scientists to facilitate collaboration. This is what the SI units are for; they’re a universal system of measuring things. The meter, as stubborn as America might be to use it, is actually the perfect ruler for science (we’ll see why later). But I wondered today: what did the world do before the French intervened with their fancy “Système International d'unités”? How did we measure football fields? 

Yeah, I went there.


Of course, each culture developed their own system of measurement according to what they had around them. For the Ancient Greeks, this meant their body parts. They measured things in daktyloi (fingers) and podes (feet). Additionally, they had units in lengths of the palm, the length of the thumb, the length of a stadium, how much land a pair of oxen could plow in a day... there wasn’t quite a ‘standard’ as there was an approximation of pretty much everything involved in daily Greek life, though they were all loosely based on the daktylos and pous. The Ancient Egyptians had a similar system, adding the use of knotted rope to give them more accuracy when plotting land, and keeping the standards for these measurements in their temples. 
...would find it hard to take accurate measurements in Ancient Greece.
The Romans then built on what the Greeks and Egyptians had done, standardizing it even further. Instead of being a conglomeration of both digits and feet, the Roman system was purely based on feet. One digit was now 1/16th of a foot, an inch was 1/12, and a palm was 1/4 (a horse’s height is still measured in palms today). The mile was defined as 5000 feet, the distance of 1000 two-step paces. 
As you probably guessed, the present English system of units is based off of the Roman one. After the Norman conquest of England in 1066, the French reintroduced the Roman system and sought to replace the previous Anglo-Saxon system of measurements by barleycorn and rod. Regardess of the merger, though, the English continued to use some really weird measurements, like: 
furlong: how long a plow could be pushed without rest
league: an hour’s walk (around three miles)
fathom: the arm span of an adult from fingertip to fingertip
rod: 20 feet, as in the length of one’s own feet, not the standard foot
bovate: the amount of land one ox could plow in a single year, which was not to be confused with...
virgate: the amount of land two oxen could plow in a year
Oh the English. Clearly, some sort of standard was needed. 

Lessons from the Anglo-Saxons: Never work a pair of oxen past a virgate.
You do not want that kind of trouble on your hands. 
And so, Louis XVI commissioned a group of scientists to create a unit of measurement based off of an absolute, that could be used by anyone. The first proposed meter, as it was called, was the length of a pendulum with a half-period of one second. However, they neglected to factor in that gravity isn’t always the same on every part of the Earth, so that distance was a little bit different everywhere one went. Hmm.
In 1791, this group decided to base the meter on the length of the Earth instead, defining it to be one ten-millionth of the distance between the equator and the north pole, with the line of measurement going through Paris. In a couple of years, a solid platinum bar was made to this exact length, to act as the prototype meter. As time went on, the length of the meter was narrowed down more and more according to this hunk of metal: first it was the length of the platinum bar, then it was a platinum-iridium bar alloy at freezing point, then it was the alloy at freezing point and sea level, “supported on two cylinders of at least one centimetre diameter, symmetrically placed in the same horizontal plane at a distance of 571 millimetres from each other.” It was simply ridiculous, the lengths* they went to in order to narrow this down. 
FINALLY, as recently as 2002, the International Committee of Weights and Measures defined the meter to be: 
"...the base unit of length in the International System of Units that is equal to the distance traveled by light in a vacuum in 1/299,792,458th of a second or to about 39.37 inches" (Webster)
Having a unit of length defined in terms of the speed of light, instead of any physical substance, made the meter the perfect universal ruler for science. And just think: it only took a little less than 5000 years. That's the power of progress. 
*Hey-yo! Sorry, those opportunities are just too good. 

1 comment:

  1. Nicely written. This belongs on Stuff You Should Know. It's interesting, informative, and entertaining

    ReplyDelete