Tuesday, August 9, 2011

PFE026: Physical Standards

The questions of, "What is a meter?" or "What do we mean by a second?" often come up, mainly because the answers are rather complicated.

First, I will say that while I think about driving distances and how tall someone is in terms of inches, feet, and miles, I wish I didn't. Not only that, of course, but the rest of the world [Hello [rest of the] world!] prefers the metric system. More importantly, however, is that the science community prefers the metric system. As such, any fancy sciency definitions of mass or what-not are probably going to be metric based.

First, let's talk about mass. I was taught as a kid that one gram is the amount of mass of one mili-liter or one cubic centimeter of water [note that liters are defined in terms of length in this way]. A more on point definition specifies the temperature [historically at $0^\circ$C or right above melting point].

Unfortunately, impurities in water
are fairly common. Moreover, one gram is not a particularly practical size for everyday things. So the standard was shifted up a factor of 1,000 and we get the kilogram.

For this, some French and Italian scientists fashioned the first formal kilogram in 1799 made out of platinum equal in weight to 1000 cubic centimeters of water. But for the temperature, instead of choosing $0^\circ$C, they chose $4^\circ$C which is slightly more stable temperature. Who really cares anyways? In 1875, a newer fancier kilogram was manufactured along with a number of duplicates.

These are locked up around the world and taken out once in awhile for comparisons. As weird as it might sound, they seem to actually change a bit with time. That said, they are still extremely accurate, so don't worry about them redefining the kilogram causing your weight to go up - that's just because of that cheeseburger from last night.

The standard for mass, the kilogram, has been historically related to the standard for length, the meter. Again, the French were behind this one, only their initial effort was less - precise. They decided that a useful way to define the meter, was by declaring it as one ten-millionth of the distance between the north pole and the equator through Paris.

It only took them four years to realize the silliness of this. There's no easy simple way to measure this, not to mention that the earth is far from smooth or spherical.

They quickly replaced this idea with a metal rod, and then, another four years later, when the first kilogram was set, a similar platinum rod was declared as one meter. This in turn was again replaced upgraded some 90 years later by a newer, better bar to match the newer better kilogram. This continues for awhile [upgrades, increases in the specifications of air pressure, temperature, breathiness of the observer, levelness of the rod, etc.] until the physicists get involved in the 60's. They cleverly noted that the radiation from Krypton is incredibly uniform, and, get this, declared that one meter is $1,650,763.73$ wavelengths of said radiation. That's easy to remember!

Krypton not kryptonite.

Of course, we're not done yet, as the physicists decided to tie the meter directly into the speed of light which, as we all know, is exactly:
299,792,458. m/s        (duh)
The reason why it is exactly this speed without anymore trailing decimals is because we [physicists, scientists, etc.] basically decided, that we were sick of it and redefined the meter to round off any extra decimal places. Don't worry about this changing your height as you could be, at most, 0.0000002 inches shorter than you were before 1983.

This is basically where we stand. If you want to measure exactly one meter, get a flash light, and really fancy stop watch, and some reflexes. Turn the light on and hit the stopwatch. When the stopwatch hits $0.00000000333564\;(1/299792458)$ seconds, measure how far the light has gone, and voila! One meter.

The main reason why these options are nicer than the metal rod option is that it is the same everywhere. Anyone, with advanced enough equipment, can measure one meter to a very high precision.

All of the above, however, still requires an accurate definition of time. Let's take a look at the second.


The second had been casually defined in terms of increasing subdivisions of a day, specifically as the unit of time such that $60\times60\times24=86,400$. But of course this isn't that easy to measure, not really. First, the factor of 86,400 isn't practical for everyday use. And then there's the fact that the sun doesn't rise at the same time each day, and that it shifts throughout the year.

In the 1960's, apparently, the best we could come up with was something like one in 31 million of a year on the equator in the year 1900 by referencing old astronomical data. How useless is that?

The next step was the creation of the [reasonably?] well known atomic clock. Like the unique properties of krypton that were briefly used as the definition of the meter, some clever physicists in the late 60's measured to an extremely high accuracy [and correctly compared with celestial motion to compare with the previous definition of the second] the wiggles of particular cesium atoms. In fact, said atom has to vibrate more than 9 BILLION times to make one second.

This has essentially remained the same except that every ten years or so someone comes along and specifies more conditions for the measurement [temperature, pressure, day of week, etc.] in an effort to lock in a prescription for anyone.

Thank goodness for fancy pants scientists. We used to have metal rods and fractions of days to understand what distance and time meant. Now we need to measure 9 billion excitations of a Cs-133 atom just to know that a second has passed. Geesh.

That's physical standards.

No comments:

Post a Comment