<<

FROM UNITS TO UNITY

Author: Guy Snelling

Intercal 581 Lupton Drive, Halfway House, Midrand Email: [email protected] Phone: +27 11 315 4321 Fax: +27 86 515 2344

Abstract To the layman, metrologists may seem to be a strange lot indeed. They insist on precision and accuracy, not only in their measurements but also in every part of their work. They must know the exact value of a measurement and then they calculate by how much they don’t know the exact value! To call them ‘pedantic’ would be like calling William Shakespeare ‘that bloke that wrote a bit’. But then ask a Metrologist about the origin of the measurements he or she is making and they will spout forth about famous scientists like Kelvin, Watt and Ampere. They’ll tell you all about the SI Units and the inter-relationship between the seven base units and how the French put together the metric system. But then ask them to go further back in and things start to get a little hazy. Ask them about some of the lesser known units of measurement, such as the ‘’, the ‘jiffy’, the ‘micromort’ and even the ‘quasihemidemisemiquaver’ and they will be looking for the nearest escape route or at the very least, the nearest stiff drink. In every field of science, it is often good to go back to the basics. We can get so wound up in all the theory that we sometimes need to sit back, open our minds and re-live the past in order to bring understanding to the present. This paper will take a look back over the eons to a time before metres, litres, and miles. To a life dictated by the length of an arm and a foot; to food sold in baskets and beer sold in caskets and to a period where you slept when it was dark and ate when you were hungry. We will discover how rudimentary measurements were made and how they were refined into the units and standards that we know (and love) today. Join me on a journey through time, itself only an arbitrary measurement, and discover the roots of our ancient profession. The Ancients Everyone knows that the Egyptian pyramids are built to a precision that is difficult to match even today, but at around the same time other cultures were also building vast monuments; the Mayans were building their version of the pyramids in South America; the Nabataeans built the desert cliff city of Petra; the Neolithic inhabitants of early Britain fashioned slabs of rock into the 50-ton sarsen stones of Stonehenge and further North the Scandinavians built their burial cairns. Whatever the size and shape of these vast monuments, they all have one thing in common; they needed precise measurements to build them. Precise measurements of course needed a system of measurement consisting of some kind standard artefact as well as a divisible unit. A good ancient example of this is the Egyptian “cubit”. Traditionally this was based on the length of the prevailing Pharaoh’s forearm and was sub- divided into “palms” and “fingers” and “digits”. It should be noted that other cultures such as the Greeks, Romans and Hebrews had similar units of length also called the cubit along with the Chinese chi, the Japanese Shaku and the Indian hasta. Such standards of course are of little value unless their length can be transferred to secondary and working standards. After all, one cannot expect the great Pharaoh of Egypt to visit a half completed Pyramid to check the measurements with his arm. Thus calibration was invented and the Egyptian workers were required by law to present their working cubit to their supervisor once a for calibration against a standard cubit. Failure to do so was punishable by death, which may seem severe but there are a number of QA managers in industry even now that wish that the law could be re-instated. Looking further into the measurement of length, we find more references to the human body. An obvious one is the “foot”, although this was not based on the length of the Pharaoh’s foot but is believed to be a multiple of fractions of the cubit. It does, however approximate the length of a human foot and the Roman foot was around 296 mm long. This was divided into 12 unciae (inches) and also 16 digits. The reference to the mile also comes from the Roman mille, being 1000 paces with a pace being 5 feet. It seems that this would explain the success of the Roman Empire since the Greek pace was only 2½ feet long and therefore the Romans must have been extremely tall. Of course this was not the case and the Roman pace of 5 feet was actually the distance travelled by one foot when walking; what we in fact would refer to as two paces. The Middle Ages When the Romans occupied England, the 5000 foot mile was used, however Queen Elizabeth I changed it to 5280 feet, this being exactly 40 furlongs. Maybe the furlong should be explained at this point. In the medieval strip farming system, each farmer had one or more strips of land 220 yards (40 rods) long and 22 yards (1 chain) wide; this being 1 acre in area. Since the farmer ploughed furrows the length of the field the strip was “a furrow long”, hence the “furlong” for 220 yards. The yard itself initially had several possible lengths, such as the circumference of the waist or the distance from the finger-tip to the nose. It was divided by the binary method into 2, 4, 8 and 16 divisions being the half-yard, span, finger and nail respectively. Other measurements of length existed such as Rod, Poles and Perches. Staying with agricultural terms, there is a measurement of area called a “barn”, however it is unlikely that you could see this with the naked eye as it is the area of the Uranium nucleus, being 10-28 square metres. There is then also the kilo-barn and mega-barn, along with the milli-barn and the micro-barn, although this is more often called an “outhouse”. Getting even smaller is the yoctobarn, more commonly known as a “shed”. While sticking with terms for distance, physical exercise also came into the mix with such units as the “league” which is traditionally the distance a man or a horse can walk in one . There were also variations on the length of the mile with the Scottish mile being 5952 feet, the Irish mile being 6720 feet and the Country mile being a generally vague distance, often longer than any measured mile. One should also be wary of the Irish distance of “a mile and a bit” as the “bit” may be longer than the mile. Before we leave length, it is worth noting we may have come across the origins of the of a cricket pitch. The length of this is of course 22 yards, which is the same as the width of a farming strip. It is not beyond the realms of probability to picture then an early game of cricket being played across the middle strip of a number of farming strips and the batsmen defending the gates or wickets between the strips. Something for the workers to do after the harvest and before the planting season. Speaking of planting, we come to another method of proving a reasonably precise measurement. The volume of containers could be measured by filling them with seeds which were then counted. It turns out that seeds are generally consistent in their size so were a useful and easily obtained standard. Seeds were also used in weighing systems and one unit of that method survives today in the carat, the unit used to measure gold and gems, derived from the Greek word kerátion, that refers to the carob seed. Obviously it was easy to use a lump of metal to represent a certain number of grains, and such lumps were kept in temples as official standards. Copies of these lumps could then be made and checked against the standard using a precise balance. Unfortunately, history shows us that it is all too easy to deceive a customer about weight by adding or removing metal and so it was necessary to implement an inspectorate of weights and measures. This same system still exists today, with the standard lump (1 kg) kept in a temple (BIPM) and regular comparisons and inspection taking place throughout the world. Other units of mass have been used over the , and in some cases they are still used by some die-hards. The pound is equivalent to 0.453 kilograms and the name comes from the Latin word pondus, meaning weight. Incidentally the unit (lb) comes from the Roman word “libra”, referring to a specific standard weight (around 325 g). The slug is a mass numerically equal to the acceleration of standard gravity of 32.174 feet per per second. Thus a 1 slug mass (32.174 pounds) will accelerate at 1 foot per second per second when subjected to a force of 1 pound weight. The stone is another unit that may often be heard in England, being the equivalent of 14 pounds and decreed by Kind Edward III in 1340 when Flemish measures were adopted to aid England’s wool trade. More Time Passes Moving on, reliable and accurate time measurement has always been a practical necessity. In ancient , it is probable that the difference between night from and between morning from afternoon was sufficiently accurate for most purposes. Eventually though more precise methods of pinpointing the same time each day was required and instruments such as the sun- dial and the water were invented. Although an example of an Egyptian from 800 BC has been found, the principle had been known by astronomers long before then. The shadow cast by a stick against a known mark would indicate the time of day, but complications arose with the changing of the angle of the sun throughout the especially in the furthest latitudes. The attempted to solve this, but maintaining a constant drip-flow of water proved tricky. The hour glass was more successful, and examples were used in 18th churches in an attempt to control the length of the sermon. As time passed, it became necessary to have a method of referring to a certain time or time period. Thus the concept of the “hour” was brought into being. The day was divided into 12 “”, this number being easily divisible by 2,3,4 and 6. (Similarly the Babylonians used a base of 60, since this could be easily divided by 2,3,4,5,6,10 and 12.) Originally the hour was a variable time period since the length of the day varied throughout the year. Traditionally the 6th hour would be noon, the 3rd hour would be mid-morning and the 9th hour would be mid-afternoon. Only when clockwork became available in the 14th did the hour become a specific period of time, namely one 24th of the period between dawn and dawn, although even now are manufactured to indicate the time in two sections of 12 hours each. After a while, it became necessary to divide the hour into something smaller than one quarter and in the Middle Ages the Babylonian influence divided the hour into 60 parts. In medieval Latin, the phrase for 1/60th is pars minuta prima ('first very small part'), and a sixtieth of that is pars secunda ('second very small part'). Thus we get our “minutes” and “”. By the way, if you tell someone that you will “be there in a jiffy” you will almost certainly be late. In scientific terms a “jiffy” is the amount of time it takes for to travel a distance the size of a nucleus. So bearing mind that it takes about 1 second for light to travel to us from the moon, a “jiffy” is, well, pretty quick. By the same token, being there “in two shakes” would not improve your chances of being punctual. A “shake” describes the time it takes for one step in a nuclear reaction; about 10 billionths of a second. Any metrologist worth his salt knows that time is the most accurate measurement that we make at the present. While voltage can be measured in fractions of parts in a million, atomic clocks can maintain time to within 1 second in 100 million . But why did it become necessary to measure time so accurately and consistently? The answer lies with the sea explorers of the 17th and 18th centuries. It is relatively simple to find your latitude on the Earth using the positions of the stars, but in order to find your you need to know the time of the day. The more accurately this is known, the more accurate your location can be found. It was simple enough to install a clock in the map room of the ships, but in the 17th century the only available clocks needed to be adjusted daily in order to show the correct time. This wasn’t useful when your voyage could last several . What was needed was a clock that would maintain the correct time over many and in adverse and changeable weather conditions. Such a clock (scientifically known as a chronometer) was eventually invented by John Harrison. His design won him the £20 000 prize money offered by the British government but it took him nearly fifty years of tinkering before one of his clocks managed the journey to Jamaica, losing just 5 seconds in the process. However, it was another 12 years before he was able to show the government that it wasn’t a fluke so that he could be awarded the prize money. A Frenchman by the name of Pierre le Roy improved in Harrison’s design and his clock was accurate to within just 8 seconds after a 46-day voyage. When the chronometer was used in conjunction with a sextant, it was now possible to precisely measure position at sea and the information brought back by explorers was used to produce accurate maps and charts. Getting Hot Under Pressure So explorere can find their position on the surface of the Earth, but what about their altitude? Going back to the 17th century finds Italian Evangelista Torricelli, hard at work assisting Galileo in his later years and wondering why it is more difficult to pump water from a well when the water is far below ground level. He reasons that it is caused by the extra weight of the column of air above the water and devises an experiment using mercury in a tube to prove this. He later notices that the height of the column of mercury varies with the weather and thus invents the barometer. Further reasoning suggested that the pressure measured with the mercury column will decrease as one ascends a mountain, but it was not until the namesake of the pressure unit, Blaise Pascal, that this experiment was conducted. In fact, Blaise was feeling poorly and sent his more robust brother-in-law up the 4000 foot Puy de Dôme to take readings that allowed for the invention of the altimeter. Unfortunately, history does not record the name of this brother-in-law so we shall have to stick with the Pascal as the unit of pressure. Whilst ascending to 4000 feet, Pascal’s brother-in-law would undoubtedly have experienced not only a discernible drop in air pressure, but also a far more noticeable drop in air temperature. It was around the same time that a German glassblower, one Gabriel Daniel Fahrenheit, was experimenting with the use of mercury in his thermometers. Alcohol had been used for over half a century, in fact in 1629 Joseph Solomon Delmedigo describes in a book a sealed-glass thermometer which used brandy. However due to the non-linear expansion properties of alcohol they were not particularly accurate over a wide range. Herr Fahrenheit was able to manufacture glass tubes thin enough to allow for the lower, but more linear rate of expansion of mercury and his design eventually became the standard. The problem was the calibration of these thermometers. This would require the use of two known temperatures, with divisions between them. Rather than using Newton’s proposal for a 12-degree range between the freezing point of water and the temperature of the human body, Fahrenheit elected to use the freezing point of brine as the lower point. Using the convenient divisors of 2,3 and 4 he then divided this scale into 12 sections of 8 divisions giving a total of 96 degrees. The freezing and boiling points of water were then 32 °F and 212 °F respectively. Later in the century, in 1742, Anders Celsius proposed the use of the freezing and boiling temperatures of water as the calibration points, and in an early example of decimalisation he conveniently divided this range into 100 degrees. It is worth noting though that his original scale ran from 100 degrees at the lower end and 0 degrees at the upper. Two years later Carl Linnaeus suggested inverting the scale so that it started at 0 degrees and rose to 100. This 100-degree scale would seem a more logical solution to that of Fahrenheit, but in most English speaking countries it would be another two centuries before this system took a hold, and in some nations the Fahrenheit scale is still used. It was of course Lord Kelvin that deduced the concept of absolute zero, but thankfully the size of the divisions of his scale equate to that of Celsius, which makes the conversion relatively simple. Just to confuse things though, 11 years later Scottish scientist William Rankine proposed a similar system but using equivalent degrees of Fahrenheit in his scale. The Réaumur scale, proposed in 1773, uses 0 as the freezing point of water but 80 as the boiling point. This scale had widespread use in Europe and even though the French adopted the Celcius scale for the metric system in the 1790s, the Réaumur scale was still in use in Russia until the early 20th century. In fact, it is still used today in cheese production in some Italian dairies. A Light at the End of the Tunnel It would seem that the reporting of light intensity in the 18th century could be placed into two distinct levels. These were qualitative (“the moon is very bright tonight”) and relative (“the moon is nearly as bright as the sun”). Even though there were two main fields at that time that would have benefited from measuring the intensity of light, namely photography and astronomy, quantitative measurements only became commonplace in the 1930s. One reason for this may have been the technical difficulties of the measurement itself and then also the problem of allocating a numerical value to it. 19th Century photometers were generally of the comparison type where an observer was able to view two light sources and could adjust one of them until both appeared equal. The lack of a suitable standard seems to have hindered any progression beyond this. The introduction of photosensitive elements paved the way to electronic devices that could measure light levels evenly over a wide spectrum, and the judicious use of filters allowed the sensitivity of these devices to emulate the spectral range of the human eye. Now of course it was necessary to find a unit of measurement and the units chosen went back to basics by using candles as a reference. Candlepower became the unit for intensity at the source, equating it to the equivalent light intensity of a candle. The unit of illuminance was the foot-candle, being how bright a 1-square-foot object was being illuminated when it was 1 foot away from the source. In modern times, the candela replaced candlepower and the foot- candle has now been replaced by the lux, as one lumen of energy being emanated from one square metre. On a lighter note, as a marketing tool in the 1960s, General Electric distributed candles in the shape of a foot. These visual puns were used to promote GEs range of fluorescent light tubes, and while they could be lit they were not “standard candles” and did not deliver a calibrated amount of illuminance. Another unit that characterises a light source is named after the physicist Karl Jansky. The Janksy is used mainly by astronomers to describe light or energy sources that are fractions of the intensity of the Sun. It is handy then that one Jansky is just 10-26 Watts per square metre per . The Tension is Electric Now that we have moved into the realm of electrical units, it is worth noting that the majority of these units are named after physicists that were prominent in the field. Alessandro Volta, Andre-Marie Ampere, Georg Ohm, Michael Faraday and Charles-Augustin Coulomb were the main protagonists, but there were a few alternatives used before they were honoured with the naming of the units. For instance, an early unit of capacitance was the Jar, a shortened form of the Leyden Jar. Used mainly by the British Admiralty until the 1930s as a simple means of changing transmission frequency of their spark transmitters, a Jar consisted of a standard Admiralty glass pint tankard covered with tin-foil as the outside electrode and filled with 1 pint of brine as the inside electrode and had a capacitance of about 1 Jar. These were then swapped in or out of the “tank circuit” to adjust the frequency. The pint jar was also useful in case the radio operator needed a “quick one” in the middle of his shift. These days we know that the Ampere is one of the base SI units of measurement, however in the early C.G.S. (centimetre-gram-second) system of units the Ohm was declared to be the primary unit of electrical measurement by the BAAS (British Association for the Advancement of Science). Towards Unity Having dealt with the origin and history of some of the well-known and some of the more obscure units of measure, we have seen that not only was there very little standardisation between units, but often the same unit had difference values depending on the region that that were used. In fact, this got so bad that in 1795 there were over 700 hundred units of measure in France alone. Not only was this situation a source of error and fraud in commerce, it also held back the development of science. Harmonisation was needed, and it was needed quickly. What was also needed was a common standard that did not cater to any nations vanity and that could be easily realised. Some form of “natural” standard was envisaged and it was decided that the length of a quarter meridian (the distance from pole-to-equator on the Earth) would be the easiest to calculate and would be the most universal. The metre was then defined in 1791 as being equal to the ten millionth part of one quarter of the terrestrial meridian. This definition realised the concept of a standard that was neither arbitrary nor was related to any particular nation. The actual length of the metre still had to be determined however and this took seven years of triangulation measurements between Dunkirk and Barcelona by Pierre-François Mechain and Jean-Baptiste Delambre to come up with a figure of 10 million metres. Subsequent measurements show that they didn’t quite get it right and were off by 0.0229 %, but this isn’t bad considering that they climbed mountains, crossed rivers and were even arrested during the French Revolution and their instruments confiscated under the suspicion of being weapons. Once determined, the metre became the base unit that allowed the units for volume and mass to also be realised. The litre is one cubic decimetre, and again, by using the concept of keeping things simple and natural, the investigators defined the kilogram as being one litre of distilled water. In 1799 the first standards of length and mass were manufactured and were deposited in the Archives of the Republic, dedicated to “all men and all times”. The metric system was both simple and universal and it was soon adopted in many countries throughout Europe and Latin America. There were difficulties, since all countries were dependant on France whenever copies of the standards were required which put the concept of unification in jeopardy. Thus the Bureau International des Poids et Mesures (BIPM) was founded in 1875 and this led to the signing of the Metre Convention by 17 member states on the 20th of May of the same year. The BIPM initially was intended to maintain and disseminate the metric system throughout the world by constructing and maintaining new prototypes of the metre and kilogram. In addition, the BIPM was charged to develop and refine new measurement methods in order to promote metrology in all fields. As industry developed, the BIPM moved its research into more fields such as temperature, electrical, photometry and ionising radiation. In 1960 the International System of Units (S.I. Units) was officially founded. In this system, all units can be mathematically reduced to just seven base units. While this is convenient, it does show that the utmost care must be taken when realising the base units and due diligence must also be taken to continually improve their definitions. This is the job of the national measurement laboratories world-wide. As greater precision is required, the development of a base unit may result in the definition being changed, sometimes not only mathematically but also physically, as in the case of the metre. While the original metre was defined as a quarter of a meridian of the earth, it was difficult to measure and this is why the prototype was created and stored in the National Archives. This prototype of course was subject to degradation by environmental factors and as greater precision was required a new definition had to be found. In 1960 is was defined as 1 650 763,73 times the wavelength of orange radiation from the krypton 86 atom in vacuum. This definition brought the metre back to a natural, reproducible standard both permanent and invariable and offered an accuracy at least 50 times better than the original prototype. In 1983 it was redefined again, following the important works on the and on atomic clocks, and is now "the distance travelled by light in vacuum in 1/299 792 458 of a second". The Future Work continues on the development of the base units and indeed we are now at a point where we can define all of them, including the kilogram, in terms of physical constants. This will rid metrology of the last remaining mass prototypes, stored at the BIPM and other national laboratories for over 200 years, and will mean that all the base units will be defined in terms of fundamental quantities of nature. This so-called “New SI” system is not without its controversy and in fact the value of some of the required constants still need to be agreed upon, however the International Committee for Weights and Measures has already accepted the new definitions and just awaits the accepted values of the constants to within acceptable uncertainties. Stuck in the Past While most nations have legally adopted the metric system and the SI units, it has taken time. For a long period, the mismatch of units has led to several international incidents, including the loss of the Mars Climate Orbiter in 1999 when the designers and builders of the thrusters confused their units causing the $US 125 million orbiter to make an unscheduled and rather sudden stop on the surface of Mars. Hopefully, now that the science and engineering communities of the world have settled on using the SI Units, such disasters are a thing of the past, but in everyday use it is often the layman that resists change. While most nations have legally adopted the metric system and the SI units, members of the public of some countries are still wont to use imperial measures such as feet, inches, pounds and gallons. Some governments as well, despite accepting the metric system have resisted the change to adopt it fully. The United Kingdom is an example, where goods are sold in supermarkets using SI units (kilograms and litres) while in a restaurant a pint of beer and a 16-ounce steak is ordered. Surprisingly though, even in a country where petrol is sold by the litre while fuel consumption is calculated in miles-per- gallon, the UK public seem to be at ease with using a dual system. America also uses a dual system, as the scientific, engineering and medical communities use the SI Units, while “customary units” are for everyday use. This makes things even more complicated since U.S. customary units are based on the British units in use before these were overhauled in 1824 to become the Imperial units. Thus the US pint is smaller than an Imperial pint, but a US gallon is larger than an Imperial gallon. The US survey mile is also longer than the agreed “international” mile. While the SI units have gone a long way to providing unity in legal measurements, as long as there is national pride and resistance to conformity by the layman for everyday use, it seems that we will be forever stuck with having to use conversion tables and unity main remain a pipe-dream.

References & further reading A History of Light and Colour (Measurement Science in the Shadows) Sean F Johnston University of Glasgow, Crichton Campus, UK Institute of Publishing Bristol and Philadelphia http://www.french-metrology.com/en/history/history-mesurement.asp http://theweek.com/articles/456692/11-teeny-units-measurement-tiny-things http://www.theledlight.com/lumens.html https://en.wikipedia.org/wiki/Foot-candle https://sizes.com/units/sys_cgs.htm https://sizes.com/units/ampHist.htm http://hemyockcastle.co.uk/measure.htm https://en.wikipedia.org/wiki/Proposed_redefinition_of_SI_base_units https://en.wikipedia.org/wiki/History_of_measurement http://www.historyworld.net/wrldhis/PlainTextHistories.asp?historyid=ac07