Measurement Technology and Techniques

Total Page:16

File Type:pdf, Size:1020Kb

Measurement Technology and Techniques PART I Measurement Technology and Techniques www.newnespress.com CHAPTER 1 Fundamentals of Measurement G. M. S. de Silva 1.1 Introduction Metrology, or the science of measurement, is a discipline that plays an important role in sustaining modern societies. It deals not only with the measurements that we make in day-to-day living, such as at the shop or the petrol station, but also in industry, science, and technology. The technological advancement of the present-day world would not have been possible if not for the contribution made by metrologists all over the world to maintain accurate measurement systems. The earliest metrological activity has been traced back to prehistoric times. For example, a beam balance dated to 5000 BC has been found in a tomb in Nagada in Egypt. It is well known that Sumerians and Babylonians had well-developed systems of numbers. The very high level of astronomy and advanced status of time measurement in these early Mesopotamian cultures contributed much to the development of science in later periods in the rest of the world. The colossal stupas (large hemispherical domes) of Anuradhapura and Polonnaruwa and the great tanks and canals of the hydraulic civilization bear ample testimony to the advanced system of linear and volume measurement that existed in ancient Sri Lanka. There is evidence that well-established measurement systems existed in the Indus Valley and Mohenjo-Daro civilizations. In fact the number system we use today, known as the Indo-Arabic numbers, with positional notation for the symbols 1–9 and the concept of zero, was introduced into western societies by an English monk who translated the books of the Arab writer Al-Khawanizmi into Latin in the 12th century. www.newnespress.com 4 Chapter 1 In the modern world metrology plays a vital role to protect the consumer and to ensure that manufactured products conform to prescribed dimensional and quality standards. In many countries the implementation of a metrological system is carried out under three distinct headings or services, namely, scientific, industrial, and legal metrology. Industrial metrology is mainly concerned with the measurement of length, mass, volume, temperature, pressure, voltage, current, and a host of other physical and chemical parameters needed for industrial production and process control. The maintenance of accurate dimensional and other physical parameters of manufactured products to ensure that they conform to prescribed quality standards is another important function carried out by industrial metrology services. Industrial metrology thus plays a vital role in the economic and industrial development of a country. It is often said that the level of industrial development of a country can be judged by the status of its metrology. 1.2 Fundamental Concepts The most important fundamental concepts of measurement, except the concepts of uncertainty of measurement, are explained in this section. 1.2.1 Measurand and Influence Quantity The specific quantity determined in a measurement process is known as the measurand. A complete statement of the measurand also requires specification of other quantities, for example, temperature, pressure, humidity, and the like, that may affect the value of the measurand. These quantities are known as influence quantities. For example, in an experiment performed to determine the density of a sample of water at a specific temperature (say 20C), the measurand is the “density of water at 20C.” In this instance the only influence quantity specified is the temperature, namely, 20C. 1.2.2 True Value (of a Quantity) The true value of a quantity is defined as the value consistent with its definition. This implies that there are no measurement errors in the realization of the definition. For example, the density of a substance is defined as mass per unit volume. If the mass and volume of the substance could be determined without making measurement errors, www.newnespress.com Fundamentals of Measurement 5 then the true value of the density can be obtained. Unfortunately, in practice, both these quantities cannot be determined without experimental error. Therefore the true value of a quantity cannot be determined experimentally. 1.2.3 Nominal Value and Conventional True Value The nominal value is the approximate or rounded-off value of a material measure or characteristic of a measuring instrument. For example, when we refer to a resistor as 100O or to a weight as 1 kg, we are using their nominal values. Their exact values, known as conventional true values, may be 99.98O and 1.0001 kg, respectively. The conventional true value is obtained by comparing the test item with a higher-level measurement standard under defined conditions. If we take the example of the 1-kg weight, the conventional true value is the mass value of the weight as defined in the OIML (International Organization for Legal Metrology) International Recommendation RI 33, that is, the apparent mass value of the weight, determined using weights of density 8000 kg/m3 in air of density 1.2 kg/m3 at 20C with a specified uncertainty figure. The conventional value of a weight is usually expressed in the form 1.001 g Æ 0.001 g. 1.2.4 Error and Relative Error of Measurement The difference between the result of a measurement and its true value is known as the error of the measurement. Since a true value cannot be determined, the error, as defined, cannot be determined as well. A conventional true value is therefore used in practice to determine an error. The relative error is obtained by dividing the error by the average of the measured value. When it is necessary to distinguish an error from a relative error, the former is sometimes called the absolute error of measurement. As the error could be positive or negative of another term, the absolute value of the error is used to express the magnitude (or modulus) of the error. As an example, suppose we want to determine the error of a digital multimeter at a nominal voltage level of 10V DC. The multimeter is connected to a DC voltage standard supplying a voltage of 10V DC and the reading is noted down. The procedure is repeated several times, say five times. The mean of the five readings is calculated and found to be 10.2V. The error is then calculated as 10.2 – 10.0 ¼þ0.2V. The relative error is obtained by dividing 0.2V by 10.2V, giving 0.02. The relative error as a percentage is obtained by multiplying the relative error (0.02) by 100; that is, the relative error is 0.2% of the reading. www.newnespress.com 6 Chapter 1 In this example a conventional true value is used, namely, the voltage of 10V DC supplied by the voltage standard, to determine the error of the instrument. 1.2.5 Random Error The random error of measurement arises from unpredictable variations of one or more influence quantities. The effects of such variations are known as random effects.For example, in determining the length of a bar or gauge block, the variation in temperature of the environment gives rise to an error in the measured value. This error is due to a random effect, namely, the unpredictable variation in the environmental temperature. It is not possible to compensate for random errors. However, the uncertainties arising from random effects can be quantified by repeating the experiment a number of times. 1.2.6 Systematic Error An error that occurs due to a more or less constant effect is a systematic error. If the zero of a measuring instrument has been shifted by a constant amount this would give rise to a systematic error. In measuring the voltage across a resistance using a voltmeter, the finite impedance of the voltmeter often causes a systematic error. A correction can be computed if the impedance of the voltmeter and the value of the resistance are known. Often, measuring instruments and systems are adjusted or calibrated using measurement standards and reference materials to eliminate systematic effects. However, the uncertainties associated with the standard or the reference material are incorporated in the uncertainty of the calibration. 1.2.7 Accuracy and Precision The terms accuracy and precision are often misunderstood or confused. The accuracy of a measurement is the degree of its closeness to the true value. The precision of a measurement is the degree of scatter of the measurement result, when the measurement is repeated a number of times under specified conditions. In Figure 1.1 the results obtained from a measurement experiment using a measuring instrument are plotted as a frequency distribution. The vertical axis represents the frequency of the measurement result and the horizontal axis represents the values of the results (X). The central vertical line represents the mean value of all the measurement results. The vertical line marked T represents the true value of the measurand. The difference between the mean value and the T line is the accuracy of the measurement. www.newnespress.com Fundamentals of Measurement 7 S T T – True value S – Reference standard X value Δ – Accuracy σ x – Precision U – Uncertainty of reference Δ standard U σ σ x x X Figure 1.1: Accuracy, precision, and true value. The standard deviation (marked sx) of all the measurement results about the mean value is a quantitative measure for the precision of the measurement. Unfortunately the accuracy defined in this manner cannot be determined, as the true value (T) of a measurement cannot be obtained due to errors prevalent in the measurement process. The only way to obtain an estimate of accuracy is to use a higher- level measurement standard in place of the measuring instrument to perform the measurement and use the resulting mean value as the true value.
Recommended publications
  • An Atomic Physics Perspective on the New Kilogram Defined by Planck's Constant
    An atomic physics perspective on the new kilogram defined by Planck’s constant (Wolfgang Ketterle and Alan O. Jamison, MIT) (Manuscript submitted to Physics Today) On May 20, the kilogram will no longer be defined by the artefact in Paris, but through the definition1 of Planck’s constant h=6.626 070 15*10-34 kg m2/s. This is the result of advances in metrology: The best two measurements of h, the Watt balance and the silicon spheres, have now reached an accuracy similar to the mass drift of the ur-kilogram in Paris over 130 years. At this point, the General Conference on Weights and Measures decided to use the precisely measured numerical value of h as the definition of h, which then defines the unit of the kilogram. But how can we now explain in simple terms what exactly one kilogram is? How do fixed numerical values of h, the speed of light c and the Cs hyperfine frequency νCs define the kilogram? In this article we give a simple conceptual picture of the new kilogram and relate it to the practical realizations of the kilogram. A similar change occurred in 1983 for the definition of the meter when the speed of light was defined to be 299 792 458 m/s. Since the second was the time required for 9 192 631 770 oscillations of hyperfine radiation from a cesium atom, defining the speed of light defined the meter as the distance travelled by light in 1/9192631770 of a second, or equivalently, as 9192631770/299792458 times the wavelength of the cesium hyperfine radiation.
    [Show full text]
  • Guide for the Use of the International System of Units (SI)
    Guide for the Use of the International System of Units (SI) m kg s cd SI mol K A NIST Special Publication 811 2008 Edition Ambler Thompson and Barry N. Taylor NIST Special Publication 811 2008 Edition Guide for the Use of the International System of Units (SI) Ambler Thompson Technology Services and Barry N. Taylor Physics Laboratory National Institute of Standards and Technology Gaithersburg, MD 20899 (Supersedes NIST Special Publication 811, 1995 Edition, April 1995) March 2008 U.S. Department of Commerce Carlos M. Gutierrez, Secretary National Institute of Standards and Technology James M. Turner, Acting Director National Institute of Standards and Technology Special Publication 811, 2008 Edition (Supersedes NIST Special Publication 811, April 1995 Edition) Natl. Inst. Stand. Technol. Spec. Publ. 811, 2008 Ed., 85 pages (March 2008; 2nd printing November 2008) CODEN: NSPUE3 Note on 2nd printing: This 2nd printing dated November 2008 of NIST SP811 corrects a number of minor typographical errors present in the 1st printing dated March 2008. Guide for the Use of the International System of Units (SI) Preface The International System of Units, universally abbreviated SI (from the French Le Système International d’Unités), is the modern metric system of measurement. Long the dominant measurement system used in science, the SI is becoming the dominant measurement system used in international commerce. The Omnibus Trade and Competitiveness Act of August 1988 [Public Law (PL) 100-418] changed the name of the National Bureau of Standards (NBS) to the National Institute of Standards and Technology (NIST) and gave to NIST the added task of helping U.S.
    [Show full text]
  • The Kibble Balance and the Kilogram
    C. R. Physique 20 (2019) 55–63 Contents lists available at ScienceDirect Comptes Rendus Physique www.sciencedirect.com The new International System of Units / Le nouveau Système international d’unités The Kibble balance and the kilogram La balance de Kibble et le kilogramme ∗ Stephan Schlamminger , Darine Haddad NIST, 100 Bureau Drive, Gaithersburg, MD 20899, USA a r t i c l e i n f o a b s t r a c t Article history: Dr. Bryan Kibble invented the watt balance in 1975 to improve the realization of the unit Available online 25 March 2019 for electrical current, the ampere. With the discovery of the Quantum Hall effect in 1980 by Dr. Klaus von Klitzing and in conjunction with the previously predicted Josephson effect, Keywords: this mechanical apparatus could be used to measure the Planck constant h. Following a Unit of mass proposal by Quinn, Mills, Williams, Taylor, and Mohr, the Kibble balance can be used to Kilogram Planck constant realize the unit of mass, the kilogram, by fixing the numerical value of Planck’s constant. Kibble balance In 2017, the watt balance was renamed to the Kibble balance to honor the inventor, who Revised SI passed in 2016. This article explains the Kibble balance, its role in the redefinition of the Josephson effect unit of mass, and attempts an outlook of the future. Quantum Hall effect Published by Elsevier Masson SAS on behalf of Académie des sciences. This is an open access article under the CC BY-NC-ND license Mots-clés : (http://creativecommons.org/licenses/by-nc-nd/4.0/).
    [Show full text]
  • Quick Guide to Precision Measuring Instruments
    E4329 Quick Guide to Precision Measuring Instruments Coordinate Measuring Machines Vision Measuring Systems Form Measurement Optical Measuring Sensor Systems Test Equipment and Seismometers Digital Scale and DRO Systems Small Tool Instruments and Data Management Quick Guide to Precision Measuring Instruments Quick Guide to Precision Measuring Instruments 2 CONTENTS Meaning of Symbols 4 Conformance to CE Marking 5 Micrometers 6 Micrometer Heads 10 Internal Micrometers 14 Calipers 16 Height Gages 18 Dial Indicators/Dial Test Indicators 20 Gauge Blocks 24 Laser Scan Micrometers and Laser Indicators 26 Linear Gages 28 Linear Scales 30 Profile Projectors 32 Microscopes 34 Vision Measuring Machines 36 Surftest (Surface Roughness Testers) 38 Contracer (Contour Measuring Instruments) 40 Roundtest (Roundness Measuring Instruments) 42 Hardness Testing Machines 44 Vibration Measuring Instruments 46 Seismic Observation Equipment 48 Coordinate Measuring Machines 50 3 Quick Guide to Precision Measuring Instruments Quick Guide to Precision Measuring Instruments Meaning of Symbols ABSOLUTE Linear Encoder Mitutoyo's technology has realized the absolute position method (absolute method). With this method, you do not have to reset the system to zero after turning it off and then turning it on. The position information recorded on the scale is read every time. The following three types of absolute encoders are available: electrostatic capacitance model, electromagnetic induction model and model combining the electrostatic capacitance and optical methods. These encoders are widely used in a variety of measuring instruments as the length measuring system that can generate highly reliable measurement data. Advantages: 1. No count error occurs even if you move the slider or spindle extremely rapidly. 2. You do not have to reset the system to zero when turning on the system after turning it off*1.
    [Show full text]
  • Weights and Measures Standards of the United States—A Brief History (1963), by Lewis V
    WEIGHTS and MEASURES STANDARDS OF THE UMIT a brief history U.S. DEPARTMENT OF COMMERCE NATIONAL BUREAU OF STANDARDS NBS Special Publication 447 WEIGHTS and MEASURES STANDARDS OF THE TP ii 2ri\ ii iEa <2 ^r/V C II llinCAM NBS Special Publication 447 Originally Issued October 1963 Updated March 1976 For sale by the Superintendent of Documents, U.S. Government Printing Office Wash., D.C. 20402. Price $1; (Add 25 percent additional for other than U.S. mailing). Stock No. 003-003-01654-3 Library of Congress Catalog Card Number: 76-600055 Foreword "Weights and Measures," said John Quincy Adams in 1821, "may be ranked among the necessaries of life to every individual of human society." That sentiment, so appropriate to the agrarian past, is even more appropriate to the technology and commerce of today. The order that we enjoy, the confidence we place in weighing and measuring, is in large part due to the measure- ment standards that have been established. This publication, a reprinting and updating of an earlier publication, provides detailed information on the origin of our standards for mass and length. Ernest Ambler Acting Director iii Preface to 1976 Edition Two publications of the National Bureau of Standards, now out of print, that deal with weights and measures have had widespread use and are still in demand. The publications are NBS Circular 593, The Federal Basis for Weights and Measures (1958), by Ralph W. Smith, and NBS Miscellaneous Publication 247, Weights and Measures Standards of the United States—a Brief History (1963), by Lewis V.
    [Show full text]
  • A Practical Josephson Voltage Standard at One Volt
    A Practical Josephson Voltage Standard at One Volt This paper [1] is considered the seminal, definitive Weston cell was officially adopted for maintaining paper describing the revolutionary one-volt Josephson- the volt. After 1908, only Weston cells were used junction array standard. NIST changed forever high- for maintaining the national standard in the United accuracy voltage measurements with this development, States. which built on earlier work at NIST and a microwave The Weston standard cell can be disturbed by trans- feed design from the then West German standards port or if it is subjected to a change in temperature or a laboratory, Physikalisch-Technische Bundesanstalt small electrical current. When at times it was necessary (PTB). The basic element of the array is the Josephson to eliminate cells—due to changes in emf of a cell junction, in the form of a superconductor-insulator- relative to the mean of the group—new cells could superconductor sandwich. When irradiated with micro- be added. In 1965 the National Reference Group of wave energy, such a junction exhibits a dc potential standard cells [10] included 11 cells made in 1906, uniquely determined by the frequency of the radiation, seven cells made in 1932, and 26 cells made in 1948. the electronic charge, and Planck’s constant, with a Long-term stability of the volt reference was also single junction providing a few millivolts. In other maintained by comparisons of neutral and acid cells, words, a Josephson junction can act as a superb preparing and characterizing new cells, and through frequency-to-dc voltage converter.
    [Show full text]
  • The New Definition of the Kilogram: Guide for Teachers and Students
    The new definition of the kilogram: Guide for teachers and students On May 20, the kilogram will no longer be defined by the artefact in Paris, but through the definition of Planck’s constant h=6.626 070 15 10-34 kg m2/s. This is a major change for metrology, but also a challenge for teachers to explain what now defines the unit of mass. However, this is also an opportunity to educate students and the public about modern science. Ideally, every high school teacher would tell his or her science class about this historic change. Here we provide a way of explaining the new kg in a direct and simple way. First, one should illustrate the concept of defining units through fundamental constants. Time is directly defined by the frequency of an atomic clock (more precisely, inside the cesium atom, there is an oscillation at νCs = 9,192,631,770 Hertz in the motion of the nuclear magnetic moment and electron magnetic moment). Length (or the meter) used to be defined by a 1 m long bar made of a platinum/iridium alloy. In 1960, the meter was redefined as 1,650,763.73 wavelengths of the orange- red light emitted by the Kr-86 isotope of the krypton atom. This was an important step from using imperfect man-made objects to perfect objects made by nature (cesium atoms, krypton atoms) which allows every country or laboratory to have a primary standard. Since time can be measured more accurately than length, the meter was redefined in 1984 by defining the numerical value of the speed of light as c= 299,792,458 meter/second.
    [Show full text]
  • Chapter 1: STANDARDS of MEASUREMENT Definition of Metrology: Metrology (From Ancient Greek Metron (Measure) and Logos (Study Of)) Is the Science of Measurement
    www.getmyuni.com Chapter 1: STANDARDS OF MEASUREMENT Definition of Metrology: Metrology (from Ancient Greek metron (measure) and logos (study of)) is the science of measurement. Metrology includes all theoretical and practical aspects of measurement. Metrology is concerned with the establishment, reproduction, conservation and transfer of units of measurement & their standards. For engineering purposes, metrology is restricted to measurements of length and angle & quantities which are expressed in linear or angular terms. Measurement is a process of comparing quantitatively an unknown magnitude with a predefined standard. Objectives of Metrology: The basic objectives of metrology are; 1. To provide accuracy at minimum cost. 2. Thorough evaluation of newly developed products, and to ensure that components are within the specified dimensions. 3. To determine the process capabilities. 4. To assess the measuring instrument capabilities and ensure that they are adequate for their specific measurements. 5. To reduce the cost of inspection & rejections and rework. 6. To standardize measuring methods. 7. To maintain the accuracy of measurements through periodical calibration of the instruments. 8. To prepare designs for gauges and special inspection fixtures. Definition of Standards: A standard is defined as “something that is set up and established by an authority as rule of the measure of quantity, weight, extent, value or quality”. For example, a meter is a standard established by an international organization for measurement of length. Industry, commerce, international trade in modern civilization would be impossible without a good system of standards. Role of Standards: The role of standards is to achieve uniform, consistent and repeatable measurements throughout the world.
    [Show full text]
  • UNITS of WEIGHT and MEASURE International (Metric) and U.S
    I \ ___^am UNITS OF WEIGHT AND MEASURE International (Metric) and U.S. Customary Definitions and Tables of Equivalents ivit I crv¥Hi\u M I I I Arm 'K^ he I I ^Nfck. r a law I I mmm I m mmJr \mw I mum lARE-ACRt STANDARDS U.S. DEPARTMENT OF COMMERCE / NATIONAL BUREAU OF Miscellaneous Publication 286 : THE NATIONAL BUREAU OF STANDARDS The National Bureau of Standards 1 provides measurement and technical information services essential to the efficiency and effectiveness of the work of the Nation's scientists and engineers. The Bureau serves also as a focal point in the Federal Government for assur- ing maximum application of the physical and engineering sciences to the advancement of technology in industry and commerce. To accomplish this mission, the Bureau is organized into three institutes covering broad program areas of research and services: THE INSTITUTE FOR BASIC STANDARDS . provides the central basis within the United States for a complete and consistent system of physical measurements, coor- dinates that system with the measurement systems of other nations, and furnishes essential services leading to accurate and uniform physical measurements throughout the Nation's scientific community, industry, and commerce. This Institute comprises a series of divisions, each serving a classical subject matter area: —Applied Mathematics—Electricity—Metrology—Mechanics—Heat—Atomic Phys- ics—Physical Chemistry—Radiation Physics—Laboratory Astrophysics 2—Radio Standards Laboratory, 2 which includes Radio Standards Physics and Radio Standards Engineering—Office of Standard Reference Data. THE INSTITUTE FOR MATERIALS RESEARCH . conducts materials research and provides associated materials services including mainly reference materials and data on the properties of materials.
    [Show full text]
  • Trialogue on the Number of Fundamental Constants
    MCTP-01-45 CERN-TH/2001-277 physics/0110060 Trialogue on the number of fundamental constants M. J. Duff 1,L.B.Okun 2 and G. Veneziano 3 ∗ † ‡ ∗Michigan Center for Theoretical Physics, Randall Laboratory, Ann Arbor, MI ITEP, Moscow, 117218, Russia † Theory Division, CERN, CH-1211 Geneva 23, Switzerland ‡ and Laboratoire de Physique Th`eorique, Universit`e Paris Sud, 91405, Orsay, France Abstract This paper consists of three separate articles on the number of fundamental dimen- sionful constants in physics. We started our debate in summer 1992 on the terrace of the famous CERN cafeteria. In the summer of 2001 we returned to the subject to find that our views still diverged and decided to explain our current positions. LBO develops the traditional approach with three constants, GV argues in favor of just two, while MJD advocates zero. 1mduff@umich.edu [email protected] [email protected] Fundamental constants: parameters and units L.B. Okun Abstract There are two kinds of fundamental constants of Nature: dimensionless (like α ' 1/137) and dimensionful (c – velocity of light, ¯h – quantum of action and angular momentum, and G – Newton’s gravitational constant). To clarify the discussion I suggest to refer to the former as fundamental parameters and the latter as fundamental (or basic) units. It is necessary and sufficient to have three basic units in order to reproduce in an experimentally meaningful way the dimensions of all physical quantities. Theoretical equations describing the physical world deal with dimensionless quantities and their solutions depend on dimensionless fundamental parameters. But experiments, from which these theories are extracted and by which they could be tested, involve measurements, i.e.
    [Show full text]
  • Speed of Light from Direct Frequency and Wavelength Measurements
    Speed of Light From Direct Frequency and Wavelength Measurements The National Bureau of Standards has had a long generation and mixing. With this approach, a frequency- history of interest in the speed of light, and no doubt this synthesis chain was constructed linking the microwave interest contributed to the measurement described here output of the cesium frequency standard to the optical [1]. As early as 1907, Rosa and Dorsey [2] determined region, so that the group could directly measure the the speed of light from the ratio of the capacitance of a frequency of a helium-neon laser stabilized against the condenser as measured in electrostatic and electro- 3.39 ␮m transition of methane. When the measurements magnetic units. Over the ensuing years NBS developed were completed, the uncertainty limitation was found to still other methods to improve upon the accuracy of this be the asymmetry of the krypton line on which the important physical constant. definition of the meter was then based. The experiment By the late 1960s, lasers stabilized in frequency thus showed that the realization of the meter could be to atomic and molecular resonances were becoming substantially improved through redefinition. reliable research tools. These could be viewed as provid- This careful measurement resulted in a reduction of ing stable reference for either optical frequency or wave- the uncertainty of the speed of light by a factor of nearly length. This duality of frequency and length produced 100. The methods developed at NIST were replicated in the obvious suggestion that a simultaneous measure- a number of other laboratories, and the experiments ment of frequency and length for the same laser transi- were repeated and improved to the point where it was tion would yield a very good measurement of the generally agreed that this technology could form the speed of light.
    [Show full text]
  • Time Line for the Definition of the Meter
    Time Line for the Definition of the Meter By: William B. Penzes 1791 The International System (formerly called the Metric System) is the decimal system of weights and measures based on the meter and the kilogram. The essential features of the system were embodied in a report to the French National Assembly by the Paris Academy of Sciences. 1799 Originally intended to be one ten-millionth part of the quadrant of the earth, the so called Meter of the Archives was based on a measurement of a meridian between Dunkirk and Barcelona. A platinum bar with a rectangular cross section and polished parallel ends was made to embody the meter. The meter was defined as the distance between the polished end faces at a specified temperature and it was the international standard for most of the 19th century. It was compared to other bars with optical comparators as a means of disseminating the unit. 1859 J.C. Maxwell suggested choosing as a natural standard, the wavelength of the yellow spectral line of sodium. 1866 By act of the U.S. Congress, the use of the metric system was legalized in this country, but was not made obligatory.The International Commission of the Meter made the Meter of the Archives the official definition of the meter and the standard of length. It was admitted that its relationship to a quadrant of the earth was tenuous and of little consequence anyway. The Commission had 30 prototype meters made using the Meter of the Archives as the reference. 1872 The International Commission of the Meter made the Meter of the Archives the official definition of the meter and the standard of length.
    [Show full text]