Beginner's Guide to Measurement in Mechanical Engineering Measurement in Mechanical Engineering
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
11 Fluid Statics
CHAPTER 11 | FLUID STATICS 357 11 FLUID STATICS Figure 11.1 The fluid essential to all life has a beauty of its own. It also helps support the weight of this swimmer. (credit: Terren, Wikimedia Commons) Learning Objectives 11.1. What Is a Fluid? • State the common phases of matter. • Explain the physical characteristics of solids, liquids, and gases. • Describe the arrangement of atoms in solids, liquids, and gases. 11.2. Density • Define density. • Calculate the mass of a reservoir from its density. • Compare and contrast the densities of various substances. 11.3. Pressure • Define pressure. • Explain the relationship between pressure and force. • Calculate force given pressure and area. 11.4. Variation of Pressure with Depth in a Fluid • Define pressure in terms of weight. • Explain the variation of pressure with depth in a fluid. • Calculate density given pressure and altitude. 11.5. Pascal’s Principle • Define pressure. • State Pascal’s principle. • Understand applications of Pascal’s principle. • Derive relationships between forces in a hydraulic system. 11.6. Gauge Pressure, Absolute Pressure, and Pressure Measurement • Define gauge pressure and absolute pressure. • Understand the working of aneroid and open-tube barometers. 11.7. Archimedes’ Principle • Define buoyant force. • State Archimedes’ principle. • Understand why objects float or sink. • Understand the relationship between density and Archimedes’ principle. 11.8. Cohesion and Adhesion in Liquids: Surface Tension and Capillary Action • Understand cohesive and adhesive forces. • Define surface tension. • Understand capillary action. 11.9. Pressures in the Body • Explain the concept of pressure the in human body. • Explain systolic and diastolic blood pressures. • Describe pressures in the eye, lungs, spinal column, bladder, and skeletal system. -
Pressure, Its Units of Measure and Pressure References
_______________ White Paper Pressure, Its Units of Measure and Pressure References Viatran Phone: 1‐716‐629‐3800 3829 Forest Parkway Fax: 1‐716‐693‐9162 Suite 500 [email protected] Wheatfield, NY 14120 www.viatran.com This technical note is a summary reference on the nature of pressure, some common units of measure and pressure references. Read this and you won’t have to wait for the movie! PRESSURE Gas and liquid molecules are in constant, random motion called “Brownian” motion. The average speed of these molecules increases with increasing temperature. When a gas or liquid molecule collides with a surface, momentum is imparted into the surface. If the molecule is heavy or moving fast, more momentum is imparted. All of the collisions that occur over a given area combine to result in a force. The force per unit area defines the pressure of the gas or liquid. If we add more gas or liquid to a constant volume, then the number of collisions must increase, and therefore pressure must increase. If the gas inside the chamber is heated, the gas molecules will speed up, impact with more momentum and pressure increases. Pressure and temperature therefore are related (see table at right). The lowest pressure possible in nature occurs when there are no molecules at all. At this point, no collisions exist. This condition is known as a pure vacuum, or the absence of all matter. It is also possible to cool a liquid or gas until all molecular motion ceases. This extremely cold temperature is called “absolute zero”, which is -459.4° F. -
Introduction to Measurements & Error Analysis
Introduction to Measurements & Error Analysis The Uncertainty of Measurements Some numerical statements are exact: Mary has 3 brothers, and 2 + 2 = 4. However, all measurements have some degree of uncertainty that may come from a variety of sources. The process of evaluating this uncertainty associated with a measurement result is often called uncertainty analysis or error analysis. The complete statement of a measured value should include an estimate of the level of confidence associated with the value. Properly reporting an experimental result along with its uncertainty allows other people to make judgments about the quality of the experiment, and it facilitates meaningful comparisons with other similar values or a theoretical prediction. Without an uncertainty estimate, it is impossible to answer the basic scientific question: “Does my result agree with a theoretical prediction or results from other experiments?” This question is fundamental for deciding if a scientific hypothesis is confirmed or refuted. When we make a measurement, we generally assume that some exact or true value exists based on how we define what is being measured. While we may never know this true value exactly, we attempt to find this ideal quantity to the best of our ability with the time and resources available. As we make measurements by different methods, or even when making multiple measurements using the same method, we may obtain slightly different results. So how do we report our findings for our best estimate of this elusive true value? The most common way to show the range of values that we believe includes the true value is: measurement = best estimate ± uncertainty (units) Let’s take an example. -
An Atomic Physics Perspective on the New Kilogram Defined by Planck's Constant
An atomic physics perspective on the new kilogram defined by Planck’s constant (Wolfgang Ketterle and Alan O. Jamison, MIT) (Manuscript submitted to Physics Today) On May 20, the kilogram will no longer be defined by the artefact in Paris, but through the definition1 of Planck’s constant h=6.626 070 15*10-34 kg m2/s. This is the result of advances in metrology: The best two measurements of h, the Watt balance and the silicon spheres, have now reached an accuracy similar to the mass drift of the ur-kilogram in Paris over 130 years. At this point, the General Conference on Weights and Measures decided to use the precisely measured numerical value of h as the definition of h, which then defines the unit of the kilogram. But how can we now explain in simple terms what exactly one kilogram is? How do fixed numerical values of h, the speed of light c and the Cs hyperfine frequency νCs define the kilogram? In this article we give a simple conceptual picture of the new kilogram and relate it to the practical realizations of the kilogram. A similar change occurred in 1983 for the definition of the meter when the speed of light was defined to be 299 792 458 m/s. Since the second was the time required for 9 192 631 770 oscillations of hyperfine radiation from a cesium atom, defining the speed of light defined the meter as the distance travelled by light in 1/9192631770 of a second, or equivalently, as 9192631770/299792458 times the wavelength of the cesium hyperfine radiation. -
Calibration Methods – Nomenclature and Classification
CHAPTER 8 CALIBRATION METHODS – NOMENCLATURE AND CLASSIFICATION Paweł Kościelniak Institute of Analytical Chemistry, Faculty of Chemistry, Jagiellonian University, R. Ingardena 3, 30-060 Kraków, Poland ABSTRACT Reviewing the analytical literature, including academic textbooks, one can notice that in fact there is no precise and clear terminology dealing with the analytical calibration. Especially a great confusion exists in nomenclature related to the calibration methods: not only different names are used with reference to a given method, but they do not express the principles and the nature of different methods properly (e.g. "the set of standard method" or "the internal standard method"). The problem mentioned above is of great importance. A lack of good terminology can be a source of misunderstandings and, consequently, can be even a reason of carrying out an analytical treatment against the rules. Finally, the aspect of rather psychological nature is worth to be stressed, namely just an analyst is (or should be at least) especially sensitive to such terms as "order" and "purity" irrespectively of what analytical area is considered. Chapter 8 1 INTRODUCTION Reading the professional literature, one is bound to arrive at the conclusion that in analytical chemistry there is a lack of clearly defined, current nomenclature relating to the problems of analytical calibration. It is characteristic that, among other things, in spite of the inevitable necessity of carrying out calibration in instrumental analysis and the common usage of the term ‘analytical calibration’ itself, it is not defined even in texts on nomenclature problems in chemistry [1,2], or otherwise the definitions are not connected with analytical practice [3]. -
Pressure Measuring Instruments
testo-312-2-3-4-P01 21.08.2012 08:49 Seite 1 We measure it. Pressure measuring instruments For gas and water installers testo 312-2 HPA testo 312-3 testo 312-4 BAR °C www.testo.com testo-312-2-3-4-P02 23.11.2011 14:37 Seite 2 testo 312-2 / testo 312-3 We measure it. Pressure meters for gas and water fitters Use the testo 312-2 fine pressure measuring instrument to testo 312-2 check flue gas draught, differential pressure in the combustion chamber compared with ambient pressure testo 312-2, fine pressure measuring or gas flow pressure with high instrument up to 40/200 hPa, DVGW approval, incl. alarm display, battery and resolution. Fine pressures with a resolution of 0.01 hPa can calibration protocol be measured in the range from 0 to 40 hPa. Part no. 0632 0313 DVGW approval according to TRGI for pressure settings and pressure tests on a gas boiler. • Switchable precision range with a high resolution • Alarm display when user-defined limit values are • Compensation of measurement fluctuations caused by exceeded temperature • Clear display with time The versatile pressure measuring instrument testo 312-3 testo 312-3 supports load and gas-rightness tests on gas and water pipelines up to 6000 hPa (6 bar) quickly and reliably. testo 312-3 versatile pressure meter up to Everything you need to inspect gas and water pipe 300/600 hPa, DVGW approval, incl. alarm display, battery and calibration protocol installations: with the electronic pressure measuring instrument testo 312-3, pressure- and gas-tightness can be tested. -
Surface Tension Measurement." Copyright 2000 CRC Press LLC
David B. Thiessen, et. al.. "Surface Tension Measurement." Copyright 2000 CRC Press LLC. <http://www.engnetbase.com>. Surface Tension Measurement 31.1 Mechanics of Fluid Surfaces 31.2 Standard Methods and Instrumentation Capillary Rise Method • Wilhelmy Plate and du Noüy Ring Methods • Maximum Bubble Pressure Method • Pendant Drop and Sessile Drop Methods • Drop Weight or Volume David B. Thiessen Method • Spinning Drop Method California Institute of Technology 31.3 Specialized Methods Dynamic Surface Tension • Surface Viscoelasticity • Kin F. Man Measurements at Extremes of Temperature and Pressure • California Institute of Technology Interfacial Tension The effect of surface tension is observed in many everyday situations. For example, a slowly leaking faucet drips because the force of surface tension allows the water to cling to it until a sufficient mass of water is accumulated to break free. Surface tension can cause a steel needle to “float” on the surface of water although its density is much higher than that of water. The surface of a liquid can be thought of as having a skin that is under tension. A liquid droplet is somewhat analogous to a balloon filled with air. The elastic skin of the balloon contains the air inside at a slightly higher pressure than the surrounding air. The surface of a liquid droplet likewise contains the liquid in the droplet at a pressure that is slightly higher than ambient. A clean liquid surface, however, is not elastic like a rubber skin. The tension in a piece of rubber increases as it is stretched and will eventually rupture. A clean liquid surface can be expanded indefinitely without changing the surface tension. -
Chapter 5 Dimensional Analysis and Similarity
Chapter 5 Dimensional Analysis and Similarity Motivation. In this chapter we discuss the planning, presentation, and interpretation of experimental data. We shall try to convince you that such data are best presented in dimensionless form. Experiments which might result in tables of output, or even mul- tiple volumes of tables, might be reduced to a single set of curves—or even a single curve—when suitably nondimensionalized. The technique for doing this is dimensional analysis. Chapter 3 presented gross control-volume balances of mass, momentum, and en- ergy which led to estimates of global parameters: mass flow, force, torque, total heat transfer. Chapter 4 presented infinitesimal balances which led to the basic partial dif- ferential equations of fluid flow and some particular solutions. These two chapters cov- ered analytical techniques, which are limited to fairly simple geometries and well- defined boundary conditions. Probably one-third of fluid-flow problems can be attacked in this analytical or theoretical manner. The other two-thirds of all fluid problems are too complex, both geometrically and physically, to be solved analytically. They must be tested by experiment. Their behav- ior is reported as experimental data. Such data are much more useful if they are ex- pressed in compact, economic form. Graphs are especially useful, since tabulated data cannot be absorbed, nor can the trends and rates of change be observed, by most en- gineering eyes. These are the motivations for dimensional analysis. The technique is traditional in fluid mechanics and is useful in all engineering and physical sciences, with notable uses also seen in the biological and social sciences. -
Software Testing
Software Testing PURPOSE OF TESTING CONTENTS I. Software Testing Background II. Software Error Case Studies 1. Disney Lion King 2. Intel Pentium Floating Point Division Bug 3. NASA Mars Polar Lander 4. Patriot Missile Defense System 5. Y2K Bug III. What is Bug? 1. Terms for Software Failure 2. Software Bug: A Formal Definition 3. Why do Bugs occur? and cost of bug. 4. What exactly does a Software Tester do? 5. What makes a good Software Tester? IV. Software Development Process 1. Product Components 2. What Effort Goes into a Software Product? 3. What parts make up a Software Product? 4. Software Project Staff V. Software Development Lifecycle Models 1. Big Bang Model 2. Code and Fix Model 3. Waterfall Model 4. Spiral Model VI. The Realities of Software Testing VII. Software Testing Terms and Definition 1. Precision and Accuracy 2. Verification and Validation 3. Quality Assurance and Quality Control Anuradha Bhatia Software Testing I. Software Testing Background 1. Software is a set of instructions to perform some task. 2. Software is used in many applications of the real world. 3. Some of the examples are Application software, such as word processors, firmware in an embedded system, middleware, which controls and co-ordinates distributed systems, system software such as operating systems, video games, and websites. 4. All of these applications need to run without any error and provide a quality service to the user of the application. 5. The software has to be tested for its accurate and correct working. Software Testing: Testing can be defined in simple words as “Performing Verification and Validation of the Software Product” for its correctness and accuracy of working. -
A Measuring Instrument for Multipoint Soil Temperature Underground
A MEASURING INSTRUMENT FOR MULTIPOINT SOIL TEMPERATURE UNDERGROUND Cheng Wang, Chunjiang Zhao * , Xiaojun Qiao, Zhilong Xu National Engineering Research Center for Information Technology in Agriculture, Beijing, P. R. China, 100097 * Corresponding author, Address: Shuguang Huayuan Middle Road 11#, Beijing, 100097, P. R. China, Tel: +86-10-51503411, Fax: +86-10-51503449, Email: [email protected] Abstract: A new measuring instrument for 10 points soil temperatures in 0–50 centimeters depth underground was designed. System was based on Silicon Laboratories’ MCU C8051F310, single chip digital temperature sensor DS18B20, and other peripheral circuits. It was simultaneously able to measure, memory and display, and also convey data to computer via a standard RS232 interface. Keywords: Multi-point Soil Temperature; Portable; DS18B20; C8051F310 1. INTRODUCTION The temperature of soil is a vital environmental factor, which directly influences the activity of microorganisms and the decomposition of organic substances. It can affect roots absorbing water and mineral elements. It also plays an important role in the growth rate and range of roots. Statistically, roots of most plants are within 50 centimeters underground, so it becomes very significant to measure the soil temperature of different depth in this level. The Soil Temperature Measuring Instruments used nowadays mainly fall into three types, the first type is the measure temperature by making use of the relationship between the soil temperature and the temperature-sensitive resistor. Before using this sort of instruments, the system parameters need to Wang, C., Zhao, C., Qiao, X. and Xu, Z., 2008, in IFIP International Federation for Information Processing, Volume 259; Computer and Computing Technologies in Agriculture, Vol. -
Made to Measure. Practical Guide to Electrical Measurements in Low Voltage Switchboards V
Contact us A 250 500 200 150 V (b) 100 (a) 50 0 t Made to measure. Practical guide to electrical measurements in low voltage switchboards A 250 500 ABB SACE The data and illustrations are not binding. We reserve 200 the right to modify the contents of this document on the 150 Una divisione di ABB S.p.A. basis of technical development of the products, 100 Apparecchi Modulari without prior notice. 50 0 Viale dell’Industria, 18 Copyright 2010 ABB. All rights reserved. - 1.500 - CAL. 20010 Vittuone (MI) Tel.: 02 9034 1 Fax: 02 9034 7609 bol.it.abb.com www.abb.com V 80 V 60 2CSC445012D0201 - 12/2010 (f) 40 50 Hz 20 0 t Made to measure. Practical guide to electrical measurements in low voltage switchboards table of Made to measure. Practical guide to electrical measurements contents in low voltage switchboards 1 Electrical measurements 5.3.2 Current transformers ......................................................... 37 5.3.3 Voltage transformers ......................................................... 38 1.1 Why is it important to measure? .......................................... 3 5.3.4 Shunts for direct current .................................................... 38 1.2 Applicational contexts .......................................................... 4 1.3 Problems connected with energy networks ......................... 4 6 The measurements 1.4 Reducing consumption ........................................................ 7 1.5 Table of charges .................................................................. 8 6.1 TRMS Measurements -
Guide for the Use of the International System of Units (SI)
Guide for the Use of the International System of Units (SI) m kg s cd SI mol K A NIST Special Publication 811 2008 Edition Ambler Thompson and Barry N. Taylor NIST Special Publication 811 2008 Edition Guide for the Use of the International System of Units (SI) Ambler Thompson Technology Services and Barry N. Taylor Physics Laboratory National Institute of Standards and Technology Gaithersburg, MD 20899 (Supersedes NIST Special Publication 811, 1995 Edition, April 1995) March 2008 U.S. Department of Commerce Carlos M. Gutierrez, Secretary National Institute of Standards and Technology James M. Turner, Acting Director National Institute of Standards and Technology Special Publication 811, 2008 Edition (Supersedes NIST Special Publication 811, April 1995 Edition) Natl. Inst. Stand. Technol. Spec. Publ. 811, 2008 Ed., 85 pages (March 2008; 2nd printing November 2008) CODEN: NSPUE3 Note on 2nd printing: This 2nd printing dated November 2008 of NIST SP811 corrects a number of minor typographical errors present in the 1st printing dated March 2008. Guide for the Use of the International System of Units (SI) Preface The International System of Units, universally abbreviated SI (from the French Le Système International d’Unités), is the modern metric system of measurement. Long the dominant measurement system used in science, the SI is becoming the dominant measurement system used in international commerce. The Omnibus Trade and Competitiveness Act of August 1988 [Public Law (PL) 100-418] changed the name of the National Bureau of Standards (NBS) to the National Institute of Standards and Technology (NIST) and gave to NIST the added task of helping U.S.