The Progression of Technology Education in the U.S.A. Pacific Rim International Conference on Technology Education Technology Education Institute of Nanjing Normal University October 16-18, 2013 William E. Dugger, Jr. Senior Fellow, International Technology and Engineering Educators Association Emeritus Professor of Technology Education, Virginia Tech Technology is as old as the human race and it is the foundation for how we as humans change, alter, or modify our natural world to satisfy many of our needs and wants (ITEA/ITEEA, 2000/2002/2007). Technology’s roots as a discipline began over two and a third million years ago when our ancestors created stone tools that were used to kill and butcher animals for food and clothing (Leakey, 2012). For the first time in human prehistory, there is evidence that the toolmakers had a mental template of what they wanted to produce—that they were intentionally imposing its shape on the raw material they used. The implement that suggests this is the so-called hand-axe, a teardrop-shaped tool that required remarkable skill and patience to make. During this formative period in history, humans developed abilities that enabled them to become right-handed or left-handed. In working with their hands, prehistoric humans have provided historians with archeological artifacts that have documented the evolution of technology through periods in time by referring to them as the Stone Age, the Copper and Bronze Age, and the Iron Age. With the development of the alphabet (around the 27th century BC) and the base 10 numbering system (around 3000 BC), humans created the basis for languages and mathematics. Across the millennia, thousands of inventions and innovations stand out as significant to the development of the human race. Some of these were: the creation and manipulation of fire, the plow (about 8000 BC), the wheel (about 4000 BC), the abacus (2nd century BC), the clock (approximately 4000 BC), the bow and arrow (about 16,000 BC) and the crossbow (approximately 6th century BC), the compass (between the 2nd century BC and 1st century AD), gunpowder (9th century AD), papermaking (105 AD), and the moveable type printing press (around 1040 AD in China and about 1440 AD in Germany). In the remaining half of the last millennium, humans in many parts of the world refined agriculture to a point where food and fiber are relatively plentiful. During the Renaissance period (14th to 17th centuries), cultural rebirth took place and yet many new technological ideas and innovations were made (mass 1 production of books, da Vinci’s flying machine and the establishment of the laws of linear perspective by Brunelleschi to mention a few). During that same time the Age of Exploration was taking place. Explorers, such as Zheng He, Columbus, Cook, da Gama, Magellan, and others traversed the world in hopes of finding new land, treasures, and cultures. The Industrial Revolution began in England in the mid-18th century and was fueled by coal mining. The invention of the steam engine allowed steamboats and the locomotives to transport people and goods more quickly. By the mid-19th century the Industrial Revolution had spread to North America and Continental Europe, and since then it has spread to most of the rest of the world. The Industrial Revolution is defined by mass production, broadcasting, the rise of the nation state, electric power, modern medicine, and running water. While burning of fossil fuels has contributed to global warming and economic growth to ecological damage, the quality of human life has increased dramatically. Life expectancy today worldwide is more than twice as high as it was when the Industrial Revolution began. In the 20th and 21st centuries, technology has escalated at an exponential rate. Some would refer to this period as the Information Age. While information or digital technologies hold a significant place in the overall spectrum of technology, they are not considered to be the totality of technology. Advances in medical technologies, agricultural and biotechnologies, energy and power technologies, transportation technologies, information and communication technologies, manufacturing technologies, and communication technologies, have all been responsible for the technological world that we all live in today. The development of technology has helped satisfy our basic needs and wants. A human need, or the object of a human need, is something people must have in order to live a good life. On the other hand, a want, or the object of a want, is something one desires to have, whether or not one needs it. These basic human needs and wants drive technology to help us to improve our health; to grow and process food and fiber better; to harness and use energy more efficiently; to communicate more effectively; to process data faster and accurately; to move people and things easier; to make products to enhance our lives; and to build structures that provide shelter and comfort (Dugger, 2011). When comparing this description with some related disciplines—science is the study of the natural world; mathematics deals with patterns and relationships; and engineering is involved with design under constraint. In simple terms, science deals with “what is” in the natural world while technology deals with “what can be” in the invented and innovated world. We all need to know that: technology is our content—what we teach. While technology education is the school subject that teaches about technology—to whom, where, when, why, and how we teach. Unfortunately, a majority of people in the United States misunderstand technology today as documented by two International Technology Education Association (ITEA) Gallup Polls conducted in 2001 and 2004 (Rose et al, 2001 & 2004). In both polls, about two-thirds (68% in 2004 and 67% in 2001) of those surveyed in the United States responded to an open-ended question about “What is technology” and gave a very narrow answer of technology as being “computers” or the “Internet.” Furthermore, a majority (62% in 2004 and 59% in 2001) of respondents stated that technology was the same as science. However, it was encouraging in the two polls to find out that almost all (98% in 2004 and 97% in 2001) of the respondents thought that technology should be included in the school curriculum as an area of study. In the U.S., education is primarily the responsibility of the state or local governments. The U.S. Department of Education has limited power and responsibility concerning education at the state or local level. With this in mind, there is no “national curriculum” for any subject even though most subjects now have nationally developed standards for what each student must know and be able to do in order 2 to be literate in that subject. Many of these standards were developed by educational associations, like the International Technology and Engineering Educators Association, or national agencies such as the National Research Council (NRC). The progression of the study of technology as a formal school course in the United States began in the last half of the 1800s as “Manual Arts Education.” This school course had its philosophical foundation primarily from the “Educational Sloyd” system in Finland (Cygnaeus, 2010) and Sweden (Salomon). The passage of the Morrill Act in the U.S. in 1862 established colleges and institutions in each state to educate people in agriculture, home economics, mechanical arts, and other professions. The term “industrial arts” was coined by Charles Richards in 1904 as the official name for our school subject. Then in 1923, Bonser and Mossman, in their book: Industrial Arts in the Elementary School, defined industrial arts as “a study of the changes made by man in the forms of materials to increase their values, and of the problems of life related to these changes” (Bonser & Mossman, 1923, p. 5). This definition served well over the early years until “technology” became the predominant term used in our teaching and writings. The American Industrial Arts Association (AIAA) was founded in 1939 by Dr. William E. Warner, a professor at The Ohio State University. AIAA held its first national conference in 1947 in Columbus, Ohio with a theme of “A Curriculum to Reflect Technology.” In the 1960s and 1970s, there were a number of federal- and state-funded curriculum projects in industrial arts in the U.S. The most prominent ones were the Industrial Arts Curriculum Project (IACP) at the Ohio State University from 1965 into the 1970s that developed The World of Manufacturing and The World of Construction. The directors of that project were doctors Donald Lux and Willis Ray. Also, the American Industry Project, directed by Professors Face and Flug at Stout State University in Wisconsin from 1966 to 1971, was significant in that it provided the first true study of industry to students. Another curriculum project having a major impact on the teaching of industrial arts education was The Maryland Plan under the leadership of Dr. Donald Maley at the University of Maryland. Maley’s work is significant in that it offered a curricular alternative to those industrial arts educators who were searching for a more student-centered approach to instruction (Herschbach, D.R. 1997). In the late 1970s and the 1980s, the industrial arts profession in the United States slowly moved away from teaching individual skills of industry (woodworking, metalworking, electricity, engineering and architectural drawing). After this took place in many schools, in the late 1980s and 1990s, industrial arts moved towards teaching larger clusters of technological content such as manufacturing, construction, energy and power, transportation, and communication. Today, the study of technology in the U.S. is an elective area in most states and localities with approximately 150,000 students and about 28,000 teachers.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages7 Page
-
File Size-