S. Turcott November 23, 2020 Metallurgy During the Iron Age And its Legacy upon Modern Metallurgy Figure 1: Photograph of an axe head found at the bottom of a lake. Its analysis began the journey of exploring the metallurgy of the past. INTRODUCTION No profession has been more affected by the Iron Age than that of metallurgy. Yet with decades of education and experience as a metallurgical engineer, never had its history been mentioned or made relevant to me - until one day I was handed an old axe head that began a fascinating journey into a realm I never appreciated existed (Figure 1). Now I realize that the metallurgical practices of the Iron Age literally forged the world of modern metallurgy. Although this axe was probably made in the 1700s or early 1800s, it incorporated concepts that defined metallurgy in Europe for thousands of years. There are Roman, Viking and Medieval artifacts that exhibit comparable metallurgical features. This is because several of the core principles of the Iron Age remained the same until the invention of the Bessemer process in 1851 which began modern steelmaking. As a disclaimer, a few months ago I had never even heard of the term “archaeometallurgy” – the metallurgical study of historical artifacts. As an amateur on the topic, I have written this to share concepts of interest from the books, articles and conversations experienced since. I will use this axe head as the foundation to demonstrate and weave through some of the core ideas within the Iron Age. I apologize in advance to the real experts if I oversimplify or overgeneralize. Yet I think some of the ideas introduced through the study of historical metallurgy will interest many. IRON – THE BASICS Archaeometallurgy uses the term “iron” very differently than we do in modern industry. Whereas today iron contains 2-4wt% carbon and graphite, in archaeology wrought iron is essentially that, iron with no carbon. Think of “iron” as low carbon steel with less than 0.1wt% carbon. Soft, malleable and not very useful for cutting edges such as weapons or axes. Yet it still had many practical applications such as nails, hinges, etc. Metallurgy During the Iron Age and its Legacy upon Modern Metallurgy Page 1 of 12 S. Turcott November 23, 2020 For example, the wedge used to hold the axe head onto its wooden handle comprised of wrought iron. Under a microscope, it can be seen to have an entirely ferritic structure with no notable carbon/carbides (Figure 2). You can also see the high density of slag inclusions that identify it as having been made by reduction without the metal having been entirely melted. This matched with either bloomery furnaces used until around the 1400s, or older-style blast furnaces used afterwards, which failed to separate the slag from the final iron product. Due to the smaller furnace sizes, they could reduce ore into iron but did not have enough time/temperature to fully melt and/or soak in the carbon like that of modern, massive blast furnaces. Slag inclusion Ferrite (white) Wedge used to hold axe head onto wooden handle Slag Optical Microscope (50-1000x magnification) Slag inclusion Ferrite (white) Wedge Structure, 100x Wedge Structure, 400x Figure 2: Micrographs displaying the microstructure of the wrought iron wedge formerly used to fasten the axe head onto its wooden handle. Its predominantly ferrite structure with slag inclusions was classic for iron throughout the majority of the Iron Age. Incomplete melting during the reduction in bloomery furnaces or smaller/older blast furnaces left in slag inclusions. Etched using 3% nital. Metallurgy During the Iron Age and its Legacy upon Modern Metallurgy Page 2 of 12 S. Turcott November 23, 2020 One of the interesting things about iron is that it remained essentially the same for thousands of years. Figure 3 compares the microstructure and slag inclusions within our axe wedge (1700-1851AD) with a Roman nail estimated to be 2000 years old (0-100AD). Other than luck associated with the quality of ore used, my understanding is that there really were no large improvements in the quality of European iron over thousands of years until 1851. Cross-sectioned a) Axe Wedge, 1750-1851AD b) Roman Nail, ~100AD Slag inclusions Ferrite Ferrite (white) (white) Slag inclusions c) Wedge Core, 100x d) Nail Core, 100x Figure 3: Micrographs comparing the structures of the (a,c) axe wedge produced sometime between 1750-1851AD and (b,d) a Roman nail from before 100AD. Because the structure of wrought iron had not changed for thousands of years, their ferritic structures with slag inclusions appeared similar to one another. Only subtle differences in the inclusions gave hints of being produced from different ore sources. Etched using 3% nital. To be honest, I bought this nail on eBay from a seller that claimed it was Roman from the 1st century and I cannot verify its age. Yet the point remains the same, that from a metallurgical perspective, iron had not changed for thousands of years. Believe it or not, iron existed in reasonable abundance during the Bronze Age. Yet the problem with iron was that it was soft, weak, pliable and not very useful for cutting edges Metallurgy During the Iron Age and its Legacy upon Modern Metallurgy Page 3 of 12 S. Turcott November 23, 2020 such as weapons or axes. Cold worked, tin-alloyed bronze could be made into swords and tools with hardness values between 200 and 220 HV (archaeometallurgy uses Vickers hardness for everything). Yet bloomery iron was typically between 80 to 130 HV – this axe head wedge had a hardness of 130 HV. So iron was not a militarily strategic metal until the invention of steel. STEEL – ADVANTAGES, LIMITATIONS AND DESIGN Its “Ironic” (ha!) that the Iron Age actually began when steel was invented and long after iron had been in use. Yet the discovery of steel changed everything and quickly spread throughout Europe between 800BC and 600BC. Steel strengthened by 0.3 to 1.5wt% carbon content provided the opportunity for weapons and tools to surpass the usefulness of bronze. Plus iron ore was more available than the tin needed for weapon-grade bronze and overall, steel was cheaper to produce. It quickly became the dominant metal. Established countries and empires with militaries built upon bronze weaponry quickly fell to the taste of steel by once lesser competitors if they did not adapt fast enough. Throughout the Iron Age, there were two factors that greatly affected (and limited) how steel was used: 1) Steel was pricey – usually 3-5 times more expensive than iron. 2) The carbon content could not be controlled or measured. (ranged between 0.3wt% to 1.5wt% carbon) There were several different ways to make steel. Most of the methods involved carburizing iron to increase its carbon content. The higher the carbon content, generally the better the steel performed. Yet the additional processing meant that steel was inherently much more expensive than iron and carburizing limited the size of steel pieces that could be produced. To keep costs down and to work with the small pieces of steel, the majority of tools and weapons were made of steel and iron fused together. Steel would be selectively used in areas where its properties were most needed while the remainder would comprise of iron. The quality of a part included steel quality, craftsmanship and the design used by the blacksmith in its construction. This axe head was a perfect example of this. The majority of its body comprised of iron yet a high carbon steel strip had been fused onto its cutting edge (Figure 4). The steel had a hardness of 427 HV (42 HRC), providing a wear resistant, hard cutting edge. Yet the design used the minimum amount of steel to keep the costs down. Metallurgy During the Iron Age and its Legacy upon Modern Metallurgy Page 4 of 12 S. Turcott November 23, 2020 Iron body Steel fused onto cutting edge Iron body (Soft yet affordable) Steel fused onto cutting edge (Hard, wear resistant but expensive) Figure 4: Photographs and macrograph displaying a cross-section taken of the axe head. The axe body comprised of soft iron. The blacksmith had fused steel onto the iron body at the cutting edge. This was a smart design that provided a hard, wear resistant steel that would hold a sharpened edge yet also was cost-effective. Etched using 3% nital. Although this axe had been made in the 18th or 19th centuries, the concepts used in its design spanned over thousands of years. Axes excavated around Europe and dating back throughout different time periods used variants of this steel-iron design (Figure 5). Blacksmiths would selectively use steel only at the most critical portions of the tool. Labour and the availability of iron seemed less restrictive. Metallurgy During the Iron Age and its Legacy upon Modern Metallurgy Page 5 of 12 S. Turcott November 23, 2020 Iron Steel a) 3rd BC, Ireland Steel insert at cutting edge Iron body th th b) 5 -6 AD, Ireland Steel fused onto cutting edge Iron body c) 5th-6th AD, Baltics (Vikings!!) Figure 5: Schematics displaying different axe designs used in different places/time periods. The blacksmiths used different designs to minimize the use of steel, iron and skill.1,2 Taking an even closer look at our axe’s cutting edge, Figure 6 displays optical micrographs of its steel. You will notice its structure comprised entirely of unresolvable, fine pearlite with carbides/cementite. Although this steel was hypereutectic, having over 0.8wt% carbon and readily hardenable, its structure showed that it had not been quenched.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages12 Page
-
File Size-