Quality Control, Robust Design, and the Taguchi Method the Wadsworth & Brooks/Cole Statisticsiprobability Series
Total Page:16
File Type:pdf, Size:1020Kb
Quality Control, Robust Design, and the Taguchi Method The Wadsworth & Brooks/Cole StatisticsiProbability Series Series Editors O. E. Barndorff-Nielsen, Aarhus University Peter J. Bickel, University of California, Berkeley William S. Cleveland, AT&T Bell Laboratories Richard M. Dudley, Massachusetts Institute of Technology R. Becker, J. Chambers, A. Wilks, The New S Language: A Programming Environment for Data Analysis and Graphics P. Bickel, K. Doksum, J. Hodges, Jr., A Festschrift for Erich L. Lehmann G. Box, The Collected Works of George E. P. Box, Volumes I and II, G. Tiao, editor-in-chief L. Breiman, J. Friedman, R. Olshen, C. Stone, Classification and Regression Trees J. Chambers, W. S. Cleveland, B. Kleiner, P. Tukey, Graphical Methods for Data Analysis W. S. Cleveland, M. McGill, Dynamic Graphics for Statistics K. Dehnad, Quality Control, Robust Design, and the Taguchi Method R. Durrett, Lecture Notes on Particle Systems and Percolation F. Graybill, Matrices with Applications in Statistics, Second Edition L. Le Cam, R. Olshen, Proceedings of the Berkeley Conference in Honor of Jerzy Neyman and Jack Kiefer, Volumes I and II P. Lewis, E. Orav, Simulation Methodology for Statisticians, Operations Analysts, and Engineers H. J. Newton, TIMESLAB ' J. Rawlings, Applied Regression Analysis: A Research Tool J. Rice, Mathematical Statistics and Data Analysis J. Romano, A. Siegel, Counterexamples in Probability and Statistics J. Tanur, F. Mosteller, W. Kruskal, E. Lehmann, R. Link, R. Pieters, G. Rising, Statistics: A Guide to the Unknown, Third Edition J. Tukey, The Collected Works of J. w. Tukey, w. S. Cleveland, editor-in-chief Volume I: Time Series: 1949-1964, edited by D. Brillinger Volume II: Time Series: 1965-1984, edited by D. Brillinger Volume III: Philosophy and Principles of Data Analysis: 1949-1964, edited by L. Jones Volume IV: Philosophy and Principles of Data Analysis: 1965-1986, edited by L. Jones Volume V: Graphics: 1965-1985, edited by W. S. Cleveland Quality Control, Robust Design, and the Taguchi Method Edited by Khosrow Dehnad AT &T Bell Laboratories w~ Wadsworth & Brooks/Cole Advanced Books & Software Pacific Grove, California © Copyright 1989 by AT&T. Softcover reprint ofthe hardcover 1st edition 1989 All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transcribed, in any form or by any means-electronic, mechanical, photocopying, recording, or otherwise-without the prior written permission of the publisher, Wadsworth & Brooks/Cole Advanced Books & Software, Pacific Grove, California 93950, a division of Wadsworth, Inc. 10 9 8 7 6 5 4 3 2 Library of Congress Cataloging in Publication Data Library of Congress Cataloging-in-Publication Data Quality control, robust deSign, and the Taguchi method. 1. Quality control. 2. Engineering design. 3. Experimental design. I. Dehnad, Khosrow, [date] TSl56.Q3625 1988 658.5'62 88-26220 ISBN-13: 978-1-4684-1474-5 e-ISBN-13: 978-1-4684-1472-1 DOl: 10.10071978-1-4684-1472-1 Sponsoring Editor: John Kimmel Editorial Assistant: Maria Tarantino Production Editor: Nancy Shammas Permissions Editor: Carline Haga Cover Design: Vernon T. Boes Cover Printing: New England Book Components, Inc., Hingham, Massachusetts Printing and Binding: Arcata Graphics/Fairfield, Fairfield, Pennsylvania The following institutions and organizations have kindly granted reprint permiSSions for articles from their publications: American Society for Quality Control, American Statistical Association, American Telephone and Telegraph Company, Cahners Publishing Company, and The Institute of Electrical and Electronics Engineering. UNIX is a registered trademark of AT&T. VAX is a trademark of Digital Equipment Corporation. FOREWORD In 1980, I received a grant from Aoyama-gakuin university to come to the United States to assist American Industry improve the quality of their products. In a small way this was to repay the help the US had given Japan after the war. In the summer of 1980, I visited the AT&T Bell Laboratories Quality Assurance Center, the organization that founded modern quality control. The result of my first summer at AT&T was an experiment with an orthogonal array design of size 18 (OA18) for optimization of an LSI fabrication process. As a measure of quality, the quantity "signal-ta-noise" ratio was to be optimized. Since then, this experi mental approach has been named "robust design" and has attracted the attention of both engineers and statisticians. My colleagues at Bell Laboratories have written several expository articles and a few theoretical papers on robust design from the viewpoint of statistics. Because so many people have asked for copies of these papers, it has been decided to publish them in a book form. This anthology is the result of these efforts. Despite the fact that quality engineering borrows some technical words from traditional design of experiments, the goals of quality engineering are different from those of statistics. For example, suppose there are two vendors. One vendor supplies products whose quality characteristic has a normal distribution with the mean on target (the desired value) and a certain standard deviation. The other supplies products whose quality characteristic has a uniform distribution with the same mean and standard deviation. The distributions are different from a statisti cal viewpoint but the products are the same from a quality engineering viewpoint. The reason is that quality engineering measures quality levels by observing the mean square deviation from the nominal or ideal value and relates it to the signal to-noise ratio. The philosophy of quality engineering in using experiments with orthogonal array designs is quite different from that of traditional statistical experimental design. Quality engineering uses orthogonal arrays to evaluate new designs for downstream manufacturing. It estimates only the main effects. When only the main effects are estimated using orthogonal arrays, these effects may be con founded by many interactions. When there are no interactions and, hence, when there is no confounding, using other factorial designs is inefficient. Conversely, when many large interactions exist, the design is poor-even useless--because the optimum settings of the design parameters are different downstream. We should assume that design parameters interact with various conditions: laboratory, downstream, large-scale manufacturing, and customer-use. The most efficient way to conduct experiments for finding the degree of interaction is to conduct them within the laboratory, and then assume the optimum settings of v Robust Design Anthology design parameters in the lab is the same for downstream conditions. And to deter mine whether large interactions exist among product or process design parameters is to compare the predicted output with the outcome of the experiments. When there are large interactions and the predicted value is different from the outcome, we can prevent poor designs from going downstream. When the predicted output and the experiments' outcome agree, we can be confident that the settings are optimum, except for possible adjustments. Orthogonal array experiments, then, can find poor design just as inspections do. Although I dropped statistical theory as the sole basis for quality engineering years ago, I always welcome any related statistical theory on how to measure and improve quality. I am sure this is the best book to begin the process of advanCing quality engineering methodologies and mathematical foundations, including statis tical theory. Perhaps the readers of this book will play an important role in this process. Genichi Taguchi Tokyo, Japan December 25, 1986 vi PREFACE A decade ago, Dr. Genichi Taguchi was unknown in the United States. Today, he is frequently mentioned with such influential quality leaders as W. Edwards Deming, Joseph M. Juran, and Kaoru Ishikawa. This dramatic change is testimony to the success and the impact of his approach to quality engineering. By combining statistical methods with a deep understanding of engineering problems, Dr. Taguchi has created a powerful tool for quality improvement. The approach focuses on reducing the influence of factors, called noise, that degrade the perfor mance of a product or process. This is in contrast to the conventional quality engineering practice of identifying important sources of noise and taking measures, often costly, to control them. Dr. Taguchi's "parameter design" achieves this robustness to noise through proper settings of certain parameters, called control fac tors. Since these parameters are generally easy to set, "macro quality is obtained with micro money." The improved settings of the parameters are obtained through statistically designed experiments that enable one to study a large number of fac tors with relatively few experiments. Within AT&T, these methods are collectively referred to as "robust design." The Quality Theory and Technology Department at AT&T Bell Laboratories has worked with Dr. Taguchi on robust design since 1980. Bell Labs researchers have studied and applied this methodology in many cases. The number of requests for reprints of the articles written by these researchers has been growing, as has been the popularity of Dr. Taguchi's approach. Moreover, it has become difficult to follow the evolution of this method because these articles have appeared in a wide variety of publications. This book removes that difficulty by collecting them in a single volume. The first test