Quick viewing(Text Mode)

Robust Design

Robust Design

16.888 – Multidisciplinary System Design Optimization Robust Design

Response

Control Factor Prof. Dan Frey Mechanical Engineering and Engineering Systems Plan for the Session

• Basic concepts in probability and • Review design of • Basics of Robust Design • Research topics – Model-based assessment of RD methods – Faster computer-based robust design – Robust invention Ball and Ramp

Ball

Response = Ramp the time the Funnel ball remains in the funnel

Causes of experimental error = ? Probability Measure

• Axioms – For any event A, P(A) t 0 – P(U)=1 – If the intersection of A and B=I, then P(A+B)=P(A)+P(B) Continuous Random Variables

• Can take values anywhere within continuous ranges • Probability density function b – P^`a x d b f (x)dx ³ x a f (x) – 0 d f x (x) for all x x

f – f (x)dx 1 ³ x f a b x Histograms

• A graph of continuous data • Approximates a pdf in the limit of large n

Histogram of Crankpin Diameters

5 Frequency

0 Diameter, Pin #1 Measures of Central Tendency

• Expected value E(g(x)) g(x) f (x)dx ³ x S

• Mean P= E(x)

• Arithmetic average 1 n ¦ xi n i 1 Measures of Dispersion

•Variance VAR(x) V 2 E((x  E(x))2 )

• Standard deviation V E((x  E(x))2 ) 1 n • Sample variance 2 2 S ¦(xi  x) n 1 i1 • nth central moment E((x  E(x))n ) th • n moment about m E((x  m)n ) Sums of Random Variables • Average of the sum is the sum of the average (regardless of distribution and independence) E(x  y) E(x)  E( y) • Variance also sums iff independent V 2 (x  y) V (x)2 V ( y)2 • This is the origin of the RSS rule – Beware of the independence restriction! Concept Test • A bracket holds a component as shown. The dimensions are independent random variables with standard deviations as noted. Approximately what is the standard deviation of the gap? A) 0.011” V 0.01" B) 0.01” V 0.001" C) 0.001” gap Expectation Shift

Under utility theory, S=E(y(x))- y(E(x)) S is the only difference between probabilistic and deterministic design S y(E(x)) y(x) E(y(x))

fy(y(x)) fx(x) x E(x) Probability Distribution of Sums

• If z is the sum of two random variables x and y z x  y

• Then the probability density function of z can be computed by convolution

z p (z) x(z ] )y(] )d] z ³ f Convolution

z p (z) x(z ] )y(] )d] z ³ f Convolution

z p (z) x(z ] )y(] )d] z ³ f Central Limit Theorem

The mean of a sequence of n iid random variables with

– Finite P

2G – E xi  E(xi ) < f G ! 0 approximates a normal distribution in the limit of a large n. Normal Distribution ( xP )2  1 2V 2 f x (x) e V 2S

-6V -3V -1V P +1V +3V +6V

68.3% 99.7% 1-2ppb Engineering Tolerances

• Tolerance --The total amount by which a specified dimension is permitted to vary (ANSI Y14.5M) pp(qy)) • Every component within spec adds to the yield (Y) Y L U qy Process Capability Indices

UL /2 • Process Capability Index Cp { 3V UL P  2 • Bias factor k { ()/UL 2 CC()1 k • Performance Index pk{ p

p(q) UL 2

L UL U q 2 18 Concept Test

• Motorola’s “6 sigma” programs suggest that

we should strive for a Cp of 2.0. If this is achieved but the mean is off target so that k=0.5, estimate the process yield. Plan for the Session

• Basic concepts in probability and statistics • Review • Basics of Robust Design • Research topics – Model-based assessment of RD methods – Faster computer-based robust design – Robust invention Pop Quiz • Assume we wish to estimate the effect of ball position on the ramp on swirl time. The experimental error causes V = 1 sec in the response. We run the 4 times. What is the error our estimate of swirl time? A) V = 1 sec B) V = 1/2 sec Ball C) V = 1/4 sec Ramp Funnel History of DoE

• 1926 – R. A. Fisher introduced the idea of factorial design • 1950-70 – Response surface methods • 1987 – G. Taguchi, System of Experimental Design Full Factorial Design

4 A B C D Response • This is the 2 +1 +1 +1 +1 +1 +1 +1 -1 • All main effects +1 +1 -1 +1 and interactions +1 +1 -1 -1 +1 -1 +1 +1 can be resolved +1 -1 +1 -1 +1 -1 -1 +1 • Scales very +1 -1 -1 -1 poorly with -1 +1 +1 +1 -1 +1 +1 -1 number of factors -1 +1 -1 +1 -1 +1 -1 -1 -1 -1 +1 +1 -1 -1 +1 -1 -1 -1 -1 +1 -1 -1 -1 -1 and Precision

A B C D Response +1 +1 +1 +1 +1 +1 +1 -1 The average of +1 +1 -1 +1 trials 1 through 8 +1 +1 -1 -1 has a  of 1/8 +1 -1 +1 +1 V +1 -1 +1 -1 that of each trial +1 -1 -1 +1 +1 -1 -1 -1 -1 +1 +1 +1 -1 +1 +1 -1 “the same precision as if -1 +1 -1 +1 the whole … -1 +1 -1 -1 had been devoted to one -1 -1 +1 +1 single component” -1 -1 +1 -1 -1 -1 -1 +1 – Fisher -1 -1 -1 -1 Resolution and Aliasing

Trial A B C D E F G FG=-A 1 -1 -1 -1 -1 -1 -1 -1 +1 2 -1 -1 -1 +1 +1 +1 +1 +1 3 -1 +1 +1 -1 -1 +1 +1 +1 4 -1 +1 +1 +1 +1 -1 -1 +1 5 +1 -1 +1 -1 +1 -1 +1 -1 6 +1 -1 +1 +1 -1 +1 -1 -1 7 +1 +1 -1 -1 +1 +1 -1 -1 8 +1 +1 -1 +1 -1 -1 +1 -1

27-4 Design (aka “ L8”)

Resolution III. Projective Property

+

B + - - C - A +

Considered important for exploiting sparsity of effects. DOE – Key Assumptions

• Pure experimental error error in observations is random & independent • Hierarchy lower order effects are more likely to be significant than higher order effects • Sparsity of effects there are few important effects • Effect heredity for an to be significant, at least one parent should be significant Sparsity of Effects

• An experimenter may

list several factors 1.2 • They usually affect the 1 response to greatly 0.8 varying degrees 0.6 0.4

• The drop off is 0.2

surprisingly steep Factor effects 0 2 (~1/n ) 1234567 • Not sparse if prior Pareto ordered factors knowledge is used or if factors are screened Hierarchy

• Main effects are usually more important than two- factor interactions A B C D • Two-way interactions are usually more important than three-factor interactions AB AC AD BC BD CD •And so on • Taylor’s series seems to support the idea ABC ABD ACD BCD f f (n) (a) ¦(x  a)n ABCD n 0 n! Inheritance

• Two-factor interactions are most likely when A B C D both participating factors (parents?) are

strong AB AC AD BC BD CD • Two-way interactions are least likely when ACD ABC ABD BCD neither parent is strong • And so on ABCD Resolution

• II Main effects are aliased with main effects • III Main effects are clear of other main effects but aliased with two-factor interactions • IV Main effects are clear of other main effects and clear of two-factor interactions but main effects are aliased with three-factor interactions and two-factor interactions are aliased with other two-factor interactions • V Two-factor interactions are clear of other two-factor interactions but are aliased with three factor interactions… Discussion Point

• What are the four most important factors affecting swirl time? • If you want to have sparsity of effects and hierarchy, how would you formulate the variables? Important Concepts in DOE

• Resolution – the ability of an experiment to provide estimates of effects that are clear of other effects • Sparsity of Effects – factor effects are few • Hierarchy – interactions are generally less significant than main effects • Inheritance – if an interaction is significant, at least one of its “parents” is usually significant • Efficiency – ability of an experiment to estimate effects with small error variance Plan for the Session

• Basic concepts in probability and statistics • Review design of experiments • Basics of Robust Design • Research topics – Model-based assessment of RD methods – Faster computer-based robust design – Robust invention Major Concepts of Taguchi Method

• Variation causes quality loss • Two-step optimization • Parameter design via orthogonal arrays • Inducing noise (outer arrays) • Interactions and confirmation Loss Function Concept

• Quantify the economic consequences of performance degradation due to variation

L(y)

What should the function be?

y Fraction Defective Fallacy

• ANSI seems to imply a “goalpost” L(y) mentality • But, what is the A difference between o – 1 and 2? 1 2 3 – 2 and 3? m y m-' m+'R Isn’t a continuous function R more appropriate? A Generic Loss Function

• Desired properties L(y) – Zero at nominal value – Equal to cost at A specification limit o – C1 continuous • Taylor series m m+' m-'R R y f 1 f (x) | ¦ (x  a)n f (n) (a) n 0 n! Nominal-the-best • Defined as L(y) Ao 2 L( y) 2 ( y  m) 'o A • Average loss is o proportional to the 2nd moment y m-' m m+' about m R R quadratic quality loss function "goal post" loss function Average Quality Loss

P L(y)

A Ao o 2 2 V E[L( y)] 2 >@V  (P  m) 'o

y m m+ m-'R 'R quadratic quality loss function probability density function Other Loss Functions

Ao 2 • Smaller the better L( y) 2 y 'o

2 1 • Larger the better L( y) Ao'o 2 y

A • Asymmetric o ( y  m)2 if y ! m ' 2 L( y) Upper Ao 2 2 ( y  m) if y d m 'Lower Who is the better target shooter?

Sam John Who is the better target shooter?

Sam John

Sam can just John requires adjust his sights lengthy training The “P” Diagram

Noise Factors

Product / Process Response

Control Factors

There are usually more control factors than responses Exploiting Non-linearity Response

Use your extra “degrees of freedom” and search for robust set points.

Control Factor Inner and Outer (Crossed) Arrays • Induce the same noise factor levels for each row in a balanced manner Control Factors 1122N1 Expt. ABCD1212N2 No. 1221N3 1 1111 2 1222 3 1333 4 2123 inner x 5 2231 6 2312 outer = 7 3132 L9xL4= 8 3213 9 3321 36 Compounding Noise

• If the physics are understood qualitatively, worst case combinations may be identified a priori

Control Factors 1122N1 Expt. ABCD1212N2 No. 1221N3 1 1111 2 1222 3 1333 4 2123 inner x 5 2231 6 2312 outer = 7 3132 L9xL4= 8 3213 9 3321 36 18 Signal to Noise Ratio • PERformance Measure Independent of Adjustment PERMIA (two-step optimization) Control Factors 1122N1 Expt. ABCD1212N2 No. 1221N3 1 1111 2 1222 3 1333For each row, take an 4 2123average P and 5 2231standard deviation V 6 2312 ª P 2 º 7 3132K 10log 8 3213 10 «V 2 » 9 3321 ¬ ¼ Factor Effect Plots

Factor Effects on S/N Ratio Choose the

15.0 B1 best levels

14.0 A1 C3 13.0 D2 D3 A3 C2 B3 P 12.0 D1 A2 C1 11.0 B2 Scaling factor? 10.0

Prediction Equation

K(Ai , B j ,Ck , Di ) P  ai  b j  ck  di  e Confirmation

Factor Effects on S/N Ratio Build the

15.0 B1 best plane

14.0 A1 C3 13.0 D2 D3 A3 C2 B3 P 12.0 D1 A2 C1 11.0 B2

10.0

Check result against prediction

K(Ai , B j ,Ck , Di ) P  ai  b j  ck  di  e What is an Interaction? • If I carry out this experiment, I will find that:

Control Factors Expt. ABCD K 26 No. 25 1 112224.88 24 2 122221.78 23 A1 A2 3 132220.17 22 A3 4 212221.38 21 5 222222.62 20

19 6 232222.02 B1 B2 B3 7 312225.03 8 322219.93 If there are significant 9 332220.58interactions, the prediction may fail to confirm Major Concepts of Taguchi Method

• Variation causes quality loss • Two-step optimization • Parameter design via orthogonal arrays • Inducing noise (outer arrays) • Interactions and confirmation Some Concerns with • Interactions can often cause failure to confirm • Two step optimization not really needed • Use of S/N often not a useful as modeling the response explicitly • Some experts consider crossed arrays are less efficient than putting noise in the inner array References • Byrne, Diane M. and Taguchi, Shin “The Taguchi Approach to Parameter Design” Quality Progress, Dec 1987. • Phadke, Madhav S., 1989, Quality Engineering Using Robust Design Prentice Hall, Englewood Cliffs, 1989. • Logothetis and Wynn, Quality Through Design, Oxford Series on Advanced Manufacturing, 1994. • Wu and Hamada, 2000, Experiments: Planning, Analysis and Parameter Design Optimization, Wiley & Sons, Inc., NY. Plan for the Session

• Basic concepts in probability and statistics • Review design of experiments • Basics of Robust Design • Research topics – Model-based assessment of RD methods – Faster computer-based robust design – Robust invention A Model

n n n n n n y(x1 , x2 ,!, xn ) ¦ E i xi  ¦¦E ij xi x j  ¦¦¦E ijk xi x j xk  H i 1 i 11j i 111j k j!i j!i k ! j

­N(0,1) if Gi 0 effects are normally distributed f (Ei Gi ) ® 2 ¯N(0,c ) if Gi 1 two classes – strong and weak

Pr(Gi 1) p effect sparsity

­p00 if G i G j 0 ° Pr(G ij 1G i ,G j ) ®p01 if G i G j 1 effect hierarchy & inheritance °p if 2 ¯ 11 G i G j

Chipman, H., M. Hamada, and C. F. J. Wu, 2001, “A Bayesian Variable Selection Approach for Analyzing Designed Experiments with Complex Aliasing”, Technometrics 39(4)372-381. Robust Design Method Evaluation Approach 1. Instantiate models of multiple “engineering systems” 2. For each system, simulate different robust design methods 3. For each system/method pair, perform a confirmation experiment 4. Analyze the data

Frey, D. D., and X. Li, 2004, “Validating Robust Design Methods, accepted for ASME Design Engineering Technical Conference, September 28 - October 2, Salt Lake City, UT. Including Noise Factors in the Model

n n n n n n y(x1 , x2 ,!, xn ) ¦ E i xi  ¦¦E ij xi x j  ¦¦¦E ijk xi x j xk  H i 1 i 11j i 111j k j!i j!i k ! j

xi ~ NID(0,w1 ) i 1!m The first m are noise factors

The rest are control factors xi ^`1,1 i  m 1!n with two levels

2 Observations of the H ~ NID(0, w2 ) response y are subject to experimental error Confirmation

Using a polynomial response has the advantage that response variance is easily computable

2 m ª n n n º 2 (x , x , , x ) w 2 « x x x » V m1 m2 ! n 1  ¦¦¦Ei  ¦Eij ˜ j  Eijk ˜ j ˜ k  i 111« j m1 j m k m » ¬« j!i j!i k! j ¼»

2 m m ª n º m m m « x » 2 ¦¦Eij  ¦Eijk ˜ k  ¦¦¦Eijk i 11j « k m1 » i 111j k j!i ¬« k! j ¼» j!i k! j Fitting the Model to Data • Collect published full factorial data on various engineering systems – More than data 100 sets collected so far • Use Lenth method to sort “active” and “inactive” effects • Estimate the probabilities in the model • Use other free parameters to make model pdf Distribution of Effects Distribution of Effects 10 10 fit the data pdf 9 9 8 8

7 7

6 6

5 5

Percentage 4 4 Percentage(%)

3 3

2 2

1 1

0 0 -100 -80 -60 -40 -20 0 20 40 60 80 -100100 -80 -60 -40 -20 0 20 40 60 80 100 Effects Effects Different Variants of the Model

c s1 s2 w1 w2 The model that Basic WH 10 1 1 1 1 drives much of DOE Basic low w 10 1 1 0.1 0.1 & Robust Design Basic 2nd order 10 1 0 1 1 Fitted WH 15 1/3 2/3 1 1 Fitted low w 15 1/3 2/3 0.1 0.1 The model I think is nd Fitted 2 order 15 1/3 0 1 1 most realistic

pp11 p01 p00 p111 p011 p001 p000 Basic WH 0.25 0.25 0.1 0 0.25 0.1 0 0 Basic low w 0.25 0.25 0.1 0 0.25 0.1 0 0 Basic 2nd order 0.25 0.25 0.1 0 N/A N/A N/A N/A Fitted WH 0.43 0.31 0.04 0 0.17 0.08 0.02 0 Fitted low w 0.43 0.31 0.04 0 0.17 0.08 0.02 0 Fitted 2nd order 0.43 0.31 0.04 0 N/A N/A N/A N/A The single array is extremely effective Results if the typical modeling assumptions of DOE hold

Basic Fitted Method Experiments WH low 2nd WH low 2nd w order w order 27 u 23 1,024 60% 81% 58% 50% 58% 40% 7 31 512 44% 80% 52% 45% 58% 40% 2 u 2III 2104 64 8% 8% 56% 18% 9% 38% 2105 32 9% 3% 33% 16% 9% 17% 74 31 32 12% 8% 51% 16% 25% 38% 2III u 2III 31 32 39% 56% 43% 36% 42% 35% OFAT u 2III OFAT u OFAT 32 31% 37% 41% 33% 31% 27% 2106 16 4% 4% 8% 4% 2% 0% The single array is terribly ineffective Results if the more realistic assumptions are made

Basic Fitted Method Experiments WH low 2nd WH low 2nd w order w order 27 u 23 1,024 60% 81% 58% 50% 58% 40% 7 31 512 44% 80% 52% 45% 58% 40% 2 u 2III 2104 64 8% 8% 56% 18% 9% 38% 2105 32 9% 3% 33% 16% 9% 17% 74 31 32 12% 8% 51% 16% 25% 38% 2III u 2III 31 32 39% 56% 43% 36% 42% 35% OFAT u 2III OFAT u OFAT 32 31% 37% 41% 33% 31% 27% 2106 16 4% 4% 8% 4% 2% 0% Taguchi’s crossed arrays are more Results effective than single arrays

Basic Fitted Method Experiments WH low 2nd WH low 2nd w order w order 27 u 23 1,024 60% 81% 58% 50% 58% 40% 7 31 512 44% 80% 52% 45% 58% 40% 2 u 2III 2104 64 8% 8% 56% 18% 9% 38% 2105 32 9% 3% 33% 16% 9% 17% 74 31 32 12% 8% 51% 16% 25% 38% 2III u 2III 31 32 39% 56% 43% 36% 42% 35% OFAT u 2III OFAT u OFAT 32 31% 37% 41% 33% 31% 27% 2106 16 4% 4% 8% 4% 2% 0% A Comparison of Taguchi's Product Array and the Combined Array in We have run an experiment where we have done both designs simultaneously (product and combined). In our experiment, we found that the product array performed better for the identification of effects on the variance. An explanation for this might be that the combined array relies too much on the factor sparsity assumption. Joachim Kunert, Universitaet Dortmund The Eleventh Annual Spring Research Conference (SRC) on Statistics in Industry and Technology will be held May 19-21, 2004. An adaptive approach is quite effective Results if the more realistic assumptions are made

Basic Fitted Method Experiments WH low 2nd WH low 2nd w order w order 27 u 23 1,024 60% 81% 58% 50% 58% 40% 7 31 512 44% 80% 52% 45% 58% 40% 2 u 2III 2104 64 8% 8% 56% 18% 9% 38% 2105 32 9% 3% 33% 16% 9% 17% 74 31 32 12% 8% 51% 16% 25% 38% 2III u 2III 31 32 39% 56% 43% 36% 42% 35% OFAT u 2III OFAT u OFAT 32 31% 37% 41% 33% 31% 27% 2106 16 4% 4% 8% 4% 2% 0% An adaptive approach is a solid choice Results (among the fast/frugal set) no matter what modeling assumptions are made

Basic Fitted Method Experiments WH low 2nd WH low 2nd w order w order 27 u 23 1,024 60% 81% 58% 50% 58% 40% 7 31 512 44% 80% 52% 45% 58% 40% 2 u 2III 2104 64 8% 8% 56% 18% 9% 38% 2105 32 9% 3% 33% 16% 9% 17% 74 31 32 12% 8% 51% 16% 25% 38% 2III u 2III 31 32 39% 56% 43% 36% 42% 35% OFAT u 2III OFAT u OFAT 32 31% 37% 41% 33% 31% 27% 2106 16 4% 4% 8% 4% 2% 0% Plan for the Session

• Basic concepts in probability and statistics • Review design of experiments • Basics of Robust Design • Research topics – Model-based assessment of RD methods – Faster computer-based robust design – Robust invention Sampling Techniques for Computer Experiments

Random Stratified Sampling Sampling Proposed Method • Simply extend quadrature to many variables • Will be exact to if factor effects of 4th polynomial order linearly superpose z2 • Lacks projective property 2.8750 • Poor divergence 1.3556 z1

z 3 -1.3556

-2.8750 Why Neglect Interactions?

n n n n n n n n n n (z) z z z z z z z z z z If the response K E 0  ¦ E i i  ¦¦E ij i j  ¦¦¦E ijk i j k  ¦¦¦¦E ijkl i j k l i 1 j 11i j 11i k 1 j 11i k 11l id j id j kdk id j kd j ldk is polynomial

n 2 2 2 2 2 V (K(z)) ¦ E i  2E ii  6E i E iii 15E iii  24E ii E iiii  96E iiii  i 1

2 2 2 2 2 2 Then the effects § E  3E  3E 15E 15E  8E  · ¨ ij iij ijj iiij ijjj iijj ¸ n n of single factors ¨2E E  2E E  4E E  4E E  6E E  ¸  ¦¦¨ i ijj j iij ii iijj jj iijj ij iiij ¸ i 11j have larger i j ¨6E E  6E E  6E E  24E E  24E E ¸ © ij ijjj iii ijj jjj iij iiii iijj jjjj iijj ¹ contributions to 2 2 2 2 § E ijk  3E iijk  3E ijjk  3E ijkk  2E iij E jkk  2E iik E jjk  · V than the n n n ¨ ¸ ¨2E E  4E E  4E E  4E E  6E E ¸  mixed terms. ¦¦¦¨ ijj ikk iijj jjkk iijj iikk iikk jjkk iiij ijkk ¸ i 111j k i j j k ¨6 6 6 6 ¸ © E iiik E ijjk  E ijjj E ijkk  E jjjk E iijk  E jkkk E iijk ¹

2 n n n n § E  2E E  2E E  2E E  2E E · ¦¦¦¦¨ ijkl iijj kkll iikk jjll iill jjkk iijk jkll ¸ i 1111j k l ¨ ¸  2E ijjk Eikll  2Eijkk E ijll  2E iijl E jkkl  2E ijjl Eikkl  2E ijll Eijkk i j j k k l © ¹ Fourth Order – RWH Model Fit to Data 1

0.9

0.8

0.7

0.6 d=7 0.5

0.4 4d+1=29 Cumulative Probability Legend 0.3 Quadrature 29 samples Cubature 73 samples HSS 29 samples 2 0.2 HSS 290 samples d +3d+3=73 LHS 29 samples 0.1 LHS 290 samples

0 0 5 10 15 20 % Error in Estimating Standard Deviation Continuous-Stirred Tank Reactor

• Objective is to generate chemical species B at a rate of 60 mol/min

F Ti CAi CBi Q FUC p (T  Ti )  V (rA H RA  rB H RB )

C Ai 0 EA / RT C CBi  k Ae WC A A 0 E / RT A CB 1 k Ae W 0 EB / RT Q 1 k B e W

0 EA / RT  rA k Ae C A

F T CA CB 0 EB / RT 0 EA / RT  rB k B e CB  k Ae C A

Adapted from Kalagnanam and Diwekar, 1997, “An Efficient Sampling Technique for Off-Line Quality Control”, Technometrics (39 (3) 308-319. Comparing HSS and Quadrature Hammersley Sequence Quadrature • Required ~150 points • Used 25 points • 1% accuracy V2 • 0.3% accuracy in P • V2 from 1,638 to 232 • 9% accuracy in (y-60)2 far • Nominally on target from optimum • 0.8% accuracy in (y-60)2 • Mean 15% off target near to optimum • Better optimum, on target 0.03 quadrature and slightly lower variance 0.02 HSS • E(L(y)) = 208.458

0.01 Probability density (min/mol)

0 0 20406080100 Production Rate (mol/min) Plan for the Session

• Basic concepts in probability and statistics • Review design of experiments • Basics of Robust Design • Research topics – Model-based assessment of RD methods – Faster computer-based robust design – Robust invention Percentage of total An opportunity 100 25 50 75

Problem definition

Concept Robust parameterdesign

Lifecycle phase design

Detail design

Manufacture Design flexibility costs committed Quality determined&

Use Philip Barkan Ford and Russell B. Source: Defining “Robustness Invention”

• A “robustness invention” is a technical or design innovation whose primary purpose is to make performance more consistent despite the influence of noise factors • The patent summary and prior art sections usually provide clues Example -- A Pendulum Robust to Temperature Variations

• Period of the swing is affected by length • Length is affected by temperature • Consistency is a key to accurate timekeeping • Using materials with different thermal expansion coefficients, the length can be made insensitive to temp Theory of Inventive Problem Solving (TRIZ) • Genrich Altshuller sought to identify patterns in the patent literature • Defined problems as contradictions • Provided a large database of solutions • Stimulate designer’s creativity by presenting past designs appropriate to their current challenge Searching for Robustness Inventions

• Keyword search in USPTO database • There seem to be several thousand

Search Term Number of Hits Search Term Number of Hits Insensitive 35,708 Independent 114,201 Less sensitive 12,253 Uncoupling 2,189 Robust 27,913 Decoupling 6,505 Accurate 221,600 Noise compensation 22,092 Reliable 211,533 Noise control 142,138 Repeatable 16,458 Noise conditioning 10,787 Tolerant 13,765 Resistant 3,535 Despite changes 1,323 Acclimation 712 Regardless of changes 1,147 Desensitize 447 Independent of 20,521 Sweet spot 1317 Self compensating 1,269 Operating window 728 Force Cancellation 59 TOTAL 867,472 Classifying Inventions via the P-Diagram

Patent #4,487,333 – Noise “Fluid Dispensing System”

Signal Response

Patent #5,483,840 – Patent #5,024,105 – “System for Viscosity-insensitive Measuring Flow” variable-area flowmeter

Courtesy of the United States Patent and Trademark Office, http://www.uspto.gov.�� Discussion Point

Ball

Response = Ramp the time the Funnel ball remains in the funnel

Noise Factor = 2 Types of Ball

Name some ways that you might modify the ball and ramp equipment or procedure to make the system robust to ball type. Conclusions So Far • Effective strategies for experimentation should be adaptive (not always, but under a broad range of scenarios) • Resolution is not always required for reliable improvement • Simulating the process of experimentation provides insights I can’t get from deduction alone Questions?

Dan Frey Assistant Professor of Mechanical Engineering and Engineering Systems