A Study on the Quantitative Evaluation for the Software Inciuded in Digital Systems of Nuclear Power Plants DISCLAIMER
Total Page:16
File Type:pdf, Size:1020Kb
A study on the quantitative evaluation for the software incIuded in digital systems of nuclear power plants DISCLAIMER Portions of this document may be illegible in electronic image products. Images are produced from the best available original document. 2002. 3. t I Summary Recently, newly being developed nuclear power plants (NPPs) accept digital instrumentation and control (I&C) systems because the limitations such as mrrectness, maintainability, enhancement of the operational reliability and complexity of conventional analog systems arise. In addition, in the case of currently being operated nuclear power plants, the tendency of adopting digital I&C systems is increasing because it is difficult to prepattdestablish spear parts of installed analog I&C systems. In general, probabilistic safety analysis (PSA) has been used as one of the most important methods to evaluate the safety of NPPs. The PSA, because most of NPPs have been installed and used analog I&C systems, has been performed based on the hardware perspectives. In addition, since the tendency to use digital I&C systems including software instead of analog I&C systems is increasing, the needs of quantitative evaluation methods so as to perform PSA are also increasing. Nevertheless, several reasons such as software did not aged and it is very perplexed to estimate software failure rate due to its non-linearity, make the performance of PSA difficult. In this study, in order to perform PSA including software more eficiently, test-based software reliability estimation methods are reviewed to suggest a preliminary procedure that can provide reasonable guidances to quantifj, software failure rate. In addition, requisite research activities to enhance applicability of the suggested procedure are also discussed. ... 41 1 5) Ad ............................................................................................................................ 1 41 2 3 4--PEaqq]cflQ PSA 4-g-A ................................................................ 4 4 3 3 AIpI713 & 44s qq %kg ............................................................... 7 x{] 1 4 'Select inpub' ................................................................................................. 9 1. *==qq A]gq *zq g *=3qzq 4%............................................... 9 2 . A] gq+ @% ........................................................................................................... 10 3 . Alga 481 g.Q.Q Alpl qqa (test inpubs9 4%................................. 15 41 2 4 'Perfom tests' 341 ............................................................................................ 22 1. ;r! %b+%&* ............................................................................................................. 22 2 . &sE8J1qx%E 8% ..................................................................................... 23 41 3 3 'Collect failure data to obtain software reliability' 941 ................................. 25 1. Omcleq +lqs g coverage ................................................................................ 26 2 . 4- -P test coverage ..................................................................................... 27 4 4 3 c]XI%!%%! %!XIS =41%4 PSA +qEf-%+lQ *-"E44AlgBA) 9 ++S&4 S3 ............................................................. 29 41 1 @%Is -439 71% .................................................................................... 29 41 2 4 qX1l@+%! @XIS YS4]g.n 2EqM &nE4]qq+lqE4+ *.---.31 1. kEEq1qq 34~34. A]%* 4+4 g +3€ ~S)JZS!.) 3%............... 31 2 . A] 4%........................................................................................................... 34 3 . A]% QS?aq8% ................................................................................................. 34 4 . ?2JqJ-+%&* ............................................................................................................. 35 5 . 4-=qq x%bs3% ..................................................................................... 35 6. Watchdog A]&@q 44E coverage .............................................................. 35 7 . &mE$Jqq test coverage ..................................................................................... 36 8 . PSA +q-&+1Sfl &qq& &-ILEq]q oJ1G @A) .......................... -37 4) 5 %b .............................................................................................................................. 39 xd.-;?. s* .................................................................................................................................... 42 P. .i . .. - I1 - I xg 3-1. &32-#4 A]%& -&e+!qE a%q934 3 wok flow ....................... 2% 3-2. qB 43ilE 3- 481 4aQ A]%qL;= a%*&%.................................. 10 3%3-3. A]qaS)**& xqQ A]%*% ..................................... - ..... .......... ..........-.12 ! 2%3-4. A] % CqOlq .?r1ap1qq ............................................................................... 16 3%3-5. 44 cJQ S++s......................................................................... 19 3%3-6. Qfqqgqs+ Aq] CAP,, A]% Q] qa- @q q] ....................................... 21 3%3-7. Oracle A]&q& AI-9-Q A1 ...................................................................... 26 2%3-8. 4sz Eflqs) test covemge 9 input coverage9 tj]a .................................... 27 2.a 4-1. 7]e4 @X)Z .............................................. ......................................... 29 ZLg 4-2. ClX] sZ)S ‘;1A)-& ............................................................................... 30 32 4-3. A]q& 9%3S)E 4]+ *&% ............................................................ - .........- .... 38 ... ! -C. -1 - -2 - -3 - .... -4 - - modeling the multi-tasking of digital systems - modeling the features of phased mission systems - digital system induced initiating events including human errors * -5 - -6 - (test-based software reliability estimation methods) ANSI(American National Standards 1nstitute)oft qS$ 4= E aq dq2Z-k "the probability of failure-fiee software operation for a specific period of time or demand in a specified environment" e) 301 348C) [ANSIgl]. 4714A1, hZEqq 3% (failure= cM1) %-& %qS 2-k &*(enor), ag(fau1t) 9 aB(defect)q 481 %+jQc)~ 34 8~) [PfleegerE]. L. -7 - n I I I -- I I J 4 1 3 ‘Select inputs’ @a -9 - Laplace'. mk of number of tests point cstlmation succession Assuming Binnomhl Independent trial distribution The t-dSstributSon method -10- e=- 1 N+2 F 1-c Ncj(i-e)N-jej2 C, ~=0,1,2,... j== F i-z N+,c,(i-e)N+l-jej2 C, ~=0,1,2,... ]d -1 1- I ln(1- c) - ln(1- 0) N21+ > In( 1- 0 + p.0) (4 4) 2NAH = X: (1 - C) AH=H,-H k=n-m fi = frequency of the occurrence of the i”’ outcome = -*i N 6 H=-~(’In(fa) (4 5) i=l -13- 6 Cf, = 1.0 i=l x:(O.OS) - 11.07 N= -- 2.m 2.m -14- -i.. -15- psiiirk Random sampling selectlon method Input sampling Bin (urn) sampling execution coverage -1 7- a is integer, b = a/2; if (b 2 2.5) then @ else @I a is integer, b = a + 1; if (b 2 5) then @ else @ -L. -18- No yes A, B, C, D = deckbn points , '6', affected by input data t + Perform PerfOlIll segment i segment 2 -19- A: integer; 0 < A < 20 .. .. -20- B: Integer; -20 < B < 0 Binl: 2 Bin2: 3 I 1 Bin3: 5 I I I I I I I 1 1 I 0246 000 20 b A -21- I -22- I -L. -23- I -PZ- -_ -25- 6enerating correct outputs for ghn input data ........ ........ -26- e Bug included in software 2%3-8. AZE4q4 test coverage 9 input coverage4 -27- I . -28- Analog/ Digital Comparator Rndog/Diglhl - trfp QDM Input moduk 0 (to check trip outputmodule % (contmlekmant (plant parameters) parameters) -(generating trip rlgnal) driving mechanism) - Trip Manual trip signal slgnst (from manual trip buttons) 1% 4-1. 7]Sq +k3& -29- input module -- (to check trip __c outputmodule S (wntmlahnt (plant parametars) parunatan) (generating Mp signal) drMng mchanlm) L I J -30- i -31- -33- -<. -34- -35- I. -36- -37- ....... ^.................-......-..............-.-......-.................-.. .......................... -.... ..-......-. Considering 'safety' perspective software reliablli Considering ccrfK7ation test Define re/ative target reliability Assuming independent triak between successive tests Considering conffdence level Sekcfing inpuk based on 'safety' perspecfive Input cases Sekcfing test inpub based on fhe combination of whife-box approach- L .-.................._.......-.........-...... ^. .....__......-....... ..........-. -. ..-......... -.-......-..-........... Perform I tests ............-.........................._......-......-.......-............. -.......-..... -.-.-.... -.. ......-.... i Dennc Assuming ai/ failures from processor i falluro modes of moduk were due to soRware faub processor module Assuming no masking effects Quantify software reliaQllity -3 8- -. -39- -40- -29- [Cannon011 R. M. Cannon. Sense and sensitiivty - designing surveys based on an imperfect test. Preventive Veterinary Medicine 2001;49. p. 141- 163. [Chen96) S. Chen and S. Mills. A binary Markov process model for random testing. IEEE Transactions on Software Engineering 1996;22(3). p. 2 18-223. [ChoiO 13 J. G. Choi and P. H. Seong. Dependability estimation of a digital system with consideration of software masking effects on hardware faults. Reliability Engineering and System Safety 2001;71. p. 45-55. [Choi98] J. K. Choi and P. H. Seong. Software dependability models under memory faults with application to a digital system in nuclear power plants. Reliability Engineering and System Safety 1998;59. p. 32 1-329. [COOPRA97] COOPRA working document. What PRA needs from a Otgital I&C systems