A technical survey on issues of the quantitative ...
Transcript of A technical survey on issues of the quantitative ...
•111!KR0000243
KAmi/AR-565/00
A technical survey on issues of the quantitative
evaluation of software reliability
3 1 / 40
Please be aware that all of the Missing Pages in this document wereoriginally blank pages
* 2000 7} •
2000. 4. 3.
^^(maintainabili ty),
5/63171
71
7}*1 ^ ^ 1 ^ (aging
, non-linearity
^ PSA
Sit:}.
( # 137B^ category)
o 4 category^
Summary
Recently, newly being developed nuclear power plants including Ulchin 5/6 accept
digital instrumentation and control (I&C) systems because the limitations such as
correctness, maintainability, enhancement of the operational reliability and complexity of
conventional analog systems arise. In addition, in the case of currently being operated
nuclear power plants, the tendency of adopting digital I&C systems is increasing
because it is difficult to prepare/establish spear parts of installed analog I&C systems.
One of the most significant difference between conventional analog and digital I&C
systems is using computer systems in which various kinds of software were included.
However, in contrast to hardware, it is very difficult to obtain software reliability data
(or software failure rate) that are requisite for the PSA of nuclear power plants because
there are some problems in evaluating software reliability (i.e., no aging, non-linearity
and resource sharing).
Therefore, as the preliminary phase, the following researches are performed to
determine appropriate methodology for evaluating software reliability.
o Categorizing methodologies that have been proposed for evaluating software
reliability in the software reliability engineering field.
0 Analyzing strong • weak points of each category to determine the most appropriate
methodologies that can be applied for the digital I&C systems
As the results, methodologies that can be directly applied for the evaluation of the
software reliability for the digital I&C systems are not exist. Thus additional researches
to combine the most appropriate methodologies/techniques from existing ones would be
needed to evaluate the software reliability.
2 # c l ^ l T f l ^ M *]&&<% W PSA - • 3*fl 1 ^ PSA 7 l t J^ >fc7fl • 3*fi 2 ^ t : l*H 7fl^-*H -M^ofl tfl*> PSA *J-g- 6
3 # 4 i H H $ H -ilSjS ^7} * o ^ 9
afl 1 ^ # 3 3 * 1 %?} u o v ^ ^ 121. Implementation Quality ( i S H ^ W ^ - ^ aj^Hg ^71-) 12
7}. Category 1 - iSS^|<>lfi] # e l ^ 3.7]# f-*} 7}- 13
U- Category 2 - i 5 1 ? l H S l 7 ] - ^ 3.71 # ^ * 1 ^7f 14
t}. Category 3 - i 5 ^ | ^ 5 ] s?.^ ^ ^ ^ # ^-*> ^7} 15
Sf. Category 4 - H5.ZLeflni7f J ^ ^ ] ^ B H ^ - ^ S l - -§•«> ^ 7 } 16
2 Category 5 - Requirement Quality (-i^^.^1 ^ ^ A ^ ^71-) 17
3. Category 6 - Design Quality ( i = - H $ H ^^1 # ^ ^ 7 » 18
4. Testing Quality ( i B ^ l H *]^&\ ^ ^ A ^ ^7f) 19
7\. Category 7 - Test Coverage ^7} 19
u>. Category 8 - Test Effort ^ 7 } 20
u>. Category 9 - Test Team Availability/Capability *%7\ 20
5. Category 10 - Multivariate approach 21
all 2 ^ *133<*! ^7f « o^^- 221. Category 11 - 4f#/-f7l|^ * j e | # ^»V i S ^ o l ^]S]S.^
^ j ^ ^ ^ l o f l ^ 222. Category 12 - dtSS ]<>1 S 1 ! 3 ^ ^ t l ^.$\SL ^ ^ 27
3. Category 13 - i = S $ H 7fl«J: 7ill- S^?> ^^^91 ^1^J£ ^7} 28
7\. RADC S 1 ! 28
14. BBN S ^ 30
A] 3 ^ i=S^«H -ilSlS. ^7} yo ^ € ^ ^-11^ 33
1. ^1^^6J ^7f ^ ^ - ^ £ « 332. ^]^^6J >hSS l«H 3g7]. aj-^^-^ « .^^ 35
7\. i 5 ^ | M Al^# f-*> AlSlS. g7f 36
^7} 37
37
- 1 -
4 # 4i=S$H *g7r ^ ^ 2:4 ^ £ ^ s . 39
^-ti 41
^ ^ 45
A. 4 ^ H ° ) ] o } £ | - ^ a } •^•^•S. 3g7l- *£»*} 46
A.I Operational or Functional Complexity 46
A.2 BVA S.^i 46
A.3 Number of Entries and Exits per Module 47
A.4 Design Structure 47
A.5 Data Flow Complexity 48
A.6 Cohesion/Coupling 49
A.7 Functional Complexity 49
A.8 Data Structure Metrics 52
A.9 System Design Complexity 52
A.10 Entropy Measure 53
B. Requirement Quality ^ 7 } ^ 57
B.I Cause and Effect Graphing 57
B.2 Requirement Traceability 57
B.3 Number of Conflicting Requirements 57
B.4 Requirements Compliance 58
B.5 Completeness 58
B.6 Requirements Specification Change Requests 59
C. i S S ^ o } ^ 7 j | # ^ ^ 7 } «oVi£ 60
C.I Software Process Capability Determination 60
C.2 Cost and Schedule 61
C.3 Reliability Prediction as a function of Development Environment 61
D. i ^ S ^ l H ^ o f l tfl*> Coverage ^7} » o^ 63D.I Functional Test Coverage 63
D.2 Minimal Unit Case Determination 63
D.3 Test Coverage 63
D.4 Test Sufficiency 64
- n -
D.5 Testability Analysis 65
E. >fcSB.JSfl«H^ Test Effort ^ 7 r ^ 66
E.I Fault Number Days 66
E.2 Man-hours per Major Defect Detected •• 66
F. 4 : S S ^ H ^ 1 Test Team Availability/Capability ^ 7 r ^ 67
F.I Number of Faults Remaining ..-67
- in -
S. 2.1 PSA ^ ^ ? ) H tfl*t ^M] ufl-g- 4
X 3.1 i S H J S H &2\S. W « o ^ ^ : 10
a 3.2 Bugs per line of codeofi tfl^ ° 1 ^ ^ ^ ^ ^7} ^ 2 } 13
S 3.3 <&$)£) &*}3.B. * ^ A | S E S ] ^*«^-^ 15
X 3.4 iaJESlMoj JJL^^ . ^ i 3g7f «o^ 16
S 3.5 <ysj5} ^^1^^^} cUtl ^ 3 ^ 4^"-£ ^7f oil 17
X 3.6 i ^ ^ M ^7fl^.^i^ ^ ^ - ^ ^g7} * o ^ 18
a 3.7 i s s $ H Ad l # ^ ^7} «ong 19
a 3.8 4iSS.^H<H^ Test Coverage ^7} «o^ 20
a 3.9 ifc-SSSlH^ Test Effort ^7} wj- 20
a 3.10 Test Team Availability/Capability *%7\ *£-** 21
a 3.11 ^7f|3| ^Bl-t A}-g-«]: i^S^flol ^ l ^ S ofl - ao ^ # (1/2) 24
a 3.11 M * | ^ e l # A}-§-*> -fcSE^o] A l ^ S ofl . tit-^s. (2/2) 25
a 3.12 i a s ^ o i a^^J-i- f-tl ^ s i s . ^7> ^ ^ (1/2) 27
a 3.12 4:SS4IH S^^1^- WQ ^l^l£ ^ ^ j - «J- (2/2) 28
a 3.13 ^ o f i ^ t l ^ ^ ^ - % ! • ^ ^ , p(Loses | Sick & Dry) 30
S: 3.14 £%*![<& ^^M.%]o] ^7} »^*\] cfl*> ^ ja . ^z\}% 35
- IV -
life cycle - • 12
life cycle 28
- v -
J i ^ (maintainability), ^ * l * i £ ] - £
l, 2], # * ! 5/63L7J
71
[3-7].
fi-W-^^(drift) ^ CPUuf
xf^(resources)
^(Common Cause Failure; CCF)ofl ^1^*1 ^-g- 7 ^
[8-10].
507] ttfl^olt:}. a l^ . , ^ ^ - B ] 71 ^ #
^7}(Probabilistic Safety Analysis;
PSA a o ^ ^ .
- 1 -
3.^-^r random process^. 1- ^ ^c}-) 2f 4 i S S
i non-linearity ^ ± 3 . 9in A^ PSA »^$] ^ j ^ ^ ^ l ^-g-o] nfl^-
[9, 11, 12].
fe ^1*1! Til^^H ^ 1 ^ ^ ^ PSA -g-A)(failure probability)^ ^ 1 ^ - % ^ SU-fe- a o V W # S.#*l"7l
*H, i S S ^ o ) A]S]S. ^-tv^o>(software reliability engineering)^A] ^X\Q r.}<
-2 -
PSA
PSA
^ ^ (deterministic safrty
analysis) ^ PSA
(design basis accident analysis)^
2.1 PSAS] W - l - a ^ ^ ^ . , a 2.1^ ZLQ 2.13] A
ai en
T
^-3t] 2.1 «a«>^6l PSA
- 3 -
IE 2.1 PSA
«*«
2.71 4 £
43*
• *
- ) £
PSA ^p*"— .$•]
3*11 -ii l K
S. - 1-51, EI)^-,
43"r^(Event
S. PSA r 4<H
Tit t ~T
t ^7^]4^-'ul"
•i7fl4s.ofl
Tree)<>lu} 3.-
a^ 43§°1# r=fe l# i
*1| 4 x 1 <S o -
7} 1i]"-;*1rii3 ""
3tl fei
L3 3 ^
-§-°] 3.
1 3*11 fc
7l7l^§
fr#r^i74?i^utfejl ;*! o] 3g 7}t-|- 7 | 2u PSA ^ H]
<»I4 71 43S4I 3^ ^^ 3 » -d *lfe t ^ s4 , ^
7 ] ^ I *Ml 1-fl ^" *7T SJ"^| J">*>fl ^fl - Q j ^
i^^-Ti ^ 7]E} ^ ^ ^ Jy- #
^(bottom line number)#
(decision-making)
- 4 -
, PSA
4^^r-^-(event tree; ET)£f tree; FT)o]
7l£A}£(basic event)^
•-: 7} 7} 7}
7]7)oil cflSfi
O ol-g.^^£: 71717}7l7l7f
dependency
HPSI ^ ^ ( H i g h Pressure Safety Injection Pump;
fl^o«, PSA
PSA |A-1 J2 .2^ ^ dependency^
2.2 ^ dependency
Dependency
Functional dependency
Spatial dependency
Human dependency
CCF(Common Cause Failure;
Signal 2 ^ S 1*> HPSI &^*) 7}^- -yafl
-g-^sv ^ l ^ H Slsfl 44^1 7l7l(ole|*V 3-°., JL^Vol a ^ ^ ^ ^ o l -g-<y*}7l ttfl-^ofl redundancy*!7B^6] ^f l*H Sfetf)
- 5 -
PSA
PSA
[9, 11, 12].
O Aging
PSA
(random failure)
PSA
S^o i f e agingo] «Hi4*l ^7]
non-linearity
ioo°c~2oo°ctflSfl
six:}.
(Windows
1 ^ A ^ BB
- 6 -
3.7)1 «*•§•
O ifcSS dependency £
£TT I/O
(error) iEfe ^^> (fault)
& 71-
PSA
V&V (Validation and
Verification) 7}
A S 41-3-*!-[1, 2. 12].
^ 7 )
- 7 -
fe 410"7RyS.
6\]
14]. 2-, i S S
10-4~10-3/demand^ [1, 2, 9, 13,
^ ^ # PSA
PSA
^ (decision-making)
71
o)-g-
[15],
71
; 2)
3)
ANSI(American National Standards Institute)^! ^f . . i t ^ S S l H -£].£] J£-b "the
probability of failure-free software operation for a specific period of time in a specified
environment" £} ^°1 ^ g ^ t : } [16]. <*|7H*1, i H S ^ W 3.#(failure>g: t}&3\ Q
•& ^ # 5 f e ^.^-(error), ^^(fault) ^ ^ ^ (defect)^]
[17].
o
O
^.^-(human error)
©l^(anomaly)^.S>H, i S I l M 4^(coding)
91 -^^(requirement) § A S
Programmer
Human Error
Errors
Software RequirementsSoftware Design...
Irrelevant Requirements,design, ...
/""System' w. Defects
SoftwareFaults
us] 3.1 ^ ^ ^
SystemFailures
-9 -
ANSI51
[17, 18],
reliability)^
3.
.1 >fcS
Indirectestimation(predicting
softwarereliability using
developmentprocess quality)
Directestimation(estimating
softwarereliability)
Implementationquality
Requirement quality
Design processquality
Testing processquality
Miscellaneous
Predicting softwarefailure rate through
analytical ormathematical
methods
Software size
Softwarecomplexity
Physical size(based on the number of
line of code)
Functional size(based on the number of
software functions)
Based on softwarestructure or architecture
Based on psychology
Evaluating software requirement quality
Evaluating the quality of software design,design organization, design environment
and Implementation
Evaluating test coverage
Evaluating test effort
Evaluating availability/capability ofsoftware testing team
Multivariate approach
Concerning endproduct only
Includingdevelopmentprocesses +End Product
Based on software testing+ statistical techniques
Based on softwaremodeling
Based on models whichcan synthesize/integrate ofeach development process
and end product (software)
Category 1
Category 2
Category 3
Category 4
Category 5
Category 6
Category 7
Category 8
Category 9
Category 10
Category 11
Category 12
Category 13
-10-
<H 5]
(category 1 ~ 10)3f %-|-/-f-7ll
-11-
life cycl
l i f e
^ process)7|-
cf [19].
SoftwareRequirement
SoftwareDesisn
Implementation(Coding)
Testing User
gf fe
a^J 3.2 4iSS.JS|<H life cycle
, A ^ ^ ^ - H - b U JH^ | V&V(Verification & Validation)
1. Implementation Quality ( 4 I ^
Jg7[A|£.(complexity measure)
^.^(maintenance personnel)^
^^ . z i e l lH (programmer)
-12-
J2.^(failure)#
3.71
2\ SS.ZL
(category 1), 3.71 # *&7}*\t: aoVi3 (category
« o ^ (category 3) ^
(category 4)
7\. Category 1 - 3.7]
3.71 eRr(source code)5l s}o]
Sfi
#«^W1 sicK o|1) bugs per line of code, 2) fault density, 3) design defect density %-=r § ^
# bugs per line of codeofl cfl V 7 1 1 ^ ^ * 1 ^ ^ ^ ^ 3.2SJ- ^ c > [20, 21].
S 3.2
Rough
Si =Fi =F =N =
F,=
F =
Crude
Bugs per line of codeofl cfl^l o|.g.a} ia|[;g ni *%7\
industry estimation (5~3O failures/thousand executable
4.3}
line
The number of executable source lines in the ith moduleEstimated number of failures in the ith moduleEstimated number of failures in the whole programThe number of modules contained in a program
4.2 + 0.0015- Si 5
;=!
estimation for the number of failures in a program per line
of code)
of codes
^ ( 1 0 0 0 5~307fl
CM-
-13-
FORTRAN g ° f ^ # 3 ] S H - § ^ £o] S S H e R n ^ «J<g*l 3.^-g; *flo|: ^ | o w
level language-! I ^ S . *t 3H7l oR-g^ £*fl x
(object-oriented program) J£-b high level language^]
Sit;}.
. Category 2 - i S ^ ^ o j 7]^^ 3.71 #
3.7} S-V.} 4;JEJ=.-$]lo] o)] ^nfu}
¥&•£: code li
alJZ^j
3.711-
function point analysis(3i-c- feature point analysis)-g- § ^r Sll l". °1 tov^-cr
} ^ - ^ ^ ^ 57fl^
o The number of functions that are related to internal logical files (logical groups of
data maintained in an application)
o The number of functions that are related to external interface files (logical groups of
data used by one application but maintained by another application)
o The number of functions that are related to external inputs (maintained by internal
logical files)
O The number of functions that are related to external outputs (reports and data
leaving the application)
O The number of functions that are related to external inquiries (combination of a data
request and data retrieval)
-14-
-£, uj - iSS
testing quality(;fc=M4IM ofl
}. Category 3 - ^ ^ S
o] category^ M 7)£
category ofl
category 1
(architecture i f e structure)
theory) ^ l * } ^ S l ^ [22]. ILe f l iL f
^- cyclomatic complexityS.4 ^ f e ^ ^ ^
code)
3.3
Textual Program (-^^lisLE.)********** J>J fyc^ic /VO I **********
NTOT = (NCHARS + 7)*NW0RDS
********** BLOCK NO 2 **********
IF(- ) GOTO 900
********** BLOCK NO 3 **********
ELSE
********** Hh,O(2K. NO 4 **********
900 PRINT NTOT
********** HL.OCK. 1V0 5 **********
990 STOP
END
T
2
<^< \>
•1 3
The number of edges = e = 5
The number of nodes = n = 5
Cyclomatic complexity
= e - n + 2 = 2
-15-
*U]3..5.S1
fl, cyclomatic complexity^
£ edge 4 ^ node
S#J£(flow chart J£-b flow
fl ^* j (abs t r ac t ive )^ .^
o]
3.4 4 : ^ ^
^ 7 } « o ^
BVA model
Data structure metricsOperational or functional complexity
(I nterface/M icro/Operator)Number of entries & exits per module
Design structureData flow complexity
Cohesion/CouplingFunctional complexityData structure metrics
System design complexity
Entropy measure
^^^Ti l l - ^7f
i S S ^ I o l ^ a%v^i ^ } S ^ - ^ # ^7}
e>. Category 4 - 5 5 2
Halstead< ] E measure),
71 category
Zieiuf category
Sfi category
engineering) #
51 (engineering psychology)
-16-
7}
2\o]%o]t\. S. 3.5-b Halstead
3.5
l\i = the nur
n,2 = the nur
Ni = total fi
N2 = total fi
E = Average
IF (A=0)THEN
A - B;ELSE
A = C;
_mN2(Nl + N2) {r
2 7 2 &A
nber of unique operators
nber of unique operands
equency of operators
•equency of operands
efforts for programming
Operator
=
( )
THEN
ELSE
IF
Frequency
2
3
1
1
1
1
I1+V2)
Operand
A
B
C
m = 612 = 3
E =
Frequency
3
1
1
Ni = 9N2 = 5
221.90
3000
operaton-f operand^]
, o]3. «1
2 Category 5 - Requirement Quality
91 MS. <$<%*]*=:
-17-
7] 4 . 31*} i S S
lHS. [17],
Cf. 1XJ-5H
3.6 4iss .
s) 7I tiVtt}
Cause & effect graphing
Requirements traceabilityCompleteness
Completeness
Number of conflictingrequirements
Requirements compliance
Requirements specificationchange requests
W ? ! i ^1-1 MS, ** afe V*=Ife ^il-# -7-1
(requirement stability)^- ^ 7 }
3. Category 6 - Design Quality ( i t^ES-
(category 5)5}
o] sj *>
] 5} 4i = S.
capability), 4 i S S
ait:}.
-18-
3.7 -fc»H
Software process capabilitydetermination (SPICE)
Cost/Schedule
Reliability prediction as a function ofdevelopment environment
Reliability prediction for the operationalenvironment
Reliability prediction as a function ofSW characteristics
alJL
-fc=S$H 4 U s t# ^7}
4. Testing Quality ( ^ S S
"category 7 - i H S
^7}? (test coverage)," "category 8 - ^ ^ . 3 ] H ! - X\%*}7\ ^]*fl, %-^^l <*!«}
^ ^ o ] ^ o j i ( ^ ^ 7 } ? (test effort)" ^ "category 9 - ^=S^ |<H ^ 1 ^ # ^*^«1
? (test team availability or capability)" SJ 4 7 W S .
7}. Category 7 - Test Coverage 3§7}
Test coverage o]7lT->
Sll K S 3.8^ test coverage
-19-
Category 8 - Test Effort
o) category^ ^ *g7r ^sfl
fe cfl ZL 3.9fe test effort
3.8 dtx^M Test Coverage ^ 7 }
^ 7 } « o ^
Functional testcoverage
Minimal unit test casedetermination
Test coverage
Testing sufficiency
PIE (Propagation,Infection and
Execution) technique
^H^?.||CHD!I 5g-=J J I H ( / « « / W » ) S | 4 =
^XilM AISS test case°\ 4s
2 ^ E S S i g o l b tof co5e£| ^
AISS :>|^°| 4=i 5 M ? J I O 1 0 ] | SL&& D\^(Junction )2\ 4*
A|go^ ojgH g g g gg- 4=
4 4 ^ Ajigofl tfl*H, ^ l^^l ^*o»^^ ^ A l ^ = ^ ^ * ] -*1*1, ^ r^^ i ^ A l s ^ ^ ] i |s | | *£*> ^v^. ^ v - ^ ^ ^ ^-r Si ^ S ^ I ^ ^ r ^W) l*fl ^% v # ^>^- ^^Koutput)^
1 3 . 9 Test Effort ^7}
^ 7 } ^
Fault number daysMan-hours per major
defect detected
u ^ ^ . ^^wi ^IN °i# 3.^. ttii^w ^fi Ai ># sg7|.2* m * ^ s -i^ w
Category 9 - Test Team Availability/Capability
o) categoryofl
-20-
i - f r ^ (error seeding) 7]*» JEEfe £.•%• ^-(sampling)
3.10 Test Team Availability/Capability *§7\
^ 7 } « o ^
Number of faults remaining
Test accuracy
Mutation testing^ ^ «1# ^ f i Jg7}U^ ^ 1 ^ # ^7}.
5. Category 10 - Multivariate approach
}. n s j ^ . o) n) 4 , ^ E
multivariate approach- - category 1 ~ 9»fl t:fdo
vt l
sj](Weighting), x (Measure),
ojnfl, 4 ig7f^s<>11 tfltj: 7}#*1 (weighting)^ ^-g-7]- =>^>^m factor analysis iE
-fe- principal component analysis-H - -sfl ^Tfl ^ t } .
Multivariate approach^) t f l S . ^ ^ < )-fe- software release readiness5}^- ^ ^ ^ . S ' H ,
o^7]o]lA-l^ ^.^7} vy^o]] S\*t 7}^x\3} functional test coverage (category 2),
software document source listings ^ software capability maturity model (category 6) -§-
-21-
o]
71-g: 7}^g^ "Pareto Principle"
l ^ - S [23], ZL ufl-g-^r " 4 r = M
Sllfe
!§7f7|- " e SB^o] life cycle
fe 51 o]
, 2) i^M
. 01
SoftwareRequirement
SoftwareDesign
Implementation(Coding)
Testirm >• User
-22-
1. Category 11 -
^ - ^ ^ r a 3.112} [19, 24-30].
(operational profile)a i
test period^ i S
3.4
AISIOII m-s r i : 5failure rate, MTTF
-23-
3.11 (1/2)
- 7}^ .1(1972^1) , M c D o n a l d ^ Navy project-
Jelinski &Moranda model
MTTF
Constant 3.^ ratelMTTF ^ exponential distribution^ u:}^.c}2L 7}^%
Non-homogeneous Poisson
process (NHPP)model
- 1979^^] Amrit Goel^f Kazu Okumotoofl
- Constant rate#1 ^ Poisson process^ tcj-^.
-, space shuttle $]1975^011 Schneidevvindofl*} IBM^ ul«8;*j|o1 software^!AlAA(American Institute of Aeronautic and Aeronautics)^]^, software
reliability modeling-fr ^l*fl -«d^^> 47H model
Schneidewind'smodel
.:$ data)S.A-j -§-^«V ^ ^ ^ test interval ti ^ ticount T\7\ *££j5\t\. (test interval
- Constant 3 . ^ rate-1- 7 } ^
-b Poisson process-f- trf^-
-24-
S. 3.11 (2/2)
1975\£ofl John Musaoflfc*t software reliability S.«)
Musa's basicexecution model
xofl cflt> reliability*
- Constant J L ^ rate# 7 } ^
^ Poisson process
- Non-homogeneous Poisson process(NHPP) model^l
Hyper-exponentialmodel
7} codingzj- class-S}
da ta )S^count fi7f
test interval ti. (test interval
ti !§•<£
- Constant HQ rate# 7}^Poisson process!- nj-g-i:!^ 7}^
^ 3.7fl 1) time domain
approach, 2) error seeding & tagging approach ^ 3) input domain approach^. \tW%
^r Sic}. Helu} error seeding ^ ^ ^ r category 9*11^ °M ^ 7 ^ test team
availability/capability ^%7\S\ ^cA^\A A^:6^ ^- i 3 . ^ < ^ l > [ ^ - time domain approach
^ input domain approach^^ ^i'^'&f^.c}.
, time domain approach ^ input domain approach-^} ^
o) " o)
[17].
test#
-25-
o\ 4%!# W &$£. s$7}»^-& 7tf«J4£] <£%o] JIB]5]^ time domain approach
£f 4-§-4£) I H M 3 . ^ 5 1 ^ input domain approach^! f- 7}x]^. ^ - ^
Time domain approach^ i 5 ^ | < > i 5 ] *±S]£. ^l^Hl £MA-| 7p£
period)
J£fe MTTF(Mean Time beTween Failure)
time domain approach!- aJ-§-*r7l ^ S
(input ranged < t iW ?>^^.S. ^ - ^ H 4
input domain approacheV i
time domain approach^
^Sfl
Slfe ifcSS (operational profile),
[29, 30].
^ ^ 7fl
1000^ O)AO
SllcK time domain approach
$it\. ZL operational profile^- JL3^Sfl Mcl-ig999«l*l|
1000*1*11 A}-g
4-§-4 0J^Hfe
-26-
ij-EJ-A-j time domain approach^} input domain approach-
2. Category 12 - 4 ^ S .
°] category^
[20,
a 3.12 ^ ^ s ^7} ^ (i/2)
SystemPerformanceReliability
QNM(QueueingNetworkModel)
R0O o| SJQ-g:
System Reliability = p(R<R0)
R = System Response Time
R0 = Design Goal or Requirement
IndependentProcess
Reliability
O Loop7} local dependency^, hardwareAj-Markov
Modeling O iSS
SystemOperationalAvailability
O Dependency7} <&Tr i 5
-27-
IE 3.12 -fcSS (2/2)
ReliabilityBlock
Diagrams(RBD)
k-out-of-nmodel
Fault tree
RBD(Reliability
BlockDiagram)
Fault TreeModeling
"12
o zj -M- -o]] cflsfl, A]^V tofl*|.a) -ilSlS.^- exponential^. 7}
o RBD S^c^l cU«fl, n7ll -§• k7f)^ JS.§- 2.^- 7 ]^o | ^
3. Category 13 - ^ ^ S .
o] category^!
Laboratory) ^ n]
tiov^-cr
ROME<HH
LLNL(Lawrence Livermore National
, category 11 ^ 127]-
tt>e> RADC ^ BBN(Bayesian Belief NeUvork)
S H ^ M life cycled
SoftwareA Requirement
SoftwareDesisin
Implementation(Coding)
RAOC S! BBN
ZL j 3.5 RADC ^ BBN
Testina User
SS.^||<H life cycle
-28-
7>. RADC
°\n\ i & T O multivariate approach^
life cycled
, category 10
category 1~9<>M ojn]
multivariate
^017}
Zis] 3 .6^ RADC
Base reliabiliry for sofhvare itself
• Failure rate• Reliability prediction for operatingenvironment
• Run reliability
Design quality
• Cause & effect graphing• Fault number days• Design defect density• Man-hour per major defect• Function point analysis• Requirements traceability
Implementation quality
• Bugs per line of code- Code defect density- Man-hour per major defect- Function point analysis- Cyclometric complexity- Graph theoretic architecture
A (Base reliability)
S, = fl(0.5 - 97.997ft + 99.5/f2 )6
-S -97.997/1+99.5/;2)6
Test quality
- Cumulative failure profile- Code defect density- Design defect density- Man-hour per major defect- Modular test coverage- Reviews, inspections and workthroughs- Capability maturity model- Requirements change requests
= Y\ (0.5 - 97.997/ + 99.5/,2 )8
1 = ]
Overall ReliabilityR = AD-SrS2
Failure rateRf =10.6/?
ZL5] 3.6 RADC at7}
-29-
. BBN
^7}t}7] $}^ .
W « l , RADCi}^
Network)-^- A}-§-§}C^ causal network^- - T H $ ^ -r1, °]
t>}3. alt:}. BBN£- <y#S] \A}^ JB.^(network
S $ I M life cycled
BBN (Bayesian Belief
^(Bayesian Theory)!- ^
Ofl, ?I^>(diagnosis) 3!
4
^(decision-making)
BBN S. 'S^r
BBN*1}
3.133}
3.13 p(Loses | Sick & Dry)
Loses =
Loses =
"yes"
"no"
Sick =
0
0
Dry =
"yes"
.95
.05
"yes"
Sick =
0.
0.
"no"
85
15
Sick
Dry =
= "yes"
0.90
0.10
"no"
Sick =
0.
0.
"no"
02
98
ait:}.
BBN Us} 3.73}
-30-
H ^ 3.7 BBN
BBN
ol*rbfl
;EE*> "Loses" t =
fe S71| 37USJ $!4(Sick, Dry ^ Loses)7} <&3. 4Ao^fl(state)fe S . ^ "yes" JE-b "no" 2) -^ 7}xf -£
^Bfl-fe "Sick" $} "Dry" i ^ ^ . ^ 2:?i<>fl tc}ef
2.^ Dry7f yes/no«a
BBN
BBN
(node)fe o
ZLU 3.8:4 2>t:>.
(development process quality)oflfe 4 i = B c||*f
$17} itHSoil, 4 *4
-31-
Development, Process Quality
O.Q 3.8 BBN
-32-
71 •
life cycled S ^ ^ 4 #7fln># ^ g j ^ ig7}#
(category 6)
multivariate approach(category 10)7}
}. ^ , multivariate approach^
71 4-SrOfl, i S E ^ o ] ^S]£<H1 ^% v# ^ f e S ^ ^1^}#^- factor analysist.}
principal component analysis •§-# Af-§-;Sll -r-^1 7f#^] (weighting factor)#
1=1
-33-
^requirement
o]
7}
-34-
a 3.14
^ ^ : test 7}
3.7]
51 2 | t ^
} 333*1 IQ7f ^S. IQ#
o i^E
2.
o|n)
category 13<>fl
-35-
o
2. Data quality
operational profile
3. fitting
4.
5. ^^fe
-36-
[31]-
- p)
u -
C = ^ H l S . (confidence level)
p = ^••^^Jt (target reliability)
^°], 90%$] % ^ £ S 10'7
23,000,000^*1 Ajtgo]
2.
-37-
category 13ofi S^5|fe- RADC * BBN ^ t M J L , °1 ^ H Sl^- i l^ ^(artificial
neural network)^ AJ-§-*> i 5 S ^ < H *\S\5. 3g7} tiV^o]
Category 13^1 ^ ^ 5 ] ^ RADC iJ BBN ao
v^^r ?t€: ^ ^ S * 7 r^JL $Xc\.
>| RADC ^ S-^- ifcSSJ?fl<H life cycle ^Afl-i- sg7 rcfl^^S. ^ f e ^ f e ^
f^ ^ ^ d 51 c>. Z i e m o] aj.^,2. category 10<>11>M
multivariate approach^ -^A}*> ^ 5 r y0
v ^ # 4|*}JL &7] 4 "<>fl ^^f^^-^1
^ ^ H ) # * > ^ > ^ % ^ ^ Rf
BBN
^.^(calibration)
fl # 4 [32]. ^
-38-
v 614= 4?-; category
^S. S 4.12]-
4.1 i S S ^ ^7} category
Category
life-cycle#
1 ~ 10^ 7 }
1 1 - 1 2
o ^JEH.-?-)]^ life-cycle^-
sj^.*}
^(cal ibrat ion)^
4.l
[17], i 5 H
5}
life-cycle^ cj*|| 4
l, PSA
ofu]
JEE.«1 ^ -^^- /Hi-Ti l SlUr, NRC
RADC
life-cycle , 0}
-39-
o|
2 E o | o ] | i f e c y c l e t : H |
^(deterministic)
71
[33].
(category 1 ~
-40-
[I] D. L. Parnas, G. J. K. Asmis and J. Madey, Assessment of Safety- Critical
Software in Nuclear Power Plants, Nuclear Safety Vol. 32, No. 2, pp. 189-198,
1991.
[2] D. Welbourne, Safety Critical Software in Nuclear Power, The GEC Journal of
Technology, Vol. 14, No. 1, pp. 33-40, 1997.
[3] J. P. Burel, The Use of Digital Technology for Protection and Safety
Applications at French Nuclear Reactors, Kerntechnik, Vol. 60, No. 5-6, pp.
220-224, 1995.
[4] T. Graae and L. Engdahl, The Reliability of the Software of the Digital Control
System Nuclear Advantage, Kerntechnik, Vol. 61, No. 5-6, pp. 236-238, 1996.
[5] H. W. Bock and A. Graf, Reliability Aspects of Computer-based Safety Systems
for Nuclear Power Plants, Kerntechnik, Vol. 60, No. 5-6, pp. 207-214, 1995.
[6] H. Hofmann and H. J. Sauer, Effect of Fieldbus Technology on Digital
Instrumentation and Control for Nuclear Power Plants, Kerntechnik, Vol. 60, No.
5-6, pp. 245-247, 1995.
[7] U. Kunze and V. Streicher, Advanced Monitoring Systems for Preventive
Maintenance of Mechanical Systems and Components, Kerntechnik, Vol. 60, No.
5-6, pp. 238-241, 1995.
[8] SL^Q, ^ 3 r]x)%
S., pp. 58-67, 1999.
[9] ^ ^ , iSS^o]
1151, pp. 68-75, 1999.
[10] % S ^ , r)x}% Akg.
SL, pp. 76-83, 1999.
[II] USNRC, Digital Instrumentation and Control Systems in Nuclear Power
Plants-Safety and Reliability Issues, Final Report, National Academy Press,
Washington D.C., 1997.
[12] COOPRA working document, What PRA Needs From A Digital I&C Systems
Analysis: An Opinion, www.coopra.org, 1999.
[13] NEA/CSNI/R(97)23, Operating and Maintenance Experience with Computer-based
Systems in Nuclear Power Plants, Nuclear Energy Agency, Committee on the
Safety of Nuclear Installations, September, 1998.
-41-
[14] N. M. Ichiyen and P. K. Joannou, Safety Critical Software Design Approaches
Developed for Canadian Nuclear Power Plants, Kerntechnik, Vol. 60, No. 5-6,
pp. 232-237, 1995.
[15] Bev Littlewood and Lorenzo Strigini, Validation of Ultrahigh Dependability for
Software-based Systems, Comunications of the ACM, Vol. 36, No. 11, pp.
69-80, 1993.
[16] ANSI/IEEE, Standard Glossary of Software Engineering Terminology,
STD-729-1991, ANSI/IEEE, 1991.
[17] S. L. Pfleeger, Measuring Software Reliability, IEEE Spectrum, Vol. 29, Iss. 8,
pp. 56-60, 1992.
[18] IEEE, Charter and Organization of the Software Reliability Engineering
Committee, 1995.
[19] Michael R. Lyu, et al., Handbook of Software Reliability Engineering, IEEE
Computer Society Press, 1995.
[20] IEEE Std. 982.1, IEEE Standard Dictionary of Measures to Produce Reliable
Software, 1988.
[21] IEEE Std. 982.2, IEEE Guide for the Use of IEEE Standard Dictionary of
Measures to Produce Reliable Software, 1988.
[22] Ronald E. Prather, Design and analysis of hierarchical software metrics, ACM
Computing Surveys, Vol. 27, No. 4, pp. 497-517, 1995.
[23] Norman Fenton and Martin Neil, A Critique of Software Defect Prediction
Models, to appear IEEE Transactions on Software Engineering, 1999
(http://www.csr.city.ac.uk/papers/1999.html).
[24] Alan Wood, Predicting Software Reliability, IEEE Computer, pp. 69-77,
November 1996.
[25] Norman F. Schneidewind, Software Reliability Model with Optimal Selection of
Failure Data, IEEE Transactions on Software Engineering, Vol. 19, No. 11, pp.
1095-1104, 1993.
[26] FT. popentiu and D. N. Boros, Software Reliability Growth Supermodels,
Microelectronics and Reliability, Vol. 36, No. 4, pp. 485-491, 1996.
[27] Bev Littlewood and David Wright, Some Conservative Stopping Rules for the
Operational Testing of Safety-Critical Software, IEEE Transactions on Software
Engineering, Vol. 23, No. 11, pp. 673-683, 1997.
[28] Keith W. Miller, Larry J. Morell, Robert E. Noonan, Stephen K. Park, David M.
-42-
Nicol, Branson W. Murrill and Jeffrey M. Voas, Estimating the Probability of
Filure when Testing Reveals No Failures, IEEE Transactions on Software
Engineering, Vol. 18, No. 1, pp. 33-42, 1992.
[29] Jeff Tian, Integrating Time Domain and Input Domain Analyses of Software
Reliability using Tree-Based Models, IEEE Transactions on Software Engineering,
Vol. 21, No. 12, pp. 945-958, 1995.
[30] Walter J. Gutjahr, Optimal Test Distributions for Software Failure Cost
Estimation, IEEE Transactions on Software Engineering, Vol. 21, No. 3, pp.
219-228, 1995.
[31] Paul E. Ammann, Susan S. Brilliant and John C. Knight, "The effect of
Imperfect Error Detection on Reliability Assessment via Life Testing," IEEE
Transactions on Software Engineering, Vol. 20, No. 2, pp. 142-148, 1994.
[32] Marc Bouissou, Assessment of a Safety-Critical System Including Software: A
Bayesian Belief Networks for Evidence Sources, Edf, 1999.
[33] Jeffrey M. Voas and Keith W. Miller, Software Testability: The New
Verification, IEEE Software, Vol. 12, No. 3, pp. 17-28, 1995.
[34] Han S. Son and Poong H. Seong, Quantitative evaluation of safety-critical
software at the early development stage: an interposing logic system software
example, Reliability Engineering and System Safety, Vol. 50, No. 3, pp.
261-269, 1995.
[35] An entropy-based meaure of software complexity, IEEE Transactions on Software
Engineering, Vol.18, No. 11, pp. 1025-1029, 1992.
[36] G. J. Mayers, Reliable software through composite design, Petrocelli, 1975.
[37] Barry W. Bohem, Software engineering economics, Prentice-Hill, 1981.
[38] Wendy W. Peng and Dolores R. Wallace, Software error analysis, NIST Special
Publication 500-209, April 1993.
[39] David N. Card and Robert L. Glass, Measuring software design quality,
Prentice-Hall, 1990.
[40] A. Mowshowitz, Entropy and the complexity of graphs: I. An index of the
relative complexity of a graph, Bulletin of Mathematical Biophysics, Vol. 30, pp.
175-204, 1968.
[41] K. S. Lew et al., Software complexity and its impact on software reliability,
IEEE Transactions on Software Engineering, Vol. 14, No. 11, pp. 1645-1655,
1988.
-43-
[42] M. R. Woodward et al., A measure of control flow complexity in program text,
IEEE Transactions on Software Engineering, Vol. SE-5, No. 1, pp. 45-50, 1979.
[43] J. McCall, et. al, Methodology for software reliability prediction,
RADC-TR-87-171, Volume I and II, Rome Air Development Center, 1987.
[44] George F. Watson, MIL Reliability: A New Approach, IEEE Spectrum, Vol. 29,
Iss. 8, pp. 46-49, 1992.
[45] Jeffrey M. Voas, PIE: A Dynamic Failure-Based Technique, IEEE Transactions
on Software Engineering, Vol. 18, No. 8, pp. 717-727, 1992.
-44-
fe [20]
- 4 5 -
A. i i
A.I Operational or Functional Complexity
(complexity)^ i ^
B S ^ (entropy)l- ^
^ -SHS-Slfe Shannon
(information theory)°fl
. °H
p. = Al^Eflol j
A.i *as.^5iii-
complexity, interface complexity, micro complexity ^ operator
complexity) 7> l ^ E ] ^ SiJL [34, 35], s . ^ ^ S ] ^ 5 . A.1 ^
k
Functional complexity = - ]T p
Interface complexity =
Micro complexity = —
Operator complexity =
4,7y S i \=1 n
j log
log2
fS>
log2
2 (Pi)
<7>
• »
k =
Pi =
n =
fi =
m =
N =
i ^i H function
Ai S. coupling ^
A"] si cohesion -
i ^i#!| operator
5.^"^ function ^r
l-c- operator T^
^f-8--S %• operator ^
A^BVAS- l
Ol 5fi 7l
21 0} 4 i H H ^ H l " t !:-§• #71 ^ sfl
^ ^J15l^(algorithm)ol ^
-46-
coupling -§: % 7}t}7] ^ «fl
01 cf. BVA
cflsfl i A.2^-
A.2 BVA ^ ft
integer
float
double
long (integer)
Boolean
Natural
character
unsigned integer
C,Ada
C,Ada
C,Ada
C,Ada
C,Ada
Ada
C,Ada
C
B V A ^
5
5
5
5
2
3
2
A.3 Number of Entries and Exits per Module
architecture^
entry/exit point 1 ^ H ^ ^ f e
lj7 exit point $] ^7\ x^flo] tf[ j
= e; +
^ 4 . °1 aoV^£
function 1:3] ^
£ . - | H cfl«fl entry point fi]
point-t^r £ ^ function # 4 ^ 7 }
function
A.4 Design Structure
P3 = 5 . 5 - Z ! ^ ^ S]^ ^ « J s l prior processing °1] ^ ^
P4 = cfl o] Hfl o] i o j i f - ^ element -ir -l ^
P5 = E|1 o] B] Hfl o) element € f1 unique ^^1 & £ element - t^] ^
P6 = tH] o| 5] H]] o] ^ ] segment ^
P7 = SSZL^oi l 3.*t& S-l- ^ 1 7flS] entry/exit point 1- 7}T.]T] ^ ^
-47-
Dl = H5 .H^o] top-down ^n^- g ^ ^ S l ^ ^ 7 > ? ( H % 4 ^ 1, ZL^T.) ^ o _ ^ 0
D2 = Module dependence = P2/P1
D3 = Module dependent on prior process = P3/P1
D4 = Database size = P5/P4
D5 = Database compartmentalization = P6/P4
D6 = Module single entry/exit = P7/P1
°1 e1 tb ?M"§- f-«fl design structure measure (DSM) £• 4 T S - 4
,. =1.0
A.5 Data Flow Complexity
lfi = E f S # ^ 7>fe local data flow
datain = S-1MM 4 ^ = 1 fe ^ S . ( ^
ifo = S-lr^l ^ uJ-71-fe local data flow
length =
°1 ?]tl-& 4 - § - ^ information flow complexity (HFC)fe 4-5-4 ^-
IFC = {fanin x fanont)' = j(Ifi + datain )2 x(Ifo + dataont)1
- 4 8 -
A.6 Cohesion/Coupling
"cohesion"^:
S.-1-o] e^irj function^-
"coupling"^ <#$. aj-g-i;
7 H «>^
-g-o^ojtf. o)i-o] ^ ^ H
l - ^ S A.3
A.3 Cohesion ^ Coupling ^ 7l •
Cohesion
• Difficult to describe the module function
• Module performs more than one function
• Only one function performed perinvocation
• Each function has an entry point
• Module performs related class of functions
• Functions are related to problem procedure
• All of the functions use the same data
Coupling
• Direct reference between modules
• Modules packed together
• Some interface data external or global
• Some interface data control information
• Some interface data in data structure
A.7 Functional Complexity
^ Bohem[37]°l
complexity £] ^JEt -
Si4. S. A.4 ^ M functional
-49 -
Rating
Verylow
Low
Nominal
Control Operations
Straight-line code with a fewnonnested SP operators: Dos,CASEs, IFTHENELSEs. Simplepredicates.
Straightforward nesting of SPoperators. Mostly simplepredicates.
Mostly simple nesting. Someinter-module control. Decisiontables.
I£ A.4 Functional Complexity Q''
Computational Operations
Evaluation of simpleexpressions, such as
a - b + c • (d - e)
Evaluation of moderate-levelexpressions, such as
Use of standard math andstatistical routines. Basicmatrix / vector operations.
8 7|-£(l/2)
Device-dependent Operations
Simple read, write statementswith simple formats.
No cognizance needed ofparticular processor or I/Odevice characteristics. I/O doneat GET / PUT level. Nocognizance of overlap.
I/O processing includes deviceselection, status checking anderror processing.
Data ManagementOperations
Simple arrays in mainmemory.
Single file subsettingwith no data structurechanges, no edits, nointermediate files.
Multi-file input andsingle file output.Simple structuralchanges, simple edits.
-50-
if. A.4 Functional Complexity '4 ^ 'A <£ (2/2)
Rating
High
Veryhigh
Extrahigh
Control Operations
Highly nested SP operators withmany compound predicates.Queue and stack control.Considerable intermodulecontrol.
Reentrant and recursive coding.Fixed-priority interrupt handling.
Multiple resource schedulingwith dynamically changingpriorities. Microcode-levelcontrol.
Computational Operations
Basic numerical analysis:multivariate interpolation,ordinary differential equations.Basic truncation, round offconcerns.
Difficult but structurednumerical analysis: near-singular matrix equations,partial differential equations.
Difficult and unstructurednumerical analysis: highlyaccurate analysis of noisy,stochastic data.
Device-dependent Operations
Operations at physical I/Olevel (physical storage addresstranslations; seeks, reads, etc.).Optimized I/O overlap.
Routines for interruptdiagnosis, servicing, masking.Communication line handling.
Device timing-dependentcoding, micro-programmedoperations.
Data ManagementOperations
Special purposesubroutines activated bydata stream contents.Complex datarestructuring at recordlevel.
A generalized,parameter-driven filestructuring routine. Filebuilding, commandprocessing, searchoptimization.
Highly coupled,dynamic relationalstructures. Naturallanguage datamanagement.
- 5 1 -
A.8 Data Structure Metrics [38]
L, = S.1- ° H Sl^ i « i ^ line
N =
LV; = i «I*1) linear ^ ^ ^ ^ - f S-- tMHH ^ ^ ^ ^ ^ ^ ^ (live variable) 1-S] ^
VSj = j *m ^ T ^ 7 > A>^-Jg line^r°l°11 £ M code line *r (the number of lines of code
between the first and last reference of the variable)
SPjk = j £1*1) ^.n^7> <g %> - n] ] ^ Code line ^r (the number of lines of code between the klh
and (k+l)5t references to the variable)
LV =IOC ^"
A.9 System Design Complexity [39]
n = system °i| i t ^ r ^ t SL^r-srSJ nr
f(i) = 4 # S-^t0)] -1*11 i£L*fl S-il0! l - s l^ l^S l^ r (call ^ - ^ - i :
v(i) = i i^1| 5-ik0]] S-ltQ I/O ^^r l - -^ =r~
S(i): structural complexity of the ith S.-g- -> 5(/) = [/(Oj
-52-
D(i): data complexity of the ilh 5.H: -> D{i) = —^11/ (0 H
St = program structural complexity -> & = 2_, [/"(')]1=1
1 " v(i)Dt: program data complexity -> Dt - — "£—:
C: program design complexity -$ C = St + Dt
Fr: expected failure rate (1000 line "%) -> /> = 0.4 • C - 5.2
A.10 Entropy Measure [40, 41, 42]
nsflS(program control
H2flH(data structure graph)S]
HBflS^ll tfl^ ^ I S S ^ l ^ 4-§-3[- ^ ^ ^tflS. Mowshowitz^l S]sfl ^ ^ S j ^ J l , o ^
3.7fl ^2fl£fil ^^^(regularity) S ^ tfl%1 (symmetry)^ ^7 r %v ^ S i ^ "^^(first-
order) <a^S.2z)"4 ^AS.^} S.^^. ^Jifil ^ - i - ^7ff- ^ o|fe "o]^>(second-order)
H = -fjp(Ai)-\og[p(Ai)]i=\
Aj = identified classes in a graph
h = the number of identified classesthe numberof nodes included in A •
p(A;) = estimated probability of A, =the total numberoi nodes in a graph
- 5 3 -
b ,
A.I
G'
S.A.6 H H G
ZlEflHG
{a}{b,c,e,f}
{d}
{g}
in
0
1
2
2
out
2
1
2
0
(node class)
I
II
IIIIV
V
ZLSflSG'
{a}{b,d,e}
{c}
{f}
{g}
in
0
11
2
2
out
2
1
2
1
0
G
b, c, e
A.7 S G G'-aj
h = the number of identified classes = 4
p(A,) = estimated probability of A, = 1/7
p(AM) = estimated probability of A,, = 4/7
p(Am) = estimated probability of A,,, = 1/7
p(A,v) = estimated probability of A1V = 1/7
- 5 4 -
ZL5flHG'
HG = - I P(4) • log {p{A,)] = 1.664
h = the number of identified classes = 5
p(A,) = estimated probability of A, = 1/7
p(AtI) = estimated probability of Au = 3/7
p(Am) = estimated probability of A,n = 1/7
p(Alv) = estimated probability of AIV = 1/7
p(Av) = estimated probability of Av = 1/7
HG '=-$>(4)- l°g[/>(4-)] = 2-218
Qj*V o jB 3.2
e h L ^ H G7}
§ S.
^.tj-
.(neighborhood node)!:^ %•& ±^
A.8 £ ^ G
Graph G
{a}{b,c}
{b,c}
{a,d}
I
II
Graph G'
{a}
{b}
{b?c}
{a,g}
- 5 5 -
{d}
{e,f}
{g}
{b, c, e, f}
{d,g}
{e,0
III
IV
V
VI
{c}
{d,e}
(0{g}
{a, d, e}
{e,f}
{d, e, g}
{b,f}
A.8^1 Af-g-^ ZLEflH G
HS] a7l(size)l- s-l
d,
71'
i-fl*>fl
tfl
=?-2z nSfl=.(data structure graph)
-56 -
B. Requirement Quality ^7>
B.I Cause and Effect Graphing
tflsfl neflH#
- ^ l - i : %m ^ # ^ £ (input and output condition)^
(cause and effect graph) 3§7>*»ll s)£}. ^ 7 } ^ } ^ g. A . 9 ^
A.9 Cause and Effect Graphing ^ 7>
#31
1
2
3
tit. *
true/false °node tr-i-?sTr cause
31iUl°
IPnode M-
11 tfl
*H
cause
El3" node°11 t
Ai oflnfl*
7M^1 ^} node 1^ effect
) node^ a ^ t F ^S ^ i (constraint)-i:3E.
-•& effect 1- 7j-^lx]node •§••§• —] a l tt^f.
B.2 Requirement Traceability
B.3 Number of Conflicting Requirements
o ] l - function 1-4
function^ lr-ir 4-§-«H
B.4 Requirements Compliance
- 5 7 -
system verification diagram (SVD)l- -*}-g-*M
B.5 Completeness
measure)
(completeness)*
^ CM ^
A. 10 Completeness 7]
^ «fl CM (completeness
5,
^ 2
^ 3
5 4
5 5
^ 6
5 7
5 ,
5 9
•#10
Bn
Bn
Bn
Bu
Bi6
5,7
5,8
number of functions not satisfactorily defined
number of functions
number of data references not having an origin
number of data references
number of defined functions not used
number of defined functions
number of referenced functions not defined
number of referenced functions
number of decision points not using all conditions or options or both
number of decision points
number of condition options without processing
number of condition options
number of calling routines with parameters not agreeing with definedparameters
number of calling routines
number of condition options not set
number of set condition options having no processing
number of set condition options
number of data references having no destination
3LA.ii
D,
D2
(B2-B,) /B,
(B4-B3)/B4
°jn)
Functions satisfactorily defined
data references having an origin
- 5 8 -
D3
D4
D5
D6
D7
Ds
D9
D,o
(B6-B5)/B6
(B8-B7)/B8
(Bio - B9) / B1 0
(B]2 - B,i)/B,2
(BH - B]3)/B,4
(B[2- BJ 5) /B]2
(B[7- B ] 6) /B 1 7
(B4-B,8)/B4
Defined functions used
referenced functions defined
all condition options at decision points
all condition options with processing at decisionpoints are used
calling routine parameters agree with the calledroutine's defined parameters
all condition options that are set
processing follows set condition options
data references have a destination10 10
CM = £>,£,, 5>,=1.0;=i i=i
w ^ 7\^A (weighting fac tor)^ W ^ } - z]- ^ 5g7> ^ i - t ^ l
B.6 Requirements Specification Change Requests
l-oll tflsfl i £ B
-59 -
c. iiH
C.I Software Process Capability Determination
ISO/IEC Draft International Standard (DIS) 15504 °1H ^l
4 i 5 . H 3 H 7 ^ 7flf 4 ^ (process) ^ 7 ^ (capability)
711 V ^
7im^^7r%T-^-
• Acquire software• Manage customer needs• Supply software• Operate software• Provide customer service• Develop system requirements and design• Develop software requirements• Develop software design• Implement software design• Integrate and test software• Integrate and test system• Maintain system and software• Develop documentation• Perform configuration management• Perform quality assurance• Perform work product verification• Perform work product validation• Perform joint reviews• Perform audits• Perform problem resolution• Manage the project• Manage quality• Manage risks• Manage subcontractors• Engineer the business• Define the process• Improve the process• Provide skilled human resources• Provide software engineering infrastructure
7fl^^^^7].*oVs.
• Process performance attribute• Performance management attribute• Work product management attribute• Process definition attribute• Process resource attribute• Process measurement attribute• Process control attribute• Process change attribute• Continuous improvement attribute
11-^-1-Tr | H l 5-1 «fl N (not achieved), P (partially achieved), L (largely
achieved) ^ F (fully achieved)^.
-60-
C.2 Cost and Schedule
7fl
SI (schedule index)^
(cost index)i4
S. A. 13 Cost ^ Schedule 71^-
c,T,
s,
V/
Estimated costs for life cycle phase /
actual cost of life cycle phase /
Estimated elapsed time to be spent on life cycle
actual elapsed time for life cycle phase /
10Weighting factor for life cycle i costs, £ wi = 1
Weighting factor for life cycle i elapsed time
phase i
.0
C.3 Reliability Prediction as a function of Development Environment {43, 44]
RADC (Rome air development center)^
) ^ RP=A-D
5. A. 14
Airborne
Strategic
Tactical
Process Control
Production Center
Developmental
Total/Average
A
0.0128
0.0092
0.0078
0.0018
0.0085
0.0123
0.0094
3.
-61 -
Organic
Semi-detached
Embedded
Software that is being developed by a groupthat is responsible for the overall application(for example, reactor protection systemsoftware that is being developed by a nuclearreactor vendor)
The software developer has specializedknowledge of the application area, but is notpart of the sponsoring organization (e.g.,network control software that is beingdeveloped by a communications company thatdoes not operate the target network)
Software that frequently has very tightperformance constraints being developed by aspecialist software organization that is directlyconnected with the application (e.g.,surveillance radar software being developed bya group within the radar manufacturer, but notorganizationally tied to the user of thesurveillance information).
D
0.76
1.0
1.3
°1 ^ ^ ^Hl , "Reliability prediction for the Operational Environment"
Prediction as a Function of Software Characteristics"
"Reliability
-fr4t!r
-62-
D. 4:5.B31 °] Al H °1] Tfl 1r Coverage ^ 7>
D.I Functional Test Coverage
FE = *1 ^ # ^ ^ ^ ^.S. > 1 iHE?ll «H ofl ^-g-Hl 71 ^ £ ^ S.l-1-^
FT = dtS
Functional Test Coverage Index =
D.2 Minimal Unit Case Determination
^ ! ^ (©I neflHfe cyclomatic complexity #
(minimal independent path)!:^- # o } o)s.o]
cyclomatic complexity
D.3 Test Coverage
(completeness)-!: *§7H-71
Requirement Capabilities (RC) - i = l f l | a\o]} tfl *}-
Implemented Capabilities ( I Q - 4°^^. ^ ^ ^ - ^ ^ ^ l
Total Program Primitives (TPP) - i ^ H S - l M ^ ] 5 . ^ € £.#1-^1 ^ ^ o]
Program Primitives Tested (PPT) - 4M15. A|^ o] ^ s j s l primitive 1^1
- 6 3 -
TCJC PPT
RC TPP
D.4 Test Sufficiency
(integrated test) S.& «]-§;- i5E?flo]S) -f- -
NF = i S B
Fpil= i = H
N F r e m =
A/-yVF =——-(NF-F-)
D.5 Testability Analysis [45]
PIE (Propagation, Infection and Execution) -5]fe o]
i5 .H $ | ° H tflsfl propagation, infection ^ execution ^ - ^ °1
Executionti_ —1
InfectionTIT —i
Propagation
n*i iES^oj t ^ * ^ *.; iiHE^^o] / ^ ^ ,ineo]
iS-H^l^^l ifl^ >^ Ao ll (internal data state)!- ^
n«l i = H4IM# ^ 8 S ^ , i £ I ? M 2 ] /« i^ lineo]
-64-
o]
min(;l«/)»min(*v/)M» c
>» '• J J
In(l-c)
= max(0 ,o - ( i -
- 6 5 -
E. i S H S l I <H 3 Test Effort *% 7} «oV
E.I Fault Number Days
(effort)°l
#• i c ^ FD (fault days number)^ 4 ^
E.2 Man-hours per Major Defect Detected
•?]^ (man-hours)*
T, = i £ S ? W t ^A f (inspection)§|-7l 41 t t ^ ^ " fi (team)
S, = i *£*1| ^ 4 ° i l ^ ^ ^ € ^ ^ - ^ ^ ^ (non-trival fault)
Man-hours per major Faults = M = ———
- 6 6 -
F. i S H ^ H ^ Test Team Availability/Capability ^ 7}
F.I Number of Faults Remaining
^ ^ (sampling
technique)-!: #l^>al &4. °fll- *°1 , ^ ^ « f ^ 4 *fe i ^ B ^ l ^ ^ l NS7]}S]
^ ^ ^ (fault seeding), ^ ^^7l) t £ f tb4. ° H ^ 1 ^ ^ °1ns7fl
-l- ^ nF7lll
Total number of "real" faults (estimated) = NF = -£- • N,
NFrem = N F - n F
°1 ^1°fl, mutation testing ^ test accuracy yc!'
1t! ^ A l number of faults remaining
- 6 7 -
^ * * Ji tf 4
KAERI/AR-565/00
^ "P" n *a } °~* "T~ "H ^
^ .1 -1 67 p.
^^-71^^.3. A INIS ^ ^ S S .
!-^[3§7>^), °]7l<g (-^^5.71 #7fl^"^-7l^7fl^"^)
KAERI
a 7i
2000. 4.
21 X 29.7cm
?7ll( O ), tflfi]Hl( ),
******
} 7]s^:^]S.-Ai
si -tls)S.# ; -
S.» ^ ^"^ ^ 7
7\7t\ ^^-g-g- ^"^l^i ^1-^1^ ^ l ^ ^il^] :5§7l-'§' ^r Sife - ^ ^ °}^\ aife"
j]-^- ^-i2-?}- ^ *S s) "H ^" 7ji^-5- ^
h
^ 7 > U - ^
BIBLIOGRAPHIC INFORMATION SHEET
Performing Org.
Report No.
Sponsoring Org.
Report No.Standard Report No. INIS Subject Code
KAERI/AR-565/00
A technical survey on issues of the quantitative evaluation of
software reliabilityTitle / Subtitle
Project Manager
and DepartmentJ.K. Park (Integrated Safety Assessment team)
Researcher and
Department
T.Y. Sung (ISA team), H.S. Eom (ISA team), H.S. Jeong (Hanaro)
J.H. Park (ISA team) and H.G. Kang (ISA team),
K.Y. Lee (ARTD team), J.K Park (ARTD team)
Publication
PlaceTaejon Publisher KAERI
Publication
Date2000. 4.
Page 67 p. 111. & Tab. Yes( O ), No ( ) Size21 X
29.7cm
Note
Classified Open( O ), Restricted(
Class DocumentReport Type Analysis Report
Sponsoring Org. Contract No.
Abstract (15-20 Lines)
To develope the methodology for evaluating the software reliability included in digital
instrumentation and control system (I&C), many kinds of methodologies/techniques that
have been proposed from the software reliability engineering field are analyzed to identify
the strong and week points of them.
According to analysis results, methodologies/techniques that can be directly applied for
the evaluation of the software reliability are not exist. Thus additional researches to
combine the most appropriate methodologies/techniques from existing ones would be
needed to evaluate the software reliability.Subject Keywords
(About 10 words)
digital I&C system, probabilistic safety assessment, software,
reliability, quantitative evaluation