1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E....

21
Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification IS301 – Software Engineering Lecture # 9 – 2004-09-20 M. E. Kabay, PhD, CISSP Assoc. Prof. Information Assurance Division of Business & Management, Norwich University mailto:[email protected] V: 802.479.7937

Transcript of 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E....

Page 1: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Critical Systems

Specification IS301 – Software Engineering

Lecture # 9 – 2004-09-20M. E. Kabay, PhD, CISSP

Assoc. Prof. Information AssuranceDivision of Business & Management, Norwich University

mailto:[email protected] V: 802.479.7937

Page 2: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

3 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Topics covered

Risk-driven specificationSafety specificationSecurity specificationSoftware reliability specification

Today we will use 18 of Prof. Sommerville’s slides in class

Page 3: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

4 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Dependability requirements

Functional requirements to define error checking and recovery facilities and protection against system failures.

Non-functional requirements defining the required reliability and availability of the system.

Excluding requirements that define states and conditions that must not arise.

Page 4: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

7 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Risk-driven specification

Risk analysis andclassification

Risk reductionassessment

Riskassessment

Dependabilityrequirements

Riskdecomposition

Root causeanalysis

Riskdescription

Riskidentification

Page 5: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

11 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Levels of risk

Unacceptable regionRisk cannot be tolerated

Risk tolerated only ifrisk reduction is impractical

or grossly expensive

Acceptableregion

Negligible risk

ALARPregion

Unacceptable regionRisk cannot be tolerated

Risk tolerated onlyif risk reduction isimpractical orgrossly expensive

Acceptableregion

Neglible risk

ALARPregion

Page 6: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

14 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Risk assessment - insulin pump

Identified hazard Hazardprobability

Hazardseverity

Estimatedrisk

Acceptability

1. Insulin overdose Medium High High Intolerable

2. Insulin underdose Medium Low Low Acceptable

3. Power failure High Low Low Acceptable

4. Machine incorrectly fitted High High High Intolerable

5. Machine breaks in patient Low High Medium ALARP

6. Machine causes infection Medium Medium Medium ALARP

7. Electrical interference Low High Medium ALARP

8. Allergic reaction Low Low Low Acceptable

Page 7: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

16 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Fault-tree analysis

A deductive top-down technique.Put the risk or hazard at the root of the tree

and identify the system states that could lead to that hazard.

Where appropriate, link these with ‘and’ or ‘or’ conditions.

A goal should be to minimize the number of single causes of system failure.

Page 8: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

17 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Insulin pump fault tree

Incorrectsugar levelmeasured

Incorrectinsulin doseadministered

or

Correct dosedelivered atwrong time

Sensorfailure

or

Sugarcomputation

error

Timerfailure

Pumpsignals

incorrect

or

Insulincomputation

incorrect

Deliverysystemfailure

Arithmeticerror

or

Algorithmerror

Arithmeticerror

or

Algorithmerror

Page 9: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

24 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Control system safety requirements

Controlsystem

Equipment

Protectionsystem

Systemrequirements

Safetyrequirements

Functional safetyrequirements

Safety integrityrequirements

Page 10: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

25 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

The safety life-cycle

Hazard and riskanal ysis

Concept andscope definition

Validation O & M Installation

Planning Safety-relatedsystems

development

External riskreductionfacilities

Operation andmaintenance

Planning and development

Systemdecommissioning

Safety req.allocation

Safety req.derivation

Installation andcommissioning

Safetyvalidation

Page 11: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

29 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

The security specification process

Assetidentification

System assetlist

Threat analysis andrisk assessment

Security req.specification

Securityrequirements

Threat andrisk matrix

Securitytechnolog y

analysis

Technolog yanalysis

Threatassignment

Asset andthreat

description

Page 12: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

32 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Types of security requirement

Identification requirements.Authentication requirements.Authorization requirements. Immunity requirements. Integrity requirements. Intrusion detection requirements.Non-repudiation requirements.Privacy requirements.Security auditing requirements.System maintenance security requirements.

Page 13: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

37 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Reliability metrics

Reliability metrics are units of measurement of system reliability.

System reliability is measured by counting the number of operational failures and, where appropriate, relating these to the demands made on the system and the time that the system has been operational.

A long-term measurement program is required to assess the reliability of critical systems.

Page 14: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

38 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Reliability metrics

Metric Explanation

POFODProbability of failureon demand

The likelihood that the system will fail when a service request is made. A POFODof 0.001 means that 1 out of a thousand service requests may result in failure.

ROCOFRate of failureoccurrence

The frequency of occurrence with which unexpected behaviour is l ikely to occur.A R OCOF of 2/100 means that 2 f ailures are likely to occur in each 100operational time units. This metric is sometimes called the failure intensity.

MTTFMean time to failure

The average time between observed system failures. An MTTF of 500 means that1 failure can be expected every 500 time units.

AVAILAvailability

The probability that the system is available for use at a given time. Availability of0.998 means that in every 1000 time units, the system is likely to be available for998 of these.

Page 15: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

39 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Probability of failure on demand

This is the probability that the system will fail when a service request is made. Useful when demands for service are intermittent and relatively infrequent.

Appropriate for protection systems where services are demanded occasionally and where there are serious consequence if the service is not delivered.

Relevant for many safety-critical systems with exception management componentsEmergency shutdown system in a

chemical plant.

Page 16: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

40 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Rate of fault occurrence (ROCOF)

Reflects the rate of occurrence of failure in the system.

ROCOF of 0.002 means 2 failures are likely in each 1000 operational time units e.g. 2 failures per 1000 hours of operation.

Relevant for operating systems, transaction processing systems where the system has to process a large number of similar requests that are relatively frequentCredit card processing system, airline

booking system.

Page 17: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

41 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Mean time to failure

Measure of the time between observed failures of the system. Is the reciprocal of ROCOF for stable systems.

MTTF of 500 means that the mean time between failures is 500 time units.

Relevant for systems with long transactions i.e. where system processing takes a long time. MTTF should be longer than transaction lengthComputer-aided design systems where a

designer will work on a design for several hours, word processor systems.

Page 18: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

42 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Availability

Measure of the fraction of the time that the system is available for use.

Takes repair and restart time into accountAvailability of 0.998 means software is

available for 998 out of 1000 time units.Relevant for non-stop, continuously running

systems telephone switching systems, railway

signaling systems.

Page 19: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

52 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

Homework

RequiredBy Mon 27 Sep 2004Write an essay of 100-200 words

discussing question 9.1 for 15 pointsAnswer questions 9.2, 9.3 & 9.6 for 10

points eachFor 20 points, answer question 9.9 in

detail. Note the phrase “Giving reasons…”Optional

By Monday 4 Oct 2004Q. 9.8 for 12 points with reasonsQ. 9.10 and/or 9.12 @ 2 points each

Page 20: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

53 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

REMINDER

QUIZ #1 IN CLASS

WED 22 SEP 2004

Chapters 1 - 5

Page 21: 1 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Critical Systems Specification.

54 Notes content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved.

DISCUSSION