1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s...

11
1 PSM System Architecture Measurement

Transcript of 1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s...

Page 1: 1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s task was to: Identify a set of leading indicators that.

1

PSMPSM

System Architecture Measurement

Page 2: 1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s task was to: Identify a set of leading indicators that.

2

PSMPSM

Continuation of NDIA Measurements Task

• Goal of last year’s task was to:• Identify a set of leading indicators that provide insight into technical

performance at major decision points for managing programs quantitatively across their life cycle, with emphasis on Technology Development (TD) and Engineering Manufacturing and Development (EMD) phases.

• Build upon objective measures in common practice in industry, government, and accepted standards. Do not define new measures unless currently available measures are inadequate to address the information needs.

• Select objective measures based on essential attributes (e.g., relevance, completeness, timeliness, simplicity, cost effectiveness, repeatability, and accuracy).

• Measures should be commonly and readily available, with minimal additional effort needed for data collection and analysis.

Page 3: 1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s task was to: Identify a set of leading indicators that.

3

PSMPSM

Architecture Measurement

• Architecture was a high priority area but …• No measures met the criteria

• Architectures must be complete, consistent, and correct

• Complete –All the elements required to describe the are solution present and all the requirements/needs are addressed

• Consistent –The artifacts that describe the architecture are internally consistent and consistent with external constraints (e.g. external interfaces)

• Correct –The architecture satisfies all requirements within the program constraints (cost, schedule, etc.) and it the architecture that does so to the greatest extent

• This results in a need for multiple types of measurement

Page 4: 1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s task was to: Identify a set of leading indicators that.

4

PSMPSM

Architecture Measurement

• Two types of measurement needed• Completeness/Maturity and Consistency

- Are all the elements required present at the current program phase?

- Are all requirements accounted for?- Does it tie together? Within an architecture level? Between

levels? Between artifact types? • Correct = Solution Quality

- Does it meet the stakeholder needs?- Does it avoid known architecture deficiencies?- Does it do so better than alternatives?

• Traditionally this was determined at the milestone reviews and was a lagging indicator

• Model based architecting (or architecture modeling) makes the evaluation of completeness and consistency feasible as a leading indicator)

Page 5: 1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s task was to: Identify a set of leading indicators that.

5

PSMPSM

Quantitative Measurement

• Two types of measures required• Quantitative• Qualitative

• Goal is to measure whether an architecture is complete and consistent• Easier with model-based architecting• Anticipated artifacts / completed artifacts• Internal reports showing missing data and inconsistencies between artifacts• Supported by many of the architecture tools but requires effort on the part of

the program to create and customize• Models help visualize heuristics as well

• Examples• Progress chart• Requirements trace reports (SELI)• TBx closure rate and TBx counts (SELI)• Empty data field counts• Visual reviews of artifacts• Other reports from the modeling tool database that address consistency

Page 6: 1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s task was to: Identify a set of leading indicators that.

6

PSMPSM

Qualitative Measurement

• Goal is to ensure the architecture is correct and coherent• Does it meets stakeholder needs within the program

constraints?• Is it better than the alternative architectures in satisfying

stakeholder needs?

• Still somewhat subjective but has aspects that can be measured

• Can only be determined in comparison to the alternatives• TPMs and MOE/KPP satisfaction compared

• Examples• TPM/MOE radar charts• Est. At Completion vs TPM/MOE• Architecture design trade study records

Page 7: 1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s task was to: Identify a set of leading indicators that.

7

PSMPSM

Internal Measurements/Heuristics

• Additional ways to measure architecture quality• Heuristics – “Does it look right”

- Review of the model artifacts can sometimes indicate if an architecture exhibits good / bad characteristics such as low cohesion or high levels of coupling

• Internal metrics- Number of internal interfaces- Number of requirements per architecture element can

indicate an imbalance

Page 8: 1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s task was to: Identify a set of leading indicators that.

8

PSMPSM

Example Progress Table/Chart

Estimated # of diagrams

Started Definition TEM Complete

Drawn Inspected ERBed % Complete

System Behavior Diagrams

26 26 26 26 26 100%

Subsystem Behavior Diagrams

175 175 170 160 150 86%

Component Behavior Diagrams

300 25 25 20 15 5%

Page 9: 1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s task was to: Identify a set of leading indicators that.

9

PSMPSM

Example Architecture “Radar” Chart / Table

Attribute Weight Value Weighted Value

Flexibility 25% 75% 19%

Adaptability 10% 80% 8%

Modular 10% 25% 3%

Simplicity 10% 75% 8%

Completeness 10% 25% 3%

Usability 10% 75% 8%

Performance 25% 100% 25%

Total 100% 72%

“Utility Function” for the architecture assessment is a simple weighted sum of the assessed attribute values…repeat for each candidate architecture!

Attribute 1

Attribute 2

Attribute 3

Attribute 4

Attribute 5

Attribute N

Page 10: 1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s task was to: Identify a set of leading indicators that.

10

PSMPSM

Structural Heuristics

“The eye is a fine architect. Believe it”• Werner Von Braun, 1950

“A good solution somehow looks nice”– Robert Spinrad, 1991

Page 11: 1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s task was to: Identify a set of leading indicators that.

11

PSMPSM

Heuristics

High External Complexity Low External Complexity

Which Partitioning is Better? Why?