Learnings from Process Assessments

30
1 © 2020 The MathWorks, Inc. Learnings from Process Assessments Govind Malleichervu Industry Marketing John Lee Consulting Services June 23, 2020 Model-Based Design Maturity Framework™

Transcript of Learnings from Process Assessments

Page 1: Learnings from Process Assessments

1© 2020 The MathWorks, Inc.

Learnings from Process Assessments

Govind Malleichervu – Industry Marketing

John Lee – Consulting Services

June 23, 2020

Model-Based Design Maturity Framework™

Page 2: Learnings from Process Assessments

2

First Digital Transformation: Adding Embedded Software to Everything

1980 1990 2000 2010 2020 2030

Design complexity

Page 3: Learnings from Process Assessments

3

Second Digital Transformation in Automotive

▪ Digital transformation is raising new questions about tools and processes

Vehicle Electrification Automated Driving Connected Vehicles

Page 4: Learnings from Process Assessments

4

Questions to Answer

How well is your verification and validation process capable of meeting ISO 26262?

Why is integration level testing not being leveraged?

Are models leveraged for validation and analysisbeyond code generation? How are they being maintained?

Are processes and tools being evolved to meet the application needs?

Page 5: Learnings from Process Assessments

5

Objective for Today

▪ Over the years, MathWorks have conducted

numerous Process Assessments across various

industries around the world.

▪ Share findings including general trends and correlation with opportunities

for tool and process improvements with a focus on Automotive Industry.

Page 6: Learnings from Process Assessments

6

Assessment is based on Model-Based Design Maturity

Framework™

▪ Key Features

– Comprehensive measurement of capabilities

– Independently measures each capability

– Applies to any level of expertise

Core Technical

Competency

“Capability”

Supporting

Competency

“Scalability”

Page 7: Learnings from Process Assessments

7

Size

Hierarchy

Interfaces

Partitioning

Data Management

Scheduling

Testability

Data Modeling

Attributes

Model-Based Design Maturity Framework™

Modeling Pillar Example

▪ Maturity determined by rating:

– 6 Pillars

– 28 Key Process Groups

– 200+ Attributes

Requirements

Modeling Language Selection

Algorithm Modeling

Environment/Plant Modeling

Architecture

Modeling Standards

Process Groups

Pillar

Page 8: Learnings from Process Assessments

8

Process Assessment Execution

Review workflow and tool usage

Determine gaps against MBD Maturity Framework

List pros/cons of workflow and usage

Develop recommendations

Create Implementation Plan

Page 9: Learnings from Process Assessments

9

How Did We Review the Data?...Simple Example

▪ Observation

– Strong in Implementation (code

generation)

– Verification and Validation, and

Simulation and Analysis not fully

leverage

– Model-Based Design focuses on

software creation

Page 10: Learnings from Process Assessments

10

Process Assessment – Data Analysis

▪ Quantitative analysis

– Plots of Process Assessment data

▪ Qualitative analysis

– Analysis of detail report

Page 11: Learnings from Process Assessments

11

How well is your verification and validation process capable of meeting ISO 26262?

Page 12: Learnings from Process Assessments

12

0

0.2

0.4

0.6

0.8

1

Unit LevelTesting

IntegrationLevel

Testing

HIL TestingVehicleTesting

RegressionTesting

Verification and Validation Pillar

Verification and Validation Pillar

Mapping with ISO 26262-6

*Average maturity rating V&V pillar normalized against Vehicle Testing

VV Pillar ISO 26262

Unit Level Testing ISO 26262-6 Clause 9

Software Unit Verification

Integration Level Testing ISO 26262-6 Clause 10

Software Integration and Verification

Vehicle and HIL Testing ISO 26262-6 Clause 11

Testing of Embedded Software

Regression Testing ISO 26262-8 Clause 9

Verification

Page 13: Learnings from Process Assessments

13

Verification and Validation Pillar

Status and Trends

▪ Vehicle and HIL Testing are strong due

to legacy reasons

▪ Low maturity in the other 3 process

groups shows difficulty for meeting

safety standards such ISO 26262

*Average maturity rating V&V pillar normalized against Vehicle Testing

0

0.2

0.4

0.6

0.8

1

Unit LevelTesting

IntegrationLevel

Testing

HIL TestingVehicleTesting

RegressionTesting

Verification and Validation Pillar

Page 14: Learnings from Process Assessments

14

0

0.2

0.4

0.6

0.8

1

Unit LevelTesting

IntegrationLevel

Testing

HIL TestingVehicleTesting

RegressionTesting

Verification and Validation Pillar

Verification and Validation Pillar

Status and Trends

*Average maturity rating V&V pillar normalized against Vehicle Testing

▪ Positive trend:

– Increasing rigor for Unit Testing and Regression Testing

– Cause:▪ Increase system complexity

▪ Standards such as ISO 26262 and ASPICE

– Expectation is a maturity increase due to complex application – AV/ADAS/AD

▪ Puzzle: What about Integration Level Testing?

Page 15: Learnings from Process Assessments

15

0

0.2

0.4

0.6

0.8

1

Unit LevelTesting

IntegrationLevel

Testing

HIL TestingVehicleTesting

RegressionTesting

Verification and Validation Pillar

Verification and Validation Pillar

Status and Trends

*Average maturity rating V&V pillar normalized against Vehicle Testing

0.00

1.00

2.00

3.00

2010 2012 2014 2016 2018 2020

Avera

ge M

atu

rity

Level

(absolu

te s

cale

)

Year

Trends in Integration Level Testing

Page 16: Learnings from Process Assessments

16

Why is integration level testing not being leveraged?

Page 17: Learnings from Process Assessments

17

Why Is Integration Level Testing Important?

▪ Safety standard

– Detail unit level verification

– Ensure unit working together through

integration testing

▪ Requirement validation

– Top-down design approach

– Ensuring requirement is correct

50% of defects are from

requirement

38% of software issues

were system-related

A “digital” car contains 100 M lines

of code in car

Page 18: Learnings from Process Assessments

18

Key Components for Building Integration Level Model

▪ Automatic integration of unit level models

▪ Stronger integration with Software/Model repository

▪ Out of box solution for Plant Model

▪ Simulation in scalability

▪ Matching software construct such as scheduler

▪ ….etc.

Page 19: Learnings from Process Assessments

19

Development for Integration Level Testing

Create Vehicle

Integrate Software

Author Scenarios

Simulate & Analyze

Deploy Simulation

R2016b release – Powertrain Blockset

R2018a release – Vehicle Dynamic Blockset

Page 20: Learnings from Process Assessments

20

Enable Integration Level Testing

Out-of-the-box capability Custom virtual vehicle solution

MathWorks Consulting Services

R2016b release – Powertrain Blockset

R2018a release – Vehicle Dynamic Blockset

Create Vehicle

Integrate Software

Author Scenarios

Simulate & Analyze

Deploy Simulation

Page 21: Learnings from Process Assessments

21

Are models leveraged for validation and analysisbeyond code generation? How are they being maintained?

Page 22: Learnings from Process Assessments

22

Model Usage

▪ Most companies develop model for algorithm development.

▪ Leading companies also develop models for – Requirement validation

– Performance optimization

▪ Leading companies maintain models and– Gather metrics and reports

– Optimize simulation speed

▪ Usage of models is an area that exhibits wide differences between leaders and laggards

* Part of the Model Pillar

** Part of the Simulation Pillar

Page 23: Learnings from Process Assessments

23

Model Lifecycle Management Through Metrics

▪ Out-of-the-Box

– Model Metric Dashboard

Page 24: Learnings from Process Assessments

24

Model Lifecycle Management Through Metrics

▪ Out-of-the-Box

– Model Metric Dashboard

▪ Industry partnership

– Model Quality Objectives

Page 25: Learnings from Process Assessments

25

Model Lifecycle Management Through Metrics

▪ Out-of-the-Box

– Model Metric Dashboard

▪ Partnering with industry

– Model Quality Objectives

▪ Customized solution - Consulting

– Custom report with integration to Jenkins

– Embedded pass/fail thresholds

Page 26: Learnings from Process Assessments

26

Are processes and tools being evolved to meet the application needs?

Page 27: Learnings from Process Assessments

27

Version Upgrade – MathWorks Advisory Board (MAB) Survey

Page 28: Learnings from Process Assessments

28

Effect of Supporting Competency

EM ➔ Management Sponsorship

PTI ➔ Process/Tool Investment

0.00

0.20

0.40

0.60

0.80

1.00

1.20

0.00 0.20 0.40 0.60 0.80 1.00 1.20

Core

Com

pete

ncy (

norm

alized)

PTI (normalized)

Investment in Dedicated Process / Tool Support

0.00

0.20

0.40

0.60

0.80

1.00

1.20

0.00 0.20 0.40 0.60 0.80 1.00 1.20

Core

Com

pete

ncy (

norm

alized)

EM (normalized)

Enterprise / Management Sponsorship

Page 29: Learnings from Process Assessments

29

Summary

Improve Verification and Validation process to meet safety standards

Leverage Integration Level Model for requirement validation

Ensure models are used for analysis and are maintained using leading indicator metrics

Invest in processes and tools

Page 30: Learnings from Process Assessments

30

Thank You for Your Attention!▪ Poll Questions

– Are you interested in learning more about Process Assessment?▪ Yes

▪ No

– I am interested in Process Assessment for this reason▪ Gap analysis against standard such ISO

26262, ASPICE, …etc.

▪ Optimize existing process

▪ Establish a new process

▪ None

Consulting Services

[email protected]

Please contact Govind Malleichervu / John Lee for questions:

Industry Marketing

[email protected]