Model Driven Performance Analysis

23
Model Driven Performance Analysis University College London James Skene – [email protected]

description

Model Driven Performance Analysis. University College London James Skene – [email protected]. Outline. Requirements for the analysis method, as I see them. Overview of the model driven performance approach chosen Rationale related to the requirements. Future work. - PowerPoint PPT Presentation

Transcript of Model Driven Performance Analysis

Page 1: Model Driven Performance Analysis

Model Driven Performance Analysis

University College London

James Skene – [email protected]

Page 2: Model Driven Performance Analysis

Outline

• Requirements for the analysis method, as I see them.

• Overview of the model driven performance approach chosen– Rationale related to the requirements.

• Future work

Page 3: Model Driven Performance Analysis

Performance Analysis Functional Requirements

• Assuming the existence of the TAPAS platform…• Reason about compositionality of service level

agreements• Predict application capacity

– Over-provisioning or under provisioning w.r.t SLAs a cost.

– Targeted at ASP technologies

• Enable design time performance prediction– Select architecture

Page 4: Model Driven Performance Analysis

Non-functional requirements

• Be usable:– Performance analysis outside usual software

engineering competence• Must be integrated with standard software

engineering practice

– Minimise cost of performance analysis.

• Be used:– Performance analysis is currently not

performed despite benefits

Page 5: Model Driven Performance Analysis

Approach

• Mappings from analysis to design models within the Model Driven Architecture (MDA)

• Qualitatively:– Includes UML so is integrated with standard software

engineering practice.

– Is tool supported, so:• Can integrate the technique

• Can provide assistance with the technique

• Can automate the technique

• Also meets the functional requirements!

Page 6: Model Driven Performance Analysis

The Model Driven Architecture (MDA)

• Family of specifications– UML – The Unified Modelling Language– MOF – The Meta-Object Facility– CWM – The Common Warehouse Meta-model– Also: CORBA – The Common Object-Request

Broker Architecture

• Not really an architecture• Software designs captured as UML models

Page 7: Model Driven Performance Analysis

PIMs and PSMs

• Problem: Technical infrastructure changes independently of business rules, but these are strongly coupled in designs.

• Solution: Decouple them

Platform Independent Model (PIM)

Platform Specific Model (PSM)

«realize»

Page 8: Model Driven Performance Analysis

Semantic domains

• PIMs and PSMs relate to different types of thing.

• It is convenient to describe these designs using different languages.– E.g. EJB implementation details

• UML can describe object oriented designs.• UML contains extension mechanisms to

provide additional interpretations for model elements.

Page 9: Model Driven Performance Analysis

Metamodels

UML

PIM PSM

Virtual Metamodel

Profile

Meta-model:

Model:

Page 10: Model Driven Performance Analysis

Profiles

• The lightweight extension mechanisms:– Stereotypes extend the meaning of UML model

elements.

– Tagged values associate qualities with model elements.

– Constraints govern the form of models, enforcing domain semantics. Act at meta-model level.

– Profiles group stereotypes, tagged values and constraints.

• Freedom through constraint• Opportunity for standardisation

Page 11: Model Driven Performance Analysis

Mappings

PIM

PSM PSM

Source Code

Analysis

Results

PIM

Page 12: Model Driven Performance Analysis

How are mappings described?

• Imperative mappings specify an algorithm

• Declarative mappings specify pair-wise constraints

• Declarative mappings can be captured in a profile using constraints.

«profile»Mapping

«profile»Design

«profile»QN

Page 13: Model Driven Performance Analysis

Benefits of mappings

• They can be checked, providing assistance to modellers• Declarative mappings only need to be partially

specified– The flexibility addresses the difficulty in producing feasible

analysis models.

• The mappings define a semantics for the design domain, in terms of the analysis domain concepts.

• The declarative mappings provide guidance for subsequent automated mappings.

• Can capture expert modelling techniques

Page 14: Model Driven Performance Analysis

Design domain: A soft-real-time profile

Based on the ‘UML Profile for Schedulability, Performance, and Time Specification’

• Stereotypes to:– Identify workload classes– Identify resources accessed under mutual

exclusion– Identify actions having resource demands

Page 15: Model Driven Performance Analysis

A Soft-real-time profile 2

• Tagged values to:– Specify workload parameters (e.g. population, think-

time, or arrival rate)

– Specify resource demands for actions/procedures

– Specify probabilities for choices, average number of iterations.

• Constraints:– Object containing action with resource demand must be

deployed in context where resource is available.

Page 16: Model Driven Performance Analysis

Example design model - sequence

:UpdateBean :ManagerBean :EmployeeBean

1:update()

2:ejbCreate()

3:ejbCreate()

{repetitions = 100, demand={cpu:10000}}

{p= 0.5, demand={cpu:100, disk1:5}}

{demand={cpu:100, disk1:5}}

Page 17: Model Driven Performance Analysis

Platform

Example design model – deployment

:UpdateBean

:ManagerBean

:EmployeeBean{serviceRate=0.1s}

«resource»

CPU

«resource»

Disk2

«resource»

Disk1«deploys»

«deploys»

«deploys»

{serviceRate=0.1s}

{serviceRate=0.001s}

Page 18: Model Driven Performance Analysis

A performance analysis domain profile

• Queuing networks• Stereotypes:

– Identify instances as queues, delays or populations.

• Tagged values:– Specify service intervals and probabilities on links.

• Constraints:– Ensure that the network is connected.

Page 19: Model Driven Performance Analysis

Example QN Collaboration

«client»

Workload

«queue»

CPU

«queue»

Disk1

«queue»

Disk2{thinkTime = 5sec,

Population = 15}

{serviceRate = 1000}

{p = 0.05}

{p = 0.05}

{p = 0.02}

{serviceRate = 0.1}

{serviceRate = 0.1}

Page 20: Model Driven Performance Analysis

Mapping from design to analysis domain

• Resources correspond to queues.• Resource demands translate to probabilities or

demand vectors.• Much more complicated mappings will be required to

capture infrastructure details (e.g. performance of containers).

«model»

Design

«model»

QN«DesignToQN»

Page 21: Model Driven Performance Analysis

Requirements?

PIM

PSM

Analysis

EJB MQ server Oracle

Tool:

QN

SPA

SPN

Lifecycle

SLAng

Page 22: Model Driven Performance Analysis

Progress

• SLAng identifies relevant scenarios and technologies.

• Assembling a toolset:– Poseidon UML– MDR plug-in for NetBeans– LUI OCL checker, NEPTUNE project– Libraries and tools for performance analysis

Page 23: Model Driven Performance Analysis

Future work

• Define profiles

• Associate with SLAng constructs

• Create tool to automate analysis

• Integrate into single IDE

• Automate mappings