Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

13
Performance Journey℠ An Introduction

description

 

Transcript of Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

Page 1: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

Performance Journey℠

An Introduction

Page 2: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

What Does it Take to be Great?

• Dribbling

• Passing

• Tackling

• Shooting

• Heading

• Teamwork

Page 3: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

What Does it Take to be Great at APM?

Performance Reporting

Performance Measurement The ability to measure

application performance from the end-user’s perspective

across the entire application delivery

chain

Problem Resolution

The ability to identify, isolate fault domain, determine root cause

and resolve application

performance problems

Performance Improvement

The ability to continuously identify, prioritize, implement

and measure the results of application

improvement opportunities Production Readiness

The ability to ensure user experience can scale with load prior

to launching new applications or

deploying infrastructure changes

The ability to provide role-specific insight

using common metrics, enabling superior business-

oriented IT decision-making

Page 4: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

A Model of APM Maturity

Performance Measurement

Problem Resolution

Performance Improvement

Production Readiness

Performance Reporting

No awareness of user experience

Reactive problem

resolution

Ad hoc “gut feel” approach to

improvements

“Test in Production”

approach to new technologies

Few SLAs / reporting on technology

Basic awareness and ownership

Reactive resolution but

can validate

Some improvements using baseline

Best effort focused on code / infrastructure

SLAs / reporting have basic end-

user metrics

Own user experience

across the chain

Increasingly proactive

awareness

Can pinpoint specific causes of

issues

Load testing focused on the

application

SLAs / reporting on end-to-end performance

Deeper level of understanding

tied to business

Automation of issue analysis

and diagnostics

Improvement based on Six Sigma or ITIL

Load testing focused on user

experience

SLAs / reporting tied to business

metrics

Real-time visibility drives service delivery

Automation of resolution before business impact

Improvement processes & auto implementation

Designed w/ performance in

mind

SLAs / reporting is competitive

advantage

Level 1 REACTIVE

Level 2 AWARE

Level 3 EFFECTIVE

Level 4 OPTIMIZED

Level 5 PERVASIVE

Best Practice

Page 5: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

Improving APM Maturity

Performance Reporting

Performance Measurement

Measure end-user experience based on key

repeatable “control” transactions

Measure the end-user experience based on real end-user response time

Measure transaction-level performance across all tiers of the application

delivery chain

Problem Resolution

Performance Improvement

Production Readiness

Apply synthetic or robotic monitoring using scripted

performance and availability measurements

to identify problems specific to geographies, to

alert on availability problems, and to provide

controlled repeatable end-user experience

measurements ideal for response-time SLAs.

Core APM Aspects Key Capabilities for Each Aspect

Best Practices for Each Capability

Page 6: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

Application Performance Management Challenges

2. People and Approach 1. Technology

Business Development

Test Production

C/C++

Page 7: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

Team work to ensure success

Customer Partner Compuware

Page 8: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

Performance Journey℠ Assessment

Method and Structure

Page 9: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

“If you want to build a ship,

don't drum up people together to collect wood

and don't assign them tasks and work,

but rather teach them to long for the endless immensity of the sea”

Antoine De Saint-Exupery

An approach to achieving application

performance excellence and developing a roadmap based on organizational

capabilities and goals

Performance Journey℠ Vision

Page 10: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

Performance Journey℠ Assessment

• Interactive series of workshops with key stakeholders

• Identifies current capability baseline, maps business and IT goals, and analyzes gaps

GAP

Baseline

Goal

GAP

GAP

GAP

GAP

Page 11: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

Performance Journey℠ Assessment - Process

Preliminary

Meeting

•Refine scope

•Identify stakeholders

•Agree business goals

Follow-up Workshop

•Present report

Interview Workshop

•Review current practice

•Score capabilities

•Current

•18 Month Aspiration

Report

•Analyse findings

•Benchmark

•Improvement options

Present Analyse Discovery Plan Follow-up

Onsite Remote Remote

Follow-up

•Next steps discussions

•Additional coaching offerings or consulting support discussions

12 © 2013 Compuware Corporation — All Rights Reserved

Page 12: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

Level 2 AWARE

Performance Reporting

Few SLAs / reporting on technology

SLAs / reporting have basic end-

user metrics

SLAs / reporting on end-to-end performance

SLAs / reporting tied to business

metrics

SLAs / reporting is competitive

advantage

Few SLAs / reporting on technology

SLAs / reporting on end-to-end performance GAP

Production Readiness

“Test in Production”

approach to new technologies

Best effort focused on code / infrastructure

Load testing focused on the

application

Load testing focused on user

experience

Designed w/ performance in

mind

“Test in Production”

approach to new technologies

Load testing focused on user

experience GAP

Performance Improvement

Ad hoc “gut feel” approach to

improvements

Some improvements using baseline

Can pinpoint specific causes of

issues

Improvement based on Six Sigma or ITIL

Improvement processes & auto implementation

Some improvements using baseline

Improvement based on Six Sigma or ITIL

GAP

Problem Resolution

Reactive problem

resolution

Reactive resolution but

can validate

Increasingly proactive

awareness

Automation of issue analysis

and diagnostics

Automation of resolution before business impact

Reactive problem

resolution

Automation of issue analysis

and diagnostics GAP

Performance Measurement

No awareness of user experience

Basic awareness and ownership

Own user experience

across the chain

Deeper level of understanding

tied to business

Real-time visibility drives service delivery

Basic awareness and ownership

Deeper level of understanding

tied to business GAP

Performance Journey℠ Assessment Results

Level 1 REACTIVE

Level 3 EFFECTIVE

Level 4 OPTIMIZED

Level 5 PERVASIVE

Typical Challenges Identified

• Gaps in APM capability adoption

• Aspiration/strategy differences

• Process deficiencies

• Organizational blockers

• Skills deficiencies

• Gaps in available APM toolset

Page 13: Compuware ASEAN APM User Conference 2013 - APM Performance Journey Presentation

© 2011 Compuware Corporation — All Rights Reserved

14