© Copyright 2008, SoftWell Performance AB 1 Performance Measurements of IMS systems, Concepts.
-
Upload
ian-thornton -
Category
Documents
-
view
215 -
download
0
Transcript of © Copyright 2008, SoftWell Performance AB 1 Performance Measurements of IMS systems, Concepts.
1© Copyright 2008, SoftWell Performance AB
Performance Measurements of IMS systems,
Concepts
2© Copyright 2008, SoftWell Performance AB
Background to this presentation
and proposal
3© Copyright 2008, SoftWell Performance AB
A key to continued success (IMS)
Quality needs to be defined and measured
The involved standards (IMS) should include definitions of: • Quality metrics• Measurements of quality metrics• Evaluation and verification of measured quality figures
Metrics should include:• Product characteristics• Production characteristics
Background
This was our message in a panel discussion at VON Rome
4© Copyright 2008, SoftWell Performance AB
Background
General reasons • We gain a general understanding of what should be regarded as performance• We will get a standard vocabulary for performance related issues • We will have one interpretation of every metrics• We will get a standard notation for Performance Test Cases• The “Babylon” of Performance testing will come to an end
Practical reasons• Different tests that are performed according to the standard can be compared• Performance test tools can be certified • Performance test cases will be transferable between Test Tools that comply to the standard• Standardized Performance Test Case descriptions will be easier to maintain• Possible to define and certify Performance Test Suits• Will simplify regression testing (set-up and compare of results)
The goal• To expand the notation of TTCN-3 to cover performance testing.
This presentation is a short introduction to performance testing in general with extensions to IMS and its architecture. We hope to initiate a performance task.
Why should performance testing have a standard?
5© Copyright 2008, SoftWell Performance AB
Performance Objectives & Performance Metrics
6© Copyright 2008, SoftWell Performance AB
Performance Objectives
Three classes of performance objectives
Can the test object deliver what is required?
Can the test object maintain services as required?
Is the test object designed to be powerful and reliable?
Powerful
Reliable
Efficient
MeasuredObject
7© Copyright 2008, SoftWell Performance AB
Performance Metrics
Metrics describing powerfulness
Delivery aspects
CapacityPerformance measurements of- Output (throughput)- Input capacity (peak load)- Concurrency (tx concurrently being processed)
Speed of operationPerformance measurements of time for tested services- Response time (time to deliver response to a request)- Latency time (time to process or react)- Roundtrip time
ScalabilityPerformance measurements of improvements by - Adding more HW- Adding faster HW
Powerful
Reliable
Efficient
MeasuredObject
8© Copyright 2008, SoftWell Performance AB
Performance Metrics (cont.)
Metrics describing reliability
Powerful
Reliable
Efficient
MeasuredObject
Carrier-Grade aspects
Stability Has the tested system trends or patterns in Powerfulness and/or Efficiency over time?
Availability Unavailability in service delivery over time due to - Physical problems (Platform) - Logical problems (Application)
Robustness Service levels during extreme conditions - Externally (DoS attacks, peaks loads) - Internally (partial HW outage)
Recovery Time to recover from- Partial or full restart of system- Update of system HW / SW
Correctness Does the tested system deliver correct results under load?
9© Copyright 2008, SoftWell Performance AB
Performance Metrics (cont.)
Metrics describing efficiency
Powerful
Reliable
Efficient
MeasuredObject
Design aspects
Resource usagePerformance limits due to- Bottlenecks (design)- Too dynamic resource allocation (Peak-on-Peak)
Resource utilizationPerformance limits due to- Load distribution - Resource access (Queues)
Resource balancePerformance limits due to shortages in resources when there is plenty left of other resources
Linearity & ScalabilityPerformance limits due to - Non-linear processing- Multi CPU / Core design limits
10© Copyright 2008, SoftWell Performance AB
Performance Metrics (cont.)
Performance Metrics must be:
- Uniquely identified …………….. (metrics identifiers)- Understandable ……………..…. (metrics specifications)- Comparable ..…………………..... (recording conditions and methods)- Repeatable ...……………………. (test case specs, test tool specs and test object specs)- Accurate .....……………………... (data recording methods and configurations)
Requirements on performance metrics
11© Copyright 2008, SoftWell Performance AB
Performance Metrics (cont.)
Performance metrics descriptions
Metrics identifiers- What …….- Where …..- When ……
Metrics format- Value type- Unit- Accuracy (+/- %)
MeasurementValue(s)
Measurement conditions- How …….
Metrics types- Raw data- Normalized data- Derived data from …
12© Copyright 2008, SoftWell Performance AB
Performance Test Bench Overview
13© Copyright 2008, SoftWell Performance AB
Test Bench
Performance Test Bench overview
Performance test bench surroundings
MeasuredUnit
Performance test specification
Powerful Reliable Efficient
Measurement dataand evaluation
14© Copyright 2008, SoftWell Performance AB
Test Bench
Measured Unit
External Performance Data Internal Performance Data External Performance Data
Test Tool Servers
Test Tool Clients
Test Tool Clients
Probes
Performance Test Bench overview
Performance test bench components
Measurement evaluation
Powerful Reliable Efficient
15© Copyright 2008, SoftWell Performance AB
Test sizing
Performance Test Bench overview
Performance test specifications
Performance test specification
Test bench configuration
Test application (services)
Test data specification
Test evaluation
Test reporting
Measurement specifications
16© Copyright 2008, SoftWell Performance AB
Performance Test Bench overview
Performance test specifications (cont.)
Performance test specification
Test bench configuration
Test application (services)
Measurement specifications
Test evaluation
Test reporting
Test data specification
Test sizing specifications
Specify test durationHow long time will the test run?
Specify load- Load levels - Load patterns - Load level variations
Specify simulated volumes- Number of simulated users - Amounts of test data
Test sizing
17© Copyright 2008, SoftWell Performance AB
Test sizing
Performance Test Bench overview
Performance test specifications (cont.)
Performance test specification
Test application (services)
Test data specification
Test evaluation
Test reporting
Measurement specifications
Test bench specifications
Test bench components (physical & logical)- Test object components (SUT) - Test tool components
Test bench transmission specs.- Used protocols - Addresses and ports for traffic - Load level variations
Test bench traffic mapping - Mapping test tool components to test object components - Addresses and ports for traffic
Test bench configuration
18© Copyright 2008, SoftWell Performance AB
Test sizing
Performance Test Bench overview
Performance test specifications (cont.)
Performance test specification
Test data specification
Test evaluation
Test reporting
Measurement specifications
Test application specifications (client side)
Test application content (service profile)- What services should be requested
Test application flow (traffic profile)- In what order shall services be requested- Intelligent handling of different results of a service req. - Randomized order of service requests
Test application customization- Service request formats, multiple formats per service type- Individually modified service request messages
Test application processing specifications - Specification of processing flow of a service - Timer settings
Test bench configuration
Test application (services)
19© Copyright 2008, SoftWell Performance AB
Test sizing
Performance Test Bench overview
Performance test specifications (cont.)
Performance test specification
Test data specification
Test evaluation
Test reporting
Measurement specifications
Test application specifications (server side)
Test application flow (response profile)- Normal response mode- Negative response mode (errors, time-outs, disconnects etc.)
Test application customization- Service response formats, multiple formats per service type- Individually modified service response messages
Test application processing specifications - Specification of processing flow of a service - Timer settings
Test bench configuration
Test application (services)
20© Copyright 2008, SoftWell Performance AB
Test sizing
Performance Test Bench overview
Performance test specifications (cont.)
Performance test specification
Test application (services)
Test evaluation
Test reporting
Measurement specifications
Test data specifications
User name specificationUser names to be inserted in messages - Client user name spaces - Server user name spaces
User data specificationUser data variables to be inserted in messages
Test bench configuration
Test data specification
21© Copyright 2008, SoftWell Performance AB
Test sizing
Performance Test Bench overview
Performance test specifications (cont.)
Performance test specification
Test application (services)
Test data specification
Test evaluation
Test reporting
Measurement specifications
Test measurement specifications
Measurement outside test objectWhat should be measured at the test tools interfaces- Traffic rates, traffic volumes- Response time, latency time etc.
Measurement inside test objectWhat should be measured inside the test object - resources - measurement points (per server, per process, …)
Test bench configuration
22© Copyright 2008, SoftWell Performance AB
Performance MeasurementMethods
23© Copyright 2008, SoftWell Performance AB
Performance Measurement Methods
Three performance measurement methods
MeasuredUnit
Load Simulation Monitoring
When Pre Deployment Pre Deployment Post Deployment
Why Measuring limits Verify production req. Verifying requirements
How Isolated functions Complex traffic Active monitoringSimple traffic What-if testing Passive monitoring
24© Copyright 2008, SoftWell Performance AB
IMS Performance Test Objects
25© Copyright 2008, SoftWell Performance AB
Test Bench
IMS function(s)
External Performance Data Internal Performance Data External Performance Data
Test Tool Servers
Test Tool Clients
Test Tool Clients
Probes
IMS Performance Test Bench
Performance test bench components
Measurement evaluation
Powerful Reliable Efficient
26© Copyright 2008, SoftWell Performance AB
IMS Performance Test Bench
The IMS architecture (simplified)
P-CSCF
SIP-AS OSA-SCS IM-SSF
BGCFMGCF MRF-C
SGW
MGW MRF-P
UE
The IMS media planeThe IMS media plane
RTP
The IMS signaling planeThe IMS signaling plane
06 SIP / Mj
01 SIP / Gm
HSS
SLF
01 Diameter / Cx
01 Diameter / Dx
03 SIP / ISC
03 HTTP / Ut
02 SIP / Mw I-CSCF 02 SIP / Mw S-CSCF 02 SIP / Mw P-CSCF UE
03 Diameter / Sh
H248 / Mn
07 SIP / Mg 05 SIP / Mi
H248 / Mp
Application servicesApplication services
AAA servicesAAA services
Control servicesControl services
ISUP / IP04 SIP / Mr
01 SIP / Gm
IMS is an architecture containing a large number of inter-acting functions, where each function performs a its tasks based on traffic over connected protocols.
Overall performance is never better than the weakest function in the service chain, i.e. any function that does not match the performance of adjacent functions is a bottleneck or a risk.
27© Copyright 2008, SoftWell Performance AB
IMS Performance Test Bench
Performance testing IMS functions
An integrated IMS system must be performance tested from individual functions, via sets of interacting functions to entire systems and interconnected systems
A test bed for a single function may be quite complex to configure and set up.
Controlfunctions
Applicationservices
BGCF
The IMS signaling planeThe IMS signaling plane
HSS
Diameter / Cx Dx
Traffic serverTraffic serverTest Tools
SLF
P-CSCF I-CSCF S-CSCF
AAAservices
such as a PTTserver
or an
Presenceserver
or …such as an
Diameter / Cx Dx
SIP / Gm / Mw / ISC
SIP / MG / Mi / Mj / Mr
HTTP / XCAP / Ut
SIP / ISCDiameter / Sh / …
or a or …
such as a
SBC MRF-C
or …
Diameter / Sh / …
Traffic serverTraffic serverTest Tools
Traffic serverTraffic serverTest Tools
28© Copyright 2008, SoftWell Performance AB
IMS Performance Test Bench
IMS function interfaces and requirements
A test bed for a single function may be quite complex to configure and set up.
IMSfunction
SIP Diameter XCAP other
TCP UDP SCTP other
Application protocols
Transmission protocols
Multiplexed
Single users per port Individual IP address.
Interfaces
29© Copyright 2008, SoftWell Performance AB
IMS Performance Test Bench
IMS function Test Tool interfaces
Test Tools must have traffic units that match interfaces of all IMS components
Test Tool Servers
Test Tool Peers
Server side
Client side
Test Tool Clients
Test Tool Peers
Server side
Client side
Test Tool Servers
Test Tool Bridges
Ser
ver
sid
e
Cli
ent
sid
e
30© Copyright 2008, SoftWell Performance AB
IMS Function
IMS Function
IMS Performance Test Bench
IMS function Test Tool interfaces
A Test bed may be set-up with several IMS functions interconnected by the test tool
Test Tool Peers
Server side
Client side
Test Tool Peers
Server side
Client side
Test Tool Servers
Test Tool Bridges
Ser
ver
sid
e
Cli
ent
sid
e
IMS Function
Test Tool Clients
31© Copyright 2008, SoftWell Performance AB
What is next,suggestions
32© Copyright 2008, SoftWell Performance AB
What is next, suggestions
The track will contain a number of tasks • Create a proposal for performance related definitions and method• Create an abstract but precise notation for performance test cases• Create a standard for test tool components in a test bench• Create a proposal for language extensions to TTCN-3
First of all som practical activities• Investigate the interest for this and if yes• Set up a task force
Thank you for your time
A standard for performance testing
33© Copyright 2008, SoftWell Performance AB
“To measure is to know”
- Lord Kelvin