3G KPI Guidelines

50
Appendix U12 UMTS Network KPI NOTES: This document can be used as an example of defining KPIs in a contract. It is in Ericsson’s interest that KPIs are well defined, including: a KPI target values (keep as few with targets as possible) b Formula and measurement definitions (for drives and stats) c Where and how KPI will be measured (locations, setup, tools, drive, statistics) d Conditions – prerequisites, exclusions e Statistical considerations As much of this as practical should be defined in the contract, particularly anything that will limit Ericsson’s risks and costs. Some of the material in this document may best be included in other parts of the contract, including the Acceptance Procedure and associated documents. KPI target values have very little meaning until the conditions of measurement are defined as above. Committing to a target without defining how, when and where it will be measured leaves Ericsson at great risk.

description

kpi GUIDEERICSSONUMTS

Transcript of 3G KPI Guidelines

Page 1: 3G KPI Guidelines

Appendix U12

UMTS Network KPI

NOTES:

This document can be used as an example of defining KPIs in a contract. It is in Ericsson’s interest that KPIs are well defined, including:

a KPI target values (keep as few with targets as possible)

b Formula and measurement definitions (for drives and stats)

c Where and how KPI will be measured (locations, setup, tools, drive, statistics)

d Conditions – prerequisites, exclusions

e Statistical considerations

As much of this as practical should be defined in the contract, particularly anything that will limit Ericsson’s risks and costs.

Some of the material in this document may best be included in other parts of the contract, including the Acceptance Procedure and associated documents.

KPI target values have very little meaning until the conditions of measurement are defined as above. Committing to a target without defining how, when and where it will be measured leaves Ericsson at great risk.

N.B. The KPIs and their targets included in this document may not be the ones recommended by Ericsson. Refer to the KPI Technical Guidelines and KPI data base for current recommendations.

Page 2: 3G KPI Guidelines

Table of Contents

UMTS NETWORK KPI............................................................................................................................................................1

1 INTRODUCTION..........................................................................................................................................................3

1.1 SUMMARY OF LEVEL-1 KPI AND TARGETS.............................................................................................................3

2 PERFORMANCE COMMITMENTS AND REQUIREMENTS..............................................................................6

2.1 MILESTONE DEFINITION..........................................................................................................................................62.2 SCOPE......................................................................................................................................................................62.3 GENERAL REQUIREMENTS.......................................................................................................................................62.4 CONTRACTUAL KPI COMMITMENTS........................................................................................................................7

3 UTRAN LEVEL-1 KEY PERFORMANCE INDICATORS......................................................................................7

3.1 CIRCUIT SWITCHED LEVEL-1 KEY PERFORMANCE INDICATORS.............................................................................73.1.1 Circuit Switched Call Setup...............................................................................................................................73.1.2 RRC Establishment Causes................................................................................................................................93.1.3 CSV Access Failure............................................................................................................................................93.1.4 CSV Drop.........................................................................................................................................................113.1.5 CSV Quality......................................................................................................................................................123.1.6 CSV Cell Capacity............................................................................................................................................143.1.7 CSV Soft/Softer Handover Overhead...............................................................................................................153.1.8 CSV Inter-Radio Access Technology Handover Failure..................................................................................163.1.9 CSV Call Setup Time........................................................................................................................................183.1.10 CSD Access Failure.....................................................................................................................................193.1.11 CSD Drop....................................................................................................................................................203.1.12 CSD Quality.................................................................................................................................................213.1.13 CSD Call Setup time....................................................................................................................................22

3.2 PACKET SWITCHED LEVEL-1 KEY PERFORMANCE INDICATORS...........................................................................233.2.1 Packet Switched Call Setup..............................................................................................................................233.2.2 PSD Access Failure..........................................................................................................................................243.2.3 PSD Drop.........................................................................................................................................................253.2.4 PSD Latency.....................................................................................................................................................263.2.5 PSD Throughput...............................................................................................................................................263.2.6 PSD Call Setup Time........................................................................................................................................283.2.7 PSD Inter-Radio Access Technology Handover Failure.................................................................................293.2.8 PSD IRAT Interruption time.............................................................................................................................303.2.9 HSDPA Access Failure....................................................................................................................................313.2.10 HSDPA Drop...............................................................................................................................................323.2.11 HSDPA Latency...........................................................................................................................................343.2.12 HSDPA Throughput.....................................................................................................................................343.2.13 HSUPA Throughput.....................................................................................................................................363.2.14 HSDPA Data Session Setup Time................................................................................................................37

3.3 SYSTEM AVAILABILITY.........................................................................................................................................373.3.1 Average Cell Availability.................................................................................................................................383.3.2 Average Cell Non-Maintenance Availability...................................................................................................38

3.4 COMPARISON OF UMTS AND GSM NETWORK PERFORMANCE............................................................................393.5 IMPACT OF UMTS ON GSM NETWORK PERFORMANCE.......................................................................................39

4 UTRAN Level-2 Key Performance Indicators...............................................................................................................40

2 of 39

Page 3: 3G KPI Guidelines

1 IntroductionThe purpose of this Appendix U12 is to define Network Key Performance Indicators (KPI) for the UMTS drive test, UTRAN counter based measurements and performance targets and formulas for each of the KPI. All capitalized terms, acronyms, and definitions used in this Appendix U12 are listed in either the UMTS Acronyms and Definitions (Appendix U18), or in the Original Agreement.

The UTRAN KPI definition and development effort is performed at the two levels as defined below:

Level-1 KPI: are the high-level metrics used for overall service quality measurement and monitoring the health of Purchaser’s UTRAN Network. These metric definitions shall be agreed to by Seller, and Seller‘s implementation of the Level-1 metrics shall be in compliance with Purchaser‘s definitions defined in this Appendix U12.

Level-2 KPI: are the detailed engineering level metrics that shall be used for engineering, dimensioning the Network, the investigation and troubleshooting of problem areas in the UMTS Network.

1.1Summary of Level-1 KPI and Targets

NOTE: The selection of KPI and the target values should use the KPI Technical Guidelines and KPI Database. The following is an example from a real contract and should NOT be seen as Ericsson recommendation and must not be used in any contract in this form.

KPI Parameter Table Reference

Pre-Launch

(Milestone 1)

Launch

(Milestone 2)

Post Launch

(Milestone 3)

CSV Access Failure Rate Table 3 ≤ 2.0 % ≤ 2.0 % ≤ 2.0 %

CSV Drop Rate Table 5 ≤ 2.0 % ≤ 2.0 % ≤ 2.0 %

CSV Quality (DL) Table 8 95th percentile of samples ≤ 2.0 % BLER

95th percentile of samples ≤ 2.0 %

BLER

95th percentile of samples ≤ 2.0 % BLER

CSV Quality (UL) Table 8 95th percentile of samples ≤ 2.0 % BLER

95th percentile of samples ≤ 2.0 %

BLER

95th percentile of samples ≤ 2.0 % BLER

Cell Capacity Table 10 ≥ 40 UE AMR 12.2k

(provided it is achieved in RF

design)

≥ 40 UE AMR 12.2k (provided it is achieved in RF

design)

Measure and report

Soft/Softer Handover Overhead

Table 12 ≤ 1.6 ≤ 1.6 ≤ 1.7

CSV IRAT Failure Rate Table 14 N/A ≤ 5.0 % ≤ 5.0 %

3 of 39

Page 4: 3G KPI Guidelines

*Voice Call Setup time (Mobile to PSTN)

Table 15 95th percentile ≤ 6 seconds

95th percentile ≤ 6 seconds

95th percentile ≤ 6 seconds

*Voice Call Setup time (Mobile to Mobile)

Table 15 95th percentile ≤ 9 seconds

95th percentile ≤ 9 seconds

95th percentile ≤ 9 seconds

CSD Access Failure Rate Table 17 ≤ 3.0 % ≤ 2.0 % ≤ 2.0 %

CSD Drop Rate Table 19 ≤ 2.5 % ≤ 2.5 % ≤ 3.0 % (Counter)

≤ 2.5 % (Drive Test)

CSD Quality (DL) Table 21 95th percentile of samples ≤ 1.0 % BLER

95th percentile of samples ≤ 1.0 %

BLER

95th percentile of samples ≤ 1.0 % BLER

CSD Quality (UL) Table 21 95th percentile of samples ≤ 1.0 % BLER

95th percentile of samples ≤ 1.0 %

BLER

95th percentile of samples ≤ 1.0 %BLER

*CSD Call Setup time Table 22 N.A 95th percentile ≤ 9 seconds

N.A

PSD Access Failure Rate Table 24 ≤ 2.0 % ≤ 2.0 % ≤ 2.0 %

PSD Drop Rate Table 26 ≤ 2.5 % ≤ 2.0 % ≤ 2.0 %

*PSD Latency (any R99 RAB)

Table 28 N.A 95th percentile

≤ 200ms

N.A

PSD R99 Average DL throughput (kbps)

(Unloaded)

Table 31 ≥ 240 ≥ 240 ≥ 150 (Counter)

≥ 240 (Drive Test)

PSD R99 Average DL throughput (kbps)

(Loaded)

Table 31 ≥ 210 ≥ 210 ≥ 150 (Counter)

≥ 210 (Drive Test)

PSD R99 Average UL throughput (kbps)

(Unloaded)

Table 32 ≥ 200 ≥ 200 ≥ 150 (Counter)

≥ 200 (Drive Test)

PSD R99 Average UL throughput (kbps)

(Loaded)

Table 32 ≥ 180 ≥ 180 ≥ 150 (Counter)

≥ 180 (Drive Test)

4 of 39

Page 5: 3G KPI Guidelines

*PSD Call Setup time Table 34 N.A 95th percentile sessions ≤ 5

seconds

N.A

PSD IRAT Failure Rate Table 36 N/A ≤ 5.0 % ≤ 5.0 %

PSD IRAT Interruption time Table 39 N.A 95th percentile

≤ 12 seconds

N.A

PSD IRAT User Data Interruption time

Table 39 N.A Measure and Report

N.A

HSDPA Access Failure Rate

Table 41 ≤ 2.0 % ≤ 2.0 % ≤ 2.0 %

HSDPA Drop Rate Table 43 ≤ 3.0 % ≤ 2.0 % ≤ 2.0 %

*HSDPA Latency Table 45 N.A 95th percentile sessions ≤ 100

ms

N.A

Stationary Maximum DL HSDPA Bit Rate (kbps) under no load (UE Category 12)

Table 47 ≥ 1300 ≥ 1300 ≥1300

(at RLC Layer using counter)

Stationary Maximum DL HSDPA Bit Rate (kbps) with 50% DL loading (UE Category 12)

Table 47 ≥ 1100 ≥ 1100 N.A

HSDPA Average DL Throughput (kbps) for UE category 12 under no load

Table 48 ≥ 700 ≥ 700 ≥ 700

(at off peak)

HSDPA Average DL Throughput (kbps) for UE category 12 under 50% DL loading

Table 48 ≥ 600 ≥ 600 N.A

HSUPA Stationary Peak throughput (kbps) for UE Category 3 (1.45 Mbps)

Table 50 ≥ 1100 ≥ 1100 Measure and Report

HSUPA Stationary Average throughput (kbps) for UE Category 3 (1.45 Mbps)

Table 50 ≥ 500 ≥ 500 Measure and Report

*HSDPA Session Activation time

Table 51 N.A 95th percentile sessions ≤ 5

seconds

N.A

5 of 39

Page 6: 3G KPI Guidelines

*Average Cell Availability KPI

Table 53 N.A N.A N.A**

*Average Cell Non-Maintenance Availability KPI

Table 55 N.A N.A N.A**

GSM Performance Degradation due to Introduction of UMTS

Page No: 39

Cell ≤ 10 %

Cluster ≤ 7.5%

Market ≤ 5%

Cell≤ 10 %

Cluster ≤ 7.5%

Market ≤ 5%

Cell≤ 10 %

Cluster ≤ 7.5%

Market ≤ 5%

* Seller shall commit to these KPI during the life of the Network as described in Section 2.4 of this document

** KPI Applicable one (1) year after UMTS Network commercial Launch.

2 Performance Commitments and Requirements

2.1Milestone Definition

Milestone description is provided in the UMTS System Acceptance (Appendix U16).

Three milestones are defined in this Appendix U12 as follows:

1. Milestone-1 (MS1): Pre-launch. Pre-launch is defined as the completion of Cluster Acceptance testing. The KPI target levels will be based on drive tests and not OSS statistics.

2. Milestone-2 (MS2): Launch. Launch is defined as the completion of Market Acceptance testing of the initial Network configuration. The KPI target levels shall be based on drive tests and not OSS statistics.

3. Milestone-3 (MS3): Post-Launch. Post-Launch is defined as six (6) months after Launch. The KPI target levels shall be based on OSS statistics at Market Level or Drive Test data if there is not sufficient traffic, as defined in UMTS System Acceptance (Appendix U16) in the UMTS Network.

2.2Scope

The scope in this Appendix U12 is to define applicable UMTS Network KPIs for Systems and Equipment supplied by Seller.

2.3General Requirements

Performance Management Measurement Entity shall be available at Cell, Node-B, Cluster, RNC, Market, Region and Network level for every KPI unless otherwise stated in this Appendix U12. RNC Measurement Granularity shall be a minimum period of fifteen (15) minutes unless otherwise stated in this Appendix U12.

All tests for KPI purposes shall be performed in Mobile (vehicle drive) environment unless otherwise specified.

IRAT Hard Handover Feature shall be switched off during KPI verification within the UMTS Service Area at Pre-Launch. At Launch milestone IRAT HHO failure rate in 3G-2G RF Service boundary shall be verified with one dedicated drive per Market unit.

The tests shall be performed under a multi-supplier environment, in which nodes from different suppliers may be used (e.g. SGSN, GGSN, MSC, MGW etc). In the event that a particular test is not passed and Seller claims the problem is with the other supplier, Seller shall clearly demonstrate that the element in question is the source of the problem. If the problem is confirmed to be in a third-party supplier’s Equipment then Seller shall re-schedule the individual test cases only after the problem has been resolved by the third-party supplier.

The initial UTRAN parameter set shall be mutually agreed. No UTRAN parameter shall be changed during Cluster/Market Acceptance tests without prior approval of Purchaser as defined in the UMTS System Acceptance (Appendix U16). As terminal performance may also have an impact on the

6 of 39

Page 7: 3G KPI Guidelines

performance, the brand and model of terminal to be used for the tests shall be mutually agreed to by Seller and Purchaser.

Counter based statistics for Level-1 KPI shall cover all traffic classes and include all causes of failures/successes to indicate end to end performance as viewed by UTRAN.

2.4Contractual KPI Commitments

The following KPIs and targets shall be valid for the lifetime of the Network, provided that there is a Support Agreement in place for the Seller provided Systems.

Table 2: KPI Parameter Commitment for the lifetime of the UMTS Network

SN KPI Parameter Target

1 CSV Call Setup Time (Mobile to PSTN) ≤ 6 seconds

2 CSV Call Setup Time (Mobile to Mobile) ≤ 9 seconds

3 CSD Call Setup time (Mobile to Mobile) ≤ 9 seconds

4 PSD Latency - any R99 RAB ≤ 200 ms

5 PSD Call Setup Time ≤ 5 seconds

6 HSDPA Latency ≤ 100ms

7 HSDPA Data Session Setup Time ≤ 5 seconds

8 Cell Availability As per Table 53

9 Cell Availability (Non-Maintenance) As per Table 55

The UMTS Network is expected to meet or exceed the KPI targets specified in this Appendix U12 within the UMTS Service Area with or without loading the UMTS Network.

Seller commitment applies to Level-1 KPI for achieving target values at the milestone applicable to KPI.

KPI counter metrics shall be based on UTRAN view of Network Performance. Seller shall provide all Level-1 KPI counters used in the formulas defined in this Appendix U12 at Milestone-2 with the exception of SRB counters for Access Failure. Seller shall provide all Level-1 KPI reports for KPI Acceptance.

Purchaser shall define KPI Level-2 requirements as outlined in Section 4.

3 UTRAN Level-1 Key Performance Indicators This Section describes the UTRAN Level-1 KPI, or high level service metrics. Level-1 KPI is sub-divided into two basic service domains: the Circuit Switched (CS) domain that includes Voice and Video calls, and the Packet Switched (PS) domain that includes R99 Packet Data and HSDPA/HSUPA Data Sessions.

3.1Circuit Switched Level-1 Key Performance Indicators

The two (2) types of services in the CS domain are Circuit Switched Voice (CSV) and Circuit Switched Data (CSD).

7 of 39

Page 8: 3G KPI Guidelines

3.1.1 Circuit Switched Call Setup

The setup of a CS service is comprised of three basic steps (see Figure 1). First, UE must access the UTRAN and establish an RRC connection. Once this connection is completed the Non Access Stratum (NAS) messages are exchanged between the UE and the Core Network (CN) e.g. CM Service Request, Authentication, Security, etc. The last step of the call setup is the establishment of a Radio Access Bearer (RAB) between the CN and the UE.

Figure 1: Call Setup Block Diagram

Figure 2 illustrates a call setup flow diagram for a CS mobile originated call. If a problem occurs between RRC Connection Request (step 1), and RANAP: RAB Assignment Response (step 27), Alerting/Connect (step 29) in case of drive test, it is considered an Access Failure. Any RAB abnormal release after RAB Assignment Response (in case of Counter Formula), Alerting/Connect message (in case of Drive Test Formula) is considered a dropped call.

8 of 39

Establish RRC

Connection

Non Access Stratum

Messaging

Radio Access Bearer (RAB) Setup

Page 9: 3G KPI Guidelines

Figure 2: Circuit Switched Call Setup Flow Diagram

3.1.2 RRC Establishment Causes

Table 1 provides a list of RRC establishment causes. When the UE attempts to setup a RRC Connection with the UTRAN, the UE sends a RRC Connection Request message. Embedded within this message is an information element that communicates the establishment cause for the connection request. This establishment cause shall be utilized to differentiate CS and PS call KPI related to Access Failure. Separate counters for each traffic classes for every RRC Connection Request and RRC Connection Complete (as per 3GPP) shall be made available in Software release at the time of commercial launch (Milestone 2).

Table 1: RRC Establishment Causes

RRC Establishment Cause Description

0 Originating Conversational Call

1 Originating Streaming Call

2 Originating Interactive Call

3 Originating Background Call

4 Originating Subscribed traffic Call

5 Terminating Conversational Call

6 Terminating Streaming Call

7 Terminating Interactive Call

8 Terminating Background Call

9 Emergency Call

10 Inter-RAT Cell re-selection

11 Inter-RAT Cell change order

12 Registration

13 Detach

14 Originating High Priority Signaling

15 Originating Low Priority Signaling

16 Call re-establishment

17 Terminating High Priority Signaling

18 Terminating Low Priority Signaling

19 Terminating – cause unknown

3.1.3 CSV Access Failure

Access failure is the product of the RRC Connection Failure, NAS Setup Failure and the RAB Establishment Failure. RRC Connection Success is counted when the RNC receives a RRC Setup Complete from the UE. NAS Setup is considered successful when the signaling messages in the call flow during call setup flow is successfully completed by relevant Network Elements. A RAB is considered successfully established when the RAB Assignment Response is sent by RNC to the CN.

9 of 39

Page 10: 3G KPI Guidelines

Equation 1: CSV Access Failure Rate (Counter Formula)

100*

The above formula definition shall be met with the following Seller’s counters.

100*

Seller shall develop SRB counters by RAN Release P8. Seller and Purchaser agreed to use 99.93% SRB success rate until SRB counters per CS and PS domain are made available at P8.

As an interim solution, it shall be defined as a drop rate of 2% over an average call length of 90 seconds, the probability of dropping the SRB during the direct transfer phase is 3s/90s*2%= 0.07%. If during the deployment, fault rates higher than 0.1% are observed, seller shall develop counters in P7.

Measurement Condition:

In case of multiple RRC connection requests only the first RRC connection request shall be taken into consideration for KPI calculation.

Table 2: CSV Access Term Definition

Key Performance Indicator Term Definition

#RRC Connection Success (CSV) The number of successful RRC Connection Complete with any establishment cause that lead to CSV RAB.

#RRC Connection Attempt (CSV) The number RRC Connection Request Messages received with any establishment cause that leads to a CSV RAB.

#RAB Assignment Response (CSV) The number of RANAP: RAB Assignment Response messages sent from the RNC to the MSC for a voice call.

#RAB Assignment Request (CSV) The number of RANAP: RAB Assignment Request messages sent from the MSC to the RNC to establish a voice call.

# SRB Attempt The number of RRC Connection Completes that will lead to an Initial Direct Transfer to the CS domain and will also lead to a CSV RAB Assignment Request.

# SRB Success The number of CSV RAB Assignment Requests

Equation 2: CSV Access Failure Rate (Drive Test Formula)

10 of 39

Page 11: 3G KPI Guidelines

Measurement Condition:

In the event that there are several consecutive RRC Connection Request only the first RRC connection request shall be taken into account for the KPI calculation.

Calls are considered as failed during access when a connection is attempted and the UE does not receive Alerting/Connect message in case of drive test measurement. For OSS RANAP: RAB Assignment Response message from the RNC to core Network is considered as a call successful setup.

Table 3: CSV Access Failure Rate KPI

Pre-Launch Launch Post-Launch

≤ 2.0 % ≤ 2.0 % ≤ 2.0 %

3.1.4 CSV Drop

Circuit Switched Voice Drop measures the Network’s inability to maintain a call. CSV Drop is defined as the ratio of abnormal speech disconnects, relative to all speech disconnects (both normal and abnormal).

A normal disconnect is initiated by a RAB Disconnect RANAP message from the MSC at the completion of the call. An abnormal RAB disconnect includes Radio Link Failures, UL or DL interference or other reason and can be initiated by either UTRAN or CN.

Equation 3: CSV Drop Rate (Counter Formula)

100*

100*

Counter DescriptionpmNoSystemRabReleaseSpeech Number of system RAB releases (Speech 12.2k and

AMR Narrow Band) for the best cell in the Active Set. pmNoNormalRabReleaseSpeech Number of normal RAB releases (Speech 12.2k and

AMR Narrow Band) for the best cell in the Active Set.

Table 4: CSV Drop Term Definition

Key Performance Indicator Term Definition

#RABNormalRelease(CSV) Number of voice RAB normally released

#RABAbnormalRelease(CSV) Number of voice RAB abnormally released

Equation 4: CSV Drop Rate (Drive Test Formula)

11 of 39

Page 12: 3G KPI Guidelines

Measurement Condition:

Any CSV call successfully handed over from 3G to 2G network within the CSV Service Area shall be considered as drop call for Milestone 2.

Table 5: CSV Drop Rate KPI

Pre-Launch Launch Post-Launch

≤ 2.0 % ≤ 2.0 % ≤ 2.0%

3.1.5 CSV Quality

Voice quality shall be measured by BLER

Equation 5: CSV UL Quality (Counter Formula)

100*

UL BLER is also available as average over the whole Reporting period (15 minutes) as a counter described in the following formula using RNC based counters:

100*

i (value of UeRc) Radio Connection Configuration2 Conventional CS Speech (12.2/12.2 kbps)

33 Conventional CS Speech AMR (7.95/7.95 kbps)34 Conventional CS Speech AMR (5.9/5.9 kbps)35 Conventional CS Speech AMR (4.75/4.75 kbps)

Measurement results are available every 15 minutes.

Measurement Condition:

UL quality shall be measured in all three (3) Milestones (Pre-Launch, Launch and Post Launch)

This KPI shall be measured at 1% BLER operating point set in UTRAN for voice call.

UL BLER is collected and presented in the MRR-W (Measurement Results Recording WCDMA) feature in OSS-RC. Reporting period can be set to between 2 to 64 seconds according to standard. Counters are presented in a PDF format in MRR-W. In 2 seconds there are 100 blocks. Resolution for the PDF is 0.5%. Note that using very short sampling intervals for BLER measurements will result in low accuracy for each sample.

Table 6: CSV UL Quality Terms Definition

Key Performance Indicator Term Definition

#Faulty Transport Blocks in Uplink after SelectionCombining (Speech)

Number of faulty Uplink DCH transport blocks for speech after selection and combining. Sampling period shall be every ten (10) seconds.

#Total Transport Blocks in Uplink after SelectionCombining (Speech)

Total number of Uplink DCH transport blocks for speech after selection and combining. Sampling period shall be every ten (10) seconds.

Equation 6: CSV DL Quality (Drive Test Formula)

12 of 39

Page 13: 3G KPI Guidelines

100*

CPI for RES is found under RAN Performance Management radio Environment Statistics:USER DESCRIPTION 106/1553-HSD 101 02/5 Uen B

Measurement Condition:

DL quality shall be measured in all three (3) Milestones (Pre-Launch, Launch and Post Launch)

This KPI shall be measured at 1% BLER operating point set in UTRAN for voice call.

BLER shall be available using counters with Cell Level granularity. KPI for MS3 shall be calculated at Market Level. For MS3, DL BLER shall be measured using RES counters or drive test as required.

DL BLER is reported by the mobile according to 3GPP specification. Measurement reports are collected and presented in the MRR-W feature in OSS-RC. Reporting period can be set to between 2 to 64 seconds according to standard. Counters are presented in a PDF format in MRR-W. In 2 seconds there are 100 blocks.

Resolution for the PDF is 0.5%. Note that using very short sampling intervals for BLER measurements will result in low accuracy for each sample.

Table 7: CSV DL Quality Terms Definition

Key Performance Indicator Term Definition

#Faulty Transport Blocks in Downlink after Combining (Speech)

Number of faulty Downlink DCH transport blocks for speech after combining. Sampling period shall be every two (2) seconds or less.

#Total Transport Blocks in Downlink after Combining (Speech)

Total number of Downlink DCH transport blocks for speech after combining. Sampling period shall be every two (2) seconds or less.

Equation 7: CSV DL Quality (Drive Test Formula)

Measurement Condition:

DL quality shall be measured in all three (3) milestones (Pre-Launch, Launch and Post Launch)

This KPI shall be measured at 1% BLER operating point set in UTRAN for voice call. The downlink BLER value shall be collected from the UE having reporting period of 2 seconds and aggregated over a period of 10 seconds in Post Processing Tool for calculation of KPI. Any fraction of 10 seconds shall also be included in this KPI calculation. The BLER calculation shall be made from the time when Connect message is received.

Table 8: CSV Quality KPI

KPI Pre-Launch Launch Post Launch

CSV Quality (DL) 95th percentile of samples ≤ 2.0 %

BLER

95th percentile of samples ≤ 2.0 %

BLER

95th percentile of samples ≤ 2.0 %

BLER

13 of 39

Page 14: 3G KPI Guidelines

KPI Pre-Launch Launch Post Launch

CSV Quality (UL) 95th percentile of samples ≤ 2.0 %

BLER

95th percentile of samples ≤ 2.0 %

BLER

95th percentile of samples ≤ 2.0 %

BLER

3.1.6 CSV Cell Capacity

Equation 8: CSV Cell Capacity (Counter Formula)

Note the above is an approximation as Soft/Softer Handover Overhead includes both CS and PS connections.

Measurement Condition:

CSV Cell Capacity shall be calculated based on average DL power allocated to AMR 12.2 RB for this KPI. The above formula is for the purpose of calculating Cell Capacity not a guideline for RF Planning/Design purposes.

Table 9: Cell Capacity Definition

Key Performance Indicator Term Definition

CSV CellCapacity Average number of Voice Calls (AMR 12.2) that can be supported by cell

CellTrafficPower Net DL Power available in a cell to carry user traffic and is equal to total cell carrier RF power (at system reference point) minus Common/signaling channel power as per RF Planning Guideline/Link Budget

AvgRL PerCall Average number of Radio Link used to support a Voice Call (AMR 12.2)

AvgPowerPerRL Average RF Power used for a Radio Link to support Voice Call (AMR 12.2)

Equation 9: Cell Capacity (Drive Test Formula)

Table 10: Cell Capacity KPI

14 of 39

Page 15: 3G KPI Guidelines

Pre-Launch Launch Post-Launch Units

≥ 40 (provided it is achieved in RF

design)

≥ 40 (provided it is achieved in RF

design)

≥ Measure and report

# of AMR12.2 UE’s

Measurement Condition:

CSV Cell Capacity shall be calculated based on average DL power allocated to AMR 12.2 RB for this KPI.

The whole of Cluster/Market drive data shall be used to calculate this KPI.

3.1.7 CSV Soft/Softer Handover Overhead

Soft/Softer Handover Overhead KPI provides an indication of how many Cells or Sectors were in the active set during the call on an average basis.

Equation 10: Soft/Softer Handover Overhead (Counter Formula)

The counters for this KPI measures not only CSV Soft/Softer Handover Overhead for all CS connections but also for PS connections.

Measurement Condition:

The counters have a sampling rate of 1 minute and the counters should be aggregated over a sufficiently long period for statistical validity.

Table 11: Soft/Softer Handover Overhead definition

Key Performance Indicator Term Definition

M Maximum Active Set size as defined in RNC

AS Active Set size

Duration(AS) Duration of the call in Active Set size of AS (AS=1 to M)

Equation 11: Soft/Softer Handover Overhead (Drive Test Formula)

Table 12: Soft/Softer Handover Overhead KPI

15 of 39

Page 16: 3G KPI Guidelines

Pre-Launch Launch Post-Launch

≤ 1.6 ≤ 1.6 ≤ 1.7

3.1.8 CSV Inter-Radio Access Technology Handover Failure

Inter-Radio Access Technology (IRAT) Hard Handover Failure rate from UMTS and GSM System for voice calls.

Equation 12: CSV IRAT Failure Rate (Counter Formula)

Figure 3: IRAT HO Call Flow – CSV Call

16 of 39

Page 17: 3G KPI Guidelines

pmNoSuccessOutIratHoSpeech(GsmRelation): the trigger is when IU RELEASE COMMAND is received with cause ‘Normal release’ or ‘Successful relocation’ and based on the CS RAB state.

pmNoAttOutIratHoSpeech(GsmRelation): this counter is increased when RNC sends HANDOVER FROM UTRAN COMMAND.

The above formula captures speech 12.2k and AMR narrow band.

Measurement Condition:

In order to verify CSV IRAT HO performance, some prerequisites have to be fulfilled such as: definition of IRAT strategy, setting up services priorities. Both 3G and 2G Networks shall have IRAT neighbors defined. The GSM Network shall have available resources without showing congestion. Mutually agreed test UE (with the latest available Software, Firmware and Equipment) shall be used for the IRAT KPI

17 of 39

Page 18: 3G KPI Guidelines

verification. The 2G and 3G Networks shall be properly configured in accordance with Purchaser’s Network design including the definition of the routing tables through the Core Network. PURCHASER shall inform Seller about major changes in the configuration of the GSM Network (frequency re-plan, Cell parameter changes, etc) that will degrade the 3G IRAT performance. Seller may review Purchaser’s GSM Network changes/planned activities before, during or after IRAT KPI verification drive.

CSV IRAT Hard Handover Failure rate shall be measured by a specific drive per Market Unit. The specific drive shall be around UMTS and GSM service boundary. CSV IRAT HHO failure KPI does not include HHO Preparation failure. The HHO Preparation failure accounts for configuration mismatch and availability of resources in the GSM System.

Table 13: CSV IRAT Term Definition

Key Performance Indicator Term Definition

CSV3G2GHandoverFailureRate Hard Handover failure rate when a Voice Call fail to handover from UMTS network to GSM network

#HandoverFromUTRAN failure The number of Handover from UTRAN Failure messages sent from UE to RNC

#HandoverFromUtranCommand The number of Handover from UTRAN command sent by RNC to UE

Equation 13: CSV IRAT Failure Rate (Drive Test Formula)

Trigger for Handover from UTRAN Command is when the Command is sent to the UE.

Trigger for Handover from UTRAN Failure is when it is received from the UE.

Measurement Condition:

CSV IRAT HHO failure KPI does not include HHO Preparation failure. Specific drive route shall be identified at the UMTS RF Service Area boundary to verify this KPI.

Table 14: CSV IRAT Failure Rate KPI

Pre-Launch Launch Post-Launch

N/A ≤ 5.0 % ≤ 5.0%

3.1.9 CSV Call Setup Time

Voice call set up time indicates Network response time to a user request for a voice service. This KPI is applicable only for drive test.

Equation 14: CSV Call Setup Time (Drive Test Formula)

Voice Call setup time = CC_Alerting(MOC) – RRC Connection Request(MOC)

Measurement Condition:

For mobile to mobile call, calling UE shall be mobile and called UE shall be stationary.

Table 15: CSV Call Setup time KPI

18 of 39

Page 19: 3G KPI Guidelines

KPI Pre-Launch Launch Post Launch

Voice Call Setup time (Mobile to PSTN)

95th percentile ≤ 6 seconds

95th percentile ≤ 6 seconds

95th percentile ≤ 6 seconds

Voice Call Setup time (Mobile to Mobile)

95th percentile ≤ 9 seconds

95th percentile ≤ 9 seconds

95th percentile ≤ 9 seconds

Seller agrees to the 95th percentile provided the impact of non UTRAN Equipment is within the industry typical value range.

3.1.10 CSD Access Failure

Access failure is the product of the RRC Connection Failure, NAS setup failure and the RAB Establishment Failure. RRC Connection Success is counted when the RNC receives a RRC Setup Complete from the UE. NAS Setup is considered successful when the signaling messages in the call flow during call setup flow is successfully completed by relevant Network elements. A RAB is considered successfully established when the RAB Assignment Response is sent by RNC to the CN.

Equation 15: CSD Access Failure Rate (Counter Formula)

100*

The following formula includes both Conventional CS 64/64 kbps and Streaming CS 57.6/57.6 kbps:100*

Seller shall develop SRB counters by RAN Release P8. Seller and Purchaser agreed to use 99.93% SRB success rate until SRB counters per CS and PS domain are made available at P8.

For an interim solution, it shall be defined as a drop rate of 2% over an average call length of 90 seconds, the probability of dropping the SRB during the direct transfer phase is 3s/90s*2%= 0.07%. If during the deployment, fault rates higher than 0.1% are observed, seller shall develop counters in P7.

Table 16: CSD Access Term Definition

Key Performance Indicator Term Definition

#RRCConnectionSuccess The number of successful RRC Connection Setups with Conversational and Streaming Call Establishment Causes (both originating and terminating).

#RRCConnectionAttempt The number of RRC Connection Request Messages received with Conversational and Streaming Call Establishment Causes (both originating and terminating).

#RABAssignmentResponse (CSD) The number of RANAP: RAB Assignment Response messages sent from the RNC to the MSC for CSD.

#RABAssignmentRequest(CSD) The number of RANAP: RAB Assignment Request messages sent from the MSC to the RNC to establish a CSD call.

19 of 39

Page 20: 3G KPI Guidelines

Key Performance Indicator Term Definition

# SRBAttempt The number of RRC Connection Complete that will lead to an Initial Direct Transfer to the CS domain and will also lead to a CSD RAB Assignment Request.

# SRBSuccess The number of CSD RAB Assignment Requests

Equation 16: CSD Access Failure Rate (Drive Test Formula)

Measurement Condition:

In case there are several consecutive RRC Connection Requests only the first RRC connection request will be taken into account for the KPI calculation.

Table 17: CSD Access Failure Rate KPI

Pre-Launch Launch Post-Launch

≤ 3.0% ≤ 2.0 % ≤ 2.0 %

3.1.11 CSD Drop

Equation 17: CSD Drop (Counter Formula)

100*

The following formula includes both Conventional CS 64/64 kbps and Streaming CS 57.6/57.6 kbps.100*

Table 18: CSD Drop Term Definition

Key Performance Indicator Term Definition

#RABNormalRelease(CSD) Number of video RABs (Conversational and streaming) normally released

#RABAbnormalRelease(CSD) Number of video RABs (Conversational and streaming) abnormally released

Equation 18: CSD Drop (Drive Test Formula)

20 of 39

Page 21: 3G KPI Guidelines

Table 19: CSD Drop Rate KPI

Pre-Launch Launch Post-Launch

≤ 2.5 % ≤ 2.5 % ≤ 3.0 % (Using Counter method)

≤ 2.5 % (Using Drive Test method)

3.1.12 CSD Quality

This equation is similar to that of CSV; however it is specific for CSD RAB.

Equation 19: CSD Quality (UL)

100*

100*

i (value of UeRc) Radio Connection Configuration3 Conversational CS 64/64 kbps8 Streaming CS57.6 kbps

Measurement Condition:

UL quality shall be measured in all three (3) Milestones (Pre-Launch, Launch and Post Launch)

Measurement results are available every 15 minutes.BLER shall be available using counters with Cell Level granularity.

UL BLER is collected and presented in the MRR-W (Measurement Results Recording WCDMA) feature in OSS-RC. Reporting period can be set to between 2 to 64 seconds according to standard. Counters are presented in a PDF format in MRR-W. In 2 seconds there are 100 blocks. Resolution for the PDF is 0.5%. Note that using very short sampling intervals for BLER measurements will result in low accuracy for each sample.

Equation 20: CSD DL Quality (Counter Formula)

100*

DL BLER is reported by the mobile according to 3GPP specification. Measurement reports are collected and presented in the MRR-W feature in OSS-RC. Reporting period can be set to between 2 to 64 seconds according to standard. Counters are presented in a PDF format in MRR-W. In 2 seconds there are 100 blocks.

Resolution for the PDF is 0.5%. Note that using very short sampling intervals for BLER measurements will result in low accuracy for each sample.

CPI for RES is found under RAN Performance Management radio Environment Statistics

Measurement Condition:

DL quality shall be measured in all three (3) Milestones (Pre-Launch, Launch and Post Launch with optional feature for UE to report DL BLER measurement)

BLER shall be available using counters with Cell Level granularity. KPIs at MS3 shall be calculated at Market Level.

Table 20: CSD Quality Terms Definition

21 of 39

Page 22: 3G KPI Guidelines

Key Performance Indicator Term Definition

#FaultyTransportBlocksInUplink AfterSelectionCombining(CSD)

Number of faulty Uplink DCH transport blocks for CSD after selection and combining. Sampling period shall be every ten (10) seconds

#TotalTransportBlocksInUplinkAfterSelectionCombining(CSD)

Total number of Uplink DCH transport blocks for CSD after selection and combining. Sampling period shall be every ten (10) seconds .

#FaultyTransportBlocksInDownlinkAfterCombining(CSD)

Number of faulty Downlink DCH transport blocks for CSD after combining. Sampling period shall be every ten (10) seconds

#TotalTransportBlocksInDownlinkAfterCombining(CSD)

Total number of Downlink DCH transport blocks for CSD after combining. Sampling period shall be every ten (10) seconds

Equation 21: CSD Quality DL (Drive Test Formula)

Measurement Condition:

DL quality shall be measured in all three (3) Milestones (Pre-Launch, Launch and Post Launch)

Table 21: CSD Quality KPI

KPI Pre-Launch Launch Post Launch

CSD Quality (DL)

95th percentile of samples ≤ 1% BLER

95th percentile of samples ≤1% BLER

95th percentile of samples ≤ 1% BLER

CSD Quality (UL)

95th percentile of samples ≤ 1% BLER

95th percentile of samples ≤ 1%BLER

95th percentile of samples ≤ 1%BLER

This KPI will be measured at 0.3 % BLER operating point set in UTRAN for CS Video Call.

3.1.13 CSD Call Setup time

Circuit Switched Data call set up time indicates Network response time to a user request for a video service. The test shall be initiated by making a video call from one UMTS UE to another UMTS UE. In case of multiple RRC connection requests the first RRC connection request will be taken into account for KPI calculation.

Equation 22: Mobile to Mobile Video call (Drive Test Formula)

Video Call setup time = CC_Alerting (MOC) – RRC Connection Request (MOC)

Table 22: CSD Call Setup time KPI

Parameter Pre-Launch Launch

Video Call Setup time N.A 95th percentile ≤ 9 seconds

Seller agrees to the 95th percentile and 9 seconds provided the impact of non UTRAN Equipment is within the industry typical value range.

3.2Packet Switched Level-1 Key Performance Indicators

R99 DL RAB allowed: 384 kbps,128 kbps, and 64 kbps, onlyR99 UL RAB allowed: 384 kbps, 64 kbps only

22 of 39

Page 23: 3G KPI Guidelines

HSDPA DL HS-DSCH allowed: UE Cat 12 (1800 kbps)HSUPA UL allowed: UE Cat 3 (1.450 Mbps)

3.2.1 Packet Switched Call Setup

Figure 4 is a flow diagram for a PS call. If a problem occurs between RRC Connection Request (step 1) and the RAB Assignment Response (step 27), the result is considered Access Failure. Any RAB abnormal release after the RAB Assignment Response is considered a call Drop.

Figure 4: Packet Switched Setup Flow Diagram

3.2.2 PSD Access Failure

The aim is to measure Packet Switch Data access failure from a user perspective.

Equation 23: PSD Access Failure Rate (Counter Formula)

23 of 39

Page 24: 3G KPI Guidelines

100*

The following formula includes Packet Interactive and Packet Background. Please refer to Section 6 of the CPI document User Description Connection Handling 4/1553-HSD 101 02/5 Uen.

100*

R99InteractiveRABEstablishSuccessRate equals:

and,

HS2_HardHO_Flow(UtranCell) = pmOutgoingHsHardHoAttempt(UtranCell) – pmNoHsHardHoReturnOldChSource(UtranCell) – pmIncominHsHardHoAttempt(UtranCell)

Seller shall develop SRB counters by RAN Release P8. Seller and Purchaser agreed to use 99.93% SRB success rate until SRB counters per CS and PS domain are made available at P8.

For an interim solution, it shall be defined as a drop rate of 2% over an average call length of 90 seconds, the probability of dropping the SRB during the direct transfer phase is 3s/90s*2%= 0.07%. If during the deployment fault rates higher than 0.1% are observed, seller shall develop counters in P7.

Measurement Condition:

Both Interactive and Background traffic are included in the counter based formula above.

Table 23: PSD Access Failure Term Definition

Key Performance Indicator Term Definition

#RRCConnectionComplete The number of successful RRC Connection Setups with PSD Establishment Causes (both originating and terminating).

#RRCConnectionRequest The number RRC Connection Request Messages received with PSD Establishment Causes (both originating and terminating).

#RABAssignmentResponse (PSD) The number of RANAP: RAB Assignment Response messages sent from the RNC to the SGSN PS service.

#RABAssignmentRequest(PSD) The number of RANAP: RAB Assignment Request messages sent from the SGSN to the RNC to establish PS Service.

#SRBAttempt The number of RRC Connection Completes that will lead to an Initial Direct Transfer to the PS domain and will also lead to a PS (R99) RAB Assignment Request.

#SRBSuccess The number of PS R99 RAB Assignment Requests.

24 of 39

Page 25: 3G KPI Guidelines

Equation 24: PSD Access Failure Rate (Drive Test Formula)

Measurement Condition:

Access Failure Rate shall be measured by data session activation (PS) followed by download. In case of multiple RRC connection requests the first RRC connection request will be considered for KPI calculation.

Table 24: PSD Access Failure Rate KPI

Pre-Launch Launch Post-Launch

≤ 2.0 % ≤ 2.0 % ≤ 2.0 %

3.2.3 PSD Drop

Packet session is considered as dropped when associated RAB has been released abnormally by either UTRAN or CN. Any drop after RANAP: RAB Assignment Response is considered as PS drop call.

Equation 25: PSD Drop Rate (Counter Formula)

100*

100*

The formula above calculates drop rate for Packet Interactive and Packet Background (including HSDPA).

Measurement Condition:

Traffic classes Background, and Interactive (both originating and terminating) shall be considered in counter-based metrics.

Table 25: PSD Drop Term Definition

Key Performance Indicator Term Definition

#RABSetupComplete The number of completed RAB setup phase for PS Data, when the RNC sends RANAP: RAB Assignment Responses to Core Network after a successful RAB establishment

#RABDrop The number of RAB Drop for PS data calls

Equation 26: PSD Drop Rate (Drive Test Formula)

Measurement Condition:

25 of 39

Page 26: 3G KPI Guidelines

PSD drop Rate shall be measured after start of ftp download . PSD call is considered as dropped when ftp session is manually/abnormally disconnected without completing file transfer due to any reason.

Table 26: : PSD Drop Rate KPI

Pre-Launch Launch Post-Launch

≤ 2.5 % ≤ 2.0% ≤ 2.0 %

3.2.4 PSD Latency

The following is extracted from 3GPP Specification 23.107, Section 6.4.3.1 for reference:

Transfer delay (ms)

Definition: Indicates maximum delay for 95th percentile of the distribution of delay for all delivered SDUs during the lifetime of a bearer service, where delay for an SDU is defined as the time from a request to transfer an SDU at one SAP to its delivery at the other SAP.

NOTE 3: Transfer delay of an arbitrary SDU is not meaningful for a bursty source, since the last SDUs of a burst may have long delay due to queuing, whereas the meaningful response delay perceived by the user is the delay of the first SDU of the burst.

Table 27: PSD Latency Term Definition

Key Performance Indicator Term Definition

PSD Latency Round trip time for a 32 Bytes ping for any R99 PS RAB

Measurement Condition:

Latency will be measured with the destination server for ping connected directly to the GGSN (i.e. the server on the same Intranet domain as GGSN). RTT will be measured in stationary position. This test will be done with a single UE with R99 RAB PS call with Ping application.

Table 28: PSD Latency KPI

Latency at Launch RAB

95th percentile ≤ 200 ms R99 RAB (64 Kbps or 128 Kbps or 384 Kbps)

Seller agrees to the 95th percentile and the 200ms provided the impact of non UTRAN Equipment is within the industry typical value range.

3.2.5 PSD Throughput

Total number of RLC blocks sent over the observation window including re-transmission per transport type.

Equation 27: PSD Average Throughput (Counter Formula)

The above formula is applicable for both Downlink and Uplink

26 of 39

Page 27: 3G KPI Guidelines

Measurement Condition:

This metric shall be measured only for Packet Switch Traffic Class (Interactive and Background). KPI Measurement shall be based on RLC (SDU) layer throughput with five percent (5%) target for DL/UL BLER for R99 RAB.

Table 29: PSD Average Throughput Term Definition

Key Performance Indicator Term Definition

AM_RLC_SDU_Data(kb) Total AM RLC SDUs (kb) transferred excluding re-transmission in the downlink or uplink

DlUserThroughput Downlink Average Packet data throughput (kbps)

AM_RLC_SDU_Duration The total RLC SDU transmission duration in seconds. For DL/UL, this excludes the period when the DL/UL transmission buffer for the RLC entity is empty

Equation 28: PSD Average Throughput (Drive Test Formula)

Formula is valid for uplink and downlink.

Measurement Condition:

KPI Measurement shall be for application layer throughput with five-percent (5%) target DL/UL BLER operating point for R99 RAB. Throughput shall be calculated using data from the entire Cluster/ Market drive test for Milestone 1 and Milestone 2.

The download/upload file shall be compressed type.

Table 30: PSD Average Throughput Definition

Key Performance Indicator Term Definition

UserDataTransferred(kb) FTP download in kilo-bit during one session.

SessionDuration(sec) Time duration (seconds) to download/upload a file.

PSD_Throughput(kbps) Packet data throughput in kbps using R99 RAB measured at application layer.

Table 31: PSD Average DL Throughput KPI

KPI Pre-Launch Launch Post Launch

Average DL throughput

(Unloaded) kbps

≥ 240 ≥ 240 ≥ 150 (Counter)

≥ 240 (Drive Test)

Average DL Throughput

(Loaded) kbps

≥ 210 ≥ 210 ≥ 150 (Counter)

≥ 210 (Drive Test)

Table 32: PSD Average UL Throughput KPI

27 of 39

Page 28: 3G KPI Guidelines

KPI Pre-Launch Launch Post Launch

Average UL throughput

(Unloaded) kbps

≥ 200 ≥ 200 ≥ 150 (Counter)

≥ 200 (Drive Test)

Average UL Throughput

(Loaded) kbps

≥ 180 ≥ 180 ≥ 150 (Counter)

≥ 180 (Drive Test)

Measurement Condition:

KPI Measurement shall be for application layer throughput with five-percent (5%) BLER operating point for both UL and DL. Throughput shall be calculated using data from the entire Cluster/ Market drive test for Milestone 1 and Milestone 2 using all available R99 Radio Bearer.

Table 33: PSD Average Throughput Definition

Key Performance Indicator Term Definition

UserDataTransferred(kb) Total ftp size download in kilo-bit in one session

SessionDuration(sec) Total time (seconds) duration to download single fileSessionDuration(sec) = (time stamp of Session End or Session Error – time stamp of Session Start)

PSDAvgThroughput Packet Switched Data average throughput using R99 RAB measured at application layer

3.2.6 PSD Call Setup Time

PSD call setup time indicates Network response time to a user request for a packet data service. In case of multiple RRC connection requests the first RRC connection request will be considered for KPI calculation.

Equation 29: PSD Call Setup Time (Drive Test Formula)

PS Call setup time = PDP Context Activation accept (MOC) - RRC Connection Request (MOC)

Measurement Condition:

The UE is already attached to the UTRAN network.

Table 34: PSD Call Setup time KPI

KPI Pre-Launch At Launch Post Launch

PSD Session Activation time

N.A 95th percentile sessions ≤ 5

seconds

N.A

Seller agrees to the 95th percentile and the 5 seconds provided the impact of non-UTRAN Equipment is within the industry typical value range.

3.2.7 PSD Inter-Radio Access Technology Handover Failure

Inter-Radio Access Technology (IRAT) is a hard handover between UMTS and GSM.

For Packet Switched handover, the RNC sends a Cell Change Order from UTRAN command to the UE. A successful handover can be monitored by the RNC Iu Release message from the CN.

Equation 30: PSD IRAT Failure Rate (counter formula)

28 of 39

Page 29: 3G KPI Guidelines

pmNoOutIratCcReturnOldCh is increased when the CELL CHANGE ORDER FROM UTRAN FAILURE (RRC) message is received from the UE.pmNoOutIratCcAtt is increased when the CELL CHANGE ORDER FROM UTRAN (RRC) message has been sent to the UE.

Measurement Condition:

PSD IRAT HHO failure KPI does not include HHO Preparation failure.

In order to verify PSD IRAT HO performance, some prerequisites have to be fulfilled such as: definition of IRAT strategy, setting up services priorities. Both 3G and 2G Networks shall have IRAT neighbors defined. The GSM Network shall have available resources without showing congestion. Mutually agreed test UE (with the latest available Software, Firmware and Equipment) shall be used for the IRAT KPI verification. The 2G and 3G Networks shall be properly configured in accordance with Purchaser’s Network design including the definition of the routing tables through the Core Network. PURCHASER shall inform Seller about major changes in the configuration of the GSM Network (frequency re-plan, Cell parameter changes, etc) that will degrade the 3G IRAT performance. Seller may review Purchaser’s GSM Network changes/planned activities before, during or after IRAT KPI verification drive.

Table 35: PSD IRAT Failure Term Definition

Key Performance Indicator Term Definition

3G2GPSHandoverFailureRate Hard Handover failure rate when a PS Data call (R99) fail to handover from UMTS network to GSM network

#CellChangeOrderFromUTRAN Failure(PS) RRC: The number of Cell Change Order from UTRAN Failure messages from UE to RNC

#CellChangeOrderFromUTRAN Command(PS) RRC: The number of Cell Change Order Messages from RNC to UE

Equation 31: PSD IRAT Failure Rate (Drive Test Formula)

RRCCellChangeOrderFromUTRANFailure_UE message is sent by UE.RRCCellChangeOrderFromUTRANCommand_RNC message is sent by RNC.

Measurement Condition:

PSD IRAT HHO failure KPI does not include HHO Preparation failure. Specific drive route shall be identified at the UMTS RF Service Area boundary to verify this KPI.

Table 36: PSD IRAT Failure Rate KPI

Pre-Launch Launch Post-Launch

N/A ≤ 5.0 % ≤ 5.0 %

3.2.8 PSD IRAT Interruption time

29 of 39

Page 30: 3G KPI Guidelines

This KPI is an indicator of interruption time for the packet switch data during Inter-Radio Access Technology hard handover.

Equation 32: PSD IRAT Interruption time (Drive Test Formula)

PSD_IRATInterruptionTime = (TimeRAUpdateComplete_UE – TimeCellChangeOrder_RNC)

Equation 33: PSD IRAT User Data Interruption time (Drive Test Formula)

PSD_IRATUserDataInterruptionTime = (TimeFirstPacketDataReceivedIn2G – TimeLastPacketDataReceivedIn3G)

Measurement Condition:

This KPI is applicable for dedicated drive test route on 3G-2G RF service border area during Market Acceptance.

The interruption time shall be measured during 3G to 2G PSD HHO.

Table 37: PSD IRAT Interruption Definition

Key Performance Indicator Term Definition

PSD_IRATInterruptionTime Duration of interruption to Packet data service during Hard Handover

TimeRAUpdateComplete_UE The timestamp in drive test Tool when UE sends Routing Area Update complete message in the uplink to the 2G SGSN

TimeCellChangeOrder_RNC The timestamp in drive test Tool when UE receives Cell Change Order message from RNC in the downlink

Table 38: PSD IRAT User Data Interruption Definition

Key Performance Indicator Term Definition

PSD_IRATUserDataInterruptionTime Duration in seconds to interruption of Packet data service from User perspective during IRAT Hard Handover from 3G to 2G

TimeFirstPacketDataReceivedIn2G Timestamp in drive test Tool when first data is Received in 2G System after successful IRAT Handover.

TimeLastPacketDataReceivedIn3G Timestamp in drive test Tool when UE receives last Packet Data in 3G System before IRAT Handover.

Table 39: PSD IRAT Interruption time KPI

KPI Term At Launch

PSD IRAT Interruption time 95th percentile ≤ 12 seconds

PSD IRAT User Data Interruption time

Measure and Report

Seller agrees to the 95th percentile and the 12 seconds provided the impact of non UTRAN Equipment is within the industry typical value range.

3.2.9 HSDPA Access Failure

30 of 39

Page 31: 3G KPI Guidelines

This test shall be done with a single UE doing HSDPA call in the Cell under test with 384/64 Kbps associated DCH on the uplink.

Equation 34: HSDPA Access Failure Rate (Counter Formula)

100*

100*

Seller shall develop SRB counters by RAN Release P8. Seller and Purchaser agreed to use 99.93% SRB success rate until SRB counters per CS and PS domain are made available at P8.

As an interim solution, it shall be defined as a drop rate of 2% over an average call length of 90 seconds, the probability of dropping the SRB during the direct transfer phase is 3s/90s*2%= 0.07%. If during the deployment, fault rates higher than 0.1% are observed, seller shall develop counters in P7.

Table 40: HSDPA Access Failure Term Definition

Key Performance Indicator Term Definition

#RRCConnectionComplete The number of successful RRC Connection Setups with Packet Establishment Causes (both originating and terminating).

#RRCConnectionRequest The number RRC Connection Request Messages received with Packet Establishment Causes (both originating and terminating).

#RABAssignmentResponse (HS-DSCH)

The number of RANAP: RAB Assignment Response messages sent from the RNC to the SGSN PS service.

#RABAssignmentRequest(HS-DSCH) The number of RANAP: RAB Assignment Request messages sent from the SGSN to the RNC to establish HSDPA Service.

#SRBAttempt The number of RRC Connection Completes that will lead to an Initial Direct Transfer to the PS domain and will also lead to a HSDPA RAB Assignment Request.

#SRBSuccess The number of HSDPA RAB Assignment Requests

Equation 35: HSDPA Access Failure Rate (Drive Test Formula)

Measurement Condition:

In case of multiple RRC connection requests the first RRC connection request will be considered for KPI calculation.

31 of 39

Page 32: 3G KPI Guidelines

Access Failure Rate shall be measured by data session activation (HSDPA) followed by download.

Table 41: HSDPA Access Failure Rate KPI

Pre-Launch Launch Post-Launch

≤ 2.0 % ≤ 2.0 % ≤ 2.0 %

3.2.10 HSDPA Drop

HSDPA session is considered as dropped when associated HS-DSCH has been released abnormally by either UTRAN or CN. This test will be done with a single UE performing HSDPA call in the Cell under test with 384/64 Kbps associated DCH on the uplink and HS-DSCH in the DL with downgrade/upgrade to/from R99 RAB.

Equation 36: HSDPA Drop Rate (Counter Formula)

100*

100*

pmNoSystemRbReleaseHs: Number of successful system releases of packet RABs mapped on HS-DSCH in the Serving HS-DSCH cell. The counter is stepped for the Serving HS-DSCH cell at RAB/RB combination transition from PS Interactive 64/HS - HS-DSCH to SRB-DCH or to Idle mode due to the same reasons as for stepping the existing counter pmNoSystemRabReleasePacket. pmNoSystemRabReleasePacket is only increased due to a RANAPIu Release Command or RAB Assignment Request message with "release cause" = anything except 'Normal Release', 'Successful Relocation', 'Resource Optimisation Relocation', 'User Inactivity' or 'release-due-to-UE-generated- signalling-connection-release'. This counter is incremented for the best cell in the Active Set in the SRNC and when releasing a HS RAB, this counter is stepped for the Serving HS-DCH cell.

pmNoRabEstablishSuccessPacketInteractiveHs: The number of successful RAB establishments for PS Interactive RAB mapped on HS-DSCH. The counter is stepped for the selected Serving HS-DSCH cell at RAB establishment after the successfully transition SRB-DCH to PS Interactive 64/HS - HS-DSCH.

pmDlUpswitchSuccessHs: Number of successful DL upswitches to any HS state. The counter is stepped for successful DL upswitch to a RB combination containing HS. The counter is incremented in all cells of the active set.

pmNoRabEstablishSuccessPacketInteractiveEul: The number of successful RAB establishments for PS Interactive RAB mapped on E-DCH/HSDPA. Counter is stepped for the Serving E-DCH cell at successful RAB/RB combination transition to PS Interactive E-DCH/HS - HS-DSCH due to RAB establishment. The counter is triggered after sending of RAB Assignment Response (successful).

pmUlUpswitchSuccessEul: Number of successful up-switches, triggered by UL user activity, to a target RB combination E-DCH/HSDPA. Stepped for the target cell. Counter in target cell is stepped at a successful upswitch triggered by UL user activity.

Measurement Condition:

32 of 39

Page 33: 3G KPI Guidelines

The Seller counter based metrics formula shall be updated as new functionality is introduced in the System

Table 42: HSDPA Drop Term Definition

Key Performance Indicator Term Definition

HS-DSCH_ReleaseDueToFailure Number of HS-DSCH allocation releases due to radio link and other failures

HS-DSCH_AllocationSuccess Number of allocations when the RNC has received RRC: Radio Bearer Re-Configuration Complete message from the UE after successful HS-DSCH MAC-d flow setup

Equation 36: HSDPA Drop Rate (Drive Test Formula)

Measurement Condition:

HSDPA Drop Rate shall be measured by data session activation (HSDPA) followed by ftp download

Table 43: HSDPA Drop Rate KPI

Pre-Launch Launch Post-Launch

≤ 3.0 % ≤ 2.0 % ≤ 2.0 %

3.2.11 HSDPA Latency

Table 44: HSDPA Latency Term Definition

Key Performance Indicator Term Definition

HSDPA Latency Round trip time for a 32 Bytes ping for HSDPA NRT RAB

Measurement Condition:

Latency shall be measured with the destination server for ping connected directly to the GGSN (i.e the server on the same Intranet domain as GGSN). RTT shall be measured in a stationary test. This test shall be done with a single UE doing HSDPA call in the Cell under test.

Table 45: HSDPA Latency KPI

KPI Latency at Launch

HSDPA Latency 95th percentile sessions ≤ 100 ms

Seller agrees to the 95th percentile and the 100ms provided the impact of non UTRAN Equipment is within the industry typical value range.

3.2.12 HSDPA Throughput

Total number of RLC blocks sent over the observation window including re-transmission.

Equation 39: HSDPA Throughput (Counter Formula)

33 of 39

Page 34: 3G KPI Guidelines

Key Performance Indicator Term

Definition

AM_RLC_PDU_Data(kb) Total AM RLC PDUs (kilo-bit) transferred excluding re-transmission in the uplink

AM_RLC_PDU_Duration The total RLC PDU transmission duration (in seconds)

Table 46: HSDPA Throughput Term Definition

Key Performance Indicator Term

Definition

AM_RLC_PDU_Data(kb) Total AM RLC PDUs (kilo-bit) transferred excluding re-transmission in the downlink or uplink

AM_RLC_PDU_Duration The total RLC PDU transmission duration (in seconds). For downlink this duration excludes the period when the downlink transmission buffer for the RLC entity is empty

Equation 40: HSDPA Throughput (Drive Test Formula)

Measurement Condition:

Measurement shall be for Application Layer bit rate. The HSDPA throughput is only applicable when the UE is on HS-DSCH.

Table 47: Stationary Maximum DL HSDPA Bit Rate (kbps) KPI

Pre-Launch Launch Post-Launch 1800 kbps

(using UE category 12)

≥ 1300 ≥ 1300 ≥ 1300

(at RLC Layer using counter)

Under no load condition

≥ 1100 ≥ 1100 N.A 50% of Available Power at System Reference Point allocated to HSDPA

Measurement Condition:

UE shall be in Stationary location under excellent RF condition (CPICH RSCP ≥ -80 dBm and CPICH Ec/No ≥ - 8 dB).

34 of 39

Page 35: 3G KPI Guidelines

The download file shall be compressed type.

Measurement shall be based on FTP file download of minimum ten (10) MB file to reduce the impact of TCP slow start.

Measurement shall be at application layer throughout the HSDPA Service Area.

Table 48: Average DL Throughput (kbps) KPI

Pre-Launch Launch Post-Launch Using UE category 12

1.8 Mbps

≥ 700 ≥ 700 ≥ 700

(at off peak)

Under no load condition

≥ 600 ≥ 600 N.A 50% of Available Power at System Reference Point allocated to HSDPA

Measurement Condition:

UE shall be in mobile environment within designed HSDPA Service Area. The supported DL HS-DSCH during drive test is 1800 kbps for UE Category 12. The Cell shall be loaded in such a way that HSDPA is allocated fifty-percent (50%) of the available power at System Reference Point. The remaining 50 % of available DL power at System Reference Point shall include common channels power, load due to one drive test UE (Voice) and the load simulated by OCNS speech mode. There shall be one UE making short Voice Call (AMR 12.2k) in the same cell as HSDPA call to monitor the impact of HSDPA on the voice user. Seller shall provide measurement report on short voice call for Setup Failure and Call Drop Rate as defined in Section 3.1.3 and Section 3.1.4 respectively. The average DL throughput measured during the entire drive test route within Cluster/Market Area shall be based on HS-DSCH Application Layer throughput. Measurement shall be based on drive test with FTP file download of minimum ten (10) MB file to reduce the impact of TCP slow start.

HSDPA Service Area shall be same as PSD 64k Service Area.

3.2.13 HSUPA Throughput

Total number of RLC blocks sent over the observation window excluding re-transmission.

Equation 39: HSUPA Throughput (Counter Formula)

pmSumAckedBitsCellEul:    The number of Media Access Control Enhanced Uplink (Eul) bits received and acknowledged by the RBS.

pmNoActive10msFramesEul:    The number of 10ms frames containing enhanced uplink data transmitted by the UE

Table 49: HSUPA Throughput Term Definition

35 of 39

Page 36: 3G KPI Guidelines

Key Performance Indicator Term

Definition

AM_RLC_PDU_Data(kb) Total AM RLC PDUs (kilo-bit) transferred excluding re-transmission in the uplink

AM_RLC_PDU_Duration The total RLC PDU transmission duration (in seconds).

Equation 40: HSUPA Throughput (Drive Test Formula)

Measurement Condition:

Measurement shall be at Application Layer bit rate. The upload file shall be compressed type.

HSUPA Service Area shall be same as PSD 64k Service Area.

UE shall be in Stationary location under excellent RF condition (CPICH RSCP ≥ -80 dBm and CPICH Ec/No ≥ - 8 dB)

Table 50: HSUPA Stationary UL Bit Rate (kbps) KPI

Pre-Launch Launch Post-Launch UE Category 3 (1.45 Mbps)

≥ 1100 ≥ 1100 Measure and Report

Stationary Peak Throughput (kbps) under no load

≥ 500 ≥ 500 Measure and Report

Average Throughput (kbps) under no load

3.2.14 HSDPA Data Session Setup Time

HSDPA Data Session setup time indicates Network response time to a user request for an HSDPA data service.

Equation 41: HSDPA session definition (Drive Test Formula)

HSDPA Data Session setup time = PDP Context Activation accept (MOC) - RRC Connection Request (MOC)

Measurement Condition:

This test shall be done with a single UE performing HSDPA call in the cell under test. The UE is already attached to the UTRAN Network.

Table 51: HSDPA Data Session Setup time KPI

KPI Pre-Launch Launch

HSDPA Session Activation time N.A 95th percentile sessions ≤ 5 seconds

Seller agrees to the 95th percentile and the 5 seconds provided the impact of non UTRAN Equipment is within the industry typical value range.

3.3System Availability

System Availability is defined as the percentage of time the Network can handle one hundred per-cent (100%) of the traffic it is designed for as measured at Cell Level.

36 of 39

Page 37: 3G KPI Guidelines

The purpose of this metric is to calculate the total amount of time (in percentage) out of the total operating time the RNC and Node-B are available to carry commercial traffic. Minimum granularity for KPI purposes is total loss of traffic at Cell level.

Seller is responsible only for the Sub-System supplied to Purchaser: RNC and Node-B. This excludes Antenna Systems, Transport Systems, Power/Battery Backup, non-Seller Core Network.

The loss of traffic at Cell Level can be due to one or more reasons:

1.        Equipment failures

2.        Software failures

3.        Seller originated (accidental, misuse, reset etc.)

4. Planned events authorized by Seller (Software upgrade, Equipment upgrade, Parameter change etc)

For Average Cell Availability, Seller shall be responsible for service degradation due to 1, 2, 3 (caused by Seller’s personnel or sub-contractors of Seller) and 4.

For Average Cell Non-Maintenance Availability, Seller shall be responsible for service degradation due to 1, 2 and 3 (caused by Seller’s personnel or sub-contractors of Seller).

The level of aggregation for this metric is Purchaser’s entire UMTS Network for which Seller has supplied the Equipment and Software (RNC and Node-B). The alarm aggregation for this metrics shall be performed on daily basis. System Availability KPI for Seller shall be calculated on a yearly basis for the purpose of achieving KPI target starting from the Network commercial Launch for RNC and Node-B Equipment. Seller shall provide all the necessary alarm details to assist Purchaser in realizing System Availability KPI matrices for the sub-Systems supplied by Seller (RNC and Node-B). Seller shall recommend to Purchaser which alarms to use to measure these KPIs.

Daily System Availability shall be measured as follows:

3.3.1 Average Cell Availability

Equation 37: Average Cell Availability formula

Using Alarms and Systems notification, the following formula shall apply

Table 52: Average Cell Availability Term Definition

Key Performance Indicator Term Definition

TotalCellDowntime(sec) Total duration (in seconds) of Node-B Cells within RNC unable to carry traffic due to planned or non-planned events due to RNC or Node-B. The duration shall be calculated when Alarm/System notification clears minus when Alarm/System notification triggers.

TotalSectorCount Total number of active Node-B Cells within RNC

AvgCellAvailability Average Node-B Cells availability to carry user traffic calculated as a percentage of total time

Table 53: Average Cell Availability KPI

Average Cell Availability KPI Sub-System RAN Release

≥ 99.95 (99.7) % RNC + Node-B P5

37 of 39

Page 38: 3G KPI Guidelines

Average Cell Availability KPI Sub-System RAN Release

≥ 99.95 (99.8) % RNC + Node-B P6

≥ 99.95 (99.9) % RNC+ Node-B P7

≥ 99.95 (>99.9)% RNC+ Node-B ≥P8

3.3.2 Average Cell Non-Maintenance Availability

Equation 38: Average Cell Non-Maintenance Availability Formula

Table 54: Average Cell Non-Maintenance Availability Term Definition

Key Performance Indicator Term Definition

TotalCellNonMaintenanceDowntime(sec) Total duration (in seconds) of Node-B Cells within a RNC not able to carry any traffic during normal operation (excluding planned events) due to RNC and Node-B Equipment reason. The duration shall be calculated when Alarm/System notification clears minus when Alarm/System notification triggers.

TotalSectorCount Total number of active Node-B Cells within RNC

AvgCellNonMaintenanceAvailability Average Node-B Cells available calculated as a percentage of available time at Cell level

Table 55: Average Cell Non-Maintenance Availability KPI

Average Cell Non-Maintenance Availability KPI Sub-System RAN Release

≥ 99.995 (99.7) % RNC + Node-B P5

≥ 99.995 (99.8) % RNC + Node-B P6

≥ 99.995 (99.9) % RNC+ Node-B P7

≥ 99.995 (>99.9) % RNC+ Node-B ≥P8

3.4Comparison of UMTS and GSM Network PerformanceSeller shall collect some 2G measurements defined below at the same time as 3G measurements for all Cluster Acceptance and all Market Acceptance drives.

The 2G measurements collected by Seller from a 2G terminal in the vehicle executing a pre-defined call sequence are defined below:

a. Voice Access Failure rate

b. Voice Drop Rate

c. Voice Quality; MOS

The 2G terminal used shall be specified by Purchaser and shall be a commercially available terminal used by Purchasers 2G Customers. Call sequences used by the 2G terminals shall be identical to the call sequences used by the 3G terminals. To the extent possible, the number of 2G calls made shall be the same as the number of 3G calls made.

38 of 39

Page 39: 3G KPI Guidelines

Test cases for 2G terminals executed to derive the above stated KPI must be executed at the same time within the test vehicle as the test cases to derive the same KPI for the 3G terminals.

It is a requirement for Cluster Acceptance and Market Acceptance that the 3G KPI are met. If the collected 2G KPIs do not meet 3G KPI target then 3G results shall be better than or equal to the obtained 2G results, using the same call profiles and call sequences. If the measured 2G KPI’s exceed the measured 3G KPI the Seller shall only have to meet the 3G target.

3.5 Impact of UMTS on GSM Network PerformanceThe deployment of the UMTS Network shall not degrade Purchaser’s existing GSM Network performance. Excluding IRAT Hard Handover Feature, both UMTS and GSM Networks are considered independent of each other. This section addresses any concern related to UMTS Equipment installation on GSM Sites and related faults for which Seller has responsibility that could degrade GSM performance.

The KPI shall be collected for the entire underlying 2G Site (s) relevant to 3G Node-B:

MOU/drop for Voice Calls

Access failure rate for voice calls

Daily (24 hour) Voice Erlang Traffic

Seller shall ensure that introduction of the UMTS Network does not degrade Purchaser’s existing GSM Network performance by no more than (10%) ten percent at Cell Level, no more than (7.5%) seven and a half percent at Cluster Level and no more than (5%) five percent at Market Level. The method of calculation shall be as described in the UMTS Systems Acceptance (Appendix U16).

4 UTRAN Level-2 Key Performance Indicators Purchaser shall provide Seller a document describing the KPI Level-2 counter based requirements. Seller shall provide Purchaser all the available counters down to the lowest cause level for RNC and Node-B. Seller shall provide (with a collaborative effort by Purchaser) a document that describes the Level-2 KPIs (formulas and counters) within ten (10) weeks after receipt of the Level 2 requirements from Purchaser. Any deviations on the above mentioned time schedule shall be mutually agreed.

For counters that are not likely to be available at the time of commercial launch (Milestone 2), Seller shall mutually agree with Purchaser (within (3) months of execution of the Fifth (5th) Amendment) on a reasonable roadmap to develop required counters.

Performance Management Measurement Entity shall be available at Cell, Node-B, Cluster, RNC, Market, Region and Network level for every KPI unless otherwise stated in Level-2 KPIs.

RNC Measurement Granularity shall be a minimum period of fifteen (15) minutes unless otherwise stated in the Level-2 KPIs.

39 of 39