Post on 25-Jun-2015
PerformanceEvaluation
Chris BlondiaNick Vercammen
2
Performance Evaluation CycleSystem under
EvaluationPerformance
measures
System Model- Environment - System
Evaluation Model-Input (e.g.traces)-System model-Evaluation method
-Analytical-Simulation-Experimental
Derive Performance Measures
- Compute- Simulate- Measure
Relate results to System Model
Use evaluation resultsin system
3
Example: Influence of network congestion on videoquality
• Streaming video transmitted through acongested network
• Packets of the video get lost due to bufferoverflow
• What is the influence of the packet loss on thevideo quality
4
Performance Evaluation Cycle: System UnderEvaluation
System under Evaluation
Performance measures
System Model- Environment - System
Evaluation Model-Input (e.g.traces)-System model-Evaluation method
-Analytical-Simulation-Experimental
Derive Performance Measures
- Compute- Simulate- Measure
Relate results to System Model
Use evaluation resultsin system
5
System under Evaluation / Performance measures
Performance measures Network: packet loss, number of consecutive packets
lost, number of admitted streams Content: type of packet lost, synchronization User perception
Videoserver
Congestednetwork
(Bufferoverflow) End User
6
Performance Evaluation Cycle: System ModelSystem under
EvaluationPerformance
measures
System Model- Environment - System
Evaluation Model-Input (e.g.traces)-System model-Evaluation method
-Analytical-Simulation-Experimental
Derive Performance Measures
- Compute- Simulate- Measure
Relate results to System Model
Use evaluation resultsin system
7
System Model
Taggedvideo source
Backgroundtraffic
Packet loss
Congestion can be modelled by
- Background traffic
- Variable service time in router
8
Performance Evaluation Cycle: Evaluation model anddetermine performance measures
System under Evaluation
Performance measures
System Model- Environment - System
Evaluation Model-Input (e.g.traces)-System model-Evaluation method
-Analytical-Simulation-Experimental
Derive Performance Measures
- Compute- Simulate- Measure
Relate results to System Model
Use evaluation resultsin system
9
Analytical Approach
approximateprocess
superpositionof X sources
B < ∞
packet lossprobability
...
0 10000 20000 30000 400000
50000
100000
150000
200000
250000
frame number
b
i
t
s
/
f
r
a
m
e
Frame size trace - bond
I-frames
B-frames
P-frames P B B P B B P BB BBI1IN
P B B P B B P BB BBI1IN
P B B P B B P BB BBI1IN
1/N1/N1/N1/N
1/N1/N
h1,1
h1,2 h2,
2
h1,i h1,
LhL,L
10
Use of results: Admission Control
0 5 10 15 20 25 30 35 40 450
10
20
30
40
50
60
number of bond sources
number of asterix sources
1e-4 admission boundary, buffer = 100 packets
Theoretical results
Experimental results
11
Experimental Set-up and Results
See demonstration
12
Performance Evaluation Cycle: Interpretation of results
System under Evaluation
Performance measures
System Model- Environment - System
Evaluation Model-Input (e.g.traces)-System model-Evaluation method
-Analytical-Simulation-Experimental
Derive Performance Measures
- Compute- Simulate- Measure
Relate results to System Model
Use evaluation resultsin system
13
Use performance evaluation results
Use the IBBT competences to suggest improvements Examples:
Network groups: intelligent buffer management schemes (e.g. using
thresholds)Congestion control mechanisms (e.g. explicit
congestion notification)Define efficient admission control algorithms
Video groups:Codecs using feedback use layered video coding schemes
14
Use of Performance Evaluation Competence
Application andContent
Network Protocols
Layer L2 – L4
Physical Layer
15
Application and Content
MeasuresSystemLayer
burstiness,peak bitrate,average bitrate, mobility
video, (aggregated)applications, webtraffic
Traffic Analysis
coding efficiency,error resilience andconcealment, error types andfrequencies, synchronization
analysis of differentcodecs, errorresiliencemechanisms, …
Video and audioContent
responsiveness, scalability,stability
content distributionweb access parallel fileprocessing
Application
16
Networks Protocols L2 – L4
MeasuresSystemLayer
scalability,robustness, reliability,throughput
protocols for mesh, ad hocand wireless sensornetworks
Advanced andlarge scaleexperimentaltestbeds (fixedand wireless)
testing of HGW,modem/mux interaction
protocol testsuites
responsiveness,scalability, stability,functionality,throughput
scheduling mechanism,MAC protocol, routingprotocol, auto-configurationprotocol
Protocolperformanceevaluation
17
Physical Layer
MeasuresSystemLayer
Signal propagationcharacteristics
WiMAXBody Area Networks
Signal propagation
18
Generic Test and Measurement Equipment
Use of advanced test equipment:E.g. Spirent-Smartbits, Spirent-Avalanche, Agilent-NX2, Opticom-Opera, Tracespan, Fluke-Optiview, …
19
Testbeds - Example: Wireless testbed