o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

14
Chair of Communication Networks Department of Electrical and Computer Engineering Technical University of Munich ©2017 Technical University of Munich o‘zapft is: Tap Your Networking Algorithm‘s Big Data! Andreas Blenk Patrick Kalmbach [email protected] Wolfgang Kellerer Stefan Schmid [email protected]

Transcript of o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Page 1: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Chair of Communication NetworksDepartment of Electrical and Computer EngineeringTechnical University of Munich

©2017 Technical University of Munich

o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Andreas Blenk

Patrick Kalmbach

[email protected]

Wolfgang Kellerer

Stefan Schmid

[email protected]

Page 2: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Network Algorithms are Ubiquitious

2

RoutingCache/Content

Placement Congestion Control

Two classes of optimization problems: Packing and Covering problems

Page 3: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

The Limitation – Fire and Forget

3

Place Cache Place Cache Place Cache

The Opportunity – Tap into your Algorithm’s Big Data

Place Cache Place Cache Place Cache

Algorithms repeatedly solve similar problems from scratch. This is not only boring for the

algorithm but also a waste of information and resources

Page 4: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Traditional vs. Proposed System

4

Problem

Instances

Optimization

Algorithm Problem

Solutions

produce

Problem

Solutions

Problem

Instances

Machine

Learning

Solution

Information

Optimization

Algorithm

produce

learn from (offline)

Traditional System

o’zapft is

Data Available at: Patrick Kalmbach, Johannes Zerwas, Michael Manhart, Andreas Blenk, Stefan Schmid, and Wolfgang Kellerer. 2017. Data on "o’zapft is: Tap

Your Network Algorithm’s Big Data!". (2017). https://doi.org/10.14459/2017md1361589

Page 5: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Potentials

5

Predict Value of Objective Function

Search Space Reduction reduction/Initial Solutions

Page 6: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Facility Location Use-Case

7

Facility Location (Controller Placement) – Guess Initial solutions

?

?

??

?Problem: Given a network and a number

of controllers, where to place the

controllers??

Page 7: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Data Generation – Facility Location

8

Optimization Algorithms:

• Random Placement (Rnd)

• Mixed Integer Program (MIP)

• Greedy (Gdy)

• Barabasi-Albert (BA) [2]

• Whole Topology Zoo (TZ) [1]

40 nodes

20-800 nodes

Substrates

Objective:Minimize maximum latency between node and

controller

Kentucky Data Link

Barabasi-Albert Graph [3]

[1] Knight et al., The Internet Topology Zoo. IEEE J. on Sel. Areas in Communica-tions 29, 9 (2011).

[2] Saino et al., A Toolchain for Simplifying Network Simulation Setup, in Procs. SIMUTOOLS '13, Cannes, France, March 2013

[3] Picture taken from http://graphstream-project.org/media/img/generator_overview_barabasi_albert.png

Page 8: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Learning to Place

9

Features:

• Node degree

• Closeness

• Betweeness

• Spectral Features

Classifier:

• Logistic Regression

• Support Vector Machine

• Random Forest

• Extra Tree• AdaBoost

Model Training and Selection:

80% Training 20% Testing

Min Max Min Max

Balance Classes

2-Fold Stratified

Cross Validation

Parameters

Training

Set

Test

Set

Final

Results

Best Parameters

[1] Scikit-learn: Machine Learning in Python, Pedregosa et al., JMLR 12, pp. 2825-2830, 2011.

Library:

• Sci-Kit Learn [1]

Feature Extraction Feature Extraction

Measures:

• F1 Score

Page 9: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Use Case: Facility Location

10

Problem: Place Facility on network node such that maximum latency is minimized

Node

𝑛𝑖

Calculate

Node Features

Machine

Learning

Feature Vector 𝑥𝑖

𝑃 𝑛𝑖 = 1 𝑥𝑖)

o’zapft is-Exact

o’zapft is-MIP0.25

o’zapft is-MIP0.5

𝑎𝑟𝑔𝑚𝑎𝑥𝑛𝑖{𝑃 𝑛𝑖 = 1 𝑥𝑖)}

MIP25%

50%

Page 10: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Use Case: Facility Location

11

Large reduction of solution space with only small performance degradation

Page 11: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Search Space Reduction

13

9.2𝑒−4

7.7𝑒−7

MIP

o’zapft is-

MIP0.5

o’zapft is-

MIP0.25

Substrate Kentucky Data Link

Number of Nodes 734

Number Facilities 10

Page 12: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Impact of Complex Features

14

Good classification performance for features with low complexity

Page 13: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Conclusion

16

It is possible to predict the behavior of networking

algorithms from past problem-solution-pairs

Search Space Reduction reduction/Initial Solutions

Predict Value of Objective Function

Page 14: o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Patrick Kalmbach (TUM) | o‘zapft is: Tap Your Networking Algorithm‘s Big Data!

Future Work

17

•Transfer Learning

• Investigate the size of the minimal search space

• Investigate whether heuristics improve on reduced search space

• Investigating the applicability of Deep Learning