Multi-objective combinatorial optimization: From methods ... · Combinatorial optimization...
Transcript of Multi-objective combinatorial optimization: From methods ... · Combinatorial optimization...
Multi-objectivecombinatorial optimization:From methods to problems,
from the Earth to (almost) the Moon
Nicolas Jozefowiez
Maıtre de conferences en informatiqueINSA, LAAS-CNRS, Universite de Toulouse
le mardi 03 decembre 2013
Outline
I. Curriculum Vitæ
II. Multi-objective optimization
III. Multi-objective search tree
IV. Multi-objective column generation
V. Multi-objective genetic algorithms
VI. Conclusions and perspectives
Nicolas Jozefowiez 2 / 51
Part I
Curriculum vitæ
Short CV
2001
2004Ph.D. in Computer Science, Universite deLille 1, France.Title: Modelling and approximate solution ofmulti-objective vehicle routing problems
2004
2005Temporary assistant professor, Universite deLille 1, France.
2005
2006Fulbright visiting scholar, CU Boulder, CO,USA.
2006
2007Temporary assistant professor, Universite deValenciennes et du Hainaut-Cambresis, France.
2007
2008Postdoctoral position, ESG-UQAM / CIR-RELT, Montreal, Canada.
2008 Assistant professor in Computer Science,INSA / LAAS-CNRS, Toulouse, France.
Nicolas Jozefowiez 4 / 51
Supervisions: Ph.D.
2009
2013Panwadee Tangpattanakul, Multi-objectiveoptimization of Earth observing satellites,French-Thai grant, co-director: P. Lopez.
2009
2013Boadu Mensah Sarpong, Column generationfor bi-objective integer programs: Applicationto vehicle routing problems, Ministry grant, co-director: C. Artigues.
2012 Leonardo Malta, Transportation problemsfor door-to-door services, ANR RESPET, co-director: F. Semet.
2013 Leticia Gloria Vargas Suarez, Multi-objectivecumulative vehicle routing problems for hu-manitarian logistics, Erasmus grant, co-director: S. U. Ngueveu.
Nicolas Jozefowiez 5 / 51
Supervisions: Postdoct & Master
2008
2009H. Murat Afsar, Optimization of intelligentwaste collecting routing, Midi-Pyrenees grant,co-supervisor: P. Lopez.
2009
2010Rodrigo Acuna-Agost, Methods for integratedaircraft-passenger recovery systems, AmadeusS.A., co-supervisor: C. Mancel.
2010• Oussama Ben Ammar, Bi-objective schedul-ing on a single machine, Universite deToulouse 2.
2010• Myriam Foucras, Multi-modal traveling sales-man problem, Universite Paul Sabatier.
Nicolas Jozefowiez 6 / 51
Project participations
2008
2009GEDEON, Projet Region Midi-Pyrenees.
2009
2010Gestion des perturbations dans le transportaerien, Amadeus SA.
2010• Algo. de PL embarquable pour le rendez-vous orbital, CNES.
2012
2013Planification mission flexible, R&T CNES.
2012
2015Energy Aware feeding system, ECO-INNOVERA.
2012
2015RESPET, ANR Transports TerrestresDurables.
2013
2017ATHENA, ANR Blanc.
Nicolas Jozefowiez 7 / 51
Activities
2010• Fulbright jury, French scholar program.
2010• ROADEF 2010, organizing committeemember.
2011• GdR RO GT2L 7th meeting, organizer.
2008
2013ROADEF WG PM2O, co-animator.
2011
2013LAAS laboratory council, member elect.
2014
2015Bureau ROADEF, member (tresorier).
2015• ODYSSEUS 2015, organizing committeemember.
Nicolas Jozefowiez 8 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancing
Facultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visits
Labels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels
Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
Scheduling
Air transp.and space
Route balancingFacultative visitsLabels
Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
Scheduling
Air transp.and space
Route balancingFacultative visitsLabels
Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels
Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Research area
Combinatorial optimization
Multi-objectiveoptimization
Meta-heuristics
Vehicle routingproblems
Branch-and-cutalgorithm
Columngeneration
SchedulingAir transp.and space
Route balancingFacultative visitsLabels Disruption magt. EOS
Nicolas Jozefowiez 9 / 51
Part II
Multi-objective optimization
Multi-objective optimization problem
(MOP) =
minimize F (x) = (f1(x), f2(x), . . . , fn(x))
x ∈ Ω
• n ≥ 2: number of objectives
• F : function vector to optimize
• Ω ⊆ Rm: feasible solution set (solution space)
• x : a solution
• Y = F (Ω): objective space
• y = (y1, y2, . . . , yn) ∈ Y with yi = fi (x): a point in theobjective space
Nicolas Jozefowiez 11 / 51
Pareto dominance
x y ⇔
fi (x) ≤ fi (y) ∀i ∈ [1, . . . , n]
fi (x) < fi (y) ∃i ∈ [1, . . . , n]
Efficient/Pareto-optimal solutionEfficient/Pareto-optimal set
Non-dominated pointNon-dominated set
f1
f2
A
C
D
B•
•
•
•
E•
F•
G•
H•
Nicolas Jozefowiez 12 / 51
Pareto dominance
x y ⇔
fi (x) ≤ fi (y) ∀i ∈ [1, . . . , n]
fi (x) < fi (y) ∃i ∈ [1, . . . , n]
Efficient/Pareto-optimal solutionEfficient/Pareto-optimal set
Non-dominated pointNon-dominated set
f1
f2
A
C
D
B•
•
•
•
E•
F•
G•
H•
Nicolas Jozefowiez 12 / 51
Pareto dominance
x y ⇔
fi (x) ≤ fi (y) ∀i ∈ [1, . . . , n]
fi (x) < fi (y) ∃i ∈ [1, . . . , n]
Efficient/Pareto-optimal solutionEfficient/Pareto-optimal set
Non-dominated pointNon-dominated set
f1
f2
A
C
D
B•
•
•
•
E•
F•
G•
H•
Nicolas Jozefowiez 12 / 51
Pareto dominance
x y ⇔
fi (x) ≤ fi (y) ∀i ∈ [1, . . . , n]
fi (x) < fi (y) ∃i ∈ [1, . . . , n]
Efficient/Pareto-optimal solutionEfficient/Pareto-optimal set
Non-dominated pointNon-dominated set
f1
f2
A
C
D
B•
•
•
•
E•
F•
G•
H•
Nicolas Jozefowiez 12 / 51
Pareto dominance
x y ⇔
fi (x) ≤ fi (y) ∀i ∈ [1, . . . , n]
fi (x) < fi (y) ∃i ∈ [1, . . . , n]
Efficient/Pareto-optimal solutionEfficient/Pareto-optimal set
Non-dominated pointNon-dominated set
f1
f2
A
C
D
B•
•
•
•
E•
F•
G•
H•
Nicolas Jozefowiez 12 / 51
Solution approach
A priori approach
• Consideration of a decision-maker choice set
• One solution that is optimal (or an approximation) regardingto this choice set
Interactive approach
• The choice set is updated during the solution
A posteriori approach
• Efficient set (or an approximation)
• The decision-maker chooses among the efficient set
Nicolas Jozefowiez 13 / 51
Usefulness
Can every problem be limited to a single objective ? No
Example: fairness between drivers in the CVRP
Taburoute Prins’ GA
Instance Distance Fairness Distance Fairness
E51-05e 524.61 20.07 524.61 20.07E76-10e 835.32 78.10 835.26 91.08E101-08e 826.14 97.88 826.14 97.88E151-12c 1031.17 98.24 1031.63 100.34E200-17c 1311.35 106.70 1300.23 82.31E121-07c 1042.11 146.67 1042.11 146.67E101-10c 819.56 93.43 819.56 93.43
Nicolas Jozefowiez 14 / 51
MOP as a decision tool
Example: Cumulative Capacitated Vehicle Routing Problem
Number of vehicles
Cu
mu
lati
vele
ngt
h
1000
1500
2000
2500
3000
3500
4000
4500
—
—
—
—
—
—
—
—
5 10 15 20 25 30 35 40 45 50| | | | | | | | | |
Nicolas Jozefowiez 15 / 51
Scalarization methods
Weighted sum method
min (f1(x), . . . , fn(x))
x ∈ Ω→
min
∑ni=1 λi fi (x)
x ∈ Ω
n∑i=1
λi = 1
ε-constraint method
min (f1(x), . . . , fn(x))
x ∈ Ω→
min fk(x)
x ∈ Ω
fi (x) ≤ εi (i ∈ [1, n], i 6= k)
Nicolas Jozefowiez 16 / 51
Two-phase method [Ulungu & Teghem, 1993]
Phase 1
• Dichotomic search
• Weighted sum objective
• Only the convex hull
• Supported solutions
Phase 2
• Enumerative search
• Bounded by phase 1solutions
• Not supported solutions f1
f2
•
•
•
•
•
•
•
••
•
Nicolas Jozefowiez 17 / 51
Two-phase method [Ulungu & Teghem, 1993]
Phase 1
• Dichotomic search
• Weighted sum objective
• Only the convex hull
• Supported solutions
Phase 2
• Enumerative search
• Bounded by phase 1solutions
• Not supported solutions f1
f2
•
•
•
•
•
•
•
••
•
Nicolas Jozefowiez 17 / 51
Two-phase method [Ulungu & Teghem, 1993]
Phase 1
• Dichotomic search
• Weighted sum objective
• Only the convex hull
• Supported solutions
Phase 2
• Enumerative search
• Bounded by phase 1solutions
• Not supported solutions f1
f2
•
•
•
•
•
•
•
••
•
Nicolas Jozefowiez 17 / 51
Two-phase method [Ulungu & Teghem, 1993]
Phase 1
• Dichotomic search
• Weighted sum objective
• Only the convex hull
• Supported solutions
Phase 2
• Enumerative search
• Bounded by phase 1solutions
• Not supported solutions f1
f2
•
•
•
•
•
•
•
••
•
Nicolas Jozefowiez 17 / 51
Two-phase method [Ulungu & Teghem, 1993]
Phase 1
• Dichotomic search
• Weighted sum objective
• Only the convex hull
• Supported solutions
Phase 2
• Enumerative search
• Bounded by phase 1solutions
• Not supported solutions f1
f2
•
•
•
•
•
•
•
••
•
Nicolas Jozefowiez 17 / 51
Two-phase method [Ulungu & Teghem, 1993]
Phase 1
• Dichotomic search
• Weighted sum objective
• Only the convex hull
• Supported solutions
Phase 2
• Enumerative search
• Bounded by phase 1solutions
• Not supported solutions f1
f2
•
•
•
•
•
•
•
••
•
Nicolas Jozefowiez 17 / 51
Two-phase method [Ulungu & Teghem, 1993]
Phase 1
• Dichotomic search
• Weighted sum objective
• Only the convex hull
• Supported solutions
Phase 2
• Enumerative search
• Bounded by phase 1solutions
• Not supported solutions f1
f2
•
•
•
•
•
•
•
••
•
Nicolas Jozefowiez 17 / 51
Two-phase method [Ulungu & Teghem, 1993]
Phase 1
• Dichotomic search
• Weighted sum objective
• Only the convex hull
• Supported solutions
Phase 2
• Enumerative search
• Bounded by phase 1solutions
• Not supported solutions f1
f2
•
•
•
•
•
•
•
••
•
Nicolas Jozefowiez 17 / 51
Iterative ε-constraint method
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
ε0
ε1
ε2
ε3
ε4
ε5
Nicolas Jozefowiez 18 / 51
Iterative ε-constraint method
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
ε0
ε1
ε2
ε3
ε4
ε5
Nicolas Jozefowiez 18 / 51
Iterative ε-constraint method
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
ε0
ε1
ε2
ε3
ε4
ε5
Nicolas Jozefowiez 18 / 51
Iterative ε-constraint method
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
ε0
ε1
ε2
ε3
ε4
ε5
Nicolas Jozefowiez 18 / 51
Iterative ε-constraint method
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
ε0
ε1
ε2
ε3
ε4
ε5
Nicolas Jozefowiez 18 / 51
Iterative ε-constraint method
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
ε0
ε1
ε2
ε3
ε4
ε5
Nicolas Jozefowiez 18 / 51
Iterative ε-constraint method
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
ε0
ε1
ε2
ε3
ε4
ε5
Nicolas Jozefowiez 18 / 51
Intensification / Diversification
f1
Goals
f2
Inte
nsifica
tion
Diversification
f1
Good intensification
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Good diversificationf1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
• Usually associated to multi-objective evolutionary algorithms
• What does it mean for methods such as the Two-Phasemethod ?
• What does it mean for exact methods ?
• It should be true all along the search
Nicolas Jozefowiez 19 / 51
Intensification / Diversification
f1
Goals
f2
Inte
nsifica
tion
Diversification
f1
Good intensification
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Good diversificationf1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
• Usually associated to multi-objective evolutionary algorithms
• What does it mean for methods such as the Two-Phasemethod ?
• What does it mean for exact methods ?
• It should be true all along the search
Nicolas Jozefowiez 19 / 51
Intensification / Diversification
f1
Goals
f2
Inte
nsifica
tion
Diversification
f1
Good intensification
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Good diversificationf1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
• Usually associated to multi-objective evolutionary algorithms
• What does it mean for methods such as the Two-Phasemethod ?
• What does it mean for exact methods ?
• It should be true all along the search
Nicolas Jozefowiez 19 / 51
Intensification / Diversification
f1
Goals
f2
Inte
nsifica
tion
Diversification
f1
Good intensification
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Good diversificationf1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
• Usually associated to multi-objective evolutionary algorithms
• What does it mean for methods such as the Two-Phasemethod ?
• What does it mean for exact methods ?
• It should be true all along the search
Nicolas Jozefowiez 19 / 51
Intensification / Diversification
f1
Goals
f2
Inte
nsifica
tion
Diversification
f1
Good intensification
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Good diversificationf1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
• Usually associated to multi-objective evolutionary algorithms
• What does it mean for methods such as the Two-Phasemethod ?
• What does it mean for exact methods ?
• It should be true all along the search
Nicolas Jozefowiez 19 / 51
Multi-objective anytime algorithm
Multi-objective evolutionary algorithms
• A population P + mechanisms
• Set-based optimization [Zitzler et al., 2010]
Ψ ∈ P = ψ ⊆ Ω : @x , y ∈ Φ• Multi-objective decoders
Integer programming methods
• A single program, a scalarization method
• Avoid iteration of NP-hard problem solutions
• Search tree, lower bound
Design
• Representativity (Diversity)
• Uniformity (Convergence)
• Factorization (Efficiency)
Nicolas Jozefowiez 20 / 51
Multi-objective anytime algorithm
Multi-objective evolutionary algorithms
• A population P + mechanisms
• Set-based optimization [Zitzler et al., 2010]
Ψ ∈ P = ψ ⊆ Ω : @x , y ∈ Φ• Multi-objective decoders
Integer programming methods
• A single program, a scalarization method
• Avoid iteration of NP-hard problem solutions
• Search tree, lower bound
Design
• Representativity (Diversity)
• Uniformity (Convergence)
• Factorization (Efficiency)
Nicolas Jozefowiez 20 / 51
Multi-objective anytime algorithm
Multi-objective evolutionary algorithms
• A population P + mechanisms
• Set-based optimization [Zitzler et al., 2010]
Ψ ∈ P = ψ ⊆ Ω : @x , y ∈ Φ• Multi-objective decoders
Integer programming methods
• A single program, a scalarization method
• Avoid iteration of NP-hard problem solutions
• Search tree, lower bound
Design
• Representativity (Diversity)
• Uniformity (Convergence)
• Factorization (Efficiency)
Nicolas Jozefowiez 20 / 51
Multi-objective anytime algorithm
Multi-objective evolutionary algorithms
• A population P + mechanisms
• Set-based optimization [Zitzler et al., 2010]
Ψ ∈ P = ψ ⊆ Ω : @x , y ∈ Φ• Multi-objective decoders
Integer programming methods
• A single program, a scalarization method
• Avoid iteration of NP-hard problem solutions
• Search tree, lower bound
Design
• Representativity (Diversity)
• Uniformity (Convergence)
• Factorization (Efficiency)Nicolas Jozefowiez 20 / 51
Representativity
Limit: 3 iterations
f1
f2
Non-dominated set
ε-constraint method
Two-phase method
Representative set
Nicolas Jozefowiez 21 / 51
Representativity
Limit: 3 iterations
f1
f2
Non-dominated set
ε-constraint method
Two-phase method
Representative set
Nicolas Jozefowiez 21 / 51
Representativity
Limit: 3 iterations
f1
f2
Non-dominated set
ε-constraint method
Two-phase method
Representative set
Nicolas Jozefowiez 21 / 51
Representativity
Limit: 3 iterations
f1
f2
Non-dominated set
ε-constraint method
Two-phase method
Representative set
Nicolas Jozefowiez 21 / 51
Representativity
Limit: 3 iterations
f1
f2
Non-dominated set
ε-constraint method
Two-phase method
Representative set
Nicolas Jozefowiez 21 / 51
Representativity
Limit: 3 iterations
f1
f2
Non-dominated set
ε-constraint method
Two-phase method
Representative set
Nicolas Jozefowiez 21 / 51
Representativity
Limit: 3 iterations
f1
f2
Non-dominated set
ε-constraint method
Two-phase method
Representative set
Nicolas Jozefowiez 21 / 51
Representativity
Limit: 3 iterations
f1
f2
Non-dominated set
ε-constraint method
Two-phase method
Representative set
Nicolas Jozefowiez 21 / 51
Representativity
Limit: 3 iterations
f1
f2
Non-dominated set
ε-constraint method
Two-phase method
Representative set
Nicolas Jozefowiez 21 / 51
Representativity
Limit: 3 iterations
f1
f2
Non-dominated set
ε-constraint method
Two-phase method
Representative set
Nicolas Jozefowiez 21 / 51
Uniformity
f1
f2
• • • •
• • • •
• • • • • •
• • • • • • •
• • • • • • •
• • • • • • •
• • • • • •
• • • •
• • •
The search/computational effort should be spread on the completeobjective spaceEach operation should have an impact on the all approximation
Nicolas Jozefowiez 22 / 51
Uniformity
f1
f2
• • • •
• • • •
• • • • • •
• • • • • • •
• • • • • • •
• • • • • • •
• • • • • •
• • • •
• • •
The search/computational effort should be spread on the completeobjective spaceEach operation should have an impact on the all approximation
Nicolas Jozefowiez 22 / 51
Uniformity
f1
f2
• • • •
• • • •
• • • • • •
• • • • • • •
• • • • • • •
• • • • • • •
• • • • • •
• • • •
• • •
The search/computational effort should be spread on the completeobjective spaceEach operation should have an impact on the all approximation
Nicolas Jozefowiez 22 / 51
Uniformity
f1
f2
• • • •
• • • •
• • • • • •
• • • • • • •
• • • • • • •
• • • • • • •
• • • • • •
• • • •
• • •
The search/computational effort should be spread on the completeobjective spaceEach operation should have an impact on the all approximation
Nicolas Jozefowiez 22 / 51
Efficiency
f1
f2
• • • • •
• • • • • •
• • • • • • •
• • • • • • • •
• • • • • • •
• • •
• • •
•
•
•
••
•
S
D
A
C
B
Two-phase method:
20
iterations / Best search:
10
iterations
Nicolas Jozefowiez 23 / 51
Efficiency
f1
f2
• • • • •
• • • • • •
• • • • • • •
• • • • • • • •
• • • • • • •
• • •
• • •
•
•
•
••
•
S
D
A
C
B
Two-phase method:
20
iterations / Best search:
10
iterations
Nicolas Jozefowiez 23 / 51
Efficiency
f1
f2
• • • • •
• • • • • •
• • • • • • •
• • • • • • • •
• • • • • • •
• • •
• • •
•
•
•
••
•
S
D
A
C
B
Two-phase method:
20
iterations / Best search:
10
iterations
Nicolas Jozefowiez 23 / 51
Efficiency
f1
f2
• • • • •
• • • • • •
• • • • • • •
• • • • • • • •
• • • • • • •
• • •
• • •
•
•
•
••
•
S
D
A
C
B
Two-phase method:
20
iterations / Best search:
10
iterations
Nicolas Jozefowiez 23 / 51
Efficiency
f1
f2
• • • • •
• • • • • •
• • • • • • •
• • • • • • • •
• • • • • • •
• • •
• • •
•
•
•
••
•
S
D
A
C
B
Two-phase method: 20 iterations / Best search:
10
iterations
Nicolas Jozefowiez 23 / 51
Efficiency
f1
f2
• • • • •
• • • • • •
• • • • • • •
• • • • • • • •
• • • • • • •
• • •
• • •
•
•
•
••
•
S
D
A
C
B
Two-phase method: 20 iterations / Best search:
10
iterations
Nicolas Jozefowiez 23 / 51
Efficiency
f1
f2
• • • • •
• • • • • •
• • • • • • •
• • • • • • • •
• • • • • • •
• • •
• • •
•
•
•
••
•
S
D
A
C
B
Two-phase method: 20 iterations / Best search:
10
iterations
Nicolas Jozefowiez 23 / 51
Efficiency
f1
f2
• • • • •
• • • • • •
• • • • • • •
• • • • • • • •
• • • • • • •
• • •
• • •
•
•
•
••
•
S
D
A
C
B
Two-phase method: 20 iterations / Best search:
10
iterations
Nicolas Jozefowiez 23 / 51
Efficiency
f1
f2
• • • • •
• • • • • •
• • • • • • •
• • • • • • • •
• • • • • • •
• • •
• • •
•
•
•
••
•
S
D
A
C
B
Two-phase method: 20 iterations / Best search:
10
iterations
Nicolas Jozefowiez 23 / 51
Efficiency
f1
f2
• • • • •
• • • • • •
• • • • • • •
• • • • • • • •
• • • • • • •
• • •
• • •
•
•
•
••
•
S
D
A
C
B
Two-phase method: 20 iterations / Best search:
10
iterations
Nicolas Jozefowiez 23 / 51
Efficiency
f1
f2
• • • • •
• • • • • •
• • • • • • •
• • • • • • • •
• • • • • • •
• • •
• • •
•
•
•
••
•
S
D
A
C
B
Two-phase method: 20 iterations / Best search: 10 iterations
Nicolas Jozefowiez 23 / 51
Part III
Multi-objective search tree
Upper and lower bounds
Upper bound (ub)
x ∈ Ω : @y ∈ ub, y x ⊆ Ω
Lower bound (lb) [Villareal & Karwan, 1981]
x ∈ Rn : (@x , y ∈ lb, y x) ∧ (∀y ∈ Ω,∃x ∈ lb, x y) ⊆ Rn
Case (1) Case (2) Case (3)
Nicolas Jozefowiez 25 / 51
Computation of the lower bound
• A single multi-objective integer program
• Lower bound• A set of subproblems Φ• A subproblem φ ∈ Φ = linear relaxation + scalarization
technique
• Computation• Solve a subset Φ ⊆ Φ• Advantage: each φ ∈ Φ is polynomially solvable
• Φ should be kept polynomial or pseudo-polynomial
• Branch-and-cut flowchart is not modified
Nicolas Jozefowiez 26 / 51
Example
Φ = φε, ε ∈ 0, 1, 2
minimize −1.00x1 − 0.64x2
minimize x3
s.t. 50x1 + 31x2 ≤ 250
3x1 − 2x2 ≥ −4
x1 + x3 ≤ 2
x1, x2 ≥ 0 and integer
x3 ∈ 0, 1, 2
Nicolas Jozefowiez 27 / 51
Example
Φ = φε, ε ∈ 0, 1, 2
minimize −1.00x1 − 0.64x2
s.t. 50x1 + 31x2 ≤ 250
3x1 − 2x2 ≥ −4
x1 + x3 ≤ 2
x3 = ε
x1, x2 ≥ 0
Nicolas Jozefowiez 27 / 51
Search tree
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 3ε = 1 x1 = 1 x2 = 3ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 1.94 x2 = 4.92UnfeasibleUnfeasible
ε = 0 x1 = 2 x2 = 4UnfeasibleUnfeasible
UnfeasibleUnfeasibleUnfeasible
Number of LP solutions: 15
Nicolas Jozefowiez 28 / 51
Search tree
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 3ε = 1 x1 = 1 x2 = 3ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 1.94 x2 = 4.92UnfeasibleUnfeasible
ε = 0 x1 = 2 x2 = 4UnfeasibleUnfeasible
UnfeasibleUnfeasibleUnfeasible
Number of LP solutions: 15
Nicolas Jozefowiez 28 / 51
Search tree
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 3ε = 1 x1 = 1 x2 = 3ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 1.94 x2 = 4.92UnfeasibleUnfeasible
ε = 0 x1 = 2 x2 = 4UnfeasibleUnfeasible
UnfeasibleUnfeasibleUnfeasible
Number of LP solutions: 15
Nicolas Jozefowiez 28 / 51
Search tree
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 3ε = 1 x1 = 1 x2 = 3ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 1.94 x2 = 4.92UnfeasibleUnfeasible
ε = 0 x1 = 2 x2 = 4UnfeasibleUnfeasible
UnfeasibleUnfeasibleUnfeasible
Number of LP solutions: 15
Nicolas Jozefowiez 28 / 51
Search tree
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 3ε = 1 x1 = 1 x2 = 3ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 1.94 x2 = 4.92UnfeasibleUnfeasible
ε = 0 x1 = 2 x2 = 4UnfeasibleUnfeasible
UnfeasibleUnfeasibleUnfeasible
Number of LP solutions: 15
Nicolas Jozefowiez 28 / 51
Search tree
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 3ε = 1 x1 = 1 x2 = 3ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 1.94 x2 = 4.92UnfeasibleUnfeasible
ε = 0 x1 = 2 x2 = 4UnfeasibleUnfeasible
UnfeasibleUnfeasibleUnfeasible
Number of LP solutions: 15
Nicolas Jozefowiez 28 / 51
Partial pruning
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 3ε = 1 x1 = 1 x2 = 3
Not solved
ε = 0 x1 = 1.94 x2 = 4.92UnfeasibleNot solved
ε = 0 x1 = 2 x2 = 4Not solvedNot solved
UnfeasibleNot solvedNot solved
Number of LP solutions: 9
Nicolas Jozefowiez 29 / 51
Partial pruning
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 3ε = 1 x1 = 1 x2 = 3
Not solved
ε = 0 x1 = 1.94 x2 = 4.92UnfeasibleNot solved
ε = 0 x1 = 2 x2 = 4Not solvedNot solved
UnfeasibleNot solvedNot solved
Number of LP solutions: 9
Nicolas Jozefowiez 29 / 51
Partial pruning
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 3ε = 1 x1 = 1 x2 = 3
Not solved
ε = 0 x1 = 1.94 x2 = 4.92UnfeasibleNot solved
ε = 0 x1 = 2 x2 = 4Not solvedNot solved
UnfeasibleNot solvedNot solved
Number of LP solutions: 9
Nicolas Jozefowiez 29 / 51
Partial pruning
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 3ε = 1 x1 = 1 x2 = 3
Not solved
ε = 0 x1 = 1.94 x2 = 4.92UnfeasibleNot solved
ε = 0 x1 = 2 x2 = 4Not solvedNot solved
UnfeasibleNot solvedNot solved
Number of LP solutions: 9
Nicolas Jozefowiez 29 / 51
Partial pruning
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 3ε = 1 x1 = 1 x2 = 3
Not solved
ε = 0 x1 = 1.94 x2 = 4.92UnfeasibleNot solved
ε = 0 x1 = 2 x2 = 4Not solvedNot solved
UnfeasibleNot solvedNot solved
Number of LP solutions: 9
Nicolas Jozefowiez 29 / 51
Partial pruning
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 3ε = 1 x1 = 1 x2 = 3
Not solved
ε = 0 x1 = 1.94 x2 = 4.92UnfeasibleNot solved
ε = 0 x1 = 2 x2 = 4Not solvedNot solved
UnfeasibleNot solvedNot solved
Number of LP solutions: 9
Nicolas Jozefowiez 29 / 51
Parallel branching
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 4ε = 1 x1 = 1 x2 = 3
Not solved
UnfeasibleUnfeasibleNot solved
Number of LP solutions: 7
Nicolas Jozefowiez 30 / 51
Parallel branching
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 4ε = 1 x1 = 1 x2 = 3
Not solved
UnfeasibleUnfeasibleNot solved
Number of LP solutions: 7
Nicolas Jozefowiez 30 / 51
Parallel branching
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 4ε = 1 x1 = 1 x2 = 3
Not solved
UnfeasibleUnfeasibleNot solved
Number of LP solutions: 7
Nicolas Jozefowiez 30 / 51
Parallel branching
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 4ε = 1 x1 = 1 x2 = 3
Not solved
UnfeasibleUnfeasibleNot solved
Number of LP solutions: 7
Nicolas Jozefowiez 30 / 51
Parallel branching
ε = 0 x1 = 1.94 x2 = 4.92ε = 1 x1 = 1 x2 = 3.5ε = 2 x1 = 0 x2 = 2
ε = 0 x1 = 2 x2 = 4ε = 1 x1 = 1 x2 = 3
Not solved
UnfeasibleUnfeasibleNot solved
Number of LP solutions: 7
Nicolas Jozefowiez 30 / 51
The multilabel traveling salesman problem
G = (V ,E )
Cost function c on E
A set of labels L = , , ,
Each e ∈ E ← δe ∈ L (data)
Minimize the total length
Minimize the number of labels used
IP: Based on [Dantzig et al., 54] + valid inequalities
Lower bound: ε-constraint method on the # of labels used(max LP solved ≤ |L|)
Cuts are searched after each LP solution
Nicolas Jozefowiez 31 / 51
The multilabel traveling salesman problem
G = (V ,E )
Cost function c on E
A set of labels L = , , ,
Each e ∈ E ← δe ∈ L (data)
Minimize the total length
Minimize the number of labels used
IP: Based on [Dantzig et al., 54] + valid inequalities
Lower bound: ε-constraint method on the # of labels used(max LP solved ≤ |L|)
Cuts are searched after each LP solution
Nicolas Jozefowiez 31 / 51
The multilabel traveling salesman problem
G = (V ,E )
Cost function c on E
A set of labels L = , , ,
Each e ∈ E ← δe ∈ L (data)
Minimize the total length
Minimize the number of labels used
IP: Based on [Dantzig et al., 54] + valid inequalities
Lower bound: ε-constraint method on the # of labels used(max LP solved ≤ |L|)
Cuts are searched after each LP solution
Nicolas Jozefowiez 31 / 51
The multilabel traveling salesman problem
G = (V ,E )
Cost function c on E
A set of labels L = , , ,
Each e ∈ E ← δe ∈ L (data)
Minimize the total length
Minimize the number of labels used
IP: Based on [Dantzig et al., 54] + valid inequalities
Lower bound: ε-constraint method on the # of labels used(max LP solved ≤ |L|)
Cuts are searched after each LP solution
Nicolas Jozefowiez 31 / 51
The multilabel traveling salesman problem
G = (V ,E )
Cost function c on E
A set of labels L = , , ,
Each e ∈ E ← δe ∈ L (data)
Minimize the total length
Minimize the number of labels used
IP: Based on [Dantzig et al., 54] + valid inequalities
Lower bound: ε-constraint method on the # of labels used(max LP solved ≤ |L|)
Cuts are searched after each LP solution
Nicolas Jozefowiez 31 / 51
The multilabel traveling salesman problem
G = (V ,E )
Cost function c on E
A set of labels L = , , ,
Each e ∈ E ← δe ∈ L (data)
Minimize the total length
Minimize the number of labels used
IP: Based on [Dantzig et al., 54] + valid inequalities
Lower bound: ε-constraint method on the # of labels used(max LP solved ≤ |L|)
Cuts are searched after each LP solution
Nicolas Jozefowiez 31 / 51
The multilabel traveling salesman problem
G = (V ,E )
Cost function c on E
A set of labels L = , , ,
Each e ∈ E ← δe ∈ L (data)
Minimize the total length
Minimize the number of labels used
IP: Based on [Dantzig et al., 54] + valid inequalities
Lower bound: ε-constraint method on the # of labels used(max LP solved ≤ |L|)
Cuts are searched after each LP solution
Nicolas Jozefowiez 31 / 51
The multilabel traveling salesman problem
G = (V ,E )
Cost function c on E
A set of labels L = , , ,
Each e ∈ E ← δe ∈ L (data)
Minimize the total length
Minimize the number of labels used
IP: Based on [Dantzig et al., 54] + valid inequalities
Lower bound: ε-constraint method on the # of labels used(max LP solved ≤ |L|)
Cuts are searched after each LP solution
Nicolas Jozefowiez 31 / 51
The multilabel traveling salesman problem
G = (V ,E )
Cost function c on E
A set of labels L = , , ,
Each e ∈ E ← δe ∈ L (data)
Minimize the total length
Minimize the number of labels used
IP: Based on [Dantzig et al., 54] + valid inequalities
Lower bound: ε-constraint method on the # of labels used(max LP solved ≤ |L|)
Cuts are searched after each LP solution
Nicolas Jozefowiez 31 / 51
Computational results (I)
Comparison with an iterative ε-constraint method
Same underlying branch-and-cut algorithm
MOB&C εCM
|L| |V | #Par #Nodes Seconds Seconds* #Nodes Seconds
40 20 12.1 606.8 4.2 3.1 1571.0 5.040 30 17.8 1913.0 58.7 42.7 5806.0 67.240 40 21.7 4406.6 503.0 349.8 17462.0 665.840 50 26.6 15360.6 1845.9 1374.5 45306.6 3334.5
50 20 12.4 718.9 4.4 3.4 2296.6 6.850 30 18.8 3248.3 144.0 110.2 12687.6 224.950 40 23.9 8722.7 1374.4 1097.7 36339.4 1636.950 50 27.7 20680.3 4094.0 2902.5 74336.6 5938.4
Nicolas Jozefowiez 32 / 51
Computational results (II)
Use of the method as a heuristic
Stop after a percentage of the search tree has been explored
%: percentage of Pareto solutions found
Gap: average over all non efficient solutions of
25% 50% 75%
|L| |V | % Gap % Gap % Gap Seconds
40 20 58.7 1.011 76.0 1.005 87.6 1.002 2.740 30 41.6 1.010 62.9 1.005 83.7 1.002 30.040 40 31.3 1.011 43.8 1.007 80.2 1.002 200.340 50 34.2 1.009 51.9 1.006 71.8 1.003 708.0
50 20 59.7 1.011 69.4 1.009 84.7 1.004 2.950 30 41.0 1.012 63.8 1.005 86.2 1.002 75.450 40 34.3 1.011 51.9 1.005 82.0 1.002 601.850 50 24.5 1.012 40.8 1.007 69.7 1.003 1679.9
Nicolas Jozefowiez 33 / 51
Computational results (II)
Use of the method as a heuristic
Stop after a percentage of the search tree has been explored
%: percentage of Pareto solutions found
Gap: average over all non efficient solutions of
25% 50% 75%
|L| |V | % Gap % Gap % Gap Seconds
40 20 58.7 1.011 76.0 1.005 87.6 1.002 2.740 30 41.6 1.010 62.9 1.005 83.7 1.002 30.040 40 31.3 1.011 43.8 1.007 80.2 1.002 200.340 50 34.2 1.009 51.9 1.006 71.8 1.003 708.0
50 20 59.7 1.011 69.4 1.009 84.7 1.004 2.950 30 41.0 1.012 63.8 1.005 86.2 1.002 75.450 40 34.3 1.011 51.9 1.005 82.0 1.002 601.850 50 24.5 1.012 40.8 1.007 69.7 1.003 1679.9
Nicolas Jozefowiez 33 / 51
Computational results (II)
Use of the method as a heuristic
Stop after a percentage of the search tree has been explored
%: percentage of Pareto solutions found
Gap: average over all non efficient solutions of
25% 50% 75%
|L| |V | % Gap % Gap % Gap Seconds
40 20 58.7 1.011 76.0 1.005 87.6 1.002 2.740 30 41.6 1.010 62.9 1.005 83.7 1.002 30.040 40 31.3 1.011 43.8 1.007 80.2 1.002 200.340 50 34.2 1.009 51.9 1.006 71.8 1.003 708.0
50 20 59.7 1.011 69.4 1.009 84.7 1.004 2.950 30 41.0 1.012 63.8 1.005 86.2 1.002 75.450 40 34.3 1.011 51.9 1.005 82.0 1.002 601.850 50 24.5 1.012 40.8 1.007 69.7 1.003 1679.9
Nicolas Jozefowiez 33 / 51
Computational results (II)
Use of the method as a heuristic
Stop after a percentage of the search tree has been explored
%: percentage of Pareto solutions found
Gap: average over all non efficient solutions of
25% 50% 75%
|L| |V | % Gap % Gap % Gap Seconds
40 20 58.7 1.011 76.0 1.005 87.6 1.002 2.740 30 41.6 1.010 62.9 1.005 83.7 1.002 30.040 40 31.3 1.011 43.8 1.007 80.2 1.002 200.340 50 34.2 1.009 51.9 1.006 71.8 1.003 708.0
50 20 59.7 1.011 69.4 1.009 84.7 1.004 2.950 30 41.0 1.012 63.8 1.005 86.2 1.002 75.450 40 34.3 1.011 51.9 1.005 82.0 1.002 601.850 50 24.5 1.012 40.8 1.007 69.7 1.003 1679.9
Nicolas Jozefowiez 33 / 51
Computational results (II)
Use of the method as a heuristic
Stop after a percentage of the search tree has been explored
%: percentage of Pareto solutions found
Gap: average over all non efficient solutions of
25% 50% 75%
|L| |V | % Gap % Gap % Gap Seconds
40 20 58.7 1.011 76.0 1.005 87.6 1.002 2.740 30 41.6 1.010 62.9 1.005 83.7 1.002 30.040 40 31.3 1.011 43.8 1.007 80.2 1.002 200.340 50 34.2 1.009 51.9 1.006 71.8 1.003 708.0
50 20 59.7 1.011 69.4 1.009 84.7 1.004 2.950 30 41.0 1.012 63.8 1.005 86.2 1.002 75.450 40 34.3 1.011 51.9 1.005 82.0 1.002 601.850 50 24.5 1.012 40.8 1.007 69.7 1.003 1679.9
Nicolas Jozefowiez 33 / 51
Part IV
Multi-objective
column generation
Column generation & bi-objective IP
Linear relaxation of the MP (LMP)
minimize∑
j∈Jcrj θj (r = 1, 2)
s.t.∑
j∈Jaijθj ≥ bi (i ∈ I )
θj ∈ R+ (j ∈ J)
Weighted sum ε-constraint method
f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
•
•
•
• f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
••
•
•
Restricted master problem (RMP) J ′ ⊂ J, |J ′| << |J|→ Generate columns for the
Nicolas Jozefowiez 35 / 51
Column generation & bi-objective IP
Master problem (MP)
minimize∑
j∈Jcrj θj (r = 1, 2)
s.t.∑
j∈Jaijθj ≥ bi (i ∈ I )
θj ∈ N (j ∈ J)
Weighted sum ε-constraint method
f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
•
•
•
• f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
••
•
•
Restricted master problem (RMP) J ′ ⊂ J, |J ′| << |J|→ Generate columns for the
Nicolas Jozefowiez 35 / 51
Column generation & bi-objective IP
Linear relaxation of the MP (LMP)
minimize∑
j∈Jcrj θj (r = 1, 2)
s.t.∑
j∈Jaijθj ≥ bi (i ∈ I )
θj ∈ R+ (j ∈ J)
Weighted sum ε-constraint method
f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
•
•
•
• f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
••
•
•
Restricted master problem (RMP) J ′ ⊂ J, |J ′| << |J|→ Generate columns for the
Nicolas Jozefowiez 35 / 51
Column generation & bi-objective IP
Linear relaxation of the MP (LMP)
minimize∑
j∈Jcrj θj (r = 1, 2)
s.t.∑
j∈Jaijθj ≥ bi (i ∈ I )
θj ∈ R+ (j ∈ J)
Weighted sum
ε-constraint method
f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
•
•
•
•
f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
••
•
•
Restricted master problem (RMP) J ′ ⊂ J, |J ′| << |J|→ Generate columns for the
Nicolas Jozefowiez 35 / 51
Column generation & bi-objective IP
Linear relaxation of the MP (LMP)
minimize∑
j∈Jcrj θj (r = 1, 2)
s.t.∑
j∈Jaijθj ≥ bi (i ∈ I )
θj ∈ R+ (j ∈ J)
Weighted sum ε-constraint method
f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
•
•
•
• f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
••
•
•
Restricted master problem (RMP) J ′ ⊂ J, |J ′| << |J|→ Generate columns for the
Nicolas Jozefowiez 35 / 51
Column generation & bi-objective IP
Linear relaxation of the MP (LMP)
minimize∑
j∈Jcrj θj (r = 1, 2)
s.t.∑
j∈Jaijθj ≥ bi (i ∈ I )
θj ∈ R+ (j ∈ J)
Weighted sum ε-constraint method
f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
•
•
•
• f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
••
•
•
Restricted master problem (RMP) J ′ ⊂ J, |J ′| << |J|
→ Generate columns for the
Nicolas Jozefowiez 35 / 51
Column generation & bi-objective IP
Linear relaxation of the MP (LMP)
minimize∑
j∈Jcrj θj (r = 1, 2)
s.t.∑
j∈Jaijθj ≥ bi (i ∈ I )
θj ∈ R+ (j ∈ J)
Weighted sum ε-constraint method
f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
•
•
•
• f1
f2• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
••
•
•
Restricted master problem (RMP) J ′ ⊂ J, |J ′| << |J|→ Generate columns for the
Nicolas Jozefowiez 35 / 51
Point-by-point search (PPS)
Scalarization technique = ε-constraint method
• Iterative ε-constraintmethod
• Full column generationalgorithm at each iteration
• Possible improvements
• Column storage• Blind ad-hoc heuristics
(Improved PPS)• ...
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Problems: may be caught in a tailing effect, no uniformconvergence, no factorization, not good as a heuristic ...
⇒ column search strategies
Nicolas Jozefowiez 36 / 51
Point-by-point search (PPS)
Scalarization technique = ε-constraint method
• Iterative ε-constraintmethod
• Full column generationalgorithm at each iteration
• Possible improvements
• Column storage• Blind ad-hoc heuristics
(Improved PPS)• ...
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Problems: may be caught in a tailing effect, no uniformconvergence, no factorization, not good as a heuristic ...
⇒ column search strategies
Nicolas Jozefowiez 36 / 51
Point-by-point search (PPS)
Scalarization technique = ε-constraint method
• Iterative ε-constraintmethod
• Full column generationalgorithm at each iteration
• Possible improvements
• Column storage• Blind ad-hoc heuristics
(Improved PPS)• ...
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Problems: may be caught in a tailing effect, no uniformconvergence, no factorization, not good as a heuristic ...
⇒ column search strategies
Nicolas Jozefowiez 36 / 51
Point-by-point search (PPS)
Scalarization technique = ε-constraint method
• Iterative ε-constraintmethod
• Full column generationalgorithm at each iteration
• Possible improvements
• Column storage• Blind ad-hoc heuristics
(Improved PPS)• ...
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Problems: may be caught in a tailing effect, no uniformconvergence, no factorization, not good as a heuristic ...
⇒ column search strategies
Nicolas Jozefowiez 36 / 51
Point-by-point search (PPS)
Scalarization technique = ε-constraint method
• Iterative ε-constraintmethod
• Full column generationalgorithm at each iteration
• Possible improvements
• Column storage• Blind ad-hoc heuristics
(Improved PPS)• ...
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Problems: may be caught in a tailing effect, no uniformconvergence, no factorization, not good as a heuristic ...
⇒ column search strategies
Nicolas Jozefowiez 36 / 51
Point-by-point search (PPS)
Scalarization technique = ε-constraint method
• Iterative ε-constraintmethod
• Full column generationalgorithm at each iteration
• Possible improvements
• Column storage• Blind ad-hoc heuristics
(Improved PPS)• ...
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Problems: may be caught in a tailing effect, no uniformconvergence, no factorization, not good as a heuristic ...
⇒ column search strategies
Nicolas Jozefowiez 36 / 51
Point-by-point search (PPS)
Scalarization technique = ε-constraint method
• Iterative ε-constraintmethod
• Full column generationalgorithm at each iteration
• Possible improvements
• Column storage• Blind ad-hoc heuristics
(Improved PPS)• ...
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Problems: may be caught in a tailing effect, no uniformconvergence, no factorization, not good as a heuristic ...
⇒ column search strategies
Nicolas Jozefowiez 36 / 51
Point-by-point search (PPS)
Scalarization technique = ε-constraint method
• Iterative ε-constraintmethod
• Full column generationalgorithm at each iteration
• Possible improvements• Column storage
• Blind ad-hoc heuristics(Improved PPS)
• ...
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Problems: may be caught in a tailing effect, no uniformconvergence, no factorization, not good as a heuristic ...
⇒ column search strategies
Nicolas Jozefowiez 36 / 51
Point-by-point search (PPS)
Scalarization technique = ε-constraint method
• Iterative ε-constraintmethod
• Full column generationalgorithm at each iteration
• Possible improvements• Column storage• Blind ad-hoc heuristics
(Improved PPS)• ...
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Problems: may be caught in a tailing effect, no uniformconvergence, no factorization, not good as a heuristic ...
⇒ column search strategies
Nicolas Jozefowiez 36 / 51
Point-by-point search (PPS)
Scalarization technique = ε-constraint method
• Iterative ε-constraintmethod
• Full column generationalgorithm at each iteration
• Possible improvements• Column storage• Blind ad-hoc heuristics
(Improved PPS)• ...
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Problems: may be caught in a tailing effect, no uniformconvergence, no factorization, not good as a heuristic ...
⇒ column search strategies
Nicolas Jozefowiez 36 / 51
Point-by-point search (PPS)
Scalarization technique = ε-constraint method
• Iterative ε-constraintmethod
• Full column generationalgorithm at each iteration
• Possible improvements• Column storage• Blind ad-hoc heuristics
(Improved PPS)• ...
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Problems: may be caught in a tailing effect, no uniformconvergence, no factorization, not good as a heuristic ...
⇒ column search strategies
Nicolas Jozefowiez 36 / 51
Solve once, generate for all (SOGA)
Scalarization technique = ε-constraint method
Main computational cost: solution of a subproblemThe subproblem is similar for several values of ε
• At each iteration
• Select a value ε1
• Solve the LRMP for ε1
• Search for a column set J1
• For several εk , solve theLRMP → π∗k
• Heuristically built columnsusing J1 and π∗k
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Nicolas Jozefowiez 37 / 51
Solve once, generate for all (SOGA)
Scalarization technique = ε-constraint method
Main computational cost: solution of a subproblem
The subproblem is similar for several values of ε
• At each iteration
• Select a value ε1
• Solve the LRMP for ε1
• Search for a column set J1
• For several εk , solve theLRMP → π∗k
• Heuristically built columnsusing J1 and π∗k
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Nicolas Jozefowiez 37 / 51
Solve once, generate for all (SOGA)
Scalarization technique = ε-constraint method
Main computational cost: solution of a subproblemThe subproblem is similar for several values of ε
• At each iteration
• Select a value ε1
• Solve the LRMP for ε1
• Search for a column set J1
• For several εk , solve theLRMP → π∗k
• Heuristically built columnsusing J1 and π∗k
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Nicolas Jozefowiez 37 / 51
Solve once, generate for all (SOGA)
Scalarization technique = ε-constraint method
Main computational cost: solution of a subproblemThe subproblem is similar for several values of ε
• At each iteration
• Select a value ε1
• Solve the LRMP for ε1
• Search for a column set J1
• For several εk , solve theLRMP → π∗k
• Heuristically built columnsusing J1 and π∗k
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Nicolas Jozefowiez 37 / 51
Solve once, generate for all (SOGA)
Scalarization technique = ε-constraint method
Main computational cost: solution of a subproblemThe subproblem is similar for several values of ε
• At each iteration
• Select a value ε1
• Solve the LRMP for ε1
• Search for a column set J1
• For several εk , solve theLRMP → π∗k
• Heuristically built columnsusing J1 and π∗k
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Nicolas Jozefowiez 37 / 51
Solve once, generate for all (SOGA)
Scalarization technique = ε-constraint method
Main computational cost: solution of a subproblemThe subproblem is similar for several values of ε
• At each iteration
• Select a value ε1
• Solve the LRMP for ε1
• Search for a column set J1
• For several εk , solve theLRMP → π∗k
• Heuristically built columnsusing J1 and π∗k
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Nicolas Jozefowiez 37 / 51
Solve once, generate for all (SOGA)
Scalarization technique = ε-constraint method
Main computational cost: solution of a subproblemThe subproblem is similar for several values of ε
• At each iteration
• Select a value ε1
• Solve the LRMP for ε1
• Search for a column set J1
• For several εk , solve theLRMP → π∗k
• Heuristically built columnsusing J1 and π∗k
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Nicolas Jozefowiez 37 / 51
Solve once, generate for all (SOGA)
Scalarization technique = ε-constraint method
Main computational cost: solution of a subproblemThe subproblem is similar for several values of ε
• At each iteration
• Select a value ε1
• Solve the LRMP for ε1
• Search for a column set J1
• For several εk , solve theLRMP → π∗k
• Heuristically built columnsusing J1 and π∗k
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Nicolas Jozefowiez 37 / 51
Solve once, generate for all (SOGA)
Scalarization technique = ε-constraint method
Main computational cost: solution of a subproblemThe subproblem is similar for several values of ε
• At each iteration
• Select a value ε1
• Solve the LRMP for ε1
• Search for a column set J1
• For several εk , solve theLRMP → π∗k
• Heuristically built columnsusing J1 and π∗k
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Nicolas Jozefowiez 37 / 51
Solve once, generate for all (SOGA)
Scalarization technique = ε-constraint method
Main computational cost: solution of a subproblemThe subproblem is similar for several values of ε
• At each iteration
• Select a value ε1
• Solve the LRMP for ε1
• Search for a column set J1
• For several εk , solve theLRMP → π∗k
• Heuristically built columnsusing J1 and π∗k
f1
f2
• • • • • • • •
• • • • • • • • •
• • • • • • • •
• • • • • •
• • • • • • •
• • • • •
• • •
• • •
• • • •
Nicolas Jozefowiez 37 / 51
Bi-obj. multi-vehicle covering tour problem
G = (V ∪W ,E , d), p: max # of nodes in a tour
A solution = a set of tours on V ′ ⊆ V + assignment of W to V ′
Objectives: i) minimize the total length; ii) maxwi∈W minvj∈V ′ dij
depot
V : nodesthat can be visited
W : nodes to cover
Nicolas Jozefowiez 38 / 51
Bi-obj. multi-vehicle covering tour problem
G = (V ∪W ,E , d)
, p: max # of nodes in a tour
A solution = a set of tours on V ′ ⊆ V + assignment of W to V ′
Objectives: i) minimize the total length; ii) maxwi∈W minvj∈V ′ dij
depot
V : nodesthat can be visited
W : nodes to cover
Nicolas Jozefowiez 38 / 51
Bi-obj. multi-vehicle covering tour problem
G = (V ∪W ,E , d)
, p: max # of nodes in a tour
A solution = a set of tours on V ′ ⊆ V + assignment of W to V ′
Objectives: i) minimize the total length; ii) maxwi∈W minvj∈V ′ dij
depot
V : nodesthat can be visited
W : nodes to cover
Nicolas Jozefowiez 38 / 51
Bi-obj. multi-vehicle covering tour problem
G = (V ∪W ,E , d)
, p: max # of nodes in a tour
A solution = a set of tours on V ′ ⊆ V + assignment of W to V ′
Objectives: i) minimize the total length; ii) maxwi∈W minvj∈V ′ dij
depot
V : nodesthat can be visited
W : nodes to cover
Nicolas Jozefowiez 38 / 51
Bi-obj. multi-vehicle covering tour problem
G = (V ∪W ,E , d)
, p: max # of nodes in a tour
A solution = a set of tours on V ′ ⊆ V + assignment of W to V ′
Objectives: i) minimize the total length; ii) maxwi∈W minvj∈V ′ dij
depot
V : nodesthat can be visited
W : nodes to cover
Nicolas Jozefowiez 38 / 51
Bi-obj. multi-vehicle covering tour problem
G = (V ∪W ,E , d), p: max # of nodes in a tour
A solution = a set of tours on V ′ ⊆ V
+ assignment of W to V ′
Objectives: i) minimize the total length
; ii) maxwi∈W minvj∈V ′ dij
depot
V : nodesthat can be visited
W : nodes to cover
Nicolas Jozefowiez 38 / 51
Bi-obj. multi-vehicle covering tour problem
G = (V ∪W ,E , d), p: max # of nodes in a tour
A solution = a set of tours on V ′ ⊆ V + assignment of W to V ′
Objectives: i) minimize the total length; ii) maxwi∈W minvj∈V ′ dij
depot
V : nodesthat can be visited
W : nodes to cover
Nicolas Jozefowiez 38 / 51
A model for the BOMCTP
minimize∑ωk∈R
ckθk
minimize Γmax
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
Γmax ≥ ρkθk (ωk ∈ R)
θk ∈ 0, 1 (ωk ∈ R)
• ωk ∈ R: a tour on V ′ ⊆ V + W ′ ⊆W• ck : the tour length• aik = 1 if wi ∈W ′, 0 otherwise.• ρk = maxwi∈W ′ minvj∈V ′ dij
Nicolas Jozefowiez 39 / 51
A model for the BOMCTP
minimize∑ωk∈R
ckθk
minimize Γmax
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
Γmax ≥ ρkθk (ωk ∈ R)
θk ∈ 0, 1 (ωk ∈ R)
• ωk ∈ R: a tour on V ′ ⊆ V + W ′ ⊆W
• ck : the tour length• aik = 1 if wi ∈W ′, 0 otherwise.• ρk = maxwi∈W ′ minvj∈V ′ dij
Nicolas Jozefowiez 39 / 51
A model for the BOMCTP
minimize∑ωk∈R
ckθk
minimize Γmax
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
Γmax ≥ ρkθk (ωk ∈ R)
θk ∈ 0, 1 (ωk ∈ R)
• ωk ∈ R: a tour on V ′ ⊆ V + W ′ ⊆W• ck : the tour length
• aik = 1 if wi ∈W ′, 0 otherwise.• ρk = maxwi∈W ′ minvj∈V ′ dij
Nicolas Jozefowiez 39 / 51
A model for the BOMCTP
minimize∑ωk∈R
ckθk
minimize Γmax
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
Γmax ≥ ρkθk (ωk ∈ R)
θk ∈ 0, 1 (ωk ∈ R)
• ωk ∈ R: a tour on V ′ ⊆ V + W ′ ⊆W• ck : the tour length• aik = 1 if wi ∈W ′, 0 otherwise.
• ρk = maxwi∈W ′ minvj∈V ′ dij
Nicolas Jozefowiez 39 / 51
A model for the BOMCTP
minimize∑ωk∈R
ckθk
minimize Γmax
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
Γmax ≥ ρkθk (ωk ∈ R)
θk ∈ 0, 1 (ωk ∈ R)
• ωk ∈ R: a tour on V ′ ⊆ V + W ′ ⊆W• ck : the tour length• aik = 1 if wi ∈W ′, 0 otherwise.• ρk = maxwi∈W ′ minvj∈V ′ dij
Nicolas Jozefowiez 39 / 51
Reformulation
minimize∑ωk∈R
ckθk
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
θk ∈ 0, 1 (ωk ∈ R)
• The master problem is a single objective problem• The subproblem is a single objective problem• Well-suited for an ε-constraint method [Berube et al., 2009]
• Rε = ωk ∈ R : ρk ≤ ε• No weakening of the linear relaxation for a given ε value• Difference with the mono-objective model: aik is to be decided• Large variety of problems
1 A global objective on the complete solution2 An objective on the components → minimizing the worst case
Nicolas Jozefowiez 40 / 51
Reformulation
minimize∑ωk∈R
ckθk
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
θk ∈ 0, 1 (ωk ∈ R)
• The master problem is a single objective problem
• The subproblem is a single objective problem• Well-suited for an ε-constraint method [Berube et al., 2009]
• Rε = ωk ∈ R : ρk ≤ ε• No weakening of the linear relaxation for a given ε value• Difference with the mono-objective model: aik is to be decided• Large variety of problems
1 A global objective on the complete solution2 An objective on the components → minimizing the worst case
Nicolas Jozefowiez 40 / 51
Reformulation
minimize∑ωk∈R
ckθk
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
θk ∈ 0, 1 (ωk ∈ R)
• The master problem is a single objective problem• The subproblem is a single objective problem
• Well-suited for an ε-constraint method [Berube et al., 2009]
• Rε = ωk ∈ R : ρk ≤ ε• No weakening of the linear relaxation for a given ε value• Difference with the mono-objective model: aik is to be decided• Large variety of problems
1 A global objective on the complete solution2 An objective on the components → minimizing the worst case
Nicolas Jozefowiez 40 / 51
Reformulation
minimize∑ωk∈R
ckθk
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
θk ∈ 0, 1 (ωk ∈ R)
• The master problem is a single objective problem• The subproblem is a single objective problem• Well-suited for an ε-constraint method [Berube et al., 2009]
• Rε = ωk ∈ R : ρk ≤ ε• No weakening of the linear relaxation for a given ε value• Difference with the mono-objective model: aik is to be decided• Large variety of problems
1 A global objective on the complete solution2 An objective on the components → minimizing the worst case
Nicolas Jozefowiez 40 / 51
Reformulation
minimize∑ωk∈R
ckθk
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
θk ∈ 0, 1 (ωk ∈ R)
• The master problem is a single objective problem• The subproblem is a single objective problem• Well-suited for an ε-constraint method [Berube et al., 2009]
• Rε = ωk ∈ R : ρk ≤ ε
• No weakening of the linear relaxation for a given ε value• Difference with the mono-objective model: aik is to be decided• Large variety of problems
1 A global objective on the complete solution2 An objective on the components → minimizing the worst case
Nicolas Jozefowiez 40 / 51
Reformulation
minimize∑ωk∈R
ckθk
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
θk ∈ 0, 1 (ωk ∈ R)
• The master problem is a single objective problem• The subproblem is a single objective problem• Well-suited for an ε-constraint method [Berube et al., 2009]
• Rε = ωk ∈ R : ρk ≤ ε• No weakening of the linear relaxation for a given ε value
• Difference with the mono-objective model: aik is to be decided• Large variety of problems
1 A global objective on the complete solution2 An objective on the components → minimizing the worst case
Nicolas Jozefowiez 40 / 51
Reformulation
minimize∑ωk∈R
ckθk
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
θk ∈ 0, 1 (ωk ∈ R)
• The master problem is a single objective problem• The subproblem is a single objective problem• Well-suited for an ε-constraint method [Berube et al., 2009]
• Rε = ωk ∈ R : ρk ≤ ε• No weakening of the linear relaxation for a given ε value• Difference with the mono-objective model: aik is to be decided
• Large variety of problems
1 A global objective on the complete solution2 An objective on the components → minimizing the worst case
Nicolas Jozefowiez 40 / 51
Reformulation
minimize∑ωk∈R
ckθk
s.t.∑ωk∈R
aikθk ≥ 1 (wi ∈W )
θk ∈ 0, 1 (ωk ∈ R)
• The master problem is a single objective problem• The subproblem is a single objective problem• Well-suited for an ε-constraint method [Berube et al., 2009]
• Rε = ωk ∈ R : ρk ≤ ε• No weakening of the linear relaxation for a given ε value• Difference with the mono-objective model: aik is to be decided• Large variety of problems
1 A global objective on the complete solution2 An objective on the components → minimizing the worst case
Nicolas Jozefowiez 40 / 51
Computational results
PPS SOGA
p |V | |W | Seconds #SP Seconds % #SP % |R∗|
5 30 60 18.03 184.6 13.00 .72 112.2 .61 1229.45 30 90 15.38 163.4 12.28 .79 105.6 .64 1041.2
5 40 80 49.50 228.0 36.68 .74 141.4 .62 1609.85 40 120 126.75 330.4 86.39 .68 173.8 .53 2347.6
5 50 100 205.79 390.4 154.51 .75 212.6 .54 2871.25 50 150 392.54 486.4 268.06 .68 223.6 .46 3087.4
8 30 60 49.76 226.8 25.46 .51 136.0 .60 1657.28 30 90 31.26 215.2 18.68 .60 121.8 .56 1284.0
8 40 80 113.41 302.0 86.81 .76 182.8 .60 2252.68 40 120 511.23 480.6 326.63 .63 243.2 .51 3652.4
8 50 100 1343.66 522.0 821.23 .61 289.0 .55 4392.88 50 150 1525.19 672.0 1049.09 .68 305.6 .45 4781.8
Nicolas Jozefowiez 41 / 51
Part V
Multi-objective
genetic algorithms
Multi-objective meta-heuristics
Main focus of research on
• Selection
• Mechanisms for diversification
• Mechanisms for intensification
Less focus on
• Operators (crossover), neighborhood
• Encoding
• Usually inspired by a close single objective problem
Nicolas Jozefowiez 43 / 51
Set-based optimization [Zitzler et al., 2010]
pop
ula
tion
Standard approach
f1
f2
•
•
••
•
Set-based approach
f1
f2
••
•
•••
•
••••
•
•• •
•
• How to manipulate and define operators ?
• Proto-solution
• Multi-objective decoder: a proto-solution → several solutions
Nicolas Jozefowiez 44 / 51
Set-based optimization [Zitzler et al., 2010]
pop
ula
tion
Standard approach
f1
f2
•
•
••
•
Set-based approach
f1
f2
••
•
•••
•
••••
•
•• •
•
• How to manipulate and define operators ?
• Proto-solution
• Multi-objective decoder: a proto-solution → several solutions
Nicolas Jozefowiez 44 / 51
Set-based optimization [Zitzler et al., 2010]
pop
ula
tion
Standard approach
f1
f2
•
•
••
•
Set-based approach
f1
f2
••
•
•••
•
••••
•
•• •
•
• How to manipulate and define operators ?
• Proto-solution
• Multi-objective decoder: a proto-solution → several solutions
Nicolas Jozefowiez 44 / 51
Set-based optimization [Zitzler et al., 2010]
pop
ula
tion
Standard approach
f1
f2
•
•
••
•
Set-based approach
f1
f2
••
•
•••
•
••••
•
•• •
•
• How to manipulate and define operators ?
• Proto-solution
• Multi-objective decoder: a proto-solution → several solutions
Nicolas Jozefowiez 44 / 51
Set-based optimization [Zitzler et al., 2010]
pop
ula
tion
Standard approach
f1
f2
•
•
••
•
Set-based approach
f1
f2
••
•
•••
•
••••
•
•• •
•
• How to manipulate and define operators ?
• Proto-solution
• Multi-objective decoder: a proto-solution → several solutions
Nicolas Jozefowiez 44 / 51
Set-based optimization [Zitzler et al., 2010]
pop
ula
tion
Standard approach
f1
f2
•
•
••
•
Set-based approach
f1
f2
••
•
•••
•
••••
•
•• •
•
• How to manipulate and define operators ?
• Proto-solution
• Multi-objective decoder: a proto-solution → several solutions
Nicolas Jozefowiez 44 / 51
Set-based optimization [Zitzler et al., 2010]
pop
ula
tion
Standard approach
f1
f2
•
•
••
•
Set-based approach
f1
f2
••
•
•••
•
••••
•
•• •
•
• How to manipulate and define operators ?
• Proto-solution
• Multi-objective decoder: a proto-solution → several solutions
Nicolas Jozefowiez 44 / 51
Agile Earth Observation Satellite
Satellite direction
Earth surface
Captured photograph
Candidate photographs
Problem
• Select and scheduleacquisitions
• Operational constraints,multiple customers
• max. profit / fairnessbetween customers
Method
• Biased random-key genetic algorithms [Goncalves & Resende, 2010]
• Proto-solution: order to consider the acquisitions
• Decoder: two different heuristics
• Strategies to combine the solutions
• Strict improvement on computational results
Nicolas Jozefowiez 45 / 51
Vehicle routing problems
Proto-solution
• A giant tour (TSP solution)
• Example: CVRP → ignore the capacity constraint
SPLIT operator [Prins, 2004]
20
10
30 25
15
35
25
30
40
:40 :50 :80 :50
:85
:120
:95:55
:60
:90
Decoder
• Multi-objective Shortest Path Prob. with Resource Constraints
• Dynamic programming [Feillet et al., 2003][Reinhardt & Pisinger, 2011]
• Minimal modification: Label, dominance, extension rules
• Indicator-based evaluation
Nicolas Jozefowiez 46 / 51
Vehicle routing problems
Proto-solution
• A giant tour (TSP solution)
• Example: CVRP → ignore the capacity constraint
SPLIT operator [Prins, 2004]
20
10
30 25
15
35
25
30
40
:40 :50 :80 :50
:85
:120
:95:55
:60
:90
Decoder
• Multi-objective Shortest Path Prob. with Resource Constraints
• Dynamic programming [Feillet et al., 2003][Reinhardt & Pisinger, 2011]
• Minimal modification: Label, dominance, extension rules
• Indicator-based evaluation
Nicolas Jozefowiez 46 / 51
Vehicle routing problems
Proto-solution
• A giant tour (TSP solution)
• Example: CVRP → ignore the capacity constraint
SPLIT operator [Prins, 2004]
20
10
30 25
15
35
25
30
40
:40 :50 :80 :50
:85
:120
:95:55
:60
:90
Decoder
• Multi-objective Shortest Path Prob. with Resource Constraints
• Dynamic programming [Feillet et al., 2003][Reinhardt & Pisinger, 2011]
• Minimal modification: Label, dominance, extension rules
• Indicator-based evaluation
Nicolas Jozefowiez 46 / 51
Vehicle routing problems
Proto-solution
• A giant tour (TSP solution)
• Example: CVRP → ignore the capacity constraint
SPLIT operator [Prins, 2004]
20
10
30 25
15
35
25
30
40
:40 :50 :80 :50
:85
:120
:95:55
:60
:90
Decoder
• Multi-objective Shortest Path Prob. with Resource Constraints
• Dynamic programming [Feillet et al., 2003][Reinhardt & Pisinger, 2011]
• Minimal modification: Label, dominance, extension rules
• Indicator-based evaluation
Nicolas Jozefowiez 46 / 51
Vehicle routing problems
Proto-solution
• A giant tour (TSP solution)
• Example: CVRP → ignore the capacity constraint
SPLIT operator [Prins, 2004]
20
10
30 25
15
35
25
30
40
:40 :50 :80 :50
:85
:120
:95:55
:60
:90
Decoder
• Multi-objective Shortest Path Prob. with Resource Constraints
• Dynamic programming [Feillet et al., 2003][Reinhardt & Pisinger, 2011]
• Minimal modification: Label, dominance, extension rules
• Indicator-based evaluation
Nicolas Jozefowiez 46 / 51
Vehicle routing problems
Proto-solution
• A giant tour (TSP solution)
• Example: CVRP → ignore the capacity constraint
SPLIT operator [Prins, 2004]
20
10
30 25
15
35
25
30
40
:40 :50 :80 :50
:85
:120
:95:55
:60
:90
Decoder
• Multi-objective Shortest Path Prob. with Resource Constraints
• Dynamic programming [Feillet et al., 2003][Reinhardt & Pisinger, 2011]
• Minimal modification: Label, dominance, extension rules
• Indicator-based evaluation
Nicolas Jozefowiez 46 / 51
Vehicle routing problems
Proto-solution
• A giant tour (TSP solution)
• Example: CVRP → ignore the capacity constraint
SPLIT operator [Prins, 2004]
20
10
30 25
15
35
25
30
40
:40 :50 :80 :50
:85
:120
:95:55
:60
:90
Decoder
• Multi-objective Shortest Path Prob. with Resource Constraints
• Dynamic programming [Feillet et al., 2003][Reinhardt & Pisinger, 2011]
• Minimal modification: Label, dominance, extension rules
• Indicator-based evaluation
Nicolas Jozefowiez 46 / 51
Vehicle routing problems
Proto-solution
• A giant tour (TSP solution)
• Example: CVRP → ignore the capacity constraint
SPLIT operator [Prins, 2004]
20
10
30 25
15
35
25
30
40
:40 :50 :80 :50
:85
:120
:95:55
:60
:90
Decoder
• Multi-objective Shortest Path Prob. with Resource Constraints
• Dynamic programming [Feillet et al., 2003][Reinhardt & Pisinger, 2011]
• Minimal modification: Label, dominance, extension rules
• Indicator-based evaluation
Nicolas Jozefowiez 46 / 51
Part VI
Conclusions and perspectives
Conclusions and short term perspectives
Research area• Mono- and multi-objective optimization
• Exact algorithms and heuristics
• Land transportation, air transportation, space
Contributions• Problem modeling and studies
• Proposition of new multi-objective meta-heuristics
• Proposition of new multi-objective exact algorithms
• Investigation of lower bound computation for multi-objectivecombinatorial optimization
Short term perspectives• Heuristics (matheuristics, multi-objective decoder ...)
• Vehicle routing problems (balancing, generalization ofproblems with facultative visits ...)
• Study of other families of problems to validate the methodsNicolas Jozefowiez 48 / 51
Multi-objective search tree
• Computation of lower bounds• Additional research for column generation• Multi-objective cutting plane algorithm
• Branching mechanisms
• Pruning mechanisms
• Decision space (variables) / objective space
• Branch-and-price algorithm
• Use of parallelism
• More than two objectives
Nicolas Jozefowiez 49 / 51
Collaborative logistics
• Recent trend in logistics
• Supply chain resource pooling between actors
• Flow massification, resource sharing
• A natural ground for multi-objective optimization• Cost• Customer service• Environmental impact• Resource management• Fairness in the consortium ...
• New models / new methods
• ANR RESPET on door-to-door service network
Nicolas Jozefowiez 50 / 51
Uncertainty in MOCO
Stochastic programming
minx∈Xc1x + Q1(x), c2x + Q2(x) : Ax = b
Qi (x) = Eζ [vi (hi (ω)− T i (ω)x)], vi (s) = min
y∈Yi
qi (ω)y : W iy = s
• Study of the interaction of the recourse functions
• Adaptation of methods such as Integer L-shaped Method
• Network design
Discrete robust optimization
• Scenarios / regret function
• Regret will not be an objective
• Robust efficient solutions / set
• Study of the interaction objective / scenario
• Adaptation of scenario relaxation methods
Nicolas Jozefowiez 51 / 51