87

Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

  • Upload
    others

  • View
    10

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introduction to Stochastic Local Search

Thomas St�utzleIRIDIA, CoDEUniversit�e Libre de BruxellesBrussels, [email protected]/~stuetzle

Page 2: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Outline1 Stochastic local search2 SLS Techniques

Iterative improvementSimulated AnnealingTabu SearchDynamic Local SearchIterated local searchAnt Colony OptimizationEvolutionary algorithms

3 Empirical analysisRun-time distributions

4 Engineering SLS algorithms5 Recent trends

Page 3: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Stochastic local search

enormous research e�orts� hundreds / thousands of publications� conference series (MIC); 190 submissions in 2005� sub-areas become established �elds (evolutionaryalgorithms, swarm intelligence)

signi�cant successes and developments� excellent results in many application areas� new algorithmic ideas (MAs, ACO, VLSN, etc.)� large exibility and robustness� sophisticated data structures� su�cient computational power nowadays available

Page 4: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Combinatorial problems

examples� �nding minimum cost schedule to deliver goods� �nding optimal sequence of jobs in production line� �nding best allocation of ight crews to airplanes� �nding a best routing for Internet data packets� : : : and many more

about problems we tackle� arise in many real-world applications� many have high computational complexity (NP-hard)� in research, often abstract versions of real-world problemsare treated

Page 5: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Traveling Salesman Problem(TSP)

� given: fully connected,weighted GraphG = (V ;E ; d)

� goal: �nd shortestHamiltonian cycle

� hardness: NP-hard� interest: standardbenchmark problem foralgorithmic ideas

Page 6: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Graph Coloring Problem (GCP)

� given: Graph G = (V ;E )� decision version: given k colorsis there a proper coloring?

� optimisation version: what isthe smallest k for which aproper coloring exists?

� hardness: NP-hard� interest: paradigmaticconstrained problem

Page 7: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Search paradigms (1)

systematic search� traverse search space of instances in systematic manner� complete: guaranteed to �nd optimal solution in �niteamount of time

local search� start at some initial solution� iteratively move from search position to neighbouring one� incomplete: not guaranteed to �nd optimal solutions

Page 8: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Search paradigms (2)constructive local search� search space = partial candidate solutions� search step = extension with solution components� example: sequential algorithms for GCP

perturbative local search� search space = complete candidate solutions� search step = modi�cation of solution components� example:

2-exchange

Page 9: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Stochastic local search

Many prominent local search algorithms use randomisedchoices in generating and modifying candidate solutions.These stochastic local search (SLS) algorithms are one of themost successful and widely used approaches for solving hardcombinatorial problems.

Some well-known SLS methods and algorithms:� Lin-Kernighan Algorithm for TSP� Simulated Annealing� Ant Colony Optimisation

Page 10: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Stochastic local search | globalview

� vertices: candidate solutions(search positions)

� edges: connect neighbouringpositions

� s: (optimal) solution� c: current search position

c

s

Page 11: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Stochastic local search | localview

c

s

� next search position is selected from local neighbourhoodbased on local information, e.g., heuristic values.

Page 12: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

De�nition: Stochastic Local Search Algorithm (1)

For given problem instance �:� search space S(�)(e.g., for GCP: set of all complete assignmentsof k colours to nodes)

� solution set S 0(�) � S(�)(e.g., for GCP: proper colourings for a given graph)

� neighbourhood relation N(�) � S(�)� S(�)(e.g., for GCP: neighbouring colour assignments di�erin the colour assignment of exactly one node)

Page 13: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

De�nition: Stochastic Local Search Algorithm (2)

� set of memory states M(�)(may consist of a single state, for SLS algorithms thatdo not use memory)

� initialisation function init : ; 7! D(S(�)�M(�))(speci�es probability distribution over initial searchpositions and memory states)

� step function step : S(�)�M(�) 7! D(S(�)�M(�))(maps each search position and memory state ontoprobability distribution over subsequent, neighbouringsearch positions and memory states)

� termination predicateterminate : S(�)�M(�) 7! D(f>;?g)(determines the termination probability for eachsearch position and memory state)

Page 14: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

procedure SLS-Decision(�)input: problem instance � 2 �output: solution s 2 S 0(�) or ;(s;m) := init(�);while not terminate(�; s;m) do

(s;m) := step(�; s;m);end

if s 2 S 0(�) thenreturn s

elsereturn ;

endend SLS-Decision

Page 15: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

procedure SLS-Minimisation(�0)input: problem instance �0 2 �0output: solution s 2 S 0(�0) or ;(s;m) := init(�0);s := s;while not terminate(�0; s;m) do

(s;m) := step(�0; s;m);if f (�0; s) < f (�0; s) then

s := s;end

endif s 2 S 0(�0) then

return selse

return ;end

end SLS-Minimisation

Page 16: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Note� procedural versions of init, step and terminate implementsampling from respective probability distributions.

� memory state m can consist of multiple independentattributes, i.e., M(�) := M1 �M2 � : : :�Ml(�).

Evaluation function� function g(�) : S(�) 7! R that maps candidate solutionsof a given problem instance � onto real numbers, suchthat global optima correspond to solutions of �;

� used for ranking or assessing neighbhours of current searchposition to provide guidance to search process.

Page 17: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Iterative Improvement (II)

� does not use memory� init: uniform random choice from S� step: uniform random choice from improving neighbours,i.e., step(s)(s 0) := 1=#I (s) if s 0 2 I (s), and 0 otherwise,where I (s) := fs 0 2 S j N(s; s 0) ^ g(s 0) < g(s)g

� terminates when no improving neighbour available, that is,in a local minimum

� di�erent variants through modi�cations of step functionNote: II is also known as iterative descent or hill-climbing.

Page 18: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Example: iterative improvementfor GCP

� search space S : set of all colour assignments of k coloursto nodes in the given graph G

� solution set S 0: set of all proper k-colourings� neighbourhood relation N: 1-exchange neighbourhood� memory: not used, i.e., M := f0g� initialisation: uniform random choice from S , i.e.,init()(a0) := 1=#S for all assignments a0

� evaluation function: g(a) := number of edges with bothend-nodes having assigned the same colour

� step function: uniform random choice from mostimproving neighbours

� termination: when no improving neighbour is available

Page 19: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Pivoting rules

In II, various mechanisms can be used for choosing improvingneighbour in each step:� Best Improvement (aka gradient descent, greedyhill-climbing): choose maximally improving neighbour, i.e.,randomly select from I �(s) := fs 0 2 N(s) j g(s 0) = g�g,where g� := minfg(s 0) j s 0 2 N(s)g.Note: requires evaluation of all neighbours in each step.

� First Improvement: evaluate neighbours in �xed order,choose �rst improving step encountered.Note: can be much more e�cient than Best Improvement;the order of evaluation can have signi�cant impact onperformance.

Page 20: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Incremental updates (aka deltaevaluations)

� Key idea: calculate e�ects of di�erences between currentsearch position s and neighbours s 0 on evaluation functionvalue.

� evaluation function values often consist of independentcontributions of solution components; hence, g(s) can bee�ciently calculated from g(s 0) by di�erences between sand s 0 in terms of solution components.

� typically crucial for the e�cient implementation of IIalgorithms (and other SLS techniques).

� sometimes requires the usage of \complex" datastructures (see research on TSP)

Page 21: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Neighbourhood pruning

� reduce size of neighbourhoods by exluding neighbours thatare likely (or guaranteed) not to yield improvements in g .

� crucial for large neighbourhoods, but can be also veryuseful for small neighbourhoods (e.g., linear in instancesize).

Page 22: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Example: candidate lists for theTSP

� intuition: High-quality solutions likely include short edges.� candidate list of vertex v : list of v 's nearest neighbours(limited number), sorted according to increasing edgeweights.

� search steps (e.g., 2-exchange moves) always involve edgesto elements of candidate lists.

� signi�cant impact on performance of SLS algorithmsfor the TSP.

Page 23: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Local minima and largeneighbourhoods

local minima� local minima depend on g and neighbourhood relation, N.� larger neighbourhoods N(s) induce

� neighbhourhood graphs with smaller diameter;� fewer local minima.

more \powerful" II techniques� variable neighbourhood descent� variable depth search, ejection chains� dynasearch� very large scale neighbourhood search

Main problem remains: Getting trapped in local optima

Page 24: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

SLS methods

modify neighbourhoods� variable neighbourhood search

accept occasionally worse neighbours� simulated annealing� tabu search

modify evaluation function� dynamic local search

generate new starting solutions (for local search)� iterated local search� ant colony optimization� memetic algorithms / EAs

Page 25: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Simulated Annealing

Key idea: probabilistically accept worse neighbours independence of a time-varying parameter called Temperatureand the evaluation function di�erence.

Inspired by physical annealing process:� candidate solutions �= states of physical system� evaluation function �= thermodynamic energy� globally optimal solutions �= ground states� parameter T �= physical temperature

In physical process (e.g., annealing of metals), perfect groundstates are achieved by very slow lowering of temperature.

Page 26: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Simulated Annealing (SA):determine initial candidate solution sset initial temperature T according to annealing scheduleWhile termination condition is not satis�ed:jj probabilistically choose a neighbour s 0 of sjj using proposal mechanismjj If s 0 satis�es probabilistic acceptance criterion (depending on T ):jj s := s 0b update T according to annealing schedule

Page 27: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

� 2-stage step function based on� proposal mechanism (often uniform random choice from

N(s))� acceptance criterion (often Metropolis condition)

� annealing schedule (function mapping run-time t ontotemperature T (t)):

� initial temperature T0(may depend on properties of given problem instance)

� temperature update scheme(e.g., geometric cooling: T := � � T )

� number of search steps to be performed at eachtemperature(often multiple of neighbourhood size)

� termination predicate: often based on acceptance ratio,i.e., ratio of proposed vs accepted steps.

Page 28: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Example: Simulated Annealing forthe TSP

� proposal mechanism: uniform random choice from2-exchange neighbourhood;

� acceptance criterion: Metropolis condition (always acceptimproving steps, accept worsening steps with probabilityexp [(f (s)� f (s 0))=T ]);

� annealing schedule: geometric cooling T := 0:95 � T withn � (n � 1) steps at each temperature (n = number ofvertices in given graph), T0 chosen such that 97% ofproposed steps are accepted;

� termination: when for �ve successive temperature valuesno improvement in solution quality and acceptance ratio< 2%.

Page 29: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Improvements:� neighbourhood pruning (e.g., candidate lists for TSP)� greedy initialisation (e.g., by using NNH for the TSP)� low temperature starts (to prevent good initial candidatesolutions from being too easily destroyed by worseningsteps)

� look-up tables for acceptance probabilities:instead of computing exponential function exp(�=T ) for eachstep with � := f (s)� f (s 0) (expensive!), use precomputed tablefor range of argument values �=T .

Page 30: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Remarks on SA

� historically important� easy to implement� has interesting theoretical properties (convergence), butthese are of very limited practical relevance

� achieves good performance often at the cost of substantialrun-times

Page 31: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Tabu Search

Key idea: use aspects of search history (memory) to escapefrom local minima.

Simple Tabu Search:� associate tabu attributes with candidate solutions orsolution components.

� forbid steps to search positions recently visited byunderlying iterative best improvement procedure based ontabu attributes.

Page 32: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Tabu Search (TS):determine initial candidate solution sWhile termination criterion is not satis�ed:jj determine set N 0 of non-tabu neighbours of sjj choose a best improving candidate solution s 0 in N 0jjjj update tabu attributes based on s 0b s := s 0

Page 33: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

� non-tabu search positions in N(s) are called admissibleneighbours of s.

� after a search step, the current search position or thesolution components just added/removed from it aredeclared tabu for a �xed number of subsequent searchsteps (tabu tenure).

� often, an additional aspiration criterion is used: thisspeci�es conditions under which tabu status may beoverridden (e.g., if considered step leads to improvementin incumbent solution).

Page 34: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Example: Tabu Search for GCP {Tabucol (1)

� search space S : set of all colour assignments of k coloursto nodes in the given graph G

� solution set S 0: set of all proper k-colourings� neighbourhood relation N: 1-exchange neighbourhood� memory: associate tabu status with individual colourassignments to nodes, that is, forbid assignments (v ; c).

� initialisation: uniform random choice from S , i.e.,init()(a0) := 1=#S for all assignments a0

� evaluation function: g(a) := number of edges with bothend-nodes having assigned the same colour

Page 35: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Example: Tabu Search for GCP {Tabucol (2)

� search steps:� colour assignments (v ; c) are tabu i� they have beenchanged in the last tt steps;

� neighbouring assignments are admissible i� they can bereached by a non-tabu move or have better evaluationfunction value than the best colour assignment seen so far(aspiration criterion);

� choose uniformly at random a best improving admissiblecolour assignment

� termination: upon �nding a proper colouring for G withk colours or after given bound on number of search steps.

� tabu tenure: depends on number of vertices in con icttt := rand(10) + � � jV C j

Page 36: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Further improvements can be achieved by usingintermediate-term or long-term memory to achieve additionalintensi�cation or diversi�cation.Examples:� occasionally backtrack to elite candidate solutions, i.e.,high-quality search positions encountered earlier in thesearch; when doing this, all associated tabu attributes arecleared.

� freeze certain solution components and keep them �xedfor long periods of the search.

� occasionally force rarely used solution components to beintroduced into current candidate solution.

� extend evaluation function to capture frequency of use ofcandidate solutions or solution components.

Page 37: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Tabu search algorithms are state of the artfor solving many combinatorial problems, including:� SAT and MAX-SAT� the Constraint Satisfaction Problem (CSP)� many scheduling problems

Crucial factors in many applications:� e�cient evaluation of candidate solutions (caching andincremental updating mechanisms)

� proper parameter tuning

Page 38: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Dynamic Local Search

� Key Idea: modify the evaluation function whenever alocal optimum is encountered in such a way that furtherimprovement steps become possible.

� associate penalty weights (penalties) with solutioncomponents; these determine impact of components onevaluation function value.

� perform Iterative Improvement; when in local minimum,increase penalties of some solution components untilimproving steps become available.

Page 39: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Dynamic Local Search (DLS):determine initial candidate solution sinitialise penaltiesWhile termination criterion is not satis�ed:jj compute modi�ed evaluation function g 0 from gjj based on penaltiesjjjj perform subsidiary local search on sjj using evaluation function g 0jjb update penalties based on s

Page 40: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Dynamic Local Search (continued)

� modi�ed evaluation function:g 0(�; s) := g(�; s) +Pi2SC(�0;s) penalty(i); where

SC (�0; s) = set of solution components of probleminstance �0 used in candidate solution s.

� penalty initialisation: for all i : penalty(i) := 0.� penalty update in local minimum s: typically involvespenalty increase of some or all solution components of s;often also occasional penalty decrease or penaltysmoothing.

� subsidiary local search: often Iterative Improvement.

Page 41: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Potential problem:solution components required for (optimal) solution may alsobe present in many local minima.Possible solutions:A: occasional decreases/smoothing of penalties.B: only increase penalties of solution components that are

least likely to occur in (optimal) solutions.

Implementation of B:only increase penalties of solution components i with maximalutility:

util(s 0; i) := fi (�; s 0)1 + penalty(i)

where fi (�; s 0) = solution quality contribution of i in s 0.

VoudourisandTsang ; 1995

Page 42: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Example: Guided Local Search(GLS) for the TSP

� given: TSP instance G� search space: Hamiltonian cycles in G with n vertices;use standard 2-exchange neighbourhood.

� penalty initialisation: set all edge penalties to zero.� subsidiary local search: Iterative First Improvement.� penalty update: increment penalties for all edges withmaximal utility by

� := 0:3 � w(s2-opt)n

where s2-opt = 2-optimal tour.

Page 43: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Earlier, closely related methods:� Breakout Method [Morris, 1993]� GENET [Davenport et al., 1994]� clause weighting methods for SAT[Selman and Kautz, 1993; Cha and Iwama, 1996; Frank,1997]

Dynamic local search algorithms are state of the art for manyproblems, including:� SAT� MAX-CLIQUE

Page 44: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Iterated Local Search

Key Idea: use two types of SLS steps:� subsidiary local search steps for reaching local optima ase�ciently as possible (intensi�cation)

� perturbation steps for e�ectively escaping from localoptima (diversi�cation).

also: use acceptance criterion to control diversi�cation vsintensi�cation behaviour.

Page 45: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Iterated Local Search (ILS):determine initial candidate solution sperform subsidiary local search on sWhile termination criterion is not satis�ed:jj r := sjj perform perturbation on sjj perform subsidiary local search on sjjjj based on acceptance criterion,b keep s or revert to s := r

Page 46: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Note:� subsidiary local search results in a local minimum.� ILS trajectories can be seen as walks in the space of localminima of the given evaluation function.

� perturbation phase and acceptance criterion may useaspects of search history (i.e., limited memory).

� in a high-performance ILS algorithm, subsidiary localsearch, perturbation mechanism and acceptance criterionneed to complement each other well.

Page 47: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Subsidiary local search

� more e�ective subsidiary local search procedures lead tobetter ILS performance.Example: 2-opt vs 3-opt vs LK for TSP.

� often, subsidiary local search = iterative improvement, butmore sophisticated SLS methods can be used. (e.g., TabuSearch).

Page 48: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Perturbation mechanism

� needs to be chosen such that its e�ect cannot be easilyundone by subsequent local search phase.(often achieved by search steps larger neighbourhood.)Example: local search = 3-opt, perturbation = 4-exchangesteps in ILS for TSP.

� weak perturbation ) short subsequent local search phase;but: risk of revisiting current local minimum.

� strong perturbation ) more e�ective escape from localminima;but: may have similar drawbacks as random restart.

� advanced ILS algorithms may change nature and/orstrength of perturbation adaptively during search.

Page 49: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Acceptance criteria

� always accept the better of the two candidate solutions) ILS performs Iterative Improvement in the space oflocal optima reached by subsidiary local search.

� always accept the more recent of the two candidatesolutions) ILS performs random walk in the space of local optimareached by subsidiary local search.

� intermediate behaviour: select between the two candidatesolutions based on the Metropolis criterion (e.g., used inLarge Step Markov Chains [Martin et al., 1991].

� advanced acceptance criteria take into account searchhistory, e.g., by occasionally reverting to incumbentsolution.

Page 50: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Example: Iterated Local Search forthe TSP (1)

� given: TSP instance G .� search space: Hamiltonian cycles in G� subsidiary local search: Lin-Kernighan variable depthsearch algorithm

� acceptance criterion: always return the better of the twogiven candidate round trips.

Page 51: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Example: Iterated Local Search forthe TSP (2)

� perturbation mechanism: `double-bridge move' =particular 4-exchange step:

A

BC

D

double bridge

move

A

BC

D

note:� cannot be directly reversed by a sequence of 2-exchangesteps as performed by "usual" LK implementations.

� empirically shown to be e�ective independent of instancesize.

Page 52: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

ILS | Example results TSP

solution quality for iterated 3-opt, CLK-ABCC, and ILK-Hgiven is percentage deviation from optimum over 10 trials each of a 600seconds on an Athlon 1.2 GHz CPU, 1 GB RAM

Iterated-3-opt CLK-ABCC ILK-HInstance �min �avg �max �min �avg �max �min �avg �maxfnl4461 0:20 0:28 0:35 0:03 0:08 0:11 0:0 < 0:01 < 0:01pla7397 0:22 0:35 0:48 0:04 0:19 0:33 0:01 0:03 0:05rl11849 0:44 0:66 0:76 0:16 0:25 0:39 0:04 0:06 0:10usa13509 0:39 0:51 0:58 0:10 0:13 0:17 0:03 0:04 0:06

Page 53: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Iterated local search algorithms . . .

� are typically rather easy to implement (given existingimplementation of subsidiary simple SLS algorithms);

� achieve state-of-the-art performance on manycombinatorial problems, including the TSP.

there are many SLS approaches that are closely related to ILS,including:� Large Step Markov Chains [Martin et al., 1991]� Chained Local Search [Martin and Otto, 1996]� Variants of Variable Neighbourhood Search (VNS)[Hansen and Mladenovi�c, 2002]

Page 54: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Ant Colony Optimization

I guess I do not have to explain this one here ;-)

Page 55: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Ant Colony Optimization

I guess I do not have to explain this one here ;-)

Page 56: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Evolutionary AlgorithmsKey idea: iteratively apply genetic operators mutation,recombination, selection to a population of candidate solutions.inspired by simple model of biological evolution:� mutation introduces random variation in the geneticmaterial of individuals.

� recombination of genetic material during sexualreproduction produces o�spring that combines featuresinherited from both parents.

� di�erences in evolutionary �tness lead selection of genetictraits (`survival of the �ttest').

Page 57: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Evolutionary Algorithm (EA):determine initial population sp

While termination criterion is not satis�ed:jj generate set spr of new candidate solutionsjj by recombinationjjjjjjjj generate set spm of new candidate solutionsjj from spr and sp by mutationjjjjjjjj select new population sp fromb candidate solutions in sp, spr , and spm

Page 58: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

problem: pure evolutionary algorithms often lackcapability of su�cient search intensi�cation.solution: apply subsidiary local search after initialisation,mutation and recombination.

) Memetic Algorithms (aka Genetic Local Search)

Page 59: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Evolutionary Algorithm (EA):determine initial population sp

perform subsidiary local search on sp

While termination criterion is not satis�ed:jj generate set spr of new candidate solutionsjj by recombinationjjjj

perform subsidiary local search on spr

jjjj generate set spm of new candidate solutionsjj from spr and sp by mutationjjjj

perform subsidiary local search on spm

jjjj select new population sp fromb candidate solutions in sp, spr , and spm

Page 60: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Memetic Algorithm (MA):determine initial population spperform subsidiary local search on spWhile termination criterion is not satis�ed:jj generate set spr of new candidate solutionsjj by recombinationjjjj perform subsidiary local search on sprjjjj generate set spm of new candidate solutionsjj from spr and sp by mutationjjjj perform subsidiary local search on spmjjjj select new population sp fromb candidate solutions in sp, spr , and spm

Page 61: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Remarks on ACO, EAs / MAs� use of populations leads to a \natural" increase in searchspace exploration

� allow for a very large number of di�erent implementationchoices

� this makes these SLS methods very exible� but results also in more complex algorithm design andparameter tuning

� typically reach best performance when usingimplementation choices that take into account problemcharacteristics

� achieve very good performance on a wide range ofproblems

Page 62: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Characteristics of SLS methods

Feature SA TS DLS ILS MA ACOsingle trajectory + + + � � �population � � � � + +memory � + + 9 9 9multiple neighborhoods � � � + 9 �sol. construction � � � � � +dynamic g(x) � 9 + � � �nature-inspired + � � � + +

+: feature present, 9: partially present, �: not present

Page 63: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

SLS | empirical analysis

why empirical analysis?� SLS algorithms are very complex systems� theoretical analysis very hard already for rather simplisticvariants of the real algorithms and practical applicability oftheoretical results rather limited

� empirical analysis proven successful in many other sciences(medicine, physics, chemistry, biology, ..)

Page 64: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Empirical analysis

Aspects of stochastic search performance� variability due to randomisation� robustness w.r.t. parameter settings� robustness across di�erent instance types� scaling with problem size

Insights into algorithmic performance...� help assessing suitability for applications� provide basis for comparing algorithms� characterise algorithm behaviour� facilitate improvements of algorithms

Page 65: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Run-time distributions

traditional empirical approach (worst case?)

� run SLS algorithm n times with time limit tmax� compute averages, standard deviations

more detailed analysis (on individual instances)� behaviour of SLS algorithm described by bivariate randomvariable (RunTime, SolutionQuality)

� empirically analyse these stochastic aspects� often: analyse marginal distributions

� run-time needed to reach solution cost c is described by(quali�ed) run-time distributions (RTDs)

� cost value obtained after wall-clock time tmax is describedby solution-quality distributions

Page 66: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Measuring RTDs

� run the SLS algorithm n times on an individual instancewith large time limit tmax

� estimate RTDs for several bounds on solution quality

relative solution quality [%]

00

0.10.20.30.40.50.60.70.80.9

1

0.5 1 1.5 2 2.5

10s3.2s 1s0.3s0.1s

P(s

olve

)

run-time [CPU sec]

0.010

0.10.20.30.40.50.60.70.80.9

1

0.1 1 10 100 1 000

0.8%0.6%0.4%0.2%

opt

P(s

olve

)

Page 67: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

RTDs | advantages

� graphical illustration of algorithm behavior� comparison between algorithms� empirical analysis of asymptotic behavior� characterization of run-time behavior� hints on improvements to algorithm performance

Page 68: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

RTDs | hints on asymptoticbehaviour

00.10.20.30.40.50.60.70.80.9

1

P(s

olve

)

10 100 1 000 10 000

run-time [CPU sec]

MMASMMAS*

two ACO algorithms: MMAS (provably convergent); MMAS�(essentially incomplete);

Page 69: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Functional characterisation of LVAbehaviour (1)

� Empirical RTDs are step functions that approximate theunderlying theoretical RTDs.

� For reasonably large sample sizes (numbers of runs),empirical RTDs can often be approximated well usingmuch simpler continuous mathematical functions.

� Such functional approximations are useful for summarisingand mathematically modelling empirically observedbehaviour, which often provides deeper insights into LVAbehaviour.

� Approximations with parameterised families of continuousdistribution functions known from statistics, such asexponential or normal distributions, are particularly useful.

Page 70: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Approximation of an empirical RTD with an exponentialdistribution ed[m](x) := 1� 2�x=m:

0

100

200

300

400

500

600

χ2 va

lue

median run-time [search steps]

00.10.20.30.40.50.60.70.80.9

1

P(s

olve

)

102 103 105 105 102 103 104 105106

run-time [search steps]

empirical RLDed[61081.5]

0.01 acceptance threshold0.05 acceptance threshold

Page 71: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Functional characterisation of LVAbehaviour (2)

� Model �tting techniques can be used to approximateempirical RTDs with parameterised cumulative distributionfunctions.

� The quality of approximations can be assessed usingstatistical goodness-of-�t tests

� This approach can be easily generalised to ensembles ofproblem instances.

Page 72: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Approximation of an empirical RTD with an exponentialdistribution ed[m](x) := 1� 2�x=m:

0

100

200

300

400

500

600

χ2 va

lue

median run-time [search steps]

00.10.20.30.40.50.60.70.80.9

1

P(s

olve

)

102 103 105 105 102 103 104 105106

run-time [search steps]

empirical RLDed[61081.5]

0.01 acceptance threshold0.05 acceptance threshold

The optimal �t exponential distribution obtained from theMarquardt-Levenberg algorithm passes the �2 goodness-of-�t test at� = 0:05.

Page 73: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

RTDs | hints on improvements

� special role of exponential distribution (consequence ofmemory-less property)

00.10.20.30.40.50.60.70.80.9

1P

(sol

ve)

0.1 1 1 00010010

run-time [CPU sec]

ed[18]ILS

00.10.20.30.40.50.60.70.80.9

1

P(s

olve

)

0.1 1 1 00010010

run-time [CPU sec]

ILS + dynamic restartILS

Page 74: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

improvement through static restarts

00.10.20.30.40.50.60.70.80.9

1

P(s

olve

)

0.1 1 1 00010010

run-time [CPU sec]

ed[18]ILS

00.10.20.30.40.50.60.70.80.9

1

P(s

olve

)

0.1 1 1 00010010

run-time [CPU sec]

ILS + dynamic restartILS

Page 75: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

RTD analysis of ILS-TSP

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0.01 0.1 1 10

empirical solution probability

log CPU time

0.05%optf(x)

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0.01 0.1 1 10 100 1000

empirical solution probability

log CPU time

0.5%opt

f(x)

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0.1 1 10 100 1000

empirical solution probability

log CPU time

0.5%optf(x)

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

1 10 100 1000

empirical solution probability

log CPU time

0.25%opt

f(x)

Page 76: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Improved ILS algorithm for TSP

� apply �tness{distance diversi�cation after � � n stepswithout improvement (ILS-FDD)

1 generate a set P comprising p copies of s.2 apply one perturbation step followed by a local search to

each candidate solution in P.3 let Q be the set of the q highest quality candidate

solutions from P (where 1 � q � p).4 let ~s be a candidate solution from Q with maximal

distance to s. If d(~s; s) � dmin or if a given maximalnumber of iterations has not been exceeded, then go tostep 2; otherwise return ~s.

Page 77: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

ILS-FDD | example resultsExample:

performance results for ILS-Descent and ILS-FDDgiven is the frequency of �nding optimal tours, average percentagedeviation and average computation times measured across 10 to 100 trialsper instance on a 266MHz Pentium II CPU with 512 MB RAM runningRed Hat Linux

ILS-Descent ILS-FDDInstance fopt �avg tavg fopt �avg tavgrat783 0:71 0:03 238:8 1:0 0 159:5pcb1173 0 0:26 461:4 0:56 0:01 652:9d1291 0:08 0:29 191:2 1:0 0 245:4fl577 0:12 0:52 494:6 1:0 0 294:1pr2392 0 0:23 1 538:5 0:3 0:03 2 909:1pcb3038 0 0:22 3 687:6 0 0:10 5 535:9fl3795 0:2 0:36 3 601:9 0:9 <0:01 3 506:7

Page 78: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

SLS methods

enormous research e�orts� hundreds / thousands of publications� conference series (MIC); 190 submissions in 2005� sub-areas become established �elds (evolutionaryalgorithms, swarm intelligence)

signi�cant successes and developments� excellent results in many application areas� new algorithmic ideas (MAs, ACO, VLSN, etc.)� sophisticated data structures� su�cient computational power nowadays available

Page 79: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

SLS methods

current de�ciencies� no general guidelines of how to design e�cient SLSalgorithms; application often considered an art

� high development times and expert knowledge required� relationship between problem or search spacecharacteristics and performance not well understood

� shortcomings in experimental methodology

Page 80: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

SLS methods

insights from Metaheuristics Network� success with SLS algorithms due to

� level of expertise of developer and implementer� time invested in designing and tuning the SLS algorithm� creative use of insights into algorithm behaviour andinterplay with problem characteristics

� fundamental are issues like choice of underlyingneighbourhoods, e�cient data structures, creative use ofalgorithm components; to a less extent the strictattainment to the rules of a speci�c SLS method

Page 81: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

SLS algorithm engineering

Algorithm engineering (AE)� is conceived as to �ll the gap between traditional (oftentheoretical) research in algorithmics and practice

� process of designing, analyzing, implementing, tuning, andexperimentally evaluating algorithms [Demetrescu et al.2003]

SLS algorithm engineering� analogous high-level process to AE� but much more di�cult because

� problems tackled are highly complex (NP-hard)� stochasticity of algorithms makes analysis harder� many degrees of freedom

Page 82: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

SLS algorithm engineering (2)

SLS

Appli-cations

ComputerScience

OperationsResearch

Statis-tics

SLS

Page 83: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

SLS algorithm engineering (3)

� complexity of �eld requires strong knowledge in all areas� better organization of current knowledge on SLS� tentative step-wise engineering procedure (stronglyassisted by sound experimental methodologies)

� get insight into the problem being tackled� implement basic constructive and local search procedures� starting from these add complexity (simple SLS methods)� add advanced concepts like perturbations, population� if needed: iterate through these steps

bottom-up approach� add complexity step-by-step starting from simplealgorithms

Page 84: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Trends

SLS methods� study of more complex local search algorithms (complexneighborhoods, etc.)

� combinations with exact algorithms� hybrid SLS methods that combine several basic techniques

principled SLS algorithm design� di�erences between SLS methods become increasinglyblurred

� adoption of engineering methodologies for SLSimplementation

Page 85: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Trends (2)

applications� increasingly complex problems tackled� transfer to \new" problem domains (dynamic, stochastic,multi-objective)

applicability of SLS algorithms� tools for automated con�guration and parameter tuning� development of local search frameworks or class libraries(Hotframe, COMET, etc.)

� application of SLS methods to general problemformulations (mainly IP, CP)

Page 86: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Trends (3)empirical analysis� research in SLS algorithms in early stages of adoptingthorough empirical analysis

� systematic usage of statistical analysis tools, machinelearning techniques etc.

problem characteristics, search space, SLS behavior� interplay of these elements not clearly understood� insights important for principled design of SLS algorithms

theoretical results� analytical characterization of performance� theoretical models on SLS behavior

Page 87: Introduction to Stochastic Local Searchiridia.ulb.ac.be/ants2006/tutorial_slides/stuetzle_tutorial_slides.pdfIntroduction to Stochastic Local Search Thomas utzle St Outline Stochastic

Introductionto StochasticLocal SearchThomasSt�utzle

OutlineStochasticlocal searchSLSTechniquesIterativeimprovementSimulatedAnnealingTabu SearchDynamic LocalSearchIterated localsearchAnt ColonyOptimizationEvolutionaryalgorithmsEmpiricalanalysisRun-timedistributionsEngineeringSLSalgorithmsRecent trends

Take-home message

� stochastic local search is one of the most successful andmost widely used methods for solving combinatorialproblems (but certainly no panacea)

� Many open problems, signi�cant research and practicalpotential

� SLS research is fun!