Mentales habitudes

Aller au contenu | Aller au menu | Aller à la recherche

dimanche 10 décembre 2006

Metaheuristics and experimental research

Springer has just published a book on "Experimental Research in Evolutionary Computation", written by Thomas Bartz-Beielstein.

Thomas Bartz-Beielstein is working on the statistical analysis of the behaviour of metaheuristics (see its tutorials at GECCO and CEC), and the publication of its book is a really great thing. I haven't read it yet, but the table of content seems really promising. There is a true need for such work in the metaheuristics community, and in stochastic optimization in general.

A friend said to me that the lack of experimental culture in the computer science community was a form of consensus, perhaps because theoretical aspects of mathematics was the "only way to make true science". This is a true problem when you deal with stochastic algorithm, applied to real world problem. Despite the fact that several papers early call for more rigourous experimental studies of metaheuristcs (E.D. Taillard has written papers on this problem several years ago, for example), the community does not seems to quickly react.

Yet, things are changing, after the series of CEC special sessions on benchmark for metaheuristics, there is more and more papers on how to test stochastic optimization algorithms and outline the results. I think this book is coming timely... the next step will be to promote the dissemination of the results data (and code!), in an open format, along with the papers.

samedi 28 octobre 2006

Frameworks for metaheuristics

Note that descriptions are picked up from the web sites of the projects.

As one can see, most of these softwares are designed for evolutionnary algorithms, but I recommend you to try out some of the generic frameworks, because "genetic" algorithms are not always the best choice for solving an optimization problem, despite their wide spread.

Here are the frameworks I would recommend. These frameworks are free softwares, you can use, look at the code, modify it and redistribute it (precious qualities for frameworks).

I would also recommend C or C++, which permits to implement fast programs, while using object oriented programming. C++ compilers are also available for a large choice of plateforms (with a special distinction for GCC, which is free software). A fast progam is crucial for testing algorithms on real problems and using a well-known langage is a good idea.

The main idea beside the design of the framework is specified as one of the following keywords:

  • template: design a new algorithm concist in extending a base class, perhaps the simple object model to understand, but it can be difficult to re-use existing code.
  • component: design a new algorithm concist in select its component from availables operators, make it easy to implement algorithms, but it can be quite difficult to understand the underlying model. It can be hard to add new algorithm paradigm (generaly used for evolutionnary algorithms).
  • function: design a new algorithm concist in use the framework's primitives. Simple to understand, but one must learn the primitives to use and code a lot of stuff.


Here is my list :

  • Open Metaheuristics: a library aimed at the conception of metaheuristics (i.e. genetic/evolutionnary algorithms, tabu search, simulated annealing, ant colony algorithms, etc.). One of the main goal of oMetah is to permit rigourous empirical tests of metaheuristics, through a statistical approach. (C++, LGPL, template)
  • *EO: a set of paradigm-free Evolutionary Computation libraries dedicated to the flexible design of EAs through evolving objects superseding the most common dialects (Genetic Algorithms, Evolution Strategies, Evolutionary Programming and Genetic Programming). (C++, GPL, component)
  • Sferes: a framework that gathers an evolution framework with a simulation framework. It is dedicated to experiments involving the design of artificial agents through artificial evolutionary algorithms, like Genetic Algorithms or Evolution Strategies. (C++, CeCILL, component)
  • openBEAGLE: Evolutionary Computation (EC) framework. (C++, LGPL, component).
  • PISA: Platform and Programming Language Independent Interface for Search Algorithms. PISA is mainly dedicated to multi-objective search, where the optimization problem is characterized by a set of conflicting goals and not just one criterion that needs to be optimized. (C, BSD, function)
  • GAlib: defines classes for using genetic algorithms to do optimization in any C++ program using any representation and genetic operators. The distribution includes extensive documentation and many examples.. (C++, MIT, component)
  • MOMHLib++: a library of C++ classes that implements a number of multiple objective metaheuristics. It has been developed on the basis of former implementations of Pareto simulated annealing (PSA) and multiple objective genetic local search (MOGLS). (C++, GPL, template).
  • ECJ Java Evolutionary Computation Research System. (Java, AFL, component)


These frameworks are not those which I would recommend, but they have some properties that could be intersting :

If you want to find out more and more frameworks, try searching "genetic" on source forge

Many of these projects are dead, be carefull if you need a maintained framework. Take a close look to the project activity, generally, the number of developers and file releases give you a good idea of the framework vitality. Also check if it is a student's project for a training course or if it is made by professional researchers.

dimanche 10 septembre 2006

Confirmation of the biological foundations of particle swarm optimization

Particle swarm optimization (PSO) is based on biological (and theorical physic) work concerning self-organizaton in animals groups. Up to now, theory explained that animals must adjust their direction in order to set up a group. PSO use this concept to build a set of vectors that will exlpore the search space of an optimization problem, while converging on an optimum.

One key prediction of the theory is a transition between the recruitment of the individuals and the collective motion. This transition "from disorder to order" has been proved in situ by biologists, while studying locusts. They have filmed during 8 hours a group of 5 to 120 desert locusts, in a circular cockpit, and analysed the motions datas. The study shows that, at low density (under 25 individuals/m2), the animals moves independently. When reaching 25 to 60 locusts/m2, they form collective groups, which direction can vary abruptly. Beyond 75 locusts/m2, the coordinated marching is homogen.

While this should not change the use of PSO, which is a simplified model, it is always interesting to consider works talking about this transition between order and chaos, in self-organized systems. Indeed, this transition can also occurs in metaheuristics, and is perhaps interesting for further research, like in dynamical optimization.

From Disorder to Order in Marching Locusts, Buhl et al., Science, Vol. 312. no. 5778, pp. 1402 - 1406, 2 june 2006.

vendredi 1 septembre 2006

Finding references about metaheuristics (and computer science)

The Internet is rapidly becoming the first place to search for scientific papers. But the number of places gathering ressources becomes really high. Here is a list of web sites with free access. These are places where you can find some stuff about metaheuristics, I have not include all the available databases (nor journals web pages), despite the fact that metaheuristics are often apply to a wide range of fields.

  • Online collective bibliography managers are really usefull for picking references when surfing on journal sites, they permits to automatically gather the reference's informations, tag them, share your list with others and export it in your local reference manager.
  • digital libraries or paper databases:
    • CiteSeer, search for citations and papers, show citations inside each papers, permits corrections on items, really interesting for computer science.
    • ScienceDirect, database, require registration, make available a watchlist based on email alerts.
    • arXiv, an e-print service, well formated ressources, RSS syndication.
    • Springer Links, books & papers database, RSS syndication, watchlist for registered users.
    • ACM portal, digital library, no syndication or free watchlist for registered users.
    • IEEE Xplore, database, no syndication, no watchlist, email alerts at field level only.
    • The Collection of Computer Science Bibliographies, database, RSS syndication.
    • PubMed, database, emails alerts for registered users.
    • Backwell Synergy, database, RSS/Atom syndication, email alerts.
    • Optimization Online, e-print about optimization, monthly email alerts.
    • Scitation, database, no syndication, no free email alerts.
    • Wiley InterScience, database, no syndication, emails alerts for registerd users.
  • generalistic search-engines:

mercredi 30 août 2006

Metaheuristics & benchmarks at CEC

The IEEE Congress on Evolutionary Computation (CEC) is a well-known event that take place every year.

Since 2005, there is an interesting group of special sessions, organized by Ponnuthurai Nagaratnam Suganthan:

What is really interesting in these sessions is the systematic presence of an implemented generalistic benchmark, built after discussion between researchers.

This is an extremely necessary practice, which is, unfortunately, not generalized. Indeed, this is the first step toward a rigourous performance assessment of metaheuristics (the second one being a true statistical approach, and the third one a considered data presentation).

vendredi 25 août 2006

Some blogs about metaheuristics

As this blog is just starting, I have searched some peoples talking about metaheuristics, and found some that are really interesting.

  • IlliGAL Blogging: the blog of David E. Goldberg... well, he is a great researcher and everybody knows his work. His blog is really interesting.
  • Pensive Pondering: an awesome blog, Jason Brownlee talks about biologically inspired computation. Surprisingly, I found an old post on his blog about the definition of metaheuristics, where one can see that we share a lot of common ideas on the subject (see the previous post).
  • Adaptare: Julián García writes on "Evolutionary computation, evolutionary dynamics, cultural evolution and the like", an interesting and active blog.

These are the best blogs I have found, you will find some other (less interesting) blogs in my OPML feedlist.

mercredi 23 août 2006

What are metaheuristics ?

Despite the title of this blog, the term metaheuristic is not really well defined.

One of the first occurence of the term can (of course) be found in a paper by Fred Glover[1]: Future Paths for Integer Programming and Links to Artificial Intelligence[2]. In the section concerning tabu search, he talks about meta-heuristic:

Tabu search may be viewed as a "meta-heuristic" superimposed on another heuristic. The approach undertakes to transcend local optimality by a strategy of forbidding (or, more broadly, penalizing) certain moves.

In the AI field, a heuristic is a specific method that help solving a problem (from the greek for to find), but how must we understand the meta word ? Well, in greek, it means "after", "beyond" (like in metaphysic) or "about" (like in metadata). Reading Glover, metaheuristics seems to be heuristics beyond heuristics, which seems to be a good old definition, but what is the definition nowadays ? The litterature is really prolific on this subject, and the definitions are numerous.

There is at least three tendencies :

  1. one that consider that the most important part of metaheuristcs is the gathering of several heuristics,
  2. one other that promotes the fact that metaheuristics are designed as generalistic methods, that can tackle several problems without major changes in their design,
  3. the last one that use the term only for evolutionnary algorithms when they are hybridicized with local searches (methods that are called memetic algorithms in the other points of vue).

The last one is quite minor in the generalistic litterature, it can mainly be found in the field of evolutionnary computation, separate out the two other tendencies is more difficult.

Here are some definitions gathered in more or less generalistic papers:

"iterative generation process which guides a subordinate heuristic by combining intelligently different concepts for exploring and exploiting the search space" (Osman and Laporte, 1996[3])

"(metaheuristics) combine basic heuristic methods in higher level frameworks aimed at efficiently and effectively exploring a search space" (Blum and Roli, 2003[4])

"a metaheuristic can be seen as a general-purpose heuristic method designed to guide an underlying problem-specific heuristic (...) A metaheuristic is therefore a general algorithmic framework which can be applied to different optimization problems with relative few modifications to make them adapted to a specific problem." (Dorigo and Stützle, 2004[5])

"(metaheuristics) apply to all kinds of problems (...) are, at least to some extent, stochastic (...) direct, i.e. they do not resort to the calculation of the gradients of the objective function (...) inspired by analogies: with physics, biology or ethology" (Dréo, Siarry, Petrowski and Taillard, 2006[6])

One can summarize by enumerating the expected characteristics:

  • optimization algorithms,
  • with an iterative design,
  • combining low level heuristics,
  • aiming to tackle a large scale of "hard" problems.

As it is pointed out by the last reference, a large majority of metaheuristics (well, not to say all) use at least one stochastic (probabilistic) process and does not use more information than the solution and the associated value(s) of the objective function.

Talking about combining heuristics seems to be appropriate for Ant Colony Optimization, that specifically needs one (following Dorigo's point of vue), it can be less obvious for Evolutionnary Algorithms. One can consider that mutation, or even the method's strategy itself, is a heuristic, but isn't it too generalistic to be called a heuristic ?

If we forget the difficulty to demarcate what can be called a heuristic and what is the scope of the term meta, one can simply look at the use of the term among specialists. Despite the fact that the definition can be used in several fields (data mining, machine learning, etc.), the term is used for optimization algorithms. This is perhaps the best reason among others: the term permits to separate a research field from others, thus adding a little bit of marketing...

I would thus use this definition:

Metaheuristics are algorithms designed to tackle "hard" optimization problems, with the help of iterative stochastic processes. These methods are manipulating direct samples of the objective function, and can be applied to several problems without major changes in their design.


[1] A recurrent joke says that whatever is your new idea, it has already be written down by Glover

[2] Comput. & Ops. Res.Vol. 13, No.5, pp. 533-549, 1986

[3] Metaheuristic: A bibliography, Annals of Operations Research, vol. 63, pp. 513-623, 1996

[4] Metaheuristics in combinatorial optimization: Overview and conceptual comparison, ACM Computing Surveys, vol. 35, issue 3, 2003

[5] Ant Colony Optimization, MIT Press, 2004

[6] Metaheuristics for Hard Optimization, Springer, 2006

mardi 22 août 2006

A new book about "stigmergic optimization"

Springer has published a new book in their Studies in Computational Intelligence series.

Edited by Abraham, Grosan and Ramos, it covers "Stigmergic Optimization". Looking at the table of content, one can see that we talk about Ant Colony Optimization.

Of course, it also get onto the so-called swarm intelligence and self-organization, but also onto Particle Swarm Optimization, which is a method that gather more and more researchers these months.

mercredi 2 août 2006

Introductive bibliography to metaheuristics for hard optimization

Note that this bibliography is quite old (2003) and aim french students.

If you need only one reference, this is (of course, because I'm one of the authors) this one :

  • Dréo, J. ; Petrowski, A. ; Taillard, E. ; Siarry, P. ; Metaheuristics for Hard Optimization Methods and Case Studies, Springer, 2006, XII, 369 p., 140 illus., Hardcover. ISBN: 3-540-23022-X



  • Glover, F. W. ; Kochenberger, G. A. ; 2003 : Handbook of Metaheuristics, Kluwer Academic Publishers, International series in operations research and management science, Boston Hardbound.
  • Teghem, J. ; Pirlot, M. ; 2002 : Optimisation approchée en recherche opérationnelle. Recherches locales, réseaux neuronaux et satisfaction de contraintes, Hermès.
  • Pham, D.T. ; Karaboga, D. ; 2000 : Intelligent optimisation techniques. Genetic Algorithms, Tabu Search, Simulated Annealing and Neural Networks, Springer.
  • Saït, S.M. ; Youssef, H. ; 1999 : Iterative computer algorithms with applications in engineering, IEEE Computer Society Press.
  • Reeves, C.R., 1995 : Modern Heuristic Techniques for Combinatorial Problems, Mc Graw-Hill, Advances topics in computer science.


Simulated Annealing

  • Siarry, P.; Dreyfus, G. ; 1989 : La méthode du recuit simulé : théorie et applications, ESPCI -- IDSET, 10 rue Vauquelin, 75005 Paris.

Tabu Search

  • Glover, F. ; Laguna, M. ; 1997 : Tabu search, Kluwer Academic Publishers, Dordrecht.
  • Glover, F. ; 1989 : Tabu search --- part I, ORSA Journal on Computing, vol. 1, 190--206.
  • Glover, F. ; 1990 : Tabu search --- part II, ORSA Journal on Computing, vol. 2, 4--32.

Evolutionary Algorithms (aka Genetic Algorithm)

  • Baeck, T. ; Fogel, D. B. ; Michalewicz, Z. ; 2000 : Evolutionary computation 1: basic algorithms and operators, Institute of Physics Publishing.
  • Baeck, T. ; Fogel, D. B. ; Michalewicz, Z. ; 2000 : Evolutionary computation 2: advanced algorithms and operators, Institute of Physics Publishing.
  • Goldberg, D. E. ; 1994 : Algorithmes génétiques. exploration, optimisation et apprentissage automatique, Addison-Wesley France.
  • Koza, J. R. ; 1992 : Genetic programming I: on the programming of computers by means of natural selection, MIT Press.
  • Koza, J. R. ; 1994 : Genetic programming II: automatic discovery of reusable programs, MIT Press.

Ant Colony Algorithms

  • Bonabeau, E. ; Dorigo, M. ; Theraulaz, G. ; 1999 : Swarm Intelligence, From Natural to Artificial Systems, Oxford University Press.

Greedy Randomized Adaptive Search Procedure (GRASP)

Tech Reports
  • Resende, M.G.C. ; 2000 : Greedy randomized adaptive search procedures (GRASP), AT&T Labs-Research, TR 98.41.1.

Partical Swarm Optimization

  • Eberhart, R.C. ; Kennedy, J. ; Shi, Y. ; 2001 : Swarm Intelligence, Morgan Kaufmann, Evolutionnary Computation.

Estimation of Distribution Algorithms

  • Larranaga, P. ; Lozano, J.A. ; 2002 : Estimation of Distribution Algorithms, A New Tool for Evolutionnary Computation, Kluwer Academic Publishers, Genetic Algorithms and Evolutionnary Computation.

Related Topics

Multi-Objective Optimization

  • Collette, Y. ; Siarry, P. ; 2002 : Optimisation multiobjectif, Eyrolles.
  • Deb, K. ; 2001 : Multi-objective optimization using evolutionary algorithms, John Wiley and sons.

Constrainted Optimization

  • Michalewicz, Z. ; 1996 : Genetic algorithms + data structures = evolution programs, Springer Verlag, troisième édition révisée.


  • Camazine, S. ; Deneubourg, J.L. ; Franks, N. ; Sneyd, J. ; Theraulaz, G. ; Bonabeau, E. ; 2000 : Self-Organization in Biological Systems, Princeton University Press.

mardi 1 août 2006

About this blog

This blog is an attempt to publish thoughts about metaheuristics and to share them with others. Indeed, blogs are fun, blogs are popular, ok... but most of all, blogs can be very usefull for researchers, that constently need to communicate, share ideas and informations.

Metaheuristics are (well, that's one definition among others, but in my opinion the better one) iterative (stochastic) algorithms for "hard" optimization. Well known metaheuristics are the so-called "genetic algorithms" (lets call them evolutionary ones), but these are not the only class: dont forget simulated annealing, tabu search, ant colony algorithms, estimation of distribution, etc.

This blog will try to focuse on the theory, the design, the understanding, the application, the implementation and the use of metaheuristics. I hope this blog will be profitable to other peoples (researchers as well as users), and will be a place to share thoughts.

Welcome aboard, and lets sleep with metaheuristics.

page 2 de 2 -