Mentales habitudes - Tag - evolutionary computation2015-10-19T19:25:25+02:00nojhanurn:md5:12147DotclearThe ultimate metaheuristic?urn:md5:6b05629a17a75cbaed8aa9566830da022008-09-11T14:56:00+02:00nojhanDiversdescent algorithmestimation of distributionevolutionary computationmetaheuristicMetropolis-Hastings algorithmmixture of gaussian kernelsamplingsimulated annealing <p>There exists a lot of different algorithms families that can be called
"metaheuristics", stricly speaking, there are a very, very, very large number
of <a href="http://metah.nojhan.net/tag/metaheuristic">metaheuristics</a> <em>instances</em>.</p>
<p>Defining what is a metaheuristic "family" is a difficult problem: when may I
called this or this algorithm an evolutionary one? Is estimation of
distribution a sub-family of genetic algorithms? What is the difference between
ant colony optimization and stochastic gradient ascent? Etc.</p>
<p>Despite the <a href="http://metah.nojhan.net/post/2007/10/12/Classification-of-metaheuristics">difficulty of classifying
metaheuristics</a>, there is some interesting characteristics shared by
stochastic metaheuristics. Indeed, they are all iteratively manipulating a
sample of the objective function<sup>[<a href="http://metah.nojhan.net/post/2008/09/11/#pnote-276667-1" id="rev-pnote-276667-1" name="rev-pnote-276667-1">1</a>]</sup></p>
<p>For example, <a href="http://metah.nojhan.net/tag/simulated%20annealing">simulated annealing</a> is
often depicted as a probabilistic <a href="http://metah.nojhan.net/tag/descent%20algorithm">descent
algorithm</a>, but it is more than that. Indeed, simulated annealing is based
on the <a href="http://metah.nojhan.net/tag/Metropolis-Hastings%20algorithm">Metropolis-Hastings
algorithm</a>, which is a way of sampling any probability distributionn, as
long as you can calculate its density at any point. Thus, <strong>simulated
annealing use an approximation of the objective function as a probability
density function to generate a <a href="http://metah.nojhan.net/tag/sampling">sampling</a></strong>.
It is even more obvious if you consider a step by step decrease of the
temperature. <a href="http://metah.nojhan.net/tag/estimation%20of%20distribution">Estimation of
distribution</a> are another obvious example: they are explicitly manipulating
samplings, but one can also have the same thoughts about <a href="http://metah.nojhan.net/tag/evolutionary%20computation">evolutionary algorithms</a>, even if they are
manipulating the sampling rather implicitely.</p>
<p><img src="http://upload.wikimedia.org/wikipedia/commons/thumb/f/f9/Metaheuristic_parcours-population.png/250px-Metaheuristic_parcours-population.png" alt="" /></p>
<p>The diagram tries to illustrate this idea: (a) a descent algorithm can have
the same sampling behaviour than an iteration of a (b) "population" method.</p>
<p>Given these common processes, is it possible to design a kind of "universal"
metaheuristic ? Theoretically, the answer is yes. For example, in the
continuous domain, consider an estimation of distribution algorithm, using a
<a href="http://metah.nojhan.net/tag/mixture%20of%20gaussian%20kernel">mixture of gaussian kernel</a>:
it can learn any probability density function (possibly needing an infinite
number of kernels). Thus, carefully choosing the function to use at each
iteration and the selection operator, <strong>one can reproduce the behaviour
of any stochastic metaheuristic</strong>.</p>
<p>Of course, choosing the correct mixture (and the other parameters) is a very
difficult problem in practice. But I find interesting the idea that <strong>the
problem of designing a metaheuristic can be reduced to a configuration
problem</strong>.</p>
<div class="footnotes">
<h4>Notes</h4>
<p>[<a href="http://metah.nojhan.net/post/2008/09/11/#rev-pnote-276667-1" id="pnote-276667-1" name="pnote-276667-1">1</a>] Johann Dréo, Patrick Siarry, "<a href="http://www.nojhan.net/pro/spip.php?article26" hreflang="en">Stochastic
metaheuristics as sampling techniques using swarm intelligence</a>. ", in
"Swarm Intelligence: Focus on Ant and Particle Swarm Optimization", Felix T. S.
Chan, Manoj Kumar Tiwari (Eds.), Advanced Robotic Systems International, I-Tech
Education and Publishing, Vienna, Austria , ISBN 978-3-902613-09-7 - December
2008</p>
</div>The problem with spreading new metaheuristicsurn:md5:94643814f4659ec3b7d16d5f92ff77b72008-03-03T11:19:00+01:00nojhanDiversevolutionary computationmetaheuristic <p><a href="http://www.blogger.com/profile/16939449739489587189">Marcelo De
Brito</a> had interesting thoughts about what he call <a href="http://geneticargonaut.blogspot.com/2008/02/evolving-grid-computing-optimization.html">
New Wave Of Genetic Algorithms</a>. He is surprised that when "evolutionary
computation" is applied to a new problem, the first algorithm used is the good
old canonic genetic algorithm, despite that there exist active researchs on
Estimation of Distribution Algorithms. <a href="http://www.blogger.com/profile/09333191187316058782">Julian Togelius</a> write
that it may be because people does not understand other algorithms, or even
know that anything else exists.</p>
<p>I think that is definitely true. This subject is a kind of hobby for me.
Indeed, as I have came from ecology to applied mathematics, I feel like a kind
of generalist researcher, not being able to be the best somewhere, but trying
to be as good as possible on several fields. Concerning the field of what
Marcelo called NWOGA, I would like to emphasize some other things.</p>
<p>As David E. Goldberg say in <a href="http://entrepreneurialengineer.blogspot.com/2007/05/genetic-algorithm-course-intro-lecture.html">
its courses</a>, genetic algorithm is the term everybody use. For specialist, a
GA is just a kind of "evolutionary algorithm", with specific rules, that are
more defined by <em>history</em> than by anything else.</p>
<p>The litterature on evolutionary computation is quite big, the first
algorithm being designed in 1965 (evolutionary strategies, followed by
evolutionary programming in 1966), making it difficult to spread deep changes
on basic concepts.</p>
<p>There exist a lot more stochastic algorithms for global optimization than
just evolutionary ones. I prefer to call the stochastic metaheuristics, or
simply metaheuristics, because this lead to far less bias than a metaphoric
naming (cf. the previous post on <a href="http://metah.nojhan.net/post/2007/10/12/Classification-of-metaheuristics">classification
of metaheuristics</a>).</p>
<p>For example, during my PhD thesis, I was convinced that some Ant Colony
Optimization algorithms were just equivalent to Estimation of Distribution
Algorithms, when talking about continuous problems. Moreover, I'm now convinced
that a lot of metaheuristics just shares some common stochastic sampling
processes, that are not specifiquely related to evolution. For example,
mathematically, Simulated Annealing is just a kind of EDA using an
approximation of the objective function as a model (or inversely, of
course).</p>
<p>As Julian says: <q>I know roughly what an EDA does, but I couldn't sit down
an implement one on the spot</q>. This is, in my humble opinion, one of the
more important thing to keep in mind. Indeed, there exist more and more papers
claming that a correct parameter setting of a metaheuristic can lead to the
performances of any competing metaheuristic.</p>
<p>Thus, the true discriminatory criterion is not the fantasised intrinsic
capability, but the ease of implementation and parameter setting <em>on a
specific problem</em>. In other words, choose the algorithm you like, but be
aware that there exists a lot of other ones.</p>Evolving Objects 1.0 is outurn:md5:331cfa3a698c46cdb9e4b6418acd76d22007-01-07T00:00:00+01:00nojhanProgrammationC plus plusevolutionary computation <p>The EO framework has just reached <a href="http://sourceforge.net/project/showfiles.php?group_id=9775" hreflang="en">the
1.0 version</a>. This is one of the most interesting library for
metaheuristics.</p>
<p>It is a templatized C++ framework, with a component based architecture. EO
is focused on "evolutionary computing" (a synonym of metaheuristics, imho) and
can be used for any population-base metaheuristics. There exists versions for
local searches, multi-objective optimization or parallel architectures... a
real good piece of software :-)</p>Metaheuristics and machine-learningurn:md5:6a9a108df90979ccc7d46e7b1cf8dcd62006-12-19T00:00:00+01:00nojhanDiversevolutionary computationmachine learningsimulated annealing <p>Metaheuristics and machine-learning algorithms shares a large number of
characteristics, like stochastic processes, manipulaton of probability density
functions, etc.</p>
<p>One of the interesting evolution of the research on metaheuristics these
years is the increasing bridge-building with machine-learning. I see at least
two interesting pathways: the use of metaheuristics in machine-learning and the
use of machine-learning in metaheuristics.</p>
<p>The first point is not really new, machine-learning heavily use
optimization, and it was natural to try stochastic algorithms where local
search or exact algorithms failed. Nevertheless, there is now a sufficient
litterature to organize some special sessions in some symposium. For 2007,
there will be a <em><a href="http://seal.tst.adfa.edu.au/~alar/gbml2007/" hreflang="en">special session on Genetics-Based Machine Learning</a></em> at
CEC, and a <em><a href="http://www.sigevo.org/gecco-2007/organizers-tracks.html" hreflang="en">track
on Genetics-Based Machine Learning and Learning Classifier Systems</a></em> at
GECCO. These events are centered around "genetic" algortihm (see the posts on
the IlliGAL blog : <a href="http://www-illigal.ge.uiuc.edu/system/components/com_jd-wp/wp-trackback.php?p=745" hreflang="en">1</a>, <a href="http://www-illigal.ge.uiuc.edu/system/components/com_jd-wp/wp-trackback.php?p=746" hreflang="en">2</a>), despite the fact that there are several papers using
other metaheuritics, like simulated annealing, but this is a common drawback,
and does not affect the interest of the subject.</p>
<p>The second point is less exploited, but I find it of great interest. A
simple example of what can be done with machine-learning inside metaheuristic
can be shown with estimation of distribution algorithms. In these
metaheuristics, a probability density function is used to explicitely build a
new sample of the objective function (a "population", in the evolutionary
computation terminology) at each iteration. It is then crucial to build a
probability density function that is related to the structure of the objective
function (the "fitness landscape"). There, it should be really interesting to
build the model of the pdf itself from a selected sample, using a
machine-learning algorithm. There is some interesting papers talking about
that.</p>
<p>If you mix these approaches with the problem of estimating a Boltzmann
distribution (the basis of simulated annealing), you should have an awesome
research field...</p>Metaheuristics and experimental researchurn:md5:791721d7bcb3baa498d1430e63b6c5462006-12-10T00:00:00+01:00nojhanDiversevolutionary computationperformance assessment <p>Springer has just published a book on "<a href="http://www.springer.com/east/home/computer/foundations?SGWID=5-156-22-142872140-detailsPage=ppmmedia" hreflang="en">Experimental Research in Evolutionary Computation</a>", written
by <a href="http://ls11-www.cs.uni-dortmund.de/people/tom/" hreflang="en">Thomas Bartz-Beielstein</a>.</p>
<p>Thomas Bartz-Beielstein is working on the statistical analysis of the
behaviour of metaheuristics (see <a href="http://ls11-www.cs.uni-dortmund.de/people/tom/#Talks_and_Tutorials" hreflang="en">its tutorials</a> at GECCO and CEC), and the publication of its book is a
really great thing. I haven't read it yet, but the table of content seems
really promising. There is a true need for such work in the metaheuristics
community, and in stochastic optimization in general.</p>
<p>A friend said to me that the lack of experimental culture in the computer
science community was a form of consensus, perhaps because theoretical aspects
of mathematics was the "only way to make true science". This is a true problem
when you deal with stochastic algorithm, applied to real world problem. Despite
the fact that several papers early call for more rigourous experimental studies
of metaheuristcs (<a href="http://mistic.heig-vd.ch/taillard/" hreflang="en">E.D. Taillard</a> has written papers on this problem several years ago,
for example), the community does not seems to quickly react.</p>
<p>Yet, things are changing, after the series of CEC special sessions on
benchmark for metaheuristics, there is more and more papers on how to test
stochastic optimization algorithms and outline the results. I think this book
is coming timely... the next step will be to promote the dissemination of the
results data (and code!), in an open format, along with the papers.</p>What are metaheuristics ?urn:md5:0c3c19f8d320d8b16e3a3f4486afdf5c2006-08-23T00:00:00+02:00nojhanDiversdefinitionevolutionary computationhybridizationlocal searchmemetic algorithmmetaheuristictabu search <p>Despite the title of this blog, the term <em>metaheuristic</em> is not
really well defined.</p>
<p>One of the first occurence of the term can (of course) be found in a paper
by Fred Glover<sup>[<a href="http://metah.nojhan.net/post/2006/08/23/#pnote-188116-1" id="rev-pnote-188116-1" name="rev-pnote-188116-1">1</a>]</sup>: <em>Future Paths for Integer Programming and
Links to Artificial Intelligence</em><sup>[<a href="http://metah.nojhan.net/post/2006/08/23/#pnote-188116-2" id="rev-pnote-188116-2" name="rev-pnote-188116-2">2</a>]</sup>. In the section
concerning <em>tabu search</em>, he talks about <em>meta-heuristic</em>:</p>
<blockquote>
<p>Tabu search may be viewed as a "meta-heuristic" superimposed on another
heuristic. The approach undertakes to transcend local optimality by a strategy
of forbidding (or, more broadly, penalizing) certain moves.</p>
</blockquote>
<p>In the AI field, a <em>heuristic</em> is a specific method that help solving
a problem (from the greek for <em>to find</em>), but how must we understand the
<em>meta</em> word ? Well, in greek, it means "after", "beyond" (like in
<em>metaphysic</em>) or "about" (like in <em>metadata</em>). Reading Glover,
<em>metaheuristics</em> seems to be <em>heuristics beyond heuristics</em>,
which seems to be a good old definition, but what is the definition nowadays ?
The litterature is really prolific on this subject, and the definitions are
numerous.</p>
<p>There is at least three tendencies :</p>
<ol>
<li>one that consider that the most important part of <em>metaheuristcs</em> is
the gathering of several heuristics,</li>
<li>one other that promotes the fact that <em>meta</em>heuristics are designed
as generalistic methods, that can tackle several problems without major changes
in their design,</li>
<li>the last one that use the term only for evolutionnary algorithms when they
are hybridicized with local searches (methods that are called <em>memetic
algorithms</em> in the other points of vue).</li>
</ol>
<p>The last one is quite minor in the generalistic litterature, it can mainly
be found in the field of evolutionnary computation, separate out the two other
tendencies is more difficult.</p>
<p>Here are some definitions gathered in more or less generalistic papers:</p>
<blockquote>
<p>"iterative generation process which guides a subordinate heuristic by
combining intelligently different concepts for exploring and exploiting the
search space" (Osman and Laporte, 1996<sup>[<a href="http://metah.nojhan.net/post/2006/08/23/#pnote-188116-3" id="rev-pnote-188116-3" name="rev-pnote-188116-3">3</a>]</sup>)</p>
</blockquote>
<blockquote>
<p>"(metaheuristics) combine basic heuristic methods in higher level frameworks
aimed at efficiently and effectively exploring a search space" (Blum and Roli,
2003<sup>[<a href="http://metah.nojhan.net/post/2006/08/23/#pnote-188116-4" id="rev-pnote-188116-4" name="rev-pnote-188116-4">4</a>]</sup>)</p>
</blockquote>
<blockquote>
<p>"a metaheuristic can be seen as a general-purpose heuristic method designed
to guide an underlying problem-specific heuristic (...) A metaheuristic is
therefore a general algorithmic framework which can be applied to different
optimization problems with relative few modifications to make them adapted to a
specific problem." (Dorigo and Stützle, 2004<sup>[<a href="http://metah.nojhan.net/post/2006/08/23/#pnote-188116-5" id="rev-pnote-188116-5" name="rev-pnote-188116-5">5</a>]</sup>)</p>
</blockquote>
<blockquote>
<p>"(metaheuristics) apply to all kinds of problems (...) are, at least to some
extent, <em>stochastic</em> (...) direct, i.e. they do not resort to the
calculation of the gradients of the objective function (...) inspired by
<em>analogies</em>: with physics, biology or ethology" (Dréo, Siarry, Petrowski
and Taillard, 2006<sup>[<a href="http://metah.nojhan.net/post/2006/08/23/#pnote-188116-6" id="rev-pnote-188116-6" name="rev-pnote-188116-6">6</a>]</sup>)</p>
</blockquote>
<p>One can summarize by enumerating the expected characteristics:</p>
<ul>
<li>optimization algorithms,</li>
<li>with an iterative design,</li>
<li>combining low level heuristics,</li>
<li>aiming to tackle a large scale of "hard" problems.</li>
</ul>
<p>As it is pointed out by the last reference, a large majority of
metaheuristics (well, not to say <em>all</em>) use at least one stochastic
(probabilistic) process and does not use more information than the solution and
the associated value(s) of the objective function.</p>
<p>Talking about <em>combining</em> heuristics seems to be appropriate for
<em>Ant Colony Optimization</em>, that specifically needs one (following
Dorigo's point of vue), it can be less obvious for <em>Evolutionnary
Algorithms</em>. One can consider that <em>mutation</em>, or even the method's
strategy itself, is a heuristic, but isn't it too generalistic to be called a
<em>heuristic</em> ?</p>
<p>If we forget the difficulty to demarcate what can be called a
<em>heuristic</em> and what is the scope of the term <em>meta</em>, one can
simply look at the use of the term among specialists. Despite the fact that the
definition can be used in several fields (data mining, machine learning, etc.),
the term is used for optimization algorithms. This is perhaps the best reason
among others: the term permits to separate a research field from others, thus
adding a little bit of marketing...</p>
<p>I would thus use this definition:</p>
<blockquote>
<p>Metaheuristics are algorithms designed to tackle "hard" optimization
problems, with the help of iterative stochastic processes. These methods are
manipulating direct samples of the objective function, and can be applied to
several problems without major changes in their design.</p>
</blockquote>
<div class="footnotes">
<h4>Notes</h4>
<p>[<a href="http://metah.nojhan.net/post/2006/08/23/#rev-pnote-188116-1" id="pnote-188116-1" name="pnote-188116-1">1</a>] A recurrent joke says that whatever is your new idea,
it has already be written down by Glover</p>
<p>[<a href="http://metah.nojhan.net/post/2006/08/23/#rev-pnote-188116-2" id="pnote-188116-2" name="pnote-188116-2">2</a>] Comput. & Ops. Res.Vol. 13, No.5, pp. 533-549,
1986</p>
<p>[<a href="http://metah.nojhan.net/post/2006/08/23/#rev-pnote-188116-3" id="pnote-188116-3" name="pnote-188116-3">3</a>] <em>Metaheuristic: A bibliography</em>, Annals of
Operations Research, vol. 63, pp. 513-623, 1996</p>
<p>[<a href="http://metah.nojhan.net/post/2006/08/23/#rev-pnote-188116-4" id="pnote-188116-4" name="pnote-188116-4">4</a>] <em>Metaheuristics in combinatorial optimization:
Overview and conceptual comparison</em>, ACM Computing Surveys, vol. 35, issue
3, 2003</p>
<p>[<a href="http://metah.nojhan.net/post/2006/08/23/#rev-pnote-188116-5" id="pnote-188116-5" name="pnote-188116-5">5</a>] <em>Ant Colony Optimization</em>, MIT Press, 2004</p>
<p>[<a href="http://metah.nojhan.net/post/2006/08/23/#rev-pnote-188116-6" id="pnote-188116-6" name="pnote-188116-6">6</a>] <em>Metaheuristics for Hard Optimization</em>,
Springer, 2006</p>
</div>About this blogurn:md5:6371356d53a94f98925fb66da14eeb9f2006-08-01T00:00:00+02:00nojhanDiversant colony algorithmestimation of distributionevolutionary computationsimulated annealing <p>This blog is an attempt to publish thoughts about metaheuristics and to
share them with others. Indeed, blogs are fun, blogs are popular, ok... but
most of all, blogs can be very usefull for researchers, that constently need to
communicate, share ideas and informations.</p>
<p>Metaheuristics are (well, that's one definition among others, but in my
opinion the better one) iterative (stochastic) algorithms for "hard"
optimization. Well known metaheuristics are the so-called "genetic algorithms"
(lets call them <em>evolutionary</em> ones), but these are not the only class:
dont forget simulated annealing, tabu search, ant colony algorithms, estimation
of distribution, etc.</p>
<p>This blog will try to focuse on the <em>theory</em>, the <em>design</em>,
the <em>understanding</em>, the <em>application</em>, the
<em>implementation</em> and the <em>use</em> of metaheuristics. I hope this
blog will be profitable to other peoples (researchers as well as users), and
will be a place to share thoughts.</p>
<p>Welcome aboard, and lets sleep with metaheuristics.</p>