Mentales habitudes

Aller au contenu | Aller au menu | Aller à la recherche


Fil des billets

jeudi 16 décembre 2010

Evolving Objects C++ framework -- release 1.1.0

After several years without a release, the EO team is proud to announce that a new version of the "Evolving Objects" framework is available.

EO is a template-based, C++ evolutionary computation library which helps you to write your own stochastic optimization algorithms insanely fast.

Learn more about EO on the official website:

You will find the release 1.1.0 at the following address:

Here is a summary of the change log:

  • massive documentation cleaning, now far more browsable!
  • provide cmake build system, remove the old autotools one
  • GCC 4.3 compatibility
  • new versatile log system with several nested verbose levels
  • an evaluator that throw an exception if a maximum evaluation numbers has been reached, independently of the number of generations
  • a new monitor that can write on any ostream
  • a new continuator that can catch POSIX system user signals
  • dual fitness class to handle feasibility of individual with guarantee that feasible fitness will always be better than unfeasible one
  • more statistics: feasible fitness ratio, fitness interquartile range, average size of individuals
  • new website, switch from CVS to GIT, and a cool logo!

See a more detailed change log at:

Do not hesitate to submit the bugs you will face:

I've chosen EO since a few years to code metaheuristics, after having test − and even code − several other frameworks, and I did not regret this choice. Since a few month, I'm now the maintainer of EO, and the main developper, along with my intern, Caner Candan, and we have great plan for improving EO!

Happy evolutionary hacking :)

lundi 23 juin 2008

An Estimation of Distribution Algorithm in a few lines of python

About an interesting post I have already commented, Julian Togelius said: "I know roughly what an EDA does, but I couldn't sit down an implement one on the spot [...]"

I personally think that estimation of distribution algorithms (EDA) are some of the more elegant an easy to use metaheuristics. Obviously, this highly depends of the mind configuration :-) Anyway, as a piece of code is more comprehensive than a long speech, I've made a very simple and small EDA in python, to illustrate my thought.

This is a simple continuous EDA using a (also simple) normal probability density function to optimize a (once more simple) function with two variables. As you can see, the code is (guess what ?) simple, only a few lines with some scipy functions, and that's it.

from scipy import *

# The problem to optimize
def x2y2( x ):
        return x[0]**2 + x[1]**2

class eda:
        def __init__(self, of):
                # Algorithm parameters
                self.iterations = 100
                self.sample_size = 100
                self.select_ratio = 0.5
                self.epsilon = 10e-6

                # class members
                self.objective_function = of
                self.dimensions = 2
                self.sample = []
                self.means = []
                self.stdevs = []        
                self.debug = False

        def run(self):
                # uniform initialization
                self.sample = random.rand( self.sample_size, self.dimensions+1 )
                # cosmetic
                self.sample = self.sample * 200 - 100

                # main loop
                i = 0
                while i < self.iterations:
                        i += 1

                # sort the final sample
                # output the optimum
                print "#[ x y f(x,y) ]"
                print self.sample[0]

        def sample_sort(self):
                # sort rows on the last column
                self.sample = self.sample[ argsort( self.sample[:,-1], 0 ) ]

        def dispersion_reduction(self):

                # number of points to select
                nb = int( floor( self.sample_size * self.select_ratio ) )

                # selection
                self.sample = self.sample[:nb]

        def estimate_parameters( self ):
                # points sub array (without values)
                mat = self.sample[:,:self.dimensions]
                # row means (axis 0 in scipy)
                self.means = mean( mat, 0 )
                # row standard deviation
                self.stdevs = std( mat, 0 )

        def draw_sample(self):
                # for each variable to optimize
                for i in xrange(self.dimensions):
                        # if the dispersion is null
                        if self.stdevs[i] == 0.0:
                                # set it to a minimal value
                                self.stdevs[i] = self.epsilon
                # empty sample
                self.sample = zeros( (self.sample_size, self.dimensions+1) )
                # for each point
                for i in xrange( self.sample_size ):
                        # draw in random normal
                        p = random.normal( self.means, self.stdevs )
                        # put it into the sample
                        self.sample[i][:self.dimensions] = p

        def evaluate(self):
                # for each point
                for i in xrange( self.sample_size ):
                        d = self.dimensions
                        # call the objective function
                        #   the third element is the result of the objective function call
                        #   taking the first two elements as variables
                        self.sample[i][-1]  = self.objective_function( self.sample[i][:d] )

if __name__=="__main__":
        a = eda( x2y2 )

See also the file alone with a debug mode, to see how it works in details.

dimanche 7 janvier 2007

Evolving Objects 1.0 is out

The EO framework has just reached the 1.0 version. This is one of the most interesting library for metaheuristics.

It is a templatized C++ framework, with a component based architecture. EO is focused on "evolutionary computing" (a synonym of metaheuristics, imho) and can be used for any population-base metaheuristics. There exists versions for local searches, multi-objective optimization or parallel architectures... a real good piece of software :-)

samedi 28 octobre 2006

Frameworks for metaheuristics

Note that descriptions are picked up from the web sites of the projects.

As one can see, most of these softwares are designed for evolutionnary algorithms, but I recommend you to try out some of the generic frameworks, because "genetic" algorithms are not always the best choice for solving an optimization problem, despite their wide spread.

Here are the frameworks I would recommend. These frameworks are free softwares, you can use, look at the code, modify it and redistribute it (precious qualities for frameworks).

I would also recommend C or C++, which permits to implement fast programs, while using object oriented programming. C++ compilers are also available for a large choice of plateforms (with a special distinction for GCC, which is free software). A fast progam is crucial for testing algorithms on real problems and using a well-known langage is a good idea.

The main idea beside the design of the framework is specified as one of the following keywords:

  • template: design a new algorithm concist in extending a base class, perhaps the simple object model to understand, but it can be difficult to re-use existing code.
  • component: design a new algorithm concist in select its component from availables operators, make it easy to implement algorithms, but it can be quite difficult to understand the underlying model. It can be hard to add new algorithm paradigm (generaly used for evolutionnary algorithms).
  • function: design a new algorithm concist in use the framework's primitives. Simple to understand, but one must learn the primitives to use and code a lot of stuff.


Here is my list :

  • Open Metaheuristics: a library aimed at the conception of metaheuristics (i.e. genetic/evolutionnary algorithms, tabu search, simulated annealing, ant colony algorithms, etc.). One of the main goal of oMetah is to permit rigourous empirical tests of metaheuristics, through a statistical approach. (C++, LGPL, template)
  • *EO: a set of paradigm-free Evolutionary Computation libraries dedicated to the flexible design of EAs through evolving objects superseding the most common dialects (Genetic Algorithms, Evolution Strategies, Evolutionary Programming and Genetic Programming). (C++, GPL, component)
  • Sferes: a framework that gathers an evolution framework with a simulation framework. It is dedicated to experiments involving the design of artificial agents through artificial evolutionary algorithms, like Genetic Algorithms or Evolution Strategies. (C++, CeCILL, component)
  • openBEAGLE: Evolutionary Computation (EC) framework. (C++, LGPL, component).
  • PISA: Platform and Programming Language Independent Interface for Search Algorithms. PISA is mainly dedicated to multi-objective search, where the optimization problem is characterized by a set of conflicting goals and not just one criterion that needs to be optimized. (C, BSD, function)
  • GAlib: defines classes for using genetic algorithms to do optimization in any C++ program using any representation and genetic operators. The distribution includes extensive documentation and many examples.. (C++, MIT, component)
  • MOMHLib++: a library of C++ classes that implements a number of multiple objective metaheuristics. It has been developed on the basis of former implementations of Pareto simulated annealing (PSA) and multiple objective genetic local search (MOGLS). (C++, GPL, template).
  • ECJ Java Evolutionary Computation Research System. (Java, AFL, component)


These frameworks are not those which I would recommend, but they have some properties that could be intersting :

If you want to find out more and more frameworks, try searching "genetic" on source forge

Many of these projects are dead, be carefull if you need a maintained framework. Take a close look to the project activity, generally, the number of developers and file releases give you a good idea of the framework vitality. Also check if it is a student's project for a training course or if it is made by professional researchers.