Tag: Genetic Algorithms

Genetic Algorithms, News, Science

Health and Genetic Algorithms

From R&D Mag – Developing a potential life-saving mathematical tool -:

Math and medicine are coming together to help people who have suffered an abdominal aortic aneurysm, which with 15,000 is the 13th-leading cause of death in the United States.

At the heart of the effort are genetic algorithms written by Oak Ridge National Laboratory researchers that allow physicians to more efficiently assess and organize the often vast amounts of information contained in patient reports. Ultimately, with this tool—a sophisticated way to quickly extract key phrases—doctors will be able to characterize features and findings in reports and provide better patient care.

(…)

This work builds on previous studies involving genetic algorithms developed for mammography. That system allows doctors to quickly identify trends specific to an individual patient and match images and text to a database of known cancerous and pre-cancerous conditions.

Read full article here.

toggle Sign up now! View the newsletter sample.

Developing a potential life-saving mathematical tool

Article, Genetic Algorithms, genetic programming, News, Science

Darwin on the track

From The Economist article:

WHILE watching the finale of the Formula One grand-prix season on television last weekend, your correspondent could not help thinking how Darwinian motor racing has become. Each year, the FIA, the international motor sport’s governing body, sets new design rules in a bid to slow the cars down, so as to increase the amount of overtaking during a race—and thereby make the event more interesting to spectators and television viewers alike. The aim, of course, is to keep the admission and television fees rolling in. Over the course of a season, Formula One racing attracts a bigger audience around the world than any other sport.

Read the full article here.

Genetic Algorithms, News, Science

Meanwhile, at the Hall of Justice!

UPDATE 05/10: there is an article in the Physorg too.

Sometimes we face new applications for EC, but for this I was not expecting, from Eurekalert:

WASHINGTON, Oct. 5 — Criminals are having a harder time hiding their faces, thanks to new software that helps witnesses recreate and recognize suspects using principles borrowed from the fields of optics and genetics.

(…)

His software generates its own faces that progressively evolve to match the witness’ memories. The witness starts with a general description such as “I remember a young white male with dark hair.” Nine different computer-generated faces that roughly fit the description are generated, and the witness identifies the best and worst matches. The software uses the best fit as a template to automatically generate nine new faces with slightly tweaked features, based on what it learned from the rejected faces.

“Over a number of generations, the computer can learn what face you’re looking for,” says Solomon.

Read the full article here.

Genetic Algorithms, Pyevolve, Python

n-queens problem using Pyevolve

Last night I’ve read a post on Reddit written by Matthew Rollings showing a code in Python to solve Eight Queens puzzle using EA. So I decided to implement it in Python again but this time using Pyevolve, here is the code:

from pyevolve import *
from random import shuffle

BOARD_SIZE = 64

def queens_eval(genome):
   collisions = 0
   for i in xrange(0, BOARD_SIZE):
      if i not in genome: return 0
   for i in xrange(0, BOARD_SIZE):
      col = False
      for j in xrange(0, BOARD_SIZE):
         if (i != j) and (abs(i-j) == abs(genome[j]-genome[i])):
            col = True
      if col == True: collisions +=1
   return BOARD_SIZE-collisions

def queens_init(genome, **args):
   genome.genomeList = range(0, BOARD_SIZE)
   shuffle(genome.genomeList)

def run_main():
   genome = G1DList.G1DList(BOARD_SIZE)
   genome.setParams(bestrawscore=BOARD_SIZE, rounddecimal=2)
   genome.initializator.set(queens_init)
   genome.mutator.set(Mutators.G1DListMutatorSwap)
   genome.crossover.set(Crossovers.G1DListCrossoverCutCrossfill)
   genome.evaluator.set(queens_eval)

   ga = GSimpleGA.GSimpleGA(genome)
   ga.terminationCriteria.set(GSimpleGA.RawScoreCriteria)
   ga.setMinimax(Consts.minimaxType["maximize"])

   ga.setPopulationSize(100)
   ga.setGenerations(5000)
   ga.setMutationRate(0.02)
   ga.setCrossoverRate(1.0)

   # This DBAdapter is to create graphs later, it'll store statistics in
   # a SQLite db file
   sqlite_adapter = DBAdapters.DBSQLite(identify="queens")
   ga.setDBAdapter(sqlite_adapter)

   ga.evolve(freq_stats=10)

   best = ga.bestIndividual()
   print best
   print "\nBest individual score: %.2f\n" % (best.score,)

if __name__ == "__main__":
   run_main()

It tooks 49 generations to solve a 64×64 (4.096 chess squares) chessboard, here is the output:

Gen. 0 (0.00%): Max/Min/Avg Fitness(Raw) [20.83(27.00)/13.63(7.00)/17.36(17.36)]
Gen. 10 (0.20%): Max/Min/Avg Fitness(Raw) [55.10(50.00)/39.35(43.00)/45.92(45.92)]
Gen. 20 (0.40%): Max/Min/Avg Fitness(Raw) [52.51(55.00)/28.37(24.00)/43.76(43.76)]
Gen. 30 (0.60%): Max/Min/Avg Fitness(Raw) [67.45(62.00)/51.92(54.00)/56.21(56.21)]
Gen. 40 (0.80%): Max/Min/Avg Fitness(Raw) [65.50(62.00)/19.89(31.00)/54.58(54.58)]

        Evolution stopped by Termination Criteria function !

Gen. 49 (0.98%): Max/Min/Avg Fitness(Raw) [69.67(64.00)/54.03(56.00)/58.06(58.06)]
Total time elapsed: 39.141 seconds.

And here is the plots generated by the Graph Plot Tool of Pyevolve:

fig1

fig3

fig5

fig8

genetic programming, Pyevolve, Python

Genetic Programming meets Python

I’m proud to announce that the new versions of Pyevolve will have Genetic Programming support; after some time fighting with these evil syntax trees, I think I have a very easy and flexible implementation of GP in Python. I was tired to see people giving up and trying to learn how to implement a simple GP using the hermetic libraries for C/C++ and Java (unfortunatelly I’m a Java web developer hehe).

The implementation is still under some tests and optimization, but it’s working nice, here is some details about it:

The implementation has been done in pure Python, so we still have many bonus from this, but unfortunatelly we lost some performance.

The GP core is very very flexible, because it compiles the GP Trees in Python bytecodes to speed the execution of the function. So, you can use even Python objects as terminals, or any possible Python expression. Any Python function can be used too, and you can use all power of Python to create those functions, which will be automatic detected by the framework using the name prefix =)

As you can see in the source-code, you don’t need to bind variables when calling the syntax tree of the individual, you simple use the “getCompiledCode” method which returns the Python compiled function ready to be executed.

Here is a source-code example:

from pyevolve import *
import math

error_accum = Util.ErrorAccumulator()

# This is the functions used by the GP core,
# Pyevolve will automatically detect them
# and the they number of arguments
def gp_add(a, b): return a+b
def gp_sub(a, b): return a-b
def gp_mul(a, b): return a*b
def gp_sqrt(a):   return math.sqrt(abs(a))

def eval_func(chromosome):
   global error_accum
   error_accum.reset()
   code_comp = chromosome.getCompiledCode()

   for a in xrange(0, 5):
      for b in xrange(0, 5):
         # The eval will execute a pre-compiled syntax tree
         # as a Python expression, and will automatically use
         # the "a" and "b" variables (the terminals defined)
         evaluated     = eval(code_comp)
         target        = math.sqrt((a*a)+(b*b))
         error_accum += (target, evaluated)
   return error_accum.getRMSE()

def main_run():
   genome = GTree.GTreeGP()
   genome.setParams(max_depth=5, method="ramped")
   genome.evaluator.set(eval_func)

   ga = GSimpleGA.GSimpleGA(genome)
   # This method will catch and use every function that
   # begins with "gp", but you can also add them manually.
   # The terminals are Python variables, you can use the
   # ephemeral random consts too, using ephemeral:random.randint(0,2)
   # for example.
   ga.setParams(gp_terminals       = ['a', 'b'],
                gp_function_prefix = "gp")
   # You can even use a function call as terminal, like "func()"
   # and Pyevolve will use the result of the call as terminal
   ga.setMinimax(Consts.minimaxType["minimize"])
   ga.setGenerations(1000)
   ga.setMutationRate(0.08)
   ga.setCrossoverRate(1.0)
   ga.setPopulationSize(2000)
   ga.evolve(freq_stats=5)

   print ga.bestIndividual()

if __name__ == "__main__":
   main_run()

I’m very happy and testing the possibilities of this GP implementation in Python.

And of course, everything in Pyevolve can be visualized any time you want (click to enlarge):

ramped_small

ramped_big

The visualization is very flexible too, if you use Python decorators to set how functions will be graphical represented, you can have many interesting visualization patterns. If I change the function “gp_add” to:

@GTree.gpdec(representation="+", color="red")
def gp_add(a, b): return a+b

We’ll got the follow visualization (click to enlarge):

full

I hope you enjoyed it, I’m currently fixing some bugs, implementing new features, docs and preparing the next release of Pyevolve, which will take some time yet =)

Genetic Algorithms, News, Science

Digital Archeology Reveals Dinosaur Details using Genetic Algorithms

From the article of LiveScience.com:

The pick and shovel can go only so far in digging up details about dinosaurs. Now supercomputers are revealing knowledge about their anatomy otherwise lost to history.

(…)

For example, if the muscles connected to the thigh bone of a Tyrannosaurus rex were short, that would suggest it was angled vertically as in humans. However, if they were very long, it could have been angled horizontally as in birds.

dino

Initial attempts to randomly decipher which pattern of muscle activation works best result almost always in the animal falling on its face, explained computer paleontologist Peter Falkingham at the University of Manchester. But the scientists employ “genetic algorithms,” or computer programs that can alter themselves and evolve, and so run pattern after pattern until they get improvements.

Eventually, they evolve a pattern of muscle activation with a stable gait and the dinosaur can walk, run, chase or graze, Falkingham said. Assuming natural selection evolves the best possible solution as well, the modeled animal should move similar to its now extinct counterpart. Indeed, they have achieved similar top speeds and gaits with computer versions of humans, emus and ostriches as in reality.

Read the full article.

I'm starting a new course "Machine Learning: Foundations and Engineering" for 2024.