Neuroevolution test

In the process of developing/discovering a new approach to evolutive AI, i have to explore the current approaches to acquire a background knowledge i need to keep maturing my ideas… so… this time i create a classical and simple implementation of neuroevolution to make a car learn how to traverse the track…

next stop… NEAT….

Genetic Algorithms

‘be advised: Risk of really bad english’

I will do my best so, by the end of this post,  we can have a basic understanding of how to create a algorithm that can evolve its agents using a simple model based on natural selection. Sounds cool?? I hope so….

Let’s get to it!!

 

The survival of the fittest

Our model is going to use a few simple concepts, the first one is the concept of Fitness.

Fitness is simply the variable that is going to represent how good your agent is, and of course, the characteristic represented by this variable may vary depending on your problem.

0

If you are evolving some digital giraffes, the fitness may represent the length of the neck, because if the neck is too short they can not feed, soooo… longer neck equals better agent that equals higher fitness.

 

Natural selection

To evolve a population, we need to select some agents that will reproduce and create the next generation. This concept is about how we select the best candidates that will improve the overall fitness of the population. That is a lot of different ways of doing this, here i will approach one of simplest and most used techniques called roulette-wheel selection.

The basic idea is to pick a random agent from the population, but the chance of picking a certain agent is associated to its fitness, so agents with higher fitness will have a equally higher chance of being selected.
1

Here is how it can be done:

  • total = Sum of all the fitness
  • slice = Random value between 0 and ‘total
  • fitnessSoFar = 0
  • loop from i = 0 to population size
    • fitnessSoFar = fitnessSoFar + population[ i ].fitness
    • if fitnessSoFar >= slice
      • theChosenOne = population[ i ]
      • exit the loop

To avoid losing good agents, we also use something called elitism, but i will talk about that latter on.

 

Gimme sum good genes

You probably heard a history about dad, that gives a seed to mum and 9 months latter… well… you know… Now we are going to see how that seed carry a bunch of characteristics to the next generation.

Mum and dad carry a thing called chromosome ( for our model we will consider only one chromosome per agent ), which is a string of information ( genes ) about them, in our case it can be a string of floats, doubles, bits and so on. When a baby is made, chromosomes of both parents are combined in a process called crossover.

First thing we need to do is select a random position to serve as a pivot, than we iterate over the chromosomes to create a new one, taking from mum’s genes before the pivot, and from dad’s genes after the pivot, something like this:

3
In the image above the chromosomes are represented by the array of colored squares.

 

The algorithm is quite simple:

  • Select two agents from the population using the roulette-wheel selection
  • Select a random pivot from 0 to chromosome size
  • loop from i=0  to pivot
    • baby1 [ i ] = mum[ i ]
    • baby2 [ i ] = dad[ i ]
  • loop from i=pivot to chromosome size
    • baby1 [ i ] = dad[ i ]
    • baby2 [ i ] = mum[ i ]

Be aware that are some variations to the crossover technique that may or may not perform better for your particular problem.

 

Mutation

Now we know how to select agents and how to mix those precious genes, the only concept left for now is the concept of mutation. Mutations allow us to insert subtle changes to the agent’s chromosome that, in the long run, may improve the overall fitness.

It is as simple as this:

  • loop from i = 0 to chromosome size
    • if we going to mutate this gene ( small chance )
      • mutation = random from minimum to maximum pertubation
      • chromosome [ i ] = chromosome [ i ] + mutation

 

Putting it all together

Let’s see how everything work together… but first, let’s talk about elitism.

You probably remember how do we mix the genes at every generation to make, hopefully, better chromosomes, but in this process there is a considerable risk of mixing really good genes and end up losing a great solution for our problem. To mitigate against that, we use the so called elitism, that simply takes a percentage of the best agents of the population and put them directly into the new generation, without mixing or mutating anything, preserving those good chromosomes.

Enough talk… let’s see how it work

  1. First we create a initial population with random chromosomes
  2. Loop for x amount of generations
    1. Sort them by fitness so we can easily pick the best ones for the elitism
    2. Sum all the fitness and store it
    3. Create a new list or array to store the new population
    4. Using elitism, we pick a percentage of the population (let’s say 20%) with the best fitness and put it into the new population
    5. While the size of the new population is smaller than the size of the old population,
      1. select mum with the roulette-wheel selection
      2. select dad with the roulette-wheel selection
      3. If it falls into crossover chance( something arround 70% )
        1. create baby1 and baby2 through crossover
      4. Else
        1. baby1 = mum
        2. baby2 = dad
      5. If it falls into mutation chance( something between 5% and 30% ) for the baby1
        1. mutate baby1
      6. If it falls into mutation chance( something between 5% and 30% ) for the baby2
        1. mutate baby2
      7. Put the babies into the new population
    6. increase generation counter
    7. current population = new population

 

 

Alright…. now that’s the basic stuff i suggest you to create:

  • Chromosome class
    • Proprieties:
      • fitness (usually a double)
      • a list or array of genes. Genes can be at any type you need ( usually double )
  • Genetic algorithm class
    • Proprieties:
      • list or array of chromosomes ( population )
      • sum of all fitness ( usually a double )
      • chance of mutation ( float ), best between 0.05 and 0.3
      • crossover chance ( float ), 0.7 it’s a good choice
      • generation counter (int)
      • max mutation or pertubation ( float )
      • elitism rate (float)
    • Methods
      • roulette-wheel selection ( )
      • crossover ( mum, dad, baby1, baby2 )
      • mutate ( chromosome )
      • evolve()

 

Genetic algorithms is really fun to use and can solve a hole bunch of problems, from function optimization to neural network training…. there is a lot you can do… experiment with it and let me know if this was helpful to you in any way…. i’m kind of new to this, and any feedback will help me produce better content….

thank you..

now get out of here and go program some stuff….

Graduation is coming!!

before continue, be advised: My english really suck sometimes…

I’m graduating at the end of this year, and i start to develop my graduation project. The idea is to create a machine learning algorithm that uses a neural network (neuroevolution) to generate a behavior tree that can be plugged into a NPC in a game. This may sound quite confusing, but putting in simple words, i want  to create emergent behavior on NPCs by having them experimenting with the space they are in.

My next couple of posts will be about this project…. and since is a bit big, and also far from done, i will be posting about some subjects that are used in the project… and hopefully, we can glue everything together by the end of it.

I’m not intending to create tutorials, but high level explanation about those subjects in a most direct and clear way that i possibly can, so you can implement in any language you may desire.

If you really want a tutorial, with code and all that stuff, please let me know…