Understanding Genetic Algorithms

With each successive generation, we only retain the highest performers and discard non-optimal solutions.

By carrying over elites, we guarantee that generational peak performance can potentially increase and never decrease.

After 155 generations, a chromosome reaches the solution with 100% fitness!Fitness of 2 Elitism, 2 Splice Pair and 2 Mutation Chromosomes in Generation 155Converging to the Genetic SolutionWe have fixed the size of each generation to be 10 chromosomes, but the genetic algorithm incorporates randomness in many areas, from our random chromosomes to the crossover genes to the bit flip rate.

Statistics of the first 10 generations show how natural selection gradually improves the population fitness, while dispersion is highest in the first random generation.

Fitness Distribution Statistics for Generations 1 to 10Statistics of the last 10 generations show that each population is extremely fit with the least fit chromosome scoring 95%.

As a high fitness chromosome approaches the genetic solution, it becomes more likely that its fitness can only improve by mutating a single gene.

Thus, high mutation rates and low bit flip rates give fit generations the best chance of achieving the target solution.

Fitness Distribution Statistics for Generations 146 to 155We can visualize the performance of the genetic algorithm by plotting fitness statistics across generations.

Any random binomial distribution will empower the first generation with 50% fitness.

Elitism guarantees that generational peak performance will be monotonically increasing, while crossover promotes substantial initial improvements and mutation pushes them to the top.

Convergence of the Gene Pool Towards the Optimal Solution Over 155 GenerationsTuning the Model ParametersThe randomness of the genetic algorithm means that its performance and efficiency will vary over a series of repeated trials.

By solving 1,000 games and plotting the distribution of total chromosomes created over all generations, we see that our genetic algorithm will typically find the optimal solution after creating only about 1,700 chromosomes out of 1.

27e+30 possibilities!This performance distribution is specific to the hyper-parameters of our model, which are 10 chromosomes per generation, a 20% elite rate, a 20% crossover rate, a 60% mutation rate and a 1% bit flip rate.

In the future, we could optimize the efficiency of our genetic algorithm for this problem by minimizing the convergence rate with respect to these feature parameters.

Performance Metric of a Genetic Algorithm with Fixed Model Parameters Over 1,000 SamplesThe source code written for this analysis is available on GitHub, which includes a random generator of a Battleship board, implementation of a genetic algorithm and the examples presented in this article.

While there are many more details in mastering genetic algorithms, we should feel confident that we understand the fundamentals of this optimization algorithm.

.

. More details

Leave a Reply