Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Hands-On Neuroevolution with Python.

You're reading from  Hands-On Neuroevolution with Python.

Product type Book
Published in Dec 2019
Publisher Packt
ISBN-13 9781838824914
Pages 368 pages
Edition 1st Edition
Languages
Author (1):
Iaroslav Omelianenko Iaroslav Omelianenko
Profile icon Iaroslav Omelianenko

Table of Contents (18) Chapters

Preface Section 1: Fundamentals of Evolutionary Computation Algorithms and Neuroevolution Methods
Overview of Neuroevolution Methods Python Libraries and Environment Setup Section 2: Applying Neuroevolution Methods to Solve Classic Computer Science Problems
Using NEAT for XOR Solver Optimization Pole-Balancing Experiments Autonomous Maze Navigation Novelty Search Optimization Method Section 3: Advanced Neuroevolution Methods
Hypercube-Based NEAT for Visual Discrimination ES-HyperNEAT and the Retina Problem Co-Evolution and the SAFE Method Deep Neuroevolution Section 4: Discussion and Concluding Remarks
Best Practices, Tips, and Tricks Concluding Remarks Other Books You May Enjoy

Using NEAT for XOR Solver Optimization

In this chapter, you will learn about one of the classic computer science experiments that demonstrates that the NEAT algorithm works and can create a proper network topology. In this chapter, you will get first-hand experience of writing an objective function to guide the XOR problem solver. You will also learn how to select the correct hyperparameters of the NEAT algorithm to assist with solving the XOR problem. This chapter aims to introduce you to the basic techniques of how to apply the NEAT algorithm to solve classic computer science problems.

After completing the experiment and exercises described in this chapter, you will have a solid understanding of the XOR experiment's particulars and get the practical skills you need to write the relevant Python source code using the NEAT-Python library. You will also gain experience in setting...

Technical requirements

XOR problem basics

The classic multilayer perceptron (MLP) or artificial neural network (ANN) without any hidden units in their topology is only capable of solving linearly separable problems correctly. As a result, such ANN configurations cannot be used for pattern recognition or control and optxor_experiment.pyimization tasks. However, with more complex MLP architectures that include some hidden units with a kind of non-linear activation function (such as sigmoid), it is possible to approximate any function to the given accuracy. Thus, a non-linearly separable problem can be used to study whether a neuroevolution process can grow any number of hidden units in the ANN of the solver phenotype.

The XOR problem solver is a classic computer science experiment in the field of reinforcement learning that cannot be solved without introducing non-linear execution to the solver algorithm...

The objective function for the XOR experiment

In the XOR experiment, the fitness of the organism in the population is defined as the squared distance between the correct answer and the sum of the outputs that are generated for all four XOR input patterns. It is computed as follows:

  1. The phenotype ANN is activated against all four XOR input patterns.
  2. The output values are subtracted from the correct answers for each pattern, and the absolute values of the results are then summed.
  3. The error value that was found at the previous step is subtracted from the maximal fitness value (4) to calculate organism fitness. The highest fitness value means better solver performance.
  4. The calculated fitness is then squared to give proportionally more fitness to the organisms, thereby producing solver ANNs that give closer answers to the correct solution. This approach makes the evolutionary pressure...

Hyperparameter selection

The XOR experiment we will discuss in this chapter uses the NEAT-Python library as a framework. The NEAT-Python library defines a set of hyperparameters that are used to control the execution and performance of the NEAT algorithm. The configuration file is stored in a format similar to Windows .INI files; each section starts with a name in square brackets ([section]), followed by key-value pairs that are delimited by an equals sign (=).

In this section, we will discuss some hyperparameters of the NEAT-Python library that can be found in each section of the configuration file.

A full list of the hyperparameters in the NEAT-Python library can be found at https://neat-python.readthedocs.io/en/latest/config_file.html.

NEAT section

...

Running the XOR experiment

Before we start working on the XOR experiment, we need to set up our Python environment correctly according to the requirements of the NEAT-Python library, which we chose as the framework for writing our code. The NEAT-Python library is available from PyPI, so we can use the pip command to install it into the virtual environment of the XOR experiment.

Environment setup

Before we start writing the code related to the XOR experiment, the appropriate Python environment should be created, and all the dependencies need to be installed into it. Follow these steps to set up the work environment properly:

  1. A Python 3.5 virtual environment for the XOR experiment is created using the conda command from the...

Exercises

Now that we have the source code of the neuroevolutionary-based XOR solver, try to experiment by changing NEAT's hyperparameters, which control the evolutionary process.

One of the parameters of particular interest is compatibility_threshold, which can be found in the DefaultSpeciesSet section of the configuration file:

  • Try to increase its value and monitor the speciation of the population. Compare the performance of the algorithm with the new value against the default one (3.0). Does it get any better?
  • What happens if you decrease the value of this parameter? Compare its performance against the default value.

Another essential parameter that controls the evolutionary process is min_species_size, which can be found in the DefaultReproduction section. By changing the values of this parameter, you can directly control the minimum number of individuals per species...

Summary

In this chapter, we introduced a classic computer science problem related to the creation of the optimal XOR solver. We discussed the basics of the XOR problem and demonstrated its importance as the first experiment with neuroevolution—it allows you to check whether the NEAT algorithm can evolve a more complex ANN topology, starting with the most straightforward ANN configuration. Then, we defined the objective function for the optimal XOR solver and a detailed description of the NEAT hyperparameters. After that, we used the NEAT-Python library to write the source code of the XOR solver using a defined objective function, and then we experimented.

The results of the experiment we carried out allowed us to conclude the relationship between the number of species in the population, the minimum size of each species, and the performance of the algorithm, as well as the...

lock icon The rest of the chapter is locked
You have been reading a chapter from
Hands-On Neuroevolution with Python.
Published in: Dec 2019 Publisher: Packt ISBN-13: 9781838824914
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}