Reader small image

You're reading from  Hands-On Neuroevolution with Python.

Product typeBook
Published inDec 2019
Reading LevelExpert
PublisherPackt
ISBN-139781838824914
Edition1st Edition
Languages
Right arrow
Author (1)
Iaroslav Omelianenko
Iaroslav Omelianenko
author image
Iaroslav Omelianenko

Iaroslav Omelianenko occupied the position of CTO and research director for more than a decade. He is an active member of the research community and has published several research papers at arXiv, ResearchGate, Preprints, and more. He started working with applied machine learning by developing autonomous agents for mobile games more than a decade ago. For the last 5 years, he has actively participated in research related to applying deep machine learning methods for authentication, personal traits recognition, cooperative robotics, synthetic intelligence, and more. He is an active software developer and creates open source neuroevolution algorithm implementations in the Go language.
Read more about Iaroslav Omelianenko

Right arrow

ES-HyperNEAT and the Retina Problem

In this chapter, you will learn about the ES-HyperNEAT extension of the HyperNEAT method, which we discussed in the previous chapter. As you learned in the previous chapter, the HyperNEAT method allows the encoding of larger-scale artificial neural network (ANN) topologies, which is essential for working in areas where the input data has a large number of dimensions, such as computer vision. However, despite all its power, the HyperNEAT method has a significant drawback—the configuration of the ANN substrate should be designed beforehand by a human architect. The ES-HyperNEAT method was invented to address this issue by introducing the concept of evolvable-substrate, which allows us to produce the appropriate configuration of the substrate automatically during evolution.

After familiarizing yourself with the basics of the ES-HyperNEAT...

Technical requirements

Manual versus evolution-based configuration of the topography of neural nodes

The HyperNEAT method, which we discussed in Chapter 7, Hypercube-Based NEAT for Visual Discrimination, allows us to use neuroevolution methods for a broad class of problems that require the use of large-scale ANN structures to find a solution. This class of problem spreads across multiple practical domains, including visual pattern recognition. The main distinguishing feature of all these problems is the high dimensionality of the input/output data.

In the previous chapter, you learned how to define the configuration of the substrate of the discriminator ANN to solve a visual discrimination task. You also learned that it is crucial to use an appropriate substrate configuration that is aligned with the geometric features of the search space of the target problem. With the HyperNEAT method, you, as an...

Quadtree information extraction and ES-HyperNEAT basics

For the effective calculation of the information density within the connectivity patterns of the substrate, we need to use an appropriate data structure. We need to employ a data structure that allows an effective search through the two-dimensional substrate space at different levels of granularity. In computer science, there is a data structure that perfectly fits these requirements. This structure is the quadtree.

The quadtree is a data structure that allows us to organize an effective search through two-dimensional space by splitting any area of interest into four subareas. Each of these subareas consequently becomes a leaf of a tree, with the root node representing the initial region.

ES-HyperNEAT employs the quadtree data structure to iteratively look for the new connections and nodes in the substrate, starting from...

Modular retina problem basics

The hierarchical modular structures are an essential part of the complex biological organisms and play an indispensable role in their evolution. The modularity enhances the evolvability, allowing the recombination of various modules during the evolution process. The evolved hierarchy of modular components bootstraps the evolution process, allowing operations over a collection of complex structures rather than basic genes. After that, the neuroevolutionary process does not need to spend time to evolve similar functionality from scratch again. Instead, the ready-to-use modular components can be used as building blocks to produce very complex neural networks.

In this chapter, we will implement a solution to the retina problem using the ES-HyperNEAT algorithm. The retina problem is about the simultaneous identification of valid 2x2 patterns on the left...

Modular retina experiment setup

In this section, we discuss the details of an experiment aimed at creating a successful solver of the modular retina problem. In our experiment, we use this problem as a benchmark to test the ability of the ES-HyperNEAT method to discover modular topologies in the phenotype ANN.

The initial substrate configuration

As described earlier in the chapter, the retina has dimensions of 4x2, with two 2x2 areas, one on the left side and one on the right side. The particulars of the retina geometry must be represented in the geometry of the initial substrate configuration. In our experiment, we use a three-dimensional substrate, as shown in the following diagram:

The initial substrate configuration

As...

Modular retina experiment

Now we are ready to start experimenting against the test environment that simulates the modular retina problem space. In the next subsections, you will learn how to select appropriate hyperparameters and how to set up the environment and run the experiment. After that, we discuss the experiment results.

Hyperparameter selection

The hyperparameters are defined as a Parameters Python class, and the MultiNEAT library refers to it for the necessary configuration options. In the source code of the experiment runner script, we define a specialized function called create_hyperparameters, which encapsulates the logic of the hyperparameter initialization. Hereafter, we describe the most critical hyperparameters...

Exercises

  1. Try to run an experiment with different values of the random seed generator that can be changed in line 101 of the retina_experiment.py script. See if you can find successful solutions with other values.
  2. Try to increase the initial population size to 1,000 by adjusting the value of the params.PopulationSize hyperparameter. How did this affect the performance of the algorithm?
  3. Try to change the number of activation function types used during the evolution by setting the probability of its selection to 0. It's especially interesting to see what happens when you exclude the ActivationFunction_SignedGauss_Prob and ActivationFunction_SignedStep_Prob activation types from selection.

Summary

In this chapter, we learned about the neuroevolution method that allows the substrate configuration to evolve during the process of finding the solution to the problem. This approach frees the human designer from the burden of creating a suitable substrate configuration to the smallest details, allowing us to define only the primary outlines. The algorithm will automatically learn the remaining details of the substrate configuration during the evolution.

Also, you learned about the modular ANN structures that can be used to solve various problems, including the modular retina problem. Modular ANN topologies are a very powerful concept that allows the reuse of the successful phenotype ANN module multiple times to build a complex hierarchical topology. Furthermore, you have had the chance to hone your skills with the Python programming language by implementing the corresponding...

lock icon
The rest of the chapter is locked
You have been reading a chapter from
Hands-On Neuroevolution with Python.
Published in: Dec 2019Publisher: PacktISBN-13: 9781838824914
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Author (1)

author image
Iaroslav Omelianenko

Iaroslav Omelianenko occupied the position of CTO and research director for more than a decade. He is an active member of the research community and has published several research papers at arXiv, ResearchGate, Preprints, and more. He started working with applied machine learning by developing autonomous agents for mobile games more than a decade ago. For the last 5 years, he has actively participated in research related to applying deep machine learning methods for authentication, personal traits recognition, cooperative robotics, synthetic intelligence, and more. He is an active software developer and creates open source neuroevolution algorithm implementations in the Go language.
Read more about Iaroslav Omelianenko