Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Hands-On Neuroevolution with Python.

You're reading from  Hands-On Neuroevolution with Python.

Product type Book
Published in Dec 2019
Publisher Packt
ISBN-13 9781838824914
Pages 368 pages
Edition 1st Edition
Languages
Author (1):
Iaroslav Omelianenko Iaroslav Omelianenko
Profile icon Iaroslav Omelianenko

Table of Contents (18) Chapters

Preface Section 1: Fundamentals of Evolutionary Computation Algorithms and Neuroevolution Methods
Overview of Neuroevolution Methods Python Libraries and Environment Setup Section 2: Applying Neuroevolution Methods to Solve Classic Computer Science Problems
Using NEAT for XOR Solver Optimization Pole-Balancing Experiments Autonomous Maze Navigation Novelty Search Optimization Method Section 3: Advanced Neuroevolution Methods
Hypercube-Based NEAT for Visual Discrimination ES-HyperNEAT and the Retina Problem Co-Evolution and the SAFE Method Deep Neuroevolution Section 4: Discussion and Concluding Remarks
Best Practices, Tips, and Tricks Concluding Remarks Other Books You May Enjoy

Best Practices, Tips, and Tricks

In this chapter, we provide some advice on best practices, tips, and tricks for writing and analyzing neuroevolution algorithms. By the end of this chapter, you will know how to start working with the problem at hand, how to tune the hyperparameters of the neuroevolution algorithm, how to use advanced visualization tools, and what metrics can be used to the analyze the algorithm's performance. Also, you will learn about the best coding practices for Python, which will help you in the implementation of your projects.

In this chapter, we will cover the following topics:

  • Starting with problem analysis
  • Selecting the optimal search optimization method
  • Using advanced visualization tools
  • Tuning hyperparameters and knowing what should be tuned
  • Understanding which performance metrics to collect
  • Python coding tips and tricks
...

Starting with problem analysis

Starting with a proper analysis of the problem space is a recipe for success. Neuroevolution is lenient with programmer errors. Such mistakes are a part of the environment, and the evolution process can adapt to them. However, there is a particular category of mistakes that can hinder the evolution process from finding a successful solution: the numerical stability of the evolution process. Most types of activation function are designed to operate in a range of inputs between zero and one. As a result, too large or negative values do not have much influence on the evolution process.

Thus, you may need to preprocess the input data to avoid these numeric issues. Do not skip the analysis of the input data samples and data preprocessing steps.

Next, we discuss how to preprocess the input data.

...

Selecting the optimal search optimization method

In this book, we have presented you with two basic search optimization methods: goal-oriented search and Novelty Search. The former method is more straightforward to implement and easier to understand. However, Novelty Search is handy in cases where the fitness function has a deceptive landscape with many local optima traps.

In the next section, we briefly discuss both methods to remind you of the details and to help you choose which one to use in a given situation. We start with goal-oriented search.

Goal-oriented search optimization

Goal-oriented search optimization is based on measuring the proximity of the solution to the ultimate goal. To calculate the average distance...

Advanced visualization

Almost always, proper visualization of inputs and results is crucial to the success of your experiment. With proper visualization, you will get intuitive insights about what has gone wrong and what needs to be fixed.

Always try to visualize the simulator execution environment. Such visualization can save you hours of debugging when you get an unexpected result. Usually, with adequate visualization, you can see that something has gone wrong at a glance, such as a maze solver that got stuck up in a corner.

With neuroevolution algorithms, you also need to visualize the performance of the genetic algorithm execution per generation. You need to visualize speciation from generation to generation to see whether the evolutionary process has stagnated. Stagnated evolution fails to create enough species to maintain healthy diversity among solvers. On the other hand...

Tuning hyperparameters

With proper tuning of the hyperparameters, you can make tremendous improvements in the training speed and efficiency of the neuroevolution process. Here are some practical tips:

  • Do short runs with different seed values of the random number generator and note how the algorithm performance changes. After that, choose the seed value that gives the best performance and use it for the long runs.
  • You can increase the number of species in the population by decreasing the compatibility threshold and by slightly increasing the value of the disjoint/excess weight coefficient.
  • If the process of neuroevolution has stumbled while trying to find a solution, try to decrease the value of the NEAT survival threshold. This coefficient maintains the ratio of the best organisms within a population that got the chance to reproduce. By doing this, you increase the quality of...

Performance metrics

After a successful solution is found, it is crucial to compare it with other solutions to estimate how good it is. There are many important statistical metrics that compare different models.

Become familiar with concepts such as precision score, recall score, F1 score, ROC AUC, and accuracy. Understanding these metrics will help you to compare the results produced by different models in various classification tasks. Next, we give a brief overview of these metrics.

Precision score

The precision score attempts to answer the question of how many among the positive identifications are actually correct. The precision score can be calculated as follows:

TP is the true positives, and FP is the false positives...

Python coding tips and tricks

Having decided to work with Python, it is vital to learn the best coding practices of the language. Here, we provide you with some tips and give you directions to continue with self-learning.

Coding tips and tricks

The following coding tips and tricks will help you master Python:

  • You should learn how to use the popular machine learning libraries, such as NumPy (https://numpy.org), pandas (https://pandas.pydata.org), and Scikit-learn (https://scikit-learn.org/stable/). Mastering these libraries will give you tremendous power in data manipulation and analysis. This will help you to avoid many mistakes, and enable easy debugging of the results collected from the experiments.
  • Learn about the object...

Summary

In this chapter, we provided you with practical tips that we hope will make your life easier. You learned about the standard methods of data preprocessing and about conventional statistical metrics that can be used to evaluate the performance of the models you created. Finally, you learned how to improve your coding skills and where to look for additional information about Python and machine learning topics.

In the next chapter, we will look at a few concluding remarks based on what we have learned in the book and where we can use the concepts we have learned in the future.

lock icon The rest of the chapter is locked
You have been reading a chapter from
Hands-On Neuroevolution with Python.
Published in: Dec 2019 Publisher: Packt ISBN-13: 9781838824914
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}