In this chapter we will cover the following recipes:
- Perceptron classifier
- Neural network – multilayer perceptron
- Stacking with a neural network
In this chapter we will cover the following recipes:
Neural networks and deep learning have been incredibly popular recently as they have solved tough problems and perhaps have become a significant part of the public face of artificial intelligence. Let's explore the feed-forward neural networks available in scikit-learn.
With scikit-learn, you can explore the perceptron classifier and relate it to other classification procedures within scikit-learn. Additionally, perceptrons are the building blocks of neural networks, which are a very prominent part of machine learning, particularly computer vision.
Let's get started. The process is as follows:
Load the UCI diabetes dataset:
import numpy as np
import pandas as pd
data_web_address = "https://archive.ics...
Using a neural network in scikit-learn is straightforward and proceeds as follows:
Load the medium-sized California housing dataset that we used in Chapter 9, Tree Algorithms and Ensembles:
%matplotlib inline
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from sklearn.datasets import fetch_california_housing
cali_housing = fetch_california_housing()
X = cali_housing.data
y = cali_housing.target
Bin the target variable so that the target train set and target test set are a bit more similar. Then use a stratified train/test split...
The two most common meta-learning methods are bagging and boosting. Stacking is less widely used; yet it is powerful because one can combine models of different types. All three methods create a stronger estimator from a set of not-so-strong estimators. We tried the stacking procedure in Chapter 9, Tree Algorithms and Ensembles. Here, we try it with a neural network mixed with other models.
The process for stacking is as follows: