Early Stopping Callback to Improve Neural Network Performance with Python

Hyper-tuning neural network parameters to increase performance

Amit Chauhan
5 min readMay 31, 2024
Image source

It is one of the techniques through which we can increase the performance of the neural network, in keras there are many hyper-parameters to be set properly to get better accuracy. The hyper-parameters are the number of hidden layers, per-layer neurons, learning rate, optimizer, batch size, activation functions, and epoch.

Early stopping is an intelligent mechanism to stop early automatically while training with the help of the callback feature in the keras library. In the epoch, we can set the number of iterations to update weights depending on the type of gradient descent. In the type of mini-batch, we need to set the number of batches i.e. a hyper-parameter. To know more about these hyper-parameters in this article.

If we set the epoch with a large number then the model may become over-fit i.e. giving result better on the training set but not on testing set data.

Python Example:

# avoind the warnings
import warnings

# used to see the decision boundary in 2D space
from mlxtend.plotting import plot_decision_regions

# using tensorflow framwork
from tensorflow.keras.models import Sequential
from…

--

--

Amit Chauhan
Amit Chauhan

Written by Amit Chauhan

Data Scientist, AI/ML/DL, Azure Cloud

No responses yet