• Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar
  • Skip to footer
  • Home
  • Crypto Currency
  • Technology
  • Contact
NEO Share

NEO Share

Sharing The Latest Tech News

  • Home
  • Artificial Intelligence
  • Machine Learning
  • Computers
  • Mobile
  • Crypto Currency

Building Custom Callbacks with Keras and TensorFlow 2

December 28, 2020 by systems

Similarly, we can implement a custom callback TestingCallback() for testing.

class TestingCallback(Callback):

def on_test_begin(self, logs=None):
print("Starting testing ...")

def on_test_batch_begin(self, batch, logs=None):
print(f"Testing: Starting batch {batch}")

def on_test_batch_end(self, batch, logs=None):
print(f"Testing: Finished batch {batch}")

def on_test_end(self, logs=None):
print("Finished testing")

To use it for testing, pass it to the callbacks argument in the model.valutate() method.

model.evaluate(
X_test,
y_test,
verbose=False,
callbacks=[TestingCallback()],
batch_size=2000, // A large value for demo purpose
)

By executing the statement, you should get an output like below:

Starting testing ...
Testing: Starting batch 0
Testing: Finished batch 0
Testing: Starting batch 1
Testing: Finished batch 1
Testing: Starting batch 2
Testing: Finished batch 2
Testing: Starting batch 3
Testing: Finished batch 3
Testing: Starting batch 4
Testing: Finished batch 4
Finished testing
[85.61210479736329, 0.8063]

Similarly, we can implement a custom callback PredictionCallback() for prediction.

class PredictionCallback(Callback):

def on_predict_begin(self, logs=None):
print("Starting prediction ...")

def on_predict_batch_begin(self, batch, logs=None):
print(f"Prediction: Starting batch {batch}")

def on_predict_batch_end(self, batch, logs=None):
print(f"Prediction: Finish batch {batch}")

def on_predict_end(self, logs=None):
print("Finished prediction")

To use it for prediction, we just need to pass it to the callbacks argument in the model.predict() method.

model.predict(
X_test,
verbose=False,
callbacks=[PredictionCallback()],
batch_size=2000, // A large value for demo purpose
)

By executing the statement, you should get an output like below:

Starting prediction ...
Prediction: Starting batch 0
Prediction: Finish batch 0
Prediction: Starting batch 1
Prediction: Finish batch 1
Prediction: Starting batch 2
Prediction: Finish batch 2
Prediction: Starting batch 3
Prediction: Finish batch 3
Prediction: Starting batch 4
Prediction: Finish batch 4
Finished prediction
array([......])

A main application of callback is to perform some actions depend on performance metrics, for example:

  • Real-time plotting during training
  • Stop training when a metric has stopped improving
  • Save model at the end of every epoch
  • Adjust learning rate (or other hyperparameters) according to a defined schedule
  • etc

In this section, we are going to show you some examples of Keras custom callback applications.

  • Real-time plotting during training
  • Early stopping at minimum loss
  • Learning rate scheduling

4.1 Real-time plot during training

This first example shows the creation of a Callback that shows a live, real-time update of loss as your training progresses.

import numpy as np
import tensorflow as tf
fig = plt.figure(figsize=(12,4))
# Create plot inside the figure
ax = fig.add_subplot()
ax.set_xlabel('Epoch #')
ax.set_ylabel('loss')
class TrainingPlot(tf.keras.callbacks.Callback):
def on_train_begin(self, logs={}):
# Initialize the lists for holding losses
self.losses = []

def on_epoch_end(self, epoch, logs={}):
# Append the losses to the lists
self.losses.append(logs['loss'])

# Plot
epochs = np.arange(0, len(self.losses))
ax.plot(epochs, self.losses, "b-")
fig.canvas.draw()

self.losses=[] is initialized in on_train_begin() for holding losses. The real-time plot is implemented at the end of each epochon_epoch_end() by calling ax.plot(epochs, self.losses, "b-") and fig.canvas.draw() .

To use it for training, we just need to pass it to the callbacks argument in the model.fit() method.

model = create_model()history = model.fit(
X_train,
y_train,
epochs=20,
validation_split=0.20,
batch_size=64,
verbose=2,
callbacks=[TrainingPlot()]
)

By executing the statement, you should get a live real-time output like below:

image by author

4.2 Early stopping at minimum loss

Note: this example is originally from Keras guide “Writing your own callbacks”, please check out the official documentation for details.

This example shows the creation of a Callback that stops training when the minimum of loss has been reached, by setting the attribute self.model.stop_training (boolean). Optionally, you can provide an argument patience to specify how many epochs we should wait before stopping after having reached a local minimum.

To use it for training:

model = create_model()history = model.fit(
X_train,
y_train,
epochs=50,
validation_split=0.20,
batch_size=64,
verbose=2,
callbacks=[EarlyStoppingAtMinLoss()]
)

In this run, our training was stopped at epoch 28 as the minimum of loss has been reached

image by author

4.3 Learning rate scheduler

Note: this example is originally from Keras guide “Writing your own callbacks”, please check out the official documentation for details.

This example shows how a custom Callback can be used to dynamically change the learning rate of the optimizer during the course of training.

To use it

model = create_model()history = model.fit(
X_train,
y_train,
epochs=15,
validation_split=0.20,
batch_size=64,
verbose=2,
callbacks=[CustomLearningRateScheduler(lr_schedule)]
)

Keras provides a base class called Callback which allows us to subclass it and create our own callbacks. It is really useful for debugging and perform some actions depend on performance metrics.

I hope this article will help you to save time in creating your own custom callback and perform some custom actions. I recommend you to check out the documentation for the Callbacks API and to know about other things you can do.

Thanks for reading. Please check out the notebook for the source code and stay tuned if you are interested in the practical aspect of machine learning.

You may be interested in some of my other TensorFlow articles:

More can be found from my Github

Filed Under: Machine Learning

Primary Sidebar

Stay Ahead: The Latest Tech News and Innovations

Cryptocurrency Market Updates: What’s Happening Now

Emerging Trends in Artificial Intelligence: What to Watch For

Top Cloud Computing Services to Secure Your Data

The Future of Mobile Technology: Recent Advancements and Predictions

Footer

  • Privacy Policy
  • Terms and Conditions

Copyright © 2025 NEO Share

Terms and Conditions - Privacy Policy