How to Use the Online Neural Network (ONN) in Python

Jan 7, 2022 | Data Science

The Online Neural Network (ONN) provides a new approach to deep learning with capabilities tailored for online learning. This blog will guide you through the setup and usage of the ONN implementation, also introducing its secondary methodology, the Non-linear Contextual Bandit Algorithm (ONN_THS). By the end of this article, you will be equipped to use both algorithms effectively.

What is ONN?

ONN is a Pytorch implementation of the research paper Online Deep Learning: Learning Deep Neural Networks on the Fly. The algorithm incorporates a new backpropagation mechanism called Hedge Backpropagation, which adjusts to the learning process dynamically by enabling or disabling various hidden layers based on performance. In simpler terms, think of ONN as a smart switchboard operator, who decides which circuits (hidden layers) to activate based on the efficiency of the signals (performance) it continues to receive.

Installation

To get started with ONN, you first need to install the package. You can do this swiftly by running the following command in your terminal:

pip install onn

How to Use ONN

Once installed, you can use the ONN easily by following these steps:

Step 1: Importing the Library

import numpy as np
from onn.OnlineNeuralNetwork import ONN

Step 2: Starting a Neural Network

Initialize your network specifying feature size, maximum hidden layers, number of neurons per layer, and the number of classes:

onn_network = ONN(features_size=2, max_num_hidden_layers=5, qtd_neuron_per_hidden_layer=10, n_classes=2)

Step 3: Partial Training

Conduct partial training with sample data:

onn_network.partial_fit(np.asarray([[0.1, 0.2]]), np.asarray([0]))
onn_network.partial_fit(np.asarray([[0.8, 0.5]]), np.asarray([1]))

Step 4: Making Predictions

Utilize the trained model to predict classes:

predictions = onn_network.predict(np.asarray([[0.1, 0.2], [0.8, 0.5]]))

The expected output will look like this:

Predictions -- array([1, 0])

New Features

  • The algorithm now supports batch processing (mainly for experimentation, as it is not recommended for online approaches).
  • It can utilize CUDA for enhanced performance, though for smaller networks, CPU processing may be quicker.

Using the ONN_THS Algorithm

The ONN_THS functions like a non-linear contextual bandit, blending exploitation and exploration factors, thanks to the Thompson Sampling algorithm:

from onn.OnlineNeuralNetwork import ONN_THS

onn_network = ONN_THS(features_size=2, max_num_hidden_layers=5, qtd_neuron_per_hidden_layer=10, n_classes=2)
arm_selected, exploration_factor = onn_network.predict(np.asarray([[0.1, 0.2]]))
onn_network.partial_fit(np.asarray([[0.1, 0.2]]), np.asarray([arm_selected]), exploration_factor)

Troubleshooting

If you encounter any issues while using ONN or ONN_THS, consider the following:

  • Check if the package is installed correctly. Rerun pip install onn if necessary.
  • Ensure your input data is shaped correctly. Input data should be NumPy arrays.
  • Make sure to check for compatibility if using CUDA.
  • If you need additional insights or collaboration on AI projects, feel free to reach out. For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.

Conclusion

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox