Welcome to the universe of N-BEATS, where neural basis expansion analysis meets interpretable time series forecasting! In this blog, we’ll explore how to implement N-BEATS using both TensorFlow and PyTorch backends. You might be wondering, what does all of this mean? Don’t worry; by the end of this article, you’ll be ready to dive in!
Understanding N-BEATS
N-BEATS is essentially a mathematical magician aimed at solving time series forecasting problems by providing interpretable results. Imagine you are looking for a needle in a haystack (your forecasting objective); N-BEATS is the strategic approach, equipped with tools that help you sift through the hay, identifying patterns that lead directly to the needle – the answer to your forecasting dilemma.
Installation
Before you start, let’s get N-BEATS up and running on your machine. You can easily install both TensorFlow and PyTorch backends.
From PyPI
- To install the Tensorflow/Keras backend, run:
pip install nbeats-keras
pip install nbeats-pytorch
From the Sources
For those who prefer a more hands-on approach:
- Install N-BEATS with Keras using:
make install-keras
make install-pytorch
Run on the GPU
If you want to harness the power of the GPU (for Keras backend), you can ensure GPU utilization by running:
pip uninstall -y tensorflow
pip install tensorflow-gpu
Example Implementation
Here’s how you can get familiar with both backends:
import warnings
import numpy as np
from nbeats_keras.model import NBeatsNet as NBeatsKeras
from nbeats_pytorch.model import NBeatsNet as NBeatsPytorch
warnings.filterwarnings(action='ignore', message='Setting attributes')
def main():
num_samples, time_steps, input_dim, output_dim = 50_000, 10, 1, 1
for BackendType in [NBeatsKeras, NBeatsPytorch]:
backend = BackendType(
backcast_length=time_steps, forecast_length=output_dim,
stack_types=(NBeatsKeras.GENERIC_BLOCK, NBeatsKeras.GENERIC_BLOCK),
nb_blocks_per_stack=2, thetas_dim=(4, 4), share_weights_in_stack=True,
hidden_layer_units=64
)
backend.compile(loss='mae', optimizer='adam')
x = np.random.uniform(size=(num_samples, time_steps, input_dim))
y = np.mean(x, axis=1, keepdims=True)
c = num_samples // 10
x_train, y_train, x_test, y_test = x[c:], y[c:], x[:c], y[:c]
test_size = len(x_test)
print("Training...")
backend.fit(x_train, y_train, validation_data=(x_test, y_test), epochs=20, batch_size=128)
backend.save('n_beats_model.h5')
predictions_forecast = backend.predict(x_test)
predictions_backcast = backend.predict(x_test, return_backcast=True)
model_2 = BackendType.load('n_beats_model.h5')
np.testing.assert_almost_equal(predictions_forecast, model_2.predict(x_test))
if __name__ == '__main__':
main()
Breaking Down the Code: An Analogy
Think of the above code block as a recipe for making a delicious multi-layer cake!
- The ingredients: num_samples, time_steps, input_dim, output_dim represent the quantities of different cake ingredients.
- Mixing the ingredients: The loop iterates over different types of cakes (Keras and PyTorch), much like deciding whether to make chocolate or vanilla layers.
- Baking: When you call backend.fit(), you’re baking the cake in the oven, ensuring that all ingredients blend perfectly over the specified time (epochs).
- Testing: Just like tasting your cake with a toothpick, backend.predict() allows you to check if the cake is ready through backcasting and forecasting!
Troubleshooting
If you encounter issues during installation or implementation, consider the following troubleshooting tips:
- Ensure you have the necessary dependencies installed, including Python and relevant libraries.
- Double-check the backend compatibility; for instance, the Keras backend currently supports
input_dim=1
. - If your model is not training well, check the data shapes and ensure they align with the model’s expectations.
- For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
Conclusion
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.