Welcome to the world of artificial intelligence with MicroMLP, a compact yet powerful multilayer perceptron designed to run seamlessly on devices like ESP32 and Pycom modules. Whether you’re looking to process signals, images, or even explore reinforcement learning, this guide is here to help you navigate MicroMLP with ease.
Getting Started with MicroMLP
MicroMLP is incredibly user-friendly and consists of only a single file – microMLP.py. This simplicity allows you to modify the network’s structure, manage learning examples, and even utilize various activation functions.
Features of MicroMLP
Here’s what makes MicroMLP stand out:
- Modifiable multilayer and connections structure
- Integrated bias on neurons
- Plasticity of the connections included
- Activation functions by layer
- Parameters Alpha, Eta, and Gain
- Managing examples and learning sets
- QLearning functions available for reinforcement learning
- Ability to save and load structures from JSON files
Activation Functions Offered
MicroMLP supports various activation functions:
- Heaviside binary step
- Logistic (sigmoid)
- Hyperbolic tangent
- SoftPlus rectifier
- ReLU (rectified linear unit)
- Gaussian function
How to Create a Neural Network Using MicroMLP
Creating a neural network with MicroMLP is as easy as pie! Let’s break it down with an analogy: think of building your neural network as constructing a Lego model. Each block represents a neuron, and the connections between them symbolize how they communicate. By choosing how many blocks to use, and what types of blocks (activation functions) to mix, you can build an intricate model tailored to your needs.
from microMLP import MicroMLP
mlp = MicroMLP.Create([3, 10, 2], Sigmoid, MicroMLP.LayersFullConnect)
In this example, the neural network consists of 3 neurons in the input layer, 10 in the hidden layer, and 2 in the output layer, using the Sigmoid activation function.
Learning and Predicting with MicroMLP
Once you’ve set up your neural network, you can train and make predictions. Here’s how to tackle the famous XOr problem:
from microMLP import MicroMLP
mlp = MicroMLP.Create(
neuronsByLayers=[2, 2, 1],
activationFuncName=MicroMLP.ACTFUNC_TANH,
layersAutoConnectFunction=MicroMLP.LayersFullConnect
)
nnFalse = MicroMLP.NNValue.FromBool(False)
nnTrue = MicroMLP.NNValue.FromBool(True)
mlp.AddExample([nnFalse, nnFalse], [nnFalse])
mlp.AddExample([nnFalse, nnTrue], [nnTrue])
mlp.AddExample([nnTrue, nnTrue], [nnFalse])
mlp.AddExample([nnTrue, nnFalse], [nnTrue])
learnCount = mlp.LearnExamples()
print('LEARNED:')
print(' - False xor False = %s' % mlp.Predict([nnFalse, nnFalse])[0].AsBool)
print(' - False xor True = %s' % mlp.Predict([nnFalse, nnTrue])[0].AsBool)
print(' - True xor True = %s' % mlp.Predict([nnTrue, nnTrue])[0].AsBool)
print(' - True xor False = %s' % mlp.Predict([nnTrue, nnFalse])[0].AsBool)
if mlp.SaveToFile('mlp.json'):
print("MicroMLP structure saved!")
Troubleshooting Common Issues
As with any programming, issues can arise. Here are some common scenarios you might encounter with MicroMLP and how to fix them:
- Network structure not learned: Ensure you’re using a sufficient number of training examples. If necessary, increase the size of your training set.
- Slow predictions: Check if the device you’re running MicroMLP on has adequate processing power.
- Saving/loading problems: If you encounter issues while attempting to save or load files, verify that the path and file names are correctly specified.
- No output: Make sure that the activation functions and neurons are appropriately set up. Misconfiguration can lead to no results from predictions.
For additional support, feel free to reach out for collaboration or insights on AI development projects by connecting with **fxis.ai**.
Conclusion
At **fxis.ai**, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

