Unlock the power of gaze detection with machine learning in JavaScript! This guide will help you create interactive experiences that respond to eye movements, allowing users to engage without the need for traditional input devices.
Demo
Curious to see it in action? Visit the gaze demo (works well on mobile too!) and explore the possibilities!

What is Gaze Detection?
This tool detects when users look in different directions—right, left, up, and straight ahead. It’s inspired by the Android application Look to Speak and utilizes TensorFlow.js’s face landmark detection model.
How to Use
Installation
To get started, you’ll need to install the gaze-detection module. Use the following command:
npm install gaze-detection --save
Code Sample
Now that you have the module installed, let’s break down the code required to implement gaze detection.
Think of this process as setting up a wise owl that can observe and react based on where it’s looking. In our analogy:
- The wise owl is represented by our gaze detection tool.
- The various inputs and predictions are like different branches in a tree that the owl can choose to look at.
- The actions we take based on the owl’s observations are similar to responding to what the owl sees.
Here’s how you can implement it:
import gaze from 'gaze-detection';
const videoElement = document.querySelector('video');
const init = async () => {
// Using the default webcam
await gaze.setUpCamera(videoElement);
// Or, using more camera input devices
const mediaDevices = await navigator.mediaDevices.enumerateDevices();
const camera = mediaDevices.find(
(device) => device.kind === 'videoinput' && device.label.includes('YourLabel') // The label from the list of available devices
);
await gaze.setUpCamera(videoElement, camera.deviceId);
};
const predict = async () => {
const gazePrediction = await gaze.getGazePrediction();
console.log('Gaze direction: ', gazePrediction); // Will return RIGHT, LEFT, STRAIGHT or TOP
if (gazePrediction === 'RIGHT') {
// Do something when the user looks to the right
}
let raf = requestAnimationFrame(predict);
};
predict();
// To stop the detection:
cancelAnimationFrame(raf);
Troubleshooting
If you encounter issues during setup or while running predictions, consider the following troubleshooting tips:
- Ensure that your webcam is connected and accessible by the browser.
- Check your browser’s permissions to make sure it has access to the webcam.
- Verify that you are using the correct device label for the camera if you are specifying one. This may vary across different devices.
- Ensure that TensorFlow.js and gaze-detection are properly installed and up to date.
- Look for any errors in the browser console for more specific guidance.
If you need further assistance, don’t hesitate to reach out. For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
Conclusion
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.