PromethAI is an open-source framework designed to help you navigate decision-making, set personalized goals, and effectively execute them with the assistance of AI agents. In this blog, we will walk through how to set it up, use its fascinating features, and even troubleshoot common issues. Let’s dive in!
What is PromethAI?
PromethAI is a Python-based AGI (Artificial General Intelligence) project that tailors recommendations based on user goals and feedback. Currently focused on the food industry, it’s highly extensible, allowing application in various domains.
Features of PromethAI
- Optimized for Autonomous Agents
- Personalized for each user
- Decision trees to aid navigation and solution finding
- Asynchronous operation
- Supports multiple Vector DBs through Langchain
- Low latency and easy deployment
Setting Up PromethAI
Follow these simple steps to install and configure PromethAI:
- Clone the repository using the following command:
- Navigate to the directory:
- Create a copy of the .env.template and rename it to .env:
- Now enter your unique OpenAI API Key, Google key, and Custom Search Engine ID into the .env file. For instructions on obtaining these keys:
- OpenAI API Key: Visit OpenAI Developer to create an API key.
- Pinecone API Key: Sign up at Pinecone.io.
- Google API Key: Create a project in Google Cloud Console.
- Custom Search Engine ID: Set up at Google Programmable Search Engine.
- Ensure you have Docker and Docker Compose installed. If not, you can download them from here.
- Once that’s set up, run the command to start the application:
- Open a browser and go to localhost:3000 to see PromethAI running!
git clone https://github.com/topoteretes/PromethAI-Backend.git
cd PromethAI-Backend
cp .env.template .env
docker-compose up promethai --build
Understanding How PromethAI Works
Now, let’s break down how the AI operates:
- Imagine PromethAI as a detective trying to solve a case. When a user queries the AI, it first “collects evidence” by vectorizing the query and storing it in a virtual memory bank (the Pinecone Vector Database).
- Just like a detective pulls out the relevant clues from memory, the AI looks through its past queries to find any helpful information.
- Then, the detective (in this case, the AI) thinks about the next steps to take, stores the plan (thought), and takes action based on its findings and the user’s current query.
- Finally, it answers the question and adds this new evidence to its memory for future reference.
Using PromethAI
To use PromethAI, follow these commands:
docker-compose build promethai
And then access the API with CURL requests. Here’s an example:
curl -X POST http://0.0.0.0:8000/recipe-request -H "Content-Type: application/json" --data-raw '{ "user_id": 659, "session_id": 459, "model_speed": "slow", "prompt": "I would like a healthy chicken meal over 125$" }'
Troubleshooting Common Issues
If you encounter issues while setting up or using PromethAI, consider these troubleshooting steps:
- Check that you have all the required API keys entered correctly in the .env file.
- Ensure Docker and Docker Compose are installed and running properly.
- Double-check your terminal commands for typos.
- If the app doesn’t load at localhost:3000, make sure the Docker container is running.
- For additional insights or assistance, feel free to reach out. For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.
Conclusion
Setting up and utilizing PromethAI can be an exciting journey into the world of autonomous AI decisions. By following this guide, you can seamlessly get started and explore the vast possibilities this framework offers.

