Welcome to your one-stop guide on how to harness the power of TaskingAI! This Backend as a Service (BaaS) platform simplifies the development and deployment of LLM-based agents while providing a multitude of built-in tools to enhance performance. From Docker setup to API interaction, we’ve got you covered!
Key Features of TaskingAI
- All-In-One LLM Platform: Access hundreds of AI models through unified APIs.
- Abundant Enhancement: Improve agent performance with customizable tools and an advanced RAG system.
- BaaS-Inspired Workflow: Decouple AI logic from product development using RESTful APIs and SDKs.
- One-Click to Production: Easily deploy your agents with just a click.
- Asynchronous Efficiency: Leverage Python FastAPI for high-performance concurrency.
- Intuitive UI Console: Simplify project management and workflow testing.
Quickstart with Docker
Setting up TaskingAI with Docker is straightforward. Follow these steps to get started:
Prerequisites
- Docker and Docker Compose installed on your machine.
- Git installed for cloning the repository.
- Python environment (version 3.8 or above) for running the client SDK.
Installation Steps
- Clone the TaskingAI repository from GitHub:
- Navigate to the Docker directory:
- Copy the environment configuration:
- Edit the
.envfile to set your configurations. - Start Docker Compose:
git clone https://github.com/taskingai/taskingai.git
cd taskingai
cd docker
cp .env.example .env
docker-compose -p taskingai --env-file .env up -d
Access the TaskingAI console through your browser at http://localhost:8080. The default credentials are username: admin and password: TaskingAI321.
Upgrading TaskingAI
If you have a previous version and want to upgrade, follow these steps:
- Pull the latest changes:
- Stop the current Docker service:
- Upgrade the Docker image and restart the service:
git pull origin master
cd docker
docker-compose -p taskingai down
docker-compose -p taskingai pull
docker-compose -p taskingai --env-file .env up -d
Your data will be automatically migrated, ensuring no data loss.
Using the TaskingAI Client SDK
Once your console is up, you can interact with TaskingAI programmatically using its Client SDK. Below is a simple example:
import taskingai
taskingai.init(api_key=YOUR_API_KEY, host="http://localhost:8080")
assistant = taskingai.assistant.create_assistant(model_id=YOUR_MODEL_ID, memory="naive")
chat = taskingai.assistant.create_chat(assistant_id=assistant.assistant_id)
taskingai.assistant.create_message(assistant_id=assistant.assistant_id, chat_id=chat.chat_id, text="Hello!")
assistant_message = taskingai.assistant.generate_message(assistant_id=assistant.assistant_id, chat_id=chat.chat_id)
print(assistant_message)
Make sure to replace YOUR_API_KEY and YOUR_MODEL_ID with actual values from your setup.
Troubleshooting
If you encounter issues while setting up or using TaskingAI, consider the following:
- Docker Issues: Ensure that Docker and Docker Compose are properly installed and running.
- Environment Variables: Double-check your
.envfile for accurate settings. - Version Conflicts: Confirm you are running supported versions of Python and other dependencies.
For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
Conclusion
With TaskingAI, the complexity of LLM-based application development is greatly reduced. Thanks to its robust features and user-friendly interface, creating intelligent agents is now accessible to all.
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

