In the world of artificial intelligence, effective communication between large language models (LLMs) and production environments is vital. Enter **BricksLLM**—your cloud-native AI gateway that streamlines this integration process. Written in Go, it supports a variety of leading AI models like OpenAI, Anthropic, and Azure OpenAI. Let’s embark on a journey to discover how to set it up and troubleshoot common issues you may face along the way!
Getting Started with BricksLLM
To successfully launch BricksLLM, you’ll need to follow a simple series of steps. Think of it as assembling a LEGO model where each step is a crucial block that supports the structure of your AI solution.
Step-by-Step Setup
- Clone the Repository:
Use the following command to copy the BricksLLM-Docker repository:bash git clone https://github.com/bricks-cloud/BricksLLM-Docker - Navigate to the Directory:
Change your directory to BricksLLM-Docker:bash cd BricksLLM-Docker - Deploy BricksLLM:
Run the following command to bring up the system with PostgreSQL and Redis:bash docker compose up - Create a Provider Setting:
Use the cURL command to set up your OpenAI API key:bash curl -X PUT http://localhost:8001/api/provider-settings \ -H "Content-Type: application/json" \ -d '{"provider":"openai","setting":{"apikey":"YOUR_OPENAI_KEY"}}' - Create Bricks API Key:
Formulate a key to use with specific rate and spend limits:bash curl -X PUT http://localhost:8001/api/key-management/keys \ -H "Content-Type: application/json" \ -d '{"name":"My Secret Key","key":"my-secret-key","tags":["mykey"],"settingIds":["ID_FROM_STEP_FOUR"],"rateLimitOverTime":2,"rateLimitUnit":"m","costLimitInUsd":0.25}' - Redirect Requests:
Now redirect requests to BricksLLM using your newly created key:bash curl -X POST http://localhost:8002/api/providers/openai/v1/chat/completions \ -H "Authorization: Bearer my-secret-key" \ -H "Content-Type: application/json" \ -d '{"model":"gpt-3.5-turbo","messages":[{"role":"system","content":"hi"}]}'
Features of BricksLLM
BricksLLM is outfitted with a plethora of features to enhance your production experience:
- PII detection and masking
- Rate limits and cost control
- Request retries and caching
- Granular access control for both model and endpoint
- Native support for various AI models and custom deployments
Troubleshooting Common Issues
When diving into the waters of AI integration, you might encounter a few waves. Here are some common issues and their solutions:
- Deployment Issues: Ensure your Docker is running correctly. You can try restarting the Docker service.
- API Key Errors: Double-check that the API key was inserted correctly and matches the one generated with your OpenAI account.
- Authentication Failures: Make sure your Authorization header matches the format required, particularly the ‘Bearer’ part.
For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
Continuous Improvement
Keep your setup up-to-date with the following commands:
- To update to the latest version:
bash docker pull luyuanxin1995/bricksllm:latest - For a particular version:
bash docker pull luyuanxin1995/bricksllm:1.4.0
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

