Open-Source Documentation Assistant
DocsGPT is a cutting-edge open-source solution that streamlines the process of finding information in the project documentation. With its integration of the powerful GPT models, developers can easily ask questions about a project and receive accurate answers. Say goodbye to time-consuming manual searches, and let DocsGPT help you quickly find the information you need. Try it out and see how it revolutionizes your project documentation experience. Contribute to its development and be a part of the future of AI-powered assistance.
Production Support
Help for Companies: We’re eager to provide personalized assistance when deploying your DocsGPT to a live environment.
![video-example-of-docs-gpt](https://d3dg1063dc54p9.cloudfront.net/videos/demov3.gif)
Roadmap
You can find our roadmap here. Please don’t hesitate to contribute or create issues, it helps us improve DocsGPT!
Our Open-Source Models Optimized for DocsGPT
Name | Base Model | Requirements (or similar) |
---|---|---|
Docsgpt-7b-mistral | Mistral-7b | 1x A10G GPU |
Docsgpt-14b | Llama-2-14b | 2x A10 GPUs |
Docsgpt-40b-falcon | Falcon-40b | 8x A10G GPUs |
If you don’t have enough resources to run it, you can use bitsnbytes to quantize.
End to End AI Framework for Information Retrieval
![Architecture chart](https://github.com/user-attachments/assets/fc6a7841-ddfc-45e6-b5a0-d05fe648cbe2)
Useful Links
- ππ Cloud Version
- π¬π Join our Discord
- ππ Guides
- π« Interested in contributing?
- ππ How to use any other documentation
- π π How to host it locally (so all data will stay on-premises)
Project Structure
- Application – Flask app (main application)
- Extensions – Chrome extension
- Scripts – Script that creates similarity search index for other libraries
- Frontend – Frontend uses Vite and React
QuickStart
Note: Make sure you have Docker installed. On Mac OS or Linux, write:
./setup.sh
It will install all the dependencies and allow you to download the local model, use OpenAI or use our LLM API. Otherwise, refer to this Guide for Windows:
- Download and open this repository with
git clone https://github.com/arc53/DocsGPT.git
- Create a
.env
file in your root directory and set the env variables andVITE_API_STREAMING
to true or false, depending on whether you want streaming answers or not. It should look like this inside: - See optional environment variables in the .env-template and application.env_sample files.
- Run run-with-docker-compose.sh.
- Navigate to http://localhost:5173.
- To stop, just run
Ctrl + C
.
LLM_NAME=[docsgpt or openai or others]
VITE_API_STREAMING=true
API_KEY=[if LLM_NAME is openai]
Development Environments
Spin up Mongo and Redis
For development, only two containers are used from docker-compose.yaml
(by deleting all services except for Redis and Mongo).
See file docker-compose-dev.yaml.
Run:
docker compose -f docker-compose-dev.yaml build
docker compose -f docker-compose-dev.yaml up -d
Run the Backend
Note: Make sure you have Python 3.10 or 3.11 installed.
- Export required environment variables or prepare a
.env
file in the project folder: - Copy .env_sample and create
.env
(check out application/core/settings.py if you want to see more config options). - (optional) Create a Python virtual environment:
- On Mac OS and Linux:
python -m venv venv.
. venv/bin/activate - On Windows:
python -m venv venv.
venv\Scripts\activate - Download the embedding model and save it in the model folder:
- Install dependencies for the backend:
- Run the app using:
- Start worker with:
You can use the script below, or download it manually from here, unzip it and save it in the model folder.
wget https://d3dg1063dc54p9.cloudfront.net/models/embeddings/mpnet-base-v2.zip
unzip mpnet-base-v2.zip -d model
rm mpnet-base-v2.zip
pip install -r application/requirements.txt
flask --app application/app.py run --host=0.0.0.0 --port=7091
celery -A application.app.celery worker -l INFO
Start Frontend
Note: Make sure you have Node version 16 or higher.
- Navigate to the frontend folder.
- Install the required packages husky and vite (ignore if already installed):
- Install dependencies by running:
- Run the app using:
npm install husky -g
npm install vite -g
npm install --include=dev
npm run dev
Contributing
Please refer to the CONTRIBUTING.md file for information about how to get involved. We welcome issues, questions, and pull requests.
Code Of Conduct
We as members, contributors, and leaders, pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation. Please refer to the CODE_OF_CONDUCT.md file for more information about contributing.
Many Thanks To Our Contributors
License
The source code license is MIT, as described in the LICENSE file.
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.
Troubleshooting
If you encounter issues while setting up or using DocsGPT, here are some common troubleshooting tips:
- Ensure Docker is properly installed and running.
- Check if all environment variables are correctly set in your
.env
file. - Verify that your dependencies are properly installed.
- If running locally, ensure no other applications are using the same ports.
- For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
By embracing DocsGPT, youβre stepping into a world where documentation becomes more accessible and manageable. Happy coding!