How to Use Local LLMs in Notebooks with Langchain

Category :

Have you ever wanted to test your locally hosted language models in a notebook environment? Well, you’re in luck! This guide will walk you through the steps to easily load your local language models using Langchain. Whether you’re using APIs or not, we’ve got you covered.

Overview of the Project

The goal of this project is to help users seamlessly integrate locally hosted language models into notebooks for testing. We offer three notebooks available for use:

  • Two notebooks that utilize APIs to create custom Langchain LLM wrappers (one for the oobaboogas text generation web UI and another for KoboldAI).
  • One notebook that loads models without an API by leveraging the oobaboogas text-generation-webui virtual environment and modules for model loading.

In the end, you’ll be able to create an instance of the Custom LLM Wrapper to generate text using the format: llm(prompt goes here). You can use this instead of the OpenAI LLM class commonly seen in various guides.

Getting Started

To begin your adventure with local LLMs, follow these instructions for setting up the APIs provided in their respective repositories:

  1. Update the url variable with your API URL.
  2. Run the cells to create an instance of the Custom LLM Wrapper.

Roadmap

Currently, the preferred method for loading the models is through the API. Future improvements are planned for the API classes, but for now, both API and non-API notebooks will continue to work. However, please note that the non-API options may not be actively maintained going forward, so proceed with caution.

Non-API Notebook Instructions

If you prefer to use the non-API option, follow these steps:

  1. Activate your Python or Conda environment.
  2. Install Jupyter Notebook by running pip install jupyter in your preferred command prompt or terminal.
  3. Restart your command prompt or terminal to ensure the installation is properly configured.
  4. Activate your Python or Conda environment again, then run jupyter notebook in the command prompt or terminal to launch the Jupyter interface.
  5. Navigate to the directory where Non-API-Notebook.ipynb is located. If you’re an ooba user, place it in .text-generation-webui and open the notebook in the Jupyter interface.

Understanding the Code: An Analogy

Imagine your local language model is like a chef in a restaurant. The restaurant (your local machine) is equipped with all necessary tools and ingredients (the environment and libraries needed). Your goal is to direct this chef to prepare customized meals (generate text) based on the specific requests from customers (prompts you provide). The notebooks serve as the order form, guiding your chef on how to create the perfect dish (text output) using either direct instructions (non-API) or a detailed menu (API).

Troubleshooting

In your journey, you may run into some challenges. Here are a few troubleshooting tips:

  • If you’re encountering issues with Jupyter Notebook, ensure that your Python environment is activated correctly and that you’ve restarted your terminal after installation.
  • If the API doesn’t connect, double-check the URL you provided to ensure it’s accurate.
  • For errors related to dependency issues, verify that all necessary packages are installed in your environment.

For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.

Conclusion

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox

Latest Insights

© 2024 All Rights Reserved

×