How to Get Started with PAR LLAMA: A User-Friendly Guide

Sep 9, 2021 | Educational

PAR LLAMA is an incredible TUI application designed to help users effectively manage and utilize Ollama-based LLMs across multiple operating systems, including Windows, Windows WSL, Mac, and Linux. In this blog, we’ll walk you through the prerequisites, installation procedures, and how to use the application to its fullest potential. So, let’s dive in!

Table of Contents

About

PAR LLAMA is a TUI application that simplifies the management and utilization of Ollama-based large language models (LLMs). Built using Textual and Rich, it features both dark and light modes, as well as custom themes, catering to different user preferences.

Prerequisites for Running

  • Install and run Ollama.
  • Install Python 3.11 or newer. You can find the installers at python.org.
  • On Windows, use Scoop to easily manage Python installation by running: scoop install python.

Installing from MyPI Using pipx

If you don’t have pipx installed, you can install it using:

pip install pipx
pipx ensurepath

Once pipx is installed, run the following command:

pipx install parllama

To upgrade an existing installation, use the –force flag:

pipx install parllama --force

Installing from MyPI Using pip

Create a virtual environment and install PAR LLAMA with the following commands:

mkdir parllama
cd parllama
python -m venv venv
source venv/Scripts/activate
pip install parllama

Command Line Arguments

PAR LLAMA supports several command line arguments which allow customization:

usage: parllama [-h] [-d DATA_DIR] [-u OLLAMA_URL] [-t THEME_NAME] [-m dark,light]
                [-s local,site,chat,prompts,tools,create,options,logs] ...

Some notable flags include:

  • -h: show help message
  • -v: show version information
  • -d DATA_DIR: Data Directory (defaults to ~.parllama)
  • -u OLLAMA_URL: URL of your Ollama instance (defaults to http://localhost:11434)
  • -t THEME_NAME: Set the theme name

Running PAR LLAMA

With pipx Installation

To run PAR LLAMA from anywhere, simply use:

parllama

With pip Installation

Activate your virtual environment and execute:

source venv/Scripts/activate
parllama

Quick Start Chat Workflow

Here’s a basic workflow to follow:

  1. Start PAR LLAMA.
  2. Click the “Site” tab and use ^R to fetch the latest models.
  3. Filter models by typing “llama3”.
  4. Select the llama3 model and click ^P to pull it to your local machine.
  5. Once downloaded, you can jump to the chat tab and type a message.

Custom Prompts

Users can create a library of custom prompts for quick chat initiation. You can set up system prompts, and user messages which can be sent immediately upon loading.

Themes

Themes are JSON files stored in the themes folder and offer options for both dark and light modes. Modifications can be made using the --theme-name option while starting the application.

FAQ

  • Do I need Docker? Docker is required only for quantizing models from Hugging Face.
  • Does PAR LLAMA require internet access? By default, no internet access is needed unless updates are enabled.
  • Can PAR LLAMA run on ARM? Yes, it can run wherever Python is compatible.

What’s New

Discover the latest updates in PAR LLAMA:

  • v0.3.7: Fixes potential crashes with multiple models.
  • v0.3.6: Added options for chat input history saving.
  • v0.3.5: First launch welcome feature added.

Troubleshooting

If you encounter issues while using PAR LLAMA, consider the following troubleshooting steps:

  • Ensure that all prerequisites are adequately installed.
  • Verify that your Ollama server is running properly.
  • Check environment variables have the correct values set.
  • If you face connectivity issues, confirm the URL specified matches your Ollama instance.

For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox