Elia: Your Keyboard-Centric Chat Companion for LLMs

Sep 10, 2021 | Educational

A snappy, keyboard-centric terminal user interface for interacting with large language models.

Chat with Claude 3, ChatGPT, and local models like Llama 3, Phi 3, Mistral, and Gemma.

elia-screenshot-collage

Introduction

Welcome to the future of AI interaction! Elia is an innovative application that allows you to chat with multiple large language models (LLMs) directly through your terminal. Imagine Elia as your fast and friendly librarian, helping you find answers and guiding you through meticulously organized bookshelves filled with knowledge!

This tool is designed to be keyboard-focused, efficient, and above all, fun to use! It stores your conversations in a local SQLite database and grants you access to models like ChatGPT and Claude, along with local installations managed through tools like ollama and LocalAI.

Installation

Ready to dive in? Here’s how to get Elia up and running:

  • Install Elia using pipx:
pipx install elia-chat

Depending on your chosen model, you may need to set up a few environment variables (for instance, OPENAI_API_KEY, ANTHROPIC_API_KEY, GEMINI_API_KEY, etc.).

Quickstart

Now that Elia is installed, it’s time to meet your new conversationalist! Launch Elia straight from the command line:

elia

Want to chat inline? Use the option -i or --inline:

elia -i What is the Zen of Python?

Prefer full-screen mode? Express that desire with:

elia Tell me a cool fact about lizards!

Need to specify a model? No problem!

elia -m gpt-4o

You can combine options to customize your experience. For example, to use the Gemini 1.5 Flash model in inline mode:

elia -i -m gemini-gemini-1.5-flash-latest How do I call Rust code from Python?

Running Local Models

If you’re looking to work with local models, you’ll need to follow a few steps:

  • First, install ollama.
  • Next, pull the required model, like so:
  • ollama pull llama
  • Now, start the local ollama server:
  • ollama serve
  • Finally, don’t forget to add the model to your configuration file!

Configuration

The configuration file is your custom garden where you can plant and nurture your preferences and models. Here’s how to set it up:

The config file’s location is shown at the bottom of the options window (press ctrl+o).

Here’s an example setup that indicates available options:

default_model = gpt-4o
system_prompt = You are a helpful assistant who talks like a pirate.
theme = galaxy
message_code_theme = dracula
[[models]]
name = ollama/llama3
[[models]]
name = openai/some-model
api_base = http://localhost:8080/v1
api_key = api-key-if-required

Custom Themes

Elevate your chatting experience with custom themes! Add a theme YAML file to the themes directory, which you can locate by pressing ctrl+o and checking the Themes directory line. Here’s a sample theme:

name: example
primary: #4e78c4
secondary: #f39c12
accent: #e74c3c
background: #0e1726
surface: #17202a
error: #e74c3c
success: #2ecc71
warning: #f1c40f

Changing Keybindings

Currently, Elia does not allow changing keybindings directly as terminals have restrictions. As a workaround, it is recommended to map your preferred key combination at the terminal emulator level to ensure you can send your messages seamlessly. For example:

iTerm example

With this configuration in place, pressing Cmd + Enter will send a message, while pressing Enter alone will create a new line.

Importing and Managing Conversations

You can easily export your conversations from ChatGPT as a JSON file and then import them into Elia using:

elia import pathtoconversations.json

If you want to start fresh, wipe the database with:

elia reset

To uninstall Elia, simply run:

pipx uninstall elia-chat

Troubleshooting

If you encounter issues while using Elia, consider the following troubleshooting steps:

  • Ensure your environment variables are set correctly.
  • Check if the server for local models is actively running.
  • Review the configuration file for any syntax errors.
  • Attempt resetting or uninstalling/reinstalling Elia to refresh its setup.

For any persistent problems, don’t hesitate to reach out for support or consult the documentation.

For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.

Conclusion

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox