How to Let Two Local LLMs Have a Conversation: A Simple Experiment

Feb 11, 2021 | Educational

Dive into the world of local Large Language Models (LLMs) by conducting a straightforward experiment where two LLMs chat about anything! This article provides step-by-step guidance on installing the necessary tools, using the TwoAI framework, and troubleshooting common issues. Let’s embark on this journey!

Installation of Required Tools

Before we can let the AIs converse, we need to set up the environment. Follow these steps for a smooth installation:

  • First, install Ollama by downloading the executable and following the installation instructions provided on the website.
  • Once installed, ensure that Ollama is running in the background. You can check its status via your system tray.
  • Next, find the model you wish to use by browsing the Ollama library.
  • Install the desired model using the command below, replacing model-name with your chosen model name:
  • bash ollama pull model-name

Using the TwoAI Framework

Now that we have our models set up, it’s time to run the conversational experiment. Let’s dive into the code snippet for this.

import sys
from twoai import TWOAI, AgentDetails

BASE_MODEL = "llama3"  # ensure this is installed
sys_prompt = "You are a very intelligent AI Chatbot and your name is current_name. You will have a conversation with another AI called other_name. Keep each message short and concise and repeat DONE! ONLY if you both agreed that you have ended the discussion."

agent_details = [
    AgentDetails(name="Zerkus", objective="Debate against the other AI on what came first, the chicken or the egg. You think the chicken came first.", model=BASE_MODEL, host="http://localhost:11434"),
    AgentDetails(name="Nina", objective="Debate against the other AI on what came first, the chicken or the egg. You think the egg came first.")
]

twoai = TWOAI(model=BASE_MODEL, agent_details=agent_details, system_prompt=sys_prompt, exit_word="DONE!", max_exit_words=2)

In this code:

  • We’re setting up two agents, Zerkus and Nina, each with their own perspective on the classic chicken-or-egg debate.
  • Just like a debate team, these AIs take turns presenting their points. Zerkus believes the chicken came first, while Nina argues for the egg.

Now to execute the conversation, you can either call:

twoai.next_response(show_output=True)

Or start an infinite conversation loop until they agree to stop:

twoai.start_conversation()

Throughout this process, both AIs will interact based on the data they were trained on, not pulling info from the internet.

Troubleshooting Ideas

If you encounter any issues, consider the following troubleshooting techniques:

  • Ensure Ollama is actively running in your system tray.
  • Double-check that the model was installed correctly by re-running the pull command.
  • Verify your installation of the TwoAI framework — check for any dependency issues or missing requirements.
  • If your AIs aren’t conversing as expected, try adjusting their objectives or system prompts for clarity.

For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.

Conclusion

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Preview

For a visual reference, please visit the following link for a preview of the experiment: Preview Link.

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox