How to Get Started with LangGraph Studio (Beta)

Nov 30, 2022 | Educational

Welcome to the innovative world of LangGraph Studio! This user-friendly platform allows you to develop large language model (LLM) applications with ease, giving you the power to visualize, interact, and debug complex agentic applications. While still in beta, LangGraph Studio is free for all users on any plan tier of LangSmith. In this blog, we will guide you step-by-step on how to set up and use LangGraph Studio efficiently. Let’s dive in!

Step 1: Download LangGraph Studio

The first step is to download the latest version of LangGraph Studio. Currently, it is only available for macOS. To get started:

  • Download the .dmg file of LangGraph Studio here.
  • You can also visit the releases page for more options.

Step 2: Install Dependencies

Before you can use LangGraph Studio, you’ll need to ensure that Docker Engine is running:

Make sure to have docker-compose version 2.22.0 or higher installed as well.

Step 3: Set Up Your Project

Next, you’ll need a project that includes a LangGraph application:

  • Clone the example repository with:
  • git clone https://github.com/langchain-ai/langgraph-example.git
  • If you prefer using a pyproject.toml file for dependencies, clone this repository instead:
  • git clone https://github.com/langchain-ai/langgraph-example-pyproject.git
  • Create a .env file by copying from the example:
  • cp .env.example .env
  • Fill in this .env file with your OpenAI, Anthropic, and Tavily API keys, while avoiding to set the LANGSMITH_API_KEY, which is handled automatically.

Step 4: Open a Project in LangGraph Studio

Once you launch the LangGraph Studio desktop app, you’ll need to log in via LangSmith:

  • Select your LangGraph application folder that contains a correctly configured langgraph.json file.

Once authenticated, you’ll be greeted with a visually rendered graph!

Step 5: Run Your Graph

Now it’s time to run the graph:

  • Select a graph from the dropdown menu at the top left.
  • Edit your input parameters and click Submit to see the outputs.

This interactive approach allows rapid iterations and debugging, improving efficiency significantly. Think of it like navigating through a maze with both a map and a live feedback loop!

Troubleshooting

If you encounter any issues while setting up or using LangGraph Studio, consider the following troubleshooting tips:

  • Ensure that your Docker service is running and correctly installed.
  • Make sure your .env file is correctly configured with valid API keys.
  • Double-check that the langgraph.json file is present and correctly set up in your project directory.

For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.

Conclusion

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox