Welcome to waggledance.ai!

May 15, 2024 | Data Science


Quick Start

Try the cloud preview ↗
Join the Discord – Help with algorithm: star this repo

You can also build and deploy yourself! However, you must configure your environment.

What is waggledance.ai?

waggledance.ai is an experimental application focused on achieving user-specified goals. It provides a friendly but opinionated user interface for building agent-based systems. The project emphasizes explainability, observability, concurrent generation, and exploration. Currently in pre-alpha, the development philosophy prefers experimentation over stability as goal-solving and agent systems are rapidly evolving.

How waggledance.ai Works

Think of waggledance.ai as a well-organized orchestra where each musician (agent) specializes in their instrument (sub-task). The conductor (planner agent) sets the tempo and gives out the tasks, ensuring that musicians play their parts together for a beautiful symphony of solution crafting.

  • Planner Agent: The conductor, it streams an execution graph for sub-tasks.
  • Execution Agents: The musicians, they execute sub-tasks as concurrently as possible.
  • Criticism Agents: The audience, they review the music being played (sub-results) to avoid missing notes (poor results).
  • Human in the Loop: You, the curious listener, can provide feedback and course corrections to the orchestra as the performance unfolds!

Highlighted Features

  • High concurrency execution graph planning.
  • Adversarial agents that review results.
  • Vector database for long-term memory.
  • Explainable results and responsive UI: Graph visualizer, sub-task (agent) results, agent logs, and events.

Tech Stack

Running Locally and Development

waggledance.ai can be deployed locally using Docker or manually using Node.js. Configuration of .env vars is required.

Docker

docker-compose up --build

Dependencies

  • Required: Node JS LTS ≧ v18.17.0 (LTS recommended)
  • pnpm is used in examples but npm or yarn may also work.
  • Recommended: Turbo – pnpm add turbo --global or use pnpx turbo in place of turbo.

Configure Your Environment

Copy .env.example to .env and set your environment variables. For help, please reach out on Discord. See env-schema.mjs for explicit requirements.

Setting up Postgres

Refer to .env.example and env-schema.mjs for the required environment variables. The project currently only supports Postgres via Prisma. Use a local Postgres instance or a cloud provider like Supabase.

bash
pnpm db:generate
pnpm db:push

After setting up the Postgres database, run the above commands where:

  • db:generate creates local typings and DB info from the schema.
  • db:push pushes the schema to the database provider.

Run Development

bash
turbo dev # or
pnpm dev

Troubleshooting

  • If you encounter problems, check for missing environment variable configurations.
  • Run turbo lint to identify any linting issues before starting a feature.
  • If builds fail, run turbo lint:fix for auto-fixes where possible.

For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.

Contribute and Help

Developers can check CONTRIBUTING.md for guidelines, star the project, or join our Discord community.

Closing Remarks

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox