Agile Diffusers Inference (ADI)

Jul 14, 2021 | Data Science




CI Status

**Agile Diffusers Inference (ADI)** is a **C++ library** featuring a **CLI tool** designed to harness the acceleration capabilities of ONNXRuntime and the high compatibility of the .onnx model format. Its primary goal is to provide a convenient solution for the engineering deployment of Stable Diffusion, ensuring suitable package size and high performance.

Why Choose ONNXRuntime as Our Inference Engine?

  • Open Source: ONNXRuntime is open-source, allowing easy use and modification to suit various application scenarios.
  • Scalability: It supports custom operators and optimizations for tailored performance.
  • High Performance: Optimized for quick inference speeds, ideal for real-time applications.
  • Strong Compatibility: Facilitates model conversion from multiple deep learning frameworks such as PyTorch and TensorFlow.
  • Cross-Platform Support: Capable of executing efficiently on CPU, GPU, TPU, etc.
  • Community and Enterprise Support: Developed by Microsoft, it offers continuous updates and maintenance through an active community.

How to Install (CLI)?

Method 1: Install the Command Line Tool Using a Package Manager

macOS (Homebrew):

brew tap windsander/adi-stable-diffusion
brew install adi

Windows (git-Bash + Chocolatey):

curl -L -o adi.1.0.1.nupkg https://raw.githubusercontent.com/Windsander/ADI-Stable-Diffusion/deploy/adi.1.0.1.nupkg
choco install adi.1.0.1.nupkg -y

Method 2: Download from the Released Version

You can find the latest available version from the Release Assets. The file tree will appear as follows:

--bin
    --adi
--lib
    --[Corresponding platforms ADI library, e.g., libadi.a]
    --[Corresponding platforms ORT library, e.g., libonnxruntime.dylib]
--include
    --adi.h
--CHANGELOG.md
--README.md
--LICENSE

After unzipping, install the bin and lib directories or navigate to the unzipped bin directory to start using ADI.

Method 3: Build adi-lib & adi-cli Locally

An automated script to compile ADI on your device is provided. Simply execute the script auto_build.sh:

# if you do not pass the BUILD_TYPE parameter, it will default to Debug.
# if you do not enable certain ORTProvider by options, the script will select the default ORTProvider.
.auto_build.sh

For example:

# MacOS example:
.auto_build.sh --platform macos --build-type debug

# Windows example:
.auto_build.sh --platform windows --build-type debug

# Linux (Ubuntu) example:
.auto_build.sh --platform linux --build-type debug

# Android example:
.auto_build.sh --platform android --build-type debug --android-ndk Volumes/AL-Data-W04/WorkingEnv/Android/sdk/ndk/26.1.10909125 --android-ver 27

How to Use?

Example: 1-step Euler_A img2img latent space visualized

Below shows what actually happened in [Example: 1-step img2img inference] in Latent Space:

1-step img2img inference in Latent Space

You can use the command-line tools generated by CMake to execute functionalities like the following:

# Optional (if using local build and not installed):
cd .[your_adi_path]/bin
cd .cmake-build-debug/bin

# Example command:
adi -p "A cat in the water at sunset" -m img2img -i "....sdio-testinput-test.png" -o "....sdio-testoutput.png" -w 512 -h 512 -c 3  --seed 15.0 ... 

Troubleshooting Ideas

  • Ensure you have all the necessary dependencies installed before running the installation commands.
  • If you encounter errors during the build process, check the logs for missing libraries or incorrect paths.
  • Make sure you are using compatible versions of the C++ compiler and ONNXRuntime.
  • For any issues, refer to the project’s issues on GitHub for help.

For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox