If you’re venturing into the realm of artificial intelligence and image generation, you’ve likely heard of Stable Diffusion. With the help of NixOS, you can make the most of this technology. In this article, we’ll guide you through the process of getting up and running with nix-stable-diffusion, making it as user-friendly as possible!
What’s Done
- A Nix flake has been created capable of running InvokeAI and stable-diffusion-webui flavors of Stable Diffusion.
- No need for pip or conda, plus AMD ROCM support is included.
- And yes, there are profits to be had!
How to Use It?
Let’s break down the steps to get you started with InvokeAI and stable-diffusion-webui.
InvokeAI
- Clone the repository.
- Run the following command:
nix run .#invokeai.default,amd -- --web --root_dir fold_for_configs_and_models - Wait for the package to build.
.#invokeai.defaultbuilds with default torch-bin, which has CUDA support..#invokeai.amdoverrides torch packages with ROCM-enabled versions.
- Downloading model weights:
- Built-in CLI Way: Upon first launch, InvokeAI will suggest running a built-in TUI startup configuration script to download default models or supply existing ones.
- Built-in GUI Way: A recent version of InvokeAI added a GUI for model management. Check the upstream documentation for details.
stable-diffusion-webui (a.k.a. 111AUTOMATIC111 Fork)
- Clone the repository.
- Run the following command:
nix run .#webui.default,amd -- --data-dir runtime_folder --ckpt-dir folder_with_pre-downloaded_models - Wait for the packages to build.
- Remember, webui isn’t a typical python package, so a multi-layered wrapper script is needed to set the required environment and arguments:
binflake-launchsets default args and runs by default.binlaunch.pyis a thin wrapper that setsPYTHONPATHwith required packages.
- If you encounter any issues with image generation related to paths, check the settings tab inside the web UI.
Hardware Quirks
For AMD Users
If you see the error:
hipErrorNoBinaryForGpu: Unable to find code object for all current devices!
then it’s likely that your GPU is not fully supported by ROCM. To resolve this, set the environment variable with the following command:
export HSA_OVERRIDE_GFX_VERSION=10.3.0
.
For Nvidia Users
Currently, there’s no Nvidia GPU to test the CUDA functionality. If CUDA doesn’t work, please open an issue or submit a pull request with a proposed fix.
Troubleshooting
Here are some tips for common issues you might encounter:
- If you run into difficulties during the package build, ensure your Nix environment is correctly set up.
- For improper paths or read-only errors, recheck the output paths in the web UI settings.
- If using older hardware or incompatible GPUs, ensure you have the right environment variables set as mentioned.
For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
What Needs to be Done?
Some enhancements to consider include:
- Submissions for missing package definitions to Nixpkgs.
- Ensuring webui uses consistent paths and filenames for weights as InvokeAI.
- Creating a PR for pynixify to enable skip-errors mode.
- Increaing reproducibility by replacing runtime downloaded models with proper flake inputs.
Current Versions
- InvokeAI: 2.3.5.post2
- stable-diffusion-webui: 12.03.2023
Meta
Contributions to this project are always welcome. The aim is to keep pace with the development of InvokeAI and related applications.
Acknowledgements
Special thanks to the following:
- cript0naut for generating the boilerplate for missing python packages.
- colemickens and skogsbrus for inspiration and useful code snippets.
Similar Projects
Take a look at Nixified-AI, which aims to support a broader range of AI models on NixOS.
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

