Opyrator

May 28, 2022 | Data Science

Turns your Python functions into microservices with web API, interactive GUI, and more.

Getting StartedFeaturesExamplesSupportContributionChangelog

Instantly turn your Python functions into production-ready microservices. Deploy and access your services via HTTP API or interactive UI. Seamlessly export your services into portable, shareable, and executable files or Docker images. Opyrator builds on open standards – OpenAPI, JSON Schema, and Python type hints – and is powered by FastAPI, Streamlit, and Pydantic. It cuts out all the pain for productizing and sharing your Python code – or anything you can wrap into a single Python function.

Alpha Version: Only suggested for experimental usage.

Highlights

  • Turn functions into production-ready services within seconds.
  • Auto-generated HTTP API based on FastAPI.
  • Auto-generated Web UI based on Streamlit.
  • Save and share as self-contained executable file or Docker image.
  • Reuse pre-defined components and combine with existing Opyrators.
  • Instantly deploy and scale for production usage.

Getting Started

Installation

Requirements: Python 3.6+.

pip install opyrator

Usage

A simple Opyrator-compatible function could look like this:

from pydantic import BaseModel

class Input(BaseModel):
    message: str

class Output(BaseModel):
    message: str

def hello_world(input: Input) -> Output:
    """Returns the message of the input data."""
    return Output(message=input.message)

This structure can be compared to a vending machine: the input corresponds to the order you place (the message), and the output is the item that the vending machine dispenses (the response). Just as you need to specify what you want (input) to get a proper item (output), the function needs to declare what it takes in and how it responds.

Steps to run your Opyrator:

  1. Copy the function code to a file (e.g., my_opyrator.py).
  2. Run the UI server from the command-line:
  3. opyrator launch-ui my_opyrator:hello_world
  4. Run the HTTP API server from the command-line:
  5. opyrator launch-api my_opyrator:hello_world
  6. You will see where your web app and service are being served on your local machine.

Examples

Opyrator supports various tasks and use-cases. All examples are bundled into a demo playground which you can also deploy on your own machine via Docker:

docker run -p 8080:8080 mltooling/opyrator-playground:latest

Support & Feedback

This project is maintained by Benjamin Räthlein, Lukas Masuch, and Jan Kalkan. Please understand that we won’t be able to provide individual support via email. We also believe that help is much more valuable if it’s shared publicly so that more people can benefit from it.

For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.

Troubleshooting Ideas

If you encounter any issues during the installation or while running your Opyrator, consider the following troubleshooting methods:

  • Ensure that you are using Python 3.6 or higher.
  • Confirm that all required dependencies are installed in your environment.
  • Check if your function strictly follows Opyrator’s compatibility requirements.
  • Look out for error messages in the command line for guidance on resolving issues.

If problems persist, you can refer to the Opyrator GitHub page for more help or create an issue.

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox