AI Workflow Platform
Built for Developers

No new concepts to learn—just write functions and reference libraries like you always do to create blocks. Drag and drop them into workflows to efficiently build new features. Publish as APIs, MCP tools, or automation to turn ideas into products instantly.
OOMOL Product Screenshot

Four Core Features

Boosting efficiency across the entire development and deployment pipeline

01|Node as Function: No New Framework to Learn

In oomol, a node is a function. Every line of code you write can become a capability component.

• Clear input/output definitions, external calls work like using an API
• Reference any open-source library, zero bottlenecks for feature expansion
• Native VSCode experience
• AI auto-generates nodes + Vibe Coding Block accelerates development
Familiar approach, unlimited capabilities

02|Local Containerized Development: Production-Ready Deployment

IDE with built-in container environment, ready to use anytime.

• No environment setup required, works out of the box
• Strong isolation, doesn't pollute host machine
• Local environment identical to cloud environment
Write and deploy to cloud immediately, no more debugging "environment inconsistency" issues

03|Device as Server: Put Idle Computing Power to Work

Combining container and reverse proxy technology to assign domain names directly to devices.

• Personal computers & NAS become servers instantly
• GPU value no longer wasted (supports WSL2 calling Nvidia GPU)
• Private deployment + remote access anytime
Turn all your devices into development resources

04|One-Click Publishing: API / MCP / Automation

Workflows and nodes can both be published as callable capabilities:

• Serverless API services
• MCP tools, directly integrated with AI
• Timer / Trigger automation processes
Create capabilities that can be used by any system

Publish and Share

Join our vibrant community to share your creations, discover amazing tools, and collaborate with fellow creators.

Community Home
Community Detail

Choose Your Deployment Mode

From local development to enterprise deployment, OOMOL provides flexible product formats to meet different scenarios. 3 Modes, One Experience.

🖥️

Studio

Studio Software Deployment
Core Capability
Comprehensive & Convenient
Use Cases
Personal project debugging, rapid team sharing, make full use of idle GPUs (e.g., PCs with Nvidia 4090)
Technical Features
Ready to use with local installation, built-in container environment, one-click connectivity to run as online server
🐳

Headless

Headless Image Deployment
Core Capability
Freedom & Security for Enterprise
Use Cases
Highly customizable API services, automated workflow deployment, private cloud environments
Technical Features
Supports Docker and open-source infrastructure, deeply customizable, flexible integration, controllable security
☁️

Cloud

Cloud Serverless Deployment
Core Capability
Officially Provided Service, Performance Optimized to the Max
Use Cases
Rapid launch for SMB teams, no infrastructure management. Simple with zero mental overhead.
Technical Features
On-demand scaling, zero operations overhead, instant production deployment

Fusion APIs & LLM Access

Use a single OOMOL API Token to quickly access various common APIs and mainstream LLMs. Simpler, more unified development experience.

from oocana import Context
#region generated meta
import typing
Inputs = typing.Dict[str, typing.Any]
Outputs = typing.Dict[str, typing.Any]
#endregion
async def main(params: Inputs, context: Context) -> Outputs | None:
# Fusion API Base URL
api_url = "https://fusion-api.oomol.com/v1/fal-nano-banana-edit/submit"
# LLM Base URL, can use openai sdk
llm_base_url = "https://llm.oomol.com/v1"
# Get OOMOL token from context (no need for manual API key input)
api_token = await context.oomol_token()
return {
# "output": "output_value"
}

Open Source Project Built on OOMOL: PDF-Craft

Experience the power of local computing with our flagship open source project built on OOMOL

📄

PDF-Craft is OOMOL's official open source PDF to eBook conversion tool with 3000+ GitHub stars. Run it locally for free, share with friends via OOMOL, or use the official API service — there's always a service that fits you.

MIT open source, fully transparent
Built on DeepSeek OCR, advanced model
Professional team maintenance, open source and service in parallel
Local Deployment

Local Computing, Free & Private

  • Prepare a PC with RTX 3060 or higher GPU
  • Install OOMOL Studio for free local deployment
  • Use Mini App or Flow to run
Official Service

Pay-as-you-go, Ready to Use

  • Official maintenance, quality guaranteed
  • Available on PDF-Craft website
  • API service for developers
OOMOL Logo

Start Using OOMOL Today

Download OOMOL Studio — Turn ideas instantly into running products.