Simple, Transparent Pricing
Build locally for free. Subscribe only when you want OOMOL to host delivery and managed usage.
Local
Build and validate in Studio for free

Use the full local authoring environment, test with your own dependencies, and bring your own model service when needed.

Online
Subscribe when you want to ship and host

Use Pro or Max when you want OOMOL to take on the online delivery layer for the same validated function as an API, MCP tool, or automation task.

Choose How You Use OOMOL
Free covers local development and hosted trials. Pro and Max are for online publishing, hosted runs, and predictable Cloud Task costs.
Free
$0
For local development, validation, and everyday work with your own model services
200 min/month20 packages
Apple Silicon Mac only
Full OOMOL Studio local development environment
Bring your own model service when you do not want managed AI billing
200 Cloud Task minutes included each month
Publish up to 20 packages total, public and private combined
View full benefitsCollapse
No charge for local runs inside Studio
1 sign-up credit to try LLM and Fusion API usage
Max
$79/month
For steady online traffic that needs a larger monthly hosted allowance for ongoing hosted runs
14,400 min/month300 packages
All Pro features, with higher limits
14,400 Cloud Task minutes included each month
$8 in general credits included each month
Publish up to 300 packages total, public and private combined
View full benefitsCollapse
Better fit for steady high-volume hosted runs
Top up at the standard rate after the included allowance
Usage Pricing
When you use managed AI, Cloud Task, or hosted service calls, usage is billed in credits with the same rates shown in OOMOL Console.
LLM pricing is fetched live from console. Cloud Task and Fusion API rows reflect the current console table. Pro is designed to give you included usage plus a better price when you keep growing.
Channel
Model
Input Price (Credits/M Token)
Cache Price (Credits/M Token)
Output Price (Credits/M Token)
No LLM pricing data is available right now.
OOMOL Logo

Start with your next function and let AI use it

Download OOMOL Studio, write and validate your function locally, publish it to OOMOL Cloud, then sign in with the OOMOL CLI to use it from Codex or Claude Code, or keep delivering it as an API, MCP tool, or automation task.
Apple Silicon Mac only