Ai2 Brings Open AI Models to Enterprises With Cirrascale Tie-Up
OLMo 2, Molmo, and Tülu 3 are now available via API, marking Ai2’s first step toward commercializing its open-source models
Non-profit AI lab Ai2 is partnering with infrastructure platform provider Cirrascale in a move to commercialize its open-source models. OLMo 2, Molmo, and Tülu 3 can now be run via API on Cirrascale’s Inference Platform, enabling companies to avoid the need to deploy the models themselves and the associated costs of in-house systems or expertise.
“Since launching our family of truly open models last year, the AI community has been asking for API access. Today, in partnership with Cirrascale, we’re excited to deliver just that—an API to enable scalable, flexible, and cost-efficient integration,” Sophie Lebrecht, Ai2’s chief operating officer, remarks in a statement. “Our fully open models have already changed the way the AI community thinks about language models, and API access makes it that much easier for them to get into the hands of builders, developers, and researchers everywhere.”
Similar to AWS SageMaker, Microsoft Azure Machine Learning, Google Vertex AI, and CoreWeave, Cirrascale’s Inference Platform provides cloud-based infrastructure for training and running AI models, as well as Inference-as-a-Service and other managed services. These capabilities can aid companies looking to leverage AI models more effectively without the need to invest in servers, GPUs, and other related hardware, as well as hire machine learning professionals.
Until now, Cirrascale’s Inference Platform supported pre-compiled foundation models like Meta’s Llama 3.1 Instruct and DeepSeek R1. With this launch, developers can access Ai2’s state-of-the-art OLMo, an open language model providing full transparency, open weights, training data, and Apache 2.0-licensed code. Also available are Molmo, Ai2’s new multimodal model, and the latest version of Tülu, its high-performing instruction-following model family.
Ai2’s models were readily available for developers to use long before the Cirrascale partnership. But getting OLMo, Molmo, and Tülu to work required more time and resources. By accessing it through Cirrascale’s platform, developers can streamline the process and focus on building their AI-powered apps.
“Our new Inference Platform is designed for two core audiences: developers building differentiated models and needing an endpoint offering in order to commercialize quickly, and enterprise customers with customized or fine-tuned models looking to deploy them at scale,” Dave Driggers, Cirrascale’s chief executive, states.
Importantly, Cirrascale customers will access Ai2’s model via an API, but Cirrascale is not independently hosting OLMo, Molmo, or Tülu. Ai2 remains directly involved in how its models are offered on the Inference Platform. Today’s announcement is an endorsement of Cirrascale as the exclusive infrastructure partner, providing managed, enterprise-ready access—at least for now.
This move may be seen as unconventional by some for a nonprofit lab, but it presents another opportunity for Ai2 to gain more visibility for its models. Amid growing competition from both open- and closed-model providers, the AI house that Paul Allen built must broaden its reach to garner more attention for why its technology is the way forward.