OpenRouter, a startup based in New York, has raised $40 million in combined seed and Series A funding to support its mission of simplifying access to AI models. The funding round was led by Andreessen Horowitz and Menlo Ventures, with additional backing from Sequoia Capital and several prominent angel investors from OpenAI, DeepMind, and Hugging Face. This investment values the company at around $500 million, a testament to the growing centrality of its platform in the expanding AI infrastructure market.
At its core, OpenRouter provides a single API that enables developers to access over 400 large language and vision models from leading AI companies, including OpenAI, Anthropic, Meta, and Mistral. Instead of having to work with separate APIs and payment systems for each provider, developers can route their requests through OpenRouter. The platform connects to major cloud providers like Amazon Web Services and Microsoft Azure, ensuring flexibility and speed. OpenRouter charges a modest 5 percent fee on usage, and by May 2025, it was processing over $8 million in monthly developer spend, ten times what it handled just seven months earlier.
What makes OpenRouter particularly attractive is how it helps developers and companies manage the rising complexity and cost of using different AI models. The platform simplifies everything from billing and model switching to performance tracking and system failover. It supports over 60 cloud hosting services and boasts edge-optimised infrastructure with average latency below 25 milliseconds. OpenRouter claims to process trillions of tokens each week, proving it can handle production-level workloads with efficiency.
With the new funding, the company plans to expand its product offering. This includes introducing enterprise-grade controls, offering a wider variety of models, and building deeper integrations with popular platforms like VS Code, Zapier, Cloudflare, and Make. It is also working with AI labs by providing them with real-world usage data. Notably, OpenRouter was used during the quiet rollout of OpenAI’s GPT-4.1, suggesting even the biggest players in AI are finding value in its routing system.
The need for this kind of tool is growing. As companies rely more on AI, managing the cost of running these models has become a serious concern, so much so that some teams now consider inference costs a bigger line item than salaries. OpenRouter’s centralised model access and usage optimisation tools position it as a valuable partner for any business trying to scale its use of AI without losing control over spending or performance.
In a fast-moving AI landscape filled with competing models and providers, OpenRouter is carving out a role as the bridge that connects them all. Its mission is simple but powerful: make it easy for developers and organisations to experiment, deploy, and manage the best AI models available, without being locked into a single ecosystem. As more businesses build AI-powered applications, OpenRouter’s platform could become one of the key pieces of infrastructure that enables this shift.