Today, we’re excited to announce that we have raised $80M in Series A funding to accelerate our mission: to change the way the world works through AI. We launched the company a little over a year ago, and have since then built a world-class team that is both enterprise-native and AI-native, deployed our enterprise AI platform in production with leading Fortune 500 companies, and published state-of-the-art research, including on our RAG 2.0 technology.

The Contextual AI Platform: End-to-End, Optimized for Enterprise

One area where AI will profoundly reshape the world is in the workplace. Enterprise AI applications will automate complex knowledge-intensive tasks, freeing up knowledge workers to focus on the highest-value activities.

Contextual AI’s turnkey platform lets enterprises build state-of-the-art RAG 2.0 applications in minutes, outperforming “Frankenstein’s RAG” out of the box, while being fully customizable using end-to-end machine learning (ML). Enterprises can deploy the platform on our secure SaaS infrastructure or on-prem within their own environment. With on-prem deployment, there’s no need to send your data to external APIs — we bring the model to your data, ensuring your security boundaries are maintained.

While developing RAG prototypes is relatively easy today, building production-grade AI systems at scale remains a massive challenge for enterprises. This is especially true in high-stakes settings that require high accuracy, with a low tolerance for hallucinations. The power of end-to-end ML is no match for prompt engineering, which is too inflexible, too brittle and too cumbersome.

The Contextual Language Model Approach

Our CEO Douwe Kiela spent many years pushing the boundaries of foundation models both in industry (Microsoft Research, Meta/Facebook AI Research, Hugging Face) and academia (Cambridge, Stanford). Among many other contributions, he is best known as one of the pioneers of RAG. Partnering with CTO Amanpreet Singh, a close collaborator at both Meta and Hugging Face, they co-founded Contextual AI to bring production-grade AI to the enterprise. The company’s approach centers on two core tenets:

  • Systems over models: Generative language models are only a small piece of a much bigger system. The key to delivering effective enterprise solutions lies in contextualizing the language model within this broader setting. End-to-end optimized systems easily outperform badly designed ones, even with amazing language models.
  • Specialization over AGI: One might argue that current AGI pursuits are fundamentally about consumer products – AGI is designed for generalist capabilities, where you do not know what the user wants. In contrast, enterprises know exactly what they want from a system in a given use case. In such settings, specialization will always be more impactful than one-size-fits-all generalist solutions.

Using this approach, we have created Contextual Language Models (CLMs), trained using RAG 2.0. CLMs power the Contextual AI Platform, delivering applications that are more accurate and have better grounding than is possible with naive RAG implementations.

Unlocking RAG 2.0 for Enterprises in Production

Our platform supports a wide range of use cases, from technical customer support to investment research and information discovery. We focus on enhancing high-value knowledge workers in specialized tasks that deliver high ROI.

Qualcomm Uses CLMs for its Customer Engineering Team

Recently, Qualcomm signed a multi-year contract with us to deploy a CLM-powered customer engineering application. Their initial prototypes plateaued at low accuracy, insufficient for production use. They needed a specialized solution to retrieve complex technical documentation and assist their engineers in resolving customer issues more efficiently.

By building on RAG 2.0 with Contextual AI’s platform, Qualcomm substantially improved accuracy and achieved the results needed to move forward with deployment. In production, the CLM-powered application will be used by customer engineers globally, retrieving information over tens of thousands of highly technical documents across business units and product lines.

“With their RAG 2.0 technology, deep expertise and proven results to date, Contextual AI gives me confidence that we can leverage generative AI to support our team, help our customers design and develop products efficiently, and set new standards for performance and quality,” said Yogi Chiniga, VP of Customer Engineering at Qualcomm.

HSBC is Partnering with Contextual AI for AI-Assisted Knowledge Management

HSBC plans to leverage Contextual AI to provide research insights and process guidance support through retrieving and synthesizing relevant market outlooks, financial news, and operational documents. HSBC found a Generative AI partner in Contextual AI that offers both cutting-edge technology as well as extensive enterprise-grade security and compliance capabilities which are critical for enabling safe adoption of AI in financial services.

Ian Glasner, Group Head of Emerging Technology, Innovation, and Ventures at HSBC, said, “Contextual AI’s language models, fine tuning, and feedback learning capabilities combine to deliver a product that will create value in many industries. In financial services, it will enable a deeper understanding of customer behavior and automate complex digital workflows. By building reliable, enterprise grade AI systems, Contextual AI helps enterprises like HSBC drive better customer outcomes.”

Scaling Operations and Welcoming New Partners

Our Series A funding will help us scale operations, accelerate our go-to-market, and meet the surging demand for AI in enterprises worldwide. The round was led by existing investors Greycroft, together with other seed investors Bain Capital Ventures (BCV), Lightspeed, Lip-Bu Tan, Conviction/Sarah Guo, and Recall Capital. We’re excited to welcome new investors Bezos Expeditions, NVentures (Nvidia), HSBC Ventures, and Snowflake Ventures. We are also proud to add Marcie Vu to our board. Marcie was instrumental in taking both Google and Facebook public and has a wealth of knowledge in building a best-in-class generational technology company.

“We are thrilled to deepen our partnership with Contextual AI and founders Douwe and Amanpreet to seize a generational market opportunity,” said Marcie Vu, Greycroft Partner. “When Greycroft invested in their seed financing, we foresaw the potential of RAG technology to revolutionize AI in the enterprise. CEO Douwe Kiela, who pioneered RAG at Meta in 2020, and his team have since garnered recognition from customers and top talent for their unparalleled expertise.”

Ready to Change the Way You Work?

Join our team

Our team is growing rapidly. We are customer-obsessed, operate with excellence and agility, and focus on depth and mastery in our space. We spend most of our time building together in-office in Mountain View and have a presence in New York and London. We are committed to changing the way the world works, striving to be the best example of the future of work. Interested? We are hiring for many different roles — check out our careers page.

Start building with Contextual AI

Discover how Contextual AI can transform your enterprise with generative AI and state-of-the-art RAG 2.0 applications. Contact us at info@contextual.ai.