Mistral AI’s Codestral launches as a service, first on Vertex AI

Mistral AI’s Codestral launches as a service, first on Vertex AI

Today, we’re announcing that Google Cloud is the first hyperscaler to introduce Codestral – Mistral AI’s first open-weight generative AI model explicitly designed for code generation tasks – as a fully-managed service. Codestral helps developers write and interact with code through a shared instruction and completion API endpoint. You can get started with it today in Vertex AI Model Garden.

Additionally, we’re thrilled to announce the addition of Mistral AI’s latest large language models (LLMs) to Vertex AI Model Garden, generally available today via our Model-as-a-Service (MaaS) endpoints:

  • Mistral Large 2: Mistral AI’s flagship model, offering the best performance and versatility of any of the company’s models to date.

  • Mistral Nemo: This 12B model delivers exceptional performance at a fraction of the cost. 

The new models excel in coding, mathematics, and multilingual capabilities (including English, French, German, Italian, and Spanish), making them ideal for a range of downstream tasks, from content localization to software development. Notably, Codestral is  optimized for tasks such as code completion, documentation, and test generation. You can access the new models in just a few clicks using Model-as-a-Service, without any setup or infrastructure hassles. 

These additions continue Google Cloud’s commitment to open and flexible AI ecosystems that help you build solutions best-suited to your needs. Our collaboration with Mistral AI is a testament to our open approach, within a unified and an enterprise ready environment. Vertex AI provides a curated collection of first-party, open-source, and third-party models, many of which — including the new Mistral AI models — can be delivered as a fully-managed Model-as-a-service (MaaS) offering. With MaaS, you can choose the foundation model that fits your requirements, access it simply via an API and tailor it with robust development tools — all with the simplicity of a single bill and enterprise-grade security on our fully-managed infrastructure. 

“We are excited to announce the expansion of our partnership with Google Cloud, which marks an important milestone in our mission to put AI in everyone’s hands. As the first hyperscaler to support our new Codestral model, Google Cloud will enable developers worldwide to leverage the power of Mistral AI’s proprietary models on Vertex AI. Together, we are democratizing access to state-of-the-art AI technology, empowering developers to build differentiated gen AI applications with ease. With this collaboration, we are committed to driving together meaningful innovation in AI and delivering unparalleled value to our customers and partners.”
—Arthur Mensch, Co-Founder and CEO, Mistral AI

Trying and adopting Mistral AI models using Google Cloud

Google Cloud’s Vertex AI is a comprehensive AI platform for experimenting with, customizing, and deploying foundation models. Mistral AI’s new models join over 150 models already available on Vertex AI Model Garden, further expanding your choice and flexibility to choose the best models for your needs and budget, and to keep pace with the continued rapid pace of innovation.

image1

By accessing Mistral AI models on Vertex AI, you can:

  • Experiment with confidence: Explore Mistral AI models through simple API calls and comprehensive side-by-side evaluations within our intuitive environment. We handle the deployment and infrastructure complexities for you.

  • Tune the models to your advantage: Fine-tune Mistral AI’s foundation models to create bespoke solutions, with your unique data and domain knowledge. The ability to fine-tune Mistral AI models using MaaS will be available soon.

  • Craft intelligent agents: Create and orchestrate agents powered by Mistral AI models, using Vertex AI’s comprehensive set of tools, including LangChain on Vertex AI. Integrate Mistral AI models into your production-ready AI experiences with Genkit’s Vertex AI plugin.

  • Move from experiments to practical application: Deploy your Mistral AI models at scale, using pay-as-you-go pricing and without having to manage any infrastructure. You can also maintain consistent performance and capacity with Provisioned Throughput, available in the coming weeks. And of course, leverage world-class infrastructure, purpose-built for AI workloads.

  • Deploy with confidence: Benefit from Google Cloud’s robust security, privacy, and compliance measures, safeguarding your data and models at every step

“We had the opportunity to preview Mistral AI models in Vertex AI Model Garden. For our banking app Sumeria, they stood out from the competition in multiple customer service use cases such as conversation categorization and customer sentiment analysis. Mistral AI is a competitive and high-performing French alternative to other models on the market. Its integration to Vertex AI is a great advantage for rapid adoption and productization.”
William Brulin, Senior Vice President, Sumeria / Lydia Solutions

“The BlaBlaCar team previewed Mistral AI models in Google Cloud. The Vertex AI Model Garden integration makes experimentation super efficient. The results met our expectations across twenty languages spoken by our members, and convinced us to work with Mistral AI in the future.”
Emmanuel Martin-Chave, VP Data at BlaBlaCar

Get Started with Mistral AI models on Google Cloud

We’re committed to providing developers with easy access to the most advanced AI capabilities. Our partnership with Mistral AI is a testament to that commitment from both the organizations to provide you with world-class innovation in AI supported by  an open and accessible AI ecosystem. We’ll continue to work closely with Mistral AI and other partners to keep our customers at the forefront of AI capabilities.

You can access the Mistral AI models in Vertex AI Model Garden later today, and to learn more about them, check out Mistral AI’s announcement.