Vertex AI Search adds new generative AI capabilities and enterprise-ready features

Vertex AI Search adds new generative AI capabilities and enterprise-ready features

Our vision for Vertex AI Search, unveiled earlier this year and made generally available in August, is to leverage our deep experience in information retrieval and generative AI to help enterprises enable their customers, employees, and other stakeholders to discover critical information at speed, uncover hidden insights across data, and improve productivity.

The easy setup and out-of- box capabilities of Vertex AI Search reduce the time it takes to build search applications from weeks and months to mere days or hours. Moreover, with the ability to add custom embeddings and leverage large language models (LLM), Vertex AI Search provides customers with a tunable Retrieval Augmented Generation (RAG) system for information discovery. Since its launch, we have seen customers use Vertex AI Search with Vertex AI Conversation for a wide range of applications that combine generative AI and semantic search, from intranet and website search to digital assistants.

For example, Forbes recently announced the beta launch of Adelaide, a purpose-built news search tool, created using Vertex AI Search and Conversation, that offers visitors AI-driven personalized recommendations and insights from Forbes’ trusted journalism. Adelaide’s search- and conversation-based approach makes content discovery easier and more intuitive for Forbes’ global audience, combing through Forbes’ trusted content archive from the past twelve months, continuously learning, and adapting to individual reader preferences.

“As we look to the future, we are enabling our audiences to better understand how AI can be a tool for good and enhance their lives,” said Vadim Supitskiy, Chief Digital and Information Officer, Forbes. “Adelaide is poised to revolutionize how Forbes audiences engage with news and media content, offering a more personalized and insightful experience from start to finish.”

GE Appliances uses Vertex AI Search to power their SmartHQ Assistance experience for their appliances. “At GE Appliances, we constantly strive to deliver the products and experiences that resonate with our consumers. Earlier this year, we rolled out the SmartHQ Assistant experience which enables our consumers to seamlessly interact with their appliances. By leveraging Vertex AI Search, we were able to deliver this uniquely personalized experience with unprecedented speed and accuracy,” said Adam Jones, Senior Director, Cloud Products and IoT Services at GE Appliances.

Today, we’re thrilled to build on this momentum by announcing additional customization and expanded grounding and compliance capabilities for customers to develop even more powerful and secure search, chat, and personalized recommendation applications.

Tailored search to fit business needs

Our new generative AI features address the needs of organizations, especially large enterprises, that want to more deeply customize AI-driven search:

Customizable answers: With customizable answers, now in preview, Vertex AI Search lets developers design the prompt used to generate summarization/answers, giving them control over the style, tone, length, and format of information presented. In addition, for some use cases, developers might also expose part of the prompt to end users, generally as drop-down options that provide options such as “Short,” “Verbose,” “Casual,” or “Formal” answer styles.

For example, if a user asks “What is Vertex AI Model Garden,” a standard response might look like this:

​​Vertex AI Model Garden is a managed service for discovering, managing, and deploying machine learning models. It provides a unified interface to pre-trained models from Google and third-party partners, as well as tools for customizing, tuning, and deploying models to production.

A response that’s been prompted to be “simple,” in contrast, might output something like this:

Vertex AI Model Garden is a place to find and use machine learning models for tasks like image classification, natural language processing, and translation.

image1

Vertex AI Search lets developers design the prompt used to generate summarization/answers, giving them control over the style, tone, length, and format of information presented.

Search tuning: Beyond letting developers build apps that search through an organization’s documents and data, in preview later this month, Vertex AI Search will also let organizations use their own data for document rank tuning, helping to provide even more accurate results. Even small training sets — for example, 50 – 100 questions with answers from relevant document segments — is enough for Vertex AI Search to refine its rankings and deliver better search experiences.

DIY search engines with vector search and Vertex AI Embeddings: For customers who want to build their own search, recommendations, and other gen AI applications for more complex use cases like e-commerce or ad-serving using embeddings, we offer vector search and Vertex AI Embeddings.

  • Vector search: (formerly Vertex Matching Engine) indexes data as vector embeddings and finds the most relevant embeddings, at scale, blazingly fast. It performs a search algorithm, known as approximate nearest neighbors (ANN) that can handle high throughput while providing high recall at low latency. To make vector search more accessible we’ve updated the UI so developers can create and deploy their indexes without any coding. We’ve also reduced index time for smaller indexes from hours to minutes and improved filtering capabilities and documentation. Read more about vector search and all the enhancements here.
  • Vertex AI Embeddings: Vertex AI offers a set of embedding models across data modalities to support use cases including outlier detection, classification, content moderation, semantic search, and recommendations. Vertex AI’s Text Embeddings and Multimodal Embeddings (supporting Text & Image) models are both generally available. We are excited to introduce in preview a new Multimodal Embeddings model that supports text, image, and now video. All three data input types can share the same semantic space to unlock new use cases using video files. These embeddings can be uploaded to vector search to pair with other Vertex AI services and foundation models to power predictive and generative AI applications.

Rooting results to data: new options for grounding

One of the concerns most enterprises have with generative AI is that foundation models can be prone to hallucinations. How do organizations build user confidence in the results presented in their generative apps? Vertex AI Search offers multiple options to ground data in information the organization is comfortable with.

  • Grounding in enterprise data. By grounding results in the factuality of their own data and including summaries and citations alongside the results, organizations can help users to verify and validate results across disparate data sources. This feature is a core building block of Vertex AI Search and generally available. Through Vertex AI Connectors, developers can expand data sources to many leading enterprise applications.
  • Grounding in selected public datasets. For organizations where a number of employees might want to regularly access third-party public datasets like Wikipedia, we are also testing grounding with publicly available datasets. This new option lets developers leverage a wider set of information sources for employees and stakeholders’ data discovery needs, saving both the time and effort of having to search multiple sources individually for the same data.

Compliance-first search for the modern enterprise

When developers create applications that search across enterprise data, the security and privacy of that data is always top of mind. Enterprise data — which includes any customer data stored on Google Cloud, the input prompt, the model output, and tuning data — is stored in the organization’s own Google Cloud instance, and Google does not access this data without permission or use it to train our models. We also support a range of compliance and security standards, including HIPAA, ISO 27000-series, and SOC-1/2/3. These standards help to ensure the transparency, accountability, confidentiality, and integrity of our customers’ data.

We are delighted to share that we are expanding support for access transparency to provide customers with awareness of Googler administrative access to their data. Virtual Private Cloud Service Controls (VPC-SC) prevent customer employees from infiltrating or exfiltrating data. We are also offering Customer-managed Encryption Keys (CMEK) in preview, allowing customers to encrypt their core content with their own encryption keys.

Google Cloud is committed to helping our customers leverage the full potential of generative AI with best-in-class privacy, security, and compliance capabilities. Our goal is to build trust by protecting systems, enabling transparency, and offering flexible, always-available infrastructure, all while grounding efforts in our AI principles.

Whether an organization wants to power its public website search with Google-quality results, help employees find information faster, or deliver personalized recommendations to users, Vertex AI Search makes it easy to build highly-customizable generative search and recommendation applications, leveraging the best of Google’s information retrieval technology combined with the state-of-the-art generative AI models. Visit our Vertex AI Search web page or reach out to the Google Cloud Sales team for more information.