How partners can augment solution development with gen AI

How partners can augment solution development with gen AI

Meeting client timelines and quality standards is a long-standing challenge for Google Cloud Partners, especially when navigating large legacy codebases, unfamiliar languages, and multiple tools. 

Partner projects frequently involve code integration across multiple languages — including legacy ones — posing challenges for manual analysis and limiting the effectiveness of automation tools. Further, ensuring reliable error detection is crucial for partners assisting Google Cloud developers. 

And despite using DevOps practices for faster solution development, partners rarely have holistic visibility into customer environments, making it harder to minimize errors and expedite delivery.

Today, generative AI (gen AI) is reshaping software development for Google Cloud partners, offering a strategic solution to balance speed, quality, and security. This blog post explores how Google Cloud partners can use Google’s gen AI alongside other Google Cloud services to redefine and enhance DevOps practices.

The Google Cloud Partners gen AI solution: A flexible unified platform

At Google Cloud, we understand that partners need powerful tools with the flexibility to integrate into existing workflows. Our gen AI is integrated across the Google Cloud platform so you can use it to enhance your DevOps practices quickly and efficiently.

For example, Gemini Code Assist completes your code as you write, and generates whole code blocks or functions on demand. Code assistance is available in multiple IDEs such as Visual Studio Code, JetBrains IDEs (IntelliJ, PyCharm, GoLand, WebStorm, and more), Cloud Workstations, Cloud Shell Editor, and supports 20+ programming languages, including Go, Java, Javascript, Python, and SQL.

We always encourage our partners to review our Responsible AI guidelines as they use Google gen AI to enhance DevOps practices.

Now, let’s take a look at how to incorporate generative AI large language models (LLMs) with partner development operations so you can accelerate and transform how you develop solutions for end customers.

Redefining DevOps with gen AI 

Most partners have sophisticated in-house development operations tools and processes for their customers. With our new reference architecture, you can modernize software delivery operations further by integrating Google gen AI products into those tools and processes. 

The reference architecture below shows gen AI products working in conjunction with Google Cloud products to augment solution development operations for partner engagements.

image1

Reference architecture – a gen AI powered DevOps automation solution

Partners can use this reference architecture to build a solution to streamline DevOps software delivery. 

First, Google gen AI LLMs integrate with Continuous Integration and Continuous Deployment (CI/CD) pipelines, enabling automated testing and risk checks against your evolving codebase. 

Then, Google’s monitoring tools provide the visibility needed to spot production issues faster and  centralize troubleshooting, and generated code suggestions from Gemini Code Assist can be pushed into a Git repository for CI/CD deployments into client environments.

Partners can deploy this solution in their own environments to deploy and manage client environments, or deploy it into customer environments as well. 

Solution components

This reference architecture provides a comprehensive solution for automating code generation, review, and deployment; its key components are as follows

1. Vertex AI: The heart of the gen AI platform

  • Gen AI LLMs: Vertex allow developers to interact with cutting-edge generative AI models for code generation, completion, translation, debugging, and fine-tuning. Developers can automate repetitive tasks, improve their productivity, and create innovative solutions. The following generative LLM variants work together within this solution to help developers:

    1. Code-Gecko: Code-Gecko is an intelligent code-completion tool that integrates with Gemini 1.0 Pro. It provides real-time suggestions leveraging Gemini’s context understanding for efficient coding and to generate relevant code suggestions for developers.

    2. Gemini Code Assist & Gemini 1.0 Pro: Gemini Code Assist bridges the IDE and Gemini 1.0 Pro, allowing developers to interact with Gemini. It facilitates user queries, provides contextual information, and generates code snippets based on the user’s requirements within the solution.

    3. Gemini 1.5 Pro: Gemini 1.5 Pro is an upgrade to Gemini 1.0 Pro, performing more complex tasks and comprehensive code analyses. It supports coherent and contextually aware responses, multi-turn interaction, and code generation with documentation. In this design, Gemini 1.5 Pro provides a powerful AI assistant to help with complex coding use cases.

  • Vertex AI Search: Vertex AI Search leverages semantic search and ML models to index code snippets and documentation. In this architecture, it enables fast retrieval of code created by LLMs and facilitates various task-automation processes, such as code generation, testing, and documentation creation.

  • Vertex AI Agents is a cutting-edge agent building and conversational AI service that automates risk checks and generates fresh code suggestions. It integrates with CI/CD pipelines and allows developers to interact with the codebase through a chat-like interface.

2. Cloud Storage: The code repository

  • CodeBase: In this design, Cloud Storage serves as the central repository for all project artifacts, including codebases, unit tests, documentation, and model training data including generated codes from the LLMs.

  • Triggers and build processes: Cloud Storage integrates with Cloud Build, enabling automatic trigger of builds and CI/CD pipelines upon changes to the codebase, to test and validate generated code.

3. BigQuery: Enhancing insights

  • Document metadata: BigQuery stores structured metadata about code artifacts in Cloud Storage, providing insights into code documentation, relationships between source code files and generated tests, and code authorship.

  • Analysis and insights: BigQuery enables analysis of this metadata along with logs and metrics collected from other components, providing insights into model usage patterns and areas for improvement.

4. Security and observability

  • IAM (Identity and Access Management): IAM provides granular role-based access control, ensuring appropriate permissions for developers, services, and the gen AI models themselves.

  • VPC (Virtual Private Cloud): VPC isolates the development environment, enhancing security by limiting access points and defining network controls and firewalls.

  • Cloud Logging and Cloud Monitoring: These services work together to collect logs from various components and track key performance indicators, aiding in troubleshooting, monitoring model performance, and detecting potential issues.

Expedite solution development

Our expertise in Search and keeping results grounded and factual translates to more effective solutions. With this architecture, you can leverage Google gen AI offerings to expedite solution development, streamline workflows, while automating tasks like coding and code review. The end result is quick and error-free delivery to clients and improved software development efficiency. By using this gen AI solution, you differentiate your organization with innovative, ready-to-use tools, demonstrating your commitment to adding value for your clients with cutting-edge technology.

We urge you partners to broaden your competencies in Google gen AI, and integrate it across your workflows. Reach out to your Partner account team for the training options available to you. If you are interested in becoming a Google Cloud Partner, reach out to us here.