This year, the annual Google Cloud Next conference brought us a whirlwind of product announcements (161 to be precise) that covered a broad range of cloud and collaboration products. From infrastructure and big data to APIs, Google had a myriad of updates to talk about across its entire product suite. 

The uniting theme, however, was AI. And more specifically, generative AI (AKA "GenAI"). Google made no attempt to shy away from the fact that nearly every enhancement to its tech stack was focused on generative AI. Google even referred to this year as the beginning of the age of generative AI. But what exactly does this mean and what makes GenAI different? In this post, we'll examine the new GenAI wave, highlight the most important announcements from the conference, and discuss the impact these new releases will have on the enterprise.

Why GenAI is different

So why the hyper-focus on generative AI? There's no question that right now GenAI is at the very top of the hype cycle. However, it was clear from the keynote delivered by Google CEO, Sundar Pichai, that GenAI is not simply the latest tech fad. Google believes GenAI will soon be everywhere as it makes its way into every major product across all industries. Just dwell on that for a moment. What gives this technology the potential to be so pervasive? Google gave us two key reasons.

  1. Reason one: By its nature, GenAI is able to create something new that didn't exist before. Imagine all the things that are created by business workflows today: emails, calendar events, meeting notes, chat reminders, presentations, etc. Augmentation of collaboration content seems straightforward enough (Google even showed off its AI's ability to attend a meeting for you and take detailed notes!). But there's a further vision here. With the creation of code, change logs, security alerts, firewall rules, infrastructure definitions – the list goes on – so much of the tech stack is now defined by text in standardized formats and GenAI just happens to be extremely adept at understanding and creating this type of content. 
  2. Reason two: Google pointed out that every company can realize immediate benefits from GenAI internally. If you think about the widespread adoption of something like the Internet, one of the keys to its success was the fact that companies connecting various office locations saw immediate intrinsic benefits to internal workflows. GenAI has the potential to offer all companies the same type of transformative benefits without having to wait for market opportunities to drive adoption.

Vertex AI 

Keeping Google's tremendous focus on GenAI technology in mind, let's dive into the product announcements. When it comes to building AI solutions on Google Cloud, everything revolves around Vertex AI. If you aren't familiar, Vertex AI is not a single product, but rather a platform of AI solutions that are grouped together to form an expansive set of AI tools and APIs. It became known at the conference that Vertex AI underwent a huge amount of enhancements including:

PaLM 2 

  • Now available in 38 languages.
  • 4x input token increase: The text-bison-32k model can now handle up to 32k tokens in a single input prompt (a token is approximately 4 characters, so think 128k characters or an 85-page document).
  • 8k tokens outbound: Output now allows for up to 8k tokens.
  • Adapter tuning (generally available).
  • Reinforcement Learning with Human Feedback (or RLHF, in preview).
  • 15% price decrease for using the PaLM API.
  • Grounding service: They ability to ground AI responses in your specific data; reduce hallucinations.

Model Garden 

  • Llama 2 is now available (from Meta).
  • Claude 2 is now available (from Anthropic).
  • Sec-PaLM (from Google, model tuned to security use cases).
  • Med-PaLM (from Google, model tuned to medical use cases).

Gen AI App Builder 

  • Vertex AI Conversation: Now generally available; easily create chatbots with little to no code or ML expertise.
  • Vertex AI Search: Now generally available; quickly create an interactive search of a website or other data source with little to no coding needed.

Vertex AI extensions

  • Connect to and retrieve data from other sources (HR system, sales data, project management system, etc.) and use this data in generated responses to users.

Key takeaway on Vertex AI

Vertex AI aims to be a one-stop-shop for everything AI, from truly custom model creation, to model tuning, to using existing third-party models via the Model Garden. It provides tools for doing everything from the ground up, or relying on pre-built solutions, like GenAI App Builder, that significantly reduce the time and cost of bringing an AI solution to market. 

Duet AI 

Another key focus of Google Cloud Next was Duet AI. Previously, this product was only integrated into Google Workspace and was focused on augmenting collaboration tasks. At Next, it was announced that Duet AI will now be integrated into a multitude of Google Cloud products, as well. For starters, these include: 

  • Cloud Code
  • Cloud Console
  • Logs Explorer
  • BigQuery
  • Cloud Spanner
  • AlloyDB AI – new product; builds GenAI apps integrated with a fully managed PostgreSQL database.
  • Colab Enterprise – new product; Colab Notebooks with enterprise controls and security.
  • Security Command Center
  • Mandiant Threat Intelligence
  • Chronicle Security Operations

Key takeaway on Duet AI 

Expect to see Duet AI everywhere in GCP. It will be integrated into existing products, allowing users to use natural language to accomplish increasingly complex tasks, check their work and navigate product documentation. 

Commitment to the AI Ecosystem 

Finally, we should mention Google's focus on the AI ecosystem. While Google is developing a significant amount of first-party AI products, it is clear they want GCP to be friendly to third-party hardware, tools and datasets. 

Google announced a big expansion of its partnership with NVIDIA and CEO Jensen Huang even made an appearance during the keynote. The partnership includes NVIDIA putting DGX (their ML research supercomputer) into GCP. For customers of GCP, the partnership means continuing to get the best NVIDIA hardware available to run AI workloads, such as the new A3 VMs, which are powered by the NVIDIA H100. 

As mentioned above, we also saw continued commitment to the Model Garden with the addition of many more third-party models such as Llama 2 by Meta and Claude 2 by Anthropic. Additionally, products like Vertex AI Connectors promise infinite possibilities for developers to build integrations for their products to work seamlessly with Vertex AI on GCP. 

With so many possibilities for how to build your AI solution, Google also pointed to the importance of working with a services partner that has experience in AI and can therefore help you navigate these products and create the right approach for your business. 

Key takeaway on AI ecosystem

Google wants the AI tools of GCP to integrate well with existing solutions and data, so they are developing their tools with this goal in mind.

Responsible AI 

It's also worth noting that Google spent a significant amount of time talking about safe and responsible AI. The grounding features rolled out to PaLM 2 are a huge improvement for reducing hallucinations and directing a model to prioritize specific content as the source of truth. 

Several sessions also mentioned that there's more to come for data controls, including tech safeguards, a content moderation API, and tools to check content sources and reduce response bias. 

Other important launches

The conference wasn't all AI, mind you. There were noteworthy launches in other categories, as well. See here:

Infrastructure 

  • Cloud TPU v5e: The next evolution of Google's Cloud TPUs.
  • GKE Enterprise: An expansion on Anthos, offering managed Kubernetes across multiple environments.
  • A3 VMs, powered by NVIDIA H100: New VM type offering tremendous compute power for HPC workloads.
  • Titanium system: A tiered offload architecture system.
  • Google Distributed Cloud (next-gen):  The next generation of GDC with significant improvements and controls available.
  • Cross-cloud Network: A new low-overhead method for connecting clouds or on-premises resources directly to Google Cloud.

Data and Analytics 

  • BigQuery Studio: A new collaborative suite to accelerate data into AI workflows.
  • Looker Studio Pro: Self-service analytics platform with full enterprise controls and support.
  • AlloyDB AI: A new tool for building GenAI apps into fully managed PostgreSQL databases.

Storage 

  • Cloud Storage FUSE: Mount and access cloud storage buckets as a local filesystem.
  • Parallelstore: Based on Intel DAOS; high performance, managed parallel file service.
  • NetApp Volumes: Native support for NetApp storage in GCP.

Security 

  • Mandiant Hunt for Chronicle Security Operations – AI tool that proactively searches for security threats.

In conclusion

There were many more product enhancements that Google announced at the conference that simply won't fit in a single post. But the above should give you an excellent overview of the noteworthy highlights. 

Google continues to invest heavily in building generative AI solutions for its customers to use in the cloud. We are likely only at the beginning stages of these products and their capabilities. It's an exciting time to be at the forefront of this emergence and continue to help companies navigate this rapidly evolving space.

Technologies