Google Cloud Announced Dozens of Generative AI Solutions and Reveals an Evolving Strategy
A break down of 15 announcements and 4 conspicuously absent products
The generative AI foundation model wars are proxy battles in the ongoing cloud wars. AWS, Microsoft Azure, and Google Cloud are the center of gravity in generative AI. Google Cloud today offered a clear demonstration of this reality. It included dozens of significant announcements about generative AI foundation models, applications, and hardware.
I will focus on notable foundation model announcements, software solutions, and product absences for today's update. Hardware announcements will be covered in a future post. Several of them are significant, but the software and models will drive immediate activity.
Google Cloud and AI Foundation Models
PaLM 2 - The upgraded PaLM 2 model now offers a 32k context window, quadrupling the current 8k limit. Google says this is sufficient to maintain about 80 pages of text in memory and significantly expands the use cases the LLM can support.
Med-PaLM 2 - The fine-tuned (i.e., domain-specific or domain-optimized) LLM for medical questions will be generally available next month. Google also said that Bayer, HCA, and MEDITECH are already users and outlined more detail around use cases for drug discovery, medical note summarization, and medical records search.
Imagen - The vague “visual appeal” improvement was complemented by a style-tuning feature for Imagen. The style feature enables developers to set a default style for Imagen, which should enable it to maintain brand consistency in its outputs. And Imagen is getting a digital pixel-based watermark feature that Google says does not degrade image quality.
Llama 2 - The highest profile open-source LLM is now available on Google Cloud. Meta appears to be facing no distribution barriers.
Code Llama - The Llama 2 fine-tuned coding assistant is also available on Google Cloud.
Anthropic Claude 2 - Anthropic is not yet available, but Google Cloud pre-announced its forthcoming launch.
TII Falcon - In a surprise move, Google Cloud is also offering access to TII’s Falcon open-source LLM. This is a savvy move to have another open-source model available that is not tied to a tech giant.
Generative AI Applications and Features
Duet - Google Duet showed up in many places and is only available in preview today. It fulfills a similar product role in what Microsoft calls a copilot. Some noteworthy examples are provided below.
Duet in Google Workspace - There was a demo of Google Workspace employing Duet to create a Google Slides presentation from a natural language prompt and some source documents.
We’re working on enhancing Duet so that we can go from a prompt-based interaction to a much richer, contextual interaction. Duet takes into account what you’re working on, whether it’s an email, a document, a spreadsheet, meeting, and offers you proactive help like generating summaries or suggesting creative ideas; and soon even taking action on your behalf.
We have already seen Microsoft’s vision of its Copilot for productivity, so this demo was not exactly mind-bending. It did, however, look useful and simple. I also noted that Duet may “soon” take action on our behalf. That sounds like a personalized intelligent agent that might fulfill the promise that never materialized with Google Duplex.
Duet for Data Science - The copilot enables you to attach a data document and then perform analysis using natural language prompts.
Deut for Security - Google also promoted Duet as a tool to help analyze and remediate cybersecurity threats.
Duet AI helps security professionals prevent threats, reduce toil in their security workflows, and uplevel security talent — and it’s now integrated into our security products such as Chronicle Security Operations, Mandiant Threat Intelligence and Security Command Center. Duet AI can quickly summarize and classify threat information, translate natural language searches into queries, and provide suggested next steps to remediate issues, which can reduce time for detection and response and make overworked security professionals more productive.
Duet AI - This is the cloud assistant for developers, and it seems like the branding may replace Codey as there is significant feature overlap. Technically, Codey is the suite of foundation models that enable applications such as Duet, and there is a claim that it improved by 25%. From a practical standpoint, Google also talks about Codey as having application features that are similar to Duet AI.
Duet AI include:
Code Completion and Code generation in your IDEs. You get recommendations as you type for full functions and code blocks based on comments, fixes for errors found in the code, and generation of unit tests for code directly in your IDE.
Chat assistance so you can use natural language to ask questions about code bases and APIs, and retrieve coding best practices. Chat assistance is available across many Google Cloud products, such as in the Cloud Console, Cloud Workstations, BigQuery, Spanner, and Apigee.
Vertex AI Search - Google has packaged up some of the features of its Search Generative Experience and packaged it as enterprise search. It is multimodal and supports multi-turn conversational search. It also provides conversation and document summarization. In addition, it has extensions to connect with external data sources and grounding to provide citations and guide the LLM to prioritize results from a specific data set. Essentially, they are making it easy to set up a retrieval model to increase accuracy and reduce (but not eliminate) hallucinations. The demo that shows it plugging into an existing website suggests this may quickly become a popular solution.
Subscribe to Synthedia for free! Get in-depth analysis and daily updates about generative AI.
Vertex AI Conversation - is a new chatbot builder that can leverage LLMs, NLU, search, grounding, and vector databases. You can even start building the bot using a natural language prompt. Google says:
Vertex AI Conversation facilitates the creation of natural-sounding, human-like chatbots and voicebots, powered by foundation models with support for both audio and text. With it, developers can build a chatbot based on a website or collection of documents with just a few clicks. For further customizations, Vertex AI lets developers combine deterministic workflows with generative outputs, combining rules-based processes with dynamic AI to create apps that are engaging but reliable—including transaction abilities so users can prompt AI agents to, for example, book appointments or make purchases. Organizations can tune chats with a variety of data from websites, documents, FAQs, emails, and agent conversation histories, and they can generate interaction summaries, citations, and other data to facilitate handoffs between AI apps and human agents.
A brief demo suggested a Duet-like assistant will enable a no-code chatbot creation using a natural language prompt. We may learn more about this in the developer keynote on Wednesday.
AlloyDB AI - Google says, “Build generative AI applications with the familiar PostgreSQL interface and open, standard technologies like pgvector and LangChain.” The company also claims AlloyDB AI is 10x faster than standard Postgress, though there was no comparison with other popular vector databases. d
Generative AI Hardware
I will cover more on Google’s AI hardware announcements in a future post. If you want the Google summary, you can read this blog post.
What was Missing
Google covered a lot of ground on generative AI technology and application features. However, there were some notable Google generative AI solutions that were noticeably absent from the discussion.
Bard - Google’s answer to ChatGPT was never mentioned. Sure, it is not a Cloud service, but Google would be happy to talk about it if there was something interesting to say. I expect Bard, or its successor, to receive more attention in Google’s fall product launch event. It would be a bad sign if it were shunted aside again. Let’s see.
LaMDA - It was not surprising that LaMDA was not mentioned. Synthedia even predicted its demise in favor of PaLM long ago. Of course, last year at this time, and even in February of 2023, LaMDA was considered Google’s favored generative AI LLM for chatbot experiences. PaLM rules and LaMDA’s history are being expunged from all Google discussions. Is it the Voldemort of LLMs?
Dialogflow - The world’s most widely used conversational AI platform was not mentioned. It is available in Google Cloud today. However, Vertex AI Conversation looks like it may be Dialogflow’s successor application. Dialogflow functionality will not go away because NLU-based capabilities are still needed, and its copious integrations and customer base are valuable. Then again, Vertex AI Conversation may take over the feature footprint at some point.
Google Assistant - The most widely used AI chatbot in Google’s ecosystem and the vehicle that should have been used for generative AI was also ignored. I suspect Bard will eventually be subsumed into Google Assistant, which may appear in the fall product launch event. Stay tuned.
What it Means
Google is finally offering a coherent and comprehensive (maybe too comprehensive) generative AI strategy. The strategy was not explicitly stated, as with Microsoft’s Satya Nadella. However, the new announcements and shape of the emerging product portfolio indicate an implicit strategy of foundation models, copilots, software features, services, and computing infrastructure.
It took eight months and the context of Google Cloud to bring it all together. Today’s announcements also confirmed that Google had a lot of very capable generative AI solutions as works-in-progress when the ChatGPT moment arrived. They just weren’t ready for market.
Google thought it had time to develop these solutions deliberately over many years and was surprised at how quickly the market dynamics changed. The ChatGPT moment forced Google to rapidly transform many years of research into enterprise-ready products.
A prominent software developer told me, “I think once they let Googlers use the technology they built, folks really ran with it.”
“I think once they let Googlers use the technology they built, folks really ran with it.”
Google stressed many times that its services have “zero data leakage” and that cloud user data won’t be used to train its models. This is obviously a key concern expressed by corporate buyers of generative AI services. However, it is one of many challenges. Fierce competition from other cloud providers and software developers will be persistent obstacles facing Google as generative AI adoption accelerates.
What is clear is that this is not the Google of the February Bard announcement. Duet is the real answer to Microsoft’s challenge. Google may not have a technology moat, as one internal researcher memo said, but it does have a lot of technology and business assets that will enable it to compete effectively and potentially lead the market. Microsoft and Amazon surely took note of today’s announcement frenzy. You should expect them to react.
ChatGPT Enterprise seems so quaint compared to what we witnessed today. Game on!