Photo of Thomas Kurian, CEO of Google Cloud

Thomas Kurian, CEO of Google Cloud, speaking about AI at the company’s Executive Forum in Mountain View, CA

Google Cloud held an Executive Forum in June to discuss its latest AI advancements with customers and partners. The company’s CEO, Thomas Kurian, shared how the cloud market is undergoing a transformative shift, and at the core of this reinvention is the evolution of AI, particularly generative AI. Generative AI helps businesses create content, synthesize and organize information, automate business processes, and build engaging customer experiences. To achieve these goals, Kurian said the company would focus on five key priorities: world-class AI infrastructure, multi-model foundational models, deep integration with Google Workspace and Google Cloud Platform, AI-powered collaboration, and a broad ecosystem of partners.

At the heart of Google Cloud’s generative AI offerings are foundation models. Foundation models are large AI models trained on enormous quantities of unlabeled data. Google employs self-supervised learning to optimize these models. These generalized models can perform various tasks, such as recognizing, predicting, and generating text, images, software code, audio, and video.

Google said its customers can access over 60 models from Google and its partners. To make AI accessible to a broader range of organizations, Google provides the ability to make its foundation models, such as PaLM 2, available in smaller, cost-efficient configurations tailored for different applications and use cases. Companies can customize PaLM 2 through adaptation, transfer learning, and distillation techniques. These models come in different sizes, each with performance, latency, cost, and memory requirements, ensuring flexibility for different scenarios. PaLM 2’s latest enhancements improve its to perform advanced reasoning tasks, including code and math, classification and question answering, translation and multilingual proficiency, and natural language generation.

Additionally, Google offers Vertex AI, an end-to-end AI platform that streamlines the deployment of AI and generative models, allowing companies to customize AI models for their specific applications. Vertex AI offers three essential generative AI capabilities: model discovery and tuning, customization tools, and control over data and IP. With Vertex AI, customers can discover and tune foundation models from Google and its partners. The platform enables seamless model scaling, reinforcement learning feedback loops automation, synthetic data generation for testing, and management of deployment locations and costs. Businesses can refine Google’s models to improve performance and specific tasks using prompt engineering, finetuning, parameter-efficient tuning, and reinforcement learning.

One crucial area of concern for organizations is corporate data protection when using large foundation models from third parties. Google said its enterprise services ensure complete control and segregation of data, IP protection, and compliance with regulatory requirements when customizing models. Customer data remains confidential and isn’t shared with other customers or with Google’s foundation models.

Companies want to leverage Google’s expertise and lessons learned in designing and scaling AI models. But you may find yourself wondering what, if anything, are real companies doing with these solutions. Google provided several examples of how customers and partners are using Vertex AI. For instance, an aerospace company, GA Telesis, developed a data extraction solution that automates customer calls by interpreting and correlating customer orders. Snorkel AI, a data science company, uses Vertex AI to generate high-quality training data and train models to customize patient treatments. Organizations like YouCite and Behavior Box utilize generative AI models on the Vertex AI platform to enhance employee interactions and identify insider threats.

Google featured speakers from the Mayo Foundation, Priceline, Wayfair and Wendy’s at the executive forum. These Google Cloud customers shared their insights on AI’s challenges, opportunities, and practical applications. Each organization sees tremendous opportunities in AI and has been working with Google on AI solutions for many years.

The combination of search and chat has become a hot topic in the customer experience world. The panel discussed how companies are looking for ways that search and chat can work together seamlessly using generative AI. Unsurprisingly, travel companies have been attempting to create more personalized customer experiences. Priceline shared that generative AI would help it create a more connected travel experience that incorporates all its products but also surfaces additional relevant information for its customers when they travel.

Like many healthcare organizations, Mayo Foundation seeks ways to minimize employee burnout. It uses generative AI and search to surface complex clinical and medical information to clinicians and administrators. Ideally, it would like to automate some of the clinicians’ and administrators’ routine tasks to allow their employees more time to focus on what matters. Mayo spoke of the importance of having a human in the loop at these early stages of any new AI development. In multiple discussions with various companies, Lopez Research has learned that having a person review the model’s output for intent and accuracy is crucial because AI models frequently don’t perform as expected. AI requires finetuning.

Wayfair spoke about the challenges of the home category and the need to match AI with search capabilities. For example, looking for the perfect couch for your living room is very different than finding out the score of the latest NBA game. In retail, there’s not one right answer; it is a journey. Wayfair spoke about how its company employed machine learning capabilities over time to understand customer intent and match customers to the perfect product.

Wayfair has a framework for generative AI and a small cross-functional  team that reviews what AI use cases deliver value but still require a human in the loop to audit outputs and minimize enterprise risk. Wayfair has fast-tracked use cases for customer service, sales, and code-generation assistants. Going forward, Wayfair wants to focus on more intentional and differentiated features for its customers, such as delivering generative imagery.

Wendy’s wanted to improve the pickup window process’s speed, accuracy and consistency. Wendy’s has over 30 menu products that can be customized. Additionally, its customers may order products in ways that differ from the menu board. Wendy’s uses AI technology, specifically items such as Dialogflow, to help it figure out how to automate the order-taking process. In June, Wendy’s launched its first pilot using new generative AI offerings, such as Vertex AI, to have conversations with customers that include the ability to understand made-to-order requests and generate responses to frequently asked questions. The goal is to eliminate the ordering process’s complexity by leveraging generative AI. Wendy’s will use Google’s foundational LLMs with the data from Wendy’s menu and established business rules and logic for conversation guardrails. The AI system is integrated with Wendy’s restaurant hardware and the Point-of-Sale system.

Instead of solving a single problem, Wendy’s said it wants to create a platform approach that solves problems across Wendy’s various channels beyond drive-thru to include mobile applications. Wendy’s says it has seen early success in using language models in order-taking. It also noted that employees aren’t worried about job displacement of jobs because the order-taking process is one job of several jobs they’re doing in the restaurant. Wendy’s noted the benefits of engaging staff and franchisees early in the technical design phase by providing proof of concepts and gathering feedback.

Priceline shared that a developer’s mindset must change to developing an experience instead of features. The developer must design an experience that optimizes the inputs and outputs of a large language model, which requires a completely different skill set than being proficient in writing JavaScript. For example, in prompt engineering, the developer must decide how the prompts get weighted in terms of price or have value for the customer. Prompts are crucial in communicating and directing the behavior of Large Language Models (LLMs) AI. Prompts are the inputs or queries people use to get answers from a model. Priceline also shared that companies underestimate the need to purchase real-time data infrastructure solutions to enable generative AI. A company needs to be able to measure and monitor AI models within their ecosystem.

Wayfair noted that even digitally native companies have legacy digital solutions. It shared how Google helped it move from a monolithic code base and database to a cloud-native architecture. It also uses Vertex AI and Gen App Builder to make it easier for data scientists and machine learning engineers to quickly get onboarded to a platform so the teams can focus on building and experimenting without worrying about designing ML infrastructure. It’s using BERT to help it understand its customer’s intentions within search.

Mayo said it’s been using the large language models for some time, with Generative AI as the latest iteration in its AI toolkit. Healthcare has always struggled with searching and finding information. Mayo said an average patient has 7,000 to 8,000 data points. The average physician sees 10 to 15 patients a day. Like Wayfair, Mayo Foundation has been working with Google for some time. It used Google’s natural language processing (NLP) capabilities before generative AI to gather and synthesize unstructured data more easily, and generative AI furthers these efforts. Cris Ross, Mayo Clinic’s Chief Information Officer, shared the following in a press release about the collaboration between the two companies. “Google Cloud’s tools have the potential to unlock sources of information that typically aren’t searchable in a conventional manner, or are difficult to access or interpret, from a patient’s complex medical history to their imaging, genomics, and labs.”

Obviously, the Mayo Foundation is also concerned with patient data security. It spoke of the benefits of a secure enclave with private customer keys to ensure data privacy, control where the data resides with an auditable view, and encryption at rest. It also wants to provide its innovators with safe, secure sandboxes for research and development tests. The Mayo Foundation shared that no model will be 100% perfect but said what’s more important is how it measures a model’s accuracy and provides confidence in what the model is good at. It believes you don’t need to go directly to patient diagnosis to make AI successful because there’s so much low-hanging fruit to reduce paperwork and administrative burdens. For example, we can change the patient experience by giving clinicians a summary of their records before the doctor sees them.

Wayfair noted one of the largest differences in AI today is that all companies have access to foundational capabilities as a service to help you jumpstart your AI efforts. Before this, you had to hire highly skilled people, build AI infrastructure, and adopt AI software tools. The availability of AI-specific services from multiple providers is a game changer for organizations looking to deploy AI.

Adaire Fox-Martin, President of Google Cloud GTM & Head of Google Ireland, closed the panel with an appropriate line that sums up the opportunity ahead of us when she said, “You can reimagine, not just a process, but an entire industry with AI’s capabilities. It’s a formative time for generative AI across the industries.”