Sachin Gupta, Google | Google Cloud Next ’24

[Savannah Peterson]

Welcome to Google Cloud Next

Good afternoon, cloud community, and welcome back to fabulous Las Vegas, Nevada. We’re here midway through day one of Google Cloud Next. We’ve got three days of coverage, power-packed with over 35 different segments.

Introducing Savannah Peterson and Rob Strachey

My name is Savannah Peterson, joined by analyst Rob Strachey for this one. Rob, this is a fun one. Is this your second Google Cloud Next?

[Rob Strachey]

Rob Strachey’s experience with Google Cloud Next

It’s my third.

[Savannah Peterson]

Your third?

[Rob Strachey]

Yes, yes. But this is the second one with theCUBE, and I think it’s really just, I mean, it seems like we were here yesterday, but the amount of announcements since the last one last fall is just massive.

Sachin Gupta joins the show

It’s just unbelievable.

[Savannah Peterson]

It absolutely is massive, 30,000 people here. Our next guest, one of the folks behind the infrastructure at Google, Sachin, welcome to the show. Must be a thrilling day for you. It absolutely is, and super excited to have this time with you guys. Yes, we’re grateful you could make the time.

Google Cloud announcements

Big series of announcements coming out from your division today and over the course of the week.

[00:01:04]

Can you give us some of the highlights?

[Sachin Gupta]

Google Distributed Cloud

Yes. I think some of the things that Thomas also started mentioning in the keynote, I just want to echo a little bit more. The best place to run AI is clearly in our public cloud regions, but there are reasons why sometimes customers can’t use those regions, and that could be because there’s a regulatory need, there’s a compliance need, survivability or latency-driven need that forces the deployment to remain on-prem or at the edge, and that’s why we introduced Google Distributed Cloud, and we’re seeing tremendous momentum with customers both in completely air-gapped environments where the data must stay on-prem, but the operations are all also air-gapped, no connectivity to the internet ever, as well as connected environments like retail stores that may have hundreds or thousands of locations, but they still need continuity of critical services on-premise. So, super excited with the momentum there, and also super excited with what we announced at last night, actually, which is cross-cloud networking, helping customers simply and securely get the best out of AI models, out of data, regardless of where that data resides today.

[00:02:14]

Cross-cloud networking

So, you know, very, very excited about those.

[Savannah Peterson]

How has that cross-cloud networking evolved since that announcement was made eight months ago-ish?

[Sachin Gupta]

Yeah, so cross-cloud networking is really about how we help you build distributed applications, how you secure your workforce, and so how can you connect every single one of your locations into our backbone, leverage the power of our backbone, but bring your security stack of choice. So that’s where we partner with Palo Alto and many of the other players so that the customers don’t have to compromise on security at all. There’s maybe three things that I’d like to highlight in cross-cloud network that were announcing here.

Cross-cloud networking enhancements

One is, cross-cloud networking is now service-centric. So what I mean by that is, we had a technology called Private Service Connect that allowed you to have, think of it as a proxy, in front of your Google Cloud services and applications.

[00:03:04]

But now we extend that to other clouds or applications and services running on-premise. So customers can have one consistent way for NetOps and SecOps to set this up, and DevOps can now move much more quickly. So that’s number one. The other thing we’re doing is, you may have your data in Hyperscaler One and may have it on-prem, but you may want to be leveraging Gemini as the model. So how do you connect that cost-effectively with the right SLAs? Cross-cloud network helps customers with that. And then the third one is, you’re trying to do inferencing, you’re serving these models for all those wonderful enterprise use cases that we talked about, and you want to make sure that you’re using that ML infrastructure very effectively, GPUs and TPUs, and that’s where we help you with our load balancer. It’s now AI-aware and can help you optimize the experience for customers, as well as help lower costs. So lots of great innovations in CCN.

[Rob Strachey]

Unpacking service-centric approach

Yeah, I think let’s unpack that a little bit even more, because I think the service-centric just went GA, right?

[00:04:06]

And I think it’s when you start to look at how organizations are building out their applications, they either want to bring the AI to the app or to the data, or the data to the AI. And this would seem that you’re helping them do both, and choose where data lives, what kind of lakes it lives on, and things like that, and bringing the AI to it in certain circumstances, and then with the distributed cloud portion as well. But also making those connections across there, because most people are multi-cloud or have multiple different places, or have SaaS, or something of that nature.

[Sachin Gupta]

Scotiabank customer example

So maybe let’s unpack that using a customer example. We’ve been working very closely with Scotiabank, and Scotiabank is trying to connect on-prem and other cloud environments using interconnect, or something we call cross-cloud interconnect, into Google Cloud. And now they’re trying to represent every one of the services they have using this whole server-service-centric approach we have.

[00:05:04]

But it’s not just our services. They have wallet services, credit card services that they must integrate with. So when you think about AI applications and bringing the data together, those AI apps have to integrate with all of their existing services. And so the new enhancements and the new capabilities in cross-cloud network thinks about how you provide that connectivity fabric, how you provide that security, and how do you provide that proxy and help create a mesh for your services and your application tiers that may sit anywhere. And so Scotiabank is able to get consistency, better security, lower cost, all at the same time.

[Rob Strachey]

Partnering with Palo Alto Networks

And it seems like a good place where you’re bringing together not just the power of Google, but the power of your partners as well. Yes. And like you mentioned, Palo Alto, who we had on earlier in the day, how does that really play into this to really help customers get a full end-to-end solution?

[Sachin Gupta]

Very often, customers are forced to make a hard decision where they have to compromise on their security stack of choice.

[00:06:07]

Because they may go to a cloud provider who says, use my firewall. And their firewall may not be the best, and their security needs may be something different. So first of all, we partnered with Palo Alto Networks to create our cloud next-generation firewall. And it’s 20 times more, it has 20 times higher efficacy than firewalls in other cloud providers. But at the same time, if you don’t want to use our firewall, we’ve made it very easy to bring your own firewall and integrate it as part of our backbone. And so Palo Alto, Broadcom, there’s just so many different partners that we’ve onboarded, so that customers do not have to compromise. And this goes back to our strategy of remaining completely open. We want to make sure that that experience is great, and customers can bring their security stack of choice.

[Savannah Peterson]

Collaboration between hyperscalers

It’s so important, and it is really exciting to see the different big players and hyperscalers all coming together, to your point, to play nice and to create these solutions.

[00:07:01]

20x more efficacy on security is really compelling. It’s no wonder that you just got authorization from one of the most secure entities, in theory, on the planet, the US government.

US government authorization for Google Distributed Cloud

Tell us a little bit more about that.

[Sachin Gupta]

Yeah, let me talk a little bit about that. So we’re super excited. We announced this today, and this is now shifting gears to our Google Distributed Cloud product, which has now received authorization to operate in top secret and secret use cases with the US government. And that doesn’t happen just by chance. It’s our fundamentals in zero trust, and building the platform from the ground up with zero trust framework in mind, and then going through that accreditation process, working with the US government. So super excited about that. We’re seeing great traction with Google Distributed Cloud in many countries, and for many, many different use cases.

[Savannah Peterson]

Congratulations. That is a huge deal. I’m just curious, how long was that process? More than a year. How about that?

[Rob Strachey]

I’ll leave it there.

[00:08:00]

Yeah, it definitely, it is definitely a tough thing.

[Sachin Gupta]

Is that top secret? Yeah, yeah, yeah. Something like that. But we’ve been working on it for a long time, and on the product side, we’ve been working on several years for the product to be able to serve these kinds of needs.

[Rob Strachey]

Partnering with ISVs in Google Distributed Cloud

Yeah, and I think also, within the Distributed Cloud, you’re bringing partners to bear there as well, and ISVs in particular, not just packaged ISVs like SAP and Citrix and Starburst, but things like where Starburst having Trino and Service Mesh and stuff of that nature, or Data Mesh, you also have your own in those. So it seems like you’re bringing, again, that choice to the edge and to where it needs to be actually used.

[Sachin Gupta]

Generative AI search capability in Google Distributed Cloud

Yeah, so let me just provide one more example on that. We’re introducing a generative AI search capability in Google Distributed Cloud. So it’s running fully air-gapped, no connectivity to the internet, no connectivity to Google. You can feed your most sensitive data into this thing and search it, multi-modal input.

[00:09:04]

So the way it works is, we take that input, we apply a Vertex API to do optical character recognition, speech-to-text conversion, translation API. We then use an open source model to create embeddings. We then take those embeddings, we store it in a vector database, which is AlloyDB, and then we create a chatbot interface and use a large language model, which is Gemma, the Google open source model, in order to interact with this. So imagine you’ve got all these data sets, private, you could not gain insights from it, it was way too complicated, and we now give you a complete solution where you don’t have to worry about your data ever leaving your premise. Now, I just talked about a whole bunch of Google components that pull this together, but if you want to use somebody else’s translation API, a different embeddings model, a different vector database like Elastic, for example, if you want to use Lama instead of Gemma as the LLM, every component there, you can swap out for whatever you want as a customer.

[00:10:04]

So we give you this entire solution, very attractive, very useful, but completely open where every part of it can be swapped out for your choice. It’s like a little Lego kit. It’s like a solution kit where we give you the entire solution framework, we show you how it’s all integrated, you can take it, you know, the entire repository and deploy it, and if you’re like, no, no, no, I actually want to use something different in this box, very easy to do.

[Savannah Peterson]

Shortening the adoption curve

That makes it a wonderful, it shortens the adoption curve, I would assume, for a lot of these customers.

[Sachin Gupta]

It absolutely does. I mean, it’s a Kubernetes-based service, very easy to load, very easy to run. You know, in chain AI world, you can’t be talking about years. I mean, you have to be talking about days and weeks in terms of how quickly innovation is moving, and so we want to make sure if you need to update any one of those components, you can do it easily.

[Savannah Peterson]

Yeah, absolutely. I think it’s awesome. You mentioned customers. I know you’re in the spotlight this afternoon sharing some customer examples. Company I did not expect to be talking about as much this week as we already have today is McDonald’s.

[00:11:04]

McDonald’s use case

Yeah. Tell us a little bit about that use case.

[Sachin Gupta]

So McDonald’s, you know, we’ve had a great partnership with them. I’ve been working with Brian and Steve from there for a long time now, and it was great to sort of understand what are they trying to do as they re-imagine their store experience. For them, it really starts with the customer and crew experience, and so when they thought about infrastructure and modernizing those restaurants, the infrastructure there, it was a great conversation on, hey, what should be in the cloud? What should remain in the restaurant? What kind of reliability do they expect? You know, the restaurants must continue running. The fryers must run. The menu systems must run, and it was great to sort of design with them what does this cloud feature look like in those restaurants, and then with Google Distributed Cloud with our connected offer using, you know, one RU server, three of them in a cluster in every single restaurant globally, it’s just so powerful.

[00:12:01]

And I think there’s two things that I just wanted to highlight there.

Operating and AI applications in Google Distributed Cloud

One is, operating that environment, I mean, it’s still quite legacy in many, many places, but we can bring a complete DevOps model to this. What’s your blueprint? What’s your policy? How do you segment your markets? How do you roll things out? So a lot of our customers love that about Google Distributed Cloud. Tens of thousands of locations, it has to be structured very, very simply for how you roll out, how you monitor. And then secondly, it’s about the AI applications we can enable. And so now, how do you do automated order taking? How can we recognize on your tray, did you actually place all the right food items? Because you want to prevent loss, but you also want to make sure that the customer is satisfied because they got everything they ordered correctly. And so a ton of applications that we can provide, and these are restaurant examples, but it extends to many, many verticals.

[Rob Strachey]

And some other things you were talking about or was on that Thomas was talking about this morning is some new form factors coming to this as well.

[00:13:04]

New form factors with NVIDIA

And again, with everybody, if you don’t talk about NVIDIA and talk about AI, something seems wrong, but there’s some form factors coming with NVIDIA as well.

[Sachin Gupta]

Yes. So we announced support for multiple things, L4 GPUs, H100 GPUs from NVIDIA. At the same time, we have OneRU servers, we have multi-rack support. And for the air-gapped environment, we also announced a tactical appliance. So it’s a small, I think it’s about 100 pounds or so appliance that can go on a vehicle, for example. And it’s a full cloud inside this appliance. We also announced actually something called a Conex design, which is racks that actually go in a container. So your shipping container, you can actually deploy the container and bring up an entire cloud to service an area. It’s just, we’re really working with our customers on what they need, and we’re going to continue delivering the right form factors for the different locations they need it at.

[Savannah Peterson]

Yeah, different environmental factors there, even.

Sustainability and AI

It may make sense in certain environments to have a shipping container as your server rack.

[00:14:03]

And the sustainability, I was walking the show floor earlier, and I was seeing that there was a big presentation on AI for sustainability, both from the food chain to everything that we do that’s so wasteful. I can imagine in the McDonald’s incident, you’re reducing a lot of food waste as well. Yes. I would imagine that’s always a challenge.

[Sachin Gupta]

Use cases for AI in various industries

Streamlining operations, increasing efficiency, reducing loss, reducing cost is super important. Another example I’ll give you is a lot of the manufacturing companies are trying to use video vision detection to identify errors as it’s going through. And so if you can pull that off and not ship it to an end customer, rework the part, and do it automatically, it’s extremely powerful, improves customer satisfaction, reduces loss as well. A ton of use cases. I mean, fraud detection in financial customers. Speaking of sustainability, a lot of the energy companies are prone to cyber security attacks. Everybody’s sort of, after that it’s critical national infrastructure, and so they tend to not share their data and have fully air-gapped systems.

[00:15:03]

But now we can give them data capabilities, AI capabilities that is fully air-gapped, so they can optimize how they’re delivering energy and meet their own sustainability goals as an energy provider. So very excited about helping them in that journey.

[Rob Strachey]

Supportability aspect of Google Distributed Cloud

And I would assume this also has a supportability aspect to that, that is really helpful to these organizations, because even though we have historically low unemployment here in the States and everything like that, there’s still a difference between what it takes to actually do AI on-premise versus doing it in the cloud, and this must help that as well.

[Sachin Gupta]

Customer focus on data insights

I think, Robby, you bring up a great point. So many of my customer conversations are about, I want to get out of the business of managing infrastructure, managing databases, managing solution integration. And what I want to focus on is, what is the data that I care about, how do I securely gain insights from it that can help me innovate faster, improve my customer experience, lower my cost?

[00:16:05]

And I tell them, you know what my business is? It’s to deliver them that complete architecture and solution, infrastructure, past services, database services, AI services, make it a fully managed solution, so they don’t need to worry about that. And that’s not what they want to be in the business for. And they want to focus on their own core businesses, and we’re super happy to be able to help them there.

[Rob Strachey]

And I would say that that probably hits both sides of the fence that you’re talking here, on the networking side as well as the distributed cloud.

[Sachin Gupta]

Networking and distributed cloud

Yes, yes, absolutely. You know, sort of going back to the networking conversation, so often that is the hardest part when you’re thinking about on-prem migration or connecting different data silos securely. It sometimes can be the hardest, it can take the longest, removing that, reducing that pain significantly, showing that we’re open to working with other hyperscalers, showing leadership.

Data transfer upon cloud exit

For example, in our data transfer upon cloud exit, we’re the first hyperscaler to make that free, so showing leadership that we’re going to lead here has been super, super helpful to our customers.

[00:17:09]

[Rob Strachey]

Yeah, it just seems like you continue to have all of these things, and that really, I mean, it’s paying out with the customers.

Orange customer example

I was going to say that Orange was on, was talked about by Thomas as well this morning, and that was a really interesting use case as well.

[Sachin Gupta]

Yeah, I mean, Orange, again, 26 countries, everyone had their own jurisdiction, nobody wants the data. These are call data, et cetera, to leave their country, but they need to apply AI to that data, they need to enhance customer experience. Orange is in the business of infrastructure, but telecom infrastructure, mobile infrastructure, they don’t want to worry about cloud infrastructure, and so we’re giving them Google Distributed Cloud in each of those 26 countries to provide those services so easily, and now they can focus on analytics and AI instead of worrying about the rest.

[00:18:00]

[Savannah Peterson]

That user experience is so much more enjoyable for them. I think that’s such a good thing.

Biggest risk to AI hype

Sachin, since you see quite a swath of the market across verticals, across nations, across the entire industry, what do you think is the biggest risk to the hype and excitement that we’re experiencing right now in AI?

[Sachin Gupta]

I think, that’s a great question, you’re probably going to get lots of different answers to that. I think customers are trying to go from experimentation to real enterprise deployments, production use cases, and we talked about so many customers in the keynote who are already deploying applications this way. Thinking about security, is my data still secure, am I making sure that I’m providing results that are completely grounded in reality? If you’re using this to engage with your end customers, that engagement has to be high quality, it needs to be correct. You cannot have hallucinations. Making sure we’re taking care of data privacy, copywriting, I think is super important.

[00:19:06]

Addressing security, privacy, and performance concerns

I think the risk is, Google makes this a huge priority, and we want to make sure that as you go from experimentation to production in enterprise applications, that all the things you care about, compliance, security, privacy, performance, cost, we can help you. I think the risk is, people may try to skip some steps there, maybe working with other players as well, and I’m really hoping that they follow a methodical approach, and our partners can help, we can help with our own experiences here. We’re also here to learn about what are the challenges here, to actually make that vision a true reality with AI.

[Savannah Peterson]

Importance of methodical approach to AI adoption

I think that’s very well stated, and I think you’re right, skipping a step or taking a shortcut now has really big, massive impacts in the long term, especially with the size of these data sets and the infrastructure that we’re dealing with.

[Sachin Gupta]

Google’s AI innovations and expertise

I think what we’re saying is, we’re taking so many of our AI innovations and bringing them to our own applications at massive scale, like Workspace, and we can bring the knowledge

[00:20:02]

of how to do that securely, safely, privacy, copyright, performance at scale, the AI supercomputer that we talked about, to get the best price performance, sustainability, how you do this in a carbon-free or sustainable way, we can bring that expertise and really partner with our customers to help them with their objectives.

[Savannah Peterson]

I love that. Last big question for you, and then I’ve got one little lightning round.

Future hopes for Google Cloud

What do you hope you can say next time you’re seated here as a CUBE alumni with us, that you can’t say yet, with the current ecosystem?

[Sachin Gupta]

That one’s easy. For me, it’s always about taking the innovation and technology we’re bringing and demonstrating how we’ve helped customers. The customers we haven’t delivered the solutions to or enabled yet, I want to be able to come back here and talk about more customers where we really help them solve their own use cases, their own problems, innovate faster with the innovation that Google Cloud is bringing. It starts with customers, and I’m hoping to talk about many more successful stories when I come back.

[00:21:01]

[Savannah Peterson]

Well, we absolutely love that. We have a chair for them next time. We can even do a hot seat. We’ll get them running through.

[Sachin Gupta]

We would absolutely love to have that.

[Savannah Peterson]

McDonald’s order

Final question, just because I’m curious, and you’ve been working with them for a little while. What’s your McDonald’s order?

[Sachin Gupta]

What’s the? What’s your McDonald’s order? I’m vegetarian. Okay. I’m a little boring on this, but I really like their fries.

[Savannah Peterson]

I was going to ask how many fries were consumed during this duration of the partnership.

[Sachin Gupta]

We actually celebrated some of this with McDonald’s for everybody, so super, super excited about that. One of the interesting things is, if you go to their Chicago store, their primary store, every week it does a rotating menu of different countries in the world. Oh, cool. It is so cool. That’s so fun. I also travel to India, and the McDonald’s there, you get local, spicy, great options too. It’s wonderful.

Favorite McDonald’s orders

That’s great.

[Savannah Peterson]

That’s great. Rob, what’s your favorite McD’s order?

[Rob Strachey]

I’m Chicken McNuggets.

[Savannah Peterson]

You’re a nugget man.

[Rob Strachey]

I’m a nugget man. That is.

[Savannah Peterson]

You are a nugget man. I’m a nugget man. I’m a chicken sando girl with mustard and extra pickles.

[00:22:01]

Closing remarks and thank yous

In case anyone’s wondering, in case anyone wants to bring us this snack, Sachin, thank you so much for being here. Your insights are absolutely fantastic. I learned a lot. Rob, always a pleasure to have you on my left, and thank all of you for tuning in from home for our three days of live coverage here at Google Cloud Next in beautiful Las Vegas, Nevada. My name’s Savannah Peterson. You’re watching theCUBE, the leading source for enterprise tech news.