Google Cloud earns defense contract win for Anthos multi-cloud management tool

Google dropped out of the Pentagon’s JEDI cloud contract battle fairly early in the game, citing it was in conflict with its “AI principals.” However, today the company announced a new 7 figure contract with DoD’s Defense Innovation Unit (DIU), a big win for the cloud unit and CEO Thomas Kurian.

While the company would not get specific about the number, the new contract involves using Anthos, the tool the company announced last year to secure DIU’s multi-cloud environment. In spite of the JEDI contract involving a single vendor, the DoD has always used solutions from all three major cloud vendors — Amazon, Microsoft and Google — and this solution will provide a way to monitor security across all three environments, according to the company.

“Multi-cloud is the future. The majority of commercial businesses run multi-cloud environments securely and seamlessly, and this is now coming to the federal government as well,” Mike Daniels, VP of Global Public Sector at Google Cloud told TechCrunch.

The idea is to manage security across three environments with help from cloud security vendor Netskope, which is also part of the deal.”The multi-cloud solution will be built on Anthos, allowing DIU to run web services and applications across Google Cloud, Amazon Web Services,  and Microsoft Azure — while being centrally managed from the Google Cloud Console,” the company wrote in a statement.

Daniels says that while this is a deal with DIU, he could see it expanding to other parts of DoD. “This is a contract with the DIU, but our expectation is that the DoD will look at the project as a model for how to implement their own security posture.”

Google Cloud Platform remains way back in the cloud infrastructure pack in third place with around 8% market share. For context, AWS has around 33% market share and Microsoft has around 18%.

While JEDI, a $10 billion, winner-take-all prize remains mired in controversy and an on-going battle between The Pentagon, Amazon and Microsoft, this deal shows that the defense department is looking at advanced technology like Anthos to help it manage a multi-cloud world regardless of what happens with JEDI.


By Ron Miller

In spite of pandemic (or maybe because of it), cloud infrastructure revenue soars

It’s fair to say that even before the impact of COVID-19, companies had begun a steady march to the cloud. Maybe it wasn’t fast enough for AWS, as Andy Jassy made clear in his 2019 Re:invent keynote, but it was happening all the same and the steady revenue increases across the cloud infrastructure market bore that out.

As we look at the most recent quarter’s earnings reports for the main players in the market, it seems the pandemic and economic fall out has done little to slow that down. In fact, it may be contributing to its growth.

According to numbers supplied by Synergy Research, the cloud infrastructure market totaled $29 billion in revenue for Q12020.

Image Credit: Synergy Research

Synergy’s John Dinsdale, who has been watching this market for a long time, says that the pandemic could be contributing to some of that growth, at least modestly. In spite of the numbers, he doesn’t necessarily see these companies getting out of this unscathed either, but as companies shift operations from offices, it could be part of the reason for the increased demand we saw in the first quarter.

“For sure, the pandemic is causing some issues for cloud providers, but in uncertain times, the public cloud is providing flexibility and a safe haven for enterprises that are struggling to maintain normal operations. Cloud provider revenues continue to grow at truly impressive rates, with AWS and Azure in aggregate now having an annual revenue run rate of well over $60 billion,” Dinsdale said in a statement.

AWS led the way with a third of the market or more than $10 billion in quarterly revenue as it continues to hold a substantial lead in market share. Microsoft was in second, growing at a brisker 59% for 18% of the market. While Microsoft doesn’t break out its numbers, using Synergy’s numbers, that would work out to around $5.2 billion for Azure revenue. Meanwhile Google came in third with $2.78 billion.

If you’re keeping track of market share at home, it comes out to 32% for AWS, 18% for Microsoft and 8% for Google. This split has remained fairly steady, although Microsoft has managed to gain a few percentage points over the last several quarters as its overall growth rate outpaces Amazon.


By Ron Miller

Google Cloud’s fully-managed Anthos is now generally available for AWS

A year ago, back in the days of in-person conferences, Google officially announced the launch of its Anthos multi-cloud application modernization platform at its Cloud Next conference. The promise of Anthos was always that it would allow enterprises to write their applications once, package them into containers and then manage their multi-cloud deployments across GCP, AWS, Azure and their on-prem data centers.

Until now, support for AWS and Azure was only available in preview, but today, the company is making support for AWS and on-premises generally available. Microsoft Azure support remains in preview, though.

“As an AWS customer now, or a GCP customer, or a multi-cloud customer, […] you can now run Anthos on those environments in a consistent way, so you don’t have to learn any proprietary APIs and be locked in,” Eyal Manor, the VP of engineering in charge of Anthos, told me. “And for the first time, we enable the portability between different infrastructure environments as opposed to what has happened in the past where you were locked into a set of API’s.”

Manor stressed that Anthos was designed to be multi-cloud from day one. As for why AWS support is launching ahead of Azure, Manor said that there was simply more demand for it. “We surveyed the customers and they said, hey, we want, in addition to GCP, we want AWS,” he said. But support for Azure will come later this year and the company already has a number of preview customers for it. In addition, Anthos will also come to bare metal servers in the future.

Looking even further ahead, Manor also noted that better support for machine learning workloads in on the way. Many businesses, after all, want to be able to update and run their models right where their data resides, no matter what cloud that may be. There, too, the promise of Anthos is that developers can write the application once and then run it anywhere.

“I think a lot of the initial response and excitement was from the developer audiences,” Jennifer Lin, Google Cloud’s VP of product management, told me. “Eric Brewer had led a white paper that we did to say that a lot of the Anthos architecture sort of decouples the developer and the operator stakeholder concerns. There hadn’t been a multi-cloud shared software architecture where we could do that and still drive emerging and existing applications with a common shared software stack.”

She also noted that a lot of Google Cloud’s ecosystem partners endorsed the overall Anthos architecture early on because they, too, wanted to be able to write once and run anywhere — and so do their customers.

Plaid is one of the launch partners for these new capabilities. “Our customers rely on us to be always available and as a result we have very high reliability requirements,” said Naohiko Takemura, Plaid’s head of engineering. “We pursued a multi-cloud strategy to ensure redundancy for our critical KARTE service. Google Cloud’s Anthos works seamlessly across GCP and our other cloud providers preventing any business disruption. Thanks to Anthos, we prevent vendor lock-in, avoid managing cloud-specific infrastructure, and our developers are not constrained by cloud providers.”

With this release, Google Cloud is also bringing deeper support for virtual machines to Anthos, as well as improved policy and configuration management.

Over the next few months, the Anthos Service Mesh will also add support for applications that run in traditional virtual machines. As Lin told me, “a lot of this is is about driving better agility and talking the complexity out of it so that we have abstractions that work across any environment, whether it’s legacy or new or on-prem or AWS or GCP.”


By Frederic Lardinois

Etsy’s 2-year migration to the cloud brought flexibility to the online marketplace

Founded in 2005, Etsy was born before cloud infrastructure was even a thing.

As the company expanded, it managed all of its operations in the same way startups did in those days — using private data centers. But a couple of years ago, the online marketplace for crafts and vintage items decided to modernize and began its journey to the cloud.

That decision coincided with the arrival of CTO Mike Fisher in July 2017. He was originally brought in as a consultant to look at the impact of running data centers on Etsy’s ability to innovate. As you might expect, he concluded that it was having an adverse impact and began a process that would lead to him being hired to lead a long-term migration to the cloud.

That process concluded last month. This is the story of how a company born in data centers made the switch to the cloud, and the lessons it offers.

Stuck in a hardware refresh loop

When Fisher walked through the door, Etsy operated out of private data centers. It was not even taking advantage of a virtualization layer to maximize the capacity of each machine. The approach meant IT spent an inordinate amount of time on resource planning.


By Ron Miller

Google Cloud’s newest data center opens in Salt Lake City

Google Cloud announced today that it’s a new data center in Salt Lake City has opened, making it the 22nd such center the company has opened to-date.

This Salt Lake City data center marks the third in the western region joining LA and Dalles, Oregon with the goal of providing lower latency compute power across the region.

“We’re committed to building the most secure, high-performance and scalable public cloud, and we continue to make critical infrastructure investments that deliver our cloud services closer to customers that need them the most,” said Jennifer Chason, director of Google Cloud Enterprise for the Western States and Southern California said in a statement.

Cloud vendors in general are trying to open more locations closer to potential customers. This is a similar approach taken by AWS when it announced its LA local zone at AWS re:Invent last year. The idea is to reduce latency by moving compute resources closer to the companies who need the, or to spread workloads across a set of regional resources.

Google also announced that PayPal, a company that was already a customer, has signed a multi-year contract, and will be moving parts of its payment systems into the western region. It’s worth noting that Salt Lake City is also home to a thriving startup scene that could benefit from having a data center located close by.

Google Cloud’s parent company Alphabet’s recently shared the cloud division’s quarterly earnings for the first time, indicating that it was on a run rate of more than $10 billion. While it still has a long way to go catch rivals Microsoft and Amazon, as it expands its reach in this fashion, it could help grow that market share.


By Ron Miller

Google Cloud opens its Seoul region

Google Cloud today announced that its new Seoul region, its first in Korea, is now open for business. The region, which it first talked about last April, will feature three availability zones and support for virtually all of Google Cloud’s standard service, ranging from Compute Engine to BigQuery, Bigtable and Cloud Spanner.

With this, Google Cloud now has a presence in 16 countries and offers 21 regions with a total of 64 zones. The Seoul region (with the memorable name of asia-northeast3) will complement Google’s other regions in the area, including two in Japan, as well as regions in Hong Kong and Taiwan, but the obvious focus here is on serving Korean companies with low-latency access to its cloud services.

“As South Korea’s largest gaming company, we’re partnering with Google Cloud for game development, infrastructure management, and to infuse our operations with business intelligence,” said Chang-Whan Sul, the CTO of Netmarble. “Google Cloud’s region in Seoul reinforces its commitment to the region and we welcome the opportunities this initiative offers our business.”

Over the course of this year, Google Cloud also plans to open more zones and regions in Salt Lake City, Las Vegas and Jakarta, Indonesia.


By Frederic Lardinois

Google closes $2.6B Looker acquisition

When Google announced that it was acquiring data analytics startup Looker for $2.6 billion, it was a big deal on a couple of levels. It was a lot of money and it represented the first large deal under the leadership of Thomas Kurian. Today, the company announced that deal has officially closed and Looker is part of the Google Cloud Platform.

While Kurian was happy to announce that Looker was officially part of the Google family, he made it clear in a blog post that the analytics arm would continue to support multiple cloud vendors beyond Google.

“Google Cloud and Looker share a common philosophy around delivering open solutions and supporting customers wherever they are—be it on Google Cloud, in other public clouds, or on premises. As more organizations adopt a multi-cloud strategy, Looker customers and partners can expect continued support of all cloud data management systems like Amazon Redshift, Azure SQL, Snowflake, Oracle, Microsoft SQL Server and Teradata,” Kurian wrote.

As is typical in a deal like this, Looker CEO Frank Bien sees the much larger Google giving his company the resources to grow much faster than it could have on its own. “Joining Google Cloud provides us better reach, strengthens our resources, and brings together some of the best minds in both analytics and cloud infrastructure to build an exciting path forward for our customers and partners. The mission that we undertook seven years ago as Looker takes a significant step forward beginning today,” Bien wrote in his post.

At the time the deal was announced in June, the company shared a slide, which showed where Looker fits in what they call their “Smart Analytics Platform,” which provides ways to process, understand, analyze and visualize data. Looker fills in a spot in the visualization stack while continuing to support other clouds.

Slide: Google

Looker was founded in 2011 and raised more than $280 million, according to Crunchbase. Investors included Redpoint, Meritech Capital Partners, First Round Capital, Kleiner Perkins, CapitalG and PremjiInvest. The last deal before the acquisition was a $103 million Series E investment on a $1.6 billion valuation in December 2018.


By Ron Miller

Google Cloud gets a Secret Manager

Google Cloud today announced Secret Manager, a new tool that helps its users securely store their API keys, passwords, certificates and other data. With this, Google Cloud is giving its users a single tool to manage this kind of data and a centralized source of truth, something that even sophisticated enterprise organizations often lack.

“Many applications require credentials to connect to a database, API keys to invoke a service, or certificates for authentication,” Google developer advocate Seth Vargo and product manager Matt Driscoll wrote in today’s announcement. “Managing and securing access to these secrets is often complicated by secret sprawl, poor visibility, or lack of integrations.”

With Berglas, Google already offered an open-source command-line tool for managing secrets. Secret Manager and Berglas will play well together and users will be able to move their secrets from the open-source tool into Secret Manager and use Berglas to create and access secrets from the cloud-based tool as well.

With KMS, Google also offers a fully managed key management system (as do Google Cloud’s competitors). The two tools are very much complementary. As Google notes, KMS does not actually store the secrets — it encrypts the secrets you store elsewhere. Secret Manager provides a way to easily store (and manage) these secrets in Google Cloud.

Secret Manager includes the necessary tools for managing secret versions and audit logging, for example. Secrets in Secret Manager are also project-based global resources, the company stresses, while competing tools often manage secrets on a regional basis.

The new tool is now in beta and available to all Google Cloud customers.


By Frederic Lardinois

Google acquires AppSheet to bring no-code development to Google Cloud

Google announced today that it is buying AppSheet, an 8 year-old no-code mobile application building platform. The company had raised over $17 million on a $60 million valuation, according to PitchBook data. The companies did not share the purchase price.

With AppSheet, Google gets a simple way for companies to build mobile apps without having to write a line of code. It works by pulling data from a spreadsheet, database or form, and using the field or column names as the basis for building an app.

It is integrated with Google Cloud already integrating with Google Sheets and Google Forms, but also works with other tools including AWS DynamoDB, Salesforce, Office 365, Box and others. Google says it will continue to support these other platforms, even after the deal closes.

As Amit Zavery wrote in a blog post announcing the acquisition, it’s about giving everyone a chance to build mobile applications, even companies lacking traditional developer resources to build a mobile presence. “This acquisition helps enterprises empower millions of citizen developers to more easily create and extend applications without the need for professional coding skills,” he wrote.

In a story we hear repeatedly from startup founders, Praveen Seshadri, co-founder and CEO at AppSheet sees an opportunity to expand his platform and market reach under Google in ways he couldn’t as an independent company.

“There is great potential to leverage and integrate more deeply with many of Google’s amazing assets like G Suite and Android to improve the functionality, scale, and performance of AppSheet. Moving forward, we expect to combine AppSheet’s core strengths with Google Cloud’s deep industry expertise in verticals like financial services, retail, and media  and entertainment,” he wrote.

Google sees this acquisition as extending its development philosophy with no-code working alongside workflow automation, application integration and API management.

No code tools like AppSheet are not going to replace sophisticated development environments, but they will give companies that might not otherwise have a mobile app, the ability to put something decent out there.


By Ron Miller

Google makes converting VMs to containers easier with the GA of Migrate for Anthos

At its Cloud Next event in London, Google today announced a number of product updates around its managed Anthos platform, as well as Apigee and its Cloud Code tools for building modern applications that can then be deployed to Google Cloud or any Kubernetes cluster.

Anthos is one of the most important recent launches for Google, as it expands the company’s reach outside of Google Cloud and into its customers’ data centers and, increasingly, edge deployments. At today’s event, the company announced that it is taking Anthos Migrate out of beta and into general availability. The overall idea behind Migrate is that it allows enterprises to take their existing, VM-based workloads and convert them into containers. Those machines could come from on-prem environments, AWS, Azure or Google’s Compute Engine, and — once converted — can then run in Anthos GKE, the Kubernetes service that’s part of the platform.

“That really helps customers think about a leapfrog strategy, where they can maintain the existing VMs but benefit from the operational model of Kubernetes,” Google VP of product management Jennifer Lin told me. “So even though you may not get all of the benefits of a cloud-native container day one, what you do get is consistency in the operational paradigm.”

As for Anthos itself, Lin tells me that Google is seeing some good momentum. The company is highlighting a number of customers at today’s event, including Germany’s Kaeser Kompressoren and Turkey’s Denizbank.

Lin noted that a lot of financial institutions are interested in Anthos. “A lot of the need to do data-driven applications, that’s where Kubernetes has really hit that sweet spot because now you have a number of distributed datasets and you need to put a web or mobile front end on [them],” she explained. “You can’t do it as a monolithic app, you really do need to tap into a number of datasets — you need to do real-time analytics and then present it through a web or mobile front end. This really is a sweet spot for us.”

Also new today is the general availability of Cloud Code, Google’s set of extensions for IDEs like Visual Studio Code and IntelliJ that helps developers build, deploy and debug their cloud-native applications more quickly. The idea, here, of course, is to remove friction from building containers and deploying them to Kubernetes.

In addition, Apigee hybrid is now also generally available. This tool makes it easier for developers and operators to manage their APIs across hybrid and multi-cloud environments, a challenge that is becoming increasingly common for enterprises. This makes it easier to deploy Apigee’s API runtimes in hybrid environments and still get the benefits of Apigees monitoring and analytics tools in the cloud. Apigee hybrid, of course, can also be deployed to Anthos.


By Frederic Lardinois

Google will soon open a cloud region in Poland

Google today announced its plans to open a new cloud region in Warsaw, Poland to better serve its customers in Central and Eastern Europe.

This move is part of Google’s overall investment in expanding the physical footprint of its data centers. Only a few days ago, after all, the company announced that, in the next two years, it would spend $3.3 billion on its data center presence in Europe alone.

Google Cloud currently operates 20 different regions with 61 availability zones. Warsaw, like most of Google’s regions, will feature three availability zones and launch with all the standard core Google Cloud services, including Compute Engine, App Engine, Google Kubernetes Engine, Cloud Bigtable, Cloud Spanner, and BigQuery.

To launch the new region in Poland, Google is partnering with Domestic Cloud Provider (a.k.a. Chmury Krajowej, which itself is a joint venture of the Polish Development Fund and PKO Bank Polski). Domestic Cloud Provider (DCP) will become a Google Cloud reseller in the country and build managed services on top of Google’s infrastructure.

“Poland is in a period of rapid growth, is accelerating its digital transformation, and has become an international software engineering hub,” writes Google Cloud CEO Thomas Kurian. “The strategic partnership with DCP and the new Google Cloud region in Warsaw align with our commitment to boost Poland’s digital economy and will make it easier for Polish companies to build highly available, meaningful applications for their customers.”

 


By Frederic Lardinois

Google Cloud makes it easier to set up continuous delivery with Spinnaker

Google Cloud today announced Spinnaker for Google Cloud Platform, a new solution that makes it easier to install and run the Spinnaker continuous delivery (CD) service on Google’s cloud.

Spinnaker was created inside Netflix and is now jointly developed by Netflix and Google. Netflix open-sourced it back in 2015 and over the course of the last few years, it became the open-source CD platform of choice for many enterprises. Today, companies like Adobe, Box, Cisco, Daimler, Samsung and others use it to speed up their development process.

With Spinnaker for Google Cloud Platform, which runs on the Google Kubernetes Engine, Google is making the install process for the service as easy as a few clicks. Once up and running, the Spinnaker install includes all of the core tools, as well as Deck, the user interface for the service. Users pay for the resources used by the Google Kubernetes Engine, as well as Cloud Memorystore for Redis, Google Cloud Load Balancing and potentially other resources they use in the Google Cloud.

could spinnker.max 1100x1100

The company has pre-configured Spinnaker for testing and deploying code on Google Kubernetes Engine, Compute Engine and App Engine, though it will also work with any other public or on-prem cloud. It’s also integrated with Cloud Build, Google’s recently launched continuous integration service and features support for automatic backups and integrated auditing and monitoring with Google’s Stackdriver.

“We want to make sure that the solution is great both for developers and DevOps or SRE teams,” says Matt Duftler, Tech Lead for Google’s Spinnaker effort, in today’s announcement. “Developers want to get moving fast with the minimum of overhead. Platform teams can allow them to do that safely by encoding their recommended practice into Spinnaker, using Spinnaker for GCP to get up and running quickly and start onboard development teams.”

 


By Frederic Lardinois

We’ll talk even more Kubernetes at TC Sessions: Enterprise with Microsoft’s Brendan Burns and Google’s Tim Hockin

You can’t go to an enterprise conference these days without talking containers — and specifically the Kubernetes container management system. It’s no surprise then, that we’ll do the same at our inaugural TC Sessions: Enterprise event on September 5 in San Francisco. As we already announced last week, Kubernetes co-founder Craig McLuckie and Aparna Sinha, Google’s director of product management for Kubernetes, will join us to talk about the past, present and future of containers in the enterprise.

In addition, we can now announce that two other Kubernetes co-founders will join us: Google principal software engineer Tim Hockin, who currently works on Kubernetes and the Google Container Engine, and Microsoft distinguished engineer Brendan Burns, who was the lead engineer for Kubernetes during his time at Google.

With this, we’ll have three of the four Kubernetes co-founders onstage to talk about the five-year-old project.

Before joining the Kuberntes efforts, Hockin worked on internal Google projects like Borg and Omega, as well as the Linux kernel. On the Kubernetes project, he worked on core features and early design decisions involving networking, storage, node, multi-cluster, resource isolation and cluster sharing.

While his colleagues Craig McLuckie and Joe Beda decided to parlay their work on Kubernetes into a startup, Heptio, which they then successfully sold to VMware for about $550 million, Burns took a different route and joined the Microsoft Azure team three years ago.

I can’t think of a better group of experts to talk about the role that Kubernetes is playing in reshaping how enterprise build software.

If you want a bit of a preview, here is my conversation with McLuckie, Hockin and Microsoft’s Gabe Monroy about the history of the Kubernetes project.

Early-Bird tickets are now on sale for $249; students can grab a ticket for just $75. Book your tickets here before prices go up.


By Frederic Lardinois

AWS remains in firm control of the cloud infrastructure market

It has to be a bit depressing to be in the cloud infrastructure business if your name isn’t Amazon. Sure, there’s a huge, growing market, and the companies behind Amazon are growing even faster. Yet it seems no matter how fast they grow, Amazon remains a dot on the horizon.

It seems inconceivable that AWS can continue to hold sway over such a large market for so long, but as we’ve pointed out before, it has been able to maintain its position through true first-mover advantage. The other players didn’t even show up until several years after Amazon launched its first service in 2006, and they are paying the price for their failure to see the way computing would change the way Amazon did.

They certainly see it now, whether it’s IBM, Microsoft or Google, or Tencent and Alibaba, both of which are growing fast in the China/Asia markets. All of these companies are trying to find the formula to help differentiate themselves from AWS and give them some additional market traction.

Cloud market growth

Interestingly, even though companies have begun to move with increasing urgency to the cloud, the pace of growth slowed a bit in the first quarter to a 42 percent rate, according to data from Synergy Research, but that doesn’t mean the end of this growth cycle is anywhere close.


By Ron Miller

Google Cloud makes some strong moves to differentiate itself from AWS and Microsoft

Google Cloud held its annual customer conference, Google Cloud Next, this week in San Francisco. It had a couple of purposes. For starters it could introduce customers to new CEO Thomas Kurian for the first time since his hiring at the end of last year. And secondly, and perhaps more importantly, it could demonstrate that it could offer a value proposition that is distinct from AWS and Microsoft.

Kurian’s predecessor, Diane Greene, was fond of saying that it was still early days for the cloud market, and she’s still right, but while the pie has continued to grow substantially, Google’s share of the market has stayed stubbornly in single digits. It needed to use this week’s conference as at least a springboard to showcase its strengths .

Its lack of commercial cloud market clout has always been a bit of a puzzler. This is Google after all. It runs Google Search and YouTube and Google Maps and Google Docs. These are massive services that rarely go down. You would think being able to run these massive services would translate into massive commercial success, but so far it hasn’t.

Missing ingredients

Even though Greene brought her own considerable enterprise cred to GCP, having been a co-founder at VMware, the company that really made the cloud possible by popularizing the virtual machine, she wasn’t able to significantly change the company’s commercial cloud fortunes.

In a conversation with TechCrunch’s Frederic Lardinois, Kurian talked about missing ingredients like having people to talk to (or maybe a throat to choke). “A number of customers told us ‘we just need more people from you to help us.’ So that’s what we’ll do,” Kurian told Lardinois.

But of course, it’s never one thing when it comes to a market as complex as cloud infrastructure. Sure, you can add more bodies in customer support or sales, or more aggressively pursue high value enterprise customers, or whatever Kurain has identified as holes in GCP’s approach up until now, but it still requires a compelling story and Google took a big step toward having the ingredients for a new story this week.

Changing position

Google is trying to position itself in the same way as any cloud vendor going after AWS. They are selling themselves as the hybrid cloud company that can help with your digital transformation. It’s a common strategy, but Google did more than throw out the usual talking points this week. It walked the walk too.

For starters, it introduced Anthos, a single tool to manage your workloads wherever they live, even in a rival cloud. This is a big deal, and if it works as described it does give that new beefed-up sales team at Google Cloud a stronger story to tell around integration. As my colleague, Frederic Lardinois described it:

So with Anthos, Google will offer a single managed service that will let you manage and deploy workloads across clouds, all without having to worry about the different environments and APIs. That’s a big deal and one that clearly delineates Google’s approach from its competitors’. This is Google, after all, managing your applications for you on AWS and Azure, he wrote

AWS hasn’t made made many friends in the open source community of late and Google reiterated that it was going to be the platform that is friendly to open source projects. To that end, it announced a number of major partnerships.

Finally, the company took a serious look at verticals, trying to put together packages of Google Cloud services designed specifically for a given vertical. As an example, it put together a package for retailers that included special services to help keep you up and running during peak demand, tools to suggest if you like this, you might be interested in these items, contact center AI and other tools specifically geared toward the retail market. You can expect the company will be doing more of this to make the platform more attractive to a given market space.

Photo: Michael Short/Bloomberg via Getty Images

All of this and more, way too much to summarize in one article, was exactly what Google Cloud needed to do this week. Now comes the hard part. They have come up with some good ideas and they have to go out and sell it.

Nobody has ever denied that Google lacked good technology. That has always been an inherently obvious strength, but it has struggled to translate that into substantial market share. That is Kurian’s challenge. As Greene used to say, in baseball terms, it’s still early innings. And it really still is, but the game is starting to move along, and Kurian needs to get the team moving in the right direction if it expects to be competitive.


By Ron Miller