Equinix is buying 13 data centers from Bell Canada for $750M

Equinix, the data center company, has the distinction of recently recording its 69th straight positive quarter. One way that it has achieved that kind of revenue consistency is through strategic acquisitions. Today, the company announced that it’s purchasing 13 data centers from Bell Canada for $750 million, greatly expanding its footing in the country.

The deal is financially detailed by Equinix across two axes, including how much the data centers cost in terms of revenue, and adjusted profit. Regarding revenue, Equinix notes that it is paying $750 million for what it estimates to be $105 million in “annualized revenue,” calculated using the most recent quarter’s results multiplied by four. This gives the purchase a revenue multiple of a little over 7x.

Equinix also provided an adjusted profit multiple, saying that the 13 data center locations “[represent] a purchase multiple of approximately 15x EV / adjusted EBITDA.” Unpacking that, the company is saying that the asset’s enterprise value (similar to market capitalization, a popular valuation metric for public companies) is worth about 15 times its earnings before interest, taxes, deprecation and amortization (EBITDA). This seems a healthy price, but not one that is outrageous.

Global reach of Equinix including expanded Canadian operations shown in left panel. Image: Equinix

The acquisition not only gives the company that additional revenue and a stronger foothold in the 10th largest economy in the world, it also gains 600 customers using the Bell data centers, of which 500 are net new.

As much of the world is attempting to digitally transform in the midst of the pandemic and current economic crisis, Equinix sees this as an opportunity to help more Canadian customers go digital more quickly.

“Equinix has been serving the Canadian market in Toronto for more than a decade. This expansion and scale gives the Canadian market a clear and rapid migration path to digital transformation. We’re looking forward to deepening our relationships with our existing Canada-based customers and helping new companies throughout the country position themselves for digital success,” Jon Lin, Equinix President, Americas told TechCrunch.

This is not the first time that Equinix has taken a bunch of data centers off of the hands of a telco. In fact, three years ago, the company bought 29 centers from Verizon (which is the owner of TechCrunch) for $3.6 billion.

As telcos move away from the data center business, companies like Equinix are able to come in and expand into new markets and increase revenue. It’s one of the ways it continues to generate positive revenue year after year.

Today’s deal is just part of that strategy to keep expanding into new markets and finding new ways to generate additional revenue as more companies use their services. Equinix rents space in its data centers and provides all the services that companies need without having to run their own. That would include things like heating, cooling, racks and wiring.

Even though public cloud companies like Amazon, Microsoft and Google are generating headlines with growing revenues, plenty of companies still want to run their own equipment without going to the expense of actually owning the building where the equipment resides.

Today’s deal is expected to close in the second half of the year, assuming it clears all of the regulatory scrutiny required in a purchase like this one.


By Ron Miller

Equinix just recorded its 69th straight positive quarter

There’s something to be said for consistency through good times and bad, and one company that has had a staggeringly consistent track record is international data center vendor, Equinix. It just recorded its 69th straight positive quarter, according to the company.

That’s an astonishing record, and covers over 17 years of positive returns. That means this streak goes back to 2003. Not too shabby.

The company had a decent quarter, too. Even in the middle of an economic mess, it was still up 6% YoY to $1.445 billion and up 2% over last quarter. The company runs data centers where companies can rent space for their servers. Equinix handles all of the infrastructure providing racks, wiring and cooling — and customers can purchase as many racks as they need.

If you’re managing your own servers for even part of your workload, it can be much more cost-effective to rent space from a vendor like Equinix than trying to run a facility on your own.

Among its new customers this quarter are Zoom, which is buying capacity all over the place, having also announced a partnership with Oracle earlier this month, and TikTok. Both of those companies deal in video and require lots of different types of resources to keep things running.

This report comes against a backdrop of a huge increase in resource demand for certain sectors like streaming video and video conferencing, with millions of people working and studying at home or looking for distractions.

And if you’re wondering if they can keep it going, they believe they can. Their guidance calls for 2020 revenue of $5.877-$5.985 billion, a 6-8% increase over the previous year.

You could call them the anti-IBM. At one point Big Blue recorded 22 straight quarters of declining revenue in an ignominious streak that stretched from 2012 to 2018 before it found a way to stop the bleeding.

When you consider that Equnix’s streak includes the period of 2008-2010, the last time the economy hit the skids, it makes the record even more impressive, and certainly one worth pointing out.


By Ron Miller

Microsoft to open first data center in New Zealand as cloud usage grows

In spite of being in the midst of a pandemic sowing economic uncertainty, one area that continues to thrive is cloud computing. Perhaps that explains why Microsoft, which saw Azure grow 59% in its most recent earnings report, announced plans to open a new data center in New Zealand once it receives approval from the Overseas Investment Office.

“This significant investment in New Zealand’s digital infrastructure is a testament to the remarkable spirit of New Zealand’s innovation and reflects how we’re pushing the boundaries of what is possible as a nation,” Vanessa Sorenson, general manager at Microsoft New Zealand said in a statement.

The company sees this project against the backdrop of accelerating digital transformation that we are seeing as the pandemic forces companies to move to the cloud more quickly with employees often spread out and unable to work in offices around the world.

As CEO Satya Nadella noted on Twitter, this should help companies in New Zealand that are in the midst of this transformation. “Now more than ever, we’re seeing the power of digital transformation, and today we’re announcing a new datacenter region in New Zealand to help every organization in the country build their own digital capability,” Nadella tweeted.

The company wants to do more than simply build a data center. It will make this part of a broader investment across the country, including skills training and reducing the environmental footprint of the data center.

Once New Zealand comes on board, the company will boast 60 regions covering 140 countries around the world. The new data center won’t just be about Azure, either. It will help fuel usage of Office 365 and the Dynamics 365 back-office products, as well.


By Ron Miller

Google Cloud’s newest data center opens in Salt Lake City

Google Cloud announced today that it’s a new data center in Salt Lake City has opened, making it the 22nd such center the company has opened to-date.

This Salt Lake City data center marks the third in the western region joining LA and Dalles, Oregon with the goal of providing lower latency compute power across the region.

“We’re committed to building the most secure, high-performance and scalable public cloud, and we continue to make critical infrastructure investments that deliver our cloud services closer to customers that need them the most,” said Jennifer Chason, director of Google Cloud Enterprise for the Western States and Southern California said in a statement.

Cloud vendors in general are trying to open more locations closer to potential customers. This is a similar approach taken by AWS when it announced its LA local zone at AWS re:Invent last year. The idea is to reduce latency by moving compute resources closer to the companies who need the, or to spread workloads across a set of regional resources.

Google also announced that PayPal, a company that was already a customer, has signed a multi-year contract, and will be moving parts of its payment systems into the western region. It’s worth noting that Salt Lake City is also home to a thriving startup scene that could benefit from having a data center located close by.

Google Cloud’s parent company Alphabet’s recently shared the cloud division’s quarterly earnings for the first time, indicating that it was on a run rate of more than $10 billion. While it still has a long way to go catch rivals Microsoft and Amazon, as it expands its reach in this fashion, it could help grow that market share.


By Ron Miller

Edge computing startup Pensando comes out of stealth mode with a total of $278 million in funding

Pensando, an edge computing startup founded by former Cisco engineers, came out of stealth mode today with an announcement that it has raised a $145 million Series C. The company’s software and hardware technology, created to give data centers more of the flexibility of cloud computing servers, is being positioned as a competitor to Amazon Web Services Nitro.

The round was led by Hewlett Packard Enterprise and Lightspeed Venture Partners and brings Pensando’s total raised so far to $278 million. HPE chief technology officer Mark Potter and Lightspeed Venture partner Barry Eggers will join Pensando’s board of directors. The company’s chairman is former Cisco CEO John Chambers, who is also one of Pensando’s investors through JC2 Ventures.

Pensando was founded in 2017 by Mario Mazzola, Prem Jain, Luca Cafiero and Soni Jiandani, a team of engineers who spearheaded the development of several of Cisco’s key technologies, and founded four startups that were acquired by Cisco, including Insieme Networks. (In an interview with Reuters, Pensando chief financial offier Randy Pond, a former Cisco executive vice president, said it isn’t clear if Cisco is interested in acquiring the startup, adding “our aspirations at this point would be to IPO. But, you know, there’s always other possibilities for monetization events.”)

The startup claims its edge computing platform performs five to nine times better than AWS Nitro, in terms of productivity and scale. Pensando prepares data center infrastructure for edge computing, better equipping them to handle data from 5G, artificial intelligence and Internet of Things applications. While in stealth mode, Pensando acquired customers including HPE, Goldman Sachs, NetApp and Equinix.

In a press statement, Potter said “Today’s rapidly transforming, hyper-connected world requires enterprises to operate with even greater flexibility and choices than ever before. HPE’s expanding relationship with Pensando Systems stems from our shared understanding of enterprises and the cloud. We are proud to announce our investment and solution partnership with Pensando and will continue to drive solutions that anticipate our customers’ needs together.”


By Catherine Shu

Google is investing $3.3B to build clean data centers in Europe

Google announced today that it was investing 3 billion euro (approximately $3.3 billion USD) to expand its data center presence in Europe. What’s more, the company pledged the data centers would be environmentally friendly.

This new investment is in addition to the $7 billion the company has invested since 2007 in the EU, but today’s announcement was focused on Google’s commitment to building data centers running on clean energy, as much as the data centers themselves.

In a blog post announcing the new investment, CEO Sundar Pichai, made it clear that the company was focusing on running these data centers on carbon-free fuels, pointing out that he was in Finland today to discuss building sustainable economic development in conjunction with a carbon-free future with prime minister Antti Rinne.

Of the 3 billion Euros, the company plans to spend, it will invest 600 million to expand its presence in Hamina, Finland, which he wrote “serves as a model of sustainability and energy efficiency for all of our data centers.” Further, the company already announced 18 new renewable energy deals earlier this week, which encompass a total of 1,600-megawatts in the US, South America and Europe.

In the blog post, Pichai outlined how the new data center projects in Europe would include some of these previously announced projects:

Today I’m announcing that nearly half of the megawatts produced will be here in Europe, through the launch of 10 renewable energy projects. These agreements will spur the construction of more than 1 billion euros in new energy infrastructure in the EU, ranging from a new offshore wind project in Belgium, to five solar energy projects in Denmark, and two wind energy projects in Sweden. In Finland, we are committing to two new wind energy projects that will more than double our renewable energy capacity in the country, and ensure we continue to match almost all of the electricity consumption at our Finnish data center with local carbon-free sources, even as we grow our operations.

The company is also helping by investing in new skills training, so people can have the tools to be able to handle the new types of jobs these data centers and other high tech jobs will require. The company claims it has previously trained 5 million people in Europe for free in crucial digital skills, and recently opened a Google skills hub in Helsinki.

It’s obviously not a coincidence that company is making an announcement related to clean energy on Global Climate Strike Day, a day when people from around the world are walking out of schools and off their jobs to encourage world leaders and businesses to take action on the climate crisis. Google is attempting to answer the call with these announcements.


By Ron Miller

Why AWS gains big storage efficiencies with E8 acquisition

AWS is already the clear market leader in the cloud infrastructure market, but it’s never been an organization that rests on its past successes. Whether it’s a flurry of new product announcements and enhancements every year, or making strategic acquisitions.

When it bought Israeli storage startup E8 yesterday, it might have felt like a minor move on its face, but AWS was looking, as it always does, to find an edge and reduce the costs of operations in its data centers. It was also very likely looking forward to the next phase of cloud computing. Reports have pegged the deal at between $50 and $60 million.

What E8 gives AWS for relatively cheap money is highly advanced storage capabilities, says Steve McDowell, senior storage analyst at Moor Research and Strategy. “E8 built a system that delivers extremely high-performance/low-latency flash (and Optane) in a shared-storage environment,” McDowell told TechCrunch.


By Ron Miller

Fungible raises $200 million led by SoftBank Vision Fund to help companies handle increasingly massive amounts of data

Fungible, a startup that wants to help data centers cope with the increasingly massive amounts of data produced by new technologies, has raised a $200 million Series C led by SoftBank Vision Fund, with participation from Norwest Venture Partners and its existing investors. As part of the round, SoftBank Investment Advisers senior managing partner Deep Nishar will join Fungible’s board of directors.

Founded in 2015, Fungible now counts about 200 employees and has raised more than $300 million in total funding. Its other investors include Battery Ventures, Mayfield Fund, Redline Capital and Walden Riverwood Ventures. Its new capital will be used to speed up product development. The company’s founders, CEO Pradeep Sindhu and Bertrand Serlet, say Fungible will release more information later this year about when its data processing units will be available and their on-boarding process, which they say will not require clients to change their existing applications, networking or server design.

Sindu previously founded Juniper Networks, where he held roles as chief scientist and CEO. Serlet was senior vice president of software engineering at Apple before leaving in 2011 and founding Upthere, a storage startup that was acquired by Western Digital in 2017. Sindu and Serlet describe Fungible’s objective as pivoting data centers from a “compute-centric” model to a data-centric one. While the company is often asked if they consider Intel and Nvidia competitors, they say Fungible Data Processing Units (DPU) complement tech, including central and graphics processing units, from other chip makers.

Sindhu describes Fungible’s DPUs as a new building block in data center infrastructure, allowing them to handle larger amounts of data more efficiently and also potentially enabling new kinds of applications. Its DPUs are fully programmable and connect with standard IPs over Ethernet local area networks and local buses, like the PCI Express, that in turn connect to CPUs, GPUs and storage. Placed between the two, the DPUs act like a “super-charged data traffic controller,” performing computations offloaded by the CPUs and GPUs, as well as converting the IP connection into high-speed data center fabric.

This better prepares data centers for the enormous amounts of data generated by new technology, including self-driving cars, and industries such as personalized healthcare, financial services, cloud gaming, agriculture, call centers and manufacturing, says Sindu.

In a press statement, Nishar said “As the global data explosion and AI revolution unfold, global computing, storage and networking infrastructure are undergoing a fundamental transformation. Fungible’s products enable data centers to leverage their existing hardware infrastructure and benefit from these new technology paradigms. We look forward to partnering with the company’s visionary and accomplished management team as they power the next generation of data centers.”


By Catherine Shu

Google says it’ll invest $13B in U.S. data centers and offices this year

Google today announced that it will invest $13 billion in data centers and offices across the U.S. in 2019. That’s up from $9 billion in investments last year. Many of these investments will go to states like Nebraska, Nevada, Ohio, Texas, Oklahoma, South Carolina and Virginia, where Google plans new or expanded data centers. Though like most years, it’ll also continue to expand many of its existing offices in Seattle, Chicago and New York, as well as in its home state of California.

Given Google’s push for more cloud customers, it’s also interesting to see that the company continues to expand its data center presence across the country. Google will soon open its first data centers in Nevada, Nebraska, Ohio and Texas, for example, and it will expand its Oklahoma, South Carolina and Virginia data centers. Google clearly isn’t slowing down in its race to compete with AWS and Azure.

“These new investments will give us the capacity to hire tens of thousands of employees, and enable the creation of more than 10,000 new construction jobs in Nebraska, Nevada, Ohio, Texas, Oklahoma, South Carolina and Virginia,” Google CEO Sundar Pichai writes today. “With this new investment, Google will now have a home in 24 total states, including data centers in 13 communities. 2019 marks the second year in a row we’ll be growing faster outside of the Bay Area than in it.”

Given the current backlash against many tech companies and automation in general, it’s probably no surprise that Google wants to emphasize the number of jobs it is creating (and especially jobs in Middle America). The construction jobs are obviously temporary, though, and data centers don’t need a lot of employees to run once they are up and running. Still, Google promises that this will give it the “capacity to hire tens of thousands of employees.”


By Frederic Lardinois

New Synergy Research report finds enterprise data center market is strong for now

Conventional wisdom would suggest that in 2019, the public cloud dominates and enterprise data centers are becoming an anachronism of a bygone era, but new data from Synergy Research finds that the enterprise data center market had a growth spurt last year.

In fact, Synergy reported that overall spending in enterprise infrastructure, which includes elements like servers, switches and routers and network security; grew 13 percent last year and represents a $125 billion business — not too shabby for a market that is supposedly on its deathbed.

Overall these numbers showed that market is still growing, although certainly not nearly as fast the public cloud. Synergy was kind enough to provide a separate report on the cloud market, which grew 32 percent last year to $250 billion annually.

As Synergy analyst John Dinsdale, pointed out, the private data center is not the only buyer here. A good percentage of sales is likely going to the public cloud, who are building data centers at a rapid rate these days. “In terms of applications and levels of usage, I’d characterize it more like there being a ton of growth in the overall market, but cloud is sucking up most of the growth, while enterprise or on-prem is relatively flat,” Dinsdale told TechCrunch.

 

 

Perhaps the surprising data nugget in the report is that Cisco remains the dominant vendor in this market with 23 percent share over the last four quarters. This, even as it tries to pivot to being more of a software and services vendor, spending billions on companies such as AppDynamics, Jasper Technologies and Duo Security in recent years. Yet data still shows that it still dominating in the traditional hardware sector.

Cisco remains the top vendor in the category in spite of losing a couple of percentage points in marketshare over the last year, primarily due to the fact they don’t do great in the server part of the market, which happens to be the biggest overall slice. The next vendor, HPE, is far back at just 11 percent across the six segments.

While these numbers show that companies are continuing to invest in new hardware, the growth is probably not sustainable long term. At AWS Re:invent in November, AWS president Andy Jassy pointed out that a vast majority of data remains in private data centers, but that we can expect that to begin to move more briskly to the public cloud over the next five years. And web scale companies like Amazon often don’t buy hardware off the shelf, opting to develop custom tools they can understand and configure at a highly granular level.

Jassy said that outside the US, companies are one to three years behind this trend, depending on the market, so the shift is still going on, as the much bigger growth in the public cloud numbers indicates.


By Ron Miller

Amazon isn’t the only tech company getting tax breaks

Amazon has a big target on its back these days, and because of its size, scope and impact on local business, critics are right to look closely at tax breaks and other subsidies they receive. There is nothing wrong with digging into these breaks to see if they reach the goals governments set in terms of net new jobs. But Amazon isn’t alone here by any means. Many states have a big tech subsidy story to tell, and it isn’t always a tale that ends well for the subsidizing government.

In fact, a recent study by the watchdog group, Good Jobs First, found states are willing to throw millions at high tech companies to lure them into building in their communities. They cited three examples in the report including Tesla’s $1.25 billion 20-year deal to build a battery factory in Nevada, Foxconn’s $3 billion break to build a display factory in Wisconsin and the Apple data center deal in Iowa, which resulted in a $214 million tax break.

Good Jobs First executive director Greg LeRoy doesn’t think these subsidies are justifiable and they take away business development dollars from smaller businesses that tend to build more sustainable jobs in a community.

“The “lots of eggs in one basket” strategy is especially ill-suited. But many public leaders haven’t switched gears yet, often putting taxpayers at great risk, especially because some tech companies have become very aggressive about demanding big tax breaks. Companies with famous names are even more irresistible to politicians who want to look active on jobs,” LeRoy and his colleague Maryann Feldman wrote in a Guardian commentary last month.

It doesn’t always work the way you hope

While these deals are designed to attract the company to an area and generate jobs, that doesn’t always happen. The Apple-Iowa deal, for example, involved 550 construction jobs to build the $1.3 billion state-of-the-art facility, but will ultimately generate only 50 full-time jobs. It’s worth noting that in this case, Apple further sweetened the pot by contributing “up to $100 million” to a local public improvement fund, according to information supplied by the company.

One thing many lay people don’t realize, however, is that in spite of the size, cost and amount of real estate of these mega data centers, they are highly automated and don’t require a whole lot of people to run. While Apple is giving back to the community around the data center, in the end, if the goal of the subsidy is permanent high-paying jobs, there aren’t very many involved in running a data center.

It’s not hard to find projects that didn’t work out. A $2 million tax subsidy deal between Massachusetts and Nortel Networks in 2008 to keep 2200 jobs in place and add 800 more failed miserably. By 2010 there were just 145 jobs left at the facility and the tax incentive lasted another 4 years, according to a Boston.com report.

More recent deals come at a much higher price. The $3 billion Foxconn deal in Wisconsin was expected to generate 3000 direct jobs (and another 22,000 related ones). That comes out to an estimated cost of between $15,000 and $19,000 per job annually, much higher than the typical cost of $2457 per job, according to data in the New York Times.

Be careful what you wish for

Meanwhile states are falling all over themselves with billions in subsidies to give Amazon whatever its little heart desires to build HQ2, which could generate up to 50,000 jobs over a decade if all goes according to plan. The question, as with the Foxconn deal, is whether the states can truly justify the cost per job and the impact on infrastructure and housing to make it worth it?

What’s more, how do you ensure that you get a least a modest return on that investment? In the case of the Nortel example in Massachusetts, shouldn’t the Commonwealth have protected itself against a catastrophic failure instead of continuing to give the tax break for years after it was clear Nortel wasn’t able to live up to its side of the agreement?

Not every deal needs to be a home run, but you want to at least ensure you get a decent number of net new jobs out of it, and that there is some fairness in the end, regardless of the outcome. States also need to figure out the impact of any subsidy on other economic development plans, and not simply fall for name recognition over common sense.

These are questions every state needs to be considering as they pour money into these companies. It’s understandable in post-industrial America, where many factory jobs have been automated away that states want to lure high-paying high tech jobs to their communities, but it’s still incumbent upon officials to make sure they are doing due diligence on the total impact of the deal to be certain the cost is justified in the end.


By Ron Miller

Azure’s availability zones are now generally available

No matter what cloud you build on, if you want to build something that’s highly available, you’re always going to opt to put your applications and data in at least two physically separated regions. Otherwise, if a region goes down, your app goes down, too. All of the big clouds also offer a concept called ‘availability zones’ in their regions to offer developers the option to host their applications in two separate data centers in the same zone for a bit of extra resilience. All big clouds, that is, except for Azure, which is only launching its availability zones feature into general availability today after first announcing a beta last September.

Ahead of today’s launch, Julia White, Microsoft’s corporate VP for Azure, told me that the company’s design philosophy behind its data center network was always about servicing commercial customers with the widest possible range of regions to allow them to be close to their customers and to comply with local data sovereignty and privacy laws. That’s one of the reasons why Azure today offers more regions than any of its competitors, with 38 generally available regions and 12 announced ones.

“Microsoft started its infrastructure approach focused on enterprise organizations and built lots of regions because of that,” White said. “We didn’t pick this regional approach because it’s easy or because it’s simple, but because we believe this is what our customers really want.”

Every availability zone has its own network connection and power backup, so if one zone in a region goes down, the others should remain unaffected. A regional disaster could shut down all of the zones in a single region, though, so most business will surely want to keep their data in at least one additional region.