Google Cloud launches Confidential VMs

At its virtual Cloud Next ’20 event, Google Cloud today announced Confidential VMs, a new type of virtual machine that makes use of the company’s work around confidential computing to ensure that data isn’t just encrypted at rest but also while it is in memory.

We already employ a variety of isolation and sandboxing techniques as part of our cloud infrastructure to help make our multi-tenant architecture secure,” the company notes in today’s announcement. “Confidential VMs take this to the next level by offering memory encryption so that you can further isolate your workloads in the cloud. Confidential VMs can help all our customers protect sensitive data, but we think it will be especially interesting to those in regulated industries.”

In the backend, Confidential VMs make use of AMD’s Secure Encrypted Virtualization feature, available in its second-generation EPYC CPUs. With that, the data will stay encrypted when used and the encryption keys to make this happen are automatically generated in hardware and can’t be exported — and with that, even Google doesn’t have access to the keys either.

Image Credits: Google

Developers who want to shift their existing VMs to a Confidential VM can do so with just a few clicks. Google notes that it built Confidential VMs on top of its Shielded VMs, which already provide protection against rootkits and other exploits.

“With built-in secure encrypted virtualization, 2nd Gen AMD EPYC processors provide an innovative hardware-based security feature that helps secure data in a virtualized environment,” said Raghu Nambiar, corporate vice president, Data Center Ecosystem, AMD. “For the new Google Compute Engine Confidential VMs in the N2D series, we worked with Google to help customers both secure their data and achieve performance of their workloads.”

That last part is obviously important, given that the extra encryption and decryption steps do incur at least a minor performance penalty. Google says it worked with AMD and developed new open-source drivers to ensure that “the performance metrics of Confidential VMs are close to those of non-confidential VMs.” At least according to the benchmarks Google itself has disclosed so far, both startup times and memory read and throughput performance are virtually the same for regular VMs and Confidential VMs.


By Frederic Lardinois

Zoom consultant Alex Stamos weighs in on Keybase acquisition

When Zoom started having security issues in March, they turned to former Facebook and Yahoo! Security executive Alex Stamos, who signed on as a consultant to work directly with CEO Eric Yuan.

The goal was to build a more cohesive security strategy for the fast-growing company. One of the recommendations that came out of those meetings was building end-to-end encryption into the paid tier of the product. Those discussions led to the company buying Keybase this morning.

Stamos says in the big build versus buy debate that companies tend to go through when they are evaluating options, this fell somewhere in the middle. While they bought a company with a lot of expertise, it will still require Keybase engineers working with counterparts from Zoom and consultants like Stamos to build a final encrypted product.

“The truth is that what Zoom wants to do with end-to-end encryption, nobody’s really done, so there’s no product that you could just slap onto Zoom to turn it into key encryption. That’s going to have to be thought out from the beginning for the specific needs of an enterprise,” Stamos told TechCrunch.

But what they liked about Keybase in particular is that they have already thought through similar problems with file encryption and encrypted chat, and they want to turn the Keybase engineers loose on this problem.

“The design is going to be something that’s totally new. The great thing about Keybase is that they have already been through this process of thinking through and then crafting a design that is usable by normal people and that provides functionality while being somewhat invisible,” he said.

Because it’s a work in progress, it’s not possible to say when that final integration will happen, but Stamos did say that the company intends to publish a paper on May 22nd outlining its cryptographic plan moving forward, and then will have a period of public discussion before finalizing the design and moving into the integration phase.

He says that the first goal is to come up with a more highly secure version of Zoom meetings with end-to-end encryption enabled. At least initially, this will only be available for people using the Zoom client or Zoom-enabled hardware. You won’t be able to encrypt someone calling in, for instance.

As for folks who may be worried about Keybase being owned by Zoom, Stamos says, “The whole point of the Keybase design is that you don’t have to trust who owns their servers.”


By Ron Miller

Zoom acquires Keybase to get end-to-end encryption expertise

Zoom announced this morning that it has acquired Keybase, a startup with encryption expertise. It did not reveal the purchase price.

Keybase, which has been building encryption products for several years including secure file sharing and collaboration tools, should give Zoom some security credibility as it goes through pandemic demand growing pains.

The company has faced a number of security issues in the last couple of months as demand as soared and exposed some security weaknesses in the platform. As the company has moved to address these issues, having a team of encryption experts on staff should help the company build a more secure product.

In a blog post announcing the deal, CEO Eric Yuan said they acquired Keybase to give customers a higher level of security, something that’s increasingly important to enterprise customers as more operations are relying on the platform, working from home during the pandemic.

“This acquisition marks a key step for Zoom as we attempt to accomplish the creation of a truly private video communications platform that can scale to hundreds of millions of participants, while also having the flexibility to support Zoom’s wide variety of uses,” Yuan wrote.

He added that that tools will be available for all paying customers as soon as it is incorporated into the product. “Zoom will offer an end-to-end encrypted meeting mode to all paid accounts. Logged-in users will generate public cryptographic identities that are stored in a repository on Zoom’s network and can be used to establish trust relationships between meeting attendees,” he wrote.

Under the terms of the deal, the Keybase will become a subsidiary of Zoom and co-founder and Max Krohn will lead the Zoom security engineering team, reporting directly to Yuan to help build the security product. The other almost two dozen employees will become Zoom employees. The vast majority are security engineers.

It’s not clear what will happen to Keybase’s products, but the company did say Zoom is working with Keybase to figure that out.

Keybase was founded in 2014 and has raised almost $11 million according to Crunchbase data.


By Ron Miller

GitHub gets a built-in IDE with Codespaces, discussion forums and more

Under different circumstances, GitHub would be hosting its Satellite conference in Paris this week. Like so many other events, GitHub decided to switch Satellite to a virtual event, but that isn’t stopping the Microsoft-owned company from announcing quite a bit of news this week.

The highlight of GitHub’s announcement is surely the launch of GitHub Codespaces, which gives developers a full cloud-hosted development environment in the cloud, based on Microsoft’s VS Code editor. If that name sounds familiar, that’s likely because Microsoft itself rebranded Visual Studio Code Online to Visual Studio Codespaces a week ago — and GitHub is essentially taking the same concepts and technology and is now integrating it directly inside its service. If you’ve seen VS Online/Codespaces before, the GitHub environment will look very similar.

Contributing code to a community can be hard. Every repository has its own way of configuring a dev environment, which often requires dozens of steps before you can write any code,” writes Shanku Niyogi, GitHub’s SVP of Product, in today’s announcement. “Even worse, sometimes the environment of two projects you are working on conflict with one another. GitHub Codespaces gives you a fully-featured cloud-hosted dev environment that spins up in seconds, directly within GitHub, so you can start contributing to a project right away.”

Currently, GitHub Codespaces is in beta and available for free. The company hasn’t set any pricing for the service once it goes live, but Niyogi says the pricing will look similar to that of GitHub Actions, where it charges for computationally intensive tasks like builds. Microsoft currently charges VS Codespaces users by the hour and depending on the kind of virtual machine they are using.

The other major new feature the company is announcing today is GitHub Discussions. These are essentially discussion forums for a given project. While GitHub already allowed for some degree of conversation around code through issues and pull requests, Discussions are meant to enable unstructured threaded conversations. They also lend themselves to Q&As, and GitHub notes that they can be a good place for maintaining FAQs and other documents.

Currently, Discussions are in beta for open-source communities and will be available for other projects soon.

On the security front, GitHub is also announcing two new features: code scanning and secret scanning. Code scanning checks your code for potential security vulnerabilities. It’s powered by CodeQL and free for open-source projects. Secret scanning is now available for private repositories (a similar feature has been available for public projects since 2018). Both of these features are part of GitHub Advanced Security.

As for GitHub’s enterprise customers, the company today announced the launch of Private Instances, a new fully managed service for enterprise customers that want to use GitHub in the cloud but know that their code is fully isolated from the rest of the company’s users. “Private Instances provides enhanced security, compliance, and policy features including bring-your-own-key encryption, backup archiving, and compliance with regional data sovereignty requirements,” GitHub explains in today’s announcement.


By Frederic Lardinois

Tozny introduces encrypted identity tool as part of security service platform

Tozny, a Portland, Oregon startup that wants to help companies more easily incorporate encryption into their programs and processes, introduced TozID today. It is an identity and access control tool that can work independently or in conjunction with the company’s other encryption tools.

“Basically we have a Security as a Service platform, and it’s designed to help developers and IT departments add defense in depth by [combining] centralized user management with an end-to-end encryption platform,” Tozny CEO and founder Isaac Potoczny-Jones told TechCrunch.

The company is introducing an identity and access solution today with the hope of moving beyond its core developer and government audience to a broader enterprise customer base.

Under the hood, TozID uses standards identity constructs like single sign-on, SAML and OpenID, and can plug into any existing identity framework, but the key here is that it’s encryption-based and uses Zero Knowledge identification. This allows a user (or application) to control information with a password, while reducing the risk of sharing data because Tozny does not store passwords or send them over the network.

In this tool, the password acts as the encryption key, which enables users or applications to control access to data in a very granular way, only unlocking information for people or applications they want to be able to access that information — and nobody else.

As Potoczny-Jones points out, this can be as simple as one-to-one communication in an encrypted messaging app, but it can be more complex at the application layer, depending on how it’s set up. “It’s really powerful to have a user make that decision, but that’s not the only use case. There are many different ways to enable who gets access to data, and this tool enforces those kinds of decisions with encryption,” he explained.

Regardless of how this is implemented, the user never has to understand encryption, or even know that encryption is in play in the application. All they need to do is enter a password as they always have, and Tozny deals with the complex parts under the hood, using standard open source encryption algorithms.

The company also has a data privacy tool geared towards developers to build in end-to-end encryption into applications, whether that’s web, mobile, server and so forth. Developers can use the Tozny SDK to add encryption to their applications without a lot of encryption knowledge.

The company has been around since 2013 and hasn’t taken any private investment. Instead, it has developed an encryption toolkit for government agencies, including NIST and DARPA, that has acted as a de facto kind funding mechanism.

“This is an open source toolkit on the client side, so that folks can vet it for security — cryptographers like that — and on the server side it’s a SaaS-type platform,” he said. The latter is how the company makes money, by selling the service.

“Our goal really here is to bring the kind of cybersecurity that we’ve been building for government agencies into the commercial market, so this is really work on our side to try to, you might say, bring it down market as the threat landscape moves up market,” he said.


By Ron Miller

Google Cloud gets a Secret Manager

Google Cloud today announced Secret Manager, a new tool that helps its users securely store their API keys, passwords, certificates and other data. With this, Google Cloud is giving its users a single tool to manage this kind of data and a centralized source of truth, something that even sophisticated enterprise organizations often lack.

“Many applications require credentials to connect to a database, API keys to invoke a service, or certificates for authentication,” Google developer advocate Seth Vargo and product manager Matt Driscoll wrote in today’s announcement. “Managing and securing access to these secrets is often complicated by secret sprawl, poor visibility, or lack of integrations.”

With Berglas, Google already offered an open-source command-line tool for managing secrets. Secret Manager and Berglas will play well together and users will be able to move their secrets from the open-source tool into Secret Manager and use Berglas to create and access secrets from the cloud-based tool as well.

With KMS, Google also offers a fully managed key management system (as do Google Cloud’s competitors). The two tools are very much complementary. As Google notes, KMS does not actually store the secrets — it encrypts the secrets you store elsewhere. Secret Manager provides a way to easily store (and manage) these secrets in Google Cloud.

Secret Manager includes the necessary tools for managing secret versions and audit logging, for example. Secrets in Secret Manager are also project-based global resources, the company stresses, while competing tools often manage secrets on a regional basis.

The new tool is now in beta and available to all Google Cloud customers.


By Frederic Lardinois

Messaging app Wire confirms $8.2M raise, responds to privacy concerns after moving holding company to the US

Big changes are afoot for Wire, an enterprise-focused end-to-end encrypted messaging app and service that advertises itself as “the most secure collaboration platform”. In February, Wire quietly raised $8.2 million from Morpheus Ventures and others, we’ve confirmed — the first funding amount it has ever disclosed — and alongside that external financing, it moved its holding company in the same month to the US from Luxembourg, a switch that Wire’s CEO Morten Brogger described in an interview as “simple and pragmatic.”

He also said that Wire is planning to introduce a freemium tier to its existing consumer service — which itself has half a million users — while working on a larger round of funding to fuel more growth of its enterprise business — a key reason for moving to the US, he added: There is more money to be raised there.

“We knew we needed this funding and additional to support continued growth. We made the decision that at some point in time it will be easier to get funding in North America, where there’s six times the amount of venture capital,” he said.

While Wire has moved its holding company to the US, it is keeping the rest of its operations as is. Customers are licensed and serviced from Wire Switzerland; the software development team is in Berlin, Germany; and hosting remains in Europe.

The news of Wire’s US move and the basics of its February funding — sans value, date or backers — came out this week via a blog post that raises questions about whether a company that trades on the idea of data privacy should itself be more transparent about its activities.

The changes to Wire’s financing and legal structure had not been communicated to users until news started to leak out, which brings up questions not just about transparency, but about how secure Wire’s privacy policy will play out, given the company’s ownership now being on US soil.

It was an issue picked up and amplified by NSA whistleblower Edward Snowden . Via Twitter, he described the move to the US as “not appropriate for a company claiming to provide a secure messenger — claims a large number of human rights defenders relied on.”

The key question is whether Wire’s shift to the US puts users’ data at risk — a question that Brogger claims is straightforward to answer: “We are in Switzerland, which has the best privacy laws in the world” — it’s subject to Europe’s General Data Protection Regulation framework (GDPR) on top of its own local laws — “and Wire now belongs to a new group holding, but there no change in control.” 

In its blog post published in the wake of blowback from privacy advocates, Wire also claims it “stands by its mission to best protect communication data with state-of-the-art technology and practice” — listing several items in its defence:

  • All source code has been and will be available for inspection on GitHub (github.com/wireapp).
  • All communication through Wire is secured with end-to-end encryption — messages, conference calls, files. The decryption keys are only stored on user devices, not on our servers. It also gives companies the option to deploy their own instances of Wire in their own data centers.
  • Wire has started working on a federated protocol to connect on-premise installations and make messaging and collaboration more ubiquitous.
  • Wire believes that data protection is best achieved through state-of-the-art encryption and continues to innovate in that space with Messaging Layer Security (MLS).

But where data privacy and US law are concerned, it’s complicated. Snowden famously leaked scores of classified documents disclosing the extent of US government mass surveillance programs in 2013, including how data-harvesting was embedded in US-based messaging and technology platforms.

Six years on, the political and legal ramifications of that disclosure are still playing out — with a key judgement pending from Europe’s top court which could yet unseat the current data transfer arrangement between the EU and the US.

Privacy versus security

Wire launched at a time when interest in messaging apps was at a high watermark. The company made its debut in the middle of February 2014, and it was only one week later that Facebook acquired WhatsApp for the princely sum of $19 billion. We described Wire’s primary selling point at the time as a “reimagining of how a communications tool like Skype should operate had it been built today” rather than in in 2003.

That meant encryption and privacy protection, but also better audio tools and file compression and more. It was  a pitch that seemed especially compelling considering the background of the company. Skype co-founder Janus Friis and funds connected to him were the startup’s first backers (and they remain the largest shareholders); Wire was co-founded in by Skype alums Jonathan Christensen and Alan Duric (no longer with the company); and even new investor Morpheus has Skype roots.

Even with the Skype pedigree, the strategy faced a big challenge.

“The consumer messaging market is lost to the Facebooks of the world, which dominate it,” Brogger said today. “However, we made a clear insight, which is the core strength of Wire: security and privacy.”

That, combined with trend around the consumerization of IT that’s brought new tools to business users, is what led Wire to the enterprise market in 2017.

But fast forward to today, and it seems that even as security and privacy are two sides of the same coin, it may not be so simple when deciding what to optimise in terms of features and future development, which is part of the question now and what critics are concerned with.

“Wire was always for profit and planned to follow the typical venture backed route of raising rounds to accelerate growth,” one source familiar with the company told us. “However, it took time to find its niche (B2B, enterprise secure comms).

“It needed money to keep the operations going and growing. [But] the new CEO, who joined late 2017, didn’t really care about the free users, and the way I read it now, the transformation is complete: ‘If Wire works for you, fine, but we don’t really care about what you think about our ownership or funding structure as our corporate clients care about security, not about privacy.’”

And that is the message you get from Brogger, too, who describes individual consumers as “not part of our strategy”, but also not entirely removed from it, either, as the focus shifts to enterprises and their security needs.

Brogger said there are still half a million individuals on the platform, and they will come up with ways to continue to serve them under the same privacy policies and with the same kind of service as the enterprise users. “We want to give them all the same features with no limits,” he added. “We are looking to switch it into a freemium model.”

On the other side, “We are having a lot of inbound requests on how Wire can replace Skype for Business,” he said. “We are the only one who can do that with our level of security. It’s become a very interesting journey and we are super excited.”

Part of the company’s push into enterprise has also seen it make a number of hires. This has included bringing in two former Huddle C-suite execs, Brogger as CEO and Rasmus Holst as chief revenue officer — a bench that Wire expanded this week with three new hires from three other B2B businesses: a VP of EMEA sales from New Relic, a VP of finance from Contentful; and a VP of Americas sales from Xeebi.

Such growth comes with a price-tag attached to it, clearly. Which is why Wire is opening itself to more funding and more exposure in the US, but also more scrutiny and questions from those who counted on its services before the change.

Brogger said inbound interest has been strong and he expects the startup’s next round to close in the next two to three months.


By Ingrid Lunden

Battlefield vets StrongSalt (formerly OverNest) announces $3M seed round

StrongSalt, then known as OverNest, appeared at the TechCrunch Disrupt NYC Battlefield in 2016, and announced product for searching encrypted code, which remains unusual to this day. Today, the company announced a $3 million seed round led by Valley Capital Partners.

StrongSalt founder and CEO Ed Yu, says encryption remains a difficult proposition, and that when you look at the majority of breaches, encryption wasn’t used. He said that his company wants to simplify adding encryption to applications, and came up with a new service to let developers add encryption in the form of an API. “We decided to come up with what we call an API platform. It’s like infrastructure that allows you to integrate our solution into any existing or any new applications,” he said.

The company’s original idea was to create a product to search encrypted code, but Yu says the tech has much more utility as an API that’s applicable across applications, and that’s why they decided to package it as a service. It’s not unlike Twilio for communications or Stripe for payments, except in this case you can build in searchable encryption.

The searchable part is actually a pretty big deal because, as Yu points out, when you encrypt data it is no longer searchable. “If you encrypt all your data, you cannot search within it, and if you cannot search within it, you cannot find the data you’re looking for, and obviously you can’t really use the data. So we actually solved that problem,” he said.

Developers can add searchable encryption as part of their applications. For customers already using a commercial product, the company’s API actually integrates with popular services, enabling customers to encrypt the data stored there, while keeping it searchable.

“We will offer a storage API on top of Box, AWS S3, Google cloud, Azure — depending on what the customer has or wants. If the customer already has AWS S3 storage, for example, then when they use our API, and after encrypting the data, it will be stored in their AWS repository,” Yu explained.

For those companies who don’t have a storage service, the company is offering one. What’s more, they are using the blockchain to provide a mechanism for the sharing, auditing and managing encrypted data. “We also use the blockchain for sharing data by recording the authorization by the sender, so the receiver can retrieve the information needed to reconstruct the keys in order to retrieve the data. This simplifies key management in the case of sharing and ensures auditability and revocability of the sharing by the sender,” Yu said.

If you’re wondering how the company has been surviving since 2016, while only getting its seed round today, it had a couple of small seed rounds prior to this, and a contract with the US Department of Defense, which replaced the need for substantial earlier funding.

“The DOD was looking for a solution to have secure communication between between computers, and they needed to have a way to securely store data, and so we were providing a solution for them,” he said. In fact, this work was what led them to build the commercial API platform they are offering today.

The company, which was founded in 2015, currently has 12 employees spread across the globe.


By Ron Miller

The mainframe business is alive and well, as IBM announces new Z15

It’s easy to think about mainframes as some technology dinosaur, but the fact is these machines remain a key component of many large organization’s computing strategies. Today, IBM announced the latest in their line of mainframe computers, the Z15.

For starters, as you would probably expect, these are big and powerful machines capable of handling enormous workloads. For example, this baby can process up to 1 trillion web transactions a day and handle 2.4 million Docker containers, while offering unparalleled security to go with that performance. This includes the ability to encrypt data once, and it stays encrypted, even when it leaves the system, a huge advantage for companies with a hybrid strategy.

Speaking of which, you may recall that IBM bought Red Hat last year for $34 billion. That deal closed in July and the companies have been working to incorporate Red Hat technology across the IBM business including the z line of mainframes.

IBM announced last month that it was making OpenShift, Red Hat’s Kubernetes-based cloud-native tools, available on the mainframe running Linux. This should enable developers, who have been working on OpenShift on other systems to move seamlessly to the mainframe without special training.

IBM sees the mainframe as a bridge for hybrid computing environments, offering a highly secure place for data that when combined with Red Hat’s tools, can enable companies to have a single control plane for applications and data wherever it lives.

While it could be tough to justify the cost of these machines in the age of cloud computing, Ray Wang, founder and principal analyst at Constellation Research, says it could be more cost-effective than the cloud for certain customers. “If you are a new customer, and currently in the cloud and develop on Linux, then in the long run the economics are there to be cheaper than public cloud if you have a lot of IO, and need to get to a high degree of encryption and security” he said.

He added, “The main point is that if you are worried about being held hostage by public cloud vendors on pricing, in the long run the Z is a cost-effective and secure option for owning compute power and working in a multi-cloud, hybrid cloud world.”

Companies like airlines and financial services companies continue to use mainframes, and while they need the power these massive machines provide, they need to do so in a more modern context. The z15 is designed to provide that link to the future, while giving these companies the power they need.


By Ron Miller

IBM’s quantum-resistant magnetic tape storage is not actually snake oil

Usually when someone in tech says the word “quantum,” I put my hands on my ears and sing until they go away. But while IBM’s “quantum computing safe tape drive” nearly drove me to song, when I thought about it, it actually made a lot of sense.

First of all, it’s a bit of a misleading lede. The tape is not resistant to quantum computing at all. The problem isn’t that qubits are going to escape their cryogenic prisons and go interfere with tape drives in the basement of some datacenter or HQ. The problem is what these quantum computers may be able to accomplish when they’re finally put to use.

Without going too deep down the quantum rabbit hole, it’s generally acknowledged that quantum computers and classical computers (like the one you’re using) are good at different things — to the point where in some cases, a problem that might take incalculable time on a traditional supercomputer could be done in a flash on quantum. Don’t ask me how — I said we’re not going down the hole!

One of the things quantum is potentially very good at is certain types of cryptography: It’s theorized that quantum computers could absolutely smash through many currently used encryption techniques. In the worst case scenario, that means that if someone got hold of a large cache of encrypted data that today would be useless without the key, a future adversary may be able to force the lock. Considering how many breaches there have been where the only reason your entire life wasn’t stolen was because it was encrypted, this is a serious threat.

quantum tapeIBM and others are thinking ahead. Quantum computing isn’t a threat right now, right? It isn’t being seriously used by anyone, let alone hackers. But what if you buy a tape drive for long-term data storage today, and then a decade from now a hack hits and everything is exposed because it was using “industry standard” encryption?

To prevent that from happening, IBM is migrating its tape storage over to encryption algorithms that are resistant to state of the art quantum decryption techniques — specifically lattice cryptography (another rabbit hole — go ahead). Because these devices are meant to be used for decades if possible, during which time the entire computing landscape can change. It will be hard to predict exactly what quantum methods will emerge in the future, but at the very least you can try not to be among the low-hanging fruit favored by hackers.

The tape itself is just regular tape. In fact, the whole system is pretty much the same as you’d have bought a week ago. All the changes are in the firmware, meaning earlier drives can be retrofitted with this quantum-resistant tech.

Quantum computing may not be relevant to many applications today, but next year who knows? And in ten years, it might be commonplace. So it behooves companies like IBM, which plan to be part of the enterprise world for decades to come, to plan for it today.


By Devin Coldewey

Quantum computing is coming to TC Sessions: Enterprise on Sept. 5

Here at TechCrunch, we like to think about what’s next, and there are few technologies quite as exotic and futuristic as quantum computing. After what felt like decades of being “almost there,” we now have working quantum computers that are able to run basic algorithms, even if only for a very short time. As those times increase, we’ll slowly but surely get to the point where we can realize the full potential of quantum computing.

For our TechCrunch Sessions: Enterprise event in San Francisco on September 5, we’re bringing together some of the sharpest minds from some of the leading companies in quantum computing to talk about what this technology will mean for enterprises (p.s. early-bird ticket sales end this Friday). This could, after all, be one of those technologies where early movers will gain a massive advantage over their competitors. But how do you prepare yourself for this future today, while many aspects of quantum computing are still in development?

IBM’s quantum computer demonstrated at Disrupt SF 2018

Joining us onstage will be Microsoft’s Krysta Svore, who leads the company’s Quantum efforts; IBM’s Jay Gambetta, the principal theoretical scientist behind IBM’s quantum computing effort; and Jim Clark, the director of quantum hardware at Intel Labs.

That’s pretty much a Who’s Who of the current state of quantum computing, even though all of these companies are at different stages of their quantum journey. IBM already has working quantum computers, Intel has built a quantum processor and is investing heavily into the technology and Microsoft is trying a very different approach to the technology that may lead to a breakthrough in the long run but that is currently keeping it from having a working machine. In return, though, Microsoft has invested heavily into building the software tools for building quantum applications.

During the panel, we’ll discuss the current state of the industry, where quantum computing can already help enterprises today and what they can do to prepare for the future. The implications of this new technology also go well beyond faster computing (for some use cases); there are also the security issues that will arise once quantum computers become widely available and current encryption methodologies become easily breakable.

The early-bird ticket discount ends this Friday, August 9. Be sure to grab your tickets to get the max $100 savings before prices go up. If you’re a startup in the enterprise space, we still have some startup demo tables available! Each demo table comes with four tickets to the show and a high-visibility exhibit space to showcase your company to attendees — learn more here.


By Frederic Lardinois

Liberty’s challenge to UK state surveillance powers reveals shocking failures

A legal challenge to the UK’s controversial mass surveillance regime has revealed shocking failures by the main state intelligence agency, which has broad powers to hack computers and phones and intercept digital communications, in handling people’s information.

The challenge, by rights group Liberty, led last month to an initial finding that MI5 had systematically breached safeguards in the UK’s Investigatory Powers Act (IPA) — breaches the Home Secretary, Sajid Javid, euphemistically couched as “compliance risks” in a carefully worded written statement that was quietly released to parliament.

Today Liberty has put more meat on the bones of the finding of serious legal breaches in how MI5 handles personal data, culled from newly released (but redacted) documents that it says describe the “undoubtedly unlawful” conduct of the UK’s main security service which has been retaining innocent people’s data for years.

The series of 10 documents and letters from MI5 and the Investigatory Powers Commissioner’s Office (IPCO), the body charged with overseeing the intelligence agencies’ use of surveillance powers, show that the spy agency has failed to meet its legal duties for as long as the IPA has been law, according to Liberty.

The controversial surveillance legislation passed into UK law in November 2016 — enshrining a system of mass surveillance of digital communications which includes a provision that logs of all Internet users’ browsing activity be retained for a full year, accessible to a wide range of government agencies (not just law enforcement and/or spy agencies).

The law also allows the intelligence agencies to maintain large databases of personal information on UK citizens, even if they are not under suspicion of any crime. And sanctions state hacking of devices, networks and services, including bulk hacking on foreign soil. It also gives U.K. authorities the power to require a company to remove encryption, or limit the rollout of end-to-end encryption on a future service.

The IPA has faced a series of legal challenges since making it onto the statute books, and the government has been forced to amend certain aspects of it on court order — including beefing up restrictions on access to web activity data. Other challenges to the controversial surveillance regime, including Liberty’s, remain ongoing.

The newly released court documents include damning comments on MI5’s handling of data by the IPCO — which writes that: “Without seeking to be emotive, I consider that MI5’s use of warranted data… is currently, in effect, in ‘special measures’ and the historical lack of compliance… is of such gravity that IPCO will need to be satisfied to a greater degree than usual that it is ‘fit for purpose’”.”

Liberty also says MI5 knew for three years of failures to maintain key safeguards — such as the timely destruction of material, and the protection of legally privileged material — before informing the IPCO.

Yet a key government sales pitch for passing the legislation was the claim of a ‘world class’ double-lock authorization and oversight regime to ensure the claimed safeguards on intelligence agencies powers to intercept and retain data.

So the latest revelations stemming from Liberty’s legal challenge represent a major embarrassment for the government.

“It is of course paramount that UK intelligence agencies demonstrate full compliance with the law,” the home secretary wrote in the statement last month, before adding his own political spin: “In that context, the interchange between the Commissioner and MI5 on this issue demonstrates that the world leading system of oversight established by the Act is working as it should.”

Liberty comes to the opposite conclusion on that point — emphasizing that warrants for bulk surveillance were issued by senior judges “on the understanding that MI5’s data handling obligations under the IPA were being met — when they were not”.

“The Commissioner has pointed out that warrants would not have been issued if breaches were known,” it goes on. “The Commissioner states that “it is impossible to sensibly reconcile the explanation of the handling of arrangements the Judicial Commissioners [senior judges] were given in briefings…with what MI5 knew over a protracted period of time was happening.”

So, basically, it’s saying that MI5 — having at best misled judges, whose sole job it is to oversee its legal access to data, about its systematic failures to lawfully handle data — has rather made a sham of the entire ‘world class’ oversight regime.

Liberty also flags what it calls “a remarkable admission to the Commissioner” — made by MI5’s deputy director general — who it says acknowledges that personal data collected by MI5 is being stored in “ungoverned spaces”. It adds that the MI5 legal team claims there is “a high likelihood [of material] being discovered when it should have been deleted, in a disclosure exercise leading to substantial legal or oversight failure”.

“Ungoverned spaces” is not a phrase that made it into Javid’s statement last month on MI5’s “compliance risks”.

But the home secretary did acknowledge: “A report of the Investigatory Powers Commissioner’s Office suggests that MI5 may not have had sufficient assurance of compliance with these safeguards within one of its technology environments.”

Javid also said he had set up “an independent review to consider and report back to me on what lessons can be learned for the future”. Though it’s unclear whether that report will be made public. 

We reached out to the Home Office for comment on the latest revelations from Liberty’s litigation. But a spokesman just pointed us to Javid’s prior statement. 

In a statement, Liberty’s lawyer, Megan Goulding, said: “These shocking revelations expose how MI5 has been illegally mishandling our data for years, storing it when they have no legal basis to do so. This could include our most deeply sensitive information – our calls and messages, our location data, our web browsing history.

“It is unacceptable that the public is only learning now about these serious breaches after the Government has been forced into revealing them in the course of Liberty’s legal challenge. In addition to showing a flagrant disregard for our rights, MI5 has attempted to hide its mistakes by providing misinformation to the Investigatory Powers Commissioner, who oversees the Government’s surveillance regime.

“And, despite a light being shone on this deplorable violation of our rights, the Government is still trying to keep us in the dark over further examples of MI5 seriously breaching the law.”


By Natasha Lomas

Microsoft makes a push for service mesh interoperability

Services meshes. They are the hot new thing in the cloud native computing world. At Kubecon, the bi-annual festival of all things cloud native, Microsoft today announced that it is teaming up with a number of companies in this space to create a generic service mesh interface. This will make it easier for developers to adopt the concept without locking them into a specific technology.

In a world where the number of network endpoints continues to increase as developers launch new micro-services, containers and other systems at a rapid clip, they are making the network smarter again by handling encryption, traffic management and other functions so that the actual applications don’t have to worry about that. With a number of competing service mesh technologies, though, including the likes of Istio and Linkerd, developers currently have to chose which one of these to support.

“I’m really thrilled to see that we were able to pull together a pretty broad consortium of folks from across the industry to help us drive some interoperability in the service mesh space,” Gabe Monroy, Microsoft’s lead product manager for containers and the former CTO of Deis, told me. “This is obviously hot technology — and for good reasons. The cloud-native ecosystem is driving the need for smarter networks and smarter pipes and service mesh technology provides answers.”

The partners here include Buoyant, HashiCorp, Solo.io, Red Hat, AspenMesh, Weaveworks, Docker, Rancher, Pivotal, Kinvolk and VMWare. That’s a pretty broad coalition, though it notably doesn’t include cloud heavyweights like Google, the company behind Istio, and AWS.

“In a rapidly evolving ecosystem, having a set of common standards is critical to preserving the best possible end-user experience,” said Idit Levine, founder and CEO of Solo.io. “This was the vision behind SuperGloo – to create an abstraction layer for consistency across different meshes, which led us to the release of Service Mesh Hub last week. We are excited to see service mesh adoption evolve into an industry level initiative with the SMI specification.”

For the time being, the interoperability features focus on traffic policy, telemetry and traffic management. Monroy argues that these are the most pressing problems right now. He also stressed that this common interface still allows the different service mesh tools to innovate and that developers can always work directly with their APIs when needed. He also stressed that the Service Mesh Interface (SMI), as this new specification is called, does not provide any of its own implementations of these features. It only defines a common set of APIs.

Currently, the most well-known service mesh is probably Istio, which Google, IBM and Lyft launched about two years ago. SMI may just bring a bit more competition to this market since it will allow developers to bet on the overall idea of a service mesh instead of a specific implementation.

In addition to SMI, Microsoft also today announced a couple of other updates around its cloud-native and Kubernetes services. It announced the first alpha of the Helm 3 package manager, for example, as well as the 1.0 release of its Kubernetes extension for Visual Studio Code and the general availability of its AKS virtual nodes, using the open source Virtual Kubelet project.

 


By Frederic Lardinois

Slack hands over control of encryption keys to regulated customers

Slack announced today that it is launching Enterprise Key Management (EKM) for Slack, a new tool that enables customers to control their encryption keys in the enterprise version of the communications app. The keys are managed in the AWS KMS key management tool.

Geoff Belknap, chief security officer (CSO) at Slack, says that the new tool should appeal to customers in regulated industries, who might need tighter control over security. “Markets like financial services, health care and government are typically underserved in terms of which collaboration tools they can use, so we wanted to design an experience that catered to their particular security needs,” Belknap told TechCrunch.

Slack currently encrypts data in transit and at rest, but the new tool augments this by giving customers greater control over the encryption keys that Slack uses to encrypt messages and files being shared inside the app.

He said that regulated industries in particular have been requesting the ability to control their own encryption keys including the ability to revoke them if it was required for security reasons. “EKM is a key requirement for growing enterprise companies of all sizes, and was a requested feature from many of our Enterprise Grid customers. We wanted to give these customers full control over their encryption keys, and when or if they want to revoke them,” he said.

Screenshot: Slack

Belknap says that this is especially important when customers involve people outside the organization such as contractors, partners or vendors in Slack communications. “A big benefit of EKM is that in the event of a security threat or if you ever experience suspicious activity, your security team can cut off access to the content at any time if necessary,” Belknap explained.

In addition to controlling the encryption keys, customers can gain greater visibility into activity inside of Slack via the Audit Logs API. “Detailed activity logs tell customers exactly when and where their data is being accessed, so they can be alerted of risks and anomalies immediately,” he said. If a customer finds suspicious activity, it can cut off access.

EKM for Slack is generally available today for Enterprise Grid customers for an additional fee. Slack, which announced plans to go public last month, has raised over $1 billion on a $7 billion valuation.


By Ron Miller

Why Daimler moved its big data platform to the cloud

Like virtually every big enterprise company, a few years ago, the German auto giant Daimler decided to invest in its own on-premises data centers. And while those aren’t going away anytime soon, the company today announced that it has successfully moved its on-premises big data platform to Microsoft’s Azure cloud. This new platform, which the company calls eXtollo, is Daimler’s first major service to run outside of its own data centers, though it’ll probably not be the last.

As Daimler’s head of its corporate center of excellence for advanced analytics and big data Guido Vetter told me, that the company started getting interested in big data about five years ago. “We invested in technology — the classical way, on-premise — and got a couple of people on it. And we were investigating what we could do with data because data is transforming our whole business as well,” he said.

By 2016, the size of the organization had grown to the point where a more formal structure was needed to enable the company to handle its data at a global scale. At the time, the buzzword was ‘data lakes’ and the company started building its own in order to build out its analytics capacities.

Electric Line-Up, Daimler AG

“Sooner or later, we hit the limits as it’s not our core business to run these big environments,” Vetter said. “Flexibility and scalability are what you need for AI and advanced analytics and our whole operations are not set up for that. Our backend operations are set up for keeping a plant running and keeping everything safe and secure.” But in this new world of enterprise IT, companies need to be able to be flexible and experiment — and, if necessary, throw out failed experiments quickly.

So about a year and a half ago, Vetter’s team started the eXtollo project to bring all the company’s activities around advanced analytics, big data and artificial intelligence into the Azure Cloud and just over two weeks ago, the team shut down its last on-premises servers after slowly turning on its solutions in Microsoft’s data centers in Europe, the U.S. and Asia. All in all, the actual transition between the on-premises data centers and the Azure cloud took about nine months. That may not seem fast, but for an enterprise project like this, that’s about as fast as it gets (and for a while, it fed all new data into both its on-premises data lake and Azure).

If you work for a startup, then all of this probably doesn’t seem like a big deal, but for a more traditional enterprise like Daimler, even just giving up control over the physical hardware where your data resides was a major culture change and something that took quite a bit of convincing. In the end, the solution came down to encryption.

“We needed the means to secure the data in the Microsoft data center with our own means that ensure that only we have access to the raw data and work with the data,” explained Vetter. In the end, the company decided to use thethAzure Key Vault to manage and rotate its encryption keys. Indeed, Vetter noted that knowing that the company had full control over its own data was what allowed this project to move forward.

Vetter tells me that the company obviously looked at Microsoft’s competitors as well, but he noted that his team didn’t find a compelling offer from other vendors in terms of functionality and the security features that it needed.

Today, Daimler’s big data unit uses tools like HD Insights and Azure Databricks, which covers more than 90 percents of the company’s current use cases. In the future, Vetter also wants to make it easier for less experienced users to use self-service tools to launch AI and analytics services.

While cost is often a factor that counts against the cloud since renting server capacity isn’t cheap, Vetter argues that this move will actually save the company money and that storage cost, especially, are going to be cheaper in the cloud than in its on-premises data center (and chances are that Daimler, given its size and prestige as a customer, isn’t exactly paying the same rack rate that others are paying for the Azure services).

As with so many big data AI projects, predictions are the focus of much of what Daimler is doing. That may mean looking at a car’s data and error code and helping the technician diagnose an issue or doing predictive maintenance on a commercial vehicle. Interestingly, the company isn’t currently bringing any of its own IoT data from its plants to the cloud. That’s all managed in the company’s on-premises data centers because it wants to avoid the risk of having to shut down a plant because its tools lost the connection to a data center, for example.


By Frederic Lardinois