Nadella warns government conference not to betray user trust

Microsoft CEO Satya Nadella, delivering the keynote at the Microsoft Government Leaders Summit in Washington, DC today, had a message for attendees to maintain user trust in their tools technologies above all else.

He said it is essential to earn user trust, regardless of your business. “Now, of course, the power law here is all around trust because one of the keys for us, as providers of platforms and tools, trust is everything,” he said today. But he says it doesn’t stop with the platform providers like Microsoft. Institutions using those tools also have to keep trust top of mind or risk alienating their users.

“That means you need to also ensure that there is trust in the technology that you adopt, and the technology that you create, and that’s what’s going to really define the power law on this equation. If you have trust, you will have exponential benefit. If you erode trust it will exponentially decay,” he said.

He says Microsoft sees trust along three dimensions: privacy, security and ethical use of artificial intelligence. All of these come together in his view to build a basis of trust with your customers.

Nadella said he sees privacy as a human right, pure and simple, and it’s up to vendors to ensure that privacy or lose the trust of their customers. “The investments around data governance is what’s going to define whether you’re serious about privacy or not,” he said. For Microsoft, they look at how transparent they are about how they use the data, their terms of service, and how they use technology to ensure that’s being carried out at runtime.

He reiterated the call he made last year for a federal privacy law. With GDPR in Europe and California’s CCPA coming on line in January, he sees a centralized federal law as a way to streamline regulations for business.

As for security, as you might expect, he defined it in terms of how Microsoft was implementing it, but the message was clear that you needed security as part of your approach to trust, regardless of how you implement that. He asked several key questions of attendees.

“Cyber is the second area where we not only have to do our work, but you have to [ask], what’s your operational security posture, how have you thought about having the best security technology deployed across the entire chain, whether it’s on the application side, the infrastructure side or on the endpoint, side, and most importantly, around identity,” Nadella said.

The final piece, one which he said was just coming into play was how you use artificial intelligence ethically, a sensitive topic for a government audience, but one he wasn’t afraid to broach. “One of the things people say is, ‘Oh, this AI thing is so unexplainable, especially deep learning.’ But guess what, you created that deep learning [model]. In fact, the data on top of which you train the model, the parameters and the number of parameters you use — a lot of things are in your control. So we should not abdicate our responsibility when creating AI,” he said.

Whether Microsoft or the US government can adhere to these lofty goals is unclear, but Nadella was careful to outline them both for his company’s benefit and this particular audience. It’s up to both of them to follow through.


By Ron Miller

Satya Nadella looks to the future with edge computing

Speaking today at the Microsoft Government Leaders Summit in Washington DC, Microsoft CEO Satya Nadella made the case for edge computing, even while pushing the Azure cloud as what he called “the world’s computer.”

While Amazon, Google and other competitors may have something to say about that, marketing hype aside, many companies are still in the midst of transitioning to the cloud. Nadella says the future of computing could actually be at the edge where computing is done locally before data is then transferred to the cloud for AI and machine learning purposes. What goes around, comes around.

But as Nadella sees it, this is not going to be about either edge or cloud. It’s going to be the two technologies working in tandem. “Now, all this is being driven by this new tech paradigm that we describe as the intelligent cloud and the intelligent edge,” he said today.

He said that to truly understand the impact the edge is going to have on computing, you have to look at research, which predicts there will be 50 billion connected devices in the world by 2030, a number even he finds astonishing. “I mean this is pretty stunning. We think about a billion Windows machines or a couple of billion smartphones. This is 50 billion [devices], and that’s the scope,” he said.

The key here is that these 50 billion devices, whether you call them edge devices or the Internet of Things, will be generating tons of data. That means you will have to develop entirely new ways of thinking about how all this flows together. “The capacity at the edge, that ubiquity is going to be transformative in how we think about computation in any business process of ours,” he said. As we generate ever-increasing amounts of data, whether we are talking about public sector kinds of use case, or any business need, it’s going to be the fuel for artificial intelligence, and he sees the sheer amount of that data driving new AI use cases.

“Of course when you have that rich computational fabric, one of the things that you can do is create this new asset, which is data and AI. There is not going to be a single application, a single experience that you are going to build, that is not going to be driven by AI, and that means you have to really have the ability to reason over large amounts of data to create that AI,” he said.

Nadella would be more than happy to have his audience take care of all that using Microsoft products, whether Azure compute, database, AI tools or edge computers like the Data Box Edge it introduced in 2018. While Nadella is probably right about the future of computing, all of this could apply to any cloud, not just Microsoft.

As computing shifts to the edge, it’s going to have a profound impact on the way we think about technology in general, but it’s probably not going to involve being tied to a single vendor, regardless of how comprehensive their offerings may be.


By Ron Miller

Microsoft’s Windows Virtual Desktop service is now generally available

Microsoft today announced that Windows Virtual Desktop (WVD), its Azure-based system for virtualizing the Windows and Office user experience it announced last September, is now generally available. Using WVD, enterprises can give their employees access to virtualized applications and remote desktops, including the ability to provide multi-session Windows 10 experiences, something that sets Microsoft’s own apart from that of other vendors who offer virtualized Windows desktops and applications.

In addition to making the service generally available, Microsoft is also rolling it out globally, whereas the preview was U.S.-only and the original plan was to slowly roll it out globally. As Scott Manchester, the principal engineering lead for WVD, also told me that over 20,000 companies signed up for the preview. He also noted that Microsoft Teams is getting enhanced support in WVD with a significantly improved video conferencing experience.

Shortly after announcing the preview of WVD, Microsoft acquired a company called FSLogix, which specialized in provisioning the same kind of virtualized Windows environments that Microsoft offers through WVD. As Microsoft’s corporate VP for Microsoft 365 told me ahead of today’s announcement, the company took a lot of the know-how from FSLogix to ensure that the user experience on WVD is as smooth as possible.

Andreson noted that just as enterprises are getting more comfortable with moving some of their infrastructure to the cloud (and have others worry about managing it), there is now also growing demand from organizations that want this same experience for their desktop experiences. “They look at the cloud as a way of saying, ‘listen, let the experts manage the infrastructure. They can optimize it; they can fine-tune it; they can make sure that it’s all done right.’ And then I’ll just have a first-party service — in this case Microsoft — that I can leverage to simplify my life and enable me to spin up and down capacity on demand,” Anderson said. He also noted, though, that making sure that these services are always available is maybe even more critical than for other workloads that have moved to the cloud. If your desktop stops working, you can’t get much done, after all.

Anderson also stressed that if a customer wants a multi-session Windows 10 environment in the cloud, WVD is the only way to go because that is the only way to get a license to do so. “We’ve built the operating system, we built the public cloud, so that combination is going to be unique and this gives us the ability to make sure that that Windows 10 experience is the absolute best on top of that public cloud,” he noted.

He also stressed that the FSLogix acquisition enabled his team to work with the Office team to optimize the user experience there. Thanks to this, when you spin up a new virtualized version of Outlook, for example, it’ll just take a second or two to load instead of almost a minute.

A number of companies are also still looking to upgrade their old Windows 7 deployments. Microsoft will stop providing free security patches for them very soon, but on WVD, these users will still be able to get access to virtualized Windows 7 desktops with free extended security updates until January 2023.  Anderson does not believe that this will be a major driver for WVD adoption, but he does see “pockets of customers who are working on their transition.”

Enterprises can access Windows 10 Enterprise and Windows 7 Enterprise on WVD at no additional licensing cost (though, of course, the Azure resources they consume will cost them) if they have an eligible Windows 10 Enterprise or Microsoft 365 license.

 


By Frederic Lardinois

Symantec’s Sheila Jordan named to Slack’s board of directors

Workplace collaboration software business Slack (NYSE: WORK) has added Sheila Jordan, a senior vice president and chief information officer of Symantec, as an independent member of its board of directors. The hiring comes three months after the business completed a direct listing on the New York Stock Exchange.

Jordan, responsible for driving information technology strategy and operations for Symantec, brings significant cybersecurity expertise to Slack’s board. Prior to joining Symantec in 2014, Jordan was a senior vice president of IT at Cisco and an executive at Disney Destination for nearly 15 years.

With the new appointment, Slack appears to be doubling down on security. In addition to the board announcement, Slack recently published a blog post outlining the company’s latest security strategy in what was likely part of a greater attempt to sway potential customers — particularly those in highly regulated industries — wary of the company’s security processes. The post introduced new features, including the ability to allow teams to work remotely while maintaining compliance to industry and company-specific requirements.

Jordan joins Slack co-founder and chief executive officer Stewart Butterfield, former Goldman Sachs executive Edith Cooper, Accel general partner Andrew Braccia, Nextdoor CEO Sarah Friar, Andreessen Horowitz general partner John O’Farrell, Social Capital CEO Chamath Palihapitiya and former Salesforce chief financial officer Graham Smith on Slack’s board of directors.

“I believe there is nothing more critical than driving organizational alignment and agility within enterprises today,” Jordan said in a statement. “Slack has developed a new category of enterprise software to help unlock this potential and I’m thrilled to now be a part of their story.”

Slack closed up nearly 50% on its first day of trading in June but has since stumbled amid reports of increased competition from Microsoft, which operates a Slack-like product called Teams.

Slack co-founder and chief technology officer Cal Henderson will join us onstage at TechCrunch Disrupt San Francisco next week to discuss the company’s founding, road to the public markets and path forward. Buy tickets here.


By Kate Clark

QC Ware Forge will give developers access to quantum hardware and simulators across vendors

Quantum computing is almost ready for prime time, and, according to most experts, now is the time to start learning how to best develop for this new and less than intuitive technology. With multiple vendors like D-Wave, Google, IBM, Microsoft and Rigetti offering commercial and open-source hardware solutions, simulators and other tools, there’s already a lot of fragmentation in this business. QC Ware, which is launching its Forge cloud platform into beta today, wants to become the go-to middleman for accessing the quantum computing hardware and simulators of these vendors.

Forge, which like the rest of QC Ware’s efforts is aimed at enterprise users, will give developers the ability to run their algorithms on a variety of hardware platforms and simulators. The company argues that developers won’t need to have any previous expertise in quantum computing, though having a bit of background surely isn’t going to hurt. From Forge’s user interface, developers will be able to run algorithms for binary optimization, chemistry simulation and machine learning.

Screen Shot 2019 09 19 at 2.16.37 PM

“Practical quantum advantage will occur. Most experts agree that it’s a matter of ‘when’ not ‘if.’ The way to pull that horizon closer is by having the user community fully engaged in quantum computing application discovery. The objective of Forge is to allow those users to access the full range of quantum computing resources through a single platform,” said Matt Johnson, CEO, QC Ware. “To assist our customers in that exploration, we are spending all of our cycles working on ways to squeeze as much power as possible out of near-term quantum computers, and to bake those methods into Forge.”

Currently, QC Ware Forge offers access to hardware from D-Wave, as well as open-source simulators running on Google’s and IBM’s clouds, with plans to support a wider variety of platforms in the near future.

Initially, QC Ware also told me that it offered direct access to IBM’s hardware, but that’s not yet the case. “We currently have the integration complete and actively utilized by QC Ware developers and quantum experts,”  QC Ware’s head of business development Yianni Gamvros told me. “However, we are still working with IBM to put an agreement in place in order for our end-users to directly access IBM hardware. We expect that to be available in our next major release. For users, this makes it easier for them to deal with the churn. We expect different hardware vendors will lead at different times and that will keep changing every six months. And for our quantum computing hardware vendors, they have a channel partner they can sell through.”

Users who sign up for the beta will receive 30 days of access to the platform and one minute of actual Quantum Computing Time to evaluate the platform.


By Frederic Lardinois

Windows 10 now runs on over 900M devices

So you thought there were 800 million Windows 10 Devices that will get Microsoft’s most recent out-of-band emergency patch? Think again. As the company announced on Twitter today, Windows 10 now runs on over 900M devices.

That’s a bit of bad timing, but current security issues aside, the momentum for Windows 10 clearly remains steady. Last September, Microsoft said Windows 10 was running on 700 million devices and by March of this year, that number had gone up to 800 million. That number includes standard Windows 10 desktops and laptops, as well as the Xbox and niche devices like the Surface Hub and Microsoft’s HoloLens.

As Yusuf Mehdi, Microsoft’s Corporate Vice President of its ‘Modern Life, Search and Devices’ group, also noted, the company added more Windows 10 devices in the last twelve months than ever before.

Come January 2020, Windows 7 is hitting the end of its (supported) life, which is likely pushing at least some users to move over to a more modern (and supported) operating system.

While those numbers for Windows 10 are clearly ticking up, Microsoft itself famously thought that Windows 10 would get to 1 billion devices by the middle of 2018. At this rate, Windows 10 will likely hit 1 billion sometime in 2020.


By Frederic Lardinois

Walt Disney Studios partners with Microsoft Azure on cloud innovation lab

Seems like everything is going to the cloud these days, so why should moving-making be left out? Today, Walt Disney Studios announced a five-year partnership with Microsoft around an innovation lab to find ways to shift content production to the Azure cloud.

The project involves the Walt Disney StudioLab, an innovation work space where Disney personnel can experiment with moving different workflows to the cloud. The movie production software company, Avid is also involved.

The hope is that by working together, the three parties can come up with creative, cloud-based workflows that can accelerate the innovation cycle at the prestigious movie maker. Every big company is looking for ways to innovate, regardless of their core business, and Disney is no different.

As movie making involves ever greater amounts of computing resources, the cloud is a perfect model for it, allowing them to scale up and down resources as needed, whether rendering scenes or adding special effects. As Disney’s CTO Jamie Voris sees it, this could make these processes more efficient, which could help lower cost and time to production.

“Through this innovation partnership with Microsoft, we’re able to streamline many of our processes so our talented filmmakers can focus on what they do best,” Voris said in a statement. It’s the same kind of cloud value proposition that many large organizations are seeking. They want to speed time to market, while letting technology handle some of the more mundane tasks.

The partnership builds on an existing one that Microsoft already had with Avid where the two companies have been working together to build cloud-based workflows for the film industry using Avid software solutions on Azure. Disney will add its unique requirements to the mix, and over the five years of the partnership, hopes to streamline some of its workflows in a more modern cloud context.


By Ron Miller

HashiCorp announces fully managed service mesh on Azure

Service mesh is just beginning to take hold in the cloud native world, and as it does, vendors are looking for ways to help customers understand it. One way to simplify the complexity of dealing with the growing number of service mesh products out there is to package it as a service. Today, HashiCorp announced a new service on Azure to address that need, building it into the Consul product.

HashiCorp co-founder and CTO Armon Dadgar says it’s a fully managed service. “We’ve partnered closely with Microsoft to offer a native Consul [service mesh] service. At the highest level, the goal here is, how do we make it basically push button,” Dadgar told TechCrunch.

He adds that there is extremely tight integration in terms of billing and permissions, as well other management functions, as you would expect with a managed service in the public cloud. Brendan Burns, one of the original Kubernetes developers, who is now a distinguished engineer at Microsoft, says the HashiCorp solution really strips away a lot of the complexity associated with running a service mesh.

“In this case, HashiCorp is using some integration into the Azure control plane to run Consul for you. So you just consume the service mesh. You don’t have to worry about the operations of the service mesh, Burns said. He added, “This is really turning it into a service instead of a do-it-yourself exercise.”

Service meshes are tools used in conjunction with containers and Kubernetes in a dynamic cloud native environment to help micro services communicate and interoperate with one another. There is a growing number of them including Istio, Envoy and Linkerd jockeying for position right now.

Burns makes it clear that while Microsoft is working closely with HashiCorp on this project, it’s also working with other vendors, as well. “Our goal with the service mesh interface specification was really to let a lot of partners be successful on the platform. You know, there’s a bunch of different service meshes. It’s a place where we feel like there’s a lot of evolution and experimentation happening, so we want to make sure that our customers can can find the right solution for them,” Burns explained.

The HashiCorp Consul service is currently in private Beta.


By Ron Miller

Top VCs on the changing landscape for enterprise startups

Yesterday at TechCrunch’s Enterprise event in San Francisco, we sat down with three venture capitalists who spend a lot of their time thinking about enterprise startups. We wanted to ask what trends they are seeing, what concerns they might have about the state of the market, and of course, how startups might persuade them to write out a check.

We covered a lot of ground with the investors — Jason Green of Emergence Capital, Rebecca Lynn of Canvas Ventures, and Maha Ibrahim of Canaan Partners — who told us, among other things, that startups shouldn’t expect a big M&A event right now, that there’s no first-mover advantage in the enterprise realm, and why grit may be the quality that ends up keeping a startup afloat.

On the growth of enterprise startups:

Jason Green: When we started Emergence 15 years ago, we saw maybe a few hundred startups a year, and we funded about five or six. Today, we see over 1,000 a year; we probably do deep diligence on 25.


By Connie Loizos

Atlassian launches free tiers for all its cloud products, extends premium pricing plan

At our TC Sessions: Enterprise event, Atlassian co-CEO Scott Farquhar today announced a number of updates to how the company will sell its cloud-based services. These include the launch of new premium plans for more of its products, as well as the addition of a free tier for all of the company’s services that didn’t already offer one. Atlassian now also offers discounted cloud pricing for academic institutions and nonprofit organizations.

The company previously announced its premium plans for Jira Software Cloud and Confluence Cloud. Now, it is adding Jira Service Desk to this lineup, and chances are it’ll add more of its services over time. The premium plan adds a 99.9% update SLA, unlimited storage and additional support. Until now, Atlassian sold these products solely based on the number of users, but didn’t offer a specific enterprise plan.

As Harsh Jawharkar, the head of go-to-market for Cloud Platform at Atlassian, told me, many of its larger customers, who often ran the company’s products on their own servers before, are now looking to move to the cloud and hand over to Atlassian the day-to-day operations of these services. That’s in part because they are more comfortable with the idea of moving to the cloud at this point — and because Atlassian probably knows how to run its own services better than anybody else. 

For these companies, Atlassian is also introducing a number of new features today. Those include soon-to-launch data residency controls for companies that need to ensure that their data stays in a certain geographic region, as well as the ability to run Jira and Confluence Cloud behind customized URLs that align with a company’s brand, which will launch in early access in 2020. Maybe more important, though, are features to Atlassian Access, the company’s command center that helps enterprises manage its cloud products. Access now supports single sign-on with Google Cloud Identity and Microsoft Active Directory Federation Services, for example. The company is also partnering with McAfee and Bitglass to offer additional advanced security features and launch a cross-product audit log. Enterprise admins will also soon get access to a new dashboard that will help them understand how Atlassian’s tools are being used across the organization.

But that’s not all. The company is also launching new tools to make customer migration to its cloud products easier, with initial support for Confluence and Jira support coming later this year. There’s also new extended cloud trial licenses, which a lot of customers have asked for, Jawharkar told me, because the relatively short trial periods the company previously offered weren’t quite long enough for companies to fully understand their needs.

This is a big slew of updates for Atlassian — maybe its biggest enterprise-centric release since the company’s launch. It has clearly reached a point where it had to start offering these enterprise features if it wanted to grow its market and bring more of these large companies on board. In its early days, Atlassian mostly grew by selling directly to teams within a company. These days, it has to focus a bit more on selling to executives as it tries to bring more enterprises on board — and those companies have very specific needs that the company didn’t have to address before. Today’s launches clearly show that it is now doing so — at least for its cloud-based products.

The company isn’t forgetting about other users either, though. It’ll still offer entry-level plans for smaller teams and it’s now adding free tiers to products like Jira Software, Confluence, Jira Service Desk and Jira Core. They’ll join Trello, Bitbucket and Opsgenie, which already feature free versions. Going forward, academic institutions will receive 50% off their cloud subscriptions and nonprofits will receive 75% off.

It’s obvious that Atlassian is putting a lot of emphasis on its cloud services. It’s not doing away with its self-hosted products anytime, but its focus is clearly elsewhere. The company itself started this process a few years ago and a lot of this work is now coming to fruition. As Anu Bharadwaj, the head of Cloud Platform at Atlassian, told me, this move to a fully cloud-native stack enabled many of today’s announcements, and she expects that it’ll bring a lot of new customers to its cloud-based services.  


By Frederic Lardinois

Why now is the time to get ready for quantum computing

For the longest time, even while scientists were working to make it a reality, quantum computing seemed like science fiction. It’s hard enough to make any sense out of quantum physics to begin with, let alone the practical applications of this less than intuitive theory. But we’ve now arrived at a point where companies like D-Wave, Rigetti, IBM and others actually produce real quantum computers.

They are still in their infancy and nowhere near as powerful as necessary to compute anything but very basic programs, simply because they can’t run long enough before the quantum states decohere, but virtually all experts say that these are solvable problems and that now is the time to prepare for the advent of quantum computing. Indeed, Gartner just launched a Quantum Volume metric, based on IBM’s research, that looks to help CIOs prepare for the impact of quantum computing.

To discuss the state of the industry and why now is the time to get ready, I sat down with IBM’s Jay Gambetta, who will also join us for a panel on Quantum Computing at our TC Sessions: Enterprise event in San Francisco on September 5, together with Microsoft’s Krysta Svore and Intel’s Jim Clark.


By Frederic Lardinois

The five great reasons to attend TechCrunch’s Enterprise show Sept. 5 in SF

The vast enterprise tech category is Silicon Valley’s richest, and today it’s poised to change faster than ever before. That’s probably the biggest reason to come to TechCrunch’s first-ever show focused entirely on enterprise. But here are five more reasons to commit to joining TechCrunch’s editors on September 5 at San Francisco’s Yerba Buena Center for an outstanding day (agenda here) addressing the tech tsunami sweeping through enterprise. 

No. 1: Artificial intelligence
At once the most consequential and most hyped technology, no one doubts that AI will change business software and increase productivity like few, if any, technologies before it. To peek ahead into that future, TechCrunch will interview Andrew Ng, arguably the world’s most experienced AI practitioner at huge companies (Baidu, Google) as well as at startups. AI will be a theme across every session, but we’ll address it again head-on in a panel with investor Jocelyn Goldfein (Zetta), founder Bindu Reddy (Reality Engines) and executive John Ball (Salesforce / Einstein). 

No. 2: Data, the cloud and Kubernetes
If AI is at the dawn of tomorrow, cloud transformation is the high noon of today. Indeed, 90% of the world’s data was created in the past two years, and no enterprise can keep its data hoard on-prem forever. Azure’s CTO
Mark Russinovitch will discuss Microsft’s vision for the cloud. Leaders in the open-source Kubernetes revolution — Joe Beda (VMware), Aparna Sinha (Google) and others — will dig into what Kubernetes means to companies making the move to cloud. And last, there is the question of how to find signal in all the data — which will bring three visionary founders to the stage: Benoit Dageville (Snowflake), Ali Ghodsi (Databricks) and Murli Thirumale (Portworx). 

No. 3: Everything else on the main stage!
Let’s start with a fireside chat with
SAP CEO Bill McDermott and Qualtrics Chief Experience Officer Julie Larson-Green. We have top investors talking where they are making their bets, and security experts talking data and privacy. And then there is quantum computing, the technology revolution waiting on the other side of AI: Jay Gambetta, the principal theoretical scientist behind IBM’s quantum computing effort, Jim Clarke, the director of quantum hardware at Intel Labs and Krysta Svore, who leads Microsoft’s quantum effort.

All told, there are 21 programming sessions.

No. 4: Network and get your questions answered
There will be two Q&A breakout sessions with top enterprise investors; this is for founders (and anyone else) to query investors directly. Plus, TechCrunch’s unbeatable CrunchMatch app makes it really easy to set up meetings with the other attendees, an
incredible array of folks, plus the 20 early-stage startups exhibiting on the expo floor.

No. 5: SAP
Enterprise giant SAP is our sponsor for the show, and they are not only bringing a squad of top executives, they are producing four parallel track sessions, featuring key SAP Chief Innovation Officer
Max Wessel, SAP Chief Designer and Futurist Martin Wezowski and SAP.IO’s managing director Ram Jambunathan (SAP.iO), in sessions including how to scale-up an enterprise startup, how startups win large enterprise customers, and what the enterprise future looks like.

Check out the complete agenda. Don’t miss this show! This line-up is a view into the future like none other. 

Grab your $349 tickets today, and don’t wait til the day of to book because prices go up at the door!

We still have two Startup Demo Tables left. Each table comes with four tickets and a prime location to demo your startup on the expo floor. Book your demo table now before they’re all gone!


By Robert Frawley

Ally raises $8M Series A for its OKR solution

OKRs, or Objectives and Key Results, are a popular planning method in Silicon Valley. Like most of those methods that make you fill in some form once every quarter, I’m pretty sure employees find them rather annoying and a waste of their time. Ally wants to change that and make the process more useful. The company today announced that it has raised an $8 million Series A round led by Accel Partners, with participation from Vulcan Capital, Founders Co-op and Lee Fixel. The company, which launched in 2018, previously raised a $3 million seed round.

Ally founder and CEO Vetri Vellore tells me that he learned his management lessons and the value of OKR at his last startup, Chronus. After years of managing large teams at enterprises like Microsoft, he found himself challenged to manage a small team at a startup. “I went and looked for new models of running a business execution. And OKRs were one of those things I stumbled upon. And it worked phenomenally well for us,” Vellore said. That’s where the idea of Ally was born, which Vellore pursued after selling his last startup.

Most companies that adopt this methodology, though, tend to work with spreadsheets and Google Docs. Over time, that simply doesn’t work, especially as companies get larger. Ally, then, is meant to replace these other tools. The service is currently in use at “hundreds” of companies in more than 70 countries, Vellore tells me.

One of its early adopters was Remitly . “We began by using shared documents to align around OKRs at Remitly. When it came time to roll out OKRs to everyone in the company, Ally was by far the best tool we evaluated. OKRs deployed using Ally have helped our teams align around the right goals and have ultimately driven growth,” said Josh Hug, COO of Remitly.

Desktop Team OKRs Screenshot

Vellore tells me that he has seen teams go from annual or bi-annual OKRs to more frequently updated goals, too, which is something that’s easier to do when you have a more accessible tool for it. Nobody wants to use yet another tool, though, so Ally features deep integrations into Slack, with other integrations in the works (something Ally will use this new funding for).

Since adopting OKRs isn’t always easy for companies that previously used other methodologies (or nothing at all), Ally also offers training and consulting services with online and on-site coaching.

Pricing for Ally starts at $7 per month per user for a basic plan, but the company also offers a flat $29 per month plan for teams with up to 10 users, as well as an enterprise plan, which includes some more advanced features and single sign-on integrations.


By Frederic Lardinois

Microsoft acquires jClarity, an open source Java performance tuning tool

Microsoft announced this morning that it was acquiring jClarity, an open source tool designed to tune the performance of Java applications. It will be doing that on Azure from now on. In addition, the company has been offering a flavor of Java called AdoptOpenJDK, which they bill as a free alternative to Oracle Java. The companies did not discuss the terms of the deal.

As Microsoft pointed out in a blog post announcing the acquisition, they are seeing increasing use of large-scale Java installations on Azure, both internally with platforms like Minecraft and externally with large customers including Daimler and Adobe.

The company believes that by adding the jClarity team and its toolset, it can help service these Java customers better. “The team, formed by Java champions and data scientists with proven expertise in data driven Java Virtual Machine (JVM) optimizations, will help teams at Microsoft to leverage advancements in the Java platform,” the company wrote in the blog.

Microsoft has actually been part of the AdoptOpenJDK project along with a Who’s Who of other enterprise companies including Amazon, IBM, Pivotal, Red Hat and SAP.

Co-founder and CEO Martin Verburg, writing in a company blog post announcing the deal, unsurprisingly spoke in glowing terms about the company he was about to become a part of. “Microsoft leads the world in backing developers and their communities, and after speaking to their engineering and programme leadership, it was a no brainer to enter formal discussions. With the passion and deep expertise of Microsoft’s people, we’ll be able to support the Java ecosystem better than ever before,” he wrote.

Verburg also took the time to thank the employees, customers and community who has supported the open source project on top of which his company was built. Verburg’s new title at Microsoft will be Principal Engineering Group Manager (Java) at Microsoft.

It is unclear how the community will react to another flavor of Java being absorbed by another large vendor, or how the other big vendors involved in the project will feel about it, but regardless, jClarity is part of Microsoft now.


By Ron Miller

Microsoft Azure CTO Mark Russinovich will join us for TC Sessions: Enterprise on September 5

Being the CTO for one of the three major hypercloud providers may seem like enough of a job for most people, but Mark Russinovich, the CTO of Microsoft Azure, has a few other talents in his back pocket. Russinovich, who will join us for a fireside chat at our TechCrunch Sessions: Enterprise event in San Francisco on September 5 (p.s. early-bird sale ends Friday), is also an accomplished novelist who has published four novels, all of which center around tech and cybersecurity.

At our event, though, we won’t focus on his literary accomplishments (except for maybe his books about Windows Server) as much as on the trends he’s seeing in enterprise cloud adoption. Microsoft, maybe more so than its competitors, always made enterprise customers and their needs the focus of its cloud initiatives from the outset. Today, as the majority of enterprises is looking to move at least some of their legacy workloads into the cloud, they are often stumped by the sheer complexity of that undertaking.

In our fireside chat, we’ll talk about what Microsoft is doing to reduce this complexity and how enterprises can maximize their current investments into the cloud, both for running new cloud-native applications and for bringing legacy applications into the future. We’ll also talk about new technologies that can make the move to the cloud more attractive to enterprises, including the current buzz around edge computing, IoT, AI and more.

Before joining Microsoft, Russinovich, who has a Ph.D. in computer engineering from Carnegie Mellon, was the co-founder and chief architect of Winternals Software, which Microsoft acquired in 2006. During his time at Winternals, Russinovich discovered the infamous Sony rootkit. Over his 13 years at Microsoft, he moved from Technical Fellow up to the CTO position for Azure, which continues to grow at a rapid clip as it looks to challenge AWS’s leadership in total cloud revenue.

Tomorrow, Friday, August 16 is your last day to save $100 on tickets before prices go up. Book your early-bird tickets now and keep that Benjamin in your pocket.

If you’re an early-stage startup, we only have three demo table packages left! Each demo package comes with four tickets and a great location for your company to get in front of attendees. Book your demo package today before we sell out!


By Frederic Lardinois