Walt Disney Studios partners with Microsoft Azure on cloud innovation lab

Seems like everything is going to the cloud these days, so why should moving-making be left out? Today, Walt Disney Studios announced a five-year partnership with Microsoft around an innovation lab to find ways to shift content production to the Azure cloud.

The project involves the Walt Disney StudioLab, an innovation work space where Disney personnel can experiment with moving different workflows to the cloud. The movie production software company, Avid is also involved.

The hope is that by working together, the three parties can come up with creative, cloud-based workflows that can accelerate the innovation cycle at the prestigious movie maker. Every big company is looking for ways to innovate, regardless of their core business, and Disney is no different.

As movie making involves ever greater amounts of computing resources, the cloud is a perfect model for it, allowing them to scale up and down resources as needed, whether rendering scenes or adding special effects. As Disney’s CTO Jamie Voris sees it, this could make these processes more efficient, which could help lower cost and time to production.

“Through this innovation partnership with Microsoft, we’re able to streamline many of our processes so our talented filmmakers can focus on what they do best,” Voris said in a statement. It’s the same kind of cloud value proposition that many large organizations are seeking. They want to speed time to market, while letting technology handle some of the more mundane tasks.

The partnership builds on an existing one that Microsoft already had with Avid where the two companies have been working together to build cloud-based workflows for the film industry using Avid software solutions on Azure. Disney will add its unique requirements to the mix, and over the five years of the partnership, hopes to streamline some of its workflows in a more modern cloud context.


By Ron Miller

HashiCorp announces fully managed service mesh on Azure

Service mesh is just beginning to take hold in the cloud native world, and as it does, vendors are looking for ways to help customers understand it. One way to simplify the complexity of dealing with the growing number of service mesh products out there is to package it as a service. Today, HashiCorp announced a new service on Azure to address that need, building it into the Consul product.

HashiCorp co-founder and CTO Armon Dadgar says it’s a fully managed service. “We’ve partnered closely with Microsoft to offer a native Consul [service mesh] service. At the highest level, the goal here is, how do we make it basically push button,” Dadgar told TechCrunch.

He adds that there is extremely tight integration in terms of billing and permissions, as well other management functions, as you would expect with a managed service in the public cloud. Brendan Burns, one of the original Kubernetes developers, who is now a distinguished engineer at Microsoft, says the HashiCorp solution really strips away a lot of the complexity associated with running a service mesh.

“In this case, HashiCorp is using some integration into the Azure control plane to run Consul for you. So you just consume the service mesh. You don’t have to worry about the operations of the service mesh, Burns said. He added, “This is really turning it into a service instead of a do-it-yourself exercise.”

Service meshes are tools used in conjunction with containers and Kubernetes in a dynamic cloud native environment to help micro services communicate and interoperate with one another. There is a growing number of them including Istio, Envoy and Linkerd jockeying for position right now.

Burns makes it clear that while Microsoft is working closely with HashiCorp on this project, it’s also working with other vendors, as well. “Our goal with the service mesh interface specification was really to let a lot of partners be successful on the platform. You know, there’s a bunch of different service meshes. It’s a place where we feel like there’s a lot of evolution and experimentation happening, so we want to make sure that our customers can can find the right solution for them,” Burns explained.

The HashiCorp Consul service is currently in private Beta.


By Ron Miller

The five great reasons to attend TechCrunch’s Enterprise show Sept. 5 in SF

The vast enterprise tech category is Silicon Valley’s richest, and today it’s poised to change faster than ever before. That’s probably the biggest reason to come to TechCrunch’s first-ever show focused entirely on enterprise. But here are five more reasons to commit to joining TechCrunch’s editors on September 5 at San Francisco’s Yerba Buena Center for an outstanding day (agenda here) addressing the tech tsunami sweeping through enterprise. 

No. 1: Artificial intelligence
At once the most consequential and most hyped technology, no one doubts that AI will change business software and increase productivity like few, if any, technologies before it. To peek ahead into that future, TechCrunch will interview Andrew Ng, arguably the world’s most experienced AI practitioner at huge companies (Baidu, Google) as well as at startups. AI will be a theme across every session, but we’ll address it again head-on in a panel with investor Jocelyn Goldfein (Zetta), founder Bindu Reddy (Reality Engines) and executive John Ball (Salesforce / Einstein). 

No. 2: Data, the cloud and Kubernetes
If AI is at the dawn of tomorrow, cloud transformation is the high noon of today. Indeed, 90% of the world’s data was created in the past two years, and no enterprise can keep its data hoard on-prem forever. Azure’s CTO
Mark Russinovitch will discuss Microsft’s vision for the cloud. Leaders in the open-source Kubernetes revolution — Joe Beda (VMware), Aparna Sinha (Google) and others — will dig into what Kubernetes means to companies making the move to cloud. And last, there is the question of how to find signal in all the data — which will bring three visionary founders to the stage: Benoit Dageville (Snowflake), Ali Ghodsi (Databricks) and Murli Thirumale (Portworx). 

No. 3: Everything else on the main stage!
Let’s start with a fireside chat with
SAP CEO Bill McDermott and Qualtrics Chief Experience Officer Julie Larson-Green. We have top investors talking where they are making their bets, and security experts talking data and privacy. And then there is quantum computing, the technology revolution waiting on the other side of AI: Jay Gambetta, the principal theoretical scientist behind IBM’s quantum computing effort, Jim Clarke, the director of quantum hardware at Intel Labs and Krysta Svore, who leads Microsoft’s quantum effort.

All told, there are 21 programming sessions.

No. 4: Network and get your questions answered
There will be two Q&A breakout sessions with top enterprise investors; this is for founders (and anyone else) to query investors directly. Plus, TechCrunch’s unbeatable CrunchMatch app makes it really easy to set up meetings with the other attendees, an
incredible array of folks, plus the 20 early-stage startups exhibiting on the expo floor.

No. 5: SAP
Enterprise giant SAP is our sponsor for the show, and they are not only bringing a squad of top executives, they are producing four parallel track sessions, featuring key SAP Chief Innovation Officer
Max Wessel, SAP Chief Designer and Futurist Martin Wezowski and SAP.IO’s managing director Ram Jambunathan (SAP.iO), in sessions including how to scale-up an enterprise startup, how startups win large enterprise customers, and what the enterprise future looks like.

Check out the complete agenda. Don’t miss this show! This line-up is a view into the future like none other. 

Grab your $349 tickets today, and don’t wait til the day of to book because prices go up at the door!

We still have two Startup Demo Tables left. Each table comes with four tickets and a prime location to demo your startup on the expo floor. Book your demo table now before they’re all gone!


By Robert Frawley

Microsoft acquires jClarity, an open source Java performance tuning tool

Microsoft announced this morning that it was acquiring jClarity, an open source tool designed to tune the performance of Java applications. It will be doing that on Azure from now on. In addition, the company has been offering a flavor of Java called AdoptOpenJDK, which they bill as a free alternative to Oracle Java. The companies did not discuss the terms of the deal.

As Microsoft pointed out in a blog post announcing the acquisition, they are seeing increasing use of large-scale Java installations on Azure, both internally with platforms like Minecraft and externally with large customers including Daimler and Adobe.

The company believes that by adding the jClarity team and its toolset, it can help service these Java customers better. “The team, formed by Java champions and data scientists with proven expertise in data driven Java Virtual Machine (JVM) optimizations, will help teams at Microsoft to leverage advancements in the Java platform,” the company wrote in the blog.

Microsoft has actually been part of the AdoptOpenJDK project along with a Who’s Who of other enterprise companies including Amazon, IBM, Pivotal, Red Hat and SAP.

Co-founder and CEO Martin Verburg, writing in a company blog post announcing the deal, unsurprisingly spoke in glowing terms about the company he was about to become a part of. “Microsoft leads the world in backing developers and their communities, and after speaking to their engineering and programme leadership, it was a no brainer to enter formal discussions. With the passion and deep expertise of Microsoft’s people, we’ll be able to support the Java ecosystem better than ever before,” he wrote.

Verburg also took the time to thank the employees, customers and community who has supported the open source project on top of which his company was built. Verburg’s new title at Microsoft will be Principal Engineering Group Manager (Java) at Microsoft.

It is unclear how the community will react to another flavor of Java being absorbed by another large vendor, or how the other big vendors involved in the project will feel about it, but regardless, jClarity is part of Microsoft now.


By Ron Miller

In spite of slowing growth, Microsoft has been flexing its cloud muscles

When Microsoft reported its FY19, Q4 earnings last week, the numbers were mostly positive, but as we pointed out, Azure earnings growth has stalled. Productivity and business, which includes Office 365, has also mostly flattened out. But slowing growth is not always as bad as it may seem. In fact, it’s an inevitability that once you start to reach Microsoft’s market maturity, it gets harder to maintain large growth numbers.

That said, AWS launched the first cloud infrastructure service, Amazon Elastic Compute Cloud in August, 2006. Microsoft came much later to the cloud, launching Azure in February, 2010, but so were other established companies in Microsoft’s market share rearview. What did it do differently to achieve this success that the companies chasing it — Google, IBM and Oracle — failed to do? It’s a key question.

Let’s look at some numbers

For starters, let’s look at the most numbers for Productivity & Business Processes this year. This category includes all of its commercial and consumer SaaS products including Office 365 commercial and consumer, Dynamics 365, LinkedIn and others. The percentage growth started FY19 at 19% but ended at 14%

Screenshot 2019 07 19 14.34.00

When you look at just Office365 commercial earnings growth, it started at 36% and dropped down to 31% by Q4.


By Ron Miller

Red Hat and Microsoft are cozying up some more with Azure Red Hat OpenShift

It won’t be long before Red Hat becomes part of IBM, the result of the $34 billion acquisition last year that is still making its way to completion. For now, Red Hat continues as a stand-alone company, and is if to flex its independence muscles, it announced its second agreement in two days with Microsoft Azure, Redmond’s public cloud infrastructure offering. This one involving running Red Hat OpenShift on Azure.

OpenShift is RedHat’s Kubernetes offering. The thinking is that you can start with OpenShift in your data center, then as you begin to shift to the cloud, you can move to Azure Red Hat OpenShift — such a catchy name — without any fuss, as you have the same management tools you have been used to using.

As Red Hat becomes part of IBM, it sees that it’s more important than ever to maintain its sense of autonomy in the eyes of developers and operations customers, as it holds its final customer conference as an independent company. Red Hat executive vice president and president, of products and technologies certainly sees it that way. “I think [the partnership] is a testament to, even with moving to IBM at some point soon, that we are going to be  separate and really keep our Switzerland status and give the same experience for developers and operators across anyone’s cloud,” he told TechCrunch.

It’s essential to see this announcement in the context of both IBM’s and Microsoft’s increasing focus on the hybrid cloud, and also in the continuing requirement for cloud companies to find ways to work together, even when it doesn’t always seem to make sense, because as Microsoft CEO Satya Nadella has said, customers will demand it. Red Hat has a big enterprise customer presence and so does Microsoft. If you put them together, it could be the beginning of a beautiful friendship.

Scott Guthrie, executive vice president for the cloud and AI group at Microsoft understands that. “Microsoft and Red Hat share a common goal of empowering enterprises to create a hybrid cloud environment that meets their current and future business needs. Azure Red Hat OpenShift combines the enterprise leadership of Azure with the power of Red Hat OpenShift to simplify container management on Kubernetes and help customers innovate on their cloud journeys,” he said in a statement.

This news comes on the heels of yesterday’s announcement, also involving Kubernetes. TechCrunch’s own Frederic Lardinois described it this way:

What’s most interesting here, however, is KEDA, a new open-source collaboration between Red Hat and Microsoft that helps developers deploy serverless, event-driven containers. Kubernetes-based event-driven autoscaling, or KEDA, as the tool is called, allows users to build their own event-driven applications on top of Kubernetes. KEDA handles the triggers to respond to events that happen in other services and scales workloads as needed.

Azure Red Hat OpenShift is available now on Azure. The companies are working on some other integrations too including Red Hat Enterprise Linux (RHEL) running on Azure and Red Hat Enterprise Linux 8 support in Microsoft SQL Server 2019.


By Ron Miller

Microsoft and GitHub grow closer

Microsoft’s $7.5 billion acquisition of GitHub closed last October. Today, at its annual Build developer conference, Microsoft announced a number of new integrations between its existing services and GitHub. None of these are earth-shattering or change the nature of any of GitHub’s fundamental features, but they do show how Microsoft is starting to bring GitHub closer into the fold.

It’s worth noting that Microsoft isn’t announcing any major GitHub features at Build, though it’s only a few weeks ago that the company made a major change by giving GitHub Free users access to unlimited private repositories. For major feature releases, GitHub has its own conference anyway.

So what are the new integrations? Most of them center around identity management. That means GitHub Enterprise users can now use Azure Active Directory to access GitHub. Developers will also be able to use their existing GitHub accounts to log into Azure features like the Azure Portal and Azure DevOps. “This update enables GitHub developers to go from repository to deployment with just their GitHub account,” Microsoft argues in its release announcement.

As far as selling GitHub goes, Microsoft also today announced a new Visual Studio subscription with access to GitHub Enterprise for Microsoft’s Enterprise Agreement customers. Given that there is surely a lot of overlap between Visual Studio’s enterprise customers and GitHub Enterprise users, this move makes sense. Chances are, it’ll also make moving to GitHub Enterprise more enticing for current Visual Studio subscribers.

Lastly, the Azure Boards app, which offers features like Kanban boards and sprint planning tools, is now also available in the GitHub Marketplace.


By Frederic Lardinois

Google’s hybrid cloud platform is coming to AWS and Azure

Google’s Cloud Services Platform for managing hybrid clouds that span on-premise data centers and the Google cloud, is coming out of beta today. The company is also changing the product’s name to Anthos, a name that either refers to a lost Greek tragedy, the name of an obscure god in the Marvel universe, or rosemary. That by itself would be interesting but minor news. What makes this interesting is that Google also today announced that Anthos will run on third-party clouds as well, including AWS and Azure.

“We will support Anthos and AWS and Azure as well, so people get one way to manage their application and that one way works across their on-premise environments and all other clouds,” Google’s senior VP for its technical infrastructure, Urs Hölzle, explained in a press conference ahead of today’s announcement.

So with Anthos, Google will offer a single managed service that will let you manage and deploy workloads across clouds, all without having to worry about the different environments and APIs. That’s a big deal and one that clearly delineates Google’s approach from its competitors’. This is Google, after all, managing your applications for you on AWS and Azure.

“You can use one consistent approach — one open-source based approach — across all environments,” Hölzle said. “I can’t really stress how big a change that is in the industry, because this is really the stack for the next 20 years, meaning that it’s not really about the three different clouds that are all randomly different in small ways. This is the way that makes these three cloud — and actually on-premise environments, too — look the same.”

Anthos/Google Cloud Services Platform is based on the Google Kubernetes Engine, as well as other open source projects like the Istio service mesh. It’s also hardware agnostic, meaning that users can take their current hardware and run the service on top of that without having to immediately invest in new servers.

Why is Google doing this? “We hear from our customers that multi-cloud and hybrid is really an acute pain point,” Hölzle said. He noted that containers are the enabling technology for this but that few enterprises have developed a unifying strategy to manage these deployments and that it takes expertise in all major clouds to get the most out of them.

Enterprises already have major investments in their infrastructure and created relationships with their vendors, though, so it’s no surprise that Google is launching Anthos with over 30 major hardware and software partners that range from Cisco to Dell EMC, HPE and VMWare, as well as application vendors like Confluent, Datastax, Elastic, Portworx, Tigera, Splunk, GitLab, MongoDB and others.

Anthos is a subscription-based service, with the list prices starting at $10,000/month per 100 vCPU block. Enterprise prices then to be up for negotiation, though, so many customers will likely pay less.

It’s one thing to use a service like this for new applications, but many enterprises already have plenty of line-of-business tools that they would like to bring to the cloud as well. For them, Google is launching the first beta of Anthos Migrate today. This service will auto-migrate VMs from on-premises or other clouds into containers in the Google Kubernetes Engine. The promise here is that this is essentially an automatic process and once the container is on Google’s platform, you’ll be able to use all of the other features that come with the Anthos platform, too.

Google’s Hölzle noted that the emphasis here was on making this migration as easy as possible. “There’s no manual effort there,” he said.


By Frederic Lardinois

Microsoft gives 500 patents to startups

Microsoft today announced a major expansion of its Azure IP Advantage program, which provides its Azure users with protection against patent trolls. This program now also provides customers who are building IoT solutions that connect to Azure with access to 10,000 patents to defend themselves against intellectual property lawsuits.

What’s maybe most interesting here, though, is that Microsoft is also donating 500 patents to startups in the LOT Network. This organization, which counts companies like Amazon, Facebook, Google, Microsoft, Netflix, SAP, Epic Games, Ford, GM, Lyft and Uber among its well over 150 members, is designed to protect companies against patent trolls by giving them access to a wide library of patents from its member companies and other sources.

“The LOT Network is really committed to helping address the proliferation of intellectual property losses, especially ones that are brought by non-practicing entities, or so-called trolls,” Microsoft  CVP and Deputy General Counsel Erich Andersen told me. 

This new program goes well beyond basic protection from patent trolls, though. Qualified startups who join the LOT Network can acquire Microsoft patents as part of their free membership and as Andresen stressed, the startups will own them outright. The LOT network will be able to provide its startup members with up to three patents from this collection.

There’s one additional requirement here, though: to qualify for getting the patents, these startups also have to meet a $1,000 per month Azure spend. As Andersen told me, though, they don’t have to make any kind of forward pledge. The company will simply look at a startup’s last three monthly Azure bills.

“We want to help the LOT Network grow its network of startups,” Andersen said. “To provide an incentive, we are going to provide these patents to them.” He noted that startups are obviously interested in getting access to patents as a foundation of their companies, but also to raise capital and to defend themselves against trolls.

The patents we’re talking about here cover a wide range of technologies as well as geographies. Andersen noted that we’re talking about U.S. patents as well as European and Chinese patents, for example.

“The idea is that these startups come from a diverse set of industry sectors,” he said. “The hope we have is that when they approach LOT, they’ll find patents among those 500 that are going to be interesting to basically almost any company that might want a foundational set of patents for their business.”

As for the extended Azure IP Advantage program, it’s worth noting that every Azure customer who spends more than $1,000 per month over the past three months and hasn’t filed a patent infringement lawsuit against another Azure customers in the last two years can automatically pick one of the patents in the program’s portfolio to protect itself against frivolous patent lawsuits from trolls (and that’s a different library of patents from the one Microsoft is donating to the LOT Network as part of the startup program).

As Andresen noted, the team looked at how it could enhance the IP program by focusing on a number of specific areas. Microsoft is obviously investing a lot into IoT, so extending the program to this area makes sense. “What we’re basically saying is that if the customer is using IoT technology — regardless of whether it’s Microsoft technology or not — and it’s connected to Azure, then we’re going to provide this patent pick right to help customers defend themselves against patent suits,” Andersen said.

In addition, for those who do choose to use Microsoft IoT technology across the board, Microsoft will provide indemnification, too.

Patent trolls have lately started acquiring IoT patents, so chances are they are getting ready to making use of them and that we’ll see quite a bit of patent litigation in this space in the future. “The early signs we’re seeing indicate that this is something that customers are going to care about in the future,” said Andersen.


By Frederic Lardinois

Microsoft announces an Azure-powered Kinect camera for enterprise

Today’s Mobile World Congress kickoff event was all about the next Hololens, but Microsoft still had some surprises up its sleeve. One of the more interesting additions is the Azure Kinect, a new enterprise camera system that leverages the company’s perennially 3D imaging technology to create a 3D camera for enterprises.

The device is actually a kind of companion hardware piece for Hololens in the enterprise, giving business a way to capture depth sensing and leverage its Azure solutions to collect that data.

“Azure Kinect is an intelligent edge device that doesn’t just see and hear but understands the people, the environment, the objects and their actions,” Azure VP Julia White said at the kick off of today’s event. “The level of accuracy you can achieve is unprecedented.”

What started as a gesture-based gaming peripheral for the Xbox 360 has since grown to be an incredibly useful tool across a variety of different fields, so it tracks that the company would seek to develop a product for business. And unlike some of the more far off Hololens applications, the Azure Kinect is the sort of product that could be instantly useful, right off the the shelf.

A number of enterprise partners have already begun testing the technology, including Datamesh, Ocuvera and Ava, representing an interesting cross-section of companies. The system goes up for pre-order today, priced at $399. 


By Brian Heater

Microsoft Azure sets its sights on more analytics workloads

Enterprises now amass huge amounts of data, both from their own tools and applications, as well as from the SaaS applications they use. For a long time, that data was basically exhaust. Maybe it was stored for a while to fulfill some legal requirements, but then it was discarded. Now, data is what drives machine learning models, and the more data you have, the better. It’s maybe no surprise, then, that the big cloud vendors started investing in data warehouses and lakes early on. But that’s just a first step. After that, you also need the analytics tools to make all of this data useful.

Today, it’s Microsoft turn to shine the spotlight on its data analytics services. The actual news here is pretty straightforward. Two of these are services that are moving into general availability: the second generation of Azure Data Lake Storage for big data analytics workloads and Azure Data Explorer, a managed service that makes easier ad-hoc analysis of massive data volumes. Microsoft is also previewing a new feature in Azure Data Factory, its graphical no-code service for building data transformation. Data Factory now features the ability to map data flows.

Those individual news pieces are interesting if you are a user or are considering Azure for your big data workloads, but what’s maybe more important here is that Microsoft is trying to offer a comprehensive set of tools for managing and storing this data — and then using it for building analytics and AI services.

(Photo credit:Josh Edelson/AFP/Getty Images)

“AI is a top priority for every company around the globe,” Julia White, Microsoft’s corporate VP for Azure, told me. “And as we are working with our customers on AI, it becomes clear that their analytics often aren’t good enough for building an AI platform.” These companies are generating plenty of data, which then has to be pulled into analytics systems. She stressed that she couldn’t remember a customer conversation in recent months that didn’t focus on AI. “There is urgency to get to the AI dream,” White said, but the growth and variety of data presents a major challenge for many enterprises. “They thought this was a technology that was separate from their core systems. Now it’s expected for both customer-facing and line-of-business applications.”

Data Lake Storage helps with managing this variety of data since it can handle both structured and unstructured data (and is optimized for the Spark and Hadoop analytics engines). The service can ingest any kind of data — yet Microsoft still promises that it will be very fast. “The world of analytics tended to be defined by having to decide upfront and then building rigid structures around it to get the performance you wanted,” explained White. Data Lake Storage, on the other hand, wants to offer the best of both worlds.

Likewise, White argued that while many enterprises used to keep these services on their on-premises servers, many of them are still appliance-based. But she believes the cloud has now reached the point where the price/performance calculations are in its favor. It took a while to get to this point, though, and to convince enterprises. White noted that for the longest time, enterprises that looked at their analytics projects thought $300 million projects took forever, tied up lots of people and were frankly a bit scary. “But also, what we had to offer in the cloud hasn’t been amazing until some of the recent work,” she said. “We’ve been on a journey — as well as the other cloud vendors — and the price performance is now compelling.” And it sure helps that if enterprises want to meet their AI goals, they’ll now have to tackle these workloads, too.


By Frederic Lardinois

Microsoft acquires FSLogix to enhance Office 365 virtual desktop experience

Back in September, Microsoft announced a virtual desktop solution that lets customers run Office 365 and Windows 10 in the cloud. They mentioned several partners in the announcement who were working on solutions with them. One of those was FSLogix, a Georgia virtual desktop startup. Today, Microsoft announced it has acquired FSLogix. It did not share the purchase price.

“FSLogix is a next-generation app-provisioning platform that reduces the resources, time and labor required to support virtualization,” Brad Anderson, corporate VP for Microsoft Office 365 and Julia White, corporate VP for Microsoft Azure wrote in a joint blog post today.

When Microsoft made the virtual desktop announcement in September they named Citrix, CloudJumper, Lakeside Software, Liquidware, People Tech Group, ThinPrint and FSLogix as partners working on solutions. Apparently, the company decided it wanted to own one of those experiences and acquired FSLogix.

Microsoft believes by incorporating the FSLogix solution, it will provide a better virtual desktop experience for its customer by enabling better performance and faster load times, especially for Office 365 ProPlus customers.

Randy Cook, founder and CTO at FSLogix, said the acquisition made sense given how well the two companies have worked together over the years. “From the beginning, in working closely with several teams at Microsoft, we recognized that our missions were completely aligned. Both FSLogix and Microsoft are dedicated to providing the absolute best experience for companies choosing to deploy virtual desktops,” Cook wrote in a blog post announcing the acquisition.

Lots of companies have what are essentially dumb terminals running just the tools each employee needs, rather than a fully functioning stand-alone PC. Citrix has made a living offering these services. When employees comes in to start the day, they sign in with their credentials and they get a virtual desktop with the tools they need to do their jobs. Microsoft’s version of this involves Office 365 and Windows 10 running on Azure.

FSLogix was founded in 2013 and has raised over $10 million, according to data on Crunchbase. Today’s acquisition, which has already closed according to Microsoft, comes on the heels of last week’s announcement that the company was buying Xoxco, an Austin-based developer shop with experience building conversational bots.


By Ron Miller

Chef launches deeper integration with Microsoft Azure

DevOps automation service Chef today announced a number of new integrations with Microsoft Azure. The news, which was announced at Microsoft Ignite conference in Orlando, Florida, focuses on helping enterprises bring their legacy applications to Azure and ranges from the public preview of Chef Automate Managed Service for Azure to the integration of Chef’s InSpec compliance product with Microsoft’s cloud platform.

With Chef Automate as a managed service on Azure, which provides ops teams with a single tool for managing and monitoring their compliance and infrastructure configurations, developers can now easily deploy and manage Chef Automate and the Chef Server from the Azure Portal. It’s a fully managed service and the company promises that businesses can get started with using it in as little as thirty minutes (though I’d take those numbers with a grain of salt).

When those configurations need to change, Chef users on Azure can also now use the Chef Workstation with Azure Cloud Shell, Azure’s command line interface. Workstation is one of Chef’s newest products and focuses on making ad-hoc configuration changes, no matter whether the node is managed by Chef or not.

And to remain in compliance, Chef is also launching an integration of its InSpec security and compliance tools with Azure. InSpec works hand in hand with Microsoft’s new Azure Policy Guest Configuration (who comes up with these names?) and allows users to automatically audit all of their applications on Azure.

“Chef gives companies the tools they need to confidently migrate to Microsoft Azure so users don’t just move their problems when migrating to the cloud, but have an understanding of the state of their assets before the migration occurs,” said Corey Scobie, the senior vice president of products and engineering at Chef, in today’s announcement. “Being able to detect and correct configuration and security issues to ensure success after migrations gives our customers the power to migrate at the right pace for their organization.”

more Microsoft Ignite 2018 coverage


By Frederic Lardinois

Microsoft wants to put your data in a box

AWS has its Snowball (and Snowmobile truck), Google Cloud has its data transfer appliance and Microsoft has its Azure Data Box. All of these are physical appliances that allow enterprises to ship lots of data to the cloud by uploading it into these machines and then shipping them to the cloud. Microsoft’s Azure Data Box launched into preview about a year ago and today, the company is announcing a number of updates and adding a few new boxes, too.

First of all, the standard 50-pound, 100-terabyte Data Box is now generally available. If you’ve got a lot of data to transfer to the cloud — or maybe collect a lot of offline data — then FedEx will happily pick this one up and Microsoft will upload the data to Azure and charge you for your storage allotment.

If you’ve got a lot more data, though, then Microsoft now also offers the Azure Data Box Heavy. This new box, which is now in preview, can hold up to one petabyte of data. Microsoft did not say how heavy the Data Box Heavy is, though.

Also new is the Azure Data Box Edge, which is now also in preview. In many ways, this is the most interesting of the additions since it goes well beyond transporting data. As the name implies, Data Box Edge is meant for edge deployments where a company collects data. What makes this version stand out is that it’s basically a small data center rack that lets you process data as it comes in. It even includes an FPGA to run AI algorithms at the edge.

Using this box, enterprises can collect the data, transform and analyze it on the box, and then send it to Azure over the network (and not in a truck). Using this, users can cut back on bandwidth cost and don’t have to send all of their data to the cloud for processing.

Also part of the same Data Box family is the Data Box Gateway. This is a virtual appliance, however, that runs on Hyper-V and VMWare and lets users create a data transfer gateway for importing data in Azure. That’s not quite as interesting as a hardware appliance but useful nonetheless.

more Microsoft Ignite 2018 coverage


By Frederic Lardinois

Microsoft Azure gets new high-performance storage options

Microsoft Azure is getting a number of new storage options today that mostly focus on use cases where disk performance matters.

The first of these is Azure Ultra SSD Managed Disks, which are now in public preview. Microsoft says that these drives will offer “sub-millisecond latency,” which unsurprisingly makes them ideal for workloads where latency matters.

Earlier this year, Microsoft launched its Premium and Standard SSD Managed Disks offerings for Azure into preview. As far as we can tell, these ‘ultra’ SSDs represent the next tier up from the Premium SSDs with even lower latency and higher throughput.

And talking about Standard SSD Managed Disks, this service is now generally available after only three months in preview. To top things off, all of Azure’s storage tiers (Premium and Standard SSD, as well as Standard HDD) now offer 8, 16 and 32 TB storage capacity.

Also new today is Azure Premium files, which is now in preview. This, too, is an SSD-based service. Azure Files itself isn’t new, though. It offers users access to cloud storage using the standard SMB protocol. This new premium offering promises higher throughput and lower latency for these kind of SMB operations.

more Microsoft Ignite 2018 coverage


By Frederic Lardinois