Dell dumps another big asset moving Boomi to Francisco Partners and TPG for $4B

It’s widely known that Dell has a debt problem left over from its massive acquisition of EMC in 2016, and it seems to be moving this year to eliminate part of it in multi-billion chunks. The first step was spinning out VMware as a separate company last month, a move expected to net close to $10 billion.

The second, long expected, finally dropped last night when the company announced it was selling Boomi to a couple of private equity firms for $4 billion. Francisco Partners is joining forces with TPG to make the deal to buy the integration platform.

Boomi is not unlike Mulesoft, a company that Salesforce purchased in 2018 for $6.5 billion, although a bit longer in tooth. They both help companies with integration problems by creating connections between disparate systems. With so many pieces in place from various acquisitions over the years, it seems like a highly useful asset for Dell to help pull these pieces together and make them work, but the cash is trumping that need.

Providing integration services is a growing requirement as companies look for ways to make better use of data locked in siloed systems. Boomi could help and that’s one of the primary reasons for the acquisition, according to Francisco executives.

“The ability to integrate and connect data and workflows across any combination of applications or domains is a critical business capability, and we strongly believe that Boomi is well positioned to help companies of all sizes turn data into their most valuable asset,” Francisco CEO Dipanjan Deb and partner Brian Decker said in a statement.

As you would expect, Boomi’s CEO Chris McNabb put a positive spin on the deal about how his new bosses were going to fuel growth for his company. “By partnering with two tier-one investment firms like Francisco Partners and TPG, we can accelerate our ability for our customers to use data to drive competitive advantage. In this next phase of growth, Boomi will be in a position of strength to further advance our innovation and market trajectory while delivering even more value to our customers,” McNabb said in a statement.

All of this may have some truth to it, but the company goes from being part of a large amorphous corporation to getting absorbed in the machinery of two private equity firms. What happens next is hard to say.

The company was founded in 2000, and sold to Dell in 2010. Today, it has 15,000 customer, but Dell’s debt has been well documented, and when you string together a couple of multi-billion deals as Dell has recently, pretty soon you’re talking real money. While the company has not stated it will explicitly use the proceeds of this deal to pay off debt as it did with the VMware announcement, it stands to reason that this will be the case.

The deal is expected to close later this year, although it will have to pass the typical regulatory scrutiny prior to that.


By Ron Miller

Google Drive adds workflow integrations with DocuSign, K2 and Nintex

Google today announced a few new workflow integrations for its Drive file storage service that’ll bring support for some features from DocuSign and process automation platforms K2 and Nintex to the service.

None of these new integrations are all that unusual but if you use a combination of Drive and the newly supported tools, they will undoubtedly make your daily work a little bit easier.

For DocuSign, the new integration lets your prepare, sign and store your documents right in Google Drive, as well as trigger actions like billing, account activation and payments after an agreement has been signed.

The K2 integration is a bit different and focuses on that company’s machine learning tools. It’ll allow users to train models on a workflow (using Google machine learning tools) and then, for example, determine whether a loan should be automatically approved or denied, with all of the information about those requests and the approval process stored in a Google Sheet. The integration also supports more pedestrian use cases, though, including the ability to make lots of documents in Drive more easily discoverable.

“K2 is committed to simplifying the way in which our customers connect and manage their information whether it resides on-premise or in the cloud,” said Eyal Inbar, Vice President of Global Technology Alliances at K2. “By integrating with Google Drive, we are able to put the next-generation of content management services in the hands of our customers so they can build and implement powerful workflows into their applications.”

Nintex’s solution seems to be a bit more specialized, with a focus on contract management lifecycles for HR, legal and sales use cases. There’s nothing exciting about managing contracts, but that’s probably a good thing and ideally, adding more automation will help to keep it that way.


By Frederic Lardinois

With Mulesoft in fold, Salesforce gains access to data wherever it lives

When Salesforce bought Mulesoft last spring for the tidy sum of $6.5 billion, it looked like money well spent for the CRM giant. After all, it was providing a bridge between the cloud and the on-prem data center and that was a huge missing link for a company with big ambitions like Salesforce.

When you want to rule the enterprise, you can’t be limited by where data lives and you need to be able to share information across disparate systems. Partly that’s a simple story of enterprise integration, but on another level it’s purely about data. Salesforce introduced its intelligence layer, dubbed Einstein, at Dreamforce in 2016.

With Mulesoft in the fold, it’s got access to data cross systems wherever it lives, in the cloud or on-prem. Data is the is the fuel of artificial intelligence, and Salesforce has been trying desperately to get more data for Einstein since its inception.

It lost out on LinkedIn to Microsoft, which flexed its financial muscles and reeled in the business social network for $26.5 billion a couple of years ago. It’s undoubtedly a rich source of data that the company longed for. Next, it set its sights on Twitter (although Twitter was ultimately never sold, of course). After board and stockholder concerns, the company walked away.

Each of these forays was all about the data, and frustrated, Salesforce went back to the drawing board. While Mulesoft did not supply the direct cache of data that a social network would have, it did provide a neat way for them to get at backend data sources, the very type of data that matters most to its enterprise customers.

Today, they have extended that notion beyond pure data access to a graph. You can probably see where this is going. The idea of a graph, the connections between say a buyer and the things they tend to buy or a person on a social network and people they tend to interact with can be extended even to the network/API level and that is precisely the story that Salesforce is trying to tell this week at the Dreamforce customer conference in San Francisco.

Visualizing connections in a data integration network in Mulesoft. Screenshot: Salesforce/Mulesoft

Maureen Fleming, program vice president for integration and process automation research at IDC says that it is imperative that organizations view data as a strategic asset and act accordingly. “Very few companies are getting all the value from their data as they should be, as it is locked up in various applications and systems that aren’t designed to talk to each other. Companies who are truly digitally capable will be able to connect these disparate data sources, pull critical business-level data from these connections, and make informed business decisions in a way that delivers competitive advantage,” Fleming explained in a statement.

Configuring data connections on Mulesoft Anypoint Platform. Gif: Salesforce/Mulesoft

It’s hard to underestimate the value of this type of data is to Salesforce, which has already put Mulesoft to work internally to help build the new Customer 360 product announced today. It can point to how it’s providing this very type of data integration to which Fleming is referring on its own product set.

Bret Taylor, president and chief product officer at Salesforce, says that for his company all of this is ultimately about enhancing the customer experience. You need to be able to stitch together these different computing environments and data silos to make that happen.

“In the short term, [customer] infrastructure is often fragmented. They often have some legacy applications on premise, they’ll have some cloud applications like Salesforce, but some infrastructure in on Amazon or Google and Azure, and to actually transform the customer experience, they need to bring all this data together. And so it’s a really a unique time for integration technologies, like Mulesoft because it enables you to create a seamless customer experience, no matter where that
data lives, and that means you don’t need to wait for infrastructure to be perfect before you can transform your customer experience.”


By Ron Miller

Salesforce introduces Integration Cloud on heels of MuleSoft acquisition

Salesforce hasn’t wasted any time turning the MuleSoft acquisition into a product of its own, announcing the Salesforce Integration Cloud this morning.

While in reality it’s too soon to really take advantage of the MuleSoft product set, the company is laying the groundwork for the eventual integration into the Salesforce family with this announcement, which really showcases why Salesforce was so interested in them that they were willing to fork over $6.5 billion.

The company has decided to put their shiny new bauble front and center in the Integration Cloud announcement, so that when they are in the fold, they will have a place for them to hit the ground running

The Integration Cloud itself consists of three broad pieces: The Integration Platform, which will eventually be based on MuleSoft; Integration Builder, a tool that lets you bring together a complete picture of a customer from Salesforce tools, as well as across other enterprise data repositories and finally Integration Experiences, which is designed to help brands build customized experiences based on all the information you’ve learned from the other tools.

For now, it involves a few pieces that are independent of MuleSoft including a workflow tool called Lightning Flow, a new service that is designed to let Salesforce customers build workflows using the customer data in Salesforce CRM.

It also includes a dash of Einstein, Salesforce’s catch-all brand for the intelligence layer that underlies the platform, to build Einstein intelligence into any app.

Salesforce also threw in some Trailhead education components to help customers understand how to best make use of these tools.

But make no mistake, this is a typical Salesforce launch. It is probably earlier than it should be, but it puts the idea of integration out there in the minds of its customers and lays a foundation for a much deeper set of products and services down the road when MuleSoft is more fully integrated into the Salesforce toolset.

For now, it’s important to understand that this deal is about using data to fuel the various pieces of the Salesforce platform and provide the Einstein intelligence layer with information from across the enterprise wherever it happens to live, whether that’s in Salesforce, another cloud application or some on-prem legacy systems.

This should sound familiar to folks attending the Adobe Summit this week in Las Vegas, since it’s eerily similar to what Adobe announced on stage yesterday at the Summit keynote. Adobe is calling it a customer experience system of record, but the end game is pretty much the same: bringing together data about a customer from a variety of sources, building a single view of that customer, and then turning that insight into a customized experience.

That they chose to make this announcement during the Adobe Summit, where Adobe has announced some data integration components of its own could be a coincidence, but probably not.

GitLab adds support for GitHub

Here is an interesting twist: GitLab, which in many ways competes with GitHub as a shared code repository service for teams, is bringing its continuous integration and delivery (CI/CD) features to GitHub.

The new service is launching today as part of GitLab’s hosted service. It will remain free to developers until March 22, 2019. After that, it’s moving to GitLab.com’s paid Silver tier.

GitHub itself offers some basic project and task management services on top of its core tools, but for the most part, it leaves the rest of the DevOps lifecycle to partners. GitLab offers a more complete CI/CD solution with integrated code repositories, but while GitLab has grown in popularity, GitHub is surely better known among developers and businesses. With this move, GitLab hopes to gain new users — and especially enterprise users — who are currently storing their code on GitHub but are looking for a CI/CD solution.

The new GitHub integration allows developers to set up their projects in GitLab and connect them to a GitHub repository. So whenever developers push code to their GitHub repository, GitLab will kick off that project’s CI/CD pipeline with automated builds, tests and deployments.

“Continuous integration and deployment form the backbone of modern DevOps,” said Sid Sijbrandij, CEO and co-founder of GitLab. “With this new offering, businesses and open source projects that use GitHub as a code repository will have access to GitLab’s industry leading CI/CD capabilities.”

It’s worth noting that GitLab offers a very similar integration with Atlassian’s BitBucket, too.