Health APIs usher in the patient revolution we have been waiting for

If you’ve ever been stuck using a health provider’s clunky online patient portal or had to make multiple calls to transfer medical records, you know how difficult it is to access your health data.

In an era when control over personal data is more important than ever before, the healthcare industry has notably lagged behind — but that’s about to change. This past month, the U.S. Department of Health and Human Services (HHS) published two final rules around patient data access and interoperability that will require providers and payers to create APIs that can be used by third-party applications to let patients access their health data.

This means you will soon have consumer apps that will plug into your clinic’s health records and make them viewable to you on your smartphone.

Critics of the new rulings have voiced privacy concerns over patient health data leaving internal electronic health record (EHR) systems and being surfaced to the front lines of smartphone apps. Vendors such as Epic and many health providers have publicly opposed the HHS rulings, while others, such as Cerner, have been supportive.

While that debate has been heated, the new HHS rulings represent a final decision that follows initial rules proposed a year ago. It’s a multi-year win for advocates of greater data access and control by patients.

The scope of what this could lead to — more control over your health records, and apps on top of it — is immense. Apple has been making progress with its Health Records app for some time now, and other technology companies, including Microsoft and Amazon, have undertaken healthcare initiatives with both new apps and cloud services.

It’s not just big tech that is getting in on the action: startups are emerging as well, such as Commure and Particle Health, which help developers work with patient health data. The unlocking of patient health data could be as influential as the unlocking of banking data by Plaid, which powered the growth of multiple fintech startups, including Robinhood, Venmo and Betterment.

What’s clear is that the HHS rulings are here to stay. In fact, many of the provisions require providers and payers to provide partial data access within the next 6-12 months. With this new market opening up, though, it’s time for more health entrepreneurs to take a deeper look at what patient data may offer in terms of clinical and consumer innovation.

The incredible complexity of today’s patient data systems


By Walter Thompson

Salesforce announces new tools to boost developer experience on Commerce Cloud

Salesforce announced some new developer tools today, designed to make it easier for programmers to build applications on top of Commerce Cloud in what is known in industry parlance as a “headless” system.

What that means is that developers can separate the content from the design and management of the site, allowing companies to change either component independently.

To help with this goal, Salesforce announced some new and enhanced APIs that enable developers take advantage of features built into the Commerce Cloud platform without having to build them from scratch. For instance, they could take advantage of Einstein, Salesforce’s artificial intelligence platform, to add elements like next-best actions to the site, the kind of intelligent functionality that would typically be out of reach of most developers.

Developers also often need to connect to other enterprise systems from their eCommerce site to share data with these tools. To fill that need, Salesforce is taking advantage of Mulesoft, the company it purchased almost two years ago for $6.5 billion. Using Mulesoft’s integration technology, Salesforce can help connect to other systems like ERP financial systems or product management tools and exchange information between the two systems.

Brent Leary, founder at CRM Essentials, whose experience with Salesforce goes back to its earliest days, says this about helping give developers the tools that they need to create the same kind of integrated shopping experiences consumers have grown to expect from Amazon.

“These tools give developers real-time insights delivered at the “moment of truth” to optimize conversion opportunities, and automate processes to improve ordering and fulfillment efficiencies. This should give developers in the Salesforce ecosystem what they need to deliver Amazon-like experiences while having to compete with them.” he said.

To help get customers comfortable with these tools, the company also announced a new Commerce Cloud Development Center to access a community of developers who can discuss and share solutions with one another, an SDK with code samples and Trailhead education resources.

Salesforce made these announcement as part of the National Retail Foundation (NRF) Conference taking place in New York City this week.


By Ron Miller

Battlefield vets StrongSalt (formerly OverNest) announces $3M seed round

StrongSalt, then known as OverNest, appeared at the TechCrunch Disrupt NYC Battlefield in 2016, and announced product for searching encrypted code, which remains unusual to this day. Today, the company announced a $3 million seed round led by Valley Capital Partners.

StrongSalt founder and CEO Ed Yu, says encryption remains a difficult proposition, and that when you look at the majority of breaches, encryption wasn’t used. He said that his company wants to simplify adding encryption to applications, and came up with a new service to let developers add encryption in the form of an API. “We decided to come up with what we call an API platform. It’s like infrastructure that allows you to integrate our solution into any existing or any new applications,” he said.

The company’s original idea was to create a product to search encrypted code, but Yu says the tech has much more utility as an API that’s applicable across applications, and that’s why they decided to package it as a service. It’s not unlike Twilio for communications or Stripe for payments, except in this case you can build in searchable encryption.

The searchable part is actually a pretty big deal because, as Yu points out, when you encrypt data it is no longer searchable. “If you encrypt all your data, you cannot search within it, and if you cannot search within it, you cannot find the data you’re looking for, and obviously you can’t really use the data. So we actually solved that problem,” he said.

Developers can add searchable encryption as part of their applications. For customers already using a commercial product, the company’s API actually integrates with popular services, enabling customers to encrypt the data stored there, while keeping it searchable.

“We will offer a storage API on top of Box, AWS S3, Google cloud, Azure — depending on what the customer has or wants. If the customer already has AWS S3 storage, for example, then when they use our API, and after encrypting the data, it will be stored in their AWS repository,” Yu explained.

For those companies who don’t have a storage service, the company is offering one. What’s more, they are using the blockchain to provide a mechanism for the sharing, auditing and managing encrypted data. “We also use the blockchain for sharing data by recording the authorization by the sender, so the receiver can retrieve the information needed to reconstruct the keys in order to retrieve the data. This simplifies key management in the case of sharing and ensures auditability and revocability of the sharing by the sender,” Yu said.

If you’re wondering how the company has been surviving since 2016, while only getting its seed round today, it had a couple of small seed rounds prior to this, and a contract with the US Department of Defense, which replaced the need for substantial earlier funding.

“The DOD was looking for a solution to have secure communication between between computers, and they needed to have a way to securely store data, and so we were providing a solution for them,” he said. In fact, this work was what led them to build the commercial API platform they are offering today.

The company, which was founded in 2015, currently has 12 employees spread across the globe.


By Ron Miller

APIs are the next big SaaS wave

While the software revolution started out slowly, over the past few years it’s exploded and the fastest-growing segment to-date has been the shift towards software as a service or SaaS.

SaaS has dramatically lowered the intrinsic total cost of ownership for adopting software, solved scaling challenges and taken away the burden of issues with local hardware. In short, it has allowed a business to focus primarily on just that — its business — while simultaneously reducing the burden of IT operations.

Today, SaaS adoption is increasingly ubiquitous. According to IDG’s 2018 Cloud Computing Survey, 73% of organizations have at least one application or a portion of their computing infrastructure already in the cloud. While this software explosion has created a whole range of downstream impacts, it has also caused software developers to become more and more valuable.

The increasing value of developers has meant that, like traditional SaaS buyers before them, they also better intuit the value of their time and increasingly prefer businesses that can help alleviate the hassles of procurement, integration, management, and operations. Developer needs to address those hassles are specialized.

They are looking to deeply integrate products into their own applications and to do so, they need access to an Application Programming Interface, or API. Best practices for API onboarding include technical documentation, examples, and sandbox environments to test.

APIs tend to also offer metered billing upfront. For these and other reasons, APIs are a distinct subset of SaaS.

For fast-moving developers building on a global-scale, APIs are no longer a stop-gap to the future—they’re a critical part of their strategy. Why would you dedicate precious resources to recreating something in-house that’s done better elsewhere when you can instead focus your efforts on creating a differentiated product?

Thanks to this mindset shift, APIs are on track to create another SaaS-sized impact across all industries and at a much faster pace. By exposing often complex services as simplified code, API-first products are far more extensible, easier for customers to integrate into, and have the ability to foster a greater community around potential use cases.

Screen Shot 2019 09 06 at 10.40.51 AM

Graphics courtesy of Accel

Billion-dollar businesses building APIs

Whether you realize it or not, chances are that your favorite consumer and enterprise apps—Uber, Airbnb, PayPal, and countless more—have a number of third-party APIs and developer services running in the background. Just like most modern enterprises have invested in SaaS technologies for all the above reasons, many of today’s multi-billion dollar companies have built their businesses on the backs of these scalable developer services that let them abstract everything from SMS and email to payments, location-based data, search and more.

Simultaneously, the entrepreneurs behind these API-first companies like Twilio, Segment, Scale and many others are building sustainable, independent—and big—businesses.

Valued today at over $22 billion, Stripe is the biggest independent API-first company. Stripe took off because of its initial laser-focus on the developer experience setting up and taking payments. It was even initially known as /dev/payments!

Stripe spent extra time building the right, idiomatic SDKs for each language platform and beautiful documentation. But it wasn’t just those things, they rebuilt an entire business process around being API-first.

Companies using Stripe didn’t need to fill out a PDF and set up a separate merchant account before getting started. Once sign-up was complete, users could immediately test the API with a sandbox and integrate it directly into their application. Even pricing was different.

Stripe chose to simplify pricing dramatically by starting with a single, simple price for all cards and not breaking out cards by type even though the costs for AmEx cards versus Visa can differ. Stripe also did away with a monthly minimum fee that competitors had.

Many competitors used the monthly minimum to offset the high cost of support for new customers who weren’t necessarily processing payments yet. Stripe flipped that on its head. Developers integrate Stripe earlier than they integrated payments before, and while it costs Stripe a lot in setup and support costs, it pays off in brand and loyalty.

Checkr is another excellent example of an API-first company vastly simplifying a massive yet slow-moving industry. Very little had changed over the last few decades in how businesses ran background checks on their employees and contractors, involving manual paperwork and the help of 3rd party services that spent days verifying an individual.

Checkr’s API gives companies immediate access to a variety of disparate verification sources and allows these companies to plug Checkr into their existing on-boarding and HR workflows. It’s used today by more than 10,000 businesses including Uber, Instacart, Zenefits and more.

Like Checkr and Stripe, Plaid provides a similar value prop to applications in need of banking data and connections, abstracting away banking relationships and complexities brought upon by a lack of tech in a category dominated by hundred-year-old banks. Plaid has shown an incredible ramp these past three years, from closing a $12 million Series A in 2015 to reaching a valuation over $2.5 billion this year.

Today the company is fueling an entire generation of financial applications, all on the back of their well-built API.

Screen Shot 2019 09 06 at 10.41.02 AM

Graphics courtesy of Accel

Then and now

Accel’s first API investment was in Braintree, a mobile and web payment systems for e-commerce companies, in 2011. Braintree eventually sold to, and became an integral part of, PayPal as it spun out from eBay and grew to be worth more than $100 billion. Unsurprisingly, it was shortly thereafter that our team decided to it was time to go big on the category. By the end of 2014 we had led the Series As in Segment and Checkr and followed those investments with our first APX conference in 2015.

Plaid, Segment, Auth0, and Checkr had only raised Seed or Series A financings! And we are even more excited and bullish on the space. To convey just how much API-first businesses have grown in such a short period of time, we thought it would be useful perspective to share some metrics over the past five years, which we’ve broken out in the two visuals included above in this article.

While SaaS may have pioneered the idea that the best way to do business isn’t to actually build everything in-house, today we’re seeing APIs amplify this theme. At Accel, we firmly believe that APIs are the next big SaaS wave — having as much if not more impact as its predecessor thanks to developers at today’s fastest-growing startups and their preference for API-first products. We’ve actively continued to invest in the space (in companies like, Scale, mentioned above).

And much like how a robust ecosystem developed around SaaS, we believe that one will continue to develop around APIs. Given the amount of progress that has happened in just a few short years, Accel is hosting our second APX conference to once again bring together this remarkable community and continue to facilitate discussion and innovation.

Screen Shot 2019 09 06 at 10.41.10 AM

Graphics courtesy of Accel


By Arman Tabatabai

ReadMe scores $9M Series A to help firms customize API docs

Software APIs help different tools communicate with one another, let developers access essential services without having to code it themselves, and are critical components for driving a platform-driven strategy. Yet they require solid documentation to help make the best use of them. ReadMe, a startup that helps companies customize their API documentation, announced a $9 million Series A today led by Accel with help from Y Combinator. The company was part of the Y Combinator Winter 2015 cohort.

Prior to today’s funding announcement, the company had taken just a $1.2 million Seed round in 2014. Today, it reports 3000 paying customers and that it’s been profitable for the last several years, an unusual position for a startup. In spite of this success, co-founder and CEO Gregory Koberger said as the company has taken on larger customers, they have more sophisticated requirements, and that prompted them to take this round of funding.

In addition, it has expanded the platform to use a company’s API logs to help create more dynamic documentation and improve customer support kinds of scenarios. But by taking on data from other companies, it needs to make sure the data is secure, and today’s funding will help in that regard.

“We’re going to still build the company traditionally by hiring more engineers, more support people, more designers, the obvious stuff, but the main impetus for doing this was that we started working with bigger companies with more secure data. So a lot of the money is going to help make sure that we handle that right,” Koberger explained.

Screenshot 2019 08 28 10.55.38

Image: ReadMe

He says this ability to make use of the API logs has opened up all kinds of possibilities for the company as the data provides a valuable window into how people use the APIs. “It’s amazing how much you get by just actually seeing what the server sees. When people are having problems with an API, they can debug it themselves because they can actually see the problems, The support team can see it as well,” Koberger said.

Accel’s Dan Levine, whose firm is leading the investment believes that having good documentation is the difference between making and breaking an API. “APIs don’t just create technical integration, they create ecosystems around core services and underpin corporate partnerships that generate billions of dollars. ReadMe is as much a strategy as it is a service for businesses. Providing clean, interactive, data-driven API documentation to make developers love working with you can be the difference between 100 partnerships or 1000 partnerships,” Levine said.

ReadMe was founded in 2014. It has 22 employees in their San Francisco offices, a number that should increase with today’s funding.


By Ron Miller

Postman raises $50 million to grow its API development platform

Postman, a five-year-old startup that is attempting to simply development, tests, and management of APIs through its platform, has raised $50 million in a new round to scale its business.

The Series B for the startup, that began its journey in India, was led by CRV and included participation from existing investor Nexus Venture Partners . The startup, with offices in India and San Francisco, closed its Series A financing round four years ago and has raised $57 million to date.

Postman offers a development environment which a developer or a firm could use to build, publish, document, design, monitor, test, and debug their APIs. Postman, like some other startups such as RapidAPI, also maintains a marketplace to offer APIs for quick integration with other popular services.

The startup was co-founded by Abhinav Asthana, a former intern at Yahoo . Asthana was frustrated with how APIs were an afterthought for many developers as they usually got around to building them in the eleventh hour. Additionally, developers were relying on their own workflows and there was no organized platform that could be used by many, he explained in an interview with TechCrunch.

Even big software firms have not looked into this space yet, and many have instead become a customer of Postman. “We are solving a fundamental problem for the technology landscape. Big companies tend to be slower as they have many other things on their plate,” said Asthana.

Five years later, Postman has grown significantly. More than 7 million users and 300,000 companies including Microsoft, Twitter, BestBuy, AMC Theaters, Paypal, Shopify, BigCommerce, and DocuSign today use Postman’s platform.

The modern software development relies heavily on APIs as more businesses begin to talk with one another. According to research firm Gartner, more than 65% of global infrastructure service providers’ revenue will be generated through services enabled by APIs by 2023, up from 15% in 2018.

Asthana said Postman intends to use the fresh capital to scale its startup, products, and grow its team. “We are scaling rapidly across all dimensions. There are many use cases that we still want to address over the coming months. We will also experiment with sales and invest in improving user experience,” he added.

Postman offers some of its services in limited capacity for free to users. For rest, it charges between $8 to $18 per user to its customers. That’s how the company generates revenue. Asthana declined to share the financial performance of the startup, but said its customer based was “growing phenomenally.”

Postman said CRV General Partner Devdutt Yellurkar has joined its board of directors.


By Manish Singh

Apigee jumps on hybrid bandwagon with new API for hybrid environments

This year at Google Cloud Next, the theme is all about supporting hybrid environments, so it shouldn’t come as a surprise that Apigee, the API company it bought in 2016 for $265 million, is also getting into the act. Today, Apigee announced the Beta of Apigee Hybrid, a new product designed for hybrid environments.

Amit Zavery, who recently joined Google Cloud after many years at Oracle, and Nandan Sridhar, describe the new product in a joint blog post as “a new deployment option for the Apigee API management platform that lets you host your runtime anywhere—in your data center or the public cloud of your choice.”

As with Anthos, the company’s approach to hybrid management announced earlier today, the idea is to have a single way to manage your APIs no matter where you choose to run them.

“With Apigee hybrid, you get a single, full-featured API management solution across all your environments, while giving you control over your APIs and the data they expose and ensuring a unified strategy across all APIs in your enterprise,” Zavery and Sridhar wrote in the blog post announcing the new approach.

The announcement is part of an overall strategy by the company to support a customer’s approach to computing across a range environments, often referred to as hybrid cloud. In the Cloud Native world, the idea is to present a single fabric to manage your deployments, regardless of location.

This appears to be an extension of that idea, which makes sense given that Google was the first company to develop and open source Kubernetes, which is at the forefront of containerization and Cloud Native computing. While this isn’t pure Cloud Native computing, it is keeping true to its ethos and it fits in the scope of Google Cloud’s approach to computing in general, especially as it is being defined at this year’s conference.


By Ron Miller

Mailgun changes hands again as Thoma Bravo buys majority stake

Mailgun, an email API delivery service, announced today that it was selling a majority stake in the company to private equity firm Thoma Bravo. The companies did not share terms, but this is the second owner in the company’s 8+ year history.

Mailgun provides API services for building email functionality into applications. It has over 150,000 customers today using its APIs, according to data provided by the company.

In a blog post announcing the investment, CEO William Conway said the new money should help the company expand its capabilities and accelerate the product roadmap, a common refrain from companies about to be acquired.

“We will be investing millions in the development of products you can use to enhance your deliverability, gain more insights into your emails and deliver an unparalleled experience for your customers. We’re also doubling down on customer success and enablement to ensure our customers have exactly what they need to scale their communications,” Conway wrote in the blog post.

The company, which was founded in 2010 and was a part of the Y Combinator Winter 2011 cohort, has had a complex history. Rackspace acquired it in 2012 and held onto it until 2017 when it spun out into a private company. At that point, Turn/River, another private equity firm,  invested $50 million in the company. After today’s deal, Turn/River will maintain a minority ownership stake in Mailgun.

Mailgun typically competes with companies like MailChimp and SendGrid. Thoma Bravo has a history buying enterprise software companies. Most recently, it bought a majority stake in enterprise software company Apttus. It also has investments in SolarWinds, SailPoint and Blue Point Systems.

Thoma Bravo did not respond to a request for comment before publishing.


By Ron Miller

Salesforce Commerce Cloud updates keep us shopping with AI-fueled APIs

As people increasingly use their mobile phones and other devices to shop, it has become imperative for vendors to improve the shopping experience, making it as simple as possible, given the small footprint. One way to do that is using artificial intelligence. Today, Salesforce announced some AI-enhanced APIs designed to keep us engaged as shoppers.

For starters, the company wants to keep you shopping. That means providing an intelligent recommendation engine. If you searched for a particular jacket, you might like these similar styles, or this scarf and gloves. That’s fairly basic as shopping experiences go, but Salesforce didn’t stop there. It’s letting developers embed this ability to recommend products in any app whether that’s maps, social or mobile.

That means shopping recommendations could pop up anywhere developers think it makes sense like on your maps app. Whether consumers see this as a positive thing, Salesforce says when you add intelligence to the shopping experience, it increases sales anywhere from 7-16 percent, so however you feel about it, it seems to be working.

The company also wants to make it simple to shop. Instead of entering a long faceted search as has been the traditional way of shopping in the past — footwear, men’s, sneakers, red — you can take a picture of a sneaker (or anything you like) and the visual search algorithm should recognize it and make recommendations based on that picture. It reduces data entry for users, which is typically a pain on the mobile device, even if it has been simplified by checkboxes.

Salesforce has also made inventory availability as a service, allowing shoppers to know exactly where the item they want is available in the world. If they want to pick up in-store that day, it shows where the store is on a map and could even embed that into your ride-sharing app to indicate exactly where you want to go. The idea is to create this seamless experience between consumer desire and purchase.

Finally, Salesforce has added some goodies to make developers happy too including the ability to browse the Salesforce API library and find the ones that make most sense for what they are creating. This includes code snippets to get started. It may not seem like a big deal, but as companies the size of Salesforce increase their API capabilities (especially with the Mulesoft acquisition), it’s harder to know what’s available. The company has also created a sandboxing capability to let developers experiment and build capabilities with these APIs in a safe way.

The basis of Commerce Cloud is Demandware, the company Salesforce acquired two years ago for $2.8 billion. Salesforce’s intelligence platform is called Einstein. In spite of its attempt to personify the technology, it’s really about bringing artificial intelligence across the Salesforce platform of products, as it has with today’s API announcements.


By Ron Miller

With Mulesoft in fold, Salesforce gains access to data wherever it lives

When Salesforce bought Mulesoft last spring for the tidy sum of $6.5 billion, it looked like money well spent for the CRM giant. After all, it was providing a bridge between the cloud and the on-prem data center and that was a huge missing link for a company with big ambitions like Salesforce.

When you want to rule the enterprise, you can’t be limited by where data lives and you need to be able to share information across disparate systems. Partly that’s a simple story of enterprise integration, but on another level it’s purely about data. Salesforce introduced its intelligence layer, dubbed Einstein, at Dreamforce in 2016.

With Mulesoft in the fold, it’s got access to data cross systems wherever it lives, in the cloud or on-prem. Data is the is the fuel of artificial intelligence, and Salesforce has been trying desperately to get more data for Einstein since its inception.

It lost out on LinkedIn to Microsoft, which flexed its financial muscles and reeled in the business social network for $26.5 billion a couple of years ago. It’s undoubtedly a rich source of data that the company longed for. Next, it set its sights on Twitter (although Twitter was ultimately never sold, of course). After board and stockholder concerns, the company walked away.

Each of these forays was all about the data, and frustrated, Salesforce went back to the drawing board. While Mulesoft did not supply the direct cache of data that a social network would have, it did provide a neat way for them to get at backend data sources, the very type of data that matters most to its enterprise customers.

Today, they have extended that notion beyond pure data access to a graph. You can probably see where this is going. The idea of a graph, the connections between say a buyer and the things they tend to buy or a person on a social network and people they tend to interact with can be extended even to the network/API level and that is precisely the story that Salesforce is trying to tell this week at the Dreamforce customer conference in San Francisco.

Visualizing connections in a data integration network in Mulesoft. Screenshot: Salesforce/Mulesoft

Maureen Fleming, program vice president for integration and process automation research at IDC says that it is imperative that organizations view data as a strategic asset and act accordingly. “Very few companies are getting all the value from their data as they should be, as it is locked up in various applications and systems that aren’t designed to talk to each other. Companies who are truly digitally capable will be able to connect these disparate data sources, pull critical business-level data from these connections, and make informed business decisions in a way that delivers competitive advantage,” Fleming explained in a statement.

Configuring data connections on Mulesoft Anypoint Platform. Gif: Salesforce/Mulesoft

It’s hard to underestimate the value of this type of data is to Salesforce, which has already put Mulesoft to work internally to help build the new Customer 360 product announced today. It can point to how it’s providing this very type of data integration to which Fleming is referring on its own product set.

Bret Taylor, president and chief product officer at Salesforce, says that for his company all of this is ultimately about enhancing the customer experience. You need to be able to stitch together these different computing environments and data silos to make that happen.

“In the short term, [customer] infrastructure is often fragmented. They often have some legacy applications on premise, they’ll have some cloud applications like Salesforce, but some infrastructure in on Amazon or Google and Azure, and to actually transform the customer experience, they need to bring all this data together. And so it’s a really a unique time for integration technologies, like Mulesoft because it enables you to create a seamless customer experience, no matter where that
data lives, and that means you don’t need to wait for infrastructure to be perfect before you can transform your customer experience.”


By Ron Miller

Twilio’s contact center products just got more analytical with Ytica acquisition

Twilio, a company best known for supplying a communications APIs for developers has a product called Twilio Flex for building sophisticated customer service applications on top of Twilio’s APIs. Today, it announced it was acquiring Ytica (pronounced Why-tica) to provide an operational and analytical layer on top of the customer service solution.

The companies would not discuss the purchase price, but Twilio indicated it does not expect the acquisition to have a material impact on its “results, operations or financial condition.” In other words, it probably didn’t cost much.

Ytica, which is based in Prague, has actually been a partner with Twilio for some time, so coming together in this fashion really made a lot of sense, especially as Twilio has been developing Flex.

Twilio Flex is an app platform for contact centers, which offers a full stack of applications and allows users to deliver customer support over multiple channels, Al Cook, general manager of Twilio Flex explained. “Flex deploys like SaaS, but because it’s built on top of APIs, you can reach in and change how Flex works,” he said. That is very appealing, especially for larger operations looking for a flexible, cloud-based solution without the baggage of on-prem legacy products.

What the product was lacking, however, was a native way to manage customer service representatives from within the application, and understand through analytics and dashboards, how well or poorly the team was doing. Having that ability to measure the effectiveness of the team becomes even more critical the larger the group becomes, and Cook indicated some Flex users are managing enormous groups with 10,000-20,000 employees.

Ytica provides a way to measure the performance of customer service staff, allowing management to monitor and intervene and coach when necessary. “It made so much sense to join together as one team. They have huge experience in the contact center, and a similar philosophy to build something customizable and programmable in the cloud,” Cook said.

While Ytica works with other vendors beyond Twilio, CEO Simon Vostrý says that they will continue to support those customers, even as they join the Twilio family. “We can run Flex and can continue to run this separately. We have customers running on other SaaS platforms, and we will continue to support them,” he said.

The company will remain in Prague and become a Twilio satellite office. All 14 employees are expected to join the Twilio team and Cook says plans are already in the works to expand the Prague team.


By Ron Miller

Ping Identity acquires stealthy API security startup Elastic Beam

At the Identiverse conference in Boston today, Ping Identity announced that it has acquired Elastic Beam, a pre-Series A startup that uses artificial intelligence to monitor APIs and help understand when they have been compromised.

Ping also announced a new product, PingIntelligence for APIs, based on the Elastic Beam technology. They did not disclose the sale price.

The product itself is a pretty nifty piece of technology. It automatically detects all the API IP addresses and URLs running inside a customer. It then uses artificial intelligence to search for anomalous behavior and report back when it finds it (or it can automatically shut down access depending on how it’s configured).

“APIs are defined either in the API gateway because that facilitates creation or implemented on an application server like node.js. We created a platform that could bring a level of protection to both,” company founder Bernard Harguindeguy told TechCrunch.

It may seem like an odd match for Ping, which after all, is an enterprise identity company, but there are reasonable connections here. Perhaps the biggest is that CEO Andre Durand wants to see his company making increasing use of AI and machine learning for identity security in general. It’s also worth noting that his company has had an API security product in its portfolio for over five years, so it’s not a huge stretch to buy Elastic Beam.

With this purchase, Ping has not only acquired some advanced technology, it has also acqui-hired a team of AI and machine learning experts that could help inject the entire Ping product line with AI and machine learning smarts. “Nobody should be surprised who has been watching that Ping will drive machine learning AI and general intelligence into our identity platform,” Durand said.

Harguindeguy certainly sees the potential here. “I think we can over time bring a high level of monitoring and intelligence to Ping to understand whether an identity may have been used by someone else or being misused somehow,” he said.

Elastic Beam interface. Photo: Elastic Beam website

Harguindeguy will join Ping Identity as Senior Vice President of Intelligence along with his entire team. Neither company would divulge the exact number of employees, but Durand did acknowledge it fell somewhere between the 11 and 50 mentioned in the company Crunchbase profile. The original team consisted of around 10 according to  Harguindeguy and they have been hiring for some time, so fair to say more than 11, but less than 50.

Harguindeguy says they were pursued by more than one company (although he wouldn’t say who those other companies were), but he felt that Ping provided a good cultural match for his company and could take them where they wanted to go faster than they could on their own, even with Series A money.

“We realized this is going to be really big. How do we go after the market really strongly really fast? We saw that we could could fuse this really fast with Ping and have strong go- to market with with them,” he said.

Durand acknowledged that Ping, which was itself acquired by Vista Equity Partners for $600 million two years ago, couldn’t have made such an acquisition without the backing of a larger firm like this. “There was there was no chance we could have done either UnboundID (which the company acquired in August 2016) or Elastic Beam on our own. This was purely an artifact of being part of the Vista family portfolio,” he said.

PingIntelligence for APIs, the product based on Elastic Beam’s technology, is currently in private preview. It should be generally available some time later this year.


By Ron Miller

Auth0 snags $55M Series D, seeks international expansion

Auth0, a startup based in Seattle, has been helping developers with a set of APIs to build authentication into their applications for the last five years. It’s raised a fair bit of money along the way to help extend that mission, and today the company announced a $55 million Series D.

This round was led by led by Sapphire Ventures with help from World Innovation Lab, and existing investors Bessemer Venture Partners, Trinity Ventures, Meritech Capital and K9 Ventures. Today’s investment brings the total raised to $110 million. The company did not want to share its valuation.

CEO Eugenio Pace said the investment should help them expand further internationally. In fact, one of the investors, World Innovation Lab, is based in Japan and should help with their presence there. “Japan is an important market for us and they should help explain to us how the market works there,” he said.

The company offers an easy way for developers to build in authentication services into their applications, also known as Identification as a Service (IDaaS). It’s a lot like Stripe for payments or Twilio for messaging. Instead of building the authentication layer from scratch, they simply add a few lines of code and can take advantage of the services available on the Auth0 platform.

That platform includes a range of service such as single-sign on, two-factor identification, passwordless log-on and breached password detection.

They have a free tier, which doesn’t even require a credit card, and pay tiers based on the types of users — regular versus enterprise — along with the number of users. They also charge based on machine-to-machine authentication. Pace reports they have 3500 paying customers and tens of thousands of users on the free tier.

All of that has added up to a pretty decent business. While Pace would not share specific numbers, he did indicate the company doubled its revenue last year and expected to do so again this year.

With a cadence of getting funding every year for the last three years, Pace says this round may mark the end of that fundraising cycle for a time. He wasn’t ready to commit to the idea of an IPO, saying that is likely a couple of years away, but he says the company is close to profitability.

With the new influx of money, the company does plan to expand its workforce as moves into markets across the world . They currently have 300 employees, but within a year he expects to be between 400 and 450 worldwide.

The company’s last round was a $30 million Series C last June led by Meritech Capital Partners.


By Ron Miller

Adobe CTO leads company’s broad AI bet

There isn’t a software company out there worth its salt that doesn’t have some kind of artificial intelligence initiative in progress right now. These organizations understand that AI is going to be a game-changer, even if they might not have a full understanding of how that’s going to work just yet.

In March at the Adobe Summit, I sat down with Adobe executive vice president and CTO Abhay Parasnis, and talked about a range of subjects with him including the company’s goal to build a cloud platform for the next decade — and how AI is a big part of that.

Parasnis told me that he has a broad set of responsibilities starting with the typical CTO role of setting the tone for the company’s technology strategy, but it doesn’t stop there by any means. He also is in charge of operational execution for the core cloud platform and all the engineering building out the platform — including AI and Sensei. That includes managing a multi-thousand person engineering team. Finally, he’s in charge of all the digital infrastructure and the IT organization — just a bit on his plate.

Ten years down the road

The company’s transition from selling boxed software to a subscription-based cloud company began in 2013, long before Parasnis came on board. It has been a highly successful one, but Adobe knew it would take more than simply shedding boxed software to survive long-term. When Parasnis arrived, the next step was to rearchitect the base platform in a way that was flexible enough to last for at least a decade — yes, a decade.

“When we first started thinking about the next generation platform, we had to think about what do we want to build for. It’s a massive lift and we have to architect to last a decade,” he said. There’s a huge challenge because so much can change over time, especially right now when technology is shifting so rapidly.

That meant that they had to build in flexibility to allow for these kinds of changes over time, maybe even ones they can’t anticipate just yet. The company certainly sees immersive technology like AR and VR, as well as voice as something they need to start thinking about as a future bet — and their base platform had to be adaptable enough to support that.

Making Sensei of it all

But Adobe also needed to get its ducks in a row around AI. That’s why around 18 months ago, the company made another strategic decision to develop AI as a core part of the new  platform. They saw a lot of companies looking at a more general AI for developers, but they had a different vision, one tightly focussed on Adobe’s core functionality. Parasnis sees this as the key part of the company’s cloud platform strategy. “AI will be the single most transformational force in technology,” he said, adding that Sensei is by far the thing he is spending the most time on.”

Photo: Ron Miller

The company began thinking about the new cloud platform with the larger artificial intelligence goal in mind, building AI-fueled algorithms to handle core platform functionality. Once they refined them for use in-house, the next step was to open up these algorithms to third-party developers to build their own applications using Adobe’s AI tools.

It’s actually a classic software platform play, whether the service involves AI or not. Every cloud company from Box to Salesforce has been exposing their services for years, letting developers take advantage of their expertise so they can concentrate on their core knowledge. They don’t have to worry about building something like storage or security from scratch because they can grab those features from a platform that has built-in expertise  and provides a way to easily incorporate it into applications.

The difference here is that it involves Adobe’s core functions, so it may be intelligent auto cropping and smart tagging in Adobe Experience Manager or AI-fueled visual stock search in Creative Cloud. These are features that are essential to the Adobe software experience, which the company is packaging as an API and delivering to developers to use in their own software.

Whether or not Sensei can be the technology that drives the Adobe cloud platform for the next 10 years, Parasnis and the company at large are very much committed to that vision. We should see more announcements from Adobe in the coming months and years as they build more AI-powered algorithms into the platform and expose them to developers for use in their own software.

Parasnis certainly recognizes this as an ongoing process. “We still have a lot of work to do, but we are off in an extremely good architectural direction, and AI will be a crucial part,” he said.


By Ron Miller

Salesforce introduces Integration Cloud on heels of MuleSoft acquisition

Salesforce hasn’t wasted any time turning the MuleSoft acquisition into a product of its own, announcing the Salesforce Integration Cloud this morning.

While in reality it’s too soon to really take advantage of the MuleSoft product set, the company is laying the groundwork for the eventual integration into the Salesforce family with this announcement, which really showcases why Salesforce was so interested in them that they were willing to fork over $6.5 billion.

The company has decided to put their shiny new bauble front and center in the Integration Cloud announcement, so that when they are in the fold, they will have a place for them to hit the ground running

The Integration Cloud itself consists of three broad pieces: The Integration Platform, which will eventually be based on MuleSoft; Integration Builder, a tool that lets you bring together a complete picture of a customer from Salesforce tools, as well as across other enterprise data repositories and finally Integration Experiences, which is designed to help brands build customized experiences based on all the information you’ve learned from the other tools.

For now, it involves a few pieces that are independent of MuleSoft including a workflow tool called Lightning Flow, a new service that is designed to let Salesforce customers build workflows using the customer data in Salesforce CRM.

It also includes a dash of Einstein, Salesforce’s catch-all brand for the intelligence layer that underlies the platform, to build Einstein intelligence into any app.

Salesforce also threw in some Trailhead education components to help customers understand how to best make use of these tools.

But make no mistake, this is a typical Salesforce launch. It is probably earlier than it should be, but it puts the idea of integration out there in the minds of its customers and lays a foundation for a much deeper set of products and services down the road when MuleSoft is more fully integrated into the Salesforce toolset.

For now, it’s important to understand that this deal is about using data to fuel the various pieces of the Salesforce platform and provide the Einstein intelligence layer with information from across the enterprise wherever it happens to live, whether that’s in Salesforce, another cloud application or some on-prem legacy systems.

This should sound familiar to folks attending the Adobe Summit this week in Las Vegas, since it’s eerily similar to what Adobe announced on stage yesterday at the Summit keynote. Adobe is calling it a customer experience system of record, but the end game is pretty much the same: bringing together data about a customer from a variety of sources, building a single view of that customer, and then turning that insight into a customized experience.

That they chose to make this announcement during the Adobe Summit, where Adobe has announced some data integration components of its own could be a coincidence, but probably not.