FintechOS nabs $60M for a low-code approach to modernizing legacy banking and insurance services

“Challenger” startups in banking and insurance have upended their industries, and picked up significant business, by building more customer-friendly tools and services — more personalized, easier to access, and usually competitively priced — than those typically provided their bigger, incumbent rivals. Now, a startup out of Romania that is building tools to help the incumbents respond with better services of their own is announcing a significant round of funding as its business grows.

FintechOS, which has built a low-code platform aimed at larger (older) banking and insurance companies to help them build new services and analytics on top of and around their existing infrastructure, has raised €51 million ($61.5 million at today’s rates, but $60 million at the time of the deal closing) in a Series B round of funding.

FintechOS’s opportunity has been to target wave of incumbents in the insurance and banking industries that have been slowly watching as newer players like Lemonade (in insurance) and a huge plethora of challenger banks (Revolut, N26, Monzo and many others) are swooping in and picking up customers, especially among younger demographics, while they have been unable to respond mostly because their infrastructure is too old and big. Turning a huge ship around, as we have seen, is no small task — a situation that has become only more apparent in the last year of pandemic living and the big shift to digital interactions that resulted from it.

“When we launched FintechOS in 2017, we could already see existing solutions to digital transformation would struggle to deliver tangible results. By contrast, our unique approach has quickly inspired a sea-change in how financial institutions address digitization and engage with their customers,” said Teodor Blidarus, co-founder and CEO at FintechOS, in a statement. “Events over the last year have only increased pressure on our industry to evolve and as a result we’re seeing growing demand for our powerful platforms. Our latest round of funding will help us grow at the pace needed to improve outcomes for financial institutions and their customers globally.”

(It is not the only one. Others out of Europe in the space of bringing new tools to incumbent banks to help them make more modern and competitive products include 10x, Thought Machine, Temenos, Mambu and many more.)

The Series B round of funding is being led by Draper-Esprit, with Earlybird, Gapminder Ventures, Launchub, and OTB Ventures (which all participated in its Series A in December 2019) also participating. There are other backers in the round that are not being disclosed at this time, the startup added. FintechOS is also not disclosing its valuation. The company, based out of Bucharest, has raised just under $80 million to date.

FintechOS is active today in the UK and Europe — where it has been growing at a CAGR of 200% and says its services touch “millions” of people, with some of its key customers including the likes of banking giants Societe Generale and IdeaBank and international insurance brokers Howden. The plan will be to continue investing in those markets, as well as expanding internationally.

And it will be adding in more services. Today, the banking platform is designed to help banks launch more retail services for consumers and small and medium business customers, and for insurance companies to build new health, life and general insurance products (there are a lot of synergies in how insurance and financial services companies have been built over the years, and so it’s a natural couplet when it comes to building tools for those industries).

In the financial sector, FintechOS lets banks build in new digital onboarding flows, credit cards and loan products, savings and mortgage products. Insurance products include new approaches to generating and handling quotes, customer onboarding and management and claims automation — which may well bring FintechOS into closer contact and collaboration with the most successful startup to come out of its home country to date, the RPA juggernaut UiPath. In all cases, it helps stitch together data from a bank’s own systems with more modern tooling, and to link that up with yet more modern tools to help process that data more easily.

This is “low code” but it typically means that the company needs to work with third parties to enable all of this. Partners include the likes of integrators and other global services technicians, such as Microsoft, Deloitte, CapGemini, KPMG, and so on. (And the founders of the startup themselves come from consulting backgrounds so they well understand the role these companies play in the process of bringing technology into big businesses.)

FintechOS is tapping into a couple of very big trends that have arguably been the biggest in the financial and related insurance industries.

The first of these is the fact that core services around things like credit/loans, current deposits and savings are not just very complex to build but actually have largely become commoditized — similar to digital payments — and so packaging them up and turning them into services that can be integrated by way of an API makes them more easily accessed without the heavy lifting needed to build them from scratch. This lets companies focus instead on customer service or building more interesting tools around those basic services to customise them (for example AI based personalization). Disintermediating basic functions from the services built around them is arguably a bigger trend but it has been especially prevalent in enterprise, which has long been a slow-moving space when it comes to innovation in the back-end, and the front-end.

The second of these is the big swing towards using no-code and low-code tools to empower more people within organizations to get stuck in when they can see something not working as efficiently as it could, and building the workflows themselves to improve that. This also applies to trying out and testing new products — again something that typically has not been done in financial and insurance services but can now be possible with low-code and no-code tools.

“Not only is our technology helping financial institutions become customer centric, but it’s also helping them provide products and services to more people and businesses,” said Sergiu Negut, the other co-founder who is FintechOS’s CFO and COO, said in a separate statement. “With so many markets still underserved, the ability to tailor offerings to a segment of one offers the opportunity to increase financial inclusion and adheres to our ideal that easy access to financial services is essential. We’re delighted to be working with investors who share our views on how fintech should be transforming the financial services industry.”

Notably, Draper Esprit also has backed Thought Machine, another big player in the world of fintech that is taking some of the learnings and models that have helped new entrants disrupt incumbents, and is packaging them up as services for incumbents, too. It takes a different approach to doing this, not using low-code but smart contracts, which could be one reason why the VC doesn’t see the investments as conflict of interest. They are also tackling an enormous market, and so at least for now there is room for them, and many others in the space, such as 10x, Temenos, Mambu, Rapyd and many others.

“When we met Teo and Sergiu, we were immediately convinced of their vision: a data led, end-to-end platform, facilitated with a low-code/no-code infrastructure,” Vinoth Jayakumar, partner at Draper Esprit, said in a statement. “Incumbent financial services firms have cost-to-income ratios up to 90%, so we see a huge and increasing need for infrastructure software that allows digitisation at speed, ease and lower cost. Draper Esprit builds enduring partnerships; with the team at FintechOS we hope to build an enduring fintech company that will dramatically change financial services experiences for people all over the world.”

 

 


By Ingrid Lunden

Drama and quirk aren’t necessary for startup success

Many of the stories in our EC-1 series tell tales of startups in the wilderness hacking out green field opportunities. Klaviyo is a different breed of company: One that went into an established market and challenged powerful incumbents, ultimately finding success with a new, more data-oriented generation of email marketers.

As such, the lessons that it offers are, perhaps, more subtle; its insights bordering on common sense.

But as the saying goes, common sense to an uncommon degree becomes wisdom. Here are four pieces of wisdom I’ve gleaned from Klaviyo’s story:

Drama and sizzle help companies stand out, undoubtedly. But are they necessary for success? Klaviyo’s story suggests otherwise.

Lesson 1: Drama and quirk aren’t necessary for startup success

Silicon Valley has become a showcase for oddity. Ironically, we all enjoy “Silicon Valley” (the show) or “The Social Network.” Unironically, we toss around phrases like “the hustle” and “sweat equity.” Hot companies often stand out with stories of intense struggle and failure, a larger-than-life founder or a chaotic (and often toxic) management structure.

Drama and sizzle help companies stand out, undoubtedly. But are they necessary for success? Klaviyo’s story suggests otherwise.


By Danny Crichton

Marketing in 2021 is emotional and not just transactional

Brands are emotions made physical. The clothes we wear, the media we consume, the devices we use — all signal not only to others what we value and see in ourselves, they also are a way to construct our very identities. Experimenting to deepen that bond has been at the core of the marketing profession for a century; its origins rooted in Freudian psychoanalysis.

There had always been one critical limitation, though: Marketers had to appeal to the masses. Radio, television and print media allowed brands to deliver only one message to everyone, no matter if their product conferred luxury or smart cost-consciousness.

On the internet, the masses have been shattered into ever smaller shards, shifting that marketing calculus toward targeted audiences and social network interest groups. Today, niche brands, large corporations and every business in between are reaching ever-narrower audiences.

Marketers who become expert at personalization, especially for existing customers through owned marketing platforms like email, will hold an edge over their competitors.

Yet, advertising and social networks are competitive marketplaces. Over time, prices to reach niche audiences rise, and strategies that once worked become unviable. In 2021, these perpetual challenges are joined by two new factors: a fresh influx of new e-commerce brands and changing privacy policies on third-party platforms.

Klaviyo benefits from these secular trends. While the cost or difficulty of acquiring new customers may increase, as we looked at in the second part of this EC-1, the cost of emailing an existing one remains much the same. Marketers who become expert at personalization, especially for existing customers through owned marketing platforms like email, will hold an edge over their competitors. It’s no longer about marketing to narrow slices of audiences — it’s about building an emotional bond with an audience of one.

To a booming economy, now ad inflation

While 2020 was a banner year for e-commerce in the wake of the COVID-19 pandemic, the early months of 2021 have brought about a new problem: Customer acquisition costs are rising, sometimes to a worrying degree. For instance, one company interviewed by TechCrunch that did not wish to be named said it has seen its return on investment for Facebook ads fall by nearly half in the first months of 2021. Such inflation has also been predicted by firms like ECI Media Management.

There are two possible reasons for this increase. First, an unprecedented number of companies are moving online, spurred by COVID-19 and worldwide lockdowns.


By Danny Crichton

How Klaviyo used data and no-code to transform owned marketing

Email is the communication medium that refuses to die.

“Eventually, every technology is trumped by something new and better. And I feel that email is ready to be trumped. But by what?” wrote the venture capitalist Fred Wilson in 2007. Three years later, he updated readers that other forms of messaging had outgrown email. “It looks like email’s reign as the king of communication is ending and social networking is now supreme,” he said. (To be fair to Wilson, his view was nuanced enough to continue investing in email tech.)

Despite the competition, Klaviyo didn’t just break into the market — it has also achieved an unusual level of excitement and loyalty among marketers despite its youthful history.

Investors weren’t alone — marketers have also spent years anticipating the next big thing.

“It was SMS, it was YouTube, it was Instagram. Before that it was Facebook, then it was Snapchat and TikTok. I kinda feel like individually all those things are fleeting. I think people found: You know what? Everyone still opens their emails every day,” says Darin Hager, a former sneaker entrepreneur who is now an email marketing manager at Adjust Media.

Email has an estimated four billion users today and continues to grow steadily even as mature social networks plateau. Estimates of the number of nonspam messages sent each day range from 25 billion to over 300 billion.

Unsurprisingly for a marketing channel with so much volume, there’s voluminous competition to send and program those emails. Yet, despite the competition, Klaviyo didn’t just break into the market — it has also achieved an unusual level of excitement and loyalty among marketers despite its youthful history.

“If you’re not using Klaviyo and you’re in e-commerce, then it’s not very professional. If you see ‘Sent by Constant Contact or Mailchimp’ at the bottom of an email by a brand, it makes it look like they’re not really there yet,” Hager said.

How did Klaviyo become the standard solution among email marketers?

In Klaviyo’s origin story, we delved into part of the answer: The company began life as an e-commerce analytics service. Once it matured to compete as an email service provider, Klaviyo benefited from the edge given by its deeper, more comprehensive focus on data.

However, that leaves several questions unanswered. Why is email so important to e-commerce? What are the substantive differences between Klaviyo’s feature set and those of its competitors? And why did several large, well-funded incumbents fail to capitalize on building an advantage in data first?

In this section, we’ll answer those questions — as well as laying out the significance of COVID-19 on the e-commerce market, and how newsletters and AI figure into the company’s future.

A positive Outlook on email’s longevity

Email is one of the oldest tech verticals: Constant Contact, one of the most venerable email service providers (ESPs), was founded in 1995, went public in 2007 and was taken private in 2015 for $1 billion. By the time Klaviyo started in 2012, the space was well served by numerous incumbents.


By Danny Crichton

The Klaviyo EC-1

E-commerce is booming as retailers race to transform their brick-and-mortar footprints into online storefronts. By some counts, the market grew an astonishing 42% in 2020 in the wake of the COVID-19 pandemic, and estimates show that online spending in the U.S. will surpass $1 trillion by 2022. It’s a bonanza, and everyone is figuring out this new terrain.

Consumers are likely familiar with the front-end brands for these storefronts — with companies like Amazon, Shopify, Square, and Stripe owning attention — but it’s the tooling behind the curtain that is increasingly determining the competitiveness of individual stores.

Klaviyo may not be a household name to consumers (at least, not yet), but in many ways, this startup has become the standard by which email marketers are judged today, triangulating against veterans Mailchimp and Constant Contact and riding the e-commerce wave to new heights.

Founded in 2012, this Boston-based company helps marketers personalize and automate their email messaging to customers. By now, most people are intimately familiar with these kinds of emails; if you’ve ever given your email address to an online store, the entreaties to come back to your abandoned cart or browse the latest sale are Klaviyo’s bread and butter.

It may seem obvious in retrospect that email would grow to become a premier platform for marketing, but this wasn’t the case even a few years ago when social ads and search engine marketing were the dominant paradigm. Today, owned marketing and customer experience management are white-hot trends, and Klaviyo has surged from a lifestyle business to a multi-billion dollar behemoth in just a few short years. Its story is at the heart of the internet economy today, and the future.

TechCrunch’s writer and analyst for this EC-1 is Chris Morrison. Morrison, who previously wrote our EC-1 on Roblox, has been a writer and independent game developer covering the video game industry and the marketing challenges that come with publishing. As an analyst and a potential user, he’s in a unique position to explain the Klaviyo story. The lead editor for this package was Danny Crichton, the assistant editor was Ram Iyer, the copy editor was Richard Dal Porto and illustrations were created by Nigel Sussman.

Klaviyo had no say in the content of this analysis and did not get advance access to it. Morrison has no financial ties to Klaviyo or other conflicts of interest to disclose.

The Klaviyo EC-1 comprises four main articles numbering 9,700 words and a reading time of 43 minutes. Let’s take a look:

  • Part 1: Origin storyHow Klaviyo transformed from a lifestyle business into a $4.15B email titan” (2,600 words/10 minutes) — Explores the rise of Klaviyo from a database for e-commerce data into a modern email powerhouse as it successively learned from customers and bootstrapped in the absence of funding from accelerators and early VCs.
  • Part 2: Business and growthHow Klaviyo used data and no-code to transform owned marketing” (3,000 words/12 minutes) — Analyzes Klaviyo’s recent growth and how marketers increasingly focus on owned marketing channels and customer experience management.
  • Part 3: Dynamics of e-commerce marketingMarketing in 2021 is emotional and not just transactional” (2,200 words/9 minutes) — To fully understand Klaviyo and this new world of martech, this article contextualizes how and why marketers are increasingly trying to personalize and build deeper emotional bonds with their customers outside of social media channels.
  • Part 4: Lessons on startup growthDrama and quirk aren’t necessary for startup success” (1,900 words/8 minutes) — Founders shouldn’t have to keep learning the same lessons over and over again. Klaviyo offers a number of tried-and-true tutorials to understand how to build a competitive startup and not get bogged down in finding product-market fit and scaling.

We’re always iterating on the EC-1 format. If you have questions, comments or ideas, please send an email to TechCrunch Managing Editor Danny Crichton at [email protected].


By Danny Crichton

Should Dell have pursued a more aggressive debt-reduction move with VMware?

When Dell announced it was spinning out VMware yesterday, the move itself wasn’t surprising: there had been public speculation for some time. But Dell could have gone a number of ways in this deal, despite its choice to spin VMware out as a separate company with a constituent dividend instead of an outright sale.

The dividend route, which involves a payment to shareholders between $11.5 and $12 billion, has the advantage of being tax-free (or at least that’s what Dell hopes as it petitions the IRS). For Dell, which owns 81% of VMware, the dividend translates to somewhere between $9.3 and $9.7 billion in cash, which the company plans to use to pay down a portion of the huge debt it still holds from its $58 billion EMC purchase in 2016.

VMware was the crown jewel in that transaction, giving Dell an inroad to the cloud it had lacked prior to the deal. For context, VMware popularized the notion of the virtual machine, a concept that led to the development of cloud computing as we know it today. It has since expanded much more broadly beyond that, giving Dell a solid foothold in cloud native computing.

Dell hopes to have its cake and eat it too with this deal: it generates a large slug of cash to use for personal debt relief while securing a five-year commercial deal that should keep the two companies closely aligned. Dell CEO Michael Dell will remain chairman of the VMware board, which should help smooth the post-spinout relationship.

But could Dell have extracted more cash out of the deal?

Doing what’s best for everyone

Patrick Moorhead, principal analyst at Moor Insights and Strategies, says that beyond the cash transaction, the deal provides a way for the companies to continue working closely together with the least amount of disruption.

“In the end, this move is more about maximizing the Dell and VMware stock price [in a way that] doesn’t impact customers, ISVs or the channel. Wall Street wasn’t valuing the two companies together nearly as [strongly] as I believe it will as separate entities,” Moorhead said.


By Ron Miller

Tecton teams with founder of Feast open source machine learning feature store

Tecton, the company that pioneered the notion of the machine learning feature store, has teamed up with the founder of the open source feature store project called Feast. Today the company announced the release of version 0.10 of the open source tool.

The feature store is a concept that the Tecton founders came up with when they were engineers at Uber. Shortly thereafter an engineer named Willem Pienaar read the founder’s Uber blog posts on building a feature store and went to work building Feast as an open source version of the concept.

“The idea of Tecton [involved bringing] feature stores to the industry, so we build basically the best in class, enterprise feature store. […] Feast is something that Willem created, which I think was inspired by some of the early designs that we published at Uber. And he built Feast and it evolved as kind of like the standard for open source feature stores, and it’s now part of the Linux Foundation,” Tecton co-founder and CEO Mike Del Balso explained.

Tecton later hired Pienaar, who is today an engineer at the company where he leads their open source team. While the company did not originally start off with a plan to build an open source product, the two products are closely aligned, and it made sense to bring Pienaar on board.

“The products are very similar in a lot of ways. So I think there’s a similarity there that makes this somewhat symbiotic, and there is no explicit convergence necessary. The Tecton product is a superset of what Feast has. So it’s an enterprise version with a lot more advanced functionality, but at Feast we have a battle-tested feature store that’s open source,” Pienaar said.

As we wrote in a December 2020 story on the company’s $35 million Series B, it describes a feature store as “an end-to-end machine learning management system that includes the pipelines to transform the data into what are called feature values, then it stores and manages all of that feature data and finally it serves a consistent set of data.”

Del Balso says that from a business perspective, contributing to the open source feature store exposes his company to a different group of users, and the commercial and open source products can feed off one another as they build the two products.

“What we really like, and what we feel is very powerful here, is that we’re deeply in the Feast community and get to learn from all of the interesting use cases […] to improve the Tecton product. And similarly, we can use the feedback that we’re hearing from our enterprise customers to improve the open source project. That’s the kind of cross learning, and ideally that feedback loop involved there,” he said.

The plan is for Tecton to continue being a primary contributor with a team inside Tecton dedicated to working on Feast. Today, the company is releasing version 0.10 of the project.


By Ron Miller

Zoho launches new low code workflow automation product

Workflow automation has been one of the key trends this year so far, and Zoho, a company known for its suite of affordable business tools has joined the parade with a new low code workflow product called Qntrl (pronounced control).

Zoho’s Rodrigo Vaca, who is in charge of Qntrl’s marketing says that most of the solutions we’ve been seeing are built for larger enterprise customers. Zoho is aiming for the mid-market with a product that requires less technical expertise than traditional business process management tools.

“We enable customers to design their workflows visually without the need for any particular kind of prior knowledge of business process management notation or any kind of that esoteric modeling or discipline,” Vaca told me.

While Vaca says, Qntrl could require some technical help to connect a workflow to more complex backend systems like CRM or ERP, it allows a less technical end user to drag and drop the components and then get help to finish the rest.

“We certainly expect that when you need to connect to NetSuite or SAP you’re going to need a developer. If nothing else, the IT guys are going to ask questions, and they will need to provide access,” Vaca said.

He believes this product is putting this kind of tooling in reach of companies that may have been left out of workflow automation for the most part, or which have been using spreadsheets or other tools to create crude workflows. With Qntrl, you drag and drop components, and then select each component and configure what happens before, during and after each step.

What’s more, Qntrl provides a central place for processing and understanding what’s happening within each workflow at any given time, and who is responsible for completing it.

We’ve seen bigger companies like Microsoft, SAP, ServiceNow and others offering this type of functionality over the last year as low code workflow automation has taken center stage in business.

This has become a more pronounced need during the pandemic when so many workers could not be in the office. It made moving work in a more automated workflow more imperative, and we have seen companies moving to add more of this kind of functionality as a result.

Brent Leary, principal analyst at CRM Essentials, says that Zoho is attempting to remove some the complexity from this kind of tool.

“It handles the security pieces to make sure the right people have access to the data and processes used in the workflows in the background, so regular users can drag and drop to build their flows and processes without having to worry about that stuff,” Leary told me.

Zoho Qntrl is available starting today starting at just $7 per user month.


By Ron Miller

Docugami’s new model for understanding documents cuts its teeth on NASA archives

You hear so much about data these days that you might forget that a huge amount of the world runs on documents: a veritable menagerie of heterogeneous files and formats holding enormous value yet incompatible with the new era of clean, structured databases. Docugami plans to change that with a system that intuitively understands any set of documents and intelligently indexes their contents — and NASA is already on board.

If Docugami’s product works as planned, anyone will be able to take piles of documents accumulated over the years and near-instantly convert them to the kind of data that’s actually useful to people.

Because it turns out that running just about any business ends up producing a ton of documents. Contracts and briefs in legal work, leases and agreements in real estate, proposals and releases in marketing, medical charts, etc, etc. Not to mention the various formats: Word docs, PDFs, scans of paper printouts of PDFs exported from Word docs, and so on.

Over the last decade there’s been an effort to corral this problem, but movement has largely been on the organizational side: put all your documents in one place, share and edit them collaboratively. Understanding the document itself has pretty much been left to the people who handle them, and for good reason — understanding documents is hard!

Think of a rental contract. We humans understand when the renter is named as Jill Jackson, that later on, “the renter” also refers to that person. Furthermore, in any of a hundred other contracts, we understand that the renters in those documents are the same type of person or concept in the context of the document, but not the same actual person. These are surprisingly difficult concepts for machine learning and natural language understanding systems to grasp and apply. Yet if they could be mastered, an enormous amount of useful information could be extracted from the millions of documents squirreled away around the world.

What’s up, .docx?

Docugami founder Jean Paoli says they’ve cracked the problem wide open, and while it’s a major claim, he’s one of few people who could credibly make it. Paoli was a major figure at Microsoft for decades, and among other things helped create the XML format — you know all those files that end in x, like .docx and .xlsx? Paoli is at least partly to thank for them.

“Data and documents aren’t the same thing,” he told me. “There’s a thing you understand, called documents, and there’s something that computers understand, called data. Why are they not the same thing? So my first job [at Microsoft] was to create a format that can represent documents as data. I created XML with friends in the industry, and Bill accepted it.” (Yes, that Bill.)

The formats became ubiquitous, yet 20 years later the same problem persists, having grown in scale with the digitization of industry after industry. But for Paoli the solution is the same. At the core of XML was the idea that a document should be structured almost like a webpage: boxes within boxes, each clearly defined by metadata — a hierarchical model more easily understood by computers.

Illustration showing a document corresponding to pieces of another document.

Image Credits: Docugami

“A few years ago I drank the AI kool-aid, got the idea to transform documents into data. I needed an algorithm that navigates the hierarchical model, and they told me that the algorithm you want does not exist,” he explained. “The XML model, where every piece is inside another, and each has a different name to represent the data it contains — that has not been married to the AI model we have today. That’s just a fact. I hoped the AI people would go and jump on it, but it didn’t happen.” (“I was busy doing something else,” he added, to excuse himself.)

The lack of compatibility with this new model of computing shouldn’t come as a surprise — every emerging technology carries with it certain assumptions and limitations, and AI has focused on a few other, equally crucial areas like speech understanding and computer vision. The approach taken there doesn’t match the needs of systematically understanding a document.

“Many people think that documents are like cats. You train the AI to look for their eyes, for their tails… documents are not like cats,” he said.

It sounds obvious, but it’s a real limitation: advanced AI methods like segmentation, scene understanding, multimodal context, and such are all a sort of hyper-advanced cat detection that has moved beyond cats to detect dogs, car types, facial expressions, locations, etc. Documents are too different from one another, or in other ways too similar, for these approaches to do much more than roughly categorize them.

And as for language understanding, it’s good in some ways but not in the ways Paoli needed. “They’re working sort of at the English language level,” he said. “They look at the text but they disconnect it from the document where they found it. I love NLP people, half my team is NLP people — but NLP people don’t think about business processes. You need to mix them with XML people, people who understand computer vision, then you start looking at the document at a different level.”

Docugami in action

Illustration showing a person interacting with a digital document.

Image Credits: Docugami

Paoli’s goal couldn’t be reached by adapting existing tools (beyond mature primitives like optical character recognition), so he assembled his own private AI lab, where a multi-disciplinary team has been tinkering away for about two years.

“We did core science, self-funded, in stealth mode, and we sent a bunch of patents to the patent office,” he said. “Then we went to see the VCs, and Signalfire basically volunteered to lead the seed round at $10 million.”

Coverage of the round didn’t really get into the actual experience of using Docugami, but Paoli walked me through the platform with some live documents. I wasn’t given access myself and the company wouldn’t provide screenshots or video, saying it is still working on the integrations and UI, so you’ll have to use your imagination… but if you picture pretty much any enterprise SaaS service, you’re 90 percent of the way there.

As the user, you upload any number of documents to Docugami, from a couple dozen to hundreds or thousands. These enter a machine understanding workflow that parses the documents, whether they’re scanned PDFs, Word files, or something else, into an XML-esque hierarchical organization unique to the contents.

“Say you’ve got 500 documents, we try to categorize it in document sets, these 30 look the same, those 20 look the same, those 5 together. We group them with a mix of hints coming from how the document looked, what it’s talking about, what we think people are using it for, etc,” said Paoli. Other services might be able to tell the difference between a lease and an NDA, but documents are too diverse to slot into pre-trained ideas of categories and expect it to work out. Every set of documents is potentially unique, and so Docugami trains itself anew every time, even for a set of one. “Once we group them, we understand the overall structure and hierarchy of that particular set of documents, because that’s how documents become useful: together.”

Illustration showing a document being turned into a report and a spreadsheet.

Image Credits: Docugami

That doesn’t just mean it picks up on header text and creates an index, or lets you search for words. The data that is in the document, for example who is paying whom, how much and when, and under what conditions, all that becomes structured and editable within the context of similar documents. (It asks for a little input to double check what it has deduced.)

It can be a little hard to picture, but now just imagine that you want to put together a report on your company’s active loans. All you need to do is highlight the information that’s important to you in an example document — literally, you just click “Jane Roe” and “$20,000” and “5 years” anywhere they occur — and then select the other documents you want to pull corresponding information from. A few seconds later you have an ordered spreadsheet with names, amounts, dates, anything you wanted out of that set of documents.

All this data is meant to be portable too, of course — there are integrations planned with various other common pipes and services in business, allowing for automatic reports, alerts if certain conditions are reached, automated creation of templates and standard documents (no more keeping an old one around with underscores where the principals go).

Remember, this is all half an hour after you uploaded them in the first place, no labeling or pre-processing or cleaning required. And the AI isn’t working from some preconceived notion or format of what a lease document looks like. It’s learned all it needs to know from the actual docs you uploaded — how they’re structured, where things like names and dates figure relative to one another, and so on. And it works across verticals and uses an interface anyone can figure out a few minutes. Whether you’re in healthcare data entry or construction contract management, the tool should make sense.

The web interface where you ingest and create new documents is one of the main tools, while the other lives inside Word. There Docugami acts as a sort of assistant that’s fully aware of every other document of whatever type you’re in, so you can create new ones, fill in standard information, comply with regulations, and so on.

Okay, so processing legal documents isn’t exactly the most exciting application of machine learning in the world. But I wouldn’t be writing this (at all, let alone at this length) if I didn’t think this was a big deal. This sort of deep understanding of document types can be found here and there among established industries with standard document types (such as police or medical reports), but have fun waiting until someone trains a bespoke model for your kayak rental service. But small businesses have just as much value locked up in documents as large enterprises — and they can’t afford to hire a team of data scientists. And even the big organizations can’t do it all manually.

NASA’s treasure trove

Image Credits: NASA

The problem is extremely difficult, yet to humans seems almost trivial. You or I could glance through 20 similar documents and a list of names and amounts easily, perhaps even in less time than it takes for Docugami to crawl them and train itself.

But AI, after all, is meant to imitate and excel human capacity, and it’s one thing for an account manager to do monthly reports on 20 contracts — quite another to do a daily report on a thousand. Yet Docugami accomplishes the latter and former equally easily — which is where it fits into both the enterprise system, where scaling this kind of operation is crucial, and to NASA, which is buried under a backlog of documentation from which it hopes to glean clean data and insights.

If there’s one thing NASA’s got a lot of, it’s documents. Its reasonably well maintained archives go back to its founding, and many important ones are available by various means — I’ve spent many a pleasant hour perusing its cache of historical documents.

But NASA isn’t looking for new insights into Apollo 11. Through its many past and present programs, solicitations, grant programs, budgets, and of course engineering projects, it generates a huge amount of documents — being, after all, very much a part of the federal bureaucracy. And as with any large organization with its paperwork spread over decades, NASA’s document stash represents untapped potential.

Expert opinions, research precursors, engineering solutions, and a dozen more categories of important information are sitting in files searchable perhaps by basic word matching but otherwise unstructured. Wouldn’t it be nice for someone at JPL to get it in their head to look at the evolution of nozzle design, and within a few minutes have a complete and current list of documents on that topic, organized by type, date, author, and status? What about the patent advisor who needs to provide a NIAC grant recipient information on prior art — shouldn’t they be able to pull those old patents and applications up with more specificity than any with a given keyword?

The NASA SBIR grant, awarded last summer, isn’t for any specific work, like collecting all the documents of such and such a type from Johnson Space Center or something. It’s an exploratory or investigative agreement, as many of these grants are, and Docugami is working with NASA scientists on the best ways to apply the technology to their archives. (One of the best applications may be to the SBIR and other small business funding programs themselves.)

Another SBIR grant with the NSF differs in that, while at NASA the team is looking into better organizing tons of disparate types of documents with some overlapping information, at NSF they’re aiming to better identify “small data.” “We are looking at the tiny things, the tiny details,” said Paoli. “For instance, if you have a name, is it the lender or the borrower? The doctor or the patient name? When you read a patient record, penicillin is mentioned, is it prescribed or prohibited? If there’s a section called allergies and another called prescriptions, we can make that connection.”

“Maybe it’s because I’m French”

When I pointed out the rather small budgets involved with SBIR grants and how his company couldn’t possibly survive on these, he laughed.

“Oh, we’re not running on grants! This isn’t our business. For me, this is a way to work with scientists, with the best labs in the world,” he said, while noting many more grant projects were in the offing. “Science for me is a fuel. The business model is very simple – a service that you subscribe to, like Docusign or Dropbox.”

The company is only just now beginning its real business operations, having made a few connections with integration partners and testers. But over the next year it will expand its private beta and eventually open it up — though there’s no timeline on that just yet.

“We’re very young. A year ago we were like five, six people, now we went and got this $10M seed round and boom,” said Paoli. But he’s certain that this is a business that will be not just lucrative but will represent an important change in how companies work.

“People love documents. Maybe it’s because I’m French,” he said, “but I think text and books and writing are critical — that’s just how humans work. We really think people can help machines think better, and machines can help people think better.”


By Devin Coldewey

Microsoft goes all in on healthcare with $19.7B Nuance acquisition

When Microsoft announced it was acquiring Nuance Communications this morning for $19.7 billion, you could be excused for doing a Monday morning double take at the hefty price tag.

That’s surely a lot of money for a company on a $1.4 billion run rate, but Microsoft, which has already partnered with the speech-to-text market leader on several products over the last couple of years, saw a company firmly embedded in healthcare and it decided to go all in.

And $20 billion is certainly all in, even for a company the size of Microsoft. But 2020 forced us to change the way we do business from restaurants to retailers to doctors. In fact, the pandemic in particular changed the way we interact with our medical providers. We learned very quickly that you don’t have to drive to an office, wait in waiting room, then in an exam room, all to see the doctor for a few minutes.

Instead, we can get on the line, have a quick chat and be on our way. It won’t work for every condition of course — there will always be times the physician needs to see you — but for many meetings such as reviewing test results or for talk therapy, telehealth could suffice.

Microsoft CEO Satya Nadella says that Nuance is at the center of this shift, especially with its use of cloud and artificial intelligence, and that’s why the company was willing to pay the amount it did to get it.

“AI is technology’s most important priority, and healthcare is its most urgent application. Together, with our partner ecosystem, we will put advanced AI solutions into the hands of professionals everywhere to drive better decision-making and create more meaningful connections, as we accelerate growth of Microsoft Cloud in Healthcare and Nuance,” Nadella said in a post announcing the deal.

Microsoft sees this deal doubling what was already a considerable total addressable market to nearly $500 billion. While TAMs always tend to run high, that is still a substantial number.

It also fits with Gartner data, which found that by 2022, 75% of healthcare organizations will have a formal cloud strategy in place. The AI component only adds to that number and Nuance brings 10,000 existing customers to Microsoft including some of the biggest healthcare organizations in the world.

Brent Leary, founder and principal analyst at CRM Essentials, says the deal could provide Microsoft with a ton of health data to help feed the underlying machine learning models and make them more accurate over time.

“There is going be a ton of health data being captured by the interactions coming through telemedicine interactions, and this could create a whole new level of health intelligence,” Leary told me.

That of course could drive a lot of privacy concerns where health data is involved, and it will be up to Microsoft, which just experienced a major breach on its Exchange email server products last month, to assure the public that their sensitive health data is being protected.

Leary says that ensuring data privacy is going to be absolutely key to the success of the deal. “The potential this move has is pretty powerful, but it will only be realized if the data and insights that could come from it are protected and secure — not only protected from hackers but also from unethical use. Either could derail what could be a game changing move,” he said.

Microsoft also seemed to recognize that when it wrote, “Nuance and Microsoft will deepen their existing commitments to the extended partner ecosystem, as well as the highest standards of data privacy, security and compliance.”

We are clearly on the edge of a sea change when it comes to how we interact with our medical providers in the future. COVID pushed medicine deeper into the digital realm in 2020 out of simple necessity. It wasn’t safe to go into the office unless absolutely necessary.

The Nuance acquisition, which is expected to close some time later this year, could help Microsoft shift deeper into the market. It could even bring Teams into it as a meeting tool, but it’s all going to depend on the trust level people have with this approach, and it will be up to the company to make sure that both healthcare providers and the people they serve have that.


By Ron Miller

Microsoft is acquiring Nuance Communications for $19.7B

Microsoft agreed today to acquire Nuance Communications, a leader in speech to text software, for $19.7 billion. Bloomberg broke the story over the weekend that the two companies were in talks.

In a post announcing the deal, the company said this was about increasing its presence in the healthcare vertical, a place where Nuance has done well in recent years. In fact, the company announced the Microsoft Cloud for Healthcare last year, and this deal is about accelerating its presence there. Nuance’s products in this area include Dragon Ambient eXperience, Dragon Medical One and PowerScribe One for radiology reporting.

“Today’s acquisition announcement represents the latest step in Microsoft’s industry-specific cloud strategy,” the company wrote. The acquisition also builds on several integrations and partnerships the two companies have made in the last couple of years.

The company boasts 10,000 healthcare customers, according to information on the website. Those include AthenaHealth, Johns Hopkins, Mass General Brigham and Cleveland Clinic to name but a few, and it was that customer base that attracted Microsoft to pay the price it did to bring Nuance into the fold.

Nuance CEO Mark Benjamin will remain with the company and report to Scott Guthrie, Microsoft’s EVP in charge of the cloud and AI group.

Nuance has a complex history. It went public in 2000 and began buying speech recognition products including Dragon Dictate from Lernout Hauspie in 2001. It merged with a company called ScanSoft in 2005. That company began life as Visioneer, a scanning company in 1992.

Today, the company has a number of products including Dragon Dictate, a consumer and business text to speech product that dates back to the early 1990s. It’s also involved in speech recognition, chat bots and natural language processing particularly in healthcare and other verticals.

The company has 6,000 employees spread across 27 countries. In its most recent earnings report from November 2020, which was for Q42020, the company reported $352.9 million in revenue compared to $387.6 million in the same period a year prior. That’s not the direction a company wants to go in, but it is still a run rate of over $1.4 billion.

At the time of that earnings call, the company also announced it was selling its medical transcription and electronic health record (EHR) Go-Live services to Assured Healthcare Partners and Aeries Technology Group. Company CEO Benjamin said this was about helping the company concentrate on its core speech services.

“With this sale, we will reach an important milestone in our journey towards a more focused strategy of advancing our Conversational AI, natural language understanding and ambient clinical intelligence solutions,” Benjamin said in a statement at the time.

It’s worth noting that Microsoft already has a number speech recognition and chat bot products of its own including desktop speech to text services in Windows and on Azure, but it took a chance to buy a market leader and go deeper into the healthcare vertical.

The transaction has already been approved by both company boards and Microsoft reports it expects the deal to close by the end of this year, subject to standard regulatory oversight and approval by Nuance shareholders.

This would mark the second largest purchase by Microsoft ever, only surpassed by the $26.2 billion the company paid for LinkedIn in 2016.


By Ron Miller

Immersion cooling to offset data centers’ massive power demands gains a big booster in Microsoft

LiquidStack does it. So does Submer. They’re both dropping servers carrying sensitive data into goop in an effort to save the planet. Now they’re joined by one of the biggest tech companies in the world in their efforts to improve the energy efficiency of data centers, because Microsoft is getting into the liquid-immersion cooling market.

Microsoft is using a liquid it developed in-house that’s engineered to boil at 122 degrees Fahrenheit (lower than the boiling point of water) to act as a heat sink, reducing the temperature inside the servers so they can operate at full power without any risks from overheating.

The vapor from the boiling fluid is converted back into a liquid through contact with a cooled condenser in the lid of the tank that stores the servers.

“We are the first cloud provider that is running two-phase immersion cooling in a production environment,” said Husam Alissa, a principal hardware engineer on Microsoft’s team for datacenter advanced development in Redmond, Washington, in a statement on the company’s internal blog. 

While that claim may be true, liquid cooling is a well-known approach to dealing with moving heat around to keep systems working. Cars use liquid cooling to keep their motors humming as they head out on the highway.

As technology companies confront the physical limits of Moore’s Law, the demand for faster, higher performance processors mean designing new architectures that can handle more power, the company wrote in a blog post. Power flowing through central processing units has increased from 150 watts to more than 300 watts per chip and the GPUs responsible for much of Bitcoin mining, artificial intelligence applications and high end graphics each consume more than 700 watts per chip.

It’s worth noting that Microsoft isn’t the first tech company to apply liquid cooling to data centers and the distinction that the company uses of being the first “cloud provider” is doing a lot of work. That’s because bitcoin mining operations have been using the tech for years. Indeed, LiquidStack was spun out from a bitcoin miner to commercialize its liquid immersion cooling tech and bring it to the masses.

“Air cooling is not enough”

More power flowing through the processors means hotter chips, which means the need for better cooling or the chips will malfunction.

“Air cooling is not enough,” said Christian Belady, vice president of Microsoft’s datacenter advanced development group in Redmond, in an interview for the company’s internal blog. “That’s what’s driving us to immersion cooling, where we can directly boil off the surfaces of the chip.”

For Belady, the use of liquid cooling technology brings the density and compression of Moore’s Law up to the datacenter level

The results, from an energy consumption perspective, are impressive. The company found that using two-phase immersion cooling reduced power consumption for a server by anywhere from 5 percent to 15 percent (every little bit helps).

Microsoft investigated liquid immersion as a cooling solution for high performance computing applications such as AI. Among other things, the investigation revealed that two-phase immersion cooling reduced power consumption for any given server by 5% to 15%. 

Meanwhile, companies like Submer claim they reduce energy consumption by 50%, water use by 99%, and take up 85% less space.

For cloud computing companies, the ability to keep these servers up and running even during spikes in demand, when they’d consume even more power, adds flexibility and ensures uptime even when servers are overtaxed, according to Microsoft.

“[We] know that with Teams when you get to 1 o’clock or 2 o’clock, there is a huge spike because people are joining meetings at the same time,” Marcus Fontoura, a vice president on Microsoft’s Azure team, said on the company’s internal blog. “Immersion cooling gives us more flexibility to deal with these burst-y workloads.”

At this point, data centers are a critical component of the internet infrastructure that much of the world relies on for… well… pretty much every tech-enabled service. That reliance however has come at a significant environmental cost.

“Data centers power human advancement. Their role as a core infrastructure has become more apparent than ever and emerging technologies such as AI and IoT will continue to drive computing needs. However, the environmental footprint of the industry is growing at an alarming rate,” Alexander Danielsson, an investment manager at Norrsken VC noted last year when discussing that firm’s investment in Submer.

Solutions under the sea

If submerging servers in experimental liquids offers one potential solution to the problem — then sinking them in the ocean is another way that companies are trying to cool data centers without expending too much power.

Microsoft has already been operating an undersea data center for the past two years. The company actually trotted out the tech as part of a push from the tech company to aid in the search for a COVID-19 vaccine last year.

These pre-packed, shipping container-sized data centers can be spun up on demand and run deep under the ocean’s surface for sustainable, high-efficiency and powerful compute operations, the company said.

The liquid cooling project shares most similarity with Microsoft’s Project Natick, which is exploring the potential of underwater datacenters that are quick to deploy and can operate for years on the seabed sealed inside submarine-like tubes without any onsite maintenance by people. 

In those data centers nitrogen air replaces an engineered fluid and the servers are cooled with fans and a heat exchanger that pumps seawater through a sealed tube.

Startups are also staking claims to cool data centers out on the ocean (the seaweed is always greener in somebody else’s lake).

Nautilus Data Technologies, for instance, has raised over $100 million (according to Crunchbase) to develop data centers dotting the surface of Davey Jones’ Locker. The company is currently developing a data center project co-located with a sustainable energy project in a tributary near Stockton, Calif.

With the double-immersion cooling tech Microsoft is hoping to bring the benefits of ocean-cooling tech onto the shore. “We brought the sea to the servers rather than put the datacenter under the sea,” Microsoft’s Alissa said in a company statement.

Ioannis Manousakis, a principal software engineer with Azure (left), and Husam Alissa, a principal hardware engineer on Microsoft’s team for datacenter advanced development (right), walk past a container at a Microsoft datacenter where computer servers in a two-phase immersion cooling tank are processing workloads. Photo by Gene Twedt for Microsoft.


By Jonathan Shieber

Industry experts bullish on $500M KKR investment in Box, but stock market remains skeptical

When Box announced it was getting a $500 million investment from private equity firm KKR this morning, it was hard not to see it as a positive move for the company. It has been operating under the shadow of Starboard Value, and this influx of cash could give it a way forward independent of the activist investors.

Industry experts we spoke to were all optimistic about the deal, seeing it as a way for the company to regain control, while giving it a bushel of cash to make some moves. However, early returns from the stock market were not as upbeat as the stock price was plunging this morning.

Alan Pelz-Sharpe, principal analyst at Deep Analysis, a firm that follows the content management market closely, says that it’s a significant move for Box and opens up a path to expanding through acquisition.

“The KKR move is probably the most important strategic move Box has made since it IPO’d. KKR doesn’t just bring a lot of money to the deal, it gives Box the ability to shake off some naysayers and invest in further acquisitions,” Pelz-Sharpe told me, adding “Box is no longer a startup its a rapidly maturing company and organic growth will only take you so far. Inorganic growth is what will take Box to the next level.”

Dion Hinchcliffe, an analyst at Constellation Research, who covers the work from home trend and the digital workplace, sees it similarly, saying the investment allows the company to focus longer term again.

“Box very much needs to expand in new markets beyond its increasingly commoditized core business. The KKR investment will give them the opportunity to realize loftier ambitions long term so they can turn their established market presence into a growth story,” he said.

Pelz-Sharpe says that it also changes the power dynamic after a couple of years of having Starboard pushing the direction of the company.

“In short, as a public company there are investors who want a quick flip and others that want to grow this company substantially before an exit. This move with KKR potentially changes the dynamic at Box and may well put Aaron Levie back in the driver’s seat.”

Josh Stein, a partner at DFJ and early investor in Box, who was a long time board member, says that it shows that Box is moving in the right direction.

“I think it makes a ton of sense. Management has done a great job growing the business and taking it to profitability. With KKR’s new investment, you have two of the top technology investors in the world putting significant capital into going long on Box,” Stein said.

Perhaps Stein’s optimism is warranted. In its most recent earnings report from last month, the company announced revenue of $198.9 million, up 8% year-over-year with FY2021 revenue closing at $771 million up 11%. What’s more, the company is cash-flow positive, and has predicted an optimistic future outlook.

“As previously announced, Box is committed to achieving a revenue growth rate between 12-16%, with operating margins of between 23-27%, by fiscal 2024,” the company reiterated in a statement this morning.

Investors remains skeptical, however, with the company stock price getting hammered this morning. As of publication the share price was down over 9%. At this point, market investors may be waiting for the next earnings report to see if the company is headed in the right direction. For now, the $500 million certainly gives the company options, regardless of what Wall Street thinks in the short term.


By Ron Miller

KKR hands Box a $500M lifeline

Box announced this morning that private equity firm KKR is investing $500 million in the company, a move that could help the struggling cloud content management vendor get out from under pressure from activist investor Starboard Value.

The company plans to use the proceeds in what’s called a “dutch auction” style sale to buy back shares from certain investors for the price determined by the auction, an activity that should take place after the company announces its next earnings report in May. This would presumably involve buying out Starboard, which took a 7.5% stake in the company in 2019.

Last month Reuters reported that Starboard could be looking to take over a majority of the board seats when the company board meets in June. That could have set them up to take some action, most likely forcing a sale.

While it’s not clear what will happen now, it seems likely that with this cash, they will be able to stave off action from Starboard, and with KKR in the picture be able to take a longer term view. Box CEO Aaron Levie sees the move as a vote of confidence from KKR in Box’s approach.

“KKR is one of the world’s leading technology investors with a deep understanding of our market and a proven track record of partnering successfully with companies to create value and drive growth. With their support, we will be even better positioned to build on Box’s leadership in cloud content management as we continue to deliver value for our customers around the world,” Levie said in a statement.

Under the terms of the deal, John Park, Head of Americas Technology Private Equity at KKR, will be joining the Box board of directors. The company also announced that independent board member Bethany Mayer will be appointed chairman of the board, effective on May 1st.

Earlier this year, the company bought e-signature startup SignRequest, which could help open up a new set of workflows for the company as it tries to expand its market. With KKR’s backing, it’s not unreasonable to expect that Box, which is cash flow positive, could be taking additional steps to expand the platform in the future.

Box stock was down over 8% premarket, a signal that perhaps Wall Street isn’t thrilled with the announcement, but the cash influx should give Box some breathing room to reset and push forward.


By Ron Miller

Okta expands into privileged access management and identity governance reporting

Okta today announced it was expanding its platform into a couple of new areas. Up to this point, the company has been known for its identity access management product, giving companies the ability to sign onto multiple cloud products with a single sign on. Today, the company is moving into two new areas: privileged access and identity governance

Privileged access gives companies the ability to provide access on an as-needed basis to a limited number of people to key administrative services inside a company. This could be your database or your servers or any part of your technology stack that is highly sensitive and where you want to tightly control who can access these systems.

Okta CEO Todd McKinnon says that Okta has always been good at locking down the general user population access to cloud services like Salesforce, Office 365 and Gmail. What these cloud services have in common is you access them via a web interface.

Administrators access the speciality accounts using different protocols. “It’s something like secure shell, or you’re using a terminal on your computer to connect to a server in the cloud, or it’s a database connection where you’re actually logging in with a SQL connection, or you’re connecting to a container which is the Kubernetes protocol to actually manage the container,” McKinnon explained.

Privileged access offers a couple of key features including the ability to limit access to a given time window and to record a video of the session so there is an audit trail of exactly what happened while someone was accessing the system. McKinnon says that these features provide additional layers of protection for these sensitive accounts.

He says that it will be fairly trivial to carve out these accounts because Okta already has divided users into groups and can give these special privileges to only those people in the administrative access group. The challenge was figuring out how to get access to these other kinds of protocols.

The governance piece provides a way for security operations teams to run detailed reports and look for issues related to identity. “Governance provides exception reporting so you can give that to your auditors, and more importantly you can give that to your security team to make sure that you figure out what’s going on and why there is this deviation from your stated policy,” he said.

All of this when combined with the $6.5 billion acquisition of Auth0 last month is part of a larger plan by the company to be what McKinnon calls the identity cloud. He sees a market with several strategic clouds and he believes identity is going to be one of them.

“Because identity is so strategic for everything, it’s unlocking your customer, access, it’s unlocking your employee access, it’s keeping everything secure. And so this expansion, whether it’s customer identity with zero trust or whether it’s doing more on the workforce identity with not just access, but privileged access and identity governance. It’s about identity evolving in this primary cloud,” he said.

While both of these new products were announced today at the company’s virtual Oktane customer conference, they won’t be generally available until the first quarter of next year.


By Ron Miller