By Ron Miller
April 13, 2021
By Devin Coldewey
April 12, 2021
By Ron Miller
April 12, 2021
By Ron Miller
April 12, 2021
By Anthony Ha
April 8, 2021
By Jonathan Shieber
April 8, 2021
By Ron Miller
April 8, 2021
By Anthony Ha
April 8, 2021
By Ron Miller
April 8, 2021
By Ron Miller
April 7, 2021
Workflow automation has been one of the key trends this year so far, and Zoho, a company known for its suite of affordable business tools has joined the parade with a new low code workflow product called Qntrl (pronounced control).
Zoho’s Rodrigo Vaca, who is in charge of Qntrl’s marketing says that most of the solutions we’ve been seeing are built for larger enterprise customers. Zoho is aiming for the mid-market with a product that requires less technical expertise than traditional business process management tools.
“We enable customers to design their workflows visually without the need for any particular kind of prior knowledge of business process management notation or any kind of that esoteric modeling or discipline,” Vaca told me.
While Vaca says, Qntrl could require some technical help to connect a workflow to more complex backend systems like CRM or ERP, it allows a less technical end user to drag and drop the components and then get help to finish the rest.
“We certainly expect that when you need to connect to NetSuite or SAP you’re going to need a developer. If nothing else, the IT guys are going to ask questions, and they will need to provide access,” Vaca said.
He believes this product is putting this kind of tooling in reach of companies that may have been left out of workflow automation for the most part, or which have been using spreadsheets or other tools to create crude workflows. With Qntrl, you drag and drop components, and then select each component and configure what happens before, during and after each step.
What’s more, Qntrl provides a central place for processing and understanding what’s happening within each workflow at any given time, and who is responsible for completing it.
We’ve seen bigger companies like Microsoft, SAP, ServiceNow and others offering this type of functionality over the last year as low code workflow automation has taken center stage in business.
This has become a more pronounced need during the pandemic when so many workers could not be in the office. It made moving work in a more automated workflow more imperative, and we have seen companies moving to add more of this kind of functionality as a result.
Brent Leary, principal analyst at CRM Essentials, says that Zoho is attempting to remove some the complexity from this kind of tool.
“It handles the security pieces to make sure the right people have access to the data and processes used in the workflows in the background, so regular users can drag and drop to build their flows and processes without having to worry about that stuff,” Leary told me.
Zoho Qntrl is available starting today starting at just $7 per user month.
By Ron Miller
You hear so much about data these days that you might forget that a huge amount of the world runs on documents: a veritable menagerie of heterogeneous files and formats holding enormous value yet incompatible with the new era of clean, structured databases. Docugami plans to change that with a system that intuitively understands any set of documents and intelligently indexes their contents — and NASA is already on board.
If Docugami’s product works as planned, anyone will be able to take piles of documents accumulated over the years and near-instantly convert them to the kind of data that’s actually useful to people.
Because it turns out that running just about any business ends up producing a ton of documents. Contracts and briefs in legal work, leases and agreements in real estate, proposals and releases in marketing, medical charts, etc, etc. Not to mention the various formats: Word docs, PDFs, scans of paper printouts of PDFs exported from Word docs, and so on.
Over the last decade there’s been an effort to corral this problem, but movement has largely been on the organizational side: put all your documents in one place, share and edit them collaboratively. Understanding the document itself has pretty much been left to the people who handle them, and for good reason — understanding documents is hard!
Think of a rental contract. We humans understand when the renter is named as Jill Jackson, that later on, “the renter” also refers to that person. Furthermore, in any of a hundred other contracts, we understand that the renters in those documents are the same type of person or concept in the context of the document, but not the same actual person. These are surprisingly difficult concepts for machine learning and natural language understanding systems to grasp and apply. Yet if they could be mastered, an enormous amount of useful information could be extracted from the millions of documents squirreled away around the world.
What’s up, .docx?
Docugami founder Jean Paoli says they’ve cracked the problem wide open, and while it’s a major claim, he’s one of few people who could credibly make it. Paoli was a major figure at Microsoft for decades, and among other things helped create the XML format — you know all those files that end in x, like .docx and .xlsx? Paoli is at least partly to thank for them.
“Data and documents aren’t the same thing,” he told me. “There’s a thing you understand, called documents, and there’s something that computers understand, called data. Why are they not the same thing? So my first job [at Microsoft] was to create a format that can represent documents as data. I created XML with friends in the industry, and Bill accepted it.” (Yes, that Bill.)
The formats became ubiquitous, yet 20 years later the same problem persists, having grown in scale with the digitization of industry after industry. But for Paoli the solution is the same. At the core of XML was the idea that a document should be structured almost like a webpage: boxes within boxes, each clearly defined by metadata — a hierarchical model more easily understood by computers.
“A few years ago I drank the AI kool-aid, got the idea to transform documents into data. I needed an algorithm that navigates the hierarchical model, and they told me that the algorithm you want does not exist,” he explained. “The XML model, where every piece is inside another, and each has a different name to represent the data it contains — that has not been married to the AI model we have today. That’s just a fact. I hoped the AI people would go and jump on it, but it didn’t happen.” (“I was busy doing something else,” he added, to excuse himself.)
The lack of compatibility with this new model of computing shouldn’t come as a surprise — every emerging technology carries with it certain assumptions and limitations, and AI has focused on a few other, equally crucial areas like speech understanding and computer vision. The approach taken there doesn’t match the needs of systematically understanding a document.
“Many people think that documents are like cats. You train the AI to look for their eyes, for their tails… documents are not like cats,” he said.
It sounds obvious, but it’s a real limitation: advanced AI methods like segmentation, scene understanding, multimodal context, and such are all a sort of hyper-advanced cat detection that has moved beyond cats to detect dogs, car types, facial expressions, locations, etc. Documents are too different from one another, or in other ways too similar, for these approaches to do much more than roughly categorize them.
And as for language understanding, it’s good in some ways but not in the ways Paoli needed. “They’re working sort of at the English language level,” he said. “They look at the text but they disconnect it from the document where they found it. I love NLP people, half my team is NLP people — but NLP people don’t think about business processes. You need to mix them with XML people, people who understand computer vision, then you start looking at the document at a different level.”
Docugami in action
Paoli’s goal couldn’t be reached by adapting existing tools (beyond mature primitives like optical character recognition), so he assembled his own private AI lab, where a multi-disciplinary team has been tinkering away for about two years.
“We did core science, self-funded, in stealth mode, and we sent a bunch of patents to the patent office,” he said. “Then we went to see the VCs, and Signalfire basically volunteered to lead the seed round at $10 million.”
Coverage of the round didn’t really get into the actual experience of using Docugami, but Paoli walked me through the platform with some live documents. I wasn’t given access myself and the company wouldn’t provide screenshots or video, saying it is still working on the integrations and UI, so you’ll have to use your imagination… but if you picture pretty much any enterprise SaaS service, you’re 90 percent of the way there.
As the user, you upload any number of documents to Docugami, from a couple dozen to hundreds or thousands. These enter a machine understanding workflow that parses the documents, whether they’re scanned PDFs, Word files, or something else, into an XML-esque hierarchical organization unique to the contents.
“Say you’ve got 500 documents, we try to categorize it in document sets, these 30 look the same, those 20 look the same, those 5 together. We group them with a mix of hints coming from how the document looked, what it’s talking about, what we think people are using it for, etc,” said Paoli. Other services might be able to tell the difference between a lease and an NDA, but documents are too diverse to slot into pre-trained ideas of categories and expect it to work out. Every set of documents is potentially unique, and so Docugami trains itself anew every time, even for a set of one. “Once we group them, we understand the overall structure and hierarchy of that particular set of documents, because that’s how documents become useful: together.”
That doesn’t just mean it picks up on header text and creates an index, or lets you search for words. The data that is in the document, for example who is paying whom, how much and when, and under what conditions, all that becomes structured and editable within the context of similar documents. (It asks for a little input to double check what it has deduced.)
It can be a little hard to picture, but now just imagine that you want to put together a report on your company’s active loans. All you need to do is highlight the information that’s important to you in an example document — literally, you just click “Jane Roe” and “$20,000” and “5 years” anywhere they occur — and then select the other documents you want to pull corresponding information from. A few seconds later you have an ordered spreadsheet with names, amounts, dates, anything you wanted out of that set of documents.
All this data is meant to be portable too, of course — there are integrations planned with various other common pipes and services in business, allowing for automatic reports, alerts if certain conditions are reached, automated creation of templates and standard documents (no more keeping an old one around with underscores where the principals go).
Remember, this is all half an hour after you uploaded them in the first place, no labeling or pre-processing or cleaning required. And the AI isn’t working from some preconceived notion or format of what a lease document looks like. It’s learned all it needs to know from the actual docs you uploaded — how they’re structured, where things like names and dates figure relative to one another, and so on. And it works across verticals and uses an interface anyone can figure out a few minutes. Whether you’re in healthcare data entry or construction contract management, the tool should make sense.
The web interface where you ingest and create new documents is one of the main tools, while the other lives inside Word. There Docugami acts as a sort of assistant that’s fully aware of every other document of whatever type you’re in, so you can create new ones, fill in standard information, comply with regulations, and so on.
Okay, so processing legal documents isn’t exactly the most exciting application of machine learning in the world. But I wouldn’t be writing this (at all, let alone at this length) if I didn’t think this was a big deal. This sort of deep understanding of document types can be found here and there among established industries with standard document types (such as police or medical reports), but have fun waiting until someone trains a bespoke model for your kayak rental service. But small businesses have just as much value locked up in documents as large enterprises — and they can’t afford to hire a team of data scientists. And even the big organizations can’t do it all manually.
NASA’s treasure trove
The problem is extremely difficult, yet to humans seems almost trivial. You or I could glance through 20 similar documents and a list of names and amounts easily, perhaps even in less time than it takes for Docugami to crawl them and train itself.
But AI, after all, is meant to imitate and excel human capacity, and it’s one thing for an account manager to do monthly reports on 20 contracts — quite another to do a daily report on a thousand. Yet Docugami accomplishes the latter and former equally easily — which is where it fits into both the enterprise system, where scaling this kind of operation is crucial, and to NASA, which is buried under a backlog of documentation from which it hopes to glean clean data and insights.
If there’s one thing NASA’s got a lot of, it’s documents. Its reasonably well maintained archives go back to its founding, and many important ones are available by various means — I’ve spent many a pleasant hour perusing its cache of historical documents.
But NASA isn’t looking for new insights into Apollo 11. Through its many past and present programs, solicitations, grant programs, budgets, and of course engineering projects, it generates a huge amount of documents — being, after all, very much a part of the federal bureaucracy. And as with any large organization with its paperwork spread over decades, NASA’s document stash represents untapped potential.
Expert opinions, research precursors, engineering solutions, and a dozen more categories of important information are sitting in files searchable perhaps by basic word matching but otherwise unstructured. Wouldn’t it be nice for someone at JPL to get it in their head to look at the evolution of nozzle design, and within a few minutes have a complete and current list of documents on that topic, organized by type, date, author, and status? What about the patent advisor who needs to provide a NIAC grant recipient information on prior art — shouldn’t they be able to pull those old patents and applications up with more specificity than any with a given keyword?
The NASA SBIR grant, awarded last summer, isn’t for any specific work, like collecting all the documents of such and such a type from Johnson Space Center or something. It’s an exploratory or investigative agreement, as many of these grants are, and Docugami is working with NASA scientists on the best ways to apply the technology to their archives. (One of the best applications may be to the SBIR and other small business funding programs themselves.)
Another SBIR grant with the NSF differs in that, while at NASA the team is looking into better organizing tons of disparate types of documents with some overlapping information, at NSF they’re aiming to better identify “small data.” “We are looking at the tiny things, the tiny details,” said Paoli. “For instance, if you have a name, is it the lender or the borrower? The doctor or the patient name? When you read a patient record, penicillin is mentioned, is it prescribed or prohibited? If there’s a section called allergies and another called prescriptions, we can make that connection.”
“Maybe it’s because I’m French”
When I pointed out the rather small budgets involved with SBIR grants and how his company couldn’t possibly survive on these, he laughed.
“Oh, we’re not running on grants! This isn’t our business. For me, this is a way to work with scientists, with the best labs in the world,” he said, while noting many more grant projects were in the offing. “Science for me is a fuel. The business model is very simple – a service that you subscribe to, like Docusign or Dropbox.”
The company is only just now beginning its real business operations, having made a few connections with integration partners and testers. But over the next year it will expand its private beta and eventually open it up — though there’s no timeline on that just yet.
“We’re very young. A year ago we were like five, six people, now we went and got this $10M seed round and boom,” said Paoli. But he’s certain that this is a business that will be not just lucrative but will represent an important change in how companies work.
“People love documents. Maybe it’s because I’m French,” he said, “but I think text and books and writing are critical — that’s just how humans work. We really think people can help machines think better, and machines can help people think better.”
By Devin Coldewey
When Microsoft announced it was acquiring Nuance Communications this morning for $19.7 billion, you could be excused for doing a Monday morning double take at the hefty price tag.
That’s surely a lot of money for a company on a $1.4 billion run rate, but Microsoft, which has already partnered with the speech-to-text market leader on several products over the last couple of years, saw a company firmly embedded in healthcare and it decided to go all in.
And $20 billion is certainly all in, even for a company the size of Microsoft. But 2020 forced us to change the way we do business from restaurants to retailers to doctors. In fact, the pandemic in particular changed the way we interact with our medical providers. We learned very quickly that you don’t have to drive to an office, wait in waiting room, then in an exam room, all to see the doctor for a few minutes.
Instead, we can get on the line, have a quick chat and be on our way. It won’t work for every condition of course — there will always be times the physician needs to see you — but for many meetings such as reviewing test results or for talk therapy, telehealth could suffice.
Microsoft CEO Satya Nadella says that Nuance is at the center of this shift, especially with its use of cloud and artificial intelligence, and that’s why the company was willing to pay the amount it did to get it.
“AI is technology’s most important priority, and healthcare is its most urgent application. Together, with our partner ecosystem, we will put advanced AI solutions into the hands of professionals everywhere to drive better decision-making and create more meaningful connections, as we accelerate growth of Microsoft Cloud in Healthcare and Nuance,” Nadella said in a post announcing the deal.
Microsoft sees this deal doubling what was already a considerable total addressable market to nearly $500 billion. While TAMs always tend to run high, that is still a substantial number.
It also fits with Gartner data, which found that by 2022, 75% of healthcare organizations will have a formal cloud strategy in place. The AI component only adds to that number and Nuance brings 10,000 existing customers to Microsoft including some of the biggest healthcare organizations in the world.
Brent Leary, founder and principal analyst at CRM Essentials, says the deal could provide Microsoft with a ton of health data to help feed the underlying machine learning models and make them more accurate over time.
“There is going be a ton of health data being captured by the interactions coming through telemedicine interactions, and this could create a whole new level of health intelligence,” Leary told me.
That of course could drive a lot of privacy concerns where health data is involved, and it will be up to Microsoft, which just experienced a major breach on its Exchange email server products last month, to assure the public that their sensitive health data is being protected.
Leary says that ensuring data privacy is going to be absolutely key to the success of the deal. “The potential this move has is pretty powerful, but it will only be realized if the data and insights that could come from it are protected and secure — not only protected from hackers but also from unethical use. Either could derail what could be a game changing move,” he said.
Microsoft also seemed to recognize that when it wrote, “Nuance and Microsoft will deepen their existing commitments to the extended partner ecosystem, as well as the highest standards of data privacy, security and compliance.”
We are clearly on the edge of a sea change when it comes to how we interact with our medical providers in the future. COVID pushed medicine deeper into the digital realm in 2020 out of simple necessity. It wasn’t safe to go into the office unless absolutely necessary.
The Nuance acquisition, which is expected to close some time later this year, could help Microsoft shift deeper into the market. It could even bring Teams into it as a meeting tool, but it’s all going to depend on the trust level people have with this approach, and it will be up to the company to make sure that both healthcare providers and the people they serve have that.
By Ron Miller
In a post announcing the deal, the company said this was about increasing its presence in the healthcare vertical, a place where Nuance has done well in recent years. In fact, the company announced the Microsoft Cloud for Healthcare last year, and this deal is about accelerating its presence there. Nuance’s products in this area include Dragon Ambient eXperience, Dragon Medical One and PowerScribe One for radiology reporting.
“Today’s acquisition announcement represents the latest step in Microsoft’s industry-specific cloud strategy,” the company wrote. The acquisition also builds on several integrations and partnerships the two companies have made in the last couple of years.
The company boasts 10,000 healthcare customers, according to information on the website. Those include AthenaHealth, Johns Hopkins, Mass General Brigham and Cleveland Clinic to name but a few, and it was that customer base that attracted Microsoft to pay the price it did to bring Nuance into the fold.
Nuance CEO Mark Benjamin will remain with the company and report to Scott Guthrie, Microsoft’s EVP in charge of the cloud and AI group.
Nuance has a complex history. It went public in 2000 and began buying speech recognition products including Dragon Dictate from Lernout Hauspie in 2001. It merged with a company called ScanSoft in 2005. That company began life as Visioneer, a scanning company in 1992.
Today, the company has a number of products including Dragon Dictate, a consumer and business text to speech product that dates back to the early 1990s. It’s also involved in speech recognition, chat bots and natural language processing particularly in healthcare and other verticals.
The company has 6,000 employees spread across 27 countries. In its most recent earnings report from November 2020, which was for Q42020, the company reported $352.9 million in revenue compared to $387.6 million in the same period a year prior. That’s not the direction a company wants to go in, but it is still a run rate of over $1.4 billion.
At the time of that earnings call, the company also announced it was selling its medical transcription and electronic health record (EHR) Go-Live services to Assured Healthcare Partners and Aeries Technology Group. Company CEO Benjamin said this was about helping the company concentrate on its core speech services.
“With this sale, we will reach an important milestone in our journey towards a more focused strategy of advancing our Conversational AI, natural language understanding and ambient clinical intelligence solutions,” Benjamin said in a statement at the time.
It’s worth noting that Microsoft already has a number speech recognition and chat bot products of its own including desktop speech to text services in Windows and on Azure, but it took a chance to buy a market leader and go deeper into the healthcare vertical.
The transaction has already been approved by both company boards and Microsoft reports it expects the deal to close by the end of this year, subject to standard regulatory oversight and approval by Nuance shareholders.
This would mark the second largest purchase by Microsoft ever, only surpassed by the $26.2 billion the company paid for LinkedIn in 2016.
By Ron Miller
Box gets some financial ammunition against an activist investor, Samsung launches the Galaxy SmartTag+ and we look at the history of CryptoPunks. This is your Daily Crunch for April 8, 2021.
The big story: KKR invests $500M into Box
Private equity firm KKR is making an investment into Box that should help the cloud content management company buy back shares from activist investor Starboard Value, which might otherwise have claimed a majority of board seats and forced a sale.
After the investment, Aaron Levie will remain with Box as its CEO, but independent board member Bethany Mayer will become the chair, while KKR’s John Park is joining the board as well.
“The KKR move is probably the most important strategic move Box has made since it IPO’d,” said Alan Pelz-Sharpe of Deep Analysis. “KKR doesn’t just bring a lot of money to the deal, it gives Box the ability to shake off some naysayers and invest in further acquisitions.”
The tech giants
Samsung’s AirTags rival, the Galaxy SmartTag+, arrives to help you find lost items via AR — This is a version of Samsung’s lost-item finder that supports Bluetooth Low Energy and ultra-wideband technology.
Spotify stays quiet about launch of its voice command ‘Hey Spotify’ on mobile — Access to the “Hey Spotify” voice feature is rolling out more broadly, but Spotify isn’t saying anything officially.
Verizon and Honda want to use 5G and edge computing to make driving safer — The two companies are piloting different safety scenarios at the University of Michigan’s Mcity, a test bed for connected and autonomous vehicles.
Startups, funding and venture capital
Norway’s Kolonial rebrands as Oda, bags $265M on a $900M valuation to grow its online grocery delivery business in Europe — Oda’s aim is to provide “a weekly shop” for prices that compete against those of traditional supermarkets.
Tines raises $26M Series B for its no-code security automation platform — Tines co-founders Eoin Hinchy and Thomas Kinsella were both in senior security roles at DocuSign before they left to start their own company in 2018.
Yext co-founder unveils Dynascore, which dynamically synchronizes music and video — This is the first product from Howard Lerman’s new startup Wonder Inventions.
Advice and analysis from Extra Crunch
Four strategies for getting attention from investors — MaC Venture Capital founder Marlon Nichols joined us at TechCrunch Early Stage to discuss his strategies for early-stage investing, and how those lessons can translate into a successful launch for budding entrepreneurs.
How to get into a startup accelerator — Neal Sáles-Griffin, managing director of Techstars Chicago, explains when and how to apply to a startup accelerator.
Understanding how fundraising terms can affect early-stage startups — Fenwick & West partner Dawn Belt breaks down some of the terms that trip up first-time entrepreneurs.
(Extra Crunch is our membership program, which helps founders and startup teams get ahead. You can sign up here.)
The Cult of CryptoPunks — Ethereum’s “oldest NFT project” may not actually be the first, but it’s the wildest.
Biden proposes gun control reforms to go after ‘ghost guns’ and close loopholes — President Joe Biden has announced a new set of initiatives by which he hopes to curb the gun violence he described as “an epidemic” and “an international embarrassment.”
Apply to Startup Battlefield at TechCrunch Disrupt 2021 — All you need is a killer pitch, an MVP, nerves of steel and the drive and determination to take on all comers to claim the coveted Disrupt Cup.
The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 3pm Pacific, you can subscribe here.
By Anthony Ha
LiquidStack does it. So does Submer. They’re both dropping servers carrying sensitive data into goop in an effort to save the planet. Now they’re joined by one of the biggest tech companies in the world in their efforts to improve the energy efficiency of data centers, because Microsoft is getting into the liquid-immersion cooling market.
Microsoft is using a liquid it developed in-house that’s engineered to boil at 122 degrees Fahrenheit (lower than the boiling point of water) to act as a heat sink, reducing the temperature inside the servers so they can operate at full power without any risks from overheating.
The vapor from the boiling fluid is converted back into a liquid through contact with a cooled condenser in the lid of the tank that stores the servers.
“We are the first cloud provider that is running two-phase immersion cooling in a production environment,” said Husam Alissa, a principal hardware engineer on Microsoft’s team for datacenter advanced development in Redmond, Washington, in a statement on the company’s internal blog.
While that claim may be true, liquid cooling is a well-known approach to dealing with moving heat around to keep systems working. Cars use liquid cooling to keep their motors humming as they head out on the highway.
As technology companies confront the physical limits of Moore’s Law, the demand for faster, higher performance processors mean designing new architectures that can handle more power, the company wrote in a blog post. Power flowing through central processing units has increased from 150 watts to more than 300 watts per chip and the GPUs responsible for much of Bitcoin mining, artificial intelligence applications and high end graphics each consume more than 700 watts per chip.
It’s worth noting that Microsoft isn’t the first tech company to apply liquid cooling to data centers and the distinction that the company uses of being the first “cloud provider” is doing a lot of work. That’s because bitcoin mining operations have been using the tech for years. Indeed, LiquidStack was spun out from a bitcoin miner to commercialize its liquid immersion cooling tech and bring it to the masses.
“Air cooling is not enough”
More power flowing through the processors means hotter chips, which means the need for better cooling or the chips will malfunction.
“Air cooling is not enough,” said Christian Belady, vice president of Microsoft’s datacenter advanced development group in Redmond, in an interview for the company’s internal blog. “That’s what’s driving us to immersion cooling, where we can directly boil off the surfaces of the chip.”
For Belady, the use of liquid cooling technology brings the density and compression of Moore’s Law up to the datacenter level
The results, from an energy consumption perspective, are impressive. The company found that using two-phase immersion cooling reduced power consumption for a server by anywhere from 5 percent to 15 percent (every little bit helps).
Microsoft investigated liquid immersion as a cooling solution for high performance computing applications such as AI. Among other things, the investigation revealed that two-phase immersion cooling reduced power consumption for any given server by 5% to 15%.
Meanwhile, companies like Submer claim they reduce energy consumption by 50%, water use by 99%, and take up 85% less space.
For cloud computing companies, the ability to keep these servers up and running even during spikes in demand, when they’d consume even more power, adds flexibility and ensures uptime even when servers are overtaxed, according to Microsoft.
“[We] know that with Teams when you get to 1 o’clock or 2 o’clock, there is a huge spike because people are joining meetings at the same time,” Marcus Fontoura, a vice president on Microsoft’s Azure team, said on the company’s internal blog. “Immersion cooling gives us more flexibility to deal with these burst-y workloads.”
At this point, data centers are a critical component of the internet infrastructure that much of the world relies on for… well… pretty much every tech-enabled service. That reliance however has come at a significant environmental cost.
“Data centers power human advancement. Their role as a core infrastructure has become more apparent than ever and emerging technologies such as AI and IoT will continue to drive computing needs. However, the environmental footprint of the industry is growing at an alarming rate,” Alexander Danielsson, an investment manager at Norrsken VC noted last year when discussing that firm’s investment in Submer.
Solutions under the sea
If submerging servers in experimental liquids offers one potential solution to the problem — then sinking them in the ocean is another way that companies are trying to cool data centers without expending too much power.
Microsoft has already been operating an undersea data center for the past two years. The company actually trotted out the tech as part of a push from the tech company to aid in the search for a COVID-19 vaccine last year.
These pre-packed, shipping container-sized data centers can be spun up on demand and run deep under the ocean’s surface for sustainable, high-efficiency and powerful compute operations, the company said.
The liquid cooling project shares most similarity with Microsoft’s Project Natick, which is exploring the potential of underwater datacenters that are quick to deploy and can operate for years on the seabed sealed inside submarine-like tubes without any onsite maintenance by people.
In those data centers nitrogen air replaces an engineered fluid and the servers are cooled with fans and a heat exchanger that pumps seawater through a sealed tube.
Startups are also staking claims to cool data centers out on the ocean (the seaweed is always greener in somebody else’s lake).
Nautilus Data Technologies, for instance, has raised over $100 million (according to Crunchbase) to develop data centers dotting the surface of Davey Jones’ Locker. The company is currently developing a data center project co-located with a sustainable energy project in a tributary near Stockton, Calif.
With the double-immersion cooling tech Microsoft is hoping to bring the benefits of ocean-cooling tech onto the shore. “We brought the sea to the servers rather than put the datacenter under the sea,” Microsoft’s Alissa said in a company statement.
By Jonathan Shieber
When Box announced it was getting a $500 million investment from private equity firm KKR this morning, it was hard not to see it as a positive move for the company. It has been operating under the shadow of Starboard Value, and this influx of cash could give it a way forward independent of the activist investors.
Industry experts we spoke to were all optimistic about the deal, seeing it as a way for the company to regain control, while giving it a bushel of cash to make some moves. However, early returns from the stock market were not as upbeat as the stock price was plunging this morning.
Alan Pelz-Sharpe, principal analyst at Deep Analysis, a firm that follows the content management market closely, says that it’s a significant move for Box and opens up a path to expanding through acquisition.
“The KKR move is probably the most important strategic move Box has made since it IPO’d. KKR doesn’t just bring a lot of money to the deal, it gives Box the ability to shake off some naysayers and invest in further acquisitions,” Pelz-Sharpe told me, adding “Box is no longer a startup its a rapidly maturing company and organic growth will only take you so far. Inorganic growth is what will take Box to the next level.”
Dion Hinchcliffe, an analyst at Constellation Research, who covers the work from home trend and the digital workplace, sees it similarly, saying the investment allows the company to focus longer term again.
“Box very much needs to expand in new markets beyond its increasingly commoditized core business. The KKR investment will give them the opportunity to realize loftier ambitions long term so they can turn their established market presence into a growth story,” he said.
Pelz-Sharpe says that it also changes the power dynamic after a couple of years of having Starboard pushing the direction of the company.
“In short, as a public company there are investors who want a quick flip and others that want to grow this company substantially before an exit. This move with KKR potentially changes the dynamic at Box and may well put Aaron Levie back in the driver’s seat.”
Josh Stein, a partner at DFJ and early investor in Box, who was a long time board member, says that it shows that Box is moving in the right direction.
“I think it makes a ton of sense. Management has done a great job growing the business and taking it to profitability. With KKR’s new investment, you have two of the top technology investors in the world putting significant capital into going long on Box,” Stein said.
Perhaps Stein’s optimism is warranted. In its most recent earnings report from last month, the company announced revenue of $198.9 million, up 8% year-over-year with FY2021 revenue closing at $771 million up 11%. What’s more, the company is cash-flow positive, and has predicted an optimistic future outlook.
“As previously announced, Box is committed to achieving a revenue growth rate between 12-16%, with operating margins of between 23-27%, by fiscal 2024,” the company reiterated in a statement this morning.
Investors remains skeptical, however, with the company stock price getting hammered this morning. As of publication the share price was down over 9%. At this point, market investors may be waiting for the next earnings report to see if the company is headed in the right direction. For now, the $500 million certainly gives the company options, regardless of what Wall Street thinks in the short term.
By Ron Miller
At first glance, Quiq and Snaps might sound like similar startups — they both help businesses talk to their customers via text messaging and other messaging apps. But Snaps CEO Christian Brucculeri said “there’s almost no overlap in what we do” and that the companies are “almost complete complements.”
That’s why Quiq (based in Bozeman, Montana) is acquiring Snaps (based in New York). The entire Snaps team is joining Quiq, with Brucculeri becoming senior vice president of sales and customer success for the combined organization.
Quiq CEO Mike Myer echoed Bruccleri’s point, comparing the situation to dumping two pieces of a jigsaw puzzle on the floor and discovering “the two pieces fit perfectly.” More specifically, he told me that Quiq has generally focused on customer service messaging, with a “do it yourself, toolset approach.” After all, the company was founded by two technical co-founders, and Myer joked, “We can’t understand why [a customer] can’t just call an API.”
Snaps, meanwhile, has focused more on marketing conversations, and on a managed service approach where it handles all of the technical work for its customers. In addition, Myer said that while Quiq has “really focused on platform aspect from beginning” — building integrations with more than a dozen messaging channels including Apple Business Chat, Google’s Business Messages, Instagram, Facebook Messenger and WhatsApp — it doesn’t have “a deep natural language or conversational AI capability” the way Snaps does.
Myer added that demand for Quiq’s offering has been growing dramatically, with revenue up 300% year-over-year in the last six months of 2020. At the same time, he suggested that the divisions between marketing and customer service are beginning to dissolve, with service teams increasingly given sales goals, and “at younger, more commerce-focused organizations, they don’t have this differentiation between marketing and customer service” at all.
Apparently the two companies were already working together to create a combined offering for direct messaging on Instagram, which prompted broader discussions about how to bring the two products together. Moving forward, they will offer a combined platform for a variety of customers under the Quiq brand. (Quiq’s customers include Overstock.com, West Elm, Men’s Wearhouse and Brinks Home Security, while Snaps’ane Bryant, Live Nation, General Assembly, Clairol and Nioxin.) Brucculeri said this will give businesses one product to manage their conversations across “the full customer journey.”
“The key term you’re hearing is conversation,” Myer added. “It’s not about a ticket or a case or a question […] it’s an ongoing conversation.”
Snaps had raised $11.3 million in total funding from investors including Signal Peak Ventures. The financial terms of the acquisition were not disclosed.
By Anthony Ha
Box announced this morning that private equity firm KKR is investing $500 million in the company, a move that could help the struggling cloud content management vendor get out from under pressure from activist investor Starboard Value.
The company plans to use the proceeds in what’s called a “dutch auction” style sale to buy back shares from certain investors for the price determined by the auction, an activity that should take place after the company announces its next earnings report in May. This would presumably involve buying out Starboard, which took a 7.5% stake in the company in 2019.
Last month Reuters reported that Starboard could be looking to take over a majority of the board seats when the company board meets in June. That could have set them up to take some action, most likely forcing a sale.
While it’s not clear what will happen now, it seems likely that with this cash, they will be able to stave off action from Starboard, and with KKR in the picture be able to take a longer term view. Box CEO Aaron Levie sees the move as a vote of confidence from KKR in Box’s approach.
“KKR is one of the world’s leading technology investors with a deep understanding of our market and a proven track record of partnering successfully with companies to create value and drive growth. With their support, we will be even better positioned to build on Box’s leadership in cloud content management as we continue to deliver value for our customers around the world,” Levie said in a statement.
Under the terms of the deal, John Park, Head of Americas Technology Private Equity at KKR, will be joining the Box board of directors. The company also announced that independent board member Bethany Mayer will be appointed chairman of the board, effective on May 1st.
Earlier this year, the company bought e-signature startup SignRequest, which could help open up a new set of workflows for the company as it tries to expand its market. With KKR’s backing, it’s not unreasonable to expect that Box, which is cash flow positive, could be taking additional steps to expand the platform in the future.
Box stock was down over 8% premarket, a signal that perhaps Wall Street isn’t thrilled with the announcement, but the cash influx should give Box some breathing room to reset and push forward.
By Ron Miller
Okta today announced it was expanding its platform into a couple of new areas. Up to this point, the company has been known for its identity access management product, giving companies the ability to sign onto multiple cloud products with a single sign on. Today, the company is moving into two new areas: privileged access and identity governance
Privileged access gives companies the ability to provide access on an as-needed basis to a limited number of people to key administrative services inside a company. This could be your database or your servers or any part of your technology stack that is highly sensitive and where you want to tightly control who can access these systems.
Okta CEO Todd McKinnon says that Okta has always been good at locking down the general user population access to cloud services like Salesforce, Office 365 and Gmail. What these cloud services have in common is you access them via a web interface.
Administrators access the speciality accounts using different protocols. “It’s something like secure shell, or you’re using a terminal on your computer to connect to a server in the cloud, or it’s a database connection where you’re actually logging in with a SQL connection, or you’re connecting to a container which is the Kubernetes protocol to actually manage the container,” McKinnon explained.
Privileged access offers a couple of key features including the ability to limit access to a given time window and to record a video of the session so there is an audit trail of exactly what happened while someone was accessing the system. McKinnon says that these features provide additional layers of protection for these sensitive accounts.
He says that it will be fairly trivial to carve out these accounts because Okta already has divided users into groups and can give these special privileges to only those people in the administrative access group. The challenge was figuring out how to get access to these other kinds of protocols.
The governance piece provides a way for security operations teams to run detailed reports and look for issues related to identity. “Governance provides exception reporting so you can give that to your auditors, and more importantly you can give that to your security team to make sure that you figure out what’s going on and why there is this deviation from your stated policy,” he said.
All of this when combined with the $6.5 billion acquisition of Auth0 last month is part of a larger plan by the company to be what McKinnon calls the identity cloud. He sees a market with several strategic clouds and he believes identity is going to be one of them.
“Because identity is so strategic for everything, it’s unlocking your customer, access, it’s unlocking your employee access, it’s keeping everything secure. And so this expansion, whether it’s customer identity with zero trust or whether it’s doing more on the workforce identity with not just access, but privileged access and identity governance. It’s about identity evolving in this primary cloud,” he said.
While both of these new products were announced today at the company’s virtual Oktane customer conference, they won’t be generally available until the first quarter of next year.
By Ron Miller