Lawyers hate timekeeping. Ping raises $13M to fix it with AI

Counting billable time in six minute increments is the most annoying part of being a lawyer. It’s a distracting waste. It leads law firms to conservatively under-bill. And it leaves lawyers stuck manually filling out timesheets after a long day when they want to go home to their families.

Life is already short, as Ping CEO and co-founder Ryan Alshak knows too well. The former lawyer spent years caring for his mother as she battled a brain tumor before her passing. “One minute laughing with her was worth a million doing anything else” he tells me. “I became obsessed with the idea that we spend too much of our lives on things we have no need to do — especially at work.”

That’s motivated him as he’s built his startup Ping, which uses artificial intelligence to automatically track lawyers’ work and fill out timesheets for them. There’s a massive opportunity to eliminate a core cause of burnout, lift law firm revenue by around 10%, and give them fresh insights into labor allocation.

Ping co-founder and CEO Ryan Alshak. Image Credit: Margot Duane

That’s why today Ping is announcing a $13.2 million Series A led by Upfront Ventures, along with BoxGroup, First Round, Initialized, and Ulu Ventures. Adding to Ping’s quiet $3.7 million seed led by First Round last year, the startup will spend the cash to scale up enterprise distribution and become the new timekeeping standard.

I was a corporate litigator at Manatt Phelps down in LA and joke that I was voted the world’s worst timekeeper” Alshak tells me. “I could either get better at doing something I dreaded or I could try and build technology that did it for me.”

The promise of eliminating the hassle could make any lawyer who hears about Ping an advocate for the firm buying the startup’s software, like how Dropbox grew as workers demanded easier file sharing. “I’ve experienced first-hand the grind of filling out timesheets” writes Initialized partner and former attorney Alda Leu Dennis. “Ping takes away the drudgery of manual timekeeping and gives lawyers back all those precious hours.”

Traditionally, lawyers have to keep track of their time by themselves down to the tenth of an hour — reviewing documents for the Johnson case, preparing a motion to dismiss for the Lee case, a client phone call for Sriram case. There are timesheets built into legal software suites like MyCase, legal billing software like Timesolv, and one-off tools like Time Miner and iTimeKeep. They typically offer timers that lawyers can manually start and stop on different devices, with some providing tracking of scheduled appointments, call and text logging, and integration with billing systems.

Ping goes a big step further. It uses AI and machine learning to figure out whether an activity is billable, for which client, a description of the activity, and its codification beyond just how long it lasted. Instead of merely filling in the minutes, it completes all the logs automatically with entries like “Writing up a deposition – Jenkins Case – 18 minutes”. Then it presents the timesheet to the user for review before the send it to billing.

The big challenge now for Alshak and the team he’s assembled is to grow up. They need to go from cat-in-sunglasses logo Ping to mature wordmark Ping.  “We have to graduate from being a startup to being an enterprise software company” the CEO tells meThat means learning to sell to C-suites and IT teams, rather than just build solid product. In the relationship-driven world of law, that’s a very different skill set. Ping will have to convince clients it’s worth switching to not just for the time savings and revenue boost, but for deep data on how they could run a more efficient firm.

Along the way, Ping has to avoid any embarrassing data breaches or concerns about how its scanning technology could violate attorney-client privilege. If it can win this lucrative first business in legal, it could barge into the consulting and accounting verticals next to grow truly huge.

With eager customers, a massive market, a weak status quo, and a driven founder, Ping just needs to avoid getting in over its heads with all its new cash. Spent well, the startup could leap ahead of the less tech-savvy competition.

Alshak seems determined to get it right. “We have an opportunity to build a company that gives people back their most valuable resource — time — to spend more time with their loved ones because they spent less time working” he tells me. “My mom will live forever because she taught me the value of time. I am deeply motivated to build something that lasts . . . and do so in her name.”


By Josh Constine

How Microsoft is trying to become more innovative

Microsoft Research is a globally distributed playground for people interested in solving fundamental science problems.

These projects often focus on machine learning and artificial intelligence, and since Microsoft is on a mission to infuse all of its products with more AI smarts, it’s no surprise that it’s also seeking ways to integrate Microsoft Research’s innovations into the rest of the company.

Across the board, the company is trying to find ways to become more innovative, especially around its work in AI, and it’s putting processes in place to do so. Microsoft is unusually open about this process, too, and actually made it somewhat of a focus this week at Ignite, a yearly conference that typically focuses more on technical IT management topics.

At Ignite, Microsoft will for the first time present these projects externally at a dedicated keynote. That feels similar to what Google used to do with its ATAP group at its I/O events and is obviously meant to showcase the cutting-edge innovation that happens inside of Microsoft (outside of making Excel smarter).

To manage its AI innovation efforts, Microsoft created the Microsoft AI group led by VP Mitra Azizirad, who’s tasked with establishing thought leadership in this space internally and externally, and helping the company itself innovate faster (Microsoft’s AI for Good projects also fall under this group’s purview). I sat down with Azizirad to get a better idea of what her team is doing and how she approaches getting companies to innovate around AI and bring research projects out of the lab.

“We began to put together a narrative for the company of what it really means to be in an AI-driven world and what we look at from a differentiated perspective,” Azizirad said. “What we’ve done in this area is something that has resonated and landed well. And now we’re including AI, but we’re expanding beyond it to other paradigm shifts like human-machine interaction, future of computing and digital responsibility, as more than just a set of principles and practices but an area of innovation in and of itself.”

Currently, Microsoft is doing a very good job at talking and thinking about horizon one opportunities, as well as horizon three projects that are still years out, she said. “Horizon two, we need to get better at, and that’s what we’re doing.”

It’s worth stressing that Microsoft AI, which launched about two years ago, marks the first time there’s a business, marketing and product management team associated with Microsoft Research, so the team does get a lot of insights into upcoming technologies. Just in the last couple of years, Microsoft has published more than 6,000 research papers on AI, some of which clearly have a future in the company’s products.


By Frederic Lardinois

Coveo raises $227M at $1B+ valuation for AI-based enterprise search and personalization

Search and personalization services continue to be a major area of investment among enterprises, both to make their products and services more discoverable (and used) by customers, and to help their own workers get their jobs done, with the market estimated to be worth some $100 billion annually. Today, one of the big startups building services in this area raised a large round of growth funding to continue tapping that opportunity. Coveo, a Canadian company that builds search and personalisation services powered by artificial intelligence — used by its enterprise customers by way of clould-based, software-as-a-service — has closed a $227 million round, which CEO Louis Tetu tells me values the company at “well above” $1 billion, “Canadian or US dollars.”

The round is being led by Omers Capital Private Growth Equity Group, the investing arm of the Canadian pensions giant that makes large, later-stage bets (the company has been stepping up the pace of investments lately), with participation also from Evergreen Coast Capital, FSTQ, and IQ Ventures. Evergreen led the company’s last round of $100 million in April 2018, and in total the company has now raised just over $402 million with this round.

The $1 billion+ valuation appears to be a huge leap in the context of Coveo’s funding history: in that last round, it had a post-money valuation of about $370 million, according to PitchBook data.

Part of the reason for that is because of Coveo’s business trajectory, and part is due to the heat of the overall market.

Coveo’s round is coming about two weeks after another company that builds enterprise search solutions, Algolia, raised $110 million. The two aim at slightly different ends of the market, Tetu tells me, not directly competing in terms of target customers, and even services. “Algolia is in a different zip code,” he said. Good thing, too, if that’s the case: Salesforce — which is one of Coveo’s biggest partners and customers — was also a strategic investor in the Algolia round. Even if these two do not compete, there are plenty of others vying for the same end of the enterprise search and personalization continuum — they include Google, Microsoft, Elastic, IBM, Lucidworks, and many more. That, again, underscores the size of the market opportunity.

In terms of Coveo’s own business, the company works with some 500 customers today and says SaaS subscription revenues grew more than 55 percent year-over-year this year. Five hundred may sound like a small number, but it covers a lot of very large enterprises spanning web-facing businesses, commerce-based organizations, service-facing companies, and enterprise solutions.

In addition to Salesforce, it includes Visa, Tableau (also Salesforce now!), Honeywell, a Fortune 50 healthcare company (whose name is not getting disclosed), and what Tetu described to me as an Amazon competitor that does $21 billion in sales annually but doesn’t want to be named.

Coveo’s basic selling point is that the better discoverability and personalization that it provides helps its customers avoid as many call-center interactions (reducing operating expenditures), improving sales (boosting conversions and reducing cart abandonment), and help companies themselves just work faster.

“We believe that Coveo is the market leader in leveraging data and AI to personalize at scale,” said Mark Shulgan, Managing Director and Head of Growth Equity at Omers, in a statement. “Coveo fits our investment thesis precisely: an A-plus leadership team with deep expertise in enterprise SaaS, a Fortune 1000 customer base who deeply love the product, and a track record of high growth in a market worth over $100 billion. This makes Coveo a highly-coveted asset. We are glad to be partnering to scale this business.”

Alongside business development on its own steam, the company is going to be using this funding for acquisitions. Tetu notes that Coveo still has a lot of money in the bank from previous rounds.

“We are a real company with real positive economics,” he said. “This round is mostly to have dry powder to invest in a way that is commensurate in the AI space, and within commerce in particular.” To get the ball rolling on that, this past July, Coveo acquired Tooso, a specialist in AI-based digital commerce technology.


By Ingrid Lunden

Microsoft’s Azure Synapse Analytics bridges the gap between data lakes and warehouses

At its annual Ignite conference in Orlando, Fla., Microsoft today announced a major new Azure service for enterprises: Azure Synapse Analytics, which Microsoft describes as “the next evolution of Azure SQL Data Warehouse.” Like SQL Data Warehouse, it aims to bridge the gap between data warehouses and data lakes, which are often completely separate. Synapse also taps into a wide variety of other Microsoft services, including Power BI and Azure Machine Learning, as well as a partner ecosystem that includes Databricks, Informatica, Accenture, Talend, Attunity, Pragmatic Works and Adatis. It’s also integrated with Apache Spark.

The idea here is that Synapse allows anybody working with data in those disparate places to manage and analyze it from within a single service. It can be used to analyze relational and unstructured data, using standard SQL.

Screen Shot 2019 10 31 at 10.11.48 AM

Microsoft also highlights Synapse’s integration with Power BI, its easy to use business intelligence and reporting tool, as well as Azure Machine Learning for building models.

With the Azure Synapse studio, the service provides data professionals with a single workspace for prepping and managing their data, as well as for their big data and AI tasks. There’s also a code-free environment for managing data pipelines.

As Microsoft stresses, businesses that want to adopt Synapse can continue to use their existing workloads in production with Synapse and automatically get all of the benefits of the service. “Businesses can put their data to work much more quickly, productively, and securely, pulling together insights from all data sources, data warehouses, and big data analytics systems,” writes Microsoft CVP of Azure Data, Rohan Kumar.

In a demo at Ignite, Kumar also benchmarked Synapse against Google’s BigQuery. Synapse ran the same query over a petabyte of data in 75% less time. He also noted that Synapse can handle thousands of concurrent users — unlike some of Microsoft’s competitors.


By Frederic Lardinois

Microsoft launches Power Virtual Agents, its no-code bot builder

Microsoft today announced the public preview of its Power Virtual Agents tool, a new no-code tool for building chatbots that’s part of the company’s Power Platform, which also includes Microsoft Flow automation tool, which is being renamed to Power Automate today, and Power BI.

Built on top of Azure’s existing AI smarts and tools for building bots, Power Virtual Agents promises to make building a chatbot almost as easy as writing a Word document. With this, anybody within an organization could build a bot that walks a new employee through the onboarding experience for example.

“Power virtual agent is the newest addition to the Power Platform family,” said Microsoft’s Charles Lamanna in an interview ahead of today’s announcement. “Power Virtual Agent is very much focused on the same type of low code, accessible to anybody, no matter whether they’re a business user or business analyst or professional developer, to go build a conversational agent that’s AI-driven and can actually solve problems for your employees, for your customers, for your partners, in a very natural way.”

Power Virtual Agents handles the full lifecycle of the bot building experience, from the creation of the dialog to making it available in chat systems that include Teams, Slack, Facebook Messenger and others. Using Microsoft’s AI smarts, users don’t have to spend a lot of time defining every possible question and answer, but can instead rely on the tool to understand intentions and trigger the right action. “We do intent understanding, as well as entity extraction, to go and find the best topic for you to go down,” explained Lamanna. Like similar AI systems, the service also learns over time, based on feedback it receives from users.

One nice feature here is that if your setup outgrows the no-code/low-code stage and you need to get to the actual code, you’ll be able to convert the bot to Azure resources since that’s what’s powering the bot anyway. Once you’ve edited the code, you obviously can’t take it back into the no-code environment. “We have an expression for Power Platform, which is ‘no cliffs.’ […] The idea of ‘no cliffs’ is that the most common problem with a low-code platform is that, at some point, you want more control, you want code. And that’s frequently where low-code platforms run out of gas and you really have issues because you can’t have the pro dev take it over, you can’t make it mission-critical.”

The service is also integrated with tools like Power Automate/Microsoft Flow to allow users to trigger actions on other services based on the information the chatbot gathers.

Lamanna stressed that the service also generates lots of advanced analytics for those who are building bots with it. With this, users can see what topics are being asked about and where the system fails to provide answers, for example. It also visualizes the different text inputs that people provide so that bot builders can react to that.

Over the course of the last two or three years, we went from a lot of hype around chatbots to deep disillusionment with the experience they actually delivered. Lamanna isn’t fazed by that. In part, those earlier efforts failed because the developers weren’t close enough to the users. They weren’t product experts or part of the HR team inside a company. By using a low-code/no-code tool, he argues, the actual topic experts can build these bots. “If you hand it over to a developer or an AI specialist, they’re geniuses when it comes to developing code, but they won’t know the details and ins and outs of, say, the shoe business – and vice versa. So it actually changes how development happens.”


By Frederic Lardinois

Cortana wants to be your personal executive assistant and read your emails to you, too

Only a few years ago, Microsoft hoped that Cortana could become a viable competitor to the Google Assistant, Alexa and Siri . Over time, as Cortana failed to make a dent in the marketplace (do you ever remember that Cortana is built into your Windows 10 machine?), the company’s ambitions shrunk a bit. Today, Microsoft wants Cortana to be your personal productivity assistant — and to be fair, given the overall Microsoft ecosystem, Cortana may be better suited to that than to tell you about the weather.

At its Ignite conference, Microsoft today announced a number of new features that help Cortana to become even more useful in your day-to-day work, all of which fit into the company’s overall vision of AI as a tool that is helpful and augments human intelligence.

Screen Shot 2019 10 31 at 3.25.48 PM

The first of these is a new feature in Outlook for iOS that uses Microsoft text-to-speech features to read your emails to you (using both a male and female voice). Cortana can also now help you schedule meetings and coordinate participants, something the company first demoed at previous conferences.

Starting next month, Cortana will also be able to send you a daily email that summarizes all of your meetings, presents you with relevant documents and reminders to “follow up on commitments you’ve made in email.” This last part, especially, should be interesting as it seems to go beyond the basic (and annoying) nudges to reply to emails in Google’s Gmail.

2019 11 01 0914


By Frederic Lardinois

Google launches TensorFlow Enterprise with long-term support and managed services

Google open-sourced its TensorFlow machine learning framework back in 2015 and it quickly became one of the most popular platforms of its kind. Enterprises that wanted to use it, however, had to either work with third parties or do it themselves. To help these companies — and capture some of this lucrative market itself — Google is launching TensorFlow Enterprise, which includes hands-on, enterprise-grade support and optimized managed services on Google Cloud.

One of the most important features of TensorFlow Enterprise is that it will offer long-term support. For some versions of the framework, Google will offer patches for up to three years. For what looks to be an additional fee, Google will also offer engineering assistance from its Google Cloud and TensorFlow teams to companies that are building AI models.

All of this, of course, is deeply integrated with Google’s own cloud services. “Because Google created and open-sourced TensorFlow, Google Cloud is uniquely positioned to offer support and insights directly from the TensorFlow team itself,” the company writes in today’s announcement. “Combined with our deep expertise in AI and machine learning, this makes TensorFlow Enterprise the best way to run TensorFlow.”

Google also includes Deep Learning VMs and Deep Learning Containers to make getting started with TensorFlow easier and the company has optimized the enterprise version for Nvidia GPUs and Google’s own Cloud TPUs.

Today’s launch is yet another example of Google Cloud’s focus on enterprises, a move the company accelerated when it hired Thomas Kurian to run the Cloud businesses. After years of mostly ignoring the enterprise, the company is now clearly looking at what enterprises are struggling with and how it can adapt its products for them.


By Frederic Lardinois

Aurora Insight emerges from stealth with $18M and a new take on measuring wireless spectrum

Aurora Insight, a startup that provides a “dynamic” global map of wireless connectivity that it built and monitors in real time using AI combined with data from sensors on satellites, vehicles, buildings, aircraft and other objects, is emerging from stealth today with the launch of its first publicly-available product, a platform providing insights on wireless signal and quality covering a range of wireless spectrum bands, offered as a cloud-based, data-as-a-service product.

“Our objective is to map the entire planet, charting the radio waves used for communications,” said Brian Mengwasser, the co-founder and CEO. “It’s a daunting task.” He said that to do this the company first “built a bunker” to test the system before rolling it out at scale.

With it, Aurora Insight is also announcing that it has raised $18 million in funding — an aggregate amount that reaches back to its founding in 2016 and covering both a seed round and Series A — from an impressive list of investors. Led by Alsop Louie Partners and True Ventures, backers also include Tippet Venture Partners, Revolution’s Rise of the Rest Seed Fund, Promus Ventures, Alumni Ventures Group, ValueStream Ventures, and Intellectus Partners.

The area of measuring wireless spectrum and figuring out where it might not be working well (in order to fix it) may sound like an arcane area, but it’s a fairly essential one.

Mobile technology — specifically, new devices and the use of wireless networks to connect people, objects and services — continues to be the defining activity of our time, with more than 5 billion mobile users on the planet (out of 7.5 billion people) today and the proportion continuing to grow. With that, we’re seeing a big spike in mobile internet usage, too, with more than 5 billion people, and 25.2 billion objects, expected to be using mobile data by 2025, according to the GSMA.

The catch to all this is that wireless spectrum — which enables the operation of mobile services — is inherently finite and somewhat flaky in how its reliability is subject to interference. That in turn is creating a need for a better way of measuring how it is working, and how to fix it when it is not.

“Wireless spectrum is one of the most critical and valuable parts of the communications ecosystem worldwide,” said Rohit Sharma, partner at True Ventures and Aurora Insight board member, in a statement. “To date, it’s been a massive challenge to accurately measure and dynamically monitor the wireless spectrum in a way that enables the best use of this scarce commodity. Aurora’s proprietary approach gives businesses a unique way to analyze, predict, and rapidly enable the next-generation of wireless-enabled applications.”

If you follow the world of wireless technology and telcos, you’ll know that wireless network testing and measurement is an established field, about as old as the existence of wireless networks themselves (which says something about the general reliability of wireless networks). Aurora aims to disrupt this on a number of levels.

Mengwasser — who co-founded the company with Jennifer Alvarez, the CTO who you can see presenting on the company here — tells me that a lot of the traditional testing and measurement has been geared at telecoms operators, who own the radio towers, and tend to focus on more narrow bands of spectrum and technologies.

The rise of 5G and other wireless technologies, however, has come with a completely new playing field and set of challenges from the industry.

Essentially, we are now in a market where there are a number of different technologies coexisting — alongside 5G we have earlier network technologies (4G, LTE, Wifi); a potential set of new technologies. And we have a new breed of companies are building services that need to have close knowledge of how networks are working to make sure they remain up and reliable.

Mengwasser said Aurora is currently one of the few trying to tackle this opportunity by developing a network that is measuring multiples kinds of spectrum simultaneously, and aims to provide that information not just to telcos (some of whom have been working with Aurora while still in stealth) but the others kinds of application and service developers that are building businesses based on those new networks.

“There is a pretty big difference between us and performance measurement, which typically operates from the back of a phone and tells you when have a phone in a particular location,” he said. “We care about more than this, more than just homes, but all smart devices. Eventually, eerything will be connected to network so we are aiming to provide intelligence on that.”

One example are drone operators who are building delivery networks: Aurora has been working with at least one while in stealth to help develop a service, Mengwasser said, although he declined to say which one. (He also, incidentally, specifically declined to say whether the company had talked with Amazon.)

5G is a particularly tricky area of mobile network spectrum and services to monitor and tackle, one reason why Aurora Insight has caught the attention of investors.

“The reality of massive MIMO beamforming, high frequencies, and dynamic access techniques employed by 5G networks means it’s both more difficult and more important to quantify the radio spectrum,” said Gilman Louie of Alsop Louie Partners, in a statement. “Having the accurate and near-real-time feedback on the radio spectrum that Aurora’s technology offers could be the difference between building a 5G network right the first time, or having to build it twice.” Louie is also sitting on the board of the startup.


By Ingrid Lunden

Descartes Labs snaps up $20M more for its AI-based geospatial imagery analytics platform

Satellite imagery holds a wealth of information that could be useful for industries, science and humanitarian causes, but one big and persistent challenge with it has been a lack of effective ways to tap that disparate data for specific ends.

That’s created a demand for better analytics, and now, one of the startups that has been building solutions to do just that is announcing a round of funding as it gears up for expansion. Descartes Labs, a geospatial imagery analytics startup out of Santa Fe, New Mexico, is today announcing that it has closed a $20 million round of funding, money that CEO and founder Mark Johnson described to me as a bridge round ahead of the startup closing and announcing a larger growth round.

The funding is being led by Union Grove Venture Partners, with Ajax Strategies, Crosslink Capital, and March Capital Partners (which led its previous round) also participating. It brings the total raised by Descartes Labs to $60 million, and while Johnson said the startup would not be disclosing its valuation, PitchBook notes that it is $220 million ($200 million pre-money in this round).

As a point of comparison, another startup in the area of geospatial analytics, Orbital Insight, is reportedly now raising money at a $430 million valuation (that data is from January of this year, and we’ve contacted the company to see if it ever closed).

Santa Fe — a city popular with retirees that counts tourism as its biggest industry — is an unlikely place to find a tech startup. Descartes Labs’ presence there is a result of that fact that it is a spinoff from the Los Alamos National Laboratory near the city.

Johnson — who had lived in San Francisco before coming to Santa Fe to help create Descartes Labs (his previous experience building Zite for media, he said, led the Los Alamos scientists to first conceive of the Descartes Labs IP as the basis of a kind of search engine) — admitted that he never thought the company would stay headquartered there beyond a short initial phase of growth of six months.

However, it turned out that the trends around more distributed workforces (and cloud computing to enable that), engineers looking for employment alternatives to living in pricey San Francisco, plus the heated competition for talent you get in the Valley all came together in a perfect storm that helped Descartes Labs establish and thrive on its home turf.

Descartes Labs — named after the seminal philosopher/mathematician Rene Descartes — describes itself as a “data refinery”. By this, it means it injests a lot of imagery and unstructured data related to the earth that is picked up primarily by satellites but also other sensors (Johnson notes that its sources include data from publicly available satellites; data from NASA and the European space agency, and data from the companies themselves); applies AI-based techniques including computer vision analysis and machine learning to make sense of the sometimes-grainy imagery; and distills and orders it to create insights into what is going on down below, and how that is likely to evolve.

Screenshot 2019 10 11 at 13.26.33

This includes not just what is happening on the surface of the earth, but also in the air above it: Descartes Labs has worked on projects to detect levels of methane gas in oil fields, the spread of wildfires, and how crops might grow in a particular area, and the impact of weather patterns on it all.

It has produced work for a range of clients that have included governments (the methane detection was commissioned as part of New Mexico’s effort to reduce greenhouse gas emissions), energy giants and industrial agribusiness, and traders.

“The idea is to help them take advantage of all the new data going online,” Johnson said, noting that this can help, for example, bankers forecast how much a commodity will trade for, or the effect of a change in soil composition on a crop.

The fact that Descartes Labs’ work has connected it with the energy industry gives an interesting twist to the use of the phrase “data refinery”. But in case you were wondering, Johnson said that the company goes through a process of vetting potential customers to determine if the data Descartes Labs provides to them is for a positive end, or not.

“We have a deep belief that we can help them become more efficient,” he said. “Those looking at earth data are doing so because they care about the planet and are working to try to become more sustainable.”

Johnson also said (in answer to my question about it) that so far, there haven’t been any instances where the startup has been prohibited to work with any customers or countries, but you could imagine how — in this day of data being ‘the new oil’ and the fulcrum of power — that could potentially be an issue. (Related to this: Orbital Insight counts In-Q-Tel, the CIA’s venture arm, as one of its backers.)

Looking ahead, the company is building what it describes as a “digital twin” of the earth, the idea being that in doing so it can better model the imagery that it injests and link up data from different regions more seamlessly (since, after all, a climatic event in one part of the world inevitably impacts another). Notably, “digital twinning” is a common concept that we see applied in other AI-based enterprises to better predict activity: this is the approach that, for example, Forward Networks takes when building models of an enterprise’s network to determine how apps will behave and identify the reasons behind an outage.

In addition to the funding round, Descartes Labs named Phil Fraher its new CFO, and is announcing Veery Maxwell, Director for Energy Innovation and Patrick Cairns, who co-founded UGVP, as new board observers.


By Ingrid Lunden

Clari snags $60M Series D on valuation of around $500M

Clari uses AI to help companies find key information like the customers most likely to convert, the state of orders in the sales process or the next big sources of revenue. As its revenue management system continues to flourish, the company announced a $60 million Series D investment today.

Sapphire Ventures led the round with help from new-comer Madrona Venture Group and existing investors Sequoia Capital, Bain Capital Ventures and Tenaya Capital. Today’s investment brings the total raised to $135 million, according to the company.

The valuation, which CEO and co-founder Andy Byrne pegged at around a half a billion, appears to be hefty raise from what the company was likely valued at in 2018 after its $35 million Series C. As TechCrunch’s Ingrid Lunden wrote at the time:

“For some context, Clari, according to Pitchbook, had a relatively modest post-money valuation of $83.5 million in its last round in 2014, so my guess is that it’s now comfortably into hundred-million territory, once you add in this latest $35 million,” Lunden wrote.

Byrne says the company wasn’t even really looking for a new round, but when investors came knocking, he couldn’t refuse. “On the fundraise side, what’s really interesting is how this whole thing went down. We weren’t out looking, but we had a massive amount of interest from a lot of firms. We decided to engage, and we got it done in less than three weeks, which the board was kind of blown away by,” Byrne told TechCrunch.

What’s motivating these companies to invest is that Clari is helping to define this revenue operations category, and has attracted companies like Okta, Zoom and Qualtrics as customers. What they are providing is this AI-fueled way to see where the best sales opportunities are to drive revenue, and that’s what every company is looking for. At the same time, Byrne says that he’s moving companies away from a spreadsheet-driven record keeping system, and enabling them to see the all of the data in one place.

“Clari is allowing a rep to really understand where they should spend time, automating a lot of things for them to close deals faster, while giving managers new insights they’ve never had before to allow them to drive more revenue. And then we’re getting them out of ‘Excel hell.’ They’re no longer in these spreadsheets. They’re in Clari, and have more predictability in their forecasting,” he said.

Clari was founded in 2012 and is headquartered in Sunnyvale, CA. It has over 300 customers and just passed the 200 employee mark, a number that should increase as the company uses this money to begin to accelerate growth and expand the product’s capabilities.


By Ron Miller

Nadella warns government conference not to betray user trust

Microsoft CEO Satya Nadella, delivering the keynote at the Microsoft Government Leaders Summit in Washington, DC today, had a message for attendees to maintain user trust in their tools technologies above all else.

He said it is essential to earn user trust, regardless of your business. “Now, of course, the power law here is all around trust because one of the keys for us, as providers of platforms and tools, trust is everything,” he said today. But he says it doesn’t stop with the platform providers like Microsoft. Institutions using those tools also have to keep trust top of mind or risk alienating their users.

“That means you need to also ensure that there is trust in the technology that you adopt, and the technology that you create, and that’s what’s going to really define the power law on this equation. If you have trust, you will have exponential benefit. If you erode trust it will exponentially decay,” he said.

He says Microsoft sees trust along three dimensions: privacy, security and ethical use of artificial intelligence. All of these come together in his view to build a basis of trust with your customers.

Nadella said he sees privacy as a human right, pure and simple, and it’s up to vendors to ensure that privacy or lose the trust of their customers. “The investments around data governance is what’s going to define whether you’re serious about privacy or not,” he said. For Microsoft, they look at how transparent they are about how they use the data, their terms of service, and how they use technology to ensure that’s being carried out at runtime.

He reiterated the call he made last year for a federal privacy law. With GDPR in Europe and California’s CCPA coming on line in January, he sees a centralized federal law as a way to streamline regulations for business.

As for security, as you might expect, he defined it in terms of how Microsoft was implementing it, but the message was clear that you needed security as part of your approach to trust, regardless of how you implement that. He asked several key questions of attendees.

“Cyber is the second area where we not only have to do our work, but you have to [ask], what’s your operational security posture, how have you thought about having the best security technology deployed across the entire chain, whether it’s on the application side, the infrastructure side or on the endpoint, side, and most importantly, around identity,” Nadella said.

The final piece, one which he said was just coming into play was how you use artificial intelligence ethically, a sensitive topic for a government audience, but one he wasn’t afraid to broach. “One of the things people say is, ‘Oh, this AI thing is so unexplainable, especially deep learning.’ But guess what, you created that deep learning [model]. In fact, the data on top of which you train the model, the parameters and the number of parameters you use — a lot of things are in your control. So we should not abdicate our responsibility when creating AI,” he said.

Whether Microsoft or the US government can adhere to these lofty goals is unclear, but Nadella was careful to outline them both for his company’s benefit and this particular audience. It’s up to both of them to follow through.


By Ron Miller

Satya Nadella looks to the future with edge computing

Speaking today at the Microsoft Government Leaders Summit in Washington DC, Microsoft CEO Satya Nadella made the case for edge computing, even while pushing the Azure cloud as what he called “the world’s computer.”

While Amazon, Google and other competitors may have something to say about that, marketing hype aside, many companies are still in the midst of transitioning to the cloud. Nadella says the future of computing could actually be at the edge where computing is done locally before data is then transferred to the cloud for AI and machine learning purposes. What goes around, comes around.

But as Nadella sees it, this is not going to be about either edge or cloud. It’s going to be the two technologies working in tandem. “Now, all this is being driven by this new tech paradigm that we describe as the intelligent cloud and the intelligent edge,” he said today.

He said that to truly understand the impact the edge is going to have on computing, you have to look at research, which predicts there will be 50 billion connected devices in the world by 2030, a number even he finds astonishing. “I mean this is pretty stunning. We think about a billion Windows machines or a couple of billion smartphones. This is 50 billion [devices], and that’s the scope,” he said.

The key here is that these 50 billion devices, whether you call them edge devices or the Internet of Things, will be generating tons of data. That means you will have to develop entirely new ways of thinking about how all this flows together. “The capacity at the edge, that ubiquity is going to be transformative in how we think about computation in any business process of ours,” he said. As we generate ever-increasing amounts of data, whether we are talking about public sector kinds of use case, or any business need, it’s going to be the fuel for artificial intelligence, and he sees the sheer amount of that data driving new AI use cases.

“Of course when you have that rich computational fabric, one of the things that you can do is create this new asset, which is data and AI. There is not going to be a single application, a single experience that you are going to build, that is not going to be driven by AI, and that means you have to really have the ability to reason over large amounts of data to create that AI,” he said.

Nadella would be more than happy to have his audience take care of all that using Microsoft products, whether Azure compute, database, AI tools or edge computers like the Data Box Edge it introduced in 2018. While Nadella is probably right about the future of computing, all of this could apply to any cloud, not just Microsoft.

As computing shifts to the edge, it’s going to have a profound impact on the way we think about technology in general, but it’s probably not going to involve being tied to a single vendor, regardless of how comprehensive their offerings may be.


By Ron Miller

T4 wants to transform market research data with a combination of AI and humans

When T4 co-founder and CEO Maks Khurgin was working at Bain and Company, he ran into a common problem for analysts looking for market data. He spent way too much time searching for it and felt there had to be a better way. He decided to build a centralized market data platform himself, and T4 was born. This week the company competes in the TechCrunch Disrupt SF Startup Battlefield.

What he created with the help of his long-time friend and CTO, Yev Spektor, was built on a couple of key components. The first is an industry classification system, a taxonomy, that organizes markets by industries and sub-industries. Using search and aggregation tools powered by artificial intelligence, it scours the web looking for information sources that match their taxonomy labels.

As they researched the tool, the founders realized that the AI could only get them so far. There were always pieces that it missed. So they built a second part to provide a way for human indexers to fill in those missing parts to offer as comprehensive a list of sources as possible.

“AI alone cannot solve this problem. If we bring people into this and avoid the last mile delivery problem, then you can actually start organizing this information in a much better way than anyone else had ever done,” Khurgin explained.

It seems simple enough, but it’s a problem that well-heeled companies like Bain have been trying to solve for years, and there was a lot of skepticism when Khurgin told his superiors he was leaving to build a product to solve this problem. “I had a partner at Bain and Company actually tell me, “You know, every consulting firm has tried to do something like this — and they failed. Why do you think you can do this?””

He knew that figuring out the nature of the problem and why the other attempts had failed was the key to solving the puzzle. He decided to take the challenge, and on his 30th birthday, he quit his job at Bain and started T4 the next day — without a product yet, mind you.

This was not the first time he had left a high-paying job to try something unconventional. “Last time I left a high paying job, actually after undergrad, I was a commodities derivatives trader for a financial [services company]. I left that to pursue a lifelong dream of being in the Marine Corps,” Khurgin said.

T4 DSC00953

T4 was probably a less risky proposition, but it still took a leap of faith that only a startup founder can understand, who believes in his idea. “I felt the problem first-hand, and the the big kind of realization that I had was that there is actually a finite amount of information out there. Market research is created by humans, and you don’t necessarily have to take a pure AI approach,” he said.

The product searches for all of the related information on a topic, finds all of the data related to a category and places it in an index. Users can search by topic and find all of the free and paid reports related to that search. The product shows which reports are free and which will cost you money, and like Google, you get a title and a brief summary.

The company is just getting started with five main market categories so far, including cloud computing, cybersecurity, networking, data centers and eSports. The founders plan to add additional categories over time, and have a bold goal for the future.

“Our long-term vision is that we become your one-stop shop to find market research in the same way that if you need to buy something, you go to Amazon, or you need financial data, you go on Bloomberg or Thomson. If you need market research, our vision is that T4 is the place that you go,” Khurgin said.



By Ron Miller

India’s Fyle bags $4.5M to expand its expense management platform in US, other international markets

Fyle, a Bangalore-headquartered startup that operates an expense management platform, has extended its previous financing round to add $4.5 million of new investment as it looks to court more clients in overseas markets.

The additional $4.5 million tranche of investment was led by U.S.-based hedge fund Steadview Capital, the startup said. Tiger Global, Freshworks, and Pravega Ventures also participated in the round. The new tranche of investment, dubbed Series A1, means that the three-and-a-half-year old startup has raised $8.7 million as part of its Series A financing round, and $10.5 million to date.

The SaaS startup offers an expense management platform that makes it easier for employees of a firm to report their business expenses. The eponymous service supports a range of popular email providers including G Suite and Office 365, and uses a proprietary technology to scan and fetch details from emails, Yash Madhusudhan, co-founder and CEO of Fyle, demonstrated to TechCrunch last week.

A user, for instance, could open a flight ticket email and click on Fyle’s Chrome extension to fetch all details and report the expense in a single-click in real-time. As part of today’s announcement, Madhusudhan unveiled an integration with WhatsApp . Users will now be able to take pictures of their tickets and other things and forward it to Fyle, which will quickly scan and report expense filings for them.

These integrations come in handy to users. “80%-90% of a user’s spending patterns land on their email and messaging clients. And traditionally it has been a pain point for them to get done with their expense filings. So we built a platform that looks at the challenges faced by them. At the same time, our platform understands frauds and works with a company’s compliances and policies to ensure that the filings are legitimate,” he said.

“Every company today could make use of an intelligent expense platform like Fyle. Major giants already subscribe to ERP services that offer similar capabilities as part of their offerings. But as a company or startup grows beyond 50 to 100 people, it becomes tedious to manage expense filings,” he added.

Fyle maintains a web application and a mobile app, and users are free to use them. But the rationale behind introducing integrations with popular services is to make it easier than ever for them to report filings. The startup retains its algorithms each month to improve their scanning abilities. “The idea is to extend expense filing to a service that people already use,” he said.

International expansion

Until late last year, Fyle was serving customers in India. Earlier this year, it began searching for clients outside the nation. “Our philosophy was if we are able to sell in India remotely and get people to use the product without any training, we should be able to replicate this in any part of the world,” he said.

And that bet has worked. Fyle has amassed more than 300 clients, more than 250 of which are from outside of India. Today, the startup says it has customers in 17 nations including the U.S., and the UK. Furthermore, Fyle’s revenue has grown by five times in the last five months, said Madhusudhan, without disclosing the exact figures.

To accelerate its momentum, the startup is today also launching an enterprise version of Fyle that will serve the needs of major companies. The enterprise version supports a range of additional security features such as IP restriction and single sign-in option.

Fyle will use the new capital to develop more product solutions and integrations and expand its footprint in international markets, Madhusudhan said. The startup, which just recently set up its sales and marketing team would also expand the headcount, he said.

Moving forward, Madhusudhan said the startup would also explore tie-ups with ERP providers and other ways to extend the reach of Fyle.

In a statement, Ravi Mehta, MD at Steadview Capital, said, “intelligent and automated systems will empower businesses to be more efficient in the coming decade. We are excited to partner with Fyle to transform one of the core business processes of expense management through intelligence and automation.”


By Manish Singh

Why is Dropbox reinventing itself?

According to Dropbox CEO Drew Houston, 80% of the product’s users rely on it, at least partially, for work.

It makes sense, then, that the company is refocusing to try and cement its spot in the workplace; to shed its image as “just” a file storage company (in a time when just about every big company has its own cloud storage offering) and evolve into something more immutably core to daily operations.

Earlier this week, Dropbox announced that the “new Dropbox” would be rolling out to all users. It takes the simple, shared folders that Dropbox is known for and turns them into what the company calls “Spaces” — little mini collaboration hubs for your team, complete with comment streams, AI for highlighting files you might need mid-meeting, and integrations into things like Slack, Trello and G Suite. With an overhauled interface that brings much of Dropbox’s functionality out of the OS and into its own dedicated app, it’s by far the biggest user-facing change the product has seen since launching 12 years ago.

Shortly after the announcement, I sat down with Dropbox VP of Product Adam Nash and CTO Quentin Clark . We chatted about why the company is changing things up, why they’re building this on top of the existing Dropbox product, and the things they know they just can’t change.

You can find these interviews below, edited for brevity and clarity.

Greg Kumparak: Can you explain the new focus a bit?

Adam Nash: Sure! I think you know this already, but I run products and growth, so I’m gonna have a bit of a product bias to this whole thing. But Dropbox… one of its differentiating characteristics is really that when we built this utility, this “magic folder”, it kind of went everywhere.


By Greg Kumparak