Extra Crunch roundup: 2 VC surveys, Tesla’s melt up, The Roblox Gambit, more

This has been quite a week.

Instead of walking backward through the last few days of chaos and uncertainty, here are three good things that happened:

  • Google employee Sara Robinson combined her interest in machine learning and baking to create AI-generated hybrid treats.
  • A breakthrough could make water desalination 30%-40% more effective.
  • Bianca Smith will become the first Black woman to coach a professional baseball team.

Despite many distractions in our first full week of the new year, we published a full slate of stories exploring different aspects of entrepreneurship, fundraising and investing.

We’ve already gotten feedback on this overview of subscription pricing models, and a look back at 2020 funding rounds and exits among Israel’s security startups was aimed at our new members who live and work there, along with international investors who are seeking new opportunities.

Plus, don’t miss our first investor surveys of 2021: one by Lucas Matney on social gaming, and another by Mike Butcher that gathered responses from Portugal-based investors on a wide variety of topics.

Thanks very much for reading Extra Crunch this week. I hope we can all look forward to a nice, boring weekend with no breaking news alerts.

Walter Thompson
Senior Editor, TechCrunch
@yourprotagonist


Full Extra Crunch articles are only available to members
Use discount code ECFriday to save 20% off a one- or two-year subscription


The Roblox Gambit

In February 2020, gaming platform Roblox was valued at $4 billion, but after announcing a $520 million Series H this week, it’s now worth $29.5 billion.

“Sure, you could argue that Roblox enjoyed an epic 2020, thanks in part to COVID-19,” writes Alex Wilhelm this morning. “That helped its valuation. But there’s a lot of space between $4 billion and $29.5 billion.”

Alex suggests that Roblox’s decision to delay its IPO and raise an enormous Series H was a grandmaster move that could influence how other unicorns will take themselves to market. “A big thanks to the gaming company for running this experiment for us.”

I asked him what inspired the headline; like most good ideas, it came to him while he was trying to get to sleep.

“I think that I had “The Queen’s Gambit somewhere in my head, so that formed the root of a little joke with myself. Roblox is making a strategic wager on method of going public. So, ‘gambit’ seems to fit!”

8 investors discuss social gaming’s biggest opportunities

girl playing games on desktop computer

Image Credits: Erik Von Weber (opens in a new window) / Getty Images

For our first investor survey of the year, Lucas Matney interviewed eight VCs who invest in massively multiplayer online games to discuss 2021 trends and opportunities:

  • Hope Cochran, Madrona Venture Group
  • Daniel Li, Madrona Venture Group
  • Niko Bonatsos, General Catalyst
  • Ethan Kurzweil, Bessemer Venture Partners
  • Sakib Dadi, Bessemer Venture Partners
  • Jacob Mullins, Shasta Ventures
  • Alice Lloyd George, Rogue
  • Gigi Levy-Weiss, NFX

Having moved far beyond shooters and sims, platforms like Twitch, Discord and Fortnite are “where culture is created,” said Daniel Li of Madrona.

Rep. Alexandria Ocasio-Cortez uses Twitch to explain policy positions, major musicians regularly perform in-game concerts on Fortnite and in-game purchases generated tens of billions last year.

“Gaming is a unique combination of science and art, left and right brain,” said Gigi Levy-Weiss of NFX. “It’s never just science (i.e., software and data), which is why many investors find it hard.”

How to convert customers with subscription pricing

Giant hand and magnet picking up office and workers

Image Credits: C.J. Burton (opens in a new window) / Getty Images

Startups that lack insight into their sales funnel have high churn, low conversion rates and an inability to adapt or leverage changes in customer behavior.

If you’re hoping to convert and retain customers, “reinforcing your value proposition should play a big part in every level of your customer funnel,” says Joe Procopio, founder of Teaching Startup.

What is up with Tesla’s value?

Elon Musk, founder of SpaceX and chief executive officer of Tesla Inc., arrives at the Axel Springer Award ceremony in Berlin, Germany, on Tuesday, Dec. 1, 2020. Tesla Inc. will be added to the S&P 500 Index in one shot on Dec. 21, a move that will ripple through the entire market as money managers adjust their portfolios to make room for shares of the $538 billion company. Photographer: Liesa Johannssen-Koppitz/Bloomberg via Getty Images

Image Credits: Bloomberg (opens in a new window) / Getty Images

Alex Wilhelm followed up his regular Friday column with another story that tries to find a well-grounded rationale for Tesla’s sky-high valuation of approximately $822 billion.

Meanwhile, GM just unveiled a new logo and tagline.

As ever, I learned something new while editing: A “melt up” occurs when investors start clamoring for a particular company because of acute FOMO (the fear of missing out).

Delivering 500,000 cars in 2020 was “impressive,” says Alex, who also acknowledged the company’s ability to turn GAAP profits, but “pride cometh before the fall, as does a melt up, I think.”

Note: This story has Alex’s original headline, but I told him I would replace the featured image with a photo of someone who had very “richest man in the world” face.

How Segment redesigned its core systems to solve an existential scaling crisis

Abstract glowing grid and particles

Image Credits: piranka / Getty Images

On Tuesday, enterprise reporter Ron Miller covered a major engineering project at customer data platform Segment called “Centrifuge.”

“Its purpose was to move data through Segment’s data pipes to wherever customers needed it quickly and efficiently at the lowest operating cost,” but as Ron reports, it was also meant to solve “an existential crisis for the young business,” which needed a more resilient platform.

Dear Sophie: Banging my head against the wall understanding the US immigration system

Image Credits: Sophie Alcorn

Dear Sophie:

Now that the U.S. has a new president coming in whose policies are more welcoming to immigrants, I am considering coming to the U.S. to expand my company after COVID-19. However, I’m struggling with the morass of information online that has bits and pieces of visa types and processes.

Can you please share an overview of the U.S. immigration system and how it works so I can get the big picture and understand what I’m navigating?

— Resilient in Romania

The first “Dear Sophie” column of each month is available on TechCrunch without a paywall.

Revenue-based financing: The next step for private equity and early-stage investment

Shot of a group of people holding plants growing out of soil

Image Credits: Hiraman (opens in a new window) / Getty Images

For founders who aren’t interested in angel investment or seeking validation from a VC, revenue-based investing is growing in popularity.

To gain a deeper understanding of the U.S. RBI landscape, we published an industry report on Wednesday that studied data from 134 companies, 57 funds and 32 investment firms before breaking out “specific verticals and business models … and the typical profile of companies that access this form of capital.”

Lisbon’s startup scene rises as Portugal gears up to be a European tech tiger

Man using laptop at 25th of April Bridge in Lisbon, Portugal

Image Credits: Westend61 (opens in a new window)/ Getty Images

Mike Butcher continues his series of European investor surveys with his latest dispatch from Lisbon, where a nascent startup ecosystem may get a Brexit boost.

Here are the Portugal-based VCs he interviewed:

  • Cristina Fonseca, partner, Indico Capital Partners
  • Pedro Ribeiro Santos, partner, Armilar Venture Partners
  • Tocha, partner, Olisipo Way
  • Adão Oliveira, investment manager, Portugal Ventures
  • Alexandre Barbosa, partner, Faber
  • António Miguel, partner, Mustard Seed MAZE
  • Jaime Parodi Bardón, partner, impACT NOW Capital
  • Stephan Morais, partner, Indico Capital Partners
  • Gavin Goldblatt, managing partner, Portugal Gateway

How late-stage edtech companies are thinking about tutoring marketplaces

Life Rings flying out beneath storm clouds are a metaphor for rescue, help and aid.

Image Credits: John Lund (opens in a new window)/ Getty Images

How do you scale online tutoring, particularly when demand exceeds the supply of human instructors?

This month, Chegg is replacing its seven-year-old marketplace that paired students with tutors with a live chatbot.

A spokesperson said the move will “dramatically differentiate our offerings from our competitors and better service students,” but Natasha Mascarenhas identified two challenges to edtech automation.

“A chatbot won’t work for a student with special needs or someone who needs to be handheld a bit more,” she says. “Second, speed tutoring can only work for a specific set of subjects.”

Decrypted: How bad was the US Capitol breach for cybersecurity?

Image Credits: Treedeo (opens in a new window) / Getty Images

While I watched insurrectionists invade and vandalize the U.S. Capitol on live TV, I noticed that staffers evacuated so quickly, some hadn’t had time to shut down their computers.

Looters even made off with a laptop from Senator Jeff Merkley’s office, but according to security reporter Zack Whittaker, the damages to infosec wasn’t as bad as it looked.

Even so, “the breach will likely present a major task for Congress’ IT departments, which will have to figure out what’s been stolen and what security risks could still pose a threat to the Capitol’s network.”

Extra Crunch’s top 10 stories of 2020

On New Year’s Eve, I made a list of the 10 “best” Extra Crunch stories from the previous 12 months.

My methodology was personal: From hundreds of posts, these were the 10 I found most useful, which is my key metric for business journalism.

Some readers are skeptical about paywalls, but without being boastful, Extra Crunch is a premium product, just like Netflix or Disney+. I know, we’re not as entertaining as a historical drama about the reign of Queen Elizabeth II or a space western about a bounty hunter. But, speaking as someone who’s worked at several startups, Extra Crunch stories contain actionable information you can use to build a company and/or look smart in meetings — and that’s worth something.


By Walter Thompson

How artificial intelligence will be used in 2021

Scale AI CEO Alexandr Wang doesn’t need a crystal ball to see where artificial intelligence will be used in the future. He just looks at his customer list.

The four-year-old startup, which recently hit a valuation of more than $3.5 billion, got its start supplying autonomous vehicle companies with the labeled data needed to train machine learning models to develop and eventually commercialize robotaxis, self-driving trucks and automated bots used in warehouses and on-demand delivery.

The wider adoption of AI across industries has been a bit of a slow burn over the past several years as company founders and executives begin to understand what the technology could do for their businesses.

In 2020, that changed as e-commerce, enterprise automation, government, insurance, real estate and robotics companies turned to Scale’s visual data labeling platform to develop and apply artificial intelligence to their respective businesses. Now, the company is preparing for the customer list to grow and become more varied.

How 2020 shaped up for AI

Scale AI’s customer list has included an array of autonomous vehicle companies including Alphabet, Voyage, nuTonomy, Embark, Nuro and Zoox. While it began to diversify with additions like Airbnb, DoorDash and Pinterest, there were still sectors that had yet to jump on board. That changed in 2020, Wang said.

Scale began to see incredible use cases of AI within the government as well as enterprise automation, according to Wang. Scale AI began working more closely with government agencies this year and added enterprise automation customers like States Title, a residential real estate company.

Wang also saw an increase in uses around conversational AI, in both consumer and enterprise applications as well as growth in e-commerce as companies sought out ways to use AI to provide personalized recommendations for its customers that were on par with Amazon.

Robotics continued to expand as well in 2020, although it spread to use cases beyond robotaxis, autonomous delivery and self-driving trucks, Wang said.

“A lot of the innovations that have happened within the self-driving industry, we’re starting to see trickle out throughout a lot of other robotics problems,” Wang said. “And so it’s been super exciting to see the breadth of AI continue to broaden and serve our ability to support all these use cases.”

The wider adoption of AI across industries has been a bit of a slow burn over the past several years as company founders and executives begin to understand what the technology could do for their businesses, Wang said, adding that advancements in natural language processing of text, improved offerings from cloud companies like AWS, Azure and Google Cloud and greater access to datasets helped sustain this trend.

“We’re finally getting to the point where we can help with computational AI, which has been this thing that’s been pitched for forever,” he said.

That slow burn heated up with the COVID-19 pandemic, said Wang, noting that interest has been particularly strong within government and enterprise automation as these entities looked for ways to operate more efficiently.

“There was this big reckoning,” Wang said of 2020 and the effect that COVID-19 had on traditional business enterprises.

If the future is mostly remote with consumers buying online instead of in-person, companies started to ask, “How do we start building for that?,” according to Wang.

The push for operational efficiency coupled with the capabilities of the technology is only going to accelerate the use of AI for automating processes like mortgage applications or customer loans at banks, Wang said, who noted that outside of the tech world there are industries that still rely on a lot of paper and manual processes.


By Kirsten Korosec

Arthur.ai snags $15M Series A to grow machine learning monitoring tool

At a time when more companies are building machine learning models, Arthur.ai wants to help by ensuring the model accuracy doesn’t begin slipping over time, thereby losing its ability to precisely measure what it was supposed to. As demand for this type of tool has increased this year, in spite of the pandemic, the startup announced a $15 million Series A today.

The investment was led by Index Ventures with help from new comers Acrew and Plexo Capital along with previous investors Homebrew, AME Ventures and Work-Bench.The round comes almost exactly a year after its $3.3 million seed round.

As CEO and co-founder Adam Wenchel explains, data scientists build and test machine learning models in the lab under ideal conditions, but as these models are put into production, the performance can begin to deteriorate under real world scrutiny. Arthur.AI is designed to root out when that happens.

Even as COVID has wreaked havoc throughout much of this year, the company has grown revenue 300% in the last six months smack dab in the middle of all that. “Over the course of 2020, we have begun to open up more and talk to [more] customers. And so we are starting to get some really nice initial customer traction, both in traditional enterprises as well as digital tech companies,” Wenchel told me. With 15 customers, the company is finding that the solution is resonating with companies.

It’s interesting to note that AWS announced a similar tool yesterday at re:Invent called SageMaker Clarify, but Wenchel sees this as more of a validation of what his startup has been trying to do, rather than an existential threat. “I think it helps create awareness, and because this is our 100% focus, our tools go well beyond what the major cloud providers provide,” he said.

Investor Mike Volpi from Index certainly sees the value proposition of this company. “One of the most critical aspects of the AI stack is in the area of performance monitoring and risk mitigation. Simply put, is the AI system behaving like it’s supposed to?,” he wrote in a blog post announcing the funding.

When we spoke a year ago, the company had 8 employees. Today it has 17 and it expects to double again by the end of next year. Wenchel says that as a company whose products looks for different types of bias, it’s especially important to have a diverse workforce. He says that starts with having a diverse investment team and board makeup, which he has been able to achieve, and goes from there.

“We’ve sponsored and work with groups that focus on both general sort of coding for different underrepresented groups as well as specifically AI, and that’s something that we’ll continue to do. And actually I think when we can get together for in person events again, we will really go out there and support great organizations like AI for All and Black Girls Code,” he said. He believes that by working with these groups, it will give the startup a pipeline to underrepresented groups, which they can draw upon for hiring as the needs arise.

Wenchel says that when he can go back to the office, he wants to bring employees back, at least for part of the week for certain kinds of work that will benefit from being in the same space.


By Ron Miller

AWS announces SageMaker Clarify to help reduce bias in machine learning models

As companies rely increasingly on machine learning models to run their businesses, it’s imperative to include anti-bias measures to ensure these models are not making false or misleading assumptions. Today at AWS re:Invent, AWS introduced Amazon SageMaker Clarify to help reduce bias in machine learning models.

“We are launching Amazon SageMaker Clarify. And what that does is it allows you to have insight into your data and models throughout your machine learning lifecycle,” Bratin Saha, Amazon VP and general manager of machine learning told TechCrunch.

He says that it is designed to analyze the data for bias before you start data prep, so you can find these kinds of problems before you even start building your model.

“Once I have my training data set, I can [look at things like if I have] an equal number of various classes, like do I have equal numbers of males and females or do I have equal numbers of other kinds of classes, and we have a set of several metrics that you can use for the statistical analysis so you get real insight into easier data set balance,” Saha explained.

After you build your model, you can run SageMaker Clarify again to look for similar factors that might have crept into your model as you built it. “So you start off by doing statistical bias analysis on your data, and then post training you can again do analysis on the model,” he said.

There are multiple types of bias that can enter a model due to the background of the data scientists building the model, the nature of the data and how they data scientists interpret that data through the model they built. While this can be problematic in general it can also lead to racial stereotypes being extended to algorithms. As an example, facial recognition systems have proven quite accurate at identifying white faces, but much less so when it comes to recognizing people of color.

It may be difficult to identify these kinds of biases with software as it often has to do with team makeup and other factors outside the purview of a software analysis tool, but Saha says they are trying to make that software approach as comprehensive as possible.

“If you look at SageMaker Clarify it gives you data bias analysis, it gives you model bias analysis, it gives you model explainability it gives you poor inference explainability it gives you a global explainability,” Saha said.

Saha says that Amazon is aware of the bias problem and that is why it created this tool to help, but he recognizes that this tool alone won’t eliminate all of the bias issues that can crop up in machine learning models, and they offer other ways to help too.

“We are also working with our customers in various ways. So we have documentation, best practices, and we point our customers to how to be able to architect their systems and work with the system so they get the desired results,” he said.

SageMaker Clarify is available starting to day in multiple regions.


By Ron Miller

Tecton.ai nabs $35M Series B as it releases machine learning feature store

Tecton.ai, the startup founded by three former Uber engineers who wanted to bring the machine learning feature store idea to the masses, announced a $35 million Series B today, just seven months after announcing their $20 million Series A.

When we spoke to the company in April, it was working with early customers in a beta version of the product, but today, in addition to the funding they are also announcing the general availability of the platform.

As with their Series A, this round has Andreessen Horowitz and Sequoia Capital coming back to co-lead the investment. The company has now raised $60 million.

The reason these two firms are so committed to Tecton is the specific problem around machine learning the company is trying to solve. “We help organizations put machine learning into production. That’s the whole goal of our company, helping someone build an operational machine learning application, meaning an application that’s powering their fraud system or something real for them […] and making it easy for them to build and deploy and maintain,” company CEO and co-founder Mike Del Balso explained.

They do this by providing the concept of a feature store, an idea they came up with and which is becoming a machine learning category unto itself. Just last week, AWS announced the Sagemaker Feature store, which the company saw as major validation of their idea.

As Tecton defines it, a feature store is an end-to-end machine learning management system that includes the pipelines to transform the data into what are called feature values, then it stores and manages all of that feature data and finally it serves a consistent set of data.

Del Balso says this works hand-in-hand with the other layers of a machine learning stack. “When you build a machine learning application, you use a machine learning stack that could include a model training system, maybe a model serving system or an MLOps kind of layer that does all the model management, and then you have a feature management layer, a feature store which is us — and so we’re an end-to-end lifecycle for the data pipelines,” he said.

With so much money behind the company it is growing fast, going from 17 employees to 26 since we spoke in April with plans to more than double that number by the end of next year. Del Balso says he and his co-founders are committed to building a diverse and inclusive company, but he acknowledges it’s not easy to do.

“It’s actually something that we have a primary recruiting initiative on. It’s very hard, and it takes a lot of effort, it’s not something that you can just make like a second priority and not take it seriously,” he said. To that end, the company has sponsored and attended diversity hiring conferences and has focused its recruiting efforts on finding a diverse set of candidates, he said.

Unlike a lot of startups we’ve spoken to, Del Balso wants to return to an office setup as soon as it is feasible to do so, seeing it as a way to build more personal connections between employees.


By Ron Miller

AWS adds natural language search service for business intelligence from its data sets

When Amazon Web Services launched QuickSight, its business intelligence service, back in 2016 the company wanted to provide product information and customer information for business users — not just developers.

At the time, the natural language processing technologies available weren’t robust enough to give customers the tools to search databases effectively using queries in plain speech.

Now, as those technologies have matured, Amazon is coming back with a significant upgrade called QuickSight Q, which allows users to just ask a simple question and get the answers they need, according to Andy Jassy’s keynote at AWS re:Invent.

“We will provide natural language to provide what we think the key learning is,” said Jassy. “I don’t like that our users have to know which databases to access or where data is stored. I want them to be able to type into a search bar and get the answer to a natural language question.

That’s what QuickSight Q aims to do. It’s a direct challenge to a number of business intelligence startups and another instance of the way machine learning and natural language processing are changing business processes across multiple industries.

“The way Q works. Type in a question in natural language [like]… ‘Give me the trailing twelve month sales of product X?’… You get an answer in seconds. You don’t have to know tables or have to know data stores.”

It’s a compelling use case and gets at the way AWS is integrating machine learning to provide more no-code services to customers. “Customers didn’t hire us to do machine learning,” Jassy said. “They hired us to answer the questions.”


By Jonathan Shieber

Abacus.AI raises another $22M and launches new AI modules

AI startup RealityEngines.AI changed its name to Abacus.AI in July. At the same time, it announced a $13 million Series A round. Today, only a few months later, it is not changing its name again, but it is announcing a $22 million Series B round, led by Coatue, with Decibel Ventures and Index Partners participating as well. With this, the company, which was co-founded by former AWS and Google exec Bindu Reddy, has now raised a total of $40.3 million.

Abacus co-founder Bindu Reddy, Arvind Sundararajan and Siddartha Naidu. Image Credits: Abacus.AI

In addition to the new funding, Abacus.AI is also launching a new product today, which it calls Abacus.AI Deconstructed. Originally, the idea behind RealityEngines/Abacus.AI was to provide its users with a platform that would simplify building AI models by using AI to automatically train and optimize them. That hasn’t changed, but as it turns out, a lot of (potential) customers had already invested into their own workflows for building and training deep learning models but were looking for help in putting them into production and managing them throughout their lifecycle.

“One of the big pain points [businesses] had was, ‘look, I have data scientists and I have my models that I’ve built in-house. My data scientists have built them on laptops, but I don’t know how to push them to production. I don’t know how to maintain and keep models in production.’ I think pretty much every startup now is thinking of that problem,” Reddy said.

Image Credits: Abacus.AI

Since Abacus.AI had already built those tools anyway, the company decided to now also break its service down into three parts that users can adapt without relying on the full platform. That means you can now bring your model to the service and have the company host and monitor the model for you, for example. The service will manage the model in production and, for example, monitor for model drift.

Another area Abacus.AI has long focused on is model explainability and de-biasing, so it’s making that available as a module as well, as well as its real-time machine learning feature store that helps organizations create, store and share their machine learning features and deploy them into production.

As for the funding, Reddy tells me the company didn’t really have to raise a new round at this point. After the company announced its first round earlier this year, there was quite a lot of interest from others to also invest. “So we decided that we may as well raise the next round because we were seeing adoption, we felt we were ready product-wise. But we didn’t have a large enough sales team. And raising a little early made sense to build up the sales team,” she said.

Reddy also stressed that unlike some of the company’s competitors, Abacus.AI is trying to build a full-stack self-service solution that can essentially compete with the offerings of the big cloud vendors. That — and the engineering talent to build it — doesn’t come cheap.

Image Credits: Abacus.AI

It’s no surprise then that Abacus.AI plans to use the new funding to increase its R&D team, but it will also increase its go-to-market team from two to ten in the coming months. While the company is betting on a self-service model — and is seeing good traction with small- and medium-sized companies — you still need a sales team to work with large enterprises.

Come January, the company also plans to launch support for more languages and more machine vision use cases.

“We are proud to be leading the Series B investment in Abacus.AI, because we think that Abacus.AI’s unique cloud service now makes state-of-the-art AI easily accessible for organizations of all sizes, including start-ups. Abacus.AI’s end-to-end autonomous AI service powered by their Neural Architecture Search invention helps organizations with no ML expertise easily deploy deep learning systems in production.”

 


By Frederic Lardinois

Computer vision startup Chooch.ai scores $20M Series A

Chooch.ai, a startup that hopes to bring computer vision more broadly to companies to help them identify and tag elements at high speed, announced a $20 million Series A today.

Vickers Venture Partners led the round with participation from 212, Streamlined Ventures, Alumni Ventures Group, Waterman Ventures and several other unnamed investors. Today’s investment brings the total raised to $25.8 million, according to the company.

“Basically we set out to copy human visual intelligence in machines. That’s really what this whole journey is about,” CEO and co-founder Emrah Gultekin explained. As the company describes it, “Chooch Al can rapidly ingest and process visual data from any spectrum, generating AI models in hours that can detect objects, actions, processes, coordinates, states, and more.”

Chooch is trying to differentiate itself from other AI startups by taking a broader approach that could work in any setting, rather than concentrating on specific vertical applications. Using the pandemic as an example, Gultekin says you could use his company’s software to identify everyone who is not wearing a mask in the building or everyone who is not wearing a hard hat at construction site.

 

With 22 employees spread across the U.S., India and Turkey, Chooch is building a diverse company just by virtue of its geography, but as it doubles the workforce in the coming year, it wants to continue to build on that.

“We’re immigrants. We’ve been through a lot of different things, and we recognize some of the issues and are very sensitive to them. One of our senior members is a person of color and we
are very cognizant of the fact that we need to develop that part of our company,” he said. At a recent company meeting, he said that they were discussing how to build diversity into the policies and values of the company as they move forward.

The company currently has 18 enterprise clients and hopes to use the money to add engineers, data scientists and begin to build out a worldwide sales team to continue to build the product and expand its go-to-market effort.

Gultekin says that the company’s unusual name comes from a mix of the words choose and search. He says that it is also an old Italian insult. “It means dummy or idiot, which is what artificial intelligence is today. It’s a poor reflection of humanity or human intelligence in humans,” he said. His startup aims to change that.


By Ron Miller

Databricks launches SQL Analytics

AI and data analytics company Databricks today announced the launch of SQL Analytics, a new service that makes it easier for data analysts to run their standard SQL queries directly on data lakes. And with that, enterprises can now easily connect their business intelligence tools like Tableau and Microsoft’s Power BI to these data repositories as well.

SQL Analytics will be available in public preview on November 18.

In many ways, SQL Analytics is the product Databricks has long been looking to build and that brings its concept of a ‘lake house’ to life. It combines the performance of a data warehouse, where you store data after it has already been transformed and cleaned, with a data lake, where you store all of your data in its raw form. The data in the data lake, a concept that Databrick’s co-founder and CEO Ali Ghodsi has long championed, is typically only transformed when it gets used. That makes data lakes cheaper, but also a bit harder to handle for users.

Image Credits: Databricks

“We’ve been saying Unified Data Analytics, which means unify the data with the analytics. So data processing and analytics, those two should be merged. But no one picked that up,” Ghodsi told me. But ‘lake house’ caught on as a term.

“Databricks has always offered data science, machine learning. We’ve talked about that for years. And with Spark, we provide the data processing capability. You can do [extract, transform, load]. That has always been possible. SQL Analytics enables you to now do the data warehousing workloads directly, and concretely, the business intelligence and reporting workloads, directly on the data lake.”

The general idea here is that with just one copy of the data, you can enable both traditional data analyst use cases (think BI) and the data science workloads (think AI) Databricks was already known for. Ideally, that makes both use cases cheaper and simpler.

The service sits on top of an optimized version of Databricks’ open-source Delta Lake storage layer to enable the service to quickly complete queries. In addition, Delta Lake also provides auto-scaling endpoints to keep the query latency consistent, even under high loads.

While data analysts can query these data sets directly, using standard SQL, the company also built a set of connectors to BI tools. Its BI partners include Tableau, Qlik, Looker and Thoughtspot, as well as ingest partners like Fivetran, Fishtown Analytics, Talend and Matillion.

Image Credits: Databricks

“Now more than ever, organizations need a data strategy that enables speed and agility to be adaptable,” said Francois Ajenstat, Chief Product Officer at Tableau. “As organizations are rapidly moving their data to the cloud, we’re seeing growing interest in doing analytics on the data lake. The introduction of SQL Analytics delivers an entirely new experience for customers to tap into insights from massive volumes of data with the performance, reliability and scale they need.”

In a demo, Ghodsi showed me what the new SQL Analytics workspace looks like. It’s essentially a stripped-down version of the standard code-heavy experience that Databricks users are familiar with. Unsurprisingly, SQL Analytics provides a more graphical experience that focuses more on visualizations and not Python code.

While there are already some data analysts on the Databricks platform, this obviously opens up a large new market for the company — something that would surely bolster its plans for an IPO next year.


By Frederic Lardinois

Qualcomm Ventures invests in four 5G startups

Qualcomm Ventures, Qualcomm’s investment arm, today announced four new strategic investments in 5G-related startups. These companies are private mobile network specialist Celona, mobile network automation platform Cellwize, the edge computing platform Azion and Pensando, another edge computing platform that combines its software stack with custom hardware.

The overall goal here is obviously to help jumpstart 5G use cases in the enterprise and — by extension — for consumers by investing in a wide range of companies that can build the necessary infrastructure to enable these.

“We invest globally in the wireless mobile ecosystem, with a goal of expanding our base of customers and partners — and one of the areas we’re particularly excited about is the area of 5G,” Quinn Li, a Senior VP at Qualcomm and the global head of Qualcomm Ventures, told me. “Within 5G, there are three buckets of areas we look to invest in: one is in use cases, second is in network transformation, third is applying 5G technology in enterprises.”

So far, Qualcomm Ventures has invested over $170 million in the 5G ecosystem, including this new batch. The firm did not disclose how much it invested in these four new startups, though.

Overall, this new set of companies touches upon the core areas Qualcomm Ventures is looking at, Li explained. Celona, for example, aims to make it as easy for enterprises to deploy private cellular infrastructure as it is to deploy Wi-Fi today.

“They built this platform with a cloud-based controller that leverages the available spectrum — CBRS — to be able to take the cellular technology, whether it’s LTE or 5G, into enterprises,” Li explained. “And then these enterprise use cases could be in manufacturing settings could be in schools, could be to be in hospitals, or it could be on campus for universities.”

Cellwize, meanwhile, helps automate wireless networks to make them more flexible and manageable, in part by using machine learning to tune the network based on the data it collects. One of the main investment theses for this fund, Li told me, is that wireless technology will become increasingly software-defined and Cellwize fits right into this trend. The potential customer here isn’t necessarily an individual enterprise, though, but wireless and mobile operators.

Edge computing, where Azion and Pensando play, is obviously also a hot category right now and when where 5G has some obvious advantages, so it’s maybe no surprise that Qualcomm Ventures is putting a bit of a focus on these today with its investments in Azion and Pensando.

“As we move forward, [you will] see a lot of the compute moving from the cloud into the edge of the network, which allows for processing happening at the edge of the network, which allows for low latency applications to run much faster and much more efficiently,” Li said.

In total, Qualcomm Ventures has deployed $1.5 billion and made 360 investments since its launch in 2000. Some of the more successful companies the firm has invested in include unicorns like Zoom, Cloudflare, Xiaomi, Cruise Automation and Fitbit.


By Frederic Lardinois

DataFleets keeps private data useful, and useful data private, with federated learning and $4.5M seed

As you may already know, there’s a lot of data out there, and some of it could actually be pretty useful. But privacy and security considerations often put strict limitations on how it can be used or analyzed. DataFleets promises a new approach by which databases can be safely accessed and analyzed without the possibility of privacy breaches or abuse — and has raised a $4.5 million seed round to scale it up.

To work with data, you need to have access to it. If you’re a bank, that means transactions and accounts; if you’re a retailer, that means inventories and supply chains, and so on. There are lots of insights and actionable patterns buried in all that data, and it’s the job of data scientists and their ilk to draw them out.

But what if you can’t access the data? After all, there are many industries where it is not advised or even illegal to do so, such as in health care. You can’t exactly take a whole hospital’s medical records, give them to a data analysis firm, and say “sift through that and tell me if there’s anything good.” These, like many other data sets, are too private or sensitive to allow anyone unfettered access. The slightest mistake — let alone abuse — could have serious repercussions.

In recent years a few technologies have emerged that allow for something better, though: analyzing data without ever actually exposing it. It sounds impossible, but there are computational techniques for allowing data to be manipulated without the user ever actually having access to any of it. The most widely used one is called homomorphic encryption, which unfortunately produces an enormous, orders-of-magnitude reduction in efficiency — and big data is all about efficiency.

This is where DataFleets steps in. It hasn’t reinvented homomorphic encryption, but has sort of sidestepped it. It uses an approach called federated learning, where instead of bringing the data to the model, they bring the model to the data.

DataFleets integrates with both sides of a secure gap between a private database and people who want to access that data, acting as a trusted agent to shuttle information between them without ever disclosing a single byte of actual raw data.

Illustration showing how a model can be created without exposing data.

Image Credits: DataFleets

Here’s an example. Say a pharmaceutical company wants to develop a machine learning model that looks at a patient’s history and predicts whether they’ll have side effects with a new drug. A medical research facility’s private database of patient data is the perfect thing to train it. But access is highly restricted.

The pharma company’s analyst creates a machine learning training program and drops it into DataFleets, which contracts with both them and the facility. DataFleets translates the model to its own proprietary runtime and distributes it to the servers where the medical data resides; within that sandboxed environment, it runs grows into a strapping young ML agent, which when finished is translated back into the analyst’s preferred format or platform. The analyst never sees the actual data, but has all the benefits of it.

Screenshot of the DataFleets interface. Look, it’s the applications that are meant to be exciting.

It’s simple enough, right? DataFleets acts as a sort of trusted messenger between the platforms, undertaking the analysis on behalf of others and never retaining or transferring any sensitive data.

Plenty of folks are looking into federated learning; the hard part is building out the infrastructure for a wide-ranging enterprise-level service. You need to cover a huge amount of use cases and accept an enormous variety of languages, platforms, and techniques, and of course do it all totally securely.

“We pride ourselves on enterprise readiness, with policy management, identity access management, and our pending SOC 2 certification,” said DataFleets COO and co-founder Nick Elledge. “You can build anything on top of DataFleets and plug in your own tools, which banks and hospitals will tell you was not true of prior privacy software.”

But once federated learning is set up, all of a sudden the benefits are enormous. For instance, one of the big issues today in combating COVID-19 is that hospitals, health authorities, and other organizations around the world are having difficulty, despite their willingness, in securely sharing data relating to the virus.

Everyone wants to share, but who sends whom what, where is it kept, and under whose authority and liability? With old methods, it’s a confusing mess. With homomorphic encryption it’s useful but slow. With federated learning, theoretically, it’s as easy as toggling someone’s access.

Because the data never leaves its “home,” this approach is essentially anonoymous and thus highly compliant with regulations like HIPAA and GDPR, another big advantage. Elledge notes: “We’re being used by leading healthcare institutions who recognize that HIPAA doesn’t give them enough protection when they are making a data set available for third parties.”

Of course there are less noble, but no less viable, examples in other industries: wireless carriers could make subscriber metadata available without selling out individuals; banks could sell consumer data without violating anyone in particular’s privacy; bulky datasets like video can sit where they are instead of being duplicated and maintained at great expense.

The company’s $4.5M seed round is seemingly evidence of confidence from a variety of investors (as summarized by Elledge): AME Cloud Ventures (Jerry Yang of Yahoo!) and Morado Ventures, Lightspeed Venture Partners, Peterson Ventures, Mark Cuban, LG, Marty Chavez (President of the Board of Overseers of Harvard), Stanford-StartX fund, and three unicorn founders (Rappi, Quora, and Lucid).

With only 11 full time employees DataFleets appears to be doing a lot with very little, and the seed round should enable rapid scaling and maturation of its flagship product. “We’ve had to turn away or postpone new customer demand to focus on our work with our lighthouse customers,” Elledge said. They’ll be hiring engineers in the U.S. and Europe to help launch the planned self-service product next year.

“We’re moving from a data ownership to a data access economy, where information can be useful without transferring ownership,” said Elledge. If his company’s bet is on target, federated learning is likely to be a big part of that going forward.


By Devin Coldewey

Freshworks (re-)launches its CRM service

Freshworks, the customer and employee engagement company that offers a range of products, from call center and customer support software to HR tools and marketing automation services, today announced the launch of its newest product: Freshworks CRM. The new service, which the company built on top of its new Freshworks Neo platform, is meant to give sales and marketing teams all of the tools they need to get a better view of their customers — with a bit of machine learning thrown in for better predictions.

Freshworks CRM is essentially a rebrand of the company’s Freshsales service, combined with the company’s capabilities of its Freshmarketer marketing automation tool.

“Freshworks CRM unites Freshsales and Freshmarketer capabilities into one solution, which leverages an embedded customer data platform for an unprecedented and 360-degree view of the customer throughout their entire journey,” a company spokesperson told me.

The promise here is that this improved CRM solution is able to provide teams with a more complete view of their (potential) customers thanks to the unified view — and aggregated data — that the company’s Neo platform provides.

The company argues that the majority of CRM users quickly become disillusioned with their CRM service of choice — and the reason for that is because the data is poor. That’s where Freshworks thinks it can make a difference.

Freshworks CRM delivers upon the original promise of CRM: a single solution that combines AI-driven data, insights and intelligence and puts the customer front and center of business goals,” said Prakash Ramamurthy, the company’s chief product officer. “We built Freshworks CRM to harness the power of data and create immediate value, challenging legacy CRM solutions that have failed sales teams with clunky interfaces and incomplete data.”

The idea here is to provide teams with all of their marketing and sales data in a single dashboard and provide AI-assisted insights to them to help drive their decision making, which in turn should lead to a better customer experience — and more sales. The service offers predictive lead scoring and qualification, based on a host of signals users can customize to their needs, as well as Slack and Teams integrations, built-in telephony with call recording to reach out to prospects and more. A lot of these features were already available in Freshsales, too.

“The challenge for online education is the ‘completion rate’. To increase this, we need to understand the ‘Why’ aspect for a student to attend a course and design ‘What’ & ‘How’ to meet the personalized needs of our students so they can achieve their individual goals,” said Mamnoon Hadi Khan, the chief analytics officer at Shaw Academy. “With Freshworks CRM, Shaw Academy can track the entire student customer journey to better engage with them through our dedicated Student Success Managers and leverage AI to personalize their learning experience — meeting their objectives.”

Pricing for Freshworks CRM starts at $29 per user/month and goes up to $125 per user/month for the full enterprise plan with more advanced features.


By Frederic Lardinois

Wrike launches new AI tools to keep your projects on track

Project management service Wrike today announced a major update to its platform at its user conference that includes a lot of new AI smarts for keeping individual projects on track and on time, as well as new solutions for marketers and project management offices in large corporations. In addition, the company also launched a new budgeting feature and tweaks to the overall user experience.

The highlight of the launch, though, is, without doubt, the launch of the new AI and machine learning capabilities in Wrike . With more than 20,000 customers and over 2 million users on the platform, Wrike has collected a trove of data about projects that it can use to power these machine learning models.

Image Credits: Wrike

The way Wrike is now using AI falls into three categories: project risk prediction, task prioritization and tools for speeding up the overall project management workflow.

Figuring out the status of a project and knowing where delays could impact the overall project is often half the job. Wrike can now predict potential delays and alert project and team leaders when it sees events that signal potential issues. To do this, it uses basic information like start and end dates, but more importantly, it looks at the prior outcomes of similar projects to assess risks. Those predictions can then be fed into Wrike’s automation engine to trigger actions that could mitigate the risk to the project.

Task prioritization does what you would expect and helps you figure out what you should focus on right now to help a project move forward. No surprises there.

What is maybe more surprising is that the team is also launching voice commands (through Siri on iOS) and Gmail-like smart replies (in English for iOS and Android). Those aren’t exactly core features of a project management tools, but as the company notes, these features help remove the overall friction and reduce latencies. Another new feature that falls into this category is support for optical character recognition to allow you to scan printed and handwritten notes from your phones and attach them to tasks (iOS only).

“With more employees working from home, work and personal life are becoming intertwined,” the company argues. “As workers use AI in their personal lives, team managers and everyday users expect the smarts they’re accustomed to in consumer devices and apps to help them manage their work as well. Wrike Work Intelligence is the most comprehensive machine learning foundation that taps into tens of millions of work-related user engagements to power cross-functional collaboration to help organizations achieve operational efficiency, create new opportunities and accelerate digital transformation. Teams can focus on the work that matters most, predict and minimize delays, and cut communication latencies.”

Image Credits: Wrike

The other major new feature — at least if you’re in digital marketing — is Wrike’s new ability to pull in data about your campaigns from about 50 advertising, marketing automation and social media tools, which is then displayed inside the Wrike experience. In a fast-moving field, having all that data at your fingertips and right inside the tool where you think about how to manage these projects seems like a smart idea.

Image Credits: Wrike

Somewhat related, Wrike’s new budgeting feature also now makes it easier for teams to keep their projects within budget, using a new built-in rate card to manage project pricing and update their financials.

“We use Wrike for an extensive project management and performance metrics system,” said Shannon Buerk, the CEO of engage2learn, which tested this new budgeting tool. “We have tried other PM systems and have found Wrike to be the best of all worlds: easy to use for everyone and savvy enough to provide valuable reporting to inform our work. Converting all inefficiencies into productive time that moves your mission forward is one of the keys to a culture of engagement and ownership within an organization, even remotely. Wrike has helped us get there.”


By Frederic Lardinois

Egnyte introduces new features to help deal with security/governance during pandemic

The pandemic has put stress on companies dealing with a workforce that is mostly — and sometimes suddenly — working from home. That has led to rising needs for security and governance tooling, something that Egnyte is looking to meet with new features aimed at helping companies cope with file management during the pandemic.

Egnyte is an enterprise file storage and sharing (EFSS) company, though it has added security services and other tools over the years.

“It’s no surprise that there’s been a rapid shift to remote work, which has I believe led to mass adoption of multiple applications running on multiple clouds, and tied to that has been a nonlinear reaction of exponential growth in data security and governance concerns,” Vineet Jain, co-founder and CEO at Egnyte, explained.

There’s a lot of data at stake.

Egnyte’s announcements today are in part a reaction to the changes that COVID has brought, a mix of net-new features and capabilities that were on its road map, but accelerated to meet the needs of the changing technology landscape.

What’s new?

The company is introducing a new feature called Smart Cache to make sure that content (wherever it lives) that an individual user accesses most will be ready whenever they need it.

“Smart Cache uses machine learning to predict the content most likely to be accessed at any given site, so administrators don’t have to anticipate usage patterns. The elegance of the solution lies in that it is invisible to the end users,” Jain said. The end result of this capability could be lower storage and bandwidth costs, because the system can make this content available in an automated way only when it’s needed.

Another new feature is email scanning and governance. As Jain points out, email is often a company’s largest data store, but it’s also a conduit for phishing attacks and malware. So Egnyte is introducing an email governance tool that keeps an eye on this content, scanning it for known malware and ransomware and blocking files from being put into distribution when it identifies something that could be harmful.

As companies move more files around it’s important that security and governance policies travel with the document, so that policies can be enforced on the file wherever it goes. This was true before COVID-19, but has only become more true as more folks work from home.

Finally, Egnyte is using machine learning for auto-classification of documents to apply policies to documents without humans having to touch them. By identifying the document type automatically, whether it has personally identifying information or it’s a budget or planning document, Egnyte can help customers auto-classify and apply policies about viewing and sharing to protect sensitive materials.

Egnyte is reacting to the market needs as it makes changes to the platform. While the pandemic has pushed this along, these are features that companies with documents spread out across various locations can benefit from regardless of the times.

The company is over $100 million ARR today, and grew 22% in the first half of 2020. Whether the company can accelerate that growth rate in H2 2020 is not yet clear. Regardless, Egnyte is a budding IPO candidate for 2021 if market conditions hold.


By Ron Miller

Atlassian Smarts adds machine learning layer across the company’s platform of services

Atlassian has been offering collaboration tools, often favored by developers and IT for some time with such stalwarts as Jira for help desk tickets, Confluence to organize your work and BitBucket to organize your development deliverables, but what it lacked was machine learning layer across the platform to help users work smarter within and across the applications in the Atlassian family.

That changed today, when Atlassian announced it has been building that machine learning layer called Atlassian Smarts, and is releasing several tools that take advantage of it. It’s worth noting that unlike Salesforce, which calls its intelligence layer Einstein or Adobe, which calls its Sensei; Atlassian chose to forgo the cutesy marketing terms and just let the technology stand on its own.

Shihab Hamid, the founder of the Smarts and Machine Learning Team at Atlassian, who has been with the company 14 years, says that they avoided a marketing name by design. “I think one of the things that we’re trying to focus on is actually the user experience and so rather than packaging or branding the technology, we’re really about optimizing teamwork,” Hamid told TechCrunch.

Hamid says that the goal of the machine learning layer is to remove the complexity involved with organizing people and information across the platform.

“Simple tasks like finding the right person or the right document becomes a challenge, or at least they slow down productivity and take time away from the creative high-value work that everyone wants to be doing, and teamwork itself is super messy and collaboration is complicated. These are human challenges that don’t really have one right solution,” he said.

He says that Atlassian has decided to solve these problems using machine learning with the goal of speeding up repetitive, time-intensive tasks. Much like Adobe or Salesforce, Atlassian has built this underlying layer of machine smarts, for lack of a better term, that can be distributed across their platform to deliver this kind of machine learning-based functionality wherever it makes sense for the particular product or service.

“We’ve invested in building this functionality directly into the Atlassian platform to bring together IT and development teams to unify work, so the Atlassian flagship products like JIRA and Confluence sit on top of this common platform and benefit from that common functionality across products. And so the idea is if we can build that common predictive capability at the platform layer we can actually proliferate smarts and benefit from the data that we gather across our products,” Hamid said.

The first pieces fit into this vision. For starters, Atlassian is offering a smart search tool that helps users find content across Atlassian tools faster by understanding who you are and how you work. “So by knowing where users work and what they work on, we’re able to proactively provide access to the right documents and accelerate work,” he said.

The second piece is more about collaboration and building teams with the best personnel for a given task. A new tool called predictive user mentions helps Jira and Confluence users find the right people for the job.

“What we’ve done with the Atlassian platform is actually baked in that intelligence, because we know what you work on and who you collaborate with, so we can predict who should be involved and brought into the conversation,” Hamid explained.

Finally, the company announced a tool specifically for Jira users, which bundles together similar sets of help requests and that should lead to faster resolution over doing them manually one at a time.

“We’re soon launching a feature in JIRA Service Desk that allows users to cluster similar tickets together, and operate on them to accelerate IT workflows, and this is done in the background using ML techniques to calculate the similarity of tickets, based on the summary and description, and so on.”

All of this was made possible by the company’s previous shift  from mostly on-premises to the cloud and the flexibility that gave them to build new tooling that crosses the entire platform.

Today’s announcements are just the start of what Atlassian hopes will be a slew of new machine learning-fueled features being added to the platform in the coming months and years.


By Ron Miller