CockroachDB, the database that just won’t die

There is an art to engineering, and sometimes engineering can transform art. For Spencer Kimball and Peter Mattis, those two worlds collided when they created the widely successful open-source graphics program, GIMP, as college students at Berkeley.

That project was so successful that when the two joined Google in 2002, Sergey Brin and Larry Page personally stopped by to tell the new hires how much they liked it and explained how they used the program to create the first Google logo.

Cockroach Labs was started by developers and stays true to its roots to this day.

In terms of good fortune in the corporate hierarchy, when you get this type of recognition in a company such as Google, there’s only one way you can go — up. They went from rising stars to stars at Google, becoming the go-to guys on the Infrastructure Team. They could easily have looked forward to a lifetime of lucrative employment.

But Kimball, Mattis and another Google employee, Ben Darnell, wanted more — a company of their own. To realize their ambitions, they created Cockroach Labs, the business entity behind their ambitious open-source database CockroachDB. Can some of the smartest former engineers in Google’s arsenal upend the world of databases in a market spotted with the gravesites of storage dreams past? That’s what we are here to find out.

Berkeley software distribution

Mattis and Kimball were roommates at Berkeley majoring in computer science in the early-to-mid-1990s. In addition to their usual studies, they also became involved with the eXperimental Computing Facility (XCF), an organization of undergraduates who have a keen, almost obsessive interest in CS.


By Danny Crichton

Databricks launches SQL Analytics

AI and data analytics company Databricks today announced the launch of SQL Analytics, a new service that makes it easier for data analysts to run their standard SQL queries directly on data lakes. And with that, enterprises can now easily connect their business intelligence tools like Tableau and Microsoft’s Power BI to these data repositories as well.

SQL Analytics will be available in public preview on November 18.

In many ways, SQL Analytics is the product Databricks has long been looking to build and that brings its concept of a ‘lake house’ to life. It combines the performance of a data warehouse, where you store data after it has already been transformed and cleaned, with a data lake, where you store all of your data in its raw form. The data in the data lake, a concept that Databrick’s co-founder and CEO Ali Ghodsi has long championed, is typically only transformed when it gets used. That makes data lakes cheaper, but also a bit harder to handle for users.

Image Credits: Databricks

“We’ve been saying Unified Data Analytics, which means unify the data with the analytics. So data processing and analytics, those two should be merged. But no one picked that up,” Ghodsi told me. But ‘lake house’ caught on as a term.

“Databricks has always offered data science, machine learning. We’ve talked about that for years. And with Spark, we provide the data processing capability. You can do [extract, transform, load]. That has always been possible. SQL Analytics enables you to now do the data warehousing workloads directly, and concretely, the business intelligence and reporting workloads, directly on the data lake.”

The general idea here is that with just one copy of the data, you can enable both traditional data analyst use cases (think BI) and the data science workloads (think AI) Databricks was already known for. Ideally, that makes both use cases cheaper and simpler.

The service sits on top of an optimized version of Databricks’ open-source Delta Lake storage layer to enable the service to quickly complete queries. In addition, Delta Lake also provides auto-scaling endpoints to keep the query latency consistent, even under high loads.

While data analysts can query these data sets directly, using standard SQL, the company also built a set of connectors to BI tools. Its BI partners include Tableau, Qlik, Looker and Thoughtspot, as well as ingest partners like Fivetran, Fishtown Analytics, Talend and Matillion.

Image Credits: Databricks

“Now more than ever, organizations need a data strategy that enables speed and agility to be adaptable,” said Francois Ajenstat, Chief Product Officer at Tableau. “As organizations are rapidly moving their data to the cloud, we’re seeing growing interest in doing analytics on the data lake. The introduction of SQL Analytics delivers an entirely new experience for customers to tap into insights from massive volumes of data with the performance, reliability and scale they need.”

In a demo, Ghodsi showed me what the new SQL Analytics workspace looks like. It’s essentially a stripped-down version of the standard code-heavy experience that Databricks users are familiar with. Unsurprisingly, SQL Analytics provides a more graphical experience that focuses more on visualizations and not Python code.

While there are already some data analysts on the Databricks platform, this obviously opens up a large new market for the company — something that would surely bolster its plans for an IPO next year.


By Frederic Lardinois

ThoughtSpot hauls in $248M Series D on $1.95B valuation

ThoughtSpot was started by a bunch of ed-Googlers looking to bring the power of search to data. Seven years later the company is growing fast, sporting a fat valuation of almost $2 billion and looking ahead to a possible IPO. Today it announced a hefty $248 million Series D round as it continues on its journey.

Investors include Silver Lake Waterman, Silver Lake’s late-stage growth capital fund along with existing investors Lightspeed Venture Partners, Sapphire Ventures and Geodesic Capital. Today’s funding brings the total raised to $554 million, according to the company.

The company wants to help customers bring speed to data analysis by answering natural language questions about the data without having to understand how to formulate a SQL query. As a person enters questions, ThoughSpot translates that question into SQL, then displays a chart with data related to the question, all almost instantly (at least in the demo).

It doesn’t stop there though. It also uses artificial intelligence to understand intent to help come up the exact correct answer. ThoughtSpot CEO Sudheesh Nair says that this artificial intelligence underpinning is key to the product. As he explained, if you are looking for the answer to a specific question like ‘What is the profit margin of red shoes in Portland?” there won’t be multiple answers. There is only one answer, and that’s where artificial intelligence really comes into play.

“The bar on delivering that kind of answer is very high and because of that, understanding intent is critical. We use AI for that. You could ask, ‘How did we do with red shoes in Portland?’ I could ask, ‘What is the profit margin of red shoes in Portland?’ The system needs to know that we both are asking the same question. So there’s a lot of AI that goes behind it to understand the intent,” Nair explained.

image 10

Image: ThoughtSpot

ThoughtSpot gets answers to queries by connecting to a variety of internal systems like HR, CRM and ERP and uses all of this data to answer the question as best it can. So far, it appears to be working. The company has almost 250 large company customers, and is on a run rate of close to $100 million.

Nair said that the company didn’t necessarily need the money with $100 million still in the bank, but he saw an opportunity, and he seized it. He says the money gives him a great deal of flexibility moving forward including the possibility of acquiring companies to fill in missing pieces or to expand the platform’s capabilities. It also will allow him to accelerate growth. Plus, he sees the capital markets possibly tightening next year and he wanted to strike while the opportunity was in front of him.

Nair definitely sees the company going public at some point. “With these kind of resources behind us, it actually opens up an opportunity for us to do any sort of IPO that we want. I do think that a company like this will benefit from going public because Global 2000 kind of customers, where we have our most of our business, appreciate the transparency and the stability represented by public companies,” he said.

He added, “And with $350 million in the bank, it’s totally [possible to] IPO, which means that a year and a half from now if we are ready to take the company public, we can actually have all options open including a direct listing, potentially. I’m not saying we will do that, but I’m saying that this kind of funding behind us, we have all those options open.”


By Ron Miller

Clari platform aims to unify go-to-market operations data

Clari started out as a company that wanted to give sales teams more information about their sales process than could be found in the CRM database. Today, the company announced a much broader platform, one that can provide insight across sales, marketing and customer service to give a more unified view of a company’s go-to-market operations, all enhanced by AI.

Company co-founder and CEO Andy Byrne says this involves pulling together a variety of data and giving each department the insight to improve their mission. “We are analyzing large volumes of data found in various revenue systems — sales, marketing, customer success, etc. — and we’re using that data to provide a new platform that’s connecting up all of the different revenue departments,” Byrne told TechCrunch.

For sales that would mean driving more revenue. For marketing it would it involve more targeted plans to drive more sales, and for customer success it would be about increasing customer retention and reducing churn.

Screenshot: Clari

The company’s original idea when it launched in 2012 was looking at a range of data that touched the sales process such as email, calendars and the CRM database to bring together a broader view of sales than you could get by looking at the basic customer data stored in the CRM alone. The Clari data could tell the reps things like which deals would be most likely to close and which ones were at risk.

“We were taking all of these signals that had been historically disconnected from each other and we were connecting it all into a new interface for sales teams that’s very different than a CRM,” Byrne said.

Over time, that involved using AI and machine learning to make connections in the data that humans might not have been seeing. The company also found that customers were using the product to look at processes adjacent to sales, and they decided to formalize that and build connectors to relevant parts of the go-to-market system like marketing automation tools from Marketo or Eloqua and customer tools such as Dialpad, Gong.io and Salesloft.

With Clari’s approach, companies can get a unified view without manually pulling all this data together. The goal is to provide customers with a broad view of the to-to-market operation that isn’t possible looking at siloed systems.

The company has experienced tremendous growth over the last year leaping from 80 customers to 250. These include Okta and Alteryx, two companies that went public in recent years. Clari is based in the Bay area and has around 120 employees. It has raised over $60M. The most recent round was a $35 million Series C last May led by Tenaya Capital.


By Ron Miller

Rockset launches out of stealth with $21.5 M investment

Rockset, a startup that came out of stealth today, announced $21.5M in previous funding and the launch of its new data platform that is designed to simplify much of the processing to get to querying and application building faster.

As for the funding, it includes $3 million in seed money they got when they started the company, and a more recent $18.5 million Series A, which was led by Sequoia with participation from Greylock.

Jerry Chen, who is a partner at Greylock sees a team that understands the needs of modern developers and data scientists, one that was born in the cloud and can handle a lot of the activities that data scientists have traditionally had to handle manually. “Rockset can ingest any data from anywhere and let developers and data scientists query it using standard SQL. No pipelines. No glue. Just real time operational apps,” he said.

Company co-founder and CEO Venkat Venkataramani is a former Facebook engineer where he learned a bit about processing data at scale. He wanted to start a company that would help data scientists get to insights more quickly.

Data typically requires a lot of massaging before data scientists and developers can make use of it and Rockset has been designed to bypass much of that hard work that can take days, weeks or even months to complete.

“We’re building out our service with innovative architecture and unique capabilities that allows full-featured fast SQL directly on raw data. And we’re offering this as a service. So developers and data scientists can go from useful data in any shape, any form to useful applications in a matter of minutes. And it would take months today,” Venkataramani explained.

To do this you simply connect your data set wherever it lives to your AWS account and Rockset deals with the data ingestion, building the schema, cleaning the data, everything. It also makes sure you have the right amount of infrastructure to manage the level of data you are working with. In other words, it can potentially simplify highly complex data processing tasks to start working with the raw data almost immediately using SQL queries.

To achieve the speed, Venkataramani says they use a number of indexing techniques. “Our indexing technology essentially tries to bring the best of search engines and columnar databases into one. When we index the data, we build more than one type of index behind the scenes so that a wide spectrum of pre-processing can be automatically fast out of the box,” he said. That takes the burden of processing and building data pipelines off of the user.

The company was founded in 2016. Chen and Sequoia partners Mike Vernal joined the Rockset board under the terms of the Series A funding, which closed last August.


By Ron Miller

Anaplan hits the ground running with strong stock market debut up over 42 percent

You might think that Anaplan CEO, Frank Calderoni would have had a few sleepless nights this week. His company picked a bad week to go public as market instability rocked tech stocks. Still he wasn’t worried, and today the company had by any measure a successful debut with the stock soaring up over 42 percent. As of 4 pm ET, it hit $24.18, up from the IPO price of $17. Not a bad way to launch your company.

Stock Chart: Yahoo Finance

“I feel good because it really shows the quality of the company, the business model that we have and how we’ve been able to build a growing successful business, and I think it provides us with a tremendous amount of opportunity going forward,” Calderoni told TechCrunch.

Calderoni joined the company a couple of years ago, and seemed to emerge from Silicon Valley central casting as former CFO at Red Hat and Cisco along with stints at IBM and SanDisk. He said he has often wished that there were a tool around like Anaplan when he was in charge of a several thousand person planning operation at Cisco. He indicated that while they were successful, it could have been even more so with a tool like Anaplan.

“The planning phase has not had much change in in several decades. I’ve been part of it and I’ve dealt with a lot of the pain. And so having something like Anaplan, I see it’s really being a disrupter in the planning space because of the breadth of the platform that we have. And then it goes across organizations to sales, supply chain, HR and finance, and as we say, really connects the data, the people and the plan to make for better decision making as a result of all that,” he said.

Calderoni describes Anaplan as a planning and data analysis tool. In his previous jobs he says that he spent a ton of time just gathering data and making sure they had the right data, but precious little time on analysis. In his view Anaplan, lets companies concentrate more on the crucial analysis phase.

“Anaplan allows customers to really spend their time on what I call forward planning where they can start to run different scenarios and be much more predictive, and hopefully be able to, as we’ve seen a lot of our customers do, forecast more accurately,” he said.

Anaplan was founded in 2006 and raised almost $300 million along the way. It achieved a lofty valuation of $1.5 billion in its last round, which was $60 million in 2017. The company has just under 1000 customers including Del Monte, VMware, Box and United.

Calderoni says although the company has 40 percent of its business outside the US, there are plenty of markets left to conquer and they hope to use today’s cash infusion in part to continue to expand into a worldwide company.


By Ron Miller

Tableau gets new pricing plans and a data preparation tool

Data analytics platform Tableau today announced the launch of both a new data preparation product and a new subscription pricing plan.

Currently, Tableau offers desktop plans for users who want to analyze their data locally, a server plan for businesses that want to deploy the service on-premises or on a cloud platform, and a fully hosted online plan. Prices for these range from $35 to $70 per user and month. The new pricing plans don’t focus so much on where the data is analyzed but on the analyst’s role. The new Creator, Explorer and Viewer plans are tailored toward the different user experiences. They all include access to the new Tableau Prep data preparation tool, Tableau Desktop and new web authoring capabilities — and they are available both on premises or in the cloud.

Existing users can switch their server or desktop subscriptions to the new release today and then assign each user either a creator, explorer or viewer role. As the name indicates, the new viewer role is meant for users who mostly consume dashboards and visualizations, but don’t create their own. The explorer role is for those who need access to a pre-defined data set and the creator role is for analysts and power user who need access to all of Tableau’s capabilities.

“Organizations are facing the urgent need to empower their entire workforce to help drive more revenue, reduce costs, provide better service, increase productivity, discover the next scientific breakthrough and even save lives,” said Adam Selipsky, CEO at Tableau, in today’s announcement. “Our new offerings will help entire organizations make analytics ubiquitous, enabling them to tailor the capabilities required for every employee.”

As for the new data preparation tool, the general idea here is to give users a visual way to shape and clean their data, something that’s especially important as businesses now often pull in data from a variety of sources. Tableau Prep can automate some of this, but the most important aspect of the service is that it gives users a visual interface for creating these kind of workflows. Prep includes support for all the standard Tableau data connectors and lets users perform calculations, too.

“Our customers often tell us that they love working with Tableau, but struggle when data is in the wrong shape for analysis,” said Francois Ajenstat, Chief Product Officer at Tableau. “We believe data prep and data analysis are two sides of the same coin that should be deeply integrated and look forward to bringing fun, easy data prep to everyone regardless of technical skill set.”


By Frederic Lardinois