Amazon S3 Storage Lens gives IT visibility into complex S3 usage

As your S3 storage requirements grow, it gets harder to understand exactly what you have, and this especially true when it crosses multiple regions. This could have broad implications for administrators, who are forced to build their own solutions to get that missing visibility. AWS changed that this week when it announced a new product called Amazon S3 Storage Lens, a way to understand highly complex S3 storage environments.

The tool provides analytics that help you understand what’s happening across your S3 object storage installations, and to take action when needed. As the company describes the new service in a blog post, “This is the first cloud storage analytics solution to give you organization-wide visibility into object storage, with point-in-time metrics and trend lines as well as actionable recommendations,” the company wrote in the post.

Amazon S3 Storage Lens Console

Image Credits: Amazon

The idea is to present a set of 29 metrics in a dashboard that help you “discover anomalies, identify cost efficiencies and apply data protection best practices,” according to the company. IT administrators can get a view of their storage landscape and can drill down into specific instances when necessary, such as if there is a problem that requires attention. The product comes out of the box with a default dashboard, but admins can also create their own customized dashboards, and even export S3 Lens data to other Amazon tools.

For companies with complex storage requirements, as in thousands or even tens of thousands of S3 storage instances, who have had to kludge together ways to understand what’s happening across the systems, this gives them a single view across it all.

S3 Storage Lens is now available in all AWS regions, according to the company.


By Ron Miller

Databricks launches SQL Analytics

AI and data analytics company Databricks today announced the launch of SQL Analytics, a new service that makes it easier for data analysts to run their standard SQL queries directly on data lakes. And with that, enterprises can now easily connect their business intelligence tools like Tableau and Microsoft’s Power BI to these data repositories as well.

SQL Analytics will be available in public preview on November 18.

In many ways, SQL Analytics is the product Databricks has long been looking to build and that brings its concept of a ‘lake house’ to life. It combines the performance of a data warehouse, where you store data after it has already been transformed and cleaned, with a data lake, where you store all of your data in its raw form. The data in the data lake, a concept that Databrick’s co-founder and CEO Ali Ghodsi has long championed, is typically only transformed when it gets used. That makes data lakes cheaper, but also a bit harder to handle for users.

Image Credits: Databricks

“We’ve been saying Unified Data Analytics, which means unify the data with the analytics. So data processing and analytics, those two should be merged. But no one picked that up,” Ghodsi told me. But ‘lake house’ caught on as a term.

“Databricks has always offered data science, machine learning. We’ve talked about that for years. And with Spark, we provide the data processing capability. You can do [extract, transform, load]. That has always been possible. SQL Analytics enables you to now do the data warehousing workloads directly, and concretely, the business intelligence and reporting workloads, directly on the data lake.”

The general idea here is that with just one copy of the data, you can enable both traditional data analyst use cases (think BI) and the data science workloads (think AI) Databricks was already known for. Ideally, that makes both use cases cheaper and simpler.

The service sits on top of an optimized version of Databricks’ open-source Delta Lake storage layer to enable the service to quickly complete queries. In addition, Delta Lake also provides auto-scaling endpoints to keep the query latency consistent, even under high loads.

While data analysts can query these data sets directly, using standard SQL, the company also built a set of connectors to BI tools. Its BI partners include Tableau, Qlik, Looker and Thoughtspot, as well as ingest partners like Fivetran, Fishtown Analytics, Talend and Matillion.

Image Credits: Databricks

“Now more than ever, organizations need a data strategy that enables speed and agility to be adaptable,” said Francois Ajenstat, Chief Product Officer at Tableau. “As organizations are rapidly moving their data to the cloud, we’re seeing growing interest in doing analytics on the data lake. The introduction of SQL Analytics delivers an entirely new experience for customers to tap into insights from massive volumes of data with the performance, reliability and scale they need.”

In a demo, Ghodsi showed me what the new SQL Analytics workspace looks like. It’s essentially a stripped-down version of the standard code-heavy experience that Databricks users are familiar with. Unsurprisingly, SQL Analytics provides a more graphical experience that focuses more on visualizations and not Python code.

While there are already some data analysts on the Databricks platform, this obviously opens up a large new market for the company — something that would surely bolster its plans for an IPO next year.


By Frederic Lardinois

Rockset announces $40M Series B as data analytics solution gains momentum

Rockset, a cloud-native analytics company, announced a $40 million Series B investment today led by Sequoia with help from Greylock, the same two firms that financed its Series A. The startup has now raised a total of $61.5 million, according to the company.

As co-founder and CEO Venkat Venkataramani told me at the time of the Series A in 2018, there is a lot of manual work involved in getting data ready to use and it acts as a roadblock to getting to real insight. He hoped to change that with Rockset.

“We’re building out our service with innovative architecture and unique capabilities that allows full-featured fast SQL directly on raw data. And we’re offering this as a service. So developers and data scientists can go from useful data in any shape, any form to useful applications in a matter of minutes. And it would take months today,” he told me in 2018.

In fact, “Rockset automatically builds a converged index on any data — including structured, semi-structured, geographical and time series data — for high-performance search and analytics at scale,” the company explained.

It seems to be resonating with investors and customers alike as the company raised a healthy B round and business is booming. Rockset supplied a few metrics to illustrate this. For starters, revenue grew 290% in the last quarter. While they didn’t provide any foundational numbers for that percentage growth, it is obviously substantial.

In addition, the startup reports adding hundreds of new users, again not nailing down any specific numbers, and queries on the platform are up 313%. Without specifics, it’s hard to know what that means, but that seems like healthy growth for an early stage startup, especially in this economy.

Mike Vernal, a partner at Sequoia, sees a company helping to get data to work faster than other solutions, which require a lot of handling first. “Rockset, with its innovative new approach to indexing data, has quickly emerged as a true leader for real-time analytics in the cloud. I’m thrilled to partner with the company through its next phase of growth,” Vernal said in a statement.

The company was founded in 2016 by the creators of RocksDB. The startup had previously raised a $3 million seed round when they launched the company and the $18.5 million A round in 2018.


By Ron Miller

SimilarWeb raises $120M for its AI-based market intelligence platform for sites and apps

Israeli startup SimilarWeb has made a name for itself with an AI-based platform that lets sites and apps track and understand traffic not just on their own sites, but those of its competitors. Now, it’s taking the next step in its growth. The startup has raised $120 million, funding it will use to continue expanding its platform both through acquisitions and investing in its own R&D, with a focus on providing more analytics services to larger enterprises alongside its current base of individuals and companies of all sizes that do business on the web.

Co-led by ION Crossover Partners and Viola Growth, the round doubles the total amount that the startup has raised to date to $240 million. Or Offer, SimilarWeb’s founder and CEO, said in an interview that it was not disclosing its valuation this time around except to say that his company is now “playing in the big pool.” It counts more than half of the Fortune 100 as customers, with Walmart, P&G, Adidas and Google, among them.

For some context, it hit an $800 million valuation in its last equity round, in 2017.

SimilarWeb’s technology competes with other analytics and market intelligence providers ranging from the likes of Nielsen and ComScore through to the Apptopias of the world in that, at its most basic level, it provides a dashboard to users that provides insights into where people are going on desktop and mobile. Where it differs, Offer said, is in how it gets to its information, and what else it’s doing in the process.

For starters, it focuses not just how many people are visiting, but also a look into what is triggering the activity — the “why”, as it were — behind the activity. Using a host of AI tech such as machine learning algorithms and deep learning — like a lot of tech out of Israel, it’s being built by people with deep expertise in this area — Offer says that SimilarWeb is crunching data from a number of different sources to extrapolate its insights.

He declined to give much detail on those sources but told me that he cheered the arrival of privacy gates and cookie lists for helping ferret out, expose and sometimes eradicate some of the more nefarious “analytics” services out there, and said that SimilarWeb has not been affected at all by that swing to more data protection, since it’s not an analytics service, strictly speaking, and doesn’t sniff data on sights in the same way. It’s also exploring widening its data pool, he added:

“We are always thinking about what new signals we could use,” he said. “Maybe they will include CDNs. But it’s like Google with its rankings in search. It’s a never ending story to try to get the highest accuracy in the world.”

The global health pandemic has driven a huge amount of activity on the web this year, with people turning to sites and apps not just for leisure — something to do while staying indoors, to offset all the usual activities that have been cancelled — but for business, whether it be consumers using e-commerce services for shopping, or workers taking everything online and to the cloud to continue operating.

That has also seen a boost of business for all the various companies that help the wheels turn on that machine, SimilarWeb included.

“Consumer behavior is changing dramatically, and all companies need better visibility,” said Offer. “It started with toilet paper and hand sanitizer, then moved to desks and office chairs, but now it’s not just e-commerce but everything. Think about big banks, whose business was 70% offline and is now 70-80% online. Companies are building and undergoing a digital transformation.”

That in turn is driving more people to understand how well their web presence is working, he said, with the basic big question being: “What is my marketshare, and how does that compare to my competition? Everything is about digital visibility, especially in times of change.”

Like many other companies, SimilarWeb did see an initial dip in business, Offer said, and to that end the company has taken on some debt as part of Israel’s Paycheck Protection Program, to help safeguard some jobs that needed to be furloughed. But he added that most of its customers prior to the pandemic kicking off are now back, along with customers from new categories that hadn’t been active much before, like automotive portals.

That change in customer composition is also opening some doors of opportunity for the company. Offer noted that in recent months, a lot of large enterprises — which might have previously used SimilarWeb’s technology indirectly, via a consultancy, for example — have been coming to the company direct.

“We’ve started a new advisory service [where] our own expert works with a big customer that might have more deep and complex questions about the behaviour we are observing. They are questions all big businesses have right now.” The service sounds like a partly-educational effort, teaching companies that are not necessarily digital-first be more proactive, and partly consulting.

New customer segments, and new priorities in the world of business, are two of the things that drove this round, say investors.

“SimilarWeb was always an incredible tool for any digital professional,” said Gili Iohan of ION Crossover Partners, in a statement. “But over the last few months it has become apparent that traffic intelligence — the unparalleled data and digital insight that SimilarWeb offers — is an absolute essential for any company that wants to win in the digital world.”

As for acquisitions, SimilarWeb has historically made these to accelerate its technical march. For example, in 2015 it acquired Quettra to move deeper into mobile analytics and it acquired Swayy to move into content discovery insights (key for e-commerce intelligence). Offer would not go into too much detail about what it has identified as a further target but given that there are quite a lot of companies building tech in this area currently, that there might be a case for some consolidation around bigger platforms to combine some of the features and functionality. Offer said that it was looking at “companies with great data and digital intelligence, with a good product. There are a lot of opportunities right now on the table.”

The company will also be doing some hiring, with the plan to be to add 200 more people globally by January (it has around 600 employees today).

“Since we joined the company three years ago, SimilarWeb has executed a strategic transformation from a general-purpose measurement platform to vertical-based solutions, which has significantly expanded its market opportunity and generated immense customer value,” said Harel Beit-On, Founder and General Partner at Viola Growth, in a statement. “With a stellar management team of accomplished executives, we believe this round positions the company to own the digital intelligence category, and capitalize on the acceleration of the digital era.”


By Ingrid Lunden

New Zendesk dashboard delivers customer service data in real time

Zendesk has been offering customers the ability to track customer service statistics for some time, but it has always been a look back. Today, the company announced a new product called Explorer Enterprise that lets customers capture that valuable info in real time, and share it with anyone in the organization, whether they have a Zendesk license or not.

While it has had Explorer in place for a couple of years now, Jon Aniano, senior VP of product at Zendesk says the new enterprise product is in response to growing customer data requirements. “We now have a way to deliver what we call Live Team Dashboards, which delivers real time analytics directly to Zendesk users,” Aniano told TechCrunch.

In the days before COVID that meant displaying these on big monitors throughout the customer service center. Today, as we deal with the pandemic, and customer service reps are just as likely to be working from home, it means giving management the tools they need to understand what’s happening in real time, a growing requirement for Zendesk customers as they scale, regardless of the pandemic.

“What we’ve found over the last few years is that our customers’ appetite for operational analytics is insatiable, and as customers grow, as customer service needs get more complex, the demands on a contact center operator or customer service team are higher and higher, and teams really need new sets of tools and new types of capabilities to meet what they’re trying to do in delivering customer service at scale in the world,” Aniano told TechCrunch.

One of the reasons for this is the shift from phone and email as the primary ways of accessing customer service to messaging tools like WhatsApp. “With the shift to messaging, there are new demands on contact centers to be able to handle real-time interactions at scale with their customers,” he said.

And in order to meet that kind of demand, it requires real-time analytics that Zendesk is providing with this announcement. This arms managers with the data they need to put their customer service resources where they are needed most in the moment in real time.

But Zendesk is also giving customers the ability to share these statistics with anyone in the company. “Users can share a dashboard or historical report with anybody in the company regardless of whether they have access to Zendesk. They can share it in Slack, or they can embed a dashboard anywhere where other people in the company would like to have access to those metrics,” Aniano explained.

The new service will be available starting on August 31st for $19 per user per month.


By Ron Miller

Mode raises $33M to supercharge its analytics platform for data scientists

Data science is the name of the game these days for companies that want to improve their decision making by tapping the information they are already amassing in their apps and other systems. And today, a startup called Mode Analytics, which has built a platform incorporating machine learning, business intelligence and big data analytics to help data scientists fulfil that task, is announcing $33 million in funding to continue making its platform ever more sophisticated.

Most recently, for example, the company has started to introduce tools (including SQL and Python tutorials) for less technical users, specifically those in product teams, so that they can structure queries that data scientists can subsequently execute faster and with more complete responses — important for the many follow up questions that arise when a business intelligence process has been run. Mode claims that its tools can help produce answers to data queries in minutes.

This Series D is being led by SaaS specialist investor H.I.G. Growth Partners, with previous investors Valor Equity Partners, Foundation Capital, REV Venture Partners, and Switch Ventures all participating. Valor led Mode’s Series C in February 2019, while Foundation and REV respectively led its A and B rounds.

Mode is not disclosing its valuation, but co-founder and CEO Derek Steer confirmed in an interview that it was “absolutely” an up-round.

For some context, PitchBook notes that last year its valuation was $106 million. The company now has a customer list that it says covers 52% of the Forbes 500, including Anheuser Busch, Zillow, Lyft, Bloomberg, Capital One, VMWare, and Conde Nast. It says that to date it has processed 830 million query runs and 170 million notebook cell runs for 300,000 users. (Pricing is based on a freemium model, with a free “Studio” tier and Business and Enterprise tiers priced based on size and use.)

Mode has been around since 2013, when it was co-founded by Steer, Benn Stancil (Mode’s current president) and Josh Ferguson (initially the CTO and now chief architect).

Steer said the impetus for the startup came out of gaps in the market that the three had found through years of experience at other companies.

Specifically, when all three were working together at Yammer (they were early employees and stayed on after the Microsoft acquisition), they were part of a larger team building custom data analytics tools for Yammer. At the time, Steer said Yammer was paying $1 million per year to subscribe to Vertica (acquired by HP in 2011) to run it.

They saw an opportunity to build a platform that could provide similar kinds of tools — encompassing things like SQL Editors, Notebooks, and reporting tools and dashboards — to a wider set of users.

“We and other companies like Facebook and Google were building analytics internally,” Steer recalled, “and we knew that the world wanted to work more like these tech companies. That’s why we started Mode.”

All the same, he added, “people were not clearly exactly about what a data scientist even was.”

Indeed, Mode’s growth so far has mirrored that of the rise of data science overall, as the discipline of data science, and the business case for employing data scientists to help figure out what is “going on” beyond the day to day, getting answers by tapping all the data that’s being amassed in the process of just doing business. That means Mode’s addressable market has also been growing.

But even if the trove of potential buyers of Mode’s products has been growing, so has the opportunity overall. There has been a big swing in data science and big data analytics in the last several years, with a number of tech companies building tools to help those who are less technical “become data scientists” by introducing more intuitive interfaces like drag-and-drop features and natural language queries.

They include the likes of Sisense (which has been growing its analytics power with acquisitions like Periscope Data), Eigen (focusing on specific verticals like financial and legal queries), Looker (acquired by Google) and Tableau (acquired by Salesforce).

Mode’s approach up to now has been closer to that of another competitor, Alteryx, focusing on building tools that are still aimed primary at helping data scientists themselves. You have any number of database tools on the market today, Steer noted, “Snowflake, Redshift, BigQuery, Databricks, take your pick.” The key now is in providing tools to those using those databases to do their work faster and better.

That pitch and the success of how it executes on it is what has given the company success both with customers and investors.

“Mode goes beyond traditional Business Intelligence by making data faster, more flexible and more customized,” said Scott Hilleboe, MD, H.I.G. Growth Partners, in a statement. “The Mode data platform speeds up answers to complex business problems and makes the process more collaborative, so that everyone can build on the work of data analysts. We believe the company’s innovations in data analytics uniquely position it to take the lead in the Decision Science marketplace.”

Steer said that fundraising was planned long before the coronavirus outbreak to start in February, which meant that it was timed as badly as it could have been. Mode still raised what it wanted to in a couple of months — “a good raise by any standard,” he noted — even if it’s likely that the valuation suffered a bit in the process. “Pitching while the stock market is tanking was terrifying and not something I would repeat,” he added.

Given how many acquisitions there have been in this space, Steer confirmed that Mode too has been approached a number of times, but it’s staying put for now. (And no, he wouldn’t tell me who has been knocking, except to say that it’s large companies for whom analytics is an “adjacency” to bigger businesses, which is to say, the very large tech companies have approached Mode.)

“The reason we haven’t considered any acquisition offers is because there is just so much room,” Steer said. “I feel like this market is just getting started, and I would only consider an exit if I felt like we were handicapped by being on our own. But I think we have a lot more growing to do.”


By Ingrid Lunden

Google Cloud’s new BigQuery Omni will let developers query data in GCP, AWS and Azure

At its virtual Cloud Next ’20 event, Google today announced a number of updates to its cloud portfolio, but the public alpha launch of BigQuery Omni is probably the highlight of this year’s event. Powered by Google Cloud’s Anthos hybrid-cloud platform, BigQuery Omni allows developers to use the BigQuery engine to analyze data that sits in multiple clouds, including those of Google Cloud competitors like AWS and Microsoft Azure — though for now, the service only supports AWS, with Azure support coming later.

Using a unified interface, developers can analyze this data locally without having to move data sets between platforms.

“Our customers store petabytes of information in BigQuery, with the knowledge that it is safe and that it’s protected,” said Debanjan Saha, the GM and VP of Engineering for Data Analytics at Google Cloud, in a press conference ahead of today’s announcement. “A lot of our customers do many different types of analytics in BigQuery. For example, they use the built-in machine learning capabilities to run real-time analytics and predictive analytics. […] A lot of our customers who are very excited about using BigQuery in GCP are also asking, ‘how can they extend the use of BigQuery to other clouds?’ ”

Image Credits: Google

Google has long said that it believes that multi-cloud is the future — something that most of its competitors would probably agree with, though they all would obviously like you to use their tools, even if the data sits in other clouds or is generated off-platform. It’s the tools and services that help businesses to make use of all of this data, after all, where the different vendors can differentiate themselves from each other. Maybe it’s no surprise then, given Google Cloud’s expertise in data analytics, that BigQuery is now joining the multi-cloud fray.

“With BigQuery Omni customers get what they wanted,” Saha said. “They wanted to analyze their data no matter where the data sits and they get it today with BigQuery Omni.”

Image Credits: Google

He noted that Google Cloud believes that this will help enterprises break down their data silos and gain new insights into their data, all while allowing developers and analysts to use a standard SQL interface.

Today’s announcement is also a good example of how Google’s bet on Anthos is paying off by making it easier for the company to not just allow its customers to manage their multi-cloud deployments but also to extend the reach of its own products across clouds. This also explains why BigQuery Omni isn’t available for Azure yet, given that Anthos for Azure is still in preview, while AWS support became generally available in April.


By Frederic Lardinois

Fishtown Analytics raises $12.9M Series A for its open-source analytics engineering tool

Philadelphia-based Fishtown Analytics, the company behind the popular open-source data engineering tool dbt, today announced that it has raised a $12.9 million Series A round led by Andreessen Horowitz, with the firm’s general partner Martin Casada joining the company’s board.

“I wrote this blog post in early 2016, essentially saying that analysts needed to work in a fundamentally different way,” Fishtown founder and CEO Tristan Handy told me, when I asked him about how the product came to be. “They needed to work in a way that much more closely mirrored the way the software engineers work and software engineers have been figuring this shit out for years and data analysts are still like sending each other Microsoft Excel docs over email.”

The dbt open-source project forms the basis of this. It allows anyone who can write SQL queries to transform data and then load it into their preferred analytics tools. As such, it sits in-between data warehouses and the tools that load data into them on one end, and specialized analytics tools on the other.

As Casada noted when I talked to him about the investment, data warehouses have now made it affordable for businesses to store all of their data before it is transformed. So what was traditionally “extract, transform, load” (ETL) has now become “extract, load, transform” (ELT). Andreessen Horowitz is already invested in Fivetran, which helps businesses move their data into their warehouses, so it makes sense for the firm to also tackle the other side of this business.

“Dbt is, as far as we can tell, the leading community for transformation and it’s a company we’ve been tracking for at least a year,” Casada said. He also argued that data analysts — unlike data scientists — are not really catered to as a group.

Before this round, Fishtown hadn’t raised a lot of money, even though it has been around for a few years now, except for a small SAFE round from Amplify.

But Handy argued that the company needed this time to prove that it was on to something and build a community. That community now consists of more than 1,700 companies that use the dbt project in some form and over 5,000 people in the dbt Slack community. Fishtown also now has over 250 dbt Cloud customers and the company signed up a number of big enterprise clients earlier this year. With that, the company needed to raise money to expand and also better service its current list of customers.

“We live in Philadelpha. The cost of living is low here and none of us really care to make a quadro-billion dollars, but we do want to answer the question of how do we best serve the community,” Handy said. “And for the first time, in the early part of the year, we were like, holy shit, we can’t keep up with all of the stuff that people need from us.”

The company plans to expand the team from 25 to 50 employees in 2020 and with those, the team plans to improve and expand the product, especially its IDE for data analysts, which Handy admitted could use a bit more polish.


By Frederic Lardinois

Oribi brings its web analytics platform to the U.S.

Oribi, an Israeli startup promising to democratize web analytics, is now launching in the United States.

While we’ve written about a wide range of new or new-ish analytics companies, founder and CEO Iris Shoor said that most of them aren’t built for Oribi’s customers.

“A lot of companies are more focused on the high end,” Shoor told me. “Usually these solutions are very much based on a lot of technical resources and integrations — these are the Mixpanels and Heap Analytics and Adobe Marketing Clouds.”

She said that Oribi, on the other hand, is designed for small and medium businesses that don’t have large technical teams: “They have digital marketing strategies that are worth a few hundred thousand dollars a month, they have very large activity, but they don’t have a team for it. And I would say that all of them are using Google Analytics.”

Shoor described Oribi as designed specifically “to compete with Google Analytics” by allowing everyone on the team to get the data they need without requiring developers to write new code for every event they want to track.

Event Correlations

In fact, if you use Oribi’s plugins for platforms like WordPress and Shopify, there’s no coding at all involved in the process. Apparently, that’s because Oribi is already tracking every major event in the customer journey. It also allows on the team to define the conversion goals that they want to focus on — again, with no coding required.

Shoor contrasted Oribi with analytics platforms that simply provide “more and more data” but don’t help customers understand what to do with that data.

“We’ve created something that is much more clean,” she said. “We give them insights of what’s working; in the background, we create all these different queries and correlations about which part of the funnels are broken and where they can optimize.”

There are big businesses using Oribi already — including Audi, Sony and Crowne Plaza — but the company is now turning its attention to U.S. customers. Shoor said Oribi isn’t opening an office in the United States right away, but there are plans to do so in the next year.


By Anthony Ha

Tableau update uses AI to increase speed to insight

Tableau was acquired by Salesforce earlier this year for $15.7 billion, but long before that, the company had been working on its Fall update, and today it announced several new tools including a new feature called ‘Explain Data’ that uses AI to get to insight quickly.

“What Explain Data does is it moves users from understanding what happened to why it might have happened by automatically uncovering and explaining what’s going on in your data. So what we’ve done is we’ve embedded a sophisticated statistical engine in Tableau, that when launched automatically analyzes all the data on behalf of the user, and brings up possible explanations of the most relevant factors that are driving a particular data point,” Tableau chief product officer, Francois Ajenstat explained.

He added that what this really means is that it saves users time by automatically doing the analysis for them, and It should help them do better analysis by removing biases and helping them dive deep into the data in an automated fashion.

Explain Data Superstore extreme value

Image: Tableau

Ajenstat says this is a major improvement in that previously users would have do all of this work manually. “So a human would have to go through every possible combination, and people would find incredible insights, but it was manually driven. Now with this engine, they are able to essentially drive automation to find those insights automatically for the users,” he said.

He says this has two major advantages. First of all, because it’s AI-driven it can deliver meaningful insight much faster, but it also it gives a more of a rigorous perspective of the data.

In addition, the company announced a new Catalog feature, which provides data bread crumbs with the source of the data, so users can know where the data came from, and whether it’s relevant or trustworthy.

Finally, the company announced a new server management tool that helps companies with broad Tableau deployment across a large organization to manage those deployments in a more centralized way.

All of these features are available starting today for Tableau customers.


By Ron Miller

Video platform Kaltura adds advanced analytics

You may not be familiar with Kaltura‘s name, but chances are you’ve used the company’s video platform at some point or another, given that it offers a variety of video services for enterprises, educational institutions and video on demand platforms, including HBO,  Phillips, SAP, Stanford and others. Today, the company announced the launch of an advanced analytics platform for its enterprise and educational users.

This new platform, dubbed Kaltura Analytics for Admins, will provide its users with features like user-level reports. This may sound like a minor feature, since you probably don’t care about the exact details of a given user’s interactions with your video, but it will allow businesses to link this kind of behavior to other metrics. With this, you could measure the ROI of a given video by linking video watch time and sales, for example. This kind of granularity wasn’t possible with the company’s existing analytics systems. Companies and schools using the product will also get access to time period comparisons to help admins identify trends, deeper technology and geolocation reports, as well as real-time analytics for live events.

eCDN QoS dashboard

“Video is a unique data type in that it has deep engagement indicators for measurement, both around video creation – what types of content are being created by whom, as well as around video consumption and engagement with content – what languages were selected for subtitles, what hot-spots were clicked upon in video,” said Michal Tsur, President & General Manager of Enterprise and Learning at Kaltura. “Analytics is a very strategic area for our customers. Both for tech companies who are building on our VPaaS, as well as for large organizations and universities that use our video products for learning, communication, collaboration, knowledge management, marketing and sales.”

Tsur also tells me that the company is looking at how to best use machine learning to give its customers even deeper insights into how people watch videos — and potentially even offer predictive analytics in the long run.


By Frederic Lardinois

Google continues to preach multi-cloud approach with Looker acquisition

When Google announced it was buying Looker yesterday morning for $2.6 billion, you couldn’t blame some of the company’s 1600 customers if they worried a bit if Looker would continue its multi-cloud approach, but Google Cloud chief Thomas Kurian made clear the company will continue to support an open approach to its latest purchase when it joins the fold later this year.

It’s consistent with the messaging from Google Next, the company’s cloud conference in April. It was looking to portray itself as the more open cloud. It was going to be friendlier to open source projects, running them directly on Google Cloud. It was going to provide a way to manage your workloads wherever they live with Anthos.

Ray Wang, founder and principal analyst at Constellation Research says that in a multi-cloud world, Looker represented one of the best choices, and that could be why Google went after it. “Looker’s strengths include its centralized data-modeling and governance, which promotes consistency and reuse. It runs on top of modern cloud databases including Google BigQuery, AWS Redshift and Snowflake,” Wang told TechCrunch. He added, “They wanted to acquire a tool that is as easy to use as Microsoft Power BI and as deep as Tableau.”

Patrick Moorhead, founder and principal analyst at Moor Insights & Strategy, also see this deal as part of consistent multi-cloud message from Google. “I do think it is in alignment with its latest strategy outlined at Google Next. It has talked about rich analytics tools that could pull data from disparate sources,” he said.

Kurian pushing the multi-cloud message

Google Cloud CEO Thomas Kurian, who took over from Diane Greene at the end of last year, was careful to emphasize the company’s commitment to multi-cloud and multi-database support in comments to media and analysts yesterday. “We first want to reiterate, we’re very committed to maintaining local support for other clouds, as well as to serve data from multiple databases because customers want a single analytics foundation for their organization, and they want to be able to in the analytics foundation, look at data from multiple data sources. So we’re very committed to that,” Kurian said yesterday.

From a broader customer perspective, Kurian sees Looker providing customers with a single way to access and visualize data. “One of the things that is challenging for organizations in operationalizing business intelligence, that we feel that Looker has done really well, is it gives you a single place to model your data, define your data definitions — like what’s revenue, who’s a gold customer or how many servers tickets are open — and allows you then to blend data across individual data silos, so that as an organization, you’re working off a consistent set of metrics,” Kurian explained.

In a blog post announcing the deal, Looker CEO Frank Bien sought to ease concerns that the company might move away from the multi-cloud, multi-database support. “For customers and partners, it’s important to know that today’s announcement solidifies ours as well as Google Cloud’s commitment to multi-cloud. Looker customers can expect continuing support of all cloud databases like Amazon Redshift, Azure SQL, Snowflake, Oracle, Microsoft SQL Server, Teradata and more,” Bien wrote in the post.

No anti-trust concerns

Kurian also emphasized that this deal shouldn’t attract the attention of anti-trust regulators, who have been sniffing around the big tech companies like Google/Alphabet, Apple and Amazon as of late. “We’re not buying any data along with this transaction. So it does not introduce any concentration risk in terms of concentrating data. Secondly, there are a large number of analytic tools in the market. So by just acquiring Looker, we’re not further concentrating the market in any sense. And lastly, all the other cloud players also have their own analytic tools. So it represents a further strengthening of our competitive position relative to the other players in the market,” he explained. Not to mention its pledge to uphold the multi-cloud and multi-database support, which should show it is not doing this strictly to benefit Google or to draw customers specifically to GCP.

Just this week, the company announced a partnership with Snowflake, the cloud data warehouse startup that has raised almost a billion dollars, to run on Google Cloud Platform. It already runs AWS and Microsoft Azure. In fact, Wang suggested that Snowflake could be next on Google’s radar as it tries to build a multi-cloud soup-to-nuts analytics offering.

Regardless, with Looker the company has a data analytics tool to complement its data processing tools, and together the two companies should provide a fairly comprehensive data solution. If they truly keep it multi, cloud, that should keep current customers happy, especially those who work with tools outside of the Google Cloud ecosystem or simply want to maintain their flexibility.


By Ron Miller

Google to acquire analytics startup Looker for $2.6 billion

Google made a big splash this morning when it announced it’s going to acquire Looker, a hot analytics startup that’s raised over $280 million. It’s paying $2.6 billion for the privilege and adding the company to Google Cloud.

Thomas Kurian, the man who was handed the reigns to Google Cloud at the end of last year see the crucial role data plays today for organizations, especially as they move to the cloud. “The combination of Google Cloud and Looker will enable customers to harness data in new ways to drive their digital transformation,” Kurian said in a statement.

Google Cloud has been mired in third place in the cloud infrastructure market, and grabbing Looker gives it an analytics company with a solid track record. The last time I spoke to Looker it was grabbing a hefty $103 million in funding on a $1.6 billion valuation. Today’s price is nice even billion over that.

As I wrote at the time, Looker’s CEO Frank Bien wasn’t all that interested in bragging about valuations though. “He reported that the company has 1,600 customers now and just crossed the $100 million revenue run rate, a significant milestone for any enterprise SaaS company. What’s more, Bien reports revenue is still growing 70 percent year over year, so there’s plenty of room to keep this going.”

Perhaps, it’s not a coincidence that Google went after Looker as the two companies had a strong existing partnership and 350 common customers, according to Google.

Per usual this deal is going to be subject to regulatory approval, but the deal is expected to close later this year if all goes well.


By Ron Miller

Sumo Logic announces $110M Series G investment on valuation over $1B

Sumo Logic, a cloud data analytics and log analysis company, announced a $110 million Series G investment today. The company indicated that its valuation was “north of a billion dollars,” but wouldn’t give an exact figure.

Today’s round was led by Battery Ventures with participation from new investors Tiger Global Management and Franklin Templeton. Other unnamed existing investors also participated according to the company. Today’s investment brings the total raised to $340 million.

When we spoke to Sumo Logic CEO Ramin Sayer at the time of its $75 million Series F in 2017, he indicated the company was on its way to becoming a public company. While that hasn’t happened yet, he says it is still the goal for the company, and investors wanted in on that before it happened.

“We don’t need to capital. We had plenty of capital already, but when you bring on crossover investors and others in this this stage of a company, they have minimum check sizes and they have a lot of appetite to help you as you get ready to address a lot of the challenges and opportunities as you become a public company,” he said.

He says the company will be investing the money in continuing to develop the platform, whether that’s through acquisitions, which of course the money would help with, or through the company’s own engineering efforts.

The IPO idea remains a goal, but Sayer was not willing or able to commit to when that might happen. The company clearly has plenty of runway now to last for quite some time.

“We could go out now if we wanted to, but we made a decision that that’s not what we’re going to do, and we’re going to continue to double down and invest, and therefore bring some more capital in to give us more optionality for strategic tuck-ins and product IP expansion, international expansion — and then look to the public markets [after] we do that,” he said.

Dharmesh Thakker, general partner at investor, Battery Ventures says his firm likes Sumo Logic’s approach and sees a big opportunity ahead with this investment. “We have been tracking the Sumo Logic team for some time, and admire the company’s early understanding of the massive cloud-native opportunity and the rise of new, modern application architectures,” he said in a statement.

The company crossed the $100 million revenue mark last year and has 2000 customers including Airbnb, Anheuser-Busch and Samsung. It competes with companies like Splunk, Scaylr and Loggly.


By Ron Miller

Why Daimler moved its big data platform to the cloud

Like virtually every big enterprise company, a few years ago, the German auto giant Daimler decided to invest in its own on-premises data centers. And while those aren’t going away anytime soon, the company today announced that it has successfully moved its on-premises big data platform to Microsoft’s Azure cloud. This new platform, which the company calls eXtollo, is Daimler’s first major service to run outside of its own data centers, though it’ll probably not be the last.

As Daimler’s head of its corporate center of excellence for advanced analytics and big data Guido Vetter told me, that the company started getting interested in big data about five years ago. “We invested in technology — the classical way, on-premise — and got a couple of people on it. And we were investigating what we could do with data because data is transforming our whole business as well,” he said.

By 2016, the size of the organization had grown to the point where a more formal structure was needed to enable the company to handle its data at a global scale. At the time, the buzzword was ‘data lakes’ and the company started building its own in order to build out its analytics capacities.

Electric Line-Up, Daimler AG

“Sooner or later, we hit the limits as it’s not our core business to run these big environments,” Vetter said. “Flexibility and scalability are what you need for AI and advanced analytics and our whole operations are not set up for that. Our backend operations are set up for keeping a plant running and keeping everything safe and secure.” But in this new world of enterprise IT, companies need to be able to be flexible and experiment — and, if necessary, throw out failed experiments quickly.

So about a year and a half ago, Vetter’s team started the eXtollo project to bring all the company’s activities around advanced analytics, big data and artificial intelligence into the Azure Cloud and just over two weeks ago, the team shut down its last on-premises servers after slowly turning on its solutions in Microsoft’s data centers in Europe, the U.S. and Asia. All in all, the actual transition between the on-premises data centers and the Azure cloud took about nine months. That may not seem fast, but for an enterprise project like this, that’s about as fast as it gets (and for a while, it fed all new data into both its on-premises data lake and Azure).

If you work for a startup, then all of this probably doesn’t seem like a big deal, but for a more traditional enterprise like Daimler, even just giving up control over the physical hardware where your data resides was a major culture change and something that took quite a bit of convincing. In the end, the solution came down to encryption.

“We needed the means to secure the data in the Microsoft data center with our own means that ensure that only we have access to the raw data and work with the data,” explained Vetter. In the end, the company decided to use thethAzure Key Vault to manage and rotate its encryption keys. Indeed, Vetter noted that knowing that the company had full control over its own data was what allowed this project to move forward.

Vetter tells me that the company obviously looked at Microsoft’s competitors as well, but he noted that his team didn’t find a compelling offer from other vendors in terms of functionality and the security features that it needed.

Today, Daimler’s big data unit uses tools like HD Insights and Azure Databricks, which covers more than 90 percents of the company’s current use cases. In the future, Vetter also wants to make it easier for less experienced users to use self-service tools to launch AI and analytics services.

While cost is often a factor that counts against the cloud since renting server capacity isn’t cheap, Vetter argues that this move will actually save the company money and that storage cost, especially, are going to be cheaper in the cloud than in its on-premises data center (and chances are that Daimler, given its size and prestige as a customer, isn’t exactly paying the same rack rate that others are paying for the Azure services).

As with so many big data AI projects, predictions are the focus of much of what Daimler is doing. That may mean looking at a car’s data and error code and helping the technician diagnose an issue or doing predictive maintenance on a commercial vehicle. Interestingly, the company isn’t currently bringing any of its own IoT data from its plants to the cloud. That’s all managed in the company’s on-premises data centers because it wants to avoid the risk of having to shut down a plant because its tools lost the connection to a data center, for example.


By Frederic Lardinois