DREAMTECH NEWS

Vianai emerges with $50M seed and a mission to simplify machine learning tech

You don’t see a startup get a $50 million seed round all that often, but such was the case with Vianai, an early stage startup launched by Vishal Sikka, former Infosys managing director and SAP executive. The company launched recently with a big check and a vision to transform machine learning.

Just this week, the startup had a coming out party at Oracle Open World where Sikka delivered one of the keynotes and demoed the product for attendees. Over the last couple of years, since he left Infosys, Sikka has been thinking about the impact of AI and machine learning on society and the way it is being delivered today. He didn’t much like what he saw.

It’s worth noting that Sikka got his Ph.D. from Stanford with a specialty in AI in 1996, so this isn’t something that’s new to him. What’s changed, as he points out, is the growing compute power and increasing amounts of data, all fueling the current AI push inside business. What he saw when he began exploring how companies are implementing AI and machine learning today, was a lot of complex tooling, which in his view, was far more complex than it needed to be.

He saw dense Jupyter notebooks filled with code. He said that if you looked at a typical machine learning model, and stripped away all of the code, what you found was a series of mathematical expressions underlying the model He had a vision of making that model-building more about the math, while building a highly visual data science platform from the ground up.

The company has been iterating on a solution over the last year with two core principles in mind: explorability and explainability, which involves interacting with the data and presenting it in a way that helps the user attain their goal faster than the current crop of model-building tools.

“It is about making the system reactive to what the user is doing, making it completely explorable, while making it possible for the developer to experiment with what’s happening in a in a way that is that is incredibly easy. To make it explainable, means being able to go back and forth with the data and the model, using the model to understand the phenomenon that you’re trying to capture in the data,” Sikka told TechCrunch.

He says the tool isn’t just aimed at data scientists, it’s about business users and the data scientists sitting down together and iterating together to get the answers they are seeking, whether it’s finding a way to reduce user churn or discover fraud. These models do not live in a data science vacuum. They all have a business purpose, and he believes the only way to be successful with AI in the enterprise is to have both business users and data scientists sitting together at the same table working with the software to solve a specific problem, while taking advantage of one another’s expertise.

For Sikka, this means refining the actual problem you are trying to solve. “AI is about problem solving, but before you do the problem solving, there is also a [challenge around] finding and articulating a business problem that is relevant to businesses and that has a value to the organization,” he said.

He is very clear, that he isn’t looking to replace humans, but instead wants to use AI to augment human intelligence to solve actual human problems. He points out that this product is not automated machine learning (AutoML), which he considers a deeply flawed idea. “We are not here to automate the jobs of data science practitioners. We are here to augment them,” he said.

As for that massive seed round, Sikka knew it would take a big investment to build a vision like this, and with his reputation and connections, he felt it would be better to get one big investment up front, and he could concentrate on building the product and the company. He says that he was fortunate enough to have investors who believe in the vision, even though as he says, no early business plan survives the test of reality.

For now, the company has a new product and plenty of money in the bank to get to profitability, which he states is his ultimate goal. Sikka could have taken a job running a large organization, but like many startup founders, he saw a problem, and he had an idea how to solve it. That was a challenge he couldn’t resist pursuing.


By Ron Miller

Google is investing $3.3B to build clean data centers in Europe

Google announced today that it was investing 3 billion euro (approximately $3.3 billion USD) to expand its data center presence in Europe. What’s more, the company pledged the data centers would be environmentally friendly.

This new investment is in addition to the $7 billion the company has invested since 2007 in the EU, but today’s announcement was focused on Google’s commitment to building data centers running on clean energy, as much as the data centers themselves.

In a blog post announcing the new investment, CEO Sundar Pichai, made it clear that the company was focusing on running these data centers on carbon-free fuels, pointing out that he was in Finland today to discuss building sustainable economic development in conjunction with a carbon-free future with prime minister Antti Rinne.

Of the 3 billion Euros, the company plans to spend, it will invest 600 million to expand its presence in Hamina, Finland, which he wrote “serves as a model of sustainability and energy efficiency for all of our data centers.” Further, the company already announced 18 new renewable energy deals earlier this week, which encompass a total of 1,600-megawatts in the US, South America and Europe.

In the blog post, Pichai outlined how the new data center projects in Europe would include some of these previously announced projects:

Today I’m announcing that nearly half of the megawatts produced will be here in Europe, through the launch of 10 renewable energy projects. These agreements will spur the construction of more than 1 billion euros in new energy infrastructure in the EU, ranging from a new offshore wind project in Belgium, to five solar energy projects in Denmark, and two wind energy projects in Sweden. In Finland, we are committing to two new wind energy projects that will more than double our renewable energy capacity in the country, and ensure we continue to match almost all of the electricity consumption at our Finnish data center with local carbon-free sources, even as we grow our operations.

The company is also helping by investing in new skills training, so people can have the tools to be able to handle the new types of jobs these data centers and other high tech jobs will require. The company claims it has previously trained 5 million people in Europe for free in crucial digital skills, and recently opened a Google skills hub in Helsinki.

It’s obviously not a coincidence that company is making an announcement related to clean energy on Global Climate Strike Day, a day when people from around the world are walking out of schools and off their jobs to encourage world leaders and businesses to take action on the climate crisis. Google is attempting to answer the call with these announcements.


By Ron Miller

New Relic launches platform for developers to build custom apps

When Salesforce launched Force.com in 2007 as a place for developers to build applications on top of Salesforce, it was a pivotal moment for the concept of SaaS platforms. Since then, it’s been said that every enterprise SaaS company wants to be a platform play. Today, New Relic achieved that goal when it announced the New Relic One Observability Platform at the company’s FutureStack conference in New York City.

Company co-founder and CEO Lew Cirne explained that in order to be a platform, by definition, it is something that other people can build software on. “What we are shipping is a set of capabilities to enable our customers and partners to build their own observability applications on the very same platform that we’ve built our product,” Cirne told TechCrunch.

He sees these third-party developers building applications to enable additional innovations on top of the New Relic platform that perhaps New Relic’s engineers couldn’t because of time and resource constraints. “There are so many use cases for this data, far more than the engineers that we have at our company could ever do, but a community of people who can do this together can totally unlock the power of this data,” Cirne said.

Like many platform companies, New Relic found that as it expanded its own offering, it required a platform for its developers to access a common set of services to build these additional offerings, and as they built out this platform, it made it possible to open it up to external developers to access the same set of services as the New Relic engineering team.

“What we have is metrics, logs, events and traces coming from our customers’ digital software. So they have access to all that data in real time to build applications, measure the health of their digital business and build applications on top of that. Just as Force.com was the thing that really transformed Salesforce as a company into being a strategic vendor, we think the same thing will happen for us with what we’re offering,” he said.

As a proof point for the platform, the company is releasing a dozen open source tools built on top of the New Relic platform today in conjunction with the announcement. One example is an application to help identify where companies could be over-spending on their AWS bills. “We’re actually finding 30-40% savings opportunities for them where they’re provisioning larger servers than they need for the workload. Based on the data that we’re analyzing, we’re recommending what the right size deployment should be,” Cirne said.

The New Relic One Observability Platform and the 12 free apps will be available starting today.


By Ron Miller

Quilt Data launches from stealth with free portal to access petabytes of public data

Quilt Data‘s founders, Kevin Moore and Aneesh Karve, have been hard at work for the last four years building a platform to search for data quickly across vast repositories on AWS S3 storage. The idea is to give data scientists a way to find data in S3 buckets, and then package that data in forms that a business can use. Today, the company launched out of stealth with a free data search portal that not only proves what they can do, but also provides valuable access to 3.7 petabytes of public data across 23 S3 repositories.

The public data repository includes publicly available Amazon review data along with satellite images and other high-value public information. The product works like any search engine, where you enter a query, but instead of searching the web or an enterprise repository, it finds the results in S3 storage on AWS.

The results not only include the data you are looking for, it also includes all of the information around the data, such as Jupyter notebooks, the standard  workspace that data scientists use to build machine learning models. Data scientists can then use this as the basis for building their own machine learning models.

The public data, which includes over 10 billion objects, is a resource that data scientists should greatly appreciate it, but the company is offering access to this data out of more than pure altruism. It’s doing so because it wants to show what the platform is capable of, and in the process hopes to get companies to use the commercial version of the product.

Screen Shot 2019 09 16 at 2.31.53 PM

Quilt Data search results with data about the data found. Image: Quilt Data

Customers can try Quilt Data for free or subscribe to the product in the Amazon Marketplace. The company charges a flat rate of $550 per month for each S3 bucket. It also offers an enterprise version with priority support, custom features and education and on-boarding for $999 per month for each S3 bucket.

The company was founded in 2015 and was a member of the Y Combinator Summer 2017 cohort. The company has received $4.2 million in seed money so far from Y Combinator, Vertex Ventures, Fuel Capital and Streamlined Ventures along with other unnamed investors.


By Ron Miller

Tableau update uses AI to increase speed to insight

Tableau was acquired by Salesforce earlier this year for $15.7 billion, but long before that, the company had been working on its Fall update, and today it announced several new tools including a new feature called ‘Explain Data’ that uses AI to get to insight quickly.

“What Explain Data does is it moves users from understanding what happened to why it might have happened by automatically uncovering and explaining what’s going on in your data. So what we’ve done is we’ve embedded a sophisticated statistical engine in Tableau, that when launched automatically analyzes all the data on behalf of the user, and brings up possible explanations of the most relevant factors that are driving a particular data point,” Tableau chief product officer, Francois Ajenstat explained.

He added that what this really means is that it saves users time by automatically doing the analysis for them, and It should help them do better analysis by removing biases and helping them dive deep into the data in an automated fashion.

Explain Data Superstore extreme value

Image: Tableau

Ajenstat says this is a major improvement in that previously users would have do all of this work manually. “So a human would have to go through every possible combination, and people would find incredible insights, but it was manually driven. Now with this engine, they are able to essentially drive automation to find those insights automatically for the users,” he said.

He says this has two major advantages. First of all, because it’s AI-driven it can deliver meaningful insight much faster, but it also it gives a more of a rigorous perspective of the data.

In addition, the company announced a new Catalog feature, which provides data bread crumbs with the source of the data, so users can know where the data came from, and whether it’s relevant or trustworthy.

Finally, the company announced a new server management tool that helps companies with broad Tableau deployment across a large organization to manage those deployments in a more centralized way.

All of these features are available starting today for Tableau customers.


By Ron Miller

Salesforce brings AI power to its search tool

Enterprise search tools have always suffered from the success of Google. Users wanted to find the content they needed internally in the same way they found it on the web. Enterprise search has never been able to meet those lofty expectations, but today Salesforce announced Einstein Search, an AI-powered search tool for Salesforce users that is designed to point them to the exact information they are looking for.

Will Breetz, VP of product management at Salesforce says that enterprise search has suffered over the years for a variety of reasons. “Enterprise search has gotten a bad rap, but deservedly so. Part of that is because in many ways it is more difficult than consumer search, and there’s a lot of headwinds,” Breetz explained.

To solve these issues, the company decided to put the power of its Einstein artificial intelligence engine to bear on the problem. For starters, it might not know the popularity of a given topic like Google, but it can learn the behaviors of an individual and deliver the right answer based on a person’s profile including geography and past activity to deliver a more meaningful answer.

Einstein Search Personal

Image: Salesforce

Next, it allows you to enter natural language search phrasing to find the exact information you need, and the search tool understands and delivers the results. For instance, you could enter, “my open opportunities in Boston” and using natural language understanding, the tool can translate that into the exact set of results you are looking for –your open opportunities in Boston. You could use conventional search to click a series of check boxes to narrow the list of results to only Boston, but this is faster and more efficient.

Finally, based on what the intelligence engine knows about you, and on your search parameters, it can predict the most likely actions you want to take and provide quick action buttons in the results to help you do that, reducing the time to action. It may not seem like much, but each reduced workflow adds up throughout a day, and the idea is to anticipate your requirements and help you get your work done more quickly.

Salesforce appears to have flipped the enterprise search problem. Instead of having a limited set of data being a handicap for enterprise search, it is taking advantage of that, and applying AI to help deliver more meaningful results. It’s for a limited set of findings for now such as accounts, contacts and opportunities, but the company plans to additional options over time.


By Ron Miller

Aliro comes out of stealth with $2.7M to ‘democratize’ quantum computing with developer tools

It’s still early days for quantum computing, but we’re nonetheless seeing an interesting group of startups emerging that are helping the world take advantage of the new technology now. Aliro Technologies, a Harvard startup that has built a platform for developers to code more easily for quantum environments — “write once, run anywhere” is one of the startup’s mottos — is today coming out of stealth and announcing its first funding of $2.7 million to get it off the ground.

The seed round is being led Flybridge Capital Partners, with participation also from Crosslink Ventures and Samsung NEXT’s Q Fund, a fund the corporate investor launched last year dedicated specifically to emerging areas like quantum computing and AI.

Aliro is wading into the market at a key moment in the development of quantum computing.

While vendors continue to build new quantum hardware to be able to tackle the kinds of complex calculations that cannot be handled by current binary-based machines, for example around medicine discovery, or multi-variabled forecasting — just today IBM announced plans for a 53-qubit device — even so, it’s widely acknowledged that the computers that have been built so far face a number of critical problems that will hamper wide adoption.

The interesting development of recent times is the emergence of startups that are tackling these specific critical problems, dovetailing that progress with that of building the hardware itself. Take the fact that quantum machines so far have been too prone to error when used for extended amounts of time: last week, I wrote about a startup called Q-CTRL that has built firmware that sits on top of the machines to identify when errors are creeping in and provide fixes to stave off crashes.

The specific area that Aliro is addressing is the fact that quantum hardware is still very fragmented: each machine has its own proprietary language and operating techniques and sometimes even purpose for which it’s been optimised. It’s a landscape that is challenging for specialists to engage in, let alone the wider world of developers.

“We’re at the early stage of the hardware, where quantum computers have no standardisation, even those based on same technology have different qubits (the basic building block of quantum activity) and connectivity. It’s like digital computing in 1940s,” said CEO and chairman Jim Ricotta. (The company is co-founded by Harvard computational materials science professor Prineha Narang along with Michael Cubeddu and Will Finegan, who are actually still undergraduate students at the university.)

“Because it’s a different style of computing, software developers are not used to quantum circuits,” and engaging with them is “not the same as using procedural languages. There is a steep on-ramp from high-performance classical computing to quantum computing.”

While Aliro is coming out of stealth, it appears that the company is not being specific with details about how its platform actually works. But the basic idea is that Aliro’s platform will essentially be an engine that will let developers work in the languages that they know, and identify problems that they would like to solve; it will then assess the code and provide a channel for how to optimise that code and put it into quantum-ready language, and suggest the best machine to process the task.

The development points to an interesting way that we may well see quantum computing develop, at least in its early stages. Today, we have a handful of companies building and working on quantum computers, but there is still a question mark over whether these kinds of machines will ever be widely deployed, or if — like cloud computing — they will exist among a smaller amount of providers who will provide access to them on-demand, SaaS-style. Such a model would seem to fit with how much computing is sold today in the form of instances, and would open the door to large cloud names like Amazon, Google and Microsoft playing a big role in how this would be disseminated.

Such questions are still theoretical, of course, given some of the underlying problems that have yet to be fixed, but the march of progress seems inevitable, with forecasts predicting that quantum computing is likely to be a $2.2 billion industry by 2025, and if this is a route that is taken, the middlemen like Aliro could play an important role.

“I have been working with the Aliro team for the past year and could not be more excited about the opportunity to help them build a foundational company in Quantum Computing software, “ said David Aronoff, General Partner at Flybridge, in a statement. “Their innovative approach and unique combination of leading Quantum researchers and a world-class proven executive team, make Aliro a formidable player in this exciting new sector.

“At Samsung NEXT we are focused on what the world will look like in the future, helping to make that a reality,” said Ajay Singh, Samsung NEXT’s Q Fund, in a statement. “We were drawn to Prineha and her team by their impressive backgrounds and extent of research into quantum computing. We believe that Aliro’s unique software products will revolutionize the entire category, by speeding up the inflection point where quantum becomes as accessible as classical computing. This could have implications on anything from drug discovery, materials development or chemistry. Aliro’s ability to map quantum circuits to heterogeneous hardware in an efficient way will be truly transformative and we’re thrilled to be on this journey with them.”


By Ingrid Lunden

Salesforce is developing an app to help build a sustainable company

Salesforce has always tried to be a socially responsible company, encouraging employees to work in the community, giving 1% of its profits to different causes and building and productizing the 1-1-1 philanthropic model. The company now wants to help other organizations be more sustainable to reduce their carbon footprint, and today it announced it is working on a product to help.

Patrick Flynn, VP of sustainability at Salesforce, says that it sees sustainability as a key issue, and one that requires action right now. The question was how Salesforce could help. As a highly successful software company, it decided to put that particular set of skills to work on the problem.

“We’ve been thinking about how can Salesforce really take action in the face of climate change. Climate change is the biggest, most important and most complex challenge humans have ever faced, and we know right now, every individual, every company needs to step forward and do everything it can,” Flynn told TechCrunch.

And to that end, the company is developing the Salesforce Sustainability Cloud, to help track a company’s sustainability efforts. The tool should look familiar to Salesforce customers, but instead of tracking customers or sales, this tool tracks carbon emissions, renewable energy usage and how well a company is meeting its sustainability goals.

Dashboards

Image: Salesforce

The tool works with internal data and third-party data as needed, and is subject to both an internal audit by the Sustainability team and third-party organizations to be sure that Salesforce (and Sustainability Cloud customers) are meeting their goals.

Salesforce has been using this product internally to measure its own sustainability efforts, which Flynn leads. “We use the product to measure our footprint across all sorts of different aspects of our operations from data centers, public cloud, real estate — and we work with third-party providers everywhere we can to have them make their operations cleaner, and more powered by renewable energy and less carbon intensive,” he said. When there is carbon generated, the company uses carbon offsets to finance sustainability projects such as clean cookstoves or helping preserve the Amazon rainforest.

Flynn says increasingly the investor community is looking for proof that companies are building a real, verifiable sustainability program, and the Sustainability Cloud, is an effort to provide that information both for Salesforce and for other companies who are in a similar position.

The product is in Beta now and is expected to be ready next year. Flynn could not say how much they plan to charge for this service yet, but he said the goal of the product is positive social impact.


By Ron Miller

IEX’s Katsuyama is no flash in the pan

When you watch a commercial for one of the major stock exchanges, you are welcomed into a world of fast-moving, slick images full of glistening buildings, lush crops and happy people. They are typically interspersed with shots of intrepid executives veering out over the horizon as if to say, “I’ve got a long-term vision, and the exchange where my stock is listed is a valuable partner in achieving my goals.” It’s all very reassuring and stylish. But there’s another side to the story.

I have been educated about the realities of today’s stock exchange universe through recent visits with Brad Katsuyama, co-founder and CEO of IEX (a.k.a. The Investors Exchange). If Katsuyama’s name rings a bell, and you don’t work on Wall Street, it’s likely because you remember him as the protagonist of Michael Lewis’s 2014 best-seller, Flash Boys: A Wall Street Revolt, which explored high-frequency trading (HFT) and made the case that the stock market was rigged, really badly.

Five years later, some of the worst practices Lewis highlighted are things of the past, and there are several attributes of the American equity markets that are widely admired around the world. In many ways, though, the realities of stock trading have gotten more unseemly, thanks to sophisticated trading technologies (e.g., microwave radio transmissions that can carry information at almost the speed of light), and pitched battles among the exchanges, investors and regulators over issues including the rebates stock exchanges pay to attract investors’ orders and the price of market data charged by the exchanges.

I don’t claim to be an expert on the inner workings of the stock market, but I do know this: Likening the life cycle of a trade to sausage-making is an insult to kielbasa. More than ever, trading is an arcane, highly technical and bewildering part of our broader economic infrastructure, which is just the way many industry participants like it: Nothing to see here, folks.

Meanwhile, Katsuyama, company president Ronan Ryan and the IEX team have turned IEX into the eighth largest stock exchange company, globally, by notional value traded, and have transformed the concept of a “speed bump” into a mainstream exchange feature.

Brad Aug 12

Brad Katsuyama. Image via IEX Trading

Despite these and other accomplishments, IEX finds itself in the middle of a vicious battle with powerful incumbents that seem increasingly emboldened to use their muscle in Washington, D.C. What’s more, new entrants, such as The Long-Term Stock Exchange and Members Exchange, are gearing up to enter the fray in US equities, while global exchanges such as the Hong Kong Stock Exchange seek to bulk up by making audacious moves like attempting to acquire the venerable London Stock Exchange.

But when you sell such distinct advantages to one group that really can only benefit from that, it leads to the question of why anyone would want to trade on that market. It’s like walking into a playing field where you know that the deck is stacked against you.

As my discussion with Katsuyama reveals, IEX may have taken some punches in carving out a position for itself in this high-stakes war characterized by cutting-edge technology and size. However, the IEX team remains girded for battle and confident that it can continue to make headway in offering a fair and transparent option for market participants over the long term.

Gregg Schoenberg: Given Flash Boys and the attention it generated for you on Main Street, I’d like to establish something upfront. Does IEX exist for the asset manager, the individual, or both?

Brad Katsuyama: We exist primarily for the asset manager, and helping them helps the individual. We’re one step removed from the individual, and part of that is due to regulation. Only brokers can connect to exchanges, and the asset manager connects to the broker.

Schoenberg: To put a finer point on it, you believe in fairness and being the good guy. But you are not Robinhood. You are a capitalist.

Katsuyama: Yes, but we want to make money fairly. Actually, we thought initially about starting the business as a nonprofit, But once we laid out all the people we would need to convince to work for us, we realized it would’ve been hard for us to attract the skill sets needed as a nonprofit.

Schoenberg: Do you believe that the US equity market today primarily serves investors or traders?


By Gregg Schoenberg

Boston-based DataRobot raises $206M Series E to bring AI to enterprise

Artificial intelligence is playing an increasingly large role in enterprise software, and Boston’s DataRobot has been helping companies build, manage and deploy machine learning models for some time now. Today, the company announced a $206 million Series E investment led by Sapphire Ventures.

Other participants in this round included new investors Tiger Global Management, World Innovation Lab, Alliance Bernstein PCI, and EDBI along with existing investors DFJ Growth, Geodesic Capital, Intel Capital, Sands Capital, NEA and Meritech.

Today’s investment brings the total raised to $431 million, according to the company. It has a pre-money valuation of $1 billion, according to PitchBook. DataRobot would not confirm this number.

The company has been catching the attention of these investors by offering a machine learning platform aimed at analysts, developers and data scientists to help build predictive models much more quickly than it typically takes using traditional methodologies. Once built, the company provides a way to deliver the model in the form of an API, simplifying deployment.

The late-stage startup plans to use the money to continue building out its product line, while looking for acquisition opportunities where it makes sense. The company also announced the availability of a new product today, DataRobot MLOps, a tool to manage, monitor and deploy machine learning models across a large organization.

The company, which was founded in 2012, claims it has had triple-digit recurring revenue growth dating back to 2015, as well as one billion models built on the platform to-date. Customers contributing to that number include a broad range of companies such as Humana, United Airlines, Harvard Business School and Deloitte.


By Ron Miller