DREAMTECH NEWS

Allegro.AI nabs $11M for a platform that helps businesses build computer vision-based services

Artificial intelligence and the application of it across nearly every aspect of our lives is shaping up to be one of the major step changes of our modern society. Today, a startup that wants to help other companies capitalise on AI’s advances is announcing funding and emerging from stealth mode.

Allegro.AI, which has built a deep learning platform that companies can use to build and train computer-vision-based technologies — from self-driving car systems through to security, medical and any other services that require a system to read and parse visual data — is today announcing that it has raised $11 million in funding, as it prepares for a full-scale launch of its commercial services later this year after running pilots and working with early users in a closed beta.

The round may not be huge by today’s startup standards, but the presence of strategic investors speaks to the interest that the startup has sparked and the gap in the market for what it is offering. It includes MizMaa Ventures — a Chinese fund that is focused on investing in Israeli startups, along with participation from Robert Bosch Venture Capital GmbH (RBVC), Samsung Catalyst Fund and Israeli fund Dynamic Loop Capital. Other investors (the $11 million actually covers more than one round) are not being disclosed.

Nir Bar-Lev, the CEO and cofounder (Moses Guttmann, another cofounder, is the company’s CTO), started Allegro.AI first as Seematics in 2016 after he left Google, where he had worked in various senior roles for over 10 years. It was partly that experience that led him to the idea that with the rise of AI, there would be an opportunity for companies that could build a platform to help other less AI-savvy companies build AI-based products.

“We’re addressing a gap in the industry,” he said in an interview. Although there are a number of services, for example Rekognition from Amazon’s AWS, which allow a developer to ping a database by way of an API to provide analytics and some identification of a video or image, these are relatively basic and couldn’t be used to build and “teach” full-scale navigation systems, for example.

“An ecosystem doesn’t exist for anything deep-learning based.” Every company that wants to build something would have to invest 80-90 percent of their total R&D resources on infrastructure, before getting to the many other apsects of building a product, he said, which might also include the hardware and applications themselves. “We’re providing this so that the companies don’t need to build it.”

Instead, the research scientists that will buy in the Allegro.AI platform — it’s not intended for non-technical users (not now at least) — can concentrate on overseeing projects and considering strategic applications and other aspects of the projects. He says that currently, its direct target customers are tech companies and others that rely heavily on tech, “but are not the Googles and Amazons of the world.”

Indeed, companies like Google, AWS, Microsoft, Apple and Facebook have all made major inroads into AI, and in one way or another each has a strong interest in enterprise services and may already be hosting a lot of data in their clouds. But Bar-Lev believes that companies ultimately will be wary to work with them on large-scale AI projects:

“A lot of the data that’s already on their cloud is data from before the AI revolution, before companies realized that the asset today is data,” he said. “If it’s there, it’s there and a lot of it is transactional and relational data.

“But what’s not there is all the signal-based data, all of the data coming from computer vision. That is not on these clouds. We haven’t spoken to a single automotive who is sharing that with these cloud providers. They are not even sharing it with their OEMs. I’ve worked at Google, and I know how companies are afraid of them. These companies are terrified of tech companies like Amazon and so on eating them up, so if they can now stop and control their assets they will do that.”

Customers have the option of working with Allegro either as a cloud or on-premise product, or a combination of the two, and this brings up the third reason that Allegro believes it has a strong opportunity. The quantity of data that is collected for image-based neural networks is massive, and in some regards it’s not practical to rely on cloud systems to process that. Allegro’s emphasis is on building computing at the edge to work with the data more efficiently, which is one of the reasons investors were also interested.

“AI and machine learning will transform the way we interact with all the devices in our lives, by enabling them to process what they’re seeing in real time,” said David Goldschmidt, VP and MD at Samsung Catalyst Fund, in a statement. “By advancing deep learning at the edge, Allegro.AI will help companies in a diverse range of fields—from robotics to mobility—develop devices that are more intelligent, robust, and responsive to their environment. We’re particularly excited about this investment because, like Samsung, Allegro.AI is committed not just to developing this foundational technology, but also to building the open, collaborative ecosystem that is necessary to bring it to consumers in a meaningful way.”

Allegro.AI is not the first company with hopes of providing AI and deep learning as a service to the enterprise world: Element.AI out of Canada is another startup that is being built on the premise that most companies know they will need to consider how to use AI in their businesses, but lack the in-house expertise or budget (or both) to do that. Until the wider field matures and AI know-how becomes something anyone can buy off-the-shelf, it’s going to present an interesting opportunity for the likes of Allegro and others to step in.

 

 

 


By Ingrid Lunden

Google Cloud expands its bet on managed database services

Google announced a number of updates to its cloud-based database services today. For the most part, we’re not talking about any groundbreaking new products here but all of these updates address specific pain points that enterprises suffer when they move to the cloud.

As Google Director of Product Management Dominic Preuss told me ahead of today’s announcements, Google long saw itself as a thought leader in the database space. For the longest time, though, that thought leadership was all about things like the Bigtable paper and didn’t really manifest itself in the form of products. Projects like the globally distributed Cloud Spanner database are now allowing Google Cloud to put its stamp on this market.

Preuss also noted that many of Google’s enterprise users often start with lifting and shifting their existing workloads to the cloud. Once they have done that, though, they are also looking to launch new applications in the cloud — and at that point, they typically want managed services that free them from having to do the grunt work of managing their own infrastructure.

Today’s announcements mostly fit into this mold of offering enterprises the kind of managed database services they are asking for.

The first of these is the beta launch of Cloud Memorystore for Redis, a fully managed in-memory data store for users who need in-memory caching for capacity buffering and similar use cases.

Google is also launching a new feature for Cloud Bigtable, the company’s NoSQL database service for big data workloads. Bigtable now features regional replication (or at least it will, once this has rolled out to all users within the next week or so). The general idea here is to give enterprises that previously used Cassandra for their on-premises workloads an alternative in the Google Cloud portfolio and these cross-zone replications increase the availability and durability of the data they store in the service.

With this update, Google is also making Cloud SQL for PostgreSQL generally available with a 99.95 percent SLA and it’s adding commit timestamps to Cloud Spanner.

What’s next for Google’s database portfolio? Unsurprisingly, Preuss wouldn’t say, but he did note that the company wants to help enterprises move as many of their workloads to the cloud as they can — and for the most part, that means managed services.


By Frederic Lardinois

Rocketrip raises $15 million to reward cost-saving employees

If your company lets you expense the nicest hotel when you travel, why wouldn’t you?

But what if you got to split the savings with your employer by selecting a less expensive hotel?

A New York-based startup called Rocketrip believes most employees will opt to save companies money if they are incentivized to do so. It’s built an enterprise platform that rewards employees with gift cards if they go under budget on travel and transportation.

After five years of signing up business clients like Twitter and Pandora, Rocketrip is raising $15 million in Series C funding led by GV (Google Ventures) to keep expanding. Existing investors Bessemer Venture Partners and Canaan Partners are also in the round.

Inspired by Google’s internal travel system, Rocketrip CEO Dan Ruch calls his solution a “behavioral change platform.”  Employees “always optimize for self preservation, self interest” and are likely to book a cheaper flight if it means a gift card at a place like Amazon, Bloomingdale’s, or Home Depot, Ruch claims. He said that the average business trip booked by Rocketrip saves companies $208.

Ruch believes that Rocketrip has built a currency that motivates teams. He says some employees even gift Rocketrip points to congratulate colleagues on birthdays and promotions.

When it comes to enterprise platforms, Rocketrip is “one of those unique situations where everyone is really excited to use it,” said Canaan Partners’ Michael Gilroy, who holds a board seat.

Yet Rocketrip is not the only startup looking to help employees make money by cutting on costs. TripActions and TravelBank have also created similar businesses. 

Gilroy insists that “Rocketrip was first” and that he views the others a “validation of the model.”

Rocketrip hopes to someday expand beyond travel to incentivize healthcare choices like quitting smoking. It also thinks companies will use Rocketrip points to reward employees for community service. “Any time we can motivate an employee,” there’s an opportunity for Rocketrip, Ruch believes.


By Katie Roof

Drew Houston to upload his thoughts at TC Disrupt SF in September

Dropbox is a critically important tool for more than 500 million people.

The company launched back in 2007 and founder and CEO Drew Houston has spent the last decade growing Dropbox to the behemoth it is today.

During that time, Houston has made some tough decisions.

A few years ago, Houston decided to move the Dropbox infrastructure off of AWS. In 2014, Houston chose to raise $500 million in debt financing to keep up pace with Box, which was considering an IPO at the time. And in March 2017, Dropbox took another $600 million in debt financing from JP Morgan.

Houston also reportedly turned down a nine-figure acquisition offer from Apple.

All the while, Houston led Dropbox to be cash-flow positive and grew the company to see a $1 billion revenue run rate as of last year.

And, of course, we can’t forget the decision to go public early this year.

Dropbox is now one of the biggest tech companies in the world, with 1,800 employees across 12 global offices.

Interestingly, Houston first told his story to a TechCrunch audience at TC50 in 2008 as part of the Startup Battlefield.

At Disrupt SF in September, we’re excited to sit down with Houston to discuss his journey thus far, the process of going public, and the future of Dropbox.

The show runs from September 5 to Septmeber 7, and for the next week, our super early bird tickets are still available.


By Jordan Crook

Aion launches first public blockchain network

If you believe blockchains will proliferate in the coming years, it stands to reason that you will need some sort of mechanism to move information between them, a network of blockchains with bridges and processes for sharing information between entities. That is exactly what The Aion Network is providing with a new blockchain network released today.

The company wants to be the underlying infrastructure for a network of blockchains in a similar way that TCP/IP drove the proliferation of the internet. To that end, the company, which originally began as a for-profit startup called Nuco, has decided to become a not-for-profit organization with the goal of setting up protocols for a set of interconnected blockchains. They now see their role as something akin to the Linux Foundation, helping third party companies build products and creating an ecosystem around their base technology.

Graphic: Aion Networks

“The core design of network we have been building is to connect various networks, and route data and transactions through a public network. We are launching that network today. It allows you to build bridges to other blockchain networks. That public network acts as relayer between blockchains,” Matthew Spoke, CEO and co-founder at Aion Networks told TechCrunch.

While there clearly could be security concerns with a public by-way for blockchain data moving between systems, Spoke says that can be minimized. Instead of transmitting a medical record between a hospital and insurance company, you send a proof that the person had an operation, which the insurance company can check against the coverage rules it has created for that individual and vice versa.

The idea behind this venture is to provide the underlying plumbing to encourage more highly scalable blockchain use cases. Spoke and his team once ran the blockchain practice at Deloitte before starting this venture, and they saw roadblocks to scaling first-hand. “When we were doing enterprise projects, our biggest realization was that the plumbing wasn’t sophisticated enough. The scaling wasn’t meeting specs that enterprise companies would need long term. Because of that, we were not seeing anyone moving beyond proof of concept projects. What we are doing is trying to mature the possible use cases,” he said.

In order to drive adoption, the company is introducing a token or cryptocurrency to be used to move data across the network and build in a level of trust. Spoke believes if the users have skin in the game in the form of tokens, that could create a higher level of trust on the system.

“Instead of paying for infrastructure, you are going to pay to be part of a common trusted protocol. It comes down to the mechanism of consensus and being incentivized to do business in an honest way,” Spoke said

This is probably not something that will get adopted widely overnight. Just because they have built it, they still require a level of utilization for it to really take off, and that will require more blockchain projects. “We still need a few years of pure focus on infrastructure to make sure we are getting these layers right. Every time you move data of any kind there are security vulnerabilities and we need to make sure there are good specs and comfort in using it,” he said.


By Ron Miller

Catalyst brothers find capital success with $2.4M from True

Over the past few years, the old language of “customer support” has been supplanted by the new language of “customer success.” In the old model, companies would essentially disappear following the conclusion of a sale, merely handling customer problems when they arose. Now, companies are actively reaching out to customers, engaging them with education and training and monitoring them with analytics to ensure they have the best time with the product as possible.

What’s changing is the nature of product and services today: subscription. Customers no longer just make a single buying decision about a product, but instead must actively commit to using the product, or else they churn.

New York-based Catalyst, founded by brothers Edward and Kevin Chiu, wants to rebuild customer success from the ground up with an integrated software platform. They have received some capital success of their own, securing $2.4 million in venture capital from Phil Black of True Ventures with participation from Ludlow Ventures and Compound.

New York has had something of an increase in founder mafias, as TechCrunch reported this weekend. Catalyst is no exception to this trend, with the Chiu brothers both working at DigitalOcean, one of New York’s many high-flying enterprise startups. Edward Chiu was director of customer success at the company for a number of years, but had a unique background in sales and also in coding before starting.

Kevin Chiu was head of inside sales at DigitalOcean . “I brought my brother on to do sales at DigitalOcean,” Edward Chiu explains. “We always knew that we wanted to start a company together, but wanted to see if we would kill each other.” The two worked together, and lo and behold, they didn’t kill each other.

Edward Chiu wanted to match the product experience of using DigitalOcean with the experience of using its internal customer success tools. Nothing on the market fit. “Given that DigitalOcean was a very technical product,” Chiu explained, “we decided to build our own tool.” Chiu thought of customer success at DigitalOcean as its own product, and his team built up the platform to improve its functionality and scalability. “We just used the tool and we loved it,” he said, so we “started to show this tool to a bunch of other customer success leaders I am connected with.”

Other customer success leaders said they wanted the platform, and “after the 20th person told me that,” he and his brother spun out of DigitalOcean to go on their own. Unlike enterprise startups in New York a couple of years ago that often struggled to find any investors, Catalyst found cash quickly. “Two weeks in we had more offers than we knew what to do with,” Chiu explained. The two said they had originally targeted a fundraise of $750,000, but ended up at $2.4 million.

Catalyst is a platform that integrates between a number of other major SaaS services such as Salesforce, Zendesk, Mixpanel and others to create a unified dashboard for data around customer success. From there, customer success managers have a set of automated tools to handle engagement, such as customer segmentation and email campaigns.

A major challenge in the customer success world is that these managers often don’t have the skills required to do advanced data analytics, so they often rely on their friends in engineering to run scripts or perform database lookups. The hope is that Catalyst’s feature set is powerful enough that these sorts of ad hoc tasks become a thing of the past. “Because we aggregate all this data, you can run queries,” Chiu explains.

Chiu says that Catalyst doesn’t just want to be a software platform, but rather a movement that pushes every company to think about how they can make their customers successful. “There are so many companies that are starting to understand that it is not something that you do once you raise a Series A, but something you do from day one,” Chiu said. “If you take care of your very first customer, they will constantly promote you and constantly promote your business.”

The company is based in Flatiron, and has eight employees.


By Danny Crichton

Etleap scores $1.5 million seed to transform how we ingest data

Etleap is a play on words for a common set of data practices: extract, transform and load. The startup is trying to place these activities in a modern context, automating what they can and in general speeding up what has been a tedious and highly technical practice. Today, they announced a $1.5 million seed round.

Investors include First Round Capital, SV Angel, Liquid2, BoxGroup and other unnamed investors. The startup launched five years ago as a Y Combinator company. It spent a good 2.5 years building out the product says CEO and founder Christian Romming. They haven’t required additional funding up until now because they have been working with actual customers. Those include Okta, PagerDuty and Mode among others.

Romming started out at ad tech startup VigLink and while there he encounter a problem that was hard to solve. “Our analysts and scientists were frustrated. Integration of the data sources wasn’t always a priority and when something broke, they couldn’t get it fixed until a developer looked at it.” That lack of control slowed things down and made it hard to keep the data warehouse up-to-date.

He saw an opportunity in solving that problem and started Etleap. While there were (and continue to be) legacy solutions like Informatica, Talend and Microsoft SQL Server Integration Services, he said when he studied these at a deeply technical level, he found they required a great deal of help to implement. He wanted to simplify ETL as much as possible, putting data integration into the hands of much less technical end users, rather than relying on IT and consultants.

One of the problems with traditional ETL is that the data analysts who make use of the data tend to get involved very late after the tools have already been chosen and Romming says his company wants to change that. “They get to consume whatever IT has created for them. You end up with a bread line where analysts are at the mercy of IT to get their jobs done. That’s one of the things we are trying to solve. We don’t think there should be any engineering at all to set up ETL pipeline,” he said.

Etleap is delivered as managed SaaS or you can run it within your company’s AWS accounts. Regardless of the method, it handles all of the managing, monitoring and operations for the customer.

Romming emphasizes that the product is really built for cloud data warehouses. For now, they are concentrating on the AWS ecosystem, but have plans to expand beyond that down the road. “We want help more enterprise companies make better use of their data, while modernizing data warehousing infrastructure and making use of cloud data warehouses,” he explained.

The company is currently has 15 employees, but Romming plans to at least double that in the next 12-18 months, mostly increasing the engineering team to help further build out the product and create more connectors.


By Ron Miller

Tableau gets new pricing plans and a data preparation tool

Data analytics platform Tableau today announced the launch of both a new data preparation product and a new subscription pricing plan.

Currently, Tableau offers desktop plans for users who want to analyze their data locally, a server plan for businesses that want to deploy the service on-premises or on a cloud platform, and a fully hosted online plan. Prices for these range from $35 to $70 per user and month. The new pricing plans don’t focus so much on where the data is analyzed but on the analyst’s role. The new Creator, Explorer and Viewer plans are tailored toward the different user experiences. They all include access to the new Tableau Prep data preparation tool, Tableau Desktop and new web authoring capabilities — and they are available both on premises or in the cloud.

Existing users can switch their server or desktop subscriptions to the new release today and then assign each user either a creator, explorer or viewer role. As the name indicates, the new viewer role is meant for users who mostly consume dashboards and visualizations, but don’t create their own. The explorer role is for those who need access to a pre-defined data set and the creator role is for analysts and power user who need access to all of Tableau’s capabilities.

“Organizations are facing the urgent need to empower their entire workforce to help drive more revenue, reduce costs, provide better service, increase productivity, discover the next scientific breakthrough and even save lives,” said Adam Selipsky, CEO at Tableau, in today’s announcement. “Our new offerings will help entire organizations make analytics ubiquitous, enabling them to tailor the capabilities required for every employee.”

As for the new data preparation tool, the general idea here is to give users a visual way to shape and clean their data, something that’s especially important as businesses now often pull in data from a variety of sources. Tableau Prep can automate some of this, but the most important aspect of the service is that it gives users a visual interface for creating these kind of workflows. Prep includes support for all the standard Tableau data connectors and lets users perform calculations, too.

“Our customers often tell us that they love working with Tableau, but struggle when data is in the wrong shape for analysis,” said Francois Ajenstat, Chief Product Officer at Tableau. “We believe data prep and data analysis are two sides of the same coin that should be deeply integrated and look forward to bringing fun, easy data prep to everyone regardless of technical skill set.”


By Frederic Lardinois

Heptio launches an open source load balancer for Kubernetes and OpenStack

Heptio is one of the more interesting companies in the container ecosystem. In part, that’s due to the simple fact that it was founded by Craig McLuckie and Joe Beda, two of the three engineers behind the original Kubernetes project, but also because of the technology it’s developing and the large amount of funding it has raised to date.

As the company announced today, it saw its revenue grow 140 percent from the last quarter of 2017 to the first quarter of 2018. In addition, Heptio says its headcount quadrupled since the beginning of 2017. Without any actual numbers, that kind of data doesn’t mean all that much. It’s easy to achieve high growth numbers if you’re starting out from zero, after all. But it looks like things are going well at the company and that the team is finding its place in the fast-growing Kubernetes ecosystem.

In addition to announcing these numbers, the team also today launched a new open source project that will join the company’s existing stable of tools like the cluster recovery tool Ark and the Kubernetes cluster monitoring tool Sonobuoy.

This new tool, Heptio Gimbal, has a very specific use case that is probably only of interest to a relatively small number of users — but for them, it’ll be a lifeline. Gimbal, which Heptio developed together with Yahoo Japan subsidiary Actapio, helps enterprises route traffic into both Kubernetes clusters and OpenStack deployments. Many enterprises now run these technologies in parallel and while some are now moving beyond OpenStack and toward a more Kubernetes -centric architecture, they aren’t likely to do away with their OpenStack investments anytime soon.

“We approached Heptio to help us modernize our infrastructure with Kubernetes without ripping out legacy investments in OpenStack and other back-end systems,” said Norifumi Matsuya, CEO and President at Actapio. “Application delivery at scale is key to our business. We needed faster service discovery and canary deployment capability that provides instant rollback and performance measurement. Gimbal enables our developers to address these challenges, which at the macro-level helps them increase their productivity and optimize system performance.”

Gimbal uses many of Heptio’s existing open source tools, as well as the Envoy proxy, which is part of the Cloud Native Computing Foundation’s stable of cloud-native projects. For now, Gimbal only supports one specific OpenStack release (the ‘Mitaka’ release from 2016), but the team is looking at adding support for VMware and EC2 in the future.


By Frederic Lardinois

DoD clarifies winner-take-all cloud contract

When the Department of Defense announced in March, a 10-year winner-take-all cloud contract that could be worth up to $10 billion, it raised a few eyebrows. Last week, they clarified some of the conditions, and it turns out that much like a modern baseball free agent contract, there are a couple of points where the DoD can opt out of the deal.

In a press conference last week, chief Pentagon spokesperson, Dana W. White, indicated that the original contract award is for just two years. After that there are two additional options for five years and three years. The department can opt out after the first two years if it’s not working out, or seven years if it accepts the second option. Obviously if it takes the final option, that would add up to a full 10 year commitment.

Leigh Madden, who heads up Microsoft’s defense effort says he believes Microsoft can win such a contract, but it isn’t necessarily the best approach for the DoD. “If the DoD goes with a single award path, we are in it to win, but having said that it’s counter to what we are seeing across the globe where 80 percent of customers are adopting a multi-cloud solution,” Madden told TechCrunch.

White indicated that 46 companies have responded to the request for proposal, but it seems clear that there are only a handful of companies that could handle a project of this scope. For starters, we have Amazon and Microsoft along with Google, IBM and Oracle.

That said, White indicated that companies can band together and form a partnership, which means you could see some extremely strange bedfellows trying to form the equivalent of rock supergroup with multiple players coming together to win the deal.

This development certainly opens up some interesting options for the vendors involved and creates a level of competition and alliances, the likes of which the tech industry might have never seen. Whoever gets the contract, they get two years to prove they can do this, then they will be evaluated before getting a shot at the second five year window.


By Ron Miller