New Relic launches platform for developers to build custom apps

When Salesforce launched Force.com in 2007 as a place for developers to build applications on top of Salesforce, it was a pivotal moment for the concept of SaaS platforms. Since then, it’s been said that every enterprise SaaS company wants to be a platform play. Today, New Relic achieved that goal when it announced the New Relic One Observability Platform at the company’s FutureStack conference in New York City.

Company co-founder and CEO Lew Cirne explained that in order to be a platform, by definition, it is something that other people can build software on. “What we are shipping is a set of capabilities to enable our customers and partners to build their own observability applications on the very same platform that we’ve built our product,” Cirne told TechCrunch.

He sees these third-party developers building applications to enable additional innovations on top of the New Relic platform that perhaps New Relic’s engineers couldn’t because of time and resource constraints. “There are so many use cases for this data, far more than the engineers that we have at our company could ever do, but a community of people who can do this together can totally unlock the power of this data,” Cirne said.

Like many platform companies, New Relic found that as it expanded its own offering, it required a platform for its developers to access a common set of services to build these additional offerings, and as they built out this platform, it made it possible to open it up to external developers to access the same set of services as the New Relic engineering team.

“What we have is metrics, logs, events and traces coming from our customers’ digital software. So they have access to all that data in real time to build applications, measure the health of their digital business and build applications on top of that. Just as Force.com was the thing that really transformed Salesforce as a company into being a strategic vendor, we think the same thing will happen for us with what we’re offering,” he said.

As a proof point for the platform, the company is releasing a dozen open source tools built on top of the New Relic platform today in conjunction with the announcement. One example is an application to help identify where companies could be over-spending on their AWS bills. “We’re actually finding 30-40% savings opportunities for them where they’re provisioning larger servers than they need for the workload. Based on the data that we’re analyzing, we’re recommending what the right size deployment should be,” Cirne said.

The New Relic One Observability Platform and the 12 free apps will be available starting today.


By Ron Miller

Salesforce brings AI power to its search tool

Enterprise search tools have always suffered from the success of Google. Users wanted to find the content they needed internally in the same way they found it on the web. Enterprise search has never been able to meet those lofty expectations, but today Salesforce announced Einstein Search, an AI-powered search tool for Salesforce users that is designed to point them to the exact information they are looking for.

Will Breetz, VP of product management at Salesforce says that enterprise search has suffered over the years for a variety of reasons. “Enterprise search has gotten a bad rap, but deservedly so. Part of that is because in many ways it is more difficult than consumer search, and there’s a lot of headwinds,” Breetz explained.

To solve these issues, the company decided to put the power of its Einstein artificial intelligence engine to bear on the problem. For starters, it might not know the popularity of a given topic like Google, but it can learn the behaviors of an individual and deliver the right answer based on a person’s profile including geography and past activity to deliver a more meaningful answer.

Einstein Search Personal

Image: Salesforce

Next, it allows you to enter natural language search phrasing to find the exact information you need, and the search tool understands and delivers the results. For instance, you could enter, “my open opportunities in Boston” and using natural language understanding, the tool can translate that into the exact set of results you are looking for –your open opportunities in Boston. You could use conventional search to click a series of check boxes to narrow the list of results to only Boston, but this is faster and more efficient.

Finally, based on what the intelligence engine knows about you, and on your search parameters, it can predict the most likely actions you want to take and provide quick action buttons in the results to help you do that, reducing the time to action. It may not seem like much, but each reduced workflow adds up throughout a day, and the idea is to anticipate your requirements and help you get your work done more quickly.

Salesforce appears to have flipped the enterprise search problem. Instead of having a limited set of data being a handicap for enterprise search, it is taking advantage of that, and applying AI to help deliver more meaningful results. It’s for a limited set of findings for now such as accounts, contacts and opportunities, but the company plans to additional options over time.


By Ron Miller

Salesforce is developing an app to help build a sustainable company

Salesforce has always tried to be a socially responsible company, encouraging employees to work in the community, giving 1% of its profits to different causes and building and productizing the 1-1-1 philanthropic model. The company now wants to help other organizations be more sustainable to reduce their carbon footprint, and today it announced it is working on a product to help.

Patrick Flynn, VP of sustainability at Salesforce, says that it sees sustainability as a key issue, and one that requires action right now. The question was how Salesforce could help. As a highly successful software company, it decided to put that particular set of skills to work on the problem.

“We’ve been thinking about how can Salesforce really take action in the face of climate change. Climate change is the biggest, most important and most complex challenge humans have ever faced, and we know right now, every individual, every company needs to step forward and do everything it can,” Flynn told TechCrunch.

And to that end, the company is developing the Salesforce Sustainability Cloud, to help track a company’s sustainability efforts. The tool should look familiar to Salesforce customers, but instead of tracking customers or sales, this tool tracks carbon emissions, renewable energy usage and how well a company is meeting its sustainability goals.

Dashboards

Image: Salesforce

The tool works with internal data and third-party data as needed, and is subject to both an internal audit by the Sustainability team and third-party organizations to be sure that Salesforce (and Sustainability Cloud customers) are meeting their goals.

Salesforce has been using this product internally to measure its own sustainability efforts, which Flynn leads. “We use the product to measure our footprint across all sorts of different aspects of our operations from data centers, public cloud, real estate — and we work with third-party providers everywhere we can to have them make their operations cleaner, and more powered by renewable energy and less carbon intensive,” he said. When there is carbon generated, the company uses carbon offsets to finance sustainability projects such as clean cookstoves or helping preserve the Amazon rainforest.

Flynn says increasingly the investor community is looking for proof that companies are building a real, verifiable sustainability program, and the Sustainability Cloud, is an effort to provide that information both for Salesforce and for other companies who are in a similar position.

The product is in Beta now and is expected to be ready next year. Flynn could not say how much they plan to charge for this service yet, but he said the goal of the product is positive social impact.


By Ron Miller

Ten years after Adobe bought Omniture, the deal comes into clearer focus

Ten years ago this week, Adobe acquired Omniture for $1.8 billion. At the time, Adobe was a software company selling boxed software like Dreamweaver, Flash and Photoshop to creatives. Many people were baffled by the move, not realizing that purchasing a web analytics company was really the first volley in a full company transformation to the cloud and a shift in focus from consumer to enterprise.

It would take many years for the full vision to unfold, so you can forgive people for not recognizing the implications of the acquisition at the time, but CEO Shantanu Narayen seemed to give an inkling of what he had in mind. “This is a game-changer for both Adobe and our customers. We will enable advertisers, media companies and e-tailers to realize the full value of their digital assets,” he said in a statement after the acquisition became public.

While most people thought that perhaps this move involved some sort of link between design and data, it would turn out to be more complex than that. Tony Byrne, founder and principal analyst at Real Story Group, tried to figure out the thinking behind the deal in an EContent column published a couple of months after it was announced.

“Going forward, I think the real action will continue to revolve around integrating management and metrics, less so than integrating design and metrics. And that’s why I also think that Adobe isn’t done acquiring yet,” It was pure speculation on Byrne’s part, but it proved prescient.

There’s something happening here


By Ron Miller

Daily Crunch: Salesforce launches vertical clouds

The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 9am Pacific, you can subscribe here.

1. Salesforce doubles down on verticals, launches Manufacturing and Consumer Goods Clouds

Salesforce unveiled two new business units today as part of its strategy to build specialized solutions for specific industries.

For example, with its Manufacturing Cloud, Salesforce says it has built a way for sales agreements to link up with a company’s ERP and forecasting software, allowing for improved demand prediction.

2. Samsung’s Galaxy Tab S6 combines creative flexibility with great design

Darrell Etherington says the new Galaxy Tab S6 (with pricing starting at $649.99) expands the definition of what a tablet can be.

3. Facebook rolls out new video tools, plus Instagram and IGTV scheduling feature

The highlights include better ways to prep for and simulcast live broadcasts, ways to take better advantage of Watch Party events, new metrics to track video performance and a much-anticipated option to schedule Instagram/IGTV content for up to six months in advance.

4. Hear how to build a billion-dollar SaaS company at TechCrunch Disrupt

This year we’ll welcome three people to the Extra Crunch stage who know first-hand what it takes to join the billion-dollar club: Battery Ventures partner Neeraj Agrawal, HelloSign COO Whitney Bouck and Harness CEO Jyoti Bansal.

image001 1

5. Beekeeper raises $45M Series B to become the ‘Slack for non-desk employees’

Beekeeper has built a mobile-first communications platform for employers who need to communicate with blue-collar and service-oriented workers.

6. How to get people to open your emails

We tackle the obvious stuff that can help with low open rates, as well as bigger challenges: Let’s say 60% of your audience opens your email — how can you get the remaining 40% to open and read it too? (Extra Crunch membership required.)

7. This week’s TechCrunch podcasts

The Equity team has some thoughts on the latest WeWork drama, and how it shows that valuations are essentially meaningless. And on Original Content, we review the Netflix documentary series “The Family.”


By Anthony Ha

Walt Disney Studios partners with Microsoft Azure on cloud innovation lab

Seems like everything is going to the cloud these days, so why should moving-making be left out? Today, Walt Disney Studios announced a five-year partnership with Microsoft around an innovation lab to find ways to shift content production to the Azure cloud.

The project involves the Walt Disney StudioLab, an innovation work space where Disney personnel can experiment with moving different workflows to the cloud. The movie production software company, Avid is also involved.

The hope is that by working together, the three parties can come up with creative, cloud-based workflows that can accelerate the innovation cycle at the prestigious movie maker. Every big company is looking for ways to innovate, regardless of their core business, and Disney is no different.

As movie making involves ever greater amounts of computing resources, the cloud is a perfect model for it, allowing them to scale up and down resources as needed, whether rendering scenes or adding special effects. As Disney’s CTO Jamie Voris sees it, this could make these processes more efficient, which could help lower cost and time to production.

“Through this innovation partnership with Microsoft, we’re able to streamline many of our processes so our talented filmmakers can focus on what they do best,” Voris said in a statement. It’s the same kind of cloud value proposition that many large organizations are seeking. They want to speed time to market, while letting technology handle some of the more mundane tasks.

The partnership builds on an existing one that Microsoft already had with Avid where the two companies have been working together to build cloud-based workflows for the film industry using Avid software solutions on Azure. Disney will add its unique requirements to the mix, and over the five years of the partnership, hopes to streamline some of its workflows in a more modern cloud context.


By Ron Miller

The mainframe business is alive and well, as IBM announces new Z15

It’s easy to think about mainframes as some technology dinosaur, but the fact is these machines remain a key component of many large organization’s computing strategies. Today, IBM announced the latest in their line of mainframe computers, the Z15.

For starters, as you would probably expect, these are big and powerful machines capable of handling enormous workloads. For example, this baby can process up to 1 trillion web transactions a day and handle 2.4 million Docker containers, while offering unparalleled security to go with that performance. This includes the ability to encrypt data once, and it stays encrypted, even when it leaves the system, a huge advantage for companies with a hybrid strategy.

Speaking of which, you may recall that IBM bought Red Hat last year for $34 billion. That deal closed in July and the companies have been working to incorporate Red Hat technology across the IBM business including the z line of mainframes.

IBM announced last month that it was making OpenShift, Red Hat’s Kubernetes-based cloud-native tools, available on the mainframe running Linux. This should enable developers, who have been working on OpenShift on other systems to move seamlessly to the mainframe without special training.

IBM sees the mainframe as a bridge for hybrid computing environments, offering a highly secure place for data that when combined with Red Hat’s tools, can enable companies to have a single control plane for applications and data wherever it lives.

While it could be tough to justify the cost of these machines in the age of cloud computing, Ray Wang, founder and principal analyst at Constellation Research, says it could be more cost-effective than the cloud for certain customers. “If you are a new customer, and currently in the cloud and develop on Linux, then in the long run the economics are there to be cheaper than public cloud if you have a lot of IO, and need to get to a high degree of encryption and security” he said.

He added, “The main point is that if you are worried about being held hostage by public cloud vendors on pricing, in the long run the Z is a cost-effective and secure option for owning compute power and working in a multi-cloud, hybrid cloud world.”

Companies like airlines and financial services companies continue to use mainframes, and while they need the power these massive machines provide, they need to do so in a more modern context. The z15 is designed to provide that link to the future, while giving these companies the power they need.


By Ron Miller

IBM brings Cloud Foundry and Red Hat OpenShift together

At the Cloud Foundry Summit in The Hague, IBM today showcased its Cloud Foundry Enterprise Environment on Red Hat’s OpenShift container platform.

For the longest time, the open-source Cloud Foundry Platform-as-a-Service ecosystem and Red Hat’s Kubernetes-centric OpenShift were mostly seen as competitors, with both tools vying for enterprise customers who want to modernize their application development and delivery platforms. But a lot of things have changed in recent times. On the technical side, Cloud Foundry started adopting Kubernetes as an option for application deployments and as a way of containerizing and running Cloud Foundry itself.

On the business side, IBM’s acquisition of Red Hat has brought along some change, too. IBM long backed Cloud Foundry as a top-level foundation member, while Red Hat bet on its own platform instead. Now that the acquisition has closed, it’s maybe no surprise that IBM is working on bringing Cloud Foundry to Red Hat’s platform.

For now, this work is still officially still a technology experiment, but our understanding is that IBM plans to turn this into a fully supported project that will give Cloud Foundry users the option to deploy their application right to OpenShift, while OpenShift customers will be able to offer their developers the Cloud Foundry experience.

“It’s another proof point that these things really work well together,” Cloud Foundry Foundation CTO Chip Childers told me ahead of today’s announcement. “That’s the developer experience that the CF community brings and in the case of IBM, that’s a great commercialization story for them.”

While Cloud Foundry isn’t seeing the same hype as in some of its earlier years, it remains one of the most widely used development platforms in large enterprises. According to the Cloud Foundry Foundation’s latest user survey, the companies that are already using it continue to move more of their development work onto the platform and the according to the code analysis from source{d}, the project continues to see over 50,000 commits per month.

“As businesses navigate digital transformation and developers drive innovation across cloud native environments, one thing is very clear: they are turning to Cloud Foundry as a proven, agile, and flexible platform — not to mention fast — for building into the future,” said Abby Kearns, executive director at the Cloud Foundry Foundation. “The survey also underscores the anchor Cloud Foundry provides across the enterprise, enabling developers to build, support, and maximize emerging technologies.”image024

Also at this week’s Summit, Pivotal (which is in the process of being acquired by VMware) is launching the alpha version of the Pivotal Application Service (PAS) on Kubernetes, while Swisscom, an early Cloud Foundry backer, is launching a major update to its Cloud Foundry-based Application Cloud.


By Frederic Lardinois

Kubernetes co-founder Craig McLuckie is as tired of talking about Kubernetes as you are

“I’m so tired of talking about Kubernetes . I want to talk about something else,” joked Kubernetes co-founder and VP of R&D at VMware Craig McLuckie during a keynote interview at this week’s Cloud Foundry Summit in The Hague. “I feel like that 80s band that had like one hit song — Cherry Pie.”

He doesn’t quite mean it that way, of course (though it makes for a good headline, see above), but the underlying theme of the conversation he had with Cloud Foundry executive director Abby Kearns was that infrastructure should be boring and fade into the background, while enabling developers to do their best work. “We still have a lot of work to do as an industry to make the infrastructure technology fade into the background and bring forwards the technologies that developers interface with, that enable them to develop the code that drives the business, etc. […] Let’s make that infrastructure technology really, really boring. ”

IMG 20190911 115940

What McLuckie wants to talk about is developer experience and with VMware’s intend to acquire Pivotal, it’s placing a strong bet on Cloud Foundry as one of the premiere development platforms for cloud native applications. For the longest time, the Cloud Foundry and Kubernetes ecosystem, which both share an organizational parent in the Linux Foundation, have been getting closer, but that move has accelerated in recent months as the Cloud Foundry ecosystem has finished work on some of its Kubernetes integrations.

McLuckie argues that the Cloud Native Computing Foundation, the home of Kubernetes and other cloud-native open-source projects, was always meant to be a kind of open-ended organization that focuses on driving innovation. And that created a large set of technologies that vendors can choose from. “But when you start to assemble that, I tend to think about you building up this cake which is your development stack, you discover that some of those layers of the cake, like Kubernetes, have a really good bake. They are done to perfection,” said McLuckie, who is clearly a fan of the Great British Baking show. “And other layers, you look at it and you think, wow, that could use a little more bake, it’s not quite ready yet. […] And we haven’t done a great job of pulling it all together and providing a recipe that delivers an entirely consumable experience for everyday developers.”

EEK3PG1W4AAaasp

He argues that Cloud Foundry, on the other hand, has always focused on building that highly opinionated, consistent developer experience. “Bringing those two communities together, I think, is going to have incredibly powerful results for both communities as we start to bring these technologies together,” he said.

With the Pivotal acquisition still in the works, McLuckie didn’t really comment on what exactly this means for the path forward for Cloud Foundry and Kubernetes (which he still talked about with a lot of energy, despite being tired of it), but it’s clear that he’s looking to Cloud Foundry to enable that developer experience on top of Kubernetes that abstracts all of the infrastructure away for developers and makes deploying an application a matter of a single CLI command.

Bonus: Cherry Pie.


By Frederic Lardinois

Explorium reveals $19.1M in total funding for machine learning data discovery platform

Explorium, a data discovery platform for machine learning models, received a couple of unannounced funding rounds over the last year — a $3.6 million seed round last September and a $15.5 million Series A round in March. Today, it made both of these rounds public.

The seed round was led by Emerge with participation of F2 Capital. The Series A was led by Zeev Ventures with participation from the seed investors. The total raised is $19.1 million.

The company founders, who have a data science background, found that it was problematic to find the right data to build a machine learning model. Like most good startup founders confronted with a problem, they decided to solve it themselves by building a data discovery platform for data scientists.

CEO and co-founder, Maor Shlomo says that the company wanted to focus on the quality of the data because not much work has been done there. “A lot of work has been invested on the algorithmic part of machine learning, but the algorithms themselves have very much become commodities. The challenge now is really finding the right data to feed into those algorithms,” Sholmo told TechCrunch.

It’s a hard problem to solve, so they built a kind of search engine that can go out and find the best data wherever it happens to live, whether it’s internally or in an open data set, public data or premium databases. The company has partnered with thousands of data sources, according to Schlomo, to help data scientist customers find the best data for their particular model.

“We developed a new type of search engine that’s capable of looking at the customers data, connecting and enriching it with literally thousands of data sources, while automatically selecting what are the best pieces of data, and what are the best variables or features, which could actually generate the best performing machine learning model,” he explained.

Shlomo sees a big role for partnerships, whether that involves data sources or consulting firms, who can help push Explorium into more companies.

Explorium has 63 employees spread across offices in Tel Aviv, Kiev and San Francisco. It’s still early days, but Sholmo reports “tens of customers.” As more customers try to bring data science to their companies, especially with a shortage of data scientists, having a tool like Explorium could help fill that gap.


By Ron Miller

HashiCorp announces fully managed service mesh on Azure

Service mesh is just beginning to take hold in the cloud native world, and as it does, vendors are looking for ways to help customers understand it. One way to simplify the complexity of dealing with the growing number of service mesh products out there is to package it as a service. Today, HashiCorp announced a new service on Azure to address that need, building it into the Consul product.

HashiCorp co-founder and CTO Armon Dadgar says it’s a fully managed service. “We’ve partnered closely with Microsoft to offer a native Consul [service mesh] service. At the highest level, the goal here is, how do we make it basically push button,” Dadgar told TechCrunch.

He adds that there is extremely tight integration in terms of billing and permissions, as well other management functions, as you would expect with a managed service in the public cloud. Brendan Burns, one of the original Kubernetes developers, who is now a distinguished engineer at Microsoft, says the HashiCorp solution really strips away a lot of the complexity associated with running a service mesh.

“In this case, HashiCorp is using some integration into the Azure control plane to run Consul for you. So you just consume the service mesh. You don’t have to worry about the operations of the service mesh, Burns said. He added, “This is really turning it into a service instead of a do-it-yourself exercise.”

Service meshes are tools used in conjunction with containers and Kubernetes in a dynamic cloud native environment to help micro services communicate and interoperate with one another. There is a growing number of them including Istio, Envoy and Linkerd jockeying for position right now.

Burns makes it clear that while Microsoft is working closely with HashiCorp on this project, it’s also working with other vendors, as well. “Our goal with the service mesh interface specification was really to let a lot of partners be successful on the platform. You know, there’s a bunch of different service meshes. It’s a place where we feel like there’s a lot of evolution and experimentation happening, so we want to make sure that our customers can can find the right solution for them,” Burns explained.

The HashiCorp Consul service is currently in private Beta.


By Ron Miller

HashiCorp expands Terraform free version, adds paid tier for SMBs

HashiCorp has had a free tier for its Terraform product in the past, but it was basically for a single user. Today, the company announced it was expanding that free tier to allow up to five users, while also increasing the range of functions that are available before you have to pay.

“We’re announcing a pretty large expansion of the Terraform Cloud free tier. So many of the capabilities that used to be exclusively in our Terraform enterprise product, we’re now bringing down into the Terraform free tier. It allows you to do central actual execution of Terraform and apply the full lifecycle as part of the free tier,” HashiCorp co-founder and CTO Armon Dadgar explained.

In addition, the company announced a middle tier aimed at SMBs. Dadgar says the new pricing tier helped address some obvious gaps in the pricing catalogue for a large sets of users, who outgrew the free product, yet weren’t ready for the enterprise version.

“We were seeing was a lot of friction with our SMB customers trying to figure out how to go from one-user Terraform to a team of five people or a team of 20 people. And I think the challenge was that we had the enterprise product, which in terms of deployment and pricing, is really geared toward Global 2000 kinds of companies,” Dadgar told TechCrunch.

He said, this left a huge gap for smaller teams of between five and 100 user teams, which forced those teams to kludge together solutions to fit their requirements. The company thought it would make more sense to have a paid tier specifically geared for this group that would create a logical path for all users on the platform, while solving a known problem.

“It’s a logical path, but it also just answers the constant questions on forums and mailing lists regarding how to collaborate [with smaller teams]. Before, we didn’t have a prescriptive answer, and so there was a lot of DIY, and this is our attempt at a prescriptive answer of how you should do this,” he said.

Terraform is the company’s tool for defining, deploying and managing infrastructure as code. There is an open source product, an on prem version and a SaaS version.


By Ron Miller

Snyk grabs $70M more to detect security vulnerabilities in open source code and containers

A growing number of IT breaches has led to security becoming a critical and central aspect of how computing systems are run and maintained. Today, a startup that focuses on one specific area — developing security tools aimed at developers and the work they do — has closed a major funding round that underscores the growth of that area.

Snyk — a London and Boston-based company that got its start identifying and developing security solutions for developers working on open source code — is today announcing that it has raised $70 million, funding that it will be using to continue expanding its capabilities and overall business. For example, the company has more recently expanded to building security solutions to help developers identify and fix vulnerabilities around containers, an increasingly standard unit of software used to package up and run code across different computing environments.

Open source — Snyk works as an integration into existing developer workflows, compatible with the likes of GitHub, Bitbucket and GitLab, as well as CI/CD pipelines — was an easy target to hit. It’s used in 95% of all enterprises, with up to 77% of open source components liable to have vulnerabilities, by Snyk’s estimates. Containers are a different issue.

“The security concerns around containers are almost more about ownership than technology,” Guy Podjarny, the president who co-founded the company with Assaf Hefetz and Danny Grander, explained in an interview. “They are in a twilight zone between infrastructure and code. They look like virtual machines and suffer many of same concerns such as being unpatched or having permissions that are too permissive.”

While containers are present in fewer than 30% of computing environments today, their growth is on the rise, according to Gartner, which forecasts that by 2022, over 75% of global organizations will run containerized applications. Snyk estimates that a full 44% of Docker image scans (Docker being one of the major container vendors) have known vulnerabilities.

This latest round is being led by Accel with participation from existing investors GV and Boldstart Ventures. These three, along with a fourth investor (Heavybit) also put $22 million into the company as recently as September 2018. That round was made at a valuation of $100 million, and from what we understand from a source close to the startup, it’s now in the “range” of $500 million.

“Accel has a long history in the security market and we believe Snyk is bringing a truly unique, developer-first approach to security in the enterprise,” said Matt Weigand of Accel said in a statement. “The strength of Snyk’s customer base, rapidly growing free user community, leadership team and innovative product development prove the company is ready for this next exciting phase of growth and execution.”

Indeed, the company has hit some big milestones in the last year that could explain that hike. It now has some 300,000 developers using it around the globe, with its customer base growing some 200 percent this year and including the likes of Google, Microsoft, Salesforce and ASOS (sidenote: you know that if developers at developer-centric places themselves working at the vanguard of computing, like Google and Microsoft, are using your product, that is a good sign). Notably, that has largely come by word of mouth — inbound interest.

The company in July of this year took on a new CEO, Peter McKay, who replaced Podjarny. McKay was the company’s first investor and has a track record in helping to grow large enterprise security businesses, a sign of the trajectory that Snyk is hoping to follow.

“Today, every business, from manufacturing to retail and finance, is becoming a software business,” said McKay. “There is an immediate and fast growing need for software security solutions that scale at the same pace as software development. This investment helps us continue to bring Snyk’s product-led and developer-focused solutions to more companies across the globe, helping them stay secure as they embrace digital innovation – without slowing down.”

 


By Ingrid Lunden

Latest Adobe tool helps marketers work directly with customer journey data

Adobe has a lot going on with Analytics and the Customer Experience Platform, a place to gather data to understand customers better. Today, it announced a new analytics tool that enables employees to work directly with customer journey data to help deliver a better customer experience.

The customer journey involves a lot of different systems from a company data lake to CRM to point of sale. This tool pulls all of that data together from across multiple systems and various channels and brings it into the data analysis workspace, announced in July.

Nate Smith, group manager for product marketing for Adobe Analytics, says the idea is to give access to this data in a standard way across the organization, whether it’s a data scientist, an analyst with SQL skills or a marketing pro simply looking for insight.

“When you think about organizations that are trying to do omni-channel analysis or trying to get that next channel of data in, they now have the platform to do that, where the data can come in and we standardize it on an academic model,” he said. They then layer this ability to continuously query the data in a visual way to get additional insight they might not have seen.

Adobe screenshot 1

Screenshot: Adobe

Adobe is trying to be as flexible as possible in every step of the process, and openness was a guiding principle here, Smith said. That means that data can come from any source, and users can visualize it using Adobe tools or an external tool like Tableau or Looker. What’s more, they can get data in or out as needed, or even use your their own models, Smith said.

“We recognize that as much as we’d love to have everyone go all in on the Adobe stack, we understand that there is existing significant investment in other tech and that integration and interoperability really needs to happen, as well,” he said.

Ultimately this is about giving marketers access to a full picture of the customer data to deliver the best experience possible based on what you know about them. “Being able to have insight and engagement points to help with the moments that matter and provide great experience is really what we’re aiming to do with this,” he said.

This product will be generally available next month.


By Ron Miller

With its Kubernetes bet paying off, Cloud Foundry double down on developer experience

More than fifty percent of the Fortune 500 companies are now using the open-source Cloud Foundry Platform-as-a-Service project — either directly or through vendors like Pivotal — to build, test and deploy their applications. Like so many other projects, including the likes of OpenStack, Cloud Foundry went through a bit of a transition in recent years as more and more developers started looking to containers — and especially the Kubernetes project — as a platform to develop on. Now, however, the project is ready to focus on what always differentiated it from its closed- and open-source competitors: the developer experience.

Long before Docker popularized containers for application deployment, though, Cloud Foundry had already bet on containers and written its own orchestration service, for example. With all of the momentum behind Kubernetes, though, it’s no surprise that many in the Cloud Foundry started to look at this new project to replace the existing container technology.


By Frederic Lardinois