DREAMTECH NEWS

Putting the Pentagon $10B JEDI cloud contract into perspective

Sometimes $10 billion isn’t as much as you think.

It’s true that when you look at the bottom line number of the $10 billion Joint Enterprise Defense Infrastructure (JEDI) cloud contract, it’s easy to get lost in the sheer size of it, and the fact that it’s a one-vendor deal. The key thing to remember as you think about this deal is that while it’s obviously a really big number, it’s spread out over a long period of time and involves a huge and growing market.

It’s also important to remember that the Pentagon has given itself lots of out clauses in the way the contract is structured. This could be important for those who are worried about one vendor having too much power in a deal like this. “This is a two-year contract, with three option periods: one for three years, another for three years, and a final one for two years,” Heather Babb, Pentagon spokeswoman told TechCrunch.

The contract itself has been set up to define the department’s cloud strategy for the next decade. The thinking is that by establishing a relationship with a single vendor, it will improve security and simplify overall management of the system. It’s also part of a broader view of setting technology policy for the next decade and preparing the military for more modern requirements like Internet of Things and artificial intelligence applications.

Many vendors have publicly expressed unhappiness at the winner-take-all, single vendor approach, which they believe might be unfairly tilted toward market leader Amazon. Still, the DOD, which has stated that the process is open and fair, seems determined to take this path, much to the chagrin of most vendors, who believe that a multi-vendor strategy makes more sense.

John Dinsdale, chief analyst at Synergy Research Group, a firm that keeps close tabs on the cloud market, says it’s also important to keep the figure in perspective compared to the potential size of the overall market.

“The current worldwide market run rate is equivalent to approximately $60 billion per year and that will double in less than three years. So in very short order you’re going to see a market that is valued at greater than $100 billion per year – and is continuing to grow rapidly,” he said.

Put in those terms, $10 billion over a decade, while surely a significant figure, isn’t quite market altering if the market size numbers are right. “If the contract is truly worth $10 billion that is clearly a very big number. It would presumably be spread over many years which then puts it at only a very small share of the total market,” he said.

He also acknowledges that it would be a big feather in the cap of whichever company wins the business, and it could open the door for other business in the government and private sector. After all, if you can handle the DOD, chances are you can handle just about any business where a high level of security and governance would be required.

Final RFPs are now due on October 12th with a projected award date of April 2019, but even at $10 billion, an astronomical sum of money to be sure, it ultimately might not shift the market in the way you think.


By Ron Miller

Instana raises $30M for its application performance monitoring service

Instana, an application performance monitoring (APM) service with a focus on modern containerized services, today announced that it has raised a $30 million Series C funding round. The round was led by Meritech Capital, with participation from existing investor Accel. This brings Instana’s total funding to $57 million.

The company, which counts the likes of Audi, Edmunds.com, Yahoo Japan and Franklin American Mortgage as its customers, considers itself an APM 3.0 player. It argues that its solution is far lighter than those of older players like New Relic and AppDynamics (which sold to Cisco hours before it was supposed to go public). Those solutions, the company says, weren’t built for modern software organizations (though I’m sure they would dispute that).

What really makes Instana stand out is its ability to automatically discover and monitor the ever-changing infrastructure that makes up a modern application, especially when it comes to running containerized microservices. The service automatically catalogs all of the endpoints that make up a service’s infrastructure, and then monitors them. It’s also worth noting that the company says that it can offer far more granular metrics that its competitors.

Instana says that its annual sales grew 600 percent over the course of the last year, something that surely attracted this new investment.

“Monitoring containerized microservice applications has become a critical requirement for today’s digital enterprises,” said Meritech Capital’s Alex Kurland. “Instana is packed with industry veterans who understand the APM industry, as well as the paradigm shifts now occurring in agile software development. Meritech is excited to partner with Instana as they continue to disrupt one of the largest and most important markets with their automated APM experience.”

The company plans to use the new funding to fulfill the demand for its service and expand its product line.


By Frederic Lardinois

With Mulesoft in fold, Salesforce gains access to data wherever it lives

When Salesforce bought Mulesoft last spring for the tidy sum of $6.5 billion, it looked like money well spent for the CRM giant. After all, it was providing a bridge between the cloud and the on-prem data center and that was a huge missing link for a company with big ambitions like Salesforce.

When you want to rule the enterprise, you can’t be limited by where data lives and you need to be able to share information across disparate systems. Partly that’s a simple story of enterprise integration, but on another level it’s purely about data. Salesforce introduced its intelligence layer, dubbed Einstein, at Dreamforce in 2016.

With Mulesoft in the fold, it’s got access to data cross systems wherever it lives, in the cloud or on-prem. Data is the is the fuel of artificial intelligence, and Salesforce has been trying desperately to get more data for Einstein since its inception.

It lost out on LinkedIn to Microsoft, which flexed its financial muscles and reeled in the business social network for $26.5 billion a couple of years ago. It’s undoubtedly a rich source of data that the company longed for. Next, it set its sights on Twitter (although Twitter was ultimately never sold, of course). After board and stockholder concerns, the company walked away.

Each of these forays was all about the data, and frustrated, Salesforce went back to the drawing board. While Mulesoft did not supply the direct cache of data that a social network would have, it did provide a neat way for them to get at backend data sources, the very type of data that matters most to its enterprise customers.

Today, they have extended that notion beyond pure data access to a graph. You can probably see where this is going. The idea of a graph, the connections between say a buyer and the things they tend to buy or a person on a social network and people they tend to interact with can be extended even to the network/API level and that is precisely the story that Salesforce is trying to tell this week at the Dreamforce customer conference in San Francisco.

Visualizing connections in a data integration network in Mulesoft. Screenshot: Salesforce/Mulesoft

Maureen Fleming, program vice president for integration and process automation research at IDC says that it is imperative that organizations view data as a strategic asset and act accordingly. “Very few companies are getting all the value from their data as they should be, as it is locked up in various applications and systems that aren’t designed to talk to each other. Companies who are truly digitally capable will be able to connect these disparate data sources, pull critical business-level data from these connections, and make informed business decisions in a way that delivers competitive advantage,” Fleming explained in a statement.

Configuring data connections on Mulesoft Anypoint Platform. Gif: Salesforce/Mulesoft

It’s hard to underestimate the value of this type of data is to Salesforce, which has already put Mulesoft to work internally to help build the new Customer 360 product announced today. It can point to how it’s providing this very type of data integration to which Fleming is referring on its own product set.

Bret Taylor, president and chief product officer at Salesforce, says that for his company all of this is ultimately about enhancing the customer experience. You need to be able to stitch together these different computing environments and data silos to make that happen.

“In the short term, [customer] infrastructure is often fragmented. They often have some legacy applications on premise, they’ll have some cloud applications like Salesforce, but some infrastructure in on Amazon or Google and Azure, and to actually transform the customer experience, they need to bring all this data together. And so it’s a really a unique time for integration technologies, like Mulesoft because it enables you to create a seamless customer experience, no matter where that
data lives, and that means you don’t need to wait for infrastructure to be perfect before you can transform your customer experience.”


By Ron Miller

Salesforce wants to end customer service frustration with Customer 360

How many times have you called into a company, answered a bunch of preliminary questions about the purpose of your call, then found that those answers didn’t make their way to the CSR who ultimately took your call.

This usually is because System A can’t talk to System B and it’s frustrating for the caller, who is already angry about having to repeat the same information again. Salesforce wants to help bring an end to that problem with their new Customer 360 product announced today at Dreamforce, the company’s customer conference taking place this week in San Francisco.

What’s interesting about Customer 360 from a product development perspective is that Salesforce took the technology from the $6.5 billion Mulesoft acquisition, and didn’t just turn that into a product, it also used the same technology internally to pull the various pieces together into a more unified view of the Salesforce product family. This should in theory allow the customer service representative talking to you on the phone to get the total picture of your interactions with the company, thereby reducing that need to repeat yourself because the information wasn’t passed on.

Screenshot: Salesforce

The idea here is to bring all of the different products — sales, service, community, commerce and marketing — into a single unified view of the customer. And they allow you to do this with actually writing any code, according to the company.

Adding a data source to Customer 360 Gif: Salesforce

This allows anyone who interacts with the customer to see the whole picture, a process that has eluded many companies and upset many customers. The customer record in Salesforce CRM is only part of the story, as is the marketing pitches and the ecommerce records. It all comes together to tell a story about that customer, but if the data is often trapped in silos, nobody can see that. That’s what Customer 360 is supposed to solve.

While Bret Taylor, Salesforce’s president and chief product officer says there were ways to make this happen before in Salesforce, they have never offered a product that does so in such a direct way. He says that the big brands like Apple, Amazon and Google have changed expectations in terms of how we presume to be treated when we connect with a brand. Customer 360 is focused on helping companies achieve that expectation level.

“Now, when people don’t get that experience, where the company that you’re interacting with doesn’t know who you are, it’s gone from a pleasant experience to an expectation, and that’s what we hear time and time again from our customers. And that’s why we’re so focused on integration, that single view of the customer is the ultimate value proposition of these experiences,” Taylor explained.

This product is aimed at the Salesforce admins who have been responsible in the past for configuring and customizing Salesforce products for the unique needs of each department or overall organization. They can configure the Customer 360 to pull data from Salesforce and other products too.

Customer 360 is being piloted in North America right now and should GA some time next year.


By Ron Miller

Chef launches deeper integration with Microsoft Azure

DevOps automation service Chef today announced a number of new integrations with Microsoft Azure. The news, which was announced at Microsoft Ignite conference in Orlando, Florida, focuses on helping enterprises bring their legacy applications to Azure and ranges from the public preview of Chef Automate Managed Service for Azure to the integration of Chef’s InSpec compliance product with Microsoft’s cloud platform.

With Chef Automate as a managed service on Azure, which provides ops teams with a single tool for managing and monitoring their compliance and infrastructure configurations, developers can now easily deploy and manage Chef Automate and the Chef Server from the Azure Portal. It’s a fully managed service and the company promises that businesses can get started with using it in as little as thirty minutes (though I’d take those numbers with a grain of salt).

When those configurations need to change, Chef users on Azure can also now use the Chef Workstation with Azure Cloud Shell, Azure’s command line interface. Workstation is one of Chef’s newest products and focuses on making ad-hoc configuration changes, no matter whether the node is managed by Chef or not.

And to remain in compliance, Chef is also launching an integration of its InSpec security and compliance tools with Azure. InSpec works hand in hand with Microsoft’s new Azure Policy Guest Configuration (who comes up with these names?) and allows users to automatically audit all of their applications on Azure.

“Chef gives companies the tools they need to confidently migrate to Microsoft Azure so users don’t just move their problems when migrating to the cloud, but have an understanding of the state of their assets before the migration occurs,” said Corey Scobie, the senior vice president of products and engineering at Chef, in today’s announcement. “Being able to detect and correct configuration and security issues to ensure success after migrations gives our customers the power to migrate at the right pace for their organization.”

more Microsoft Ignite 2018 coverage


By Frederic Lardinois

LinkedIn steps into business intelligence with the launch of Talent Insights

LinkedIn may be best known as a place where people and organizations keep public pages of their professional profiles, using that as a starting point for networking, recruitment and more — a service that today that has racked up more than 575 million users, 20 million companies and 15 million active job listings. But now under the ownership of Microsoft, the company has increasingly started to build a number of other services; today sees the latest of these, the launch of a new feature called Talent Insights.

Talent Insights is significant in part because it is LinkedIn’s first foray into business intelligence, that branch of enterprise analytics aimed at helping execs and other corporate end users make more informed business decisions.

Talent Insights is also notable because it’s part of a trend, where LinkedIn has been launching a number of other services that take it beyond being a straight social network, and more of an IT productivity tool. They have included a way for users to look at and plan commutes to potential jobs (or other businesses); several integrations with Microsoft software including resume building in Word and Outlook integrations; and adding in more CRM tools to its Sales Navigator product.

Interestingly, it has been nearly a year between LinkedIn first announcing Talent Insights and actually launching it today. The company says part of the reason for the gap is because it has been tinkering with it to get the product right: it’s been testing it with a number of customers — there are now 100 using Talent Insights — with employees in departments like human resources, recruitment and marketing using it.

The product that’s launching today is largely similar to what the company previewed a year ago: there are two parts to it, one focused on people at a company, called “Talent Pool,” and another focused on data about a company, “Company Report.”

The first of these will let businesses run searches across the LinkedIn database to discover talent with characteristics similar to those what a business might already be hiring, and figure out where they are at the moment (in terms of location and company affiliation), and where they are moving, what skills they might have in common, and how to better spot those who might be on the way up based on all of this.

The second set of data tools (Company Report) provides a similar analytics profile but about your organisation and those that you would like to compare against it in areas like relative education levels and schools of the respective workforces; which skills employees have or don’t have; and so on.

Dan Francis, a senior product manager running Talent Insights, said in an interview that for now the majority of the data that’s being used to power Talent Insights is primarily coming from LinkedIn itself, although there are other data sources also added into it, such as material from the Bureau of Labor Statistics. (And indeed, even some of LinkedIn’s other data troves, for example in its recruitment listings, or even in its news/content play, the material that populates both comes from third parties.)

He also added that letting companies feed in their own data to use that in number crunching — either for their own reports or those of other companies — “is on our roadmap,” an indication that LinkedIn sees some mileage in this product.

Adding in more data sources could also help the company appear more impartial and accurate: although LinkedIn is huge and the biggest repository of information of its kind when it comes to professional profiles, it’s not always accurate and in some cases can be completely out of date or intentionally misleading.

(Related: LinkedIn has yet to launch any “verified”-style profiles for people, such as you get on Facebook or Twitter, to prove they are who they say they are, that they work where they claim to work, and that their backgrounds are what they claim them to be. My guess as to why that has not been rolled out is that it would be very hard, if not impossible, to verify everything in a clear way, and so LinkedIn relies on the power of public scrutiny to keep people mostly honest.)

“We’re pretty transparent about this,” said Francis. “We don’t position this as a product as comprehensive, but as a representative sample. Ensuring data quality is good is something that we are careful about. We know sometimes data is not perfect. In some cases it is directional.”


By Ingrid Lunden

Salesforce, AWS expand partnership with secure data sharing between platforms

Salesforce and Amazon’s cloud arm, AWS, have had a pretty close relationship for some time, signing a $400 million deal for infrastructure cloud services in 2016, but today at Dreamforce, Salesforce’s massive customer conference taking place this week in San Francisco, they took it to another level. The two companies announced they were offering a new set of data integration services between the two cloud platforms for common customers.

Matt Garman, vice president of Amazon Elastic Compute Cloud, says customers looking to transform digitally are still primarily concerned about security when moving data between cloud vendors, More specifically, they were asking for a way to move data more securely between the Salesforce and Amazon platforms. “Customers talked to us about sensitive data in Salesforce and using deep analytics and data processing on AWS and moving them back and forth in secure way,” he said. Today’s announcements let them do that.

In practice, Salesforce customers can set up a direct connection using AWS Private Link to connect directly to private Salesforce APIs and move data from Salesforce to an Amazon service such as Redshift, the company’s data warehouse product, without ever exposing the data to the open internet.

Further, Salesforce customers can set up Lambda functions so that when certain conditions are met in Salesforce, it triggers an action such as moving data (or vice versa). This is commonly known as serverless computing and developers are increasingly using event triggers to drive business processes.

Finally, the two companies are integrating more directly with Amazon Connect, the Amazon contact center software it launched in 2017. This is where it gets more interesting because of course Salesforce offers its own contact center services with Salesforce Service Cloud. The two companies found a way to help common customers work together here to build what they are calling AI-driven self-service applications using Amazon Connect on the Salesforce mobile Lightning development platform.

This could involve among other things, building mobile applications that take advantage of Amazon Lex, AWS’s bot building application and Salesforce Einstein, Salesforce’s artificial intelligence platform. Common customers can download the Amazon Connect CTI Adapter on the Salesforce AppExchange.

Make no mistake, this is a significant announcement in that it involves two of the most successful cloud companies on the planet working directly together to offer products and services that benefit their common customers. This was not lost on Bret Taylor, president and chief product officer at Salesforce. “We’re enabling something that wouldn’t have been possible. It’s really exciting because it’s something unique in the marketplace,” he said.

What’s more, it comes on the heels of yesterday’s partnership news with Apple, giving Salesforce two powerful partners to work with moving forward.

While the level of today’s news is unprecedented between the two companies, they  have been working together for some time. As Garman points out, Heroku, which Salesforce bought in 2010 and Quip, which it bought last year were both built on AWS from the get-go. Salesforce, which mostly runs its own data centers in the U.S. runs most of its public cloud on AWS, especially outside the U.S. Conversely, Amazon uses Salesforce tools internally.


By Ron Miller

Snyk raises $22M on a $100M valuation to detect security vulnerabilities in open source code

Open source software is now a $14 billion+ market and growing fast, in use in one way or another in 95 percent of all enterprises. But that expansion comes with a shadow: open source components can come with vulnerabilities, and so their widespread use in apps become a liability to a company’s cybersecurity.

Now, a startup out of the UK called Snyk, which has built a way to detect when those apps or components are compromised, is announcing a $22 million round of funding to meet the demand from enterprises wanting to tackle the issue head on.

Led by Accel, with participation from GV plus previous investors Boldstart Ventures and Heavybit, this Series B notably is the second round raised by Snyk within seven months — it raised a $7 million Series A in March. That’s a measure of how the company is growing (and how enthusiastic investors are about what it has built so far). The startup is not disclosing its valuation but a source close to the deal says it is around $100 million now (it’s raised about $33 million to date).

As a measure of Snyk’s growth, the company says it now has over 200 paying customers and 150,000 users, with revenues growing five-fold in the last nine months. In March, it had 130 paying customers.

(Current clients include ASOS, Digital Ocean, New Relic and Skyscanner, the company said.)

Snyk plays squarely in the middle of how the landscape for enterprise services exists today. It provides options for organisations to use it on-premises, via the cloud, or in a hybrid version of the two, with a range of paid and free tiers to get users acquainted with the service.

Guy Podjarny, the company’s CEO who co-founded Snyk with Assaf Hefetz and Danny Grander, explained that Snyk works in two parts. First, the startup has built a threat intelligence system “that listens to open source activity.” Tapping into open-conversation platforms — for example, GitHub commits and forum chatter — Snyk uses machine learning to detect potential mentions of vulnerabilities. It then funnels these to a team of human analysts, “who verify and curate the real ones in our vulnerability DB.”

Second, the company analyses source code repositories — including, again, GitHub as well as BitBucket — “to understand which open source components each one uses, flag the ones that are vulnerable, and then auto-fix them by proposing the right dependency version to use and through patches our security team builds.”

Open source components don’t have more vulnerabilities than closed source ones, he added, “but their heavy reuse makes those vulnerabilities more impactful.” Components can be used in thousands of applications, and by Snyk’s estimation, some 77 percent of those applications will end up with components that have security vulnerabilities. “As a result, the chances of an organisation being breached through a vulnerable open source component are far greater than a security flaw purely in their code.”

Podjarny says there is no plans to try to tackle proprietary code longer term but to expand how it can monitor apps built on open source.

“Our focus is on two fronts – building security tools developers love, and fixing open source security,” he said. “We believe the risk from insecure use of open source code is far greater than that of your own code, and is poorly addressed in the industry. We do intend to expand our protection from fixing known vulnerabilities in open source components to monitoring and securing them in runtime, flagging and containing malicious and compromised components.”

While this is a relatively new area for security teams to monitor and address, he added that the Equifax breach highlighted what might happen in the worst-case scenario if such issues go undetected. Snyk is not the only company that has identified the gap in the market. Black Duck focuses on flagging non-compliant open source licences, and offers some security features as well.

However, it is Snyk — whose name derives from a play on the word “sneak”, combined with the acronym meaning “so now you know” — that seems to be catching the most attention at the moment.

“Some of the largest data breaches in recent years were the result of unfixed vulnerabilities in open source dependencies; as a result, we’ve seen the adoption of tools to monitor and remediate such vulnerabilities grow exponentially,” said Philippe Botteri, partner at Accel, who is joining the board with this round. “We’ve also seen the ownership of application security shifting towards developers. We feel that Snyk is uniquely positioned in the market given the team’s deep security domain knowledge and developer-centric mindset, and are thrilled to join them on this mission of bringing security tools to developers.”


By Ingrid Lunden

The 7 most important announcements from Microsoft Ignite today

Microsoft is hosting its Ignite conference in Orlando, Florida this week. And although Ignite isn’t the household name that Microsoft’s Build conference has become over the course of the last few years, it’s a massive event with over 30,000 attendees and plenty of news. Indeed, there was so much news this year that Microsoft provided the press with a 27-page booklet with all of it.

We wrote about quite a few of these today, but here are the most important announcements, including one that wasn’t in Microsoft’s booklet but was featured prominently on stage.

1. Microsoft, SAP and Adobe take on Salesforce with their new Open Data Initiative for customer data

What was announced: Microsoft is teaming up with Adobe and SAP to create a single model for representing customer data that businesses will be able to move between systems.

Why it matters: Moving customer data between different enterprise systems is hard, especially because there isn’t a standardized way to represent this information. Microsoft, Adobe and SAP say they want to make it easier for this data to flow between systems. But it’s also a shot across the bow of Salesforce, the leader in the CRM space. It also represents a chance for these three companies to enable new tools that can extract value from this data — and Microsoft obviously hopes that these businesses will choose its Azure platform for analyzing the data.


2. Microsoft wants to do away with more passwords

What was announced: Businesses that use Microsoft Azure Active Directory (AD) will now be able to use the Microsoft Authenticator app on iOS and Android in place of a password to log into their business applications.

Why it matters: Passwords are annoying and they aren’t very secure. Many enterprises are starting to push their employees to use a second factor to authenticate. With this, Microsoft now replaces the password/second factor combination with a single tap on your phone — ideally without compromising security.


3. Microsoft’s new Windows Virtual Desktop lets you run Windows 10 in the cloud

What was announced: Microsoft now lets businesses rent a virtual Windows 10 desktop in Azure.

Why it matters: Until now, virtual Windows 10 desktops were the domain of third-party service providers. Now, Microsoft itself will offer these desktops. The company argues that this is the first time you can get a multiuser virtualized Windows 10 desktop in the cloud. As employees become more mobile and don’t necessarily always work from the same desktop or laptop, this virtualized solution will allow organizations to offer them a full Windows 10 desktop in the cloud, with all the Office apps they know, without the cost of having to provision and manage a physical machine.


4. Microsoft Office gets smarter

What was announced: Microsoft is adding a number of new AI tools to its Office productivity suite. Those include Ideas, which aims to take some of the hassle out of using these tools. Ideas may suggest a layout for your PowerPoint presentation or help you find interesting data in your spreadsheets, for example. Excel is also getting a couple of new tools for pulling in rich data from third-party sources. Microsoft is also building a new unified search tool for finding data across an organization’s network.

Why it matters: Microsoft Office remains the most widely used suite of productivity applications. That makes it the ideal surface for highlighting Microsoft’s AI chops, and anything that can improve employee productivity will surely drive a lot of value to businesses. If that means sitting through fewer badly designed PowerPoint slides, then this whole AI thing will have been worth it.


5. Microsoft’s massive Surface Hub 2 whiteboards will launch in Q2 2019

What was announced: The next version of the Surface Hub, Microsoft’s massive whiteboard displays, will launch in Q2 2019. The Surface Hub 2 is both lighter and thinner than the original version. Then, in 2020, an updated version, the Surface Hub 2X, will launch that will offer features like tiling and rotation.

Why it matters: We’re talking about a 50-inch touchscreen display here. You probably won’t buy one, but you’ll want one. It’s a disappointment to hear that the Surface Hub 2 won’t launch into next year and that some of the advanced features most users are waiting for won’t arrive until the refresh in 2020.


6. Microsoft Teams gets bokeh and meeting recordings with transcripts

What was announced: Microsoft Teams, its Slack competitor, can now blur the background when you are in a video meeting and it’ll automatically create transcripts of your meetings.

Why it matters: Teams has emerged as a competent Slack competitor that’s quite popular with companies that are already betting on Microsoft’s productivity tools. Microsoft is now bringing many of its machine learning smarts to Teams to offer features that most of its competitors can’t match.


7. Microsoft launches Azure Digital Twins

What was announced: Azure Digital Twins allows enterprises to model their real-world IoT deployments in the cloud.

Why it matters: IoT presents a massive new market for cloud services like Azure. Many businesses were already building their own version of Digital Twins on top of Azure, but those homegrown solutions didn’t always scale. Now, Microsoft is offering this capability out of the box, and for many businesses, this may just be the killer feature that will make them decide on standardizing their IoT workloads on Azure. And as they use Azure Digital Twins, they’ll also want to use the rest of Azure’s many IoT tools.

more Microsoft Ignite 2018 coverage


By Frederic Lardinois

Walmart is betting on the blockchain to improve food safety

Walmart has been working with IBM on a food safety blockchain solution and today it announced it’s requiring that all suppliers of leafy green vegetable for Sam’s and Walmart upload their data to the blockchain by September 2019 .

Most supply chains are bogged down in manual processes. This makes it difficult and time consuming to track down an issue should one like the E. coli romaine lettuce problem from last spring rear its head. By placing a supply chain on the blockchain, it makes the process more traceable, transparent and fully digital. Each node on the blockchain could represent an entity that has handled the food on the way to the store, making it much easier and faster to see if one of the affected farms sold infected supply to a particular location with much greater precision.

Walmart has been working with IBM for over a year on using the blockchain to digitize the food supply chain process. In fact, supply chain is one of the premiere business use cases for blockchain (beyond digital currency). Walmart is using the IBM Food Trust Solution, specifically developed for this use case.

“We built the IBM Food Trust solution using IBM Blockchain Platform, which is a tool or capability that that IBM has built to help companies build, govern and run blockchain networks. It’s built using Hyperledger Fabric (the open source digital ledger technology) and it runs on IBM Cloud,” Bridget van Kralingen, IBM’s senior VP for Global Industries, Platforms and Blockchain explained.

Before moving the process to the blockchain, it typically took approximately 7 days to trace the source of food. With the blockchain, it’s been reduced to 2.2 seconds. That substantially reduces the likelihood  that infected food will reach the consumer.

Photo:  Shana Novak/Getty Images

One of the issues in a requiring the suppliers to put their information on the blockchain is understanding that there will be a range of approaches from paper to Excel spreadsheets to sophisticated ERP systems all uploading data to the blockchain. Walmart spokesperson Molly Blakeman says that this something they worked hard on with IBM to account for. Suppliers don’t have to be blockchain experts by any means. They simply have to know how to upload data to the blockchain application.

“IBM will offer an onboarding system that orients users with the service easily. Think about when you get a new iPhone – the instructions are easy to understand and you’re quickly up and running. That’s the aim here. Essentially, suppliers will need a smart device and internet to participate,” she said.

After working with it for a year, the company things it’s ready for broader implementation with the goal ultimately being making sure that the food that is sold at Walmart is safe for consumption, and if there is a problem, making auditing the supply chain a trivial activity.

“Our customers deserve a more transparent supply chain. We felt the one-step-up and one-step-back model of food traceability was outdated for the 21st century. This is a smart, technology-supported move that will greatly benefit our customers and transform the food system, benefitting all stakeholders,” Frank Yiannas, vice president of food safety for Walmart said in statement.

In addition to the blockchain requirement, the company is also requiring that suppliers adhere to one of the Global Food Safety Initiative (GFSI), which have been internationally recognized as food safety standards, according to the company.


By Ron Miller