IBM confirms layoffs are happening, but won’t provide details

IBM confirmed reports from over night that it is conducting layoffs, but wouldn’t provide details related to location, departments or number of employees involved. The company framed it in terms of replacing people with more needed skills as it tries to regroup under new CEO Arvind Krishna.

IBM’s work in a highly competitive marketplace requires flexibility to constantly remix to high-value skills, and our workforce decisions are made in the long-term interests of our business,” an IBM spokesperson told TechCrunch.

Patrick Moorhead, principal analyst at Moor Insights & Strategy says he’s hearing the layoffs are hitting across the business. “I’m hearing it’s a balancing act between business units. IBM is moving as many resources as it can to the cloud. Essentially, you lay off some of the people without the skills you need and who can’t be re-educated and you bring in people with certain skill sets. So not a net reduction in headcount,” Moorhead said.

It’s worth noting that IBM used a similar argument back in 2015 when it reportedly had layoffs. While there is no official number, Bloomberg is reporting that today’s number is in the thousands.

Holger Mueller, an analyst at Constellation Research, says that IBM is in a tough spot. “The bets of the past have not paid off. IBM Cloud as IaaS is gone, Watson did not deliver and Blockchain is too slow to keep thousands of consultants occupied,” he said.

Mueller adds that the company could also be feeling the impact of having workers at home instead of in the field. “Enterprises do not know and have not learnt how to do large software projects remotely. […] And for now enterprises are slowing down on projects as they are busy with reopening plans,” he said.

The news comes against the backdrop of companies large and small laying off large numbers of employees as the pandemic takes its toll on the workforce. IBM was probably due for a workforce reduction, regardless of the current macro situation as Krishna tries to right the financial ship.

The company has struggled in recent years, and with the acquisition of Red Hat for $34 billion in 2018, it is hoping to find its way as a more open hybrid cloud option. It apparently wants to focus on skills that can help them get there.

The company indicated that it would continue to subsidize medical expenses for laid off employees through June 2021, so there is that.


By Ron Miller

IBM and Red Hat expand their telco, edge and AI enterprise offerings

At its Think Digital conference, IBM and Red Hat today announced a number of new services that all center around 5G edge and AI. The fact that the company is focusing on these two areas doesn’t come as a surprise, given that both edge and AI are two of the fastest-growing businesses in enterprise computing. Virtually every telecom company is now looking at how to best capitalize on the upcoming 5G rollouts, and most forward-looking enterprises are trying to figure out how to best plan around this for their own needs.

As IBM’s recently minted president Jim Whitehurst told me ahead of today’s announcement, he believes that IBM (in combination with Red Hat) is able to offer enterprises a very differentiated service because, unlike the large hyper clouds, IBM isn’t interested in locking these companies into a homogeneous cloud.

“Where IBM is competitively differentiated, is around how we think about helping clients on a journey to what we call hybrid cloud,” said Whitehurst, who hasn’t done a lot of media interviews since he took the new role, which still includes managing Red Hat. “Honestly, everybody has hybrid clouds. I wish we had a more differentiated term. One of the things that’s different is how we’re talking about how you think about an application portfolio that, by necessity, you’re going to have in multiple ways. If you’re a large enterprise, you probably have a mainframe running a set of transactional workloads that probably are going to stay there for a long time because there’s not a great alternative. And there’s going to be a set of applications you’re going to want to run in a distributed environment that need to access that data — all the way out to you running a factory floor and you want to make sure that the paint sprayer doesn’t have any defects while it’s painting a door.”

BARCELONA, CATALONIA, SPAIN – 2019/02/25: The IBM logo is seen during MWC 2019. (Photo by Paco Freire/SOPA Images/LightRocket via Getty Images)

He argues that IBM, at its core, is all about helping enterprises think about how to best run their workloads software, hardware and services perspective. “Public clouds are phenomenal, but they are exposing a set of services in a homogeneous way to enterprises,” he noted, while he argues that IBM is trying to weave all of these different pieces together.

Later in our discussion, he argued that the large public clouds essentially force enterprises to fit their workloads to those clouds’ service. “The public clouds do extraordinary things and they’re great partners of ours, but their primary business is creating these homogeneous services, at massive volumes, and saying ‘if your workloads fit into this, we can run it better, faster, cheaper etc.’ And they have obviously expanded out. They’ve added services. They are not saying we can put a box on-premise, but you’re still fitting into their model.”

On the news side, IBM is launching new services to automate business planning, budgeting and forecasting, for example, as well as new AI-driven tools for building and running automation apps that can handle routine tasks either autonomously or with the help of a human counterpart. The company is also launching new tools for call-center automation.

The most important AI announcement is surely Watson AIOps, though, which is meant to help enterprises detect, diagnose and respond to IT anomalies in order to reduce the effects of incidents and outages for a company.

On the telco side, IBM is launching new tools like the Edge Application Manager, for example, to make it easier to enable AI, analytics and IoT workloads on the edge, powered by IBM’s open-source Open Horizon edge computing project. The company is also launching a new Telco Network Cloud manager built on top of Red Hat OpenShift and the ability to also leverage the Red Hat OpenStack Platform (which remains to be an important platform for telcos and represents a growing business for IBM/Red Hat). In addition, IBM is launching a new dedicated IBM Services team for edge computing and telco cloud to help these customers build out their 5G and edge-enabled solutions.

Telcos are also betting big on a lot of different open-source technologies that often form the core of their 5G and edge deployments. Red Hat was already a major player in this space, but the acquisition has only accelerated this, Whitehurst argued. “Since the acquisition […] telcos have a lot more confidence in IBM’s capabilities to serve them long term and be able to serve them in mission-critical context. But importantly, IBM also has the capability to actually make it real now.”

A lot of the new telco edge and hybrid cloud deployments, he also noted, are built on Red Hat technologies but built by IBM, and neither IBM nor Red Hat could have really brought these to fruition in the same way. Red Hat never had the size, breadth and skills to pull off some of these projects, Whitehurst argued.

Whitehurst also argued that part of the Red Hat DNA that he’s bringing to the table now is helping IBM to think more in terms of ecosystems. “The DNA that I think matters a lot that Red Hat brings to the table with IBM — and I think IBM is adopting and we’re running with it — is the importance of ecosystems,” he said. “All of Red Hat’s software is open source. And so really, what you’re bringing to the table is ecosystems.”

It’s maybe no surprise then that the telco initiatives are backed by partners like Cisco, Dell Technologies, Juniper, Intel, Nvidia, Samsung, Packet, Equinix, Hazelcast, Sysdig, Turbonomics, Portworx, Humio, Indra Minsait, EuroTech, Arrow, ADLINK, Acromove, Geniatech, SmartCone, CloudHedge, Altiostar, Metaswitch, F5 Networks and ADVA.

In many ways, Red Hat pioneered the open-source business model and Whitehurst argued that having Red Hat as part of the IBM family means it’s now easier for the company to make the decision to invest even more in open source. “As we accelerate into this hybrid cloud world, we’re going to do our best to leverage open-source technologies to make them real,” he added.


By Frederic Lardinois

New Red Hat CEO Paul Cormier faces a slew of challenges in the midst of pandemic

When former Red Hat CEO Jim Whitehurst moved on to become president at parent company IBM earlier this month, the logical person to take his place was long-time executive Paul Cormier. As he takes over in the most turbulent of times, he still sees a company that is in the right place to help customers modernize their approach to development as they move more workloads to the cloud.

We spoke to Cormier yesterday via video conference, and he appeared to be a man comfortable in his new position. We talked about the changes his new role has brought him personally, how he his helping his company navigate the current situation and how his relationship with IBM works.

One thing he stressed was that even as part of the IBM family, his company is running completely independently, and that includes no special treatment for IBM. It’s just another customer, an approach he says is absolutely essential.

Taking over

He says that he felt fully prepared for the role having run the gamut of jobs over the years from engineering to business units to CTO. The big difference for him as CEO is that in all of his previous roles he could be the technical guy speaking a certain engineering language with his colleagues. As CEO, things have changed, especially during a time where communication has become paramount.

This has been an even bigger challenge in the midst of the pandemic. Instead of traveling to offices for meetings, chatting over informal coffees and having more serendipitous encounters, he has had to be much more deliberate in his communication to make sure his employees feel in the loop, even when they are out of the office.

“I have a company-wide meeting every two weeks. You can’t over communicate right now because it just doesn’t happen [naturally in the course of work]. I’ve got to consciously do it now, and that’s probably the biggest thing,” he said.

Go-to-market challenges

While Cormier sees little change on the engineering side, where many folks have been working remotely for some time, the go-to-market team could face more serious hurdles as they try to engage with customers.

“The go-to-market and sales side is going to be the challenge because we don’t know how our customers will come out of this. Everybody’s going to have different strategies on how they’re coming out of this, and that will drive a lot,” he said.

This week was Cormier’s first Red Hat Summit as CEO, one that like so many conferences had to pivot from a live event to virtual fairly quickly. Customers have been nervous, and this was the first chance to really reconnect with them since things have shut down. He says that he was pleasantly surprised how well it worked, even allowing more people to attend than might pay to travel to a live event.

Conferences are a place for the sales team to really shine and lay the groundwork for future sales. Not being there in person had to be a big change for them, but he says this week went better than he expected, and they learned a ton about running virtual events that they will carry forth into the future.

“We all miss the face-to-face for sure, but I think we’ve learned new things, and I think our team did an amazing job in pulling this off,” he said.

No favorites for IBM

As he navigates his role inside the IBM family, he says that new CEO Arvind Krishna has effectively become his board of directors, now that the company has gone private. When IBM paid $34 billion for Red Hat in 2018, it was looking for a way to modernize the company and to become a real player in the hybrid cloud market.

Hybrid involves finding a way to manage infrastructure that lives on premises as well as in the cloud without having to use two sets of tools. While IBM is all in on Red Hat, Cormier says it’s absolutely essential to their relationship with customers that they don’t show them any favoritism, and that includes no special pricing deals.

Not only that, he says that he has the freedom to run the company the way he sees fit. “IBM doesn’t set our product strategy. They don’t set our priorities. They know that over time our open source products could eat into what they are doing with their proprietary products, and they are okay with that. They understand that,” he said.

He says that doing it any other way could begin to erode the reason that IBM spent all that money in the first place, and it’s up to Cormier to make sure that they continue to do what they were doing and keep customers comfortable with that. So far, the company seems to be heading in the same upward trajectory it was on as a public company.

In the most recent earnings report in January, IBM reported Red Hat income of $1.07 billion, up from $863 million the previous year when it was still a private company. That’s a run rate of over $4 billion, putting it well within reach of the $5 billion goal Whitehurst set a few years ago.

Now it’s Cormier’s job to get them there and beyond. The pandemic certainly makes it more challenging, but he’s ready to lead the company to that next level, all while walking the line as the CEO of a company that lives under the IBM family umbrella and all that entails.


By Ron Miller

Incoming IBM CEO Arvind Krishna faces monumental challenges on multiple fronts

Arvind Krishna is not the only CEO to step into a new job this week, but he is the only one charged with helping turn around one of the world’s most iconic companies. Adding to the degree of difficulty, he took the role in the midst of a global pandemic and economic crisis. No pressure or anything.

IBM has struggled in recent years to find its identity as technology has evolved rapidly. While Krishna’s predecessor Ginni Rometty left a complex legacy as she worked to bring IBM into the modern age, she presided over a dreadful string of 22 straight quarters of declining revenue, a record Krishna surely hopes to avoid.

Strong headwinds

To her credit, under Rometty the company tried hard to pivot to more modern customer requirements, like cloud, artificial intelligence, blockchain and security. While the results weren’t always there, Krishna acknowledged in an email employees received on his first day that she left something to build on.

“IBM has already built enduring platforms in mainframe, services and middleware. All three continue to serve our clients. I believe now is the time to build a fourth platform in hybrid cloud. An essential, ubiquitous hybrid cloud platform our clients will rely on to do their most critical work in this century. A platform that can last even longer than the others,” he wrote.

But Ray Wang, founder and principal analyst at Constellation Research, says the market headwinds the company faces are real, and it’s going to take some strong leadership to get customers to choose IBM over its primary cloud infrastructure competitors.

“His top challenge is to restore the trust of clients that IBM has the latest technology and solutions and is reinvesting enough in innovation that clients want to see. He has to show that IBM has the same level of innovation and engineering talent as the hyper scalers Google, Microsoft and Amazon,” Wang explained.

Cultural transformation


By Ron Miller

Paul Cormier takes over as Red Hat CEO, as Jim Whitehurst moves to IBM

When Ginni Rometty indicated that she was stepping down as IBM CEO at the end of January, the company announced that Arvind Krishna would be taking over, while Red Hat CEO Jim Whitehurst would become president. To fill his role, Red Hat announced today that long-time executive Paul Cormier has been named president and CEO.

Cormier would seem to be a logical choice to run Red Hat, having been with the company since 2001. He joined as its VP of engineering and has seen the company grow from a small startup to a multi-billion dollar company.

Cormier spoke about the historical arc he has witnessed in his years at Red Hat. “Looking back to when I joined, we were in a different position and facing different issues, but the spirit was the same. We were on a mission to convince the world that open source was real, safe and enterprise-grade,” Cormier said in an email to employees about his promotion.

Former CEO Whitehurst certainly sees this as a sensible transition. “After working with him closely for more than a decade, I can confidently say that Paul was the natural choice to lead Red Hat. Having been the driving force behind Red Hat’s product strategy for nearly two decades, he’s been intimately involved in setting the company’s direction and uniquely understands how to help customers and partners make the most out of their cloud strategy,” he said in a statement.

In a Q&A with Cormier on the company website, he talked about the kind of changes he expects to see under his leadership in the next five years of the company. “There’s a term that we use today, ‘applications run the business.’ In five years, I see it becoming the case for the majority of enterprises. And with that, the infrastructure underpinning these applications will be even more critical. Management and security are paramount — and this isn’t just one environment. It’s bare metal and hypervisors to public and private clouds. It’s Linux, VMs, containers, microservices and more,” he said.

When IBM bought Red Hat in 2018 for $34 billion, there was widespread speculation that Whitehurst would eventually take over in an executive position there. Now that that has happened, Cormier will step into run Red Hat.

While Red Hat is under the IBM umbrella, it continues to operate as a separate company with its own executive structure, but that vision that Cormier outlined is in line with how it will fit within the IBM family as it tries to make its mark on the shifting cloud and enterprise open source markets.


By Ron Miller

AWS, IBM launch programs to encourage developers solving COVID-19 problems

As society comes to grips with the growing worldwide crisis related to the COVID-19 virus, many companies are stepping up in different ways. Today, two major tech companies — Amazon and IBM — each announced programs to encourage developers find solutions to a variety of problems related to the pandemic.

For starters, AWS, Amazon’s cloud arm, announced the AWS Diagnostic Development Initiative. It has set aside $20 million, which it will distribute in the form of AWS credits and technical support. The program is designed to assist and encourage teams working on COVID-19 diagnostic issues with the goal of developing better diagnostic tooling.

“In our Amazon Web Services (AWS) business, one area where we have heard an urgent need is in the research and development of diagnostics, which consist of rapid, accurate detection and testing of COVID-19. Better diagnostics will help accelerate treatment and containment, and in time, shorten the course of this epidemic,” Teresa Carlson wrote in the company’s Day One blog today.

The program aims to help customers who are working on building diagnostics solutions to bring products to market more quickly, and also encouraging teams to work together who are working on related problems.

The company also announced, it was forming an advisory group made up of scientists and health policy experts to assist companies involved with initiative.

Meanwhile IBM is refocusing its 2020 Call for Code Global Challenge developer contest on not only solving problem related to global climate change, which was this year’s original charter, but also solving issues around the growing virus crisis by building open source tooling.

“In a very short period of time, COVID-19 has revealed the limits of the systems we take for granted. The 2020 Call for Code Global Challenge will arm you with resources […] to build open source technology solutions that address three main COVID-19 areas: crisis communication during an emergency, ways to improve remote learning, and how to inspire cooperative local communities,” the company wrote in a blog post.

All of these areas are being taxed as more people are forced to stay in-doors as we to try and contain the virus. The company hopes to incentivize developers working on these issues to help solve some of these problems.

During a time of extreme social and economic upheaval when all aspects of society are being affected, businesses, academia and governments need to work together to solve a myriad of problems related to the virus. These are just a couple examples of that.


By Ron Miller

How the information system industry became enterprise software

If you were a software company employee or venture capitalist in Silicon Valley before 1993, chances are you were talking about “Information Systems Software” and not “Enterprise Software.” How and why did the industry change its name?

The obvious, but perplexing answer is simple — “Star Trek: The Next Generation.”

As befuddling and mind-numbingly satisfying as it is to your local office Trekkie, the industry rebranded itself thanks to a marketing campaign from the original venture-backed system software company, Boole & Babbage (now BMC software).

While the term “Enterprise” was used to describe complex systems for years before 1993, everything changed when Boole & Babbage signed a two-year licensing agreement with the then-highest-rated show in syndication history to produce an infomercial.

Star Trek fans have been talking about this crazy marketing agreement for years, and you can read the full details about how it was executed in TrekCore. But even Trekkies don’t appreciate its long-term impacts on our industry. In this license agreement with Paramount, Boole & Babbage had unlimited rights to create and distribute as much Star Trek content as they could. They physically mailed VHS cassettes to customers, ran magazine ads and even dressed their employees as members of Starfleet at trade shows. Boole & Babbage used this push to market itself as the “Enterprise Automation Company.”

Commander Riker says in the infomercial, “just as the bridge centralizes the functions necessary to control the USS Enterprise, Boole’s products centralize data processing information to allow centralized control of today’s complex information systems.” This seemed to scratch an itch that other systems companies didn’t realize needed scratching.

Not to be outdone, IBM in 1994 rebranded their OS/2 operating system “OS/2 Warp,” referring to Star Trek’s “warp drive.” They also tried to replicate Babbage’s licensing agreement with Paramount by hiring the Enterprise’s Captain Picard (played by actor Patrick Stewart) to emcee the product launch. Unfortunately, Paramount wouldn’t play ball, and IBM hired Captain Janeway (played by actress Kate Mulgrew) from Star Trek: Voyager instead. The licensing issues didn’t stop IBM from also hiring Star Trek’s Mr. Spock (played by actor Leonard Nimoy) to tape a five-minute intro to the event:

Outside of OS/2, IBM’s 1994 announcement list included 13 other “enterprise” initiatives. Soon, leading software companies began to rebrand themselves and release products using the term “enterprise software” as a valuable identifier. MRP software makers like SAP and Baan began embracing the new “Enterprise” moniker after 1993 and in 1995, Lotus rebranded itself as an “Enterprise Software Company.”

“Enterprise” was officially the coolest new vernacular and after industry behemoth IBM bought Lotus in 1996, they incorporated “Enterprise” across all of their products. And while Gartner’s 1990 paper “ERP: A Vision of the Next-Generation MRP II” by Wylie is the technical birth of ERP software, no one cared until Commander Riker told Harold to “monitor your entire Enterprise from a single point of control.” The ngram numbers don’t lie:

Almost 30 years later, we live in a world in which business is run on enterprise software and the use of the term is ubiquitous. Whenever I see a software business plan come across my desk or read an article on enterprise software, I can’t help but give Commander Riker a little due credit.


By Walter Thompson

Honeywell says it will soon launch the world’s most powerful quantum computer

“The best-kept secret in quantum computing.” That’s what Cambridge Quantum Computing (CQC) CEO Ilyas Khan called Honeywell‘s efforts in building the world’s most powerful quantum computer. In a race where most of the major players are vying for attention, Honeywell has quietly worked on its efforts for the last few years (and under strict NDA’s, it seems). But today, the company announced a major breakthrough that it claims will allow it to launch the world’s most powerful quantum computer within the next three months.

In addition, Honeywell also today announced that it has made strategic investments in CQC and Zapata Computing, both of which focus on the software side of quantum computing. The company has also partnered with JPMorgan Chase to develop quantum algorithms using Honeywell’s quantum computer. The company also recently announced a partnership with Microsoft.

Honeywell has long built the kind of complex control systems that power many of the world’s largest industrial sites. It’s that kind of experience, be that that has now allowed it to build an advanced ion trap that is at the core of its efforts.

This ion trap, the company claims in a paper that accompanies today’s announcement, has allowed the team to achieve decoherence times that are significantly longer than those of its competitors.

“It starts really with the heritage that Honeywell had to work from,” Tony Uttley, the president of Honeywell Quantum Solutions, told me. “And we, because of our businesses within aerospace and defense and our business in oil and gas — with solutions that have to do with the integration of complex control systems because of our chemicals and materials businesses — we had all of the underlying pieces for quantum computing, which are just fabulously different from classical computing. You need to have ultra-high vacuum system capabilities. You need to have cryogenic capabilities. You need to have precision control. You need to have lasers and photonic capabilities. You have to have magnetic and vibrational stability capabilities. And for us, we had our own foundry and so we are able to literally design our architecture from the trap up.”

The result of this is a quantum computer that promises to achieve a quantum Volume of 64. Quantum Volume (QV), it’s worth mentioning, is a metric that takes into account both the number of qubits in a system as well as decoherence times. IBM and others have championed this metric as a way to, at least for now, compare the power of various quantum computers.

So far, IBM’s own machines have achieved QV 32, which would make Honeywell’s machine significantly more powerful.

Khan, whose company provides software tools for quantum computing and was one of the first to work with Honeywell on this project, also noted that the focus on the ion trap is giving Honeywell a bit of an advantage. “I think that the choice of the ion trap approach by Honeywell is a reflection of a very deliberate focus on the quality of qubit rather than the number of qubits, which I think is fairly sophisticated,” he said. “Until recently, the headline was always growth, the number of qubits running.”

The Honeywell team noted that many of its current customers are also likely users of its quantum solutions. These customers, after all, are working on exactly the kind of problems in chemistry or material science that quantum computing, at least in its earliest forms, is uniquely suited for.

Currently, Honeywell has about 100 scientists, engineers and developers dedicated to its quantum project.


By Frederic Lardinois

Daily Crunch: IBM names new CEO

The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 9am Pacific, you can subscribe here.

1. Arvind Krishna will replace Ginni Rometty as IBM CEO in April

Krishna, IBM’s senior vice president for cloud and cognitive software, will take over on April 6 after a couple months of transition. Rometty will remain with the company as chairman of the board.

Krishna reportedly drove the massive $34 billion acquisition of Red Hat at the end of 2018, and there was some speculation at the time that Red Hat CEO Jim Whitehurst was the heir apparent. Instead, the board went with a more seasoned IBM insider for the job, while naming Whitehurst as president.

2. Apple’s redesigned Maps app is available across the US, adds real-time transit for Miami

The redesigned app will include more accurate information overall as well as comprehensive views of roads, buildings, parks, airports, malls and other public places. It will also bring Look Around to more cities and real-time transit to Miami.

3. Social media boosting service exposed thousands of Instagram passwords

The company, Social Captain, says it helps thousands of users to grow their Instagram follower counts by connecting their accounts to its platform. But TechCrunch learned this week Social Captain was storing the passwords of linked Instagram accounts in unencrypted plaintext.

4. Elon Musk just dropped an EDM track on SoundCloud

That is a real headline and I probably don’t need to say much else. Listen to the track, or don’t.

5. Being a child actress prepared me for a career in venture capital

Crystal McKellar played Becky Slater on “The Wonder Years,” and she writes about how that experience prepared her to be a managing partner at Anathem Ventures. (Extra Crunch membership required.)

6. Moda Operandi, an online marketplace for high-end fashion, raises $100M led by NEA and Apax

High-end fashion might not be the first thing that comes to mind when you think about online shopping, but it has actually been a ripe market for the e-commerce industry.

7. Why Sony’s PlayStation Vue failed

Vue launched in March 2015, offering live and on-demand content from more than 85 channels, including many local broadcast stations. But it failed to catch on with a broader audience, despite — or perhaps, because of — its integration with Sony’s PS3 and PS4 devices, and it shut down this week. (Extra Crunch membership required.)


By Anthony Ha

In latest $10B JEDI contract twist, Defense Secretary recuses himself

The JEDI drama never stops. The $10 billion, decade long cloud contract has produced a series of twists and turns since the project was announced in 2018. These include everything from court challenges to the president getting involved to accusations of bias and conflict of interest. It has had all this and more. Today, in the latest plot twist, the Secretary of Defense Mark Esper recused himself from the selection process because one of his kids works at a company that was involved earlier in the process.

Several reports name his son, Luke Esper, who has worked at IBM since February. The RFP closed in April and Esper is a Digital Strategy Consultant, according to his LinkedIn page, but given the persistent controversy around this deal, his dad apparently wanted to remove even a hint of impropriety in the selection and review process.

Chief Pentagon Spokesperson Jonathan Rath Hoffman issued an official DoD Cloud update earlier today:

“As you all know, soon after becoming Secretary of Defense in July, Secretary Esper initiated a review of the Department’s cloud computing plans and to the JEDI procurement program. As part of this review process he attended informational briefings to ensure he had a full understanding of the JEDI program and the universe of options available to DoD to meet its cloud computing needs. Although not legally required to, he has removed himself from participating in any decision making following the information meetings, due to his adult son’s employment with one of the original contract applicants. Out of an abundance of caution to avoid any concerns regarding his impartiality, Secretary Esper has delegated decision making concerning the JEDI Cloud program to Deputy Secretary Norquist. The JEDI procurement will continue to move to selection through the normal acquisition process run by career acquisition professionals.”

Perhaps the biggest beef around this contract, which was supposed to be decided in August, has been the winner-take-all nature of the deal. Only one company will eventually walk away a winner, and there was a persistent belief in some quarters that the deal was designed specifically with Amazon in mind. Oracle’s Co-CEO Safra Catz took that concern directly to the president in 2018.

The DoD has repeatedly denied there was any vendor in mind when it created the RFP, and internal Pentagon reviews, courts and a government watchdog agency repeatedly found the procurement process was fair, but the complaints continue. The president got involved in August when he named his then newly appointed defense secretary to look into the JEDI contract procurement process. Now Espers is withdrawing from leading that investigation, and it will be up to others including his Deputy Secretary to finally bring this project over the finish line.

Last April, the DoD named Microsoft and Amazon as the two finalists. It’s worth pointing out that both are leaders in Infrastructure as a Service marketshare with around 16% and 33% respectively.

It’s also worth noting that while $10 billion feels like a lot of money, it’s spread out over a 10-year period with lots of possible out clauses built into the deal. To put this deal size into perspective, a September report from Synergy Research found that worldwide combined infrastructure and software service spending in the cloud had already reached $150 billion, a number that is only expected to continue to rise over the next several years as more companies and government agencies like the DoD move more of their workloads to the cloud.

For complete TechCrunch JEDI coverage, see the Pentagon JEDI Contract.


By Ron Miller

QC Ware Forge will give developers access to quantum hardware and simulators across vendors

Quantum computing is almost ready for prime time, and, according to most experts, now is the time to start learning how to best develop for this new and less than intuitive technology. With multiple vendors like D-Wave, Google, IBM, Microsoft and Rigetti offering commercial and open-source hardware solutions, simulators and other tools, there’s already a lot of fragmentation in this business. QC Ware, which is launching its Forge cloud platform into beta today, wants to become the go-to middleman for accessing the quantum computing hardware and simulators of these vendors.

Forge, which like the rest of QC Ware’s efforts is aimed at enterprise users, will give developers the ability to run their algorithms on a variety of hardware platforms and simulators. The company argues that developers won’t need to have any previous expertise in quantum computing, though having a bit of background surely isn’t going to hurt. From Forge’s user interface, developers will be able to run algorithms for binary optimization, chemistry simulation and machine learning.

Screen Shot 2019 09 19 at 2.16.37 PM

“Practical quantum advantage will occur. Most experts agree that it’s a matter of ‘when’ not ‘if.’ The way to pull that horizon closer is by having the user community fully engaged in quantum computing application discovery. The objective of Forge is to allow those users to access the full range of quantum computing resources through a single platform,” said Matt Johnson, CEO, QC Ware. “To assist our customers in that exploration, we are spending all of our cycles working on ways to squeeze as much power as possible out of near-term quantum computers, and to bake those methods into Forge.”

Currently, QC Ware Forge offers access to hardware from D-Wave, as well as open-source simulators running on Google’s and IBM’s clouds, with plans to support a wider variety of platforms in the near future.

Initially, QC Ware also told me that it offered direct access to IBM’s hardware, but that’s not yet the case. “We currently have the integration complete and actively utilized by QC Ware developers and quantum experts,”  QC Ware’s head of business development Yianni Gamvros told me. “However, we are still working with IBM to put an agreement in place in order for our end-users to directly access IBM hardware. We expect that to be available in our next major release. For users, this makes it easier for them to deal with the churn. We expect different hardware vendors will lead at different times and that will keep changing every six months. And for our quantum computing hardware vendors, they have a channel partner they can sell through.”

Users who sign up for the beta will receive 30 days of access to the platform and one minute of actual Quantum Computing Time to evaluate the platform.


By Frederic Lardinois

The mainframe business is alive and well, as IBM announces new Z15

It’s easy to think about mainframes as some technology dinosaur, but the fact is these machines remain a key component of many large organization’s computing strategies. Today, IBM announced the latest in their line of mainframe computers, the Z15.

For starters, as you would probably expect, these are big and powerful machines capable of handling enormous workloads. For example, this baby can process up to 1 trillion web transactions a day and handle 2.4 million Docker containers, while offering unparalleled security to go with that performance. This includes the ability to encrypt data once, and it stays encrypted, even when it leaves the system, a huge advantage for companies with a hybrid strategy.

Speaking of which, you may recall that IBM bought Red Hat last year for $34 billion. That deal closed in July and the companies have been working to incorporate Red Hat technology across the IBM business including the z line of mainframes.

IBM announced last month that it was making OpenShift, Red Hat’s Kubernetes-based cloud-native tools, available on the mainframe running Linux. This should enable developers, who have been working on OpenShift on other systems to move seamlessly to the mainframe without special training.

IBM sees the mainframe as a bridge for hybrid computing environments, offering a highly secure place for data that when combined with Red Hat’s tools, can enable companies to have a single control plane for applications and data wherever it lives.

While it could be tough to justify the cost of these machines in the age of cloud computing, Ray Wang, founder and principal analyst at Constellation Research, says it could be more cost-effective than the cloud for certain customers. “If you are a new customer, and currently in the cloud and develop on Linux, then in the long run the economics are there to be cheaper than public cloud if you have a lot of IO, and need to get to a high degree of encryption and security” he said.

He added, “The main point is that if you are worried about being held hostage by public cloud vendors on pricing, in the long run the Z is a cost-effective and secure option for owning compute power and working in a multi-cloud, hybrid cloud world.”

Companies like airlines and financial services companies continue to use mainframes, and while they need the power these massive machines provide, they need to do so in a more modern context. The z15 is designed to provide that link to the future, while giving these companies the power they need.


By Ron Miller

IBM’s quantum-resistant magnetic tape storage is not actually snake oil

Usually when someone in tech says the word “quantum,” I put my hands on my ears and sing until they go away. But while IBM’s “quantum computing safe tape drive” nearly drove me to song, when I thought about it, it actually made a lot of sense.

First of all, it’s a bit of a misleading lede. The tape is not resistant to quantum computing at all. The problem isn’t that qubits are going to escape their cryogenic prisons and go interfere with tape drives in the basement of some datacenter or HQ. The problem is what these quantum computers may be able to accomplish when they’re finally put to use.

Without going too deep down the quantum rabbit hole, it’s generally acknowledged that quantum computers and classical computers (like the one you’re using) are good at different things — to the point where in some cases, a problem that might take incalculable time on a traditional supercomputer could be done in a flash on quantum. Don’t ask me how — I said we’re not going down the hole!

One of the things quantum is potentially very good at is certain types of cryptography: It’s theorized that quantum computers could absolutely smash through many currently used encryption techniques. In the worst case scenario, that means that if someone got hold of a large cache of encrypted data that today would be useless without the key, a future adversary may be able to force the lock. Considering how many breaches there have been where the only reason your entire life wasn’t stolen was because it was encrypted, this is a serious threat.

quantum tapeIBM and others are thinking ahead. Quantum computing isn’t a threat right now, right? It isn’t being seriously used by anyone, let alone hackers. But what if you buy a tape drive for long-term data storage today, and then a decade from now a hack hits and everything is exposed because it was using “industry standard” encryption?

To prevent that from happening, IBM is migrating its tape storage over to encryption algorithms that are resistant to state of the art quantum decryption techniques — specifically lattice cryptography (another rabbit hole — go ahead). Because these devices are meant to be used for decades if possible, during which time the entire computing landscape can change. It will be hard to predict exactly what quantum methods will emerge in the future, but at the very least you can try not to be among the low-hanging fruit favored by hackers.

The tape itself is just regular tape. In fact, the whole system is pretty much the same as you’d have bought a week ago. All the changes are in the firmware, meaning earlier drives can be retrofitted with this quantum-resistant tech.

Quantum computing may not be relevant to many applications today, but next year who knows? And in ten years, it might be commonplace. So it behooves companies like IBM, which plan to be part of the enterprise world for decades to come, to plan for it today.


By Devin Coldewey

Why now is the time to get ready for quantum computing

For the longest time, even while scientists were working to make it a reality, quantum computing seemed like science fiction. It’s hard enough to make any sense out of quantum physics to begin with, let alone the practical applications of this less than intuitive theory. But we’ve now arrived at a point where companies like D-Wave, Rigetti, IBM and others actually produce real quantum computers.

They are still in their infancy and nowhere near as powerful as necessary to compute anything but very basic programs, simply because they can’t run long enough before the quantum states decohere, but virtually all experts say that these are solvable problems and that now is the time to prepare for the advent of quantum computing. Indeed, Gartner just launched a Quantum Volume metric, based on IBM’s research, that looks to help CIOs prepare for the impact of quantum computing.

To discuss the state of the industry and why now is the time to get ready, I sat down with IBM’s Jay Gambetta, who will also join us for a panel on Quantum Computing at our TC Sessions: Enterprise event in San Francisco on September 5, together with Microsoft’s Krysta Svore and Intel’s Jim Clark.


By Frederic Lardinois

IBM is moving OpenPower Foundation to The Linux Foundation

IBM makes the Power Series chips, and as part of that has open sourced some of the underlying technologies to encourage wider use of these chips. The open source pieces have been part of the OpenPower Foundation. Today, the company announced it was moving the foundation under The Linux Foundation, and while it was at it, announced it was open sourcing several other important bits.

Ken King, general manager for OpenPower at IBM, says that at this point in his organization’s evolution, they wanted to move it under the auspices of the Linux Foundation . “We are taking the OpenPower Foundation, and we are putting it as an entity or project underneath The Linux Foundation with the mindset that we are now bringing more of an open governance approach and open governance principles to the foundation,” King told TechCrunch.

But IBM didn’t stop there. It also announced that it was open sourcing some of the technical underpinnings of the Power Series chip to make it easier for developers and engineers to build on top of the technology. Perhaps most importantly, the company is open sourcing the Power Instruction Set Architecture (ISA). These are “the definitions developers use for ensuring hardware and software work together on Power,” the company explained.

King sees open sourcing this technology as an important step for a number of reasons around licensing and governance. “The first thing is that we are taking the ability to be able to implement what we’re licensing, the ISA instruction set architecture, for others to be able to implement on top of that instruction set royalty free with patent rights,” he explained.

The company is also putting this under an open governance workgroup at the OpenPower Foundation. This matters to open source community members because it provides a layer of transparency that might otherwise be lacking. What that means in practice is that any changes will be subject to a majority vote, so long as the changes meet compatibility requirements, King said.

Jim Zemlin, executive director at the Linux Foundation, says that making all of this part of the Linux Foundation open source community could drive more innovation. “Instead of a very, very long cycle of building an application and working separately with hardware and chip designers, because all of this is open, you’re able to quickly build your application, prototype it with hardware folks, and then work with a service provider or a company like IBM to take it to market. So there’s not tons of layers in between the actual innovation and value captured by industry in that cycle,” Zemlin explained.

In addition, IBM made several other announcements around open sourcing other Power Chip technologies designed to help developers and engineers customize and control their implementations of Power chip technology. “IBM will also contribute multiple other technologies including a softcore implementation of the Power ISA, as well as reference designs for the architecture-agnostic Open Coherent Accelerator Processor Interface (OpenCAPI) and the Open Memory Interface (OMI). The OpenCAPI and OMI technologies help maximize memory bandwidth between processors and attached devices, critical to overcoming performance bottlenecks for emerging workloads like AI,” the company said in a statement.

The softcore implementation of the Power ISA, in particular, should give developers more control and even enable them to build their own instruction sets, Hugh Blemings, executive director of the OpenPower Foundation explained. “They can now actually try crafting their own instruction sets, and try out new ways of the accelerated data processes and so forth at a lower level than previously possible,” he said.

The company is announcing all of this today at the The Linux Foundation Open Source Summit and OpenPower Summit in San Diego.


By Ron Miller