NetApp transformation goes far beyond product: CTO Bregman

Mark Bregman, NetApp’s CTO, sits down for a detailed talk about NetApp’s technology strategy going forward, and why the company is convinced it has made the difficult transition from a vendor of yesterday to a vendor of tomorrow.

mark-bregman

Mark Bregman, NetApp CTO

NetApp has been focused over the past several years on reinventing itself, evolving from a company which had been focused on on-prem filers to one that operates several portfolios, and has a focus on the hybrid cloud and newer technologies like flash. Mark Bregman, who was brought into the company as Chief Technology Officer a year ago as a key part of this transformation emphasized, however, that the biggest changes were not to the products, but to the way the company plans and executes its strategies.

“I was brought in to help transform NetApp in two major ways,” Bregman said. “First, for 20 years, we were basically a one-product company. We had to learn how to operate different portfolios that had to be managed in different ways. That involved decisions about resource allocation and investment that we had to think very hard about.”

The second change was getting the company to think like a software company. This has been a recurring theme of CEO George Kurian, that NetApp is a software company – with the software being tin-wrapped. Being a real software company though, also required changing fundamental processes, Bregman said.

“Way, way too much of the discussion had been talking about inventory of physical stuff,” he stated. “And while much of our IP was in software, it was in embedded software, which is not the way software companies think. We worked at the tempo of a systems company. The tempo at which a software company can deliver solutions is much faster. Now, the tempo at which we can deliver solutions is much faster. We are much better there.”

Bregman, who was CTO at Symantec earlier in his career, drew on examples from his time there to illustrate the vast differences between how software and hardware companies think.

“When we worked with Intel, our engineers would sit down with them and talk about security elements, and Intel would say ‘that’s great and we will incorporate it in the next edition of our chips in 2020’ – when we were thinking more like in October,” he said. “With software companies, the tempo of innovation is so critical.

Bregman also referenced Symantec’s failed acquisition of Veritas – something that virtually everyone in the industry thought would be a ‘can’t miss’ success when it was announced – as stemming from the same problem, which paralleled Intel’s failed McAfee acquisition.

“With Symantec, the security world always operated at a different tempo from the storage world,” he said. “We never actually really integrated the two businesses. We kept them separate, and as a result, we missed the opportunity to really integrate data security in storage. With NetApp, we still aren’t completely there, in terms of operating like a software company, but we are getting closer.”

Bregman also made reference to NetApp’s ongoing integration of Solidfire as another side of the same coin, in which the underlying storage technologies are managed separately, because the storage technologies are fundamentally different. At the same time, the data itself is managed in a unified way through the data fabric.

“The difference between Solidfire and NetApp storage is that NetApp is for used in an environment where you actively manage your storage, while the vision behind Solidfire is zero touch, where management is very much a cookie-cutter style. They are very different technologies, but they are very complementary, in that they work best in different environments.

“We have to think about them differently, and that’s why the data fabric is so important,” Bregman added. “Through the data fabric, we can manage the data in a unified way, even though the storage itself can’t be managed in a unified way. Moving as a company from handling just the storage layer to the data layer works because the data is what companies care about. That’s what’s of value to them. As we move from being just a storage company to a data management company, the other capabilities and services at the data management level also become more important as differentiators.”

Extending the data fabric far beyond NetApp – something that is a requisite for a software company – is a core part of the company’s strategy, and a major focus of its presentations this week at the Insight event.

“This vision won’t be unveiled all at once, and will be a continually evolving story, to provide that data management capability for customers regardless of where they store it,” Bregman said. “A lot of this is available now. We have cloud ONTAP in AWS. You can do that today. It’s being extended to Azure. You can take on-prem data and backup to the cloud with AltaVault. We have a fair amount of integration now with others, and we will extend it with better and better data management capabilities as we evolve the data fabric. You will see better solutions here over time. It’s an ongoing process. Two years ago, George [Kurian] just talked about data fabric. Last year here, we showed demos of what’s possible. This year, some of those are available as products and we will show some new ones, and next year some of them will be available as products.”

Bregman said that NetApp’s late entry into all-flash – which was always overstated anyway – has now been completely overcome.

“We were late in all flash arrays as a main thrust,” he said. “But we were first to actually put flash in the array. What may have worked to our advantage is that we realized it’s not just about flash. It’s about storage using flash. We built flash into ONTAP and got all those benefits right away. Now we are number two in flash arrays according to IDC and growing faster than anyone else. We have passed all the startups, even though we arrived late.”

The transition to flash, is not, however, a game changing event like the transition to the cloud, which Bregman said is much more significant. It’s just an evolution to a better type of storage, which will soon become ubiquitous.

“Right now, we are in that transition between hard drives and flash,” he said. “Three or four years from now, we won’t be talking about this at all. It’s like the debate between bipolar and CMOS op-amps back in the mainframe era, where IBM was committed to bipolar and defended it, but CMOS was just so much better. It took over and now there’s almost no bipolar anywhere. Spinning disks are like bipolar. We will soon be equating flash with storage. That’s why the bigger issue now isn’t about flash, it’s determining what data us going to go into the cloud, or to SaaS providers, or stored on-prem.”

Unlike the transition to flash, Bregman said there’s no clear and easy answer there, and that the vision of some that everything can eventually be managed in an inexpensive private cloud isn’t realistic.

“I’ve seen this movie several times before,” he said. “A new kind of platform comes out, and people see everything moving, like from mainframes to mini-computers to client server. And then people realize there was some value in the old way after all, and some capabilities stay there forever, and some even move back to the old. We have never seen everything move and consolidate on a new platform.

“The challenge today, is what the best place is to put each workload, and that’s the challenge we are focusing on,” Bregman stated.