Dell’s Futuresville 2.0 office looks into crystal ball

Futuresville 2 – Dell’s Chief Technology Office for Enterprise – discussed their work in identifying key trends and co-ordinating responses within Dell, and talked about some of those trends they deem vitally important going forward.

Dell Futuresville 2.0ROUND ROCK – Futuresville 2.0 is the term by which Dell’s Chief Technology Office for Enterprise is known locally on the Dell Campus. While the office’s role is rather more tangible than simply projecting the future, assessing the likely direction of future trends – or strategic moves – is a major part of their job.

“Competitiveness is a major focus of the CTO,” said Robert Hormuth, who has been Interim CTO since the surprise departure of Paul Perez approximately a month ago, and who prior to that had been Dell Fellow and Executive Director of Platform Architecture and Technology. “We work across lines of business to make sure Dell has competitive roadmaps, even though we don’t own the roadmaps. We make sure Dell doesn’t miss the next technology turn, whether from something like OpenStack or another new technology we need to adopt. We look at customer usage models and trends to make sure we don’t miss trends like ARM servers.”

Hormuth indicated that they also look at specific problems in the industry that they can tackle in a way that has high value to Dell’s customers.

“We have prototyped innovation projects and got them onto product road maps,” he said. “We take part in conference-facing activities, and are heavily involved in university programs. We also do a lot of the due diligence around mergers and acquisitions for Dell Ventures, and make recommendations there. We are heavily engaged in startup communities around the world.”

“We also mine our customers, and spend significant time in front at industry events to do this,” said Byron Blunk, Director, Software and Automation at Dell.

“What we do there with them is very different from what Dell would do at a sales engagement,” Blunk added. “We talk about directionality and big trends and see what resonates with the customers and what doesn’t, and we see where their pain points are. This doesn’t give us the answers, but gives us another piece of input.”

One key active project the CTO team is working on is benchmarking different infrastructures around containerization.

“We look at things like how many sockets and what type of services are optimal, and translate this directly to the product group about what they should be working on,” Hormuth said. “There’s no standard benchmark for this, so we have to devise our own.”

“There’s no one place Dell can go to align to a single standard for container management, so this means that we do a lot of prediction,” Blunk said. “In the dev-ops world, we have seen an increasing trend of systems interacting with systems, rather than individuals interacting with systems, and from there we can start doing proof of concepts and having what-if conversations. This lets us predict characteristics such as that scale is going to increase, and that this means that any of our products in this space have to understand scale in a way they didn’t do historically. If we want a product to understand the container on top of it, it expands the knowledge it has to have by orders of magnitude. This applies even for traditional 2.0 technologies, because hybrid is likely for some time and we can’t have siloed technologies. Sorting out those things are a real opportunity for us.”

“We knit together efforts through the whole company working on similar things to make sure all of them are working together, doing things like putting together a Container Council where they all share information,” said Barton George, ‎a Senior Principal Engineer in the CTO’s Office. “We also have to consider things like the advent of unikernels coming in. Will those displace containers, or will things wind up being an amalgam of the two? That’s what’s hard to predict going forward. All technology is transient and which ones will carry on is tough to predict.”

Another key issue is determining how the pace of innovation is likely to impact in customer preferences

“The industry is looking to see open innovation and a faster pace of innovation, seeing that continuing on the evolutionary path we are on is getting harder, with sockets under pressure,” Hormuth said. “Customers want to see more third party innovation,and they want and believe in an open world, but whether dealing with ARM or POWER, when or if one gets to meaningful TCO rhetoric, customers will not flip for a two per cent TCO gain. There has to be measureable improvement for them to make that leap.”

Blunk agreed that while enterprises won’t take big risks and won’t change existing systems for a lower TCO gain, the decision could well be different when new applications are involved.

“I think it’s highly likely they would pick a new architecture for a new set of capabilities where they don’t have an existing estate,” he said. “It’s all about the new sets of applications, and if over time, if IDC is right about the growing dominance of the third platform and all these applications are rewritten, That’s how you achieve that affordability.”

“There’s not a lot of legacy code in machine learning,” Hormuth said. “We don’t have to worry about 40 years of code to deploy there. It’s these new areas that could possibly be ripe for adoption in a different architecture.”

Don Walker, an Enterprise Storage Product Architect in the CTO Office who leads Dell’s open source cloud effort, stressed that this opportunity for new processor technologies to make an impact is being driven now by the confluence of two technologies.

“You have operating system virtualizations and containerization, together with the rapid evolution of persistent memory, and this is rapidly reaching a point because of their confluence where there will be an opportunity for processor vendors to come up with a new technology with new processors with a completely different platform technology,” he said. “Vendors understand this, but have yet to seen a case where they can put an investment in to address those areas. Whoever gets there first could completely change the game, in my opinion.”

The difficulty the vendors face here is that while some elements of this future tech are here today, some are not, at least not completely.

“We don’t have the IO memory fabric buses to support that future tech today,” Hormuth said. “Do customers want to trap that next generation of non-volatile persistent memory in a server that can’t be composed out? To get to true composability, to this next tier of memory, the industry will have to invent a new IO memory bus, because having a lot of trapped expensive memory that can’t be composed doesn’t make IT very exciting. There is good work in the industry going on towards that path, but we can’t have resources trapped in a box that can’t be accessed by anyone else. There’s a real need for leaders in the industry to push this.”

“We lack a model the industry can use today for a really composable system, much less mechanics and semantics of how to go about doing it,” Blunk said. “We’ve seen inroads like overlay technologies in software-defined networking. “We’ve seen some of these things starting for managing hardware at a broader systems level scale. So we aren’t talking ‘future future,’ but more mid-term, and that’s where we are putting a lot of our thought, and that’s what a lot of our competition is doing as well.”

The CTO team has also been heavily evolved in the EMC integration with Dell and the longer term tech strategy that will unfold from this.

“At EMC World, EMC had a session where they pushed for a memory-centric architecture, which is very aligned with our vision of the future as well,” Hormuth said. “We also see the merger as moving things from where IT kept the lights on, to IT being a disruptive engine. At the very least, IT cannot be in the way. It has to move fast, or be disrupted like Blockbuster. From that standpoint, we can focus on running the data centre so our customers can focus on the apps. The merger brings us an unmatched portfolio in the industry on an end-to-end scale, so customers who want to disrupt the market can focus on modernized apps.”

While there are some obvious overlap areas in the storage space, the CTO execs consider the overall amount of duplication is relatively small.

“There is very little overlap in what we do,” Walker said. “They also have things like Cloud Foundry, where we have nothing similar.”