At its conference in Las Vegas last week, the company’s HP Labs division took higher-than-usual profile, showing off a concept it calls “The Machine” – a new computing model that it says will decouple memory from processors, and allow significant advances in both.
The idea is to rethink how a computing engine is built, decoupling RAM from CPU and building upon the idea of system-on-a-chip that HP debuted with its Moonshot servers, which use vast numbers of cheap processors to tackle big jobs.
“We’ve been on Moore’s Law since I was a small child, and some of device physics we’ve depended on for that are starting to plateau. The end of free upgrades is upon us,” said John Sontag, vice president of systems research for HP Labs. “We propose to build something with specialty purpose cores, and putting those purpose-built cores up against the data they’re trying to transform.”
The biggest advance in The Machine idea seems to be the use of photonics to connect memory to compute, eliminating what is today seen as a bottleneck between RAM and cache, the actual part of the memory that the processor(s) are acting upon. The use of photonics as the interconnect for processor and memory allows the two to be separate from each other, and allows processors that are in “The Machine” to directly address memory “at the Exabyte scale.”
The memory to be used in The Machine is the “memristor” technology that it previewed at its HP Discover Barcelona late last year, a high-performance, low-cost non-volatile memory/storage hybrid designed to building block for the next generation of Big Data applications, essentially extending the in-memory model that has come to the forefront with SAP’s HANA database architecture to the whole of an enterprise’s data set.
“The Machine” is the name for HP’s vision of any combination of these small system-on-chip machines with some quantity of memristor for storage, both long-term and short-term. That could be a comparatively small quantity of both on board a cell phone, or a massive bank of processors connected via photonics for an enterprise data centre. At Discover, the company showed off a 3D printed model of a processor/memristor bundle.
The combination will require a rethinking of the modern operating system, Sontag said, as current operating systems all “spend their time shuffling data between the cache and storage,” a distinction which is not made in HP’s new design. The project will require HP and the community it’s hoping to build around The Machine to “revisit millions of lines of OS code,” said Kirk Bresniker, chief architect at HP Labs.
The payoff of the new architecture could be very compelling – HP models a computer based on The Machine being able to put together 160 giga-updates per second (GUPS) over eight racks. By comparison, the current Fujitsu K supercomputer manages 28.8 GUPS over 73,000 SPARC nodels. HP believes The Machine will accomplish that much computing power at one eightieth the power required to run the K, as well, Bresniker said.
Ultimately, HP envisions a “mesh” of Machine-type computing and storage with networking even across industries. For example, Bresniker presented a situation where a Boeing 777 in-flight could not only store and transmit each bit of the gigabytes of data typically collected but ultimately deleted on a given flight, but also communicate appropriate details with other planes around it, such as weather conditions in real time, even to competitors’ planes.
From the main stage at Discover, the company said it expects to puts its new purpose-built Machine OS into public beta by 2017, with that followed a year later by the debut of edge devices for the Machine’s mesh network, scale availability of core parts for the Machine, and a full release of the operating system.
The company chose the conference as the debut of The Machine because it’s ready to start “building the community” that will build the components, operating environment, and ultimately applications that will run on the new computer, Sontag said. HP Labs is currently working one “moving from the lab to fab” for components, and then will be looking to engage with other parts of HP to bring the whole package together.
“We’ve done simulations, emulations, and studies of this, but we’ve got al ot of hard work to do to deliver it,” Sontag said. “It’s going to take a couple of years.”
There are also still some naming issues to be solved between now and general availability of HP’s new supercomputer for everyone.
“We called it [The Machine] because HP Labs doesn’t have a marketing department,” admitted Martin Fink, CTO and director of HP Labs.
While HP Labs said that building the community to power the new technology is the main reason for going public at this time in such a big way, it does not seem to be lost on HP that rival IBM recently created a new business unit around its own next-generation computing architecture, Watson.
“This is not just to design a machine to win a gameshow, it’s about being able to enable scientists and engineers to solve the problems that are facing us as we look to provide education that befits the eight to ten billion other people on our planet,” Bresniker said in a statement that was part mission statement, part dig on its rival in Armonk.