Another blog post on the HP Labs website, this one from researcher Kirk Bresniker, HP Labs Chief Architect and HP Fellow:
We met over a weekend at Stanford University in the hope of
pioneering new approaches to computing in the face of the impending end of
semiconductor scaling and guided by the notion that we can't rely on our past
trajectory to plot our future course. To paraphrase Dr. Carlo Gargini, founder
and past chair of the IEEE Conference on Intelligent Transport Systems,
"We used up twenty years worth of scaling techniques in ten years chasing
after server performance, and we knew we were doing it at the time, and so now
we have to ask ourselves what do we do next?"
The presentations offered some fascinating perspectives on how
things might change. Key topics and trends discussed included innovations in
machine and natural learning, data intensive computing, trust and security, and
memory–driven computing. But we were all promoted by the same question: what
will happen when transistors stop getting smaller and faster?
My talk, titled “Memory–Driven Computing,” explored a phenomenon
our research and engineering teams at HP have been observing: that conventional
compute is not getting faster, essentially because the technologies we've been
optimizing for the past 60 years to create general purpose processors (copper connectors,
tiers of memory and storage, and relational databases) are all at their limits.
It's not just a matter of the physics of CMOS devices, we need far more
fundamental change than that.
Additionally, I noted, just as technology improvement curves are
flattening out, data volumes are set to explode. Human-generated data will soon
be too big to move even with photonics, and will thus be all but impossible to
securely analyze in real time. Fundamentally, we’re reaching an inflection
point. It may not happen tomorrow, but it’s only a matter of time before we do.
In response to this challenge, HP has been collaborating with IEEE
Rebooting Compute and the ITRS, looking at the problem from a holistic
perspective and taking into account both the evolutionary and the revolutionary
approaches we need to accelerate scientific discovery and insights in the
field. Our vision is that the different constituencies that we represent can
change physics to change the economics of information technology, catalyzing innovation
in successively larger circles.
Just bringing together the leading minds of semiconductor physics
and computer architecture isn't sufficient, however. We need to bring an even
broader perspective to the table, with more engineering and scientific
disciplines represented because there are no more simple answers.
As I shared in my talk, HP Labs’s major research initiative, The Machine, has been examining these questions for
several years now with the ambitious goal of reinventing the fundamental
architecture of computers to enable a quantum leap in performance and
efficiency, while also lowering costs over the long term and improving
security. Almost every team within HP Labs is contributing to this effort,
along with engineering teams from across the company’s business units. Central
to our work is a shift from computation to memory as the heart of information
technology.
The Machine will fuse memory and storage, flatten complex data
hierarchies, bring processing closer to the data, embed security control points
throughout the hardware and software stacks, and enable management and
assurance of the system at scale. It may seem counter-intuitive, but by concentrating
on massive pools of non-volatile memory, we expect to spur innovation in
computation by allowing many different models of computation to work on the
same massive data sets. Quantum, deep neural net, carbon-nanotube, non-linear
analog – all of these models could be working in concert, connected to
petabytes and exabytes of information derived from a world of intelligent
devices.
By collaborating with the proven leadership of the ITRS and IEEE,
we believe we can broaden, as we must, the impact of the technical innovations
that we’re developing with The Machine. The result, we hope, will be the
development of new ways to extract knowledge and insights from large, complex
collections of digital data with unprecedented scale and speed, allowing us to collectively
help solve some of the world’s most pressing technical, economic, and social
challenges.