Monday, October 26, 2015

Cloudy weather ahead?

Just when you were hopeful that the ship is "righting" -- that's a nautical term, one that Ellison's boys know--comes word that Meg and team are abandoning yet another promising area.

You probably saw the announcement that they sold their Network Security business, Tipping Point, to Trend Micro last week for a pittance, $300M.  

Now comes a bigger shock--Hp is abandoning "the public cloud"   Source: InfoWorld 10/23/15


Hewlett-Packard is pulling out of the public cloud business, it revealed this week. The HP Helion Public Cloud goes dark on Jan. 31, 2016.
Although this is big news at HP, I’m not sure the cloud computing world is surprised. HP’s public cloud has long struggled for adoption, eclipsed by Amazon Web Services and surpassed by Google, Microsoft, and IBM. As I wrote in 2011, HP's public cloud offered nothing distinctive or compelling -- and it hasn't changed since 2011.
To HP’s credit, it recognizes the reality and is cutting not only its losses, but preventing more customers from wasting resources getting onto Hellion, then having to leave later.
Instead of continuing to invest in an unsuccessful public cloud, HP is focusing on private and hybrid cloud computing. For large enterprise hardware and software providers like HP, that's a natural evolution. By sticking to the data center history with modern tech twists, HP can play to its strengths in technology, marketing, sales -- and its internal culture. HP isn't a cloud company, and it struggled to behave like one.
That said, I’m not sure that private clouds are viable over the long term, as public clouds get better, cheaper, and more secure. It will be hard to argue for private and hybrid clouds when they cost four times as much but deliver much less.
For some years at least, private and hybrid clouds will be areas in which enterprise IT invests, so it makes sense that HP takes advantage of it for as long as possible.
HP is not alone facing this sobering trend. IBM, Oracle, and other large providers face the same challenge of staying fat and happy as more enterprises move to the public cloud. The growth of the cloud means the death of a thousand cuts for large hardware and software companies.

Yet another HP Labs Fellow weighs in

Another blog post on the HP Labs website, this one from researcher Kirk Bresniker, HP Labs Chief Architect and HP Fellow:


Just recently, I was invited to present a paper at a joint meeting of the International Technology Roadmap for Semiconductors (ITRS) and The Institute of Electrical and Electronics Engineers (IEEE) Rebooting Computing group. Each plays a critical role in bringing together academics, industry researchers, and government agencies to help shape the development of, respectively, semiconductors and computer architecture.

We met over a weekend at Stanford University in the hope of pioneering new approaches to computing in the face of the impending end of semiconductor scaling and guided by the notion that we can't rely on our past trajectory to plot our future course. To paraphrase Dr. Carlo Gargini, founder and past chair of the IEEE Conference on Intelligent Transport Systems, "We used up twenty years worth of scaling techniques in ten years chasing after server performance, and we knew we were doing it at the time, and so now we have to ask ourselves what do we do next?"


The presentations offered some fascinating perspectives on how things might change. Key topics and trends discussed included innovations in machine and natural learning, data intensive computing, trust and security, and memory–driven computing. But we were all promoted by the same question: what will happen when transistors stop getting smaller and faster?

My talk, titled “Memory–Driven Computing,” explored a phenomenon our research and engineering teams at HP have been observing: that conventional compute is not getting faster, essentially because the technologies we've been optimizing for the past 60 years to create general purpose processors (copper connectors, tiers of memory and storage, and relational databases) are all at their limits. It's not just a matter of the physics of CMOS devices, we need far more fundamental change than that.

Additionally, I noted, just as technology improvement curves are flattening out, data volumes are set to explode. Human-generated data will soon be too big to move even with photonics, and will thus be all but impossible to securely analyze in real time. Fundamentally, we’re reaching an inflection point. It may not happen tomorrow, but it’s only a matter of time before we do.

In response to this challenge, HP has been collaborating with IEEE Rebooting Compute and the ITRS, looking at the problem from a holistic perspective and taking into account both the evolutionary and the revolutionary approaches we need to accelerate scientific discovery and insights in the field. Our vision is that the different constituencies that we represent can change physics to change the economics of information technology, catalyzing innovation in successively larger circles.

Just bringing together the leading minds of semiconductor physics and computer architecture isn't sufficient, however. We need to bring an even broader perspective to the table, with more engineering and scientific disciplines represented because there are no more simple answers.

As I shared in my talk, HP Labs’s major research initiative, The Machine, has been examining these questions for several years now with the ambitious goal of reinventing the fundamental architecture of computers to enable a quantum leap in performance and efficiency, while also lowering costs over the long term and improving security. Almost every team within HP Labs is contributing to this effort, along with engineering teams from across the company’s business units. Central to our work is a shift from computation to memory as the heart of information technology.

The Machine will fuse memory and storage, flatten complex data hierarchies, bring processing closer to the data, embed security control points throughout the hardware and software stacks, and enable management and assurance of the system at scale. It may seem counter-intuitive, but by concentrating on massive pools of non-volatile memory, we expect to spur innovation in computation by allowing many different models of computation to work on the same massive data sets. Quantum, deep neural net, carbon-nanotube, non-linear analog – all of these models could be working in concert, connected to petabytes and exabytes of information derived from a world of intelligent devices.

By collaborating with the proven leadership of the ITRS and IEEE, we believe we can broaden, as we must, the impact of the technical innovations that we’re developing with The Machine. The result, we hope, will be the development of new ways to extract knowledge and insights from large, complex collections of digital data with unprecedented scale and speed, allowing us to collectively help solve some of the world’s most pressing technical, economic, and social challenges.








The HP Labs website boasts about THE MACHINE

I hadn't recently visited the 'official' HP Labs website, which I invite you to do as well.

See http://www.hpl.hp.com/research/systems-research/themachine/

What you might read there--pretty impressive claims--is as follows:


By 2020, 30 billion connected devices will generate unprecedented amounts of data. The infrastructure required to collect, process, store, and analyze this data requires transformational changes in the foundations of computing. Bottom line: current systems can’t handle where we are headed and we need a new solution.

HP has that solution in The Machine. By discarding a computing model that has stood unchallenged for sixty years, we are poised to leave sixty years of compromises and inefficiencies behind. We’re pushing the boundaries of the physics behind IT, using electrons for computation, photons for communication, and ions for storage.

The Machine will fuse memory and storage, flatten complex data hierarchies, bring processing closer to the data, embed security control points throughout the hardware and software stacks, and enable management and assurance of the system at scale.

The Machine will reinvent the fundamental architecture of computers to enable a quantum leap in performance and efficiency, while lowering costs over the long term and improving security.
The industry is at a technology inflection point that HP is uniquely positioned to take advantage of going forward. The Machine demonstrates the innovation agenda that will drive our company, and the world, forward.


What you can also find are references to blog postings.  Here's one, asserting true progress:

The Next Iteration of Memory for The Machine

http://h30499.www3.hp.com/t5/Behind-the-scenes-Labs/The-Next-Iteration-of-Memory-for-The-Machine/ba-p/6800656#.Vi5FFGSrRR6

 

Today HP and SanDisk announced a new partnership with exciting implications for The Machine. The partnership unites HP’s Memristor technology and know-how and SanDisk’s ReRAM technology and manufacturing and design expertise.

Working together with SanDisk, we’re aiming to create a new Storage Class Memory technology. We hope to use this new technology in The Machine and we’ll also be working together on near-term enterprise-wide solutions for our customers.

The Machine will reinvent computing, knocking the current model, which has stood unchallenged for sixty years, back on its heels. It is Hewlett Packard Labs’ most ambitious project and will bring together breakthrough improvements to technology ranging from non-volatile memory to photonically-enabled fabrics; from workload-specific processing to a full suite of open source developer tools.

Because memory is the heart of The Machine, we’re excited about this new partnership and what it holds for the future of Memory-driven computing at HP.