A few weeks ago, I posted "Cloudy weather ahead" citing HP's abandonment of the Public Cloud. This was days before the HPE/HPI separation, and the reaction of many to the post was 'confusion' to say the least.
"How could HP be abandoning the Cloud?" was the most popular refrain.
And of course they weren't, in terms of hybrid clouds, etc.
But this cloud business is ... indeed cloudy.
In today's SF Chronicle, I published a small article entitled, "Last mile: the home stretch for Hybrid Clouds" You can see it at http://goo.gl/1JLunW
The observation essentially is that where the Cloud technology and deployment issues stand is akin to where connecting networks was in about 1985, before widespread adoption and deployment of routers.
Having done some work re Cisco history, I was struck by how few firms understood the power and the ease of connection for multiple disparate networks. It took nearly a decade for most of the Fortune 500 to 'get on board,' astonishing as that might seem today.
IT directors, recall, are not paid to be risk-takers, but instead to be fully risk-averse. We med with many of them when I ran R&D/Product Marketing for Informix circa 1991. After one dinner with a number of IT CIOs, my wife on the way home said, "WHO ARE THESE PEOPLE? DULLER THAN...."
It fits, and still fits today. The issues today of course are complicated by cyber-security issues, contending vendors, and costly revamping of the core stack privileges. But they were back then too....
Anyway, I hope you enjoy the article
Friday, December 18, 2015
Silicon Photonics
One candidate for considerably improved computing performance is SILICON PHOTONICS
To the question of what has DARPA or other government agencies done for us lately, the news from the Obama administration on this front seems encouraging. To wit, the formulation of AIM Photonics last year. Michael Liehr, at SUNY, is the CEO/Director. I don't know him. The Deputy director is John Bowers, an incredibly capable researcher based at UCSB (Univ of Calif at Santa Barbara). He is assisted by Rod Alferness, also from UCSB.I;ve worked with them in the past, and have the highest regard for them.
The task though is a tough one. It is housed at SUNY Albany, with strong goals and capable leaders. But recall SEMATECH, and its abortive start in the mid-1980s, even though DARPA and Craig Fields did monumental work to aid U.S. firms in their fight vis-a-vis Asian chip manufacturers.
SEMATECH eventually morphed, and moved to SUNY Albany, creating the NanoTech Complex, a key reason that the Si Photonics labs are being located there as well.
Also recall MCC in Austin, again with much fanfare (Dields eventually ran it) to 'save American leadership in software'
Let's hope the lessons of those approaches are embodied in the AIM Photonics effort.
Below is the first slide of a quasi-public document describing the new facility
There are two levels of support beyond the government sponsorship-- Industry with three tiers of supporting level, and academia with an analogous three tiers. Below are the logos of the first tier of industry--clearly some big names (recall, for example, that Intel stood aside from SEMATECH until Bob Noyce was willing to head it up). Keysight (our old instrument friends from HP) are in Tier 3. Cisco, Juniper and Texas Inst have all said "we'll play" but haven't ante'd as yet, so are in no Tier.
Herewith the lead schools (note, Caltech and Stanford are in Tier 3, Berkeley and UC San Diego in Tier 2).
I'd be interested in any thoughts you have on this topic
To the question of what has DARPA or other government agencies done for us lately, the news from the Obama administration on this front seems encouraging. To wit, the formulation of AIM Photonics last year. Michael Liehr, at SUNY, is the CEO/Director. I don't know him. The Deputy director is John Bowers, an incredibly capable researcher based at UCSB (Univ of Calif at Santa Barbara). He is assisted by Rod Alferness, also from UCSB.I;ve worked with them in the past, and have the highest regard for them.
The task though is a tough one. It is housed at SUNY Albany, with strong goals and capable leaders. But recall SEMATECH, and its abortive start in the mid-1980s, even though DARPA and Craig Fields did monumental work to aid U.S. firms in their fight vis-a-vis Asian chip manufacturers.
SEMATECH eventually morphed, and moved to SUNY Albany, creating the NanoTech Complex, a key reason that the Si Photonics labs are being located there as well.
Also recall MCC in Austin, again with much fanfare (Dields eventually ran it) to 'save American leadership in software'
Let's hope the lessons of those approaches are embodied in the AIM Photonics effort.
Below is the first slide of a quasi-public document describing the new facility
There are two levels of support beyond the government sponsorship-- Industry with three tiers of supporting level, and academia with an analogous three tiers. Below are the logos of the first tier of industry--clearly some big names (recall, for example, that Intel stood aside from SEMATECH until Bob Noyce was willing to head it up). Keysight (our old instrument friends from HP) are in Tier 3. Cisco, Juniper and Texas Inst have all said "we'll play" but haven't ante'd as yet, so are in no Tier.
Herewith the lead schools (note, Caltech and Stanford are in Tier 3, Berkeley and UC San Diego in Tier 2).
I'd be interested in any thoughts you have on this topic
Tuesday, December 15, 2015
Looking forward
We're well into "the new HP era", right? Six weeks in, in fact. And the word on "the machine" is muted, but worth discussioin. As are a lot of other "what now" topics.
One that hasn't gotten much mention from HP in awhile is Quantum Computing, a worthy contender (for many theoretical years) to HP's approach.
Last week, a gaggle of Google researchers announced some interesting results from their analysis with a D-wave machine--including a result that sped up operations by some 100 million times. Wow! We should get one of those machines.
And then, an MIT researcher, Scott Aaronson, gave an ACM interview re 'what does this all mean?' You might find it interesting reading at: http://cacm.acm.org/careers/195454-scott-aaronson-on-googles-new-quantum-computing-paper/fulltext
Net net--we're all still waiting.
One that hasn't gotten much mention from HP in awhile is Quantum Computing, a worthy contender (for many theoretical years) to HP's approach.
Last week, a gaggle of Google researchers announced some interesting results from their analysis with a D-wave machine--including a result that sped up operations by some 100 million times. Wow! We should get one of those machines.
And then, an MIT researcher, Scott Aaronson, gave an ACM interview re 'what does this all mean?' You might find it interesting reading at: http://cacm.acm.org/careers/195454-scott-aaronson-on-googles-new-quantum-computing-paper/fulltext
Net net--we're all still waiting.
Monday, October 26, 2015
Cloudy weather ahead?
Just when you were hopeful that the ship is "righting" -- that's a nautical term, one that Ellison's boys know--comes word that Meg and team are abandoning yet another promising area.
You probably saw the announcement that they sold their Network Security business, Tipping Point, to Trend Micro last week for a pittance, $300M.
Now comes a bigger shock--Hp is abandoning "the public cloud" Source: InfoWorld 10/23/15
You probably saw the announcement that they sold their Network Security business, Tipping Point, to Trend Micro last week for a pittance, $300M.
Now comes a bigger shock--Hp is abandoning "the public cloud" Source: InfoWorld 10/23/15
Hewlett-Packard is pulling out of the public cloud business, it revealed this week. The HP Helion Public Cloud goes dark on Jan. 31, 2016.
Although this is big news at HP, I’m not sure the cloud computing world is surprised. HP’s public cloud has long struggled for adoption, eclipsed by Amazon Web Services and surpassed by Google, Microsoft, and IBM. As I wrote in 2011, HP's public cloud offered nothing distinctive or compelling -- and it hasn't changed since 2011.
To HP’s credit, it recognizes the reality and is cutting not only its losses, but preventing more customers from wasting resources getting onto Hellion, then having to leave later.
Instead of continuing to invest in an unsuccessful public cloud, HP is focusing on private and hybrid cloud computing. For large enterprise hardware and software providers like HP, that's a natural evolution. By sticking to the data center history with modern tech twists, HP can play to its strengths in technology, marketing, sales -- and its internal culture. HP isn't a cloud company, and it struggled to behave like one.
That said, I’m not sure that private clouds are viable over the long term, as public clouds get better, cheaper, and more secure. It will be hard to argue for private and hybrid clouds when they cost four times as much but deliver much less.
For some years at least, private and hybrid clouds will be areas in which enterprise IT invests, so it makes sense that HP takes advantage of it for as long as possible.
HP is not alone facing this sobering trend. IBM, Oracle, and other large providers face the same challenge of staying fat and happy as more enterprises move to the public cloud. The growth of the cloud means the death of a thousand cuts for large hardware and software companies.
Yet another HP Labs Fellow weighs in
Another blog post on the HP Labs website, this one from researcher Kirk Bresniker, HP Labs Chief Architect and HP Fellow:
Just recently, I was
invited to present a paper at a joint meeting of the International Technology Roadmap for Semiconductors (ITRS) and The Institute of Electrical and Electronics Engineers (IEEE) Rebooting
Computing group.
Each plays a critical role in bringing together academics, industry
researchers, and government agencies to help shape the development of,
respectively, semiconductors and computer architecture.
We met over a weekend at Stanford University in the hope of
pioneering new approaches to computing in the face of the impending end of
semiconductor scaling and guided by the notion that we can't rely on our past
trajectory to plot our future course. To paraphrase Dr. Carlo Gargini, founder
and past chair of the IEEE Conference on Intelligent Transport Systems,
"We used up twenty years worth of scaling techniques in ten years chasing
after server performance, and we knew we were doing it at the time, and so now
we have to ask ourselves what do we do next?"
The presentations offered some fascinating perspectives on how
things might change. Key topics and trends discussed included innovations in
machine and natural learning, data intensive computing, trust and security, and
memory–driven computing. But we were all promoted by the same question: what
will happen when transistors stop getting smaller and faster?
My talk, titled “Memory–Driven Computing,” explored a phenomenon
our research and engineering teams at HP have been observing: that conventional
compute is not getting faster, essentially because the technologies we've been
optimizing for the past 60 years to create general purpose processors (copper connectors,
tiers of memory and storage, and relational databases) are all at their limits.
It's not just a matter of the physics of CMOS devices, we need far more
fundamental change than that.
Additionally, I noted, just as technology improvement curves are
flattening out, data volumes are set to explode. Human-generated data will soon
be too big to move even with photonics, and will thus be all but impossible to
securely analyze in real time. Fundamentally, we’re reaching an inflection
point. It may not happen tomorrow, but it’s only a matter of time before we do.
In response to this challenge, HP has been collaborating with IEEE
Rebooting Compute and the ITRS, looking at the problem from a holistic
perspective and taking into account both the evolutionary and the revolutionary
approaches we need to accelerate scientific discovery and insights in the
field. Our vision is that the different constituencies that we represent can
change physics to change the economics of information technology, catalyzing innovation
in successively larger circles.
Just bringing together the leading minds of semiconductor physics
and computer architecture isn't sufficient, however. We need to bring an even
broader perspective to the table, with more engineering and scientific
disciplines represented because there are no more simple answers.
As I shared in my talk, HP Labs’s major research initiative, The Machine, has been examining these questions for
several years now with the ambitious goal of reinventing the fundamental
architecture of computers to enable a quantum leap in performance and
efficiency, while also lowering costs over the long term and improving
security. Almost every team within HP Labs is contributing to this effort,
along with engineering teams from across the company’s business units. Central
to our work is a shift from computation to memory as the heart of information
technology.
The Machine will fuse memory and storage, flatten complex data
hierarchies, bring processing closer to the data, embed security control points
throughout the hardware and software stacks, and enable management and
assurance of the system at scale. It may seem counter-intuitive, but by concentrating
on massive pools of non-volatile memory, we expect to spur innovation in
computation by allowing many different models of computation to work on the
same massive data sets. Quantum, deep neural net, carbon-nanotube, non-linear
analog – all of these models could be working in concert, connected to
petabytes and exabytes of information derived from a world of intelligent
devices.
By collaborating with the proven leadership of the ITRS and IEEE,
we believe we can broaden, as we must, the impact of the technical innovations
that we’re developing with The Machine. The result, we hope, will be the
development of new ways to extract knowledge and insights from large, complex
collections of digital data with unprecedented scale and speed, allowing us to collectively
help solve some of the world’s most pressing technical, economic, and social
challenges.
Subscribe to:
Posts (Atom)