While detailing Intel's multi-core future, Rattner said "You think you'd have to be some kind of freak to program for multi core, but you don't have to be a ninja programmer" and then he also said, "our goal at Intel is to banish ninja programmers forever."
Ian Phillips of ARM, on Sep 5th, in our Multi-core Challenge workshop at Bristol said, "There are problems that need scientists to solve. And there are problems that don't" implying that multi-core needs sophisticated programmers. That's more pragmatic than Rattner's talk. Isn't it?
Read more here Link to the article
Tuesday 27 September 2011
Has the industry already addressed some of the multi-core problems found in 2006?
Gordon Haff in his blog post What became of multi-core programming problems? says that some of the gaming software have managed to really tax the desktop hardware by utilising some of the modern architectures.
I can say that Gordon is one of the rare talent pool.
I will try to get him for a webinar.
I can say that Gordon is one of the rare talent pool.
I will try to get him for a webinar.
NVIDIA white paper on variable SMP
Have you read about NVIDIA's Project Kal-El processor? It implements a novel variable Symmetric Multiprocessing technology.
Essentially, there are five processors in this architecture. Four are designed to work at higher frequencies and the fifth at a much lower frequency. The low frequency chip will work on rendering multimedia. Based on the workload, these chips are enabled and disabled. The fifth CPU is called a companion which is not OS sensitive.
My question would be who would decide the enabling and disabling strategy? And at what level? If multiple programs act on enable/disable settings, would there be a middleware that takes sensible, real time decision?
What's it all about? Would it save energy without compromising on performance?
Read the whitepaper below for more information
SMP: A Multicore CPU Architecture for Low Power and High Performance
Essentially, there are five processors in this architecture. Four are designed to work at higher frequencies and the fifth at a much lower frequency. The low frequency chip will work on rendering multimedia. Based on the workload, these chips are enabled and disabled. The fifth CPU is called a companion which is not OS sensitive.
My question would be who would decide the enabling and disabling strategy? And at what level? If multiple programs act on enable/disable settings, would there be a middleware that takes sensible, real time decision?
What's it all about? Would it save energy without compromising on performance?
Read the whitepaper below for more information
SMP: A Multicore CPU Architecture for Low Power and High Performance
A Conversation with Intel’s John Hengeveld at IDF 2011
Here is a link to Greg Pfister's website containing transcript of his conversation with John Hengeveld of Intel.
The discussion spans several themes ... Use of MIC for streaming, Intel attributing performance to its programming model, MIC versus SoC/SCC, would it support CUDA and OpenCL.
An interesting read
http://perilsofparallel.blogspot.com/2011/09/at-recent-intel-developer-forum-idf-i.html
The discussion spans several themes ... Use of MIC for streaming, Intel attributing performance to its programming model, MIC versus SoC/SCC, would it support CUDA and OpenCL.
An interesting read
http://perilsofparallel.blogspot.com/2011/09/at-recent-intel-developer-forum-idf-i.html
Friday 1 October 2010
The Charity Cloud
ThinkGrid and appiChar are aiming to offer cloud services such as SaaS and Hosted Virtual Desktops for charitable organisations. The cloud suits such organisations as they operate on a variable headcount and a low budget.
Read more here.
How is Uncle Sam using the cloud?
We heard Ian Osborne revealing the latest on the UK’s G-Cloud project.
This made me curious to check out how Uncle Sam is using the cloud. It is said that the US has saved 1.7m$ by moving its services site onto the cloud. Then it has allowed its army to adopt a cloud based CRM tool saving 90% of the cost. And its interior department (DoI) is on track to save 67% of the costs. And there is more.
Check this article: http://www.ecommercetimes.com/rsstory/70924.html
Lack of cloud computing vision is hurting most enterprises
This is the view expressed by David Linthicum in his cloud computing blog. So, it is really back to the basics then. Enterprises must look at their business problems and look for technology options that solve them instead of jumping on the private cloud bandwagon.
We agree. In fact we are trying to formulate a step by step process for creating a cloud computing strategy. We are sharing this message in our regional cloud workshops and webinars.
Do join us to know more about this procedural approach on October 6th between 2 and 2:30 PM BST.
Register here: https://www1.gotomeeting.com/register/463153273
Subscribe to:
Posts (Atom)