I’m miles behind on site admin, mainly because of actual software work. I’ve got presentations lined up for late November and all the demo cases don’t quite work yet. But there’s been a few interesting things on InsideHPC in the last month or so that I’ve been meaning to comment on.
Here’s the first one: http://insidehpc.com/2010/10/19/why-the-hpc-growth-equation-hasn%E2%80%99t-added-up/
This is dead on the money. Most of HPC does have a relatively high cost and low access. In some ways HPC is at the stage where software and hardware was in the 80s: it’s capable of some amazing things, but only for a select few experts. It needs that order-of-magnitude reduction in cost and corresponding increase in accessibility. I think that’s going to come from software, the hardware is already there. The bleeding-edge of HPC is continually leaving a long tail of value behind it for companies motivated enough to get in there and deliver.
The second one is this: http://insidehpc.com/2010/10/22/idc-reports-recommends-europe-invest-heavily-in-hpc/
IDC consistently recommend that everyone invests heavily in HPC! One part of this report that I really liked is that IDC identified that HPC software needs to be exciting to use. It’s about time someone with some audience reach stepped up and said that. HPC can be a bit dry sometimes. Technologically amazing, but it looks kind of kind of beige to the general public. Google made being very smart and very capable exciting to nearly everyone. Google became a verb. HPC needs the same level of excitement to drive growth and access.
In the report, the EU is advised to become a magnet for HPC talent and development. The same advice has been given to the governments of the US and Canada, Asia-Pacific, you name it, by IDC, the Council on Competitiveness, Compute Canada and several other organisations. If everyone executes on the recommendations, that will drive a skills shortage which will in turn drive educational and vocational choices and build the critical mass of skills needed to make HPC mainstream and capitalise on that long tail of value. Whoever nails it first is going to be in a strong leadership position and it’s not guaranteed to be anyone allied with the Old West. China just did to the Top-500 what Japan did with the Earth Simulator in 2002, but they did it with GPUs.
Which brings me to my third item: http://insidehpc.com/2010/11/02/ncsa-director-gpu-is-the-future-of-supercomputing/
I’m not buying it. China is about to prove that GPUs will put you to the top of the Top-500, and I expect a GPU-based competition for the Top 500 in the near future. I think GPUs can provide some amazing speedups in almost every area, but I don’t think they’re the answer to everything. I’m thinking manycore CPUs for tasks, and manycore GPUs for data. I could be wrong. The great majority of our code is cross-platform C++ and Nvidia is doing a great job with their C++ support, so if I am wrong we have a foot in the door, and hey, it’s our code and we’ll make it work. We’re going to start using GPUs deep down to speed up the linear solvers anyway, it’s a great place to start and easily isolated from the rest of the codebase.