Cisco Logo


High Performance Computing Networking

High Performance Computing (HPC) used to be the exclusive domain of supercomputing national labs and advanced researchers.

This is no longer the case.

Costs have come down, complexity has been reduced, and off-the-shelf solutions are being built to exploit multiple processors these days.  This means that users with large compute needs — which, in an information-rich world, are becoming quite common — can now use techniques pioneered by the HPC community to solve their everyday problems.

Sure, there’s still the bleeding edge of HPC — my Grandma isn’t using a petascale computer (yet).  All the national labs and advanced researchers are still hanging out at the high-end of HPC, pushing the state of the art to get faster and bigger results that simply weren’t possible before.

Vendors now make black-box solutions that hide the parallelism and complexity from the user.  Users now press a button, a miracle occurs, and the results pop out.  Users can now run larger jobs than they ever could before.  And these results take far less time to compute.

This is a very good thing.

It means that the science of parallel computing is being commoditized.  It’s become commonplace enough that it has become useful to a (much) larger segment of the population than it was before.  High performance computing is no longer the exclusive domain of scientists and engineers.

Now the weatherman is using HPC to show you doppler radar fly-throughs.  Financial advisors use HPC to forecast your retirement plans.  Interior designers use HPC to render what that new couch will look like in your family room.

Heck, even your cell phone has two processor cores these days.

This is a very good thing.

It means that we HPC old-timers are seeing a new type of marketing creep into our industry: it’s no longer directed at us.  It’s directed at the weatherman, financial advisors, and interior decorators.

The new customers of HPC are not in the Top500; they’re in the bottom 500,000.  They don’t see their work as anything extraordinary in terms of computing — they’re just trying to get their jobs done.  The IT industry has now provided them tools to do so.

This is a very good thing.

Some HPC old-timers scoff at such low-end computing, saying “That’s not HPC!”  That’s fine; call it what you will.  Perhaps we need to invent some new terminology to separate “high end HPC” from “low end HPC.”  Whatever.

But it doesn’t really matter to me; I find it both personally and professionally rewarding that the techniques of HPC are trickling down to the masses and becoming genuinely useful in every day life.

In an effort to keep conversations fresh, Cisco Blogs closes comments after 90 days. Please visit the Cisco Blogs hub page for the latest content.

4 Comments.


  1. I thing the point isn’t “HPC has come to mass market” but “parallel computing has come to mass market”.

       0 likes

    • I definitely agree that parallel computing has come to the mass market.

      But I also argue that, potentially as a side-effect, HPC has also come to the mass market. “Normal” job computational requirements are definitely what used to be considered “HPC” only a short time ago (weather simulations, financial projections, high-detail rendering, etc.). That’s what I was trying to refer to in the last few lines of the post: perhaps we need new terminology such as “low end HPC” and “high end HPC,” or somesuch.

      Additionally, if you look at what companies like Microsoft are doing, they really are trying to bring more than just “simple” multi-threaded/multi-core parallelism to market for the common user. They’re trying to bring cluster-powered computation to Excel spreadsheets (and the like). Perhaps the line between “parallelism” and “HPC” is getting grey, or perhaps the definition of “HPC” itself is pretty grey…

         0 likes

  2. The thing is, this is hardly new. I mean, you are grouping the ideas of parallel computing with “HPC”, but that is hardly a precise definition of HPC. hPC used to be vectors… And every desktop out there has a vector unit. We used to need Unix to manage the giant gobs of computing at our disposal; now a consumer-oriented company (Apple) is the biggest Unix distributor, and Linux runs everything from phones to microwaves. Pick your computing technology, and it came from one of two places: HPC or gaming. HPC is like the F1 racing of the computer world: ideas developed there almost always end up in consumer-grade products, and this relationship is definitely in everyone’s best interest.

       0 likes

    • (sorry for taking so long to approve Kyle’s response — the notice got lost in my overflowing inbox!)

      I agree with most of what Kyle said. The problem is that there is no precise definition of “HPC”. I just watched the “Analyst Crossfire” video from ISC (http://www.youtube.com/watch?v=AVP0IdiT3iI); the 4 panelists didn’t even agree on exactly what “HPC” means. My point is that what used to be considered “HPC” is now mainstream — so is that no longer “HPC”? I’m sure different people have different answers to that.

      The overall theme of my post is: I’m seeing trickle-down of traditional / bleeding-edge “HPC” technologies into everyday usage. They’re parallel forms of computation, in one way or another — not just having the compiler automatically take advantage of vector processors, but actual parallel programming. Sometimes it’s just threads on a single machine, sometimes it’s spread out across a cluster. But Joe and Jane User are starting to use techniques that used to be the sole domain of the bleeding edge of high performance computing.

      And that’s what I think is a Good Thing.

         0 likes

  1. Return to Countries/Regions
  2. Return to Home
  1. All High Performance Computing Networking
  2. Return to Home