A STORM is Coming

April 05 2015

Have you ever used a computer that cost $200 million? Did you ever
envision something so complex that, despite appearing crystal clear in
your mind, escaped every time you tried put it into words? This is my
job. I work with supercomputers.

My favorite Listserve stories are those where people share insights
into their daily lives. I am pursuing a PhD in computer science. A
common question for me is: "Could you tell me what you're working on?
Can I comprehend this at all?" -- which really is more depressing than
flustering. Right now I'm sitting on a couch in Baton Rouge. I'm
collaborating with scientists from LSU (hey there!) on a project to
improve the fidelity of storm surge prediction. Not complicated at
all. Do you remember hurricane Katrina?

But let's start at the very beginning. What is a supercomputer? For
starters, it's not the new MacBook. Supercomputers are essentially
tens of thousands of networked computers. They are so huge that the
largest ones get their own buildings. And they are pricey. The fastest
American machine is "Titan" (at ORNL). BTW: the difference between
these machines and the data centers of Google, Facebook and the likes
is that a supercomputer is built to tackle gigantic, monolithic
compute problems. Data centers process myriads of relatively small and
independent computations.

But why do we build these insanely expensive machines? Mostly for
simulations. Supercomputers are the reason we no longer nuke south sea
island to test new weapon designs. The car you're driving likely
completed its first crash tests in a supercomputer. The project we're
working on seeks to accurately model coastal inundation caused by
heavy storms. Evacuating an area as large and populated as the New
Orleans metropolitan area costs hundreds of millions of dollars. Not
evacuating it may cost thousands of lives. Not a decision to make

Most movies don't get science. Disruptive results are rarely achieved
by a lone Sheldon Cooper, but by interdisciplinary teams -- with
specialists for every aspect of the endeavor. On our team we have
coastal engineers, mathematicians, and computer scientists. My job is
to shield the others from the complexity of supercomputers.

Now, I wrote something about abstract thinking and how it's sometimes
hard to pen ideas down. That's got to do with programming. Computer
programs are very much like recipes. A recipe is a set of instructions
that describe how to transform ingredients into a desired output (e.g.
pizza). Instead of ingredients a program operates on data. The
problem: a computer operates like a very daft, yet accurate person.
You will need to describe every action in great detail and it will
follow all of your instructions right down to the letter -- but
without understanding the greater goal, and without ever correcting
any of your errors. Supercomputers add quite a deal of complexity to
this job. To keep the picture of the kitchen intact: think you're not
instructing one cook to make one pizza, nor thousands of chefs to
produce thousands of pizzas, but your goal is to make ten thousand
chefs prepare a single pizza in 0.01% of the usual time. In perfect

E-mail me if you've got questions on supercomputers or would like to
know more about the STORM project (alternatively google for "STORM ADCIRC Stellar").


Andreas Schäfer
[email protected]
Baton Rouge, LA

comments powered by Disqus