You are viewing paulmck

Previous Entry | Next Entry

Parallel Programming: Heeding History

Given that parallel systems have been in existence for decades, it is worth asking why they have caused so much fuss over the past few years. Many argue that this is due to the end of Moore's-Law-induced frequency scaling, while others note that the business models of some corporations would suffer if people no longer felt the need to buy new computers every few years.

Although there is no doubt some validity to both of these arguments, the real reason is economics. Sure, parallel systems have been commercially available for some decades, but how many people could afford to splash out $1M in the 1980s? $100K in the early 1990s? $10K in the late 1990s? In contrast, how about a few hundred of 2009's sadly inflated US dollars?

As the price of parallel systems has plummeted, the number of situations where it makes economic sense to use them has increased exponentially. This in turn means that the demand for parallel software has also grown suddenly, outstripping the supply of developers with parallel-programming experience. Voilà, a parallel-software crisis.

But this is most definitely not the first software crisis. A very similar crisis arose in the late 1970s, with very similar history. A computer cost millions of dollars in the 1960s, tens of thousands with the advent of the minicomputer in the early 1970s, and mere thousands with the advent of the microcomputer and the personal computer in the late 1970s and early 1980s. Then as now, as the price of computer systems plummeted, the number of situations where it made economic sense to use them increased exponentially. Then as now, the demand for computer software grew suddenly, outstripping the supply of programmers. Then as now, a software crisis was proclaimed.

Many new programming languages were put forward to deal with this crisis, and these can be categorized, not into the good, the bad, and the ugly, but rather into the good, the fad, and the ugly.

The programming languages in the “ugly” category are still with us, though the fraction of code written using them has decreased. We still use various flavors of shell, sed, awk, and C, as well as holdovers from earlier times, including FORTRAN and COBOL. I myself used the Bourne shell and C for production software in the early 1980s, and would never have guessed that I would still be using them more than a quarter century later. They are simply too ugly — and too useful — to die.

The programming languages in the “fad” category include darlings such as PASCAL, MODULA, Scheme, Eiffel, Smalltalk, and CLU. There are no doubt a few developers still playing with these toys, but these darlings never were able to deliver on the promises made by their proponents, and never managed to gain a large developer base. (And given that I designed, coded, and put a 50,000-line PASCAL program into production, I know whereof I speak.)

So what does it mean for a programming language to be “good”? Given that the goal is to solve a software crisis, the only reasonable measures of goodness are: (1) an increase in productivity of existing developers by orders of magnitude, (2) an increase in the fraction of the population who can use computers, again by orders of magnitude, or preferably (3) both.

So, what were the “good” programming languages that solved the Great Software Crisis of the late 1970s and early 1980s?

And what lessons should we draw from the Great Software Crisis to help us deal with the Great Parallel Software Crisis?

Dec. 8th, 2009 12:04 pm (UTC)
Modern Languages
What do you think about modern languages such as Python and Ruby? I know they don't (yet?) do much for parallel programming, but they certainly make coding much easier and productive, as well as greatly increasing the population of coders.

On the parallel side, some of the things going on with Scala and Erlang are interesting. It is important for the nextgen language not to punish existing devs too much, and I think Scala does pretty well there.
Dec. 8th, 2009 09:09 pm (UTC)
Languages for parallelism
I will repeat a question from an earlier posting: what computer language, used heavily in production for decades, permits developers who know nothing of threads, locking, or messaging to nevertheless keep a large parallel computer usefully busy?

Do you know the answer?

Of course, I do hope that both Python and Ruby grow their parallel capabilities. Only time will tell how well Erlang and Scala will do, though both of these do have some interesting properties. That said, do you believe that either Erlang or Scala offer order-of-magnitude advantages to sequential developers wishing develop new parallel code? To developers wishing to introduce parallelism into an existing sequential application? If so, why?