Data and Perspective

Even genuinely newsworthy topics can get distorted when commentators exaggerate or use data selectively. Here are two recent examples I noticed.

“This is the worst financial crisis since the Great Depression.” It’s true that it’s bad and even historic, and this sound bite correctly doesn’t actually claim it’s as bad as the Depression. I hope it doesn’t turn out to be in the same league as that; then, people were lining up at soup kitchens. For now, however, Apple is still on track to sell 10 million iPhone 3Gs this year, which says something.

“Yesterday [Monday, September 29, 2008] saw the worst single-day plunge in Dow Jones history.” “It’s a new Black Monday.” Well, these got my attention, because I remember Black Monday on October 19, 1987 very well. I was working in IT at a major bank, doing software application support for traders and related departments. When I went up to the trading floor that day, I immediately knew something was badly wrong because of the eerie sound as the elevator doors opened — a sound you never hear during trading hours, and certainly not from a room full of traders standing at their desks: silence.

Yesterday’s loss of 777 points was stunning as the largest single-day point loss in Dow history. But as a percentage loss that’s not even in the top 10 Bad Dow Days, all but two of which occurred before 1935. Those two since the Depression occurred on October 19 and 26, 1987, when the Dow lost 22.6% and then another 8% of its total value in single sessions, respectively #2 and #9 on All-Time Bad Dow Days list. For perspective, as of this writing the Dow is down 19.8% so far this entire year, and it surely hasn’t been a good year. Today’s crisis is already historic and could well get worse yet, of course, but some of us do remember some pretty bad ones in the past.

As good old Sam Clemens said (approximately), there are lies, darned lies, and statistics. Even when the statistics are true, always cross-check them for perspective. Even the best news and the worst news can be overstated, and viewing the same data from multiple angles helps ensure we understand it properly.


[Edited 9/30 to add: At 22.6%, Black Monday in 1987 was actually the worst Dow Day ever if you don’t count “reopening days” after unusual market closures when the markets catch up with events that happened while they were closed. So when was the all-time worst Dow Day ever? Perhaps surprisingly, the answer is not in 1929, though several of the all-time top 10 were in that year. Rather, it was December 12, 1914, when the Dow dropped 24.4% after the markets reopened after being closed entirely for over four months due to the outbreak of World War I. (The markets closed on July 30, two days after Austria-Hungary declared war and a day before Germany did.) That helps put in perspective just how bad October 1987 was, and of course that today’s crisis is also pretty bad even if it hasn’t beaten those prior records, yet.]

Ralph Johnson on Parallel Programming Patterns

A few days ago at UIUC, Ralph Johnson gave a very nice talk on “Parallel Programming Patterns.” It’s now online, and here’s the abstract:

Parallel programming is hard. One proposed solution is to provide a standard set of patterns. Learning the patterns would help people to become expert parallel programmers. The patterns would provide a vocabulary that would let programmers think about their programs at a higher level than the programming language. The patterns could steer programmers away from common errors and towards good design principles.

There have been a number of papers about parallel patterns, and one book Patterns for Parallel Programming. None of them have become popular. I think the problem is that parallel programming is diverse and requires more design expertise than traditional software design. Thus, parallel programming experts use more patterns than parallel programming expert. I’ll critique the existing patterns and explain what I think should be done to make a set of patterns that can be as effective for parallel programming as patterns have been for object-oriented design.

If Johnson’s name sounds familiar, it should: He’s one of the “Gang of Four” authors of the seminal book Design Patterns.

Recommended viewing.

Effective Concurrency Course: Sep 22-24, 2008

The first offering of the three-day Effective Concurrency course in May went very well. We’re doing it again later this month — this will be the last offering this year.

Here’s the brief information (more details below):

3-Day Seminar: Effective Concurrency

September 22-4, 2008
Bellevue, WA, USA
Developed and taught by Herb Sutter

This course covers the fundamental tools that software developers need to write effective concurrent software for both single-core and multi-core/many-core machines. To use concurrency effectively, we must identify and solve four key challenges:

  • Leverage the ability to perform and manage work asynchronously
  • Build applications that naturally run faster on new hardware having more and more cores
  • Manage shared objects in memory effectively to avoid races and deadlocks
  • Engineer specifically for high performance

This seminar will equip attendees to reason correctly about concurrency requirements and tradeoffs, to migrate existing code bases to be concurrency-enabled, and to achieve key success factors for a concurrent programming project. Most code examples in the course can be directly translated to popular platforms and concurrency libraries, including Linux, Windows, Java, .NET, pthreads, and the forthcoming ISO C++0x standard.

Here’s a summary of what we’ll cover during the three days:


  • Define basic concurrency goals and requirements
  • Understand applications’ scalability needs
  • Key concurrency patterns

Isolation: Keep Work Separate

  • Running tasks in isolation and communicate via async messages
  • Integrating multiple messaging systems, including GUIs and sockets
  • Building responsive applications using background workers
  • Threads vs. thread pools

Scalability: Re-enable the Free Lunch

  • When and how to use more cores 
  • Exploiting parallelism in algorithms 
  • Exploiting parallelism in data structures 
  • Breaking the scalability barrier

Consistency: Don’t Corrupt Shared State

  • The many pitfalls of locks–deadlock, convoys, etc.
  • Locking best practices
  • Reducing the need for locking shared data
  • Safe lock-free coding patterns
  • Avoiding the pitfalls of general lock-free coding
  • Races and race-related effects

Migrating Existing Code Bases to Use Concurrency

Near-Future Tools and Features

High Performance Concurrency

  • Machine architecture and concurrency
  • Costs of fundamental operations, including locks, context switches, and system calls
  • Memory and cache effects
  • Data structures that support and undermine concurrency
  • Enabling linear and superlinear scaling

I hope to get to meet some of you in the Seattle area!