On a trip to Europe last month, I was scheduled to give three talks about my research on deterministic parallelism. The talks all had the same topic, but wildly varying lengths: a one-minute talk at PLMW, which was co-located with POPL in Rome; a ten-minute talk at the POPL student session; and an hour-or-so-including-questions talk at MPI-SWS in Saarbrücken, Germany.
Here’s the talk I ended up giving, more or less:
When programmers have to contend with subtle, hard-to-reproduce bugs in their parallel programs, they’re entering a world of pain. Fortunately, deterministic parallel programming models promise to alleviate that pain. When we look at the existing deterministic parallel models, though, a pattern emerges: the determinism tends to be based on some notion of monotonicity. By taking monotonicity as our starting point, then, we can generalize and unify existing models. In particular, we can generalize single-assignment languages, which are deterministic, to allow multiple assignments that are monotonically increasing with respect to a user-specified lattice. We maintain determinism by combining these monotonic writes with threshold reads that block until a specified lower bound is reached. By combining monotonic writes with threshold reads, in our model, determinism abides.
All this had been partly inspired in the first place by finding a subtle movie reference on Derek Dreyer’s website while I was getting ready for my MPI-SWS visit. Still, I wasn’t sure if I should attempt the same joke again for that talk. But once I had gotten to Saarbrücken and had a dinner conversation with Derek, Aaron, and Jessica during which it became obvious that Derek was a Coen brothers fan, I couldn’t not go for it.
And that’s how I ended up traveling Europe last month giving a series of talks in which I explained my Ph.D. research in terms of Big Lebowski jokes. (If you just want to see the jokes in the above video, they’re at 0:56, 28:42, and 45:45. Enjoy!)