Survivorship Bias, Survivor Guilt, and Opportunity Cost

5 minute read

N.B. For best results, try to get Destiny’s Child’s “Survivor” going in your head before proceeding.

Two recent blog posts by Larry Cebula and Holger Syme highlight the deep divide that separates the pessimists from the optimists in academia. Cebula explains why he steers his students away from pursuing a career as a professor, essentially arguing that the odds are simply too stacked against them even under the best of conditions. Syme, in contrast, suggests that Cebula’s dream-crushing advice is short-sighted and ultimately dangerous to the long-term viability of the profession. While Cebula’s reasoning will be familiar to many — he’s working the same rich vein as has William Pannapacker under his nom de doom Thomas H. Benton in the Chronicle of Higher Education — I suspect he’s also still greatly outnumbered among the greater population of academics, thanks in no small part to survivorship bias and an unwillingness to grapple with the unforgiving calculus of opportunity cost.

“Survivorship bias,” you ask? Let’s roll Wikipedia:

Survivorship bias can lead to overly optimistic beliefs because failures are ignored […] It can also lead to the false belief that the successes in a group have some special property, rather than being just lucky.

Survivorship bias can lead to all sorts of hilarious situations, like your grandparents advising you not to bother wearing a seatbelt because they never did, and look, they’re still around! Unfortunately it also can drive you to imagine that there’s a reason for your success. You certainly deserved to win, and others who deserve it will succeed, too!

In contrast, Cebula, Pannapacker, et al. appear at least partially motivated by survivorship bias’s evil twin, survivor’s guilt. They just know that they didn’t do anything particularly special, and they’re mortified by the thought that others have failed and are failing while they haven’t. They blame themselves for the gross imbalance in the job market and believe that by discouraging students from beginning the process, they can mitigate some of its deleterious effects. I’ll admit that I find this position rather appealing, in part because it’s self-deprecating, but mostly because in my case it’s actually quite easy to calculate what’s at stake.

The two core elements of the argument against pursuing a career in academia are the poor prospects for employment at the end of the road and the opportunity cost along the way. Few professors would deny the existence of the first challenge: we’ve all run the gauntlet of the job market, often multiple times, and we’ve seen our friends and colleagues and rivals do the same. Periodic reports on the state of the job market confirm what we instinctively know: there are too many candidates chasing too few jobs. And even among those available positions, fewer still conform to the ideal of a 2-2 tenure-track post.

The point about opportunity cost is far more contentious because it’s less easily quantifiable (and historians at least are of course explicitly trained to avoid indulging in such hypotheticals), and possibly because so many of us are bad at math. Cebula was (rightly) called out by my colleague Zach Schrag for advancing the figure of “a million dollars” as an indication of the income forgone by students pursuing graduate study as opposed to some more remunerative career (and here everyone seems oddly obsessed with Hooters). But the opportunity costs can still be enormous, as my own experience shows.

For three years between undergraduate and graduate study, I worked at IBM, first as a contractor and then as a direct hire employee. When I left in August 1999 for the University of Michigan, my salary was $72,000, excluding bonuses, awards, etc. which were frequent and not particularly difficult to obtain. I then spent the next six years as a graduate student, eking out a modest existence on teaching fellowships and research grants. So what did I give up?

At IBM, ten percent raises were the norm for reasonably competent individuals, and I had in the previous year received a 13% increase and far larger increases in previous years. To be very conservative, let’s assume 10% per year and no promotions, bonuses, or awards.

Year IBM Michigan
2000 $72,000 $14,000
2001 $79,200 $14,000
2002 $87,120 $13,000
2003 $95,832 $27,000
2004 $105,415 $21,000
2005 $115,957 $6,000
Total $555,524 $95,000

In contrast, during my six years of graduate study, I earned a total of around $95,000 in stipends and research grants, making for a net opportunity cost approaching half a million dollars. Not Cebula’s full million, but still nothing to sneeze at. Had I remained in graduate school longer than six years, the gulf would of course have widened even more rapidly, since I would have traded the compounding effect of raises for the rapidly diminishing handouts granted to lingering ABDs.

What critics often fail to remember is that opportunity cost doesn’t magically vanish even if one is lucky enough to land a good tenure-track position like I did. Even with the relatively minor inflation we’ve experienced over the last decade, 1999’s $72,000 is equivalent to just over $98,000 today. Not exactly the average assistant professor salary in the humanities, is it? Indeed, the opportunity cost associated with academia effectively continues to mount indefinitely.

When we tell our students that earning a PhD is essentially a very costly way to purchase a lottery ticket (albeit with less astronomical odds but also a vastly diminished payout), we’re already making a good case for what might not be there at the end of the road. But we also need to explain to them what absolutely won’t be there: opportunity cost’s forgone earnings. Syme oddly claims that Cebula is cynical to address these financial concerns, continuing:

Has anyone ever been under the illusion that working as an academic in the humanities was a quick way to wealth, homeownership, and a stable nuclear family existence?

Well no, probably not. But unless I’m mistaken, what animates Cebula’s argument isn’t regret that working as an academic in the humanities is not a quick path to these goals — which, aside from “wealth,” are hardly grandiose aspirations — it’s the understanding that working as an academic in the humanities more often than not rules out these goals entirely.

Categories:

Updated:

Leave a comment