Survivorship Bias, Survivor Guilt, and Opportunity Cost

N.B. For best results, try to get Destiny’s Child’s “Survivor” going in your head before proceeding.

Two recent blog posts by Larry Cebula and Holger Syme highlight the deep divide that separates the pessimists from the optimists in academia. Cebula explains why he steers his students away from pursuing a career as a professor, essentially arguing that the odds are simply too stacked against them even under the best of conditions. Syme, in contrast, suggests that Cebula’s dream-crushing advice is short-sighted and ultimately dangerous to the long-term viability of the profession. While Cebula’s reasoning will be familiar to many — he’s working the same rich vein as has William Pannapacker under his nom de doom Thomas H. Benton in the Chronicle of Higher Education — I suspect he’s also still greatly outnumbered among the greater population of academics, thanks in no small part to survivorship bias and an unwillingness to grapple with the unforgiving calculus of opportunity cost.

“Survivorship bias,” you ask? Let’s roll Wikipedia:

Survivorship bias can lead to overly optimistic beliefs because failures are ignored […] It can also lead to the false belief that the successes in a group have some special property, rather than being just lucky.

Survivorship bias can lead to all sorts of hilarious situations, like your grandparents advising you not to bother wearing a seatbelt because they never did, and look, they’re still around! Unfortunately it also can drive you to imagine that there’s a reason for your success. You certainly deserved to win, and others who deserve it will succeed, too!

In contrast, Cebula, Pannapacker, et al. appear at least partially motivated by survivorship bias’s evil twin, survivor’s guilt. They just know that they didn’t do anything particularly special, and they’re mortified by the thought that others have failed and are failing while they haven’t. They blame themselves for the gross imbalance in the job market and believe that by discouraging students from beginning the process, they can mitigate some of its deleterious effects. I’ll admit that I find this position rather appealing, in part because it’s self-deprecating, but mostly because in my case it’s actually quite easy to calculate what’s at stake.

The two core elements of the argument against pursuing a career in academia are the poor prospects for employment at the end of the road and the opportunity cost along the way. Few professors would deny the existence of the first challenge: we’ve all run the gauntlet of the job market, often multiple times, and we’ve seen our friends and colleagues and rivals do the same. Periodic reports on the state of the job market confirm what we instinctively know: there are too many candidates chasing too few jobs. And even among those available positions, fewer still conform to the ideal of a 2–2 tenure-track post.

The point about opportunity cost is far more contentious because it’s less easily quantifiable (and historians at least are of course explicitly trained to avoid indulging in such hypotheticals), and possibly because so many of us are bad at math. Cebula was (rightly) called out by my colleague Zach Schrag for advancing the figure of “a million dollars” as an indication of the income forgone by students pursuing graduate study as opposed to some more remunerative career (and here everyone seems oddly obsessed with Hooters). But the opportunity costs can still be enormous, as my own experience shows.

For three years between undergraduate and graduate study, I worked at IBM, first as a contractor and then as a direct hire employee. When I left in August 1999 for the University of Michigan, my salary was $72,000, excluding bonuses, awards, etc. which were frequent and not particularly difficult to obtain. I then spent the next six years as a graduate student, eking out a modest existence on teaching fellowships and research grants. So what did I give up?

At IBM, ten percent raises were the norm for reasonably competent individuals, and I had in the previous year received a 13% increase and far larger increases in previous years. To be very conservative, let’s assume 10% per year and no promotions, bonuses, or awards.

Year IBM Michigan
2000 $72,000 $14,000
2001 $79,200 $14,000
2002 $87,120 $13,000
2003 $95,832 $27,000
2004 $105,415 $21,000
2005 $115,957 $6,000
Total $555,524 $95,000

In contrast, during my six years of graduate study, I earned a total of around $95,000 in stipends and research grants, making for a net opportunity cost approaching half a million dollars. Not Cebula’s full million, but still nothing to sneeze at. Had I remained in graduate school longer than six years, the gulf would of course have widened even more rapidly, since I would have traded the compounding effect of raises for the rapidly diminishing handouts granted to lingering ABDs.

What critics often fail to remember is that opportunity cost doesn’t magically vanish even if one is lucky enough to land a good tenure-track position like I did. Even with the relatively minor inflation we’ve experienced over the last decade, 1999’s $72,000 is equivalent to just over $98,000 today. Not exactly the average assistant professor salary in the humanities, is it? Indeed, the opportunity cost associated with academia effectively continues to mount indefinitely.

When we tell our students that earning a PhD is essentially a very costly way to purchase a lottery ticket (albeit with less astronomical odds but also a vastly diminished payout), we’re already making a good case for what might not be there at the end of the road. But we also need to explain to them what absolutely won’t be there: opportunity cost’s forgone earnings. Syme oddly claims that Cebula is cynical to address these financial concerns, continuing:

Has anyone ever been under the illusion that working as an academic in the humanities was a quick way to wealth, homeownership, and a stable nuclear family existence?

Well no, probably not. But unless I’m mistaken, what animates Cebula’s argument isn’t regret that working as an academic in the humanities is not a quick path to these goals — which, aside from “wealth,” are hardly grandiose aspirations — it’s the understanding that working as an academic in the humanities more often than not rules out these goals entirely.

  • Holger

    Hi Sean,

    I still don’t quite get the opportunity cost argument. It basically assumes that a high-paying job such as the one you had would be the obvious — or at least the average — alternative to doing a humanities PhD. That doesn’t strike me as particularly realistic. I would presume that many of the kinds of students that are drawn to academia would be attracted to the kinds of jobs that are often less secure and less lucrative than working for IBM. Personally, before I started my PhD, I temped for two years while working in film and theatre, and that’s what I would have continued to do if I hadn’t gone back to grad school. (I was making a tiny bit more money in that “career” than as a grad student, but I also had higher living expenses.)

    But I also don’t understand why opportunity cost is an argument against going to grad school at all — which is how Cebula seems to use it. It may be something we should alert our students to and it’s another argument for being selective in choosing where to do one’s grad work. But do you regret having given up your IBM job? Would you regret it if you hadn’t got a tenure track position? Would you ever consider going back to your old career? (All of which is to say, I don’t quite get why the “cost” part is more important than the “opportunity” part.)

  • I generally fall on the “don’t go to grad school and expect to become a tenure track professor” side of things. But I’m also wary of the opportunity costs argument. It is often based on subjective experience, the same as the survivorship bias. As a balance to your IBM example, I could talk about myself and my decision to leave my job to go to graduate school. I was a fully certified teacher, teaching at a private high school (this already had opportunity costs, since private high schools paid less than public schools). My graduate stipend was only a few thousand dollars less than my teaching salary. Throw in the summer courses I taught, some grants I worked on, and a few other enterprises, and I made more money in graduate school than out—at least if I had stayed teaching at that particular school.

    Another point to consider is that whatever we’re doing before graduate school (if we do go to graduate school) itself has opportunity costs. Teaching high school had opportunity costs. IBM, I’m sure, had opportunity costs. But these costs are often non-monetary. Social costs. Emotional costs. Intellectual costs. Sanity costs. These hidden costs can eventually turn into financial costs as well (years of counseling to get over the meth habit I picked up while teaching high school chemistry, for example).*

    In the end, the best argument to me for advising students against grad school remains the scarcity of tenure track jobs. There are plenty of numbers to support this fact and we need to share them with students. But more crucially, and more systemically, we need to retrain the way both ourselves and our students think about graduate school. The end result of a Ph.D. in the humanities need not be a tenure track position. I’m thinking of Roger Whitson’s recent post on how graduate students and programs should reform themselves to address this new reality.

    * Purely hypothetical. I never required counseling.

  • Pingback: Grad school and opportunity cost | One piece at a time()

  • I was going to add a few thoughts to Mark’s comment, but they spun out of control, so I’ve posted them here:

    Short version: Mark is absolutely right.

  • Gary

    Many years ago, in the 1980s, I calculated my then opportunity-cost for my PhD in History. I reckoned up about $200,000.

    I would just add to this discussion that for a fair lifetime comparison, you need to consider the realistic alternative jobs for people who choose a PhD in History. It is not “manager at a Hooters” but similar careers requiring post-graduate education, like law. (I have often talked with students torn between law school and grad school. I have never talked with a student torn between grad school and managing a Hooters.) These careers have starting pay much higher than academia and lifetime earnings far above those for all but a tiny, tiny handful academic stars.

  • Mark and I park our cars in the same garage: the scarcity of tenure-track positions ought to be the single greatest deterrent to graduate study. And I’m virtually certain that this is the kind of “professor” Larry tells his students they’re not going to become. My interest in supplementing the scarcity argument with opportunity cost is that it forces students (and us) to think a little more imaginatively about the real costs of graduate study. I didn’t walk into a high-paying job directly out of college. I temped for something like $12 an hour, which turned into a $2000/month trial period learning more substantive technical things contracted at IBM, which turned into a $4000/month “normal” job contracted at IBM, which turned into a higher paying direct job with IBM, which then involved further raises.

    Unless BA-holders are going into finance, they’re almost certainly going to bounce around in some low-paying jobs. Perhaps even more disheartening is that these will be low-responsibility jobs, too. Because students have a difficult time imagining how they’ll ever make the transition from temping to something more meaningful, it’s tempting to latch onto the idea of a vocational or credential-building course of study: law school, MA in education, etc. What we need to make clear is that a PhD in the humanities is not a sound path to a specific career. The opportunity cost argument is a way to demonstrate that there aren’t just risks, there are costs.

    I don’t buy the idea that my current academic colleagues would have spent those six or eight years of their 20s and 30s temping or working at Starbucks or whatever. Intelligence aside, we’re talking about people who are highly motivated and organized. If grad school in the humanities is your “only option,” you haven’t really considered your options.

    Mark, you must know that I’m sympathetic to Whitson’s argument. Thanks in large part to CHNM, my department is one of the few places where non-traditional career trajectories in public history and digital history are not second-rate consolation prizes for our graduate students, who are deeply involved in grant writing and project management. It remains to be seen how transferrable or replicable this environment will be at other institutions.

    I would also like to address Holger’s point about selectivity. If we encourage students only to attend the most selective programs, then they’re more likely to get a good job. We already know this, anecdotally from looking around our departments, and of course from the excellent work of people at the AHA, etc. But who gets into elite graduate programs? As the same researchers have shown, people from elite undergraduate programs, of course. And who attends elite undergraduate programs? And now we’re back to the Anthony Grafton’s updated version of the Aryan from Darien.

  • Pingback: free archaeology, part one: volunteering, training and crowdfunding | conflict antiquities()

  • Pingback: free archaeology: volunteering, training and crowdfunding | (un)free archaeology()

  • AZ

    I have been following the trail of posts here with keen interest, I’ve even read the fourth one by Sydney Bufkin and as far as I know the trail ended there. (Before this post the Cebula and Syme). OK, who the hell am I? I am considering applying to grad school. I did a major in history because even though I wanted to travel and teach English, and had signed up for an English major, the lit classes I was in were just not stimulating enough. I had always really enjoyed reading history, even heavyweight material, and was so impressed by a history class in my first semester at UofM Dearborn that I went and changed my major to history and dropped all my English classes, replacing them with whatever history classes available. That wasn’t the only change, I was originally set to go into nursing and then anesthesia. I think I could’ve done it with a 4.0 in all the science prereqs, but realized I wasn’t ready to study as long as all that, and was turned off by the few years I would have to spend as a nurse… so that’s where I made the move to English and then History.

    But I really enjoyed it. So much so that now, after doing the traveling and English and landing an administrative position making over 60K with room for growth, I am/was considering going back to grad school. Here’s what works for me: I genuinely love teaching as well as research. I’ve been into that for a while and can write well and have linguistic and historical expertise in a subfield that would be, I think, very valuable to the field. But I absolutely have to make good money. I have a wife and three kids, and when I think about what one of the professors went through at UofM, it doesn’t incite warm and fuzzy feelings. A very nice, intelligent professor specializing in East Asian history and some other areas (the only one in the first for sure at our campus), after spending quite some years teaching there was finally rejected for tenure and made the move to another university. I can only imagine his applying here and there, kids school arrangements, wife’s job etc.… and who knows how many times he had to/will be packing up and leaving? It seems totally reasonable to me to expect something similar to happen to me. He was clearly a smart, capable and well qualified professor, whatever reasons existed for not giving him tenure could not be ascertained by a student or anyone on the outside.

    So the bottom line is, I feel right now the way I felt when I received an email from a nursing website I subscribed to way back when I made that first switch. It was a regular digest of nursing posts info etc. but had someone giving tips on how to bathe a patient. That included tips on not using the wash rag you used on their bottom on their other body parts and so on. Something happened to me immediately after reading that, and it came to conclusion with me scrapping the whole plan. A CRNA makes great money, has amazing job flexibility, and I love science and the challenge of anesthesia and all the expertise required. But while there things were clear cut — if you get the grades, the certifications and work experience in a Class A Emergency unit, preferably Cardiac, you will get accepted to a Master’s in Anesthesia in the end. And then you’re made. So, to stretch the metaphors thin, with history I won’t just be wiping some asses, but doing it indefinitely as well… f^&%!!!!

    I’m not an opportunist. I just am good at a lot of things and have even more interests. I feel I could genuinely find two totally different and seemingly opposite careers rewarding as those I mentioned. But it seems as if I should abandon the second career track just as I did the first. Six years, then to pick up and move who knows where how many times before settling into an academic career somewhere, very possibly as an adjunct professor for life… it’s not that I’m not up to the challenge. The biggest thing that stands out to me here is that tenure has nothing to do with any particular achievements, and I’m not exactly a centerfield type of person, politically and intellectually speaking, so I wouldn’t be surprised at getting wary receptions.

    Anyway if anyone has some comments on this massive comment I would appreciate your thoughts. This is a big decision and I can’t afford to rush it. Maybe I’ll still at least take the GRE and apply to the most dream-come-true universities… but even imagining getting a fellowship at Yale just doesn’t seem that attractive anymore to be honest.

  • Pingback: The Arbitrariness of Academic Success | Mauled by Academia()