Time Shifting and Historical Research

4 minute read

About ten thousand years ago, we were introduced to the phrase “time shifting” by a decade-long lawsuit over the right to use VCRs to tape TV shows for later viewing. Today’s DVR has of course made this process far easier and probably more widespread, but the idea remains the same: rather than watch something right now, with no snack breaks, we instead put it off until some later time. Other than the occasionally self-serving gripe about having “a lot of TiVo to catch up on,” time shifting is a settled and dead issue, a non-story. Or it would be, if it were not for the troubling case of historical research.

In a recent post I fretted about how shifting research practices might affect the significance and allure of historical fields. Here I want to examine those shifting practices in a bit more detail. The benefits of compressing a research agenda or of greatly expanding the amount of materials that can be gathered (or both) has encouraged a wholesale transformation in the way that researchers now use archives. The point of visiting the archives hasn’t changed — people still go there to gather evidence — but before the widespread use of digital photography the collection of evidence was limited by what could be read, and then summarized in notes or transcribed. All of this activity necessarily had to occur on-site, during the limited hours and days of operation, further constrained of course by strikes, holidays, and hangovers.

With digital photography, a far greater number of documents can now be processed in a much shorter period. This isn’t really news to anyone who has visited an archive in the last five years. And here, Robert Darnton’s recent defense of the analog rings especially hollow.1 In dispelling “5 Myths About the ‘Information Age’”, Darton claims,

“All information is now available online.” The absurdity of this claim is obvious to anyone who has ever done research in archives. Only a tiny fraction of archival material has ever been read, much less digitized.

This is certainly an accurate statement, so long as we only look backward. What Darton is ignoring of course is that essentially all archival material consulted today is being digitized, whether in the form of transcription or photography. What’s missing is the ability to access and mine these innumerable rich individual silos of data. Zotero is one step toward realizing this vast meta-archive, but however outrageously ambitious such a project might seem, it is trivial when compared to the massive amount of labor that has already been deployed to digitize at the individual, cottage-industry level.

What’s especially interesting, I think, is how this new practice might qualitatively affect research. In particular, I wonder how the creation and population of individual research queues, time-shifted for later consultation, will influence how scholars approach the gathering and analysis of evidence. Take, for example, the remarkable transformation in the area of pre-1923 printed materials. Whenever I encounter any reference to any printed source, the first thing that I now do is to consult Google Books or Gallica to see whether there is a digitized edition available. If I find a digital version — and most times I do — I add a copy to my Zotero library to be read later (and if it’s small format, octavo or duodecimo, usually on my Kindle).

This workflow has dramatically reduced the amount of time that I spend on-site in research libraries like the Bibliothèque nationale de France, which is increasingly becoming just another nice quiet place to do work, a purpose much like that served by my neighborhood branch library when I was in grade school, only with RFID cards and a smoking lounge. But it has also hugely increased the amount of time that I now spend reading and “doing research” at home, at night, and on weekends. Moreover, it’s incredibly easy to amass a massive queue of digitized documents and feel like one has “performed research” even though a good percentage (most?) of those materials might prove useless. So in a sense, we’re not just talking about time shifting an amount of research equivalent to say, 1998 levels, but rather that we’re simultaneously escalating the evidentiary basis for any research project.

Mike O’Malley and I have written about the changing landscape of historical research in the face of abundant evidence.2 We agree that finding, as part of the research process, will inevitably decline as a valued skill as associated costs continue to fall. In contrast, synthesis and contextualization, always valuable, will become even more important differentiating qualities. Yet I wonder whether time shifting, and the risks it necessarily introduces, won’t so overburden researchers that they fail to advance to the stage of the research cycle where they can begin to perform meaningful analysis. How is time shifting affecting your research?

  1. Robert Darnton, “5 Myths About the ‘Information Age’,” The Chronicle of Higher Education, April 17, 2011, sec. The Chronicle Review, http://chronicle.com/article/5-Myths-About-the-Information/127105/. 

  2. Michael O’Malley, “Evidence and Scarcity,” The Aporetic, October 2, 2010, http://theaporetic.com/?p=176 and Sean Takats, “Evidence and Abundance,” The Quintessence of Ham, October 18, 2010, http://quintessenceofham.org/2010/10/18/evidence-and-abundance/. 

Leave a comment