The Frailty of Historical Truth: Learning Why Historians Inevitably Err
David Lowenthal, March 2013
How tiresome are the endless anecdotes about [William Best] Hesseltine, his seminar, and his students," wrote Wisconsin editor Paul Hass.1 Yet the unsung historiographical lessons that seminar imparted to me and others richly merit recording. By scanning our mentors' publications in skeptical depth, students learned that hidden bias always skews evidence, that secondary sources are ipso facto unreliable, and that myriad minor errors betoken major sins. Still more, they learned that even paragons do not have enough time, patience, or probity to prevent all such lapses and avoid their egregious epistemic consequences. Historians ever stumble on feet of clay.
At the University of Wisconsin from 1932 until his death in 1963, Hesseltine was a renowned chronicler of the Civil War and its aftermath, whose "commandments" on historical writing are still often cited. His anathemas forbade the passive voice, the present tense, designating persons by their last names only, and quoting from secondary sources. He inveighed against the rising tide of pompous impedimenta: "do not discuss thy methodology"; "write about thy subject and not about the documents concerning thy subject"; "fight all thy battles in the footnotes." And––pertinent in today's Wiedergutmachung spate of apology, tempting historians to turn moralist—"thou shalt not pass judgments on mankind in general nor … pardon anyone for anything." 2 A retrospective celebrant prized Hesseltine's "strange blend of pacifism, anarchism, Menckenism, Calvinism, and sheer naked perversity."3
Skepticism was the cardinal rule in Hesseltine's graduate methods seminar. One learned to "doubt every document and assume that every witness was a damned liar," recalled Richard Current. In 1950, seminars began with a "firm injunction to distrust each witness; to view every document as designed to deceive." Just as barrelmakers learned their skill, Hesseltine's students would master the historian's craft. Adversarial brutality was routine. The "best seminar was like a dog-fight," wrote Current. To Frank Byrne the atmosphere, in which Hesseltine "sacrificed each student in turn for the education of the other neophytes, [was] reminiscent of the Roman arena." 4
But Hesseltine's colleagues—Howard Beale, Merle Curti, Chester Easum, Merrill Jensen, Paul Knaplund, Vernon Carstensen—suffered more than his students. They themselves were the guinea pigs whose publications evinced lamentable error. Our first assignment was to vet an essay in a major journal by a tenured member of staff for simple accuracy––every quotation, name, place, title, date, publisher in text and notes, and pagination. Each secondary source was checked for conformity with the original. Finally, we tabulated mistakes, compiling rates of errancy. We were astounded––and in scrutinizing trusted mentors, appalled––to find errors of seldom below 50 percent, and often as high as 80 percent.
Our next task was to determine how and why the faults had arisen. In many cases, reliance on a secondary source meant repeating an original omission or error. But the majority of defects seemed due to sheer carelessness. Slipshod authors had erroneously or incompletely transcribed archival or library notes, and––this proved crucial––had not taken the precaution prior to submission or proofreading to recheck sources. We were shaken and aghast by elders' and betters' manifest shortcomings. But they gave us cautionary reminders. We realized that error transcription was not the exception but the rule, that no scholar, however painstaking, was immune to such lapses, and that cutting corners by relying on secondary sources compounded misjudgments and foreclosed fruitful insights.
The most consequential lesson went to the heart of historical integrity. Our next assignment, demanding a month of research and analysis, was to supply a reasoned judgment about whether (and if so, why) these errors really mattered. However regrettably numerous, most mistakes were, after all, small details––a wrong page, a misspelled name, a misdated text––minor points that did not materially affect the author's conclusions or vex most readers. They were easily rectified with little harm done. Given the constraints on scholars' creative time and energy, was it therefore not best to forgive these lapses? Cursory apology by the busy perpetrator might suffice, along the lines of art historian Ernst Gombrich's response, when caught out in a trivial transgression, "mea culpa, mea minima culpa."5
Alas, this proved a specious conclusion. As we delved further into the essays, their peccadilloes served, like coal miners' canaries, to alert us to fundamental faults. Failure to consult an original source left the author at the mercy of an intermediary's imperfect or biased reading. To avoid being unconsciously influenced by the second-hand user's slant it was essential to see the full original. Moreover, original sources often revealed pertinent data unseen by and unknown to the lazy secondary borrower.
Even more detrimental were failures to recheck sources before publication. For the resultant errors went far beyond simple transcription mistakes. Reviewing their sources showed that our authors often misconstrued their meaning or import. They had adopted material supporting their own conclusions, ignoring or slighting contrary evidence and alternative viewpoints in the same source, often in the same sentence. Only by rereading sources before going to press would our authors have avoided the selectivity trap. In writing and rewriting, we are ever prone to single out and pervert evidence for the sake of coherence, consistency, and credibility.
Absent a final close reading of source materials, such deformations, we concluded, were inescapable. Every historian makes things up while writing—selecting, omitting, and reshaping data to make an argument clear, a point vivid, a conclusion indubitable. We had been schooled to abhor deliberate bias, knowing nonetheless that objectivity was at best a noble dream. But we had not realized quite how far unconscious bias suffused the process of gathering and using sources, let alone how important it was––and how much work it took––to minimize that bias.
These findings affected us in three ways. First, they warned us to view with extra caution the veracity and conclusions of historians given to manifold, even if seemingly minor, carelessness. Second, they were invaluable reminders for our own doctoral research, time-consuming and costly as adhering to them proved to be. It took me an additional week in the National Archives, not only to check my initial transcripts and synopses of my biographee George Perkins Marsh's 1,500-odd diplomatic despatches from the Ottoman Empire and Italy, but also to reread those despatches in their entirety, so as to gauge what my penultimate thesis draft had omitted, scanted, or misinterpreted.6
The third lesson was the most sobering. However much we took these cautionary principles to heart, however ardently we vowed to adhere scrupulously to their tenets, we came to realize that we could never unfailingly do so. Indeed, our lapses, like our mentors', were bound to become more numerous the busier our careers. How many historians take the time, even given the resources, to recheck every source before publication? Who faithfully ferrets out every original source from its secondary citation, especially when the sought-for "original" turns out to be another dubious secondary? A scholarly task would never end. So we know we fall inexcusably short.
This mortifying knowledge should fortify humility, much extolled by scripturally minded historians who bid us wash our disciplinary disciples' feet of clay. "All history should be a lesson in humility to us historians," declared Charles McIlwain in his 1936 AHA presidential address. "What we all most need is a … sense of humility," echoed Allan Nevins 23 years later, "because however hard we search for Truth we shall not quite find it." And yet, "how could they be expected to practice humility," queried Theodore Hamerow, "amid the deference" widely accorded historians at the mid-century? "The temptation to play the seer was simply too great."7
Deferred-to seers no more, historians have lost public credibility.8 It is salutary to be reminded that we are perforce fallible not only epistemically but also personally, subjugated not only to our slippery subject matter but to our slippery selves. To the genre's own insuperable limitations––data that are always selective and never complete; the unbridgeable gulf between actual pasts and any accounts of them; bias stemming from temporal distance, from hindsight, and from narrative needs––we must add, and keep in mind, human frailty. Hence we rightly accede to perpetual revision of our work. Continual correction is mandatory not only because new data keep coming to light, new insights keep arising, and the passage of time outdates earlier judgments, but also because we recognize that we never wholly live up to the demanding tenets of our trade.
We ought not be chagrined, therefore, that some successor is apt to disclose our unwitting mistakes and lay bare their sorry historical consequences. We are duty-bound only to minimize such lapses only so far as our brief years render reasonably possible. And to impart to our own students the humbling lessons bequeathed to us of the frailty of historical truth.
David Lowenthal is emeritus professor and honorary research fellow at University College London. Among his books are West Indian Societies, Geographies of the Mind, The Past Is a Foreign Country, The Heritage Crusade and the Spoils of History, and George Perkins Marsh, Prophet of Conservation. The Past Is a Foreign Country—Revisited is now in press.
2. Horace S. Merrill, "Basic Rules for Writing History," Maryland Historian 1 (1980): 10; Anthony Gene Carey, Politics, Slavery, and the Union in Antebellum Georgia (Athens: Univ. of Georgia Press, 1997), xxi; Ralph E. Luke, "Wherein I Descend from Mt. Memphis with the Ten Commandments," History News Network, November 7, 2004.