Why we need history #1: “Erasers are ‘an instrument of the Devil’…”
When I was at Kalamazoo, several people groused at me for not blogging more. This morning, as I procrastinate over other things, it occurred to me that many of my comments on the book of face are probably worthy of blog posts. This post is meant to be the beginning of a series of posts on how better historical literacy might improve work in many fields. And daily life, for that matter.
The first thing I heard on the radio this morning was a BBC headline referring to this piece in The Telegraph. Guy Claxton, a cognitive scientist currently a visiting professor at King’s College (unclear whether it’s London or Cambridge), claims that erasers (when did UK English abandon “rubbers”?) “encourage children to feel ashamed about mistakes” and should therefore be banned from classrooms. Claxton sees the erasure of mistakes as part of a culture of shame that causes dishonesty and inability (or refusal) to take responsibility for one’s own errors. He contrasts this culture with a more desirable one in which students learn from their mistakes and constantly attempt to improve. Plausible on the surface, perhaps, but a little historical knowledge and training shows the problems with Claxton’s argument.
One of the aspects of historical training that I and many others find most valuable is the development of what some call historical imagination. By this I don’t mean the blithe imaginings of people who wish to live in a past reframed by their own values. Instead, I mean the kind of imagination that requires accepting that people of the past didn’t necessarily think like we do: they had different worldviews, different value systems, different concerns. The artifacts and documents that they produced make that pretty clear. While it is true that historical interpretations of these objects changes, and that sometimes different interpretations remain open to debate, the evidence for cultural difference is pretty clear. That is especially true when dealing with pre-Modern and non-Western history.
It’s one thing for those of us in the modern West to try to imagine the lives of people who lived a hundred years ago within our own cultures, but another, very different, thing to get our heads around why people did the things they did five hundred years ago. I have friends and colleagues who knew people who served in the First World War. I have pictures of my father’s great aunt, who served in the Signal Corps. In other words, we are still connected to what most people think of as a distant past. We know that our parents and grandparents grew up in a different time, so we extrapolate backwards for their ancestors. The problem here is often that, because our parents and grandparents also died in our time, with all mod cons, we forget that most of their lives might not have been that way. For example, I have a mobile phone that probably has more computing power than all of the giant computer banks involved in the first moon landing. My using the same technology as my students — not to mention that I am often much more knowledgeable about it than they are — places me in the ‘now’. They find it hard to believe that I would have been a school contemporary of Jackie in That 70’s Show, or that when The Brady Bunch originally aired, the kids were the same age as my friends, their older siblings, and me. Nevertheless, the connection makes historical imagination relatively easy. It’s like trying to imagine the life of someone raised in a different part of the country. By contrast, trying to figure out pre-Modern and/or non-Western history requires us to learn multiple languages and often draw on scholarship from other disciplines.
Learning and using different languages forces us to think differently. I’m not arguing for linguistic determinism, only linguistic relativity. In other words, learning to express one’s thoughts (or translate another’s words) idiomatically requires us to consider differences in concepts and how they are expressed. Trying to translate literally seldom works well, while idiomatic translation requires imagination and understanding of cultural analogues. The need for such analogues is also one of the things that prompt us to look to other disciplines when our sources indicate rituals or behaviors that are not explained within the sources themselves. Although (or perhaps because?) the work of literary theorists is often one of the first things people associate with “interdisciplinary studies”, I think we sometimes forget that the social sciences often supply us with the best analogues. There are lots of reasons that those analogues don’t end up working for us, but the process of finding and even rejecting the analogues requires us to approach our subject with a different mind-set. I see this as something that contributes to a better historical imagination. It also bleeds over into what I consider the most important effect of good historical training: cultural empathy and imagination. The training that helps the best historians recognize and set aside their own ethnocentric/presentist assumptions to get their heads round an alien past is equally useful for understanding cultures that are part of our present, yet seem equally alien to us.
Claxton’s assertions about the eraser (at least as presented in the media) reflect such presentist and ethnocentric assumptions, exacerbated by apparent ignorance of historical fact and context for the practice of erasure. Perhaps there is some a priori evidence for a culture of shame and the subsequent dishonesty of trying to hide mistakes that has been omitted from the original piece. I hope so. Otherwise, there are some very simple historical reasons I find his assertions as reported problematic, and why we need history.
Imagine living in a world where writing materials are much harder to come by than they are for most of us living in the industrialized world today. That could be in poorer parts of the world now, but let’s just assume a much more distant past. People wrote on clay tablets, on papyrus, on paper, on parchment or vellum, on wood, on stone, and on cloth to record things worth recording. We don’t see a lot of mistakes on such recordings. We also don’t always see evidence of erasures. Why? There are many reasons, but the ones that come to mind first are purpose and process. Something important enough to record is something worth getting right. Getting something right might require multiple drafts — something very much like the process of constant improvement Claxton wants. Wax tablets like the one in the picture here originated in Greece and were used by the Romans and by residents of their (ex-)Empire throughout the Middle Ages. They were used in lessons, for drafts of work to be later recorded more permanently, and things that needed to be written, but not necessarily kept. Sources tell us of Japanese and Chinese scholars who practiced their calligraphy with water, sometimes on stone or pottery, because they could not afford to practice on paper. Even in the modern period, children and their teachers used slates that could be erased, not to hide the incorrect answers, but because they allowed materials to be re-used. Today, I use a whiteboard for the same purpose.
When I was at school and at university (until the last year, when my university installed two computer labs, one Apple and one DOS-based PCs, both of which used 5 ¼” floppy disks), we were required to turn in typed versions of our papers. When I did a course in the UK, my papers had to be handwritten. The common requirement was that they be proofread and free from errors, something that had not changed in decades, if not since the first time that university students were obliged to turn in written work to be marked! This meant that I, like my predecessors, drafted everything longhand, made many corrections on the original and subsequent drafts, and then either copied them out in my best writing, or typed them up very carefully. A mistake meant re-doing the entire page — at least till the invention of correction tape. Imagine how much harder it was for the medieval copyist, carefully copying out his or her text on parchment: of course the parchment could be treated and scraped to erase an occasional error, but the point was to produce something without them. The book or document wasn’t meant to show process, because it was a product, a final and definitive copy, often commissioned and paid for in the same way people paid for the works of many other artisans. Erasure isn’t meant to hide the error — it’s merely a means to correct the error. The shame lay not in the error itself, but in allowing the error to remain. The expectation that the final product be without error actually drives a process of revision and correction. The eraser makes it possible to go back and make the corrections. Whence, then, the shame and dishonesty?
It doesn’t come from the eraser. Perhaps he should look closer to the very recent past and the rhetoric of accountability and league tables in education, and to the general malaise of comparing “results” that are almost always measured against some arbitrary, often monetized, standard. Isn’t there some sort of cognitive dissonance in expecting people to care about process when they are measured only by results in a game where the stakes are incredibly high? But perhaps the shame is not in the error itself? Claxton suggests that it is also connected to a belief that people should get things right the first time. What he doesn’t address is where such unrealistic expectations come from, and how they might be connected to the same sorts of measures of success. For example, at present a company’s success is generally measured by sales, and by how much the executives and stockholders earn. Was that true fifty years ago? a hundred? is it true in every country? Factoring in things like dependence on public subsidies (whether outright or indirect), savings gained by paying below a living wage, re-investment in better technology and facilities (which also may have an impact on public expense, e.g., when refusing such improvements has a negative impact on the health of the local community), etc., shows a much closer relationship between success and effort. Similarly, looking at how much time and effort the most successful (and least successful) students put into their work — and for some, how much extra money their parents may have spent on private lessons, private transportation, tutors, etc. — and what other demands there are on their time, might help to re-set people’s perceptions of normality. It’s far harder to feel shame for not meeting the low bar when you realize that the low bar is exceptionally high.
There are doubtless many other factors I haven’t mentioned, and it’s likely that the ones I have mentioned might be cause for disagreement. My intention here is merely to show the sorts of issues and questions that anyone with a sound background in historical thinking might raise. Distrusting the deceptively simple demonization of the eraser: it’s one of the reasons we need history.