Skip to content

What *are* sabbaticals for, anyway?

6 April, 2022

What is a sabbatical for? It depends on where one is, I think. Some places give pre-tenure sabbaticals, some give semi-regular sabbaticals, some make them competitive, and some give nothing at all. Some declare that sabbaticals are the same as research leave — usually the places that don’t give research leave, or give it grudgingly as leave without pay, telling you that you should be grateful that your academic institutional employer is continuing to pay your benefits while they get nothing from it — although they are happy to tout that Fulbright/NEH/whatever will impress the parents in recruiting materials. A few say that that a sabbatical should be a time of rest and regeneration, a time that faculty can re-connect, and take the time to think, and maybe even do something unrelated to one’s teaching and scholarship. Some of those even mean it, and then, unthinkingly, undermine that. I am on sabbatical. It’s for a year. Yup. Took a pay cut to half my salary because dammit, I am tired. Before you say anything, yes, I know I am lucky. I know lots of academics who aren’t in this position, and might never be. It’s not fair. But that’s not this post. Actually, it’s part of the post, or at least in some ways related. I’ll get to that.

In the meantime, what am I doing with my sabbatical? My sabbatical that got postponed because of a pandemic, which also shattered the financial plan for funding the sabbatical, is definitely not the one I applied for. I had research plans, carefully defined. I had regenerative plans, less well-defined, but ones that included seeing family I’d not seen in over a decade (and in the case of one parent, closer to two). I had a plan for a public history project, a very specific one, that would continue something I had started a couple of years ago. I am … kinda doing those things? But what I am doing is way cooler.

I am currently sitting in my shared office at a German university, being a Fellow in an amazing research group. I meant to start blogging about this a month ago, when I got here, but it has all been so much! First, I am being paid to read, and research, and think, and talk to colleagues. How awesome is that? And omg, there is so much to catch up on. I got here a month ago, and it has been difficult to get on a roll, mostly because getting out of COVID teaching survival mode has taken longer than I expected. I knew I needed an external kick, so I volunteered to speak at our weekly colloquium. I mean, I would need to at some point, right? Then I panicked. Then I remembered that my goal was to introduce my research specifically so I could find out how I fit in, and to get feedback. Not a conference paper. Just scary smart people.

It did not go without a hitch. As I was pulling the talk together, somewhat closer to the wire than planned, I managed to kill the external drive with pretty much all of my recent backups. I’ve been trying to make sense of the horrendous mess that is what happens when you don’t file things correctly and also when you transfer info from an old computer to a new one and Apple creates a new archive every time, so there are multiple versions of the same files all over the place, because you opened and worked on one, or just opened one to see what it was, but you did that several times, so you don’t know what the actual current version is… Anyway, so I’d pulled everything from the cloud to the backup drive so that I could weed out extras and set up a new Dropbox and also have room to run Time Machine on the other external drive that was way too full of backups with too many files.

This was not the sort of excitement that I was hoping for.

The immediate issue became not “do I have time to run this thing through a translator and then tweak it and make sure that the German works?”, but rather, “Oh shit — do I have any backups of the images I needed for slides? Reader, I had one. I am sure that there are backups somewhere. Just not accessible when I needed them. Also, I had had the good sense to put my database, which I have been using since I built version 1 for my PhD thesis, onto my computer before I started the cleanup. Go me! but I digress.

I pulled something together. It was not edited as well as I’d liked, because I’d spent my 90 minutes of last-run-though time trying to get the external drive to work. Some of you have watch me flying by the seat of my pants before, so might not be surprised. I’m happy to say that this was not the case yesterday. I may have had a draft that had to be re-patched a bit, but I knew what was in it. I knew where I had meant to make cuts, and made them on the fly, albeit a bit awkwardly. My colleagues learned some things, and asked good questions, hard questions, kind questions. I tried to give them good answers, and I think some were. And some were me thinking out loud and rambling till I ended up with “those are things I think might help answer your question, but honestly, I’m not sure.”

At the end, I had my usual panic over whether I had embarrassed myself, as you do. My colleagues, especially my friends amongst them, assured me that it was fine, and I had not.

I woke up this morning and felt more mentally alive and connected to my field than I have in years. Years.

It’s a good feeling.

Where is this going? I learned a thing!

It’s been a minute

6 April, 2022
So, according to my last post, I was going to do a series on important things. I still have lots to say, but honestly, I like my mental health to be on the healthier side. It’s still time to re-start blogging though — what else are sabbaticals for?

The State of Medieval Studies — a series

13 October, 2019

This is this first in a series of blog posts over two years in the making. I am hoping to address many of the issues currently being discussed, however ineffectively, amongst scholars of medieval studies. I’ve been mulling over many of these thoughts since the 2017 Leeds IMC, and I still have trouble articulating them the way I would like. That said, some of the things I want to address in this series include:

  • the responsibility of academics, especially for academic medievalists, to engage in public political activism
  • the complexities of the term “Medieval Studies”
  • the role of regional hegemony and privilege in framing conversations about race and gender within the field writ large
  • The conflation of anti-racist goals aimed at making the human face of the field less white with the ideas of decolonization and globalization of medieval studies
  • The ways that social media, especially Twitter and Fb, both democratize and create a false illusion of democratization that are potentially bad for ECRs and senior academics alike
  • being a medievalist in the age of Trump
  • the rhetoric of social justice and its inadequacies when it comes to dealing with individuals within a relatively small community.
  • the distraction of loud voices and how they become the issue, rather than the issue being the issue

And probably lots of other things.

I don’t like to erase or edit comments, unless they represent actual threats or are personally abusive. I won’t be moderating ahead of time, because I don’t know how much time and energy I will have for this. If I do remove a comment, I will replace it with something that explains why, providing I can figure that out. I would like this to be a place for dialog, not just the same shouting seen elsewhere. I don’t doubt that, if anybody is paying attention, there will be nasty back-channeled conversations, circulation of screenshots, and the same sort of generally repugnant and hypocritical behavior that has characterized every discussion on the topic since 2017, if not since ill-advised callouts of some of the field’s more embarrassing right-wing martyrs began sometime before that. That’s been the norm for a couple of years now, and is happening even as I type. Everybody has receipts, and no one, were their “private” social media conversations made public, should be throwing more than nerf balls from their glass houses. That said, we need to talk.


Dear Dr. Brown

20 September, 2017

Just stop.

Stop punching down.

Stop your incessant attention-seeking.

Stop dog-whistling to your alt-right fan club.

Stop using racist and anti-Semitic references that you think are clever. They aren’t. 

Stop posting screenshots from private Facebook pages,  even if people have chosen to break faith with the individual or community owners of those pages. 

Stop bringing my profession into disrepute. 

Stop twisting and misrepresenting other people’s words.

Stop. Just. Stop.
Seriously, this summer has been hard enough without having to put up with your shit. It’s not about you, or it wasn’t until you decided to hijack a conversation and make it about you. That post of Dorothy Kim’s at In The Middle? It was about Charlottesville, and Leeds, and any number of other things that aren’t you. And even, even if it were about you, what you are doing is just wrong. It’s wrong to go after a junior colleague, especially one without tenure. Period. 

Beyond that, I just don’t get how you can consistently refer to your conversion to Catholicism, and yet behave in ways that are about as far from Christ-like as possible — or at least not in any way I learnt in catechism. 

So please… just stop. 
Another Damned Medievalist

Professor of History

Carolingianist, diplomaticist, translator, gender historian, fan of Fulda, member of SMFS and the Medieval Academy,

Medievalists should care about this

19 July, 2017

I take my title here from this tweet by Joshua R Eyler, posted earlier this morning. The tweet is part of a longer conversation that has been going on since this year’s International Medieval Congress at Leeds, UK. This morning, @Punctum_Books posted a thread on Twitter that encouraged meaningful conversation, and also noted that many voices were missing, including those of people involved in organizing Leeds, and other medievalists normally on Twitter. Part of the post calls for more “good, intellectual histories of the discipline.” In partial response, I posted my own thread of comments, which has so far drawn a couple of thoughtful comments, and, ultimately, Josh’s comment. That comment raises a question for me, and not just because I have a really hard time following twitter threads, especially when many people are being @-ed, and lots of people begin their comments with, “You are/You said/Are you…” Which “you”? But back to my question.

Among all the points raised in the post-Leeds Twitter storms, which is the “this” that “medievalists” should care about? Some? all?

I take it to mean something along the lines of, “There are real problems in Medieval Studies. The scholars are overwhelmingly of solely European descent (in other words, “white”, but that’s something I will unpack later, in this post or a subsequent one — there will be a lot of unpacking), cis-het, and at least outwardly able (again, this needs unpacking, and the hierarchy of disabilities, especially including invisible, non-neurotypical disabilities, is its own post(s)). At the same time, the scholarship is predominantly Eurocentric, and most of the scholarly traditions have erased and/or ignored the existence of non-white people in the sources, reinforcing an image of a white Middle Ages. Moreover, the effects of that image, and the scholarly traditions that created it, reach far beyond scholarship: they have been perpetuated in popular and political culture in ways that they have been actively coopted by racist ultra-nationalist, Alt-Right groups, and passively incorporated into the structural racism that exists in the Modern West*.”

If that’s what “this” is, then it’s a no-brainer, in my opinion. The most important parts of that statement are simple to demonstrate, because the evidence is right in front of us. It’s a statement that focuses on the structure of the field: whatever sort of medieval stuff we do, the statement applies. More importantly, I think it is recognized far more widely than the post-Leeds conversation on Twitter would suggest. But, as @Punctum_Books said, many voices are not engaging on Twitter and as someone else pointed out in an earlier series of tweets, those conversations are also not happening in other venues where they might be seen. Nevertheless, they happen, and in some cases, it’s probably a good thing that they aren’t happening on Twitter, because sometimes they include explanations of how racism is so structurally embedded that it’s not just a matter of, “well, if a POC wants to be a medievalist, then they should study it — there’s nothing to stop them.” Leeds is an international conference. If a person is from a country that has very little ethnic diversity in the first place, they are less likely to notice a lack of diversity amongst their colleagues. That’s not a defense — I’m just saying that we need to realize the fact that, even for such an important issue, we aren’t all starting at the same point.

We are also not starting from the same place. On Twitter, I sketched a differentiation between “Medieval Studies” and “medieval studies.” I admit that it has problems, but until I come up with something better (or someone else suggests it), that’s what I am going to use. As I see it, based on my own experiences as well as conversations with a wide range of medievalists from many countries, one of the biggest barriers to working effectively on the “this” that we should care about is that we don’t all mean the same thing by “Medieval Studies/medieval studies.” For example, there are people who study medieval history, medieval art history, medieval art history, etc., who borrow from and whose work is informed by work in other disciplines, but they think of themselves primarily as specialists in medieval discipline X. And then, for example, historians might divide themselves into Carolingian, Anglo-Norman, Early, Late, whatever — and still see themselves as primarily medieval historians. For them, “medieval studies” is a catch-all term that is used for anybody who does something medieval: it’s a way of grouping very unlike things that have a couple basic things in common — mostly similar geography and timelines. So “medieval studies” is not a field — it’s a nod to the fact that medievalists do different things, but are still medievalists. One of those different things is “Medieval Studies.” Another is “Medievalism,”

In this sense, “Medieval Studies” is something that you can get a degree in. There is an implication that it is interdisciplinary, although in my experience, that varies widely, depending on how one defines the disciplines involved. Also in my experience, and I gather from historian and archaeologist colleagues that I am in no means alone, “Medieval Studies” people at the Ph.D. level are most often rooted in literature training. They also seem to be more often from the USA and then other Anglophone countries, rather than from Continental Europe. One of the crucial differences here is that, for the people who lean towards “medieval studies”, it isn’t necessary to think about what we mean by “medieval” or “the Middle Ages”: they are implicitly European.

<cue hair-pulling and screams of, “BUT THAT’S THE PROBLEM!!!>

I get that. But bear with me, because communication doesn’t happen without trying to see other viewpoints, and communication is not happening. For the “medieval studies”, “I am a medievalist” scholar, the definition is akin to “Americanist”, “Early Modernist,” etc. [And yes, those terms are also extremely problematic, but that’s also another bunch of posts.] Hell, for many medievalists, the term also conveys things like, “no, actually, I had to learn all sorts of languages and specialized skills to do my work, and it’s not especially esoteric — tell me again how you specialize in a period of less than a hundred years in just one country?”  I understand this mind-set. I especially understand it in terms of how many academic departments have historically seen and judged the work of scholars who choose to go beyond the scope of their set academic duties by, for example, discussing the use and abuse of the Middle Ages in film or contemporary fiction, or by engaging in “political” speech.  That seems to be changing, but those changes are not happening at the same pace everywhere, or in every discipline where we might find medievalists. Readings that might be essential for “Medieval Studies” (and especially for people working in Medievalism) cannot be assumed to be basic readings for medievalists, because not everybody who is a medievalist considers their field to be “Medieval Studies.” Doing so pits colleagues against each other in a way that can only read as “why is this person in another field, who doesn’t have expertise in what I have been studying for decades, telling me that I have been doing it wrong for decades?” This is especially true, I think, for those who have been challenging misappropriations and misrepresentations of the Middle Ages for years. It is hard to do so without running headlong into a big wall of 18th and 19th Century invention that is not borne out by recent scholarship. Even the most hidebound traditionalist will have to deal with a lot of cognitive dissonance to ignore that. If we are good at our jobs, and I think most of us are, then maybe holes in our reading lists (which might not actually be “our” reading lists — yet!), or being able to discuss the same concepts in ways that aren’t immediately clear because they don’t reflect the language of our colleagues (perhaps in “Medieval Studies”?) specialties are not the best measure of whether a colleague is engaged in the “this” that medievalists should care about.


The insularity of medieval studies: literally and figuratively oceans apart

10 July, 2017

(Note: this is in part a response to recent conversations on Twitter and elsewhere concerning the Leeds IMC in general, and session 1414 in particular. I said on Twitter that I would say more elsewhere, because I find Twitter to be a difficult format. It is not so much the character limit as it is the immediacy, and the ways people begin to reply and argue before reading a thread all the way through, and the fact that it can be hard to follow an entire Twitter conversation and respond to the right tweets effectively. The immediacy, especially if one has notifications set in a particular way, can be stressful for people who need time to process thoughts (perhaps even more so for those of us who are not neuro-typical) before responding, I am trying to avoid engaging in any specific refutations or repetitions of arguments or comments made elsewhere, primarily because the conversation thus far hasn’t been entirely productive, but also because I am trying to avoid the breakdown into a “good guys / bad guys, who’s silencing whom” conversation that has already begun on some fronts. This has become more difficult since I started writing this, so who knows if I will be successful?)

I. Ethnicity, Nationality, and the History/Historians of the Early Middle Ages as I have encountered them over the past three decades

When I was writing the prospectus for my Ph.D. thesis, one of the things that struck me was how alien the scholarship I studied was. Most of the scholarship I was reading was very unfamiliar to me: my education as a medievalist had been pretty normal for someone from the US, focussed mainly on the early and high Middle Ages west of the Rhein, and this was different. The periodization was different; there was no discussion of feudalism, of the central or late Middle Ages, nor any number of other things I thought were pretty much settled. Granted, there were some things that were familiar, too — the constant references to a German tradition and ethnicity that was set in opposition to the Romans, for example. But I was used to seeing the Germans as barbarians to a Roman norm, and this was … new. Yes, we’d discussed barbarian law codes, German law codes, the codes of the Lombards, the Franks, the Burgundians, and I’d been working with the MGH for years, but hadn’t really thought of its connection to a core Germanness. I mean, yes, I knew Franks were one of the peoples normally called Germanic, or barbarian, but that meant that they were neither French nor German, in a modern sense. They were German in the sense of Tacitus, sort of, which is to say that they were German because they were people who lived in and came from the Roman Germania, which was so named because at one point, the Romans had encountered some people who called themselves Germans and hailed from that general area. But it struck me that, to the people responsible for the MGH, Charlemagne was a German. Not French, which made sense, because there was no France at that point. But German. 

As I worked deeper into the historiography, into Landesgeschichte, Verfassungsgeschichte, Prosopographie, Verwaltungsgeschichte, Rechtsgeschichte, Personen- und Ortsnamengeschichte, and tried to get my head around Grundherrschaft, Adelsherrshaft, Grundgesellschaft, and a number of other sorts of study (strangely, not Diplomatik, although I work with charters), it hit me over and over again, that there was a similar thread running through it all: when all was said and done, one of the explanations for why things were the way they were in the Early Middle Ages, at least in “Germany”, was that that’s how Germans had always done things, and you could tell a German because they did things that way. That’s an oversimplification, but for the scholarship from the 19th century, all the way through the 1970s, at least, that seemed to be pretty consistent, despite the efforts to purge at least some of the universities of faculty tied to the Nazi era after the Second World War. When I moved to Germany, I noticed some of the same issues, which I think still exist in the historiographies of many different European nations. I was, and am, hardly alone in this, and challenging these assumptions about ethnicity and nation, whether implicitly or explicitly, has become fairly well embedded in the work of Early Medieval historians working on subjects in the areas that would become Germany. I don’t think this is as much of an issue for colleagues working on the areas that became France and England, although I gather there are similar challenges for those who focus on Wales, Ireland, and Scotland. But, for example, as scholars move (gradually) to seeing charters and formulae as descriptive texts that probably reveal more about society than prescriptive leges, they are questioning the once sacrosanct pronouncements of long-dead legal scholars who believed in something called “Germanic law.” More importantly, and explicitly, scholars of Late Antiquity and the Early Middle Ages have spent much of the last forty or so years exploring questions of identity and ethnicity, rejecting the notions that there is anything inherent in those concepts in favor of seeing them as social constructs. This may not seem important to some readers — in fact, it might seem like I’m stating the obvious — but it’s important to remember that there was a time that these things were not obvious at all. It’s also important to consider the sorts of source material we have at our disposal.

Although archaeology plays an increasing role, the majority of sources are textual. Some are narrative, e.g., annals, hagiographies, the occasional history, as well as sermons, commentaries and exegesis, and letters, primarily written by clergy. Others, like charters, are non-narrative (mostly). In general (and I am intentionally generalizing, because one of the issues we deal with is that, the more research is done, the clearer it is that we can only make broad generalizations), we can say that our sources, especially the narrative ones, indicate that the people we study did identify themselves in many ways, and they identified themselves against others. Innate physical characteristics are not usually markers of identity; more often, the markers were particular forms of clothing and jewelry, hairstyles, cranial deformation, names and naming conventions — in other words, they were things that were adoptable and mutable. What we don’t have evidence for is a concept of race, at least not in the modern sense. There are also (and again, remember I am talking primarily about the eastern part of what would become the Carolingian Empire) relatively few references to non-Europeans (depending on how one sees the Byzantine Empire). If there is one thing that eventually defines the Franks against a non-European other, it is Christianity. Not surprising then, that in the Byzantine Empire and the Dar al-Islam, “Franks” eventually became a synonym for Western European Christians. That might as well be a description of the people who study them as well.

Most of the people studying the European Early Middle Ages, either insular or continental, are white. Not surprising, given that most of the historians of the European Middle Ages are white, although that seems to be changing very slowly. Even more, though, the people who work on things related to my research, the people I am talking about when I refer to my subfield, are people who work comfortably in German. Why is this important? Because most Americans of my generation, when preparing to go off to grad school and become medieval historians, prepped by honing their French and Latin, and only learnt enough German to pass a reading exam, and Old English if they wanted to be Anglo-Saxonists. There weren’t that many faculty in the US teaching “the German stuff” at that point, and they were few and far between — and not necessarily well plugged-into the established Medievalist network at that time. To give you an idea, Geary and Noble had only been teaching for just over a decade when I graduated with my BA. Similarly, it was about the same time, the mid-1970s, that a critical mass of UK scholars started to focus on Eastern Francia and the areas that would become Germany. I don’t know why the shift in the US occurred, but in the UK, it seems at least partially connected to German academics who taught in Oxbridge. In any case, there are still relatively few of us in the US, even including the people who work on the Ottonians. We are also not particularly visible, even to each other, in part because of the nature of academic jobs in the US, and the distances that make regular research seminars, etc., prohibitive. By nature of jobs, I mean that most US academics are not employed at top research universities, and in fact, are probably not employed in PhD granting universities, full stop. There are a lot of medieval historians working as generalists, and it can be difficult to break out of that isolation (but that’s any number of other posts). One of the few benefits of this set-up is that academics in the US, and I think medievalists more than most, have a lot of leeway when it comes to their research, as long as it relates somehow to their position. The US is also big, and although there is certainly gatekeeping, the old system of patronage governing who got which job by way of a phone call is gone.

This is not the case everywhere. I cannot speak to the rest of Europe, but in the UK, Germany, and Austria, at least, old hierarchies and patronage networks remain strong. Funding is different, too, as is the very nature of university education at both undergraduate and postgraduate levels. It is much narrower, and people do not tend to work far beyond that narrow training. I could not say if it is actively discouraged for those in permanent positions (and indeed, I know people who have remade themselves over a number of years, but they have not kept the same positions, and have generally had to change universities), but they are seldom medievalists, let alone people who do the sort of stuff I do. It’s a pretty conservative world, and very high-pressure. This is not to say that things have not changed — in fact, they have changed tremendously even in the last twenty years, but some of those changes are ones that came to other areas of Medieval Studies at least a decade before.

II. More Specific Thoughts on My Current Life as a Medievalist

As many of you might be aware, I recently attended the Leeds IMC. It is normally the high point of my academic year. It is a time for me to connect with others in my field, with people I might never see otherwise. Kalamazoo is the conference I attend to catch up with friends, go to panels on the profession and on subjects outside my specialty. It’s easy to do that at Kalamazoo, because despite a gradual expansion of offerings over the last decade or so, it is still a conference that is literature-heavy and late-leaning,. Until very recently, “early” at Kalamazoo meant Late Antique (which generally focused on either Roman-Barbarian relations or Early Christian themes) or Anglo-Saxon stuff (mostly monastic, when history). Panels on Carolingian Europe have been few and far between, and as at most conferences, often conflicted with each other or with other early panels. Not so at Leeds, where panels on the Early Middle Ages are central to the program. Another thing central to the program are strands organized by formal working groups, often housed in German and Austrian universities. These groups have produced years-long strands on subjects like “Texts and Identities” and “The Transformation of the Carolingian Age.” The scholars who present in these strands are more likely to publish in journals like Early Medieval Europe, Viator, and Francia — when they publish in English — than in Speculum or the Journal of Medieval History. Not that I manage to keep up with even a quarter of what I would like to: like many other medieval historians in the US (I can’t speak for the Lit people, but this does seem to be true for Art Historians, too), I teach a broad range of courses that might not include courses in my research field more than once every couple of years. So Leeds is vital to helping me focus my time and energies on the major trends in my field. 

This seems to be true for a lot of fields. I can go to Leeds and never run into colleagues who specialize in other areas of history, let alone in other disciplines. There are other panels I’d like to go to, especially when they pique my interest for teaching, or for new approaches. But honestly, my hierarchy of panels is usually this: Carolingians, Merovingians, charters, Anglo-Normans, Anglo-Saxons, Late Antique stuff, especially patristics and monasticism. Somewhere in there, I try to work in prosopography and teaching, not to mention panels where friends or their postgraduates are presenting.  It doesn’t help that I seldom get around to reading a program until just before the conference, and then try to get through the corrigenda before heading off to a panel. But honestly, I read through, and generally look for the key words of my interests, and then thumb through the index to make sure I haven’t missed anybody whose work I really want to see. Inevitably, I will have to decide between two or three interesting panels. And sometimes, my decisions are finalized for reasons of location, proximity to the next panel, and what friend I ran into at the coffee break. I don’t think that makes me too different from others — otherwise, I’d run into more people who aren’t in my own field. Although I wish we medievalists were more aware of what the trends are outside our specialties, it doesn’t surprise me that we — and I mean that about all of us — aren’t. There’s an awful lot to keep up with in our own areas and those most closely connected to them, and the demands of the job often make it hard to move beyond our own little bubbles.

From within that bubble, and having an idea of the bubble that many of the theme organizers of this year’s IMC occupied, I have to admit that the concept of Otherness as outlined in the Call For Papers seemed perfectly normal for a Leeds CFP. It was very general, written in a way that encouraged people to think about all sorts of ways of defining the other. In other words, I read it as an invitation for people in fields that might not normally think of differentiations between Self and Other to explore what those things look like in their fields. I felt the same way when I read the description of session 1414: the organizer had chosen to start with a discussion that was rooted in the discussions of ethnicity and identity that have been so important in his own area of specialization, with the intention of opening up discussion to the audience for comparison and contrast — in fact, to give people the opportunity to move beyond their bubbles. I am not saying this was the intent of either, but it is how I read both. What I did not read was a specific call to discuss theories of Otherness, nor would it have occurred to me to do so, since, for example, my source materials do not seem to support analysis through Critical Race- or Post-Colonial Theory: as I’ve indicated above, the sources tend to show that identity was fluid, adaptable, mutable, and to 21st century eyes, it’s about white people dividing themselves from other white people. Occasionally, there are glimpses — I am intrigued by a smattering of Old Testament names in a handful of 9th C charters, but there is not enough to tell us whether those are adopted names in the style of Alcuin’s circle, names of recent converts or perhaps names adopted upon entering a monastery, or names of actual Jewish people who, from the very scant evidence of a cartulary entry, seem to have had the same ability to alienate property as their neighbors. But in general, I don’t believe that people saw it that way (white people defining themselves against other white people) in the 7th or 8th or 9th century, and I don’t think modern ideas of race are valid or helpful in terms of my subjects and source material.

But, that is not true for the historiographic tradition. It makes a lot of sense to discuss the effects of that tradition on modern ideas of race and identity. After all, 18th and 19th century ideas of Romantic Nationalism are at the root of the search for “Germanness” (or Englishness, Frenchness, but I am most familiar with the German stuff, so…) I mentioned above. Some of our translations still in use go back to that period, as do many, perhaps a preponderance, of our editions and their commentaries. Thus, much of our scholarship is influenced by the very institutions that Critical Race Theory and Post-Colonial Theory militate against. But in a field where so much of our work is focused on picking apart primary sources (often less than a paragraph, sometimes only comparison of the use of a few words by one or more authors, etc.), and where deeper discussions of historiography tend to take place in the footnotes (except, for example, in papers on MS transmission, where there can be an awful lot of who argued for X date of Y recension), I am not sure where those theories should be applied. This is very different from many other areas of Medieval Studies, especially those focusing on Literature and later periods — and I gather that Lit people have their own periodizations? At any rate, where I find those approaches useful is in how they have affected the ways views of the Middle Ages have been perpetuated, and it is there where I see the most relevance to today, to my teaching, and to living in the world.

III. Medieval Scholarship in My Teaching and Daily Life

In some ways, I’m luckier than most: I live in the US, I have a fairly secure job, in that I am no longer worried about promotions, and the expectations that I teach a very broad range of courses, including modern topics that are informed by a variety of theoretical approaches, but are not taught as Theory per se, because 1) that’s not my specialty, and 2) those courses tend to be part of the General Education curriculum, and/or requirements for elementary education. I also teach in a slightly weird environment: I am a pretty red area of a swing state; my students tend to see their education as either professional training or as the equivalent of a license to a well-paying job; the students themselves range from Lost Causers to very woke performing arts people. Primarily, though, they are white and not particularly liberal. One thing they all have in common is that they are generally pretty ignorant of history and geography. Most of them find the history of anywhere other than the US pretty inaccessible, and conversations that focus on gender, race, and religion challenging at best. When I teach these students, especially when dealing with pre-modern history and beyond, I use approaches informed by a number of theories, but my goal is to use them as tools for getting through to them that race, gender, and religion are real issues, and that we can see them in the scholarship, often more than we see in sources themselves (or not — it very much depends on the sources). I want them to understand that those things connect to how they see the world now, sometimes implicitly, sometimes explicitly, again depending on topic and sources. I am lucky in that my Gen Ed area focuses on trying to gain global perspectives AND understanding cultural difference, because it helps to shield me from complaints about promoting a left-wing agenda. Those complaints exist, and in a tuition-driven, red state environment, those things do matter. (Having said that, the most recent elections in the US and abroad have been almost helpful, because basic critical thinking can be accused of being left-wing). 

Nevertheless, even in my survey classes focusing on earlier periods, I bring out what I can, and what seems appropriate to the topics, e.g., introducing the concept of Orientalism when discussing Herodotus, or how the Roman concept of virtus is related to the default to a particular sort of masculine norms. But even concepts that have been in common parlance in the discourse of gender and gender relations for close to a century, e.g. Patriarchy (which of course has existed as a term much longer, but I am thinking here of it in the super-simplified “underlying structure  against which feminism militates,” way), are things I have to break down in ways I never expected. Terms like patriarchy and privilege are almost immediately threatening to the majority of my students, and it’s a challenge to get them to engage with the concepts, especially when their previous education has emphasized gaming a series of standardized tests. So, for example, when I use the phrase ‘patriarchal society,’ many students seem to translate it as, “Ok, Prof. Medievalist is a FEMINIST, so I need to point out all the ways that the source we’re reading shows that women were oppressed.” One of the ironies of teaching World History is that students tend to want to take away the idea that historically, everybody was as awful as everyone else. For example a selection of texts showing that for millennia, “nice” women (that’s a whole lesson itself) in many parts of the world, most importantly the ones where Abrahamic religions took hold, were expected to cover their heads, and it is only in the fairly recent past that this expectation has pretty much died out in Europe and the Americas, can just as easily bolster confirmation bias about modern Western values as show that even things students associate with Islam, or with a “backward” culture are not what they thought.

It gets more complicated, but in some ways, easier, to teach texts and show how they’ve been appropriated, especially in terms of modern national identity. And, of course to show how those appropriated and re-fitted concepts, like the cultural inferiority of barbarians/savages has been used to create a world of systemetized racism, new sorts of power dynamics, and world views that privilege a “Modern/Western” values. Where it gets stickiest is trying to explain how people that all read as equally “white”, in a US context, can see themselves as entirely different peoples in comparison to each other (so, for example, the English and the Irish, or the various groups that made up the former-Yugoslavia), or the xenophobia of Brexiters towards Eastern Europeans, and simultaneously see themselves as more like each other than like the people forced to acculturate under their colonial empires. Except, of course, when they don’t.

Do my European and UK colleagues (whether other Early Medieval Historians or some other form of Medievalist) have the same goals or challenges? I don’t know. My guess is that, at least the Medieval historians generally don’t, because by the time students reach university, they have already started to narrow their fields far more than we do in the States as postgraduates. There are no Gen Eds, and Western- or World Civ surveys are almost non-existent. To a large extent, the students are white, taught by a primarily white faculty (although I think there are still variants of how one would define white — census forms are different in different countries). I have been party to any number of conversations where medieval historians on both sides of the Atlantic talk about the need to get more students of color into the pipeline, but one thing missing from that discussion is that the pipeline isn’t an equal access one. For example, how many children of immigrants in Germany are tracked into Hochschule rather than Gymnasien at age ten or eleven? I can tell you that when I lived in Bavaria, almost all of the kids in my daughters class, and grade, were first or second generation immigrants. Many of their parents did not know that that simple choice would make it very hard for them to ever enter a university. In the US and UK, wealthy (and proportionately white) people send their children to private schools and/or pay for extra tuition that will improve their chances. Students from poor backgrounds, especially first generation students, the majority of whom are people of color, don’t generally go into the Humanities: their families want a better chance at a return on investment. I’ve heard colleagues attribute this attitude to cultural background, but I think that ignores the history of accessibility to university education. At least anecdotally, I don’t see a lot of difference between first gen. white students and students of color when choosing a major. And beyond that, there is abundant evidence that our school systems privilege white students, whether through funding, districting, admissions requirements, or some other criteria.

IV.Where is this all going?

To be honest, I haven’t a fucking clue. This is longer than I wanted, and needs editing in a bad way. But for a variety of reasons, I wanted to get this out sooner rather than later. I know it won’t please everyone. It may not please anyone. I’m sure I’ve managed to fall into at least a couple of rhetorical pitfalls, even as I’ve tried to avoid going into a lot of them. I know that some people will be annoyed that I haven’t directly addressed things said on Twitter, and if you are, I apologize. This is the best way I know how, at this point in time, to say that I see a direct connection between what we do and study and the racism (especially), and also the homophobia, transphobia, ableism, and misogyny that permeates not just our society, but also the corner of it that is academia. I hope that others do, too, and I will keep trying, in my own way, to help those who don’t, see.

In lieu of an essay…

20 October, 2016

Something I wrote four years ago today, and forgot all about. I have too many half-finished posts in the queue, and hope that posting something will remind me to post more.

Inner Chorus:

O that we now had here
But one one-thousand dollar Dyson
That do but sit in shops!


What’s she that wishes so?
My Inner Chorus? No, my fair voices:
If we are marked to sneeze, we are enow
to use a Kleenex box; and if to breathe,
the Bissell work’d, and I have hope of savings.
God’s will! I pray thee, wish not Dyson more.
By Jove, I am not covetous for tech,
Nor care I if cats leave grit upon my floor;
It yearns me not if cats dust bunnies leave;
Such outward things dwell not in my food.
But if it be a sin to detest sneezing,
Mine is the most offending nose alive.
No, faith, old self, wish not a better vacuum
To clean the floor, to hold all dust bunnies
In momentary sway. O do not wish for more!
Rather proclaim it, Chorus mine, to the cats,
That they that hath no stomach to this fight,
let them depart; their dinner be delayed
and place for hiding be under the bed:
We would fight against dread dust bunnies
And grit that kitty cats have left behind.
Today is called the feast of Adalbert.
She that shall clean this day, and essays mark,
Will sit in bathtub when the job is done,
And rouse cats from their tidy hidey-holes.
She that shall clean this day, shall bite her tongue
For surely when the vacuum starts the kittehs
Will cry, “The Monkey is Mad, so say we:”
Then like unto a cartoon of Chuck Jones
Will lose control of limbs and silly seem.
I bite my tongue: for mocking cats is cruel,
And they’ll remember not the ravages
Against the dust and hair: only the noise.
Familiar in cat minds as Monkey mad
driving the beast, killing the dust bunnies,
Singing and cleaning, Boots-cat and Rosie
Left in their hidey-holes swearing all cat-swears.
This story shall the poor cats tell to all;
And Bishop Adalbert shall ne’er go by,
From this day to the ending of the world
But they in it shall yet remember:
Those cats, those fearful cats, those martyred kittehs;
For those who hide and plot revenge on me
They are my kittehs; be they ne’er hungry,
This day still tempers plots of vengeance:
Wrought carefully in closets deep
Plans for hairballs and poo unburiéd,
To make their hatred clear lest Monkey dare
To clean the house upon a Saturday.

Why we need history #1: “Erasers are ‘an instrument of the Devil’…”

27 May, 2015

When I was at Kalamazoo, several people groused at me for not blogging more. This morning, as I procrastinate over other things, it occurred to me that many of my comments on the book of face are probably worthy of blog posts. This post is meant to be the beginning of a series of posts on how better historical literacy might improve work in many fields. And daily life, for that matter.

The first thing I heard on the radio this morning was a BBC headline referring to this piece in The Telegraph. Guy Claxton, a cognitive scientist currently a visiting professor at King’s College (unclear whether it’s London or Cambridge), claims that erasers (when did UK English abandon “rubbers”?) “encourage children to feel ashamed about mistakes” and should therefore be banned from classrooms. Claxton sees the erasure of mistakes as part of a culture of shame that causes dishonesty and inability (or refusal) to take responsibility for one’s own errors. He contrasts this culture with a more desirable one in which students learn from their mistakes and constantly attempt to improve. Plausible on the surface, perhaps, but a little historical knowledge and training shows the problems with Claxton’s argument.

One of the aspects of historical training that I and many others find most valuable is the development of what some call historical imagination. By this I don’t mean the blithe imaginings of people who wish to live in a past reframed by their own values. Instead, I mean the kind of imagination that requires accepting that people of the past didn’t necessarily think like we do: they had different worldviews, different value systems, different concerns. The artifacts and documents that they produced make that pretty clear. While it is true that historical interpretations of these objects changes, and that sometimes different interpretations remain open to debate, the evidence for cultural difference is pretty clear. That is especially true when dealing with pre-Modern and non-Western history.

It’s one thing for those of us in the modern West to try to imagine the lives of people who lived a hundred years ago within our own cultures, but another, very different, thing to get our heads around why people did the things they did five hundred years ago. I have friends and colleagues who knew people who served in the First World War. I have pictures of my father’s great aunt, who served in the Signal Corps. In other words, we are still connected to what most people think of as a distant past. We know that our parents and grandparents grew up in a different time, so we extrapolate backwards for their ancestors. The problem here is often that, because our parents and grandparents also died in our time, with all mod cons, we forget that most of their lives might not have been that way. For example, I have a mobile phone that probably has more computing power than all of the giant computer banks involved in the first moon landing. My using the same technology as my students — not to mention that I am often much more knowledgeable about it than they are — places me in the ‘now’. They find it hard to believe that I would have been a school contemporary of Jackie in That 70’s Show, or that when The Brady Bunch originally aired, the kids were the same age as my friends, their older siblings, and me. Nevertheless, the connection makes historical imagination relatively easy. It’s like trying to imagine the life of someone raised in a different part of the country. By contrast, trying to figure out pre-Modern and/or non-Western history requires us to learn multiple languages and often draw on scholarship from other disciplines.

Learning and using different languages forces us to think differently. I’m not arguing for linguistic determinism, only linguistic relativity. In other words, learning to express one’s thoughts (or translate another’s words) idiomatically requires us to consider differences in concepts and how they are expressed. Trying to translate literally seldom works well, while idiomatic translation requires imagination and understanding of cultural analogues. The need for such analogues is also one of the things that prompt us to look to other disciplines when our sources indicate rituals or behaviors that are not explained within the sources themselves. Although (or perhaps because?) the work of literary theorists is often one of the first things people associate with “interdisciplinary studies”, I think we sometimes forget that the social sciences often supply us with the best analogues. There are lots of reasons that those analogues don’t end up working for us, but the process of finding and even rejecting the analogues requires us to approach our subject with a different mind-set. I see this as something that contributes to a better historical imagination. It also bleeds over into what I consider the most important effect of good historical training: cultural empathy and imagination. The training that helps the best historians recognize and set aside their own ethnocentric/presentist assumptions to get their heads round an alien past is equally useful for understanding cultures that are part of our present, yet seem equally alien to us.

Claxton’s assertions about the eraser (at least as presented in the media) reflect such presentist and ethnocentric assumptions, exacerbated by apparent ignorance of historical fact and context for the practice of erasure. Perhaps there is some a priori evidence for a culture of shame and the subsequent dishonesty of trying to hide mistakes that has been omitted from the original piece. I hope so. Otherwise, there are some very simple historical reasons I find his assertions as reported problematic, and why we need history.

Imagine living in a world where writing materials are much harder to come by than they are for most of us living in the industrialized world today. That could be in poorer parts of the world now, but let’s just assume a much more distant past. People wrote on clay tablets, on papyrus, on paper, on parchment or vellum, on wood, on stone, and on cloth to record things worth recording. We don’t see a lot of mistakes on such recordings. We also don’t always see evidence of erasures. Why? There are many reasons, but the ones that come to mind first are purpose and process. Something important enough to record is something worth getting right. Getting something right might require multiple drafts — something very much like the process of constant improvement Claxton wants. Wax tablets like the one in the picture here originated in Greece and were used by the Romans and by residents of their (ex-)Empire throughout the Middle Ages. They were used in lessons, for drafts of work to be later recorded more permanently, and things that needed to be written, but not necessarily kept. Sources tell us of Japanese and Chinese scholars who practiced their calligraphy with water, sometimes on stone or pottery, because they could not afford to practice on paper. Even in the modern period, children and their teachers used slates that could be erased, not to hide the incorrect answers, but because they allowed materials to be re-used. Today, I use a whiteboard for the same purpose.

When I was at school and at university (until the last year, when my university installed two computer labs, one Apple and one DOS-based PCs, both of which used 5 ¼” floppy disks), we were required to turn in typed versions of our papers. When I did a course in the UK, my papers had to be handwritten. The common requirement was that they be proofread and free from errors, something that had not changed in decades, if not since the first time that university students were obliged to turn in written work to be marked! This meant that I, like my predecessors, drafted everything longhand, made many corrections on the original and subsequent drafts, and then either copied them out in my best writing, or typed them up very carefully. A mistake meant re-doing the entire page — at least till the invention of correction tape. Imagine how much harder it was for the medieval copyist, carefully copying out his or her text on parchment: of course the parchment could be treated and scraped to erase an occasional error, but the point was to produce something without them. The book or document wasn’t meant to show process, because it was a product, a final and definitive copy, often commissioned and paid for in the same way people paid for the works of many other artisans. Erasure isn’t meant to hide the error — it’s merely a means to correct the error. The shame lay not in the error itself, but in allowing the error to remain. The expectation that the final product be without error actually drives a process of revision and correction. The eraser makes it possible to go back and make the corrections. Whence, then, the shame and dishonesty?

It doesn’t come from the eraser. Perhaps he should look closer to the very recent past and the rhetoric of accountability and league tables in education, and to the general malaise of comparing “results” that are almost always measured against some arbitrary, often monetized, standard. Isn’t there some sort of cognitive dissonance in expecting people to care about process when they are measured only by results in a game where the stakes are incredibly high? But perhaps the shame is not in the error itself? Claxton suggests that it is also connected to a belief that people should get things right the first time. What he doesn’t address is where such unrealistic expectations come from, and how they might be connected to the same sorts of measures of success. For example, at present a company’s success is generally measured by sales, and by how much the executives and stockholders earn. Was that true fifty years ago? a hundred? is it true in every country? Factoring in things like dependence on public subsidies (whether outright or indirect), savings gained by paying below a living wage, re-investment in better technology and facilities (which also may have an impact on public expense, e.g., when refusing such improvements has a negative impact on the health of the local community), etc., shows a much closer relationship between success and effort. Similarly, looking at how much time and effort the most successful (and least successful) students put into their work — and for some, how much extra money their parents may have spent on private lessons, private transportation, tutors, etc. — and what other demands there are on their time, might help to re-set people’s perceptions of normality. It’s far harder to feel shame for not meeting the low bar when you realize that the low bar is exceptionally high.

There are doubtless many other factors I haven’t mentioned, and it’s likely that the ones I have mentioned might be cause for disagreement. My intention here is merely to show the sorts of issues and questions that anyone with a sound background in historical thinking might raise. Distrusting the deceptively simple demonization of the eraser: it’s one of the reasons we need history.

Super-late notice about Kalamazoo blogger meet-up

12 May, 2015

Bloggers meeting at 7:00 Thursday in whatever Valley is Eldridge-Fox-registration-book room. Ask at the main info desk for the room number. Donations of food and beverage greatly appreciated!

Invisible Things

22 December, 2014

It appears I have forgotten how to blog. I also have a paper draft ostensibly due tomorrow, and I have every kind of mental block going. I’ve got three blog post drafts, too, all unfinished. So now, I am writing, to prove that I can. Still. Write.

I have been thinking a lot about invisibility lately. Sort of. Part of it has to do with the higher visibility of excessive force used by law enforcement when dealing with disabled people. Part of it is that I have a few friends and acquaintances who have invisible physical ailments. Coeliac, fibromyalgia, migraines, encephalomyelitis/CFS… all sorts of things. I know these things are real, and I do my best to disabuse people of the notion that such illnesses are imaginary, or not serious, or not debilitating. I also know an awful lot of people with various mental illnesses, some more serious than others. You may recall that there have been a bunch of articles over the last few months about depression and the effects of stress on academics, etc. Lots of my acquaintance have linked to these pieces and more. Ironically, perhaps, but not entirely surprising, some of the people who are most outspoken about their invisible physical illnesses seem to be far less sympathetic about mental illnesses. I’m always slightly surprised at the amount of victim blaming that goes on, even as we claim to understand that mental illness is real, and comes in lots of different sorts, some of which are treatable and/or temporary, and some that are not. It also interests me as to how we treat various disorders, labeling some ‘serious mental illnesses’ and others ‘conditions that normal people have,’ or ‘disabilities’ By ‘we’, I mean people in general, the media, etc. So, for example, ADHD is a disorder. It can also be a disability, and as such, is covered by the American Disabilities Act. But it’s not that common to hear people talking about ADHD as if it were some sort of illness. And somehow, perhaps because it’s more often diagnosed in children, it doesn’t seem to attract labels like ‘crazy’.

That doesn’t seem to be the case for many of the other sorts of pretty common forms of disorders that come under the umbrella of mental illness. Depression, anxiety, and stress disorders come in many different types. One can look a lot like another: diagnosis can require a lot of tests, and generally a lot of time. As far as I can tell, and this is just via personal observation and anecdata from friends and colleagues, it seems easier to diagnose more severe cases of things like depression than it is something like generalized anxiety disorder. It’s not surprising; after all, there are an awful lot of things out there that can cause a person to have trouble concentrating and affect short-term memory, including dehydration and lack of sleep. A couple of years ago, at what was close to a climax of a very stressful several years. I was fortunate enough to undergo all sorts of invasive tests and massively nasty medications as the doctors tried to figure out why my digestive system had gone to hell. They ruled out everything scary and still couldn’t figure out what was wrong with me. It was only by chance that I happened on an article about sleep requirements, and asked the doctor if she thought my inability to get more than about two hours of undisturbed sleep at a time might have something to do with it. Three weeks of sleep meds later, and all the symptoms were gone. Stress-related, or so they said. Get away from the stress, or learn to deal with it. Easy.

But what if it’s not easy? As at least one of the pieces I linked above notes, we tend normalize stress and bad work or personal environments to a point where not being able to deal with it is seen as abnormal or weak. Normalizing stress also makes it difficult to think outside that framework. In other words, if an environment or relationship is known to be stressful, then it’s easy to assume that what is going on is just … stress. Most people aren’t trained to make psychiatric or psychologic diagnoses, after all. We hear a lot, and are familiar with terminology, but words that seem synonyms for stress to a layperson might mean something else to a clinician, and vice versa. It might not even occur to a person to wonder if their inability to handle ‘normal’ stress is itself normal. They might indulge in some self-blame and try to hold it together. After all, everybody else seems to be doing so. Not everyone feels that way, though. Imagine the person knows both that the stressful environment is not normal AND that their reaction is something more than not handling the stress. That something more? is OMG mental illness. Slip a disc, and no one expects you to help with the heavy lifting. Diagnosed with some sort of mental disorder that makes it hard to handle certain situations? New can of worms, that is.

For those few people not on some form of social media beyond the blogosphere, take my word for it that not a week goes by without at least a few stories in your various timelines that are focused on enlightening people about what depression is like, and how it can’t be cured by Moar Willpower! or how disorder X is on the rise, or that there’s a new drug available for anxiety, or whatever. On a societal level, we seem much more willing to accept that these things are illnesses that can be mild or severe, and can be treated, and sometimes ‘cured’. But as individuals, we aren’t so good at it. People, and maybe even especially people who work with people and do thinky work, who are willing to talk about their mental health issues often take the risk of being blamed for a stressful environment or relationship — after all, we all knew it was sort of crazy, so all of the crazy must be the crazy person’s fault. Not surprising that many people try to make that part of their lives invisible to others. And face it, it’s fairly easy for most people. Everybody has a hard time coping with stress, right? as long as a person copes most of the time, it’s the stress that’s the problem. It’s sort of like my migraines: people know I have them, but they also know that most of the time I can take meds and keep going. For a long time, I was so good at hiding all but the worst of the migraines that even people who knew I got them, and knew that there were certain triggers, like strobe lights or rapid temperature changes, would regularly ask if I wanted to go clubbing. There are times we are complicit in hiding our illnesses. After many years, and meeting many other migraine sufferers, I finally stopped trying to hide them. Migraines may be invisible, but they are also Real.

It’s not the same with mental illness. It seems that for many people, Real = “so crazy anybody can see it.” The more invisible, the less real. Cope fairly well? prepare for a well-meaning friend or family member to challenge the diagnoses. This is not actually surprising, given how many people seem to think a five-minute test on a website can correctly identify anything! Nevertheless, even when the diagnosis has been made by an expert, an awful lot of people who aren’t experts are willing ignore or contradict the expert and the person who has consulted the expert. People I know with coeliac or allergies or diabetes often face similar attitudes, but ignoring those sorts of invisible illnesses can result in very visible physical illness, and even death. Once that is made clear, only serious asshats will not keep the illness in mind and act accordingly, asking about acceptable foods, etc. – it’s amazing how epi-pens and insulin pumps can change a person’s attention to detail. In contrast, a person who has an invisible mental illness, especially if they have been receiving treatment for years, may be every bit as aware of things that will make them worse, or that might put them at risk of a panic attack, or send them into a depressive state. They may be very articulate about it, and even try to explain what’s going on, and how others can help to minimize potential setbacks (if you’ve seen As Good As It Gets, you’re on the right track, although that’s orders of magnitude beyond what I’m talking about). But despite that groundwork, the people who seem to be coping despite their invisible mental illness aren’t likely to go into sudden shock, or die, if people ignore their needs. Couple that with general suspicions regarding the authenticity of illness in people who seem more or less fine, if stressed or a little down — not to mention that thinking about others and how their experiences and illness might shape their reactions to certain types of situations can feel like catering to someone who is just being difficult — and it’s not too hard to make the person as invisible as the illness.

I’ve got no real conclusion here. It’s just something that’s been rolling around in my head for a while. Inconsistencies and weird hierarchies of privilege will do that.