Something I wrote four years ago today, and forgot all about. I have too many half-finished posts in the queue, and hope that posting something will remind me to post more.
O that we now had here
But one one-thousand dollar Dyson
That do but sit in shops!
What’s she that wishes so?
My Inner Chorus? No, my fair voices:
If we are marked to sneeze, we are enow
to use a Kleenex box; and if to breathe,
the Bissell work’d, and I have hope of savings.
God’s will! I pray thee, wish not Dyson more.
By Jove, I am not covetous for tech,
Nor care I if cats leave grit upon my floor;
It yearns me not if cats dust bunnies leave;
Such outward things dwell not in my food.
But if it be a sin to detest sneezing,
Mine is the most offending nose alive.
No, faith, old self, wish not a better vacuum
To clean the floor, to hold all dust bunnies
In momentary sway. O do not wish for more!
Rather proclaim it, Chorus mine, to the cats,
That they that hath no stomach to this fight,
let them depart; their dinner be delayed
and place for hiding be under the bed:
We would fight against dread dust bunnies
And grit that kitty cats have left behind.
Today is called the feast of Adalbert.
She that shall clean this day, and essays mark,
Will sit in bathtub when the job is done,
And rouse cats from their tidy hidey-holes.
She that shall clean this day, shall bite her tongue
For surely when the vacuum starts the kittehs
Will cry, “The Monkey is Mad, so say we:”
Then like unto a cartoon of Chuck Jones
Will lose control of limbs and silly seem.
I bite my tongue: for mocking cats is cruel,
And they’ll remember not the ravages
Against the dust and hair: only the noise.
Familiar in cat minds as Monkey mad
driving the beast, killing the dust bunnies,
Singing and cleaning, Boots-cat and Rosie
Left in their hidey-holes swearing all cat-swears.
This story shall the poor cats tell to all;
And Bishop Adalbert shall ne’er go by,
From this day to the ending of the world
But they in it shall yet remember:
Those cats, those fearful cats, those martyred kittehs;
For those who hide and plot revenge on me
They are my kittehs; be they ne’er hungry,
This day still tempers plots of vengeance:
Wrought carefully in closets deep
Plans for hairballs and poo unburiéd,
To make their hatred clear lest Monkey dare
To clean the house upon a Saturday.
When I was at Kalamazoo, several people groused at me for not blogging more. This morning, as I procrastinate over other things, it occurred to me that many of my comments on the book of face are probably worthy of blog posts. This post is meant to be the beginning of a series of posts on how better historical literacy might improve work in many fields. And daily life, for that matter.
The first thing I heard on the radio this morning was a BBC headline referring to this piece in The Telegraph. Guy Claxton, a cognitive scientist currently a visiting professor at King’s College (unclear whether it’s London or Cambridge), claims that erasers (when did UK English abandon “rubbers”?) “encourage children to feel ashamed about mistakes” and should therefore be banned from classrooms. Claxton sees the erasure of mistakes as part of a culture of shame that causes dishonesty and inability (or refusal) to take responsibility for one’s own errors. He contrasts this culture with a more desirable one in which students learn from their mistakes and constantly attempt to improve. Plausible on the surface, perhaps, but a little historical knowledge and training shows the problems with Claxton’s argument.
One of the aspects of historical training that I and many others find most valuable is the development of what some call historical imagination. By this I don’t mean the blithe imaginings of people who wish to live in a past reframed by their own values. Instead, I mean the kind of imagination that requires accepting that people of the past didn’t necessarily think like we do: they had different worldviews, different value systems, different concerns. The artifacts and documents that they produced make that pretty clear. While it is true that historical interpretations of these objects changes, and that sometimes different interpretations remain open to debate, the evidence for cultural difference is pretty clear. That is especially true when dealing with pre-Modern and non-Western history.
It’s one thing for those of us in the modern West to try to imagine the lives of people who lived a hundred years ago within our own cultures, but another, very different, thing to get our heads around why people did the things they did five hundred years ago. I have friends and colleagues who knew people who served in the First World War. I have pictures of my father’s great aunt, who served in the Signal Corps. In other words, we are still connected to what most people think of as a distant past. We know that our parents and grandparents grew up in a different time, so we extrapolate backwards for their ancestors. The problem here is often that, because our parents and grandparents also died in our time, with all mod cons, we forget that most of their lives might not have been that way. For example, I have a mobile phone that probably has more computing power than all of the giant computer banks involved in the first moon landing. My using the same technology as my students — not to mention that I am often much more knowledgeable about it than they are — places me in the ‘now’. They find it hard to believe that I would have been a school contemporary of Jackie in That 70’s Show, or that when The Brady Bunch originally aired, the kids were the same age as my friends, their older siblings, and me. Nevertheless, the connection makes historical imagination relatively easy. It’s like trying to imagine the life of someone raised in a different part of the country. By contrast, trying to figure out pre-Modern and/or non-Western history requires us to learn multiple languages and often draw on scholarship from other disciplines.
Learning and using different languages forces us to think differently. I’m not arguing for linguistic determinism, only linguistic relativity. In other words, learning to express one’s thoughts (or translate another’s words) idiomatically requires us to consider differences in concepts and how they are expressed. Trying to translate literally seldom works well, while idiomatic translation requires imagination and understanding of cultural analogues. The need for such analogues is also one of the things that prompt us to look to other disciplines when our sources indicate rituals or behaviors that are not explained within the sources themselves. Although (or perhaps because?) the work of literary theorists is often one of the first things people associate with “interdisciplinary studies”, I think we sometimes forget that the social sciences often supply us with the best analogues. There are lots of reasons that those analogues don’t end up working for us, but the process of finding and even rejecting the analogues requires us to approach our subject with a different mind-set. I see this as something that contributes to a better historical imagination. It also bleeds over into what I consider the most important effect of good historical training: cultural empathy and imagination. The training that helps the best historians recognize and set aside their own ethnocentric/presentist assumptions to get their heads round an alien past is equally useful for understanding cultures that are part of our present, yet seem equally alien to us.
Claxton’s assertions about the eraser (at least as presented in the media) reflect such presentist and ethnocentric assumptions, exacerbated by apparent ignorance of historical fact and context for the practice of erasure. Perhaps there is some a priori evidence for a culture of shame and the subsequent dishonesty of trying to hide mistakes that has been omitted from the original piece. I hope so. Otherwise, there are some very simple historical reasons I find his assertions as reported problematic, and why we need history.
Imagine living in a world where writing materials are much harder to come by than they are for most of us living in the industrialized world today. That could be in poorer parts of the world now, but let’s just assume a much more distant past. People wrote on clay tablets, on papyrus, on paper, on parchment or vellum, on wood, on stone, and on cloth to record things worth recording. We don’t see a lot of mistakes on such recordings. We also don’t always see evidence of erasures. Why? There are many reasons, but the ones that come to mind first are purpose and process. Something important enough to record is something worth getting right. Getting something right might require multiple drafts — something very much like the process of constant improvement Claxton wants. Wax tablets like the one in the picture here originated in Greece and were used by the Romans and by residents of their (ex-)Empire throughout the Middle Ages. They were used in lessons, for drafts of work to be later recorded more permanently, and things that needed to be written, but not necessarily kept. Sources tell us of Japanese and Chinese scholars who practiced their calligraphy with water, sometimes on stone or pottery, because they could not afford to practice on paper. Even in the modern period, children and their teachers used slates that could be erased, not to hide the incorrect answers, but because they allowed materials to be re-used. Today, I use a whiteboard for the same purpose.
When I was at school and at university (until the last year, when my university installed two computer labs, one Apple and one DOS-based PCs, both of which used 5 ¼” floppy disks), we were required to turn in typed versions of our papers. When I did a course in the UK, my papers had to be handwritten. The common requirement was that they be proofread and free from errors, something that had not changed in decades, if not since the first time that university students were obliged to turn in written work to be marked! This meant that I, like my predecessors, drafted everything longhand, made many corrections on the original and subsequent drafts, and then either copied them out in my best writing, or typed them up very carefully. A mistake meant re-doing the entire page — at least till the invention of correction tape. Imagine how much harder it was for the medieval copyist, carefully copying out his or her text on parchment: of course the parchment could be treated and scraped to erase an occasional error, but the point was to produce something without them. The book or document wasn’t meant to show process, because it was a product, a final and definitive copy, often commissioned and paid for in the same way people paid for the works of many other artisans. Erasure isn’t meant to hide the error — it’s merely a means to correct the error. The shame lay not in the error itself, but in allowing the error to remain. The expectation that the final product be without error actually drives a process of revision and correction. The eraser makes it possible to go back and make the corrections. Whence, then, the shame and dishonesty?
It doesn’t come from the eraser. Perhaps he should look closer to the very recent past and the rhetoric of accountability and league tables in education, and to the general malaise of comparing “results” that are almost always measured against some arbitrary, often monetized, standard. Isn’t there some sort of cognitive dissonance in expecting people to care about process when they are measured only by results in a game where the stakes are incredibly high? But perhaps the shame is not in the error itself? Claxton suggests that it is also connected to a belief that people should get things right the first time. What he doesn’t address is where such unrealistic expectations come from, and how they might be connected to the same sorts of measures of success. For example, at present a company’s success is generally measured by sales, and by how much the executives and stockholders earn. Was that true fifty years ago? a hundred? is it true in every country? Factoring in things like dependence on public subsidies (whether outright or indirect), savings gained by paying below a living wage, re-investment in better technology and facilities (which also may have an impact on public expense, e.g., when refusing such improvements has a negative impact on the health of the local community), etc., shows a much closer relationship between success and effort. Similarly, looking at how much time and effort the most successful (and least successful) students put into their work — and for some, how much extra money their parents may have spent on private lessons, private transportation, tutors, etc. — and what other demands there are on their time, might help to re-set people’s perceptions of normality. It’s far harder to feel shame for not meeting the low bar when you realize that the low bar is exceptionally high.
There are doubtless many other factors I haven’t mentioned, and it’s likely that the ones I have mentioned might be cause for disagreement. My intention here is merely to show the sorts of issues and questions that anyone with a sound background in historical thinking might raise. Distrusting the deceptively simple demonization of the eraser: it’s one of the reasons we need history.
Bloggers meeting at 7:00 Thursday in whatever Valley is Eldridge-Fox-registration-book room. Ask at the main info desk for the room number. Donations of food and beverage greatly appreciated!
It appears I have forgotten how to blog. I also have a paper draft ostensibly due tomorrow, and I have every kind of mental block going. I’ve got three blog post drafts, too, all unfinished. So now, I am writing, to prove that I can. Still. Write.
I have been thinking a lot about invisibility lately. Sort of. Part of it has to do with the higher visibility of excessive force used by law enforcement when dealing with disabled people. Part of it is that I have a few friends and acquaintances who have invisible physical ailments. Coeliac, fibromyalgia, migraines, encephalomyelitis/CFS… all sorts of things. I know these things are real, and I do my best to disabuse people of the notion that such illnesses are imaginary, or not serious, or not debilitating. I also know an awful lot of people with various mental illnesses, some more serious than others. You may recall that there have been a bunch of articles over the last few months about depression and the effects of stress on academics, etc. Lots of my acquaintance have linked to these pieces and more. Ironically, perhaps, but not entirely surprising, some of the people who are most outspoken about their invisible physical illnesses seem to be far less sympathetic about mental illnesses. I’m always slightly surprised at the amount of victim blaming that goes on, even as we claim to understand that mental illness is real, and comes in lots of different sorts, some of which are treatable and/or temporary, and some that are not. It also interests me as to how we treat various disorders, labeling some ‘serious mental illnesses’ and others ‘conditions that normal people have,’ or ‘disabilities’ By ‘we’, I mean people in general, the media, etc. So, for example, ADHD is a disorder. It can also be a disability, and as such, is covered by the American Disabilities Act. But it’s not that common to hear people talking about ADHD as if it were some sort of illness. And somehow, perhaps because it’s more often diagnosed in children, it doesn’t seem to attract labels like ‘crazy’.
That doesn’t seem to be the case for many of the other sorts of pretty common forms of disorders that come under the umbrella of mental illness. Depression, anxiety, and stress disorders come in many different types. One can look a lot like another: diagnosis can require a lot of tests, and generally a lot of time. As far as I can tell, and this is just via personal observation and anecdata from friends and colleagues, it seems easier to diagnose more severe cases of things like depression than it is something like generalized anxiety disorder. It’s not surprising; after all, there are an awful lot of things out there that can cause a person to have trouble concentrating and affect short-term memory, including dehydration and lack of sleep. A couple of years ago, at what was close to a climax of a very stressful several years. I was fortunate enough to undergo all sorts of invasive tests and massively nasty medications as the doctors tried to figure out why my digestive system had gone to hell. They ruled out everything scary and still couldn’t figure out what was wrong with me. It was only by chance that I happened on an article about sleep requirements, and asked the doctor if she thought my inability to get more than about two hours of undisturbed sleep at a time might have something to do with it. Three weeks of sleep meds later, and all the symptoms were gone. Stress-related, or so they said. Get away from the stress, or learn to deal with it. Easy.
But what if it’s not easy? As at least one of the pieces I linked above notes, we tend normalize stress and bad work or personal environments to a point where not being able to deal with it is seen as abnormal or weak. Normalizing stress also makes it difficult to think outside that framework. In other words, if an environment or relationship is known to be stressful, then it’s easy to assume that what is going on is just … stress. Most people aren’t trained to make psychiatric or psychologic diagnoses, after all. We hear a lot, and are familiar with terminology, but words that seem synonyms for stress to a layperson might mean something else to a clinician, and vice versa. It might not even occur to a person to wonder if their inability to handle ‘normal’ stress is itself normal. They might indulge in some self-blame and try to hold it together. After all, everybody else seems to be doing so. Not everyone feels that way, though. Imagine the person knows both that the stressful environment is not normal AND that their reaction is something more than not handling the stress. That something more? is OMG mental illness. Slip a disc, and no one expects you to help with the heavy lifting. Diagnosed with some sort of mental disorder that makes it hard to handle certain situations? New can of worms, that is.
For those few people not on some form of social media beyond the blogosphere, take my word for it that not a week goes by without at least a few stories in your various timelines that are focused on enlightening people about what depression is like, and how it can’t be cured by Moar Willpower! or how disorder X is on the rise, or that there’s a new drug available for anxiety, or whatever. On a societal level, we seem much more willing to accept that these things are illnesses that can be mild or severe, and can be treated, and sometimes ‘cured’. But as individuals, we aren’t so good at it. People, and maybe even especially people who work with people and do thinky work, who are willing to talk about their mental health issues often take the risk of being blamed for a stressful environment or relationship — after all, we all knew it was sort of crazy, so all of the crazy must be the crazy person’s fault. Not surprising that many people try to make that part of their lives invisible to others. And face it, it’s fairly easy for most people. Everybody has a hard time coping with stress, right? as long as a person copes most of the time, it’s the stress that’s the problem. It’s sort of like my migraines: people know I have them, but they also know that most of the time I can take meds and keep going. For a long time, I was so good at hiding all but the worst of the migraines that even people who knew I got them, and knew that there were certain triggers, like strobe lights or rapid temperature changes, would regularly ask if I wanted to go clubbing. There are times we are complicit in hiding our illnesses. After many years, and meeting many other migraine sufferers, I finally stopped trying to hide them. Migraines may be invisible, but they are also Real.
It’s not the same with mental illness. It seems that for many people, Real = “so crazy anybody can see it.” The more invisible, the less real. Cope fairly well? prepare for a well-meaning friend or family member to challenge the diagnoses. This is not actually surprising, given how many people seem to think a five-minute test on a website can correctly identify anything! Nevertheless, even when the diagnosis has been made by an expert, an awful lot of people who aren’t experts are willing ignore or contradict the expert and the person who has consulted the expert. People I know with coeliac or allergies or diabetes often face similar attitudes, but ignoring those sorts of invisible illnesses can result in very visible physical illness, and even death. Once that is made clear, only serious asshats will not keep the illness in mind and act accordingly, asking about acceptable foods, etc. – it’s amazing how epi-pens and insulin pumps can change a person’s attention to detail. In contrast, a person who has an invisible mental illness, especially if they have been receiving treatment for years, may be every bit as aware of things that will make them worse, or that might put them at risk of a panic attack, or send them into a depressive state. They may be very articulate about it, and even try to explain what’s going on, and how others can help to minimize potential setbacks (if you’ve seen As Good As It Gets, you’re on the right track, although that’s orders of magnitude beyond what I’m talking about). But despite that groundwork, the people who seem to be coping despite their invisible mental illness aren’t likely to go into sudden shock, or die, if people ignore their needs. Couple that with general suspicions regarding the authenticity of illness in people who seem more or less fine, if stressed or a little down — not to mention that thinking about others and how their experiences and illness might shape their reactions to certain types of situations can feel like catering to someone who is just being difficult — and it’s not too hard to make the person as invisible as the illness.
I’ve got no real conclusion here. It’s just something that’s been rolling around in my head for a while. Inconsistencies and weird hierarchies of privilege will do that.
I realized this morning that I’ve somehow gone wildly astray in following one of my sabbatical plans. It’s not just astray in the sense of “not doing”, mind you; rather, it’s astray in the sense of completely forgetting and misunderstanding. Two of the things that I’d really hoped to do were to get back in the habits of reading and blogging. More precisely, I meant to read things I wanted to that I’ve not allowed myself the time for, whether they be in my field, blogs, or fiction. And I meant to blog more regularly. Instead, I’ve found myself feeling guilty about not reading the right stuff and writing the right stuff. In fact, I’ve felt so guilty I’ve started clenching my jaw and hiding from the world and doing things not at all related to work since giving the last presentation a couple of weeks ago.
So this is just a short reminder to myself that I am not only allowed to read things that make me think, but I should also be doing that. I should be blogging, because it allows me to engage my brain and compose. I write more effectively when I also blog. I know: you’ve all read this before. It’s just so easy for me to forget. But dammit, it’s June. I have only about ten weeks left. I need to get things done, but there is no reason I can’t decide what order to work on them, as long as I get finished those things I mean to have finished. Nor, apart from schedules imposed by institutional opening times, do I need to keep to anyone else’s schedule. So now, I’m going to read some stuff on gender in the MA. If I want to run before going to the library, I will run, even if it’s going to make me late to the library. If I want to follow a link to an article in the Grauniad, I will do that.
In short, I am going to try to allow myself to get sidetracked by things that are productive and healthy in the end, rather than finding distractions to keep from feeling bad about not being the right sort of productive.
Also… I’m working on a post that sort of ties together a lot of the various conversational threads happening on the internets since I’ve been on sabbatical, e.g., misogyny, safe places, SF/F, trigger warnings, etc. It’s turning out to be problematic on many levels, but I think I need to get it out of my system.
If you happen to hear of a case of spontaneous combustion on the news from London in the next 24 hours, it will be me.
If you happen to see me between now and tomorrow at about 15:00, please make sure I am awake and possibly offer chocolate.
I’m a little late to the party on this, having recently been at the Zoo and now madly working on what could be the most scary presentation in my life to this point; however, I want to take a little time to address a series of interesting posts, one of which really and truly pisses me off. I can’t be bothered to look back and find any number of other posts that talk about how academia screws graduate students, or how nobody warned them and now they’ve wasted their lives and money. I’m just… PEOPLE, what the FUCK are you thinking??
In case you’re a first-time visitor (it happens), I have spent time as an adjunct and VAP. Several years, in fact. I now have a full-time job. By many people’s estimations, my own included, I probably shouldn’t. I did a lot of things wrong during my (very long) time as a postgraduate, and I know a lot of people who have great qualifications, are great teachers, and still don’t have full-time and/or T-T academic positions after years on the market. I also know people who have left the profession after having got jobs many people would envy. I come from a family that several generations ago was largely made up of tradesmen and artisans who owned their own businesses, but has for the last couple of generations has been pretty solidly clinging to the white-collar end of blue-collar labour. When I was a kid in the sixties, we were poor enough that I got free breakfast and lunch at school, and we waited in line to get government issue fake spam (yes — generic SPAM), legumes, powdered milk, grains, etc. The neighborhoods we lived in the sixties and seventies were full of other people on welfare, and where we were one of only a couple white families: our neighbors were African-American or Mexican-American, depending on the town.
And you know what? I have a PhD. I have a job doing what I love. And yeah, I’m in more debt than I’d like. I’m still paying off my student loans (although as someone who received Cal and Pell grants, plus a full ride for my MA/PhD program, plus two fellowships and some adjuncting after the university funding ran out, my total student loan debt was about $18k for three degrees). And I have not now, nor has anyone I know who has been an adjunct, been anything like a slave.
That’s right. Not at all like a slave. Nothing. Also, it’s nothing like indenture (although grad school? there’s an argument for that being indenture). It’s also not wage slavery. And if you think it is, you need to check your fucking privilege and join the real world, because you are no better, and no smarter than the undergrads who think their first jobs will pay $40k a year, and that they will have secretaries to make up for the fact that they can’t spell or write decent sentences. So as someone who has been there, I have a few comments on this whole “adjuncts are today’s slaves” metaphor and how it insults all of us, and worse — far, far worse — denies the suffering and humanity of those who were (or are) enslaved, especially in the slave systems of the modern world. And yes, much of this has been said before, but this is the internet, and multiple rants on the same topic are typical of internet discourse.
The reality of slavery
- Slaves are owned. Completely and utterly. They are property, chattel. Their lives and bodies are subject to the will and whims of their owners. They have no legal right to their families, to their relationships, to anything. They have no legal agency, and any agency they can exercise depends entirely on personal dynamics and contingent circumstances.
- Slaves have no other options. Period. They cannot choose not to be slaves. Their freedom, should they somehow acquire it, is entirely dependent upon the agreement of their owners. It is also, as in the case of race-based slavery as it existed in the US, entirely dependent upon the willingness of free white people to accept that they were free. Freed slaves could be, and sometimes were, re-enslaved, and there was very little that could be done about it, unless there was an advocate to ensure that the law was respected.
- Slaves are on call 24/7. Their time is owned and at the discretion of their owners.
- Slaves do manual labour, often of the backbreaking kind. There is no workman’s comp..
- Slaves have no realistic hope of getting a better gig with any sort of contractual protections or benefits. Face it: slaves know that the hope for a life that isn’t actually slavery is unrealistic.
What is an Adjunct?
- Adjuncts choose to be adjuncts. They do. It’s completely, entirely their choice.
- Adjuncts have agency. They can leave. They can reject a gig. They can, if it’s too much of a strain, change fields.
- If an adjunct leaves, neither their freedom nor their lives are in danger. No one can force them to go back to being an adjunct.
- Adjuncts can marry who they wish. They can have children. Their bodies belong to them, and to them alone. When they have spare time, they can use it as they like.
- Adjuncts can maintain some degree of hope — and even realistic hope — that they will some day be employed in benefited T-T positions. It happens, and not just as some sort of urban legend.
- Adjuncts have advanced degrees. That’s presumably more knowledge, skills, and experience than people with undergraduate degrees or no higher education at all have. They know how to learn, and they can read, write, do experiments, run labs, and all sorts of things that are needed outside academia — and can pay as much or more than academic jobs. Adjuncts have options.
They just don’t like those options.
That sucks, doesn’t it?
Being an adjunct is not like being a slave. Being an adjunct is like having to have a single scoop of vanilla in a cup when what you thought you were going to get was a full-on coffee-Heath Bar fudge sundae fixed just like you wanted. It’s still ice cream, and if you don’t want vanilla, you’ve got a really good chance of getting pie or cake or tiramisu with a nice glass of cognac and coffee somewhere else. Hell, even if being an adjunct means you’re eating ramen at every meal, the worst alternate if you leave the field still means you get to eat in chain restaurants with silverware.
I accept that most of the time, it is not what an adjunct wants, except in the sense of wanting to stay in academia and have a job of any kind, just to do so. It’s often a desperate choice. It’s often a choice people make thinking that there is no other viable option. That’s part of why this whole thing sucks, really. The overall culture of traditional academe is absolute shit at letting people know they have options. But look across campus: do you see the students in the professional schools freaking out about getting full-time jobs? No. More to the point, unlike our professors and mentors, theirs have convinced that they can carry a full-time faculty member’s workload AND bring in a nice second income with consulting gigs, running their own businesses and practices, etc. Of course, they often have lower teaching loads… but I digress.
The thing is, the people who go into professional schools may be training for a profession, but their training also opens the door for academic jobs, if they want them. Professional students learn to judge their future value in the working world. I’m not talking on an individual basis, mind you; I’m just contrasting what I see as a major difference between academic postgraduate studies and professional programs. Moreover, there’s a sort of pervasive sense of disdain for people who leave academia — despite all evidence to the contrary, survivor guilt makes those of us who have positions want to believe that we somehow deserved them. The idea of meritocracy survives because so many of us suffer from impostor syndrome. It doesn’t help that we all know better qualified people who are still out there looking.
Yep. There are lots of excellent people who don’t have jobs. It’s been that way for as long as I can remember. I sometimes wonder if one of the reasons all of my Doktorvater’s PhD students who wanted jobs have them because he’s frankly a little weird for an academic. He never seemed to hold it against students who decided to leave. He never criticized us for making the sorts of life choices that other advisors warned against. For DV, it was pretty much always, “it’s your life — you’re an adult and you aren’t here to please me, except as far as I’m advocating for you in the department and on the market.” I realize that’s a rare thing. But anyway, I’m fully aware that there are people who are better teachers, better researchers, better writers, who generally have their shit together more than I do, and they don’t have jobs. There are also plenty of people out there proving daily that sometimes, all it takes is being glib, super-self-confident, good at interviews, and well-pedigreed to get a very good job, despite being less than stellar at any of the things we are supposed to do. But the adjunct situation isn’t a result of the wrong people getting jobs. It just isn’t.
Adjuncts have a choice. It’s not a great choice. But to be honest, even those of us who managed to get full faculty positions didn’t choose between job offers.
In any given year, we compete for positions against a lot of very well-qualified peers. There are always fewer jobs than candidates. A lot of those jobs aren’t in places we want to be, or have teaching loads we like, or they force us to negotiate difficult family decisions. It’s often the choice between a job and no job. For adjuncts, it’s much worse. The lack of choice is magnified, and the consequences are, too. It’s wearing, and it’s hard. It’s demoralizing. Adjuncts desperately cobble together teaching gigs and often other part-time work to make ends meet and scrabble with torn and bloody fingernails to stay on the fringes of academia, never wholly included, yet never able to let go.
IT IS NOT SLAVERY. IT IS NOTHING LIKE SLAVERY
It’s also not like indentured servitude, wage slavery, or sharecropping. Seriously. No matter where you came from, if you have a PhD, then you need to recognize your privilege. (Let’s just take it as read that I am not saying that it’s an equalizer across the board. It’s an intersectionality thing, ok?). You went to a university and got a good enough education to be accepted for postgraduate work. You got to spend even more time doing what you love. You learned stuff — and probably didn’t pay your whole way for it. No one can take that away from you. You have resources to draw on that will give you advantages in any pursuit you choose. But you have a choice, and you have agency. No one owns you, and no one is forcing you.
So is there a better metaphor? because being an adjunct sucks, and adjuncts are treated badly and oppressed and the system is really unethical and I need a metaphor that can convey just how shitty I feel without acting like a complete dick about other people’s truly horrific experiences
Actually, yeah. I’ve got it covered. How about …
Being an adjunct is like being trapped in a really bad marriage. You’ve invested your youth, your energy, you’ve made personal sacrifices and probably are stuck in a deeper financial hole than you ever imagined. If you leave, you may be cutting off ties with family, friends — at the very least, you may have to say goodbye to a part of your life that you truly love. Or you may not. Some people manage to leave the teaching track and still stay connected, just like some people manage to keep their in-laws. It’s scary to think of leaving, but staying is absolutely miserable. Every day hurts, and you keep thinking it will get better. And you know? you can continue to tell yourself that at least your marriage is still together, and you’re not single. You’ve got presentations and publications, which you can show off like pictures of your kids to remind yourself the time wasn’t wasted. And who knows? maybe things will get better, or you’ll meet someone new.
Being an adjunct is like having worked your way up in a company you joined right out of high school, gradually taking on more responsibility, getting more power, decent raises, learning all the ins and outs of the business. You’ve got a mortgage, kids getting ready for college, but every year you get a raise, and you’ve even got a union contract that looks like it will keep you secure through old age, even if you retire at 62. Of course, you’ve only ever worked in that one industry, that one company. And then the economy went to shit, and the company fell to a hostile takeover, and in order to keep younger people on, the union was forced to agree to a new contract and a couple of plant closures. There you are, in your 40s, and you have no job, no way to pay the mortgage because unemployment is way to little, and you need to re-train, but the government says you have to take an almost entry-level job to qualify for any benefits. Everything you thought you knew about your life is gone.
Oh wait. Being an adjunct isn’t quite like that, because not being able to get steady work doing what you want and what you were trained for isn’t really the same as being laid off and having to find a new career. Trust me, I’ve done that. Being an adjunct is a choice. Now, if you end up changing careers because you can’t find work in your field? yeah, it’s kind of like being laid off and starting again.
So do you have an even better metaphor?
How about this one?
Being an adjunct is like being any other person who has trained and invested a big chunk of their life in a career, especially a career that a person sets their heart on having, only to find out that there are just not enough jobs for everybody in your line of work. So like millions of other people who do this in many fields, you find yourself trying to stay in the same career, even if it means part-time work, no benefits, and crap pay. Like millions of others, you know that every day is a gamble: will you get that permanent job with benefits? or will you just dig yourself into a deeper hole? You know it’s taking a toll on you and your family. You don’t want to give up, but you don’t know how you can go on.
You know, since the last big economic crash, this has been the story of many, many people. I’m not saying adjuncts should be happy with their lot. The contingent faculty situation is appalling, and all faculty have a duty to fight to fix the system. It is a system that is morally and ethically bankrupt, and is detrimental to the university community and to higher education as a whole. The only people who should be adjuncts are those who really do choose to teach part-time, because they are retired or have another full-time job.
But frankly, when I see essays like the one that riled me up enough to post this, I lose some of the will to fight. There’s something about comparing to slavery to being an adjunct, which is literally no worse than the employment situation faced by millions of qualified people who are out of work and trying to find something comparable to their last job, and far better than the situation of those millions who have lost jobs and are grateful to have managed to get part-time work at Walmart and the local donut shop, that wants me to say, “You know, it’s a PhD, not a guarantee. Also? you should probably go watch The Princess Bride again. Because…
UPDATE: I meant to include this Very Useful Post by David Perry Also fixed a link above.