Barking Abbey

Interesting article by Eleanor Parker on History Today:

The Cultured Women of Essex

We should take more notice of the work of those once despised and disregarded.

‘It is asked of all who hear this work that they do not revile it because a woman translated it. That is no reason to despise it, nor to disregard the good in it.’ Many female writers have probably said, or wanted to say, something very like these words. They were written in the 12th century, around 1170, by a woman who composed one of the earliest texts from England known to be by a female author. She was a nun of Barking Abbey in Essex and, though we do not know her name, her words – and her work – demand attention.

The work she asks us not to disregard is a narrative of the life of Edward the Confessor, written in Anglo-Norman French (‘the false French of England’, the nun modestly calls it). Its author was an educated woman, able to turn a Latin source into engagingly chatty French verse and Barking Abbey must have been a congenial environment for her. Founded in the seventh century, Barking was one of the foremost nunneries in the country, a wealthy abbey which was home to many well-connected aristocratic and royal women. Its abbesses were frequently appointed from the sisters and daughters of kings and, around the time our nun wrote her Vie d’Edouard le Confesseur, Thomas Becket’s sister Mary – herself a woman of literary interests – was made abbess of Barking in compensation for her brother’s murder.

Across its long history of more than 850 years, Barking Abbey was a centre for women’s learning. It has been described as ‘perhaps the longest-lived … institutional centre of literary culture for women in British history’ and it had a strong literary and scholarly tradition that spanned the Middle Ages. In the early medieval period, authors such as Aldhelm and Goscelin of St Bertin wrote learned Latin works for the nuns of Barking; later, several nuns composed their own poetry and prose – even their own plays. In the 12th century, when women were increasingly becoming patrons, readers and, in some cases, authors of literary texts, Barking produced more than one talented writer. The first female author in England whose name we know, Clemence of Barking, was a nun there; she wrote an accomplished Life of St Catherine of Alexandria, a saint associated with female learning.

Read the whole thing, and a followup blog post about it. A choice excerpt:

I’m a UK academic writing primarily for UK audiences (not that I’m not glad to have other readers too!), but online those distinctions are blurred; other academics will pass judgement, from half a world away, on conversations they only half understand, and some of them are very resistant to the idea that in different contexts it might be necessary to speak in different languages, to ask and answer different questions. Even the basic idea that words have different connotations in different varieties of English seems to surprise them. In their particular cultural context, medieval history intersects with questions of identity and exclusion in very different ways, and they won’t listen to anyone who tries to tell them things don’t operate like that everywhere in the world. We all have to do what seems right to us in our own context, and I’m sure they are trying to do that; I only wish they were prepared to consider that the rest of us are trying to do the same, just not in the same way. Some feel entitled to demand that every discussion which touches on ‘their’ subject should address their own immediate social and political concerns – not those of (for instance) the people of Barking, of whose existence they are so loftily unconscious. Some of these people also display a deeply exclusionary view of academic status and the privileges it confers on them, and an attitude little better than contempt for the public at large; if you don’t have a doctorate, you’re not worthy of their time or attention. I’ve been observing this tendency for several years, but it’s particularly noticeable at the moment. Since these academics don’t follow British and Irish politics, they really can’t see why this is such an especially bad time to be making pronouncements on how to use words like ‘English’ and ‘British’, without any understanding of the contemporary sensitivities surrounding those terms, and they seem completely unaware of the wider social context in which UK medievalists have to consider the issue of public engagement. I think some of them truly would prefer it if they could stop the public taking any interest in medieval history at all, because that interest is, to them, always inherently problematic; but while they can decide for themselves if that’s the case in their own countries, it’s absolutely out of the question here. 

Majoring in History

From The Conversation:

Don’t despair if your teen wants to major in history instead of science

It might be your worst nightmare. Your child, sitting at the kitchen table, slides you a brochure from the local university.

“I’ve been thinking of majoring in history.”

Before you panic and begin calling the nearest computer science department, or worse, begin to crack those tired barista jokes, hear me out. This might just be the thing that your child, and our society, needs.

Choosing to become a history major is a future-friendly investment. A history degree teaches skills that are in short supply today: the ability to interpret context, and — crucially — where we’ve been, so as to better understand the world around us today and tomorrow.

We’ve never needed knowledge of history and the skills that come with the discipline more than we do now. Not only is it a good choice of a major for all the usual selfish reasons — you’ll likely get a good job, even if it takes a bit longer than the STEM disciplines, and more importantly you’ll probably be very happy with it.

But for our society more generally, we need a generation with deep capacities to acknowledge context and ambiguity. This idea of ambiguity not only pertains to interpreting the past based on a diverse body of incomplete sources, voices and outcomes, but also how our contemporary judgements of that record shape our choices today.

Our whole society hurts when students turn their back on history. A sense of history — where we have come from, the shared anchors of democratic society, the why and how of our current moment in time — is critical.

Read the whole thing.

History In and Out of the Classroom

From The Federalist:

Americans have a hunger to understand, explore, and connect with their history. Richly sourced, intellectually demanding accounts of the country’s defining moments and characters do more than break through the noise.

Indeed, historians are probably the scholars most celebrated outside the confines of the academy. They are among the few who shape our cultural landscape—from a place of learning. As though to prove the point, Chernow’s 832-page 2005 biography of Alexander Hamilton, also a New York Times best-seller, inspired the most talked-about Broadway musical in a generation. Only on the American college campus is American history on retreat.

How strange it is that U.S. colleges and universities are abandoning the study of American history and, at some institutions, the study of history altogether. The American Council of Trustees and Alumni evaluates the general education programs of more than 1,100 colleges and universities every year. The 2018–19 report found that only 17 percent of them required any kind of foundational course in American history or government. As of 2016, only four out of the top 25 national universities (as ranked by U.S. News and World Report) required a course in U.S. history in their history majors.

In this light, it is perhaps unsurprising that history programs in the United States are struggling to generate student interest. When the American Historical Association drew attention to cratering undergraduate degree production last year—the number of history degrees awarded annually has fallen almost 34 percent since 2011, more steeply than any other discipline in the liberal arts.

This is true even at my alma mater Dartmouth College, where I attended my 25th reunion last weekend.

Previous thoughts on the matter.

UPDATE: This is the 1000th post published on FFT since the blog’s inception in September 2014!

History is History

Here is another article on a recent theme:

A recent study confirms a disturbing trend: American college students are abandoning the study of history. Since 2008, the number of students majoring in history in U.S. universities has dropped 30 percent, and history now accounts for a smaller share of all U.S. bachelor’s degrees than at any time since 1950. Although all humanities disciplines have suffered declining enrollments since 2008, none has fallen as far as history. And this decline in majors has been even steeper at elite, private universities — the very institutions that act as standard bearers and gate-keepers for the discipline. The study of history, it seems, is itself becoming a relic of the past.

It is tempting to blame this decline on relatively recent factors from outside the historical profession. There are more majors to choose from than in the past. As a broader segment of American society has pursued higher education, promising job prospects offered by other fields, from engineering to business, has no doubt played a role in history’s decline. Women have moved in disproportionate numbers away from the humanities and towards the social sciences. The lingering consequences of the Great Recession and the growing emphasis on STEM education have had their effects, as well.

Yet a deeper dive into the statistics reveals that history’s fortunes have worsened not over a period of years, but over decades. In the late 1960s, over six percent of male undergraduates and almost five percent of female undergraduates majored in history. Today, those numbers are less than 2 percent and 1 percent. History’s collapse began well before the financial crash.

This fact underscores the sad truth of history’s predicament: The discipline mostly has itself to blame for its current woes. In recent decades, the academic historical profession has become steadily less accessible to students and the general public — and steadily less relevant to addressing critical matters of politics, diplomacy, and war and peace. It is not surprising that students are fleeing history, for the historical discipline has long been fleeing its twin responsibilities to interact with the outside world and engage some of the most fundamental issues confronting the United States.

More at the link.

I’m not quite sure that I agree with his critique. First, it’s important to note that just because the number of history majors has declined, it does not mean that the study of history itself has declined – history remains part of most general education requirements, and of course people can minor in history while majoring in something else. And that something else, as noted, has to start paying off immediately.

But I don’t think that the popularity of the history major has much to do with academic historians’ engagement with the public sphere. My impression is that most high school and/or college students, if they are interested in history, are interested in the history itself. They don’t know who the big names are, or whether these people actually hold academic appointments (I certainly didn’t). There is still lots of popular history out there, whether in the form of books, television shows, movies, or video games (some of which, I feel compelled to state, is even produced by real academics, and most of which is based on work that they’ve done).

Or is the idea that popular and academic history have diverged too much in their concerns? Picture a young man who is interested in World War II on account of watching Inglourious Basterds and playing a lot of D-Day, who comes to university only to discover that the only course offered on World War II focusses heavily on Rosie the Riveter, race relations in southern factories, and Japanese internment. These are legitimate topics, of course, but you can see why some students might consider them of secondary importance to the actual war itself. I don’t think that this is as much of a problem as people think, but I admit that it can be a problem. History ought to be a house of many mansions, and any substantial history department should (and generally does) try very hard to recruit members who specialize in different times and periods… and approaches, including diplomatic and military history. But what happens if there are no applicants in those latter fields? This is a function of competitive convergence – there are so many would-be academics chasing so few jobs, that everyone specializes in something they think will win them employment, something designated “hot,” “up to date,” and “relevant.” No one is willing to take a risk on specializing in a niche interest. This is a shame.*

But I don’t know what can be done about it, except for all AHA member departments to make a pact to severely limit the number of Ph.D. students they take on, in the interests of loosening up and (paradoxically) diversifying the job market – even if it means that some of the professors will have to do some of their own grunt work, which means that it will never happen. Failing that, I do think that an expansive spirit is required, a deeply engrained (and enforced, if need be) principle that the same right that you have to do your thing guarantees someone else’s right to do her thing, i.e. a spirit of genuine diversity, whereby no one’s approach is better than anyone else’s. Alas, if team intersectional gets its way, very soon there will be a sort of Bechdel test for the acceptance of scholarship in Speculum or panels at Kalamazoo, because these people require constant validation and it’s never enough that they get to do what they want to do, you have to do it too. The totalitarianism behind this impulse has always annoyed me. Live and let live, eh?

* Robert Hughes (in his Culture of Complaint) quotes the critic Louis Menand:

most of the academic world is a vast sea of conformity, and every time a new wave of theory and methodology rolls through, all the fish try to swim in its direction. Twenty years ago every academic critic of literature was talking about the self, its autonomy and its terrible isolation. Today not a single respectable academic would be caught dead anywhere near the word, for the “self’ is now the “subject’ and the subject, everyone heartily agrees, is a contingent construction… what ought to be most distressing to everyone is the utter predictability of the great majority of the academic criticism that gets published.

He’s talking about literary study, but it applies in a lesser way to history too.

Our Appeal Has Become More Selective

Some sobering news from IHE:

History has seen the steepest decline in majors of all disciplines since the 2008 recession, according to a new analysis published in the American Historical Association’s Perspectives on History.

“The drop in history’s share of undergraduate majors in the last decade has put us below the discipline’s previous low point in the 1980s,” reads the analysis, written by Benjamin M. Schmidt, an assistant professor of history at Northeastern University.

Some numbers: there were 34,642 history degrees conferred in 2008, according to federal data. In 2017, the most recent year for which data are available, there were 24,266. Between 2016 and 2017 alone, there was a 1,500 major drop-off. And even as overall university enrollments have grown, “history has seen its raw numbers erode heavily,” Schmidt wrote, especially since 2011-12.

“Of all the fields I’ve looked at, history has fallen more than any other in the last six years,” he says. The 2012 time frame is significant, according to the analysis, because it’s the first period in which students who experienced the financial crisis could easily change their majors.

The data represent a “new low” for the history major, Schmidt wrote. While a 66 percent drop in history’s share of majors from 1969 to 1985 remains the “most bruising” period in the discipline’s history, that drop followed a period of rapid enrollment expansion. The more recent drop is worse than history’s previous low point, in the 1980s.

I think that one of the main reasons for the decline in the history major is on account of university tuition fees continually rising far beyond the rate of inflation, so that students, of necessity, must see university as a financial investment that needs to start paying off immediately, rather than an incubator of cultural literacy, informed citizenship, and a personal life philosophy, as it may once have been. I am not saying that history majors can’t perform well in a wide variety of jobs, precisely because they can conduct research and present it coherently, it’s just that they have to overcome certain hurdles before they can convince people to hire them. I would not discount the politicization of the discipline, although this is not nearly as bad as some commentators would like to suggest (the profession as a whole might lean to the left, but you can always find professors who keep their politics to themselves, or who are even conservative). But I take consolation in the fact that our appeal really is selective: to do history properly you need intelligence and motivation, literacy and hard work. These qualities are less common that you might imagine.

“Whitesplaining”

From the Chronicle of Higher Education:

The Whitesplaining of History Is Over

When the academy was the exclusive playground of white men, it produced the theories of race, gender, and Western cultural superiority that underwrote imperialism abroad and inequality at home. In recent decades, women and people of color have been critical to producing new knowledge breaking down those long-dominant narratives. Sociological research confirms that greater diversity improves scholarship.

Yet the struggle to diversify the academy remains an uphill battle; institutional biases are deeply ingrained, and change evokes nostalgia for times past. Both of these obstacles were fully in evidence at a recent Applied History conference at the Hoover Institution at Stanford University. Although history is a discipline with a growing number of nonwhite faculty members, and a healthy percentage of female scholars — indeed, women constitute more than a third of the faculty in Stanford’s own history department, across the bike lane from the Hoover Institution — the Hoover conference was made up of 30 white men (and one woman, who chaired a panel).

Etc.

This sort of critique is becoming all the more common in my profession (the article above was approvingly linked by two friends on Facebook), and I hate it. I hate the jargon (“whitesplaining”) and glibness (“exclusive playground”) – but most of all I hate the Jacobinism of it, how anything produced by “white males” in the olden days is necessarily tainted, while anything “diverse” is necessarily better (the link goes to a book entitled The Diversity Bonus: How Great Teams Pay Off in the Knowledge Economy which, as anyone who has spent time in the world of work can attest, is no more true than its opposite*). Before we learned to care about the identity of the author in order to prompt us how we should respond to his ideas, it was possible for an idea to be considered largely on its merits, and I sure wish we could return to that dispensation. To suggest that all those bad old white males produced scholarship to justify Western imperialism, etc., is contradicted by the author’s own examples of Edward Thompson and E.P. Thompson, white males both who said things that she apparently agrees with (one can think of any number of others, like Charles Beard or Marc Bloch). But more importantly, how dare Priya Satia dismiss the work of (almost) everyone who came before her because they weren’t diverse enough for her tastes? Presumably they were men of integrity, who investigated the past to the best of their ability and who opened up new vistas in human understanding. Just because their race and gender are distasteful to her is no reason to preemptively dismiss their entire body of work.

But if present trends continue, this essentially adolescent pose will be with us for some time to come.

* As I wrote once: “You definitely need something in common – intelligence and a sense of modesty come to mind. Furthermore, it all depends on the purpose of your organization. Sometimes when everyone’s on the same page, sharing the same background assumptions, then you can achieve your goals much more efficiently. The notion that different people with different opinions really have something special to offer could have value, but what is the nature of those opinions? So often “diversity” just boils down to “skin color,” “configuration of genitals,” or “direction of erotic desire,” with any “opinions” that derive from these things being completely irrelevant to the vast majority of problems to be solved or tasks to be completed in the wonderful world of work; worse, there is a very real possibility that the people concerned can be indifferently competent but have massive chips on their shoulders about how allegedly oppressed they are, and will interpret every difficulty as proceeding from some amorphous but entrenched prejudice arrayed against them. This is not conducive to getting anything done.”

UPDATE: Turns out the Hoover Institution conference was organized by the great Niall Ferguson, who responds:

“Masculinity, not ideology, drives extremist groups,” was another recent headline that caught my eye, this time in The Washington Post.

Got it.

I have had to listen to a variation on this theme rather too much in recent weeks. Last month I organized a small conference of historians who I knew shared my interest in trying to apply historical knowledge to contemporary policy problems. Five of the people I invited to give papers were women, but none was able to attend. I should have tried harder to find other female speakers, no doubt. But my failure to do so elicited a disproportionately vitriolic response.

Under a headline that included the words “Too white and too male,” The New York Times published photographs of all the speakers, as if to shame them for having participated. Around a dozen academics — male as well as female — took to social media to call the conference a “StanfordSausageFest.”

So outraged were Stanford historians Allyson Hobbs and Priya Satia that they demanded “greater university oversight” of the Hoover Institution, where I work, as it was “an ivory tower in the most literal sense.”

The most literal sense?

Now let’s be clear. I was raised to believe in the equal rights of all people, regardless of sex, race, creed, or any other difference. That the human past was characterized by discrimination of many kinds is not news to me. But does it really constitute progress if the proponents of diversity resort to the behavior that was previously the preserve of sexists and racists?

Publishing the names and mugshots of conference speakers is the kind of thing anti-Semites once did to condemn the “over-representation” of Jewish people in academia. Terms such as “SausageFest” belong not in civil academic discourse but on urinal walls.

What we see here is the sexism of the anti-sexists; the racism of the anti-racists. In this “Through the Looking Glass” world, diversity means ideological homogeneity. “The whitesplaining of history is over,” declared another heated article by Satia last week. Hideous Newspeak terms such as “whitesplaining” and “mansplaining” are symptoms of the degeneration of the humanities in the modern university. Never mind the facts and reason, so the argument runs, all we need to know — if we don’t like what we hear — are the sex and race of the author.

The process of indoctrination starts early. My six-year-old son stunned his parents the other day when we asked what he had been studying at school. He replied that they had been finding out about the life of Martin Luther King Jr. “What did you learn?” I asked. “That most white people are bad,” he replied.

This is America in 2018.

Linkage

• From IHE: History is Hot! Although the author does praise the sort of activism that I disparage in this post, it is heartening to read paragraphs like this:

One obvious way is the rise in visibility. Many young Americans may, for the first time, be hearing from historians and be seeing them on a regular basis in major news media outlets. Historians certainly appear in the press all the time, but the difference now is the stage. During a presidential election, nearly all of America is paying attention to media, and particularly in such a divisive and unusual election as this one. It is an especially good time to be visible.

While being visible, we also can demonstrate the core values of our profession. We can continue to showcase the dispassionate wisdom and clarity of thought that is treasured by those of us in the discipline and sought by those outside it. In a climate of constant shouting and bickering, contemplative thought may not be everyone’s cup of tea. But it can offer a refreshing alternative and inspire younger folks that they, too, can be an impactful voice of reason when America needs it most.

• From the Guardian (originally the Chronicle of Higher Education): “Uncovering the brutal truth about the British empire” – an article on Caroline Elkins’s heroic investigation of the British fight against the Mau Mau insurgency in Kenya – yes, it involved detention camps and torture, contrary to the official line (although be sure to check out the section on criticism of Elkins’s work).

Historians Against Trump

Ron Radosh (on PJ Media) says something I happen to agree with. Major excerpts:

Big Surprise? There Is Now a ‘Historians Against Trump’ Group

As they once did in their protests against the Vietnam War, American academic historians are now trying to use their positions in academia to present “scholarly” reasons why Donald Trump must not be president of the United States. They have formed a group called “Historians Against Trump” (HAT), since obviously “Historians Against War” was not appropriate for this salvo.

Their “Open Letter to the American People,” published on their website, is one of the most arrogant, pretentious piece of claptrap they could possibly have written. Why have they written this letter? This is their reason:

“Historians understand the impact these phenomena have upon society’s most vulnerable and upon a nation’s conscience. The lessons of history compel us to speak out against a movement rooted in fear and authoritarianism. The lessons of history compel us to speak out against Trump.”

I am no fan of Donald Trump, and I am not going to vote for him this election, but their argument does not stand up. First, what if there was a large group of conservative historians in the academy who decided to write an open letter about the election, claiming “the lessons of history” as their reason for arguing we should vote Republican? The HAT would no doubt loudly condemn them for using the fact that they are professional historians with Ph.D.s as the reason they should be listened to….

In response to their letter, Professor of Law Stanley Fish has written a column in this Sunday’s New York Times. Mercilessly slashing all of their arguments, he boils them down to noting that in effect, all they are saying is “We’re historians and you’re not,” and hence they are obliged to inform Americans that the lessons of history tell us Trump should not be elected. That they have Ph.D.s is not proof that they can equate “an advanced degree with virtue.” Fish writes:

“By dressing up their obviously partisan views as “the lessons of history,” the signatories to the letter present themselves as the impersonal transmitters of a truth that just happens to flow through them. In fact they are merely people with history degrees….[which] does not qualify them to be our leaders and guides as we prepare to exercise our franchise in a general election. Academic expertise is not a qualification for delivering political wisdom.”

As a historian who has been fighting this good fight for too long a time, I fully agree with Fish that historians should not as historians be making “political pronouncements of any kind.” In trying to “invest their remarks with the authority of their academic credentials,” as Fish puts it, they are forfeiting the very sine qua non of what being a historian means. The long years of study and the skills they acquired, which earned them advanced degrees, do not come with the right to use those degrees to tell Americans how to vote.

Why does this not go absolutely without saying? Of course, people have the right to oppose Trump, as vociferously as they want. Like Radosh and Fish, though, I am chary of historians pretending that their profession gives them special insight into current politics – or rather, I am amazed that these wise, Olympian understandings always seem to be “liberal” in nature when, like all political positions, they are often no more valuable or true than their opposites. And I especially dislike it when these groups manage to get the American Historical Association or other ostensibly nonpartisan, professional organizations to endorse their points of view. We saw this ten years ago at the annual meeting of the AHA in Atlanta. As I wrote at the time:

Even during the Vietnam war the AHA would not pass an anti-war resolution, but now, be it resolved:

“that the American Historical Association urge its members through publication of this resolution in Perspectives and other appropriate outlets:

  1. To take a public stand as citizens on behalf of the values necessary to the practice of our profession; and
  2. To do whatever they can to bring the Iraq war to a speedy conclusion.”

Perhaps it passed because it doesn’t actually say that “The AHA condemns this war,” but still… it’s annoying when you discover that you’re still in college, with the student government earnestly passing sophomoric resolutions on your behalf. Wankers.

(See Radosh’s article about Eugene Genovese’s successful opposition to an anti-war resolution in the 1960s.)

Later on in 2007, still steamed, I elaborated:

Here is the resolution, in all its inanity:

“Whereas the American Historical Association’s Professional Standards emphasize the importance of open inquiry to the pursuit of historical knowledge;

“Whereas the American Historical Association adopted a resolution in January 2004 re-affirming the principles of free speech, open debate of foreign policy, and open access to government records in furthering the work of the historical profession;

“Whereas during the war in Iraq and the so-called war on terror, the current Administration has violated the above-mentioned standards and principles through the following practices:

“excluding well-recognized foreign scholars;

“condemning as “revisionism” the search for truth about pre-war intelligence;

“re-classifying previously unclassified government documents;

“suspending in certain cases the centuries-old writ of habeas corpus and substituting indefinite administrative detention without specified criminal charges or access to a court of law;

“using interrogation techniques at Guantanamo, Abu-Ghraib, Bagram, and other locations incompatible with respect for the dignity of all persons required by a civilized society;

“Whereas a free society and the unfettered intellectual inquiry essential to the practice of historical research, writing, and teaching are imperiled by the practices described above; and

“Whereas, the foregoing practives are inextricably linked to the war in which the United States is presently engaged in Iraq; now, therefore, be it

“Resolved, That the American Historical Association urges its members through publication of this resolution in Perspectives and other appropriate outlets:

  1. To take a public stand as citizens on behalf of the values necessary to the practice of our profession; and
  2. To do whatever they can to bring the Iraq war to a speedy conclusion.”

Of course I have no problem with people who oppose the war, but I would really appreciate it if they would speak for themselves, or form groups for the specific purpose of opposing the war, rather than trying to shanghai the rest of us into taking their position. Yes, the price of liberty is constant vigilance, and I, and as many people as I could have mustered, should have gone to the business meeting and spoken out and voted against this resolution. But it would be really nice if I could take it for granted that I didn’t have to do such a thing. In the world in which I would like to live, people know their place, and would be deeply ashamed of the bloody rudeness of taking a group that is ostensibly a professional association for historians, and trying to turn it into an activist group opposed to the war in Iraq. “Oh, but this issue is too important for such considerations of bourgeois propriety!” they claim. No, it isn’t. Despite the deepest, most self-dramatizing desires of these people, we are not facing the imminent fall of the Constitution and the imposition of martial law in favor of some neo-Nazi regime. Oppose the war by all means, but leave the rest of us out of it! This really is college-sophomore stuff – like “jeans day,” when you are to show your solidarity with homosexual rights by wearing jeans, or so proclaim the few posters here and there about campus, put up the day before the event. So you wear jeans like you do all the time, and no matter how you may feel about gay people, you find that you are cast as supporting them! (Ha ha, caught you!) And no, I don’t find the logic of this resolution very compelling. The attempt to link opposition to the war with the practice of history is about as true as a resolution reading: “Whereas we are distracted because we don’t know where the terrorists are going to strike next, and whereas the violent homophobia and misogyny of Wahhabi Islam are deeply offensive to us, Be It Resolved That the AHA supports President George W. Bush in the Global War on Terror.” Something tells me that a resolution like that is not going to pass any time soon, because we are dealing with American myopia. Some people can’t get visas to come to the United States, and some documents are being reclassified, and some prisoners were abused at Guantanamo and Abu Ghraib! How terrible! OK, but why not compare these things to, for example, the sort of things enumerated in this article. Opening paragraph:

“Academics who study China, which includes the author, habitually please the Chinese Communist Party, sometimes consciously, and often unconsciously. Our incentives are to conform, and we do so in numerous ways: through the research questions we ask or don’t ask, through the facts we report or ignore, through our use of language, and through what and how we teach.”

Here is a perfect example of government policy specifically curtailing the practice of history. Will the AHA pass a resolution condemning this? (And, while we’re at it, condemning China’s atrocious human rights record as being “incompatible with respect for the dignity of all persons required by a civilized society”?) Fat chance: what we do is evil, what they do is “their culture.”

What really gets me though, is when my fellow professors can’t keep their damned liberal opinions to themselves, and shout them in socially inappropriate venues, and are then surprised when state legislatures want to cut their funding, or propose affirmative action programs for conservatives. They simply have no idea where such things came from! Help, help, we’re being oppressed! (Forget college sophomores – these are high school sophomores! The Holy Grail of being a teenager – being yourself, and being accepted for being yourself. But if you remember from high school, very few people actually got to do this; the rest of us had to choose between compromising our “selves” to fit in, or adhering to them and being ostracized. But to demand the right to spout your ideology while being cherished and affirmed for it… what wankery!)

Notes About Notes

Last year I quoted the Hamilton College History Department guide to writing good history papers. It contains a solid justification for why historians cite their sources in footnotes and not in parenthetical citations.

Your professor may allow parenthetical citations in a short paper with one or two sources, but you should use footnotes for any research paper in history. Parenthetical citations are unaesthetic; they scar the text and break the flow of reading. Worse still, they are simply inadequate to capture the richness of historical sources. Historians take justifiable pride in the immense variety of their sources. Parenthetical citations such as (Jones 1994) may be fine for most of the social sciences and humanities, where the source base is usually limited to recent books and articles in English. Historians, however, need the flexibility of the full footnote. Try to imagine this typical footnote (pulled at random from a classic work of German history) squeezed into parentheses in the body of the text: DZA Potsdam, RdI, Frieden 5, Erzgebiet von Longwy-Briey, Bd. I, Nr. 19305, gedruckte Denkschrift für OHL und Reichsleitung, Dezember 1917, und in RWA, Frieden Frankreich Nr. 1883. The abbreviations are already in this footnote; its information cannot be further reduced. For footnotes and bibliography, historians usually use Chicago style. (The Chicago Manual of Style. 15th edition. Chicago: University of Chicago Press, 2003.)

I fully concur with this; my only wish is that footnotes (and not endnotes) were more standard in history publishing. At one point it was technically easier to publish a book with endnotes rather than with footnotes, but I should think that current software can exhibit footnotes easily enough. What keeps the notes as back matter, however, is the widespread idea that footnotes turn off General Reader. I have never understood this. You don’t need to read them, if you don’t want to! But if you do, and they’re hanging out back with the bibliography and the index, you need to make some effort to follow them, keeping your thumb (or a second bookmark) in the back where they have been placed.

We should make it as easy as possible for “all those who wish to follow into the deep woods, green pastures, and rewarding byways which lie on either side of the motorway of the text.”

• If we simply must have endnotes, though, it would really help if we could always have the running header “Notes to pages x to y” at the top of each page of them. And here is a technique that might act as a palliative for people annoyed at having to go note-hunting. Richard Herrnstein and Charles Murray’s The Bell Curve (1994) features reference numbers sometimes enclosed in square brackets, e.g.:

The implication is that something in the rural Georgia environment was depressing the scores of black children as they grew older.[98]

When you see the square brackets, you know that there there is some discussion in the notes, in this case:

98. Some other studies suggest a systematic sibling difference for national population, but it goes the other way: Elder siblings outscore younger siblings in some data sets, However, this “birth-order” effect, when it occurs at all, is much smaller than the effect Jensen observed.

Unbracketed numbers are simply citations, (e.g. “97. Jensen 1977”) – thus does the reader know what references might be more fruitful to follow.

(Yes, yes, I am fully aware how controversial this book is. I am endorsing its typography, not its contents.)

• But if your notes are largely citations, and not extra discussion, a good way to save space with them, and to make them look good to boot, is to run them all together without carriage returns, but boldface the reference numbers, somewhat like this:

12. Wheeler, Cultivating Regionalism, 43. 13. Marx, Das Kapital, 3: 465. 14. Jonathan Good, The Cult of St. George in Medieval England (Woodbridge: Boydell Press, 2009), 65. 15. Schama, Citizens, 234. 16. Eamon Duffy, The Stripping of the Altars: Traditional Religion in England, 1400-1580 (New Haven and London: Yale University Press, 1992), 34, 54. 17. Beatrix Potter, The Tale of Jemima Puddle-Duck (London: Frederick Warne, 1908), 4-10.

I saw this format in a book once but I can’t remember which.

• You’ll note that some of the references in the example above are full ones, while others are abbreviated. That is, Eamon Duffy’s book gets the full title, plus city of publication, publisher, and year of publication, while we don’t even know Wheeler’s first name, the full title of his book, or any of the publication information. This is because Wheeler’s book must have already been cited in the first eleven notes in our hypothetical piece. There, we would have read:

Kenneth Wheeler, Cultivating Regionalism: Higher Education and the Making of the American Midwest (DeKalb, Ill.: Northern Illinois University Press, 2011), 42.

In a similar fashion, a citation for Duffy’s book following note 17 would read simply:

Duffy, Stripping, 45.

The idea is that you’re actually following along with the notes as you read the article, so you don’t need to give all the publication information every time you cite the same piece. “Oh yeah,” you say to yourself when you read footnote 12 – that was the Wheeler book that NIU Press published in 2011, previously cited in note 4. But what if you can’t remember every piece that’s cited – or only check the notes occasionally? Should we give full information in every note? By no means! That would take too much space. We could only give abbreviated information in every note, with a bibliography at the end, but a bibliography takes up space, too. This is not much of a problem for books, but it is for journal articles. Thus, we can do what Viator has done: give a helpful reminder of where the full information appeared in the first place. From an article I recently read (boldface added):

93 Otto Demus, The Church of San Marco in Venice: History, Architecture, Sculpture (Washington, D.C., 1960), 30.

94 PL 129.724–726.

95 Demus, Church of San Marco (n. 93 above), 128-135.

• But note the publication information in n. 93: “Washington, D.C., 1960.” This is old-school: nowadays, the Chicago Manual of Style recommends City of Publication: Publisher, Year of Publication, so that the reference should read: “Washington, D.C.: Dumbarton Oaks Library, 1960.” This was a necessary change, although many publishers, especially those based in the U.K., still adhere to the old City, Year model. I suppose that historically, listing the city would impute something about the book, as though national feeling of the publishers, or the legal regime under which the book was printed, would shape the contents. But most publishers are now multinational operations. Boydell only has two cities that it calls home; University of Toronto Press has three, and Penguin has several dozen. Listing any more than one is probably a waste of space – and even naming one city is not nearly as important now as simply naming the publisher. For it’s the publisher that determines the quality or political orientation of the work – Yale University Press being more trustworthy than Edwin Mellen, for instance. You can sometimes guess the publisher based on the city (there aren’t any other publishers besides Boydell in Woodbridge, Suffolk), but why not come right out and say it? Especially given how many publishers are in London or New York.

So I propose that we should get rid of the city entirely, and have the note read simply “Dumbarton Oaks Library, 1960” – this tells you everything you need to know!

(If we simply must indicate cities, though, let us, when noting American ones, use the older, irregular-length state abbreviations, like Ala., Okla., or Calif., and not AL, OK, or CA. These should really be reserved for postal addresses only.)

• But who reads books anymore? Aren’t they a dead medium? Isn’t everything we need to know on the Internet?

Well, yes, there’s a lot of information out there in digital form, accessible through the World Wide Web, and perhaps we should prepare ourselves for the day when no information will be communicated otherwise. In the meantime, however, many people are still composing text to be printed with ink on paper, and if you’re doing so yourself, and you’ve found something on the Internet that you want to cite, make sure that you don’t just print its URL in your note. I read a book once in which every one of the 700+ endnotes was nothing more than a URL! (Clearly, it had been originally composed as a web document, and the author rigged up an algorithm to convert the links into endnotes.) Even the books were cited as links to Google Books, or Amazon. (And none of the links had “accessed [date]” after it, so that one could check the references in the Internet Wayback Machine on the approximate date the author did, in case the link should rot).

I assume that the author of the book was in some kind of a rush to get it out in print form, but his citation protocol is just plain silly. I would much rather read:

David M’Clure and Elijah Parish, Memoirs of the Rev. Eleazar Wheelock, D.D., Founder and President of Dartmouth College and Moor’s Charity School (Newburyport, Mass.: Edward Little, 1811), 57.

than:

http://books.google.com/books?id=QTYFAAAAYAAJ&dq=memoirs%20%22eleazar%20wheelock%22&client=safari&pg=PA57#v=onepage&q=&f=false

Not only is the first note easier on the eyes, it allows one to look the book up in the library if one has access to a library, or on Google books if one has access to that. (I don’t believe that we need to specifically cite Google books if we have found something there. I trust that their scanners are working properly, and that the book we see online is the same one that we find in the library.)

As for purely online sources, there is a proper format for citing them too, e.g.:

Nathan Bedford Forrest, “Report of Maj. Gen. Nathan B. Forrest, C. S. Army, Commanding Cavalry, of the Capture of Fort Pillow,” Shotgun’s Home of the American Civil War, at http://www.civilwarhome.com/forrest.htm (accessed June 9, 2016).

Note that the author and title of the document, and the title of the website where it appears, have been made manifest, in addition to the actual Internet address where you can find it all.

Note, also, that the format still says that this was found on the web, and there is utterly no reason to append “Web” to the end of the note. A few years ago students started doing this for Internet sources (and appending “Print” for paper-and-ink ones). I don’t know what genius came up with this custom, but it needs immediate deprecation! The format itself speaks!

• But if we are writing for the web: something I haven’t seen addressed anywhere else (although someone must have at some point) is the question of, stylistically, how aware online prose ought to be of its own hyperlinks. Ideally it should not be aware at all. Links are parallel to footnotes (or perhaps quod vide), and just as you can reprint an article omitting them, so also can you reprint a piece of web text without hyperlinks with no essential damage to the prose. However, a sentence reading:

Here is the original article, and here are some reactions

would not transfer well, given that the links must be there for the sentence to make sense (the two instances of “here” need them as referents). I suppose we should train ourselves to write like this:

Benedikt’s article appeared on Slate on August 29, and immediately sparked a number of reactions.

Without the links the sentence would require a bit of Googling on the part of the reader, but at least it would make sense grammatically.