“Whitesplaining”

From the Chronicle of Higher Education:

The Whitesplaining of History Is Over

When the academy was the exclusive playground of white men, it produced the theories of race, gender, and Western cultural superiority that underwrote imperialism abroad and inequality at home. In recent decades, women and people of color have been critical to producing new knowledge breaking down those long-dominant narratives. Sociological research confirms that greater diversity improves scholarship.

Yet the struggle to diversify the academy remains an uphill battle; institutional biases are deeply ingrained, and change evokes nostalgia for times past. Both of these obstacles were fully in evidence at a recent Applied History conference at the Hoover Institution at Stanford University. Although history is a discipline with a growing number of nonwhite faculty members, and a healthy percentage of female scholars — indeed, women constitute more than a third of the faculty in Stanford’s own history department, across the bike lane from the Hoover Institution — the Hoover conference was made up of 30 white men (and one woman, who chaired a panel).

Etc.

This sort of critique is becoming all the more common in my profession (the article above was approvingly linked by two friends on Facebook), and I hate it. I hate the jargon (“whitesplaining”) and glibness (“exclusive playground”) – but most of all I hate the Jacobinism of it, how anything produced by “white males” in the olden days is necessarily tainted, while anything “diverse” is necessarily better (the link goes to a book entitled The Diversity Bonus: How Great Teams Pay Off in the Knowledge Economy which, as anyone who has spent time in the world of work can attest, is no more true than its opposite*). Before we learned to care about the identity of the author in order to prompt us how we should respond to his ideas, it was possible for an idea to be considered largely on its merits, and I sure wish we could return to that dispensation. To suggest that all those bad old white males produced scholarship to justify Western imperialism, etc., is contradicted by the author’s own examples of Edward Thompson and E.P. Thompson, white males both who said things that she apparently agrees with (one can think of any number of others, like Charles Beard or Marc Bloch). But more importantly, how dare Priya Satia dismiss the work of (almost) everyone who came before her because they weren’t diverse enough for her tastes? Presumably they were men of integrity, who investigated the past to the best of their ability and who opened up new vistas in human understanding. Just because their race and gender are distasteful to her is no reason to preemptively dismiss their entire body of work.

But if present trends continue, this essentially adolescent pose will be with us for some time to come.

* As I wrote once: “You definitely need something in common – intelligence and a sense of modesty come to mind. Furthermore, it all depends on the purpose of your organization. Sometimes when everyone’s on the same page, sharing the same background assumptions, then you can achieve your goals much more efficiently. The notion that different people with different opinions really have something special to offer could have value, but what is the nature of those opinions? So often “diversity” just boils down to “skin color,” “configuration of genitals,” or “direction of erotic desire,” with any “opinions” that derive from these things being completely irrelevant to the vast majority of problems to be solved or tasks to be completed in the wonderful world of work; worse, there is a very real possibility that the people concerned can be indifferently competent but have massive chips on their shoulders about how allegedly oppressed they are, and will interpret every difficulty as proceeding from some amorphous but entrenched prejudice arrayed against them. This is not conducive to getting anything done.”

UPDATE: Turns out the Hoover Institution conference was organized by the great Niall Ferguson, who responds:

“Masculinity, not ideology, drives extremist groups,” was another recent headline that caught my eye, this time in The Washington Post.

Got it.

I have had to listen to a variation on this theme rather too much in recent weeks. Last month I organized a small conference of historians who I knew shared my interest in trying to apply historical knowledge to contemporary policy problems. Five of the people I invited to give papers were women, but none was able to attend. I should have tried harder to find other female speakers, no doubt. But my failure to do so elicited a disproportionately vitriolic response.

Under a headline that included the words “Too white and too male,” The New York Times published photographs of all the speakers, as if to shame them for having participated. Around a dozen academics — male as well as female — took to social media to call the conference a “StanfordSausageFest.”

So outraged were Stanford historians Allyson Hobbs and Priya Satia that they demanded “greater university oversight” of the Hoover Institution, where I work, as it was “an ivory tower in the most literal sense.”

The most literal sense?

Now let’s be clear. I was raised to believe in the equal rights of all people, regardless of sex, race, creed, or any other difference. That the human past was characterized by discrimination of many kinds is not news to me. But does it really constitute progress if the proponents of diversity resort to the behavior that was previously the preserve of sexists and racists?

Publishing the names and mugshots of conference speakers is the kind of thing anti-Semites once did to condemn the “over-representation” of Jewish people in academia. Terms such as “SausageFest” belong not in civil academic discourse but on urinal walls.

What we see here is the sexism of the anti-sexists; the racism of the anti-racists. In this “Through the Looking Glass” world, diversity means ideological homogeneity. “The whitesplaining of history is over,” declared another heated article by Satia last week. Hideous Newspeak terms such as “whitesplaining” and “mansplaining” are symptoms of the degeneration of the humanities in the modern university. Never mind the facts and reason, so the argument runs, all we need to know — if we don’t like what we hear — are the sex and race of the author.

The process of indoctrination starts early. My six-year-old son stunned his parents the other day when we asked what he had been studying at school. He replied that they had been finding out about the life of Martin Luther King Jr. “What did you learn?” I asked. “That most white people are bad,” he replied.

This is America in 2018.

Linkage

• From IHE: History is Hot! Although the author does praise the sort of activism that I disparage in this post, it is heartening to read paragraphs like this:

One obvious way is the rise in visibility. Many young Americans may, for the first time, be hearing from historians and be seeing them on a regular basis in major news media outlets. Historians certainly appear in the press all the time, but the difference now is the stage. During a presidential election, nearly all of America is paying attention to media, and particularly in such a divisive and unusual election as this one. It is an especially good time to be visible.

While being visible, we also can demonstrate the core values of our profession. We can continue to showcase the dispassionate wisdom and clarity of thought that is treasured by those of us in the discipline and sought by those outside it. In a climate of constant shouting and bickering, contemplative thought may not be everyone’s cup of tea. But it can offer a refreshing alternative and inspire younger folks that they, too, can be an impactful voice of reason when America needs it most.

• From the Guardian (originally the Chronicle of Higher Education): “Uncovering the brutal truth about the British empire” – an article on Caroline Elkins’s heroic investigation of the British fight against the Mau Mau insurgency in Kenya – yes, it involved detention camps and torture, contrary to the official line (although be sure to check out the section on criticism of Elkins’s work).

Historians Against Trump

Ron Radosh (on PJ Media) says something I happen to agree with. Major excerpts:

Big Surprise? There Is Now a ‘Historians Against Trump’ Group

As they once did in their protests against the Vietnam War, American academic historians are now trying to use their positions in academia to present “scholarly” reasons why Donald Trump must not be president of the United States. They have formed a group called “Historians Against Trump” (HAT), since obviously “Historians Against War” was not appropriate for this salvo.

Their “Open Letter to the American People,” published on their website, is one of the most arrogant, pretentious piece of claptrap they could possibly have written. Why have they written this letter? This is their reason:

“Historians understand the impact these phenomena have upon society’s most vulnerable and upon a nation’s conscience. The lessons of history compel us to speak out against a movement rooted in fear and authoritarianism. The lessons of history compel us to speak out against Trump.”

I am no fan of Donald Trump, and I am not going to vote for him this election, but their argument does not stand up. First, what if there was a large group of conservative historians in the academy who decided to write an open letter about the election, claiming “the lessons of history” as their reason for arguing we should vote Republican? The HAT would no doubt loudly condemn them for using the fact that they are professional historians with Ph.D.s as the reason they should be listened to….

In response to their letter, Professor of Law Stanley Fish has written a column in this Sunday’s New York Times. Mercilessly slashing all of their arguments, he boils them down to noting that in effect, all they are saying is “We’re historians and you’re not,” and hence they are obliged to inform Americans that the lessons of history tell us Trump should not be elected. That they have Ph.D.s is not proof that they can equate “an advanced degree with virtue.” Fish writes:

“By dressing up their obviously partisan views as “the lessons of history,” the signatories to the letter present themselves as the impersonal transmitters of a truth that just happens to flow through them. In fact they are merely people with history degrees….[which] does not qualify them to be our leaders and guides as we prepare to exercise our franchise in a general election. Academic expertise is not a qualification for delivering political wisdom.”

As a historian who has been fighting this good fight for too long a time, I fully agree with Fish that historians should not as historians be making “political pronouncements of any kind.” In trying to “invest their remarks with the authority of their academic credentials,” as Fish puts it, they are forfeiting the very sine qua non of what being a historian means. The long years of study and the skills they acquired, which earned them advanced degrees, do not come with the right to use those degrees to tell Americans how to vote.

Why does this not go absolutely without saying? Of course, people have the right to oppose Trump, as vociferously as they want. Like Radosh and Fish, though, I am chary of historians pretending that their profession gives them special insight into current politics – or rather, I am amazed that these wise, Olympian understandings always seem to be “liberal” in nature when, like all political positions, they are often no more valuable or true than their opposites. And I especially dislike it when these groups manage to get the American Historical Association or other ostensibly nonpartisan, professional organizations to endorse their points of view. We saw this ten years ago at the annual meeting of the AHA in Atlanta. As I wrote at the time:

Even during the Vietnam war the AHA would not pass an anti-war resolution, but now, be it resolved:

“that the American Historical Association urge its members through publication of this resolution in Perspectives and other appropriate outlets:

  1. To take a public stand as citizens on behalf of the values necessary to the practice of our profession; and
  2. To do whatever they can to bring the Iraq war to a speedy conclusion.”

Perhaps it passed because it doesn’t actually say that “The AHA condemns this war,” but still… it’s annoying when you discover that you’re still in college, with the student government earnestly passing sophomoric resolutions on your behalf. Wankers.

(See Radosh’s article about Eugene Genovese’s successful opposition to an anti-war resolution in the 1960s.)

Later on in 2007, still steamed, I elaborated:

Here is the resolution, in all its inanity:

“Whereas the American Historical Association’s Professional Standards emphasize the importance of open inquiry to the pursuit of historical knowledge;

“Whereas the American Historical Association adopted a resolution in January 2004 re-affirming the principles of free speech, open debate of foreign policy, and open access to government records in furthering the work of the historical profession;

“Whereas during the war in Iraq and the so-called war on terror, the current Administration has violated the above-mentioned standards and principles through the following practices:

“excluding well-recognized foreign scholars;

“condemning as “revisionism” the search for truth about pre-war intelligence;

“re-classifying previously unclassified government documents;

“suspending in certain cases the centuries-old writ of habeas corpus and substituting indefinite administrative detention without specified criminal charges or access to a court of law;

“using interrogation techniques at Guantanamo, Abu-Ghraib, Bagram, and other locations incompatible with respect for the dignity of all persons required by a civilized society;

“Whereas a free society and the unfettered intellectual inquiry essential to the practice of historical research, writing, and teaching are imperiled by the practices described above; and

“Whereas, the foregoing practives are inextricably linked to the war in which the United States is presently engaged in Iraq; now, therefore, be it

“Resolved, That the American Historical Association urges its members through publication of this resolution in Perspectives and other appropriate outlets:

  1. To take a public stand as citizens on behalf of the values necessary to the practice of our profession; and
  2. To do whatever they can to bring the Iraq war to a speedy conclusion.”

Of course I have no problem with people who oppose the war, but I would really appreciate it if they would speak for themselves, or form groups for the specific purpose of opposing the war, rather than trying to shanghai the rest of us into taking their position. Yes, the price of liberty is constant vigilance, and I, and as many people as I could have mustered, should have gone to the business meeting and spoken out and voted against this resolution. But it would be really nice if I could take it for granted that I didn’t have to do such a thing. In the world in which I would like to live, people know their place, and would be deeply ashamed of the bloody rudeness of taking a group that is ostensibly a professional association for historians, and trying to turn it into an activist group opposed to the war in Iraq. “Oh, but this issue is too important for such considerations of bourgeois propriety!” they claim. No, it isn’t. Despite the deepest, most self-dramatizing desires of these people, we are not facing the imminent fall of the Constitution and the imposition of martial law in favor of some neo-Nazi regime. Oppose the war by all means, but leave the rest of us out of it! This really is college-sophomore stuff – like “jeans day,” when you are to show your solidarity with homosexual rights by wearing jeans, or so proclaim the few posters here and there about campus, put up the day before the event. So you wear jeans like you do all the time, and no matter how you may feel about gay people, you find that you are cast as supporting them! (Ha ha, caught you!) And no, I don’t find the logic of this resolution very compelling. The attempt to link opposition to the war with the practice of history is about as true as a resolution reading: “Whereas we are distracted because we don’t know where the terrorists are going to strike next, and whereas the violent homophobia and misogyny of Wahhabi Islam are deeply offensive to us, Be It Resolved That the AHA supports President George W. Bush in the Global War on Terror.” Something tells me that a resolution like that is not going to pass any time soon, because we are dealing with American myopia. Some people can’t get visas to come to the United States, and some documents are being reclassified, and some prisoners were abused at Guantanamo and Abu Ghraib! How terrible! OK, but why not compare these things to, for example, the sort of things enumerated in this article. Opening paragraph:

“Academics who study China, which includes the author, habitually please the Chinese Communist Party, sometimes consciously, and often unconsciously. Our incentives are to conform, and we do so in numerous ways: through the research questions we ask or don’t ask, through the facts we report or ignore, through our use of language, and through what and how we teach.”

Here is a perfect example of government policy specifically curtailing the practice of history. Will the AHA pass a resolution condemning this? (And, while we’re at it, condemning China’s atrocious human rights record as being “incompatible with respect for the dignity of all persons required by a civilized society”?) Fat chance: what we do is evil, what they do is “their culture.”

What really gets me though, is when my fellow professors can’t keep their damned liberal opinions to themselves, and shout them in socially inappropriate venues, and are then surprised when state legislatures want to cut their funding, or propose affirmative action programs for conservatives. They simply have no idea where such things came from! Help, help, we’re being oppressed! (Forget college sophomores – these are high school sophomores! The Holy Grail of being a teenager – being yourself, and being accepted for being yourself. But if you remember from high school, very few people actually got to do this; the rest of us had to choose between compromising our “selves” to fit in, or adhering to them and being ostracized. But to demand the right to spout your ideology while being cherished and affirmed for it… what wankery!)

Notes About Notes

Last year I quoted the Hamilton College History Department guide to writing good history papers. It contains a solid justification for why historians cite their sources in footnotes and not in parenthetical citations.

Your professor may allow parenthetical citations in a short paper with one or two sources, but you should use footnotes for any research paper in history. Parenthetical citations are unaesthetic; they scar the text and break the flow of reading. Worse still, they are simply inadequate to capture the richness of historical sources. Historians take justifiable pride in the immense variety of their sources. Parenthetical citations such as (Jones 1994) may be fine for most of the social sciences and humanities, where the source base is usually limited to recent books and articles in English. Historians, however, need the flexibility of the full footnote. Try to imagine this typical footnote (pulled at random from a classic work of German history) squeezed into parentheses in the body of the text: DZA Potsdam, RdI, Frieden 5, Erzgebiet von Longwy-Briey, Bd. I, Nr. 19305, gedruckte Denkschrift für OHL und Reichsleitung, Dezember 1917, und in RWA, Frieden Frankreich Nr. 1883. The abbreviations are already in this footnote; its information cannot be further reduced. For footnotes and bibliography, historians usually use Chicago style. (The Chicago Manual of Style. 15th edition. Chicago: University of Chicago Press, 2003.)

I fully concur with this; my only wish is that footnotes (and not endnotes) were more standard in history publishing. At one point it was technically easier to publish a book with endnotes rather than with footnotes, but I should think that current software can exhibit footnotes easily enough. What keeps the notes as back matter, however, is the widespread idea that footnotes turn off General Reader. I have never understood this. You don’t need to read them, if you don’t want to! But if you do, and they’re hanging out back with the bibliography and the index, you need to make some effort to follow them, keeping your thumb (or a second bookmark) in the back where they have been placed.

We should make it as easy as possible for “all those who wish to follow into the deep woods, green pastures, and rewarding byways which lie on either side of the motorway of the text.”

• If we simply must have endnotes, though, it would really help if we could always have the running header “Notes to pages x to y” at the top of each page of them. And here is a technique that might act as a palliative for people annoyed at having to go note-hunting. Richard Herrnstein and Charles Murray’s The Bell Curve (1994) features reference numbers sometimes enclosed in square brackets, e.g.:

The implication is that something in the rural Georgia environment was depressing the scores of black children as they grew older.[98]

When you see the square brackets, you know that there there is some discussion in the notes, in this case:

98. Some other studies suggest a systematic sibling difference for national population, but it goes the other way: Elder siblings outscore younger siblings in some data sets, However, this “birth-order” effect, when it occurs at all, is much smaller than the effect Jensen observed.

Unbracketed numbers are simply citations, (e.g. “97. Jensen 1977”) – thus does the reader know what references might be more fruitful to follow.

(Yes, yes, I am fully aware how controversial this book is. I am endorsing its typography, not its contents.)

• But if your notes are largely citations, and not extra discussion, a good way to save space with them, and to make them look good to boot, is to run them all together without carriage returns, but boldface the reference numbers, somewhat like this:

12. Wheeler, Cultivating Regionalism, 43. 13. Marx, Das Kapital, 3: 465. 14. Jonathan Good, The Cult of St. George in Medieval England (Woodbridge: Boydell Press, 2009), 65. 15. Schama, Citizens, 234. 16. Eamon Duffy, The Stripping of the Altars: Traditional Religion in England, 1400-1580 (New Haven and London: Yale University Press, 1992), 34, 54. 17. Beatrix Potter, The Tale of Jemima Puddle-Duck (London: Frederick Warne, 1908), 4-10.

I saw this format in a book once but I can’t remember which.

• You’ll note that some of the references in the example above are full ones, while others are abbreviated. That is, Eamon Duffy’s book gets the full title, plus city of publication, publisher, and year of publication, while we don’t even know Wheeler’s first name, the full title of his book, or any of the publication information. This is because Wheeler’s book must have already been cited in the first eleven notes in our hypothetical piece. There, we would have read:

Kenneth Wheeler, Cultivating Regionalism: Higher Education and the Making of the American Midwest (DeKalb, Ill.: Northern Illinois University Press, 2011), 42.

In a similar fashion, a citation for Duffy’s book following note 17 would read simply:

Duffy, Stripping, 45.

The idea is that you’re actually following along with the notes as you read the article, so you don’t need to give all the publication information every time you cite the same piece. “Oh yeah,” you say to yourself when you read footnote 12 – that was the Wheeler book that NIU Press published in 2011, previously cited in note 4. But what if you can’t remember every piece that’s cited – or only check the notes occasionally? Should we give full information in every note? By no means! That would take too much space. We could only give abbreviated information in every note, with a bibliography at the end, but a bibliography takes up space, too. This is not much of a problem for books, but it is for journal articles. Thus, we can do what Viator has done: give a helpful reminder of where the full information appeared in the first place. From an article I recently read (boldface added):

93 Otto Demus, The Church of San Marco in Venice: History, Architecture, Sculpture (Washington, D.C., 1960), 30.

94 PL 129.724–726.

95 Demus, Church of San Marco (n. 93 above), 128-135.

• But note the publication information in n. 93: “Washington, D.C., 1960.” This is old-school: nowadays, the Chicago Manual of Style recommends City of Publication: Publisher, Year of Publication, so that the reference should read: “Washington, D.C.: Dumbarton Oaks Library, 1960.” This was a necessary change, although many publishers, especially those based in the U.K., still adhere to the old City, Year model. I suppose that historically, listing the city would impute something about the book, as though national feeling of the publishers, or the legal regime under which the book was printed, would shape the contents. But most publishers are now multinational operations. Boydell only has two cities that it calls home; University of Toronto Press has three, and Penguin has several dozen. Listing any more than one is probably a waste of space – and even naming one city is not nearly as important now as simply naming the publisher. For it’s the publisher that determines the quality or political orientation of the work – Yale University Press being more trustworthy than Edwin Mellen, for instance. You can sometimes guess the publisher based on the city (there aren’t any other publishers besides Boydell in Woodbridge, Suffolk), but why not come right out and say it? Especially given how many publishers are in London or New York.

So I propose that we should get rid of the city entirely, and have the note read simply “Dumbarton Oaks Library, 1960” – this tells you everything you need to know!

(If we simply must indicate cities, though, let us, when noting American ones, use the older, irregular-length state abbreviations, like Ala., Okla., or Calif., and not AL, OK, or CA. These should really be reserved for postal addresses only.)

• But who reads books anymore? Aren’t they a dead medium? Isn’t everything we need to know on the Internet?

Well, yes, there’s a lot of information out there in digital form, accessible through the World Wide Web, and perhaps we should prepare ourselves for the day when no information will be communicated otherwise. In the meantime, however, many people are still composing text to be printed with ink on paper, and if you’re doing so yourself, and you’ve found something on the Internet that you want to cite, make sure that you don’t just print its URL in your note. I read a book once in which every one of the 700+ endnotes was nothing more than a URL! (Clearly, it had been originally composed as a web document, and the author rigged up an algorithm to convert the links into endnotes.) Even the books were cited as links to Google Books, or Amazon. (And none of the links had “accessed [date]” after it, so that one could check the references in the Internet Wayback Machine on the approximate date the author did, in case the link should rot).

I assume that the author of the book was in some kind of a rush to get it out in print form, but his citation protocol is just plain silly. I would much rather read:

David M’Clure and Elijah Parish, Memoirs of the Rev. Eleazar Wheelock, D.D., Founder and President of Dartmouth College and Moor’s Charity School (Newburyport, Mass.: Edward Little, 1811), 57.

than:

http://books.google.com/books?id=QTYFAAAAYAAJ&dq=memoirs%20%22eleazar%20wheelock%22&client=safari&pg=PA57#v=onepage&q=&f=false

Not only is the first note easier on the eyes, it allows one to look the book up in the library if one has access to a library, or on Google books if one has access to that. (I don’t believe that we need to specifically cite Google books if we have found something there. I trust that their scanners are working properly, and that the book we see online is the same one that we find in the library.)

As for purely online sources, there is a proper format for citing them too, e.g.:

Nathan Bedford Forrest, “Report of Maj. Gen. Nathan B. Forrest, C. S. Army, Commanding Cavalry, of the Capture of Fort Pillow,” Shotgun’s Home of the American Civil War, at http://www.civilwarhome.com/forrest.htm (accessed June 9, 2016).

Note that the author and title of the document, and the title of the website where it appears, have been made manifest, in addition to the actual Internet address where you can find it all.

Note, also, that the format still says that this was found on the web, and there is utterly no reason to append “Web” to the end of the note. A few years ago students started doing this for Internet sources (and appending “Print” for paper-and-ink ones). I don’t know what genius came up with this custom, but it needs immediate deprecation! The format itself speaks!

• But if we are writing for the web: something I haven’t seen addressed anywhere else (although someone must have at some point) is the question of, stylistically, how aware online prose ought to be of its own hyperlinks. Ideally it should not be aware at all. Links are parallel to footnotes (or perhaps quod vide), and just as you can reprint an article omitting them, so also can you reprint a piece of web text without hyperlinks with no essential damage to the prose. However, a sentence reading:

Here is the original article, and here are some reactions

would not transfer well, given that the links must be there for the sentence to make sense (the two instances of “here” need them as referents). I suppose we should train ourselves to write like this:

Benedikt’s article appeared on Slate on August 29, and immediately sparked a number of reactions.

Without the links the sentence would require a bit of Googling on the part of the reader, but at least it would make sense grammatically.

And Now, Some Sobering News…

From the AHA’s Perspectives newsletter. I would add another reason: when you love history as such, as all history professors do, you simply can’t understand why others might not, and it’s difficult to convince them of the subject’s value.

The Decline in History Majors: What Is to Be Done?
by Julia Brookins, May 2016

As the March 2016 issue of Perspectives on History reported, the number of people earning a US bachelor’s degree in history dropped sharply in 2014 from a year earlier and may continue to fall over the next few years. We solicited reports and impressions from department chairs and talked to history faculty members at different types of colleges and universities, asking how things looked in their programs and what the implications for other history departments might be….

[Some reasons include:]

Gender breakdown of the undergraduate student population: Nationally, there are three male history majors for every two female history majors. Declining proportions of male college students will continue to affect program cohorts negatively unless more women choose history.

The history program’s reputation for rigor: This may be a factor in certain campus climates, but departments must think carefully about how prospective students encounter the expectations that their program has for its graduates. Rigor and lack of student success are not the same things. Departments may need to embrace changes that reinforce learning and be able to provide good evidence to entering students that the history program is structured to support their academic achievement.

Heavy reliance on required introductory courses to recruit students: Conventional wisdom among history faculty members has been that introductory courses are the best recruitment tool for attracting history majors. As college-going has undergone big changes over the past few decades, however, the students in the lower-division campus courses are a smaller pool to draw from. Today, more students who enter a bachelor’s degree program have previous course credits from community colleges, AP courses, dual or concurrent enrollment courses, and other sources. Departments that expand their recruitment beyond on-campus introductory courses may be able to find more of today’s college students. Direct communication and coordination between faculty peers at two-year and four-year academic units becomes more important to discovering and launching students who want to major in history.

Creation of new majors: As choices for students expand, history departments that actively engage with faculty colleagues in new and existing programs may be more likely to retain students. Facilitating double majors and developing or promoting a history minor goes along with this approach.

Changes to general education curricula: History faculty members and department chairs with whom I spoke agreed that changes to history’s place within general education have played a role in the number of undergraduate majors. Students can now often navigate the breadth of institutional offerings and choose among baskets of courses. While there are risks as designated distribution requirements disappear, there are also tremendous opportunities for more engaged classrooms and for moving introductory history courses beyond surveys of broad topical areas toward an emphasis on effectively training undergraduates in core concepts and competencies of historical thinking.

Declining career prospects in fields traditionally associated with history: Some states implemented hiring and/or salary freezes for K–12 teachers in the wake of budget crises precipitated by the recession. The past few years have also seen a significant constriction of early-career employment opportunities in the legal profession. If law school no longer seems like a good bet, the history curriculum that was great preparation for that path might lose its attraction for some students.3 Historians need to practice communicating that history skills are foundational for many career paths and be able to outline that range of occupations to undergraduates.

Regional demography: The number of traditional college-age or younger residents of certain metropolitan areas and entire states has fallen, while it has increased in other regions. Overall enrollments may be down at institutions in aging regions, and faculty may need to take different steps to reach students.

Historians and Light Bulbs

A good one from Laurentian University Professor David Leeson:

Q: How many historians does it take to change a light bulb?

A: There is a great deal of debate on this issue. Up until the mid-20th century, the accepted answer was ‘one’: and this Whiggish narrative underpinned a number of works that celebrated electrification and the march of progress in light-bulb changing. Beginning in the 1960s, however, social historians increasingly rejected the ‘Great Man’ school and produced revisionist narratives that stressed the contributions of research assistants and custodial staff. This new consensus was challenged, in turn, by women’s historians, who criticized the social interpretation for marginalizing women, and who argued that light bulbs are actually changed by department secretaries. Since the 1980s, however, postmodernist scholars have deconstructed what they characterize as a repressive hegemonic discourse of light-bulb changing, with its implicit binary opposition between ‘light’ and ‘darkness,’ and its phallogocentric privileging of the bulb over the socket, which they see as colonialist, sexist, and racist. Finally, a new generation of neo-conservative historians have concluded that the light never needed changing in the first place, and have praised political leaders like Ronald Reagan and Margaret Thatcher for bringing back the old bulb. Clearly, much additional research remains to be done.

With a response from Matthew Lavine of Mississippi State University:

Dear Dr. Leeson,

We regret that we cannot accept your historian joke in its present form. However, a panel of anonymous reviewers (well, anonymous to YOU, anyway) have reviewed it and made dozens of mutually contradictory suggestions for its… improvement. Please consider them carefully, except for the ones made by a man we all consider to be a dangerous crackpot but who is the only one who actually returns comments in a timely fashion.

1. This joke is unnecessarily narrow. Why not consider other sources of light? The sun lights department offices; so too do lights that aren’t bulbs (e.g. fluorescents). These are rarely “changed” and never by historians. Consider moving beyond your internalist approach.

2. The joke is funny, but fails to demonstrate familiarity with the most important works on the topic. I would go so far as to say that Leeson’s omission is either an unprofessional snub, or reveals troubling lacunae in his basic knowledge of the field. The works in question are Brown (1988), Brown (1992), Brown (1994a), Brown (1994b), Brown and Smith (1999), Brown (2001), Brown et al (2003), and Brown (2006).

3. Inestimably excellent and scarcely in need of revision. I have only two minor suggestions: instead of a joke, make it a haiku, and instead of light bulbs, make the subject daffodils.

4. This is a promising start, but the joke fails to address important aspects of the topic, like (a) the standard Whig answer of “one,” current through the 1950s; (b) the rejection of this “Great Man” approach by the subsequent generation of social historians; (c) the approach favored by women’s historians; (d) postmodernism’s critique of the light bulb as discursive object which obscured the contributions of subaltern actors, and (e) the neoconservative reaction to the above. When these are included, the joke should work, but it’s unacceptable in its present form.

5. I cannot find any serious fault with this joke. Leeson is fully qualified to make it, and has done so carefully and thoroughly. The joke is funny and of comparable quality to jokes found in peer journals. I score it 3/10 and recommend rejection.

“History of the World in Sixteen Piles of Crap”

A facetious (faecetious?) book proposal from a grad school friend:

Great Rift Valley Coprolite: Cooking and the evolution of human physiology
Mammoth Chunks in Fossilized Turd, North America: Homo sapiens peoples the earth
Cow Dung, Anatolian Plateau: Agriculture and the secondary products revolution
Toilet, Mohenjo-Daro: The invention of urbanization
Night Soil, Huang He River Valley: The creation of imperial economies
Road Apples, Transoxania: The impact of pastoralists on Afro-Eurasia
Unholy Shit, Levant, Mesopotamia, South Asia: Cleanliness and the development of “world” religions
Monkey Poo Marginalia: Culture in the Middle Ages
Nitrate Beds, Central Europe and the Ganges Valley: Gunpowder and global empires
Slave Ship Bilges, Atlantic Ocean: Early Modern global economies
Guano Mountain, Chincha Islands: Industrial revolution
Sewage System, London: Cholera and Modern Medicine
Fake Poop: The Haber-Bosch process, modernity, and the industrialization of war
Gandhi’s Enemas, South Asia: Decolonization
Disposable Diapers, USA: Twentieth century gender and identity in the core
Feces in Ground Beef: Globalization and the 21st-century Environment

Interesting Blog Post

From Historista:

***

Civil War Military Historians are Freaking Out

A couple of years ago, Civil War military historians woke up and determined that their field was dying. Why then? This is unclear. Perhaps it was part of a larger anxiety about the “crisis in the humanities.” Perhaps it was because many military historians are nearing retirement, and are therefore worried about their legacies.

Whatever the reason, we know that some of them chose option two as a response because we now have in our possession not one but two (simultaneous!) special issues on military history in our journals of record: Civil War History, and The Journal of the Civil War Era.

Each issue contains a manifesto-as-introduction—one written by Earl Hess (CWH) and the other by Gary Gallagher and Katy Shively Meier (JCWE)—bemoaning the state of the field, and arguing that traditional military historians (those who write about “warfare and the relationship between military institutions and the societies from which they sprang,” according to Gallagher and Meier (490)) are in danger of “losing the Civil War.” This will not do, they argue. As Hess writes, “understanding the real battlefield of 1861-65 is essential to understanding everything else about the Civil War. The experience of organized military forces, their impact on the course of a war effort and on the course of their nation’s history, is fundamental to any true understanding of war” (393).

Now, let me say I am on board with this argument—except for those problematic terms, “real” and “true.” Of course the battlefield is important; of course logistics and strategy and the lived experiences of combat are important. They were important to Civil War Americans, and so they are important to those who study them. I don’t think I know any historians in the field who would disagree with these assertions.

But clearly Gallagher, Meier, and Hess believe that everyone (everyone!) in fact does disagree with these assertions. And they feel they are besieged—and from two directions, no less.

First, by amateur historians who write popular military histories (and the commercial presses that aid and abet them). Academic historians have always had a somewhat fraught relationship with the producers of “popular histories” who write both inside and outside the academy. Just ask any historian what she/he thinks about Jill Lepore or Doris Kearns Goodwin and you will see what I mean. Hess embraces this group a bit more than Gallagher and Meier do (perhaps because he sees himself as one of them), although they all view popular authors as competition for readers. If we don’t write more and better military histories, they argue, the amateurs will determine what most of the American public knows about the Civil War.

Although I am always for more and better histories, I’m skeptical about this prediction. Academic historians consult at National Park Service sites and serve as experts in the making of documentaries. They appear regularly on C-Span. There are also increasing numbers of academic historians writing for blogs and other online sites. And as Carole Emberton pointed out during the recent kerfuffle over what constitutes a “public intellectual,” “what the American public knows” about history is often conveyed in college classrooms—which are, the last time I checked, the domain of academic historians.

But Gallagher, Meier, and Hess save most of their ire for the second set of besiegers: social and cultural historians of the Civil War, whom they depict as (variously) misinformed about, condescending toward, terrified by, and dismissive of military history. These extraordinarily powerful individuals have taken funding and jobs away from traditional military historians, and they have discouraged graduate students from writing in the field. What proof do Gallagher, Meier, and Hess have for these complaints? Well, unfortunately, most of it is anecdotal, vague, or nonexistent.

Gallagher and Meier, for example, write that, “the few academic scholars who do work on such topics are under pressure to pull away from investigating the waging of the war itself” (489). The footnote for the paragraph references Allen Guelzo’s Lincoln Prize for Gettysburg: The Last Invasion. This is confusing at best. Almost all of Hess’ evidence for the dastardly deeds of social and cultural historians comes from 33 responses (some anonymous, some not) to a survey he sent out to 129 friends in the profession. That’s a pretty thin data set, produced from what appears to be a completely un-vetted questionnaire.

Such unfounded arguments start to read like conspiracy theories, which completely undermine any kind of reasonable points that Civil War military historians can make about the importance of their own approach.

But they carry on regardless and embrace option three, throwing shade at a number of historians doing work in various fields of Civil War history. Do you study war memory? Well, it’s clearly “a substitute for genuine history” (Sutherland in Hess, 391). Interested in the war’s “dark histories” of trauma? So presentist! Obvi. <eye roll> Doing research on guerrilla warfare in the border region? Don’t bother; such actions were so anomalous as to be inconsequential. Anyone who talks about emancipation without reference to military occupation is clearly an idiot. And don’t even begin to suggest that there was a “long Civil War” that extended beyond 1861 and 1865; this diverts attention from the war itself.

These attacks on colleagues are befuddling; both Gallagher and Hess have done research in aspects of the war beyond the battlefield, and Gallagher has even published pieces on the war in popular culture (gasp!). Their graduate students (and undergraduates who have gone on to other graduate programs) have produced important social and cultural studies of warfare.

Furthermore, the essays contained in these two special issues of Civil War History and The Journal of the Civil War Era are actually “war and society” or “war studies” pieces. All of them are excellent, and they prove that scholars arestill researching and writing compelling studies in Civil War military history. It is these essays—and not the manifestos—that will encourage a rational conversation about the field, and how its shifts and changes have produced different kinds of knowledge about the past.

As we have daily proof on Twitter, dismissive snark is not critique. Can’t we argue for the strength and viability of our own field without denigrating the work of others? As Jennifer Weber argues in her response to Hess’s manifesto in Civil War History, “considering the war and its elements from multiple angles gives us a richer, more accurate, and interesting view of the past” (406).

A Code of Conduct for Historians

From my friend Richard Raiswell, an interesting article by Suzannah Lipscomb in History Today. This is her proposed code of conduct for historians:

  • Use evidence to support your interpretation and seek to understand that evidence correctly.
  • Do not wilfully present evidence out of context, especially not in such a way that the lack of context will render the meaning of the evidence different, unclear or manipulable.
  • Do not cite evidence from sources that you elsewhere discount.
  • At best, do not waste a reader’s time on unsubstantiated sources.
  • At least flag up evidence that is drawn from such sources; do not use it silently.
  • Triangulate; search ardently for evidence that might undermine, as well as corroborate, your hypothesis.
  • Avoid assumption creep: do not allow assertions to move from ‘possibly’ to ‘probably’ to ‘definitely’; do not build more elaborate layers of interpretation on a foundation that is rocky.
  • Do not rely on the secondary assertions of other historians; ad fontes! Go back to the original sources.
  • Guard against confirmation bias; interrogate the ‘facts’ anew and bring a fresh analysis to them; do not mould the facts to your interpretation.
  • Root out and resolve any internal inconsistencies in your argument.
  • Cite sources so that they can be traced, with page numbers, archival call numbers and publication details.

The biggest historical scandal in recent years has been the book Arming America by Michael Bellesiles, which got a lot of attention as it touched on a hot-button cultural issue. (I haven’t read it, but I understand that he claims that American gun culture was the creation of arms manufacturers for the purpose of selling off their surplus stock after the Civil War, and that prior to this time guns were simply not important to your average American civilian.) For his overturning of received wisdom (and, let us admit, in a direction that liberal academics really wanted to believe), he received the prestigious Bancroft Prize. To the NRA, his book was a challenge they could not afford to ignore, and they went after him personally and professionally. Unfortunately, it turns out that he really did make quite a lot of stuff up – he claimed to have consulted archives that disappeared in the San Francisco earthquake, for instance – and then kept making up excuses about why he could not produce his notes. For his blatant and systematic fraud, he lost his job at Emory. I once met someone from the department there who claimed that the only possible excuse for his behavior was some form of mental illness. Fine, but one wonders how many other violations of Lipscomb’s code are out there – fraud that is not as blatant, or does not provoke the scrutiny of interested parties. Who has the time to comb through every footnote?

This is obvious, but bears repeating: if we demand respect for our professional credentials, then we must practice with integrity.

UPDATE: From an Atlantic article by Benjamin Schwartz. Don’t be like “the sycophantic courtier [Arthur] Schlesinger [Jr.], whose histories “repeatedly manipulated and obscured the facts” and whose accounts—“profoundly misleading if not out-and-out deceptive”—were written to serve not scholarship but the Kennedys.”