Military History

From Max Hastings on Bloomberg (hat tip: Tim Furnish):

American Universities Declare War on Military History

Academics seem to have forgotten that the best way to avoid conflict is to study it.

The world applauds the scientists who have created vaccines to deliver humanity from Covid-19. One certainty about our future: There will be no funding shortfall for medical research into pandemics.

Now, notice a contradiction. War is also a curse, responsible for untold deaths. Humans should do everything possible to mitigate it. And even if scientists cannot promise a vaccine, the obvious place to start working against future conflicts is by researching the causes and courses of past ones.

Yet in centers of learning across North America, the study of the past in general, and of wars in particular, is in spectacular eclipse. History now accounts for a smaller share of undergraduate degrees than at any time since 1950. Whereas in 1970, 6% of American male and 5% of female students were history majors, the respective percentages are now less than 2% and less than 1%, respectively.

Fredrik Logevall, a distinguished Harvard historian and author of seminal works on Vietnam, along with a new biography of  John F. Kennedy, remarked to me on the strangeness of this, given that the U.S. is overwhelmingly the most powerful, biggest-spending military nation on earth. “How this came to be and what it has meant for America and the world is surely of surpassing historical importance,” he said. “Yet it’s not at the forefront of research among academic historians in this country.”    

The revulsion from war history may derive not so much from students’ unwillingness to explore the violent past, but from academics’ reluctance to teach, or even allow their universities to host, such courses. Some dub the subject “warnography,” and the aversion can extend to the study of international relations. Less than half of all history departments now employ a diplomatic historian, against 85% in 1975. As for war, as elderly scholars retire from posts in which they have studied it, many are not replaced: the roles are redefined.   

An eminent historian recently told me of a young man majoring in science at Harvard who wanted to take humanities on history, including the U.S. Civil War. He was offered only one course — which addressed the history of humans and their pets.

Paul Kennedy of Yale, author of one of the best-selling history books of all time, “The Rise and Fall of The Great Powers,” is among many historians who deplore what is, or rather is not, going on. He observed to me that while some public universities, such as Ohio State and Kansas State, have strong program in the history of war, “It’s in the elite universities that the subject has gone.”

“Can you imagine Chicago, or Berkeley, or Princeton having War Studies departments?” he asked. “Military history is the most noxious of the ‘dead white male’ subjects, and there’s also a great falling away in the teaching of diplomatic, colonial and European political history.”

Kennedy notes that war studies are highly popular with students, alumni and donors, “but the sticking point is with the faculty — where perhaps only a small group are openly hostile, but a larger group don’t think the area is important enough.” 

Read the whole thing

The Practice of History

A friend writes:

In practice it seems to have become a norm to say “historians agree,” when when that is not true, or sometimes just on the basis of some recent article or paperback, and which is popular with a certain crowd.

There is almost a tendency to present history as an activity in which some set “findings” have been made that have a fixed meaning.

Such a reality is not true even in many experimental social sciences which at least use statistical significance as a guide to reliability. It is even less the case with history where the nature of the profession precludes such statements. The very historians who do the most detailed archival, philological, and novel work are so often overwhelmed with that that they do not have a very good grasp of other areas of history, and even less the findings of social, psychological, and natural sciences. Whereas those who, because they come across well on camera, who speak most generally often simply do not have enough scholarly depth.

And for people on the left who really really feel that some piece of historiography has totally transformed historical understanding so that they know something that hoi polloi don’t, always keep in mind the sad case of Michael A. Bellesiles [link added].

My favorite example of this phenomenon is provided by Noel Ignatiev’s book, How the Irish Became White (1995). Ignatiev used “white” metaphorically to mean “part of the dominant group.” But since then it has become conventional wisdom that “the Irish weren’t even considered white in the nineteenth century!” One must use metaphors carefully. 

Happy New Year

To start the new year, a Facebook friend makes a resolution, with which I heartily agree:

Many years ago I took a course on Late Antiquity and we read a book by scholar A and then one by Scholar B. Scholar B totally opened by mocking the outrageous claims of Scholar A except here’s the thing: Scholar A never made those claims. Scholar B set up what is known as a strawman to attack and make his position seem less questionable. I took this to heart and have found myself saying “Don’t be a [Scholar B]!” before I think about re-sharing or reacting.

A lot of times on social media I see people share posts about how “so many people with belief X are posting this awful sentiment” and internally I’m like, “that’s weird – I haven’t seen anything that sounds like that or I saw one comment.” And here’s the thing: they aren’t or it’s one person but articles are written as if a whole group rose up and said one thing.

There are many outlets (media and otherwise) who want you to think that they other side is so outrageous that it isn’t worth doing your due diligence – but it’s always worth doing due diligence. And don’t let one comment replace the thoughts of many.

I fail at this a lot – it’s super easy to be Scholar B. It’s comfortable to be Scholar B. But we shouldn’t be. In 2020, let’s all agree to be skeptical when something seems too outrageous to believe and let’s also ask ourselves who benefits when we do believe it?

And then weigh that before we share an article or comment or post.

Related, from Vox.com:

Intellectual humility: the importance of knowing you might be wrong

Why it’s so hard to see our own ignorance, and what to do about it.

I’ve come to appreciate what a crucial tool it is for learning, especially in an increasingly interconnected and complicated world. As technology makes it easier to lie and spread false information incredibly quickly, we need intellectually humble, curious people.

I’ve also realized how difficult it is to foster intellectual humility. In my reporting on this, I’ve learned there are three main challenges on the path to humility:

  1. In order for us to acquire more intellectual humility, we all, even the smartest among us, need to better appreciate our cognitive blind spots. Our minds are more imperfect and imprecise than we’d often like to admit. Our ignorance can be invisible.
  2. Even when we overcome that immense challenge and figure out our errors, we need to remember we won’t necessarily be punished for saying, “I was wrong.” And we need to be braver about saying it. We need a culture that celebrates those words.
  3. We’ll never achieve perfect intellectual humility. So we need to choose our convictions thoughtfully.

This is all to say: Intellectual humility isn’t easy. But damn, it’s a virtue worth striving for, and failing for, in this new year.

Read the whole thing. All I would like to add is that opponents to Trump, not just Trump himself, can fall victim to false and unearned confidence…

Book Reviews

In the discipline of history, the single-authored monograph is the basic unit of scholarship, and an academic’s prestige is usually arbitrated by the number of such books that he can produce, and by the influence that these books have had. No one has time to read every book that gets published, though, so reviewing is an important service to the profession. That is, a given issue of the average historical journal will contain about five articles (often preliminary studies that will later appear as chapters in books), and some seventy book reviews. A review is typically about 1000 words long; it summarizes the book in question and gives a judgment of its quality. The idea is that a scholar will get the journal, look through the reviews to see what’s new, read the ones that are relevant to her interests, and if anything looks really compelling, check the books out from the library, order them through interlibrary loan, or even buy them from the publisher (although academic books do tend to be rather pricey).

In her turn, she will be expected to produce book reviews herself. Summarizing a book is time consuming, but not too difficult. It’s an exercise in the art of précis – of making the book’s message as simple as possible, but no simpler. The tricky part is judging the book’s quality. For that, you need to know what else has been published on the same topic, and really good review will cite those works in proof – it will “situate the work in the historiography,” as the jargon has it. The temptation is always there to give the book the benefit of the doubt, on the assumption that the author has more expertise than you do, and nothing would have gotten published if it wasn’t pretty good in the first place. However, a reviewer owes it to his readers to seek out and mention any errors of fact or overall weakness. Moreover, some books just aren’t very good, for various reasons. I know someone who will refuse to review such books; she’ll just send them back to the journal editor, since writing an honest review might alienate the author, and “you never know who might be on a fellowship selection committee.” This move is better than just lying about a book’s quality, I suppose, but to me it’s cowardly, and I don’t like it. We get tenure for a reason: the idea is that we are licensed to speak truth to power without having to fear for our livelihoods. Such guaranteed job security is not just about opposing Trump, the alt-right, evangelical Christians, Wall Street, or the State of Israel from our perch in the Ivory Tower, but also about calling out the members of our own profession on their mistakes, as uncomfortable as that might make things for us at our next big conference. (At the same time, I don’t believe in being gratuitously mean, like one grad school professor who gave me a C- on a book review that I had written for his seminar, with the comment “you are too nice!” – I had criticized the book, just not forcefully enough for his liking.)

So when you’re reviewing a book you try to be fair, and if it has any problems to find a middle ground between turd-polishing and being a big jerk. And you definitely try to review the book that was written, not the book that you wanted to read (another thing that too many academics like to do). 

I myself have managed to publish a number of book reviews over the course of my career – by my count 25, some of which have been referenced on this blog. My most interesting experience with book reviewing began in the autumn of 2009, when I received, from the editor of Reviews in History, a draft of a review of my own recently-published book on St. George by Sam Riches, who herself had written a book about the saint. She liked some aspects of my book, but had some reservations about it, and concluded that “it is not the definitive work on St. George in the English tradition – that has yet to be written.” I received the review because one of the features of Reviews in History is that it gives the book’s author a chance to respond. Now, at the time, I believed that one should never respond to a review. This was the message of Paul Fussell’s “Being Reviewed: The A.B.M. and its Theory,” republished in The Boy Scout Handbook and Other Observations. The “A.B.M.,” to Fussell, is the “Author’s Big Mistake,” that is, a “letter from an aggrieved author complaining about a review,” which “generally delivers the most naked view of the author’s wounded vanity” and reads “as if some puling adolescent, cut from the high school basketball team, has published a letter about how good he really is, and written it not very well.” So I let it slide, and the review was published in January 2010, with the notice that “the author of this book has not responded to this review.” 

But I came to revise my opinion over the next couple of years. Fussell’s examples of A.B.M.s were taken mostly from the New York Review of Books and the Times Literary Supplement, and mostly involved works of fiction, or popular non-fiction. Academic books, I came to perceive, are somewhat of a different matter: they make arguments, which can be defended, and as long as one sticks to the facts without getting testy, then it’s all part of the conversation. I read a number of responses to reviews that were in this vein, and I figured that there would be no harm in my doing it too. So in the summer of 2012, I took some time out of my life to pen a response to her review, which I’m pleased to say that the editor of Reviews in History posted, even at that late date. (For the record, I was never upset that Dr. Riches didn’t give my book fulsome praise. It is better to be talked about than not talked about!) Read both and decide for yourself who makes the better case. 

As a result of my contact with the editor, he asked if I would care to review a book that he had just received. I said that I would be happy to, and he sent it to me. I read it twice, as is my habit – it’s one thing to dash off a review for a graduate seminar the night before the assignment is due, it’s quite another to write for publication – you want to make sure that you have really understood what the author is trying to say, because you want to be fair and you don’t want to appear sloppy in print. This operation took a little longer than I hoped, and I got the review to him two weeks after the deadline he had given me. In response, I received an email with the subject line “Terribly sorry” and a message saying that “I forgot that I had two copies of that book, and I gave the other copy to someone else for review.”

“So I take it that review was better than mine?” I responded jokingly.

“It wasn’t, actually,” he replied. “But I’ve already sent it out for the author’s response!”

I had a good laugh about this. If nothing else it shows the importance of meeting deadlines! As it happens I easily placed my review somewhere else. Generally editors won’t accept unsolicited reviews – they have no idea about the agenda of the would-be reviewer – but after explaining the situation they were happy to publish it, and they didn’t even ask me to shorten it. The editor of Reviews in History asked if there was anything he could do for me. I said that there was a book I was interested in reviewing, and he arranged to have it sent to me. This one I ended up reading three times, because I found it difficult and I wanted to do it justice, even I didn’t like it very much (attributable largely to a disciplinary divide – I am a historian, the author was a literary critic). The review was published in the fall of 2013; the author, as I had originally, chose not respond. 

That summer, I had further contact with the editor of Reviews in History on account of another review of my book that I discovered in the Journal of English and Germanic Philology. By that point my book had been reviewed about a dozen times in various venues – again, some people liked it, others didn’t, and that’s fine. This review, however, sounded strangely familiar. As I read it, I realized that a large chunk of it was simply plagiarized from Sam Riches’s review in Reviews in History! Like a good citizen, I immediately informed Dr. Riches and the editors of Reviews in History and the JEGP about this gross violation of scholarly protocol, and the JEGP withdrew the review in its next number. The author, one Giovanni Narabito of the University of Messina, did not seem to have much of an institutional presence there, and a bit of googling eventually revealed a plausible explanation. The Wikipedia entry for Italian crime boss Giuseppe Narabito explains that his clan “established a cell in Messina on Sicily,” where they “exercise considerable power up to the present. The clan turned the University of Messina into their private fiefdom, ordering that degrees, academic posts, and influence be awarded to favored associates.”

It could be that “Narabito” is the Italian equivalent of Smith or Jones but it sure looks like someone was promoted for non-academic reasons here. But why not just stick to protection rackets and drug smuggling, I wonder? Those activities are a lot more lucrative!

Barking Abbey

Interesting article by Eleanor Parker on History Today:

The Cultured Women of Essex

We should take more notice of the work of those once despised and disregarded.

‘It is asked of all who hear this work that they do not revile it because a woman translated it. That is no reason to despise it, nor to disregard the good in it.’ Many female writers have probably said, or wanted to say, something very like these words. They were written in the 12th century, around 1170, by a woman who composed one of the earliest texts from England known to be by a female author. She was a nun of Barking Abbey in Essex and, though we do not know her name, her words – and her work – demand attention.

The work she asks us not to disregard is a narrative of the life of Edward the Confessor, written in Anglo-Norman French (‘the false French of England’, the nun modestly calls it). Its author was an educated woman, able to turn a Latin source into engagingly chatty French verse and Barking Abbey must have been a congenial environment for her. Founded in the seventh century, Barking was one of the foremost nunneries in the country, a wealthy abbey which was home to many well-connected aristocratic and royal women. Its abbesses were frequently appointed from the sisters and daughters of kings and, around the time our nun wrote her Vie d’Edouard le Confesseur, Thomas Becket’s sister Mary – herself a woman of literary interests – was made abbess of Barking in compensation for her brother’s murder.

Across its long history of more than 850 years, Barking Abbey was a centre for women’s learning. It has been described as ‘perhaps the longest-lived … institutional centre of literary culture for women in British history’ and it had a strong literary and scholarly tradition that spanned the Middle Ages. In the early medieval period, authors such as Aldhelm and Goscelin of St Bertin wrote learned Latin works for the nuns of Barking; later, several nuns composed their own poetry and prose – even their own plays. In the 12th century, when women were increasingly becoming patrons, readers and, in some cases, authors of literary texts, Barking produced more than one talented writer. The first female author in England whose name we know, Clemence of Barking, was a nun there; she wrote an accomplished Life of St Catherine of Alexandria, a saint associated with female learning.

Read the whole thing, and a followup blog post about it. A choice excerpt:

I’m a UK academic writing primarily for UK audiences (not that I’m not glad to have other readers too!), but online those distinctions are blurred; other academics will pass judgement, from half a world away, on conversations they only half understand, and some of them are very resistant to the idea that in different contexts it might be necessary to speak in different languages, to ask and answer different questions. Even the basic idea that words have different connotations in different varieties of English seems to surprise them. In their particular cultural context, medieval history intersects with questions of identity and exclusion in very different ways, and they won’t listen to anyone who tries to tell them things don’t operate like that everywhere in the world. We all have to do what seems right to us in our own context, and I’m sure they are trying to do that; I only wish they were prepared to consider that the rest of us are trying to do the same, just not in the same way. Some feel entitled to demand that every discussion which touches on ‘their’ subject should address their own immediate social and political concerns – not those of (for instance) the people of Barking, of whose existence they are so loftily unconscious. Some of these people also display a deeply exclusionary view of academic status and the privileges it confers on them, and an attitude little better than contempt for the public at large; if you don’t have a doctorate, you’re not worthy of their time or attention. I’ve been observing this tendency for several years, but it’s particularly noticeable at the moment. Since these academics don’t follow British and Irish politics, they really can’t see why this is such an especially bad time to be making pronouncements on how to use words like ‘English’ and ‘British’, without any understanding of the contemporary sensitivities surrounding those terms, and they seem completely unaware of the wider social context in which UK medievalists have to consider the issue of public engagement. I think some of them truly would prefer it if they could stop the public taking any interest in medieval history at all, because that interest is, to them, always inherently problematic; but while they can decide for themselves if that’s the case in their own countries, it’s absolutely out of the question here. 

Majoring in History

From The Conversation:

Don’t despair if your teen wants to major in history instead of science

It might be your worst nightmare. Your child, sitting at the kitchen table, slides you a brochure from the local university.

“I’ve been thinking of majoring in history.”

Before you panic and begin calling the nearest computer science department, or worse, begin to crack those tired barista jokes, hear me out. This might just be the thing that your child, and our society, needs.

Choosing to become a history major is a future-friendly investment. A history degree teaches skills that are in short supply today: the ability to interpret context, and — crucially — where we’ve been, so as to better understand the world around us today and tomorrow.

We’ve never needed knowledge of history and the skills that come with the discipline more than we do now. Not only is it a good choice of a major for all the usual selfish reasons — you’ll likely get a good job, even if it takes a bit longer than the STEM disciplines, and more importantly you’ll probably be very happy with it.

But for our society more generally, we need a generation with deep capacities to acknowledge context and ambiguity. This idea of ambiguity not only pertains to interpreting the past based on a diverse body of incomplete sources, voices and outcomes, but also how our contemporary judgements of that record shape our choices today.

Our whole society hurts when students turn their back on history. A sense of history — where we have come from, the shared anchors of democratic society, the why and how of our current moment in time — is critical.

Read the whole thing.

History In and Out of the Classroom

From The Federalist:

Americans have a hunger to understand, explore, and connect with their history. Richly sourced, intellectually demanding accounts of the country’s defining moments and characters do more than break through the noise.

Indeed, historians are probably the scholars most celebrated outside the confines of the academy. They are among the few who shape our cultural landscape—from a place of learning. As though to prove the point, Chernow’s 832-page 2005 biography of Alexander Hamilton, also a New York Times best-seller, inspired the most talked-about Broadway musical in a generation. Only on the American college campus is American history on retreat.

How strange it is that U.S. colleges and universities are abandoning the study of American history and, at some institutions, the study of history altogether. The American Council of Trustees and Alumni evaluates the general education programs of more than 1,100 colleges and universities every year. The 2018–19 report found that only 17 percent of them required any kind of foundational course in American history or government. As of 2016, only four out of the top 25 national universities (as ranked by U.S. News and World Report) required a course in U.S. history in their history majors.

In this light, it is perhaps unsurprising that history programs in the United States are struggling to generate student interest. When the American Historical Association drew attention to cratering undergraduate degree production last year—the number of history degrees awarded annually has fallen almost 34 percent since 2011, more steeply than any other discipline in the liberal arts.

This is true even at my alma mater Dartmouth College, where I attended my 25th reunion last weekend.

Previous thoughts on the matter.

UPDATE: This is the 1000th post published on FFT since the blog’s inception in September 2014!

History is History

Here is another article on a recent theme:

A recent study confirms a disturbing trend: American college students are abandoning the study of history. Since 2008, the number of students majoring in history in U.S. universities has dropped 30 percent, and history now accounts for a smaller share of all U.S. bachelor’s degrees than at any time since 1950. Although all humanities disciplines have suffered declining enrollments since 2008, none has fallen as far as history. And this decline in majors has been even steeper at elite, private universities — the very institutions that act as standard bearers and gate-keepers for the discipline. The study of history, it seems, is itself becoming a relic of the past.

It is tempting to blame this decline on relatively recent factors from outside the historical profession. There are more majors to choose from than in the past. As a broader segment of American society has pursued higher education, promising job prospects offered by other fields, from engineering to business, has no doubt played a role in history’s decline. Women have moved in disproportionate numbers away from the humanities and towards the social sciences. The lingering consequences of the Great Recession and the growing emphasis on STEM education have had their effects, as well.

Yet a deeper dive into the statistics reveals that history’s fortunes have worsened not over a period of years, but over decades. In the late 1960s, over six percent of male undergraduates and almost five percent of female undergraduates majored in history. Today, those numbers are less than 2 percent and 1 percent. History’s collapse began well before the financial crash.

This fact underscores the sad truth of history’s predicament: The discipline mostly has itself to blame for its current woes. In recent decades, the academic historical profession has become steadily less accessible to students and the general public — and steadily less relevant to addressing critical matters of politics, diplomacy, and war and peace. It is not surprising that students are fleeing history, for the historical discipline has long been fleeing its twin responsibilities to interact with the outside world and engage some of the most fundamental issues confronting the United States.

More at the link.

I’m not quite sure that I agree with his critique. First, it’s important to note that just because the number of history majors has declined, it does not mean that the study of history itself has declined – history remains part of most general education requirements, and of course people can minor in history while majoring in something else. And that something else, as noted, has to start paying off immediately.

But I don’t think that the popularity of the history major has much to do with academic historians’ engagement with the public sphere. My impression is that most high school and/or college students, if they are interested in history, are interested in the history itself. They don’t know who the big names are, or whether these people actually hold academic appointments (I certainly didn’t). There is still lots of popular history out there, whether in the form of books, television shows, movies, or video games (some of which, I feel compelled to state, is even produced by real academics, and most of which is based on work that they’ve done).

Or is the idea that popular and academic history have diverged too much in their concerns? Picture a young man who is interested in World War II on account of watching Inglourious Basterds and playing a lot of D-Day, who comes to university only to discover that the only course offered on World War II focusses heavily on Rosie the Riveter, race relations in southern factories, and Japanese internment. These are legitimate topics, of course, but you can see why some students might consider them of secondary importance to the actual war itself. I don’t think that this is as much of a problem as people think, but I admit that it can be a problem. History ought to be a house of many mansions, and any substantial history department should (and generally does) try very hard to recruit members who specialize in different times and periods… and approaches, including diplomatic and military history. But what happens if there are no applicants in those latter fields? This is a function of competitive convergence – there are so many would-be academics chasing so few jobs, that everyone specializes in something they think will win them employment, something designated “hot,” “up to date,” and “relevant.” No one is willing to take a risk on specializing in a niche interest. This is a shame.*

But I don’t know what can be done about it, except for all AHA member departments to make a pact to severely limit the number of Ph.D. students they take on, in the interests of loosening up and (paradoxically) diversifying the job market – even if it means that some of the professors will have to do some of their own grunt work, which means that it will never happen. Failing that, I do think that an expansive spirit is required, a deeply engrained (and enforced, if need be) principle that the same right that you have to do your thing guarantees someone else’s right to do her thing, i.e. a spirit of genuine diversity, whereby no one’s approach is better than anyone else’s. Alas, if team intersectional gets its way, very soon there will be a sort of Bechdel test for the acceptance of scholarship in Speculum or panels at Kalamazoo, because these people require constant validation and it’s never enough that they get to do what they want to do, you have to do it too. The totalitarianism behind this impulse has always annoyed me. Live and let live, eh?

* Robert Hughes (in his Culture of Complaint) quotes the critic Louis Menand:

most of the academic world is a vast sea of conformity, and every time a new wave of theory and methodology rolls through, all the fish try to swim in its direction. Twenty years ago every academic critic of literature was talking about the self, its autonomy and its terrible isolation. Today not a single respectable academic would be caught dead anywhere near the word, for the “self’ is now the “subject’ and the subject, everyone heartily agrees, is a contingent construction… what ought to be most distressing to everyone is the utter predictability of the great majority of the academic criticism that gets published.

He’s talking about literary study, but it applies in a lesser way to history too.

Our Appeal Has Become More Selective

Some sobering news from IHE:

History has seen the steepest decline in majors of all disciplines since the 2008 recession, according to a new analysis published in the American Historical Association’s Perspectives on History.

“The drop in history’s share of undergraduate majors in the last decade has put us below the discipline’s previous low point in the 1980s,” reads the analysis, written by Benjamin M. Schmidt, an assistant professor of history at Northeastern University.

Some numbers: there were 34,642 history degrees conferred in 2008, according to federal data. In 2017, the most recent year for which data are available, there were 24,266. Between 2016 and 2017 alone, there was a 1,500 major drop-off. And even as overall university enrollments have grown, “history has seen its raw numbers erode heavily,” Schmidt wrote, especially since 2011-12.

“Of all the fields I’ve looked at, history has fallen more than any other in the last six years,” he says. The 2012 time frame is significant, according to the analysis, because it’s the first period in which students who experienced the financial crisis could easily change their majors.

The data represent a “new low” for the history major, Schmidt wrote. While a 66 percent drop in history’s share of majors from 1969 to 1985 remains the “most bruising” period in the discipline’s history, that drop followed a period of rapid enrollment expansion. The more recent drop is worse than history’s previous low point, in the 1980s.

I think that one of the main reasons for the decline in the history major is on account of university tuition fees continually rising far beyond the rate of inflation, so that students, of necessity, must see university as a financial investment that needs to start paying off immediately, rather than an incubator of cultural literacy, informed citizenship, and a personal life philosophy, as it may once have been. I am not saying that history majors can’t perform well in a wide variety of jobs, precisely because they can conduct research and present it coherently, it’s just that they have to overcome certain hurdles before they can convince people to hire them. I would not discount the politicization of the discipline, although this is not nearly as bad as some commentators would like to suggest (the profession as a whole might lean to the left, but you can always find professors who keep their politics to themselves, or who are even conservative). But I take consolation in the fact that our appeal really is selective: to do history properly you need intelligence and motivation, literacy and hard work. These qualities are less common that you might imagine.