History is History

Here is another article on a recent theme:

A recent study confirms a disturbing trend: American college students are abandoning the study of history. Since 2008, the number of students majoring in history in U.S. universities has dropped 30 percent, and history now accounts for a smaller share of all U.S. bachelor’s degrees than at any time since 1950. Although all humanities disciplines have suffered declining enrollments since 2008, none has fallen as far as history. And this decline in majors has been even steeper at elite, private universities — the very institutions that act as standard bearers and gate-keepers for the discipline. The study of history, it seems, is itself becoming a relic of the past.

It is tempting to blame this decline on relatively recent factors from outside the historical profession. There are more majors to choose from than in the past. As a broader segment of American society has pursued higher education, promising job prospects offered by other fields, from engineering to business, has no doubt played a role in history’s decline. Women have moved in disproportionate numbers away from the humanities and towards the social sciences. The lingering consequences of the Great Recession and the growing emphasis on STEM education have had their effects, as well.

Yet a deeper dive into the statistics reveals that history’s fortunes have worsened not over a period of years, but over decades. In the late 1960s, over six percent of male undergraduates and almost five percent of female undergraduates majored in history. Today, those numbers are less than 2 percent and 1 percent. History’s collapse began well before the financial crash.

This fact underscores the sad truth of history’s predicament: The discipline mostly has itself to blame for its current woes. In recent decades, the academic historical profession has become steadily less accessible to students and the general public — and steadily less relevant to addressing critical matters of politics, diplomacy, and war and peace. It is not surprising that students are fleeing history, for the historical discipline has long been fleeing its twin responsibilities to interact with the outside world and engage some of the most fundamental issues confronting the United States.

More at the link.

I’m not quite sure that I agree with his critique. First, it’s important to note that just because the number of history majors has declined, it does not mean that the study of history itself has declined – history remains part of most general education requirements, and of course people can minor in history while majoring in something else. And that something else, as noted, has to start paying off immediately.

But I don’t think that the popularity of the history major has much to do with academic historians’ engagement with the public sphere. My impression is that most high school and/or college students, if they are interested in history, are interested in the history itself. They don’t know who the big names are, or whether these people actually hold academic appointments (I certainly didn’t). There is still lots of popular history out there, whether in the form of books, television shows, movies, or video games (some of which, I feel compelled to state, is even produced by real academics, and most of which is based on work that they’ve done).

Or is the idea that popular and academic history have diverged too much in their concerns? Picture a young man who is interested in World War II on account of watching Inglourious Basterds and playing a lot of D-Day, who comes to university only to discover that the only course offered on World War II focusses heavily on Rosie the Riveter, race relations in southern factories, and Japanese internment. These are legitimate topics, of course, but you can see why some students might consider them of secondary importance to the actual war itself. I don’t think that this is as much of a problem as people think, but I admit that it can be a problem. History ought to be a house of many mansions, and any substantial history department should (and generally does) try very hard to recruit members who specialize in different times and periods… and approaches, including diplomatic and military history. But what happens if there are no applicants in those latter fields? This is a function of competitive convergence – there are so many would-be academics chasing so few jobs, that everyone specializes in something they think will win them employment, something designated “hot,” “up to date,” and “relevant.” No one is willing to take a risk on specializing in a niche interest. This is a shame.*

But I don’t know what can be done about it, except for all AHA member departments to make a pact to severely limit the number of Ph.D. students they take on, in the interests of loosening up and (paradoxically) diversifying the job market – even if it means that some of the professors will have to do some of their own grunt work, which means that it will never happen. Failing that, I do think that an expansive spirit is required, a deeply engrained (and enforced, if need be) principle that the same right that you have to do your thing guarantees someone else’s right to do her thing, i.e. a spirit of genuine diversity, whereby no one’s approach is better than anyone else’s. Alas, if team intersectional gets its way, very soon there will be a sort of Bechdel test for the acceptance of scholarship in Speculum or panels at Kalamazoo, because these people require constant validation and it’s never enough that they get to do what they want to do, you have to do it too. The totalitarianism behind this impulse has always annoyed me. Live and let live, eh?

* Robert Hughes (in his Culture of Complaint) quotes the critic Louis Menand:

most of the academic world is a vast sea of conformity, and every time a new wave of theory and methodology rolls through, all the fish try to swim in its direction. Twenty years ago every academic critic of literature was talking about the self, its autonomy and its terrible isolation. Today not a single respectable academic would be caught dead anywhere near the word, for the “self’ is now the “subject’ and the subject, everyone heartily agrees, is a contingent construction… what ought to be most distressing to everyone is the utter predictability of the great majority of the academic criticism that gets published.

He’s talking about literary study, but it applies in a lesser way to history too.

A Plague of Plague

From Science News, courtesy my friend William Campbell:

A 5,000-year-old mass grave harbors the oldest plague bacteria ever found

A long-dead Scandinavian woman has yielded bacterial DNA showing that she contracted the earliest known case of the plague in humans.

DNA extracted from the woman’s teeth comes from a newly identified ancient strain of Yersinia pestis, the bacterium that causes plague, the oldest ever found. The woman’s bones, which date from 5,040 to 4,867 years ago, were found nearly 20 years ago in a mass grave at an ancient farming site in Sweden.

Teeth from an adult male in the same grave contain traces of the same plague variant, say evolutionary geneticist Simon Rasmussen of the University of Copenhagen and colleagues. But plague DNA from the woman is better preserved, the team reports online December 6 in Cell.

Comparisons of the newly found Y. pestis strain with other ancient and modern strains suggest that a plague epidemic emerged more than 5,000 years ago in densely populated farming communities in southeastern Europe. Then the plague spread elsewhere, including to Scandinavia, via trade routes, Rasmussen’s team concludes. That ancient epidemic apparently contributed to sharp population declines in Europe that began as early as 8,000 years ago.

In particular, the scientists suspect that an early form of plague developed among southeastern Europe’s Trypillia culture between 6,100 and 5,400 years ago. Trypillia settlements were the first to bring enough people into close contact to enable the evolution of a highly infectious version of Y. pestis, the team suggests. Trading networks then transmitted the plague from Trypillia population centers, home to as many as 10,000 to 20,000 people, to West Asian herders known as the Yamnaya, the researchers argue. In this scenario, herders infected by the Trypillia people probably spread what had become a new strain of the plague both eastward to Siberia and westward to the rest of Europe, including Scandinavia. Yamnaya migrations to Europe roughly coincided with the rapid abandonment and burning of large Trypillia settlements, which probably occurred as a result of plague outbreaks, the scientists say.

More at the link.

Bronze Age Collapse

Richard Fernandez on PJMedia:

Surprise Collapse

One of the biggest mysteries in history is the late Bronze Age Collapse. There’s no good explanation for why an early globalized civilization should suddenly disappear at around 1177 BC. “Within a period of forty to fifty years at the end of the thirteenth and the beginning of the twelfth century almost every significant city in the eastern Mediterranean world was destroyed, many of them never to be occupied again.”

Modern archaeologists have advanced a number of theories to explain this catastrophe, several of which will sound familiar to modern ears. Climate change — not the anthropogenic kind, since “fossil fuels” had not yet been developed — might have caused drought and starvation. A technological revolution caused by the replacement of bronze with iron could have destabilized the international system. Perhaps the most modern-sounding of all explanations is “complexity.” The interdependence fostered by trade left the linked empires open to a general systems collapse as the failure in one place unleashed a cascade of effects in others:

The growing complexity and specialization of the Late Bronze Age political, economic, and social organization in Carol Thomas and Craig Conant’s phrase together made the organization of civilization too intricate to reestablish piecewise when disrupted. That could explain why the collapse was so widespread and able to render the Bronze Age civilizations incapable of recovery.

The critical flaws of the Late Bronze Age are its centralization, specialization, complexity, and top-heavy political structure. These flaws then were exposed by sociopolitical events (revolt of peasantry and defection of mercenaries), fragility of all kingdoms (Mycenaean, Hittite, Ugaritic, and Egyptian), demographic crises (overpopulation), and wars between states. Other factors that could have placed increasing pressure on the fragile kingdoms include piracy by the Sea Peoples interrupting maritime trade, as well as drought, crop failure, famine, or the Dorian migration or invasion.

Eric Cline, professor of ancient history at The George Washington University, believes the collapse was caused not by a single factor but by all of the above. Cline called it “the perfect storm” in his YouTube lecture. In the published summary of his book 1177 BC on Amazon, the precis puts it this way:

The end was brought about by multiple interconnected failures, ranging from invasion and revolt to earthquakes, drought, and the cutting of international trade routes. Bringing to life the vibrant multicultural world of these great civilizations, he draws a sweeping panorama of the empires and globalized peoples of the Late Bronze Age and shows that it was their very interdependence that hastened their dramatic collapse and ushered in a dark age that lasted centuries.

Interest in the Bronze Age collapse is fueled no doubt by fears that our own civilization may meet the same fate. A 2017 BBC article exploring ways our current civilization could fail warns against dangers roughly analogous to those which brought down the world of Troy.

Read them at the link.

Rowan Williams on King Arthur

The former Archbishop of Canterbury writes in the New Statesman (hat tip: Chris Berard):

Our once and future king

No indisputable evidence exists for a “real” King Arthur, but, fictional or not, Britain has always needed him.

BY ROWAN WILLIAMS

Does anyone now read the historical novels of Henry Treece? A minor poet associated with the postwar “New Apocalyptic” group, he produced in the 1950s and 1960s a steady stream of fiction for the adult and young adult market, set mostly in early Britain and in the Viking age. The books are characterised by vivid, simple and sometimes repetitive plotting, ample bloodshed, a well-judged mixture of the cynical and the romantic, and plenty of gloomy Celtic and Nordic atmospherics. Several of the novels feature an “historical” King Arthur – a sixth-century warlord, co-ordinating resistance to the invading Saxons. Treece portrays with some skill the ways in which such a figure might have manipulated vague memories of Roman power and cultural identity to shore up his dominance in a chaotic post-Roman Britain.

The picture Treece outlines (a picture that can be found in rather less highly coloured narratives by writers such as Rosemary Sutcliff and Meriol Trevor) is in fact not too far away from what a substantial number of professional historians of the mid-20th century had come to take for granted. The withdrawal of Roman military presence from Britain in the first quarter of the fifth century must have left the native population at the mercy of rapidly increasing swarms of settlers from north-western Europe, who pushed across lowland Britain, sacking Roman settlements and killing a substantial proportion of the population. Archaeology seemed to support this picture: Roman towns had been ruined and abandoned, British hill settlements were reoccupied and refortified. There appeared to be a bit of a hiatus in “Saxon” settlement in the first half of the sixth century, however, and some historians saw this as the result of a concerted campaign of British resistance.

There was an obvious gap for an “Arthurian” figure to fill, a military leader with nationwide authority, leaving a legacy in popular memory strong enough ultimately to generate the familiar legends of a great British hero and king. We are on our way to the Round Table and the Holy Grail and all the other riches of the “Matter of Britain”, as the medieval authors called the jungle of legendary traditions that grew around the name of Arthur.

More at the link.

It’s interesting how some figures who resisted invasion are later appropriated by the invaders themselves. King Arthur is one such; if he ever existed, he would have been a Romanized Briton defending an (unwillingly) independent Britannia from Anglian and Saxon invasion. The Britons lost, of course, and were pushed into Wales and Cornwall, where they consoled themselves that some day Arthur would return and vindicate their claim to their ancestral homeland. Geoffrey of Monmouth (c. 1095-c. 1155), a man of Welsh background who entered the church and who ended up at Oxford, wrote the History of the Kings of Britain, which included a long chapter on Arthur. From that point on, Arthur ceased to be a Welsh figure and became an English one, since he defended the island from invaders.

Novi Georgii Sancti

My thanks to everyone who sends me images of St. George. Here are some newly-acquired ones:

From Arkadi monastery in Crete, courtesy of my friend Christina Heckman: a seventeenth-century “Hagios Georgios o Kephalophoros,” that is, St. George the Celphalophore. I have never heard of St. George as a cephalophore (own-head-carrier) – and note that he has sprouted a new head.

Also from Christina Heckman at Arkadi: St. George the Trophy-Bearer, complete with the pitcher-bearing boy.

From my friend Daniel Holmes at the British Museum. My guess is that this one is fifteenth-century and German.

My friend Kevin Harty enjoyed a trip to Spain and Portugal over Thanksgiving break, which included a visit to Casa Botines, a modernist building by Antoni Gaudí in the city of Léon, Spain.

Over the main entrance, a St. George killing what looks like a Komodo dragon.

From Ronald Good: a classic Orthodox dragon-killing icon, reproduced on a funeral card.

Another prayer card from Ronald Good, this one designated “Hl. Georg Das Drachenwunder – Ikonen-Museum, Recklinghausen.”

George Bush

(I refer to him as he was referred to at the time of his presidency. Having to insert “H.W.” as his middle initials is proleptic.)

In our culture, one does not speak ill of the dead, but public figures usually merit some sort of even-handed evaluation. However, most of the obituaries I have read about George Bush, 41st President of the United States (1989-1993), have been rather hagiographic in tone, praising Bush’s class, civility, and devotion to public service. This is, of course, a deliberate and pointed jab at the current administration, whose leader is the cultural antithesis of the patrician Bush, and who has made a lot of enemies through his abrasive boorishness. But by no means was Bush praised for his class when he was in office! Back then, he was the Skull and Bones son of privilege, out of touch with how ordinary Americans actually lived. I thought of this as recently as July, when during one of his rallies President Trump said:

You know all the rhetoric you see here, the “thousand points of light” – what the hell was that, by the way? The “thousand points of light.” What did that mean? Does anyone know? I know one thing: “Make America Great Again” we understand. “Thousand points of light” – I never quite got that one. What the hell is that? Has anyone ever figured that one out? Ay. And it was put out by a Republican.

Some earnest CNN talking heads took issue with that, saying that it was about volunteerism and civic mindedness, obviously, and who could have a problem with these most American of values? They were shocked that Trump would run down a fellow Republican and war hero. And I was reminded how, when in power, Republicans are evil incarnate, but when they’re no longer in power, they become respected elder statesmen. For I remember the “Thousand Points of Light,” and how, to Doonesbury at least, it was a disturbing abdication of responsibility. Since Republicans hate poor people, you see, they gut social programs and then offload the function to private charity, which is a weak substitute with no guarantee that anything will be delivered. But the CNN folks apparently forgot that critique.

(It’s obvious to me what Trump was doing: signaling that it’s not the Bushes’ party anymore! In addition to pointing out that his slogan is more straightforward, and thus more inspiring, than Bush’s “poetic” one, Trump was simply playing to the base that elected him, and that had been disaffected by establishment Republicanism, most notably over the issue of illegal immigration.)

So I must say that I appreciated this article on The Intercept, shared by a couple of Facebook friends, about Bush’s legacy, even if I disagree with some of it. For instance, I fail to comprehend what was so bad about the Willie Horton ad. But his actual role in the Iran-Contra scandal, his pardoning of some of the perpetrators, and the dishonest case his administration made for the war against Iraq, all deserved to be remembered. (Along with the ADA and NAFTA of course.)

I do like revisiting the time when he overcame the “wimp factor” with Dan Rather in 1988:

I want to talk about why I want to be president, why those 41% of the people are supporting me. And I don’t think it’s fair to judge a whole career by a rehash on Iran. How would you like it if I judged your career by those seven minutes when you walked off the set in New York. Would you like that? I have respect for you but I don’t have respect for what you’re doing here tonight.

But he wasn’t always so deadly with his words. Everyone knows about George W. Bush’s “they misunderestimated me” or “Is our children learning?”; people tend to forget that Bush himself committed a few verbal infelicities, e.g.:

“For seven and a half years I’ve worked alongside President Reagan. We’ve had triumphs. Made some mistakes. We’ve had some sex – uh – setbacks.” —in 1988

“We’re enjoying sluggish times, and not enjoying them very much.” —in 1992

“I just am not one who – who flamboyantly believes in throwing a lot of words around.” —in 1990

“Please don’t ask me to do that which I’ve just said I’m not going to do, because you’re burning up time. The meter is running through the sand on you, and I am now filibustering.” —in 1989

“I put confidence in the American people, in their ability to sort through what is fair and what is unfair, what is ugly and what is unugly.” –in 1989

“You cannot be President of the United States if you don’t have faith. Remember Lincoln, going to his knees in times of trial and the Civil War and all that stuff. You can’t be. And we are blessed. So don’t feel sorry for – don’t cry for me, Argentina. Message: I care.” —speaking to employees of an insurance company during the 1992 New Hampshire primary

“I’m not the most articulate emotionalist.” –in 1989

“It has been said by some cynic, maybe it was a former president, ‘If you want a friend in Washington, get a dog.’ Well, we took them literally—that advice—as you know. But I didn’t need that because I have Barbara Bush.” —in 1989

“Please just don’t look at the part of the glass, the part that is only less than half full.” –in 1991

In a Stable, ‘Tis a Fable

From Psephizo (hat tip: Cory Schantz):

*****

Once more: Jesus was not born in a stable

December 3, 2018 by Ian Paul

I am sorry to spoil your preparations for Christmas before the Christmas lights have even gone up—though perhaps it is better to do this now than the week before Christmas, when everything has been carefully prepared. But Jesus wasn’t born in a stable, and, curiously, the New Testament hardly even hints that this might have been the case.

So where has the idea come from? I would track the source to three things: traditional elaboration; issues of grammar and meaning; and ignorance of first-century Palestinian culture.

The elaboration has come about from reading the story through a ‘messianic’ understanding of Is 1.3:

The ox knows its master, the donkey its owner’s manger, but Israel does not know, my people do not understand.

The mention of a ‘manger’ in Luke’s nativity story, suggesting animals, led mediaeval illustrators to depict the ox and the ass recognising the baby Jesus, so the natural setting was a stable—after all, isn’t that where animals are kept? (Answer: not necessarily!)

The second issue, and perhaps the heart of the matter, is the meaning of the Greek word kataluma in Luke 2.7. Older versions translate this as ‘inn’:

And she brought forth her firstborn son, and wrapped him in swaddling clothes, and laid him in a manger; because there was no room for them in the inn. (AV).

There is some reason for doing this; the word is used in the Greek Old Testament (the Septuagint, LXX) to translate a term for a public place of hospitality (eg in Ex 4.24 and 1 Samuel 9.22). And the etymology of the word is quite general. It comes from kataluo meaning to unloose or untie, that is, to unsaddle one’s horses and untie one’s pack. But some fairly decisive evidence in the opposite direction comes from its use elsewhere. It is the term for the private ‘upper’ room where Jesus and the disciples eat the ‘last supper’ (Mark 14.14 and Luke 22.11; Matthew does not mention the room). This is clearly a reception room in a private home. And when Luke does mention an ‘inn’, in the parable of the man who fell among thieves (Luke 10.34), he uses the more general term pandocheion, meaning a place in which all (travellers) are received, a caravanserai.

The difference is made clear in this pair of definitions:

Kataluma (Gr.) – “the spare or upper room in a private house or in a village […] where travelers received hospitality and where no payment was expected” (ISBE 2004). A private lodging which is distinct from that in a public inn, i.e. caravanserai, or khan.

Pandocheionpandokeionpandokian (Gr.) – (i) In 5th C. BC Greece an inn used for the shelter of strangers (pandokian=’all receiving’). The pandokeion had a common refectory and dormitory, with no separate rooms allotted to individual travelers (Firebaugh 1928).

The third issue relates to our understanding of (you guessed it) the historical and social context of the story. In the first place, it would be unthinkable that Joseph, returning to his place of ancestral origins, would not have been received by family members, even if they were not close relatives. Kenneth Bailey, who is renowned for his studies of first-century Palestinian culture, comments:

Even if he has never been there before he can appear suddenly at the home of a distant cousin, recite his genealogy, and he is among friends. Joseph had only to say, “I am Joseph, son of Jacob, son of Matthan, son of Eleazar, the son of Eliud,” and the immediate response must have been, “You are welcome. What can we do for you?” If Joseph did have some member of the extended family resident in the village, he was honor-bound to seek them out. Furthermore, if he did not have family or friends in the village, as a member of the famous house of David, for the “sake of David,” he would still be welcomed into almost any village home.

Moreover, the actual design of Palestinian homes (even to the present day) makes sense of the whole story. As Bailey explores in his Jesus Through Middle-Eastern Eyes, most families would live in a single-room house, with a lower compartment for animals to be brought in at night, and either a room at the back for visitors, or space on the roof. The family living area would usually have hollows in the ground, filled with hay, in the living area, where the animals would feed.

This kind of one-room living with animals in the house at night is evident in a couple of places in the gospels. In Matt 5.15, Jesus comments:

Neither do people light a lamp and put it under a bowl. Instead they put it on its stand, and it gives light to everyone in the house.

This makes no sense unless everyone lives in the one room! And in Luke’s account of Jesus healing a woman on the sabbath (Luke 13.10–17), Jesus comments:

Doesn’t each of you on the Sabbath untie your ox or donkey from the manger [same word as Luke 2.7] and lead it out to give it water?

Interestingly, none of Jesus’ critics respond, ‘No I don’t touch animals on the Sabbath’ because they all would have had to lead their animals from the house. In fact, one late manuscript variant reads ‘lead it out from the house and give it water.’

*****

More at the link.

“Lazy, Arrogant Cowards”

From the Telegraph (hat tip: Chris Berard):

Lazy, arrogant cowards: how English saw French in 12th century

A twelfth-century poem newly translated into English casts fresh light on the origin of today’s Francophobic stereotypes.

Although it is meant to be an ‘entente cordiale’, the relationship between the English and the French has been anything but neighbourly.

When the two nations have not been clashing on the battlefield or the sporting pitch they have been trading insults from ‘frogs’ to ‘rosbifs’.

Now the translation of the poem has shown just how deep-rooted in history the rivalry and name-calling really is.

Written between 1180 and 1194, a century after the Norman Conquest united England and Normandy against a common enemy in France, the 396-line poem was part of a propaganda war between London and Paris.

Poet Andrew de Coutances, an Anglo-Norman cleric, describes the French as godless, arrogant and lazy dogs. Even more stingingly, he accuses French people of being cowardly, and calls them heretics and rapists.

It has taken David Crouch, a professor of medieval history at Hull University, months to complete the translation of what is one of the earliest examples of anti-French diatribe.

The poem was written at a time when Philip II of France was launching repeated attacks on Normandy, taking advantage of in-fighting within the English royal family.

Prof Crouch says that the poem is of great interest to historians because of its “racial rhetoric”, which was deployed by Anglo-Norman intellectuals in support of their kings’ bitter political and military struggle.

Extracts from the poem may be read at the link. I have enjoyed hearing Prof. Crouch present at Kalamazoo. It’s interesting how this is an example of the antiquity of ethnic animus; it’s not as if it was invented yesterday and then projected onto the past.

Our Appeal Has Become More Selective

Some sobering news from IHE:

History has seen the steepest decline in majors of all disciplines since the 2008 recession, according to a new analysis published in the American Historical Association’s Perspectives on History.

“The drop in history’s share of undergraduate majors in the last decade has put us below the discipline’s previous low point in the 1980s,” reads the analysis, written by Benjamin M. Schmidt, an assistant professor of history at Northeastern University.

Some numbers: there were 34,642 history degrees conferred in 2008, according to federal data. In 2017, the most recent year for which data are available, there were 24,266. Between 2016 and 2017 alone, there was a 1,500 major drop-off. And even as overall university enrollments have grown, “history has seen its raw numbers erode heavily,” Schmidt wrote, especially since 2011-12.

“Of all the fields I’ve looked at, history has fallen more than any other in the last six years,” he says. The 2012 time frame is significant, according to the analysis, because it’s the first period in which students who experienced the financial crisis could easily change their majors.

The data represent a “new low” for the history major, Schmidt wrote. While a 66 percent drop in history’s share of majors from 1969 to 1985 remains the “most bruising” period in the discipline’s history, that drop followed a period of rapid enrollment expansion. The more recent drop is worse than history’s previous low point, in the 1980s.

I think that one of the main reasons for the decline in the history major is on account of university tuition fees continually rising far beyond the rate of inflation, so that students, of necessity, must see university as a financial investment that needs to start paying off immediately, rather than an incubator of cultural literacy, informed citizenship, and a personal life philosophy, as it may once have been. I am not saying that history majors can’t perform well in a wide variety of jobs, precisely because they can conduct research and present it coherently, it’s just that they have to overcome certain hurdles before they can convince people to hire them. I would not discount the politicization of the discipline, although this is not nearly as bad as some commentators would like to suggest (the profession as a whole might lean to the left, but you can always find professors who keep their politics to themselves, or who are even conservative). But I take consolation in the fact that our appeal really is selective: to do history properly you need intelligence and motivation, literacy and hard work. These qualities are less common that you might imagine.