The Hearts of Reformers

Wikipedia.

A well-known symbol of Lutheranism is the so-called Luther Rose, which features a black cross on a red heart at the center. It was devised for Luther in 1530 and features multivalent symbolism. Luther claimed that:

my seal is a symbol of my theology. The first should be a black cross in a heart, so that I myself would be reminded that faith in the Crucified saves us. Although it is indeed a black cross, which mortifies and which should also cause pain, it leaves the heart in its natural color. Such a heart should stand in the middle of a white rose, to show that faith gives joy, comfort, and peace. Such a rose should stand in a sky-blue field, symbolizing that such joy in spirit and faith is a beginning of the heavenly future joy, which begins already, but is grasped in hope, not yet revealed. And around this field is a golden ring, symbolizing that such blessedness in Heaven lasts forever and has no end. 

“I wonder what symbol Calvin used?” I mused to my wife at dinner last night. “Probably a tulip,” she replied with eminent good sense. TULIP, of course, is an acronym for the five points of Calvinist theology, viz:

Total depravity
Unconditional election
Limited atonement
Irresistible grace
Perseverance of the saints

The problem is that this acronym doesn’t work in French or Latin, the two languages that Calvin operated in. Plus, the tulip may not have been introduced into Europe before Calvin’s death in 1564. 

Instead, as it turns out, Calvin did not use a flower, but a heart, held in a hand, illustrating the motto “Cor meum tibi offero, Domine, prompte et sincere,” that is, “My heart I offer to you, O Lord, promptly and sincerely.” 

The Josh Link

I’m not sure who drew this but I found it at The Josh Link

Calvin University.

This seventeenth-century medal was struck in memory of Calvin, and the image can be found at the Calvin University (Grand Rapids) website

Calvin University.

Calvin University itself uses a version of the emblem and motto. 

The more you know! Personal emblems, especially if properly heraldic, ought to make a comeback.

Ernst Kantorowicz

From the Chronicle of Higher Education (hat tip: Paul Halsall), a remembrance of a mildly famous mid-century episode:

The Right-Wing Medievalist Who Refused the Loyalty Oath: On Ernst Kantorowicz, academic freedom, and “the secret university.”

In 1950, Ernst Kantorowicz, a distinguished professor of medieval history, was fired from the University of California at Berkeley for refusing to sign an oath of loyalty, which had been mandated, in a fit of Cold War panic, by the University of California’s Board of Regents. Kantorowicz principally objected to the Board of Regents’ requirement that all professors with U.S. citizenship declare in writing that they upheld the Constitution and were not members of any organization advocating the government’s overthrow.

Kantorowicz was by no means alone in his refusal to sign. Across the UC system, another 36 tenured professors lost their jobs alongside him. As it turned out, California’s Supreme Court overturned the sackings. By then it didn’t matter much for Kantorowicz. He had already found a job at the Institute for Advanced Study in Princeton.

Looking back, this incident may seem trivial enough: just another display of Cold War paranoia, just another demonstration of supine conciliation on the part of university authorities.

But we shouldn’t let Kantorowicz’s firing fall out of institutional memory. If anything, his act has become more rather than less significant, because, paradoxically, the reasons he gave for his refusal were so peculiar, so out of touch. They were remote from ordinary ways of thinking about the professoriate’s role and status then. They are even more remote now. This very remoteness can suggest new ways for professors to relate to the university system today, as it becomes unmoored from centuries-old traditions and legitimations and as the empire of obsolescence expands.

In refusing to sign the loyalty oath, Kantorowicz did not appeal primarily to the notion of “academic freedom” as articulated by John Dewey and others earlier in the century. Nor did he refuse to sign because he was any kind of leftist. To the contrary, he was (as he put it in the pamphlet he wrote about the affair) a “conservative” who, as a volunteer fighter against the Munich 1919 uprising, had actually killed Communists.

His reasons appealed to a different conceptual or institutional tradition than any acknowledged either in modern politics or by modern academic administration. He believed that a professor is “entrusted with” an office in a particular “body corporate,” or corpus mysticum, i.e., a university. That status was defined in medieval Europe when universities were established as a universitas magistrorum et scholarium — as bodies made up of students and professors and nobody else.

As a corporation, the university had a particular legal status. It could not be identified with the sum of its members; it was rather a disembodied entity, permanent and immortal. What enabled the scholar to participate in the university was professorial office, which endowed its bearer with “dignity.” Dignity, thus conceived, is not a personal comportment but a quality essential to office. Or rather: In a permanent, mystical institution, dignity fuses office to the private personality, as Kantorowicz put it in his most famous book, The King’s Two Bodies (1957).

As a corpus mysticum, the university is a corporation in a different sense than the modern business enterprise. Because students and professors were the embodied corpus mysticum, regents or janitors, for instance, do not themselves belong to the university proper. They are attached outsiders. Janitors, for instance, merely keep the campus clean. Regents ensure that formal university procedures as mandated by the state are observed. But as members of the university’s body corporate, professors were not employees at all.

In other words, for Kantorowicz, a professorship was a public trust. No one had control over professors. No one measured their performance. The dignity of the professorial office called upon its bearers to act according to their “conscience,” which was held to be inseparable from the professor’s “genuine duties as member of the academic body corporate.” Furthermore, dignity required them to enact their conscience with “passion” and “love.” It involved a willingness to sacrifice their embodied self for the sake of the office: a concept of sacrifice whose historical origins included God’s sacrifice of Christ’s humanity.

Yeah, I’d say that sounds out of touch! For more on this episode read the whole thing, and the chapter on “The Nazi Twins” in Norman F. Cantor’s Inventing the Middle Ages (1989). 

Heraldic Seal for Dartmouth – A Proposal

Earlier this summer I wrote a letter to the president of Dartmouth College, which I reproduce below, with added links. (No reply as of yet.)

***

Dear Sir,

When I was a student at Dartmouth in the early 1990s, some Indian figures remained in the official symbolism of the College, most notably on the Dartmouth seal, the Dartmouth shield, and the Baker Library weathervane. Two years ago, the College deprecated the shield in favor of the new “D-Pine” logo, and I heard the announcement this week that the Baker weathervane is to be taken down as soon as possible.

That leaves the seal:

Please know that I am not writing to defend it. There are, indeed, a number of problems with it. Like the Dartmouth shield, the seal depicts Native people being drawn out of the woods to receive the light of the Gospel, or at the very least a European-style education. Such a scene now strikes us as offensive, and in fact was all false propaganda to begin with, being an aspect of Eleazar Wheelock’s PR efforts to keep donations coming. Wheelock designed a seal of Dartmouth College that specifically references the seal of the Society for the Propagation of the Gospel in Foreign Parts, a missionary society founded in 1701 in London:

Similarities between the Dartmouth seal and the SPG seal include the natives on the one side, the larger European technology on the other, writing in the air, and an irradiated object over the whole thing.

The Dartmouth seal is also religious in other ways. One of the supporters carries a Christian cross:

And the Hebrew at the top reads “El Shaddai,” meaning “God Almighty”:

Note that it’s on a triangle, referencing the Christian concept of the Holy Trinity (as though to say: “We know Hebrew! But please don’t confuse us with the original Hebrews.”)

Such details are not appropriate for a secular college.

I do not know if there has been a movement on campus against the Dartmouth seal. Given recent trends I can certainly understand if people might want to devise a new seal that accurately reflects Dartmouth’s history and is more in accord with its current values. If it is to be changed, allow me to propose that Dartmouth adopt a seal featuring a properly heraldic coat of arms. An example might look like this:

When well-designed, a heraldic shield is simple and recognizable, and neatly encapsulates an organization’s identity. In this case, the coat of arms says “I am an academic foundation (building, books) named after the Earl of Dartmouth (stag’s head) and located in rural New Hampshire (also stag’s head)” – with no references to Natives or Christianity. Heraldry places a university in a long tradition stretching back to the thirteenth century and suggests that it is dignified and deserves to be taken seriously. One does not need to use a coat of arms on a daily basis to express one’s identity (the D-Pine logo, as far as I’m concerned, does this quite well), but it is nice to have a coat of arms should the need arise – for instance, on those occasions when all the Ivy League coats of arms are displayed together.

I repeat that I am unfamiliar with the campus climate. I do not know whether anyone has said anything about the seal. I can understand why people might want it changed, but I can also understand why they might want to retain it too, for historic or sentimental reasons. If it is to be deprecated, however, please consider replacing it with an appropriate, well-designed, and dignified coat of arms.

Sincerely,

Jonathan Good ’94

The Practice of History

A friend writes:

In practice it seems to have become a norm to say “historians agree,” when when that is not true, or sometimes just on the basis of some recent article or paperback, and which is popular with a certain crowd.

There is almost a tendency to present history as an activity in which some set “findings” have been made that have a fixed meaning.

Such a reality is not true even in many experimental social sciences which at least use statistical significance as a guide to reliability. It is even less the case with history where the nature of the profession precludes such statements. The very historians who do the most detailed archival, philological, and novel work are so often overwhelmed with that that they do not have a very good grasp of other areas of history, and even less the findings of social, psychological, and natural sciences. Whereas those who, because they come across well on camera, who speak most generally often simply do not have enough scholarly depth.

And for people on the left who really really feel that some piece of historiography has totally transformed historical understanding so that they know something that hoi polloi don’t, always keep in mind the sad case of Michael A. Bellesiles [link added].

My favorite example of this phenomenon is provided by Noel Ignatiev’s book, How the Irish Became White (1995). Ignatiev used “white” metaphorically to mean “part of the dominant group.” But since then it has become conventional wisdom that “the Irish weren’t even considered white in the nineteenth century!” One must use metaphors carefully. 

The Auto-Icon

One of the stranger items on display at University College London is the stuffed remains of its spiritual founder, utilitarian philosopher Jeremy Bentham (1748-1832), who believed that education should be free of church influence (unlike Oxford and Cambridge, which at the time were restricted to members of the Church of England). Bentham called this his “auto-icon” (i.e. “self-image”). The auto-icon:

was inscribed in the late philosopher’s will, which requested that a number of fixtures be put in place to preserve his remains, that they be dressed in the clothes he wore in life, and that they occasionally be brought into meetings involving his still-living friends, so that what’s left of Bentham might enjoy their company.

You might be inclined to think that this was an elaborate joke on Bentham’s part, but he doesn’t strike me as the joking type. The auto-icon, according to the linked article in Atlas Obscura, has found a new and much more public home at UCL: in a glass case in the student center. (Previously it was in a closet that was only opened on request.) 

Bentham might have been an atheist, but it is interesting to note how the preservation of human remains is a custom that extends beyond religion. 

Books

Some of our books. This is the medieval section.

My wife and I, over the course of our careers as historians, have amassed over 4400 books (I catalogued them a few years ago). Our children are well on their way to replicating our habit with their own tastes in literature. We have 27 bookcases of various sizes lining the walls of six rooms of our house. I wouldn’t say that we are hoarders – we’re actually somewhat selective about what we acquire, and the shelves have their own pleasant aesthetic. But when you both have academic specialties, and teach a lot of topics through survey courses, and have any number of secondary interests – and there are a potentially unlimited number of books published on every topic under the sun, well, you end up acquiring a lot of books! In fact, our habit has become a bit compulsive, almost like an addiction. Like all addictions, it has enablers:

I hate shopping, but there are few things I enjoy more than visiting a used book store! Usually it doesn’t take much for me to find an excuse to buy something. Some possibilities:

• This looks interesting.

• I don’t have a book on this topic, and I might need one for a lecture some day.

• I have a book on this topic but this one is more recent/provides a different point of view.

• I have heard of this author and I should have some of his books.

• I have read something by this author and would like to read more.

• I have nearly all this author’s works; all I need is this one to complete my collection.

• It is important to support small bookshops.

And so on. So out we come with an armful. (I do have an Excel spreadsheet of our collection on my phone, so that we don’t end up buying the same books over again.)

Bookshops, however, at least provide you with plenty of books that you do not want to buy. Romance novels, self-help books, celebrity biographies… all so very much beneath the notice of this academic. You find the history section, and then the selection of books that you might want, and then choose the best ones among them. It’s a chase, a filtration process – the aspect of collecting that makes it addictive. The trouble comes when you’re spoiled for choice, like at the book exhibits at the annual meeting of the AHA, where just about every academic publisher operating in America shows up with every historical title they have currently in print. Then you realize just how pathetic your addiction is. My friend Scott claims he fell out of love with stamp collecting when he realized that there were companies out there from whom you could order just about any stamp ever printed. Where’s the fun in that? Similarly, why buy a book on ancient Greece that “looks interesting,” when there have been twenty such books published this year alone that are brimming with current scholarship and are not available in Barnes and Noble, i.e. they are the sort of books that actually command academic respect? Oh, the pain!

But I don’t get to the AHA much. Instead, the normal situation prevails when we visit McKay’s in Tennessee or 2nd & Charles on Barrett Parkway. Joseph Epstein once called such stores “the pool halls of academe” and lately I have come to believe that our habit is somewhat self-congratulatory and illustrates a lack of discipline – or at least a distraction from doing real work. Having walls full of books certainly signifies you as Educated and a Professional Academic, but it also represents what one friend called a “security blanket.” After all, when are you going to read them all? I will say that I do read – last year I read 42 books, most of which were in our collection. But this represents less than 1% of our holdings, and at this rate it will take a century to read everything we’ve got. I could say that they’re there for the sake of reference – “reading” in the academic sense of skimming for information, and then keeping the book on the shelf in case you need to return to it some day, which may be never, but at least it’s there. But I really don’t like reading books in this way (what one author called “book breaking“) – it shortchanges the author and encourages intellectual superficiality. 

“Have you never heard of libraries?” a friend once asked me, to which I replied, “I’ve gotten to the point where I don’t trust libraries.” And I guess I can say that this is a good reason to keep an extensive collection. It’s always convenient to have a book at home when you need it, rather than having to go to the library the next day, only to discover that it’s missing from the shelves, or that someone else has already checked it out. Having to order a book through interlibrary loan takes even longer, and there’s no guarantee that it will even arrive. And lately libraries are deaccessioning their codexes because “everything’s online anyway,” but I am suspicious of this movement, for a number of reasons:

• it remains (for me) more difficult to read longer works from a screen than from a page.

• you need computer equipment, an internet connection, and a power source to be able to read electronic documents. What if any of these is down? Sometimes they’re behind a paywall or require a subscription for added annoyance. 

• putting things online allows your reading habits to be tracked, and for changes to be made to texts without ever being acknowledged, in the mode of George Lucas monkeying with the original Star Wars trilogy. (Han shot first!) And don’t forget the books that somehow disappear without notice from your Kindle “library.” 

Reinhardt’s librarian Joel Langford once pointed out to me that with music or video recordings, you always need some sort of playback equipment, but with text, all you need is to know is how to read. Thus books will never quite go out of style, unlike CDs or VHS tapes – you don’t need any special equipment to read them, except for a light source. Furthermore, the tactility of books keeps them attractive over computer files. Malcolm Gladwell once wrote an essay about the persistence of paper. An excerpt:

Computer technology was supposed to replace paper. But that hasn’t happened…. This is generally taken as evidence of how hard it is to eradicate old, wasteful habits and of how stubbornly resistant we are to the efficiencies offered by computerization. A number of cognitive psychologists and ergonomics experts, however, don’t agree. Paper has persisted, they argue, for very good reasons: when it comes to performing certain kinds of cognitive tasks, paper has many advantages over computers. The dismay people feel at the sight of a messy desk—or the spectacle of air-traffic controllers tracking flights through notes scribbled on paper strips—arises from a fundamental confusion about the role that paper plays in our lives.

There’s much more at the link. Gladwell is talking about the use of paper in offices, but some of what he says applies to books as well – holding them in one’s hand, marking pages with sticky notes, scribbling in the margins, shelving them by topic – these things actually help us to remember what’s in the book. (This is a drawback of literacy, of course – we have outsourced remembering to the text, so anything that allows us more efficient access to that information is to be cherished.)

While we’re on the subject of tactility, it is good to remember that some books, as objects, are better than others. One is not supposed to judge a book by its cover (or, presumably, other physical attributes), but you really can’t help it. Some of the qualities I appreciate:

• The paper should be smooth to the touch, strong (not disintegrative), and should not yellow with age.

• The pages should be well laid out with spacious margins. Fonts should be attractive and appropriate, with competent leading, kerning, justification and characters per line. The ink should be solid in tone and color, and the letters well defined. 

• The illustrations and graphical flourishes should be attractive and appropriate, and not clash with the typeface.

• Whether hard or soft cover, the binding should not crack or come apart yet should be supple enough to handle with ease. (I don’t particularly care for Folio Society-style leather bindings and gilded page edges, though – that is a step too far.)

• Softcover books should be made so that the cover doesn’t curl up in the slightest humidity, and the cover shouldn’t easily retain and display the grease stains from one’s fingers. Also, it’s nice when that thin film of cellophane that covers some softcover books doesn’t bubble and start peeling off.

• Last but not least, there is that lovely scent. One of the appeals of a book store is the smell of all the old books! A book should certainly not reek of the oil used to print it. 

So I’m not about to get rid of all my books any time soon. I’m certainly not going to adopt the habits of a person I read about in the Chronicle once, who prided himself on keeping no books. If he was working on something, he would get whatever books he needed from the library or interlibrary loan, and after he was done he would return them, and put the topic out of his mind as he moved on to his next project. To my mind this is somewhat anti-intellectual, but it’s likely more conducive to academic success.

Still, a good cull is probably in order….

Happy New Year

To start the new year, a Facebook friend makes a resolution, with which I heartily agree:

Many years ago I took a course on Late Antiquity and we read a book by scholar A and then one by Scholar B. Scholar B totally opened by mocking the outrageous claims of Scholar A except here’s the thing: Scholar A never made those claims. Scholar B set up what is known as a strawman to attack and make his position seem less questionable. I took this to heart and have found myself saying “Don’t be a [Scholar B]!” before I think about re-sharing or reacting.

A lot of times on social media I see people share posts about how “so many people with belief X are posting this awful sentiment” and internally I’m like, “that’s weird – I haven’t seen anything that sounds like that or I saw one comment.” And here’s the thing: they aren’t or it’s one person but articles are written as if a whole group rose up and said one thing.

There are many outlets (media and otherwise) who want you to think that they other side is so outrageous that it isn’t worth doing your due diligence – but it’s always worth doing due diligence. And don’t let one comment replace the thoughts of many.

I fail at this a lot – it’s super easy to be Scholar B. It’s comfortable to be Scholar B. But we shouldn’t be. In 2020, let’s all agree to be skeptical when something seems too outrageous to believe and let’s also ask ourselves who benefits when we do believe it?

And then weigh that before we share an article or comment or post.

Related, from Vox.com:

Intellectual humility: the importance of knowing you might be wrong

Why it’s so hard to see our own ignorance, and what to do about it.

I’ve come to appreciate what a crucial tool it is for learning, especially in an increasingly interconnected and complicated world. As technology makes it easier to lie and spread false information incredibly quickly, we need intellectually humble, curious people.

I’ve also realized how difficult it is to foster intellectual humility. In my reporting on this, I’ve learned there are three main challenges on the path to humility:

  1. In order for us to acquire more intellectual humility, we all, even the smartest among us, need to better appreciate our cognitive blind spots. Our minds are more imperfect and imprecise than we’d often like to admit. Our ignorance can be invisible.
  2. Even when we overcome that immense challenge and figure out our errors, we need to remember we won’t necessarily be punished for saying, “I was wrong.” And we need to be braver about saying it. We need a culture that celebrates those words.
  3. We’ll never achieve perfect intellectual humility. So we need to choose our convictions thoughtfully.

This is all to say: Intellectual humility isn’t easy. But damn, it’s a virtue worth striving for, and failing for, in this new year.

Read the whole thing. All I would like to add is that opponents to Trump, not just Trump himself, can fall victim to false and unearned confidence…

Abu Bakr al-Baghdadi

Thomas MacMaster writes, in reference to a recent death in Syria:

Am I the only one startled to see the lack of discussion over the death of a world famous scholar of the medieval world (BA, MA, & PhD as well as numerous publications) who probably did more to weaponize medieval studies in the past decade than anyone else?

We should also acknowledge his leadership of one of the largest medieval re-enactment groups (with a serious commitment to using the digital humanities for outreach).

He makes a good point…

Some Good Advice

From Chronicle Vitae (hat tip: Richard Utz): 

How Not to Be a Jackass at Your Next Academic Conference

If you’ve spent any time at an academic conference, you know the scene: A stage full of scholars have just finished presenting their papers. As the Q&A session begins, a woman rises from the audience and prefaces her remarks by saying, in so many words, that she hadn’t been invited to appear on the panel. But here, anyway, are the highlights of her paper—and her credentials and biography, too.

Or maybe a senior professor speaks up. He barks at a graduate student on the panel, embarrassing the student by ripping his paper to pieces. Another professor steps forward and asks the panelists a series of multipart questions she already seems to know the answers to.

Perhaps a guy raises his hand to comment and quotes verbatim from Jürgen Habermas’s The Structural Transformation of the Public Sphere. Or he decides to show off his French by citing Frantz Fanon’s manifesto Les Damnés de la Terre, when he could have kept it simple by using the English title, The Wretched of the Earth.

Some of these moments may be byproducts of social awkwardness; others are signs of bad manners. Some might not even bother you. But they’re all fairly common. I witnessed several of them earlier this month—including the Habermas and Fanon name-checks—at the American Historical Association meeting.

Why do so many academics risk coming off like jackasses at conference Q&A sessions? Some scholars say it’s because those sessions are more about pageantry than conversation: Showing other scholars how much you know is often more important than actually listening and learning.

There’s another reason, too: Developing good conference manners—and social skills in general—just isn’t part of graduate school training. I gathered a list of behaviors, both comical and aggravating, from a few dozen academics. As I read through them, I wondered: What would Emily Post, the famous etiquette author, do?

I decided to call up someone who would know. Emily Post’s great-great granddaughter, Anna Post, keeps the flame alive, conducting business-etiquette seminars across the country as an etiquette guru at the Emily Post Institute. She carved out some time to chat with me about academic disorders and how to cure them. 

Click the link to read some excellent advice

Liberal Arts Degrees

Here is the text of my piece promised earlier this summer, which has appeared in Canton Family Life.

***

The engineering major asks “How does it work?”
The business major asks “How much will it cost?”
The liberal arts major asks “Do you want fries with that?”

I’ve heard some version of this joke many times, and it’s always annoying – in part because it’s somewhat true. The joke points to the dual purpose of higher education: does it exist to preserve “the best that has been thought and said” in our culture? To teach young people how to think and about what it means to be human? To open new vistas in human understanding?

Or does it exist to prepare people for paid employment?

At one point you could have both – a bachelor’s degree in any subject signaled that its holder was diligent and intelligent, and thus suitable for white-collar work.

Unfortunately, at some point in the twentieth century, politicians noticed that university graduates enjoyed a higher status and income level – so they figured that if everyone went to university, then everyone could enjoy a higher status and income level! They sponsored a vast expansion in higher education, both in terms of the number of university campuses built, and in the number of people who were able to attend through grants and loans to help cover their tuition bills.

Universities were happy to play along. In fact, it is a major reason why university tuition fees have risen at twice the rate of inflation for the past forty years or so. Universities are not charities, they are businesses, and even though they are not-for-profit, they hate leaving money on the table. If you get a student loan, the university will make sure that it gets every penny of that loan, plus what it would have charged in the first place. Someone has to pay for the president’s new office suite!

Alas, for the graduates themselves, the law of diminishing returns kicked in. Once bachelor’s degrees became both more common and more expensive, it meant that students could not afford to spend their undergraduate years developing a personal life philosophy and still expect that their degree would be worth something on the job market. Instead, their degree had to start paying off immediately. Thus did technical or professional majors, which prepare graduates for specific fields like business management or information technology, really start to take off. Even people who were interested in the liberal arts felt they had to major in something “practical,” out of fear for their livelihood.

Now, it should be said that universities have not completely abandoned their other, cultural purpose. They will generally require students to take a few liberal arts courses in such subjects as history, English, philosophy, or religion for the sake of polish or breadth. People who actually want to major in these subjects, however, are regularly condescended to. One guest speaker at Reinhardt told us recently, “What’s the difference between a liberal arts major and a pizza? A pizza can feed a family of four!”

But I have taught and kept in touch with fifteen years’ worth of history majors, and I can safely say that this view is not accurate.

For one, it does no one any good to major in a subject he hates. Better to pursue something that you’re really interested in and actually graduate, than to drop out on account of tedium.

For two, the skills acquired in the pursuit of a liberal arts degree are transferrable to a wide range of careers. Chief among these is the ability to pull information from a variety of sources, to synthesize it, and to present it in a coherent and eloquent manner. One of our history graduates, a project manager at Prosys Information Systems, says that his literacy and communication skills are “superior to almost everyone I work with,” and credits the history program for preparing him for his job. Another worked as a property analyst in Atlanta. His employers were glad to hear that he was a history major because they knew he could think through problems and analyze situations. As he says: “every day I craft proposals and analyses that need to be articulate and persuasive.”

Of course, success in the job market still depends on the exercise of certain amount of initiative. Holding an internship in a field you’d like to enter, developing contacts there, and marketing yourself through LinkedIn are all useful. Minoring in something technical can also be a good idea. And in fairness, I should point out that discretion is advised when choosing liberal arts courses, some of which, I am ashamed to admit, serve up great helpings of impenetrable, jargon-encrusted prose in the service of entirely predictable political positions.

But knowing how to think and knowing how to write will stand you in good stead wherever you end up – whether that’s in business, higher education, law enforcement, public administration, teaching, ministry, or health care, to name a few of the fields our graduates have found careers in. Long after this year’s hot programming language has been made obsolete, liberal arts graduates will still have the ability to “see around corners,” in the words of Kevin Reinhart, professor of religion at Dartmouth College. Blogger Joe Asch concurs, saying that “Over the years, whether in dealing with managers or lawyers or even architects and other professionals, I have found that folks with a liberal arts background understand larger issues which people with only technical training just can’t comprehend.”

It might take some effort to find your first job as a liberal arts major, but chances are you’ll end up performing very well in it!