QWERTY

From Smithsonian Magazine (from 2013): an exploration of the idea that the standard North American keyboard layout was deliberately designed to be inefficient. Jared Diamond mentions this in Guns, Germs, and Steel (1997):

Unbelievable as it may now sound, that keyboard layout was designed in 1873 as a feat of anti-engineering. It employs a whole series of perverse tricks designed to force typists to type as slowly as possible, such as scattering the commonest letters over all keyboard rows and concentrating them on the left side (where right-handed people have to use their weaker hand). The reason behind all of those seemingly counterproductive features is that the typewriters of 1873 jammed if adjacent keys were struck in quick succession, so that manufacturers had to slow down typists. When improvements in typewriters eliminated the problem of jamming, trials in 1932 with an efficiently laid-out keyboard showed that it would let us double our typing speed and reduce our typing effort by 95 percent. But QWERTY keyboards were solidly entrenched by then. The vested interests of hundreds of millions of QWERTY typists, typing teachers, typewriter and computer salespeople, and manufacturers have crushed all moves toward keyboard efficiency for over 60 years.”

However, Jimmy Stamp relates that:

While it can’t be argued that deal with Remington helped popularize the QWERTY system, its development as a response to mechanical error, has been questioned by Kyoto University Researchers Koichi Yasuoka and Motoko Yasuoka. In a 2011 paper, the researchers tracked the evolution of the typewriter keyboard alongside a record of its early professional users. They conclude that the mechanics of the typewriter did not influence the keyboard design. Rather, the QWERTY system emerged as a result of how the first typewriters were being used. Early adopters and beta-testers included telegraph operators who needed to quickly transcribe messages. However, the operators found the alphabetical arrangement to be confusing and inefficient for translating morse code. The Kyoto paper suggests that the typewriter keyboard evolved over several years as a direct result of input provided by these telegraph operators. For example;

“The code represents Z as ‘· · · ·’ which is often confused with the digram SE, more frequently-used than Z. Sometimes Morse receivers in United States cannot determine whether Z or SE is applicable, especially in the first letter(s) of a word, before they receive following letters. Thus S ought to be placed near by both Z and E on the keyboard for Morse receivers to type them quickly (by the same reason C ought to be placed near by IE. But, in fact, C was more often confused with S).

In this scenario, the typist came before the keyboard. The Kyoto paper also cites the Morse lineage to further debunk the theory that Sholes wanted to protect his machine from jamming by rearranged the keys with the specific intent to slow down typists:

“The speed of Morse receiver should be equal to the Morse sender, of course. If Sholes really arranged the keyboard to slow down the operator, the operator became unable to catch up the Morse sender. We don’t believe that Sholes had such a nonsense intention during his development of Type-Writer.”

Interesting, if true. But it still might be good to promote the more efficient Dvorak layout for beginning typists. I see that I can select it in the System Preferences for this computer.

The Avro Arrow

From BBC Future (hat tip: David Winter):

The record-breaking jet which still haunts a country
A decade after the end of World War Two, Canada built a jet which pushed technology to its limits. But its demise showed why smaller nations found it difficult to compete in the Jet Age.

In the early years of the Cold War, Canada decided to design and build the most advanced fighter aircraft in the world.

Canada is well known for its rugged bush planes, capable of rough landings and hair-raising take-offs in the wilderness. From the late 1930s, the North American country had also started to manufacture British-designed planes for the Allied war effort. Many of these planes were iconic wartime designs like the Hawker Hurricane fighter and Avro Lancaster bomber.

Ambitious Canadian politicians and engineers weren’t satisfied with this. They decided to forge a world-leading aircraft manufacturing industry out of the factories and skilled workforce built up during the war. Tired of manufacturing aircraft designed by others, this new generation of Canadian leaders were determined to produce Canadian designs. Avro Aircraft, the Canadian airplane maker created after the war, was the company that would deliver their dream.

Freed from the set ways-of-thinking of Avro’s more established rivals, the firm’s engineers were able to work on revolutionary jet fighters, commercial airliners, flying saucers and even a space plane. They placed Canada at the technological cutting edge of the new Jet Age.

In so doing, these engineers challenged notions of what small countries like Canada could achieve in the hi-tech industries of the day, even if convincing politicians to stump up the cash for them was an altogether trickier business.

Then came the Arrow. On 4 October 1957, 14,000 people watched a large hangar on the outskirts of Toronto open to reveal a beautiful, large, white, delta-wing aircraft. The plane was the Avro Arrow interceptor. A third longer and broader than today’s Eurofighter Typhoon, the Arrow could fly close to Mach 2.0 (1,500 mph, or the maximum speed of Concorde), and had the potential to fly even faster. It was Canada’s Can$250m (US$1,58bn today) bid to become an aviation superpower.

The project was genuinely ground-breaking. Avro’s engineers had been allowed to build a record-breaker without compromise. But Canadians would soon discover that the supersonic age had made aviation projects so expensive that only a handful of countries could carry them out – and Canada, unfortunately, wasn’t one of them.

Read the whole thing. Prime Minister John Diefenbaker canceled the Arrow in 1959 for genuine reasons of cost, but it was a huge blow to national pride, and the ordered destruction of everything to do with the project (for reasons of security) seemed an added insult. Fifteen thousand people lost their jobs as a result, although NASA did cherry-pick 33 Canadian engineers and put them to good use. They might not have been as important as Von Braun’s German team, or even the Hidden Figures ladies, but they made some genuine contributions to the moon shot, including the Lunar Orbiter Rendezvous concept, the design of the Lunar Module, and the design of the heat shield to protect the Command Module upon its return to Earth (see this CTV News article for more). 

As it happens we saw an episode of the Canadian television series Murdoch Mysteries this week. Set in Toronto in the 1890s, the episode (entitled “Murdoch Air“) featured the fictional inventor James Pendrick and his prototype heavier-than-air aircraft, which he called Pendrick Arrow, a clear reference to the Avro Arrow. It too gets deliberately destroyed. 

The “Flax Age”

From Literary Hub (hat tip: David Winter), an interesting proposition, excerpted from The Golden Thread by Karissa St. Clair:

What If We Called It the ‘Flax Age’ Instead of the ‘Iron Age’? Correcting the Historical Bias Against Domestic Materials

Archaeology has traditionally had a fundamental bias against fabric. Fabrics are after all highly perishable, withering away within months or years, and only rarely leaving traces behind for those coming millennia later to find. Archaeologists—predominantly male—gave ancient ages names like “Iron” and “Bronze,” rather than “Pottery” or “Flax.” This implies that metal objects were the principal features of these times, when they are simply often the most visible and long-lasting remnants. Technologies using perishable materials, such as wood and textiles, may well have been more pivotal in the daily lives of the people who lived through them, but evidence of their existence has, for the most part, been absorbed back into the earth. 

There are exceptions, of course, and traces can and do survive, usually thanks to an unusual climate: freezing, damp anaerobic conditions or extremely dry ones. The climate in Egypt, for example, is ideal for preserving all manner of usually perishable things and we subsequently know far more about ancient Egyptian textiles than those from most other regions. As archaeology has matured and diversified, scholars have increasingly looked for—and found—evidence of fine, complex textiles stretching farther back than anyone would have guessed. Their beauty and the skill needed to make them suggest a very different image of our earliest forebears than the club-wielding, simpleminded thugs of popular imagination.

Read the whole thing

Paleoichthyology

It’s a real word, referring to the study of fish in the past, as detailed in a recent Atlantic article:

The Medieval Practices that Reshaped Europe’s Fish

In Europe, aquatic animals have been traded at least since the days of the Roman Empire. But it was during the early Middle Ages, with the arrival of widespread Christianity, that the animals became a popular source of protein. That’s partially due to the roughly 130 days a yearwhen the faithful were exhorted not to eat meat, because fish didn’t count in that category.

At the same time, expanding agrarian populations were cutting down forests to create fields and diverting rivers to fill defensive moats around castles and towns, Hoffmann writes in one paper. From the ninth century a.d. to the 11th, the number of grain mills built along rivers in England exploded from about 200 to 5,624. Species that came into fresh water to spawn, such as salmon and sturgeon, began declining. New regulations, such as King Philip’s, were put into place to manage fish populations. A Scottish statute from 1214 required all dams to include an opening for fish and barrier nets to be lifted every Saturday, for instance. Soon highly sophisticated aquaculture ponds stocked with carp also provided regular access to fish for the landed elite.

This decline in freshwater populations coincided with a sudden, commercial-scale boom in sea fishing, which began around a.d. 1000 and is known as the “fish event horizon.” In one study, archaeologists collected cod bones in London from 95 Roman, medieval, and postmedieval sites. The number of bones jumped circa the year 1000, and isotopic sampling showed that in the following centuries, fish came from farther and farther away, indicating long-distance trade. In the southern English town of Southampton, the remains of marine species (such as cod) began to outnumber freshwater species (such as eel) by 1030.

That “fish event horizon” could have been caused by a number of forces. It came at a time of population growth, urbanism, new ship technology, and increased trade, says the archaeologist James Barrett, from the University of Cambridge. But, he adds, “I’ve argued consistently that this must also be about human impacts on freshwater and migratory fishes. The degree of vulnerability of fishes depends on how bounded the ecosystem they occupy is.”

In other words, because their habitat was smaller, freshwater fish were more likely to respond to human pressures sooner. When the reliable stocks of freshwater fish began dwindling, hungry Europeans turned to the much larger oceans. And while those populations had larger ranges, humans still had an impact.

Viking Tar

From Smithsonian.com:

Was the Vikings’ Secret to Success Industrial-Scale Tar Production?

Evidence suggests that the ability to mass-produce tar bolstered their trade repertoire and allowed them to waterproof and seal their iconic longships

The Vikings are often viewed as brutish, destructive village-pillagers, but their knack for innovation is perhaps overlooked. Viking-age Scandinavia was kind of the Silicon Valley of shipbuilding in the early medieval period. Their iconic longboat designs, advanced navigational skills, and perhaps even legendary sunstones gave them the ability to raid, trade and establish settlements as far away as Russia, Italy and North Africa. A new study adds another bit of technology to the list of things that gave Vikings a leg up on their adversaries: they may have been capable of making industrial scale quantities of tar, according to a new paper published in the journal Antiquity.

Tar was probably essential to the Vikings’ lifestyle since each longship would have required about 130 gallons of tar to coat all of its wooden elements, the study suggests. Tar was also needed to coat the ships’ wool sails, and the boats would need to be regularly re-tarred between voyages as well. Multiply all that to fit the needs of a fleet and we’re talking about a lot of tar here.

However, little was previously hypothesized about how they would have been able to produce the sticky substance en masse. The new study, authored by Andreas Hennius, an archaeologist from Uppsala University in Sweden, proposes a possible outline of how small scale tar production in the early centuries of the first millennium gave rise to potentially industrial use of tar by Vikings.

“I suggest that tar production in eastern Sweden developed from a small-scale household activity in the Roman Iron Age to large-scale production that relocated to the forested outlands during the Vendel/Viking Period,” Hennius writes in the paper. “This change, I propose, resulted from the increasing demand for tar driven by an evolving maritime culture.”

Read the whole thing. It’s interesting how many historians don’t tend to consider technology like this; thank goodness there are people who are willing to.

And Third Prize is You’re Fired!

After the movie Glengarry Glen Ross (link NSFW), I can’t hear the words “steak knives” without smiling. So I was amused to discover this interesting article on Popular Mechanics:

The Secret History of Steak Knives

Sharp knives disappeared from the dining room table, only to return, centuries later, in steak knife form. Kings, cardinals, and factories are involved.

By Ernie Smith
Sep 28, 2017

Obviously, knives, with their sharp blades for cutting through things, have been around forever—they’re a key ingredient of any horror film, slasher flick, or murder mystery that’s ever been created.

But here’s a question that I don’t think a lot of people have pondered, mainly because they aren’t expected to, like I am: Why do steaks get their own dedicated knives, and why do we shove them into giant blocks of wood for storage? And what about butter knives? What’s up with them?

It turns out that it’s a story with a lot of edges—some sharp, some dull.

Before there was the steak knife, there was the table knife, or the butter knife. As blade designs go, it’s pretty weak-sauce, and intentionally so.

The reason for this goes back nearly 400 years, and involves an annoyed French clergyman. Cardinal Armand Jean du Plessis, the Duke of Richelieu and Fronsac—or Cardinal Richelieu for short—became annoyed by table manners of those eating with pointed knives, which were used as a way of picking teeth.

He had his knife edges rounded, the legend goes, in an effort to discourage bad behavior by his guests.

This broke tradition around knife use. See, knife blades were long the primary way that people ate food—unlike napkins, which weren’t always a given, they were always a key element of the meal. Often, medieval cultures would eat meals using a single knife—their own, which they brought with them to dinner—and their hands. The introduction of the fork into European culture changed the way we interacted with knives, just as it did with napkins.

Cardinal Richelieu was a powerful, influential man, and his knife-dulling approach gained enough currency that in 1669, 27 years after he died, King Louis XIV issued a decree making pointed knives illegal in France, whether inside the home or out in public. Suddenly, a lot of sharp knives got pretty dull.

Read the whole thing.