Rainbow

One thing I like about the Pride flag is that it shows a properly stylized rainbow – the three primary colors red, yellow, and blue, and the three secondary colors orange, green, and purple, producing a flag with six horizontal stripes (preferable to vertical stripes since that’s the way a flag flies). 

Of course, the spectrum contains an infinitude of colors, but showing six of them is a logical and visually appealing abstraction. But didn’t we learn in school that there were seven? Doesn’t the mnemonic “Roy G. Biv” stand for Red, Orange, Yellow, Green, Blue, Indigo, Violet? Where does this “Indigo” come from? Wikipedia informs us that:

Indigo was defined as a spectral color by Sir Isaac Newton when he divided up the optical spectrum, which has a continuum of wavelengths. He specifically named seven colors primarily to match the seven notes of a western major scale, because he believed sound and light were physically similar, and also to link colors with the (known) planets, the days of the week, and other lists that had seven items. It is accordingly counted as one of the traditional colors of the rainbow.

Ah, Newton. One of the great geniuses of the previous millennium, but still not entirely a man of science as we now understand that term. The designation of “indigo” as a color of the rainbow simply to get to the number seven seems similar to a hypothetical situation in which we decided that five is a special number, and so imagined five cardinal directions – North, East, South-East, South, and West. 

But I wonder whether this seven-color Newtonian version – the product of a man who believed that the “Supreme God is a Being eternal, infinite, absolutely perfect… and from his true dominion it follows that the true God is a living, intelligent, and powerful Being,” as he says in the Principia – might not be a way to distinguish the rainbow for those who wish to “take it back” from the Gay Pride movement. It would certainly fit with the Christian idea that six is the number of “man” or “imperfection,” and seven the “totality of perfection” or “completeness.” So a seven-striped rainbow could be “Christian,” and a six-striped rainbow could be “gay.” 

In the meantime we do wish everyone a happy Pride Month!

This is the Modern World

Solar eclipse of 29 May 29, 1919, used for the Eddington experiment. Wikipedia.

One hundred years ago, as Paul Johnson writes in the opening chapter of Modern Times, a significant event took place.

***

The modern world began on 29 May 1919 when photographs of a solar eclipse, taken on the island of Principe off West Africa and at Sobral in Brazil, confirmed the truth of a new theory of the universe. It had been apparent for half a century that the Newtonian cosmology, based upon the straight lines of Euclidean geometry and Galileo’s notions of absolute time, was in need of serious modification. It had stood for more than two hundred years. It was the framework within which the European Enlightenment, the Industrial Revolution, and the vast expansion of human knowledge, freedom and prosperity which characterized the nineteenth century, had taken place. But increasingly powerful telescopes were revealing anomalies. In particular, the motions of the planet Mercury deviated by forty-three seconds of arc a century from its predictable behaviour under Newtonian laws of physics. Why?

In 1905, a twenty-six-year-old German Jew, Albert Einstein, then working in the Swiss patent office in Berne, had published a paper, ‘On the electrodynamics of moving bodies’, which became known as the Special Theory of Relativity. Einstein’s observations on the way in which, in certain circumstances, lengths appeared to contract and clocks to slow down, are analogous to the effects of perspective in painting. In fact the discovery that space and time are relative rather than absolute terms of measurement is comparable, in its effect on our perception of the world, to the first use of perspective in art, which occurred in Greece in the two decades c. 500-480 BC.

The originality of Einstein, amounting to a form of genius, and the curious elegance of his lines of argument, which colleagues compared to a kind of art, aroused growing, world-wide interest. In 1907 he published a demonstration that all mass has energy, encapsulated in the equation E = mc^2 , which a later age saw as the starting point in the race for the A-bomb. Not even the onset of the European war prevented scientists from following his quest for an all-embracing General Theory of Relativity which would cover gravitational fields and provide a comprehensive revision of Newtonian physics. In 1915 news reached London that he had done it. The following spring, as the British were preparing their vast and catastrophic offensive on the Somme, the key paper was smuggled through the Netherlands and reached Cambridge, where it was received by Arthur Eddington, Professor of Astronomy and Secretary of the Royal Astronomical Society.

Eddington publicized Einstein’s achievement in a 1918 paper for the Physical Society called ‘Gravitation and the Principle of Relativity’. But it was of the essence of Einstein’s methodology that he insisted his equations must be verified by empirical observation and he himself devised three specific tests for this purpose. The key one was that a ray of light just grazing the surface of the sun must be bent by 1.745 seconds of arc — twice the amount of gravitational deflection provided for by classical Newtonian theory. The experiment involved photographing a solar eclipse. The next was due on 29 May 1919. Before the end of the war, the Astronomer Royal, Sir Frank Dyson, had secured from a harassed government the promise of £1,000 to finance an expedition to take observations from Principe and Sobral.

Early in March 1919, the evening before the expedition sailed, the astronomers talked late into the night in Dyson’s study at the Royal Observatory, Greenwich, designed by Wren in 1675-6, while Newton was still working on his general theory of gravitation. E.T. Cottingham, Eddington’s assistant, who was to accompany him, asked the awful question: what would happen if measurement of the eclipse photographs showed not Newton’s, nor Einstein’s, but twice Einstein’s deflection? Dyson said, ‘Then Eddington will go mad and you will have to come home alone.’ Eddington’s notebook records that on the morning of 29 May there was a tremendous thunderstorm in Principe. The clouds cleared just in time for the eclipse at 1.30 pm. Eddington had only eight minutes in which to operate. ‘I did not see the eclipse, being too busy changing plates . . . We took sixteen photographs.’ Thereafter, for six nights he developed the plates at the rate of two a night. On the evening of 3 June, having spent the whole day measuring the developed prints, he turned to his colleague, ‘Cottingham, you won’t have to go home alone.’ Einstein had been right.

The expedition satisfied two of Einstein’s tests, which were reconfirmed by W.W. Campbell during the September 1922 eclipse. It was a measure of Einstein’s scientific rigour that he refused to accept that his own theory was valid until the third test (the ‘red shift’) was met. If it were proved that this effect does not exist in nature’, he wrote to Eddington on 15 December 1919, ‘then the whole theory would have to be abandoned’. In fact the ‘red shift’ was confirmed by the Mount Wilson observatory in 1923, and thereafter empirical proof of relativity theory accumulated steadily, one of the most striking instances being the gravitational lensing system of quasars, identified in 1979— 80. 5 At the time, Einstein’s professional heroism did not go unappreciated. To the young philosopher Karl Popper and his friends at Vienna University, ‘it was a great experience for us, and one which had a lasting influence on my intellectual development’. ‘What impressed me most’, Popper wrote later, ‘was Einstein’s own clear statement that he would regard his theory as untenable if it should fail in certain tests …. Here was an attitude utterly different from the dogmatism of Marx, Freud, Adler and even more so that of their followers. Einstein was looking for crucial experiments whose agreement with his predictions would by no means establish his theory; while a disagreement, as he was the first to stress, would show his theory to be untenable. This, I felt, was the true scientific attitude.’

Einstein’s theory, and Eddington’s much publicized expedition to test it, aroused enormous interest throughout the world in 1919. No exercise in scientific verification, before or since, has ever attracted so many headlines or become a topic of universal conversation. The tension mounted steadily between June and the actual announcement at a packed meeting of the Royal Society in London in September that the theory had been confirmed. To A.N. Whitehead, who was present, it was like a Greek drama:

We were the chorus commenting on the decree of destiny as disclosed in the development of a supreme incident. There was dramatic quality in the very staging: the traditional ceremonial, and in the background the picture of Newton to remind us that the greatest of scientific generalizations was now, after more than two centuries, to receive its first modification … a great adventure in thought had at last come home to shore.

From that point onward, Einstein was a global hero, in demand at every great university in the world, mobbed wherever he went, his wistful features familiar to hundreds of millions, the archetype of the abstracted natural philosopher. The impact of his theory was immediate, and cumulatively immeasurable. But it was to illustrate what Karl Popper was later to term ‘the law of unintended consequence’. Innumerable books sought to explain clearly how the General Theory had altered the Newtonian concepts which, for ordinary men and women, formed their understanding of the world about them, and how it worked. Einstein himself summed it up thus: ‘The “Principle of Relativity” in its widest sense is contained in the statement: The totality of physical phenomena is of such a character that it gives no basis for the introduction of the concept of “absolute motion”; or, shorter but less precise: There is no absolute motion.’ Years later, R. Buckminster Fuller was to send a famous cable to the Japanese artist Isamu Noguchi explaining Einstein’s key equation in exactly 249 words, a masterpiece of compression.

But for most people, to whom Newtonian physics, with their straight lines and right angles, were perfectly comprehensible, relativity never became more than a vague source of unease. It was grasped that absolute time and absolute length had been dethroned; that motion was curvilinear. All at once, nothing seemed certain in the movements of the spheres. The world is out of joint’, as Hamlet sadly observed. It was as though the spinning globe had been taken
off its axis and cast adrift in a universe which no longer conformed to accustomed standards of measurement. At the beginning of the 1920s the belief began to circulate, for the first time at a popular level, that there were no longer any absolutes: of time and space, of good and evil, of knowledge, above all of value. Mistakenly but perhaps inevitably, relativity became confused with relativism.

***

If you’d like to keep reading, see the Internet Archive, from which this excerpt was taken. The American Thinker also published a piece on the Eddington experiment.

Renaissance

One thing that I like about Renaissance humanists is that they never slavishly copied ancient Rome. They weren’t LARPers – they never wore togas, or revived gladiatorial combat, or made sacrifices to Jupiter, or discerned the will of the gods from the flight patterns of birds. No, they generally cherry-picked what they most admired (the form of the Latin language, fonts to set it in, and certain architectural details are the three that come most easily to mind). Most importantly, what they revived was a principle, that life was no longer to be a vale of tears, with one’s reward coming in the afterlife, but was meant to be lived – not in a hedonistic way, but a self-actualizing one: God gave us talents, and we honor God when we develop those talents. Since pagans didn’t have much of an afterlife to look forward to, their earthly life was all they had, and they were to use it for self-improvement and the gaining of personal glory. (Whether Romans actually lived by this principle is another question, but certain influential fifteenth-century Florentines certainly believed that they did.)

So in many ways the Renaissance was simply a “naissance,” a birth of something new, as people operated on the principle that they could do anything, because no one said they couldn’t. Mathematical perspective, for instance, was not something that the Romans ever invented, but Renaissance artists. (In other ways, of course, the Renaissance was simply a continuation of the Middle Ages, or so I am compelled to state by virtue of my membership in the medievalists’ guild.)

But speaking of art, I do think it’s a shame that art was such a dominant mode of self-expression in the Renaissance. The paintings and sculptures of Leonardo, Michelangelo, Raphael, and everyone else mentioned in Vasari’s Lives of the Artists are usually the first thing that comes to mind when one hears the word “Renaissance.” Don’t get me wrong, I think that Renaissance art is wonderful, but one regrets that science was not equally as fashionable among the humanists. (I read something once that claimed that scientific enquiry took a step backward in the Renaissance, overshadowed as it was by all the art and literature.) It would have made the Renaissance even better, say, if more people had taken up Leonardo’s engineering projects. As it stands contemporaries had to wait until 1590 before they could read Galileo’s De Motu (On Motion), part of what was now no longer the Renaissance, but the Scientific Revolution.

Fake Miniatures

From Aeon:

One popular image floating around Facebook and Pinterest has worm-like demons cavorting inside a molar. It claims to illustrate the Ottoman conception of dental cavities, a rendition of which has now entered Oxford’s Bodleian Library as part of its collection on ‘Masterpieces of the non-Western book’. Another shows a physician treating a man with what appears to be smallpox. These contemporary images are in fact not ‘reproductions’ but ‘productions’ and even fakes – made to appeal to a contemporary audience by claiming to depict the science of a distant Islamic past.

From Istanbul’s tourist shops, these works have ventured far afield. They have have found their way into conference posters, education websites, and museum and library collections. The problem goes beyond gullible tourists and the occasional academic being duped: many of those who study and publicly present the history of Islamic science have committed themselves to a similar sort of fakery. There now exist entire museums filled with reimagined objects, fashioned in the past 20 years but intended to represent the venerable scientific traditions of the Islamic world.

The irony is that these fake miniatures and objects are the product of a well-intentioned desire: a desire to integrate Muslims into a global political community through the universal narrative of science. That wish seems all the more pressing in the face of a rising tide of Islamophobia. But what happens when we start fabricating objects for the tales we want to tell? Why do we reject the real material remnants of the Islamic past for their confected counterparts? What exactly is the picture of science in Islam that are we hoping to find? These fakes reveal more than just a preference for fiction over truth. Instead, they point to a larger problem about the expectations that scholars and the public alike saddle upon the Islamic past and its scientific legacy.

Read the whole thing. It’s amazing how much fake stuff there is out there.

Pirates and the Metric System

From Taking Measure, a blog of the National Institute of Standards and Technology (via Slate Star Codex), an interesting historical anecdote offering a reason why the United States did not adopt the metric system.

Pirates of the Caribbean (Metric Edition)

September 19, 2017
by Keith Martin

To save his own life, Joseph Dombey had an idea. As two pirate ships surrounded the ship he was on in the Caribbean Sea in 1794, Dombey scrambled below deck, disrobing as he went. He appropriated the outfit of one of the ship’s many Spanish sailors and prayed that he had picked up enough of their language during his trips to South America to blend in. Dombey shouldn’t have been in this position. In fact, he shouldn’t have been in the Caribbean at all. None other than Thomas Jefferson himself was expecting to meet with Dombey in Philadelphia at that very moment.

Dombey’s fate that day arguably delayed the adoption of the metric system in the United States by almost a century and left us as one of the few countries in the world still using non-metric units for our everyday measurements.The marauders now swarming Dombey’s ship were a particular breed of pirate: British privateers— the state-sponsored terrorists of the 18th century. These waterborne gangs had the tacit approval of the government in London to harass and plunder other countries’ maritime commerce and keep part of the spoils as their profit.

After seizing control of the ship, the pirates came across a sailor speaking Spanish with a curiously French accent—Joseph Dombey. A French physician and botanist acting under orders from the French government, Dombey had left the port city of Le Havre, France, weeks earlier for Philadelphia and the meeting with Jefferson, the United States’ first secretary of state and future president. But storms had pushed Dombey’s ship off course and deep into pirate territory.

France had supported the United States against the British in the War of Independence, and now they intended to build closer economic ties with the new American nation. Dombey was to negotiate with Jefferson for grain exports to France and to deliver two new French measurement standards: a standard of length (the meter) and a standard of mass called, rather ominously, a grave, to be considered by the U.S. for adoption. (The grave would be renamed the kilogram a year later in 1795.)

In many ways, Dombey was an excellent choice for this mission. Having already been on several trips to South America to collect botanical specimens, he was an experienced trans-Atlantic traveler. His knowledge of plants would also be of help in his agricultural trade negotiations with Jefferson. And Dombey’s scientific training as a physician and botanist gave him an understanding of the importance of accurate weights and measures, so it was highly likely that he would be able to convince Congress to adopt the new French standards, which would later come to be known as the metric system.

Despite his qualifications, Dombey lacked one important attribute: luck. His previous trips had all ended in failure. He had spent two years in Peru collecting plants that could be usefully cultivated in France, only to have the shipment captured by the British. A second collecting trip, this time in Chile and in collaboration with Spain, fell apart over a business dispute, with Spain keeping all the valuable specimens. But Dombey’s voyage to Philadelphia would turn out to be his most disastrous.

Upon learning his true identity, the pirates imprisoned Dombey on the Caribbean island of Montserrat. Unfortunately, Dombey died before they were able to ransom him to the French, and the units of measure in his charge never made it into Jefferson’s hands.

Some historians view this event as a tragic missed opportunity whose consequences we are still living with today. When the U.S. became an independent nation, it inherited an inconsistent collection of traditional British weights and measures. Congress was aware of the flaws with its British measures, and a congressional committee was formed to recommend solutions. Thomas Jefferson, an admirer of French scientific ideas, lobbied for a measurement system similar to that of France. But Congress didn’t adopt it, and the British-influenced system took hold in the U.S. instead. However, If pirates hadn’t intercepted Dombey on his way to Philadelphia, the situation might be very different today.

More at the link.

Science

The usual case study to illustrate the Scientific Revolution is the triumph of heliocentrism. As you are no doubt aware, at one point learned opinion held that the Earth sat immobile at the center of the universe, with the Moon, the Sun, all the planets, and all the stars moving around it. This was a view endorsed by Aristotle, and by Ptolemy (AD 100-170), and it fit well with medieval theology: it’s not that we placed ourselves self-importantly at the center of all creation, but that we are sitting at the bottom of a sewer. We’re still one level up from Hell, but on Earth, the place where the four elements interact, things change and decay. One level up from the earth, the Lunar Sphere, is where things are perfect, formed as they are from the fifth element, quintessence. Keep on ascending and eventually you get to the realm of the angels and God himself. And anyway, the Earth appears immobile. There’s no great rushing wind, and no observable parallax either – if we were moving around the Sun, the stars would be moving in relation to each other (that they are so far away that there could be no observable parallax did not occur to anyone).

The trouble is that Ptolemy’s model didn’t quite work. The planets were never exactly where they should have been. Astronomers assumed that it was a result of faulty manuscript transmission but with the Renaissance and its mania for uncovering original texts, people discovered that Ptolemy was the originator of the bad data. (Another thing that they discovered is that not everyone was a geocentrist – the ancient Greek philosopher Aristarchus had proposed a heliocentric universe in the third century BC.) Finally, the discovery of the New World threw Ptolemy’s model even further into question. Ptolemy had proposed that the earthly realm consisted of four concentric spheres of earth, water, air, and fire. The first two weren’t quite aligned, however: the Earth should be completely covered by water, but part of it poked out above the water. This was the land – or rather, the three continents of Europe, Asia, and Africa, which themselves were in balance with each other. The discovery of the Americas illustrated that this theory was completely wrong. And if Ptolemy was wrong about that, what else was he wrong about?

Thus did the Polish astronomer Nicholas Copernicus publish De revolutionibus orbium coelestium in 1543, proposing an immobile sun at the center of the universe, with the apparent daily rotation of the stars the product of the Earth’s own axial rotation. All the planets, including the Earth, revolved around the Sun, with the Moon revolving around the Earth. De revolutionibus featured a preface claiming that it was “only a model,” but we now think that Copernicus himself was a genuine heliocentrist. This model didn’t quite work either, but it was simpler in many ways.

Heliocentrism thus became the scientific Big Idea of the sixteenth century, shared among certain scholars and derided as crazy by others – not only Catholics, but also Protestants like Luther and Calvin. After all, did Joshua not command the sun to stand still, and it stood still? Does not Chronicles state that “the world stands firm, never to be moved”? You can’t treat scripture as the foundation of your faith and not heed verses like these.

Galileo (1564-1642) wasn’t buying it. He was a heliocentrist anyway, and his use of the telescope to gaze at the heavens provided further evidence for a Sun-centered universe. Most famously, he discovered the four largest “Galilean” moons of Jupiter, proving that one could have “nested” revolutions (that the Moon went around the Earth, while the Earth itself went around the Sun, was a stumbling block to some people). He also observed sunspots on the Sun and craters on the Moon – in other words, “out there” was not perfect, but apparently made of the same stuff found “down here.” Galileo famously got into hot water with the Inquisition – no, he was not persecuted primarily for his belief in heliocentrism, but for his intemperate attacks on the Pope and the Church (which was rather touchy anyway on account of all the Protestants running around). But Galileo was forced to publicly recant his heliocentrism and spent the rest of his life under house arrest.

This was not enough to discourage further investigation. Tycho Brahe (1546-1601) was a Danish nobleman who took detailed and accurate observations of the heavens over a twenty-year period, and his German student Johannes Kepler (1571-1630) used the data to formulate the laws of planetary motion. Kepler discovered that planetary orbits are elliptical (with the sun at one of the “foci”) and that planets vary in speed as they travel. Again, this violates the principle of the “perfect” heavens: orbits are supposed to be perfectly circular, and speeds uniform. Once Kepler accurately described planetary orbits, astronomers could get rid of the epicycle – an orbit within an orbit invented to describe the apparent backward motion of some planets at times. Now, they realized, it was merely a function of variable speeds as the Earth “overtook” some other planet.

The capstone of this narrative is Sir Isaac Newton (1642-1726), who provided mathematical proof that gravity, the same force drawing things towards the Earth, is the exact same force keeping the planets going around the Sun. In one fell swoop, Newton explained all the motion in the universe, and got God out of it at the same time. Not that Newton was an atheist – he believed that only a divine mind could come up with something so elegant. But the planets no longer needed angels to make them move. Newton’s work was so culturally significant that it launched the Enlightenment – people believed that, using their reason, they could find other immutable laws that underlay other phenomena.

This, in a nutshell, is my lecture on the Scientific Revolution. Of course, I try to emphasize that it’s not a continuous narrative of Progress, that the whole thing was never foreordained, that there were all sorts of blind alleys to explore, that these scientists were human and prone to error and pettiness, etc. Furthermore, the heliocentric revolution did not even involve experimentation, an essential component of the scientific method, although it did involve accurate data collection and testable hypotheses. But one idea really did lead to another, and now we have a pretty accurate picture of the solar system (we can no longer claim that it is the entire universe). We can send spacecraft to explore these celestial bodies, and they seem to arrive and send useful data back to us.

I feel compelled to write this post today because this issue is still with us. We all hail Galileo as a martyr for the truth, and the whole episode remains deeply embarrassing for the RC Church. But the conflict between scripture and science remains when it comes to explaining the origins of life. The theory of evolution through natural selection, first formulated by Charles Darwin and refined ever since, is one of the great ideas that shaped the modern world – and unlike, say, Marxism and Freudianism, it retains its utility. It is, indeed, the foundation of the modern discipline of biology. Unfortunately, this theory’s explanation for the diversity of life on Earth contradicts the accounts given in Genesis 1 and 2, the opening chapters of the Jewish Torah, which Christians retain as part of their scriptures. Many Christians, especially around these parts, insist on the literal truth of scripture – certainly of the opening chapters, which explain the origin of everything, spelled out in a certain amount of detail. Thus has a certain type of Christian invested a great deal of mental energy in saving the appearances, of shoehorning all physical evidence into an explanatory theory that is in accord with the book of Genesis (including not only Creation, but also the idea that “in those days there were giants” and of the Flood – did you know that the Grand Canyon came about as a result of this?). The most recent example, and one that was breathlessly recommended to me by several people, is the movie Is Genesis History? which was shown last week in select cinemas. It was so successful that two more dates have been announced. Act now!

Part of me respects how Christians (and/or conservatives) have created this parallel media universe to get around the liberal possession of the commanding heights of culture. But I really wish they’d focus on stuff that’s true – or at least useful. Needless to say, this deliberate obtuseness is one of the worst advertisements for Christianity right now. One might understand evangelical opposition to same-sex marriage or abortion as matters of opinion – but to a falsifiable scientific theory, in accord with 150 years of data, in favor of some iron-age mythology? Is it any wonder that coastal Americans look down on the denizens of flyover country?

All I can say is that I’m really glad that the Bible does not overtly contradict other modern scientific discoveries, like the circulation of blood, the existence of microbes, Boyle’s Law, or the periodic table.

Medieval Fighting

Stumbled on this reddit page about medieval fighting techniques, in particular “busting myths” about them. Excerpts:

***

Swords

THE BIG MYTH: Swords are the main weapon of a medieval/ancient society.

THE TRUTH: Swords are always a secondary or tertiary weapon for warriors, meaning that you would only use your sword if your main weapon was lost/broken/inappropriate. Main weapons would almost exclusively be pole based weapons (lance, spear, polearm, javelin, pike etc) or a missile weapon (bow, crossbow, sling, firearm etc).

Swords are at a big disadvantage against pole weapons in most situations but usually in both battlefield formations and 1v1 situations.

Spears

Spears were THE medieval and ancient weapon. They were used in some format by every army from the beginning of history to modern day – even professional soldiers have bayonets, turning their gun into a spear. They are so underrepresented in fantasy that the only notable wielders that spring to mind are Kaladin in Way of Kings and Oberyn Martell in Game of Thrones. That’s not many for such an important weapon.

An inexperienced spearman will often beat an experienced swordsman because the spear has a huge advantage in reach over the sword.

A spear thrust could penetrate mail but will not penetrate plate.

Swinging the whole pole around your head is a totally legitimate historical technique.

Axes

The one handed axe is a weapon that is used because it is cheap and easy to obtain, not because it is an especially good weapon.

Axe heads need to be pretty small. The huge axe that Gimli has in the LOTR films is far, far too large to be used practically (and must require him to have enormous strength to wield). An axe made specifically for war should be far smaller because in combat, speed is what matters.

If someone chooses to use an axe over a sword for non-armoured fighting then they need a very strange reason to do so. A sword has huge advantages over any single handed axe.

Axes were seldom favoured but they did have their uses. The Vikings made good use of the Dane Axe, a huge double handed weapon – but its purpose was to hew at enemy shield walls more than it was for personal combat.

Shields

If you don’t have plate armour, you want a shield. Shields are awesome.

If you do have plate armour, shields become redundant and you’re better off with a two handed pole type weapon.

Shields are also very inconvenient to carry around with you.

Unlike a sword, a medium sized shield is actually pretty heavy. Training for 2 hours with a Viking style shield will leave your shield arm tired. Since shields varied between being little bucklers that just protect the sword hand and massive tower shields that covered the whole body, it’s not really possible to give a ‘standard weight.’

***

More at the link.

The Paris Pneumatique

An interesting post appeared today on Dartblog on the:

Paris Pneumatique mail delivery service, the fastest way to send correspondence up until the arrival of fax machines. Underground high-pressure lines (think of the air-powered tubes that are used today at drive-through bank tellers) covered Paris better than the metro (in the 1930s at the service’s peak there were 240 miles of Pneumatique tubes, mostly in the sewers, vs. 213 miles of metro today), and they shot their torpedo-like carriers across the city far more quickly than any express rider could travel. Delivery was guaranteed between any two points in Paris within two hours.

As recently at the mid-1970s, the Pneumatique system carried millions of pieces of urgent mail, but on March 30, 1984, after 117 years of operation (the first line entered service in 1867), the network was shut down for good — a sad ending for a technology that had been immortalized in François Truffaut’s 1968 film Baisers Volés (Stolen Kisses).

The technology that killed the Pneu, the fax machine — remember them? — lasted barely 25 years.

Another redundant French technology: the Minitel, a “Videotex online service accessible through telephone lines, considered one of the world’s most successful pre-World Wide Web online services.” My French host mother had one when I studied in France in 1992. I was impressed. You could look up phone numbers, order train tickets, and buy things! The service was discontinued in 2012 in the face of the Internet as we know it today.

Textiles

Interesting article on Aeon magazine: “Losing the Thread: How Textiles Repeatedly Revolutionized Technology,” by Virginia Postrel. Excerpts:

…textiles are technology, more ancient than bronze and as contemporary as nanowires. We hairless apes co-evolved with our apparel. But, to reverse Arthur C Clarke’s adage, any sufficiently familiar technology is indistinguishable from nature. It seems intuitive, obvious – so woven into the fabric of our lives that we take it for granted.

The story of technology is in fact the story of textiles. From the most ancient times to the present, so too is the story of economic development and global trade. The origins of chemistry lie in the colouring and finishing of cloth. The textile business funded the Italian Renaissance and the Mughal Empire; it left us double-entry bookkeeping and letters of credit, Michelangelo’s David and the Taj Mahal. As much as spices or gold, the quest for fabrics and dyestuffs drew sailors across strange seas. In ways both subtle and obvious, textiles made our world.

Most conspicuously, the Industrial Revolution started with the spinning jenny, the water frame, and the thread-producing mills in northern England that installed them. Before railroads or automobiles or steel mills, fortunes were made in textile technology. The new mills altered where people lived and how they worked. And the inexpensive fabrics they produced changed the way ordinary people looked.

Then, a second conspicuous wave of textile innovation began with the purple shade that francophile marketers named mauve. The invention of aniline dyes in the mid-19th century made a full spectrum of colour – including newly intense blacks – universally available. The synthetic-dye business gave rise to the modern chemical industry, and yet more technology-based fortunes.

As understandable as it might be, forgetting about textiles sacrifices an important part of our cultural heritage. It cuts us off from essential aspects of the human past, including the lives and work of women. It deprives us of valuable analogies for understanding how technology and trade transform economies and culture. It blinds us to some of today’s most pervasive innovations – and some of tomorrow’s most intriguing.

Read the whole thing.

The Middle Ages Vindicated!

Tim O’Neill shows how the notion that “science made little progress in the Middle Ages” is false, a product of Renaissance and Enlightenment prejudice:

The standard view of the Middle Ages as a scientific wasteland has persisted for so long and is so entrenched in the popular mind largely because it has deep cultural and sectarian roots, but not because it has any real basis in fact.  It is partly based on anti-Catholic prejudices in the Protestant tradition, that saw the Middle Ages purely as a benighted period of Church oppression.  It was also promulgated by Enlightenment scholars like Voltaire and Condorcet who had an axe to grind with Christianity in their own time and projected this onto the past in their polemical anti-clerical writings. By the later Nineteenth Century the “fact” that the Church suppressed science in the Middle Ages was generally unquestioned even though it had never been properly and objectively examined.

It was the early historian of science, the French physicist and mathematician Pierre Duhem, who first began to debunk this polemically-driven view of history.  While researching the history of statics and classical mechanics in physics, Duhem looked at the work of the scientists of the Scientific Revolution, such as Newton, Bernoulli and Galileo.  But in reading their work he was surprised to find some references to earlier scholars, ones working in the supposedly science-free zone of the Middle Ages.  When he did what no historian before him had done before and actually read the work of Medieval physicists like Roger Bacon (1214-1294), Jean Buridan (c. 1300- c. 1358), and Nicholas Oresme (c. 1320-1382) he was amazed at their sophistication and he began a systematic study of the until then ignored Medieval scientific flowering of the Twelfth to Fifteenth Centuries.

What he and later modern historians of early science found is that the Enlightenment myths of the Middle Ages as a scientific dark age suppressed by the dead hand of an oppressive Church were nonsense.  Duhem was a meticulous historical researcher and fluent in Latin, meaning he could read Medieval scientific works that had been ignored for centuries.  And as one of the most renowned physicists of his day, he was also in a unique position to assess the sophistication of the works he was rediscovering and of recognising that these Medieval scholars had actually discovered elements in physics and mechanics that had long been attributed to much later scientists like Galileo and Newton.  This did not sit well with anti-clerical elements in the intellectual elite of his time and his publishers were pressured not to publish the later volumes of his Systeme de MondeHistoire des Doctrines cosmologiques de Platon à Copernic – the establishment of the time was not comfortable with the idea of the Middle Ages as a scientific dark age being overturned.  Duhem died with his painstaking work largely unpublished in 1916 and it was only the efforts of his daughter Helene’s 30 year struggle for her father’s opus to see the light of day that saw the whole 10 volume work finally released in 1959.

Read the whole thing. One of the Renaissance books I recently read (can’t remember which one) suggested that scientific enquiry (like the status of women) actually abated in the Renaissance, which was obsessed with art and literature, to the impoverishment of natural philosophy.