If you like it, why don't you marry it?

Many of us I’m sure remember this elementary school taunt. Often you would be unknowingly baited in some way, i.e., “What do you think of Cheetos?” and then declare that you thoughts Cheetos were pretty great. The “Then why don’t you marry it?!” response is of course colossally dumb, but I admit that it often had its intended unsettling effect on me. Be careful of declaring that you liked something! I believe C.S. Lewis pointed out in The Four Loves that loving anything at all, even a plant or a sunset, opens oneself up to pain and loss.

Historians of a traditional mindset such as myself often express admiration for the past. We may even pine for a return of the past in some way, and this naturally opens us up to the old school taunt: if you like knights and cathedrals and gilds so much, why don’t you marry medieval society? It is easy to “date” any civilization and pick out just the things you like. But all of what you like, about medieval civilization, for example, also came with a near total lack of indoor plumbing, and no mouth wash either. You have to accept everything, and if you are not willing to do so, one’s admiration is stupid fancy at best, dangerous idealism at worst.

This charge has some of the same flaws as the old schoolyard taunt. The past surely can offer some salutary guidance even if reliving it remains obviously impossible. Aren’t we allowed to like things? But I acknowledge that one must not selectively pick, choose, and romanticize. One must “marry” the civilizations we study.

Books on the Middle Ages almost always fall into one of three camps:

  • Look at how dumb, superstitious, and oppressive they were. Aren’t you glad you didn’t live then?*
  • Look at how smart, chivalrous, beautiful they were. Don’t you wish you lived then?
  • Look at this culture. I examine it thoroughly, and discover that they did things, upon which I pronounce no judgments whatsoever.

Of the three, most fall into the first two, but I like the last the least. The first two types of authors at least strike me as human beings with something to say. The temptation to try and avoid is that of swinging entirely into the other camp. Henry Charles Lea’s The Ordeal, written in an era when the progressive ascent of democratic modernism seemed the only future, falls into the first camp. He examines the medieval practice of trial by various ordeals to illuminate the progress we have made since then. He comes not to praise, but bury.

We can admire much about this book. It is not an uninformed screed, nor is it a hit-piece on the Middle Ages itself, for he mentions that trial by ordeal happened in many other ancient cultures. He has a lot of primary source texts and reports things with some air of detachment. If his overall point is clear, as I said earlier, at least he has a point. Like Chesterton, Lewis, and other of my literary heroes, I like the Middle Ages but need to contend with the fact that they did have trials by ordeal, and do I really want to substitute a jury for a hot piece of iron?

In what follows, then, I hope to fall into neither of the three aforementioned camps.

I appreciated that Lea took time to show that other cultures also used trial by ordeal, such as Hindu and Islamic civilizations, as well as many ancient cultures. Lea also used a lot of primary sources–indeed most of his book involves simply recounting the sources and commenting on them briefly. I also admired the fact that he included a section on the eucharist as an ordeal, for every other treatment I have seen ignores this aspect of medieval life, focusing on the more sensational ordeals by fire, water, and so on. Lea buries his treatment of this towards the back, but I feel this is where one should start if we want to have some understanding of the practices of ordeal in general.

If the central aspect of medieval life was the church, then the pearl within the oyster was the eucharist, where the faithful feed upon God Himself. Certainly I make no attempt here to develop any theology of the eucharist. But we may gain more insight if we pan out further to the last judgment. Many today have the idea that God’s final judgment involves Him declaring some fit and others unfit, and then banishing the unfit. Rather, the picture the early church gives us is that God’s love (and the presence of God is the love of God) both saves and condemns. God’s showers His love upon all, but His love is so strong that it resembles a refining fire. For some made strong, made holy, the love of God warms and comforts. For others who reject the love of God, God’s love leads to their further destruction, for the hate the love of God, and it burns them. As St. Isaac the Syrian stated,

. . .those who find themselves in Gehenna will be chastised with the scourge of love. It is not right to say that sinners in hell are deprived of the love of God . . . But love acts in two different ways, as suffering in the reproved, and as joy in the blessed.

An icon of the Last Judgment shows forth this same idea:

Salvation in a Christian context means that one is not so much declared righteous but made righteous through the grace of God–made able to receive the love of God as blessing and not as curse (I acknowledge that both terms have their place, however, in discussing the meaning of salvation).

So too communion, when even thought about for a moment, takes on monumental proportions. As Fr. Schmemman stated in his classic For the Life of the World, Fuerbach’s “you are what you eat,” quip, meant as a materialistic taunt, actually expresses a profound religious truth. To eat anything means to take the life of the fruit, meat, etc. into oneself. So too, in the eucharist God offers us the chance to take His life into our own. But this free gift does not come cheap. Scripture warns us about taking communion unworthily. We must realize that the presence of God can heal and transform or destroy us. As one prayer from perhaps the 8th century states,

Though I am hindered by so many and such great evils, I now add to them by approaching holy mysteries so heavenly and divine that even the angels desire to understand them. . . . Because of my unworthiness, I fear that, rather than receive divine enlightenment and a share of grace, I will be condemned . . . What am I to do? By partaking of the awesome mysteries, I subject myself to these and greater punishments. By abstaining from them, I shall fall into greater evils . . .

Mother of the Light, pp. 27-28

Lea’s work has many merits, but his leaving this background out of the discussion can lead one to a more superstitious understanding of the practice then is warranted. As an example we can take the ordeal of boiling water. Before the ordeal the water would be prayed over by a priest:

O creature of water, I adjure thee by the living God, by the holy God, who in the beginning separated thee from the dry land; I adjure thee by the living God who led thee from the fountain of Paradise, and in four rivers commanded thee to encompass the world; I adjure thee by Him who in Cana of Galilee by His will changed thee to wine, who trod on thee with His holy feet . . . water which washes away the dust and sins of the world, I adjure thee . . . to make manifest and bring to light all truth . . .

This prayer, quite similar to the prayers said for baptism, ask that God make the water a revealer of truth in the same way that water is used to fashion the world. That is–water must serve truth, which is a manifestation of God Himself, who is Truth.** The 3rd century bishop St. Gregory the Wonderworker stated, “The Lord, Who has come upon the Jordan River, through its streams transmitted sanctification to all streams (of water),” with Christ imparting to all water, “a sign of heavenly streams” of grace.”

For the early medievals, the same held true for the ordeal of fire/the hot iron. Prayers recalled how fire revealed much in Scripture–Fire found Sodom guilty but Shadrach, Meshach, and Abednego innocent, and the burning bush of Horeb reveals God Himself. Again, I don’t think we should see the verdict’s rendered by the fire ordeal as merely forensic. The fire, the water (and other types of ordeals) manifest God to men. Some by their holiness and innocence are able to stand, some by their sin cannot. Lea writes with a seemingly exclusive legal bent, and so misses the theological import.

Lea stated that, “The History of Jurisprudence is the History of Civilization.” The sentiment has nobility but is misplaced. One must go deeper at least to culture, and preferably to religion, to see its influence on jurisprudence. This too means that he overemphasizes looking at the technical matters of the law and misses some important caveats to the use of ordeals, two of which are worth noting.

First–Lea gives the impression that the medievals used ordeals willy-nilly at the drop of a hat. Rather, I believe they used ordeals usually as a last resort when they exhausted other means of determining the truth of the matter. Perhaps it is easier for modern, depersonalized society to let matters such as hung-juries or mistrials stand. For those in a pre-modern, more personal and local context, having a unresolved verdict on a matter of great importance might put an unbearable strain on the community.

Second–Lea misses something of the “objectivity” of the ordeal. With no such measure justice might tend toward the “justice” of the strong and powerful. It was not always the case that ordeals vindicated the weak against the strong, but it seems to me that it happened much more often than Lea cared to admit or notice.^

Lea’s anti-religious cards come into full display with certain choice vocabulary words like “superstition,” and “fetish.” Indeed, when the Catholic church issued a general condemnation of ordeals in 1215, Lea does not see the triumph of a more reasonable religion, but a political power play. So Lea blames the church for fostering and encouraging ordeals (including a quip about how they preferred the ordeal of fire, no doubt for its impressive aesthetic qualities), then fails to credit them for dramatically curtailing the practice.

By now the reader may assume that in seeking to explain ordeals more fully and expressing guarded appreciation, I now should “marry” them. I object to such a burden placed on the historian. A practice may have been less onerous than some suppose, but that wouldn’t mean that the practice has no issues. No Church today (with the exception of the snake-handler cult), indeed no churchman I am aware of for basically the last 500 years has recommended the practice. I don’t feel the need to do so either.

Historians usually come in absolutist or relativist garb. The absolutist would say that, “If ordeals are wrong now, it was wrong then. The stories of people emerging unscathed from ordeals are either lies, exaggerations, or works of the devil, for no good can come from such an unjust practice.” A relativist might tell us that we should not judge the past–and indeed cannot judge anyone ever for anything. The historian should work for “understanding” and should avoid “judgment.”

One should use from both perspectives to a degree, but embracing either one in its totality leads to incoherence. Will Durant posed a generous means of interpreting people and cultures from the past. If a man shares the vices of the past, that was unfortunate, but does he have virtues that cut against the grain of his society? How does a culture compare relatively to other cultures of its time? I find the medievals did not so badly on the relative scale, but on the absolute scale, I would not want to bring them back.

I have the feeling that Lea would dismiss all of the accounts of God working through the ordeals as fabrications and propaganda. I will not so glibly dismiss numerous testimonies, and so that leaves me the position of believing that God used an imperfect and “arbitrary” means to achieve His ends. But this is hardly a problem–He has done this since the beginning of time.

The Catholic Church’s Fourth Lateran Council of 1215, which attempted to ban trial by ordeal, gave as one reason the fact that ordeals “tested God.” That is, God pledges Himself to act in certain ways in the sacraments of the Church, but we cannot take this pledge and extrapolate it to any sticky situation we face. We have not the power to call God down and demand He reveal Himself when we are stuck. As C.S. Lewis famously noted regarding Aslan–“He is not a tame Lion.” It may be, then, that the story of trial by ordeal involves not so much the folly of men, but the humility of God, who accommodated Himself to our weakness patiently for a time.

Dave

*It is interesting that no one really writes about the ancient Babylonians, Chinese, Mayans, etc. in the way that we write about the Middle Ages.

**To his credit Lea cites several instances from saints lives of people putting their arms in boiling cauldrons, either to test obedience or another point of dispute, and emerging unscathed.

^As an example, see Eric Jager’s book The Last Duel, which chronicled the plight of a woman who accused another prominent nobleman of raping her. The issue could not be definitively resolved at trial, and her husband agreed to fight the accused to the death to determine the verdict. He won, and the accused was pronounced guilty.

We should pause for a moment and flip the script, putting jury trials under a touch of scrutiny. One can read online a plethora of articles about the fairness of juries, the random nature of verdicts, and so forth. Again, I would not suggest replacing jury trials with medieval ordeals, but for someone like Lea, who believed that ordeals were entirely arbitrary, modern evidence about juries does not give us as much separation from the past as we might wish. And yet, we too have to invest the jury trial with a kind of sacredness if we are to have any kind of society at all.

The Idea of an “Empire of Liberty”

How we label things, or how we construct their meaning and place in history, obviously will say a lot about us.

The Constitution serves as a good example.

The Constitution has its flaws, its oversights, and some ungainliness about it.  We understand that it’s obviously not perfect. But it surely has worked on some level, having lasted this long.  Because it has worked (or at least we assume it has — more thoughts on this later), we think of it as a very “modern” and forward-looking document.  This matches how we think of ourselves.  We are a “progressive” people, the documents that define us must also have the same character.

But in his two books Empire of Liberty, and The Idea of America acclaimed historian Gordon Wood makes the point that the Constitution tried in fact to stem the rising tide of “liberty” and change unleashed by the “spirit of ’76.” The Constitution may not have been a completely “reactionary” document but it was a response to a quick erosion in society of what many elite revolutionaries like Adams and Madison held dear.

Through various quotes and citations, Wood lists the changes seen and feared by such men in the 1780’s . . .

  • A loosening of traditional relationships between men and women — parents had much less control over who their children married.
  • Riots and protests against professors at the few (and elite) colleges by many students demanding curriculum changes, attitude changes, and the like
  • A decided turn against the virtues of the ancient Romans and a great movement toward the ideas of tolerance and conviviality as the means to hold society together
  • Democratization of religion, which became much less authority driven and much more ‘touchy-feely.’
  • Extreme partisan politics on local levels, with stories of violent behavior in state legislatures rampant.  The rise of the “party-spirit” in politics bewildered many.
  • Both “free love” and drugs going mainstream into the culture

Ok — the last is not true, but if one looks at the list, it looks quite familiar to us, making us think of the 1920’s or the 1960’s, or today.  Maybe we must face the fact that this is what America was, is, and ever shall be.  As Alfred North Whitehead once said, “The major advances in civilization are processes which all but wreck the societies in which they occur.”

Wood asserts that while the Constitution succeeded in establishing a structure that put up some barriers to change, overall the idea of liberty and “the people” triumphed over the Constitution’s conservative aspirations.  In the end, the idea liberty and the reality of the voice of the people ended up remaking the Constitution in its own image.  The Anti-Federalists, those who opposed the Constitution’s ratification, lost the battle but won the war.  The best the ideals of Washington and Adams can do now is fight rear-guard actions against the overwhelming power of the “people.”

Though initially shocking to our sensibilities, the idea of a reactionary Constitution helps make sense of much of American history down to the present day.  It also partially explains why appeals to the vision of the founders, or the intent of the founders falls on deaf ears.  For one, Madison and others probably believed that the factional “evils” they saw in state governments would not transfer to the national legislature.  Perhaps Madison wanted a stronger national government because he thought that only the “better sort” would get elected to the national legislature and thereby elevate discourse.  This “better sort” would not fall prey to party politics.  He sought then, more power not so much to the national government but to the “better men.”  However strong Madison’s hopes on this score, they quickly proved illusory.  Madison and others like him either misinterpreted or remained ignorant of exactly what their revolution had wrought.**

At a deeper level, two other questions arise.  Can the structures of organizations curtail underlying driving principles that form such organizations?  I tend to side on this one with the Jurassic Park dictum, “Life finds a way.”  The early “conservatives” could not call upon the spirit of self-determination in 1776 and then expect to put the toothpaste back in the tube once the Revolution had accomplished what they wanted it to.  The population at large could rightly retort, “What about us?” Bowing to the “will of the people” became a necessity, and eventually, a foreordained, positive good. Even a Constitutional “literalist” or “minimalist” like Jefferson gladly dispensed with his principles with the Louisiana Purchase, among other instances.  Like a nuclear blast, the concept of “liberty” leveled everything in its path.  What mattered to most was not the past, but the future.  The founders had done their part, but our vehement abhorrence to anything smacking of aristocracy made us quickly resistant to anything resembling the determining “tyranny of the past.”  John Adams, among others, quickly tried to assert that, “Wait!  That’s not what we meant!”  Most responded with some form of “I don’t care!”  Within a generation of the Constitution’s ratification, “egghead” professions like that of lawyer already were viewed as “elitist” by many, especially towards the frontier.  The seeming radical nature of the “Jacksonian Revolution” actually had its roots laid years prior.

Wood deals with the slavery question related to these political questions, but I found his analysis of the relationship between Americans and Native Americans more intriguing.  He writes,

Conceiving itself as a composite of different peoples, the British Empire could somehow accommodate the existence of Indians within its territory.  But the new American Republic was different: it contained only citizens who were presumably equal with one another.  Since the United States could scarcely imagine the Indians as citizens equal to all other American citizens, it had to regard Native Americans as members of foreign nations with which treaties had to be negotiated.  Of course, most of the Indians themselves had no desire to become citizens of the American Republic.

While the 17th century colonists did fight with Indians, little doubt exists that American Independence proved a disaster for Native Americans.  Problems began years before the war itself — one of the driving issues behind the Sugar Act and Stamp Act involved keeping colonists off Native American land.  Wood’s reasoning fits with de Tocqueville’s thoughts on equality, and the problem persists today. We have yet to work out the tension between liberty and equality.  When the “people” speak (however we measure this) we cannot tolerate deviation from the norm.  The example of Bruce/Caitlin Jenner speaks to this.  How many ESPN commentator’s could keep their jobs and declare that Jenner is tragically mistaken in his actions?  To be fair, had ESPN existed 50 years ago, could anyone have then applauded his actions and kept his job?

One unsaid implication of Wood’s book is pride of place between the American and French Revolutions.  Most see the American Revolution as giving birth to the French Revolution, with the French Revolution as the bastardization of all that went well in America, then withering on the vine as Napoleon took over.  But we might instead see the French Revolution as the real victor, with its sense of the power and authority of the “people” in more or less full swing by the early 19th century in America.  He who laughs last laughs loudest.  Or perhaps both of these positions wrongly presume an essential difference between the two events.  Maybe the American Revolution started to resemble the French Revolution because they had the same origins — fraternal if not identical twins.  If we consider this option, then we may need to reevaluate America history as a whole — an exciting if not daunting task.

“A man’s worst difficulties begin when he is able to do as he likes.” — T.H. Huxley

Dave

*We should note that those at the the Constitutional Convention had different ideas, and others whom we might consider “founders” like Patrick Henry, John Adams, and Thomas Jefferson were not at the Convention at all.

**Wood cites a variety of sources to demonstrate that the traditional understanding of the need for the Constitutional Convention being the weaknesses of the Articles of Confederation are false.  Many had perfect awareness of these weaknesses and their fix remained relatively simple.  The real concerns of men like Madison and Washington lie rather in their observations of the petty bickering of state legislatures, and the fact any man Jack seemingly could get elected to state legislatures.

Authenticity, Man

Having flamed out on Charles Taylor’s A Secular Age, I was happy to find a more bite-sized chunk of his thought in The Ethics of Authenticity. I admit to approaching the book, like others of a traditional bent, leaning against the very idea of authenticity. “Get over yourself, already.” The search for meaning within the self can never go anywhere and remains something of an illusion. What, after all is there to really “experience?”

The whole concept of “authenticity,” born out of the 1960’s (or so I thought), has given rise to a whole host of modern problems. All of the issues with sex and gender have their roots here, as does a great deal of spiritual innovation with the church, along with the Trump presidency. Many inclined to read this book of Taylor’s might hope for a thorough denunciation from the venerable professor.

Of course, boring conservatives such as myself may not have always been such. We may remember the days of our youth when it seemed we had to break free from our surroundings to see what we were made of. Taylor taps into this, and so, while he criticizes much of what the “Authenticity” stands for, and finds it ultimately self-defeating, he reminds us that a kernel of something like the truth remains within this–in my view– unpleasant husk. Taylor writes,

The picture I am offering is rather than of an ideal that has been degraded . . . So what we need is neither root-and-branch condemnation nor uncritical praise; and not a carefully balanced tradeoff. What we need is a work of retrieval . . .

Taylor demonstrates that of what we term “authenticity” has its roots in Christianity. In the ancient world nearly every person received their identity by what lay wholly outside their control, be it birth, race, family, etc. The triumph of Christianity meant believing that an entirely other world lay outside of our normal lives, the Kingdom of God, in which there existed “neither Jew nor Gentile, neither slave nor free . . . (Gal. 3:28).” The book of Revelation tells us that God “will also give that person a white stone, with a new name written on it, known only to the one who receives it (Rev. 2:17). St. Augustine’s magnum opus told us that the City of God lies nestled, in some ways, within the City of Man, to be discovered by anyone willing to walk through wardrobe.

18th century Romantic thinkers, primarily Jean-Jacques Rousseau, picked up this dormant thread, albeit thin sprinting with it in ways that St. Augustine would abhor. I find the 18th century an absolute disaster for the Church, but still, Taylor calls me to at least a degree of balance. There was something quite ridiculous and artificial about the French aristocracy, for example, ca. 1770, accurately portrayed (I think) in this clip from John Adams

Perhaps St. Augustine did see “the road to God as passing through a reflexive awareness of ourselves” (p. 27). Many have pointed out how psychologically oriented was Martin Luther’s view of salvation. John Calvin began his Institutes by asking his readers to heed the Socratic dictum to “know thyself,” for knowledge of God and ourselves have an inextricable link. But Taylor sees Rousseau as the main originator of the modern view of authenticity. For Rousseau, “Our moral salavation comes from recovering authentic moral contact with ourselves,” and, “Self determining freedom demands that I break free from external impositions and decide for myself alone” (p. 27).

Well . . . ok. I don’t like the concept of authenticity but perhaps the difference lies in the “time of day.” What I mean is, bacon and eggs smells wonderful on the skillet when you’re hungry before breakfast, but that same smell hits one very differently with a full belly after lunch. The early Romantic movement, just as with the early Enlightenment, had good points to make. Rousseau championed, among other things, mothers actually breast-feeding their own children–something dramatically out of fashion for his time. Connecting with “nature” and the “self” I suppose could lead to more morally responsible living. You cannot blame your birth or other circumstances for who you are. But let it linger too long and you get the ridiculous movie Titanic, riddled through with romantic and “authentic” ideology–the smell of bacon and eggs at 10 am after you have already eaten. Still, Taylor asks the reader to see the premise of the morning before jumping straight to the twilight.

As for today, Taylor points out a variety of ways in which the “authenticity” narrative has gone astray.

Rousseau and his followers helped fuel democratic movements at home and abroad, but having created democracies in part through the dignifying the self, these same democracies would make a mockery of the original golden thread. Liberated from tradition, democratic man seemed to attain authenticity not via stern moral struggle against tradition, but as a birthright. If we are all authentic, then are all special, and thus, we all need recognized and regarded by others.

Taylor’s insights show us why those, for example, like Kaitlyn Jenner can be granted moral weight. The Authenticity narrative tells us that such people have attained a status of “real” humans because of their “courage” to make war even on biology itself–the final frontier–in order to achieve their version of true personhood. And, while I believe that those who alter their sex (if such a thing is truly possible) make terrible and tragic decisions, Taylor hints at why those that make these decisions often find them so empowering. Seeking a “genuine” connection with the self is the modern version of a transcendent experience. We grant large amounts of authority to those that have them, like the mystics of old.

Taylor also points out the endgame in store for “authenticity” lay implicit in its origins. If the self is to be the guide, and self-actualization has the ultimate authority, then we have a contradiction. For–the self can never be absolute, certainly not over others. Telling someone about your “experience” is nearly as bad as telling someone about the dream you had last night. In the end, we require an outside reference.

Alas, logical contradictions will likely not derail the Authenticity movement. But it is possible that time may take of this in ways that logic cannot.

I mentioned above the analogy of the smell of bacon before and after breakfast, and the analogy holds true in other aspects of life. In his War and Civilization compilation Toynbee admits the allure of the “morning” of a military outlook when reading the Iliad. Homer’s battle scenes have a dramatically bracing effect. Then, fast-forward to 19th century, where Prussian militarists like Helmuth von Moltke give one an entirely different impression of essentially the same thing that Homer described:

Perpetual Peace is a dream–and not even a beautiful dream–and War is an integral part of God’s ordering of the Universe. In War, Man’s noblest virtues come into play: courage and renunciation, fidelity to duty and a readiness to sacrifice that does not stop short of offering up Life itself. Without War the World would be swamped in materialism

Toynbee comments that, “there is a note of passion, of anxiety, and of rancor,” here that takes far away from the Greek poets. Moltke continues, perhaps even aware that he sails too close to the wind;

It is when an institution no longer appears necessary that fantastic reasons are sought or invented for satisfying the instinctive prejudice in its favor, which its long persistence has created.

If the modern Liberal order was created in part on the back of Authenticity, then surely we might say that those who still champion the idea copy Moltke and indeed invent “fantastic reasons” for the path they take. Perhaps Authenticity has run its course and can go into hibernation. We will wake it up when the scales have tipped too far in the other direction.

Dave

A Method to the Madness

In the heady days of youth, many a man in my position (i.e., newly engaged, etc.) allowed themselves to watch a whole host of Jane Austen movies with their literarily inclined fiance.  Depending on our taste and level of courage, some of us liked the movies, while others pretended to like them to one degree or another.  But as watched them I recall having a thought (one that I most definitely did not voice at the time) I think most people have when exposed to Austen’s world: “What exactly did these women do all day?”

Enter Norbert Elias to answer this, and other perplexing questions about European aristocratic life in the age of Louis XIV and beyond.  His book The Court Society sets out to give the European aristocracy a context in which they lived.  They had reasons for their actions, reasons that made at least some sense in their world.  And like any other system, the seeds of its destruction embedded themselves right within the virtues the aristocracy practiced.

By early on in the book one realizes that, yes, the aristocracy did have “jobs.”  Of course menial/”blue collar” labor remained beneath them, but each member of an aristocratic household had charge of the family name, and advancing that family name.  Americans have little concept of this, but once we understand this idea, most everything else about the aristocracy falls into place.

While Elias did not deal with Austen’s period, I couldn’t help but reference her work when thinking of what Elias described.  In the Austen movies the women spend a great deal of time visiting one another, and Elias points out how this practice allowed for a display of rank and honor.  Thus, these meetings between aristocracy rarely had a “purely social” character to them.  Some may recall the surprise visit of Elizabeth to Darcy’s estate in Pride and Prejudice.  Darcy quickly puts on his “Sunday best” to receive visitors.  Of course it is polite in any society not to receive visitors in the equivalent of pajamas, but it is important to Darcy as well to reflect the dignity of his house to others.  Of course this may be why his house (like other aristocratic houses) remained open to the public, which seems quite strange to modern Americans.  How can one just show up uninvited?  But the aristocracy generally welcomed such visits, as an actor welcomes a chance to perform.  Proper dress and decorum went beyond mere politeness — it served as a means of displaying and advancing status.  Being a good host/guest was “work” for the aristocracy.  Advancing the family name meant advancing the family fortunes.  One might even imagine the members of the family often “on campaign” to advance or defend the family honor, as this note from the Duchess of Orleans to the Duchess of Hanover makes clear:

I must really tell you how just the King is. The Duchesse de Bourgogne’s ladies, who are called Ladies of the Palace, tried to arrogate the rank and take the place of my ladies everywhere. Such a thing was never done either in the time of the Queen or of the Dauphiness. They got the King’s Guards to keep their places and push back the chairs belonging to my ladies. I complained first of all to the Duc de Noailles, who replied that it was the King’s order. Then I went immediately to the King and said to him, “May I ask your Majesty if it is by your orders that my ladies have now no place or rank as they used to have? If it is your desire, I have nothing more to say, because I only wish to obey you, but your Majesty knows that formerly when the Queen and the Dauphiness were alive the Ladies of the Palace had no rank, and my Maids of Honour, Gentlemen of Honour, and Ladies of the Robe had their places like those of the Queen and the Dauphiness. I do not know why the Ladies of the Palace should pretend to anything else.” The King became quite red, and replied, “I have given no such order, who said that I had?” “The Maréchal de Noailles,” I replied. The King asked him why he had said such a thing, and he denied it entirely. “I am willing to believe, since you say so,” l replied, “that my lackey misunderstood you, but as the King has given no such orders, see that your Guards don’t keep places for those ladies and hinder my servants from carrying chairs for my service,” as we say here. Although these ladies are high in favour, the King, nevertheless, sent the majordomo to find out how things should be done. I told him, and it will not happen again. These women are becoming far too insolent now that they are in favour, and they imagined that I would not have the courage to report the matter to the King. But I shall not lose my rank nor prerogatives on account of the favour they enjoy. The King is too just for that.

The greatness of the “House” depended on the greatness of the family, which explains why Darcy would have hesitated to be in their company.  A man of Darcy’s status would naturally hesitate to confer “honor” to Elizabeth’s family by visiting, or especially dancing, which would have conferred an extra measure of approval for their “low status” behavior.  And with Elizabeth’s family’s status teetering on the brink, one can then see how potentially damaging Lydia’s behavior would be later in the book.

Elias points out that the aristocracy needed to visit others not only to forge connections and give and receive honor, but also to understand their place in the social hierarchy.  Take fashion, for example.  One should always dress appropriate to one’s station, never above it or below.  But the appropriate dress might shift over time depending on how others dressed and what approval they received from those above them.  A lord “goes for broke” and wears a cravat a bit frillier than he might normally while visiting a duke.  The duke gives his tacit approval by wearing an even more outlandish cravat, and now everyone must level jump on their cravat’s.  Suddenly, the “normal” cravat another lord wears is out of fashion — he now dresses as a bore.  If he had been invited to more places and been busier with his “job” he would have known this.  His family’s status declines.  Hence the near obsession with the aristocracy with visiting and being visited.  It was the only way to have “information,” to use a phrase Austen’s Emma frequently uttered.

Family status often had little to do with money.  No aristocrat worth his salt would stoop to such vulgar behavior as to actually care about money.  I believe Saint-Simon relates a story of one baron who gave his son some money to spend on the town.  When the son returned with money leftover he received harsh criticism from his father, who then threw the remaining money out of the window.  In returning with money the son showed not prudence, but foolishness.  Anyone who looked like they counted their money might look like they cared about money, and that stigma would hurt their reputation severely.

Americans often get accused, and rightly so, of focusing way too much on money, which proves our essential boorishness as a nation.  We have to see this malady in some ways as a by-product of equality.  Americans for the most part have no built in social framework for support, no “society” (to use another term from Emma) where we can claim membership.  Money, therefore, more so than family or connections, becomes our primary, if not our only tool, to keep us afloat.  The charge against us is just, but the charge is easier to avoid in aristocratic societies.

Many aristocrats got their names inscribed in stone by risking vast sums on throws of dice and turns of cards.  One might go broke with such games, but even an incredible loss had glory in it and at least proved one’s cavalier approach to money.  Far better a spendthrift than a miser, but this half-virtue ruined many families.  For of course, they did need money just as anyone else did.  Tradition mitigated against them developing a trade, speculating, or becoming a merchant.  They hoped for an appointment to high ranking government or military posts which traditionally went to high ranking aristocrats.  The only way to prove oneself worthy of this honor was not only to have impeccable taste and sense within the pecking order, but also to demonstrate that they never needed to ask the price of anything.  They played a dangerous game, one that Louis XIV must have been only too delighted to see them play.  As long as the fortunes of the aristocracy ebbed and flowed unpredictably, the greater his power.

So a method did exist.  And we see that, yes, they did work of a kind and had many constraints on their existence.  They were not free in the sense we might imagine.  I had students watch the following video about how aristocrats dressed in the 18th century:

As one might expect, they thought their habits pointless, wasteful, and weird (so much makeup for the man!)*, and so on.  But we must seek to understand.

  • Fundamentally, they sought to dress in ways in which commoners could not possibly dress.  They needed to reflect their proper status, for their own benefit, of course.  But it went beyond that–it was for the good of society too (at least in their minds).  To reflect their station was to give witness to the great chain of being.
  • Most of us dress in rather plain ways.  I think they might say of us that, “You have nothing in your society to lift you above the mundane and ordinary.  You have no higher goal than your base entertainment.  Should there be no glory, nothing to strive for?”

I think this last point has some merit.  But I’m not wearing makeup.

Perhaps one might think the life of the king free from constraint, but not so.  Louis XIV put before himself a tremendous task, to become the state.  While apparently he did not utter the phrase, “L ‘etat c’est moi,” he did say

 “The interests of the state come first. When one gives these priority, one labours for one’s own good. These advantage to the state redounds to one’s glory.”

So, while Louis did get to set the rules of fashion (being the top aristocrat all matters of taste and decorum flowed down from him), he had to organize methodically his use of power.  In order to effectively display the glory of France/himself and set the rules, he had to be “on call” all the time.  This lends more sympathy perhaps to the comical and bizarre rituals of various select noblemen watching Louis dress, undress, and eat.  I had always focused on the prison the nobles had allowed themselves to enter, but to keep the nobles beholden to himself, Louis had to keep himself beholden to them.  He too faced severe constraints on his behavior.

This element of control had to be extended at Versailles to nature itself.

garden-versailles_6475_600x450

With Louis XIV one has a possible glimpse of the final apogee of the Medieval idea of the Great Chain of Being, where happiness consisted in knowing who you were by knowing your place in the universe, and how that related to redemption of all things.  But in what could be called its culmination, the egg goes bad instead of hatching.  No wonder so many aristocrats supported the French Revolution, and even supported abolishing feudal titles.  One must always take caution when using one’s own culture and experience to judge the past, but perhaps the aristocracy simply got tired of playing a game no one had any real chance of winning.  One can make a good argument for the real usefulness of the aristocracy during the medieval period, but that time had long past, and one wonders if the French nobility somehow, deep down, knew that to be true.

Dave

*Yes, I too am disturbed by the use of makeup.  But we must be careful . . . it would not have been too long ago that a woman wearing pants would have been considered a form of cross-dressing.  Men wearing earrings takes on different meanings at different times, and so on.

Risky Business

In Goodbye to All That Robert Graves wrote that in W.W. I, young officers went through some distinct stages in their acclimatization to trench life. For the first couple months, the new man was a danger to himself. Ignorant of how things worked, he easily could walk in the wrong place at the wrong time and be injured or killed. Then, from around month three to six, the officer functioned well, armed with more knowledge but maintaining a healthy fear. The real problems came after the sixth month, when the officer now became a danger to others. The abnormal risks of trench life were now normal life for him, which influenced the officer to take stupid chances that foolishly cost lives–choices that the officer would not recognize as abnormally risky at all. Graves, of course, experienced this first hand as a young officer himself.

Diane Vaughn’s excellent The Challenger Launch Decision made me think of Graves’ remark. Ultimately I lacked the patience and technical familiarity to fully benefit from her magnificent analysis of the decision to launch the Challenger in January 1986. A sociologist by training, she still demonstrated a solid command of the technical information. But she focused not on what happened but the working cultures of engineers, managers, and NASA in general. The end result lends a great deal of complexity to our standard understanding.

After the tragedy, the standard understanding of the event went something like . . .

  • NASA faced unusual pressure to launch as part of their two pronged effort to 1) Advertise civilians in space (teacher Christa MaCualiffe), and 2) Be included in the President’s State of the Union address.
  • The unusually low temperatures created the possibility of the failure of the ‘O-Ring’ to seal. Design engineers noted this possibility and passed their concerns up the chain of command.
  • Design engineers indeed did initially recommend canceling or delaying the launch, but faced with strong pressure from NASA managers, changed their minds and approved the launch.

Events seemingly fit a Hollywood script–plucky engineers, lone dissenters faced down by The Man, with tragic results.

But Vaughn painstakingly points out that this narrative dramatically oversimplifies what happened.

First, the idea of risk . . .

Vaughn gives a helpful picture for us regarding perception of risk. Imagine a butcher shop, with its variety of saws, knives, etc. A two year old around said saws and knives would have no perception of any risk whatsoever. His mother would have an entirely different perspective. The butcher himself would have yet a third perspective. He understands the equipment, knows that it can be dangerous, but accommodates his life to work around and manage these risks. Without willingness to do so, he could not be a butcher at all.

So too, Vaughn reminds us that space flight remains extremely risky, at least relative to almost anything else our government undertakes. When the general public, unfamiliar with such risks, sees engineering reports that describe “possible” malfunctions (or some other such phrase) we react in ways that NASA personnel would not. To attempt to fly into space at all meant assuming a great deal of risk to begin with. That something might go wrong was perfectly obvious, the more relevant questions for NASA routinely ran along the lines of:

  • What is the likelihood of something going wrong?
  • Does this likelihood exceed the threshold of “acceptable risk?”
  • Why do you think it would go wrong? Is this a theory, or do you have testing data to back it up? If you have data, is that data conclusive or conjectural?

NASA had known for years about the problems with the solid rocket booster design and performance. They had known that what happened to the Challenger might possibly happen. The Solid Rocket Boosters (SRB’s) always worried NASA. They launched in spite of these concerns because, “we are in the business of launching rockets”–they routinely “pushed the envelope.” If you don’t want risks. then you don’t want to go space. No one thought the shuttle design perfect–far from it–but the shuttle was what they had to work with.

Vaughn suggests that NASA made the important decisions about the SRB’s years before the Challenger Launch, when they accepted a design that all knew had flaws, and then began a process of systematically normalizing these flaws. Bigger flaws then became smaller flaws, because the starting point itself had flaws built in.

Most Americans accept, however, that space travel has risks. What seemed abnormal and negligent were the events leading up to the launch. We had a launch in unusually low temperatures, a launch to which the SRB design engineers from Thiokol objected to the night before. They had fears that the rubber O-rings might not seal properly and allow too much explosive gas to escape at any temperature below 53 degrees. Of course this is exactly what happened.

But NASA managers strongly criticized Thiokol’s assessment and (seemingly) put pressure on them to change their opinion. Larry Mulloy stated, “My God, Thiokol–when do you want me to launch? Next April?” Another NASA manager, George Hardy, stated that he was “appalled” by their recommendation. Thiokol went back to confer among themselves and then reversed their position. The rest is tragedy, and on the face of it NASA seems horribly guilty.

Vaughn has plenty of criticism for NASA throughout her lengthy book. The book gives copious details about the budgetary issues, design choices, and the engineering culture that was NASA between the early days of shuttle development and January 1986. I will focus on the launch decision itself, because what made The Challenger Launch Decision such a great read for me is that by the end of it, I perceived these seemingly damning comments given above entirely differently. Vaughn devotes at least half of the book towards this end. NASA bears ultimate responsibility for the disaster, but Thiokol bears much of the blame as well. Events proved them right, but those that objected to the launch were not right for the right reasons, or perhaps, for reasons that the engineering culture of NASA would be able to hear and act upon.

Of course NASA had known of the the O-ring problems for years.

Thiokol’s stated position that the design flaws [of the SRB] are not desirable but acceptable.  Neither NASA or Thiokol expected the rubber O-ring sealing joints to be touched by hot gases of motor ignition, much less be partially burned.  However, as tests and flights confirmed damage to the sealing rings, the reaction was to consider the amount of damage “acceptable.” At no time did management recommend a redesign of the SRB . . . 

Presidential Commission Report on Challenger

From the beginning, a certain understanding of risk developed within NASA.

As in all previous space programs, certain residual risks have been accepted by management.  These residual, or acceptable risks, which remain after all feasible corrective actions have been taken, have been thoroughly reviewed . . . 

The conclusion of this review is that there is no single hazard nor combination of hazards that should be considered a constraint to launch.  All phases of Shuttle development and operations are continually being monitored to ensure that the aggregate risk remains acceptable.

Space Shuttle Safety Assessment Report, 1981

Before the Challenger, NASA had 24 successful launches in different kinds of weather. Yes, the O-rings would always be damaged, but that damage stayed within the bounds of “acceptable” wear and tear.

Vaughn collected a great deal of documentation and first hand testimony to describe what happened on the fateful eve of the disaster. NASA had several critiques of Thiokol’s recommendation to postpone the Challenger launch.

The 53 Degree Limit

As mentioned above, on the night before the launch SRB designers Thiokol declared that they could not recommend any launch when the outside temperature dropped below 53 degrees.

For NASA, this posed some terrible problems.

Whether [NASA’s Richard Mulloy’s] choice of words was a precise as perhaps it could have been, it was certainly a valid point, because the vehicle was designed to launch year-round.  Thiokol was proposing significant changes to the whole shuttle program on the eve of launch.  

NASA Engineer Larry Wear

The implications of trying to live with the 53 degree [limit] were incredible.  And coming in the night before the launch on such a weak basis was just–I couldn’t understand it.  

NASA Engineer Bill Riehl

And from Richard Mulloy, of the “launch next April!?” comment:

There are currently no launch criteria for joint temperature.  What you are proposing to do is to create a new launch criteria [not backed up by data], after we have successfully flown 24 launches with the existing criteria.  With the new criteria we may not be able to launch until April.  

I was frustrated.  Their analysis was dumb.  The data said one thing, the recommendation another.  Their 53 degree limit did not solve the technical issue, which was–what temperature did the joint need to be, [not what temperature it was outside]?

I find Mulloy’s point about joint temperature rather than outside temperature most crucial. Also, the Challenger launch had already been delayed several times, and on some of the other proposed launch days the temperature was below 53 degrees. None of the postponements were based on temperature, and Thiokol never raised this objection at any time before. The launch was scheduled for January 27, when it was 37 degrees out, and postponed, but not for cold temperatures. Again, Thiokol failed to raise temperature objections January 27. Thiokol engineer Jack Kapp admitted that,

Most of the concerns we presented . . . we had a very difficult time having enough data to quantify the effects we had talked about.  A lot was based on engineering “feel.”

It seems unreasonable to me to ask that the NASA cancel a launch and completely revise launch criteria based on “feel.” In fact, in the various O-ring tests, the worst damage the equipment had ever sustained came on a day of high outside temperature. Thus, no data existed to show that low temperatures had a conclusive impact on the ability of the O-ring to seal. And yet, NASA said they would have canceled the launch had Thiokol stuck to their guns.

George Hardy, of the “appalled” comment, also stated according to many witnesses that he would not launch against designer advice, and also said, “For God’s sake, don’t let me make a dumb mistake.” If Thiokol truly thought they were on to something, they failed to state their case in a way that could convince NASA or even convince themselves beyond the standard nerves everyone has for a launch. They had no data. They had a “feeling.”

Now of course–they were right about this feeling! This adds to the Challenger tragedy. They had a hunch, but could not translate that hunch in a way that could lead to meaningful action. NASA would have listened to them, but Thiokol could not speak in a way they–or even themselves–could understand. Canceling would have meant that

  • New launch protocols would have been introduced without any real data to back it up
  • The number of shuttle launches planned would have to be drastically reduced
  • NASA management would have to have a reason for the cancellation to their bosses, and would have been asked to base it on a “feeling.”

To cancel the launch would have essentially upended the entirety of NASA’s culture. This is exactly what should have happened. But Vaughn wants us to see that, all things considered, it is not reasonable for us to expect that either Thiokol (which did reverse their recommendation to cancel) or NASA to do this. We do not have a Hollywood script with heroes and villains. The Challenger astronauts died not at the hands of craven management, but as part of something much larger. Vaughn’s analysis shows us that questions we should ask in the aftermath of the tragedy are much more complicated than we might have thought. Once an organization establishes a culture, some decisions almost seem to get made automatically.

Vaughn wrote her book in 1996, and if anyone at NASA read it, it had no impact. We may remember the destruction of the Columbia in 2003 upon re-entry. It’s heat shielding was very likely damaged upon launching when insulation foam from the external tank dislodged during liftoff and struck the left wing. The tank needed insulation to keep the fuel cold enough, but it routinely dislodged during launch, as we might expect given so much thrust and vibration. Unfortunately, this time a large piece dislodged and struck the shuttle in a vulnerable spot during liftoff under the left wing. Columbia lost control and disintegrated upon re-entry. The official investigation into the crash stated that,

Cultural traits and organizational practices were allowed to develop . . . and reliance on past successes [served] as a substitute for sound engineering practices. Organizational barriers prevented communication of critical safety information.

Some might take solace in the fact that NASA’s culture of risk appears to have changed. In their partnership with Space-X, for example, their criteria holds Space-X to a 1 fatality for every 230 launches. Some applaud this reform. But others find this impossible–how can NASA hold Space-X to a standard that no space program at any time and place have ever been accountable to? How much risk must we accept to make progress?

Dave

A Can of Corn

I was never a great baseball player but I had my moments. Somehow, though I am not tall and never was very fast or in possession of a strong arm, I fanangled my way into playing the outfield. Compared to the infield, one had less action, but the action was superior and more intense. The stakes were higher. Muff a grounder and no one really notices, but not so a fly ball. Of course chasing down a long moon shot had its pleasures, but my favorite moments were always the high, lazy fly balls, the “cans of corn” as known in baseball parlance. You knew you would catch these, and so you could just stand under them serenely, watching the ball spin against the blue sky. Time stood still, one needn’t worry about Republicans or Democrats, the past or the future–it was enormously satisfying.

This will sound weird, but Odell Shepherd’s The Lore of the Unicorn, an examination of various arguments before and against the existence of the fabled beast, struck me in just this way. There were so many ways this book could have gone wrong. We would be disinclined to believe a medieval writer. In the 17th century the book would have been too technical. In the 18th century it would have had way too many commas and semi-colons. A 19th century treatment would be too emotional and romantic. Bill Bryson on this topic would be too jocular and snarky. But Shepherd brings a light writing style combined with proper reverence for the sources pro and con.

Why not unplug for a bit and consider the unicorn?

When I began the book I thought the foundation for belief in the unicorn’s existence in the pre-modern west rested on a few old Greek guys, and that is true. But, it is only partially true, and true in more complex ways than I expected:

  • Ctesias wrote about the unicorn around 400 B.C., but he lived much of his life in Persia in service to the Persian kings. Xenophon writes that Ctesias healed the wound of Artaxerxes II after the Battle of Cunaxa. Seeing as how Ctesias kick’s this question off, we’ll quote him in full.

“There are in India certain wild asses which are as large as horses, and larger.  Their bodies are white, their heads dark red, and their eyes blue. They have a horn on their forehead which is about 1 ½ feet long.  The dust from this horn ground is made into a potion that protects against poisons. The base of the horn is pure white, but the top is the purest crimson, and the remainder is black.  Those who drink from this horn, they say, are not subject to epilepsy.

The animal is exceedingly swift–powerful and fierce, so that nothing can overtake it.”

  • Aristotle thought Ctesias untrustworthy overall, but he agreed with him that the unicorn did exist.

“We have never seen an animal with a solid hoof (i.e., not cloven) and two horns, and there are only a few with a solid hoof and one horn, as the Indian ass [unicorn] and the oryx.  Of all animals with a solid hoof, the Indian ass alone has a talus [he large bone in the ankle that articulates with the tibia of the leg and the calcaneum and navicular bone of the foot].

Animalium Book 3, Chapter 41

Perhaps only second to Aristotle in authority for such questions would have been

  • Pliny the Elder, ca. 60 A.D.

The Orsean Indians hunt an exceedingly wild beast called the monoceros, which has a stag’s head, elephant’s feet, and a boar’s tail, the rest of its body belng like that of a horse.  It makes a deep lowing noise, and one black horn two cubits long projects from the middle of its forehead. This animal, they say, cannot be taken alive.

Natural History, Book 8, Chapter 33

Some of what we read here may perplex us, such as the multi-colored horn (did he see painted or decorated horns?) and the fact that the unicorn is not white. If we take also the testimony of Appolonius of Tyana and Aelian, we get some basic agreement, but more disagreement than I expected. Pliny introduces the question of whether or not we should be thinking of a rhinoceros. All in all, the ancient sources appear to me to operate basically independently.

If you have a King James Bible, one notes that several passages mention a unicorn (Num. 23:22, Deut. 33:17, Ps. 39:6, Is. 34:7, Job 39:9-10, etc.). Some of these passages could possibly refer to a rhinoceros, and others, not so much, i.e., in Psalm 39:6 the unicorn is said to “skip like a calf”–rhinos don’t skip. Also, different passages mention “exaltation” like the horn of a unicorn, and a rhinoceros horn doesn’t quite fit this.

For some, the fact that the Bible mentions the unicorn is proof that it never existed, since for them the Bible contains so much fanciful gobbledygook. Others assert that the unicorn can’t exist because they haven’t seen it and don’t know anyone who has. These silly attitudes merit little attention. But I have also seen Christians who say, “The Bible mentions unicorns, so if you believe in the authority of the Bible, you must believe in unicorns.”

The question has more complexity, however. It mainly involves the translation of a two key words: “re’em” in Hebrew and “monoceros” in Greek. St. Justin Martyr, St. Ireneaus, and St. Basil the Great all seem to profess belief in the unicorn based on how they translate the Greek along with other factors. But St. Jerome, St. Ambrose, and St. Gregory the Great all believed that the passages quoted above speak of a rhinoceros and may have denied the existence of unicorns altogether. We cannot say that such men denied the authority of Scripture.

Still, for medievals the case for the unicorn remained stronger than the case against. Even Albert the Great, teacher to St. Thomas and one of the best scientific minds of that era, believed in its existence (though doubted the horn’s medicinal effects). Interestingly, belief in the unicorn may have increased in the Renaissance as the ancient Roman arts of poisoning found a new home in the classically obsessed Italians. Various dukes traveled with unicorn horns (so called) in hopes of having them ward off poisons.

But in time, belief in the unicorn ebbed away, and why this happened deserves attention as well, but more on this later. All throughout the history of unicorn belief, skeptics have weighed in with alternate theories.

Theory 1: The Unicorn as Rhinoceros

We have touched on this briefly already from the Bible, but a few other points of interest could be mentioned:

  • With its very tough hide, the rhino cannot be hunted in the standard way other beasts can
  • Like the ancient descriptions say, the rhino is very strong
  • Some even today believe in the medicinal powers of its horn
  • The “elephant’s feet” from Pliny’s description match that of a rhino.

For me, however, this stretches things a bit too far. More persuasive, in my view is

Theory 2: The Unicorn as the African Oryx

  • Like the descriptions of the unicorn, it is tall, fast, and powerful
  • It somewhat matches the colors mentioned by some ancient authors
  • Its location in Africa matches that of many ancient sightings
  • The oryx was known as difficult to hunt and rare even in the time of Oppian (ca. 160 A.D.).
  • As for the two horns, there are two possibilities: 1) African natives testify that when two oryx’s fight, sometimes a horn can break off, or 2) Some naturalists suppose the possibility of a genetic anomaly occurring and an onyx being born with just one horn.

And, one rare form of the species, the Arabian Oryx, is actually white:

Some also think that Aristotle thought that unicorn was in fact, the oryx.

I much prefer this theory to the unicorn as rhinoceros. I am very nearly convinced, but still . . . two horns is not one horn, and the ancients and medievals could count.

Theory 3: The Unicorn as a Transmuted Eastern/Christian Myth

St. Isidore of Seville (7th century) did believe in the unicorn, and he had a strong influence on the formation of medieval bestiaries. He writes,

“Rhinoceron” in Greek means “Horn in the nose,” and “Monoceros” is a Unicorn, and it is a right cruel beast.  And he has that name for he has a horn in the middle of his forehead of 4 feet long. And that horn is so sharp and strong that he throweth down all, and all he rests upon.  And this beast fights oft with the elephant and wounds him and sticks him in the belly, and throws him down to the ground. And a unicorn is so strong that he cannot be taken with the might of hunters.  But men that write of such things say that if you set out a maid [i.e., a virgin] he shall come. And she opens her lap [or possibly, her breast], and he lays his head thereon, and leaves all of his fire, and sleeps thereon.

In the ancient Persian capital of Persepolis there is a curious image of what appears to be a unicorn and a lion:

Lions represent the masculine and kingly power. Some see in this image, then, that either 1) The power of the king was mighty enough even to hunt and kill a unicorn, or 2) The masculine sun triumphs over the feminine moon, day triumphs over night (it seems even in Persia unicorns may have been thought of as white in color).

I agree with Shepherd that we should view this image mostly in mythological rather than historical or political terms. But Shepherd makes nothing of the violent depiction here, and the contrast with the medieval version of the similar story.

We have already noted the power of the unicorn and that no one can capture or contain him. The medieval versions of the story go deeper into the archetypal patterns. First, the singular horn. Shepherd cites numerous stories of how single-horned beasts had a position of great honor. For example,

  • Plutarch relates that Pericles’ farmhands presented him with a one-horned bull (the horns had merged into one) as a mark of thanks and honor.
  • One-horned cattle are seen as bending towards the king in Ethiopian carvings.
  • In the Jewish Talmud, Adam offers to God a one-horned bull after his exile from Paradise, the most precious thing he owned.

The singularity of the horn unites all these instances. In the standard bestiary of the middle ages, the author writes as a postscript,

The unicorn signifies Christ, who was made incarnate in Mary’s womb, was captured by the Jews, and was put to death. The unicorn’s fierce wildness shows the inability of hell to hold Christ. The single horn represents the unity of God and Christ. The small size of the unicorn [relatively, one must assume–an elephant is certainly bigger], is a symbol of Christ’s humility in becoming human.

Even as far back as The Epic of Gilgamesh, the feminine has always humanized or tamed the masculine. This pattern finds its ultimate expression in the Incarnation itself, where the Virgin Mary contains the uncontainable God, and, dare we say, “humanizes” God? They went on to say that that through the virtues of the spotless Virgin Mary, humanity “wooed” God–the so-called “Holy Hunt.”

So, it should not surprise us that in the famous Unicorn Tapestry, the unicorn is captured within a circular fence, reminding us of a wedding ring–God binding Himself to humanity.

Lions make their way into this tapestry as well, though in a different way than ancient Persia:

So, some argue that the maybe the medievals never really thought the unicorn was a real beast, but simply a helpful story to convey spiritual truth. Or, if they did believe in a real unicorn, they did so only as a mistake, a pleasing and helpful tale incarnated too far in their fertile imaginations.

My one beef with Shepherd’s marvelous book is that he refuses to pick a side in this debate, but I will do so.

I am able to accept that the theory of the unicorn as rhinoceros has merit, but I am not convinced. It does have one horn which many regarded as salutary. But the horn is not “exalted,” and the rhino simply lacks the grace, dignity, and metaphorical heft history has placed upon it.

The oryx theory very nearly convinces me. The speed, elusivity, and necessary “dignity” of the beast are present. Imagine a genetic ‘mistake’ with an Arabian oryx with one horn and it nearly solves the problem stem to stern. But oryx’s have two horns, and as we have seen, the singular horn stands as a crucial fact in the case. True, in a pure profile one would only see one horn of the oryx. But again, oryx’s do move, and people can count to two.

In fact, Shepherd mentions many citings of the unicorn throughout the centuries. Yes, hypothetically all could be mistaken, exaggerating, or lying. Maybe some saw the Arabian oryx. And yes, it seems strange that in the era of iphones, that none would have a picture if it existed. Possibly it did exist and went extinct some centuries ago.

What I can’t abide are those that say that because the medievals allegorized at length with the unicorn, it shows that they are easily fooled or cannot tell the difference between fact and fiction. It also minimizes the importance of the patterns laid down throughout all the ages–as if isolated”facts” that have no meaning had greater importance than all of our stories. Undeniably certain myths existed around the unicorn, but myth is not a symbol for “falsehood.”

Which brings us to why belief in the unicorn has sharply declined over the last few centuries, and especially in our day. Belief in dragons declined rather markedly after the Middle Ages, if they were ever literally believed in at all. Clearly many ancient and medieval people believed literally in unicorns. Unlike other so-called fanciful beasts, unicorn belief persisted after the medieval era, into the Renaissance and beyond. Even in the 17th century some still believed in the unicorn, as did some British explorers into the 19th century–a Major Latter wrote in 1820 that he had definitely seen a unicorn in Africa. None of this has happened with dragons.

I think the reason for the decline, regardless of whether the unicorn ever existed or not, is that we have lost the stories, we have lost the reasons for anything being anything in the first place. True, if the unicorn had not existed, the medieval people might have made him up–it fits that well into their symbolic world, just as it did for other cultures. I suppose this could be slight critique against them if one really felt the need for it. But we, on the other hand, have no need for anything to exist for any particular reason, including ourselves. Many of us are, as Walker Percy brilliantly deduced some 40 years ago, Lost in the Cosmos.

I think a discussion on cable news over whether or not the unicorn existed would reveal a lot about us, such as the role of tradition, science, and the sexes. I say, we should get at all our major worldview questions not through Twitter, CNN, Fox, or the National Review, but through pleasant cans of corn like the one Odell Shepherd has given us. These moments that stop time are likely the most important of all.

Dave

Dueling for your Health

In Mere Christianity C.S. Lewis makes a provocative point about the modern mind.  In discussing love and marriage, he observes that we have a hard time talking about degrees of good and bad.  We can only discuss absolutes and never relative goods.  This leads to a narrowing of societal discourse.  So he writes about duels that,

They ask you what you think of dueling.  If you reply that it is far better to forgive a man than to fight a duel with him, but that even a duel might be better than a lifelong enmity which leads to continuous secret efforts to ‘do the man down,’ they complain that you have not given them a straight answer.

V.G. Keirnan’s book The Duel in European History has certain strengths but lacks some of the necessary subtlety that Lewis urges.  He has a lot of juicy gems and some incisive points.  He searches for a unified field theory of dueling, which I admire.  He seems to think that dueling’s best explanation lies in a quasi-Marxist theory of maintaining class dominance, which fails in my view for a few reasons.  Of course dueling had something to do with class, but not always. Of course dueling is wrong, but . . . maybe not always?

Some personal examples . . .

I had a good friend growing up and we did various things together.  Around our freshman year we decided to add some spice to our various games of ping-pong, poker, H-O-R-S-E, or video games.  We invented consequences for the loser of these contests.  These consequences either brought great discomfort (put hot pepper on your tongue for five minutes, run barefoot in the snow, eat a spoonful of mustard, etc.), or great embarrassment (fall down dramatically in a restaurant, sing loudly in the middle of the street, etc.).  Looking back, many of these things were essentially harmless and created some good memories.  I should say too that losing brought no shame, but to back out of the “consequence” would have been unthinkable and damaging to the friendship.  You made a pledge, now see it through.

But . . . I think a lot our motivation stemmed from boredom.  No longer could we play “just for fun.”  The game itself no longer satisfied.  As you might imagine, with this motivation the consequences themselves inevitably intensified over time.  Also it seemed that we both sought to find great enjoyment in the suffering of the other person, what the Germans call “schadenfreude.” So perhaps on balance this was “primitive” or “destructive.”

Another example . . .

In college I remember walking into my dorm room one day and seeing my roommate and another guy on the hall wrestling.  It was not purely play, neither were they “fighting” in any real sense of the word.  They engaged in something in between those two.  Some sort of personal disagreement lie at the heart of this–I have no idea what.

I stayed to watch.  Keirnan might want to ascribe the fact that I watched to some sort of love of destructive spectacle.  Obviously I preferred watching the “match” to opening my biology textbook. Keirnan has a point.  But I also stayed to act as a kind of “second” for my roommate should level of fighting go too far.  Soon enough a few others came and watched, much for the same reasons, I’m sure.

After several minutes one of them agreed to say “uncle” and they stopped.  Commendations for both participants flowed from the audience.  It seemed entirely natural that now we should all go to dinner, and the first 15 minutes of conversation had most of us laughing about this or that moment in their match.  The two participants seemed entirely reconciled and never again had another such incident.  One of them had “lost,” but that carried no consequence.

I would love to know what Keirnan would think about this “duel.”  Can duels ever be good for you or society, and if so, why?  To answer this question we need to think about why duels happen in the first place.

Before we think about anything possibly positive about duels, Keirnan deals well with their obvious problems:

  • Most duels occur inextricably bound up with the sin of pride.  Perhaps this, even more so than the violence, explains their consistent condemnation by the Church.
  • Many duels bring death or grave physical harm that had no relation to the nature of the “offense” that caused the duel in the first place.  For example, towards the end of the era of dueling poets and musicians fought over particular points of artistic criticism.
  • At certain points in history duels happened not to settle disputes, but to prove manhood or courage.  Duels might then morph almost into a way of life–a way of life that can only end in death.
  • And yes, Keirnan has a point about the “social-control” aspect of dueling as its link to aristocracies.  Democratic peoples resort to dueling at a vastly lower rate than aristocratic nations, and this tells us something.

None of this surprises the reader.  But Keirnan has more interesting parts of his book.

From his tour through the history of the duel, we may guess at when duels tend to emerge more so than other times.

First, it appears that the amount duels rose in times of significant cultural and political shift.  Two main examples hint at this possibility.  First, dueling increased in the 17th century as the power of monarchs increased.  Increased power to the king meant perhaps that aristocrats felt the need to “strut their stuff” and duel more often.  They may have had the political motive of settling disputes outside of royal courts–an act of survival.

In time the power of the state grew and aristocracies declined.  Duels faded gradually through the 18th century.  But the coming of the Industrial Revolution revived it again.  Here we have part two of their attempt at survival, as the Industrial Revolution made mince-meat of the aristocratic class. This time, however, the dueling had no obvious political purpose.   Also–as to how they thought dueling would ensure their survival . . . ?  Maybe they thought they needed to leave the stage in dramatic and pointless fashion?  I don’t buy the “irrational” motif Keirnan may favor, but he can put this one in his corner.*

In his eyewitness account of the English Civil War, Edward Hyde, the Earl of Clarendon, spends his first chapter criticizing the government of Charles I.  One might suppose that certain policies impoverished England and this led to rebellion.   In fact, as Hyde and other historians point out, England enjoyed relative prosperity during Parliament’s long exile under Charles.  The problem lay not in the suffering of the country, but in part in its lack of suffering.  At length Hyde argues that Charles’ chief error lay in not giving England’s political class anything to do for several years.  They had nothing to do in part because times were good in most respects.  In other words, boredom and restlessness helped lead to the Civil War.

Keirnan mentions this as well at certain points in his narrative, and this rings true with my own experience that I mentioned above.  At some point, things got stale and we wanted to liven them up.  But I keep coming back to the question of the possible validity of some kinds of duels.

I had a long talk with my wife about this and she brought up several interesting questions about my experiences.  “Couldn’t we have had mercy on one another and forgiven the consequence?”  I answered that would not have been possible.

“But why not?”

True, many duelists had “mercy” on their combatant by firing in the air or some other such method.  But this was possible because they had already “won” by showing up and standing for the contest.  Victory was a side benefit.  They had already proven themselves.

For my friend and I, we could only prove ourselves by going through with the consequence.  That was the whole point.  When reminiscing about what happened we never said, “Remember that time you made that shot and won at H-O-R-S-E?”  Instead we reflected, “Remember that time when your feet bled from running in the snow, or when I had to sing the Police’s “Roxanne” in the middle of my street?”  Going through with the consequence gained us fame, not winning the contest.

To “forgive” a consequence in our case would have made the whole process pointless.**

So on the one hand we “proved ourselves” as “men” without doing any real harm to ourselves or others.  We bonded over this.

But on the other hand, it had all the negatives I listed above.

I still wonder about the possible ancillary benefits of duels.

Amidst the many reasons for duels–obscene pride, class control, the destructive impulse, etc.–what stands out to me most is boredom.  In some way, shape, or form, deep down we know that we need to suffer to be who we need to be.  Democracies don’t encourage suffering in any way.  We are told to gratify our desires.  Most modern American manifestations of Protestantism have no concept of voluntary suffering and many churches do all they can to accommodate, not challenge, the modern man.

I think if we can recover the true purpose and place of suffering, we may get closer than Keirnan to understanding duels.  And it is here that I must demur, for I have been a somewhat silly teenager, but I am not a saint.

-Dave

 

*I generally disagree with Marxist interpretations of history but they sometimes have merit.  Kiernan’s class emphasis makes historical sense, but not logical sense–at least to me.  Aristocrats have power because of their birth.  They do not need to “earn” it in the modern sense of the word.  Clearly dueling at times served a purpose of validating their status as aristocrats.  But why feel this need?  Again, they never had to earn their status in the first place.  Perhaps the duel represented for some a kind of atonement oriented suffering for their societal position?  Perhaps this might allow them to feel that they had “earned” their role?

I wonder why democracies eschew the duel.  After all, in theory all of their citizens are born equal and must distinguish themselves in some way from their fellow man.

**In fact I believe this happened once and only once in our years of performing “consequences” and I was the lucky recipient.  If memory serves, we were playing some kind of basketball video game and I had lost multiple times, which meant I had to drink a concoction consisting (I think) of raw egg, tabasco sauce, and mustard.

But my friend did not simply just “forgive” this consequence.  Rather, he had to back out of plans we had made for the following day and in compensation released me from drinking the miserable concoction.

Needless to say, however grave and disappointed I made myself sound when he told me this, I accepted his offer quite readily!