Toward a phenomenology of television

I admit that I’ve lost anything but a passing interest in contemporary film and television. I’m not entirely in the dark on such matters; I browse Youtube occasionally, and I have a store nearby where I can find used DVDs for as little as a buck. A year ago, I got into a jag there, buying and binge watching police procedural from the the first decade of the present century. But in general, I don’t watch telvision and stay away from special effects spectaculars. (Although the last film I actually went to a theater to see was Godzilla 2014; but then I have a soft spot for Big Greeny from my childhood, and just wanted to make sure they treated him with respect. I doubt I’ll go to any of the proposed sequels, though.) And 3-D doesn’t interest me. Harpo Marx was once asked about Lenny Bruce who was achieving notoriety at the time; he replied “I have nothing against the comedy of today; it is just not my comedy.”

However, having had to study the phenomenon of television in grad school, and having invested considerable time in thinking, talking about, watching, and even, in my youth, making film, I do have some general remarks that may be useful here.

First, never lose sight of the economic background here. Both commercial cinema and television are primarily business enterprises. The purpose of film production is to provide entertainment enough to attract audiences willing to spend money on it. This has caused considerable friction between those who provide capital for production and those who come to filmmaking with a particular vision that they are hoping to realize.

The purpose of a television show is to produce enough of an audience to sell to advertisers. (This is obviously less true of the secondary markets, DVDs and on-demand viewing; technology has changed that dynamic, although it is still in effect on most of cable.) This is actually quite a lower bar than selling tickets to the theater, since the audience only needs enough incentive as necessary to get them to watch at a particular time for possible advertises. A show only needs to be less uninteresting than competing programs at the same time in order to achieve this.

With these concrete notices, we can get into the phenomenology of the two media. The most important thing to grasp here – both easily recognized and yet easily forgotten – is that what distinguishes these media from all others, and differentiates them from each other, is their relationships to time, and how the makers of these media handle those relationships. Of course every medium establishes a relationship to time, and this relationship effectively defines the medium to a large extent. But each medium does this in a unique way, as opposed to all other media. * Yet one of the problems we have in distinguishing film and television as each a distinct medium is that fictional television seems to have a relationship to time similar to that of theatrical drama or at least of film. This is not the case.
The true structural principle of television did not become recognizable until the late ’70s, when television began broadcasting 24 hours a day. By the late ’90s, when cable television was multiplying into literally hundreds of channels, it should have been obvious to all; but part of the success of television is that it depends on, and manipulates, our attention to the particular. Most people do not think of themselves as ‘watching television.’ They see themselves watching Seinfeld or Mad Men or The Tonight Show or ‘some documentary about the North Pole.’ On the immediate existential sense, they are quite right, it is the individual program to which they attend. The trouble is, when the Seinfeld rerun ends, many of them do not get up to do something more interesting in their lives; they sit there and watch Mad Men. Or at least let it play on while they discuss what it was like to live in the ‘60s, and then the Tonight Show… and if they can’t get to sleep, it’s that documentary about the North Pole on the Nature Channel, or an old movie on AMC (does it really matter which?), or an old Bewitched rerun….

Now it sounds like I’m painting a bleak portrait of the average television viewer. But such a viewer is what television is all about. And we should note that this says nothing against such viewers. They are presented with an existential dilemma: What to do with free time in a culture with little social cohesion and diminishing institutions that once provided that cohesion?

So, whereas film is about how to manage visuals and audio and story and acting in the compacted period of a couple hours, television is about how to provide endless hours of possible viewing. It is not about this or that particular show – trends are more telling at any given moment.That CSI and NCIS and The Closer and The Mentalist and Criminal Minds, etc., etc., all appear in the same decade tells us more of what people found interesting on television that decade than any one of these shows, and certainly more than any one episode.

Which brings me to my real point. Although there are still some decent films being made on the margins and in other countries, the history of the cinema I knew and loved is at an end. Despite the fact that the basic premise of both movies is that a group of talented warrior gather to defend the good against overwhelming force there is no way to get from The Seven Samurai to The Avengers. That there is a core narrative conflict they share only means that there are core narratives shared across cultures, and we’ve known that for a long time.

But while the aesthetics of The Avengers is substantially different from that of The Seven Samurai, there is certainly an aesthetic at work in it. I am not willing to grant that television has any aesthetic at all. We can certainly discuss how aesthetic values are deployed in individual shows and individual episodes. But these are almost always borrowed from other media, primarily film. Television, just as television has no aesthetic value. And that cannot be said for film.

One way to note this is admitting that the ‘talking heads’ television is what television does best. That and talk-overs (as in sports) or banter, playful or violent, as on reality TV shows. Fictional shows can deploy aesthetic values, true; but only to get the viewer to the talk show, the next commercial, the next episode. Anything that accomplishes that.

Of course, what we end up discussing is the individual show, or the individual episode. and because television lacks aesthetic value of its own, it can fill endless hours deploying a multitude of aesthetic values from other media – poetry recitals, staged plays, documentaries, thrillers, old films, old television, various sports, news and commentary – perhaps ad infinitum. That’s what makes commenting on individual shows so interesting – and yet undercuts any conclusion reached in such discussion. All the shows we find interesting today, will be forgotten in the wave of the next trend tomorrow. But don’t worry – there will always be reruns and DVDs. As long as there is a market for them, that is.
My general point here is such that it cannot be disconfirmed by any show, or group of shows, or discussion of these – such would only confirm part of my point, television’s dependence on our attention to particulars.

One way to see think of the general problem is to imagine rowing a boat in a river; upstream someone has tossed in a flower – perhaps it is even a paper flower, and we’ll allow it to be quite lovely. So it drifts by us, and we remark its loveliness, while not addressing the many rotten pine cones that surround it. Now do either the flower or the pinecones get us an aesthetic of the river? No. So ‘bad’ television tells us no more about the aesthetics of television than does ‘good’ television.

And the river trope has another use for us here. We know the flower was tossed into the river only recently; but the pine cones have been floating about us for some time. Yet to us, now rowing past these, the pine cones are contemporary with the flower.

We think of an old TV show, say Seinfeld as if it is a phenomenon of the past; it isn’t. Reruns are still playing in major markets, making it a viable competitor to Mad Men or even Game of Thrones. It is still contemporary television. (Television does not develop a-historically, but the history of its development has been somewhat different than for media where individual works are the primary product.) So an ‘aesthetics of television’ would need to account for that phenomenon as well – not just the aesthetics of Seinfeld or of Game of Thrones, but why it is these aesthetics are received by their differing audiences at the same moment in history – and allowing even that many will watch both. And I suggest it would also have to address the aesthetics deployed in ‘non-fiction’ television (scare-quotes because I’m not sure there is any such thing). I suggest this cannot be done. What television as television presents us is grist for the mills of sociology, semiotics, cultural history; but an aesthetics?

That doesn’t mean that we shouldn’t have criticism of individual episodes or discussion of favorite programs. In fact most of us having watched television or still watching it are doomed to this. But we should be aware that, reaching for the flower, we may end up with a rotten pine cone – or, what is most likely, simply a handful of water, slipping through our fingers. returning to a river we merely float along.

—–

* On the time issue: The art of cinema – that is, the cinema I know, which I admit is no longer of interest, except on the margins – is defined by the control of time. This is also true of music and drama, but in a different way, since the filmmaker has a tool neither of the other two have: editing. Films were made on the editing boards.

But this technique could be accomplished – at least to some extent – in the camera itself. Thus even amateur filmmakers, making home movies, deployed the aesthetic of the medium – a particular control of time that photography could not emulate. Thus, picking up a movie camera and operating it immediately engages an aesthetic, however poorly realized and however unrecognized, even by the one using the camera.

Stories are inevuitable in every media; exactly becasue of this, each medium must define itself in terms of its approach to and presentation of stories, not the stories themselves, since stories will occur inevitably – and when they do not, the audience will invent and impose one.

To be less elliptical then, film’s dominant concern was – and still is, although in a way I no longer recognize – vision, in both the literal and figurative senses of that term, as we experience it through time.

While such considerations are understood by producers of television, that’s not what television is about. Television is about filling time with whatever, and getting the viewer to the next block of time (as defined by producers and advertisers). If a talking head can do this, there’s your television.

Again: We viewers are not the consumers of television – that would be the advertisers. We are the commodity that television sells to them.

That changes everything.


A reply to

“Medium, Message, and Effect” by David Ottlinger:  https://theelectricagora.com/2017/05/30/medium-message-and-effect/

Politics and song

Now, the whole business of Irish nationalism can get very serious if you’re not careful.

– Liam Clancy [1]

My father, Joseph Connelly, abandoned his family when I was two years of age.  I probably should have hated him and be done with it; but that’s not how children respond to their abandonment.  There’s a lot of self-questioning – ‘was I the cause of his leaving?’ – and attempts to prove worthy of a love that will never be acknowledged.

So up to his death of a heart attack in 1989, I went through periods when I tried to adopt Irish culture as somehow my own; as my inheritance.  In the long run, these efforts failed, and they left me realizing that I had no cultural inheritance beyond the common culture of the United States.  When people ask me where my family came from, I answer without hesitation, “Brooklyn” [2].

Nonetheless, the efforts to identify with an Irish heritage left me with considerable sympathy for a people that had long suffered the most miserable oppression as a colony of the British Empire.  (The British long maintained that Ireland was a willingly subservient kingdom, aligned to Britain in the laughable pretense of a “United Kingdom,” but this was believed only by British colonialists stealing farmland from the Irish and putting them to work as, in effect, serfs.)  The oppression really began with Cromwell’s bloody conquest of the Catholic Irish, whom he called “barbarous wretches”; the massacres were bad enough – and the Irish were no saints in these engagements – but the immediate aftermath really established the Anglo-Irish relationship that followed:  the policy of suppression “included the wholesale burning of crops, forced population movement, and killing of civilians” [3].  It cut the population by nearly half.

Difficulties, including the occasional Irish rebellion, continued throughout the history of this “union” of Ireland and England, but reached a turning point with the notorious Potato Famine of 1845.  The potato had become a staple, because it could be grown in private gardens.  When a serious blight stuck, the Irish faced starvation. Cash crops in Ireland were routinely sent to England for wholesale, and if they returned to Ireland for retail sale, they were priced way beyond the ability of the Irish peasantry to pay. These practices were unaddressed by the British government for some five years [4].  By the end of the famine, roughly 1852, the Irish population was estimated as having lost more than 2 million, half to starvation, half to emigration.  The British – many of whom agreed with Cromwell’s assessment of the Irish character as barbarous and wretched (and shameless Catholics to boot) – thought that with the famine ended, markets would naturally stabilize, and relations with the Irish could be restored to way they were under the Acts of Union of 1801. They were wrong.  Survivors of the Famine and their heirs remembered what they had gone through and who had put them through it.  Irish political activists were no longer interested in “protesting” impoverished economic conditions that the British colonialists could exploit.  They knew that any such conditions would inevitably recur as long as the colonialists controlled the economy.  So began the long hard struggle that would lead to Irish independence.

Irish rebel songs had been recorded since at least the 17th century (“Seán Ó Duibhir a’Ghleanna” on the Battle of Aughrim during the Williamite War, 1691).  Indeed, there are so many of them that they form a genre of their own.  (Going by Wikipedia, they seem to comprise about a third of all catalogued folk songs of Ireland [5].)  However, they truly embed themselves in Irish culture in the decades leading up to the War of Independence (1919-21).   They include exhortations to fight for “dear old Ireland,” reports of battles, like “Foggy Dew” (Easter Rebellion, 1916), elegies for slain soldiers; as well as opinions on differing perspectives on the politics of the era, especially concerning those that erupted into violence during the Civil War of 1922.

One might object that I haven’t remarked on “the Troubles” in Northern Island, so I will.  There have been political songs on both sides of that conflict, as well as, in recent decades, admonitions to peace. [6]  They are all Irish.  Because as much as some citizens of North Ireland like to think of themselves as somehow British, no one else does – not even the British, who in signing the accords that brought peace to Ulster (1998), effectively agreed to the right of all the Irish to self-determination.

One can no more remove politics from Irish song, than one could remove the Guinness Brewery from Dublin [7].  But the matter goes much deeper.  In fact, throughout the years of occupation, pretty much whatever the Irish sang about was political in nature.  They sang of the success of their gardens – that violated British economics.  They sang of their children – they weren’t supposed to have so many, those damned Catholics!  They sang out their love of their God – in the 17th Century, this got them killed; in the 18th matters improved, it only sent them to prison.  They sang of the beauty of their countryside – and were kicked off it left and right.  They sang of their trades – which they couldn’t independently practice, without a British approved overseer.  All they had to do was warble a note in Gaelic, and they were suspected of some dark satanic plot against the crown.  In other words, the very existence of Irish song, the very singing of it, was a politically rebellious act against British domination.

It must be kept in mind here that for 400 years, the British were engaged in what might be called genocide-by-attrition of the Irish people.  This is difficult to discuss in America, where the media has such a fascination for the health and marital antics of the ‘royal family’.  I suppose the long-range plan was to have the Irish simply die off, but since most of them were Catholics, that wasn’t going to happen.  So the British settled for total suppression of the Irish way of life and domination of its economy. They reduced the Irish to something less than serfs, since serfs were recognized as being a part of the land they worked.  The Irish were not recognized as belonging to the land, they were seen as somehow an annoying infection, needing to be cauterized.  The British did worse than destroy Irish culture, they stripped the Irish of the resources needed to produce culture.

But the body is a resource, and it can only be stripped from the possessor through death.  As Hitler realized, the only way you can completely erase a culture is through complete eradication of the targeted people.  But the British, although cruel and destructive, had a peculiar image of themselves as fundamentally “decent,” so all their crimes needed to be rationally explicable and moderated with some sense of “mercy” (and with some sense of moral superiority).   Goering once declared in a speech, “Yes, we (Nazis) are barbarians!”  A British politician would never admit such a thing.  So the Irish were allowed to starve to death, but there were no death camps to be found in, say, County Clare.

That may have been a mistake.  Song is of the body.  One feels it singing. It reverberates deeply in the lungs and shakes the innards.  It rises up with every breath (Latin: spiritus).  Sing a song and one is that song.  Sing a song for others, and one produces culture.  The British could take everything from the Irish, but they could not take away their breath; they could not stop them singing.

There are actually two ways to listen to a song.  One is to hear the voice simply as a part of the music itself.  One doesn’t actually pay attention to the words; perhaps one doesn’t understand the words.  This is how we listen to songs in languages we do not speak.  But the practice extends beyond that.  Where I work, my older colleagues and clients generally tend to be political and social conservatives.  Yet the public address radio is set to a “classic rock” station.  So I find myself frequently bemused watching these conservatives hum along to songs promoting recreational drug use (“White Rabbit”), sexual promiscuity (every other song by the Rolling Stones), political revolution or anti-war resistance (Steppenwolf’s “Monster”), non-Christian religious belief (a George Harrison song extolling Hari-Krishna), or even a song of anti-American hostility (“American Woman”).  They listen to something like the Chambers Brothers’ burst of outrage, “Time Has Come Today,” and don’t seem to have any idea that they are the targets of that outrage.  The words are meaningless to them, because they’re not listening to the words.  The voice they hear and hum along with, that’s just part of the music.

I have a suspicion that this is how most of us listen to songs in our own language, especially songs we have been hearing since very young.  My colleagues and clients don’t want to be reminded of the ’60s with all that era’s political turbulence.  They want to be reminded of their own youth.

What the British did in their aggressive disenfranchisement of the Irish on their own soil was to force the Irish to listen to their own songs, to pay attention to the words as well as to the melodies.  Because we listen to the words of a song when they are touching us directly in our immediate circumstances.  So even ancient songs can be made meaningful again if the events they refer to are replicated in the events of the current day: they are recognized as contemporary as a newspaper or a political broadside.

The British thus made the rebel song the touch-stone, the embodiment of Irish culture.  One can see how this plays out in the Irish ‘cheer’ (that’s its technical genre), “Óró Sé do Bheatha ‘Bhaile.” [8]  This probably originated as a shanty, welcoming sailors home from voyage (its structure is quite similar to “Drunken Sailor,” with which it probably shares a common original).  During the Williamite War, it transformed into a plea for Bonny Prince Charles to reclaim the throne and set conditions aright for the Irish.  In the early 20th Century, it was slightly revised by Patrick Pearse, who some say was murdered – or as others would have it, executed – by the British for participation in the Easter ’16 Proclamation of the Irish Republic. [9]  The song is in Gaelic, and roughly less than a third of the Irish report using Gaelic.  That may be less among today’s young Irish, and perhaps they don’t quite understand the full meaning of this song.  But anyone in Ireland forty years or older does.  A call for heroes to oust the “foreigners” (British) from Ireland, it was used as a marching song during the War of Independence.  Even if one doesn’t understand the words, the historical context reveals the meaning, a context remembered and passed on through generations.

Let’s clarify that.  Obviously, however moving the music, and however well known the context, the words technically have no meaning, until they’re explained.  So imagine a young person, unable to speak Gaelic, yet hearing his parents and their friends singing this song and noting their attitudes of pride and determination.  Such a one would feel impelled to ask after the song’s meaning.  And here’s where attempts to suppress a language and its song swing back to bite the oppressor’s hand.  The young person now pays closer attention to the meaning of the song during and following the explanation than he or she would if it were sung in a language already understood.  In other words, the effort to suppress Gaelic song actually backfired:  Rebel songs in Gaelic achieved greater respect as audiences struggled to place them meaningfully within the context of the Irish revolution and take possession of them as their own.

In fact, the problem for any empire is that colonization, oppression, slavery, and mass slaughter do not make friends.  Empires generate hatreds and enmities that last for generations.  The good natured Irish tend to adopt a “live and let live” pragmatic attitude even towards those they have battled in the past.  But they also tend to carry a grudge.

The British are a very proud people.  Writing this in America, I know it is expected of me to continue, “and they have every right to be.”  But I don’t believe that.  The history of England includes important eddies of remarkable writers and scientists.  But these appear to the sides of a great river of blood, clogged with the remains of slaughtered natives of colonized lands.  And for every one of those dead, whole families are left behind to this day, battling to redefine the wretched political and economic confusion the British Empire left behind in its collapse – a collapse that the British still won’t admit or deal with honestly.

I write this in America, the nation that long acted as inheritor of that collapsed empire, while flattering the British ego, by pretending we are all somehow the same people because of a common language.  By functioning in a more paternalistic, “caring” fashion, acknowledging the sovereignty of other countries, spreading around aid programs, enlisting allies (as long as they didn’t threaten our hegemony and wealth), Americans have deluded themselves into believing they are not imperialists and have made no enemies.  But they are and they have, and this will continue to haunt and befuddle their foreign affairs for many generations to come.

But America has another problem.  There is no such thing as “the American people.”  America is a collection of many peoples from around the world.  Some of these have been historically oppressed, although later assimilated into the mainstream.  Others have not been able or allowed to assimilate.  And others may feel themselves oppressed where there is no empirical evidence that this is so, beyond their own disappointment, given the nature of the economy or the nature of constitutional government.  Consequently, there are an awful lot of people here who have, or who have had, or who believe they have, reason speak out.  And when the means for doing so are blocked or when speaking seems unlikely to convince others – they can always sing about it. [10]   That’s what song is for.  Politics is not an add-on to song; song is an inevitable expression in politics.

Mark English wrote here recently of the dangers of relying on mythical thinking in matters political. [11]  The desire for respect, for the ability to live without oppression or risk of theft or murder, for the opportunity to realize one’s full potential unhindered by stigma – are these mythical aspirations?  Quite probably.  The world is a cold home to a lonely, anxious species of over-developed hominids.  But I would not be the one to reassure those starving in a famine that, rationally, their deaths would (in the words of Scrooge) “decrease the surplus population.”   Some myths are worth living for, even fighting for; and worth singing about.

Notes

[1] https://www.youtube.com/watch?v=b3zOVi0C5X4

[2] My oldest sister never quite got over it, and became obsessed with developing a family tree.  She traced the Irish roots back to an 18th century poet, Thomas Dermody, aka Dead-Drunk Dermody, who, as his nickname would suggest, drank himself to death at an early age. https://en.wikipedia.org/wiki/Thomas_Dermody

The first stanza from his “On a Dead Negro;” https://www.poemhunter.com/poem/on-a-dead-negro/:

AT length the tyrant stays his iron rod,

At length the iron rod can hurt no more;

The slave soft slumbers ‘neath this verdant sod,

And all his years of misery are o’er.

[3] https://en.wikipedia.org/wiki/Cromwellian_conquest_of_Ireland

[4] The British response to the famine – heartless indifference – was a purely rational one.  Remember that this was the age of Malthus, who once wrote, however ironically:

“(W)e should facilitate, instead of foolishly and vainly endeavouring to impede, the operations of nature in producing this mortality [of the poor]; and if we dread the too frequent visitation of the horrid form of famine, we should sedulously encourage the other forms of destruction, which we compel nature to use” Essay on the Principle of Population, 1798.

Lest any think this was not in minds of the British during the Famine, consider the following:

“Ireland is like a half-starved rat that crosses the path of an elephant. What must the elephant do? Squelch it – by heavens – squelch it.” – Thomas Carlyle, British essayist, 1840s

“The judgement of God sent the calamity to teach the Irish a lesson, that calamity must not be too much mitigated. …The real evil with which we have to contend is not the physical evil of the Famine, but the moral evil of the selfish, perverse and turbulent character of the people.” – Charles Trevelyan, head of administration for famine relief, 1840s

“[Existing policies] will not kill more than one million Irish in 1848 and that will scarcely be enough to do much good.” – Queen Victoria’s economist, Nassau Senior

“A Celt will soon be as rare on the banks of the Shannon as the red man on the banks of Manhattan.” – The Times, editorial, 1848

Source of additional quotes: http://www.politics.ie/forum/history/22143-anti-irish-quotes-throughout-history.html

[5] https://en.wikipedia.org/wiki/List_of_Irish_ballads

[6] For instance: U2: “Sunday Bloody Sunday,” Simple Minds: “Belfast Child,” The Cranberries: “Zombie.”

[7] Until Guinness bought out the brewery building recently, they held a 9,000 year lease on it.

[8] https://www.youtube.com/watch?v=4Sje2VYw99A

About the song: https://en.wikipedia.org/wiki/%C3%93r%C3%B3_s%C3%A9_do_bheatha_abhaile

Translation in English: http://songsinirish.com/oro-se-do-bheatha-bhaile-lyrics/

Revisions author: https://en.wikipedia.org/wiki/Patrick_Pearse

[9] The execution of the leaders of Easter ‘16 was perhaps the most profound mistake the British could have made.  Initially, they sentenced 89 men and a woman to death; but the first 15 executions were staggered over 9 days, as crowds stood outside the prison weeping, and politicians both Irish and British protested.  Author James Stephens described it as “like watching blood oozing from under a door.”  https://en.wikipedia.org/wiki/James_Stephens_(author)  The sentences of the other 75 sentenced to death were commuted.  But the damage was done.  The effect was to galvanize the Irish people in support of independence.  https://en.wikipedia.org/wiki/Easter_Rising

[10] https://www.youtube.com/watch?v=h4ZyuULy9zs

[11] https://theelectricagora.com/2017/02/22/nationalism-and-mythical-thinking/

 

This essay originally appeared at: https://theelectricagora.com/2017/03/03/politics-and-song/

The trolley problem and the complexities of history

This was originally a response to a discussion concerning the so-called trolley problem – a supposed ethical dilemma involving a choice to allow a trolley to speed toward five innocent people; or hit a switch that may re-direct it toward another innocent person on another track; or simply throw a person in front of the train in order to save the lives of the other five. Basically, a choice between de-ontological or utilitarian ethics. I can’t remember whether it was devised by psychologists but is used by some philosophers as a thought experiment, or the other way around. It is, from my perspective, utterly useless.

Ethics can get very complicated. Or actually, it always is complicated, but when we make our actual decisions, we do so by focusing on specific details in the context in which the decisions are made.

Do we begin an understanding of ethics in Germany, by studying the behavior of the Germans and the Nazis in the ’30s and ’40s? Of course, but how could it be otherwise? And in such study our purpose is not to justify that behavior, but to understand it, and to derive principles, both positive and negative, according to which we have greater purchase over our own behavior in the future.

Having written a study on Hitler, I had to confront a wide range of behaviors in Germany in that era. In that confrontation, I had to ask some painful questions. What made highly intelligent and otherwise ethical doctors engage in crude and cruel ‘experiments’? Why did supposedly decent truck drivers willingly deliver Zylon B to the death camps, knowing what they were intended for? If one asked a young soldier whether it was right to beat an infant to death, he would not only have rejected that suggestion, he would have been appalled. Yet the next day he would then beat an infant to death, persuaded that the infant’s Jewish descent, or the presumed wisdom of the officer ordering him to do this, effectively excused him from responsibility.

After ordering the police to form what were death squads, to ‘clean up’ Jewish villages in Poland in the wake of the invasion, Himmler decided it was his duty to witness one of these mass executions. He came, he saw, he promptly threw up, disgusted with horror. Then he just as promptly reassured the men involved that they were engaging in terrible acts for the greater glory of Germany, and they would be well remembered for their ‘moral’ sacrifice. (By the way, the notion that these special police had to follow orders in performing mass murders happens to be a lie. If any of them felt they could not in good conscience participate, they were re-assigned to desk jobs back in Germany. Partly for this reason they were replaced by the more dedicated SS.)

It is little known, but the Supreme Court of Germany, at least up to the time of my study, had not ruled Hitler’s dictatorship or the laws made by him as illegitimate, but that they were completely constitutional for their time, but only superseded by the post-war constitution? That should give us pause.

Other odd facts raising troubling questions: Himmler was a school teacher who believed stars were ice crystals. But the Nazis condemned contemporary physics as “Jewish science;’ except of course when it could be used to build weapons. Goebbels had a doctorate in engineering – along with some 40,000 Nazis holding graduate degrees in various fields, including half the medical doctors in Germany.

A right-wing influence on the young in the ’20s and ’30s was a major folk music revival. One of the most popular poets in this era was Walt Whitman in translation. Germany was peppered with pagan-revival religious cults, a movement dating back a century previous. The concentration camps were modeled in part on relocation camps for American Indians in the previous century.

Although homosexuals were oppressed and sent to camps in the later ’30s, the leadership of the Nazi SA (Brownshirts) were notorious for their homosexual orgies (which led the General Chiefs of Staff to demand their execution, carried out in the Night of the Long Knives).

The Marxists in the Reichstag voted for Hitler’s chancellorship, thinking that would position them to better negotiate with the Nazis.

Sociological analysis indicates that a third of Germany’s population actively supported Hitler, another third decided to go along with him, because what the heck, what did they have to lose? The final third were opposed to Hitler, but after all, they were Germans, and respected his legitimate election. Given the brutal totalitarianism of the Nazis, by the time they thought to resist, they were stuck.

Hitler himself was a vegetarian, something of an ascetic who only indulged by pouring sugar in his wine; he ended up addicted to pain pills. He banned modern artists, but in his youth had hoped to become one. He was fond of Mickey Mouse cartoons. Once the war started he found himself losing interest in Wagner’s operas. He told his architect Spear that he wanted buildings that would make ‘beautiful ruins.’ He refused to marry his lover Eva Braun until the moment he determined that they both needed to die. In the bunker he admitted bitterly that Schopenhauer had been right that the way of ‘Will’ was an exercise in futility, and that the Germans had proven the weaker race after all.

Historical facts like these present a wide array of ethical and political problems that aren’t going to be solved by simplistic reduction to binary choices, readily determined by psychologists or moral absolutists.

What next, the ‘five-year old Hitler dilemma’? – ‘if you could go back in time and shoot Hitler at age five, would you do so?’ Yes; double tap – and always put one in the brain.

Who are those five people the trolley is racing towards? Answer that question and the problem might be easier to solve.

 

Violence and identity

“I wouldn’t have it any other way”

The Wild Bunch is a 1969 film directed by Sam Peckinpah (written by Peckinpah and Walon Green) [1]. Nominally a Western, it tells the story of a gang of aging outlaws in the days leading up to their last gun battle.

After a failed payroll robbery, in which more innocents are killed than combatants, five surviving outlaws make their way into Mexico, broke and dispirited. The lead outlaw, Pike Bishop, remarks to his colleague Dutch that he wants to make one last big haul and then “back off.” “Back off to what?” Dutch asks, for which there is no answer. Finally Dutch reminds Bishop “they’ll be waiting for us,” and Bishop, the eternal adventurer, replies “I wouldn’t have it any other way.”

In Mexico, the Bunch, including the two Gorch brothers, Lyle and Tector, and Sykes, an old man who rides with them, visit the home town of their youngest member, Angel, which has recently suffered a visit by Federal troops under General Mapache, during which anti-Huerta rebel sympathizers were rooted out and murdered. The Bunch forms an odd bond with the townsfolk, but they’re outlaws and they’re broke. Eventually they make a deal with Mapache (who is advised by Germans, eager to see Mexico allied with them in the impending war in Europe) to rob a US arms train across the border. This robbery is successful, and they return to Mexico with the stolen arms (including a machine gun) pursued, however, by a group of bounty hunters led by Deke Thorton, a former outlaw that Bishop once abandoned during a police raid on a bordello. Later ,the bounty hunters will wound Sykes, whom the Bunch will abandon to his fate.

Along the trail, Angel, a rebel sympathizer himself, has some Indian friends carry away a case of guns and another of ammunition. Angel, however, has been betrayed by the mother of a young woman he killed in a fit of anger for having run off to join Mapache’s camp followers. The outlaws complete their deal with Mapache, but surrender Angel over to Mapache.  Deciding to let Mapache deal with the bounty hunters, they return to the Army headquarters in the ruins of an old winery. However, their betrayal of Angel haunts them. After a brief period of whoring and drinking, they decide to confront Mapache and demand the return of their colleague. Mapache cuts Angel’s throat, and without hesitation Pike and Dutch shoot him down. At this point, the Bunch probably could take hostages and back off, but to what? Instead they throw themselves gleefully into a gun battle with some 200 Federales, and by taking control of the machine gun do quite a bit of damage. Eventually, however, the inevitable happens, and they end up dead, Pike shot by a young boy with a rifle.

As the surviving Federales limp out from the Army HQ, Thorton shows up. From there, he sends the bounty hunters home with the outlaws’ bodies, but remains to mourn the loss of his former friends. Sykes rides up with the rebel Indians who have saved him, and suggests Thorton join them. “It ain’t like it used to be, but it’ll do.” Laughing in the face of fate, they ride off to join the revolution.

The thematic power of the film hinges on two apposite recognitions. The first is that the outlaws are bad men. They rob, they cheat, they lie, they kill without compunction. They seem to hold nothing sacred and have no respect for any ethical code.

The second recognition is that this judgment is not entirely complete or correct. They have a sense of humor and an undeniable intelligence. They are able to sympathize with the oppressed villagers in Mexico. They have a sense of being bound together, and this is what leads them to their final gun battle.

The Bunch have lived largely wretched lives. As professional outlaws, they are dedicated to acquiring wealth by criminal means, but throughout the film, it is clear that wealth offered only two things for them: prostitutes and liquor. Although Pike was once in love and thinking of settling down, and (the asexual) Dutch speaks wistfully of buying a small ranch, they are just as committed to the outlaw lifestyle as the unrepentant Gorches; they just would rather believe otherwise.

This is because they are committed to a life of violence, to the thrills of dangerous heists, of chases across the landscape of the Southwest, and of gun fights. They rob largely to support that lifestyle, not the other way around.

The finale of the film has two major points of decision, the first determining the second. The first is when Pike, dressing after sex with a prostitute, sits on the bed finishing off a bottle of tequila.  That’s his life; and with the wealth gotten from the Mapache deal, he could continue it indefinitely. In the next room, the Gorch brothers, also drunk, argue with another prostitute over the price of her services. That’s their life, too. Meanwhile, Angel is getting tortured to death for being an outlaw with a conscience. Pike slams the empty bottle to the floor, and the march into battle begins.

The second point of decision has already been remarked on.  The moment after shooting Mapache, when they might have escaped, the Bunch choose to fight instead. Why do they do it? It’s not for the money, the drinking or the prostitutes.  Is it for revenge?  No, it’s because they live for the violence, and they do so as a team, and they have reached the moment at which they can live it to its logical conclusion.

Peckinpah remarked that, for that moment to carry any weight, the outlaws needed to be humanized to the extent that the audience could sympathize with them. He was, I think largely successful. But the film has been controversial, not only because of its portrayal of violence, but because in the climactic battle Peckinpah pushes our sympathies for the Bunch beyond mere recognition of their humanity.  They become heroic, larger than life, almost epic figures, challenging fate itself, in order to realize themselves, like Achilles on the field before Troy. And oddly, while not really acting heroically, they become heroes nonetheless, remembered by the revolutionaries who benefit from their sacrifice.

As a side remark, let’s note that Peckinpah was raised in a conservative Calvinist, Presbyterian household. But, like Herman Melville a century before, he was a Calvinist who could not believe in God.  In such a universe, some are damned, but no one is saved. We only realize our destiny by not having any. The Bunch destroy any future for themselves and thus, paradoxically, achieve their destiny. The fault is not in our stars, but in ourselves.

A Soldier’s Story

The Wild Bunch is set in the last months of the Huerte dictatorship (Spring of 1914), a phase of the series of rebellions, coups d’état, and civil wars known collectively as the Mexican Revolution. [2] Officially, this revolution began with the fall of the Diaz regime and ended with the success of the Institutional Revolutionary Party (PRI), but rebellions and bloodshed had already permeated the Diaz regime and continued a few years after the PRI came to power. In the official period of the revolution, casualties numbered approximately 1,000,000. When one discovers that the Federal Army only had about 200,000 men at any time, and that rebel armies counted their soldiers in the hundreds, one realizes that the majority of these casualties had to be non-combatants. Not surprisingly; the Federal Army, and some of the rebels, pursued a policy (advocated by our current US president) of family reprisal – once a rebel or a terrorist is identified, but cannot be captured or killed, his family is wiped out instead. Whole villages were massacred. Dozens of bodies would be tossed into a ditch and left to rot.

As I’ve said elsewhere, I’ve nothing against thought-experiments that raise ethical questions, only those that limit the possible answers unjustifiably. So let us now imagine ourselves in the mind of a young Federal soldier, whose commandant has ordered him to shoot a family composed of a grandmother, a sister, a brother – the latter having atrophied legs due to polio – and the sister’s six-year-old daughter. The relevant question here is not whether or not he will do this. He will. The question is why.

This is a kind of question that rarely, if ever, appears in ethical philosophy in the Analytic tradition. It is, however, taken quite seriously in Continental philosophy. There’s a good, if uncomfortable, reason for this. Continental thinkers write in a Europe that survived the devastation of World War II and live among both the survivors of the Holocaust and the perpetrators of it. Analytic philosophers decided not to bother raising too many questions concerning Nazism or the Holocaust. Indeed, in the US, the general academic approach to events in Germany in the 1930’s and 40’s has been that they constituted an aberration. Thus, even in studies of social psychology, the Nazi participants in the Holocaust are treated as examples of some sort of abnormality or test cases in extremities of assumed psychological, social, or moral norms.  This is utter nonsense. If it was true, then such slaughters would have been confined to Europe. And yet, very similar things went on in the Pacific Theater: during the Japanese invasion of China, the number of causalities is estimated as being into the tens of millions.

There were a million casualties resulting from the Turkish mass killing of the Armenians, long before the Holocaust.  There were several million victims of the Khmer Rouge in Cambodia, decades after the Holocaust.  Far from being some pscyho-social aberration, human beings  have a facility for organized cruelty and mass slaughter.

At any rate, assuming that our young Mexican soldier is not suffering from some abnormal psychology, what normative thoughts might be going through his mind as he is about to pull the trigger on the family lined up before him?

For the sake of argument, we’ll allow that he has moral intuitions, however he got them, that tell him that killing innocent people is simply wrong. But some process of thought leads him to judge otherwise; to act despite his intuition. Note that we are not engaging in psychology here and need not reflect on motivations beyond the ethical explanations he gives for his own behavior.

While not a complete listing, here are some probable thoughts he might be able to relay to us in such an explanation:

For the good of the country I joined the Army, and must obey the orders of my commanding officer.

I would be broke without the Army, and they pay me to obey such orders.

These people are Yaqui Indians, and as such are sub-human, so strictures against killing innocents do not apply.

I enjoy killing, and the current insurrection gives me a chance to do so legally.

So far, all that is explained is why the soldier either thinks personal circumstances impel him to commit the massacre or believes doing so is allowable within the context. But here are some judgments that make the matter a bit more complicated:

This is the family of a rebel, who must be taught a lesson.

Anyone contemplating rebellion must be shown where it will lead.

This family could become rebels later on. They must be stopped before that can happen.

All enemies of General Huerta/ the State/ Mexico (etc.) must be killed.

Must, must, must. One of the ethical problems of violence is that there exist a great many reasons for it, within certain circumstances, although precisely which circumstances differ considerably from culture to culture, social group to social group, and generation to generation. In fact, there has never been a politically developed society for which this has not been the case. Most obviously, we find discussions among Christians and the inheritors of Christian culture, concerning what would constitute a “just war” (which translates into “jihad” in Islamic cultures). But we need not get into the specifics of that. All states, regardless of religion, hold to two basic principles concerning the use of violence in the interests of the State: First, obviously, the right to maintain the State against external opposition; but also, secondly, the right of the State to use lethal force against perceived internal threats to the peace and stability of the community. We would like to believe that our liberal heritage has reduced our eliminated adherence to the latter principle, but we are lying to ourselves. Capital punishment is legal in the United States, and 31 states still employ it. The basic theory underlying it is quite clear: Forget revenge or protection of the community or questions of the convicted person’s responsibility – the State reserves the right to end a life deemed too troublesome to continue.

But any conception of necessary violence seriously complicates ethical consideration of violence per se. Because such conceptions are found in every culture and permeate every society – by way of teaching, the arts, laws, political debates, propaganda during wartime, etc. – it is likely that each of us has, somewhere in the back of our minds, some idea, some species of reasoning, some set of acceptable responses, cued to the notion that some circumstance somewhere, at some time, justify the use of force, even lethal force. Indeed, even committed pacifists have to undertake a great deal of soul-searching and study to recognize these reasons and uproot them, but they are unlikely ever to get them all.

Many more simply will never bother to make the effort. They are either persuaded by the arguments for necessary force, or they have been so indoctrinated into such an idea that they simply take it for granted.

Because there are several and diverse conceptions and principles of necessary violence floating around in different cultures, one can expect that this indoctrination occurs to various degrees and by various means. One problem this creates is that regardless of its origin, a given conception or principle can be extended by any given individual. So today I might believe violence is only necessary when someone attempts to rape my spouse, but tomorrow I might think it necessary if someone looks at my spouse the wrong way.

The wide variance in possible indoctrination also means a wide variety in the way such a principle can be recognized or articulated. This is especially problematic given differences in education among those of differing social classes. So among some, the indoctrination occurs largely through friends and families, and may be articulated only in the crude assertion of right – “I just had to beat her!” “I couldn’t let him disrespect me!” – while those who go through schools may express this indoctrination through well thought-out, one might say philosophical, reasoning: “Of a just war, Aquinas says…” or “Nietzsche remarks of the Ubermensch…” and so on. But we need to avoid letting such expressions, either crude or sophisticated, distract us from what is really going on here. The idea that some violence is necessary has become part of the thought process of the individual. Consequently, when the relevant presumed – and prepared-for – circumstances arise, not only will violence be enacted, but the perpetrator will have no sense of transgression in doing so. As far as he is concerned, he is not doing anything wrong, even should the violent act appear to contradict some other moral interdiction. The necessary violence has become a moral intuition and overrides other concerns. “I shouldn’t kill an innocent, but in this case, I must.”

Again, this is not psychology. After more than a century of pacifist rhetoric and institutionalized efforts to find non-violent means of “conflict resolution,” we want to say that we can take this soldier and “cure” of his violent instincts.  But, what general wants us to do that? What prosecutor, seeking the death penalty, wishes that of a juror?

The rhetoric of pacifism and the institutionalization of reasoning for non-violence is a good thing, don’t misunderstand me. But don’t let it lead us to misunderstand ourselves. There is nothing psychologically aberrant in the reasoning that leads people to justify violence, and in all societies such reasoning is inevitable. It’s part of our cultural identity.  Strangely enough, it actually strengthens our social ties, as yet another deep point of agreement between us.

Being Violent

I’m certain that, given the present intellectual climate, some readers will insist that what we have been discussing is psychology; that Evolutionary Psychology or genetics can explain this; that neuroscience can pin-point the exact location in the brain for it; that some form of psychiatry can cure us. All of which may be true (assuming that our current culture holds values closer to “the truth” than other cultures, which I doubt), but is nonetheless irrelevant. It should be clear that I’m trying to engage in a form of social ontology or what might be called historically-contingent ontology. And ethics really begins in ontology, as Aristotle understood.  We are social animals, not simply by some ethnological observation, but in the very core of our being. We just have a difficult time getting along with each other.

It’s possible to change. Beating other people up is just another way to bang our own heads against the wall; this can be recognized, and changed, so the situation isn’t hopeless. As a Buddhist, I accept the violence of my nature, but have certain means of reducing it, limiting it, and letting it go. There are other paths to that. But they can only be followed by individuals. And only individuals can effect change in their communities.

This means we have to accept the possibility that human ontology is not an a-temporal absolute, and I know there is a long bias against that, but if we are stuck with what we have always been, we are doomed.

Nonetheless, the struggle to change a society takes many years, even generations, and it is never complete. Humans are an indefinitely diverse species, with a remarkable capacity to find excuses for the most execrable and self-destructive behavior. There may come a time that humans no longer have or seek justifications for killing each other; but historically, the only universal claim we can make about violence is that we are violent by virtue of being human, and because we live in human society.

Notes

  1. http://www.imdb.com/title/tt0065214/
  2. https://en.wikipedia.org/wiki/Mexican_Revolution

Reprinted from:https://theelectricagora.com/2017/02/11/violence-and-identity/

Reasoning, evidence, and/or not miracles

This week at Plato’s Footnote, Massimo Piglucci posted a brief discussion on how the use of probability reasoning, especially of the Bayesian variety, can be used to dispel contemporary myths such as anti-vaccination paranoia, trutherism concerning the events of 9/11/01, and bitherism concerning Former President Obama.

https://platofootnote.wordpress.com/2017/01/16/anatomy-of-a-frustrating-conversation/

 

The comments thread became an object lesson in just how difficult it is to discuss such matters with those who hold mythic beliefs – every silly conspiracy theory was given vent on it. I myself felt it useful to briefly engage an apologist for miracle belief, with someone misrepresenting the argument against such belief as put forth by David Hume, referenced in Piglucci’s article. I would like to present and preserve that conversation here, because it is representative of the discussions on the comment thread, but also representative of the kinds of discussions reasonable people generally have with those so committed to their beliefs that they are open to neither reasoning nor evidence against them.

 

Asserting that Hume begins by declaring miracles simply impossible (and thus pursuing a circular argument), a commenter handled jbonnicerenoreg writes:

 

“The possibility of something should be the first step in a n argument, since of something is impossible there is no need to argue about it. For example, Hume says that miracles are impossible so it is not necessary to look at a particular miracle probability. I believe Hume’s argument does more than the reasoning warrants. ”

 

My reply:

That isn’t Hume’s argument at all. Hume argues that since miracles violate the laws of nature, the standard of evidence for claims for their occurrence is considerably higher than claims of even infrequent but natural events (such as someone suddenly dying from seemingly unknown causes – which causes we now know include aneurisms, strokes, heart failure, etc. etc.). Further, the number of people historically who have never experienced a miracle far outweighs the number who claim they have, which suggests questions of motivations to such reports. Finally, Hume remarks that all religions have miracle claims, and there is no justification for accepting the claims of one religion over any other, in which case we would be left with having to accept all religions as equally justified, which would be absurd, given that each religion is embedded with claims against all other religions.

 

Hume doesn’t make a probability argument, but his argument suggests a couple; for instance, given the lack of empirical evidence, and the infrequency of eye-witness accounts (with unknown motivations), the probability of miracles occurring would seem to be low. At any rate, I don’t remember Hume disputing the logical possibility of miracles, but does demand that claims made for them conform to reason and empirical experience.

 

jbonnicerenoreg,: “If you witness Lazurus rise from the dead, and if you know he was correctly entombed, then your evidence is sense experience–the same as seeing a live person. Hume’s standard of evidence is always about historical occurrences.”

 

My reply:

If such an experience were to occur, it might be considered ’empirical’ to the one who has the experience; but the report of such an experience is not empirical evidence of the occurrence, it is mere hearsay.

 

Unless you want to claim that you were there at the supposed raising of Mr. Lazarus, I’m afraid all we have of it is a verbal report in a document lacking further evidentiary justification, for a possible occurrence that supposedly happened 2000 years ago – which I think makes it an historical occurrence.

 

And no, Hume’s standard of evidence is clearly not simply about historical occurrences, although these did concern him, since his bread-and-butter publications were in history. But if miracles are claimed in the present day, then they must be documented in such a way that a reasonable skeptic can be persuaded to consider them. And it would help even more if they were repeatable by anyone who followed the appropriate ritual of supplication. Otherwise, I feel I have a right to ask, why do these never happen when I’m around?

 

7+ billion people on the planet right now, and I can’t think of a single credible report, with supporting evidence, of anyone seeing someone raised from the dead. Apparently the art of it has been lost?

 

Look, I have a friend whose mother died much too young, in a car crash, 25 years ago. Could you send someone over to raise her from the dead? I suppose bodily decomposition may make it a little difficult, but surely, if the dead can be raised they should be raised whole. Zombies with their skin falling off are difficult to appreciate, aesthetically.

 

jbonnicerenoreg,: “I suggest that if you can get over yourself, please read Hume carefully and comment with quotes. I will be glad to answer any questions you may have about the logic of the argument.”

 

My reply:

Well, that you’ve lowered yourself to cheap ad hominem once your argument falls apart does not speak much for your faith in your position.

 

However, I will give you one quote from Hume’s An Enquiry Concerning Human Understanding, Section X, “On Miracles”:

 

A wise man, therefore, proportions his belief to the evidence. In such conclusions as are founded on an infallible experience, he expects the event with the last degree of assurance, and regards his past experience as a full proof of the future existence of that event. In other cases, he proceeds with more caution: he weighs the opposite experiments: he considers which side is supported by the greater number of experiments: to that side he inclines, with doubt and hesitation; and when at last he fixes his judgement, the evidence exceeds not what we properly call probability. All probability, then, supposes an opposition of experiments and observations, where the one side is found to overbalance the other, and to produce a degree of evidence, proportioned to the superiority. A hundred instances or experiments on one side, and fifty on another, afford a doubtful expectation of any event; though a hundred uniform experiments, with only one that is contradictory, reasonably beget a pretty strong degree of assurance. In all cases, we must balance the opposite experiments, where they are opposite, and deduct the smaller number from the greater, in order to know the exact force of the superior evidence.

( http://www.bartleby.com/37/3/14.html )

 

I think Massimo and I are reading such a remark rather fairly, whereas you preferred to bull in with something you may have found on some Apologists web-site, or made up whole cloth. It was you who needed to provide quotes and reasoning, BTW, since your counter-claim is opposed to the experience of those of us who actually have read Hume.

 

By the way, I admit I did make a mistake in my memory of Hume – He actually is making a probability argument, quite overtly.

 

jbonnicerenoreg,: “A beautiful quote and one which I hope we all take seriously put into practise.

Hume is arguing against those who at that time would say something like “miracles prove Christianity is true”. You can see that his argument is very strong against that POV. However, he never takes up the case of a person witnessing a miracle. Of course, that is because “observations and experiments” are impossible in history since the past is gone and all we have is symbolic reports which you call “hearsay”. My congratlations for taking the high road and only complaining that I never read Hume!”

 

My reply:

Thank you for the congratulations, I’m glad we could part on a high note after reaching mutual understanding.

 

Notice that jbonnicerenoreg really begins with a confusion between the possible and the probable.  One aspect of a belief in myths is the odd presumption that all things possible are equally probable, and hence ‘reasonable.’  I suppose one reason I had forgotten Hume’s directly probabilistic argument was because probabilistic reasoning now seems to me a wholly necessary part of reasoning, to the point that it doesn’t need remarking.  Bu, alas, it does need remarking, time and again, because those who cling to myth always also cling to the hope – nay, insistence – that if there is something possible about their precious myth, then it ought to be given equal consideration along with what is probable. given the nature and weight of available evidence.  Notice also that jbonnicerenoreg tries to sneak, sub-rosa, as it were, the implicit claim that eye-witnesses to miracles – such as the supposed authors of the Bible – ought to be given credence as reporting an experience, rather than simply reporting a hallucination, or a fabricating an experience for rhetorical or other purposes.  Finally, notice that when I play on and against this implicit claim, jbonnicerenoreg tries an interesting tactic – he surrenders the problem of historical reportage, while continue to insist that witnessing miracles is still possible (which if verified would mean we would need to give greater weight to those historic reports after all!).  But there again, we see the confusion – the possible must be probable, if one believes the myth strongly enough.

 

And if we believe in fairies strong enough, Tinkerbelle will be saved from Captain Hook.

 

This won’t do at all.  The bare possibility means nothing.  Anything is possible as long as it doesn’t violate the principle of non-contradiction.  A squared circle is impossible; but given the nature of the space-time continuum posited by Einstein, a spherical cube may not only be possible but probable, presuming a finite universe.  But the probability of my constructing or finding an object I can grasp in my hand, that is both a sphere and a cube is not very high, given that we exist in a very small fragment of Einstein’s universe, and Newtonian physics and Euclidean geometry suit it better than applied Relativity on a universal scale.  All things in their proper measure, in their proper time and place. 

 

But the problem with miracles is that they are never in their proper time and place, to the extent that one wonders what their proper time and place might be, other than in works of fiction.  Why raise Lazarus from the dead if he’s just going to die all over again?  Why raise Lazarus instead of the guy in the grave next to his?  Why do this in an era and in a place lacking in any sophisticated means of documentary recording?  And why would a divine being need to make such a show of power?    Wouldn’t raw faith be enough for him, must he have eye-witnesses as well? 

 

And of course that’s the real problem for jbonnicerenoreg.  For miracles to achieve anything that looks like a probability, one first has to believe in god (or in whatever supernatural forces capable of producing such miracles).  There’s no other way for it.  Without that belief, a miracle is bare possibility and hardly any probability at all.   And I do not share that belief.

 

The known unknown on the internet

This was written after reading an interesting article by Firmin deBrabander, “Shame on You,” at the Aeon website. *

 

deBrabander uses the perspective of French sociologist and philosopher Michel Foucault to discuss some current cultural formations arising in and because of the internet and its ‘social media.’

 

Foucault was concerned with the nature of power in modern capitalist society. But he held that power is diffuse and not centralized. We learn to regulate ourselves in a society in which our personalities are formed by society, a society in which even our darkest or most cherished secrets are actually available for view and review in particular circumstances.  This creates a web of relations throughout which power, as the effort to control behavior (of ourselves and others) is disseminated through language and shared interests.  One essential aspect of such power relationship has to do with how we seek to be seen, and how we seek to see others.

 

We may be watched by the state (probably are), but first we are watched by parents, peers, total strangers – your neighbors, the people you meet in a shop or on a bus, your congregation at church (if you attend), etc., etc. However, society has a hierarchical structure, so naturally those who benefit most from social strictures on behavior will be those with money, influence, or authority.

 

So what deBrabander is asking is how the internet has effected the diffusion of power, normalizing this interplay with what one might call socialized privacy, and how that generated echo chambers leading to a disunity of communication in society as a whole: “The result,” deBrabander remarks, “is a growing conformity within camps, as well as a narrowing of the shared space for understanding and dialogue between them.” And this seems clearly to benefit those with money, influence, or authority.

 

Self regulation is essential to any society; however in the current environment, you are almost guaranteed to reveal some, perhaps all, of these things to some one; if you do so on the internet – which is always a public forum, no matter how we pretend otherwise – that creates problems, some of which deBrabander discusses. (Although I think there are more as well.)

 

In some sense everything about us is ‘shameful,’ yet everything must be ‘confessed.’ And we seem to be constructing a culture around this double imperative.

 

Shame exists as a social function,helping to generate a sense of self with the agency to determine seemingly hidden values and revealed values. However the sense of shame is indoctrinated by parents and peers, and in differing social groups will determine the shamefulness of differing values. Thus anything about an individual may prove shameful in some circumstance. However, in the globalized social media, small groups appear to form around what the participants may think are private revelations that are in fact entirely public. If we take the presumed privacy as a means of protecting the hidden, then everything hidden in the many different groups becomes an object of potential shame. However, in order to participate in any group, one has to reveal what is hidden, even what the person feels ought to be hidden, and so confess. However since there is no real privacy on the internet, what is confessed is done so publicly. , This creates a web of what is hidden from some groups but revealed in others, but available to all in most circumstances,, and in other circumstances, available to those with the proper technology. This web supports the social status quo, and in a hierarchical society especially those at the top of the hierarchy with the wherewithal to leverage technological access to all information in the web.

 

It’s pointless to get paranoid in this situation; however it helps, in learning to live with it, to recognize that it is, and what it is.

 

To see this more concretely, imagine a professional football player; last year he signed a lucrative ten year contract, this despite his knowledge (known only to his family) that his mother died of Huntington’s chorea, which means that there is a 50% chance that he will likely not be able to fulfill that contract.

 

So, he doesn’t want to confess this to his team. But at some point, reluctantly, he confesses to a doctor, to receive proper diagnosis. It’s positive. So he secretly joins a support group with fellow sufferers, which is primarily concerned with confessing the kinds of physical and emotional suffering the condition causes.

 

Meanwhile, on his off-hours he pursues an interest in gardening, particularly flowers. But he doesn’t want his teammates to know this, because they all say such an interest is gay. That isn’t true, of course; but just as it happens, he is gay – and he doesn’t want his teammates to know this either. However, he certainly wants those who attend his favorite gay bar to know this, since that’s the only way he can make relationships at that bar, to which he goes after spending time at a local horticulture club. But he doesn’t mention this at the bar, because it’s a leather bar, and flowers are considered fey there.

 

Meanwhile, his alcoholic brother has sobered up thanks to the intervention of a fundamentalist church, and insists they attend some meetings there together, which he does to support his brother (who doesn’t know he’s gay), despite the fact that he’s an atheist, which only his gay friends and his fellow horticulturalists know about him.

 

Now it might be said to him, that these various social groups in which he participates put him in a tense and precarious situation, which can be ameliorated considerably if he would only confess all of his issues to everyone involved. But of course while his sense of shame in certain groups would be alleviated somewhat, he would be effectively making himself a focus of attention, some of which he would rather not have (especially if his team decides that his Huntington’s chorea invalidates his contract).

 

But here’s the problem. On the internet, under various pseudonyms, he begins participating on sports site; on sites for sufferers of Huntington’s chorea; on gay sites; on horticulture sites; on Christian sites for the support of families with someone suffering alcoholism; on atheist sites. On each site he confesses some aspect of himself and his situation he thinks he’s keeping hidden from others – from different others in the different groups in which he participates.

 

But he’s not. That myth is maintained by the acceptance of the pseudonyms he uses, and the fact that most of these sites do not communicate with each other. But in fact all his pseudonyms can be traced back to him; everything about him can be known.

 

The ease of access to the internet, the rapidity with which we can post on it, the ‘friending’ and ‘liking’ on many sites, the seemingly protective allowance for using pseudonyms, ‘handles’ and the like, have misled us into believing we have control over our presence on the web. That’s not true. To socialize at all we surrender something of ourselves to the groups we address. But on the internet, we may end up surrendering everything about ourselves to people we don’t know, and don’t even know exist. Remember, even without posting on the ‘net, our browsing is tracked to provide us with advertisement ‘recommendations.’ These are provided by programs; but the information can be accessed by the advertisers themselves. So there is no invisible presence on the ‘net. We enter it revealed, already ‘confessed’ by the websites we visit.

 

And as the construction of the surveillance state continues apace, there may be a time that everything we’ve revealed on the ‘net will be registered in a data-base in some government agencies main-frame.

 

Again, there’s no point in getting paranoid, because in contemporary society, there’s no way to avoid these interactions. But one should always post on the ‘net prepared for the consequences of public exposure.

 

—–

 

* https://aeon.co/essays/how-baring-and-sharing-online-increases-social-conformity

I  noted this article through a posting at Plato’s Footnote.*  The above includes a comment made there: and since posting this, I’ve felt impelled to write another comment, which I expand on here,  discussing some of the possible motivations for this problem:

 

In a society with few naturally formed communities, such as one used to find in homogenous small towns, we are ever trying to find communities of interest to which to join.  These can be support groups, hobby-interest groups, religious groups, fan clubs, sports clubs, or just the neighborhood bar.  In the process of becoming a member of such a community, one chooses what to reveal and what to conceal about one’s life as a whole.  This will often take on something of the nature of a confession, while involving anxiety something in the nature of a sense of shame concerning what is not revealed, although this is always a matter of degrees.  An alcoholic in AA is certainly confessing, but in a presumably safe environment.  A recovering alcoholic attending a book club ‘confesses,’ even professes a love of books, but may feel too much anxiety about his/her alcoholism to reveal anything about that.  However, in the process of attending AA he or she might discover someone who likes books; attending the book club might lead to discovery of someone else with a similar issue, and friendships are formed; each community grows tighter together.

But on the internet, the communities we join, while still needing professions, confessions, and silence on secrets, social interactions necessarily change.  Our recovering alcoholic begins posting on an AA oriented website.  The conversations involved are for all those to see, not just recovering alcoholics.  The other participants to discussion are unknown to our poster.  Some of them may not even be recovering alcoholics, they may be trolls trying to attract attention to their own site to accumulate ‘clicks’ for sales to advertisers. Meanwhile, at the book-club site, where the participants are required to provide a list of their favorite books, our recovering alcoholic unthinkingly includes the Big Book as a favored text.  Soon, it goes the rounds ‘Are you an alcoholic?’  ‘I think Fakename21 is an alcoholic!’  ”My father was a drunk, I hated him!’  ‘Why don’t you show some will-power?’ etc. etc.  If our protagonist wishes to remain in the online book-club. suddenly we see a confession concerning his/her alcoholism.  It might be made angrily, or sorrowfully, or, if done with rhetorical finesse, will earn responses of approbation: ‘good thing you joined AA, keep it up!’

But the fact remains that what seemed to be a secret has now become a confession in an entirely different community than the one it was intended for.  And further both the AA site postings and the book club postings are now public property.

Such issues are magnified ten-fold on ‘social media’ sites like Facebook.  There, the communities are shallower, and less grounded in shared interests, and the public access more open, less controlled, yet frequently unnoticed by those posting to their page.  They think their sharing with family and ‘friend’ (whom they’ve never met or actually talked with).  But their audience may include trolls, their employers, sex predators, government agencies, and certainly includes advertisers tracking their browsers.

So I don’t think its largely fame or attention such people are looking for, although that may be part of it.  Frankly, I think loneliness is what drives most of them to the internet.  It is ever harder to find real communities to join in one’s vicinity, and of course joining those requires the effort to get out, drive the car or take a bus, get jostled in a crowd, etc. all the unpleasantness of real human content – the internet is so much more convenient.

That tells me that something has changed, is still changing here.  I can’t say that it’s a bad thing, I may be a grumpy old man concerning such matters.  But it doesn’t look like much of a good thing over all.


Misadventures in the dialectic

Or, a nasty thing happened on the way to the forum

Originally published at:https://theelectricagora.com/2016/11/04/misadventures-in-the-dialectic-or-a-nasty-thing-happened-on-the-way-to-the-forum/

Thus precisely in labour where there seemed to be merely some outsider’s mind and ideas involved, the bondsman becomes aware, through this re-discovery of himself by himself, of having and being a ‘mind of his own. [1]

When Hegel, in the Phenomenology of Mind, makes an abrupt transition from epistemology per se (how we know about anything at all) into an historicized social epistemology (how knowledge is socially and historically conditioned), he begins at an odd point in history, with an analysis of the relationship between lords and bondsmen; or, as it is better known, the Master-Slave dialectic.  What the Master learns in this dialectic is that he not only commands things, but does so through the mediation of commanding his slaves.  It is the Slave, however, who turns out to be the real protagonist in this narrative – what he learns is the necessity of living for others, and through that, his own independence from “things”; that is, from the material.

In a series of important lectures in the 1930’s, the Master-Slave Dialectic received an interpretation by the Russian emigre to France, Alexandre Kojeve, which had enormous impact on French intellectual history, especially on Existentialist thinkers like Sartre, as well as on the development of Lacanian psychoanalysis.  [2]  Although written more than ten years after Kojeve’s lectures, Albert Camus’ The Rebel (1951), a text widely popular among those who have never even heard of Kojeve, is in fact a response to Kojeve.

Within Existentialism itself (and in French philosophy generally), an ongoing debate over the Marxist implications of Kojeve’s lectures emerged.  Indeed, the Marxian narrative of the historical development of a Materialist Dialectic arriving at Modern capitalism (in preparation of a future communism),  depends on the Master-Slave Dialectic, because it assumes that the economy of the Roman Empire was principally a “slave economy” [3]; that is, slaves provided the primary means of production, as well as the central market (in the exchange of slaves) and the essential social structure, of the empire – there were slave owners, there were slaves, and there were cast-off slaves who, scrounging for work where they could find it, formed a nascent proletariat.

A reasonable interpretation of the Phenomenology (given Hegel’s own historical interests and biases) suggests that Hegel’s writing here arose as a meditation on the introduction of Christianity into the culture of Rome [4].  When Hegel wrote this, scholars believed – as they did until quite recently – that Christianity spread through the Empire by appealing to the poor; i.e., to slaves and former slaves [5].  Recent scholarship, however, has proven this untrue, and it appears that Christianity’s greatest appeal in Rome was to the middle classes – businessmen, lawyers, tradesmen [6].  (Only a middle class could afford the charitable social work that Christians engaged in.)  This does not really threaten Hegel (who, after all, is talking about ideas, and in a most general way), but it doomed Marxist historiography.

Evidence has been piling up that the economy of the Roman Empire was not primarily a slave economy, but a sophisticated capitalist one, based on international trade [7].  Even without the accumulating evidence, one realizes that it couldn’t have been otherwise.  The Roman Empire not only conducted trade with client states in the Mediterranean, but with co-existing empires over which they had no direct control, including those in India and China, as well as with cultures in Africa, which they had no desire to control.  Such trade could not be centered around a market for slaves – beyond precious metals or mere commodity exchange, there had to be negotiable systems of exchange of wealth with symbolic representation of equivalent value, namely money.  And where there is money, there is capitalism [8].

However, it is with some degree of irony that we can see that long before the archaeological evidence was unearthed and pieced together showing that Rome was in fact a capitalist society, there actually existed documentary evidence of this (since Nero), which has been available to literati since the 17th century.   I don’t mean accounting records, some Roman economist’s commentary or remarks made by some court historian.  I’m referring to a work of prose fiction; indeed, one of the funniest, most incisive, and, surprisingly, most realistic texts ever written:  The Satyricon, attributed to Petronius Arbiter [9].

We don’t really know who wrote Satyricon.  We don’t even know the original shape of it.  All we have are fragments, preserved in monastic libraries, until the 17th century, when secular book collectors got their hands on it thanks to Protestant looting of those libraries [10].   Some evidence suggests that the fragments are mere slivers from a much longer work, but internal evidence from the text itself shows a remarkable thematic consistency, suggesting that the fragments we do have at least form a narrative sequence within any larger whole. [11]

Satyricon is a wild ride through the underbelly of Roman society of its time.  The narrative is what later would be called a picaresque, a disjointed series of adventures of social outcasts, whose main interests in life are sex (primarily homoerotic) and food and finding some way to acquire the capital with which to procure them.   The narrator and protagonist of the story, Encolpius, has just dropped out of the Roman equivalent of an undergraduate course in literature, in order to compete with a former lover (Ascyltus) for the affections of a young boy, Giton. [12]  Being an educated lowlife, Encolpius isn’t interested in finding suitable employment, but instead tries attaching himself to well-to-do patrons.  This leads to bizarre sexual experiences, meetings with failed poets, tasteless feasts put on by Roman tradesmen, fake religious rites (always good for initiating orgies), and capture by pirates at sea.  As the fragments close, the story doesn’t appear to be going well, as Eumolpus, an aging poet and tutor to whom Encolpius has attached himself, fails to realize an inheritance, which effectively condemns him to death among those who had been supporting him.

The most famous sequence of the narrative is Encolpius’ attendance at a banquet thrown by a successful tradesman, Trimalchio.  The sequence is a fairly complete, unified set-piece.  We first find Trimalchio at a recreation center, playing ball.  When he has to urinate, a slave rushes up with a bucket so that Trimalchio can relieve himself while still playing.  Meanwhile, another slave counts the balls that Trimalchio recurrently loses in play (to recover later), so that his master can toss out a new ball with every flub, as if he hadn’t lost any.   The tone is thus set for one of the most outrageous displays of conspicuous consumption – and conspicuous waste – in the history of Western literature.

At length some slaves came in who spread upon the couches some coverlets upon which were embroidered nets and hunters stalking their game with boar-spears, and all the paraphernalia of the chase.  We knew not what to look for next, until a hideous uproar commenced, just outside the dining-room door, and some Spartan hounds commenced to run around the table all of a sudden.  A tray followed them, upon which was served a wild boar of immense size, wearing a liberty cap upon its head, and from its tusks hung two little baskets of woven palm fibre, one of which contained Syrian dates, the other, Theban.  Around it hung little suckling pigs made from pastry, signifying that this was a brood-sow with her pigs at suck.  It turned out that these were souvenirs intended to be taken home.  When it came to carving the boar, our old friend Carver, who had carved the capons, did not appear, but in his place a great bearded giant, with bands around his legs, and wearing a short hunting cape in which a design was woven.  Drawing his hunting-knife, he plunged it fiercely into the boar’s side, and some thrushes flew out of the gash.  Fowlers, ready with their rods, caught them in a moment, as they fluttered around the room and Trimalchio ordered one to each guest, remarking, “Notice what fine acorns this forest-bred boar fed on,” and as he spoke, some slaves removed the little baskets from the tusks and divided the Syrian and Theban dates equally among the diners. [13]

This would seem to support Marxian analysis of the culture of a slave-based economy; but there’s a problem with this.  Trimalchio’s biography has to be pieced together from his own remarks, those of his guests, as well as portraiture found on the walls of the hall leading to the banquet room.   But it amounts to this:  Trimalchio had been born a slave to a wealthy merchant.  He had proven so good at his chores that he rose to the position of steward of the estate of the merchant, who provided him with an allowance.  This he saved and invested until he could buy his freedom and position himself as inheritor of the merchant’s business [14].  Trimalchio has since spent his life acquiring greater wealth and rubbing it in the noses of failed businessmen whom he turns into his personal court of sycophants.

The banquet seems to be winding down, probably intended to end at dawn [15] (like Plato’s Symposium, which it somewhat parodies), when Trimalchio (always one to sing his own praises) reveals the intended epitaph on his tomb:

Here Rests G Pompeius Trimalchio

Freedman Of Maecenas

Decreed Augustal, Sevir In His Absence

He Could Have Been A Member Of

 Every Decuria Of Rome But Would Not

Conscientious Brave Loyal

He Grew Rich From Little And Left

Thirty Million Sesterces Behind

He Never Heard A Philosopher

 Farewell Trimalchio

 Farewell Passerby [16]

Well, that’s his story, and he’s sticking with it, even after death: a dash of truth in a swill of self-admiration.

After a violent argument with his wife (formerly a prostitute) over his bisexual promiscuity, Trimalchio then returns to this theme, by effectively staging his own funeral; whereat he eulogizes himself in the crudest manner possible, boasting of his use of sex, investments, and shady business practices to build a financial empire.  “So your humble servant, who was a frog, is now a king.”  [17]

So much for the slave coming to self-consciousness by realizing the importance of working for others!

The Satyricon is the rotten apple in the bushel, not only of literary history, but of the literature of history.  Besides being unabashedly pornographic, unrepentantly cynical in the nastiest way, and thoroughly disrespectful of social manners while dismissive of any aspiration toward decency and good fellowship, the Satyricon paints an unnervingly realistic portrait of the people of ancient Rome and of their social environment.   It’s not a pretty picture, and it fails to conform to any of the expectations into which we have been long indoctrinated, by traditional historical narratives or the works of art that disseminated these.  Rome was not just monumental architecture and statues in the forum.  It was an ugly, over-populated metropolis, with tenement slums, a criminal underworld, thriving markets riddled with unethical business practices.  Alcoholism and drug abuse were rampant, and the working classes found their greatest distraction in public displays of cruelty, in the arena.   But more importantly, the people, as we find them in the Satyricon, are completely like ourselves.  We’ve met these people, we see them all around us.  Donald Trump is just a variant Trimalchio.  And who hasn’t encountered a pedantic professor pummeling students with bloated jargon that even he doesn’t understand?   I myself knew someone rather like a straight Encolpius in college; a bright mathematics student, he went through seven different sexual relationships in one semester (his general attitude toward women was best expressed in his parody of a classic song: “nothing could be finah than to wake in some vagina in the mo-o-orning…”).  There was never a day I met him when he wasn’t drunk or hung-over.

Moral improvement, political progress, aspirations toward a greater enlightenment and a brighter future; fables we tell ourselves to bring order to our lives and provide our children with hope.  To all such pretense the Satyricon raises a middle finger (as occasionally do its characters in the text).

What has really changed in human nature since Petronius?  We claim to know more about the world, but apparently we still do not know ourselves.  For two thousand years, Europe was able to mask this lack of self-recognition with a powerful ideological machine, supported by a monumental institutional structure with intimidating influence among political leaders.  As this began to fall apart, scientists, philosophers, poets and political revolutionaries sought to develop a similarly powerful ideology with an equal ability to suppress self-recognition.  But these are only stories, after all – told in mathematics sometimes, more often in heated rhetoric, but all just fables that we hope are true.  The only real change Modernity brought us has been new technology.   And all the new technology has accomplished is providing new commodities for thriving markets riddled with unethical business practices and war-mongers.

Marx is dead, but Hegel survives, as one of the grand fables of Modernity’s explanations for why we have any ideology at all and why we feel satisfied with our supposed progress [18].  Reading Hegel helps us to understand how we wish to think of ourselves, and of the history that we believe created us.  But the Satyricon shows us people as they are, at least in any complex, mercantile culture that we care to call a civilization.  Not all people, but enough that we should be more aware of – see with greater clarity – our own social environment, which hasn’t really improved so much in three thousand years.

Notes

[1] Hegel, The Phenomenology of Mind; B. Self-consciousness, IV. The true nature of Self-Certainty, A. Independence and Dependence of Self-Consciousness: Lordship and Bondage.  J. B. Bailllie translation, 1910.

[2] Kojeve, who served in the French government after WWII, always claimed to be a Marxist, even a Stalinist, but slathered insults on the Soviet Union, and remained friends to the end with conservative political philosopher (and former student of Heidegger’s) Leo Strauss, whose best known student is Allan Bloom.  Bloom was the editor of the English translation of Kojeve’s lectures, 1969:

https://u.osu.edu/dialecticseastandwest/files/2016/02/KOJEVE-introduction-to-the-reading-of-hegel-zg6tm7.pdf.   (Camus’ response to Kojeve, The Rebel, is also online: https://libcom.org/files/The-Rebel-Albert-Camus.pdf.)   Bloom’s best known student is Francis Fukuyama, who acted as de-facto philosophic counsel to the George W. Bush administration; his best known text: The End of History and the Last Man, 1989; essay prospectus: http://www.wesjones.com/eoh.htm

[3] See: http://www.marxist.com/historical-materialism-study-guide.htm.

[4] The Master-Slave Dialectic actually precedes a discussion of the Roman philosophies of Stoicism and Skepticism.  For Hegel, Christianity found its natural intellectual home in Rome, because Rome had produced the individualization of consciousness that Christianity requires, while exhausting all the reasonable expression of it possible within Roman culture itself.  (Per Hegel, Jewish culture, wherein Christianity originated, had found itself in a cul-de-sac of rigid, written “divine” law and inherited custom.)  By now, it should be obvious that we see in Hegel, not a theological explanation of history, but an historical explanation of theology, at least given the assumptions and accepted scholarly knowledge available to Hegel.

[5] Thus, for instance, Nietzsche’s claim that Christianity represented a “slave morality.”

[6] See: the review of scholarly opinion at: http://christianthinktank.com/urbxctt.html, especially section 13, Christianity was mostly made up of ‘middling-plus’ class folks: merchants, tradesmen, craftsmen.

[7] See:  https://en.wikipedia.org/wiki/Roman_economy

[8] Even Marx understood this, which is one reason he hated the very idea of money. See: https://www.marxists.org/archive/marx/works/1844/manuscripts/power.htm. He just hoped that money had been a recent invention.  Nope; it’s been here throughout most of recorded history.  See:  https://en.wikipedia.org/wiki/History_of_money.  I warn the reader that in this instance, the Wiki article is flawed, since it concentrates entirely on the history of money in the West.  In fact, there is evidence that the Chinese developed money at roughly the same time as the West, but paper currency much earlier.  See: http://www.nbbmuseum.be/en/2007/09/chinese-invention.htm.

[9] Our translation is that of W. C. Firebaugh (1922), which includes fascinating, if dated, scholarly notes:  http://onlinebooks.library.upenn.edu/webbin/gutbook/lookup?num=5225.

[10] See:  http://bookmendc.blogspot.com/2010/10/transmission-of-text-of-petronius.html.  My suspicion – but this is only a guess – is that clergy believed the text worthy of preservation, despite its scandalous material, because it included necessary keys to colloquial Latin.  Some Roman slang is only preserved in the Satyricon.  Besides, as Augustine argued in Civitas Dei, not only was the Roman Empire a dung heap, but secular history, as opposed to Sacred History – i.e., the relationship between Man and God – was entirely a waste of time.  See: http://sacs-stvi.org/augustine-on-the-concept-of-history.

[11] For instance:  Early in the text we get a discussion of the cannibalism performed on their children by mothers in besieged cities; and the existing text ends with Eumolpius demanding that his executioners eat his body.

[12] A requirement in the study of rhetoric, which tells us that Encolpius – like Augustine, two centuries later – was intended by his family for a career in law.

[13] Satyricon, Chapter Fortieth.

[14] And it certainly helped that he was the merchant’s lover, or “mistress,” as he remarks with drunken pride.

[15] It should end at dawn, but when Trimalchio hears the cock crow, he immediately orders it caught and cooked.

[16] Satyricon, Chapter Seventy-First.  “Decreed Augustal, Sevir In His Absence/ He Could Have Been A Member Of/ Every Decuria Of Rome But Would Not” – Trimalchio claims that he was appointed to the Priests of Augustus, and would have been welcome in any of the officially recognized cults of Rome; but (he implies) his modesty prohibited acceptance of such honors.

[17] Satyricon, Chapter Seventy-Seventh.

[18] In order to have an ideology, we must confront external disagreements with and internal contradictions to our beliefs, which are then resolved and appropriated, negated and cancelled, or marginalized and ignored.  We thus arrive at generalities that we comfortably assume are necessary and superior to those that came before.  Hegel’s is not the only description of this process, but it is in many ways the most powerful.  My argument here has been that the evidence of the Satyricon is that the margins keep coming back, the contradictions are rarely resolved, and it is an inevitable human trait to be thoroughly disagreeable.