End Game in the Humanities?

E. John Winner

I can’t speak to the Human Sciences, because their core missions seem to remain intact, despite the exhaustion and narrowing of their research and practices. But I think the Humanities have simply lost their way and probably without hope of recovery. My doctorate is actually in English, but what does that mean, to have a discipline called “English?” Is it the study of the language and its history? The study of the many uses of English over the centuries and especially literary texts? The study of the values English speaking peoples have expressed through their language? Perhaps, at a bare minimum, the study of the proper usage of the language, grammatically and in writing? When I first went to school we had answers to these questions, and the answers dovetailed into each other and supported each other. Now, not so much.
The knee-jerk response to this phenomenon is usually to blame one or another party for introducing ideological conflict into the discipline. But anyone aware of the many political quarrels in the discipline dating back to the initiation of contemporary language studies in the 17th Century knows that this cannot be the whole story. Another part certainly has to do with the development of mass media in the 19th Century and electronic media in the 20th. But television achieved media dominance in the 1960’s, at the same time as America realized its greatest expansion of sophisticated literacy, so that’s only part of the picture as well. Yet undeniably the arrival of the internet and similar technology has had an enormous negative impact on the study — even the public practices — of what we broadly call “English.” The internet is populated with people who have no interest in traditional literacy, and even the most conservative among them have no interest in the history of the language or the texts produced therein. At most, cultural savvy, which should be the basis of reflection and shared conversation, is used as a battering ram for opposing points of view on topics having nothing to do with the traditions manifest in the archives of the English language. We often debate the value of different archival texts and the dangers of revising them to supplicate short-term political interests, and this is certainly an argument worth having for those of us comfortable with these texts; but one worries, what is the point in a post-literate culture, where cultural references are really only used as cudgels, to be screamed about rather than discussed? How do we clear away the miasma of self-appointed “influencers,” untrained by anything other than their personal preferences and their preferred social or political bubble, in order to renew the conversations wherein we can civilly disagree, yet also learn and perhaps change our minds?
But let’s also be honest. The decision to establish publication as the standard measure of academic success has always been problematic. It puts an awful lot of pressure on individual scholars and teachers, but it also puts pressure on the discipline itself. Just how many essays on Wordworth’s Tintern Abbey did the world ever need? How many books about Jane Austen can we endure? And I love that poem, and I love Jane Austen, and I even find myself occasionally entertained by BBC documentaries on Austen that get posted on YouTube. But there you go: it is not just my age — the fact that I find it more and more difficult to commit myself to reading longer texts about texts — but the age itself, where questions concerning archival texts and their authors can be quickly and painlessly answered by a visit to a social media platform and a few views of videos, some professionally made, but others: “In My Basement channel: Jack Sprat reviews Pride and Prejudice; 39 views.”
So the academic text-mill had already said pretty much all it had to say about the archival texts, just within a couple decades before the development of a media that would make that text-mill socially superfluous. Of course the initial response was to expand the archive, broader and broader, until its boundaries simply disappeared. It was no longer the archive, it was simply whatever texts might cause a social buzz among English teachers and their students. But doesn’t that clearly fit well into contemporary web/”social media?” Of course a case should be made that the teachers could still instruct their students in English literacy, but one doesn’t need to be truly literate to navigate the web. And given how profoundly dependent good English usage is on writing, on print, the loss of literacy strips an expression like “good English usage” of any necessary reference. How about “good emoji usage’”? Or “proper trolling grammar”? “Self-expression thru sexting”?
So the study of English, as an academic discipline, effectively lost its core mission (partially dissolved, partially exhausted) just in time to enter a social environment populated for the most part by those who no longer had any interest in whatever that core mission had once been, and little interest in rebuilding a new core mission beyond the evident inertia of academic professionalism as a financed institution. (In other words, English departments exist simply because they have existed, and some people and agencies are willing to pay for their continued existence.) People fret over the political conflicts that still erupt in English departments, but can we not see that without these there would be nothing happening in them at all? Controversies excite interest and excitement produces texts; publications, which remain the standard by which professors are hired or receive tenure.
It is my suggestion that this situation obtains across the Humanities disciplines in the academy; in different ways in the somewhat more practical humanities of, say, art or music, but in very similar ways in the study of philosophy. Because of his rather journalistic prose, enriched with a kind of smug irony, I think a lot of people (among those who cared) didn’t really understand the broad picture of the history of philosophy that Richard Rorty was painting in his later career. Traditional philosophy (first playing follow-up to theology and then playing catch-up to science) had exhausted itself; the culture that had once valorized that philosophy was itself exhausted, replaced by a culture committed to self-definition through the reading of novels, poetry, and other literary texts; in which culture philosophy could only be recognized as itself simply another literary genre. It was itself an irony of history that Rorty began elaborating this narrative at precisely the time when the kind of literature Rorty held to be paradigmatic of contemporary culture was itself becoming outdated, by the now all pervasive connectivity of the internet.
But there is still much truth to learn from Rorty, especially in the loss of a core mission for the disciplined study of philosophy. For quite some time, it has been assumed that philosophy is a continuing, highly trained study into some perennial set of stupefying questions about the very nature of human life and the ontology involved in that; a set of questions initially set up by Plato and Aristotle, and then revised through the framework of theology. But eventually, theology was effectively undone during the Reformation and discarded as a source of practical wisdom or insight into humans and their ontology, so philosophy could really only continue as a kind of mulling over of ancient texts from the Mediterranean. But then, Modern thinkers began to create elaborate systems in response to the new world that was opening up through discoveries coming from the new sciences, and the texts they wrote began gathering into an archive — a canon of important texts that researchers in various fields needed to study, to accommodate, or, if in disagreement, to criticize and correct — or even attempt to replace, either through development of superior systems, or through effective deconstruction of the impulse to systematization itself. One can see the many, many discussions, debates, controversies and innovations this might initiate. But one can also see the inevitable limitations. Whatever could be said of a canonical text and its ideas would be said, after which speaking of it would prove simply repetitious, avoiding redundancy through inventive jargon. And eventually, the jargon itself would become the very object of the study and its debates and controversies. And eventually, whatever could be said about the jargon would itself be said. And so on. Meanwhile, the supposed “perennial” questions began to shimmer and blur like streetlights left on during a hot sunny day. Taught in the schools, these would be perceived as the core mission of philosophy, but as the study of the archive gave way to debates about the jargon, I think it would become obvious to many professional philosophers that the perennial questions really weren’t so interesting and perhaps had never been perennial to begin with.
Some might think that philosophy can survive the dissolution or exhaustion of much of its previously held core mission, by at least teaching the clarity of thinking and the proper formation of questions to think about, ‘”perennial” or not. But that move didn’t work for English, and it’s not working for philosophy, and much the same problems beset it in the age of the internet: the rise, for instance, of the amateur philosophers flooding the net with their assurances that difficult questions can be levelled with the easy adoption of principles advanced by some obscure book or website rant. And of course there’s the host of “philosophy made easy” sites and regurgitated essays-for-sale, similar to the Cliffs-Notes that took the place of actual study in English for students long ago. Except now there are no standards by which the use of such sites can be held accountable. Back in the day, a teacher could call out a student for plagiarism or for expounding bad ideas. Now plagiarism is hard to recognize — there’s so much of it — and criticism of bad ideas risks “triggering” a sensitive student.
Again, similar trends are sweeping across the Humanities spectrum. I suspect it somewhat different in the practical Humanities like art and music, because there are real jobs to be had outside of academia in these disciplines. After all, get a degree in music, and one could get a job in an orchestra — or one could skip school entirely and join a band, hire a good manager, work playing studio sessions – well, that’s part of the problem. There are trends and phenomena beyond the ivied walls of the academy that academics must play against, whether they like it or not. Speaking of music, think of all the “viral” music ‘stars’ that acquired their audiences (and their careers) thanks to YouTube. And if getting a job is really what the disciplined study of music comes down to — its “bottom line,” so to speak — then what was its core mission to begin with? This is what’s been lost.
The only Humanities discipline that still seems to hold onto a core mission, and it’s a study that many in it would prefer be considered a human science, is History. The reason for this is worth considering, if rather odd, because, it is so simple (and simplistic). However one approaches the study of the past and its artifacts, whatever perspective through which one wishes to interpret the past, in order to develop a credible narrative concerning it, one is committed to the study of history. In other words, the goal generates the motivation and determines the resources with which the practice must work, in a way unseen in other disciplines. One cannot study the Reformation by simply looking at Queen Elizabeth and recognizing she is the head of the Church of England. “Oh, now we understand Protestantism!” There’s no denying that such remarks can be found on the internet; but these are rather like beans spat at a tank. Those who really want a deeper understanding of Protestantism as a social phenomenon — or more generally, of the place of Christianity in the world today — will have to study the history of the Reformation and its lasting effects. And once committed to that, one is committed to the study of history whether one likes it or not. Otherwise, one might, as all too many do, simply forego any understanding of the world deeper than a Tweet or a post on Facebook. One can study history outside of the academy — and I wish more would — but one cannot study History in the academy without studying history. QED. Of course, there will be bad scholars, poor teachers, useless research publications, even repetitious research vaguely renewed through clever jargon. I fear this is all in the nature of the academy. And there will be ideologically driven narratives and controversies, that’s just in the nature of the world we live in currently. But the core mission of History just is the study of history, and that’s actually something one cannot say of English or Philosophy.
Which is why, some time ago, while weighing one of the endless conflicts between Analytic Philosophy and Phenomenology (or as it’s often called, “Continental Philosophy”), I realized that both of these schools of thought were pretty much exhausted; that no new ideas were getting developed, but a whole host of old ideas were getting regurgitated in their inevitable (and predictable) permutations; that this very regurgitation suggested that no new ideas would be developing for some time to come (since the discipline was stuck in ‘rinse-repeat’ mode on both sides of the Atlantic Ocean); and that given this, the immediate future of Philosophy would actually be a renewed study of the History of Philosophy, a preservation through narratives of different perspectives in the human endeavor to find wisdom, which is really only a sense of security that “I know what I know.” And this study would not be elaborated through research based publication (for we surely saw enough “History of Philosophy” texts published in the 20th Century), and what does publication really amount to in web-based post-literate society anyway? No, the principle practice of this study would be teaching and shared conversation between those trained to it and those who really want to learn it.
Well, that would be my suggestion; but of course I’m no longer in the academy, so no one there is going to pay attention to it. That’s frustrating, not because I feel ignored (because, really, what do I care?), but because no such suggestion is going to change the awful inertia of the academy’s chosen processes of self-mutilation and enervated uselessness. When outsiders ask for the core missions of the Humanities, the usual responses are self-righteous bluster about the preservation of the values of Western Civilization; or equally self-righteous bluster about the need to transform society; or middle-of-the-road pap about literacy and clear thinking, citizenship and well-rounded personhood for students, blah blah blah. Hundreds of millions in financing (and student loans) to make students feel more comfortable attending the opera? Wiser in their selection of politicians? Healthier social relationships in their leisure hours? Of course not; the structure of most colleges and universities was determined long ago: they are research facilities — publication mills — not teaching institutions. University teaching was conceived as a kind of noblesse oblige gift to potential future colleagues. Most students figure that out by their junior year. In their first two years they try to decide what career they’d like, to be manifested in their major; by junior year, now caught in the net, they worry about what job they can get with their earned degree. Once they graduate, jobs are scarce and the joke’s not funny anymore. For many, it actually gets worse if they think they can salvage their personal and intellectual integrity by entering graduate school. For the lucky few, however, the graduate degree may get them a niche in the academy itself, where they can be profoundly bored with administrative nonsense; or profoundly boring if they cannot find new interests to pursue in their chosen discipline; or fascinated by some exciting controversy that also generates new jargon with which to achieve publication. Until they are at last ready to retire. And I have never met any of my former professors who, having retired, were sorry they did.
But that doesn’t mean that going to college is entirely a waste of time. College is a great place to get drunk and have sex. There’s also sport, and the fun of political activism. Musicians form rock bands, fratboys play Risk, and debates about celebrities or even about “perennial” questions can carry on ’till the early morning hours. And for students who really wish to learn, who wish exposure to new ideas and difficult to acquire facts and theories, there’s no better environment. Graduate school is even better for that. During my years earning the doctorate, I could spend all day in the library, reading texts from the archives, with which I would never have become familiar otherwise. And I actually did have a number of excellent teaching professors who would challenge me and demand greater clarity of thought, pushing and prodding me to developing skills in research, writing, and critical thinking. (I was also lucky, living on a stipend; I took my degree without any student loans, more than I can say for some friends of mine.) None of this should be laughed at or denigrated. There are worse wastes of one’s time than reading Aquinas or Hegel, Joyce or Laurence Sterne. One could join an obscure church or become hooked on a conspiracy theory, fretting away the hours worried about saving souls or nations; at the end of that day, people and world look pretty much the same as they ever have. I’d rather read a good book.
These days, I generally only read histories and mystery novels. The only “classic” literature I return to is Shakespeare, Austen, Whitman, Melville, and Twain. I just no longer find that the old voices sing to me the way they once did, and I attribute that to my age and to my jaded sensibilities. I still gnaw on Kant and Hegel, Dewey, Wittgenstein, and Heidegger, on occasion; but I confess I no longer have the sense that I once had of discovering new worlds or new perspectives on the current world. To most philosophic questions that I once found puzzling, I have discovered answers that satisfy me, and of those that remain, I have given them up as probably unanswerable.
But I confess that my days attending college and university, while lying in a now-distant past, loaded with disappointments and failed expectations, remain the happiest and most fulfilling of my life and in some ways, the most meaningful. It is just in the nature of things that life, which begins as a run across an open field in sunlight, inevitably ends as a meditation by a still pool in a dark forest. The academy was once that open field, for me at least; now, by many reports it’s become a dark forest, and the still pool comes alive only with the breaking of swamp gas to the stagnant surface. If true, then we really have lost something from this culture; and it’s doubtful that we can ever get it back.
“Often from a word or a surviving image I could recognize what the work had been.When I found, in time, other copies of those books, I studied them with love, as if destiny had left me this bequest, as if having identified the destroyed copy were a clear sign from heaven that said to me: Tolle et lege. At the end of my patient reconstruction, I had before me a kind of lesser library, a symbol of the greater, vanished one: a library made up of fragments, quotations, unfinished sentences, amputated stumps of books. The more I reread this list the more I am convinced it is the result of chance and contains no message. (…) stat rosa pristina nomine, nomina nuda tenemus.”
–Umberto Eco, The Name of the Rose

The Two Christianities and Their Problems

The Two Christianities and Their Problems
E. John Winner

“You cannot petition the Lord with prayer!” – Jim Morrison

I admit my language here is somewhat over-broad and imprecise; I am merely trying to raise a point or two which I think worthy of consideration in these dark political times.

Religion and politics
All the religions are totalitarian and all religions are authoritarian, although this latter quality is enhanced in theistic religions, since of course their founding premise is that there is a super being that literally authored the universe in the first instance and authored the religion in the next, presumably to correct something that went wrong in the initial offering; and of course the super being (because he is super) can’t possibly be held accountable for anything that goes wrong in his creation. So, when something does go wrong it must be the fault of the humans who are assigned caretakers of at least their part in creation. However in order to hold the humans accountable this super being cannot simply snuff them out and start anew – that would essentially admit that the creation of the humans was itself a mistake on the part of the super being, and a super being cannot make mistakes. So the humans have to be given a good lashing and made to feel guilty; but to remind them of the super being’s power, his majesty, how good he is in having created them and not snuffing them out, the unquestionable author also generates the fundamental principles of his creation’s religion. These fundamental principles structure the totalitarian control over their lives. They tell you what to do, what to think (“good thoughts”), what not to think (“the bad thoughts”), they tell you how to get up in the morning, what time to get up in the morning, when to go to bed, how to go to bed (‘always say your prayers before you get into it’), they tell you what to do with your hands, how to wash, how to speak, how and when to light a candle- or not. And of course, what to do with your genitals. They tell you what to do with your children, what to tell your children, what your children are to do with you; and then, there’s all those foods you’re supposed to eat, foods you can’t eat (or feel terribly guilty about eating), and foods you’re to get violently angry when you see others eat it. Commandments cradle to the grave that, is the very essence of the religious life. This is certainly totalitarianism, the very definition of it. All attempts at modernist fascist or communistic totalitarian states (however supposedly secular) take this as their model for imitation.

A choice of totalitarianisms?
I am going to discuss the two basic interpretations of Christianity being promulgated in the United States today (with considerable histriography behind them). I do not mean the ages-old distinction between Roman Catholicism and Protestant congregations, although that distinction (and its history) certainly figures into the discussion. Part of the shape of American Christianity certainly has to do with the resolutions (or failures thereof) to the violent confrontation between Catholicism and Protestantism beginning some 500 years ago. As I like to point out to those suffering from myopia of history, the Reformation never really ended. (If one thinks it did, vacation in Belfast.) But in the current political constellation of religious forces in America, strange alliances have formed, and odd distinctions need to be attended to, especially since conservative Catholics have made a ‘strange bedfellows’ alliance with right-wing Protestants; both seem to be struggling to achieve some kind of theocracy, without clear awareness that such efforts, if successful, would necessitate legal – and possibly violent – confrontations between them (again). Liberal Catholics and liberal Protestants have come to essential consensus on certain matters as well, although in very different ways and with very different potential consequences. Nonetheless, it would seem, just from this thumbnail sketch, that it would be well to determine the real distinctions between the conservative and rightwing Christianity on the one hand, and the liberal or progressive forms on the other. They aren’t what they may seem to be, given that we are indoctrinated in the most charitable readings of any Christianity in a nation that considers itself heir to Christian history, and where all politicians, and many educators and bureaucrats as well, are effectively forced by peer-pressure (exerted through the media) to profess some such Christian faith, even to the point of imprinting it on our currency and our Pledge of Allegiance – the articulation of a nascent totalitarian “Americanism” if their ever was such. No; let us go down to the base assumptions, in their most primitive forms, upon which these Christianities are founded.

The Schizoid God
All Chritianities circle around a supposedly historical person (despite the extremely scant evidence for his possible existence) known as Jesus of Nazareth, often called “the Christ” or even “Messiah” per the (mis)appropriated Jewish history texts – or, as I like to refer to him, that suicidal Jewish carpenter’s son. (Not to be confused with Karen Carpenter, who starved herself to death back in the ’70s.) Like Hershell Walker, the wholly unfit Republican candidate for the US Senate, Jesus suffers from Multiple Personality Disorder; or, in its classic common-language formation, he has two separate identities. Those who believe that he is a god or at least an avatar of such, won’t see that in him, because to be a Christian of any kind is to have faith in one or the other of his personalities, but not both (which will become more clear as we go along). Indeed, much of Christian apologetics exists to “explain” away the differences betwen the two. But these differences make up the tension between differing factions of Christianity.

In the spirit of contemporary American media, determined to reduce all serious disagreements to the level of a color-coded horse race, we will designate one Jesus “Jesus Red” and the other “Jesus Blue.” Jesus Red is the right-wing Jesus whose current avatar is failed reaity-TV star Donald Trump, while that of more liberal Jesus Blue is the kinder, gentler Joseph Biden. I write that with tongue in cheek, but the distinction is not without importance, since Jesus Red believers in the US have lately committed themselves to the rise of an authoritarian theocracy, while Jesus Blue believers tend to favor a representative democracy and a culture of personal responsibility and individual liberty. These remarks are not to be taken as absolutes, but, after all, this is the world of horse-race politics and troll-driven media.

Jesus Red
The god of the pre-Talmudic Jewish law, and of the history books of what Christians refer to as the “Old Testament” (a term that loosely implies that the Jews somehow don’t quite “get” their own religion, or they would soon accept the so-called “New Testament” as the article of faith between humans and the Divine) is essentially ego-manaical (“jealous,” he says), judgmental, violent, cruel, murderous (even genocidally so). He hates sexual variance, dislikes women, and is not even particularly charitable to his own “chosen people.” Remember, he begins the long narrative of the earthly relationship with them by damning them to labor, sweat, hardship and hunger, then decides to spend the remainder of the relationship putting them through a series of morally questionable “tests of faith” (really, can any gesture be less moral than what he does to Abraham viz. his son?), makes demands of them (some quite impossible to fulfill), and then snubs them whenever they beg for aid or face real calamity. During this period, he went by the name of Yahweh, although he also liked to be called Alpha and Omega, He Who Must Be Obeyed, and Burning Bush (although he dropped this since it sounded too much like the name of a tavern). My understanding of Jesus Red belief is that Jesus Red is essentially Yahweh; the implication is that for some reason he lost interest in the Jews, the tribal “Chosen People” to whom he first appeared, and decided he would expand the requirements for joining his worshipful club. But only in the sense that there is no longer any ethnic requirement for membership. Ethicity would be reintroduced when later European sects encountered non-Europeans. But even after that, to be among the “Chosen People” is now simply a matter of the divinity’s rather random choice of “graced” or “saved” – in short, whim. Then one gets access to all the mercy and forgiveness and close intimacy with the divine the Gospels seem to promise – which are reserved to believers, and really only to the “saved.” These “saved” are usually found at the top of the social pyramid, and this position is often itself considered a sign of their salvation. But what about the rest of us? We’re supposed to follow the moral commands anyway – after all, this is a totalitarianism – and if we do so, we may live contented lives, and if don’t we must be punished – in this life as well as the next, if there is one. (Actually for those not saved, extinction is the best we can hope for.)

Jesus Blue
Jesus Blue is still Yahweh, but a Yahweh who has decided to forgive his wayward human creations. His apparent suicide is an act of self-sacrifice, presumably to alleviate humans from having to sacrifice their own lives by overly rigid adherence to the by then out-dated Jewish laws. Again the Jews are effectively held as somehow failing to fulfill their “Chosen” status, and once again membership in the “chosen club” – the “saved” – is opened to people of no fixed ethnicity. But Jesus Blue is an ambitious savior – we are taught by Jesus Blue proselytizers that his salvation is no longer intended just for the few, but for all those who “come to him,” seeking to be saved and submitting their will to his worship. Actually, anyone familiar with much of Christian theology, early as well as late, knows that isn’t really true. Jesus Blue chooses his “saved” as whimsically as Jesus Red. Yet he certainly has greater compassion, greater mercy for the lost lambs of his believing flock – and even for non-believers who might still be salvageable in the afterlife. The way Jesus Blue theorists handle this issue is to assert the inscrutable nature of the divinity’s mind. There is no earthly sign that one is “chosen.” Thus, regardless of social status, we will never know until the afterlife whether we, or anyone else is truly “saved.” (Of course recognized saints have a leg up, but their recognition has to be granted by some institution.) Thus we should all act as though we are saved, and as though fellow believers are saved, and as if non-believers could all be saved. This raises certain interesting political and social possibilities. It suggests, for instance, that we show others greater tolerance for their following their own conscience. It suggests maintaining a still essentially totalitarian ideology within a secular liberal state. But that’s not a necessity – Jesus Blue believers can be just as strident about their concerns as Jesus Red believers. But those concerns will be different than those of Jesus Red believers. I think this has to do with the presumed sources of authority to which the differing faiths hold. I don’t mean the agent but the agency – the means by which the will of the divine is to be ultimately interpreted.

Jesus Red believers derive their sense of divine authority from a book, and from proper interpretation of this book. Granted that Jesus Blue believers read the same book, and assume that it is the guide to the ultimate reality in which they believe. But for Jesus Blue believers the interpretation of the book is far more flexible, far more open to tropological variance, far more dependent on the individual; to the point that Jesus Blue believers have come to assume that it is the sense of the interpretation from whence they derive authority. This sense is indeed a feeling, and the strength of it assures them that they have spirit, they are spirits. The “spirit” thus grants the authority of their belief.

Politics without Jesus
I’ve been struggling to find a solid resolution to these thoughts the past few days, but it hasn’t been easy. First, take “synthesis” right off the table. There is no way to find common ground between these two opposed religious perspectives on the divine. Because religious language tends to be vague, generalizing, broadly sweeping – as to be expected from totalitarian ideologies – the signs, symbols, and terminology of Christianity swirl all around us as interchangeable tokens seeming to come together then drifting apart – or torn apart for individual or collective interests. Are we the conservators of the planet given us by the Almighty? or was it given to be enjoyed to the fullest, and damn the consequences (since it all gets destroyed on Judgment Day). Although we debate the science of climate change and resource management (or destruction), sadly this question will likely decide how most Americans are moved to support whatever attitude we take to the issues. And the Jesus Red team claims they are the true conservators, since the enjoyment of the planet’s exhaustion cannot be fulfilled without them, and the Jesus Blue team insists the planet cannot be enjoyed unless it is managed wisely and preserved. Well, what if the planet is not a gift from god? What if it is just this hunk of rock whirling through space, and our lives here were entirely accidental? What if the issues had nothing to do with whether we were following the Almighty’s commands, nor fulfilling the ideals of our “spirits?”

All Western fascisms have a Jesus Red component to their nationalist ideology. Obviously, that “messiah” has ready justifications for violence and domination. One way we know that Putin, Soviet apologist though he is, is a fascist is because he has let the Orthodox Church write laws on social issues, such as the criminalization of homosexuality; just as we know that his American fans have taken their cue from this regression to end women’s rights to their own reproductive health, and will, no doubt, end same sex marriage nationally and resurrect anti-sodomy (i.e., anti-gay) laws state by state. ‘God wills it!’ (And no fascism is as strenously violent as Islamic fascisms have proven – but that is a different discussion. Fascists lie in bed with intolerant gods and call that salvation – ‘or else!’)

Obviously, I am rather more sympathetic to Team Jesus Blue! And Jesus Blue believers have done wonderful work alleviating misery and righting wrongs and increasing the availability of justice for all. No one who has listened to Martin Luther King’s “I have a dream” recital, or has read his “Letter From Birmingham Jail” could doubt that for a moment. Nonetheless, Jesus Blue does not have an unmixed legacy. The cause of abolition was just, but John Brown’s massacre of the farmers at Pottawatomie Creek was just murder. There were a lot of Jesus Blue hippies in the ’60s, engaged in occasionally questionable (and certainly rather childish) political behavior – which they never outgrew, to my mind. Because Jesus Blue authority derives from the “soul,” one believing in it can get very smug very quickly. Which tends to turn off some voters not so convinced of that form of salvation. I have long noted that aggressive and insistent trans-activism and other left identitarian politics suffer from a kind of Neo-Platonism, discernable in their insistence that identities are determined soley within, having little to do with the material being that is the body. It would appear that Neo-Platonism is an old and rather dead cultic belief from ancient Greece. Except it’s not – it’s a vital component of Jesus Blue ideology, it just doesn’t get referred back before Jesus to Plotinus, its originator. And this sort of thing causes all kinds of problems socially and politically.

So I can’t really recommend either Jesus as a welcome figure on the political scene, at least not here is America. I’m afraid I’ll have to remain an anti-Christ. I don’t think anyone ought to believe in either Jesus; or if they choose to do so, keep it as far away from public discourse as possible! Profess your faith to your own mirror in the bathroom – read the Gospel while taking a crap. Sing hymns in the shower. If you want to pray, find the nearest closet, lock yourself up in the dark, and have at. Keep your fucking god out of my public market.

Both Jesus adherents tend to whine about their victimization by others. The feeling of righteous victimhood is part and parcel of all Christian ideology, the whole religion begins as the celebration of the victimization of the idiot carpenter’s son who couldn’t get a good lawyer in a Roman court. The mythology in the Abrahamic religions seems to work like this: If you’re born Jewish, you’re guilty of something; if you accept Islam, it’s because you suspect somebody else of being guilty; and if you’re raised a Christian, you learn that somebody or something has made a victim of you. Damned Adam, anyway! – and yes, he is. We all are.

So right now – as I write this shortly before the Congressional elections of 2022 – one of my complaints is that all I hear, from my wishy-washy, pusillanimous friends in the Democratic Party almost as much as from my delusional, near-psychotic enemies in the Trump-fascist Republican Party, is whining about their victimhood – from getting profiled due to color to living in the same neighborhood as people who own guns; from paying a few cents more at the gas station to being “replaced” by immigrants.

Well, maybe we should be replaced – maybe we should all be replaced. Or maybe we can free ourselves from Jesus and the sense of victimhood, and from the urges toward totalitarian authoritarianism these seem to inspire. Because soon these two Jesus factions may be in violent confrontation (again); and it may be that we will have to crucify Jesus (again), if we are to survive as a nation.

Sometimes one has to drive the stake through two hearts before the undead go to the grave a final time.

“Killest us, lest we die from Thee”

“Killest us, lest we die from Thee:”

Notes on the Guilt of God
E. John Winner

Prologue: The sense of guilt, allied as it is with such other responses as shame and remorse, is an important emotional response to our own behavior in a given moral context, that aids in controlling future behaviors in similar contexts. Assuming that we ought to have a personal relationship with a divine being who both demands the best behavior from us and also wishes us well, and who is all-knowing and all-powerful to boot, it would seem reasonable that such a being ought to have a clear and complete understanding of the human experience of this sense of guilt, especially since such most oft arises when our behaviors fail to meet his demands. But such an understanding could never be complete since the sense of guilt is a subjective experience, and the divine being – let us call such God – is also all good and consequently could never experience the feeling of guilt. But lacking such an understanding, God’s knowledge cannot be absolute, and experiencing such a feeling, God’s own behavior must have failed to meet his own demands. Not so almighty after all….

  1. Let us begin here by reminding ourselves that the feeling of guilt, is not itself guilt. “Guilt” identifies a state of being, an objectively verifiable condition of having erred according to an accepted standard. The primary usages of the term in our culture relate to religious morality and to secular law. In the realm of secular law, it should be noted, guilt may, but not always and by no necessity, evoke within the transgressor a guilty feeling. We certainly want certain transgressors to feel it, and judges will rhetorically lay it on thick in pronouncing sentence, reminding the convicted transgressor of his or her heinous crime, and how thoroughly despicable as a human being they’ve become, by causing injury or death or other harm. However, we should remember that the scofflaw fined a hundred bucks for a traffic infraction – say, speeding – is deemed just as guilty of her crime as the serial killer is of his. The degree of transgression or its harm has nothing to do with any degree of guilt before the law, because there is none. (At least per criminal law; the matter is somewhat different in courts of equity, but that involves a different discussion.) Indeed, we measure out degrees in a number of transgressions, such as murder – second degree manslaughter, first degree homicide, etc.; but if one is judged guilty of the crime, regardless of its degree, the guilt is held absolute. We allow mitigating factors – emotional or mental states, planned transgression vs. impulsive acting out, ability to make proper judgments impaired by age or, say, inebriation. But while all this may bear upon the charges brought, the mercy of a jury or of a judge, the sentence enacted, they don’t change the basic fact of the matter, that the transgressor transgressed – that the transgressor is thus guilty. But does the weight of this fact necessarily impel a feeling of guilt in the mind of the transgressor? I’ve gotten three speeding tickets in my life; I confess I never ‘felt guilty’ after any one of them. (I did regret getting caught.)
  2. When we move from the realm of the secular to that of religion, the problems of guilt and the feeling of guilt get considerably more complicated. Let’s consider a concrete example. In Augustine’s Confessions, Book II, Chapters 3, Christianity’s master theologian discusses a return home from school in his sixteenth year. [1] He spends several paragraphs bemoaning his guilt for engaging in what can only be sexual behavior. It begins when his father witnesses what is probably Augustine’s erection at the baths. It should be noted that some Roman baths were not much different from the lower Manhattan baths of the 1970s – basically, gay playgrounds. In Rome, bisexuality among men was not only legal, it was expected. I have always suspected that Augustine’s first sexual experiences were homosexual. This would explain why Augustine’s language in these paragraphs is both flowery and obfuscatory: Augustine is not only torn with feelings of guilt but of shame. Notably, Augustine deploys the idea of shame against itself – shame is the motivation which leads him to accept the peer-pressure of his friends; peer-pressure shames him into shaming himself. Augustine then redirects the discussion into a somewhat oblique report of an argument between his parents concerning his possible future marriage.

The next chapter opens with his famous narrative about how he and his friends stole peaches from a neighbor’s tree, just for the fun of it. By comparison with his discussion of his sex experiences, the narration is clear, distinct, and covers a single paragraph, although it does open the door to a general discussion about the temptations of the material world. It begins: “Theft is punished by Thy law, O Lord, and the law written in the hearts of men, which iniquity itself effaces not.” That last clause refers to the fact that thieves themselves do not like other thieves to steel from them – there appears to be no ‘Categorical Imperative” for thieves. But more to our point, there is no remark here concerning secular law. And there should be. Male bisexuality was legal in Roman culture, but theft certainly was not. Yet Augustine exhibits less shame for the theft than he does for the sexual experiences. That is because his standard of reference is a moral code laid out in a sacred book; secular law be damned.

  1. Most theistic religions adhere to some sort of either Divine Command Morality or Moral Realism (they are not identical) or some mixture of the two. So we first note that the standard of behavior that the guilty are held to transgress is absolute. Often this means that the believer must juggle priorities concerning which transgression to engage as opposed to those standards that ought never be transgressed. The popular example of this is that of the Gestapo knocking at the door in search of hidden Jews. For most of believers (I would hope), lying to the Gestapo is a lesser transgression than handing over the hidden Jew, which would amount to accessory to murder. If this occurs in 1930s Germany, then lying to the Gestapo would be breaking the law, while handing over the Jew would be law-abiding; but here we can note that discussing guilt in a religious context has nothing to do with matters of secular law. (This is actually an important point that has caused all sorts of havoc with practical politics and legislation, which unfortunately we cannot go into very deeply here.) For one thing, there are in religion, unlike in secular law, degrees of guilt. Lying to the Gestapo, one is not terribly guilty, especially since a life has been saved thereby. A quick “Hail Mary” will absolve one. Lying to a priest in a confession booth, on the other hand, is a profound betrayal of faith, since speaking to the priest is supposedly an indirect communication with the divine Himself. Yet the act has the same status, it is intentionally telling an untruth, regardless of motivation. Ah, but that’s the point – in religion, motivations matter. This has formed part of the subject of intense psychological analysis throughout the history of Christian theology. (Here, let me admit that I am largely discussing Christian theology, since unraveling similar issues in other theistic religions would take us far afield; although it may help to reference what little I know of them.) We see it in such questions as: How guilty can one be if one masturbates at an age too early to know how dreadful a sin this really is? What is the level of guilt one can assign to heathens that have never encountered the true faith? What ‘works’ or how many acts of repentance are needed to acquire redemption from venal versus mortal sins? And don’t think I am getting frivolous – this has practical application when applied by a priest determining whether repentance requires two or twenty Our Fathers; or how King Henry must humble himself for having incited the murder of Thomas à Becket.

But the matter is further complicated for most mainstream Christian sects – certainly the Catholic, the Calvinist, the Lutheran – by the fundamental cosmogenesis of their moral universe – the so-called “Fall of Man.” Humans (all humans, just as human) are born in a condition of guilt. They didn’t have to do anything at all. Adam and Eve did that, and so predetermined human nature, not only as sinful but as sinning, right from birth. One of the more important debates, around the time of Augustine, was whether baptism at birth were enough to save an infant or whether the soul of an infant dying shortly after birth without baptism would be sent directly to Hell. Augustine held that such would be the case. After all, God’s foreknowledge effectively sealed the fate of the infant’s soul – it was created to burn in hell. God does not love individuals. God loves a species, “Man,” from which he chooses individuals for special grace, alleviating them from the damnation their birthright assures them. Exactly how He makes his choice of individuals is unknown; presumably His absolute knowledge includes those He wishes to worship him for the duration of His existence. (I was going to write “throughout eternity; but as Augustine points out, “eternity” is a measure of time, and God exists outside of time; and indeed even “eternity” eventually comes to an end.) However, it is possible, and acceptable to theologians, that God chooses completely by whim. Being all powerful, he can do, and will do, anything he chooses. The Islamic philosopher al Ghazali argued that it is undeniably possible that God re-creates the entire universe every second. You think you have been reading this article for the past ten minutes, but really that very thought has been created only a moment ago, along with all your accompanying memories. Al Ghazali is not engaging a thought experiment: he realized that this is what “all powerful” actually means. A good deal of our understanding of the divine in the West has been stabilized, but to some extent trivialized, by the latent presumption that there is some sort of contract between Man and God, that God doesn’t exercise all the power He can with us because of some sort of agreement with Moses and his inheritors. But an all powerful being can make and break contracts as He pleases. The relationship between Man and God forms a kind of cosmic psycho-drama – but it is scripted by God for His own entertainment.

We should remark a problem that lingers: Plato’s famous Euthyphro dilemma. Is God good because he maintains a standard of Good, separable conceptually from His being? or is the Good good because God wills it? I’m not going into this deeply. I only note that the mainstream theologies of the Abrahamic religions in the West hold to the latter view. Many assume that the former view is the more reasonable. But despite the fact that more liberal religious congregations do tend to lean this way, their theological core remains a trust in God’s will. Without God’s will, there is no grace; without grace, no salvation.

  1. Of course Christianity enjoys a further complication not shared with Judaism or Islam: The Christ. The Christ is held to be the Messiah seemingly promised by earlier Judaic texts; as such he is at once human and divine, fulfillment of the Law of the so-called “Old Testament” and the Law itself; God as self-sacrifice in order to redeem the whole of the Human race. Put in such all-encompassing terms, it’s easy to see why Rice and Webber deemed him the world’s first “Superstar.” Yet each of these terms, taken in detail, conflict with other terms in the same set of terms. If Christ has redeemed the whole of Humankind, why bother with baptism? who now is not born saved? By designing His own Crucifixion ahead of time, hasn’t He engaged a kind of symbolic suicide – can He still be God after that? And it is wrong that Jesus is completely free of sin: “Eloi Eloi lama sabachthani?!” is a cry of Despair, which is the inverse of the sin of Pride – the Original Sin, the rebellion of eating the Forbidden Fruit. [2] Jesus then submits to God’s will, thus achieving repentance and redemption at once. Jesus thus becomes the New Adam, through which Paradise can be regained. But then why does the cosmic psycho-drama continue? Why, if the Law has been fulfilled, is there still the Law?
  2. There is something else that Jesus brings to the stage of world history, embodied primarily in the Sermon on the Mount: It is the insistence on personal responsibility, the assumption that the Good, the Right, the Moral, can only be realized by the individual. When Jesus provides directions for proper behavior, he is more often addressing individuals, rather than whole communities; even when he addresses a collective, he is speaking to the individual: “Let he who is without sin cast the first stone.” This is arguably Jesus’ greatest contribution to the otherwise loose assortment of societies and cultures we refer to as Western Civilization. But by emphasizing the individual’s moral responsibility – well beyond the demands of adherence to community law – he generated a fissure between the individual and the community that institutionalization of his following into a church only partly and temporarily healed. Because the Law – the Moral Reality – is now unhinged from community expectations, its definition requires individual interpretation collectively agreed within the institution. Where there is serious disagreement, the individual is on his or her own. Some standard must be found by which the individual can assert responsibility in opposition to the community. This standard may be determined through reason, but it may simply be a feeling, an emotional response. “Here I stand,” proclaims Luther, “I can do no other!” But why not? Surely the Brothers of his monastic order could. But despite his arguments, many of his 96 Theses are pure assertion – he feels them, he feels impelled to hold to them in the face of institutional threat. ‘I feel this is right; I would feel guilty for not doing it.’ And after Augustine, Luther is one of the ‘guiltiest’ of moral theologians. The history of the Reformation begins here, but so does the Modernity that arises as response to it. Each feels the guilt, but each feels his or her own way to redemption.
  3. Evil exists because God wills it. It therefore fulfills a purpose to his desire, in the grand psycho-drama of the relationship between Himself and Man. It is therefore merely an inversion of Good, to incite Man to greater worship of the divine. (It may also help determine who will receive grace or who will be damned, but theologians disagree as to whether this matters.)
  4. As noted, God knows the feeling of guilt means precisely that he doesn’t need to experience it. Yet here we have also seen that it is possible for God to choose to have sinned and thus felt guilt, as one interpretation of the Passion on the Cross would have it. But of course he did this rather performatively rather than with any emotional investment. There was a part to play in the drama he Himself had scripted. He played the part. The believer thanks Him and does not question.
  5. Christian theologians have long made the distinction between Divine love and Human love – between Agape and Eros in modern resurrection of Greek terminology. [3] This is also found in certain Sufi texts and Jewish texts (Buber’s “I and Thou”). Eros, Human love does require embodiment; but Agape is a condition of wholly giving one’s self up to the other. But as I have suggested, God’s ‘love’ is for a species He has created; even should He engage Agape, He probably would not do so for any particular individual. The Abrahamic God is a narcissist – He loves Himself.

Epilogue: It is discussions like this that make me glad I’m a Buddhist. Suffering all that can be suffered as a boy raised a Catholic, I haven’t felt the kind of ‘guilt’ discussed above in the thirty years since persuaded to the Four Noble Truths and their Eight Fold Path. The Dhammapada, assigned to the Buddha’s own authorship (but probably a later codification) is one of the most rigorous (one might even say rigid) ethical texts in world literature; but failing to live up to its injunctions does not evoke a sense of guilt, but only of disappointment; rather like recovering from a relapse for an addict. That’s because the world is filled with disappointment. Moral Realist demands and their enforcement through the feeling of guilt come close to sado-masochism. That is why we ought to establish secular laws and their secular enforcement. The secular state exists to deny any force to Divine Command Morality or any Moral Realism. We don’t know what is Right and Wrong in any absolute sense. We can know, and do know, what it is we agree to, in order to live peacefully with others within a pluralistic society. Thus I leave the last word to one of the great philosophers of the modern liberal state, John Stuart Mill:

“I will call no being good, who is not what I mean when I apply that epithet to my fellow-creatures; and if such a being can sentence me to hell for not so calling him, to hell I will go.” [4]

[1] E. B. Pusey, trans.; https://faculty.georgetown.edu/jod/augustine/Pusey/book02; the quote in the title of the present article is from here.
[2] See, for instance, Bryan Threlkeld, “A Cry of Dereliction;” https://faithalone.org/grace-in-focus-articles/eloi-eloi-lama-sabachthani/
[3] https://en.wikipedia.org/wiki/Agape_and_Eros
[4] Mill, Examination of Sir William Hamilton’s Philosophy

Death and the Icon

Death and the Icon

E. John Winner

Let us begin with a technical clarification, by way of a discussion of the semiotic status of icons. An icon is a powerful representation, not only of an object or a person, but of the concept of that object or person; a reminder of what it is the perceiver of the icon may wish to know or learn from that object, that person, that idea.
“An Icon is a sign which refers to the Object that it denotes merely by virtue of characters of its own, and which it possesses, just the same, whether such Object actually exists or not. (…) Anything whatever, be it quality, existent individual, or law, is an Icon of anything, in so far it is like that thing and used as a sign of it.” [1]
An icon communicates information about what it signifies, not by pointing to it, nor by elaborating the concept of it, nor by referring to it, but by “showing” it; by assuming the likeness of the signified when the signified itself is not present. Thus, obviously a statue – even of a mythological personage – can be the icon of that personage, if it is carved to give us all the likeness of the character expected of the personage. Thus we expect a statue of Hercules to have well developed muscles, reminding us of his strength. But the kind and amount of information an icon can convey can actually vary quite a bit. An aerial photograph of a road would certainly be iconic of the road, but a map drawn on the basis of the photograph would be more so, because it could include greater details, for instance measurement, roadside attractions, etc. Unfortunately, icons are not univocal. They can represent exactly what we don’t want from their signified.
It may be questioned whether all icons are likenesses or not. For example, if a drunken man is exhibited to show, by contrast, the excellence of temperance, this is certainly an icon, but whether it is a likeness or not may be doubted. The question seems somewhat trivial. [2]
We want Peirce to have said, that the drunken man is made an icon of drunkenness, or of drunken men per se. But that is not what he is saying. Rather, he is contemplating the way in which the mere presence of a drunken man is iconic representation “by contrast” of what the temperate do not want from drunkenness. A sign may be iconic exactly in contrast to an object that is different from any form it may immediately represent. To put it simply: The drunken man stands in iconic re-presentation for “temperance” (continued non-drunkenness), because whatever he is or presents, will not be found among the temperate. He is not the picture. He’s the frame.
The sign can be an icon when it frames the interpretant with another interpretant. The interpretant of the drunken man includes his being drunk; the interpretant of temperance includes not being drunk. But there is no not-being-drunk unless there is also the possibility of being drunk. More technically: The definition of the class “drunken humans” stands as limit to a definition of all that is not to be defined as “drunken” but still defined as human, which satisfies the definition of the class “temperate humans.” Peirce called this trivial. It is silly because it is so obvious.
What is not obvious is the way that this transforms the drunken man into an icon for temperance. If a person looks at a drunken man; at his messed hair and watery eyes with dilated, unfocused pupils; his reddened, runny nose; the cut on his cheek from trying to shave with an unsteady hand; the bruise on his chin from when he stumbled and fell; the dried saliva at the corners of his lips; his soiled clothing and inability to stand straight; and then of course, the stench of stale drink and of urine.
The temperate person can see in this man all the signs of drunkenness as they look to an outsider, and might imagine what it must be like to live among others in such a manner: the insults; the unwanted pity and condescension; the incessant nagging of temperance missionaries. All of this is present in such a projection, except the actual experience of being drunk. It is the man himself who stands as an icon for his own condition. But it’s more than that. By inversion, if I do not want to be a drunken man, then he stands iconically for my condition of temperance. “That,” I say to myself, looking at the man (and smelling him), “is what I do not want. I want what he hasn’t got.”
So now we see that an icon can really prove a difficult signifier to interpret. Since it is representation in likeness of its signified, it raises all the questions we might have of the signified, including whatever ideas or responses we could have of it. It can even represent, by contrast, the signified’s opposite.
Think of the vampire; or more specifically, the mighty “king of vampires” of literary legend, Count Dracula. The first thing we notice about him is that he is a Count, an aristocrat, an ancient nobleman, and has been for some 500 years. He has control of a part of Transylvania, his home country, existing in a huge, highly decorative castle, commanding the farmlands below it. We have to remember that he’s a feudal lord; that the farmers and peasants who live around his castle are beholden to him in one way or another which explains why they allow him to feed on their livestock. Occasionally, they even allow him to steal away their daughters, something feudal lords did quite frequently during the Middle Ages. He has aristocratic tastes of a sort, and with aristocratic self-assurance he expects respect. Really, given the power and authority he seems to represent, who wouldn’t want to be such a nobleman and aristocrat? Of course, the 19th century is not the Middle Ages, and it is quite possible to imagine people in that century who would not have wanted to be aristocrats. Nobility had not survived very well through the terrors of the French Revolution, and even in other nations with more evolutionary political developments, their power had been receding greatly over the previous several centuries. They were losing control over vast areas of land and wealth, and political powers were passing into the hands of elected officials or civil servants (who came out of the propertied upper classes) and from commercial and industrial nouveaux riches. So it’s entirely possible to imagine many outside of Transylvania who might look on the aristocracy with fear and loathing. We can see how this might happen in the case of Dracula in particular, not only in the novel written by Bram Stoker but in many films derived from that novel or considered pastiche sequels to that novel. Dracula’s opponents in that narrative – the Sewards, the Harkers, Dr. Van Helsing – are all from the upper middle class: doctors and lawyers; even real estate executives so to speak; all living fairly comfortable lives. Dracula himself is confined to a castle that’s gloomy, drafty, and crumbling from disrepair. The lands he commands are considerably depopulated, and farm land in a mountainous region is not very wealth-producing anyway. Of course, Dracula has no friends or family and no way to communicate with others, because few wish to communicate with an out-of-date aristocrat one can only meet at night. In appearance, he is a decaying old man with fetid breath, a grizzly mustache and cold hands, and he dresses to impress rather than to stay warm. The English people who confront him, on the other hand, are dressed comfortably. They live in comfortable homes with improvements in heating and lighting – and towards the end of the 19th century, decent sewage systems – that Dracula could never have dreamed of in that dank Carpathian castle of his,. We forget how much of modern sanitation concerns – really the origin of modern environmentalism – owes itself to bourgeois desires for clean living conditions, both inside the home and out in the streets. But Dracula’s opponents also enjoy rich social lives. They have close intimate relationships with family and friends and are recognized as part of the community. No one owes them fealty, yet everyone who knows them respects them, because they are decent hardworking people who have earned that respect, something Dracula cannot do. And in place of the feudal power Dracula enjoys in his depopulated farmlands, his opponents enjoy the power of the vote. They can remove their “lords” through election.
Dracula as an icon represents loneliness; decaying old age; a reference to a bloody past. His presumed power and arrogance is really cold-heartedness, selfishness, and cruelty. This is precisely what the original, primarily middle-class readers of Stoker’s Dracula did not want, and that is what many of the audiences of the pastiche films and books and comic books and so on still do not want. Dracula as an icon represents something that looks on the surface to be the ideal of desirable, powerful, wealthy, sexy individualism. But, discovered in the dark crypt he inhabits, it also represents what we do not want – the loneliness and isolation that comes with separation from the laws of community and from ourselves as human beings. Of course, he also is dead – perhaps more accurately, “not alive,” – and won’t ever be found lying around on a beach soaking up sunlight. Yet being not quite dead, but not really alive, is precisely his curse. He is caught forever in the grip of a past that he cannot escape and a future he cannot embrace.
It is not surprising then that the icon Dracula himself fears most of all would be a representation of everything well trained Christians would believe to be the very apotheosis of human being. It was Aristotle in the Metaphysics who argued that the belief in God arises from the desire for perfection and for eternal life, and the crucifix not only stands iconically for Jesus-on-the-Cross, it also stands for something that is not on the cross. If we see in it the icon of a self-sacrificing god who redeems us, it may well also prove an icon of something about ourselves that is ungodly, and possibly nonredeemable.
The crucifix bends those who hold it a sacred icon downward in a convolution towards sorrow and pity, manifested physically in the bending of the knee and the bowing of the head. It’s a low moment for humanity, yet ideologically elevated to the very apotheosis of human being. But there is a positive turn in all this sorrow and pity, of course, or else there would never have been a Christianity. Christians think they have a good take on things that leads them to the hope of joy and comfort, after all. And our previous discussion concerning iconology, reminds that an icon not only stands in for what it represents, but also for the opposite of what it represents. The crucifixion of Jesus is held to be the painful death of a single man who also happens to be a god. Pain, death, the isolation of the individual on the cross, all redeemed only by belief in God. Thus the Christians (or at least the Catholics) read into the crucifix the hope for potential joy, unending life, the union of the human with the divine, and the redemption of human being itself. After death, which is key. Following the crucifixion is, of course, the resurrection and all that means to Christians. Jesus suffers the Passion and dies, on the third day he rises from the dead, gives a pep talk to his followers, and then ascends to heaven. All of this assures his followers (a) that pain is an impermanent state, and once beyond it, contentment and happiness ensue; and (2) that death is itself a transitory moment for the believer, who is then guaranteed eternal life of the soul, ascension into heaven, and joyful union with the divine. Who could refuse such an offer?
The full import of this iconic representation through the crucifix struck me while I was watching The Horror of Dracula from Hammer Films (1958), starring Christopher Lee as the Count and Peter Cushing as Dracula’s nemesis, Dr. Van Helsing. In the climatic showdown between the two, Van Helsing crosses two candlesticks. The vampire, being intimidated by Christian symbols, is forced into the sunlight coming in through an open window, and perishes. But, there’s something wrong here. The film is extremely well-made, well-written, well-acted, and brilliantly paced. It rushes by like a ride in a coach with a runaway horse. Consequently some problems are not only easy to overlook, but are hardly noticeable. It took me several viewings over some twenty years before I realized that the crossing of the candlesticks actually posed no threat to Dracula at all. In fact the gesture is down-right silly. The candlesticks appear before Dracula as intersecting in the shape of a cross, “+,” much as we should expect as symbolizing the instrument of the death of Jesus. But only as long as Dracula remains at the end of one the candlesticks. All he really has to do to mitigate the effect of the symbol is change perspective – say, step to the right, so that the crossed candlesticks would then form, for him, the harmless letter “x.” Is Dracula stupid or what? Think of all the moments in our lives when we have “+” images and “x” images surrounding us – crossed window panes; fallen twigs; cross-walk signs (etc.). Even at night, Dracula would surely find himself visually bombarded with threatening symbols. Why, he could hardly leave his coffin to get out for a cool drink. So what sort of mistake has been made by the producers of the film Horror of Dracula and many others like it? And why is this almost never noticed by audiences?
The British Horror of Dracula is a film produced in a Protestant culture for consumption by a majority-Protestant audience. Protestant churches largely suppressed such iconography, replacing it with a (rather watered-down, in my opinion) symbology. But according to the old legends, as they arose among rural Catholics in predominately Catholic cultures, no “cross” could have stopped the vampire; what one needed was a crucifix.
“Although fears of animated corpses sucking the life out of victims has a long history and can be found in many cultures, the term ‘vampire’ was not popularized until the early 18th century, after an influx of vampire superstition into Western Europe from areas where vampire legends were frequent, such as the Balkans and Eastern Europe. Local variants were also known by different names, such as vrykolakas in Greece and strigoi in Romania. This increased level of vampire superstition in Europe led to what can only be called mass hysteria and in some cases resulted in corpses actually being staked and people being accused of vampirism.” [3]
The importance of this is that, despite more than a hundred years of Reformation upheavals, southeastern Europe was still dominated by conservative Orthodox and Roman Catholic beliefs. We would rightly expect that the cultural drift of the vampire legends would carry along the full baggage of the meaning of the creature, as well the spiritual armament necessary to combat it. But exactly because of the Reformation, this could not be so:
Some of the Protestant reformers, in particular Andreas Karlstadt, Huldrych Zwingli and John Calvin, encouraged the removal of religious images by invoking the Decalogue’s prohibition of idolatry and the manufacture of graven (sculpted) images of God. [4]
Eventually the Reformers softened their stance – “Now if it is not sinful for me to have Christ’s picture in my heart, why should it be sinful to have it before my eyes?” noted Martin Luther – but the damage had been done. Protestants could accept symbolic portraiture of the Christ, but could no longer accept any such as a sacred icon. So, the vampire came west and north out of Catholic and Orthodox Europe and found a home in Protestant Europe, but without the necessary apotropaic: the Crucifix as talisman to ward off evil. They could no longer use the iconic crucifix to ward off the undead, but perhaps a symbolic cross would do as well.
The old legends of the vampires in southeastern Europe don’t really make sense, unless the crucifix functions iconically as a direct representative of the crucified Christ. As I’ve shown before, a symbolic cross will not do, because it can be reconfigured in perspective by moving to one side of it (and, as suggested before, the vampire has to be able to do this, simply to survive in a world filled with cross-like objects). For the crucifix to be a crucifix, there must not only be the cross, but also, as though nailed to it, a representation of a man, in suffering, presumed to present the very likeness of Jesus of Nazareth, assumed crucified in that manner. The vampire cannot walk around the crucifix and change perspective, for the importance is not in the shape, but in the iconic representation of Jesus on the cross and all that this was assumed to mean in the cultures that gave the (modern) vampire birth.
But perhaps that’s what Protestants found disturbing about it; the continued re-iteration of the death of God. While there are images and symbols of the Resurrection and of the Ascension, iconographically, Jesus is always dying on the crucifix. He hangs there, head bowed in despair, lacerated and bleeding. This may not be the way Protestants wish to imagine him, but for Catholics, there may be no other way. Eternal life is a tricky business. Get stuck under a curse, and one haunts the night alone, possibly forever. But die and be reborn, perhaps heaven awaits; or even union with the Divine. In the Eucharist, Catholics share in his death, through the transubstantiation of bread into his flesh, and of wine into his blood. “You will be flesh of my flesh, blood of my blood.” Oh, wait, that’s a line from Dracula. [5]
All this leaves some unsettling questions. Generated in conservative Catholic cultures, and assumed to have very physical manifestations, both the crucified Christ and the modern vampire are representations of death: death-before-afterlife on the cross and afterlife as animated death beyond the grave. The Christ dies to save all souls; the vampire’s death cannot even save his own. Both god-as-man and man-as-demon seem condemned to a body that will not continue, and a soul that will not let go.
[1] Charles S. Peirce, Collected Papers (C.P.), volume 2, section 247.
[2] Peirce, C.P., V. 2, s. 282
[3] http://en.wikipedia.org/wiki/Vampire
[4] http://en.wikipedia.org/wiki/Iconoclasm#Protestant_Reformation
[5] Bram Stoker, Dracula, 1897. Chapter 21.

Farewell to Esoterix

I am truly saddened to learn of the recent demise of the blogger of Esoterix, https://wordpress.com/read/feeds/6608250/posts/4504115622. I always found his posts both amusing and enlightening. I feel the world has grown a bit emptier with his loss. I wish I had met him somehow, somewhere. My heart goes out to his surviving family.

“If Monsters Don’t Exist, Why Are They Out To Get Me?” – they’re after us all, but we had you to comfort us a little – perhaps a bit more than a little. Thank you and farewell.

Violence and Identity

Violence and Identity
E. John Winner

  1. “I wouldn’t have it any other way”

The Wild Bunch is a 1969 film directed by Sam Peckinpah (written by Peckinpah and Walon Green) [1]. Nominally a Western, it tells the story of a gang of aging outlaws in the days leading up to their last gun battle.

After a failed payroll robbery, in which more town citizens are killed than combatants, five surviving outlaws make their way into Mexico, broke and dispirited. The lead outlaw, Pike Bishop, remarks to his colleague Dutch that he wants to make one last big haul and then “back off.” “Back off to what?” Dutch asks, for which there is no answer. Finally Dutch reminds Bishop “they’ll be waiting for us.” Bishop, the eternal adventurer, replies “I wouldn’t have it any other way.”

In Mexico, the Bunch (including the two Gorch brothers, and now also Sykes, an old man who rides with them) visit the home town of their youngest member, Angel, which has recently suffered a visit by Federal troops under General Mapache, during which anti-Huerta rebel sympathizers were rooted out and murdered. The Bunch forms an odd bond with the townsfolk, but they’re outlaws and they’re broke. Eventually they make a deal with Mapache (who is advised by Germans eager to see Mexico allied with them in the war all Europe is preparing for) to rob a US arms train across the border. This robbery is successful, and they return to Mexico with the stolen arms (including a machine gun) – pursued, however, by a group of bounty hunters led by Deke Thorton, a former outlaw that Bishop once abandoned during a police ambush in a bordello. Later the bounty hunters will wound Sykes, whom the Bunch will abandon to his fate.

Along the trail, Angel, a rebel sympathizer himself, has some Indian friends of his carry away a case load of guns, and another of ammunition. Angel, however, has been betrayed by the mother of a young woman he killed in a fit of anger for having run off to join Mapache’s camp followers. The outlaws complete their deal with Mapache, but surrender Angel over to Mapache. Deciding to let Mapache deal with the bounty hunters, they return to the Army headquarters in the ruins of an old winery. However, their betrayal of Angel haunts them. After a brief period of whoring and drinking, they decide to confront Mapache and demand return of their colleague. Mapache cuts Angel’s throat, and without hesitation Pike and Dutch shoot down Mapache. At this point, the Bunch probably could take hostages and back off – but, back off to what? Instead they throw themselves gleefully into a gun battle with some 200 Federales, and by taking control of the machine gun do quite a bit of damage. But the inevitable happens, they end up dead – Pike himself shot by a young boy with a rifle.

As the surviving Federales limp out from the Army HQ, Thorton shows up. From there he sends the bounty hunters home with the outlaws’ bodies, but remains to mourn the loss of his former friends. Sykes rides up with the rebel Indians who have saved him, and suggests Thorton join them. “It ain’t like it used to be, but it’ll do.” Laughing in the face of fate, off they ride to join the revolution….

The thematic power of the film hinges on two apposite recognitions. The first is that the outlaws are very bad men. They rob, they cheat, they lie, they kill without compunction. They seem to hold nothing sacred and have no respect for any ethical code.

The second recognition is that this judgment is not entirely complete or correct. They have a sense of humor and an undeniable intelligence. They are able to sympathize with the oppressed villagers in Mexico. They have a sense of being bound together, and this bind is what leads them to their final gun battle.

The Bunch have lived largely wretched lives. As professional outlaws, they are dedicated to the acquirement of wealth by criminal means, but throughout the film, it is clear that wealth possessed only means two things for them: procurement of prostitutes and heavy drinking. Although Pike was once in love and thinking of settling down, and (the asexual) Dutch speaks wistfully of buying a small ranch eventually, they are just as committed to the outlaw lifestyle as the unrepentant Gorches; they would just rather believe otherwise.

That’s because they clearly are not outlaws simply to acquire wealth. They are committed to a life of violence, to the thrills of dangerous heists, of chases across the landscape of the southwest, and of gun fights. They rob largely to support that lifestyle, not the other way around.

The finale of the film has two major points of decision, the first determining the second. The first is when Pike, dressing after sex with a prostitute, sits on the bed finishing off a bottle of tequila. There it is, that’s his life; and with the wealth gotten from the Mapache deal, he could continue it indefinitely. In the next room the Gorch brothers, also drunk, argue with their prostitute over the price of her services. Yep, that’s their life, too. Meanwhile, Angel is getting tortured to death for being an outlaw with a conscience. Pike slams the empty bottle to the floor, and the march into battle begins.

The second point of decision has already been remarked, the moment after shooting Mapache when they possibly could have escaped, but choose battle instead. Why do they? petty revenge? No, it’s because they don’t live for the money, the drinking, the prostitutes. They live for the violence; and they do so as a team. And here’s the moment they can live that to its completion.

Peckinpah remarked that, for that moment to carry any weight, the outlaws needed to be humanized to the extent that the audience could sympathize with them. He was, I think largely successful. But the film has been controversial, not only because of its portrayal of violence, but because in the climatic battle Peckinpah pushes our sympathies for the Bunch beyond mere recognition of their humanity: they become heroic, larger than life, almost epic figures, challenging fate itself in order to realize themselves – Achilles on the field before Troy.

And oddly enough, while not really acting heroically, they become heros nonetheless – remembered by the revolutionaries who benefit from their sacrifice.

As a side remark, let’s note that Peckinpah was raised in a conservative Calvinistic Presbyterian household. But (like Herman Melville a century before), he was really a Calvinist who could not believe in god. In such a universe, some are damned, but no one is saved. We only realize our destiny by not having any. The Bunch destroy any future for themselves and thus, paradoxically, achieve their destiny. The fault is not in our stars, but in ourselves.

  1. A soldier’s story

The Wild Bunch is set in the last months of the Huerte dictatorship (say Spring of 1914), a phase of the series of rebellions, coup d’états, and civil wars known collectively as the Mexican Revolution [2]. Officially, this revolution began with the fall of the Diaz regime and ended with the success of the PRI ( Institutional Revolutionary Party), but in fact rebellions and bloodshed had permeated the Diaz regime and continued a few years after the PRI came to power. In the official period of the revolution casualties numbered approximately 1,000,000. When one discovers that the Federal Army only had about 200,000 men at any time, and that rebel armies counted their soldiers in the hundreds, one realizes that the majority of these casualties had to be non-combatants. Not surprisingly; the Federal Army, and some of the rebels, pursued a policy (advocated by a former US president) of family reprisal – once a rebel or a terrorist is identified, but cannot be got at, his family is wiped out instead. Whole villages were massacred; dozens of bodies would be tossed into a ditch and left to rot.

As I’ve said elsewhere, I’ve nothing against thought experiments that raise questions (only those that limit answers unjustifiably). So let us now imagine ourselves in the mind of a young Federal soldier, whose commandant has ordered him to shoot a family composed of a grandmother; a sister and a brother (having atrophied legs due to polio); the sister’s six-year-old daughter. The proper question here is not whether or not he will do this; he will do this. The question is why.

That’s a question that rarely (ever?) appears in ethical philosophy in the Analytic tradition. It is, however, considerably worked over in the Continental tradition. There’s a good (if uncomfortable) reason for this. Continentalists write in a Europe that survived devastation during World War II, and live among survivors of the Holocaust – and among those who may have participated in it. Analytic philosophers decided not to bother raising any questions concerning Nazism or the Holocaust. Indeed, in the US the general academic approach to events in Germany in the ‘30s and ‘40s has been that they constituted an aberration. Thus even in studies of social psychology, the Nazi participants in the Holocaust are treated as examples of some sort of abnormality, or test cases in extremities of assumed psychological, social, or moral norms. That’s utter baloney. If it were true then such slaughters would have been confined to Europe in that era. During the Japanese invasion of China, casualties are difficult to estimate, but the number is certainly into the tens of millions.

There were a million casualties of the Turkish harassment of the Armenians, long before the Holocaust; there were a million victims of the Khmer Rouge in Cambodia, long after the Holocaust. Humans have a surprising capacity for organized cruelty and mass slaughter.

At any rate, assuming that our young Mexican soldier is not suffering from some abnormal psychology, what normative thoughts might be going through his mind as he is about to pull the trigger on the family lined up before him?

For the sake of argument, we’ll allow that he has moral intuitions, however he got them, that tell him that killing innocent people is simply wrong. But some process of thought leads him to judge otherwise, to act despite his intuition. Note, we are not engaging in psychology here, and we need not reflect on motivations beyond the ethical explanations he gives for his own behavior.

While not a complete listing, here are some probable thoughts he might be able to relay to us in such an explanation:

· For the good of the country I joined the Army, and must obey the orders of my commanding officer.
· I would be broke without the Army, and they pay me to obey such orders.
· These people are Yaqui Indians, and as such may be sub-human, so strictures against killing innocents do not apply.
· I enjoy killing, and the current insurrection gives me a chance to do so legally.

So far, all that is explained is why the soldier either thinks personal circumstance impels him to commit the massacre, or believes doing so is allowable within the context. But here are some judgments that make the matter a little more complicated:

· This is the family of a rebel and he must be taught a lesson.
· Anyone contemplating rebellion must be shown where that will lead to.
· This family could become rebels later on; therefore, they must be stopped from doing so, and killing them is the most certain means of accomplishing this.
· All enemies of General Huerta/ the State/ Mexico (etc.) must be killed.

Must, must, must. One of the ethical problems of violence is that there exist a great many reasonings promoting necessary violence within certain circumstances, although precisely which circumstances differ considerably from culture to culture, social group to social group, generation to generation. In fact there has never been a politically developed society that lacked such a theory, and most have had several. Most obviously we find discussions among Christians and the inheritors of Christian culture, concerning what would constitute a “just war” (which translates into “jihad” in Islamic cultures). But we need not get into the specifics of that. All states, regardless of religion, hold to two basic principles concerning the use of violence in the interests of the state: First, obviously, the right to maintain the state against external opposition; but also, secondly, the right of the state to use lethal force against perceived internal threats to the peace and stability of the community as a whole. We would like to believe that our liberal heritage has reduced adherence to the latter principle near to extinction, but that’s nonsense. Capital punishment is legal in the United States, and 31 States have it on their books. The basic theory underlying it is quite clear: Forget ‘revenge,’ or protection of the community, or questions of the convicted person’s responsibility – the state reserves the right to end a life deemed too troublesome to continue. We can define ‘troublesome’ here any way we wish; the state still holds that an overly troublesome life should be ended.

But any theory of necessary violence seriously complicates ethical consideration of violence per se. Because such theories are found in every culture, indeed permeate every society – through teaching, the arts, laws, political debates, propaganda during wartime, etc., etc. It is most likely that everyone of us has, somewhere in the back of our brains, some idea, some species of reasoning, some set of acceptable responses cued to the notion that some circumstance somewhere at some time justifies the use of force, and of lethal force. Indeed, active pacifists have to undertake a great deal of soul searching and study to recognize these and uproot them; but they are unlikely to get them all.

Many more simply do not bother to make that effort. They are either persuaded by the arguments they find for necessary force, or they have been so indoctrinated in such an idea that they simply take it for granted.

Because there are several and diverse theories and principles of necessary violence floating around in different cultures, one can expect that this indoctrination occurs in various degrees by various means. One problem this creates is that regardless of its origin, a given theory or principle can be extended by any given individual. So today I might believe violence is only necessary when someone attempts to rape my spouse; tomorrow it may be necessary if someone looks at my spouse cross-eyed.

The wide variance in possible indoctrination also means a wide variety in the way such a principle can be recognized or articulated. This is especially problematic given differences in education among those of differing social classes. So among some, the indoctrination occurs largely through friends and families, and may be articulated only in the crude assertion of right – ‘I just had to beat her!’ “I couldn’t let him disrespect me!’ Whereas those who go through schools may express this indoctrination through well thought out, one might say philosophical, reasoning: ‘Of a just war, Aquinas says…;’ or, ‘Nietzsche remarks of the Ubermensch…,’ and so on. But we need to avoid letting such expressions, either crude or sophisticated, distract us from what is really going on here. The idea that violence is in some circumstance justified has become part of the thought process of the individual. Consequently, when that presumed – and prepared for – circumstance does arise, not only will violence be enacted, but the perpetrators will have no sense of transgressing by doing so. As far as they are concerned, they are not doing anything wrong, even should the violent act appear to be contradictory to some other moral interdiction. The necessary violence has become itself a moral intuition, and in judgment preparatory to an act, over-rides other concerns. ‘I shouldn’t kill an innocent, but in the circumstances I must kill this innocent.’ ‘I must,’ because the necessity of the violence requires an agent, and this principle is ‘mine,’ therefore, the agent must be ‘me.’

Again, this is not psychology. After more than a century of pacifist, ‘non-violent,’ rhetoric and institutionalized efforts to find non-violent means of ‘conflict resolution,’ we want it to be, we want to say that we can take this soldier and ‘cure him’ of his ‘violent tendencies.’

Now, what general wants us to do that? What prosecutor seeking the death penalty wants that for a juror?

The rhetoric of pacifism and the institutionalization of reasoning for non-violence is a good thing, don’t misunderstand me. But don’t let it lead us to misunderstand ourselves. There is nothing psychologically aberrant in the reasoning that leads people to justify violence, and in all societies such reasoning is inevitable. It’s part of our cultural identity – strangely enough, it actually strengthens our social ties as yet another deep point of agreement between us.

  1. Being violent

I’m certain that, given the present intellectual climate, some readers will insist that what we have been discussing is psychology; that Evolutionary Psychology or genetics can explain this, that neuro-sciences can pin-point the exact location in the brain for this, that some form of psychiatry can yet cure us of this. All of that may be true (assuming that our current culture holds values closer to ‘the truth’ than any other culture, which I doubt), but all of that’s irrelevant. It should be clear that I’m trying to engage in a form of social ontology, or what might be called historically-contingent ontology. And ethics really begins in ontology, as Aristotle presumed – we are social animals, not simply by some ethnological observation, but in the very core of our being. We just have a difficult time getting along with each other.

It’s possible to change. Beating other people up is just another way to bang our own heads against the wall; this can be recognized, and changed, so the situation isn’t hopeless. As a Buddhist, I accept the violence of my nature, but have means of reducing it, limiting it, letting it go. There are other paths to that. But they can only be followed by individuals. Yet only individuals can effect change in their communities.

This means we have to accept the possibility that human ontology is not an atemporal absolute, and I know there is a long bias against that; but if we are stuck with what we have always been, we’re doomed.

Nonetheless, the struggle to change a society takes many years, even generations, and it is never complete. Humans are an indefinitely diverse species, with a remarkable facility for finding excuses for the most execrable and self-destructive behavior. There may come a time that humans no longer have or seek justifications for killing each other; but historically, the only universal claim we can make about violence is that we are violent by virtue of being human, and because we live in human society.

  1. http://www.imdb.com/title/tt0065214/
  2. https://en.wikipedia.org/wiki/Mexican_Revolution

Politics and Song

Politics and Song
E. John Winner

“Now, the whole business of Irish nationalism can get very serious if you’re not careful.” – Liam Clancy [1].

My father, Joseph Connelly, abandoned his family when I was two years of age. I probably should have hated him and be done with it; but that’s not how children respond to their abandonment. There’s a lot of self-questioning – ‘was I the cause of his leaving?’ – and attempts to prove worthy of a love that will never be acknowledged.

So up to his death of a heart attack in 1989 I went through periods when I tried to adopt Irish culture as somehow my own, as somehow my inheritance. In the long run, these efforts failed; and they left me realizing that I had no cultural inheritance beyond the common culture of the United States. When people ask me where my family came from, I answer without hesitation, “Brooklyn” [2].

Nonetheless the efforts to identify with an Irish heritage left me with considerable sympathy for a people that had long suffered the most miserable oppression as a colony of the British Empire. (The British long maintained that Ireland was a willingly subservient kingdom aligned to Britain in the laughable pretense of a “United Kingdom,” but this was believed only by British colonialists stealing farmland from the Irish and putting them to work as, in effect, serfs.) The oppression really began with Cromwell’s bloody conquest of the Catholic Irish (whom he called “barbarous wretches”); the massacres were bad enough (and the Irish were no saints in these engagements), but the immediate aftermath really established Anglo-Irish relationships to follow: the policy of suppression “included the wholesale burning of crops, forced population movement, and killing of civilians” [3]. It cut the population by nearly half.

Difficulties, including the occasional Irish rebellion, continued throughout the history of this ‘union’ of Ireland and England, but reached a turning point with the notorious Potato Famine of 1845. The potato had become a staple because it could be grown in private gardens. When a serious blight struck, the Irish faced starvation. Cash crops in Ireland were routinely sent to England for wholesale, and if they returned to Ireland for retail sale, they were priced way beyond the ability of the Irish peasantry to pay. These practices were unaddressed by the British government for some five years [4]. By the end of the famine, roughly 1852, the Irish population was estimated as having lost more than 2 million, half to starvation, half to emigration. The British – many of whom agreed with Cromwell’s assessment of the Irish character as barbarous and wretched (and shameless Catholics to boot) – thought that with the famine ended, markets would naturally stabilize, and relations with the Irish could be restored to that established with the Acts of Union of 1801. They were wrong. Survivors of the Famine and their heirs remembered what they had went through, and who had put them through it. Irish political activists were no longer interested in ‘protesting’ impoverished economic conditions that the British colonialists could exploit. They knew that any such conditions would inevitably recur as long as the colonialists controlled the economy. So began the long hard struggle that would lead to Irish independence.

Irish rebel songs had been recorded since at least the 17th century (“Seán Ó Duibhir a’Ghleanna” on the Battle of Aughrim during the Williamite War, 1691). Indeed, there are so many of them that they form a genre of their own. (Going by Wikipedia, they seem to comprise about a third of all catalogued folk songs of Ireland [5].) However, they truly embed themselves in Irish culture in the decades leading up to the War of Independence (1919-21). They include exhortations to fight for “dear old Ireland,” reports of battles, like “Foggy Dew” (Easter Rebellion, 1916), elegies for slain soldiers; as well as opinions on differing perspectives on the politics of the era, especially concerning those that erupted into violence during the Civil War of 1922.

One might object that I haven’t remarked “the Troubles” in Northern Island. So I will. There have been political songs on both sides of that conflict, as well as, in recent decades, admonitions to peace [6]; they are all Irish. Because as much as some citizens of North Ireland wish to think of themselves as somehow British, no one else does – not even the British, who in signing the accords that brought peace to Ulster (1998), effectively agreed to the right of all the Irish to self-determination.

One can no more remove politics from Irish song, than one could remove the Guinness Brewery from Dublin [7]. But the matter goes much deeper. In fact, throughout the years of occupation pretty much whatever the Irish sang about was political in nature. They sang of the success of their gardens – that violated British economics. They sang of their children – they weren’t supposed to have so many, those dam’ Catholics! They sang out their love of their God; in the 17th Century, this got them killed; in the 18th matters improved, it only sent them to prison. They sang of the beauty of their countryside – and were kicked off it left and right. They sang of their trades – which they couldn’t independently practice without a British approved over-seer. Heck, all they had to do was warble a note in Gaelic, and they were suspected of some dark satanic plot against the crown. In other words, the very existence of Irish song, the very singing of it, was a politically rebellious act against British domination.

It must be born in mind here that for 400 years, the British were engaged in what might be called genocide-by-attrition of the Irish people. (This is difficult to discuss in America, where the media has such a fascination for the health and marital antics of the ‘royal family’). I suppose the long range plan was to have the Irish simply die off; but since most of them were Catholics, that wasn’t going to happen. So the British settled for total suppression of the Irish way of life, and domination of its economy. They reduced the Irish to something less than serfs, since serfs were recognized as being a part of the land they worked. The Irish were not recognized as belonging to the land, they were seen as somehow an annoying infection, needing to be cauterized. The British did worse than destroy Irish culture, they stripped the Irish of the resources needed to produce culture.

But the body is a resource, and it can only be ‘stripped’ from the possessor through death. As Hitler realized, the only way you can completely erase a culture is through complete eradication of the targeted people. But the British, although cruel and destructive, had a peculiar image of themselves as fundamentally ‘decent’ – so all their crimes needed to be rationally explicable and moderated with some sense of ‘mercy,’ with some sense of moral superiority. Goering once declared in a speech, “yes, we (Nazis) are barbarians!” A British politician would never make that claim. So the Irish were allowed to starve to death, but there were no death camps to be found in, say, County Clare.

That may have been a mistake. Song is of the body. One feels it singing, it reverberates deep in the lungs, and shakes the innards. It rises up with every breath (Latin: spiritus). Sing a song and one is that song. Sing a song for others, and one produces culture. The British could take everything from the Irish, but they could not take away their breath; they could not stop them singing.

There are actually two ways to listen to a song. One is to hear the voice simply as a part of the music itself. One doesn’t actually pay attention to the words; perhaps one doesn’t understand the words. This is how we listen to songs in languages we do not speak. But the practice extends beyond that. Where I work, my older colleagues and clients generally tend to be political and social conservatives. Yet the public address radio is set to a ‘classic rock’ station. So I find myself frequently bemused watching these conservatives hum along to songs promoting recreational drug use (“White Rabbit”), sexual promiscuity (every other song by the Rolling Stones), political revolution or anti-war resistance (Steppenwolf’s “Monster”), non-Christian religious belief (a George Harrison song extolling Hari-Krishna), or even a song of anti-American hostility (“American Woman”). They listen to something like the Chambers Brothers’ burst of outrage, “Time Has Come Today,” and don’t seem to have any idea that they are the targets of that outrage. The words are meaningless to them; because they’re not listening to the words. The voice they hear and hum along with, that’s just part of the music.

I have a suspicion that this is how most of us listen to songs in our own language, especially songs we have been hearing since very young. My colleagues and clients don’t want to be reminded of the ’60s with all that era’s political turbulence. They want to be reminded of their own youth.

What the British did in their aggressive disenfranchisement of the Irish on their own soil, was to force the Irish to listen to their own songs, to pay attention to the words as well as to the melodies. Because we listen to the words of a song when they are touching us directly in our immediate circumstances. So even ancient songs can be made meaningful again if the events they refer to are replicated in the events of the current day: they are recognized as contemporary as a newspaper or a political broadside.

The British thus made the rebel song the touch-stone, the embodiment of Irish culture. One can see how this plays out in the Irish ‘cheer’ (that’s its technical genre), “Óró Sé do Bheatha ‘Bhaile” [8]. This probably originated as a shanty, welcoming sailors home from voyage (its structure is quite similar to “Drunken Sailor,” with which it probably shares a common original). During the Williamite War, it transformed into a plea for Bonny Prince Charles to reclaim the throne and set conditions aright for the Irish. In the early 20th Century, it was slightly revised by Patrick Pearse, who some say was murdered – or as others would have it, executed – by the British for participation in the Easter ’16 Proclamation of the Irish Republic [9]. The song is in Gaelic, and roughly less than a third of the Irish report using Gaelic. That may be less among today’s young Irish, and perhaps they don’t quite understand the full meaning of this song. But anyone in Ireland 40 years or older does. A call for heroes to oust the “foreigners” (British) from Ireland, it was used as a marching song during the War of Independence. Even if one doesn’t understand the words, the historical context reveals the meaning, a context remembered and passed on through generations.

Let’s clarify that. Obviously, however moving the music, and however well known the context, the words technically have no meaning, until they’re explained. So imagine a young person, unable to speak Gaelic, yet hearing his parents and their friends singing this song, and noting their attitudes of pride and determination. Such a one would feel impelled to ask after the song’s meaning. And here’s where attempts to suppress a language and its song swing back to bite the oppressor’s hand. The young person now pays more, and more intense, attention to the meaning of the song during and following the explanation than he or she would if it were sung in a language already understood. In other words, the effort to suppress Gaelic song actually back-fired: Rebel songs in Gaelic achieved greater respect as audiences struggled to place them meaningfully within the context of the Irish revolution and take possession of them as their own.

In fact, the problem for any empire is that colonization, oppression, slavery, and mass slaughter do not make friends. Empires generate hatreds and enmities that last for generations. The good natured Irish tend to adopt a ‘live and let live’ pragmatic attitude even towards those they have battled in the past. But they also tend to carry a grudge.

The British are a very proud people. Writing this in America, I know it is expected of me to continue, ‘and they have every right to be.’ But I don’t believe that. The history of England includes important eddies of remarkable writers and scientists. But these appear to the sides of a great river of blood, clogged with the remains of slaughtered natives of colonized lands. And for every one of those dead, whole families are left behind to this day, battling to redefine the wretched political and economic confusion the British Empire left behind in its collapse – a collapse that the British still won’t admit or deal with honestly.

I write this in America, the nation that long acted as inheritor of that collapsed empire (while flattering the British ego by pretending we are all somehow the same people because of a common language). By functioning in a more paternalistic, ‘caring’ fashion, acknowledging the sovereignty of other countries, spreading around aid programs, enlisting allies (all as long as they didn’t threaten our hegemony and wealth), Americans have deluded themselves into believing they are not imperialists and have made no enemies. But they are and they have, and this will continue to haunt and befuddle their foreign affairs for many generations to come.

But America has another problem. There is no such thing as “the American people.” America is a collection of many peoples from around the world. Some of these have historically been oppressed, although later assimilated into the mainstream. Others have not been able to, or allowed to, assimilate. And others may feel themselves oppressed where there is no empirical evidence that this is so, beyond their own disappointment in unfulfilled expectations, given the nature of the economy, or the nature of the constitutional government. Consequently, there are an awful lot of people here who have, or who have had, or who believe they have, reason to speak out. And when means for doing so are blocked, or when speaking seems unable to convince others – they can always sing about it [10]. That’s what song is for. Politics is not an add-on to song; song is an inevitable expression in politics.

Conservative critic Mark English once wrote of the dangers of relying on mythical thinking in matters political [11]. Desires for respect, for the ability to live without oppression or risk of legitimized theft or murder, for the opportunity to realize one’s full potential unhindered by stigma – are these mythical aspirations? Quite probably. The world is a cold home to a lonely, anxious species of over-developed hominids. But I would not be the one to reassure those starving in a famine that, rationally, their deaths would (in the words of Scrooge) “decrease the surplus population.” Some myths are worth living for, even fighting for; and worth singing about.

[1] https://www.youtube.com/watch?v=b3zOVi0C5X4

[2] My oldest sister never quite got over it, and became obsessed with developing a family tree. She traced the Irish roots back to an 18th century poet, Thomas Dermody, aka Dead-Drunk Dermody, who, as his nickname would suggest, drank himself to death at an early age. https://en.wikipedia.org/wiki/Thomas_Dermody

The first stanza from his “On a Dead Negro;” https://www.poemhunter.com/poem/on-a-dead-negro/:

AT length the tyrant stays his iron rod,
At length the iron rod can hurt no more;
The slave soft slumbers ‘neath this verdant sod,
And all his years of misery are o’er.

[3] https://en.wikipedia.org/wiki/Cromwellian_conquest_of_Ireland

[4] The British response to the famine – heartless indifference – was a purely rational one. Remember that this was the age of Malthus, who once wrote, however ironically:

“(W)e should facilitate, instead of foolishly and vainly endeavouring to impede, the operations of nature in producing this mortality [of the poor]; and if we dread the too frequent visitation of the horrid form of famine, we should sedulously encourage the other forms of destruction, which we compel nature to use.” Essay on the Principle of Population, 1798.

Lest any think this was not in minds of the British during the Famine, consider the following:

“Ireland is like a half-starved rat that crosses the path of an elephant. What must the elephant do? Squelch it – by heavens – squelch it.” – Thomas Carlyle, British essayist, 1840s

“The judgement of God sent the calamity to teach the Irish a lesson, that calamity must not be too much mitigated. …The real evil with which we have to contend is not the physical evil of the Famine, but the moral evil of the selfish, perverse and turbulent character of the people.” – Charles Trevelyan, head of administration for famine relief, 1840s

“[Existing policies] will not kill more than one million Irish in 1848 and that will scarcely be enough to do much good.” – Queen Victoria’s economist, Nassau Senior

“A Celt will soon be as rare on the banks of the Shannon as the red man on the banks of Manhattan.” – The Times, editorial, 1848

Source of additional quotes: http://www.politics.ie/forum/history/22143-anti-irish-quotes-throughout-history.html

[5] https://en.wikipedia.org/wiki/List_of_Irish_ballads

[6] For instance: U2: “Sunday Bloody Sunday,” Simple Minds: “Belfast Child,” The Cranberries: “Zombie.”

[7] Until Guinness bought out the brewery building recently, they held a 9,000 year lease on it.

[8] https://www.youtube.com/watch?v=4Sje2VYw99A

About the song: https://en.wikipedia.org/wiki/%C3%93r%C3%B3_s%C3%A9_do_bheatha_abhaile

Translation in English: http://songsinirish.com/oro-se-do-bheatha-bhaile-lyrics/

Revisions author: https://en.wikipedia.org/wiki/Patrick_Pearse

[9] The execution of the leaders of Easter ‘16 was perhaps the most profound mistake the British could have made. Initially, they sentenced 89 men and a woman to death; but the first 15 executions were staggered over 9 days, as crowds stood outside the prison weeping, and politicians both Irish and British protested. Author James Stephens described it as “like watching blood oozing from under a door.” https://en.wikipedia.org/wiki/James_Stephens_(author) The sentences of the other 75 sentenced to death were commuted. But the damage was done. The effect was to galvanize the Irish people in support of independence. https://en.wikipedia.org/wiki/Easter_Rising

[10] Billie Holiday, “Strange Fruit”: https://www.youtube.com/watch?v=wHGAMjwr_j8

[11] https://theelectricagora.com/2017/02/22/nationalism-and-mythical-thinking/

Raging Pain as Musical Comic Book: Ramones

Raging Pain as Musical Comic Book: Ramones

E. John Winner

Popular music has become a swamp, with sink holes and tangled underbrush. Every musical form can now be considered a ‘niche’ music – not a shading along a spectrum, but a patch in a crazy quilt of vaguely related or wholly unrelated sound and lyrical stylings. Strangely, it was once possible to imagine a single history of popular music. It would lead from carnival rants and pub songs, sailor chanties and cowboy plaints, through minstrel shows and Tin Pan Alley throwaways, then receiving a massively regenerative historical disremption with the popularization of music made by descendants of Africans brought to America as slaves, occurring simultaneously with, and in large part due to, the technological innovations of recorded music performance. This disremption initiated all the major genres of American music played in the 1960s: jazz and swing, country and folk, blues and soul – and of course rock and roll, engulfing everything else. These genres seemed to play off each other dialectically, to such an extent that by the late ’60s the future of pop music seemed utterly predictable, and indeed evolved predictably into the ’70s with occasional side-steps and expanding inclusion of already existing music forms. When Emerson, Lake and Palmer performed Mussorgsky’s “Pictures at an Exhibition,” it seemed as though Rock could achieve the perfect synthesis of all musical forms. Of course music specifically made by and for Black Americans (“R&B” – rhythm and blues, although no one any longer remarked it as such) was developing some alternative forms (most notably disco), it was generally assumed that such developments were running parallel to those in Rock, and R&B and Rock would play against each other dialectically until achieving a final synthesis – probably in a Rock idiom, since Rock had already sucked much of Black music into itself. So far, so Hegelian. But there’s a reason why Hegel remarked the arrival of Romantic poetry as the end of poetry, or why, reading Danto in a certain way, one can say of Post-Modern art that it is really the end of art – the end of any history of art that would make sense of it qua historical narrative. It is that such a narrative ought to end ‘happily ever after.’ As prophesied by Whitman (surely the exemplar of the ‘post-poetry’ poet), everyone would become his or her own poet. Well, that’s actually how it does end; but that can only occur if there is no cultural center, no shared standards of values – not ‘happily ever after.’ If some teen-agers record the scraping of chalk across a blackboard and they can get other teen-agers to dance to it, we’ll need a new category for Billboard Magazine: Chalk-Pop.

Because the final synthesis of the dialectical history of pop music did not arrive in a Rock idiom, but in rap and hip-hop in the early ’80s. Not by embracing all other music forms, but by smashing them; then scratching the samples together on turn-tables, while lyricists recited lines rather than singing. Everyone his/ her own lyricist, every re-composition of old sounds a new composition. Originally derived from elements of disco and reggae, rap and hip-hop adamantly refused every interest or value of the pop musics that had come before, except that of dance. (I’ve seen intellectualist defenses of hip-hop as ‘African this’ and even ‘blues that’ but it’s all crap. Shake yer booty or take it off the dance floor.) Lyrically, nuance and subtlety – previously the qualities most admired in the work of pop music lyricists – were tossed to the wind, replaced with explicit expressions of desire, anger, or fear. Surely, everyone wants sex, everyone wants to get high somehow, everyone wants power. Why hint when you can state out-right? That doesn’t mean we can’t get good music or interesting lyrics anymore; but these no longer arise as progressive developments out of pop music’s history: that story is done. Of course we still have young lyricists imitating Bob Dylan or Joni Mitchell, or hoping to achieve the influence of Laura Nyro or Stevie Wonder. That’s inevitable in a fragmented culture of niche musics – the yearnings of the past never fully lose themselves in the satisfactions of the present. Not so long ago, buskers in Boston could still be heard warbling the Middle-English “Twa Corbies.” Any music will find it’s audience; no particular music moves its audience towards any future other than ‘more of the same.’

It’s a sound that is now quite familiar, almost tiresomely so – at last brought to main stream pop respectability by Nirvana and Green Day, punk rock has blared from movie soundtracks, television commercials, cartoons, even children’s records (remember Chipmunk Punk?) for nearly four decades. But there was a time when it was truly fresh – so fresh post-hippie boomers from the 1960s found it vile, threatening, or simply ugly. In the current Post-Modern swamp of niche pop music, I find it difficult to articulate what it once meant to hear, to listen to, a music seemingly so new that initially one could not remark hearing any music like it before. Of course, in retrospect, one scrambles to find precursors and lineage, some form of artistic or intellectual genealogy, some tradition to which the new most likely belongs. That doesn’t obviate the initial impact: one listens transfixed, one feels transformed. That’s what happened on a summer’s night in 1976, when, having already visited CBGB’s in the Bowery to sit at the feet of Talking Heads, Television, and Blondie, I at last acquired the first album by the band everyone admitted the CBGB scene was really all about – what every one meant by the new rubric, “punk rock” – Ramones. [1]

The Ramones did have ‘precursors,’ most of which they were willing to acknowledge – Iggy and the Stooges, the New York Dolls, the Velvet Underground (first two albums), as well as singles from Britain, primarily Bowie’s “Rebel, Rebel,” and Black Sabbath’s “Paranoid” – not to mention the myriad ‘one-hit wonder’ garage bands of the early ’60s. But locally, when they first appeared in New York City in late 1974; and later nationally, when their first album blasted into record stores in 1976, it was quite clear that something new was happening in pop music, something that had little to do with the folksy flowers and love-beads of the counter-culture of the 1960s, or the bloated orchestral flourishes of the ‘progressive rock’ of the early 1970s. Clocking in at less than a half-hour, the fourteen tracks of the first Ramones album were brief, minimalist sonic booms of three-chord chainsaw guitar; rumbling, rapidly fingered bass; thundering, metronomic drums; quavering, choked out vocals reciting as few lyrics as could convey an idea and suggest (not necessarily tell) a story. This ‘suggestion of narrative’ is what really concerns us; it explains how the Ramones were able to get away with delving into the darkest, dirtiest depths of their collective consciousness and yet sound like one of the lightest, most entertaining throwaway pantomimes of what some teen-ager might imagine a rock band to be. The Ramones are the comic book version of Lou Reed’s meth-driven “White Light/ White Heat,” the living cartoon versions of the sexually angst-ridden narrators of “Rebel, Rebel” and “Paranoid;” the Warner Bros.’s Animation version of the Stooges’ “No Fun.” Which doesn’t mean that there isn’t a real pain achieving expression in their songs. There is, perhaps, even too much of that. Perhaps that is the very reason their songs lurch – unintentionally, yet hypnotically – into the realm of caricature. The Ramones hurt so much, that reality reduces into comedy – true at every level: in their minimalist composition, their ‘thought balloon’ lyrics, the over-all undeniably humorous-ironic presentation in concert (“Gabba Gabba Hey!”). Nobody could take the Ramones seriously; what better way to hide the outrage and pain of their existence?

To get at this properly, we need to begin with, and later return to, the personalities. The original Ramones might be said to be culturally and politically schizoid. Guitarist Johnny (Cummings) Ramone and bassist Dee Dee (Colvin) Ramone were of Celtic (Irish, Scottish) Christian origins, although Dee Dee also had a German mother; both were firmly to the political right. Drummer Tommy (Erdelyi) Ramone and singer Joey (Hyman) Ramone were of East European Jewish descent, and were to the political left. On the other hand, as far as personal stability is concerned, the schizoid cleft followed a different line: Both Tommy and Johnny were cool, business like (at first), professionally ambitious. Johnny especially could be tough minded about “the job” of playing rock music (perhaps too tough minded). Dee Dee and Joey were both shy and withdrawn. In their later teens, both had been kicked out of their homes; for Dee Dee, physically abused by his alcoholic father (a soldier stationed in Germany), finding himself on the streets of New York was just another moment in a seemingly endless trauma. He briefly turned male prostitute in order to survive. Joey’s experience was almost as bad, in a different way: bullied at school, he developed a punishing Obsessive Compulsive Disorder – think Tony Shalhoub’s Mr. Monk, only without the sympathetic irony and occasional laughs. What Dee Dee and Joey shared, though, what kept them going, was a certain Romanticism – both had artistic, not business, ambitions, and sought recognition for their art, not just financial success.

These personality differences would have a profound impact on their professional relationships. By the 1980s, after Tommy had been replaced by Marky (Bell) Ramone, the Ramones entered a phase during which they hardly spoke to each other, even while sharing the same van during touring. Actually, calling it a “phase” is being too cautious – it lasted the remainder of their lives. Despite the ‘legend of the Ramones’ as perpetrated by favorable critics, fans, and occasionally laced into their own songs, the Ramones did not originate as a group of friends hanging out in the Forrest Hills neighborhood of Queens. They didn’t like each other; they weren’t even all that musically compatible. The Ramones were a concept band – they developed as the embodiment, the sound, of what, in some other world, an alternate historical time-line, a rock and roll band could have been – should have sounded like. As though the ’60s had been largely skipped over, and the Kingsmen’s “Louie, Louie” (1963) had been followed the very next year by Sabbath’s “Paranoid.” Woodstock never happened in the Ramones universe; and Mussorgsky – what was he, some commie? Isn’t Emerson, Lake and Palmer some ambulance chasing law firm? Part of the Ramones schtick was feigning ignorance; but this feint hid an embarrassment. The Ramones lacked any real formal education. Whatever they knew they had picked up listening to others. In later years, Joey (beyond an encyclopedic knowledge of his record collection) developed as an auto-didact, with the spotty knowledge one could expect from such. Dee Dee painted obsessively, but without any training or exposure to artists beyond the hot-house of the chic NYC art scene, the best one can say of his work is that it’s ‘original’ – not necessarily a compliment. As for Johnny, he didn’t care; and Tommy was more concerned with the business of the music business. The Ramones were the poets of those who drop out from eighth grade. “Now I wanna sniff some glue,” because having too many brain cells is for losers.

Calling the Ramones a ‘concept band’ opens the door to unnecessary criticism – as though we were entering the territory of the Monkees or the Archies, or more recently the Spice Girls or K-Pop. But this is to think in critical absolutes. There’s nothing wrong being a concept band anymore than there is in constructing ‘concept art,’ like Warhol’s Brillo Box. The Crucifix is concept art – dropped into a culture with no familiarity with Christianity, it would simply be the representation of an executed criminal. The question is always, does the concept work within its cultural context – does it evoke the intended emotional and intellectual responses? And does it adequately embody the concept? is it, aesthetically, what the audience expects and desires from the conceptual representation? If the answer to this question is positive, then we’re confronting a work of art (and so much for Heidegger, who should have paid closer attention to Hegel); any lapse only produces kitsch – worthy of plastic representation on a dash-board or lawn ornament, but hardly worthy of critical comment. Raphael’s “Madonna and Child” good; plastic Jesus not so much. And here, of course, we find the real genius of the Ramones: they are plastic Jesus filling the space of “Madonna and Child” as pop music devolves into the niche-genre swamp. Except that such hadn’t happened yet. They are thus among the important precursors of the music scene in which we find ourselves today. They established a niche – ‘Ramones music’ – that only they could fill.

Ramones, the band’s first album, is not really a concept album (although it has a frame in the first and last songs, which we’ll get back to), but there are themes that pop out at us and recur. Perhaps the principle of these is that in the apparent love songs on the album, the song is about yearning for love, or about a violent breakdown of a relationship, or about the memory of a failed relationship; there are no songs celebrating being in love, being in a relationship at a present moment. They’re not ‘love songs’ so much as they ask the musical question, ‘could there ever be any such thing as love?’ And the answer seems to be no; although you and your significant other can go to ‘Frisco, join the Symbionese Liberation Army (like Patty Hearst), and perhaps die. Probably the best outcome for all concerned. Yet these ‘anti-love songs’ are written with the same simplicity and primitive poetics as the love songs of the great garage-bands of the early ’60s. One of the band’s most important creative decisions (apparently shared between principle songwriters Joey and Dee Dee) was to push garage-band formulas to the border of self-parody (and frequently beyond). That raises interesting questions. ’60s garage band songs were never intended to be taken seriously. Their writers were not aspiring, misunderstood artistes. They were teenagers with too much testosterone, celebrating their first relationship on the dance floor. Their songs were thus primitive, nearly parodic imitations of the more sophisticated love songs of jazz or swing, the more adult, more openly sexual love songs of the blues and country-western. So how does one parody a near-parody? Actually, Harvey Kurtzman figured that out back in the 1950s, exhibiting his solution regularly in the nascent Mad Magazine; and even after he left it, the magazine continued in his tradition, and does so largely to this day. Mock the mockery as though it took itself seriously; turn every image into a caricature. Reduce personality to stereotype, reduce stereotype to stick-figure. Dialogue dumbs down to unreflective malapropisms. The imitation of life requiring suspension of disbelief becomes cartoon utterly detached from the problem of belief. One paradigmatic instance of this is Kurtzman’s brilliant rip of the old Batman comic book, “Batboy and Rubin,” recently immortalized in an episode of (appropriately enough) Batman: the Brave and the Bold (“Bat-Mite Presents Batman’s Strangest Cases”) [2]. The story ends when Rubin discovers the real master-criminal is Batboy – who then beats Rubin to death. The parody of near-parody almost always leads to violence: the problem of belief or its suspension has to be shattered in no uncertain terms. The audience is reminded not to sympathize with any such characters. The shadow play only comes to life when heads literally explode. Perhaps this partially explains another recurrent theme on the album: When the experience of adolescence is confronted directly, it is presented in horrific terms – “Beat on the Brat” (supposedly based on a real incident Joey witnessed); “I Don’t Wanna Go Down To The Basement” (“something down there” – possibly Dee Dee’s sexually abusive father); “53rd and 3rd” (Dee Dee’s autobiographical reflection on life as a male prostitute). The only quality of youth that makes it barely livable is the very absurdity of it all – “Basement” seems to reference an old z-grade horror movie, and the seriousness of “53rd and 3rd” is undercut by the singer’s insistence that “I’m no sissy.” “Sissy”?! Good heavens; monetized sexual deviance and murder reduced to school-yard boasting.

Which finally brings us to the most controversial theme embedded in the album, specifically in its framing first and last songs (“Blitzkrieg Bop” and “Today Your Love, Tomorrow the World”): Nazism. The resurgence of right-wing populism, which conservative politicians long encouraged, and which has achieved its apotheosis in the US with the Trump presidency, inevitably included the resurgence of anti-Semitism. Many conservatives didn’t see that coming, especially since there are Jews on the Far Right, like Trump adviser Stephen Miller; these may be racist, but surely they couldn’t be anti-Semites. Nonetheless that anti-Semites would enjoy increased legitimacy in the current era was, as noted, inevitable. The fears populist demagogues play upon are actually vague, amorphous; it is up to the demagogue to give them substance, identity: any will play as well as any other. Anti-Semitism has a long and durable history, with many motivations accumulating over the centuries; it is easily resurrected. Consequently, it is now difficult to remark a time, in the early ’70s, when the horrors of WWII and Nazism and the Holocaust were widely understood as so aberrant, their initial motivations so sick, that only loonies on the margins of the Far Right could ever imagine their viability as political choices in an ever increasingly pluralistic American culture. But I remember that time, when as an undergraduate in Westchester County, I could hear Jewish classmates tell a joke that now appears so reprehensibly anti-Semitic, I can only remark that it involved pizza. Why would they do that? Because at the time they – we – were certain that the Holocaust was ancient history, that it’s motivations would always remain buried at the margins, that Nazism was merely the ghost that haunted a once defeated, now long rehabilitated, Germany. Ah, well; the naivete of youth.

At any rate, we can now see how Dee Dee is using the first and last songs on the album to reference his own youth on a US military base in Germany, and why the Jewish Joey would be willing to sing these songs. Interpersonal violence (“shoot ’em in the back,” the line Dee Dee contributed to Tommy’s “Blitzkrieg Bop,” along with the title) and world war take the place of realizable relationships with others; empty boasts take the place of vulnerably honest communication. Douglas Colvin and Jeffrey Hyman were real human beings, with real fears and hopes; Dee Dee and Joey Ramone are Mad Magazine caricatures: script by Harvey Kurtzman, art by Wally Wood. Johnny gave them the sound that numbed the pain; Tommy gave them the back-beat to bring it to the dance floor. Pain becomes pose, pose becomes life-style, life-style becomes comic book, comic book becomes music. “Hey ho, let’s go!”

The only instrumental solo on the whole album is guest Craig Leon’s electric roller-rink organ riff on the cover of Chris Montez’ “Let’s Dance” – the only song on the album that is truly celebratory. Notably the solo is buried in the mix, one has to listen carefully for it. And listening carefully to Ramones music is not easy. It’s more fun when allowed to rush over one in waves of sound while bouncing up and down in a pogo. For some, that’s all to be hoped for from life. But perhaps that’s enough.

[1] https://en.wikipedia.org/wiki/Ramones; https://en.wikipedia.org/wiki/Ramones_(album); 2016 remaster: https://www.youtube.com/watch?v=p00fxoDpaIo

[2] https://en.wikipedia.org/wiki/Harvey_Kurtzman; https://jeffoverturf.blogspot.com/2011/07/bat-boy-and-rubin-wally-wood-mad.html; Batman: Brave and Bold/ “Bat-Mite Presents: Batman’s Strangest Cases:” https://www.youtube.com/watch?v=vx4xpzvwRzo

The ‘60s Gave Us Head

The ‘60s Gave Us Head

by E. John Winner

“The tragedy of your time, my young friends, is that you may get exactly what you want.” [1]

I. Monkees, the Signifier

“The Monkees”: this verbal sign signifies a number of phenomena, certainly related, yet not identical. For instance:

–A television show, broadcast in the mid-1960s; the musical group invented for that show.

–A fictional musical group, fronted by four vocalists who were appearing in the television show, backed by a loose collective of session musicians known as the “Wrecking Crew.” This collation produced two albums for record mogul Don Kirshner, The Monkees and More of the Monkees.

–An actual professional musical group (from the album Headquarters onward) comprised of the performers on the television show.

–The four performers themselves, an identification that would follow them the rest of their lives.

–A commercial brand for media and merchandise, from television specials to records, comic books to tee-shirts, plastic lunchboxes to toy “Monkeemobile” cars.

II. Monkees, the TV Show

In 1966, Bob Rafelson and Bert Schneider put together a situation comedy about a struggling pop-rock band, The Monkees. Television and music historians usually credit the inspiration to the Beatles’ A Hard Day’s Night, but that film’s director, Richard Lester, had carefully balanced a fictional plot about a randy old conman related to Paul McCartney with reflections on the very real working-class Liverpool origins of the band. The TV series Rafelson and Schneider developed owes much more to the Beatles’ second film, Help!, a parody of the James Bond films with mild but obvious satirical jabs at British bureaucracy, colonialism, religious cults, and the comfortably middle-class lifestyle the members of the band seemed to settle into after achieving commercial success. Rafelson and Schneider thus invented a band supposedly struggling for such success, but they clearly are not struggling very hard. Despite spending every show trying to find gainful employment, they appear to eat well, dress well, spend plenty of time at leisure, and always come up with the rent for their apartment. They clearly have money coming from somewhere. And the performers chosen to play this band actually had experienced some success in life already: Micky Dolenz had played in a minor hit of a TV show; Davy Jones had appeared in a popular Broadway musical; and Mike Nesmith was the son of the woman who became wealthy as developer of Liquid Paper. Only Peter Tork came from a modest middle-class home of respected academics. There was no way these performers were ever going to project themselves as not somehow privileged economically. Eventually, they would find themselves singing songs protesting middle-class conformity or the Vietnam War, but they could never sing, with John Fogerty, “I ain’t no fortunate son!” (Of the four, only Nesmith and Tork had credentials as musicians – Dolenz and Jones were hired as actors – and this played a crucial role in the development of the group dynamics ever after.)

With the fleshing out of the Rafelson/ Schneider invented rock band lacking any anchorage in a recognizable social class, the plots of the episodes were a free-form effort to throw ideas at the screen to see what worked. Usually, the narrative referred, not to situations anyone would actually experience in life but to other TV shows and old films, poking fun at genre conventions and audience expectations, rather than developing comic personality or well-rehearsed schtick, identifiers of successful comedy teams from Laurel and Hardy to Monty Python. Even the slapstick – of which there was plenty – was singularly unconvincing. The Monkees did not perform as a comedy team, they performed as young people pretending to be a comedy team. To cover for any deficiencies, the production crew deployed techniques more common to animated films: flat cinematography; rapidly paced jump-cuts; two-dimensional characterizations; absurdly exaggerated gestures and bits of business; and the occasional pop song. The show became a kind of “live-action cartoon.” That kind of comedy could only find an audience among those wholly unfamiliar with what professional comedy might look or sound like, which meant early pubescent adolescents aged 11- 14; i.e., those experiencing hormonal changes but uncertain how to respond to them. This was historically and commercially important, as it represented the same youthful audience that had greeted the Beatles in their first appearances in the US in 1964. But by 1966, the older members of that audience had settled into their sexual maturity. They were no longer dreaming of a perfect romance with an imagined mop-top singing “Love Me Do.” They were having disappointing relationships and mooning over “Norwegian Wood.” But their younger siblings were now passing through that early pubescent phase and needed their own group of mop-tops singing “I Wanna Be Free.” (What teen-ager doesn’t?) Enter the Monkees.

III. Monkees, the Band

In casting the show, Rafelson and Schneider made an interesting misstep, one that would effectively determine all that was to follow. As the musicians of the cast, Nesmith and Tork not only aspired to become professional songwriters and performers (viewing the TV show as simply a means of gaining exposure), but they had formed friendships with members of the developing Los Angeles music scene. The musicians on the Don Kirshner albums included some of those friends. When Nesmith saw the credits on the first album, which listed none of these actual musicians but instead assigned instruments to the members of the fictional TV “band,” he went ballistic. He not only demanded proper recognition for the real musicians, he initiated an idea for taking the whole matter to another level. The fictional band would become a real band: Nesmith, Dolenz, Jones and Tork would play their own instruments on record. Instead of the brief public appearances where they sang before a backing band, in promotion of the TV series, they would set out to play concerts themselves.

At first, their contracts with the production companies involved prohibited this, but Kirshner, confident of his position, made the simple mistake of releasing a 45 RPM single that had not been approved by the production companies, thus breaking his own contract. He was let go, and with music for the show needing to be made, the producers gave Nesmith the green light to form a real band out of his TV show colleagues. Not as easy as it sounds: Dolenz was a surprisingly good singer, but couldn’t play drums; Jones was a pretty good drummer, but as lead singer his voice had limited range. Nonetheless, the performers decided to adapt to the band’s television image and rehearsed accordingly.

The band probably saw themselves at a crossroads but found themselves in a pop-culture dilemma instead. By the time they at last began working as a band of musicians, everyone knew that they had not been the primary musicians on the first two albums released under the Monkees brand. Further, pop-culture itself was changing rapidly, in ways only hinted at in the early ’60s. Older viewers of the first season of the TV show had grown up, and by 1967 – the Summer of Love; Haight-Ashbury; Sgt. Pepper; Monterey Pop – they were suddenly finding the show tawdry, superficial, childish, and banal. To the post-mod, post-British Invasion audiences of 1966, The Monkees TV series looked witty and possibly even nostalgic, a throwback to the innocence of ‘64. To the hippie drop-outs of the new Counterculture, both the TV series and the band looked plastic; a corporate puppet-show; a cynical comment on the manipulation of America’s youth.

Members of the band were increasingly aware of this and annoyed by it. Having decided to enter the community of professional musicians, they yearned for whatever respect their real talents could earn, when judged on their own merits. Instead, the jury had already found them guilty of having been created for a television show and a not-very-good one at that. They were judged as pretending to be Countercultural, while acting as shills for the status-quo. In truth, it wasn’t quite that bad, and the prejudice against them was never quite that vicious. But the fact remains that musical innovations appearing on Headquarters, Pisces, Aquarius, Capricorn & Jones, and The Birds, The Bees, and The Monkees all began disappearing into bargain-bin oblivion, while Sgt. Pepper’s Lonely Hearts Club Band swept over pop-music like a tidal wave.

IV. The Frodis Caper

The band attempted to use the TV series to reconstruct their image. Having acquired complete creative control, they tried to re-direct the series to subvert the medium itself. Most notorious was an episode co-written and directed by Dolenz, “The Frodis Caper.” An evil wizard kidnaps a powerfully hypnotic alien, intending to broadcast its appearance over television, and thus take over the world. The boys in the band must send the alien home while releasing its hypnotic power freely into the atmosphere, thus bringing peace and calm to all. The episode ends with the performance of the anti-war song “Zor and Zam.” (‘Frodis’ was a codeword with Dolenz’s friends for marijuana, and veiled allusions to the drug abound.)

“The Frodis Caper” is not very well made, and the performances throughout are annoyingly hammy, but it does move along well, and the very idea of it makes for interesting television. But the band members really didn’t understand the corporate politics of television. Their second season ran because they were contracted for it, and the first season was already booked for syndication. But the production companies could see the writing on the wall. The Monkees were becoming unmanageable. Ratings were slipping fast. The show’s core audience still consisted of early pubescent adolescents, and the Monkees were now playing well beyond their cultural understanding. As for the young adult hipsters the Monkees hoped to attract, they had switched off long before. The series was canceled.

V. Head, the Movie

There are interesting stories surrounding the making of Head: the Monkees, Bob Rafelson and Jack Nicholson spending a weekend smoking copious amounts of marijuana while spewing ideas into a tape-recorder; communications theorist John Brockman becoming the “head” appearing on the promotional advertisements, for no good reason. There is also much to be said about the film itself: it’s evident influences from experimental film-maker Kenneth Anger; its relationship with a now forgotten Roger Corman film The Trip and with Dennis Hopper’s Easy Rider; its attempt at non-linear circularity (ending where it begins, but with nothing in-between actually leading there); its obvious yet disturbing symbolism (they never escape the box). I will here merely quote a brief review of the film (revised for corrections) that I wrote for the Internet Movie Database [2]:

“Accidental masterpiece: Almost laugh as the Monkees reduce their entire career to a one-minute TV commercial about dandruff! See the 50-foot Victor Mature try to figure out what the heck he’s doing in this film! Hear Frank Zappa (with his pet cow on leash) tell Davy Jones “Your music is awfully white”! Experience the Monkees’ live performance as a real rock band playing the proto-punk “Circle Sky”! Listen as Davy Jones sings a Harry Nielsen song about having a transsexual father! Be very confused, as confused as Teri Garr is when Micky Dolenz makes sexual innuendos about her in her film debut! Witness futile protests against the Vietnam War leap out of nowhere and just as quickly disappear! Watch Mike Nesmith spit on Christmas while wearing a velvet Victorian smoking jacket in a cobwebbed Gothic horror-movie soundstage! Let yourself drift into karmic bliss with Peter Tork, inspired by a comic-book version of Indian mysticism delivered by a hammy character-actor! Discover Academy Award winning director Bob Rafelson’s first feature length film, attacking the television phenomenon he himself had invented, as written by Academy Award winning actor Jack Nicholson! Pretend it’s not happening, when the Monkees commit group suicide by jumping off a bridge! Take as many drugs as the cast and crew evidently did while making this film!

With Head, the Monkees revealed themselves as the angriest, snottiest entertainers in Hollywood history, bar none. It is bewildering to discover that they blamed the failure of this film on bad promotion. To be sure, it was virtually non-existent, but did they not recognize how angry, how depressing, how self-destructive this film actually is? Head is a bad trip on acid to the suicide ward of a mental hospital. This film reveals why life in the later 20th Century was almost unbearable … if you were lucky. It’s not simply that Western culture was suffering from serious information-overload, but the information itself was bad. In fact, it was the overload effect itself that kept people going, since this allowed people to keep distracting themselves with one crisis or another. If news from Vietnam became too much to bear, you could turn the channel and watch a documentary on the rising unemployment rate instead.

The “positive” response to the reality revealed in Head was Woodstock: three days of mud and bugs and bad food and bad acid. All taking place behind a steel fence, under the lovingly watchful eyes of a veritable army of New York State Troopers, which meant that the “freedom” of Woodstock Nation was as illusory as the song John Sebastian thought he was singing while so strung out he could barely speak.

The one good thing occurring there was Jimi Hendrix’s stunning improvisation on “the Star Spangled Banner.” A year or so previously, the Jimi Hendrix Experience had gone on their first national tour of America, as the band opening for the Monkees. See, it’s all connected somehow.”

VI. Head, the Soundtrack

With the digitization of music in the current era and the consequent collapse of physical recorded music, the very notion of a “concept album” has become outdated. It’s difficult to relate how important the phenomenon was to the music of the 1960’s and the Counterculture that enjoyed it. The basic model of the concept album goes back to the song cycle of the early 19th century. It was originally codified in the 1950’s by Frank Sinatra, as a sequence of songs developing a mood or idea. The first album in the rock idiom to be widely recognized as a concept album was the Beach Boys’ Pet Sounds, a sequence of songs about a doomed summertime love affair, ending with the recording of a dog barking at a train disappearing down the tracks. But the concept album has an interesting problem: Since the songs can be played individually, what really holds the album together beyond its physical formatting? By the end of the ’60s various composers answered with the development of the so-called “rock opera,” the most cohesive of which was probably Weber and Rice’s Jesus Christ Superstar, since it had a linear narrative. Yet there were a number of artists that scorned linearity, and still tried to elevate their album to the point where the experience of the whole would be greater than the sum of its parts.

Head includes probably the strongest music the Monkees committed to record. It’s certainly the best sounding. Of the six traditionally structured songs on the album, three are basically in the sub-genre of what we now call “garage rock”: fast guitar-driven shouts at the dance floor, reminiscent of, say, the Electric Prunes’ “I had too much to dream last night.” But even these songs sound remarkably fleshed out with layered instrumentation and subtly pumped reverb. The psychedelic thunder of the organ-driven “Porpoise Song” is lush in arrangement and recording, and in their own ways so too the moody folk ballad “As We Go Along” and the mock show-tune “Daddy’s Song.” After more than fifty years, the album still sounds good. Even Sgt. Pepper sounds dated by comparison.

But the non-traditional material is what really holds the album together. Primarily, there are all the sound-effects and bits of dialogue from the movie. They have rhythms of their own and link the traditional songs by leading into them or away from them with something of a musical timing. They comment on the songs and are commented on by the songs; e.g.: the pseudo- Eastern “Can You Dig It?” rattles to its conclusion whereupon Davy Jones is heard to remark “I’d like a cup of cold gravy with a hair in it.” The pretensions of the song are thus undercut by a crude joke about consumption. The album concludes with an untitled orchestral string composition by Ken Thorne, snapped into the last real track of the album, “Swami – Plus Strings, Etc.,” a miniature aural collage. It opens with a pedantic but vapid recitation of some vaguely Hinduistic aphorisms, over repeated dialogue and effects from the film, bleeding into a reprise of “Porpoise Song.” Then we hear the truck that in the film carries the band (trapped in a fishbowl) back into the studio and then, suddenly, the cheerful arpeggio of violins with orchestral backing, a much needed tonic to all that has come before. Because what has come before is a sardonically tinged expression of despair. The key (and it’s rather obviously such) is the non-traditional acapella “Ditty Diego – War Chant,” a self-lacerating parody of the theme song to The Monkees TV show:

The money’s in, we’re made of tin,
we’re here to give you more
The money’s in, we’re made of tin
we’re here to give you
(- gunshot -)
Davy: give us a ‘w’
Peter: give us an ‘a’
Micky: give us an ‘r’
Mike: what does it spell?
(- explosion -)

The Monkees thus contextualize themselves as trivial distraction to horrors of the real world then unfolding in Vietnam – a commodity for sale, having no integrity or higher aspirations. But knowing this is exactly the cause of their despair:

A face, a voice
An overdub has no choice
An image cannot rejoice
Wanting to feel
To know what is real
Living is a, is a lie [3]

The non-linear circularity which I noted of the film’s structure is an artistic mannerism. But in the album, this becomes a deadly trap. Some of the songs explicitly complain of it, especially “Long Title: Do I Have To Do This All Over Again.” But it lingers suggestively throughout, even in the melancholic “As We Go Along,” where there is a developing sense of having nowhere special to go, but “we’ll make up our story as we go along.” The trap is simple: Whatever there is to do, do it again; and again; and so on, regardless of whether there is any meaning or purpose to it. As “the Swami” says: “For where there is clarity there is no choice/ And where there is choice, there is misery.” Be a Monkee or be damned.

A living fiction, the band had no reality, only the squared circle of the television world that created them; the box in which they were trapped. But approaching reality meant confronting the threat of non-existence – psychologically, the loss of identity, of self. In the film, escape suggests suicide. In the album, it is the final dive into a fantasy world where “the porpoise is waiting good-bye, good-bye.”

The Counterculture of the 1960’s could not last. It was merely a personality-crisis masquerading as a celebration of freedom. It ultimately required various emetics, almost none of which were wholly effective or satisfying. The Monkees may have been “A manufactured image/ With no philosophies,” as they admitted. But in their particular self-destruction in Head, they provided their own emetic, and the soundtrack to beguile our nostalgia for it.

[1] Head; Columbia Pictures, 1968; written by Jack Nicholson and Bob Rafelson, produced by Nicholson, Rafelson and Bert Schneider, directed by Rafelson.
[2] https://www.imdb.com/review/rw1431632/?ref_=tt_urv – The mis-statements made in the original review were the product of a long-time bias against the Monkees, accepting the anti-Monkees myth of the ’60s Counterculture.
[3] “The Porpoise Song,” Carole King; Gerry Goffin.

Marxism for Dummies Like Me

Marxism for Dummies Like Me

by E. John Winner


When I was 17, I fell in love with three older men. Indeed, they were so much older that two of them were dead at the time. They were everything I wasn’t. They were short and thin, and though not athletes, they moved swift and agile. One couldn’t say that they were good looking (except that one was admittedly so beautiful that it was hard not to fall in love with him). But this didn’t stop them from making an impression on others. They were extroverted and quick-witted. Unlike the tall, fat, awkward, and self-conscious young man that I was, they moved easily among people; knew what others wanted; and interesting ways of providing it, which they practiced without hesitancy or doubt. One of the interesting things about them was that they were able to enter a room full of people and immediately dominate the social environment, not by drawing attention to their wealth or higher social status, but by loping in and out of the margins attracting attention only when doing so would get them something they wanted or simply was fun. And because it was fun, it was not about all the serious things going on in the world, but about the joy of debunking them.

What in the world would need debunking? Of course, I’m talking about the social world; the world that we inhabit solely by virtue of our status as articulate, community-engaged social beings. I’m talking about an interwoven web of customs, beliefs, even ideological commitments shared with others, from humble practices of “good manners” to grand rituals and celebratory rites. We live these beliefs and practices; they permeate everything we think, say, or do. They only function because we believe in them; or if we cannot believe, we observe them with respect, perhaps with fear (there are repercussions for not doing so, some quite grave), perhaps sometimes with compassion. But for believer and non-believer alike, they are all to some extent restrictive of our capacity for joy and enjoyment. Sermons in church and lectures at school are dull, often repetitious, and waste time that could be spent otherwise. Passing the salt (when asked “please”) to a doughty old aunt at the dinner table is an imposition, and then one has to listen to her gossip about neighbors about whom one could care less. And then there’s the retailer who drones on and on about political issues that he doesn’t understand. If he’s not shown deference, will he refuse to wrap up the shirt you just bought? The restaurateur wants us to wear shoes? Let the Health Department provide them. I mean, it’s their bloody rule after all.

For who would bear the whips and scorns of time,

Th’ oppressor’s wrong, the proud man’s contumely,

The pangs of despis’d love, the law’s delay,

The insolence of office, and the spurns

That patient merit of th’ unworthy takes,

Who indeed? There’s the rub. It is not the fear of some unknown country that we endure, but the words, practices, and beliefs that bind us into community; however unhappily that might be.

Wouldn’t it be fun to fart loudly during the sermon? To lie down on the teacher’s desk and fall asleep? Throw the salt at the aunt? Pull the shirt over the shopkeeper’s head after poking him in the eye? The social world only exists because we agree that it does. This is not a bad thing and certainly not a reason in itself to tear everything down. (Since there is no “starting over,” what would we be left with?) But we sometimes need to remind ourselves of this; to poke the world and puncture its inflated importance, even if only just a little.

It is not that the Marx Brothers respected nothing. They respected children and the working poor. They respected young lovers and the pleasures of life. They certainly respected good food and a good cigar. They also respected music, with no restrictions with regard to genre or popularity. When in Night at the Opera, they seem to savage Il Trovatore, it is obvious that it is not Verdi’s opera itself that has called down their scorn, but the institution of the operatic theater, which had drawn a curtain of wealth and social status around the pleasure of listening to the music itself. [1] When the young lovers are at last allowed to sing their duet, and the forces of social control have been conned and crushed by an aggressive “Marxist” assault, the opera is liberated and opened to the audience, who can then enjoy the music as music, and appreciate the talent of the singers without the imposition of ego or moneyed influence.

What the Marx Brothers did not respect were social institutions that endeavor to determine in advance what could be enjoyed; what should be considered talent; what direction our lives should take in any walk of life. The trouble is that the normal and the normative doesn’t merely socialize, but stultifies. One of the reasons that I’ve never thought of the academic world as truly determinative of knowledge or value is because I took the lesson of Horse Feathers to heart. [2] I don’t understand how one can watch that film and continue to think that stuffy professors with an overweening sense of self-importance have anything to tell us that we couldn’t learn for ourselves. At best they may provide us with facts of which we might have been unaware, or challenge ideas and help us to see the world in a different way than we had previously. At worst, they yammer on and on about themselves and what they and their colleagues have decided is interesting.

Horse Feathers makes clear that what holds the Academy together is largely acquiescence; a willingness to agree with the status quo without serious question. Well that and … college football. Why football? Two reasons: First, American football is one of the silliest sports one can imagine, and we also know, now, that it is one the most dangerous (which makes it even stupider). Twenty-two guys running across a field to knock each other down over a piece of pigskin? The absurdity is self-evident. And yet, hundreds of millions of dollars pour into college coffers so as to provide the opportunity for college students to do this (and get serious concussions), despite it having nothing to do with what is done in the classroom, which is the raison d’etre for colleges and universities in the first place. Meanwhile, the professors, who should know better, agree that the football show must keep going despite the fact that a lot of their colleagues are doing shoddy research and publishing shoddy books, and their students are all falling asleep in their classes. Who can take any of this seriously? Well, most of us do, especially if we’ve been to college. But it’s good for us, occasionally, to hear Groucho sing “Whatever it is, I’m against it,” in his reply to questions from his faculty concerning his plans for Huxley College.

I first saw Horse Feathers at age 17. I had already fallen in love with the Marx Brothers by attending a screening of Duck Soup, their anarchic blast against government, war, fascism and greedy peanut vendors. [3] However, the revival of their films was not yet in full swing. When I chanced upon a TV Guide listing showing that Horse Feathers would be broadcast on a local station in a neighboring city (Syracuse), I pulled together what cash I had in hand, hopped a bus to that city and rented a room in a cheap hotel just to watch it. The print was in rough condition (and still is, as the television print is all we have of it), but the film was everything I hoped it would be: part Bugs Bunny; part Midsummer Night’s Dream; and part Godzilla stomping Tokyo. A weird blend of the comically chaotic with expertly executed satire. A blend the Marxes perfected in vaudeville and on Broadway, and which they managed to maintain through all of the first seven of their feature films. That night in Syracuse remains one of my happiest memories.

Horse Feathers is an interesting film for Marx Brothers fans, because it includes remnants of their first successful vaudeville sketch, Fun in Hi Skule, one of which is the scene where Groucho gives a lecture on the cardio-vascular system, and Harpo and Chico pelt him with spitballs. (“The blood rushes to the feet, gets a look at those feet, and rushes back to the head again.”) Having started out in musical specialty acts, this skit would not only establish them as comedians, but would define their comedy, not in the skit as written, but in its development. It may have been the skit they were performing in Nacogdoches, Texas, when the whole audience ran out to watch a local farmer deal with a mule that had broken its leg on Main Street. When the audience returned, the Marxes abandoned the script to assault the town with a battery of ad-libbed insults. For whatever reason, the audience loved it and Marxist anarchy was born.

The skit also helped to define the characters they would develop over the years. For one thing, the characters were all ethnic stereotypes. Everyone knows that Chico was not really an Italian, but many people do not know that Harpo had red hair (via a wig) because his character was supposed to be Irish, an identification that never really developed as Harpo was given fewer and fewer lines to speak over the years, transforming into an almost entirely physical comedian. Groucho, meanwhile, in Fun in Hi Skule, was supposed to play a stuffy German immigrant teacher, but during WWI, sensitivities were such that he began playing the part with a Yiddish, rather than a strictly German, accent.

The Marx Brothers were, of course, Jewish. They were not practicing Jews, and neither Groucho nor Chico were believers. Only Harpo occasionally voiced a faith, however vague, in a divine something-out-there. But Judaism is not just a religion, it is also a community, a culture, a tradition. Although their father was a largely unsuccessful tailor in New York City, their mother, Minnie, born the daughter of a magician back in Germany, and with a brother – Al Shean – already on his way to vaudeville stardom, always had theatrical aspirations which she strove to instill in her sons. I’ve seen no evidence that Minnie ever urged her sons into the Yiddish theater scene, which was thriving at the time, although it’s impossible to believe that living on East 92nd Street, they didn’t wander down to the Yiddish theater district running along 2nd Avenue into the East Village. But Minnie clearly tried to direct her sons into mainstream Vaudeville (at least those sons who could be directed, initially Gummo and Groucho; Chico and Harpo spent their late teens playing piano in bordellos, since these frequently also served alcohol in an adjacent barroom). This tells us that her aspirations were truly national in character, since the Yiddish theater circuit only reached into major cities and not very many of them. Unsurprisingly, the Marxes, in their various evolving acts in Vaudeville, traveled throughout the Midwest and as far into the South as was safe for Jews to go. (Groucho would later sometimes reject a writer’s offered joke with a disdainful “Will it play in Peoria?”) The strategy worked, and the Marxes finally broke out of Vaudeville into “legitimate theater” – i.e. Broadway – and then into cinema, just as synchronized sound filmmaking was being perfected and becoming all the rage.

Nonetheless, in their best work the Marxes retained a real sense of their experience as children of Jewish immigrants. Notably, stowaway immigration to America figures as an important plot device in two of their major films, Monkey Business and Night at the Opera. [4] One important function of Chico’s ersatz characterization of an Italian immigrant was that it allowed jokes about immigrants to be made throughout their careers. The characterization is commonly noted to be so transparently, obviously fake as to make it easy to deliver these jokes with tongue firmly in cheek. On some level, they really weren’t about Chico qua immigrant so much as about Chico as a fake immigrant. When in Animal Crackers, Chico’s Ravelli reveals that he knows art collector Roscoe Chandler is really former fish peddler Abie from Czechoslovakia, Chandler complains “Say, how did you get to be Italian?” [5] “Abie the Fishman” was apparently a standard stereotypical Jewish trope in American humor at the time. [6]

This is noteworthy, because when Chico engages in banter with Groucho (who’s supposedly wittier than him, but whom Chico manages to befuddle every time), what we hear, if we listen carefully, is a Jewish comedian playing an Italian immigrant effectively getting the better of a Jewish comedian playing a character that the Italian recognizes as a Jewish immigrant. It’s not so much a competition to see who’s more Jewish, but rather to see who’s more of an immigrant. Groucho, after all, has, by the time the Brothers reach the cinema, dropped any overt Jewish (Yiddish) accent. In his shabby tuxedo and with his constant wooing of Margret Dumont’s stuffy matriarchal dowager, he’s clearly trying to find some way out of his ethnic identity, and Chico won’t let him get away with it.

We can see this especially in the famous routine from The Cocoanuts, known by its most famous line, “Why a Duck;” and even more specifically in that line itself. [7] The setup is that Groucho is trying to get Chico to act as shill for him at an auction of some questionable property he has on his hands; that is, to raise the bidding by topping a bid until the desired amount is actually bid by someone willing to pay. Not Chico, who’s penniless, but a monied investor pressured by Chico’s increasing bidding to offer a higher price. (Of course this leads to disaster, since Chico doesn’t know when to quit, and outbids all the legitimate investors.) But first, Groucho has to direct Chico to the site of the auction, which happens to be across a river.

Hammer: Now, all along here, this is the river front. And all along the river…all along the river, those are all levees.

Chico: That’s the Jewish neighborhood?

Hammer: (pause) Well, we’ll pass over that…You’re a peach, boy. Now, here is a little peninsula, and, eh, here is a viaduct leading over to the mainland.

Chico: Why a duck?

It is notable that in George Kaufman’s draft of the original play, Groucho’s character was not named Hammer but Schlemmer. It is unclear exactly how the joke about Levies and Passover would have played in Peoria. But “Why a duck?” has enjoyed nearly a hundred years of recitation, even by those unfamiliar with the work of the Marxes or its import.

Because of Chico’s broken imitation Italian accent and because his character often pretends to be slower, mentally, than he actually is, and because the character has scant regard for “proper English” in any event (and possibly because Chico did have a difficulty remembering his lines), Chico often seems to intentionally misunderstand what Groucho has to say to him, not just semantically but syntactically as well. A famous moment of this, in Night at the Opera, is when Groucho tries to explain the sanity clause in a contract, to which Chico laughs derisively “there ain’t no Santy Claus!” So I think most people hear “Why a duck?” in much the same way: Chico acting dumb and misreading Groucho to make a rather tortured pun. And that’s what it is, but it’s source is not Chico’s petulant refusal to hear what is clearly spoken, and ask, “well, what’s a viaduct?” Surely even he can hear the ‘v’ instead of the ‘w’ in ‘viaduct’. He can, but it’s not his mistake he’s remarking upon. It doesn’t take genius here to remember that the ‘w’ is often pronounced ‘v’ in middle and eastern European languages. What Chico is doing is listening to Groucho as a fellow immigrant. Herr Schlemmer appears to be answering a riddle he hasn’t fully expressed: “Oy, it’s a vy a duck leading over to the mainland?” “Why a duck, why no chicken?” Two hapless Rabbis debating an obscure line in the Talmud. The humor of the line really depends on one immigrant refusing another immigrant’s wish to be accepted as not an immigrant, as not speaking in the old language, nor with the old accent. Chico is not letting him get away with it any more than he lets Roscoe Chandler be anyone other than Abie the Fishman in Animal Crackers.

If American music can be said to have been largely determined by Africans who were imported here against their will (and a strong case can be made for that), it might also be said that American humor has been strongly impressed by the humor of Jews who felt pressured to “export” themselves from Europe to America due to the on-going threat of unjust laws, social discrimination and even pogroms in homelands where they were never allowed to fully settle or even feel welcomed. African American music can be said to constitute an effort to define a culture in opposition to an oppressive social order. Jewish American humor constitutes a legacy of immigrant memory and experience in an effort to salvage a culture forced into a nomadic existence by suspicious Christians and their Modernist heirs who barely understood it. I mention these efforts together, because of the enormous impact they have had in shaping the America in which we find ourselves today; an America that has never truly defined an “American Culture” as such and will never be able to do so until it embraces all the many different cultures that have contributed to the rich, diverse opportunities for expression and creativity of which the American people have revealed themselves capable in their brighter hours.

It’s not “the better angels of our nature” we really need to listen to. It’s the Marx Brothers.


[1] MGM, 1935. Director: Sam Wood. Written by: George S. Kaufman, Morrie Ryskind, James Kevin McGuiness (story), Al Boasberg.

[2] Paramount, 1932. Director: Norman Z. McLeod. Written by: S.J. Perelman, Will B. Johnstone, Bert Kalmar, Harry Ruby.

[3] Paramount, 1933. Director: Leo McCarey. Written by: Bert Kalmar, Harry Ruby, Nat Perrin (additional dialogue), Arthur Sheekman.

[4] Paramount, 1931. Director: Norman Z. McLeod. Written by: S.J. Perelman, Will B. Johnstone, Arthur Sheekman.

[5] Paramount, 1930. Director: Victor Heerman. Written by: George S. Kaufman, Morrie Ryskind.

[6] https://www.jewishchronicle.org/2013/04/28/abie-the-fishman-embodies-diasporas-at-conference/

[7] Paramount, 1929. Director: Joseph Santley, Robert Florey. Written by: George S. Kaufman, Morrie Ryskind.