Recent readings at other blogs have persuaded me that the state of philosophy is both chaotic and yet strangely moribund. Here I approach the problem from three directions: 1. Where is professional – academic – philosophy now, in the broadest sense? 2. What is the status of non-professional philosophers, given the current electronic media access that allows presentation of just about any philosophic musing on the web? 3. How can we promote a deeper public philosophy than the dominant public philosophy in the era of triumphant capitalism? These reflections, beginning as comments elsewhere, are not intended to be conclusive. Yet they do seem to intersect at a strange but interesting point.
It seems right now, as an inheritance of the dominance of Logical Positivism, that the overwhelmingly Analytical philosophy that we find among Academic professionals, is to be defended by the simple thesis:
‘An important function of philosophy is the clarification of concepts; scientists engaged in the clarification of concepts are really engaging in philosophic reflection; scientific classification involves the clarification of concepts; philosophy is thus a beneficial complement to scientific classification.’ Or in short, philosophy studies the ground from which science develops.
Well, I’m not going to argue that point directly here. Frankly, while my own distrust of the effort to resolve philosophy into science has been discussed before; strangely enough I have just as little interest in somehow boot-strapping science into philosophy. What I really prefer is that scientists pursue science, philosophers pursue philosophy, artists pursue art, bakers pursue baking. In other words, as Mao wrote, “let a thousand flowers bloom.” the world is large enough for all manner of intellectual reflection and exploration.
But unfortunately, this tethering of philosophy to science has had a deleterious effect on the methodology of philosophy and its teaching.
While propositional logic (‘if p, then q; p, therefore q‘) was first developed (in the West) back in ancient Greece, it has, in the past 200 years, developed into the principle logical methodology of academic philosophy. That’s because, unlike classical syllogistic (‘All X are also Y, Z is an instance of X; Z is therefore also Y‘), propositional logic is amenable to two developments that syllogistic only fits very awkwardly – symbolic logic (flowering into first order logic on the one hand, and algorithmic – computational – logics on the other), and analysis of grammar. So there are all manner of subtleties and systems one can construct out of propositional logic; it has proven resiliently ‘text-productive,’ and professional academic philosophy is all about the production of texts.
However, as a means of spreading interest in philosophy beyond the ivied tower, emphasis on propositional logic has been an enormous failure. Indeed, taught as the primary and necessary methodology of academic philosophy, it has proven the single greatest impediment to the development of interest in philosophy as a whole among young people.
The problem with propositional logic is that it comes with a car-load of technical details necessary to uncover and test implicit premises, without which it dissolves, in common discourse, into classical enthymeme. Most of the students in my undergraduate symbolic course, which taught us the basics of propositional logic, came away as cynical logic choppers who felt they could pretty much twist any argument to their purposes.
“1a. If we have hair, then the food we eat must have hair in it.
1b. If the food we eat has hair in it, then we have hair inside of us.
Therefore, 1. If we have hair, then we have hair inside of us.
2.We have hair.
Therefore, we have hair inside of us.”
This classical example may be valid but it is also very silly (since 1a is poorly structured and as structured untrue), and 1b is only trivially true *sometimes* (not all food we eat has hair on it). Thus 1 remains unsupported; so the conclusion does not follow. Yet it is a valid argument.
So is this:
“If (young, black, male), then likely criminally motivated.
If a car has a criminally motivated person in it, then it is likely stolen.
If a car has a young black male at the wheel, it is likely driven by one criminally motivated.
This car has a young black male at the wheel.
Therefore this car is (likely) stolen.”
There are far too many white cops in America reasoning precisely in this fashion; and far too many politicians arguing in this fashion (see Trump’s recent remarks on Mexican immigrants). And one can’t really unravel such reasoning without going into a host of issues that have nothing to do with validity, but do have to do with hidden, uninformed premises. (Like, “blacks have the same opportunities and culture as whites, so if there is increased crime rates among blacks, then this indicates inherent criminal motivation” – exactly the kind of premise accepted by ‘social biologists’ and ‘bio-criminologists’ and Ev-Psych and the like, BTW.)
The problem with propositional logic is that to get it to work meaningfully, it has to be applied analytically – one has to be able to deconstruct a discourse into its propositions in order to run these through truth-tables; then, having done that, one has to check the propositions as reasonable or empirical claims, in order to determine justifiable true belief in the conclusions. That’s not really what we do in real life.
There’s a reason why the foundation of logic was, for many centuries the classical syllogism. Because this is how we really reason – through a process of intuitionally driven universals linking into the particulars of experience.
In teaching the logic section of my basic composition courses, over the twelve years I taught, I did what successful teachers of logic did long before the birth of Frege. I taught classical syllogistic. Especially I focused on the logical square (of opposition) and the problem of induction. The first emphasis was necessary to surface students’ hidden biases – “If (young black male) then criminally motivated” translates into “all young black males are criminally motivated,” which is deductively negated by “some young black males are not criminally motivated” (empirically demonstrable), which can only be answered with “some young black males are criminally motivated” – which proves nothing deductively (“some are, some are not”).
The reason for emphasizing the problem of induction was, first, to link the learning of good composition with the popular conception of the ‘scientific method,’ but also to blast through the ‘anecdotal fallacy’ (technical term?) The anecdote is a important source of personal reasoning in common discourse – and every anecdote is an instance of inductive reasoning. It just happens to be generally faulty; yet the science that demonstrates its faults also happens to depend on inductive reasoning. So it is important to stress the difference between individual experiences and repeatability – what anyone can experience if they perform the same operation.
As to philosophy (beyond logic): I was fortunate, in the year after receiving my doctorate, to teach at a school permitting me to teach two undergraduate courses related to philosophy, Literary Theory and Theory of Rhetoric. Designing the Literary Theory course, I thought it pointless to try to teach Derrida and Stanley Fish without the historic background on which contemporary theory depended; so I taught Hegel and Kant. Gosh, that was a lively course, with some great students! The Rhetoric course included Aristotle, Augustine and Nietzsche. It wasn’t so exciting, but it did get us where we wanted to go – development of language as a useful tool for communication.
The introduction to philosophy just has to include an introduction to its history. I admit I was an exceptional high school student – my introduction to philosophy was in 9th grade, through reading Voltaire, who introduced me to “Saint Socrates” (as Voltaire called him). But my experience as a teacher confirmed that the brighter students – those who want to learn (and nothing can be taught to anyone who doesn’t want to learn) – are open to anything that is rich in possibility and open to further discovery. History does this every time.
So too do the big questions. “Why is there something rather than nothing?” may seem silly to Lawrence Krauss – it seems silly to a lot of people. But it may be just the right question to pose to a young Heidegger – or a young Lawrence Krauss, for that matter.
A major problem for current professional philosophy is that seems to trivialize our real thinking and aspirations. For instance, ‘thought experiments’ about ‘philosophical zombies,’ in the long run we will see as trivial: problems about consciousness are about us, not hypothetical entities on other planets in other universes. Similarly. John Searle’s Chinese room ‘thought experiment’ is not itself trivial, but the hope for conscious computers, that it addresses, may well be.
Philosophy, despite its emphasis on reason, should first appeal to our hearts – to what is most dear to us. If it doesn’t do that, it wastes our time.
This brings us to an another interesting problem, that of the inspired – frequently passionate (often all too passionate) – non-professional, pursuing philosophy out of a love for it. Occasionally this love is driven by a desire to ‘save the world’ intellectually – to advance a ‘grand theory’ that will at last solve all our philosophical problems and thus lay foundation for the advancement of knowledge and society. I admit I find such philosophizing embarrassingly narcissistic. Having engaged in intellectual debates since 1970, I have seen world-saving theories and proselytizing come and go; fervent, even fanatical young people become disappointed, cynical manipulators; disillusion old men and women who look back on their seemingly heroic pasts and wonder, ‘what happened? why did we never get to the promised land?’ So shrill promises of Ultimate Theories of Everything – even when they come from famous scientists – tend to find my ears deaf and my eyes looking elsewhere. The one thing I’ve learned from the aging process is that life is surprisingly short, trusted knowledge often surprisingly false, ambition ephemeral and misplaced, ideals inevitably unrealizable. It should be remembered that human intelligence is a biological aberration – it does not make us the superior animal, it makes us the abnormal animal, the freak of the animal kingdom, the disease in the ecological body, unable to control our impulses and able to inflict enormous harm on other species, the planet – and of course ourselves. Dinosaurs walked this earth tens of millions of years; if we make two million, we’ll be lucky.
So enough with the world-saving Grand Theories of Everything, already! They have their history, a kind of shadow history to the development of human thought – Just as the history of sailing has a shadow history of ship-wrecks, sinkings, beachings, storms and becalming, and the occasional disappearances of whole crews….
But there are other independent, non-aligned thinkers (and I consider myself one of these) who philosophize because they wonder. they look about themselves, or reflect on their own responses and thinking, and they are annoyed by questions that referral to books or science or classes they take at local colleges never quite satisfactorily answer. Sometimes they simply continue to question. Other times they try to propose an answer themselves, and if they are careful enough, and determined enough, they will produce interesting theories and insights of their own that can not be realized in an academic setting. If they are bold enough, they may attract attention; if they have engaged in the study and reasoning their questions demand, they can taken seriously – and they ought to be. Since its various historical initiations, philosophy has usually become entrenched in some school or other. But the initiation itself never takes place in any school. Should we not remember that the initiation of modern philosophy begins with thinkers like Locke, Hobbs and Descartes were complete outsiders to the ‘professional philosophy’ of their day (university theology)?
I think a serious topic worth consideration is the place that can be found in philosophy for the non-professional who philosophizes. Before the internet, the topic fit a small niche referred to (sometimes in whispers in the academy) as ‘independent scholarship.’ (It was called such, because when these writers published, the strictures of good scholarship were still insisted on). Such independent scholarship could be found in a host of fields, from mathematics to literary criticism, and certainly included philosophy. The internet, which allows anyone to ‘publish’ anything, has so overloaded us with ‘independent’ theorizing, that it’s become difficult to determine the worth of theories through ‘editorial standards’ (and this I think has effected print publishing as well, sadly). Nonetheless, there is no doubt in my mind that there are some independent – non-professional – theorists and critics worth reading on the web. But what is the place they have – could have, should have – in larger conversations that might include professionals?
It should be remembered here that this question is asked at a moment in history when a fractured educational system, coupled with an electronic media that seems to make any opinion viable regardless of foundation, have together lowered the interest on the part of the general populace in any deep learning at all. Indeed, the majority of Americans (I can’t speak for other countries) have nurtured an old distrust of education and reflective thought to the extent that institutions of ‘higher education’ are considered little more than necessary evils along the path to acquiring resume essentials – in short, an unpleasant necessity for getting a better job.
What we need is a public conversation – and really a public philosophy – that promotes learning just as such as an inherently good thing. Promoting science or the humanities as a source of pleasure or greater health or greater wealth or greater power over others, merely re-enforces the notion that knowledge can be, not end in itself, but a means to greater control – over nature, over others, over reality itself. The political dangers involved in such a notion should be obvious: a culture of all against all, with winners forming a governing elite with powers of coercion previously only dreamed.
Indeed we’re close to that now. For much of the 20th century, America had a public philosophy – it was called Pragmatism. It had a ‘left’ (like Dewey) and a ‘right’ (like O. W. Holmes), and was more or less espoused by artists, aestheticians, scientists (even ‘scientismists’), businessmen and politicians. Pragmatism did tend to emphasize the need to further our interests and promote useful values – but these values included education and knowledge across a broad spectrum of possible interests, including of necessity public interests, shared social interests that superseded pure self-interest, because humans are social animals, and human self-interests are as nothing without a society in which these can be pursued (and shared with others).
The 1980s did bring forth a cultural revolution, the Reagan Revolution, which is properly so-called (although Reagan himself was mentally unbalanced and incompetent at anything other than delivering speeches). What this revolution did was focus, enforce, and then disseminate ideological traces leading back to a business ‘philosophy’ (shared beliefs of a certain commercial class) existing prior to Pragmatism (it was born of the triumph of Northern industrialism during the Civil War), which Pragmatism attempted to account for and accommodated, but which had been held in check in the efforts to recover from the Great Depression.
So let us call this business philosophy Reaganism, and recognize that it extends well beyond the restructuring of America’s economy. Reaganism involves, at the level of public discourse, the re-interpretation of all values into the language of the marketplace: ‘cost-benefit analysis’ becomes the gold standard by which we determine the utility of our interests and our policies.
This language cannot sustain a broad defense of even the sciences – nor of any research interest that is not amenable to such cost-benefit analysis. When Pragmatism was the public philosophy, we had a language by which to argue that discovery was a good in itself, that knowledge could be of myriad uses without need to specify any of these as reason to fund (or defund) any research.
We can’t go back (and I am certainly not talking about any kind of golden age, anyway). But we do need to find some way to argue the funding of research in non-commercial terms – or the discourse will grow narrower over time, until nothing is funded that hasn’t been reviewed by an accountant for its potential profitability.
As these lines of thought grow closer together, their intersection surely achieves greater clarity. It may sound like a bit of hubris – indeed, it may resound like a bit of a ‘Grand Theory’ in a way. Yet the conclusion seems unavoidable.
The place of the independent thinker in the contemporary situation is to maintain a public discourse that promotes learning and thinking for its own sake, in a manner of which the professional academic is no longer capable. (Not because they are unintelligent, or lacking in reflection or insight or even passion for their field; but because the accumulated history of their professions, coupled with the professional demands on their language and methodology, preclude them from the adventurous and innovative questioning of the conventional.)
We non-professionals do not have the necessary access to technology or libraries or publishable debates and conference presentations, of the professionals in our fields. What we have is precisely the lack of stricture and regulation, the freedom to explore, discover, invent; the ability to write new grammars and explore differing logics. We have what Socrates, and then Augustine, and then Descartes, all cherished most – the moment when we can choose what of the past is truly worth preserving, and tossing the rest into the rubbish bin; the moment of initiation to the the future potential of human thought.
I don’t suggest this as a glorification of ‘the new,’ but as a heavy burden of responsibility. Some two hundred years ago, Immanuel Kant declared that the meaning of ‘Aufklarung’ – Enlightenment – could be summed up in three words: “Think for yourself!” – a message that resonates well with the lessons of the Buddha, from 3000 years ago. It is the one truth, the one absolute, the one freedom, and the greatest responsibility we can achieve.