Fare well to Logic?
“There is the well-known mountain-climber’s cliché: When asked why they climb the mountain, the climbers reply “because its there”. Do the climbers mean that the mountain called to them? Did it whisper in inhuman language only heard in silence, insisting they attempt the climb?
Of course we all know it is the climbers who make the decision – they want to climb: they will themselves to climb. Thus we say, “they will climb”.
But what are we talking about, when we say “the mountain will be climbed”? Is this just “bad grammar”? Or do we really mean the mountain is willing the being of the climb which some climber must therefore perform?”
The present issue is one of grammar. The above three paragraphs are from a text I wrote, inquiring into the thought of Martin Heidegger, “Signing Heidegger and the Buddha.” Preparing the present text, I came across it, and realized that I had constructed a very good, clear articulation of precisely what I mean when addressing the differences between grammar and logic.
Traditional grammarians know that the sentence “the mountain will be climbed” is not the best grammar (really, ‘the mountain shall be climbed’), but it is tolerable within the English spoken in America. And their standardized reasoning here would be something like, “the mountain will be climbed” can be translated into the logical predication “the mountain is that which shall be climbed”. But here, do not Modern logicians feel just a twinge of anxiety? For the sentence “the mountain is that which shall be climbed” is indeed a predication; but it is not properly a logical statement.
I will here avoid a century of tortuous debate that unfolded in the construction of contemporary symbolic logic. After all, most of us know that all of the early claims for symbolic logic have proven utterly false. The only concrete use discovered for it was during the design of computer programming languages. But there is of course the manner in which it has been deployed in academic jargon, in certain fields, in order to close off any intelligent discussion between professionals in those fields, and professionals in other fields using language derived from (educated, ‘elevated’) common speech. Indeed, part of the embarrassment here is that logicians have convinced grammarians to teach predication and argumentation along the lines of classical syllogistics, just so they won’t be bothered having to discuss the matter with them anymore.
So I will be brief, and, at the risk of my text seeming either illogical or ungrammatical, plain.
“The mountain is that which shall be climbed.”
The terms of this sentence are unspecified. The sentence contains no claim that any entities named actually or really exist. (Indeed, grammarians have a difficult time determining the difference between the “real” and the “actual” any more, don’t they? Logicians, however, are no longer interested in the issue).
Although the terms of the sentence are undefined, they are (and this is paradoxical) too specific for the sentence to be making any general (hence theoretical) claim. Further, the claim of the sentence, in so far as there is any, is of a temporal, specifically futural, variety; to attain the status of a logical statement, conditions would need to be posited such as would ascertain the inevitability (or, properly) the necessity of the claimed event. None such is offered in the sentence. A grammarian would here note that to argue this is, in effect, to take the sentence “out of context,” with the understanding that a context to any such sentence is inevitable, as necessary for the generation of meaning, and hence of communication between the utterer of the sentence and any audience. They would be absolutely correct, but only grammatically. Modern logic has effectively divorced the truth of the sentence from any possible context. The most audacious effort to accomplish this is to be found in Wittgenstein’s “Tractatus Philosophicus.” Although Wittgenstein later realized what a misstep he had made in that text, and spent the rest of his career as a thinker meditating over the necessary configurations of linguistic context (which he called “language games”), many Modern Logicians are still pursuing the Tractatus project.
To make the long story short, the sentence, as it stands, is not a logical statement at all; thus, to use the term devised by many British logicians of the Analytic variety, strictly speaking, the sentence is “nonsense”.
Yet what makes this problem particularly embarrassing to Modern logic, is that the grammarian (who must, by now, feel considerable bewilderment at what appears simple confabulations of terminology), is entirely correct that, as a translation of “the mountain will be climbed”, the sentence “the mountain is that which shall be climbed” is the proper predicative form of the original expression. And in the present context, what this suggests is that a predication, just as such, is not inherently, necessarily, a logical function, although it most assuredly is a grammatical structure.
The grammar of predication is quite clear: a predication is a claim concerning the nature of some entity, idea, term, person, or other substantive.
However, the use of the term “substantive” here should raise alarm, for it is, grammatically, a qualifier, or modifier, of a noun, and not itself a noun; the term, therefore, itself does not name any entity. Yet, apparently, in my above deployment of it, it quite clearly stands in for some noun naming some entity, and the sentence would not be comprehensible otherwise.
And one can really make this sort of maneuver (although at some risk), but logically it is, again, quite nonsensical.
Ever since Descartes, logicians have (successfully) struggled to convince grammarians — and practically everyone else — that logic exists pre-ordinate to, and in domination of, grammar; supposedly, it can be defined as the collective term for the “laws of thought”, and thought, supposedly, is pre-ordinate to language. Thus, the effort to construct standardized languages has been defined, in part, as the effort to construct grammar according to the rigorous rules of logic; hence the logician’s embarrassment in the discovery that this can never be accomplished.
It is really wholly unclear what form of thought pre-ordinates any other. Thus it cannot be said that what we have long called “logic” could ever be defined as the “laws of thought” per se. On the contrary; in discovering that predication is a structure of grammar, not a function of logic, what we find is that, whatever form of thought pre-ordinates any other, if both grammar and logic are regulatory of language, then grammar pre-ordinates logic. A predicative sentence is necessarily grammatical; but while a logical claim must, ultimately be discovered to entail a predication, not all predication is necessarily a logical claim. This means that all logic is grammatical, but not all grammar need be logical. And this means that the structures of the class “logical statements” are wholly enclosed, as members, within the class “grammatical sentences.” In traditional terms, then, we may as well put it bluntly: Grammar must be predicated of logic, in order for there to be logic, but logic cannot be predicated of grammar absolutely.
Before the truth of a verbal expression can be determined, it must be first constructed as some form of recognizable verbal expression. That construction is under the regulation of grammar, not of logic. And the logic of this is very plain.
Then why, in heaven’s name, has this been ignored? Answering that question would involve an elaborate historical narrative. But I suggest that the heart of the narrative would ultimately reduce to the political relations between later Medieval Roman Christians, Reformation Protestant Christians, and the Deists (or, as they have lately been called, Secular Humanists) of the Enlightenment. It well here to remember that Medieval Roman Christianity is a thing of the past, the current Roman Catholic church being merely something of a ghost and a tombstone; that Protestant Christianity has so far determined Modern politics and economics; and that Enlightenment Deism can still be found at the ground of Modern science and technology. All of this, I think, could be drawn out from an intensive analysis of Kant’s famous dictum, that the sentence “God exists” is not a true predication. This move was historically decisive in philosophy, because the Medieval Latin logicians had long held that the sentence “God exists” was the only logical claim that would be true whenever and wherever it was uttered by anyone capable of uttering it; that it was, indeed, somehow true were it never to be uttered at all. And yet Kant rejects it as possibly having any logical truth at all, since, not being a predication, it actually makes no claim (on any “substantive”) that can be determined as true or false.
However, as our previous discussion suggests, the sentence “God exists” may very well prove, on analysis, to be a proper predication — within the domain of grammar, not of logic. The Medieval logicians would not have very much trouble comprehending my point here, regardless how else they might argue the matter. But of course, Modern logicians feel entirely free to dismiss my suggestion as somehow confused, as standing debunked by two hundred years of development in the study of logic. This is really to miss my point.
Kant’s move, regardless of whether correct or misguided, must first be recognized as politically motivated. Establishing the primacy of logic over grammar effectively overturns the hegemony of Medieval philosophy once and for all, since that recognized the equivalent authority, in different domains, of logic and grammar. The Medieval logicians held that the sentence “God exists” is logically true precisely because it is a grammatically correct expression of an absolute condition of being (and a necessary pre-condition for any being at all). It is (at least to Medieval thought) a grammatically correct predication, which then establishes the first condition necessary for determining as logically true any predication whatsoever.
The Medievals were not wrong and Kant right, nor even the reverse; In a plain and accurate assessment, it can be said that the Medievals really inhabited a different universe, and communicated concerning it by way of a different language, than that of Kant; indeed, radically different from any Kant could ever have imagined.
Before leaving behind comparisons between Logic and Grammar, I want to address a seemingly off-topic issue, concerning the problem of a grammatical usage that has long been found to reduce many efforts to articulate the structures of logic to just so much meaningless babble (perhaps most notoriously in some of the more obscure texts by Hegel). This problem has to do with the indefinite, or “neuter”, pronouns, such as “it,” “this,” “that.” In respect of certain variants of this usage in both the German and English languages (where the problems the usage become most notable), I have come to call this problematic that of “the Dis Factor”.
“the Dis Factor”.
Let us discuss a “subject” and its “predicate’. But the subject we will sometimes refer to as “dis” and the predicate we will sometimes refer to as “dat”. But occasionally, as must always be the case in any theoretical discussion, we will need to refer to the predicate as “dis” (when it is used as a subject of a sentence, grammatically), and occasionally, we must refer to the subject as “dat” when it is used as the grammatical object of a sentence).
So: We begin with dis; and, deploying the copula (“to be”, or “is”) quite properly (according to the regulations of both logic and grammar in the Indo-European language traditions), we can say of it (by way of logical predication, we must say of it) “dis is dat”.
Dis is dat. What a thing to say. Yet, as indicated, a thing that must logically be said, if true. Anything else to say of dat would be inappropriate to dis. Then such could not be dat. Yet, dat may be said of dis, grammatically. No avoiding our usage here, we must say of dat, that: dis is dat, and thus dis may be spoken as dat. Spoken is dis, claiming to be dat. Dis is true, dis is truly dat. But what of dat? Can dis be said to be dat? No, obviously not. The predicate never suborns the subject, but remains subordinate to dat — dis being dat under the guise of predication. Thus, dis is dat, and this is all dis can be, except insofar as dis is something other than dat. Is dis? No, and yet yes. Dis must be dat, but dis also must be some other dat, or dis is dat (qua dis to dat), and dis cannot be “dat”, as Modern Logicians have long pointed out. The distinction between the “copula of identity” and the “copula of predication” is now to be wholly discarded — and quite correctly, don’t get me wrong. Indeed here we might say, the distinction forms the dat as subject of the sentence, “dis is no longer dat,” except, of course, the dis qua dat disappears into the absence of distinction between the two. Thus we must say “dis is dat” and dis is dat whenever we say such a sentence, no matter what the sentence concerns, unless we are referring to the sentence itself as a “dat.” concerning a dis that would be another sentence, such as “dis is not dat”. Disincorporated, logic from reality as grammar of thought, language discombobulates.
Disturbing, is the topic under discussion. Or, as sometimes called, the subject. The subject under discussion is “under” because “sub-ject”, subjectum (Latin, that placed under), discussion. Yet, as topic — “topos” (Greek, the place) of any discussion at all.
I know the distinction now; now it is not so clear.
If logic pre-ordinates grammar, the above discussion should not mean anything at all. Yet, grammatically, it is clear; then why would many readers have such a difficulty understanding it? Because the relationship between logic and grammar is not merely structural, and not merely functional, and never univocal. The relationship between the two is deep, and irrevocable. It has everything to do with why there is any such being we might consider a “rational animal,” why there is any being at all; and yet why there are so many, and so various, a number of human languages to be found on the planet earth.
This effectively puts the end to the Reformation, but also to the Enlightenment as well. Both historic trends produced gains, but both produced losses. The next turn in human history may hopefully retain the gains, and recapture the losses. But perhaps not. If not, then the human species is doomed. That is a claim on the future, but it is logically true, nonetheless. The whole history of Modernity in the West assures its certainty.
Logic is not, as the Aufklarunger held, the law of thought. The Enlightenment created an epistemology predicated on the assumption that truth best thought would not be thought by a strictly human intelligence. Rather, this question became, how could any intelligence know anything at all? Which of course leads — quite logically — into the question, how can an absolute intelligence, lacking any and all human contingencies of concern, know certain truth absolutely. Those familiar with German philosophy will recognize Hegel behind such a question, but they ought to recognize Husserl and Frege, too, since they were definitely in on this project. One can easily toss in Russell and Carnap here, and the early Wittgenstein. This notion grounds all Artificial Intelligence research (obviously), but most of what passes for “Cognitive Science” as well (the word “epistem-ology”, fashioned in the manner of ancient Greek, translates into the Latinate “Cognitive Science” perfectly; it’s amazing university administrators and other academics allowed themselves to be conned by this simple linguistic sham into thinking “Cognitive Science” could be anything new). Now, because we are familiar with theology as an inquiry into whatever it is we humans think we know as “God”, it is easy to miss the fact that modern epistemology, an inquiry into the perfect knowledge of a perfect intelligence, must also be a kind of theology, one inquiring into whatever it might be that “God” might be able to know about us, and how He might go about doing this. Of course, our God is a machine, but He (now It) is still All-Mighty nonetheless. The point is, since Kant, the development of logic, as a “law of thought,” has had very little to do with the actual human practice of thinking.
A handful of rebels have pointed out how ridiculous is this effort — e.g., Peirce, Heidegger, the later Wittgenstein. But in writing these names, I hear the snickering of contempt from so-called professionals in the study of logic, for whom these rebels are rank outsiders. And it must be noted that they each have weakened their positions by evading the decisive question, which is not, as they have accepted the problem, about how true objectivity of research and theory-construction can be accomplished by living humans; the real problematic necessarily involves psychology.
This study was effectively outlawed from epistemology by Kant, and has since then become resurrected in the most dubious fashion, by the appropriation of the term for the analysis of errors-of-thought-in-concretio. “Abnormal” psychology — the term seems to separate one field of research, into the nature of bizarre thought patterns, from another, which would be the study of normative thought patterns; however, all research in “psychology” has been, since the mid-19th century, research into “abnormal psychology”. And by the end of the 20th Century, professional psychologists (regardless of other theoretical differences) seem to have agreed to resolve this ambiguity by setting aside research into human thought all-together, by defining “psychology” as “the science of human behavior”.
Well, but if epistemology/ cognitive science is not about human thinking, and psychology is not about human thinking, what form of inquiry into human thinking has developed in the modern era, in fulfillment of the promise of the enlightenment to enlighten us on all matters?
To be sure, there have been rudimentary attempts to open up such a field of inquiry. but all these efforts have stumbled over the same obstacle: The logos of thought does not, as is apparently assumed, develop in response to any “other” (inertly material, or actively social) that stands objectively apart from thought and to which thought must somehow adapt or which it must overcome. It is an internal reflection, ordering sensations and internal sensational responses, in an organism that is not only aware of the world but aware of itself. The thought of an individual human itself thus constitutes its own objectivity. This is the only explanation that can accommodate both Einstein’s development of the Theory of Relativity and mass murderer Charles Manson’s sincere belief that he is both Hitler and God incarnate. This is the only way a written text can be composed, the only way Mozart can “hear” an entire symphony in his head and Euclid can determine what a “square of the hypotenuse” might really measure. It also explains why “I know what I’m doing,” and yet why it is “I don’t understand what you’re talking about.” This also explains why there are any individuals at all, among the human species; were our thoughts truly subjective, it is doubtful we would know any other humans existed at all. (People don’t feel a “subjective” hunger; if they did, they could deny the claim — “it’s only my stomach’s opinion, after all” — and thus starve to death. Of course, occasionally a person does starve him/herself to death; but only by denying identity between their thinking and their physical selves – which is unfortunately quite easy; indeed, part of the pathology of selfhood – “am” I self or “have” I a self? )
Objectivity of thought is the ontological condition of the rational animal.
Of course the use of the word “objective” may seem a little misleading, unless we define some sort of “subjectivity” in relation to it, so I suggest we think on this in the following manner: My thinking is my objectivity, “subjective” is my reference for you and your expressions of your thought. I.e., if you say, “I feel pain,” since I obviously cannot think this objectively, I say you have just made an expression of a “subjective” sensation or thought.
On close examination, this may seem very obvious, yet I remind the reader (if my text has any reader) that there has yet been no successful effort to unravel these relationships. Consider Descartes — in order to construct a “thinking thing” (res cogitans) capable of thinking “objectively”, he argues that it is necessary to remove any sensation that might be considered “subjective”; but since he is in the first instance debating this matter entirely in his own head, for whom does he make this argument and this hypothetical model, if not himself? In other words, what he actually does is construct an “objective” model of consciousness to which his actual consciousness stands opposed as “subjective” other. Once he makes this wholly illegitimate move, it becomes impossible to recognize the fundamental and necessary objectivity of actual human thought in an individual mind. We Moderns now seem utterly convinced that only this hypothetical model, the cogitating ego, is objective, to which we all stand, as individual humans in the material universe of our experience, as somehow flawed subjectivities. Of course we have developed philosophies and even political programs through which we hope to gain a perfected subjectivity — Marxism, Freudianism, certain forms of Feminism, not to mention the whole range of “New Age” quasi-religious mystery cults — and of course, there has long existed the similar promises of Reformation Christianity and hedonistic Capitalism — but it should be obvious from the above that such efforts are doomed from the start: under careful analysis, these various forms of perfected subjectivity read an awful lot like the Cartesian Ego Cogito (only, one might say, “with heart”) — in other words, merely another Modernist denial of the real objectivity of our thought in favor of the perfected-model objectivity of a super-human, divine, or otherwise non-human (which is as much as to say, machine-like) consciousness which we can never attain.
This is the result of what Kant announced as “the Copernican revolution” in philosophy; supposedly, this was to set the human consciousness at the center of all possible knowledge; instead, it has divorced us from ourselves. We live like non-rational animals, only able to build more complicated tools, and utterly convinced that this in and of itself, since it preserves our individuality from involvement with the material world, assures that our thus untainted souls will achieve immortality.
Before anybody remarks, that this is far too general a theory to be of any practical value, I merely point out that this theory is but a starting place for further inquiry. In fact there are fully developed theories that have pursued the problem without any response to Descartes, Kant, or the era of European-American Modernity: for instance, the psychology of Aristotle and Thomas Aquinas, the theory of mind developed among Tibetan Buddhists about 700 years ago. Such thinking had historically determined problems – lack of knowledge of the workings of the brain, or the need for some divine Subjective intelligence knowing us. But these thinkers knew something we Moderns have forgotten. Apparently the “Copernican revolution” in philosophy began a deconstruction of human knowledge — into a dumb mechanics of something pre-human, or post-human.
I really believe that in order to reclaim ourselves we will need to rethink knowledge as deriving from a psychological ordering of experience – a construction arising from the organism’s effort to maintain homeostasis within so as to interact with the external environment.
So much for removing psychology from logic. Let’s move on….