Introducing Logical Thinking: Why Study Logic?
Stephen Hawking has declared that philosophy is dead. In this, Hawking is dead wrong. Our present culture now willfully ignores philosophy, but it's importance has never been greater.
What is logic, and why should anyone be interested in it? I suppose those ought to be our initial questions. This being the opening article in this new and somewhat experimental series, I will tackle the second question here and the first one in Part II.
Charles Sanders Peirce, easily the greatest philosopher the U.S. will produce (since at the rate things are going there the U.S. won’t be producing any more great philosophers), began one of his most important essays, “The Fixation of Belief,” with the following:
“Few persons care to study logic, because everybody conceives himself to be proficient enough in the art of reasoning already. But I observe that this satisfaction is limited to one’s own ratiocination, and does not extend to that of other men.”
In other words, most people are comfortable with their own assumptions and whatever reasons they have for them—if any. Their assumptions are the “logical” ones; those of the other guy are “illogical.” Most people have a mental comfort zone and get uncomfortable if it is violated.
Logic, if it does anything, ought to supply us with rules for reasoning that apply to ourselves and to “other men.” They would apply equally to the liberal and to the conservative; to the theist, agnostic and atheist alike. To the Westerner and the non-Westerner. Sometimes the results will force us out of our mental comfort zone. What if we discover that one or more of our assumptions is wrong?
At one time, it would have seemed obvious that principles of logic ought to be taught in schools, including to children. Logical reasoning had its place in the traditional trivium—the triad of grammar (the structure of language: at one time, Latin), logic (the structure of reasoning and the analysis of language) and rhetoric (the structure of instruction and rational persuasion, before the word rhetoric had the negative connotation it has now). Once, eons ago, the trivium was the staple of the first phase of a sound liberal arts education; today, of course, it is dead as a doornail. The average government school teacher probably hasn’t heard of it. For what it’s worth, the second phase was the quadrivium: geometry, arithmetic, astronomy and music. Not exactly preparation to work in a bureaucracy, live and work in a mass-consumption culture fueled by debt, or vote for two major candidates whose agendas differ in rhetoric only (in the negative sense of that term).
Many intellectuals today doubt that there could be a universal logic. Marxists developed the idea, to put it roughly, that the bourgeoisie operate with one kind of “logic” and the proletariat, a different “logic” based on their different class consciousness. If there are opposed incommensurable “logics” in this sense, any project of universal logic is hopeless. We can note that Marx and his disciples clearly believed that proletarian “logic” was superior to bourgeoisie “logic.” They certainly believed the Marxian arguments themselves were valid and those of Marx’s predecessors and opponents, invalid. What standard they were presuming? They certainly weren’t following the implication of Marxian “polylogism” (as the economist Ludwig von Mises called it) that the validity of their own position was, at best, local and class-determined. They saw it as part of a grand narrative that was true. As does anyone who believes he has found, and communicated, something true and essential.
For our purposes we will assume logic can do this, and that disagreements are over details. Inquiry would otherwise be pointless; it would be as likely to deliver falsehood as truth. The situation is worse, however. If “polylogism” or any other sort of intellectual relativism were true (whatever this would mean), we would be unable to explain science’s progress or how technology has been able to make our lives better. Technology has taken us from the wheel and the irrigation of crops to electronic messages sent thousands of miles in seconds. Surely there are principles that are part of a determinate or structured world the structure of which exists independently of our beliefs about it (the first premise of what philosophers call realism)! Surely, too, these principles are discoverable by the human mind! We will assume, that is, that there are universally valid logical principles, that our minds can identify them, and that our understanding of the world is—in some sense of the term— an understanding of something “logically” structured in the broadest sense of that term and capable of being mastered through proper combinations of human cognition and action.
This probably oversimplifies, of course. The story gets complicated rather quickly. What about, for example, “black swan” events no one could have predicted even in principle (the term deriving from Nicholas Taleb’s important book The Black Swan)? This is a good question, and we should deal with the sense some writers have developed in detail that reality isn’t as rational as its philosophical images which are therefore useful fictions at best—stories or narratives we’ve told ourselves. The best answer I have now, in this first installment, is that our understanding of our surroundings doesn’t have to be perfect and complete, whatever that would amount to. It just has to be sufficient to obtain desired results. If we fail at this, or if our expectations are violated by something we didn’t anticipate, then we need to review our assumptions and premises. One of them is false.
Logic I will define as the study of the rules of correct reasoning, for the purpose of evaluating inferences, arguments, and other passages (written or verbal) of various sorts. While this is the core of the logical enterprise, other areas of interest include the relationship between logic, language and the world—after all, there are philosophers who might read the above paragraph and declare it unintelligible, especially in these postmodernist times when grand narratives and “metanarratives” have been all declared dead! Plus there are many special areas such as the logic of scientific research, that of human action, the matter of whether or not it is logical to believe in the Christian God, and more besides.
Logic is worth studying if only because the public capacity for its use appears to have reached an all-time low. College and university courses in the subject have become jokes. This is because one-time institutions of higher learning have admitted so many students whose government-school educations have left them unequipped to study logic at the university level. Their instructor these days is frequently an adjunct who is held hostage to “student evaluations”; thus the course degenerates into strained efforts to ensure some learning while keeping students entertained. The students graduate from universities with no more logical competence than when they went in, and it shows as they go to work in institutions ranging from governments to universities to corporations to voting patterns. Those in government believe they can continue spending money indefinitely without consequences, and fail to see a difference between a small reduction in the rate of spending increases and a cut in spending. Universities promote political correctness to protect policies favoring some groups over others while calling themselves “equal opportunity employers”; those foisting these policies don’t see the logical contradiction between preferential treatment and equal opportunity. Corporations believe they can outsource jobs and that those laid off as a result will somehow be able to continue indefinitely buying their products at home. Many events of the past year culminating in the reelection of Barack Obama despite his utter failure with the economy (except for Wall Street!) demonstrate conclusively how the aggregate logical acumen of the public has gone into free fall.
A final reason for interest in logic is surely implied in all of the above: it can be a manual of intellectual self-defense. George Orwell warned about how language is manipulated. In “Politics and the English Language” he wrote, “Political language—and with variations this is true of all political parties, from Conservatives to Anarchists—is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.”
In this light, we should note that logic also studies and classifies species of mistakes in reasoning called fallacies. Some involve specific mistakes involving inferences; others involve sometimes purposeful misuses of language such as ambiguous usages or shifts in meaning (words like democracy are particularly prone to abuse). Moreover, premises that are false but used as if they are true render reasoning useless, although it can look very convincing—especially if propounded by someone highly visible and touted as an authority. Finally, some phrases in the current vernacular operate as excellent discussion-stoppers as I will call them (example: that’s a conspiracy theory).
In future installments of this series we will sharpen our account of how logic works by examining patterns of reasoning in detail. We will also take a close look at fallacies. We will also look at the analysis of language and why it matters. There are people who dismiss such concerns as trivial. The tools used in the logical analysis of language, however, can be quite powerful—they should be more than capable of uncovering falsehood and deception. I hope no one sees this project just as an end in itself. The question is, how can these exercises improve our lives and our societies? What kind of society do we wish to live in, and what actions do we need to take to build that kind of society? In this case, how can philosophical analysis contribute? How can it be made better, and more relevant? Hopefully, somewhere in this series, is a defense of the philosophical enterprise itself—recently declared “dead” by none other than Stephen Hawking, easily the preeminent theoretical physicist of our time.
Image Credit: CC BY-NC-ND 2.0 (Flickr)/BabyDinosaur