Logic, Language and Life

In Rationality and Logic (MIT Press—A Bradford Book, January 2009), Robert Hanna argues for an innate logical capacity (what he calls a “proto-logic” underlying all actual forms and methods of logical expression) along the lines of Chomsky’s innate “universal grammar” (the latter of which consists of a certain functional capacity to arrange information in the mind in ways that enable language to happen) underpinning the various and diverse grammatical mechanisms which every language uses to make sounds into sense. Of course, our brains have all sorts of capacities, but the one which distinguishes us from other animals, especially other primates, is language. For Chomsky, this requires an underlying capacity to forge meanings through the syntactical methods speakers use to arrange their sounds, signs or gestures.

Ours is not the largest brain in the animal world nor the most complex, yet our brains have what other animal brains lack. We speak and are spoken to, making something more than noise out of our utterances and Chomsky argues that this facility appears so abruptly and so rapidly in human children, and involves so much complexity in grammatical expression (which differs dramatically from language to language), that no child could possibly learn it all, with its seemingly infinite possibilities of expression, unless something more than mere experiential learning were at work. The human child, he posits, already possesses the mechanism that enables speech and the various grammars used in different languages.

Hanna finds this Chomskyian thesis compelling and proposes that logic, i.e., the rules of inference which languages rely on to make meaningful statements about things, must have such a basis, too. After all, one cannot presume to rely on logical deduction to validate, thus “prove,” the reliability of such deduction. The resultant infinite regress is, itself, inherently illogical. But if we are to believe in the truth logic purports to deliver, we must first show that logic, itself, is reliable. So, something else must underlie and validate it. Chomsky proposes that language is underwritten by a computational function called “recursion” (in which assertions of various sorts can be meaningfully nested within other assertions e.g., “I am talking about what I was talking about”) or a related computational function called “merge” (in which we put different self-contained statements together to create new meaning). It’s the capacity to do these things, Chomsky argues, that allows human language to occur, a capacity we do not have to learn because it’s already present in our brains. Hanna aims to show that logic, like Chomsky’s claim about language, rests on a similar foundation.

As Ludwig Wittgenstein suggested in the 1940s, language is something of an enigma. It consists of various verbal practices relying on certain rules learned, beginning in childhood, and deployed in concert with other speakers. These rules of “grammar”—not the grade school sort we learned for clearer or more socially acceptable writing—enable us to give sounds meaning. It’s much more than just getting the niceties of speech right. Grammar takes center stage here as a way of explicating human speech. But does it rest on a Chomskyian posit of a faculty unique to human brains or are these linguistic guard rails, which allow us to turn sounds into words (when we stay within their bounds), something else? And are these guard rails fabricated by the humans using them or do they express an inherited ability only humans have?

The rules of logic, like those of grammar, turn sounds into sense, making reasoning possible. Reason, like language, is an apparently unique capacity of the human animal but it’s hard to pin down just what it is, other than saying it’s to know how to recognize and respond to the implications in our utterances. The study of logic may be thought of as the study of inference, of how saying certain things makes saying other things a requirement if we are staying within those guard rails. What we say is tightly woven with what we do, of course, for we cannot say things without acting in accord with what we say or we will be taken to have failed to grasp the meanings of our own words. “Logic” denotes a framework of saying and doing when what we say involves talk of what we mean to do, have done, or think we should do. It’s the scaffolding on which our words are hung in in order to turn them into statements, assertions. Without logic, the rules of inference, we have no meaning and without meaning no language. But “logic” applies only to the assertoric part of language; other parts rely on other types of rules which allow us to signal our fellows, express our condition, etc.

The study of logic turns out to be the study of the structure of those relations between verbalizations (or their gestural or sign-based equivalents) and what we do in response to the world. If language consists of following grammatical rules, then logic is that subset of rules which make assertoric statements possible.

Chomsky suggests a fortuitous human mutation some 100,000 to 150,000 years ago resulted in one lucky human ancestor being born with a genetic change that enabled that individual to think in a way its fellows could not. The progeny of this fortunate creature thus had what their neighbors lacked: a capacity to think about things in a newly complex way, a way that would eventually find expression through their grunts, squeals, squawks and assorted other vocalizations. In time, these descendants would effectively join thought to sound to produce language which, once achieved, provided them an incomparable advantage over other hominids. Cooperation is made so much easier and more efficient when you can talk to others rather than relying on inarticulate grunts and gestures, after all. As language made its appearance among those blessed with the new genetic advantage (the recursive capacity Chomsky posits), they eventually left the non-language enabled in the dust. Hanna finds this account compelling. After all, if we cannot explain logic by justifying it (because that already assumes logic and you can’t justify a claim by itself), then the only other solution must be that logic is, like Chomsky’s notion of a computational brain algorithm, encoded in brains. This thesis may be highly speculative and unscientific on its face (how would we disconfirm it, after all?), but it’s not unimaginable—nor impossible. But there’s another explanation for language’s occurrence and rapid manifestation in human young and this applies to logic as well.

Chomsky ascribes the wide variance in human languages to differences in surface structures, reflecting various contingently generated methods of putting our thoughts together in sounds and signs. He argues that translatability, sharing meanings between languages, and between individuals in fact, requires such commonality, an Ur “language” into which all the disparate surface expressions of our speech can be “translated.” Because he assumes insufficient experiential knowledge of all the variant possibilities of a given language, Chomsky supposes that human children develop language more rapidly than an experience-based trial and error method would predict. Thus, a deeper language must already be present.

Logic, the rules of inference, can explain how assertoric language works but it, too, must come from somewhere. If the rules that enable language cannot occur without Chomsky’s UG, then logic, the framework of language must not either.

Reasoning, the practice of applying logic to judgments, works in English quite as well as it does in ancient Greek, the language in which the rules of logic were first noticed and categorized. Is the logic that underlies reasoning universal then in the same way Chomsky’s UG is posited to be? And, given the fact that language is just using sounds, signs and gestures to capture and convey meaning, what else shall we think logic is but those rules, albeit in abstract form (devoid of the conventions of particular communities)? Where Chomsky and Hanna would lodge the idea of rules in a brain function, of as yet undetermined form, Wittgenstein’s work suggests a different alternative. His notion of learning through emulation and repetition does not hinge on brain maturation (although arguably brain adequacy is needed) as Chomsky’s does (although human brains have a great many faculties beyond those of other creatures). What it demands is a general capacity to make it possible for humans to do what their primate cousins cannot, by capturing and retaining large mental repositories of information, creating a “library” of old information to connect with new, enabling retention of extensive sensory inputs including the proprioceptive, not just our perceptions concerning the world around us but also the sounds we ourselves utter and how others react to them—and our reactions to others’ utterances.

Language is a function of how we relate to our world, our large retentive capacity, courtesy of inherited brains, enabling us to remember past utterances and their effects. To make it all work there must be rules of practice we follow, too, and these must be learned and remembered. Do we need to suppose they’re inherited, that they already pre-exist in embryonic form in the developing brain? If logic is just rules, learned and used, to make our utterances meaningful, why can’t these be learned afresh with each new human, supplied with the proper sort of brain?

Can’t we explain language sufficiently as a system for communicating, built on more basic signaling capacities (shared with other creatures on the planet), resting on brain capacity not some heritable algorithm? All that’s needed is enough brain to learn and follow algorithms. We don’t have to have them gifted to us by inheritance, too. If logic is just those rules of inference that make an assertoric system from the raw material of sounds, signs and gestures in order to picture a world (framing our sensory inputs as distinct objects of reference) possible, what more is required?

The signaling capacities on which our linguistic abilities rest can be seen as the foundation of a communicating system enabled by greater brain capacity to hold and utilize more information than other brains in the animal kingdom, providing us a better “tool” for getting about in the world. Language is developed, learned, and retained because it works and here the rules of logic can be seen to be just those norms that work best in drawing inferences, giving its users an advantage in the world. Our large and complex primate brains make language (and so conceptual framing of our inputs) possible because they enable us to retain, for organizing purposes, large amounts of received information. We have better memories for a broader range of phenomena (from the concrete to the abstract) than competitor species. But this should not be taken as a claim that we each invent our language for ourselves any more than it should be taken to suppose we must have a specialized inherited brain algorithm. Language is a communal enterprise, a function of multiple speakers in a shared world.

To speak a language, we must learn the rules of the community we’re in. Logic, those rules of language that make sounds into meaningful utterances about things, is universal because it’s part of a development which transcends particular speakers, reflecting the common world in which we all stand. It’s not transcendent or embedded in the brain but “universal” because language speakers inhabit the same world — even if they may come to have different understandings of it.

Leave a Comment