As a professional writer and English teacher for the past decade, I’ve been prone to think frequently about the role of language in life. One of the recurring themes in my thoughts — occasioned at least in part by some of my grad school studies in philosophy, anthropology, and sociolinguistics, and also by my being confronted at my job every day by extremely rough and problematic uses of the English language that are damned difficult to address — is the question of “correct” language. Is the very idea of correctness in this area just a culturally imperialistic metanarrative? Is it just arbitrary in the grand scheme of things? Or does it really get at a crucial truth?
And beyond mere technical correctness — grammar etc. — what about matters of rhetoric, style, and syntactical choices? How important are they not just to academic matters but to life in general, and not just in a utilitarian sense but a deeply human one?
A recent essay in The New York Review of Books offers some real fodder for reflection on all of these things. In “Words” (July 15), British academic Tony Judt talks about the vast significance of language in both his own personal life and the life of human culture at large. The essay is fascinating and poignant — fascinating because of the insight Judt brings to bear on the relationship between the clear and skillful deployment of language (in both print and speech) and the achievement of a general clarity of life and thought, and poignant because he caps the whole thing off by talking about a progressive neurological disorder from which he suffers, and which will inevitably rob him of speech. “Translating being into thought,” he says, “thought into words, and words into communication will soon be beyond me and I shall be confined to the rhetorical landscape of my interior reflections.”
He explains that he was brought up in a family where talking and debating were centrally important, and was processed through the British elementary school system of the 1950s, when “‘Good’ English was at its peak” and “We were instructed in the unacceptability of even the most minor syntactical transgression.”
The heart of the essay appears in his comments about the close connection between clarity of language and clarity of thought, and the way this connection has been devalued over the past half century of public life:
Sheer rhetorical facility, whatever its appeal, need not denote originality and depth of content.
All the same, inarticulacy surely suggests a shortcoming of thought. This idea will sound odd to a generation praised for what they are trying to say rather than the thing said. Articulacy itself became an object of suspicion in the 1970s: the retreat from “form” favored uncritical approbation of mere “self-expression,” above all in the classroom. But it is one thing to encourage students to express their opinions freely and to take care not to crush these under the weight of prematurely imposed authority. It is quite another for teachers to retreat from formal criticism in the hope that the freedom thereby accorded will favor independent thought: “Don’t worry how you say it, it’s the ideas that count.”
Forty years on from the 1960s, there are not many instructors left with the self-confidence (or the training) to pounce on infelicitous expression and explain clearly just why it inhibits intelligent reflection. The revolution of my generation played an important role in this unraveling: the priority accorded the autonomous individual in every sphere of life should not be underestimated — “doing your own thing” took protean form.
Today “natural” expression — in language as in art — is preferred to artifice. We unreflectively suppose that truth no less than beauty is conveyed more effectively thereby. Alexander Pope knew better. For many centuries in the Western tradition, how well you expressed a position corresponded closely to the credibility of your argument. Rhetorical styles might vary from the spartan to the baroque, but style itself was never a matter of indifference. And “style” was not just a well-turned sentence: poor expression belied poor thought. Confused words suggested confused ideas at best, dissimulation at worst.
He goes on from this to observe that in the modern social media milieu of Facebook, Twitter, MySpace, and texting, “pithy allusion substitutes for exposition,” and people who live under the reign of an overweening consumerism begin to talk like text messages.
The prognosis he offers is unequivocal:
This ought to worry us. When words lose their integrity so do the ideas they express. If we privilege personal expression over formal convention, then we are privatizing language no less than we have privatized so much else. “When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean—neither more nor less.” “The question is,” said Alice, “whether you can make words mean so many different things.” Alice was right: the outcome is anarchy.
As I said, this all hits home because of my personal and professional positions as a writer and teacher. And also because of my philosophical and spiritual proclivities. I’m deeply influenced by a loose Zen-Christian nondual school of thinking, seeing, and knowing, and of course this involves the recognition that reality in itself is fundamentally unspeakable, fundamentally a matter of pure being-ness and first-person apprehension. “The menu isn’t the meal.” “The map isn’t the territory.” Don’t get so distracted by the finger pointing to the moon that you miss the moon itself, the “finger” being words and concepts and the “moon” being the living realities they symbolize. And so on.
For years I struggled with the question of whether this semi-existentialist recognition of the abstraction of language and thought from real being, while valid and crucial, might not entail the necessary conclusion that language is unimportant. That’s one of the major reasons, among all the others, that Judt’s insights are so gripping: because he with his neurological disorder is faced with the imminent loss of his ability to communicate in words. And this really and truly does strike him — and me — as a loss.
In point of fact, reality’s transcendence of language means that the real world and life in general should be infinitely expressible in words. No matter that the words and concepts are relative realities instead of absolute ones, and symbolic realities instead of existential ones. This very fact means a person should ideally be able to describe his or her thoughts and experiences in a literally endless variety of linguistic variations, all of them circling around and pointing toward the realities themselves, and recreating in the mind and affect of the equally linguistically astute listener or reader an approximation of those very realities, thus encouraging a “see for yourself” transition to direct looking. Not to be able to do this, to lack the skills and sensibility to state and restate our experience, is to be locked away in a prison of muteness.
I recall being exhilarated as an undergraduate when I read Robert Anton Wilson’s The Widow’s Son and came to the fantastic philosophical passage in which — as I recall (it’s been a few years) — Wilson presents a hypothetical scene of humanity’s first explosion of self-consciousness, wherein an early human spontaneously develops the first-ever capacity for self-conscious reflection, and is thus able to recognize the beauty of a flower or sunset for the first time, and exclaims to another human with gasping wonder and delight, “Oh, look! Look at this!” Writes Wilson, “And beauty was created in a world that had been flat and dead and meaningless until that moment.”
The entire history of language proceeds from that delightful leap in self-consciousness, from that titanically freeing and empowering ability to step back from life and really see it, and to symbolize it in some form that’s communicable to others, so that they, too, can see for themselves by using the symbol for its proper purpose: as the Taoist’s “finger pointing toward the moon,” which directs attention away from itself and toward reality, serving only as a bridge. (See my “The Evolution of Consciousness and the Alchemy of Language” for more along these lines.)
I finished reading Colin Wilson’s The Philosopher’s Stone recently, and the entire philosophical thrust of that ecstatically philosophical novel is the value of being able to step back from immediate experience and grasp wider meanings. Wilson writes, “So poets, philosophers, scientists are always having these moments in which they grasp enormous meanings.” He even deliberately presents an instance in which a dull and prosaic-minded character suffers a head wound that accidentally endows him with the ability to induce “value experiences” (the novel’s fictionalized version of Maslow’s “peak experiences”) at will, simply so that he (Wilson) can make this very point about the importance of linguistic expression: “We had found someone who could plunge into ecstasy as a moment’s notice. Here was s Wordsworth without the power of self-expression, a Traherne who could only say ‘Gor, ain’t it pretty.'”
So all of this is just a longish and rambling rumination to get around to saying this: that Judt is right. The power to use language with self-conscious correctness, and not just that, but with rhetorical beauty, is a real power with real value because it really does allow “the translation of being into thought, thought into words, and words into communication” — which means your and my subjectivity becomes sharable. Our walled-off world of interiority becomes something we can communicate to someone else, and they can communicate theirs to us. There may be, in fact there truly are, wordless ways of doing the same thing — but words are one of the finest and most effective means we have of doing this. (See yesterday’s post about fictional entertainments and their power to cultivate empathy.)
Even more: Words, like self-consciousness, can actually enhance primary experience. The capacity for self-consciousness and the capacity for language being inextricably interlinked, it’s simply the case that the better your ability to reflect upon and express your experience consciously and linguistically, the more fully you know that experience. The very act of reflection creates the reflector. It’s bound up with the fact of individual subjecthood itself, as any student of the Western intellectual, philosophical, political, and social tradition, not to mention any student of Buddhism, can tell you. And the achievement and refinement of that ego self, despite the undeniable and enormous problems it has created — everything having to do with the “nightmare” of recorded/civilized history from which Joyce was struggling to awake — is one of the greatest quantum leaps in the history of the universe’s evolution. It’s the universe becoming awake to itself, and our purpose lies not in fleeing from the ego but in fulfilling the purpose for which it arose. See the pre/trans fallacy famously articulated by Ken Wilber. See the biblical Jesus: “I come not to destroy the law but to fulfill it.”
Our culture now presents us with an opportunity either to rise to, and even above, the opportunity embodied in words and language, or to sink below it. This is what I and every other writer and/or teacher is charged with addressing. We’re not just trying to enhance students’ communication skills in order to enhance their employment prospects. We’re helping to focus their being, to focus Being itself, for the ultimate fulfillment of its purpose, by helping them to develop their linguistic capacities and conscious interior sensibilities to the greatest possible extent.