This week’s (exceptionally long and varied) offering of intellectual enrichment includes: an argument that the likely death of economic growth is the underlying theme of the current U.S. presidential election; thoughts on the rise of a real-life dystopia of universal algorithmic automation; an account of how the founder of TED became disgusted with the direction of the iconic ideas conference and created a new one to fulfill his original vision; reflections by a prominent physicist on science, religion, the Higgs boson, and the cultural politics of scientific research; a new examination of the definition, meaning, and cultural place of “pseudoscience”; a deeply absorbing warning about the possible death of the liberal arts, enhanced by the author’s first-person account of his own undergraduate liberal arts education at the University of Chicago; thoughts on the humanizing effects of deep literary reading in an age of proliferating psychopathy; a question/warning about the possible fate of literature as a central cultural-historical force in the grip of a publishing environment convulsed by epochal changes; an interview with Eric Kandel on the convergence of art, biology, and psychology; a new report from a psychiatric research project on the demonstrable link between creativity and mental illness; a career-spanning look at the works and themes of Ray Bradbury; and a spirited (pun intended) statement of the supreme value of supernatural and metaphysical literature from Michael Dirda.
No More Industrial Revolutions?
Thomas B. Edsall, Campaign Stops, The New York Times, October 15, 2012
[NOTE: You need to read this if you care about the links between large-scale, long-term economic collapse and the rhetoric we’re all seeing and hearing in the current American campaign season. Is the very real prospect of no more economic growth directly influencing the sociopolitical future of the United States? Does it portend a dystopian future of epic, entrenched, institutionalized, and ever-widening economic inequality? More immediately, for anybody who knows how to listen, is this all evident in the platforms and rhetorical self-positioning of the major political parties and their candidates? Much to consider here.]
The American economy is running on empty. Thatâs the hypothesis put forward by Robert J. Gordon, an economist at Northwestern University. Letâs assume for a moment that heâs right. The political consequences would be enormous. In his widely discussed National Bureau of Economic Research paper, “Is U.S. Economic Growth Over?” Gordon predicts a dark future of âepochal decline in growth from the U.S. record of the last 150 years.â The greatest innovations, Gordon argues, are behind us, with little prospect for transformative change along the lines of the three previous industrial revolutions … In essence, Gordon is saying that there wonât be a fourth industrial revolution. Why is this related to inequality? Because the burden of this decline will fall on the bulk of the population. The continuing prosperity of the wealthiest, on the other hand, will be magnified … If Gordon is even modestly on target, the current presidential campaign begins to ring hollow … Intellectually, both the Obama and Romney campaigns are undoubtedly aware of the general line of thinking that lies behind Gordonâs analysis, and of related findings in books like âThe Great Stagnationâ by Tyler Cowen of George Mason University. Cowen argues that innovation has reached a âtechnological plateauâ that rules out a return to the growth of the 20th century. For Obama, the argument that America has run out of string is politically untouchable. In the case of Romney and the Republican Party, something very different appears to be taking place … While Gordon projects a future of exacerbating inequality (as an ever-increasing share of declining productivity growth goes to the top), the wealthy are acutely aware that the political threat to their status and comfort would come from rising popular demand for policies of income redistribution … Affluent Republicans — the donor and policy base of the conservative movement — are on red alert. They want to protect and enhance their position in a future of diminished resources. What really provokes the ferocity with which the right currently fights for regressive tax and spending policies is a deeply pessimistic vision premised on a future of hard times. This vision has prompted the Republican Party to adopt a preemptive strategy that anticipates the end of growth and the onset of sustained austerity — a strategy to make sure that the size of their slice of the pie doesnât get smaller as the pie shrinks. This is the underlying and inadequately explored theme of the 2012 election.
The Tyranny of Algorithms
Evgeny Morozov, The Wall Street Journal, September 20, 2012
Teaser: Do we want a world where a software program picks the next pop-music star and legal systems run on opaque pieces of code?
In “Player Piano,” his 1952 dystopian novel, Kurt Vonnegut rebelled against automation. For Vonnegut, the metaphor of the player piano — where the instrument plays itself, without any intervention from humans — stood for all that was wrong with the cold, mechanical and efficiency-maximizing environment around him. Vonnegut would probably be terrified by Christopher Steiner’s provocative “Automate This,” a book about our growing reliance on algorithms. By encoding knowledge about the world into simple rules that computers can follow, algorithms produce faster decisions. A gadget like a player piano seems trivial in comparison with Music Xray, a trendy company that uses algorithms to rate new songs based on their “hit-appeal” by isolating their patterns of melody, beat, tempo and fullness of sound and comparing those with earlier hits. If the rating is too low, record companies — the bulk of Music Xray’s clientele — probably shouldn’t bother with the artist … On the whole, though, Mr. Steiner believes that we need to accept our algorithmic overlords. Accept them we might — but first we should vigorously, and transparently, debate the rules they are imposing … The real question isn’t whether to live with algorithms — the Sumerians got that much right — but how to live with them. As Vonnegut understood over a half-century ago, an uncritical embrace of automation, for all the efficiency that it offers, is just a prelude to dystopia.
Life after TED
April Dembosky, Financial Times, September 29, 2012
Teaser: Ideas conferences have lost their spontaneity, says Richard Saul Wurman. His solution? A $16,000-a-ticket event featuring David Blaine, Herbie Hancock and 72 hours of ‘intellectual jazz.’
[TED] grew and grew, and since Wurman sold it in 2001 for $14m, has developed cult status, attracting thousands of followers jockeying for its $7,500 entry tickets for four days filled with highly produced 18-minute “talks of a lifetime”. TED Talks, a series of lecture videos posted online, have received more than 800m views to date. University professors assign them as required course material. Some airlines, such as Delta, even have a TED channel on their in-flight entertainment systems. But, in Wurmanâs opinion, TED today has become over-orchestrated, too “slick” … Wurman originally formed TED by “subtracting” elements common to other conferences: introductions, lecterns, suits and ties. “I took away CEOs who legally canât tell the truth,” he says, “and politicians who canât tell the truth because they serve so many constituencies.” In recent years, though, too many things have, he feels, crept back in. “Now every speech is auditioned, rehearsed, edited, rehearsed again,” he says. “The spontaneity is gone and thereâs a lot of selling of charities. Thereâs the selling of being PC” … Though Wurmanâs new conference has lots of bigwig guests, and a handful of attendees willing to pay a premium to mingle with them, it lacks the high-tech lighting of TED, the beanbag chairs and ceiling-mounted television screens in the hallways, and the rows of refrigerators stocked with soft drinks and herbal iced tea. But then this is what Wurman wants: he calls WWW the “great leap backward”, both from TED and from the hyper-connected, hyper-busy modern world. “WWW could have taken place 2,500 years ago, with Aristotle and Socrates on stage,” he says.
Encounters with the God Particle: The Higgs boson, the pope, and the curious interaction between organized religion and big science
Paul Fishbane, Tablet, September 27, 2012
[NOTE: This brief piece rewards, and in fact demands, a full and close reading rather than just a skim, because the author’s point only emerges slowly and subtly. And it’s an interesting point indeed, encompassing a papal visit to CERN in 1982, the meaning and significance of both the Higgs boson and the search for it, the past and future (and meaning) of science, and certain unfortunate tendencies in politics and culture that may portend an unhappy prognosis for the future of truly epic, world-changing scientific research.]
[In 1982] Pope John Paul II came to CERN, the European center where I was engaged in physics research on basic physical law. He spoke to the staff about âprodigious things,â world peace, and how he hoped the science discovered at CERN should be subject to the constraints of conscience, quoting Genesis 1:31 (âAnd God saw everything that he had made, and behold, it was very goodâ). In reply, the CERN director spoke of a fecund dialogue between science and religion … The invention and search for the Higgs boson are part of the next stage of physics research, where we attempt to learn the rules that govern the sub-constituent parts of the nucleus and other stable or unstable particles and give us a coherent picture of the basic rules governing the way the universe was, is, and will be … [In the two decades before 1982] it was still possible for a professor and several graduate students to perform an experiment at one of the existing accelerators and get a result worth publishing, even to make a major discovery, in a matter of months. Theorists could both propose and analyze on the same time scale. By the early 1980s things had changed. Experiments in the new era involved bigger and bigger detectors. Bigger and bigger collaborations were necessary, and the time required to do an experiment grew longer. The accelerators, ever more expensive, were fewer, and their construction took years and involved dodging political minefields … [T]he fact that the United States has not provided an equivalent machine to check CERNâs results — or even to have beaten them to the punch — is discouraging. Will experiments at a single machine, without a second machine to check the results, be acceptable? This is not going to get any easier. Peter Higgs had to wait 50 years to learn that his proposal was at least partly proven right. He retired in 1996 and is now in his early eighties. Results from modern machines come slowly, and many theorists have wandered off into regions where unverifiable speculation is king. For the worker bees who stick to experimentation, thousand-person collaborations are now the rule. Will the most creative individuals be willing to spend all their time in such collaborations on a single life-spanning experiment? I wouldnât bet on it. Perhaps the popes still have something to teach us.
Separating the Pseudo from Science
Michael D. Gordin, The Chronicle of Higher Education, September 17, 2012
[NOTE: Although the author of this piece, which is adapted from his new book The Pseudoscience Wars: Immanuel Velikovsky and the Birth of the Modern Fringe (regarding which, see the illuminating and very inviting review at The New Republic), is far too quick with his uncritical and broad-brushed lumping together of parapsychology with various now-discredited pseudosciences, his overall point about the keen necessity of coming to a better collective understanding of what we mean by “pseudoscience,” and about the need to abandon the project of militant eradication of fringe theories, is excellent indeed, and also very well stated.]
I have come to think of pseudoscience as science’s shadow. A shadow is cast by something; it has no substance of its own. The same is true for these doctrines on the fringe. If scientists use some criterion such as peer review to demarcate, so will the fringe (creationists have peer-reviewed journals, as did Velikovskians). The brighter the light of science — that is, the greater its cultural prestige and authority — the sharper the shadow, and the more the fringe flourishes. Fringe theories proliferate because the status of science is high and science is seen as something worth emulating. Since World War II, science has been consistently prestigious, and heterodox doctrines have proliferated, but the pattern holds in the past as well. Late Enlightenment France and Victorian Britain were high points of scientists’ status, and clusters of such movements (mesmerism, spiritualism, phrenology) cropped up at these moments as well. Paradoxically, pseudoscience is a sign of health, not disease. Shadows are also an inevitable consequence of light. Carl Sagan and other anti-Velikovskians believed that greater scientific literacy could “cure” the ill of pseudoscience. Don’t get me wrong — scientific literacy is a wonderful thing, and I am committed to expanding it. But it won’t eradicate the fringe, and it won’t prevent the proliferation of doctrines the scientific community decries as pseudoscience … [W]e need to be careful about demarcation, to notice how we do it and why we do it, and stop striving for the goal of universal eradication of the fringe that is frankly impossible. We need to learn what we are talking about when we talk about pseudoscience.
Who Killed the Liberal Arts? And why we should care
Joseph Epstein, The Weekly Standard Vol. 18, No. 1, September 17, 2012
[NOTE: In this essay, Epstein offers not only an exceptionally lucid lament over the decline of the liberal arts but a striking and poignant first-person account of his own liberal arts education at the University of Chicago, where, as he describes in candid detail, he was a decidedly poor student. His explanation of how this education shaped him for life as a teacher and human being is truly moving.]
I was never more than a peripheral character, rather more like a tourist than a student, at the University of Chicago. Yet when I left the school in 1959, I was a strikingly different person than the one who entered in 1956. What had happened? My years there allowed me to consider other possibilities than the one destiny would appear to have set in grooves for me. I felt less locked into the social categories — Jewish, middle-class, Midwestern — in which I had grown up, and yet, more appreciative of their significance in my own development. I had had a glimpse — if not much more — of the higher things, and longed for a more concentrated look. Had I not gone to the University of Chicago, I have often wondered, what might my life be like? I suspect I would be wealthier. But reading the books I did, and have continued to throughout my life, has made it all but impossible to concentrate on moneymaking in the way that is required to acquire significant wealth. Without the experience of the University of Chicago, perhaps I would have been less critical of the worldâs institutions and the people who run them; I might even have been among those who do run them. I might, who knows, have been happier, if only because less introspective — nobody said the examined life is a lot of laughs — without the changes wrought in me by my years at the University of Chicago. Yet I would not trade in those three strange years for anything … The death of liberal arts education would constitute a serious subtraction. Without it, we shall no longer have a segment of the population that has a proper standard with which to judge true intellectual achievement. Without it, no one can have a genuine notion of what constitutes an educated man or woman, or why one work of art is superior to another, or what in life is serious and what is trivial. The loss of liberal arts education can only result in replacing authoritative judgment with rivaling expert opinions, the vaunting of the second- and third-rate in politics and art, the supremacy of the faddish and the fashionable in all of life. Without that glimpse of the best that liberal arts education conveys, a nation might wake up living in the worst, and never notice.
Psychopathy’s Double Edge
Kevin Dutton, The Chronicle of Higher Education, October 22, 2012
[NOTE: Along with offering a fascinating and disturbing first-person account of the writer’s subjective experience of being transformed briefly into a psychopath via transcranial magnetic stimulation, this essay/article, excerpted from The Wisdom of Psychopaths: What Saints, Spies, and Serial Killers Can Teach Us about Success, contains the following reflections about the psychological and neurological effects of reading. Notice how powerfully they resonate, but from a distinctly different angle, with some of Epstein’s words above.]
Over a 28-year-old single-malt scotch at the Scientific Study of Psychopathy’s biennial bash in Montreal in 2011, I asked Bob Hare, “When you look around you at modern-day society, do you think, in general, that we’re becoming more psychopathic?” The eminent criminal psychologist and creator of the widely used Psychopathy Checklist paused before answering. “I think, in general, yes, society is becoming more psychopathic … Precisely why this downturn in social values has come about is not entirely clear. A complex concatenation of environment, role models, and education is, as usual, under suspicion. But the beginnings of an even more fundamental answer may lie in a study conducted by Jeffrey Zacks and his team at the Dynamic Cognition Laboratory, at Washington University in St. Louis. With the aid of fMRI, Zacks and his co-authors peered deep inside the brains of volunteers as they read stories. What they found provided an intriguing insight into the way our brain constructs our sense of self. Changes in characters’ locations (e.g., “went out of the house into the street”) were associated with increased activity in regions of the temporal lobes involved in spatial orientation and perception, while changes in the objects that a character interacted with (e.g., “picked up a pencil”) produced a similar increase in a region of the frontal lobes known to be important for controlling grasping motions. Most important, however, changes in a character’s goal elicited increased activation in areas of the prefrontal cortex, damage to which results in impaired knowledge of the order and structure of planned, intentional action. Imagining, it would seem, really does make it so. Whenever we read a story, our level of engagement is such that we “mentally simulate each new situation encountered in a narrative,” according to one of the researchers, Nicole Speer. Our brains then interweave these newly encountered situations with knowledge and experience gleaned from our own lives to create an organic mosaic of dynamic mental syntheses. Reading a book carves brand-new neural pathways into the ancient cortical bedrock of our brains. It transforms the way we see the world — makes us, as Nicholas Carr puts it in his recent essay, “The Dreams of Readers,” “more alert to the inner lives of others.” We become vampires without being bitten — in other words, more empathic. Books make us see in a way that casual immersion in the Internet, and the quicksilver virtual world it offers, doesn’t.
All Is Not Vanity: The rise of literary self-publishing
Literary Review of Canada, September 1, 2012
Two technological breakthroughs have driven [the current explosion of literary self-publishing]: the development of the user-friendly e-reader, such as the Kobo, the Kindle and the Nook, and the appearance of inexpensive print-on-demand technology … Nowadays anyone can become a book publisher … By âliteraryâ I mean the kind of novels that vie for the literary prizes, the pool of serious, high-quality fiction out of which emerges the books that last. What does the rise of literary self-publishing mean for the future of literature? … [T]he line between self-published and conventionally published literature will disappear as more and more mainstream fiction is published as e-book or print-on-demand. Barriers to the mainstream marketplace will collapse as the physical bookstore disappears. Inevitably, medium-sized commercial publishers will gradually fade away, leaving an environment where a few entertainment megaliths battle for their share of the mass market while most serious books are produced by boutique publishers or are self-published … But will any of these future literary creations be works that last? The digital world has two cankers that constantly gnaw away at all notions of permanence: fragmentation and endless revisability. The former of these is our daily lament about our wired world: too much information, too many content providers, not enough time to begin to absorb any of it. The latter is less discussed. Yet the instant and infinite revisability of virtual text means that authors can continuously âimproveâ their work, perhaps in response to criticism, perhaps simply because writers are never truly ready to part with their creations. The notion of a definitive edition of an enduring work may soon disappear. Is the rise of literary self-publishing the beginning of the death of literature, of works that become part of a cultureâs DNA and pass from generation to generation? When the next Stone Angel or Fifth Business is published, how many of us will even know it exists? Will any of the fine novels now being brought into the world be read a hundred years from now?
Interview with Eric Kandel: “I See Psychoanalysis, Art, and Biology Coming Together”
Johann Grolle, Spiegel Online, October 11, 2012
[NOTE: Kandel is of course the author of The Age of Insight: The Quest to Understand the Unconscious in Art, Mind, and Brain, from Vienna 1900 to the Present, which is the subject of this interview.]
Teaser: Eric Kandel is considered one of the world’s most important neuroscientists. He recently published a book about the creative power of Vienna, the city of his birth. In an interview, he discusses the demonic side of man and the postcoital perspectives offered in Gustav Klimt’s paintings.
KANDEL: I began to realize that biology is a better way of approaching the truth about the mind [than psychoanalysis] … I don’t think biology is replacing the feeling experienced through art. Biology is capable of giving additional insights. It’s a parallel, not a substitutive process … My book is not only about biological insights into art but also about artistic insight into the mind … Truth has many dimensions, and the way you arrive at truth in complex situations is through many perspectives. I see psychoanalysis, art and biology ultimately coming together, just like cognitive psychology and neuroscience have merged recently … [A]rtists had insights into the mind that Freud did not have and that enriched and corrected what Freud taught us.
Creativity “closely entwined with mental illness”
Michelle Roberts, BBC News, October 17, 2012
[NOTE: It’s encouraging, and also amazing, to see this kind of thing getting such prominent mainstream press. When you see the lead researcher on a project that examined the relationship between creativity and mental illness announcing that there is indeed a connection, and that it indicates a need to reconsider the long-established trend of automatically medicating psychiatric patients to remove “everything regarded as morbid” because “the findings suggested disorders should be viewed in a new light and that certain traits might be beneficial or desirable,” you can know there’s a cultural sea change in the offing.]
Creativity is often part of a mental illness, with writers particularly susceptible, according to a study of more than a million people. Writers had a higher risk of anxiety and bipolar disorders, schizophrenia, unipolar depression, and substance abuse, the Swedish researchers at the Karolinska Institute found. They were almost twice as likely as the general population to kill themselves. The dancers and photographers were also more likely to have bipolar disorder. As a group, those in the creative professions were no more likely to suffer from psychiatric disorders than other people. But they were more likely to have a close relative with a disorder, including anorexia and, to some extent, autism, the Journal of Psychiatric Research reports. Lead researcher Dr Simon Kyaga said the findings suggested disorders should be viewed in a new light and that certain traits might be beneficial or desirable. For example, the restrictive and intense interests of someone with autism and the manic drive of a person with bipolar disorder might provide the necessary focus and determination for genius and creativity. Similarly, the disordered thoughts associated with schizophrenia might spark the all-important originality element of a masterpiece. Dr Kyaga said: “If one takes the view that certain phenomena associated with the patient’s illness are beneficial, it opens the way for a new approach to treatment. “In that case, the doctor and patient must come to an agreement on what is to be treated, and at what cost. “In psychiatry and medicine generally there has been a tradition to see the disease in black-and-white terms and to endeavour to treat the patient by removing everything regarded as morbid.”
The Dark and Starry Eyes of Ray Bradbury
Lauren Weiner, The New Atlantis, Summer 2012
Entertaining a faith in faith may be pretty conventional stuff, but it connects Bradbury (who was raised vaguely Baptist) to American readers in a unique way. Then, too, consider the leave-me-alone libertarianism that is typical of genre fiction. The bon mot of the sci-fi writer Octavia Butler captured it: âIâve never believed in utopias,â she said, âsince my utopia could so easily be someone elseâs hell.â Bradbury doesnât believe in utopia, either; he is seeking for mankind not perfection but a cheerful muddling through. Yet he is no libertarian. He naturally thinks in terms of the group, not the individual — as in the moving final vignette of The Martian Chronicles, with the pioneer family from Minnesota that has escaped a moribund Planet Earth and that will seed new human communities on Mars. Again, sheer convention — and yet in Bradburyâs hands it is given a touch of the sublime. More important than the technology that humans invent is the vision of the inventors; the fact that they dared is what matters most. Bradbury wrote stories that tried to hypnotize us into finding the future oddly, but comfortably, familiar — so that we might go forward to meet it not in fearful uncertainty but with courage, and therefore with success.
Beyond the Fields We Know
Michael Dirda, Lapham’s Quarterly, October 24, 2012
[NOTE: This essay is simply magnificent. Dirda enfolds and examines a galaxy of writers in his argument that supernatural and metaphysical fiction is the native literary mode of exploring the unconscious and dealing with serious cutting-edge cultural ideas, and that its ancientness and ubiquity underscores the lurking fact that our current “much vaunted realism is the sport, the mutant.”]
Human desires being limitless, it is thus likely that a vestigial tropism for magic and the supernatural are likely to be with us always. Our twenty-first-century minds no longer grant credence to love charms or prophecies or spells, yet our hearts still thrill to fairy tales, ghost stories, and the wonders of the Arabian Nights entertainments. While science fiction, the literature of extrapolation, answers the question âif this goes on,â stories of the supernatural build on âwhat if,â or even the hushed unspoken wish âif only.â They are tales of transcendence, whether of incontrovertible facts like death or of the horrors of modern life or of the burden of our own personalities. The fantastic pervades the worldâs literatures of every time and place; our much vaunted realism is the sport, the mutant … [T]he supernatural is the habitual mode by which writers explore the irrational and the subconscious: As within, so without …Â Far too long, I think, the realist novel has dominated our thinking about the course of English literature. Let us honor the marvelous as well as the matter of fact! It is time we paid more attention to metaphysical fiction, whether labeled fantasy, supernatural thriller or spiritual psychodrama. Some high spots of this lineage include Mary Shelleyâs Frankenstein, James Hoggâs Memoirs and Confessions of a Justified Sinner, Emily Bronteâs Wuthering Heights, George MacDonaldâs Phantastes, Robert Louis Stevensonâs Dr. Jekyll and Mr. Hyde, and Joseph Conradâs Heart of Darkness and The Secret Sharer. These demanding and disturbing novels of Machen, De la Mare, and Blackwood belong in their company. But there are many more examples in the twentieth century, from the light-hearted to the tragic: Sylvia Townsend Warnerâs Lolly Willowes, David Garnettâs Lady Into Fox, David Lindsayâs A Voyage to Arcturus, Franz Kafkaâs “Metamorphosis,” H.P. Lovecraftâs “The Shadow Out of Time,” Charles Williamsâ All Hallowâs Eve, John Crowleyâs Little, Big, and Philip K. Dickâs The Three Stigmata of Palmer Eldritch, to mention just a few. Such books remind us that we are all strangers and pilgrims.
The problem with Epstein’s article on the decline of the liberal arts is that while it’s true as far as it goes, the far greater threat to the liberal arts is the profit-obsessed corporate capitalism that the Weekly Standard exists to defend at all costs.