Throughout the 1990s the Clinton administration pushed hard for the universal integration of computers and information technology throughout America’s public education system, culminating in Bill Clinton’s official presidential call for “A computer in every classroom,” since, in his words, technology is “the great equalizer” for schools. No matter that it was an idea (and ideology) that was basically made up and lacking in any real support. No matter that, as Todd Oppenheimer incisively argued in a now-classic 1997 Atlantic article titled “The Computer Delusion” (and later in its 2003 book-length expansion, The Flickering Mind: The False Promise of Technology in the Classroom and How Learning Can Be Saved), “There is no good evidence that most uses of computers significantly improve teaching and learning, yet school districts are cutting programs — music, art, physical education — that enrich children’s lives to make room for this dubious nostrum, and the Clinton Administration has embraced the goal of ‘computers in every classroom’ with credulous and costly enthusiasm.” The techno-utopian impulse for America’s schools proved to be unstoppable on a practical level, and schools en masse, from kindergarten to college, bought into it on a proverbial hook, line, and sinker basis. The idea prevalent at administrative levels was and — as I can vouch from having spent the last decade-plus working in high school and college settings — still is that technology in and of itself is a Great Thing that will Revolutionize Learning. Even though many individual administrators and teachers are quite savvy and sensitive to the nuances of the techno-utopian gospel, the overall institutional-cultural pressure is overwhelmingly in the direction of uncritical adoption.
A week ago the Innovations blog at The Chronicle of Higher Education published a post that serves as a kind of update on these matters, and that issues a timely warning about the still-persisting delusion regarding technology’s automatic and automatically wholesome impact on education, especially at the college level:
The notion is widespread that higher education, and maybe all education, is about to be transformed by technology. This hope or fear (depending on where you sit) is fed by the breakneck pace at which new technologies burst upon the scene. Enterprises like Google and Facebook are among the poster children for this observation, and who can fail to be astounded by the way first the iPhone and then seemingly ten minutes later the iPad transformed first the cell phone and then the portable computing industries. Perhaps most important, these devices have quickly transformed the way many of us communicate, read, and organize many aspects of our lives.
Technology and technical change certainly have great potential for changing education. The way students study today is dramatically different from the way they studied even a decade ago. But questions about the pace and character of change require some perspective…[C]reating new technologies is a much more straightforward task than transforming the educational processes to which those technologies contribute. And the classroom (whether physical or virtual) processes and the individual courses they compose can’t quickly and simply be put together into new educational institutions and experiences. Attempting to transform them overnight is more likely to yield an Apple Newton than a nice new iPad.
— Sandy Baum and Michael McPherson, “Instant Revolution? Technology and Higher Education,” Innovations, The Chronicle of Higher Education, June 2, 2012
By way of illustration and comparison, the authors call attention to an extremely interesting article published last month by The New England Journal of Medicine that examines the decades-long attempt to integrate information technology into the health care profession in the hope of “catalyz[ing] the transformation of health care delivery in the United States from a fragmented cottage industry plagued by poor quality and high costs to a highly organized, integrated system that delivers high-quality care efficiently.” Based not only on the results of this push in the medical arena but on the comparative results of adopting new technology in other industries, the article elicits a clear and compelling lesson:
[C]urrent conclusions about the value of health IT investments may be premature. Research suggests three lessons for physicians and health care leaders: invest in creating new measures of productivity that can reveal the quality and cost gains that arise from health IT, avoid impatience or overly optimistic expectations about return on investment and focus on the delivery reengineering needed to create a productivity payoff, and pay greater attention to measuring and improving IT usability. In the meantime, avoiding broad claims about overall value that are based on limited evidence may permit a clearer focus on the best ways of optimizing IT’s use in health care.
— Spencer S. Jones et al., “Unraveling the IT Productivity Paradox–Lessons for Health Care,” The New England Journal of Medicine, June 20, 2012
“Avoiding broad claims about overall value that are based on limited evidence” is of course grand advice for all people, in all places, at all times. But the fact of the American education system’s deep and direct entanglement with gargantuan questions and issues of economic inequality and apocalyptic sociopolitical turmoil, as read against its traditional role of molding and cultivating minds and characters (a role it is still fulfilling, but in a kind of run-amok vein of corporate-consumerism), means Baum and McPherson are surely correct in their Chronicle post when they say the medical article’s advice is “well worth the attention of educational leaders.”
Whether anybody in a significant position of power will actually hear this, or will actually pay attention and act on it, is, however, another question altogether. In The End of Education: Redefining the Value of Schools — a book that was published the very same year that Clinton issued his call for computers in classrooms (1996) — Neil Postman argued that America’s education system has come to be defined by technology worship and consumerism. “Nowhere do you find more enthusiasm for the god of Technology than among educators,” he wrote. “There is hardly a school superintendent anywhere, or a college dean, who cannot give us a ready-made sermon on how we now live in an ‘information age.’ ” Chances are this won’t change. Like so many other aspects of life in our current cultural and civilizational juncture, techno-utopianism, including its educational iteration, is a self-perpetuating meme or, to say much the same thing in different terms, a daemon run amok. This means we can’t (or shouldn’t) oppose it directly, because that only underscores and intensifies its power. Our entrancement with technology is truly a Frankensteinian situation. The best thing we can do is to listen to it individually, each of us in the sanctum of his or her own soul, and hear what it has to tell us about who we are, how we’re built, what we want, and where we’re headed.