The Off Button

Published: January 15, 2013

Recently the Pandora’s Box of how best to reform our institutions of higher learning has been opened on a number of fronts. Government officials are heard in the media talking about us becoming “road kill” on the information superhighway if we don’t get rid of the “sages on stages” approach to teaching. The Ontario Government’s [...]

Recently the Pandora’s Box of how best to reform our institutions of higher learning has been opened on a number of fronts. Government officials are heard in the media talking about us becoming “road kill” on the information superhighway if we don’t get rid of the “sages on stages” approach to teaching.

The Ontario Government’s 2012 white paper on education, “Strengthening Ontario’s Centres of Creativity, Innovation and Knowledge,” notes that there are “new ways for students to learn from and interact with faculty and each other,” and that we may be entering an age when the “digital delivery of course content can free faculty in traditional institutions to engage in direct dialogue and mentorship with students.”

Diane Rasmussen Neal of Western University warns in an article in The Western News that the modern classroom must “evolve or die,” adopting teaching analogues of Facebook like Edmodo to make education more fun and engaging. “The only certainty in today’s online, information-drive world is change, and higher education must evolve along with our technological society in order to remain relevant to our young students and their future employers.”

In fact, the phrase “sages on stages” has become something of a buzzword for digital neo-liberals, repeated over and over in the media recently.

Such views are misinformed at best, crude propaganda for Apple and Microsoft at worst. A careful analysis of the role of digital technology in higher education will show that it’s promoted ignorance, not knowledge, and severely degraded basic reading, writing, and thinking skills. It’s time to hit the off button.

All technologies are ambivalent in the sense that for every benefit they offer us we must pay some sort of price. Personal cars allow us to zip across town in the privacy of our own steel cabins, but they also cause noise, pollution, traffic congestion, accidents, and wars in the Middle East. The use of technology in higher education is equally ambivalent, a fact not recognized by its naïve boosters.

When talking about technology in education, we have to make some sharp distinctions. Most of the time we’re not talking about mechanical technologies like bicycles and buses and bullhorns. Also, we’re not talking about medical technologies like penicillin and dental drills. No one wants to go back to a time when we used horses ‘n buggies to cross town or leeches to cure a fever. When we talk about the use of technology in education, we’re talking mostly about digital technologies.

One can find three general views toward technology defended in the public sphere. First, there’s the techno-skepticism I’m defending here that questions the utility, both political and practical, of adopting each new gadget or system. Radically opposed to this is the techno-futurism, which sees almost all technological progress as at the same time social progress. Futurists demand that we adopt new technologies as they arise as a sort of moral imperative, rushing at midnight to Future Shop to buy the latest video game or iPhone. Third, there are the techno-defeatists, who say, in effect, “I know this technology may not be beneficial, but almost everyone is using it, so resistance is futile: I had best jump on the bandwagon before it leaves me behind.” Most people commenting on higher education are either defeatists or futurists. The latter are at least defending a moral principle; the former have just given up.

One problem with both defeatists and futurists is that too many of them haven’t spent any time in the classroom in the last decade. If they had, they’d realize that digital technology is already omnipresent there, used by both students and professors. Almost every undergraduate student in North America is addicted to texting on their smart phones and checking their Facebook pages on an hourly basis. Almost every professor uses a computer, projector, and PowerPoint presentation as part of his or her lectures. Most also use video clips and the Internet in some way. Calling for more digital technology in education today is like calling for more white people in the Republican Party.

The suite of digital technologies used on campus is obvious to all active teachers: smart phones, iPods, iPads, and computers connected to the Internet. The real questions are how these are used, and whether these uses contribute anything to the main goal of higher education: to improve students’ minds and characters by helping them to learn facts, debate ideas, and understand the world better. The answer is, for the most part, no – study after study shows that digital technology has dumbed down higher education. They may make education more “fun” and “engaging,” but that’s only saying that they’ve turned education into a form of entertainment, which no one who cares about improving the knowledge and skills of their students should defend. Writing essays, reading difficult texts, or figuring out complex mathematical problems are “fun” for very few people.

We need a more careful examination of the beneficial and harmful effects of specific digital technologies before it’s too late and we’re all assimilated by them. On the plus side, the use of the computer as a delivery device for texts and images is largely a positive development. Gone are the nights spent in the bowels of the university library looking through card catalogues and the Social Science Index for books and articles. Now they’re only a few mouse clicks away on the library website. It’s also useful from a teacher’s point of view to be able to display images and video via classroom computers when teaching things like fine art, comics, and film.

However, even this seemingly benign benefit of digital tech comes with a price. Also gone are the hours spent leafing through tomes on library shelves that lead to lateral thinking, finding books and articles related to whatever one is investigating that broaden one’s perspective. Digital technology promotes a smart bomb approach to research, zeroing in on specific targets. For instance, I’ve read Nicholas Carr’s ground-breaking article “Is Google Making Us Stupid?” online, but I have no idea what the other articles in that summer 2008 issue of The Atlantic are about.

Laptops in the classroom are much more of a problem. Yes, one in 10 students actually uses them to look up facts and issues related to the topic of the lecture they’re listening to, but the other nine are using classroom wifi to check their Facebook pages, email or celebrity websites. Portable computers combine all four of the general functions of digital technology: they combine information delivery, peer communication, entertainment, and procrastination. Cellphones concentrate on the last three functions, and have no pedagogical purpose that I can see. They are merely an annoyance that have to be policed. The also encourage students’ obsession with having CNN-style updates on their grades.

Anyone who has walked to the back of a university classroom and looked at what students are actually using their various screens to look at will abandon any sense that digital technology plays a positive role in the classroom. Once notes are taken, the great majority revert to the peer communication, entertainment, and procrastination functions of the computer. They learn nothing of value from these functions. Conversely, many of the best students I’ve had in the last decade print off class notes at home then come to class with these, a pen, and a book or two in hand. Overall, the use of laptops in the classroom hurts students’ ability to learn.

The latest craze in the U.S. are MOOCs, massive open online courses offered both by specific colleges and by more profit-oriented companies such as Coursera and Udacity. Now instead of having Dr. Mike lecture to a thousand first-year psychology students in front of a twenty-foot screen with an army of TAs in tow, major universities like Harvard and MIT can administer courses with 100,000 or more students over the Net. Indeed, an early MOOC taught through Stanford by Sebastian Thrun of Udacity drew a surprising 160,000 signups. Duke University professor and blogger Cathy Davidson calls them “breathtaking” innovations that have a transformative, epoch-making potential.

Those soft liberals questioning the value of MOOCs have their brains turn spongy when digital entrepreneurs inform them that MOOCs are ideal learning instruments for the disabled, the poor, and those in remote communities without access to campus life in major cities. And indeed they can perform such a useful function. However, this function is like the politically correct meat thrown to distract the watchdogs of the critical mind while digital burglars abscond with yet another aspect of higher education. The danger is that MOOCs will migrate from the kitchen tables of soccer moms to the core curricula of major universities, from well meant amateur learning to professional credentialism. And have no illusions: university students today want credentials.

Four types of critiques can be made of these MOOCs. First, there are the practical problems of dealing with thousands of people online instead of a hundred in a physical classroom: how do you know students are actually learning anything? If you test them, how do you avoid cheating? And who wants to mark a thousand essays, a sizable proportion of which will, no doubt, be semi-literate? I vaguely recall this was the punishment assigned to sinners in Dante’s 8th circle of the Inferno, but I could be wrong, since I took the course online.

Second, there is what I’ll call the “de-socialization” effect of such courses. Yes, Johnny and Joanne will no longer have to put on their boots and coats on a cold January morning to trudge to the Social Science Center for a lecture on political theory. But is this a good thing? Leaving aside the lack of physical exercise digital society engenders, MOOCers will no longer have to directly engage with their fellow students or listen to points of views they don’t like. Our current peer culture will fragment into a series of micro-peer subcultures. Laura Pappano reports in the New York Times on November 2, 2012 that getting fellow MOOCers to meet for study groups in real spaces is a hard sell, mentioning one meeting planned by an information tech manager where no one showed up.

Third, there’s the “digital effects” of converting the physical to the virtual. Allow me a parable. When I was younger, I was an avid board gamer, concentrating on military strategy and sci-fi role-playing games. Typically, a board game would have a 10-20 page rule book, with some “campaign” style games like Star Fleet Battles having over a hundred pages. After some early attempts to mimic board games, by the end of the century video games had internalized the mechanics the traditional board game laid out in its rules, in recent years offering gamers a slim 4-8 page booklet inside the DVD case that contains little more than credits and information on how to install the game. As video games become more sophisticated visually, they became easier to play. And they shifted from history to fantasy: if you search for PS3 games on Amazon.com, you’ll find one page of historical strategy games, but hundreds of first-person shooters and fantasy role-playing games.

The same digital effects seen in gaming – a move to simpler and more visually appealing content at the cost of cognitive crunching and historical and political knowledge – are likely to accompany the shift from smaller classroom courses to MOOCs with thousands of enrollees. Education will be gamified, in the worst sense. After wading through page after page of high-minded rhetoric about “connectivity” and “collaborative institutions” in reports like Cathy Davidson and Theo Goldberg’s “The Future of Learning in the Digital Age,” one is massively let down when one discovers that the exemplars of the “new learning” are Second Life, SimCity, Civilization, and World of Warcraft, accompanied by a picture of one student showing off his elfin-eared avatar. Galadriel would be proud.

MOOC students will never see in the flesh the ghosts in the machines that teach them. Connected to these effects is refusal of modern students to do any work not directly associated with their grades, which doesn’t bode well for the success of self-directed online learning. There are already reports of massive dropout rates in the non-credentialed MOOCs offered so far – Pappano mentions a machine learning course where 72 per cent of enrollees didn’t finish.

Lastly, if MOOCs become credentialed courses in major universities in Canada, they will inevitably contribute to the McDonaldization of higher learning. Imagine a sociology or English MOOC with 5,000 students. How will the professor provide feedback to individual students? How will anyone mark with care the thousand of essays that it produces? Will there be any meaningful dialogue between teachers and students, or even between students? The answer is simple: no. We’ll let machines do the work. So no more essays, no more seminars, and multiple-choice exams marked by computers only. Such a course has only one purpose: to save money by turning learning into a Fordist assembly line. On major campuses their rationale is economic, an extension of the process of shifting from permanent to part-time faculty in undergraduate teaching that began in the 1990s.

Returning to the big picture, what’s especially frustrating when we hear the blind support for digital technology bruited in government white papers and the mass media today is the refusal of both futurists and defeatists to acknowledge the substantial empirical research and theoretical argumentation over the last 10 years or so that questions the value of such technology.

Mark Bauerlein’s The Dumbest Generation contains literally dozens of studies that show how digital technology has helped to create a generation of proud bibliophobes who avoid complex knowledge like the plague. As Bauerlein says, young users of digital tech “upload and download, surf and chat, post and design, but they haven’t learned to analyze a complex text, store facts in their heads, comprehend a foreign policy decision, take lessons from history, or spell correctly” (201-202). Talkers become texters, readers morph into web surfers.

Nicholas Carr has shown how Wikipedia, Google, and other websites have fragmented our memories and attentions spans, making an evening spent reading War and Peace with all screens off almost impossible. Jean Twenge and Keith Campbell’s The Narcissism Epidemic shows how celebrity culture, the Web 2.0 and soft parenting have accelerated young people’s sense of self-esteem beyond all reasonable boundaries of actual achievement. The mass culture tells them that everyone can be a star, facts be damned.

It’s not unusual for students to pick up poorly written and researched essays with bad grades, look at the mark for a few seconds, then forget about them or simply throw them out (I’ve actually seen this happen). In the worst cases, they don’t care why they got the grade, but are just frustrated by the bad result. This typifies the most cynical student attitude today toward institutions of higher learning: that they’re just degree-granting machines where the student is incarcerated for a few years before graduating, getting a job, and buying lots of stuff.

Digital narcissists don’t care about their inability to read and write English or their ignorance of a range of basic historical and political facts: their egos prevent them from acknowledging the absolutely essential notion that they can actually learn something from institutions of higher education, especially if teachers give them low marks that damage self-images pumped up by hours and hours of looking at the carefully posed pictures they’ve posted on their Facebook pages. Revenge time comes at the end of term, when they can fill out anonymous class evaluations.

What we need are technologies and techniques that decrease self-involvement and narcissism. Yes, Facebook may help shy students express themselves digitally. But wouldn’t it be better if they fought that shyness by talking to flesh-and-blood people in a real physical environment? In the long term, which life technique will help them more? Teachers today too often take the easy way out, doing what will make them popular with students, rather than helping them develop their skills and characters. Short-term solutions breed long-term problems.

My solution? Hit the off button in as many places as we can. Turn off wifi in the classroom, restricting it to student lounges scattered across campus. Create a school-wide policy that bans the use of cellphones during lectures and seminars. Since texting has become an addiction for many, treat cells like cigarettes: if you want to text, do it outside. Ban the use of social networking websites during class. Stop promoting Internet-based courses: these are cheap imitations of the real thing. Digital technologies can be great delivery devices. But what they too often deliver has nothing to do with education.

Doug Mann is an adjunct professor in Sociology and FIMS at UWO who checks his email every day, uses computers and projectors in every lecture he’s given since at least 2005, and has never met Ted Kaczynski. A much shorter version of this article appeared in the Toronto Star on October 6, 2012.

Photo courtesy of Reuters.

  • Mann-FeatureImage
  • Digital Classrooms