< Back to Main Site

EDUCAUSE review onlineEDUCAUSE review online

Individual Knowledge in the Internet Age


© 2010 Larry Sanger. The text of this article is licensed under the Creative Commons Attribution-Noncommercial-ShareAlike 3.0 License (http://creativecommons.org/licenses/by-nc-sa/3.0/).

EDUCAUSE Review, vol. 45, no. 2 (March/April 2010): 14-24

Larry Sanger is best known as co-founder of Wikipedia. He went on to start another wiki encyclopedia project, Citizendium.org, and has most recently started an innovative directory of educational videos: WatchKnow.org. He is also interested in educational philosophy and policy. He earned his Ph.D. in Philosophy, with a dissertation about the theory of knowledge, from Ohio State in 2000.

Comments on this article can be posted to the web via the link at the bottom of this page.

In the last several years, many observers of education and learning have been stunned by the abundance of information online, the ever-faster findability of answers, and the productivity of online "crowds," which have created information resources like Wikipedia and YouTube. The enormous scope of these developments has surprised me too, despite the fact that they are more or less what many of us had hoped for and deliberately tried to bring into being. These sudden, revolutionary developments demand analysis: How is this latest information explosion changing the way we live? Is the relationship between society and individual changing? More to the point for this article, how is the Internet revolution changing education?

I will analyze three common strands of current thought about education and the Internet. First is the idea that the instant availability of information online makes the memorization of facts unnecessary or less necessary. Second is the celebration of the virtues of collaborative learning as superior to outmoded individual learning. And third is the insistence that lengthy, complex books, which constitute a single, static, one-way conversation with an individual, are inferior to knowledge co-constructed by members of a group.

Though seemingly disparate, these three strands of thought are interrelated. Each tends to substitute the Internet — both a resource and an innovative way to organize people — for individual learning and knowledge. I have devoted my Internet career to creating educational tools, so I am sympathetic to the use of the Internet for education. But I believe that it is a profound mistake to think that the tools of the Internet can replace the effortful, careful development of the individual mind — the sort of development that is fostered by a solid liberal arts education.

Unnecessary Memorization

Whenever I encounter yet another instance of educationists' arguments against "memorizing," the following rather abstract yet simple thought springs to my philosopher's mind: Surely the only way to know something is to have memorized it. How can I be said to know something that I do not remember? So being opposed to memorizing has always sounded to me like being opposed to knowledge. I realize this argument likely seems glib. The thing educationists object to, of course, is not the remembering or even the memorizing but rather the memorizing by rote — that is, by dull repetition and often without experience or understanding.

In a December 2008 interview, Don Tapscott, a popular writer on the subject of the Internet and society, argued that the Internet is now "the fountain of knowledge" and that students need not memorize particular facts such as historical dates. "It is enough that they know about the Battle of Hastings," he said, "without having to memorise that it was in 1066. They can look that up and position it in history with a click on Google."1 This view is common enough among the Wikipedia users I have come across; they sometimes declare that since the free online encyclopedia is so huge and easy to use, they feel less pressure to commit "trivia" to memory. More recently, the online salon Edge asked 172 prominent scientists and thinkers: "How is the Internet changing the way you think?"(http://www.edge.org/q2010/q10_index.html). One of the main themes of the responses was that, metaphorically put, we will soon be "uploading our brains" to the Internet — or more literally, we will be relying increasingly on the Internet as an extension or prosthesis of our memory.2 As David Dalrymple (to take just one example) noted:

Before the Internet, most professional occupations required a large body of knowledge, accumulated over years or even decades of experience. But now, anyone with good critical thinking skills and the ability to focus on the important information can retrieve it on demand from the Internet, rather than her own memory. On the other hand, those with wandering minds, who might once have been able to focus by isolating themselves with their work, now often cannot work without the Internet, which simultaneously furnishes a panoply of unrelated information — whether about their friends' doings, celebrity news, limericks, or millions of other sources of distraction. The bottom line is that how well an employee can focus might now be more important than how knowledgeable he is. Knowledge was once an internal property of a person, and focus on the task at hand could be imposed externally, but with the Internet, knowledge can be supplied externally, but focus must be forced internally.3

Dalrymple explicitly draws the conclusion — that the Internet has made acquiring "a large body of knowledge" unnecessary, since it can be "supplied externally" — that Tapscott implicitly conveyed in his interview. Of course, nobody is saying that knowledge is unimportant. Tapscott, for example, says simply that knowing the exact date of the Battle of Hastings is not important. Dalrymple says only that acquiring "a large body of knowledge" is not necessary. So is this merely a practical problem of detail, the problem of distinguishing the essential, important-to-remember items from the trivial, just-look-it-up-on-Google items?

No. In fact, this is a deeply theoretical and crucial issue. What counts as trivial? How much knowledge is enough, so that more than that would be a large body of (presumably unnecessary) individual knowledge? And what makes acquiring knowledge valuable in the first place? I suspect that these questions are ultimately beside the point for those who promote the educational value of the Internet. Their point, I think, is merely that we can now learn less than we have learned in the past, again because the Internet is such a ready mental prosthesis.

But to claim that the Internet allows us to learn less, or that it makes memorizing less important, is to belie any profound grasp of the nature of knowledge. Finding out a fact about a topic with a search in Wolfram Alpha, for example, is very different indeed from knowing about and understanding the topic. Having a well-understood fact ready to recall is far different from merely getting an unfamiliar answer to a question. Reading a few sentences in Wikipedia about some theories on the causes of the Great Depression does not mean that one thereby knows or understands this topic. Being able to read (or view) anything quickly on a topic can provide one with information, but actually having a knowledge of or understanding about the topic will always require critical study. The Internet will never change that.

Moreover, if you read an answer to a question, you usually need fairly substantial background knowledge to interpret the answer. For example, if you have never memorized any dates, then when you discover from Wikipedia that the Battle of Hastings took place in 1066, this fact will mean absolutely nothing to you. (Anyone who has tried to teach a little history to young children, as I have to my three-year-old, knows this.) Indeed, you need knowledge in order to know what questions to ask. Defenders of liberal arts education often remind us that the point of a good education is not merely to amass a lot of facts.4 The point is to develop judgment or understanding of questions that require a nuanced grasp of the various facts and to thereby develop the ability to think about and use those facts. If you do not have copious essential facts at the ready, then you will not be able to make wise judgments that depend on your understanding of those facts, regardless of how fast you can look them up. If public intellectuals can say, without being laughed at and roundly condemned, that the Internet makes learning ("memorizing") facts unnecessary because facts can always be looked up, then I fear that we have come to a very low point in our intellectual culture. I fear we have completely devalued or, perhaps worse, forgotten about the deep importance of the sort of nuanced, rational, and relatively unprejudiced understanding of issues that a liberal education provides.

That I make this point may seem ironic. After all, in place of rote memorizing of trivia, practically everybody agrees that we should "learn how to learn" — and this is one of the hallmarks of a liberal education. Tapscott argued that the ability to learn new things is more important than ever "in a world where you have to process new information at lightning speed." He added: "Children are going to have to reinvent their knowledge base multiple times. So for them memorising facts and figures is a waste of time."5 This is an old argument of many educationists: the ever-changing nature of science and technology in the information age makes it unnecessary to amass a lot of soon-to-be-out-of-date knowledge. Since an ever-expanding amount of information and research is frequently updating our understanding of disciplines, there is no reason to insist on memorizing facts and figures — and no reason to insist on a core of basic knowledge and books that should be mastered.

But this argument seems fallacious. It implies that the new information has either replaced or made trivial the old information. And this is obviously not so in most subjects. Think of all the things typically taught in primary schools: reading, writing, mathematics, basic science. How much of this has changed in the last one hundred years? Even granting that some of our understanding, especially in more advanced education, has been replaced (as in nuclear physics and geography) or refined (as in biology and history), the vast body of essential facts that undergird any sophisticated understanding of the way the world works does not change rapidly.6 This is as true in biology and medicine, fields with stunning recent advances, as it is in mathematics and philosophy.7 And to return to my point, unless one learns the basics in those fields, Googling a question will merely allow one to parrot an answer — not to understand it.

It also won't do to make the facile reply that there is no such thing as "the basics." The basics can be understood as what is commonly taught in introductory courses or what commonly appears in introductory textbooks. Granted, there are some (new) specialized fields in which there are relatively few basics that everyone is taught — I am thinking of knowledge management, computer programming, and social media. But in most fields, there is certainly a body of core knowledge.

To possess a substantial understanding of a field requires not just memorizing the facts and figures that are used by everyone in the field but also practicing, using, and internalizing those basics. To return to my "glib" argument, surely the only way to begin to know something is to have memorized it.

Outmoded Individual Learning

Belittling substantial knowledge as unnecessary rote memorization, in the new age of Internet searching, is only one way in which the Internet is being made to substitute for the difficult work of developing individual minds. Another way is to suggest, often vaguely, that collaborative work via the Internet makes more traditional modes of study old-fashioned and also unnecessary. The first attack is on the content of learning; the second is on the method.

I have some acquaintance with the use of online collaboration in learning. My appreciation for it began in the mid-1990s when I was a philosophy graduate student. I started several mailing lists and organized careful readings of some philosophy books. I also started an online philosophical society, which hosted fascinating discussions of many topics. I learned a tremendous amount from these extracurricular activities. Later, in the first years of Wikipedia, I saw collaboration being used in course assignments. Then a few years later, I had students in my philosophy of law course post their writings and discussions on a wiki. More recently, another wiki encyclopedia project I started, Citizendium, officially invited college teachers to assign group-written encyclopedia articles via our Eduzendium program. These various experiences have given me some practical idea of the merits and drawbacks of collaborative learning online.

My own view of online collaborative learning is that it can be an excellent method of (1) exchanging written ideas, especially when those involved are interested and motivated, via student forums, and (2) obtaining free public reviews of students' work, on wikis. There are drawbacks with each of these, however. First, as to online student forums, attempting to spark a lively online, real-time, always-on conversation among reluctant students is apt to be about as easy as sparking a more traditional lively conversation among similarly reluctant students. That is, the remarks in both forums can be disappointingly perfunctory and not apt to teach much to anyone except, maybe, the student making them. I think some teachers who are early adopters of online student forums probably became enthusiastic about the prospects because of their own experiences, in which conversing online with colleagues and fellow hobbyists can be a fantastic way of learning about one's interests. But there is no reason to think that adopting the tool — online conversation — will necessarily reproduce, in students, either the motivation to pursue interests or the resulting increase in knowledge.

Regarding the second method — getting reviews of students' work on wikis — I and my colleagues have discovered that it can be a handy way for teachers to avoid having to read and give feedback on early drafts of student work. Having students post their work on Wikipedia — or, better, Citizendium, which has many more important topics still wide open — can result in their getting feedback from "the regulars," and that can be very valuable. One problem, however, is that the regulars might have some rather idiosyncratic ideas about the subject (especially on Wikipedia), which arguably wastes the student's time. Another problem is that a significant level of useful feedback cannot be guaranteed. Some students might get a huge amount of feedback, and others might get none, and that hardly seems fair.

Another way of using wikis (and similar online collaborative tools) is to require students to work together on co-written papers. But it is definitely a mistake to think that using a wiki as a tool will, by itself, reliably create the same magic and excitement among students as was created in the days of Wikipedia's growth. For one thing, many Wikipedians have been motivated by vanity — by a desire see their own words prevail as representing "what is known" on a subject. In contrast, students who are required by a teacher to write articles together do not suddenly acquire such a motivation from the use of the tool, especially when using a private wiki.

So it goes, I suspect, with all social media. There is no reason to think that repurposing social media for education will magically make students more inspired and engaged. What inspires and engages some people about social media is the passion for their individual, personal interests, as well as the desire to stay in touch with friends. Remove those crucial elements, and you merely have some neat new software tools that make communication faster.

Some of the claims made on behalf of online collaborative learning are quite dramatic. It seems that some educationists are conceiving of a whole new pedagogy centered on groupwork done online. In a recent EDUCAUSE Review article, for example, John Seely Brown and Richard P. Adler position "social learning" online as the cornerstone of "Learning 2.0."8 And in another EDUCAUSE Review article, Don Tapscott and Anthony D. Williams argued that we should "adopt collaborative learning as the core model of pedagogy," with "the Internet and the new digital platforms for learning" being "critical to all of this."9

Brown and Adler do not view the Internet as merely a fancy new set of tools, as I am inclined to.10 They regard it as potentially revolutionary for educational methods:

The most profound impact of the Internet, an impact that has yet to be fully realized, is its ability to support and expand the various aspects of social learning. What do we mean by "social learning"? Perhaps the simplest way to explain this concept is to note that social learning is based on the premise that our understanding of content is socially constructed through conversations about that content and through grounded interactions, especially with others, around problems or actions. The focus is not so much on what we are learning but on how we are learning.

They draw a distinction between social learning and what they call "Cartesian" learning:

The emphasis on social learning stands in sharp contrast to the traditional Cartesian view of knowledge and learning — a view that has largely dominated the way education has been structured for over one hundred years. The Cartesian perspective assumes that knowledge is a kind of substance and that pedagogy concerns the best way to transfer this substance from teachers to students. By contrast, instead of starting from the Cartesian premise of "I think, therefore I am," and from the assumption that knowledge is something that is transferred to the student via various pedagogical strategies, the social view of learning says, "We participate, therefore we are."11

I will pass over some philosophical quibbles with this,12 and I hope I will not be blamed if I try to respond to Brown and Adler in a fairly brief space: properly analyzing all of their theories would take much longer. My focus here is specifically on the notion that the opportunities afforded by the Internet ought to fundamentally change the way that we teach.

First is their distinction between the Cartesian view of learning and the social view of learning. In Cartesian learning, knowledge is "transferred" from a teacher to a student, whereas social learning involves students "constructing" knowledge collaboratively. Brown and Adler apparently think that since knowledge is transferred, according to the Cartesian view, there must be something that is transferred, like a baton that is passed. But why saddle Cartesian learning with the notion of a transferred substance? One could just as easily, and with just as much justification, assert that what is constructed in social learning is a "substance" that is socially shared. One can simply say instead that Cartesian learning involves the teacher causing the student to believe something that is true, by communicating the true thought.

In Cartesian learning, a person learns by himself or herself with the help of a teacher — hence the article drawing of the lone "I think, therefore I am" student. On the other hand, in social learning, students learn together in a group — hence the article drawing of the two "We participate, therefore we are" students. The distinction here is simply the difference between learning with and without the help of peers.13 Examples of Cartesian learning would involve reading a book, doing homework alone, or writing a paper by oneself. Examples of social learning would involve discussions, doing homework together, co-writing papers, and working or doing practicums together. As Brown and Adler explain:

This perspective [the social view of learning] shifts the focus of our attention from the content of a subject to the learning activities and human interactions around which that content is situated. This perspective also helps to explain the effectiveness of study groups. Students in these groups can ask questions to clarify areas of uncertainty or confusion, can improve their grasp of the material by hearing the answers to questions from fellow students, and perhaps most powerfully, can take on the role of teacher to help other group members benefit from their understanding (one of the best ways to learn something is, after all, to teach it to others).14

The conclusion here is that social learning is superior to Cartesian learning because students who use study groups score better than those who study alone (says one group of researchers), students in groups can ask questions and hear answers more readily, and students in groups can teach each other. This reasoning hardly clinches the matter. But then, I doubt this is meant as a rigorous argument. It is simply an articulation of a view for those educationists who are already inclined to agree with both the premises and the conclusion, so that they can nod their head as they move on to the meat of the paper.

The meat of Brown and Adler's paper comes when they present some Internet tools for "extending education." These tools provide examples of how social learning flourishes online. The examples include the Terra Incognita project and a Harvard Law School course, which were developed in virtual classrooms within Second Life; Digital StudyHall, an online help center for students in India; the Faulkes Telescope Project, which allows students to access high-powered telescopes via the Internet, and a couple of other astronomy tools; the Decameron Web, a site for study of and scholarship about the classic work by Giovanni Boccaccio; and public blogging by David Wiley's students in a course called "Understanding Online Interaction."

I am sure that such educational tools are exciting, fascinating, and no doubt excellent learning resources. The Internet in general is the greatest educational tool that has been devised since, perhaps, the invention of the printing press. But the question under examination is whether the mere existence of such learning resources somehow establishes the conclusion that social learning is superior to Cartesian learning — that the Internet makes it possible for social learning to replace or displace more traditional individual learning.

To see just how difficult it would be to establish this conclusion, consider all the many aspects of a quality liberal arts education — not a technical education but a more foundational, liberal one — that involve necessarily individual acts:

  • You can find the Decameron online, you can even listen to another person reading it to you, but you must mentally process it yourself. No one else, certainly no group, can do your reading for you, no matter how helpful they may be in discussing it or summarizing it. Either you read/process it or you don't.
  • Similarly, you may post your essays online in public blogs and benefit from comments others offer, but you will not become well educated unless you engage in the essentially solitary act of writing,15 no matter how much others may assist you with drafts and no matter how much you may help others with collaboratively written papers.
  • It is one thing to engage in a discussion — whether online, in a traditional classroom, or in a study session — and thereby be inspired to think fascinating thoughts, but it is quite another to think creatively and critically for oneself. A person who has no experience or inclination for the latter may work well in groups but would seem to be missing something essential to the ordinary notion of scholarship. My notion of a good scholar — perhaps standards are changing — is someone who is capable of thinking independently.
  • Similarly, you may get tremendous help solving problems in your math and science classes by working in groups, online or off, but ultimately the knowledge and skills developed are your own. After you have engaged in a study session with others, you had better make sure you can do the problems by yourself. If you cannot, you probably do not understand the material.

These four activities — reading, writing, critical thinking, and calculation — should make up the vast bulk of a liberal education. Social learning could not replace these individual, "Cartesian" activities without jettisoning liberal education itself.

Boring Old Books

Perhaps the point is that liberal education is outmoded. Perhaps the advantage being sought from collaborative learning lies in a new mode of life, an online social life perhaps sought for its own sake, as knowledge itself has often been thought to be sought for its own sake. Online communities that are open to students and that promote collaboration count as practical training in that social life. But is fostering a deeply networked online social life among the proper tasks of education, independent of or in addition to or instead of the more traditional tasks of a liberal education? Indeed, is participating in online communities via social media a replacement for reading boring old books?

Until recent years, the question would have been very puzzling, but now the suggestion will sound familiar to many readers. Books, we hear, are old-fashioned: they are not interactive, and they constitute a single, static, one-way conversation with an individual. The Internet theorist Clay Shirky has noted that we are undergoing another, inevitable "transformation of the media landscape" — as happened previously with Gutenberg's printing press — in which an older "monolithic, historic, and elite culture" is being sacrificed in favor of "a diverse, contemporary, and vulgar one." An "upstart literature" is destined to become "the new high culture." Taking on this "challenge" will mean "altering our historic models for the summa bonum of educated life."16 Shirky was writing in response to Nicholas Carr's essay "Is Google Making Us Stupid?"17 Carr explained that the fracturing of our attention may be making us less capable of processing broad, complex information and, more simply, less capable of reading books. For example, Carr noted that an acquaintance of his, a medical professor, admitted to not being able to read War and Peace anymore. On the Britannica Blog, Shirky wrote: "It's not just Carr's friend, and it's not just because of the web — no one reads War and Peace. It's too long, and not so interesting." He underscored his seriousness by saying: "This observation is no less sacrilegious for being true." Shirky proceeded to argue that "the literary world is now losing its normative hold on culture." Both Shirky and Carr quote the playwright Richard Foreman, who observed that the "complex, dense and 'cathedral-like' structure of the highly educated and articulate personality" is at risk.18

Shirky's argument implies that the now-dying medium of paper publishing encouraged "wordy, abstract, and dense" writing and that such writing is now falling away with the demise of the medium. Works like War and Peace — and no doubt most other classics — are outmoded. They are outmoded because the highly networked nature of communication and publishing presents us with a different "media landscape," one with different requirements. The requirements of the new medium make not just books but also personalities that are "complex, dense and 'cathedral-like' " positively anachronistic. But it seems to me that to say so is to declare the irrelevance of most of the thinkers throughout history — as Shirky notes, "the literary world is now losing its normative hold" on our culture. In other words, it seems that Shirky is saying that blog and Twitter posts, Wikipedia and YouTube contributions, which arguably weaken our attentional capabilities, are becoming dominant in our culture and that more challenging, pre-Internet modes of expression, like books, are going by the wayside.

But is knowledge, even the knowledge contained in great books, now something that can be adequately replaced by the collaborative creations of the students themselves? Perhaps that is the point. Perhaps the notion is that knowledge-as-co-created by students is superior to knowledge-as-passed-along-by-teachers-and-books, regardless of quality. Perhaps the accuracy of the information co-created by students does not matter, because as shared information it enjoys a social validity that dusty old volumes and teachers speaking from authority cannot. Perhaps the subtlety and depth of thinking that comes from critical reading and evaluation of great writers does not matter, because information is now ultimately best understood as belonging to and produced by large groups of people. Perhaps being acquainted with the original sources of great ideas does not matter, because reproducing those ideas, even if in stunted ways, enjoys an authenticity that convoluted old texts cannot. Perhaps the perennial nature of the classics, the fact that they have been loved and learned from for generations, does not matter, because in the new publishing and societal paradigm they will be replaced by an "upstart literature" — literature that is more realistic about the capabilities of attention-challenged students.

It might now sound as if I am attacking a strawman, with no one really talking about wholesale replacement. I hope that is true. To be well educated, to be able to pass along the liberal and rational values that undergird our civilization, we must as a culture retain our ability to comprehend long, difficult texts written by individuals. Indeed, the single best method of getting a basic education is to read increasingly difficult and important books. To be sure, other tasks are essential, especially for training in scientific and applied fields; there are some people who are very well trained for various trades without reading many books. But when it comes to getting a solid intellectual grounding — a foundational, liberal education — nothing is less dispensable than getting acquainted with many books created by the "complex, dense" minds of deep-thinking individuals.


Considering the amount of play that collaborative learning and Web 2.0 educational methods have received recently, I suspect that the above discussion may sound pedestrian and even backward. My attitude is probably not at all what one would expect from a co-founder of Wikipedia. But I am not alone in my perspective. The merits of online communities have not at all been universally agreed-upon. Wikipedia, YouTube, Facebook, and Twitter all have harsh critics. More to the point, some of the digerati have contempt specifically for the more collectivist aspects of Internet communities. In this genre, Andrew Keen's The Cult of the Amateur is well known. Other examples include Maggie Jackson's Distracted, which argues that the sheer amount of information and activity in our always-on culture is fracturing our attention and hence our ability to process information. Mark Bauerlein's The Dumbest Generation, far from celebrating social networks for enhancing education, instead blames them for turning the "digital natives" into the "dumbest generation." Perhaps the highest-octane criticism, quite relevant to the current discussion, is Jaron Lanier's essay "Digital Maoism." Among Lanier's well-placed points is that online collaboration in what he (along with Kevin Kelly and others) calls "hive minds" (e.g., Wikipedia) unsurprisingly tends to depersonalize and alienate us, cheapening our individuality and sapping the interest and idiosyncrasy from our writing and thinking.19

While I tip my hat to such thinkers, my main interest in this article was to analyze the recent boosterism of the educational merits of using Internet tools to replace the effortful, careful development of the individual mind. The three strands of current thought explored above — about how the Internet might change the educational role of memorization, about individual versus collaborative learning, and about the future of books and book-reading — are really just an extension of the older debate over the value of Western civilization, liberal arts, and "the canon." The key assumption underlying my view is that liberal education and the Western enlightenment ideals that it inculcates not only are valuable but are essential to our future.

This is worth emphasizing. Some Internet boosters argue that Google searching serves as a replacement for our memory and that students need not memorize — need not learn — as much as they did before the Internet. Educationists inspire us with the suggestion that collaborative learning online can serve as "the core model of pedagogy." Knowledge is primarily to be developed and delivered by students working in online groups. And finally, the co-creation of knowledge can and should take the place of reading long, dense, and complex books. Such claims run roughshod over the core of a liberal education. They devalue an acquaintance with (involving much memorization of) many facts about history, literature, science, mathematics, the arts, and philosophy. Such claims also ignore the individual nature of much of liberal education. Reading, writing, critical thinking, and calculation, however much they can be assisted by groups, are ultimately individual skills that must, in the main, be practiced by individual minds capable of working independently. And such claims dismiss the depth of thinking that results from a critical reading and evaluation of many long and complex books.

The educational proposals and predictions of the Internet boosters described above point to a profoundly illiberal future. I fear that if we take their advice, in the place of a creative society with a reasonably deep well of liberally educated critical thinkers, we will have a society of drones, enculturated by hive minds, who are able to work together online but who are largely innocent of the texts and habits of study that encourage deep and independent thought. We will be bound by the prejudices of our "digital tribe," ripe for manipulation by whoever has the firmest grip on our dialogue. I see all too much evidence that we are moving headlong in that direction. Indeed, I fear this is already happening. I honestly hope that I prove to be an alarmist, but I am a realist reporting on my observations. I wish the news were better.

  1. Tapscott quoted in Alexandra Frean, "Google Generation Has No Need for Rote Learning," Times (London), December 2, 2008, <http://www.timesonline.co.uk/tol/life_and_style/education/article5270092.ece>.
  2. This curious description of the Internet as a mental "prosthesis" is not my own. See, for example, Stephen M. Kosslyn, "A Small Price to Pay," Edge, <http://www.edge.org/q2010/q10_3.html#kosslyn>, and Clifford Pickover, "The Rise of Internet Prosthetic Brains and Soliton Personhood," Edge, <http://www.edge.org/q2010/q10_11.html#pickover>. I myself addressed the issue in "The Un-Focusing, De-Liberating Effects of Joining the Hive Mind," Edge, <http://www.edge.org/q2010/q10_2.html#sanger>.
  3. David Dalrymple, "Knowledge Is Out, Focus Is In, and People Are Everywhere," Edge, <http://www.edge.org/q2010/q10_16.html#dalrymple>.
  4. John Henry Newman, in The Idea of a University (1873), is my favorite example.
  5. Quoted in Frean, "Google Generation."
  6. In an article footnote, John Seely Brown and Richard P. Adler approvingly quote someone as saying that the "half life" of much technical information is "now less than four years." Perhaps — but that is the nature of technology. It is not the nature of the basic arts and sciences. Formulating educational policy about all of knowledge based on the nature of technical knowledge seems like a very bad idea. John Seely Brown and Richard P. Adler, "Minds on Fire: Open Education, the Long Tail, and Learning 2.0," EDUCAUSE Review, vol. 43, no. 1 (January/February 2008), p. 32 (note #21), <http://www.educause.edu/library/erm0811>.
  7. Would you like to be treated by a doctor who believed that he or she did not need to memorize the basic facts and figures of the field because the field is changing so quickly and facts could be looked up in a medical database?
  8. Brown and Adler, "Minds on Fire."
  9. Don Tapscott and Anthony D. Williams, "Innovating the 21st-Century University: It's Time!" EDUCAUSE Review, vol. 45, no. 1 (January/February 2010), p. 26, <http://www.educause.edu/library/erm1010>.
  10. One thing I am not mentioning here is using the Internet as a way to organize education, as opposed to a way to deliver education. In fact, I've written a speculative "2.0" article on just that topic: "Education 2.0," The Focus (German-based English-language magazine), Spring 2007. A copy can be viewed here: <http://blog.citizendium.org/?p=193>.
  11. Brown and Adler, "Minds on Fire," p. 18 (italics in original).
  12. Descartes thought that each person's mind, not "knowledge" itself somehow, was a mental substance (a thinking thing, res cogitans). The notion that knowledge itself is a "substance" would have been incoherent to him.
  13. The authors do go on to posit "a second, perhaps even more significant, aspect of social learning" — namely, becoming a practitioner in a field. I think this means that according to Brown and Adler's view, one cannot really have learned something "socially" without becoming part of the social community of practice that "does" whatever is learned about. This raises even further issues, which would require much more space for me to address them.
  14. Brown and Adler, "Minds on Fire," p. 18.
  15. Most original acts of writing, even on wikis, are solitary. You, an individual, are the producer of the sentences that others may edit.
  16. Clay Shirky, "Why Abundance Is Good: A Reply to Nick Carr," Encyclopaedia Britannica Blog, July 17, 2008, <http://www.britannica.com/blogs/2008/07/why-abundance-is-good-a-reply-to-nick-carr/>.
  17. Nicholas Carr, "Is Google Making Us Stupid?" The Atlantic, July/August 2008, <http://www.theatlantic.com/doc/200807/google>.
  18. Richard Foreman, "The Pancake People; or, 'The Gods Are Pounding My Head,' " Edge, March 8, 2005, <http://www.edge.org/3rd_culture/foreman05/foreman05_index.html>.
  19. Andrew Keen, The Cult of the Amateur: How Today's Internet Is Killing Our Culture (New York: Doubleday/Currency, 2007); Maggie Jackson, Distracted: The Erosion of Attention and the Coming Dark Age (Amherst, N.Y.: Prometheus Books, 2008); Mark Bauerlein, The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future (or, Don't Trust Anyone under 30) (New York: Jeremy P. Tarcher/Penguin, 2008); Jaron Lanier, "Digital Maoism: The Hazards of the New Online Collectivism," Edge, May 30, 2006, <http://www.edge.org/3rd_culture/lanier06/lanier06_index.html>.


Realism and the Individual Learner

I was struck by the penultimate sentence in this piece and the claim of realism.  Can we look at the arguments from the lens offered in the documentary, Declining by Degrees?  There the ideal liberal arts education was reasonably well delivered at Amherst College, an incredibly selective and high tuition school.  It was much less well delivered at the public institutions depicted in the film.  Is this essay meant to protect the style of education at places like Amherst (I doubt there is much threat to that) or is it meant to include public institutions as well?  

Where the ideal was not delivered well there were a variety of causes - (lack of) institutional commitment, commitment of individual instructors, student preparedness, and student engagement.  The realistic question to me is how to best approximate the ideal articulated in the paper when either students are not so elite or expenditure per student is not so high or both. 

Recently I've taught some of the elite kids here, a public university to be sure but one of the better ones in the country.  I would say the principal educational threat for these kids is not the Internet.  Rather it is that they are incredibly over-programmed.  Much of that, in turn, is a perceived need to develop credentials.  There needs to be a long list of extra-curricular activities as well as to follow a rigorous curriculum.  For critical thinking, the breadth-depth trade off is out of balance. There is some of the dumbing down in the education that is the concern of this piece but the causality, at least for this sort of student, is misidentified.

Turning to the more-typical students, my understanding is that much of the work that argues for a social networking approach is buttressed by a belief in constructivism.  In some of the listservs I frequent,  the constructivist approach is taken as the only true way to learn.  I am not a constructivist and I believe a lot of what is argued in this piece about the individual as a learner and liked the points about the necessity of foreknowledge to produce new knowledge out of information.  But there are some strengths to the constructivist point of view, particularly with its focus on motivation.  The section on Wikipedia participation talks about  motivation.  Where discussing the traditional Cartesian approach, learner motivation is hardly mentioned.  If one did incorporate that into the discussion in a realistic way, what would emerge?  My experience is that students are terribly frightened of showing their ignorance on a subject, to anyone but especially to their instructor.  Given that, and trying to identify motivation as emerging from the interplay between the individual efforts and peer interaction where some openness may be possible, don't we get something that looks like a hybrid of what is advocated in this piece and what the Tapscott's and Seely Brown's are arguing for? 

One last point.  Much of the media seems to have embraced hyperbole as the norm in making an argument.  If a reasonable person nowadays were to encourage movement to some sort of hybrid, in print would they argue for that or instead for some overshoot, knowing that's the more likely way to get there?

Posted by: larvan on May 7, 2010

Learning Has Not Changed... much

I do not believe that learning methods have changed because of resources available via the
internet or because of social networking online. I do think that more people have access to
resources and community because of the internet and that, in spite of the digital divide, many
people are accessing educational resources and communities that they would not have been able to
in pre-digital times. I have a masters degree in Library Science that I would not posses were it
not for the internet
But people have been combining individual study and group study for as
long as people have been learning. Think Socrates, Jewish scholars... I saw
medical students forming study groups before Google and Wikipedia and I still see them
forming study groups. The only difference is that half of them show up to their
study group meetings with laptops instead of stacks of books. Has anything changed since
then? The media and format may have changed but the basic principles of learning have not.

Is the internet making us dumber? 

That is a question I think has been asked for awhile and whenever I hear
it I think about how TV was blamed in the past, before TV it was radio,
before radio it was the printing press, before the printing press it
was writing - yes writing was blamed for making us stupid...

From Socrates regarding writing... “…father of letters, … this
discovery of yours will create forgetfulness in the learners' souls,
because they will not use their memories; they will trust to the
external written characters and not remember of themselves. The specific
which you have discovered is an aid not to memory, but to reminiscence,
and you give your disciples not truth, but only the semblance of truth;
they will be hearers of many things and will have learned nothing; they
will appear to be omniscient and will generally know nothing; they will
be tiresome company, having the show of wisdom without the reality.”

Have things gotten better since Socrates? One could point to the marvels
of modern medicine and widespread lack of head lice but one could also
point to weapons of mass destruction and the depletion of the ozone
layer. I do not think we are dumber. I do think that we see dumbness more
often as
it is readily presented to us through comment boxes and the like.
Just because we have more things to be dumb
about and more avenues through which to broadcast it does not actually mean we are dumber.

Posted by: referencegirl on April 16, 2010


Log in to comment

Most Popular

Stay Up-to-Date

RSS Email Twitter

Share Your Work and Ideas

Issues coming up will focus on designing the future of higher ed, digital engagement, and new business models. Share your work and ideas with EDUCAUSE Review Online.

E-mail us >

Annual Conference
September 29–October 2
View Proceedings

Events for all Levels and Interests

Whether you're looking for a conference to attend face-to-face to connect with peers, or for an online event for team professional development, see what's upcoming.


Digital Badges
Member recognition effort
Earn yours >

Career Center

Leadership and Management Programs

EDUCAUSE Institute
Project Management



Jump Start Your Career Growth

Explore EDUCAUSE professional development opportunities that match your career aspirations and desired level of time investment through our interactive online guide.


EDUCAUSE organizes its efforts around three IT Focus Areas



Join These Programs If Your Focus Is


Get on the Higher Ed IT Map

Employees of EDUCAUSE member institutions and organizations are invited to create individual profiles.



2014 Strategic Priorities

  • Building the Profession
  • IT as a Game Changer
  • Foundations

Learn More >

Uncommon Thinking for the Common Good™

EDUCAUSE is the foremost community of higher education IT leaders and professionals.