Grinell Smith (SJSU)

Introduction

The central claim of an increasingly popular position regarding the future of computer technology in education is this: technology has changed our students in profound ways that affect how they learn. The implication is that as educators we must profoundly change the way we teach them (see, for example Ellison 1994; Frand 2000; Gaston, 2006; Howe & Strauss, 2000, 2003; Oblinger, 2003; Prensky, 2001, 2003, 2005, 2006; Prensky & Berry, 2001; Tapscott, 1998). The claim has become so commonplace in the education literature that much of the recently published research treats the claim as if it has been established beyond doubt and simply proceeds to define how our teaching must change in response. However, there are hidden dangers in rushing toward reform without careful examination of the underpinning justifications: at a minimum, we risk spending time and money that could be better spent elsewhere; more troublingly, we risk damaging the relationship between teachers and students by reifying differences between them.

At first glance, the claim that technology has transformed us seems unassailable, in no small part because of the undeniable reach of technology into our everyday lives. A more critical view, however, reveals that the claim is supported in large part by anecdotal rather than empirical evidence, is based on a debunked model of the mind as a blank slate (Pinker, 2003) and requires a curious ahistoricity with regard to progress (Bowers, 1988, 1993, 2000, 2008; Danto, 1991, Feenberg, 2001). Further, the implication that we must change the face of teaching to accommodate students who have been transformed by technology stakes out a solidly deterministic view of technology's relationship to education (Bowers, 2000; Cuban, 2001, McKnight & Robinson, 2006). As McKnight and Robinson (2006) point out:

…with a few exceptions (e.g. Postman, 2000; Ross, 2000; Stanley, 2001), those involved with research on technology within the social studies fail to grapple with, much less acknowledge, the preponderance of historical, social and critical theory literature that interrogates and problematizes the very technological deterministic discourse and its underlying meta-narrative upon which much of the research is based. (para 6)

As inevitable as a marriage of education and technology may be, the mainstream position that dogmatically focuses only on the benefits of technology in education while assiduously ignoring the downsides forecloses the possibility of constructive discussion of dissenting opinion and makes us blind to the "danger of thinking of computers only within the framework of their personal use and within the myth of progress that surrounds them" (Bowers, 2000, p 2). This paper questions the validity of the claim that technology has changed our children in ways relevant to the way we should structure education and cautions against pursuing reform without a clear and compelling reason to do so.


The evidence of change

A typical approach to supporting the premise that children have been transformed by technology is not to refer to empirical evidence but rather to drag out a few suitably stunning statistics about the pace of technological breakthroughs or to provide a few overwhelming anecdotes illustrating the comfort of adolescents and young adults with regard to technology in the hope that the reader will leap to the "obvious" conclusion that today's youth is qualitatively different (eg, McNeely, 2005; Windham, 2005). At first glance the technique seems as persuasive as it is common, largely because such anecdotes are readily at hand and are often, in fact, quite stunning. For example, not long ago while riding the light rail commuter train through Silicon Valley to the university campus where I teach, I watched a teenager sitting alone and rocking gently to music delivered discreetly from his iPod to the tiny white buds sprouting from his ears as he thumbed a message into his Blackberry and sent it out across the ether, and then laughed out loud at the response that arrived seconds later. As I watched this young man, the train made a half-dozen stops but not once did he look up from his 3 inch screen. And as I sat, voyeuristic, isolated, and silenced by the strange juxtaposition of anonymity and proximity that commonly arises in commuter trains, I realized that he wasn't experiencing the same phenomenon I was at all. He was anything but isolated, or silenced. He was in a conversation with somebody on the other end of his Blackberry. His conversation was transposed from sound to signal and mediated through space not by his vocal chords but by his impressively dexterous teen-aged thumbs, yet it was a conversation all the same. And as the stations rattled past I became increasingly struck by two related thoughts: first, how absolutely easy it is to communicate complex ideas and thoughts to people at a distance, and second, how quickly we've become accustomed to the ability. Communicating like this is so commonplace that we usually don't give it a second thought. As technological advances accelerate, the recently amazing becomes merely novel and then quotidian at an equally accelerating pace. Sitting across from me, this teenager personified what is commonly offered as evidence that technology has utterly transformed us culturally, and perhaps even physiologically, or at least those of us who've infused it into our daily lives to the degree he had. But such anecdotes do not by themselves constitute compelling evidence of a profound transformation in how people think.


Neural plasticity

Some writers on the influence of technology on the brain offer what at first glance appears to be more compelling evidence. Prensky and Berry, for example, cite experiments from the fields of neurology and cognitive psychology to invoke the concept of "neural plasticity" to help explain how structural dissimilarities between the brains of technophiles and Luddites might arise (Prensky & Berry, 2001). The idea Prensky and Berry refer to is that the brain is shaped by experience, and that hours of online activity results in physical differences in the brain. And indeed, imaging studies demonstrate that physical differences do arise. The observation that techno-brains and Luddite brains differ leads some to a bold conclusion: technology has the power to radically transform minds. Futurists in education view such a transformation in the lives and minds of students as a Herculean challenge compelling us to remake current educational systems to accommodate the new. One only need read the title of Prensky's (2006) "Don't bother me Mom, I'm learning" to guess at the underlying guiding principle behind this philosophy. The clear implication is that as educators we must hitch to the same wagon that is hurtling headlong into the future with our students aboard or we risk being rendered useless to them, or worse, a hindrance (Frand, 2000). Indeed, the logic seems to have a certain inescapability to it. That the sentence "He was in a conversation with someone on the other end of his Blackberry" makes sense certainly seems to imply radical change.

The logic would be compelling, except for one detail: all

learning physically changes the brain. The brain, after all, is not a magical device operating outside the realm of physics; it is made entirely of physical matter and operates entirely in the physical world, and so to function as a brain at all it must change in response to experience. It is not clear exactly how these changes happen – whether new experiences are stored in new synapses or new neurons, whether new skills are stored globally in the cortex or are sequestered in specific neural organs – but it is a banality to say that experiences mark the brain in a physical way. As Steven Pinker (2002) points out, if we didn't have the ability to change our brains in response to experience, we would all be "permanent amnesiacs." Of course our brains change if we roam around in Second Life or play first-person shooter games eight hours a day. So do they if we play baseball eight hours a day, or work at an auto factory, or teach 8th grade. The ability of our brains to respond to sensory input is, in fact, one of its fundamental properties. To be sure, precisely how it occurs is still quite unclear, but that it occurs is hardly a profound insight. Neural plasticity, Pinker writes, "is just another name for learning and development, described at a different level of analysis" (p. 86).

John Medina, a prominent developmental molecular biologist, cautions against co-opting findings from modern brain science and pressing them into service to support education reform. In a keynote address at the 2009 Learning and the Brain Conference in San Francisco, California, he pointed out that biology professors and education professors frequently work their entire academic careers within rock-throwing distance of each other's offices and never darken each other's doors. The reason, he said, is that biology, and particularly brain biology, hasn't progressed far enough to be prescriptive for educators. Despite admirable effort (see, for example, Geake & Cooper, 2003), Medina pointed out that brain scientists simply do not yet know enough about how the brain does what it does to guide teachers looking to fine tune their educational approaches (Medina, 2009).


Linguistic relativity

But there can be no argument that there are new things under the sun. The train-riding teenager exemplifies many of them in stereotype. Quantifiable measures of newness exist as well, and perhaps none carries the purity or compelling force in the context of the social sciences than the observation that our very language has been noticeably affected by the pace of our progress. Write a paper about technology using a version of Microsoft Word from the early 90s and your paper will be littered with words correctly spelled yet underlined in red. Neologists tell us that thousands of new words are added to the English language each year (Algeo, 1980), - words like blog, wiki, email, Internet – and existing words are pressed into service in new ways, like "friend" as a verb, as in 'My mother friended me on FaceBook'.

To many social scientists, particularly those who subscribe to what Barkow, Cosmides, and Tooby (1992) call the Standard Social Science Model, it is accepted as fact that differences in language inevitably manifest themselves as differences in cognition and behavior. This idea, known as the principle of linguistic relativity (or linguistic determinism in its stronger form), is at the heart of many of the liberation theories espoused by scholars who seek the admirable goals of empowering the oppressed and building more equitable societies. Edward Sapir developed the principle to explore how differences in grammatical structure influence how speakers perceive the world. His student Benjamin Whorf argued that language defines the very nature of the things people are able to think about (Bloom 1981, Koerner, 2000). Whorf, for example, claimed that the Hopi language is a "timeless" language with a fundamentally different structure that does not reference the passage of time in the way that "temporal" languages like English do, and thus the Hopi do not, and in fact cannot, conceptualize time in the same way as native speakers of English (Whorf, 1970).

As it turns out, however, the Hopi language does contain references to time passing (as does every other language known to anthropology) and there is no evidence whatsoever that the Hopi think of time differently than any other group of people (Pinker, 1995). Other oft-quoted studies fall apart in similar fashion under examination. For example, the myth that Eskimos have dozens or even hundreds more words for snow than the English turns out to be exactly that – a myth, or as Geoffrey Pullum (1989) refers to it, a story of "unredeemed piffle" (p. 275). A few scholars, such as Lera Boroditsky (2003), have presented convincing empirical evidence of linguistic relativity, but the evidence supports only the "weak versions of the Whorfian hypothesis, namely that words can have some effect on memory or categorization." (Pinker, 1995, p 55). As Pinker points out, however, this is hardly surprising. And it is also a far cry from the much stronger claim that thought and behavior is shaped in any significant way by language. The principle of linguistic relativity, in other words, can offer no support for the argument that technology is transforming minds.


The illusion of progress

The argument that technology is transforming minds by acting on language falls flat for another reason as well. The rise of technology hasn't required us to learn a new language. At most, we have merely had to master a new vocabulary. And even that is only half the story. What we often forget is that for every new word that makes its way into the language, there is a balancing trend of desuetude (Algeo, 1993). Words fall from use and disappear from the lexicon; they die as surely as they are born. And so while it is true that there are many new words to describe many new things, that fact is actually quite trivial simply because it has always been so. Considering only the arrival of the new while ignoring the passing of the old feeds the illusion that we are witnessing something historically unique.

The same holds true of ideas, and even abilities. We tend to hold a myopic view of ourselves, believing that we are somehow more advanced, more able, different, smarter than our forebears because we know how to edit video and upload it to YouTube or how to manipulate xml code. This perception of historical uniqueness, in turn, suggests the possibility of unique consequences and makes it possible, if not compelling, for us to conclude that our children may be qualitatively different kinds of thinkers than we are. But this particular view reveals an arrogance born of a strong temporal bias toward the now. How many of us, for example, would be able to fend for ourselves if we were suddenly dropped into the agrarian world of a 10th century farmer? The wheeled cart replaces the dragged sled. The car replaces the wagon. The typewriter replaces the quill. The Internet replaces the newspaper. In the large view, rather than the creation of something entirely new, what our latest explosion of technological advances has done for us, by and large, is to provide us with new ways to do the same old things we've been doing since we drifted out of the Olduvai Gorge across the Serengeti and fanned out into Europe, Asia and the Americas.

As anecdotal evidence, I can think of no better illustration of how we have remained essentially and fundamentally ourselves through time than to consider Shakespeare's Sonnet 116:


Let me not to the marriage of true minds
Admit impediments. Love is not love
Which alters when it alteration finds,
Or bends with the remover to remove:
O no! it is an ever-fixed mark
That looks on tempests and is never shaken;
It is the star to every wandering bark,
Whose worth's unknown, although his height be taken.
Love's not Time's fool, though rosy lips and cheeks
Within his bending sickle's compass come:
Love alters not with his brief hours and weeks,
But bears it out even to the edge of doom.
If this be error and upon me proved,
I never writ, nor no man ever loved.

Shakespeare wrote Sonnet 116 about 400 years ago, yet while the text itself may be a bit, well, Shakespearean, the ideas are timeless. It is every bit as up-to-date as anything in last week's New Yorker. Shakespeare didn't write about a kind of love particular to the late 1500s, some quaint or unfamiliar version of our modern experience, he wrote about love! The ideas captured by Shakespeare are independent of time. Notably, they are also independent of technological format. Sonnet 116 would still ring true even if he had tapped it out with his thumbs on a QWERTY keyboard and posted it on his Facebook site. A rose by any other name would smell as sweet, as the saying goes.

There is another illusion hidden here that our myopic view fosters: that the way information is stored and presented is rapidly changing. Of course, we do edit videos and upload them to YouTube, and we do edit XML code, and this is different from what came before, but it is in no way fundamentally different. Again, to put things into perspective, it helps to take the long view of how information has been created, stored, and shared through history. In pre-literate cultures, before text, the history of a person, or a people, was oral and pictorial. Stories that painted pictures in the mind's eye were told and retold, carried through the generations, stored in the memories of those to whom the stories belonged. The first writing, or at least the first writing that historians consider coherent, date to about 4500 years ago in Sumeria and Egypt (Coulmas, 1995). With the rise of text, Sumerian and Egyptian cultures experienced information revolutions that surely dwarfed the current information revolution. As writing evolved, Sumerians were able to capture and transport ideas through time and space in ways completely and utterly unlike the ways that were available before writing existed. Egyptians could project their ideas forward in time or revisit a past that had been crystallized verbatim. Writing allows this. Text, it has been argued, is as remarkable, as powerful, as any technological invention the world has seen. In other words, text is a supernova of an idea and the invention of hyper text doesn't hold a candle to it.

But as remarkable and important to the history of humanity as the invention of writing certainly is, it is a stretch to conclude that it fundamentally changed us, physiologically, intellectually, or in any other way relevant to the foundations of education. Evidence of this is in the content of the earliest writings. Scholars spend years interpreting ancient texts and decode for us not the unimaginable, not the unrecognizable, not even the bizarre, but the familiar - laws that codify socially adaptive behavior, trading logs that track mercantile exchange, recipes, medical texts, stories of families, of wars, of disasters and struggles, of triumphs and of myths. Add to this the fact that anthropological studies of non-writing indigenous peoples have consistently failed to show distinct differences in writing and non-writing cultures due to the presence or absence of text itself (Olson, 1996) and there is only one viable conclusion: text does not seem to have changed what it is to be human at all.

What text does is preserve information in a linguistic form so that the information can be retrieved independently of the initial act of formulation. What hypertext adds to this is the ability to connect one piece of information so stored to another piece more readily. But the information language expresses, text preserves, and hypertext links together is, fundamentally, the same as it ever was – human stories, even from 40 centuries ago, still make sense to us. For all of our account passwords and our cloud computing, our linked lives and smartphones, our ability to access the world's information instantly and to communicate with anyone anywhere anytime, we are quite the same as our forebears. We may use different tools to do things in new and different ways, but we're still doing the same old things. As educators, we should keep this simple fact in mind as we think about the relationship between education and technology: we are still who we have always been.


Digital natives

So in this light, I think about the young man with the ear buds. Who is he? What's he like? How is he like me? How is he different? As an educator, how should I respond to him? Michael Wesch, a professor of cultural anthropology, and his students at Kansas State University produced a YouTube video called A Vision of Students Today (Wesch, 2008) that offers a powerful description of the modern student consistent with what I observe with my own Silicon Valley students, who bring their laptops to class, check email during breaks if not during class while I'm talking to them, keep blogs, follow each other on Twitter, and update their FaceBook pages with something approaching obsession. As Wesch's video illustrates, today's students are, to use Prensky's phrase, digital natives.

But it is important not to fall prey to the power of nostalgia when looking back to earlier times, say, two decades ago, before the rise of the World Wide Web and before anyone outside DARPANET had an email address. The tools we used then to access, explore, produce and share information have certainly changed - drastically so - but when I watch Wesch's video I see nothing to indicate that students themselves have changed significantly in twenty years. We should keep in mind that Prensky's evocative characterization of the young as digital natives and the old as digital immigrants isn't really an anthropological classification marking a line between two distinct cultures, much less a signifier of physiological or neurological differences that manifest themselves as differences in the way we think. It is merely a clever turn of phrase. Despite the fact that when I'm forced to type with my thumbs I'm all thumbs and despite the fact that I can't get across the first screen of World of Warcraft without some 14 year old handing me my hat, the differences between us - digital immigrants and digital natives - is not that the minds of the young have been sculpted into something unfamiliar to the old by endless hours of gaming, texting, surfing, and cell-phone talking or by the words they use when they speak. What separates us is that they are growing up in different times, with different demands, and with different diversions. In other words, today's students don't think differently. They're simply interested in different things. And as educators, it's important to realize that we're not seeing anything particularly new. For example here is perhaps a fairly accurate description of today's youth:

The children now love luxury. They have bad manners, contempt for authority, they show disrespect to their elders.... They no longer rise when elders enter the room. They contradict their parents, chatter before company… and are tyrants over their teachers.

The originator of the description is unknown – it's often attributed to Socrates, probably incorrectly so – but regardless of the source what is clear is that it was written a very long time ago, and it makes the point that the youth of any day are different from their parents and their teachers, and parents and teachers always bemoan it. (Socrates also believed writing was harmful because it weakened memory. But as Dennis Barron [2009] wryly points out, "We remember what Socrates said because Plato wrote it down").

Our students aren't different from us because they game, blog, and communicate with their thumbs, they're different from us simply because they're our students, and they are younger than we are. What is easily seen in many educative environments today, and is often incorrectly attributed to some qualitative difference in brain structure and function, is actually something much more conventional. To quote Prensky somewhat out of context, the reality is that our students "accustomed to the twitch-speed, multitasking, random-access, graphics-first, active, connected, fun, fantasy, quick-payoff world of their video games, MTV, and Internet are bored by most of today's education, well meaning as it may be." (Prensky, 2001, p. 5). In other words, in too many classrooms, between teachers and students, it's the same old story. Students find schoolwork boring. They'd rather be doing something else.


Implications for educators' response

The idea that today's students are qualitatively different from us is an illusion given form and structure by well-meaning, if overexuberant, advocates of educational reform. Worryingly, however, it is an illusion with potentially dangerous consequences. Bennett et al (2008) contend that imbuing the debate about how to use technology in education with a sense of impending crisis incites the "academic equivalent of a moral panic that restricts critical and rational debate" (p. 776). The illusion also abets the construction of an artificial divide between teacher and student that proscribes many avenues of meaningful connection between them. The recasting of mere differences in interests and behaviors as significant differences in abilities due to structural differences in minds or mental capacities can make teachers, especially those with more classroom experience who research shows are more likely to lack strong technology backgrounds than their younger peers (Russell et al, 2007), feel inadequate to the task of teaching this "new breed" of student, when in fact, it is often these veteran teachers who have the most to offer. It also reifies the culturally determined point of view that technology in education is merely a neutral tool that, used correctly, cannot help but result in improvement (Bowers 1993, 2000; Cuban, 1986, 2001).

The fundamental challenge facing modern educators is the same as it has always been: to engage students in meaningful learning and to help them as they grow. The essence of good teaching has not changed. What is changing (and this is very good news) are the tools teachers have at their disposal to reach, engage, and guide students. This is where educational technology's impact is strongest – in relation to decisions teachers make about how they teach. New tools for learning have widened the landscape of possible answers to the questions, "What are you interested in learning? How would you like to learn it?" Recently, for example, one of my students, a pre-service teacher, was planning a "technology lesson" as part of a teaching credential requirement - in this case, a summary lesson about mitosis. In the past, this typically may have involved gathering up a class of students, walking them down to the computer lab and sending them out into cyberspace in search of information on the given topic. Instead, this teacher asked her students how they would like to express what they had learned. After discussing some possibilities, the class decided to create an animated video of mitosis and upload it to YouTube. An archaic school policy prevented them from disseminating their work online, but the learning took place all the same. The students produced their mitosis video last year, but if I asked them today to tell me about mitosis, I think they could probably do it. At the very least, they could probably do it better than the kids across the hall who went on a Webquest instead.

The point here is simple: in many regards, technologists and futurists in education are correct to encourage the use of technology in education. Clearly, computer and networking technologies are here to stay, and schools and teachers must respond to the new demands of society and the new interests of their students. But it is counterproductive to cast the changes that we must make as fundamental. Doing so damages our ability to critically examine the landscape of choices and possibilities before us, and their consequences. It also devalues the act of teaching and prevents us from recognizing that "stability in teaching practices and the craft of teaching are positive forces in schools, maintaining a delicate balance amidst swiftly changing public expectations" (Cuban, 1986, p. 7). Technology will undoubtedly help us design and teach better lessons and will allow students to interact with information and each other in pedagogically effective and powerful ways, and we would be wise to critically examine how best to use the new opportunities technology offers to the advantage of our students and our society. But the real heavy lifting won't be technological, because the essence of teaching is not technological. This idea is captured nicely in a cliché known as Edward's law: technological solutions don't apply to sociological problems. In the case of technology and its relation to education, the cliché holds. As we move into the future, we would do well to keep this in mind.


References

Algeo, J. (1980). Where do all the new words come from? American Speech, 264-277.

Algeo, J. (1993). Desuetude among new English words. International Journal of Lexicography, 6 (4), 281-293.

Barron, D. (2009). The web of language. Retrieved February 24, 2010 from https://illinois.edu/db/view/25/13953

Bennett, S., Maton, K., & Kervin, L. (2008). The 'digital natives' debate: A critical review of the evidence. British Journal of Educational Technology, 39 (5), 775-786.

Bloom, A. H. (1981). The linguistic shaping of thought: A study in the impact of language on thinking in China and the West. Hillsdale, NJ: L. Erlbaum.

Boroditsky, L. (2003) Linguistic relativity. In L. Nadel (Ed.), Encyclopedia of cognitive science (pp. 917-922). London: Macmillan.

Bowers, C.A. (1988). The cultural dimensions of educational computing: Understanding the non-neutrality of technology. New York: Teachers College Press.

Bowers, C. A. (1993). Education, cultural myths, and the ecological crisis: Toward deep changes. Albany, NY: State University of New York Press.

Bowers, C. A. (2000). Let them eat data: How computers affect education, cultural diversity, and the prospects of ecological sustainability. Athens, GA: University of Georgia Press.

Bowers, C. A. (2008). Why a critical pedagogy of place is an oxymoron. Environmental Education Research, 14 (3), 325-335.

Coulmas, F. (1991). The writing systems of the world. Oxford, UK: Blackwell Publishing.

Cuban, L. (1986). Teachers and machines. New York: Teachers College Press.

Cuban, L. (2001). Oversold and underused: Computers in classrooms, 1980-2000. Cambridge: Harvard University Press.

Danto, Arthur C. (1991) Encounters and reflections--art in the historical present. New York: The Noonday Press.

Ellison, M. (1994). The effect of computer and calculator graphics on students' ability to mentally construct calculus concepts. (Doctoral dissertation, The University of Minnesota, 1993). Dissertation Abstracts International, 51 (11), 4020. Abstract retrieved February 4, 2010 from First Search/Dissertation Abstracts International database.

Feenberg, A., & Staff, F. (2002). Questioning technology. New York: Routledge.

Frand, J. (2000). The information-age mindset: Changes in students and implications for higher education. Educause Review, 35 (5), 14-24.

Gaston, J. (2006). Reaching and teaching the digital natives. Library Hi Tech News, 23 (3), 12-13.

Gros, B. (2003). The impact of digital games in education. First Monday, 8 (7). Retrieved February 12, 2010, from http://www.firstmonday.org/issues/issue8_7/xyzgros/index.html

Howe, N. & Strauss, W. (2000). Millennials rising: The next great generation. New York: Vintage.

Howe, N., & Strauss, W. (2003). Millennials go to college. Washington, DC: American Association of Collegiate Registrars and Admissions Officers.

Koerner, E.F.K. (2000). Towards a full pedigree of the Sapir-Whorf Hypothesis: From Locke to Lucy. In M. Pütz & M.H. Verspoor (Eds), Explorations in Linguistic Relativity (pp. 1-23). Philadelphia: John Benjamins North America.

McHale, T. (2005). Portrait of a digital native. Technology and Learning, 26 (2), 33–34.

McNeely, B. (2005). Using technology as a learning tool, not just a cool new thing. In D. Oblinger & J. Oblinger (Eds), Educating the Net generation (pp. 4.1–4.10), Boulder, CO: Educause. Retrieved February 12, 2010 from http://www.educause.edu/educatingthenetgen

McKnight, D. & Robinson, C. (2006) From technologia to technism: A critique on technology's place in education. Technology, Humanities, Education, and Narrative, 3. Retrieved November 20, 2009 from http://thenjournal.org/feature/116/

Oblinger, D. (2003). Boomers, Gen-Xers & Millennials. Understanding the new students. Educause Review, 38 (4), 37-47. Retrieved February 2, 2010 from http://www.educause.edu/ir/library/pdf/ERM0342.pdf

Olson, D. R. (1996). The world on paper: The conceptual and cognitive implications of writing and reading. Cambridge University Press.

Pinker, S. (1994) The language instinct: The new science of language and mind. London: Penguin.

Pinker, S. (1997). How the mind works. New York: Norton.

Pinker, S. (2003). The blank slate: The modern denial of human nature. London: Penguin.

Prensky, M. (2001). Digital natives, digital immigrants. On the Horizon, 9 (5), 1-6.

Prensky, M. (2003). Digital game-based learning. Computers in Entertainment (CIE), 1 (1), 21.

Prensky, M. (2005). Listen to the natives. Educational Leadership, 63 (4), 8.

Prensky, M. (2006). Don't bother me mom–I'm learning. Saint Paul: Paragon.

Prensky, M., & Berry, B. D. (2001). Do they really think differently? On the Horizon, 9 (6), 1-7.

Pullum, G. (1989). The great Eskimo vocabulary hoax. Natural Language & Linguistic Theory, 7 (2), 275-281.

Russell, M., O'Dwyer, L. M., Bebell, D., & Tao, W. (2007). How teachers' uses of technology vary by tenure and longevity. Journal of Educational Computing Research, 37 (4), 393-417.

Tapscott, D. (1998). Growing up digital: The rise of the Net generation. New York: McGraw-Hill.

Wesch, M. (2008). A vision of students today. Retrieved June 17, 2009 from http://www.youtube.com/watch?v=dGCJ46vyR9o

Whorf, B. L. (1970). Science and linguistics. Indianapolis: Bobbs-Merrill.

Windham, C. (2005). Father Google & mother IM: Confessions of a net gen learner. Educause Review, 40 (5) 42–59.