April 19, 2011 § 1 Comment
In a recent seminar organised by the Oxford Internet Institute, Baroness Susan Greenfield (Professor of Synaptic Pharmacology at Lincoln College, Oxford) urged caution against the extensive use of modern technology, arguing that technology alters the way people think – a condition she referred to as ‘mind change’ – and is likely to have serious adverse effects.
Airing her concerns for children’s increasing exposure to screen technologies, she suggested that whilst such sensory stimuli might give us higher IQ and a better short-term memory, they could also shorten our attention span, lower our ability to develop important traits like empathy, handle abstract concepts or evaluate risk. Claiming that this is an issue almost as significant as ‘climate change’, she called for immediate action from the government and researchers (see examples of her talk, book and article for her views on this issue).
Naturally, we completely agree that there is a need for much research in this area, and many of us are working to contribute to that effort. However, Greenfield appears to begin with the premise that the extensive use of technology is essentially negative – a position based on her apparently limited experience of children and screen technologies rather than on any established studies.
Greenfield began her presentation with a valuable and clearly explained description of the brain’s plasticity, how it adapts and evolves through individual’s experiences with the world. It is precisely this biological makeup of our brain that makes us vulnerable and susceptible to changes in the environment and experiences.
This much was uncontroversial but then Greenfield continued by asserting that with the use of screen technologies there is little or no social interaction, which is an essential part of human development. However, she chose to illustrate her point with a picture of two boys playing a traditional console game. Leaving aside the fact that recent console technologies have prioritised social interaction (such as with the Nintendo Wii), the two boys in question looked to us as if they were clearly interacting with each other whilst having fun! Indeed, studies on domestic gaming environment reinforce this view (see, for example, Voila and Greenberg, 2010).
Greenfield also compared computer games and books, commenting that she could not see how qualities like honour could be instilled in people through playing games like World of Warcraft (WoW), whereas reading a book like War and Peace would bring about that effect (in fact, she appeared to be comparing the single game WoW with the entire body of classical literature). In any case, those of us who are familiar with WoW or similar know that loyalty, respect and honour are at the very heart of such games. It is impossible to progress in WoW without demonstrating and upholding such important human qualities (see, for example, Williams et al., 2006).
With modern technology becoming more integrated and intertwined with our daily lives, it is bound to have some effect. Some will be bad, some might be positive – and too much of anything, be it good or bad, could have adverse effects. However:
One can no more ask, ‘How is technology affecting cognitive development?’ than one can ask, ‘How is food affecting physical development?’” As with food, the effects of technology will depend critically on what type of technology is consumed, how much of it is consumed, and for how long it is consumed. (Bavelier, Green and Dye, 2010, Children, Wired: For Better or for Worse. p. 692)
Being too fixated on the strictly dichotomous classification ‘good/bad’ could prevent us from focusing on a more important task, which is to develop a holistic understanding of how technology might affect the way we think and make sense of the world around us.
There is no doubt that this central question – how is technology affecting cognitive development and how might we harness or limit these effects – is an important one. By all means, let us keep the conversation going and push for more research, but let us start that research with an open mind. Insights from domain experts are always valuable but, in the absence of proper research, they risk becoming premature speculation that may not only be unconstructive but may also impede progress.
(joint blog post by Wayne and Wan-Ying)
August 31, 2010 § 1 Comment
Many writers have proposed that new technologies have negative consequences for our abilities to think, learn and process information – often referring to very specific lab based experiments and typically ideas based on (sometimes pseudo) understandings of neuroscience.
In the Times two weeks ago Nicholas Carr suggested that, “We seem to have arrived at an important juncture in our intellectual and cultural history, a moment of transition between two very different modes of thinking. Calm, focused, undistracted, the linear mind is being pushed aside by a new kind of mind that wants and needs to take in and dole out information in short, disjointed and overlapping bursts – the faster the better.”
The problem I have with these kinds claims is that we never consider people as people – we always study them in a specific context doing a specific task. Individuals live both in the real world and (to vary degrees) engage in the online one. All of these experiences are likely to have some kind of influence on our brains and as a result our abilities to deal with information – not just those experiences that occur online.
We are all scanning and filtering more information online – but that is because we have to. We all can (and do) still fully engage with texts over a long period of time when we need or want to. The idea, as Nicholas Carr and others suggest, that this skill is somehow removed simply because we spend time surfing the web is not what I see in the students I teach or the young people I study in my research.
What we do need is more informed research and greater discussion between the disciplines. Neuroscience has a lot to add in this debate – see, for example, Blakemore & Frith’s great book: the learning brain.
Is the Internet making us stupid? I doubt it…