It is not unusual, these days, for children to spend five hours a day looking at computer screens or mobile phones. That’s five hours a day that they’re not exploring their neighborhoods, feeling the wind in their hair, or giving someone a hug. Yet it seems we are sleepwalking into this new cyberculture without even questioning its impact on our brains.
The human brain is exquisitely talented at adapting to its environment. Its plasticity has allowed humans to occupy more ecological niches than any other species on the planet. But this adaptability also means that as our environments change in unprecedented ways, so, too, do our brains.
In our eagerness to embrace technology, we have created environments that are almost certainly impacting our brains, and, more pressingly, the brains of a younger generation that has never known anything else.
Look no further than the fact that we are constantly connected through Facebook and Twitter. We are developing the attention spans of gnats, many have given up privacy in return for “celebrity”, and we take it as a given that it is desirable to be connected all the time. Is it really desirable? I’m not convinced.
We now live in a strange, narcissistic world where even the most banal aspects of our lives are fodder for digital output. We constantly seek the validation of others through our tweets and status updates. Adults might be equipped for these changes, but how is this impacting the development of children, who are increasingly living their lives and forming their identities through the eyes of virtual others?
I find it scary that children can’t reflect on whether they feel excited, happy, let down, disappointed, or frustrated without also considering whether something is “Facebook-worthy.”
Young people are also missing out on the opportunity to rehearse basic skills like eye contact, body language, and voice-turn recognition – the kind of things one gets from practicing face-to-face communication. Consequently, the development of complex emotions like empathy is now at risk. Perhaps this is why we get the kind of ruthless bullying that now occurs on Facebook pages, or the spiteful remarks that we find in comments sections online. Might the next generation not experience the kind of accountability that comes naturally through face-to-face interaction?
In her recent book, Alone Together, MIT psychologist Sherry Turkle argues, paradoxically, that the more connected people become online, the less comfortable they tend to feel with themselves. The disconnect between the selves that we portray on Facebook and our real selves is likely familiar to many readers.
All of this is to say nothing of the impact that things like gaming are having on risk-taking, violence, and addiction, how our reliance on search engines is affecting our capacity for deeper-order cognition, or even how interaction with a two-dimensional screen is destroying our ability to focus.
Critics will point out that we don’t yet have the definitive scientific proof needed to make causal claims about these things. But the initial trends and correlations already surfacing in the scientific literature, and in our communities, are evidence enough to at least encourage us to stop being so complacent.
Consider the 1950s, when the increased incidence of lung cancer correlated strongly with cigarette consumption, but there was still no conclusive evidence, so nothing was done. It seems foolish, now, that we failed to take precautions then.
Modern technology’s impact on our brains poses a complex, multifaceted, and significant risk to humanity. I have coined the term “mind change” to emphasize the dangers that we face in this regard. My hope is that echoing the phrase “climate change” might encourage people to take the issue more seriously.
Unlike climate change, there is more we can do about mind change than simply damage limitation. Governments can begin to invest in the kind of longitudinal studies and other important scientific research that we need to better understand how our brains are changing. We can also push social-networking sites, video-game designers, and other industries to create more sophisticated digital experiences – virtual worlds that provide meaning and depth.
I’m certainly not suggesting that we should turn back the clock or put hatchets in computers. I don’t question the fantastic benefits the cyberworld has brought to our societies. But it would be ironic if the technologies that have enabled such incredible progress deprived us of the ability to maximize our individual potential.
Original Article
Source: the mark news
Author: Susan Greenfield
The human brain is exquisitely talented at adapting to its environment. Its plasticity has allowed humans to occupy more ecological niches than any other species on the planet. But this adaptability also means that as our environments change in unprecedented ways, so, too, do our brains.
In our eagerness to embrace technology, we have created environments that are almost certainly impacting our brains, and, more pressingly, the brains of a younger generation that has never known anything else.
Look no further than the fact that we are constantly connected through Facebook and Twitter. We are developing the attention spans of gnats, many have given up privacy in return for “celebrity”, and we take it as a given that it is desirable to be connected all the time. Is it really desirable? I’m not convinced.
We now live in a strange, narcissistic world where even the most banal aspects of our lives are fodder for digital output. We constantly seek the validation of others through our tweets and status updates. Adults might be equipped for these changes, but how is this impacting the development of children, who are increasingly living their lives and forming their identities through the eyes of virtual others?
I find it scary that children can’t reflect on whether they feel excited, happy, let down, disappointed, or frustrated without also considering whether something is “Facebook-worthy.”
Young people are also missing out on the opportunity to rehearse basic skills like eye contact, body language, and voice-turn recognition – the kind of things one gets from practicing face-to-face communication. Consequently, the development of complex emotions like empathy is now at risk. Perhaps this is why we get the kind of ruthless bullying that now occurs on Facebook pages, or the spiteful remarks that we find in comments sections online. Might the next generation not experience the kind of accountability that comes naturally through face-to-face interaction?
In her recent book, Alone Together, MIT psychologist Sherry Turkle argues, paradoxically, that the more connected people become online, the less comfortable they tend to feel with themselves. The disconnect between the selves that we portray on Facebook and our real selves is likely familiar to many readers.
All of this is to say nothing of the impact that things like gaming are having on risk-taking, violence, and addiction, how our reliance on search engines is affecting our capacity for deeper-order cognition, or even how interaction with a two-dimensional screen is destroying our ability to focus.
Critics will point out that we don’t yet have the definitive scientific proof needed to make causal claims about these things. But the initial trends and correlations already surfacing in the scientific literature, and in our communities, are evidence enough to at least encourage us to stop being so complacent.
Consider the 1950s, when the increased incidence of lung cancer correlated strongly with cigarette consumption, but there was still no conclusive evidence, so nothing was done. It seems foolish, now, that we failed to take precautions then.
Modern technology’s impact on our brains poses a complex, multifaceted, and significant risk to humanity. I have coined the term “mind change” to emphasize the dangers that we face in this regard. My hope is that echoing the phrase “climate change” might encourage people to take the issue more seriously.
Unlike climate change, there is more we can do about mind change than simply damage limitation. Governments can begin to invest in the kind of longitudinal studies and other important scientific research that we need to better understand how our brains are changing. We can also push social-networking sites, video-game designers, and other industries to create more sophisticated digital experiences – virtual worlds that provide meaning and depth.
I’m certainly not suggesting that we should turn back the clock or put hatchets in computers. I don’t question the fantastic benefits the cyberworld has brought to our societies. But it would be ironic if the technologies that have enabled such incredible progress deprived us of the ability to maximize our individual potential.
Original Article
Source: the mark news
Author: Susan Greenfield
No comments:
Post a Comment