THE ATLANTIC. Even though they admit their devices distract them from class. Now, some educators are incorporating the digital tools into their teaching.
image Dierk Schaefer/Flickr
Smartphones, iPads, TVs, computers, videogames. Technology is omnipresent, especially for young students. They just can’t get enough; one 2013 study found that college students check their digital devices for non-class purposes 11 times per day on average, and 80 percent of them admitted that the technology was distracting them from class. This has some educators and scientists concerned: Are students distracted because their brains are hard-wired for it after a lifetime of screens? Is there a cultural or behavioral element to the fixation that has infiltrated the classroom?
As scientists work to answer these questions, educators are finding ways to adapt to students’ changes—whatever their cause—and use the technology to their advantage, especially in courses focused on science, technology, engineering, and math (STEM).
When scientists talk about how technology affects the brain, it’s one particular network that they focus on most often. It’s called executive function—the series of mental processes at the root of people’s working memory, which involves tasks like remembering instructions, and abilities such as multitasking or paying attention. While the most rapid brain development occurs before the age of five, people don’t hit their peak executive-function until their late 20s.
A numberof studies have shown the connection between stimuli and executive function. One of the most famous was conducted in 2011 and is commonly known as the “Spongebob study.” The research revealed that four-year-olds experience impaired executive function after watching the cartoon for just 10 minutes. In a separate 2011 study, researchers found that teens who are addicted to the Internet have abnormal neural pathways, which are tied to executive function.
While scientists have not yet looked at how this stress affects executive function in the long term, there’s reason to think it might. This has to do with our understanding of neuroplasticity, or how an individual’s brain changes over time depending on how that person uses it. During adolescence, each person’s brain weeds out the pathways that it uses less often in a process called neural pruning, said Gary Small, a psychiatry professor and director of the University of California, Los Angeles, Longevity Center at the Semel Institute for Neuroscience and Human Behavior. “It’s hard to imagine that the way you’re using your brain at a young age isn’t going to affect the pruning process,” he added. In other words, if you spent your youth in front of screens, it would make sense that your adult brain would be hard-wired to process information at a frenzied pace.
If you spent your youth in front of screens your adult brain would be hard-wired to process information at a frenzied pace.
But Daniel Willingham, psychology professor at the University of Virginia in Charlottesville, says it’s still too early to draw conclusions about any long-term effects. “I don’t think there’s really good evidence that there is any fundamental change to cognition as a consequence of technology,” he said. “Anyone who says anything about the long-term effects is going to be guessing.” Conclusions found by researchers like the authors of the Spongebob study only factor in short-term impacts on executive function, Willingham said. “The thing about that study is that the measures were taken right after the kids watched the videos, so a better interpretation is that Spongebob makes you tired,” he added. If technology had changed kids’ brains, we would be seeing more impacts of the inattention in places other than in classroom behavior, Willingham said. “You’d predict a significant dive in standardized test scores over the time frame you’re guessing kids have been heavy users of digital technology,” he said. The data just doesn’t reflect that so, the jury is still out when it comes to technology’s long-term effects on the brain.