Today marks the beginning of Screen-Free Week, when children throughout the nation are encouraged to unplug from the screen to rediscover the joys of childhood, the richness of relationships, and the adventure of stillness.
In anticipation of Screen-Free Week, I spent some of my weekend reviewing a number of studies on how digital addiction and information overload is re-writing our children’s brains. I summarized some of these studies in my TSM article “The Dangers of Digital Addiction and Information Overload: How I Discovered that Silence is Good for my Brain.”
One piece of research that was fascinating (and somewhat alarming) was that even if you’re not using it, simply being able to see a smartphone hinders a person’s ability to focus on tough tasks. Having a tablet or smart-phone in the room, whether or not one actually uses it, leads to a reduced attention span, poor performance on tasks, especially tasks requiring high levels of attention or cognitive abilities. The mere presence of a smart-phone in the same room also weakens a person’s ability to connect with other people, especially when something meaningful is being discussed.
In Part 2 of my interview with Graham Taylor about brain fitness, I shared what I learned from Nicholas Carr’s book The Shallows about mental schemas:
We’ve probably all seen people who have an ability to learn information quickly, perhaps when studying for a test, but then they forget it afterwards, and then we know other people who are able to achieve content mastery. What’s the difference? The difference is that in order for content mastery to occur, let alone understanding and wisdom, the brain has to move beyond massed practice and even memorizing; rather, the brain needs to start schematizing. This is because schemas serve as hooks on which to fasten new information. Without our brain’s ability to create schemas, without a sense of the connectedness of things, everything we learn would be simply a random collection of disconnected facts and there would never be any true understanding.
Nicholas Carr puts it like this in his book The Shallows: “The depth of our intelligence hinges on our ability to transfer information from working memory to long-term memory and weave it into conceptual schemas.”“…brain scientists have come to realize that long-term memory is actually the seat of understanding. It stores not just facts but complex concepts, or ‘schemas.’ By organizing scattered bits of information into patterns of knowledge, schemas give depth and richenss to our thinking. ‘Our intellectual prowess is derived largely from the schemas we have acquired over long periods of time,’ says Sweller. ‘We are able to understand concepts in our areas of expertise because we have schemas associated with those concepts.’”
Carr points out that although the mental skill of schema formation is needed today more than ever, it is in jeopardy from technologies that orient us towards a state of continuous partial attention. As concentrated attention spans and focus become replaced by broad attention ranges and multitasking, what is lost is the type of slow, methodical, systematic and linear cognition that favors the formation of schemas in the long-term memory. In order for the brain to build up schemas effectively, a person has to reflect deeply about her life and what she has learned, and this reflection needs to occur in a slow and undistracted manner. When this is not the case, or when we form schemas badly, then the brain easily falls prey to oversimplifications. We see this all the time in our public political discourse, where issues are deliberated upon in isolated compartments that are often dominated by ideology, resulting in gross oversimplifications.
Also, the brain has to be nimble and flexible enough to adjust our schemas in light of new information we receive through knowledge, experience and personal growth. Sometimes we learn or experience things that do not fit within our existing neurological schemas, and so the brain has to alter existing schemas or create new neuro pathways, a process known as accommodation. That’s where intellectual humility, mental flexibility and open-mindedness become really important. But when our thinking is dominated by impressions, by emotional reasoning or ideology, then we become closed-minded and stuck in schemas that can actually detach us from reality.
The solution is to constantly engage in deep intellectual reflection, to eschew what Socrates called “the unexamined life.” Deep intellectual reflection is to a healthy brain like water is to a healthy plant. People knew that since before Socrates, but now we have the brain science to go with it.
Moore’s law, which expresses itself in computers becoming smaller and smaller, seems to parallel what is happening in our machine-mediated discourse. Our public discourse has been shrinking at a rate rivaled by the speed at which the integrated circuit has diminished in size.
When fax machines first appeared, it was like magic precisely because they could transmit so much text. I remember standing in wonder at the fax machine in my father’s bookstore as it dropped page after page on the floor. When email appeared, it was again astonishing that so much text could be sent over the computer. People would spend hours crafting careful email messages that drew on the tradition of letter writing.
That didn’t last very long. As our communication media have evolved through instant messaging, text messaging and finally Twitter, what attracts us is not length but brevity. Our communication media orient us to eschew complexity and depth, to give preference to what is brief and transitory.
At least, that is what dawned on me this morning when reading Nicholas Carr’s chapter on Twitter in his brand-new book Utopia is Creepy and Other Provocations. This chapter, which is a reprint of Carr’s 2007 blog post, points out that Twitter’s great accomplishment has been to fragment the fragments, enabling us to turn any event in our lives, no matter how trifling, into a headline. Twitter dignifies the banal and glorifies the boring by enabling us to turn any experience into a stop-the-press bulletin. Twitter thus “wraps itself and its users in an infantile language” in which we can take refuge in the insignificant. Carr’s closing paragraph connects Twitter to emerging Virtual Reality technologies:
As the physical world takes on more of the characteristics of a simulation, we seek reality in the simulated world. At least there we can be confident that the simulation is real. At least there we can be freed from the anxiety of not knowing where the edge between real and unreal lies. At least there we find something to hold onto, even if it’s nothing.
I have really benefited from these talks presented by Father Maximos. Father Maximos shares practices we can all do to cultivate inner prayer during times of constant distractions, and he also has some helpful things to say about mindfulness and the history of the Philokalia. The talks are totally free to download from Patristic Nectar.
Most of the time we do grow more skilled at the things we practice, whether it’s learning to play the violin or speak French. But studies show that multitasking falls into the weird category of behaviors that go against this norm: the more you practice it the worse you become. In 2009, researchers at Stanford found that those who multitask frequently and believed that it boosted their performance were actually worse at multitasking than those who preferred not to multitask. This is sobering: if you multitask a lot and think you’re good at it, there is a statistical likelihood is that you are actually a very bad multitasker. The more you practice it, the worse you become.
This research seems counter-intuitive and weird. How could practicing an activity make a person worse at that activity? From a neurological perspective, however, this is not surprising. Study after study has found that in order for the higher functions of the brain to flourish the brain needs to be given frequent and regular spaces of silence, as well as spaces of deep undistracted attentiveness to a single activity. What both silence and attentiveness share in common is that they depend on the brain being able to weed out incoming stimuli. In other words, for the brain to work properly, it needs times when it is not multi-tasking. Times of quiet, as well as times of undistracted focused activity, act as incubation periods in which the brain consolidates what it has learned like a computer defragmenting to weed out the junk. Of course, undistracted focus is not possible in an environment of multitasking.
The following is an expert from my TSM article ‘Recovering Quiet in an Age of Noise‘.
I recently published an article for the Taylor Study Method titled ‘Recovering Quiet in an Age of Noise.’ This article continues to explore my ongoing interest in digital distractions and email addiction, but this time I’m approaching the topic from a personal angle and sharing my own journey. I share what I discovered when I went from having internet only on my computer to having it strapped to my side at all times. To read my article just click on the following link:
Recovering Quiet in an Age of Noise
(This post is a condensed and re-organized version of two earlier blog posts.)
“Technology tends to see reality as heaps, as a conglomeration of fragments that somehow are put together by someone in order to obtain something . . . that don’t have any inner order or interiority that is resistant to human manipulation.”
When Apple unveiled its new Apple Watch Series 2 at this year’s long-anticipated launch, news of the new smart-watch was overshadowed by reactions to the iPhone 7. Yet the underpublicized news that the Apple Watch is soon to be equipped with Pokémon Go is perhaps of greater significance than the annoying fact that Apple has decided to remove the headphone jack from the iPhone.
When Apple unveiled its new Apple Watch Series 2 at yesterday’s long-anticipated launch, news of the new smart-watch was overshadowed by reactions to the iPhone 7. Yet the underpublicized news that the Apple Watch is soon to be equipped with Pokémon Go is perhaps of greater significance than the annoying fact that Apple has decided to remove the headphone jack from the iPhone.
As I mentioned in last week’s post ‘Unbundled Reality and the Anti-Poetry of Pokémon Go‘, I only found out about Pokémon Go a few months ago after it had already taken over the world. The hugely popular reality-enhancement game has been downloaded over 500 million times since its release this summer, bringing in over $10 million a day to the developers. The game enables users to find and capture virtual creatures that have been placed in real-world locations and which are only visible to those with the right electronic devices.
Pokémon in “Real Time” Continue reading
Not long after digital books started becoming readily accessible on the internet, I began hearing that one of their advantages was that they enabled key sections of a book to be extracted from the larger context. Instead of having to read the whole book, a person can use search tools and navigational aids to jump straight into the best sections.
What really caught my attention, however, is when I began being told that eventually the context of a book, even a work of fiction, might pass into irrelevancy as an anachronistic relic of our literary past. Instead, sections of literary works might come to be organized according to new fluid contexts that emerge organically from algorithms based on user preferences. In an article on the post-literary mind, Mark Federman called this emerging model “the UCaPP world” (UCaPP stands for “ubiquitously connected and pervasively proximate.”) Federman described this as
“a world of relationships and connections. It is a world of entangled, complex processes, not content. It is a world in which the greatest skill is that of making sense and discovering emergent meaning among contexts that are continually in flux. It is a world in which truth, and therefore authority, is never static, never absolute, and not always true.”
Steven Johnson’s 2005 book Everything Bad is Good for You: How Today’s Popular Culture is Actually Making Us Smarter, offered an enormous boon to everyone who wanted to defend the neurological virtues of our increasingly technologized and media-driven culture. I remember hearing Doug Wilson speak at our former church, where he cited Johnson’s book–which was then required reading at New Saint Andrew’s College–as balancing out the work of Neil Postman.
In Nicholas Carr’s book The Shallows: What the Internet is Doing to our Brains, Carr responded to Johnson’s claim that book reading under stimulates the senses compared to using a computer. Here is Carr’s rebuttal:
Steven Johnson, in his 2005 book Everything Bad is Good for You, contrasted the widespread, teeming neural activity seen in the brains of computer users with the much more muted activity evident in the brains of book readers. The comparison led him to suggest that computer use provides more intense mental stimulation than does book reading. The neural evidence could even, he wrote, lead a person to conclude that ‘reading books chronically understimulates the senses.’ But while Johnson’s diagnosis is correct, his interpretation of the differing patterns of brain activity is misleading. It is the very fact that book reading ‘understimulates the senses’ that makes the activity so intellectually rewarding. By allowing us to filter out distractions, to quiet the problem-solving functions of the frontal lobes, deep reading becomes a form of deep thinking. The mind of the experienced book reader is a calm mind, not a buzzing one. When it comes to the firing of our neurons, it’s a mistake to assume that more is better.