The year is 2060. Professor Updike stands to take the podium for the keynote speech at his university’s annual communications conference.
Professor Updike is a clean-shaven African American man in his mid-forties. To the audience, however, these details are irrelevant. Everyone in attendance is wearing virtual reality glasses—a technology that allows each person to customize their own reality and seamlessly overlay that reality onto the physical world. This technology, at one time experimental and cumbersome, has now become normal and ubiquitous. In fact, it has become unusual not to see people wearing these glasses, although there remain some neo-Luddite holdouts in the rural areas.
Through their VR glasses, some people see Professor Updike as he would have looked twenty-five years ago as an undergraduate. Others have adjusted their VR settings to see him as a white person, or another race of preference. For still others, the professor appears to be giving his speech completely nude.
In the video below, Pulitzer Prize finalist Nicholas Carr shares evidence from brain science about what happens when our devices (particularly the smartphones) infuse into our lives perpetual distractibility, multitasking and split attentiveness. He shows that what science is finding (and you can see footnotes to the actual peer-reviewed studies in Carr’s book The Shallows) is that there is a trade-off whereby certain cognitive functions become diminished.
That much isn’t surprising, but what I found really interesting is which cognitive functions are compromised.
Most of the time, of course, we do grow more skilled at the things we practice, whether it’s learning to play the violin or speak French. But studies show that multitasking falls into the weird category of behaviors that go against this norm: the more you practice it the worse you become. In 2009, researchers at Stanford found that those who multitask frequently and believed that it boosted their performance were actually worse at multitasking than those who preferred not to multitask. Indeed, when measured by the same established cognitive control dimensions, the group who had a lifestyle of frequent multitasking performed worse than the group of light multitaskers.
This is sobering: if you multitask a lot and think you’re good at it, there is a statistical likelihood is that you are actually a very bad multitasker. The study also found that multitasking makes a person “more susceptible to interference from irrelevant environmental stimuli and from irrelevant representations in memory.”
This morning while doing some research for a couple clients, I came across two interesting articles that seemed to connect.
One article was a piece by Rod Dreher talking about his time at the recent Society of Classical Learning (SCL) conference. Titled ‘The Problem with ‘Worldview’ Education‘, Dreher shared Joshua Gibbs’ insight that “real art is not something that calls forth an immediate response. You have to contemplate it, turn it over in your mind for a while.” Gibbs went on to suggest that one of the casualties of the worldview-based approach to education is that the rush to analyze texts through a worldview grid can prematurely foreclose–or even completely short-circuit–this necessary process of wondering about and contemplating texts.
Moore’s law, which expresses itself in computers becoming smaller and smaller, seems to parallel what is happening in our machine-mediated discourse. Our public discourse has been shrinking at a rate rivaled by the speed at which the integrated circuit has diminished in size.
When fax machines first appeared, it was like magic precisely because they could transmit so much text. I remember standing in wonder at the fax machine in my father’s bookstore as it dropped page after page on the floor. When email appeared, it was again astonishing that so much text could be sent over the computer. People would spend hours crafting careful email messages that drew on the tradition of letter writing.
That didn’t last very long. As our communication media have evolved through instant messaging, text messaging and finally Twitter, what attracts us is not length but brevity. Our communication media orient us to eschew complexity and depth, to give preference to what is brief and transitory.
At least, that is what dawned on me this morning when reading Nicholas Carr’s chapter on Twitter in his brand-new book Utopia is Creepy and Other Provocations. This chapter, which is a reprint of Carr’s 2007 blog post, points out that Twitter’s great accomplishment has been to fragment the fragments, enabling us to turn any event in our lives, no matter how trifling, into a headline. Twitter dignifies the banal and glorifies the boring by enabling us to turn any experience into a stop-the-press bulletin. Twitter thus “wraps itself and its users in an infantile language” in which we can take refuge in the insignificant. Carr’s closing paragraph connects Twitter to emerging Virtual Reality technologies:
As the physical world takes on more of the characteristics of a simulation, we seek reality in the simulated world. At least there we can be confident that the simulation is real. At least there we can be freed from the anxiety of not knowing where the edge between real and unreal lies. At least there we find something to hold onto, even if it’s nothing.
We live in strange times when to think critically about emerging technologies, and to ask difficult questions about how to harness our technologies towards the ends of making us more human, is to invite the criticism of being a Luddite. It should be an axiom of the examined life that as new tools and art forms become available to us, they should be the subject of deep reflection, and that the intellectual life should admit no boundaries to the scope of it’s reflections. Sometimes the willingness to ask questions is more important than the answers we arrive at. But not so in the anti-intellectual climate of today. I am increasingly finding that certain questions are taboo, and that caricatures like “Luddite” and “old fashion” are functioning as substitute for genuine refutation.
Today marks the beginning of Screen-Free Week, when children throughout the nation are encouraged to unplug from the screen to rediscover the joys of childhood, the richness of relationships, and the adventure of stillness.
In anticipation of Screen-Free Week, I spent some of my weekend reviewing a number of studies on how digital addiction and information overload is re-writing our children’s brains. I summarized some of these studies in my TSM article “The Dangers of Digital Addiction and Information Overload: How I Discovered that Silence is Good for my Brain.”
One piece of research that was fascinating (and somewhat alarming) was that even if you’re not using it, simply being able to see a smartphone hinders a person’s ability to focus on tough tasks. Having a tablet or smart-phone in the room, whether or not one actually uses it, leads to a reduced attention span, poor performance on tasks, especially tasks requiring high levels of attention or cognitive abilities. The mere presence of a smart-phone in the same room also weakens a person’s ability to connect with other people, especially when something meaningful is being discussed.
In Part 2 of my interview with Graham Taylor about brain fitness, I shared what I learned from Nicholas Carr’s book The Shallows about mental schemas:
We’ve probably all seen people who have an ability to learn information quickly, perhaps when studying for a test, but then they forget it afterwards, and then we know other people who are able to achieve content mastery. What’s the difference? The difference is that in order for content mastery to occur, let alone understanding and wisdom, the brain has to move beyond massed practice and even memorizing; rather, the brain needs to start schematizing. This is because schemas serve as hooks on which to fasten new information. Without our brain’s ability to create schemas, without a sense of the connectedness of things, everything we learn would be simply a random collection of disconnected facts and there would never be any true understanding.
Nicholas Carr puts it like this in his book The Shallows: “The depth of our intelligence hinges on our ability to transfer information from working memory to long-term memory and weave it into conceptual schemas.”“…brain scientists have come to realize that long-term memory is actually the seat of understanding. It stores not just facts but complex concepts, or ‘schemas.’ By organizing scattered bits of information into patterns of knowledge, schemas give depth and richenss to our thinking. ‘Our intellectual prowess is derived largely from the schemas we have acquired over long periods of time,’ says Sweller. ‘We are able to understand concepts in our areas of expertise because we have schemas associated with those concepts.’”
Carr points out that although the mental skill of schema formation is needed today more than ever, it is in jeopardy from technologies that orient us towards a state of continuous partial attention. As concentrated attention spans and focus become replaced by broad attention ranges and multitasking, what is lost is the type of slow, methodical, systematic and linear cognition that favors the formation of schemas in the long-term memory. In order for the brain to build up schemas effectively, a person has to reflect deeply about her life and what she has learned, and this reflection needs to occur in a slow and undistracted manner. When this is not the case, or when we form schemas badly, then the brain easily falls prey to oversimplifications. We see this all the time in our public political discourse, where issues are deliberated upon in isolated compartments that are often dominated by ideology, resulting in gross oversimplifications.
Also, the brain has to be nimble and flexible enough to adjust our schemas in light of new information we receive through knowledge, experience and personal growth. Sometimes we learn or experience things that do not fit within our existing neurological schemas, and so the brain has to alter existing schemas or create new neuro pathways, a process known as accommodation. That’s where intellectual humility, mental flexibility and open-mindedness become really important. But when our thinking is dominated by impressions, by emotional reasoning or ideology, then we become closed-minded and stuck in schemas that can actually detach us from reality.
The solution is to constantly engage in deep intellectual reflection, to eschew what Socrates called “the unexamined life.” Deep intellectual reflection is to a healthy brain like water is to a healthy plant. People knew that since before Socrates, but now we have the brain science to go with it.
I have really benefited from these talks presented by Father Maximos. Father Maximos shares practices we can all do to cultivate inner prayer during times of constant distractions, and he also has some helpful things to say about mindfulness and the history of the Philokalia. The talks are totally free to download from Patristic Nectar.