Moore’s law, which expresses itself in computers becoming smaller and smaller, seems to parallel what is happening in our machine-mediated discourse. Our public discourse has been shrinking at a rate rivaled by the speed at which the integrated circuit has diminished in size.
When fax machines first appeared, it was like magic precisely because they could transmit so much text. I remember standing in wonder at the fax machine in my father’s bookstore as it dropped page after page on the floor. When email appeared, it was again astonishing that so much text could be sent over the computer. People would spend hours crafting careful email messages that drew on the tradition of letter writing.
That didn’t last very long. As our communication media have evolved through instant messaging, text messaging and finally Twitter, what attracts us is not length but brevity. Our communication media orient us to eschew complexity and depth, to give preference to what is brief and transitory.
At least, that is what dawned on me this morning when reading Nicholas Carr’s chapter on Twitter in his brand-new book Utopia is Creepy and Other Provocations. This chapter, which is a reprint of Carr’s 2007 blog post, points out that Twitter’s great accomplishment has been to fragment the fragments, enabling us to turn any event in our lives, no matter how trifling, into a headline. Twitter dignifies the banal and glorifies the boring by enabling us to turn any experience into a stop-the-press bulletin. Twitter thus “wraps itself and its users in an infantile language” in which we can take refuge in the insignificant. Carr’s closing paragraph connects Twitter to emerging Virtual Reality technologies:
As the physical world takes on more of the characteristics of a simulation, we seek reality in the simulated world. At least there we can be confident that the simulation is real. At least there we can be freed from the anxiety of not knowing where the edge between real and unreal lies. At least there we find something to hold onto, even if it’s nothing.
I have really benefited from these talks presented by Father Maximos. Father Maximos shares practices we can all do to cultivate inner prayer during times of constant distractions, and he also has some helpful things to say about mindfulness and the history of the Philokalia. The talks are totally free to download from Patristic Nectar.
This morning while doing some research for a couple clients, I came across two interesting articles that seemed to connect.
One article was a piece by Rod Dreher talking about his time at the recent Society of Classical Learning (SCL) conference. Titled ‘The Problem with ‘Worldview’ Education‘, Dreher shared Joshua Gibbs’ insight that “real art is not something that calls forth an immediate response. You have to contemplate it, turn it over in your mind for a while.” Gibbs went on to suggest that one of the casualties of the worldview-based approach to education is that the rush to analyze texts through a worldview grid can prematurely foreclose–or even completely short-circuit–this necessary process of wondering about and contemplating texts.
We live in strange times when to think critically about emerging technologies, and to ask difficult questions about how to harness our technologies towards the ends of making us more human, is to invite the criticism of being a Luddite. It should be an axiom of the examined life that as new tools and art forms become available to us, they should be the subject of deep reflection, and that the intellectual life should admit no boundaries to the scope of it’s reflections. Sometimes the willingness to ask questions is more important than the answers we arrive at. But not so in the anti-intellectual climate of today. I am increasingly finding that certain questions are taboo, and that caricatures like “Luddite” and “old fashion” are functioning as substitute for genuine refutation.
Today marks the beginning of Screen-Free Week, when children throughout the nation are encouraged to unplug from the screen to rediscover the joys of childhood, the richness of relationships, and the adventure of stillness.
In anticipation of Screen-Free Week, I spent some of my weekend reviewing a number of studies on how digital addiction and information overload is re-writing our children’s brains. I summarized some of these studies in my TSM article “The Dangers of Digital Addiction and Information Overload: How I Discovered that Silence is Good for my Brain.”
One piece of research that was fascinating (and somewhat alarming) was that even if you’re not using it, simply being able to see a smartphone hinders a person’s ability to focus on tough tasks. Having a tablet or smart-phone in the room, whether or not one actually uses it, leads to a reduced attention span, poor performance on tasks, especially tasks requiring high levels of attention or cognitive abilities. The mere presence of a smart-phone in the same room also weakens a person’s ability to connect with other people, especially when something meaningful is being discussed.
In Part 2 of my interview with Graham Taylor about brain fitness, I shared what I learned from Nicholas Carr’s book The Shallows about mental schemas:
We’ve probably all seen people who have an ability to learn information quickly, perhaps when studying for a test, but then they forget it afterwards, and then we know other people who are able to achieve content mastery. What’s the difference? The difference is that in order for content mastery to occur, let alone understanding and wisdom, the brain has to move beyond massed practice and even memorizing; rather, the brain needs to start schematizing. This is because schemas serve as hooks on which to fasten new information. Without our brain’s ability to create schemas, without a sense of the connectedness of things, everything we learn would be simply a random collection of disconnected facts and there would never be any true understanding.
Nicholas Carr puts it like this in his book The Shallows: “The depth of our intelligence hinges on our ability to transfer information from working memory to long-term memory and weave it into conceptual schemas.”“…brain scientists have come to realize that long-term memory is actually the seat of understanding. It stores not just facts but complex concepts, or ‘schemas.’ By organizing scattered bits of information into patterns of knowledge, schemas give depth and richenss to our thinking. ‘Our intellectual prowess is derived largely from the schemas we have acquired over long periods of time,’ says Sweller. ‘We are able to understand concepts in our areas of expertise because we have schemas associated with those concepts.’”
Carr points out that although the mental skill of schema formation is needed today more than ever, it is in jeopardy from technologies that orient us towards a state of continuous partial attention. As concentrated attention spans and focus become replaced by broad attention ranges and multitasking, what is lost is the type of slow, methodical, systematic and linear cognition that favors the formation of schemas in the long-term memory. In order for the brain to build up schemas effectively, a person has to reflect deeply about her life and what she has learned, and this reflection needs to occur in a slow and undistracted manner. When this is not the case, or when we form schemas badly, then the brain easily falls prey to oversimplifications. We see this all the time in our public political discourse, where issues are deliberated upon in isolated compartments that are often dominated by ideology, resulting in gross oversimplifications.
Also, the brain has to be nimble and flexible enough to adjust our schemas in light of new information we receive through knowledge, experience and personal growth. Sometimes we learn or experience things that do not fit within our existing neurological schemas, and so the brain has to alter existing schemas or create new neuro pathways, a process known as accommodation. That’s where intellectual humility, mental flexibility and open-mindedness become really important. But when our thinking is dominated by impressions, by emotional reasoning or ideology, then we become closed-minded and stuck in schemas that can actually detach us from reality.
The solution is to constantly engage in deep intellectual reflection, to eschew what Socrates called “the unexamined life.” Deep intellectual reflection is to a healthy brain like water is to a healthy plant. People knew that since before Socrates, but now we have the brain science to go with it.
Most of the time we do grow more skilled at the things we practice, whether it’s learning to play the violin or speak French. But studies show that multitasking falls into the weird category of behaviors that go against this norm: the more you practice it the worse you become. In 2009, researchers at Stanford found that those who multitask frequently and believed that it boosted their performance were actually worse at multitasking than those who preferred not to multitask. This is sobering: if you multitask a lot and think you’re good at it, there is a statistical likelihood is that you are actually a very bad multitasker. The more you practice it, the worse you become.
This research seems counter-intuitive and weird. How could practicing an activity make a person worse at that activity? From a neurological perspective, however, this is not surprising. Study after study has found that in order for the higher functions of the brain to flourish the brain needs to be given frequent and regular spaces of silence, as well as spaces of deep undistracted attentiveness to a single activity. What both silence and attentiveness share in common is that they depend on the brain being able to weed out incoming stimuli. In other words, for the brain to work properly, it needs times when it is not multi-tasking. Times of quiet, as well as times of undistracted focused activity, act as incubation periods in which the brain consolidates what it has learned like a computer defragmenting to weed out the junk. Of course, undistracted focus is not possible in an environment of multitasking.
The above was an expert from my TSM article ‘Recovering Quiet in an Age of Noise‘.
I recently published an article for the Taylor Study Method titled ‘Recovering Quiet in an Age of Noise.’ This article continues to explore my ongoing interest in digital distractions and email addiction, but this time I’m approaching the topic from a personal angle and sharing my own journey. I share what I discovered when I went from having internet only on my computer to having it strapped to my side at all times. To read my article just click on the following link:
Recovering Quiet in an Age of Noise
(This post is a condensed and re-organized version of two earlier blog posts.)
“Technology tends to see reality as heaps, as a conglomeration of fragments that somehow are put together by someone in order to obtain something . . . that don’t have any inner order or interiority that is resistant to human manipulation.”
When Apple unveiled its new Apple Watch Series 2 at this year’s long-anticipated launch, news of the new smart-watch was overshadowed by reactions to the iPhone 7. Yet the underpublicized news that the Apple Watch is soon to be equipped with Pokémon Go is perhaps of greater significance than the annoying fact that Apple has decided to remove the headphone jack from the iPhone.
When Apple unveiled its new Apple Watch Series 2 at yesterday’s long-anticipated launch, news of the new smart-watch was overshadowed by reactions to the iPhone 7. Yet the underpublicized news that the Apple Watch is soon to be equipped with Pokémon Go is perhaps of greater significance than the annoying fact that Apple has decided to remove the headphone jack from the iPhone.
As I mentioned in last week’s post ‘Unbundled Reality and the Anti-Poetry of Pokémon Go‘, I only found out about Pokémon Go a few months ago after it had already taken over the world. The hugely popular reality-enhancement game has been downloaded over 500 million times since its release this summer, bringing in over $10 million a day to the developers. The game enables users to find and capture virtual creatures that have been placed in real-world locations and which are only visible to those with the right electronic devices.
Pokémon in “Real Time” Continue reading