Worldview Education, Technology and the Value of Boredom

This morning while doing some research for a couple clients, I came across two interesting articles that seemed to connect.

One article was a piece by Rod Dreher talking about his time at the recent Society of Classical Learning (SCL) conference. Titled ‘The Problem with ‘Worldview’ Education‘, Dreher shared Joshua Gibbs’ insight that “real art is not something that calls forth an immediate response. You have to contemplate it, turn it over in your mind for a while.” Gibbs went on to suggest that one of the casualties of the worldview-based approach to education is that the rush to analyze texts through a worldview grid can prematurely foreclose–or even completely short-circuit–this necessary process of wondering about and contemplating texts.

Continue reading

Customized Communication in a Virtual World

The year is 2060. Professor Updike stands to take the podium for the keynote speech at his university’s annual communications conference.

Professor Updike is a clean-shaven African American man in his mid-forties. To the audience, however, these details are irrelevant. Everyone in attendance is wearing virtual reality glasses—a technology that allows each person to customize their own reality and seamlessly overlay that reality onto the physical world. This technology, at one time experimental and cumbersome, has now become normal and ubiquitous. In fact, it has become unusual not to see people wearing these glasses, although there remain some neo-Luddite holdouts in the rural areas.

Through their VR glasses, some people see Professor Updike as he would have looked twenty-five years ago as an undergraduate. Others have adjusted their VR settings to see him as a white person, or another race of preference. For still others, the professor appears to be giving his speech completely nude.

Continue reading

Nicholas Carr on the Decline of Deep Thinking

In the video below, Pulitzer Prize finalist Nicholas Carr shares evidence from brain science about what happens when our devices (particularly the smartphones) infuse into our lives perpetual distractibility, multitasking and split attentiveness. He shows that what science is finding (and you can see footnotes to the actual peer-reviewed studies in Carr’s book The Shallows) is that there is a trade-off whereby certain cognitive functions become diminished.

That much isn’t surprising, but what I found really interesting is which cognitive functions are compromised.

Continue reading

Multitasking: the more you practice it, the worse you become

Most of the time, of course, we do grow more skilled at the things we practice, whether it’s learning to play the violin or speak French. But studies show that multitasking falls into the weird category of behaviors that go against this norm: the more you practice it the worse you become. In 2009, researchers at Stanford found that those who multitask frequently and believed that it boosted their performance were actually worse at multitasking than those who preferred not to multitask. Indeed, when measured by the same established cognitive control dimensions, the group who had a lifestyle of frequent multitasking performed worse than the group of light multitaskers.

This is sobering: if you multitask a lot and think you’re good at it, there is a statistical likelihood is that you are actually a very bad multitasker. The study also found that multitasking makes a person “more susceptible to interference from irrelevant environmental stimuli and from irrelevant representations in memory.”

Continue reading

Refuge in Insignificance

Moore’s law, which expresses itself in computers becoming smaller and smaller, seems to parallel what is happening in our machine-mediated discourse. Our public discourse has been shrinking at a rate rivaled by the speed at which the integrated circuit has diminished in size.

When fax machines first appeared, it was like magic precisely because they could transmit so much text. I remember standing in wonder at the fax machine in my father’s bookstore as it dropped page after page on the floor. When email appeared, it was again astonishing that so much text could be sent over the computer. People would spend hours crafting careful email messages that drew on the tradition of letter writing.

That didn’t last very long. As our communication media have evolved through instant messaging, text messaging and finally Twitter, what attracts us is not length but brevity. Our communication media orient us to eschew complexity and depth, to give preference to what is brief and transitory.

At least, that is what dawned on me this morning when reading Nicholas Carr’s chapter on Twitter in his brand-new book Utopia is Creepy and Other Provocations. This chapter, which is a reprint of Carr’s 2007 blog post, points out that Twitter’s great accomplishment has been to fragment the fragments, enabling us to turn any event in our lives, no matter how trifling, into a headline. Twitter dignifies the banal and glorifies the boring by enabling us to turn any experience into a stop-the-press bulletin. Twitter thus “wraps itself and its users in an infantile language” in which we can take refuge in the insignificant. Carr’s closing paragraph connects Twitter to emerging Virtual Reality technologies:

As the physical world takes on more of the characteristics of a simulation, we seek reality in the simulated world. At least there we can be confident that the simulation is real. At least there we can be freed from the anxiety of not knowing where the edge between real and unreal lies. At least there we find something to hold onto, even if it’s nothing.

Strange Times

We live in strange times when to think critically about emerging technologies, and to ask difficult questions about how to harness our technologies towards the ends of making us more human, is to invite the criticism of being a Luddite. It should be an axiom of the examined life that as new tools and art forms become available to us, they should be the subject of deep reflection, and that the intellectual life should admit no boundaries to the scope of it’s reflections. Sometimes the willingness to ask questions is more important than the answers we arrive at. But not so in the anti-intellectual climate of today. I am increasingly finding that certain questions are taboo, and that caricatures like “Luddite” and “old fashion” are functioning as substitute for genuine refutation.

Continue reading

Smartphones and Your Brain

Today marks the beginning of Screen-Free Week, when children throughout the nation are encouraged to unplug from the screen to rediscover the joys of childhood, the richness of relationships, and the adventure of stillness.

In anticipation of Screen-Free Week, I spent some of my weekend reviewing a number of studies on how digital addiction and information overload is re-writing our children’s brains. I summarized some of these studies in my TSM article “The Dangers of Digital Addiction and Information Overload: How I Discovered that Silence is Good for my Brain.

One piece of research that was fascinating (and somewhat alarming) was that even if you’re not using it, simply being able to see a smartphone hinders a person’s ability to focus on tough tasks. Having a tablet or smart-phone in the room, whether or not one actually uses it, leads to a reduced attention span, poor performance on tasks, especially tasks requiring high levels of attention or cognitive abilities. The mere presence of a smart-phone in the same room also weakens a person’s ability to connect with other people, especially when something meaningful is being discussed.
Continue reading

The Dangers of Digital Addiction and Information Overload: How I Discovered that Silence is Good for my Brain

This post was originally published at the Taylor Study Method and is reprinted here with permission of the author (me).

I still remember the night that convinced me I finally needed to join the twenty-first century.

I had just finished a long day helping as a judge for a debate tournament. By the time I finally headed home it was dark. Or at least, I thought I was headed home. However, the further I drove, the less I recognized of my surroundings. As the road progressed further and further up into the mountains, I remembered my young children waiting at a friends’ house for me to collect them. Finally, the road abruptly ended. Literally, it just ended. I had no choice but to turn around and start over.

At about midnight I finally pulled into the drive-way of my friends’ house to collect my tired children. I determined never to let myself get lost again: I would finally invest in a GPS.

A few weeks later I went into an electronics store and asked for a device that had GPS capabilities. They sold me an Android tablet. I quickly discovered that the tablet was more than just a GPS: it was also an audio player, a camera, a gaming device, even a flashlight. Moreover, the tablet had a perpetual connection so it was always online.

I felt proud of my new device and even had a green pouch custom made for wearing it at my side. Since it was a camera, I took it with me everywhere. Wearing the tablet made me feel sophisticated and modern: I had finally entered the 21st century!

You Have Email!

About a week after purchasing the device, I learned that I could check my email on it. And although the particular device I had purchased didn’t function as a phone, I found an app that allowed me to have text messages directed to it. At first I had mixed feelings about these features. During my work day I was constantly at the computer, continually available to people who needed to contact me. Did I really want this to spell into my personal life? So I determined not to use my tablet to check my messages when I wasn’t working, except perhaps in the situations when I was expecting something really important.

At first it was easy to keep to my resolve. When I wasn’t working I was generally either doing something with my children, reading, or walking in nature. These activities provided valuable opportunities to rest my brain, to be in a state of stillness. Why would I want to spoil these precious times with the noise of the internet?

If I ever did feel tempted to start using the device to go online when I wasn’t working, I was well fortified with a host of research on why I shouldn’t. You see, I had read about studies on the neuroscience of message addiction, and I had occasion to write about this over the years (see here and here and here and here and here) and as a journalist I had even been hired to review studies on the dangers of digital addiction and information overload. I knew that many office workers check their email every minute while an increasing amount of high school students are being interrupted from homework once every 20 seconds to the sound of an incoming text message.

One of the most fascinating studies, published in 2015, found that young adults were using their phones five hours a day on 85 separate occasions. Other research suggests that children between 8 to 12 spend an average of about six hours consuming online media, while the typical teenager spends as much as 9 hours a day using online media. Rather worryingly, the research is also finding that because smartphone use has become habituated, our interaction with these devices often occurs without any conscious awareness of our behavior in much the same way that we are not always conscious of automated behaviors like breathing or scratching ourselves.

Digital saturation is actually becoming a public health concern: in a September 2016 special edition of Time Magazine, it was reported that scientists have discovered that the type of multitasking encouraged by electronic devices (especially those that ding or light up every time we have a new message) damages the brain’s prefrontal cortex in areas required for decision making, reasoning skills, and social virtues like empathy, understanding and interpersonal communication. (For more information about this, see Pamela DeLoatch’s article ‘The Four Negative Sides of Technology‘.)

At least, I thought, that would never happen to me! Whatever the ignorant masses might be doing, I would never let my focus become compromised. I would never succumb to what Cory Doctorow has termed the “ecosystem of interruption technologies.” As I played with my children or walked in the woods with my Android device strapped to my side, I congratulated myself that I was not even tempted to check my messages. I had turned off all notifications and used the device only as a camera, GPS, and audio player.

Into the Digital Abyss 

I can’t remember when I first started checking my email regularly. It all happened so gradually that I hardly noticed at first. But eventually I began to find that when I was walking, listening to music, playing with children, reading, and sometimes even while I was meditating, a compulsion would suddenly come upon me to open up my tablet to check my messages. Maybe someone was trying to send me a text. Maybe there was an email waiting to be read!

I first realized I had a problem when I found myself checking my messages during a walk in the woods, hiding behind trees so as not to be a bad example to my kids. They naively believed they had my full attention.

The worst thing about my tablet was that even when I refused to give into the impulse to check my messages, my times of silence were filled with thoughts about the omnipresent conversation happening around me. Moreover, I began to find that the more I let my attention be scattered by external stimuli, the harder it was to control the internal distractions that naturally arise in the brain throughout the day (i.e., useless thoughts, negative ruminations, toxic imaginations, etc.). Looking back, I believe this was significant. You see, as far back as the written record extends, it’s clear that human beings have recognized that one of the greatest obstacles to well-being are the thousands of useless thoughts that constantly arise in the human brain. Great spiritual traditions ranging from Christian monasticism to Zen Buddhism have offered various techniques for rejecting this ever-changing kaleidoscope of mental stimuli and bringing the brain to a place of stillness. The condition in which we now find ourselves is one in which the kaleidoscope of mental stimuli has been externalized through our self-imposed bombardment with incoming digital stimuli.

Living in today’s world we do not merely have to contend with the useless ruminations of our ever-distractible brains: through social media we also have to deal with the cognitions of thousands of others. And the frightening thing is that the neuropathways activated by the latter undermine the brain’s resistance against the former.

In my own case, the electronic distractions eventually were no longer simply limited to emails and text messages. I became a consumer of social media, using my tablet to go on Facebook to see if people were commenting on articles I’d written. I even remember once when I was driving and I found myself longing for the traffic light to be red so I could take a quick peak at my messages and notifications.

I still knew that silence is good for my brain, but I stopped being able to enjoy silence.

Offline Distractions

Once I realized I had a problem, the solution was simple: don’t reach for my tablet. So I developed little routines to make it harder for me to go online. If I was using the tablet to listen to audio books while driving, I would put it on airplane mode so it wasn’t online. When I got home from work I would leave the tablet in the car instead of taking it into the house. When I went for hikes, I wouldn’t even take the tablet. I felt proud of myself: I had conquered the impulse to be distracted!

I quickly found that things were not so simple. The worst distractions occurred when I was not actually using my tablet. Even if I wasn’t consciously thinking about it at all, at the back of my mind was always the awareness that there was a conversation happening that I could potentially jump into with a click of the button. Whether or not I actually turned my tablet on, there remained a nagging awareness of the conversations happening around me. Maybe someone had just sent me an email. Maybe someone had replied to my Facebook comment. Maybe I had a text message waiting to be read. This awareness of what we might call “information potentiality” drained my attention and hindered my ability to focus. In fact, these types of offline distractions seemed to be more unmanageable than the online distractions. Whereas online distractions may last a few seconds, the scattered focus during my offline hours was interminable.

At first I assumed I was an anomaly for experiencing these symptoms. I have friends and family-members who use smart-phones all the time and yet I never hear them complaining about these types of problems. We hear a lot about online distractions, but why does there seem to be so little awareness of the type of offline distractions I was experiencing caused by thinking about what might potentially be happening online? Was there something wrong with my brain?

With these questions in mind, I jumped back into some of the research I had reviewed a few years earlier.

What the Research Says

In jumping back into the research, my attention was struck by a pair of studies I read about in Time Magazine’s special edition on Mindfulness. These studies suggested that the primary challenge electronic distractions bring is not in the areas we might at first suspect. A common assumption is that our electronic devices distract us through consuming our time. However, glancing at a text message or a Facebook comment in the middle of your homework or office time can be a very brief exercise occupying no more than a few seconds. Rather, the primary problem is that by exposing our minds to this constant stream of stimuli we are using up valuable cognitive resources that put a drain on the working memory (the part of the brain through which all information must pass before it can reach the long-term memory). This part of the brain can only hold so much information at any one time, which is why it is important not to overload it.

Information overload can happen without a person actually feeling like their brain is overloaded, since the brain adjusts to this state of affairs by shutting down other processes. Again, think of a computer with limited RAM that shuts down certain functions to accommodate others.

What are the functions that get shut down when our brains are exposed to too much information? This question was addressed by Nicholas Carr in his book The Shallows: What the Internet is Doing to Our Brains. Carr combed through study after study showing that when our working memory is compromised by too many distractions, some of the first mental functions to be shut off are the ability to put knowledge into schemas, to make connections, and to grasp over-arching narratives of meaning. In short, our brains become lost in a sea of particulars without the ability to connect these particulars into larger structures of understanding. Another function to be shut down is the ability to be attentive to others, to empathize, and to understand things from another person’s point of view.

In order for these higher cognitive functions to work, the brain needs lots of time during the day when we are at rest, when we are quiet, and when we can focus on specific mental, imaginative or interpersonal tasks against a backdrop of stillness. What our hand-held electronic devices do is to replace the backdrop of stillness with a backdrop of informational noise.

Having reviewed this research, the solution for me was simple: I needed to ramp up the self-control and check my messages less frequently. But how less frequently? Earl Miller, professor of neuroscience at MIT, says that “switch cost” (the loss of attention when we’re pulled away from a task, even if only for a split second to glance at a message) has an effect on the brain’s ability to focus that lasts up to 15 or 20 minutes. The worst effects of switch cost occur in the first 64 seconds after checking one’s email or text messages regardless of whether or not there was a message of significance.

Because of this, it’s probable that the average office worker wastes at least 8.5 hours a week figuring out what he or she was doing moments before. This may be one of the reasons that researchers have found that even one’s ability to see a cell phone hinders the brain’s ability to focus. Having a tablet or smart-phone in the room, whether or not one actually uses it, leads to a reduced attention span, poor performance on tasks, especially tasks requiring high levels of attention or cognitive abilities. The mere presence of a smart-phone in the same room also weakens a person’s ability to connect with other people, especially when something meaningful is being discussed.

My Experiment Going Offline

At this point I’d like to be able to say that after reviewing all this research I acted nobly and threw away my tablet. The reality is much more prosaic: last year I cancelled the data plan with my Android because I couldn’t afford to keep paying $30 a month for a continual connection.

At first it was kind of scary not to have internet with me all the time. When I left the wifi zone of my office or home, I felt like I had personally been disconnected, isolated from the center of social gravity. To ease my newfound insecurity, I had to keep reminding myself that people lived for thousands of years without an ubiquitous internet connection.

My fear and insecurity was quickly replaced by a sense of relief – no longer was I continually subject to the tyranny of being constantly available. No longer were my times of stillness plagued with the thought “I wonder if I have a message.” Gradually, little by little, the stillness I had previously treasured began to return.

Since that time I admit I can sometimes be like a person who gives up smoking and then becomes intolerable to those who still practice the habit. Last week I was tutoring a high school student who was grappling with one of the more complex chapters in E. M. Forster’s classic novel A Passage to India. As I struggled to explain the chapter’s dominant themes, every 20 to 30 seconds our concentration was interrupted by a bell on his phone signifying an incoming text message. Every time this happened he would glance at his phone, usually for less than a second, to confirm that the message was unimportant. To avoid having our focus continually derailed, I eventually asked him to turn off the device or put it in another room. I backed up my request by explaining what scientists were discovering about the subliminal effects of “switch cost.” After listening to my review of these studies, my student continued to maintain that he was not distracted by his phone and, consequently, he didn’t need to turn it off. Eventually we reached a compromise: he would put his phone on vibrate mode. That way he would still know if someone was trying to reach him.

Is Email Addicting?

As the above interchange suggests, asking a teenager to turn off his phone may well be like asking a gambling addict to stay away from the casino. The comparison has warrant since there is a large body of research showing that digital addiction (including a compulsion to incessantly check one’s messages) activates the same pathways in the brain as gambling. Slot machines follow a “variable interval reinforcement schedule”, which means that a certain action is rewarded some of the time but not all of the time, and not in a predictable pattern. The presence of occasional rewards paced at unpredictable intervals incentivizes the gambler to keep putting money in the slot machine even when this usually results in a loss. Similarly, checking one’s email or glancing at incoming text messages also results in a loss most of the time. It is a loss because there is nothing of interest and our focus is briefly scattered. However, occasionally these actions are rewarded: we receive a message that brings excitement, social nourishment, or a diversion from tedious work or study. It is these once-in-a-while rewards that incentivizes a person to keep coming back to re-check email or to review the latest text messages, even when the results are deleterious to their mental and psychological wellbeing.

“a much more blended physical and digital world”

I had been able to escape from digital addiction and information overload by simply phoning my provider and asking to cancel my plan. But a day may be swift approaching when the internet is forced upon each of us, whether we choose or not. In the 2007 Consumer Electronic Show, Bill Gates described the bedroom of the future as one with walls completely covered with computer screens.

Building on this vision, in 2011 Microsoft released a “Productivity Future Vision” video. The video plays majestic music while presenting the vision of a bright future in which online connectivity will be seamlessly integrated into every aspect of life, everywhere in the world.

The video envisions a world in which multitasking and divided-attention have become the norm, where contented men and women negotiate hundreds of digital inputs and outputs every day, thus enabling everyone to be linked together in a web of goodwill and productivity. There is something very factory-like in this pragmatic vision, which is also shared by Google and most of Silicon Valley. But whereas the factory has generally come to be seen as dehumanizing, the vision of a digital utopia is presented as a way of enabling humanity to flourish.

Not to be left behind, Facebook has now jumped on the bandwagon, announcing last week that it was working on technology that would enable digital messages to be layered on top of the physical world. Facebook developer Regina Dugan told TheVerge.Com that it was working on an augmented reality system that would integrate our online and offline lives, towards the goal of “a much more blended physical and digital world.”

When Does Practice not Make Perfect?

If these technocratic utopias are ever realized, it would be nice to think that the human brain will adjust like it has adjusted to other technologies. As we become more and more practiced in multitasking, won’t we get better at it? After all, practice makes perfect, right?

Most of the time, of course, we do grow more skilled at the things we practice, whether it’s learning to play the violin or speak French. But studies show that multitasking falls into the weird category of behaviors that go against this norm: the more you practice it the worse you become. In 2009, researchers at Stanford found that those who multitask frequently and believed that it boosted their performance were actually worse at multitasking than those who preferred not to multitask. Indeed, when measured by the same established cognitive control dimensions, the group who frequently practiced multitasking performed worse than the group of light multitaskers. This is sobering: if you multitask a lot and think you’re good at it, there is a statistical likelihood is that you are actually a very bad multitasker. The study also found that multitasking makes a person “more susceptible to interference from irrelevant environmental stimuli and from irrelevant representations in memory.”
This research seems counter-intuitive. How could practicing an activity make a person worse at that activity? From a neurological perspective, however, this is not surprising. Study after study has found that in order for the higher functions of the brain to flourish the brain needs to be given frequent and regular spaces of silence, as well as spaces of deep undistracted attentiveness to a single activity. What both silence and attentiveness share in common is that they depend on the brain being able to weed out incoming stimuli. In other words, for the brain to work properly, it needs times when it is not multi-tasking. Times of quiet, as well as times of undistracted focused activity (whether listening to music, reading, prayer, exercise or meditation), act as incubation periods in which the brain consolidates what it has learned like a computer defragmenting itself to weed out the junk. Of course, undistracted focus is not possible in an environment of multitasking. This may be one of the reasons that another study found that multitasking shrinks the part of the brain involved in emotional regulation and higher cognitive functions.

If the neuroscientists are correct that our brain needs regular periods of stillness and attentiveness to function properly, then the greatest virtue we can pass onto our children is the virtue of stillness. One might say that in a world of universal noise, being quiet is a revolutionary act. Stillness is revolutionary because it challenges the pragmatism that has become the modus operandi of our culture, where value is determined by the speed at which the mind can receive inputs and execute outputs.

Stillness in Education

Since various schools and community groups are officially recognizing this week as “screen free week”, I’d like to conclude by drawing some implications for school teachers and administrators.

I’ve been involved in various college prep schools, including charter schools and classical schools, and when I talk to the educators in these systems the one thing they always lament is lack of time to cover everything they would like. Content, content, content – the more the better. Accordingly, curriculum is structured with the goal of cramming as much information as possible into the students’ minds. As inputs are continually increased, the outputs expected of the students also increase, with the result that little space is left for reflecting deeply on any one thing. Thus, from an early age students learn that success in life is directly correlative to the speed at which they can absorb inputs and produce outputs. When students are occasionally given time to reflect deeply on a single thing, their minds often find it difficult to adjust to the slower pace. When the freneticism of information overload is the norm, thinking deeply feels strangely uncomfortable, while stillness comes to feel unnatural, even frightening and disconcerting.

On one level it makes sense for educators to throw as much information at our kids as possible. After all, our goal is to produce smart kids who will be able to get into good colleges and tackle the demands of an increasingly competitive world. But what if the effective means for reaching these goals is opposite to what we usually think? But what if a child is very different to a machine? What if a child’s ability to sit in silence, to voluntarily slow down the speed of inputs and outputs, is actually one of the biggest indicators of future success, prosperity and academic achievement?

I pose these as questions, but science is already on the way to providing the answers. In his book Focus, Daniel Goleman shared researching suggesting that the ability to focus is an even greater indication of future life success than IQ. A series of randomized-controlled studies conducted in the 2011-12 school year found that periods of focused stillness improved students’ aptitude in attention, self-control, self-care, participation, and showing care for others. In San Francisco after some of the roughest schools implemented periods of focused breathing, they had twice as many students score proficient in English on the California Achievement Test compared to schools that didn’t use the program.

This shouldn’t be surprising since the ability to attend to one thing, and one thing only (be it a conversation, a piece of music, someone else’s feelings, or just the pattern of one’s breathing), is a skill that is directly related to the same part of the brain we use for planning, creativity, strategic thinking, empathy, schema formation, memory and learning, and the ability to grasp the big picture. This is the part of my brain that was in danger of being lost once the internet became the constant companion at my side.

This should prompt us to ask some serious questions about the use of technology in our schools. Each year American school systems spend inordinate time, money and training to more fully integrate technology into the classroom. Often we feel that if our students are not tech-savvy from as young as possible then they will struggle to compete in the global economy of the 21st century. As teachers and school administrators, we may also feel that integrating digital technology into the classroom shows that we are advanced and forward-looking. The result is not only that devices like laptops, iPads and smartphones are becoming integrated into the classroom, but that it is now considered normal for students to use the internet instead of books for their research. Although an increasing body of studies show that technology damages students’ brains in precisely those areas required for learning and serious reflection, teachers who raise concerns about this are often dismissed with caricatures like “Luddite” or “old fashion.”

The irony is that this movement to bring more technology into our schools runs parallel with the movement to integrate mindfulness into the classroom. At first glance, one might be forgiven for thinking that the habits of mind that the mindfulness movement tries to cultivate—habits such as attention, inner stillness, mono-focus, meta-cognition, contemplation and impulse control—might be compromised by devices that practically encourage compulsive multitasking, digital saturation and information overload. But apparently many people within the mindfulness movement do not see it like that. Not only are those at the forefront of the mindfulness movement neglecting to initiate a public conversation about the effect of technology on our children’s brains, but they often see technology as the friend of mindfulness. The industry of “mindfulness apps” is now big bucks, as a variety of programs offer digital assistance for being calm, achieving inner quiet, improving attentiveness skills and becoming more mindful. The irony is that the areas of the brain that are being atrophied by our technology are precisely those areas involved in attentiveness, the cornerstone of mindfulness.

William James talked about the “the faculty of voluntarily bringing back a wandering attention, over and over again,” as being “the very root of judgment, character, and will” and added that “an education which should improve this faculty would be the education par excellence.” As more and more education takes place in front of a computer, however, the habit of remaining attentive to any one thing for long is weakened.

Within a digital environment children’s “brains are rewarded not for staying on task but for jumping to the next thing,” Harvard professor Michael Rich told The New York Times. “The worry is we’re raising a generation of kids in front of screens whose brains are going to be wired differently.”

And that, by the way, is why I support all the schools throughout the country who are celebrating screen-free week from May 1 to May 7. As children throughout the nation unplug from digital entertainment, they have the opportunity to rediscover the joys of childhood, the richness of relationships, and the adventure of stillness.

Three months after switching off my tablet and recovering quiet, I stumbled upon a poem by William Henry Davies that I had loved in my youth but had forgotten. Davies’ words make a fitting conclusion to the thoughts I’ve been sharing.

What is this life if, full of care,
We have no time to stand and stare.
No time to stand beneath the boughs
And stare as long as sheep or cows.

No time to see, when woods we pass,
Where squirrels hide their nuts in grass.
No time to see, in broad daylight,
Streams full of stars, like skies at night.

No time to turn at Beauty’s glance,
And watch her feet, how they can dance.
No time to wait till her mouth can
Enrich that smile her eyes began.

A poor life this if, full of care,
We have no time to stand and stare.

Nicholas Carr on Schema Formation

In Part 2 of my interview with Graham Taylor about brain fitness, I shared what I learned from Nicholas Carr’s book The Shallows about mental schemas:

We’ve probably all seen people who have an ability to learn information quickly, perhaps when studying for a test, but then they forget it afterwards, and then we know other people who are able to achieve content mastery. What’s the difference? The difference is that in order for content mastery to occur, let alone understanding and wisdom, the brain has to move beyond massed practice and even memorizing; rather, the brain needs to start schematizing. This is because schemas serve as hooks on which to fasten new information. Without our brain’s ability to create schemas, without a sense of the connectedness of things, everything we learn would be simply a random collection of disconnected facts and there would never be any true understanding.

Nicholas Carr puts it like this in his book The Shallows: “The depth of our intelligence hinges on our ability to transfer information from working memory to long-term memory and weave it into conceptual schemas.”“…brain scientists have come to realize that long-term memory is actually the seat of understanding. It stores not just facts but complex concepts, or ‘schemas.’ By organizing scattered bits of information into patterns of knowledge, schemas give depth and richenss to our thinking. ‘Our intellectual prowess is derived largely from the schemas we have acquired over long periods of time,’ says Sweller. ‘We are able to understand concepts in our areas of expertise because we have schemas associated with those concepts.’”

Carr points out that although the mental skill of schema formation is needed today more than ever, it is in jeopardy from technologies that orient us towards a state of continuous partial attention. As concentrated attention spans and focus become replaced by broad attention ranges and multitasking, what is lost is the type of slow, methodical, systematic and linear cognition that favors the formation of schemas in the long-term memory. In order for the brain to build up schemas effectively, a person has to reflect deeply about her life and what she has learned, and this reflection needs to occur in a slow and undistracted manner. When this is not the case, or when we form schemas badly, then the brain easily falls prey to oversimplifications. We see this all the time in our public political discourse, where issues are deliberated upon in isolated compartments that are often dominated by ideology, resulting in gross oversimplifications.

Also, the brain has to be nimble and flexible enough to adjust our schemas in light of new information we receive through knowledge, experience and personal growth. Sometimes we learn or experience things that do not fit within our existing neurological schemas, and so the brain has to alter existing schemas or create new neuro pathways, a process known as accommodation. That’s where intellectual humility, mental flexibility and open-mindedness become really important. But when our thinking is dominated by impressions, by emotional reasoning or ideology, then we become closed-minded and stuck in schemas that can actually detach us from reality.

The solution is to constantly engage in deep intellectual reflection, to eschew what Socrates called “the unexamined life.” Deep intellectual reflection is to a healthy brain like water is to a healthy plant. People knew that since before Socrates, but now we have the brain science to go with it.