Earl Miller, BA '85 Photographed by Jason Grow
Earl Miller, BA '85 Photographed by Jason Grow

Attention, Please

You are here

MIT neuroscientist Earl Miller, BA ’85, continues to break new ground in the understanding of cognition—and his research may help us move beyond the limits of the brain’s working memory. 

By Adam Piore, Photography by Jason Grow

In the rehearsal space of the Boston band What She Said, Earl Miller lays into his bass guitar, plucking out a funky groove. In a black band tee, faded cargo pants and signature newsboy cap, Miller looks like a seasoned musician you’d see in any corner dive bar.

But at his nearby office at MIT, Miller is nothing if not professorial. How could that rocker in the cap be the same bookish academic now gazing solemnly at me across his paper-strewn desk at the Picower Institute for Learning and Memory? The jarring contrast between the two Earl Millers is a fitting way to begin a discussion of the pioneering neuroscientist’s work.

After all, some of Miller’s biggest contributions to the field over the past 20 years have explored exactly how contrasts like these are possible; how it is, in other words, that human beings—or any other animal with a brain—are able to seamlessly adapt behavior to changing rules and environments. How is it that distinct populations of brain cells, or neurons, are able to work together to quickly summon an appropriate response? How do we know when it’s fitting to play a Patti Smith bass line, and when it’s time to explain the complex workings of brain waves?

This mental flexibility is so fundamental that it’s easy to take it for granted. But there are few functions the brain must perform that are more complex or crucial to survival than recognizing when something has changed and then calling up all the disparate information needed to adapt appropriately.

“Think about what we’re doing here,” Miller says. “Right now. We’re sitting on chairs. We’re taking turns talking. This is based on rules. We’ve learned how to behave in this context we’re in right now.”

To pull off tasks like these, the brain uses something called working memory. Cognitive psychologists coined the term in 1960 as they tried to explain the fundamental structure of the human thought process.

Try to hold that last sentence in your mind, or memorize a phone number you’re about to dial, and you’ll have engaged this critical brain system.

Miller has spent the past two decades trying to understand the mechanisms behind working memory, and he believes the key lies in the brain’s prefrontal cortex. Insights into this thin layer of neurons at the front of the brain could answer questions that have flummoxed scientists for generations. It might have practical use, too.

Experts have long known that we have a virtually unlimited capacity to store new long-term memories. Yet there’s a limit on how much information we can cram into our working memory.

In studying the prefrontal cortex’s functions, Miller and others are coming closer to finally explaining this contradiction. And by solving this riddle, we may find ways to get beyond those limits.

Someday, Miller believes, he’ll be able to make us all smarter.

Someday, Miller believes, he’ll be able to make us all smarter.

Building the Picture

As postdoctoral students at Baltimore’s Johns Hopkins University in the 1960s, David Hubel and Torsten Wiesel set out to solve a long-standing mystery: What happens in the brain when we see objects and shapes?

Every one of us has about 100 billion neurons, separated by gaps called synapses. Neurons talk to each other by passing signals across these spaces. When one neuron’s signal is strong enough, it causes the neuron on the other side of the synapse to fire an electrical spike. When that second neuron fires, it passes messages to all the other neurons it’s connected to, which can cause those neurons to fire. This sequential firing of neurons allows us to think, to move—and to see.

After a series of experiments performed on the visual cortex of animals, Hubel and Wiesel argued that it is the consecutive firing of individual, specialized neurons, each responsible for a specific detail in a picture or pattern, that helps us build complex images in our mind’s eye. Their work earned them the Nobel Prize in Physiology or Medicine in 1981.

As it happened, Miller entered Kent State University the same year—with dreams of becoming a doctor. That quickly changed when he started working in a neuroscience lab.

“The moment I first dropped an electrode into a slice of brain and heard all these neurons firing away like a thunderstorm, I was hooked,” Miller recalls.

As a Princeton University graduate student, Miller studied the inferior temporal cortex, a patch of neurons slightly forward of the visual cortex. Scientists had demonstrated this was the region that knits together a unified image from all the complex individual components Hubel and Wiesel identified. Then it starts the “higher level” processing of the outside world.

By the time Miller earned his PhD in 1990, he was asking the questions that would later define his career: What happens in the inferior temporal cortex after a unified picture emerges? How do our brains tell us what it means?

Miller tried to answer those questions while working in the lab of National Institute of Mental Health neuroscientist Bob Desimone. Miller was looking for neurons that fired only when an animal spotted an item it was storing in short-term memory. Miller and Desimone trained animals to hold a single image in mind—such as an apple—and release a lever when that picture reappeared on a screen.

If the animal remembered the first picture it saw and released the lever, a drop of delicious juice would roll down a tube and into its cage.

The pair noticed that certain parts of the animal brain were inherently sensitive to repetition—regardless of whether it translated into a valued juice reward. Some neurons fired when animals saw a second banana or second image of trees. It was as if the brain was on automatic pilot, primed to notice repetition without any active effort to do so, even when that repetition had no meaning.

It was as if the brain was on automatic pilot, primed to notice repetition without any active effort to do so, even when that repetition had no meaning.

But the pair also discovered a second type of firing pattern. When the animal spotted a picture it was actively holding in his memory—hoping for a juice reward—not only did different neurons fire, those neurons fired far more intensely.

“Something was switching the volume to make these neurons fire, more or less, depending on the nature of the memory,” Miller says. “That got me wondering. Who’s turning up or down the volume?”

Turn It Up

Scientists have suspected that the prefrontal cortex plays a key role in high-level cognition since the case of Phineas Gage. On Sept. 13, 1848, Gage, who worked in railroad construction, was setting an explosive charge with a tamping iron when the gunpowder detonated, rocketing a metal rod up through the roof of his mouth, into his left frontal lobe and through the top of his skull. The rod landed 75 feet away, coated in pieces of Gage’s brain.

Miraculously, Gage survived and could speak, walk and function. But, it was written later, he could no longer stick to plans and lost much of his social grace and restraint.

From studying Gage and others like him, neuroscientists surmised that the frontal lobes performed the brain’s “executive functions.” They run the business of thinking and processing and directing the spotlight of attention. And yet, nearly 150 years after Gage’s famous injury, scientists were still trying to understand how the frontal lobe works.

So, when Miller started his own lab at MIT in 1995, he decided to switch his focus to the prefrontal cortex. By then, some of his peers had already shown that clusters of neurons in lab animals would fire repeatedly in the prefrontal cortex during memory exercises. Their results suggested this region houses our working memory.

To Miller, however, this didn’t explain how the executive areas of the brain could “turn up the volume” on memories associated with free juice.

How does the animal know how to do the task? How does the animal know the rules?

“I thought that was the most important thing,” Miller says. “I didn’t understand why no one was studying it. Context-dependent behavior is what high-level cognition is all about.”

In his new lab, Miller designed an experiment that complicated the choice his animals faced. Instead of just showing an animal a picture and training it to respond every time it reappeared, he varied the number of possible responses by adding a second cue.

Miller predicted he’d detect activity in multiple neurons in the prefrontal cortex every time he changed the rule. These neurons, he believed, somehow turned up or down the “volume” of the neurons he’d recorded in other areas of the brain.

Not only was Miller right, but the rule change consistently caused twice as many neurons in the prefrontal cortex to fire than in the more simplistic experiments where the task required the animal to just hold a picture in mind.

“That told us something,” he says. Perhaps the prefrontal cortex’s primary job wasn’t short-term memory at all, but to learn the rules of the game.

In 2001, Miller published a research review that fundamentally shifted the way many viewed the prefrontal cortex. Miller compared the prefrontal cortex to a railroad switch operator, and the rest of the brain to railroad tracks. The switch operator activates some parts of the track and takes others offline. This model would explain how attention works. It explains, for instance, how an animal can focus on a picture while suppressing a noise. And it explained why Phineas Gage had trouble blocking out distractions and focusing on the task at hand.

The theory made intuitive sense. But to some, steeped in the specialized-neuron theories of Hubel and Wiesel, Miller’s theory seemed preposterous.

“That’s impossible!” Miller recalls one prominent neuroscientist declaring after Miller delivered an invited lecture. “We all know that neurons do one thing. Your problem is you can’t figure out what these neurons are doing,” the researcher told him.

But Miller has continued to accumulate experimental evidence—as have many other labs—gradually winning scientists over to his idea.

“Neurons are multifunctional,” Miller says. “We’ve shown this over and over again for 20 years.”

Earl Miller plays bass at the Tavern at the End of the World in Boston's Charlestown neighborhood. Photograph by Jason Grow.

“Earl is kind of a rock star. When he says something, a lot more people notice it.”

Wave Change

These days, Miller is taking on another piece of dogma—that neurons primarily communicate by electrical spikes. In recent papers, Miller argues that there’s still a lot to learn from the intermittent electrical currents called oscillations, or brain waves.

When we hold an item in working memory, these oscillations move through brain circuits in waves that rise and fall scores of times. These oscillations, he argues, are how the prefrontal cortex—that mental “switch operator”—stores several items on the cusp of our awareness in working memory, so we can pull them into our conscious minds as needed.

The oscillations aren’t enough to make the neurons spike. But the brain waves bind together all the neurons in a circuit with every crest, pushing the neurons so close to their firing point that they’re primed to respond to just the slightest extra stimulus.

This might help answer a question that has long intrigued scientists: How can the human brain store a virtually unlimited number of long-term memories, yet remain severely limited in the information we can hold in our conscious minds at once?

It’s a limit most notably characterized by Princeton cognitive psychologist George Miller (no relation) in a 1956 paper, “The Magical Number Seven, Plus or Minus Two.” George Miller, who helped coin the term working memory, argued that seven, plus or minus two, is the maximum number of objects most of us can hold in our short-term memory at once. Researchers have since demonstrated the number can vary far more widely and may even be smaller than seven. But no one doubts there are limits. (See sidebar below.)

If working memory is encoded in oscillations, Earl Miller says it would explain these limits, because a single wave can only rise and fall a certain number of times a second. “That means you have to fit in everything you want to juggle in your current conscious mind,” he says. “That’s a natural limitation in bandwidth.”

Brad Postle, a University of Wisconsin-Madison neuroscientist, says the idea that something other than the spiking of neurons is important has been “kicking around for a while.” Postle himself suggested brain waves may play a role in focusing attention. Still, he believes it’s significant that Miller is now arguing the point.

“Having it come out of Earl Miller’s mouth almost by definition will bring attention to it,” says Postle, who authored a widely used neuroscience textbook that includes many of Miller’s earlier experiments. “Earl is kind of a rock star. When he says something, a lot more people notice it.”

Now, Miller is focusing on new technologies that might actually enhance working memory capacity.

 “If we find a way to stretch the cycle, increase amplitude, make it taller or maybe slow the frequency a little bit, maybe we could increase the capacity of working memory,” he says.

So he’s planning on experimenting with a technique that uses electrodes placed on top of the scalp to deliver faint pulses of electricity and record the impact. If these pulses are timed correctly, they could change the shape of the brain waves.

It would be a significant technological feat, but Miller thinks it’ll work. If he’s correct, it could have a profound impact on human performance, literally expanding our brainpower.

Adam Piore is a writer based in Boston, Mass. Excerpted from an article that originally appeared in the October 2016 issue of Discover magazine.


The power of paying attention

When neuroscientist Earl Miller, BA ’85, spoke at Kent State’s Commencement in May 2016, he gave the graduates some practical advice that boiled down to one word: focus. “Multitasking ruins productivity, causes mistakes and impedes creative thought,” he says. Below, he breaks down why that is—and what you can do about it.

Multitasking is a misnomer

Your brain has limited capacity for simultaneous thought. Humans can only hold a little bit of information in mind at any single moment, but your brain deludes you into thinking you understand more about what’s going on around you than you actually do. For example, you probably think you see almost everything in front of you. But you’re actually sipping at the outside world through a straw. Your brain can only take information in little snippets, which it combines to give you the illusion of seamless visual perception.

You can’t pay attention to two things at the same time. Toggling between tasks requires a series of small cognitive shifts. You may think you’re juggling two tasks at once, but actually you’re switching back and forth very rapidly. For example, if you interrupt a project to check an incoming email or watch a cat video, when you finally return to the task your brain has to expend valuable mental energy refocusing, backtracking and fixing errors.

Whenever you switch  from one thought to another, you cognitively  stumble a little bit. Humans have a great ability to change their thoughts from moment to moment, but it comes at a cost. As the cognitive apparatus in your brain reconfigures from one mode of thought to another, you slow down, make more errors and miss things.

You’re less likely to think creatively if you multitask. Innovative thinking comes from extended concentration, i.e., the ability to follow links between thoughts. Memory is a big network of associations. Truly deep and creative thoughts come from following the path of this network to new and different places. When you try to multitask, you typically don’t get far enough down any path to stumble upon something original because you’re constantly switching and backtracking. Multitasking lowers the quality of your thoughts, making them more superficial, less creative and less innovative.

Your brain is  ill-equipped to handle sensory overload. In prehistoric days, when the human brain first evolved, it was a different environment. Any new information could be critical to survival—a rustling in the bush might mean a tiger is about to leap out. It was adaptive for our brains to seek out and pay attention to new information, and our brains also evolved to focus on one thing at a time. However, in today’s modern society, the ceaseless onslaught of information has the potential to cripple us. What was once an evolutionary advantage has become a distraction.

You may think you’re good at multitasking, but you’re not. Studies have shown that people who think they are good at multitasking are actually the worst at it. People who multitask a lot do so because they are easily distracted. Then they rationalize it by convincing themselves that they are really good at multitasking. 

Improve your ability to focus

Block out a period of time. Think ahead about what you need to accomplish, and plan to focus instead of trying to multitask.

Eliminate as many distractions as possible. Work in a quiet environment. Put away your mobile phone and tablet. Shut off extra computer screens. Turn off email alerts and shut down your email if you have to. Don’t try to monotask by willpower alone; it’s too hard to fight the brain’s craving for new information. Instead, prevent the urge by removing the temptation.

Work on one task at a time for extended periods. Your work quality and productivity will improve if you focus on one task at a time.

Take a short break. If you feel yourself losing focus, walk around a bit. It increases blood flow to the brain and helps restore focus.

Prioritize by doing the most important tasks first. This removes some of the pressure to multitask as deadlines draw near.

Introduce novelty. What we perceive is not a faithful representation of the world. Our brain is constantly interpreting sensory inputs, and if something is familiar, the brain begins to gloss over it. For example, to catch errors when proofing a paper, change the formatting or read it aloud to “wake up” your brain and cause it to pay closer attention.

Put away your cell phone when you drive. Your ability to pay attention to the road while you talk on the phone is another delusion. It is estimated that as many as half of the car accidents in the United States alone are due to distracted driving. Studies have shown that talking on the phone causes drivers to miss as much as half the things in front of them.

Hands-free headsets don’t help much, because it’s the cognitive demands of conversation that cause the distraction. (Talking to a passenger is different, because they know when to shut up.)

And if you find yourself focusing intently on a radio program or an audiobook, etc., turn it off. You have a limited pool of cognitive resources. Multitasking while driving is just plain dangerous.—Earl Miller

Learn more about Professor Miller's research at https://ekmillerlab.mit.edu/

Back to Spring 2017