Researchers at the Princeton Neuroscience Institute (PNI) are tackling some of the biggest mysteries of the human mind: Why we think and behave as we do, how we make decisions, how we choose what to ignore and remember, and how we can learn to forget.
These abilities arise from our 100 billion brain cells, each of which communicates with about 10,000 other nerve cells. Discovering how these neural conversations give rise to a thought, a memory or a decision is the goal of Princeton neuroscientists.
Some of these scientists come at the challenge by probing individual neurons, while others study the activity of entire brain regions. Only by understanding both the big picture and tiny details of neuronal function and communication can we begin to understand the complexity of the brain, said PNI co-directors, Jonathan Cohen, the Robert Bendheim and Lynn Bendheim Thoman Professor in Neuroscience, and David Tank, the Henry L. Hillman Professor in Molecular Biology.
These two scientists epitomize PNI’s approach to understanding the brain. Cohen looks at brain activity and constructs theories of how we guide attention, thought and action in accordance with our goals and intentions. Tank uses a microscope trained on living neurons to explore how networks of these cells orchestrate short-term memory and decision-making.
“The institute successfully spans the different levels of analysis that we frankly need for understanding how the brain gives rise to thoughts, feelings and behaviors,” Cohen said.
“This continuum of people with expertise in computation, mathematics, psychology, biology and related disciplines is what helps set the Princeton Neuroscience Institute apart,” Tank added.
The interdisciplinary nature of Princeton neuroscience is what attracted Carlos Brody to PNI. Brody, an associate professor of molecular biology and PNI, is also a Howard Hughes Medical Institute (HHMI) investigator. He focuses on developing computational models that explain the neural pathways behind behavior and cognition. One such model simulates decision-making behavior in the prefrontal cortex, the area of the brain behind the forehead. Even when we make what seems like a simple decision, neurons are sending and receiving signals to and from thousands of other cells within their neural network.
“We are just at the beginning of understanding the brain, so we rely on model systems that can help us understand how the brain works,” Brody said. “It is the connections between neurons that make the brain work the way it does.”
Brody and his team are using a computer model to explore a theory of decision-making wherein the brain tallies information little by little until it finally makes a decision. Usually this happens so quickly that we are unaware of it.
But sometimes the process happens slowly, like in the morning when it takes you several seconds to realize that your alarm clock is beeping and it is time to get out of bed.
To test his computer model of how the brain tallies incoming information prior to making a decision, Brody trained laboratory rodents to respond to a series of sounds coming from the right or left side. To earn a reward, the rodents had to decide from which side the majority of sounds were coming, and then look in that direction.
During the experiment, Brody and his team monitored the rodents’ brains to determine which neural networks in the prefrontal cortex were active. Their research, which is ongoing and is supported by HHMI and the Human Frontier Science Program, could shed light on how we create so-called “working memory,” the temporary store of information that is essential for making decisions and other cognitive functions.
The brain’s working memory is a central component not only of decision-making but also of navigation behavior. PNI co-director David Tank and his team are studying how networks of neurons create working memory in mice navigating a virtual maze.
While navigating the maze, each mouse sees visual patterns it has learned to recognize as an indication to turn at an upcoming intersection in the maze. The animal must then hold the signal in memory until it reaches the turn. Tank and his colleagues discovered that during this task the neuron populations involved in storing the memory fire in distinctive sequences. The study was published in the journal Nature in April 2012 and was supported by the National Institutes of Health, including a National Institutes of Health Challenge Grant, part of the American Recovery and Reinvestment Act of 2009.
Tank uses mathematical models and statistical data analyses to describe how the firing activity of neurons, linked together into neural circuits, causes memories to be held or lost and how they lead to decisions.
“Studies such as this are aimed at understanding the basic principles of neural activity during memory and decision making in the normal brain,” Tank said. “However, the work may in the future assist researchers in understanding how activity might be altered in brain disorders that involve deficits in working memory. Schizophrenia is one disorder thought to involve deficits in working memory.”
Our ability to recall previous experiences, while impressive, can weaken due to age or other factors. But assigning a timestamp to a memory, said Kenneth Norman, an associate professor of psychology and PNI, can help the brain retrieve important information when needed. Norman suspects that people categorize the “when” of a memory by storing additional information about what happened just before or just after the event. If you stopped for a cappuccino after class one day, for example, remembering the café could jog your memory of the lecture topic.
With his students, Norman is developing computational theories of how timestamping works and testing these theories against experimental data. The work could lead to new techniques that enhance our ability to remember.
“Using computers, we can build networks of neurons and test our theories of how the strength of the connections between those neurons change as a function of experience,” Norman said. “If we build a good model, these networks should ’remember’ in the same way that humans do.”
Although memory is essential, forgetting also can be valuable. Methods for extinguishing bad memories could be of use in treating post-traumatic stress disorder.
Norman is exploring the idea that bringing a memory partially to mind can weaken or extinguish it. “If you totally shut out a memory, then, according to our theory, it will come back just as strong as it was before,” Norman said of the research, which is sponsored by the National Institute of Mental Health. “Similarly, if you constantly relive the memory, it will get stronger. We hope to develop procedures for eliciting just the right level of memory.”
Weak recall of a memory may be what helps the brain forget. To test this idea, Norman and his team are scanning brains of human volunteers using functional magnetic resonance imaging (fMRI), which reveals active areas of the brain by measuring blood flow. A participant is asked to study several randomly generated pairings of words and photographs — for example, the word “nickel” paired with a photo of a man’s face — until the two concepts are linked in the participant’s mind. Then the researchers present the cue word nickel and ask the participant to avoid thinking about the associated picture.
The researchers look at the brain’s activity to see how much the picture of the man is coming to mind while participants try to avoid thinking about it. Results from this study show that, when the picture comes to mind moderately, this leads to forgetting the word-picture association.
Just as forgetting unpleasant memories is useful for mental health, so is being able to ignore unwanted details when necessary. When hailing a cab in a major city, your brain can ignore hundreds of cars while searching for one with a bright yellow paintjob. But when it is time to cross the street, you risk your life if you ignore even one of those hundreds of cars.
“Our brains have a way to determine what is relevant for decision-making in any given scenario and what details can be ignored,” said Yael Niv, an assistant professor of psychology and PNI, who explores how the brain decides what is important. “If we didn’t, we wouldn’t be able to learn from our experiences because no two situations are exactly alike.”
Niv created a computer game in which she could find out if players used a strategy of ignoring non-relevant information.
The player views three pictures on a screen with each picture varying by color, pattern and shape. The goal of the game is to guess which feature, such as a triangle shape, is the one specified in advance by the computer game. After viewing the screen, the player makes a guess and receives a point if he or she chooses the picture that contains the correct feature. By viewing a series of screens and receiving feedback on guesses, the player can figure out that the triangle shape is the correct feature.
After human volunteers played the game, the researchers created a computer model that described the typical player’s progress toward the correct answer. Previous models suggested that the player learns about all image features at once, slowly homing in on the correct one. But Niv and PNI postdoctoral research fellow Robert Wilson found that the brain employs shortcuts that involve creating serial hypotheses involving single features, and testing and discarding each wrong hypothesis until the player deduces the correct answer.
“When you choose a green triangle with polka dots you may be learning about green, about triangles and about polkadots,” Niv said. “But what our model shows is that human behavior is more consistent with the following description: You think in your head, ’maybe green is correct,’ so you choose the green polka-dotted triangle, and based on the feedback, you update whether you think green is correct. You basically learn nothing about triangles and polka dots because you were ignoring them.”
These mental shortcuts are faster and require less memory than would be needed to explore the entire range of possibilities, Niv said. The work, which was supported by the Alfred P. Sloan Foundation, the United States-Israel Binational Science Foundation and the U.S. National Institute on Drug Abuse, was published in a January 2012 issue of Frontiers in Human Neuroscience.
“This ’structure learning’ — using experiences to learn what information to ignore in a specific task — is what allows animals and humans to learn so quickly,” Niv said. “Learning seems to depend on how we take a task and divide it into parts that are relevant and irrelevant, based on previous experiences.”
PNI provides a framework for fostering communication between what once were largely separate areas of study. The task of understanding the cognitive brain brings together researchers from psychology, molecular biology and other disciplines.
“The fields of neuroscience and psychology are increasingly complementary,” said Deborah Prentice, chair of Princeton’s Department of Psychology and the Alexander Stewart 1886 Professor of Psychology and Public Affairs.
“It used to be that experimentalists had a black box approach — you altered some experimental factors and you observed what came out the other side,” Prentice said. “Today’s neuroscience techniques allow us to finally look inside the black box.”
“Some of the most exciting and challenging problems in molecular and cell biology are found in neuroscience,” said PNI researcher Lynn Enquist, the Henry L. Hillman Professor of Molecular Biology and chair of the Department of Molecular Biology. “Today, we are using high-resolution optical imaging in combination with powerful new tools in genetics, genomics and cell biology to produce a wealth of information about the brain.”
Wilson, Robert C. and Yael Niv. 2012. “Inferring Relevance in a Changing World.” Front. Hum. Neurosci., Vol. 5, Article #189.
Harvey, Christopher D., Philip Coen and David W. Tank. 2012. “Choice-Specific Sequences in Parietal Cortex During a Virtual-Navigation Decision Task.” Nature. Vol. 484, no. 7392: 62-8.