MIT Down syndrome researchers work on ways to ensure a healthy lifespan

An Alana Down Syndrome Center webinar, co-sponsored by the Massachusetts Down Syndrome Congress, presented numerous MIT studies that all share the goal of improving health throughout life for people with trisomy 21.

David Orenstein | The Picower Institute for Learning and Memory
April 24, 2025

In recent decades the life expectancy of people with Down syndrome has surged past 60 years, so the focus of research at the Alana Down Syndrome Center at MIT has been to make sure people can enjoy the best health during that increasing timeframe.

“A person with Down syndrome can live a long and happy life,” said Rosalind Mott Firenze, scientific director of the center founded at MIT in 2019 with a gift from the Alana Foundation. “So the question is now how do we improve health and maximize ability through the years? It’s no longer about lifespan, but about healthspan.”

Firenze and three of the center’s Alana Fellows scientists spoke during a webinar, hosted on April 17th, where they described the center’s work toward that goal. An audience of 99 people signed up to hear the webinar titled “Building a Better Tomorrow for Down Syndrome Through Research and Technology,” with many viewers hailing from the Massachusetts Down Syndrome Congress, which co-sponsored the event.

The research they presented covered ways to potentially improve health from stages before birth to adulthood in areas such as brain function, heart development, and sleep quality.

Boosting brain waves

One of the center’s most important areas of research involves testing whether boosting the power of a particular frequency of brain activity—“gamma” brain waves of 40Hz—can improve brain development and function. The lab of the center’s Director Li-Huei Tsai, Picower Professor in The Picower Institute for Learning and Memory and the Department of Brain and Cognitive Sciences, uses light that flickers and sound that clicks 40 times a second to increase that rhythm in the brain. In early studies of people with Alzheimer’s disease, which is a major health risk for people with Down syndrome, the non-invasive approach has proved safe, and appears to improve memory while preventing brain cells from dying. The reason it works appears to be because it promotes a healthy response among many types of brain cells.

Working with mice that genetically model Down syndrome, Alana Fellow Dong Shin Park has been using the sensory stimulation technology to study whether the healthy cellular response can affect brain development in a fetus while a mother is pregnant. In ongoing research, he said, he’s finding that exposing pregnant mice to the light and sound appears to improve fetal brain development and brain function in the pups after they are born.

In his research, Postdoctoral Associate Md. Rezaul Islam worked with 40Hz sensory stimulation and Down syndrome model mice at a much later stage in life—when they are adult aged. Together with former Tsai Lab member Brennan Jackson, he found that when the mice were exposed to the light and sound, their memory improved. The underlying reason seemed to be an increase not only in new connections among their brain cells, but also an increase in the generation of new ones. The research, currently online as a preprint, is set to publish in a peer-reviewed journal very soon.

Firenze said the Tsai lab has also begun to test the sensory stimulation in human adults with Down syndrome. In that testing, which is led by Dr. Diane Chan, it is proving safe and well tolerated, so the lab is hoping to do a year-long study with volunteers to see if the stimulation can delay or prevent the onset of Alzheimer’s disease.

Studying cells

Many Alana Center researchers are studying other aspects of the biology of cells in Down syndrome to improve healthspan. Leah Borden, an Alana Fellow in the lab of Biology Professor Laurie Boyer, is studying differences in heart development. Using advanced cultures of human heart tissues grown from trisomy 21 donors, she is finding that tissue tends to be stiffer than in cultures made from people without the third chromosome copy. The stiffness, she hypothesizes, might affect cellular function and migration during development, contributing to some of the heart defects that are common in the Down syndrome population.

Firenze pointed to several other advanced cell biology studies going on in the center. Researchers in the lab of Computer Science Professor Manolis Kellis, for instance, have used machine learning and single cell RNA sequencing to map the gene expression of more than 130,000 cells in the brains of people with or without Down syndrome to understand differences in their biology.

Researchers the lab of Y. Eva Tan Professor Edward Boyden, meanwhile, are using advanced tissue imaging techniques to look into the anatomy of cells in mice, Firenze said. They are finding differences in the structures of key organelles called mitochondria that provide cells with energy.

And in 2022, Firenze recalled, Tsai’s lab published a study showing that brain cells in Down syndrome mice exhibited a genome-wide disruption in how genes are expressed, leading them to take on a more senescent, or aged-like, state.

Striving for better sleep

One other theme of the Alana Center’s research that Firenze highlighted focuses on ways to understand and improve sleep for people with Down syndrome. In mouse studies in Tsai’s lab, they’ve begun to measure sleep differences between model and neurotypical mice to understand more about the nature of sleep disruptions.

“Sleep is different and we need to address this because it’s a key factor in your health,” Firenze said.

Firenze also highlighted how the Alana Center has collaborated with MIT’s Desphande Center for Technological Innovation to help advance a new device for treating sleep apnea in people with Down syndrome. Led by Mechanical Engineering Associate Professor Ellen Roche, the ZzAlign device improves on current technology by creating a custom-fit oral prosthesis accompanied by just a small tube to provide the needed air pressure to stabilize mouth muscles and prevent obstruction of the airway.

Through many examples of research projects aimed at improving brain and heart health and enhancing sleep, the webinar presented how MIT’s Alana Down Syndrome Center is working to advance the healthspan of people with Down syndrome.

 

Sebastian Lourido awarded highest alumni honor from alma mater

Whitehead Institute Member Sebastian Lourido receives the Tulane 2025 Science and Engineering Outstanding Alumni Award for Professional Excellence

Whitehead Institute
April 11, 2025

The Lourido laboratory at Whitehead Institute studies the developmental transitions and molecular pathways that the single cell parasite Toxoplasma gondii (T. gondii ), uses to infect its host, causing toxoplasmosis.They combine several approaches that span phospho-proteomics, chemical-genetics, and genome editing to investigate the unique biology of these organisms and identify specific features that can be targeted to treat infections of T. gondii and related parasites.

Lourido, who is also an associate professor of biology at Massachusetts Institute of Technology, originally joined Whitehead Institute as a Whitehead Fellow in 2012, a program that allows promising MD or PhD graduates to initiate their own research program in lieu of a traditional postdoctoral fellowship. “Sebastian’s demonstrated excellence as a young investigator underscores the importance of investing in the next generation of scientists and scientific leaders,” says Ruth Lehmann, Whitehead Institute’s President and Director.

After receiving both a BS in Cell and Molecular Biology and a BFA in Studio Art, Lourido went on to pursue graduate work at Washington University in St. Louis. In addition to this honor, Lourido has also been the recipient of other awards including the NIH Director’s Early Independence Award and the 2024 William Trager Award from the American Society of Tropical Medicine and Hygiene and was recognized as one of the Burroughs Wellcome Fund’s Investigators in the Pathogenesis of Infectious Disease.

Manipulating time with torpor

New research from the Hrvatin Lab recently published in Nature Aging indicates that inducing a hibernation-like state in mice slows down epigenetic changes that accompany aging.

Shafaq Zia | Whitehead Institute
March 7, 2025

Surviving extreme conditions in nature is no easy feat. Many species of mammals rely on special adaptations called daily torpor and hibernation to endure periods of scarcity. These states of dormancy are marked by a significant drop in body temperature, low metabolic activity, and reduced food intake—all of which help the animal conserve energy until conditions become favorable again.

The lab of Whitehead Institute Member Siniša Hrvatin studies daily torpor, which lasts several hours, and its longer counterpart, hibernation, in order to understand their effects on tissue damage, disease progression, and aging. In their latest study, published in Nature Aging on March 7, first author Lorna Jayne, Hrvatin, and colleagues show that inducing a prolonged torpor-like state in mice slows down epigenetic changes that accompany aging.

“Aging is a complex phenomenon that we’re just starting to unravel,” says Hrvatin, who is also an assistant professor of biology at Massachusetts Institute of Technology. “Although the full relationship between torpor and aging remains unclear, our findings point to decreased body temperature as the central driver of this anti-aging effect.”

Tampering with the biological clock

Aging is a universal process, but scientists have long struggled to find a reliable metric for measuring it. Traditional clocks fall short because biological age doesn’t always align with chronology—cells and tissues in different organisms age at varying rates.

To solve this dilemma, scientists have turned to studying molecular processes that are common to aging across many species. This, in the past decade, has led to the development of epigenetic clocks, new computational tools that can estimate an organism’s age by analyzing the accumulation of epigenetic marks in cells over time.

Think of epigenetic marks as tiny chemical tags that cling either to the DNA itself or to the proteins, called histones, around which the DNA is wrapped. Histones act like spools, allowing long strands of DNA to coil around them, much like thread around a bobbin. When epigenetic tags are added to histones, they can compact the DNA, preventing genetic information from being read, or loosen it, making the information more accessible. When epigenetic tags attach directly to DNA, they can alter how the proteins that “read” a gene bind to the DNA.

While it’s unclear if epigenetic marks are a cause or consequence of aging, this much is evident: these marks change over an organism’s lifespan, altering how genes are turned on or off, without modifying the underlying DNA sequence. These changes have enabled researchers to track the biological age of individual cells and tissues using dedicated epigenetic clocks.

In nature, states of stasis like hibernation and daily torpor help animals survive by conserving energy and avoiding predators. But now, emerging research in marmots and bats hints that hibernation may also slow down epigenetic aging, prompting researchers to explore whether there’s a deeper connection between prolonged bouts of torpor and longevity.

However, investigating this link has been challenging, as the mechanisms that trigger, regulate, and sustain torpor remain largely unknown. In 2020, Hrvatin and colleagues made a breakthrough by identifying neurons in a specific region of the mouse hypothalamus, known as the avMLPA, which act as core regulators of torpor.

“This is when we realized that we could leverage this system to induce torpor and explore mechanistically how the state of torpor might have beneficial effects on aging,” says Jayne. “You can imagine how difficult it is to study this in natural hibernators because of accessibility and the lack of tools to manipulate them in sophisticated ways.”

The age-old mystery

The researchers began by injecting adeno-associated virus in mice, a gene delivery vehicle that enables scientists to introduce new genetic material into target cells. They employed this technology to instruct neurons in the mice’s avMLPA region to produce a special receptor called Gq-DREADD, which does not respond to the brain’s natural signals but can be chemically activated by a drug. When the researchers administered this drug to the mice, it bound to the Gq-DREADD receptors, activating the torpor-regulating neurons and triggering a drop in the animals’ body temperature.

However, to investigate the effects of torpor on longevity, the researchers needed to maintain these mice in a torpor-like state for days to weeks. To achieve this, the mice were continuously administered the drug through drinking water.

The mice were kept in a torpor-like state with periodic bouts of arousal for a total of nine months. The researchers measured the blood epigenetic age of these mice at the 3-, 6-, and 9-month marks using the mammalian blood epigenetic clock. By the 9-month mark, the torpor-like state had reduced blood epigenetic aging in these mice by approximately 37%, making them biologically three months younger than their control counterparts.

To further assess the effects of torpor on aging,  the group evaluated these mice using the mouse clinical frailty index, which includes measurements like tail stiffening, gait, and spinal deformity that are commonly associated with aging. As expected, mice in the torpor-like state had a lower frailty index compared to the controls.

With the anti-aging effects of the torpor-like state established, the researchers sought to understand how each of the key factors underlying torpor—decreased body temperature, low metabolic activity, and reduced food intake—contributed to longevity.

To isolate the effects of reduced metabolic rate, the researchers induced a torpor-like state in mice, while maintaining the animal’s normal body temperature. After three months, the blood epigenetic age of these mice was similar to that of the control group, suggesting that low metabolic rate alone does not slow down epigenetic aging.

Next, Hrvatin and colleagues isolated the impact of low caloric intake on blood epigenetic aging by restricting the food intake of mice in the torpor-like state, while maintaining their normal body temperature. After three months, these mice were a similar blood epigenetic age as the control group.

When both low metabolic rate and reduced food intake were combined, the mice still exhibited higher blood epigenetic aging after three months compared to mice in the torpor state with low body temperature. These findings, combined, led the researchers to conclude that neither low metabolic rate nor reduced caloric intake alone are sufficient to slow down blood epigenetic aging. Instead, a drop in body temperature is necessary for the anti-aging effects of torpor.

Although the exact mechanisms linking low body temperature and epigenetic aging are unclear, the team hypothesizes that it may involve the cell cycle, which regulates how cells grow and divide: lower body temperatures can potentially slow down cellular processes, including DNA replication and mitosis. This, over time, may impact cell turnover and aging. With further research, the Hrvatin Lab aims to explore this link in greater depth and shed light on the lingering mystery.

Taking the pulse of sex differences in the heart

Work led by Talukdar and Page Lab postdoc Lukáš Chmátal shows that there are differences in how healthy male and female heart cells—specifically, cardiomyocytes, the muscle cells responsible for making the heart beat—generate energy.

Greta Friar | Whitehead Institute
February 18, 2025

Heart disease is the number one killer of men and women, but it often presents differently depending on sex. There are sex differences in the incidence, outcomes, and age of onset of different types of heart problems. Some of these differences can be explained by social factors—for example, women experience less-well recognized symptoms when having heart attacks, and so may take longer to be diagnosed and treated—but others are likely influenced by underlying differences in biology. Whitehead Institute Member David Page and colleagues have now identified some of these underlying biological differences in healthy male and female hearts, which may contribute to the observed differences in disease.

“My sense is that clinicians tend to think that sex differences in heart disease are due to differences in behavior,” says Harvard-MIT MD-PhD student Maya Talukdar, a graduate student in Page’s lab. “Behavioral factors do contribute, but even when you control for them, you still see sex differences. This implies that there are more basic physiological differences driving them.”

Page, who is also an HHMI Investigator and a professor of biology at the Massachusetts Institute of Technology, and members of his lab study the underlying biology of sex differences in health and disease, and recently they have turned their attention to the heart. In a paper published on February 17 in the women’s health edition of the journal Circulation, work led by Talukdar and Page lab postdoc Lukáš Chmátal shows that there are differences in how healthy male and female heart cells—specifically, cardiomyocytes, the muscle cells responsible for making the heart beat—generate energy.

“The heart is a hard-working pump, and heart failure often involves an energy crisis in which the heart can’t summon enough energy to pump blood fast enough to meet the body’s needs,” says Page. “What is intriguing about our current findings and their relationship to heart disease is that we’ve discovered sex differences in the generation of energy in cardiomyocytes, and this likely sets up males and females differently for an encounter with heart failure.”

Page and colleagues began their work by looking for sex differences in healthy hearts because they hypothesize that these impact sex differences in heart disease. Differences in baseline biology in the healthy state often affect outcomes when challenged by disease; for example, people with one copy of the sickle cell trait are more resistant to malaria, certain versions of the HLA gene are linked to slower progression of HIV, and variants of certain genes may protect against developing dementia.

Identifying baseline traits in the heart and figuring out how they interact with heart disease could not only reveal more about heart disease, but could also lead to new therapeutic strategies. If one group has a trait that naturally protects them against heart disease, then researchers can potentially develop medical therapies that induce or recreate that protective feature in others. In such a manner, Page and colleagues hope that their work to identify baseline sex differences could ultimately contribute to advances in prevention and treatment of heart disease.

The new work takes the first step by identifying relevant baseline sex differences. The researchers combined their expertise in sex differences with heart expertise provided by co-authors Christine Seidman, a Harvard Medical School professor and director of the Cardiovascular Genetics Center at Brigham and Women’s Hospital; Harvard Medical School Professor Jonathan Seidman; and Zoltan Arany, a professor and director of the Cardiovascular Metabolism Program at the University of Pennsylvania.

Along with providing heart expertise, the Seidmans and Arany provided data collected from healthy hearts. Gaining access to healthy heart tissue is difficult, and so the researchers felt fortunate to be able to perform new analyses on existing datasets that had not previously been looked at in the context of sex differences. The researchers also used data from the publicly available Genotype-Tissue Expression Project. Collectively, the datasets provided information on bulk and single cell gene expression, as well as metabolomics, of heart tissue—and in particular, of cardiomyocytes.

The researchers searched these datasets for differences between male and female hearts, and found evidence that female cardiomyocytes have higher activity of the primary pathway for energy generation than male cardiomyocytes. Fatty acid oxidation (FAO) is the pathway that produces most of the energy that powers the heart, in the form of the energy molecule ATP. The researchers found that many genes involved in FAO have higher expression levels in female cardiomyocytes. Metabolomic data reinforced these findings by showing that female hearts had greater flux of free fatty acids, the molecules used in FAO, and that female hearts used more free fatty acids than did males in the generation of ATP.

Altogether, these findings show that there are fundamental differences in how female and male hearts generate energy to pump blood. Further experiments are needed to explore whether these differences contribute to the sex differences seen in heart disease. The researchers suspect that an association is likely, because energy production is essential to heart function and failure.

In the meantime, Page and his lab members continue to investigate the biology underlying sex differences in tissues and organs throughout the body.

“We have a lot to learn about the molecular origins of sex differences in health and disease,” Chmátal says. “What’s exciting to me is that the knowledge that comes from these basic science discoveries could lead to treatments that benefit men and women, as well as to policy changes that take sex differences into account when determining how doctors are trained and patients are diagnosed and treated.”

A planarian’s guide to growing a new head

Researchers at the Whitehead Institute have described a pathyway by which planarians, freshwater flatworms with spectacular regenerative capabilities, can restore large portions of their nervous system, even regenerating a new head with a fully functional brain.

Shafaq Zia | Whitehead Institute
February 6, 2025

Cut off any part of this worm’s body and it will regrow. This is the spectacular yet mysterious regenerative ability of freshwater flatworms known as planarians. The lab of Whitehead Institute Member Peter Reddien investigates the principles underlying this remarkable feat. In their latest study, published in PLOS Genetics on February 6, first author staff scientist M. Lucila Scimone, Reddien, and colleagues describe how planarians restore large portions of their nervous system—even regenerating a new head with a fully functional brain—by manipulating a signaling pathway.

This pathway, called the Delta-Notch signaling pathway, enables neurons to guide the differentiation of a class of progenitors—immature cells that will differentiate into specialized types—into glia, the non-neuronal cells that support and protect neurons. The mechanism ensures that the spatial pattern and relative numbers of neurons and glia at a given location are precisely restored following injury.

“This process allows planarians to regenerate neural circuits more efficiently because glial cells form only where needed, rather than being produced broadly within the body and later eliminated,” said Reddien, who is also a professor of biology at Massachusetts Institute of Technology and an Investigator with the Howard Hughes Medical Institute.

Coordinating regeneration

Multiple cell types work together to form a functional human brain. These include neurons and a more abundant group of cells called glial cells—astrocytes, microglia, and oligodendrocytes. Although glial cells are not the fundamental units of the nervous system, they perform critical functions in maintaining the connections between neurons, called synapses, clearing away dead cells and other debris, and regulating neurotransmitter levels, effectively holding the nervous system together like glue. A few years ago, Reddien and colleagues discovered cells in planarians that looked like glial cells and performed similar neuro-supportive functions. This led to the first characterization of glial cells in planarians in 2016.

Unlike in mammals where the same set of neural progenitors give rise to both neurons and glia, glial cells in planarians originate from a separate, specialized group of progenitors. These progenitors, called phagocytic progenitors, can not only give rise to glial cells but also pigment cells that determine the worm’s coloration, as well as other, lesser understood cell types.

Why neurons and glia in planarians originate from distinct progenitors—and what factors ultimately determine the differentiation of phagocytic progenitors into glia—are questions that still puzzled Reddien and team members. Then, a study showing that planarian neurons regenerate before glia formation led the researchers to wonder whether a signaling mechanism between neurons and phagocytic progenitors guides the specification of glia in planarians.

The first step to unravel this mystery was to look at the Notch signaling pathway, which is known to play a crucial role in the development of neurons and glia in other organisms, and determine its role in planarian glia regeneration. To do this, the researchers used RNA interference (RNAi)—a technique that decreases or completely silences the expression of genes—to turn off key genes involved in the Notch pathway and amputated the planarian’s head. It turned out Notch signaling is essential for glia regeneration and maintenance in planarians—no glial cells were found in the animal following RNAi, while the differentiation of other types of phagocytic cells was unaffected.

Of the different Notch signaling pathway components the researchers tested, turning of the genes notch-1delta-2, and suppressor of hairless produced this phenotype. Interestingly, the signaling molecules Delta-2 was found on the surface of neurons, whereas Notch-1 was expressed in phagocytic progenitors.

With these findings in hand, the researchers hypothesized that interaction between Delta-2 on neurons and Notch-1 on phagocytic progenitors could be governing the final fate determination of glial cells in planarians.

To test the hypothesis, the researchers transplanted eyes either from planarians lacking the notch-1 gene or from planarians lacking the delta-2 gene into wild-type animals and assessed the formation of glial cells around the transplant site. They observed that glial cells still formed around the notch-1 deficient eyes, as notch-1 was still active in the glial progenitors of the host wild-type animal. However, no glial cells formed around the delta-2 deficient eyes, even with the Notch signaling pathway intact in phagocytic progenitors, confirming that delta-2 in the photoreceptor neurons is required for the differentiation of phagocytic progenitors into glia near the eye.

“This experiment really showed us that you have two faces of the same coin—one is the phagocytic progenitors expressing Notch-1, and one is the neurons expressing Delta-2—working together to guide the specification of glia in the organism,”said Scimone.

The researchers have named this phenomenon coordinated regeneration, as it allows neurons to influence the pattern and number of glia at specific locations without the need for a separate mechanism to adjust the relative numbers of neurons and glia.

The group is now interested in investigating whether the same phenomenon might also be involved in the regeneration of other tissue types.

A sum of their parts

Researchers in the Department of Biology at MIT use an AI-driven approach to computationally predict short amino acid sequences that can bind to or inhibit a target, with a potential for great impact on fundamental biological research and therapeutic applications.

Lillian Eden | Department of Biology
February 6, 2025

All biological function is dependent on how different proteins interact with each other. Protein-protein interactions facilitate everything from transcribing DNA and controlling cell division to higher-level functions in complex organisms.

Much remains unclear about how these functions are orchestrated on the molecular level, however, and how proteins interact with each other — either with other proteins or with copies of themselves. 

Recent findings have revealed that small protein fragments have a lot of functional potential. Even though they are incomplete pieces, short stretches of amino acids can still bind to interfaces of a target protein, recapitulating native interactions. Through this process, they can alter that protein’s function or disrupt its interactions with other proteins. 

Protein fragments could therefore empower both basic research on protein interactions and cellular processes and could potentially have therapeutic applications. 

Recently published in Proceedings of the National Academy of Sciences, a new computational method developed in the Department of Biology at MIT builds on existing AI models to computationally predict protein fragments that can bind to and inhibit full-length proteins in E. coli. Theoretically, this tool could lead to genetically encodable inhibitors against any protein. 

The work was done in the lab of Associate Professor of Biology and HHMI Investigator Gene-Wei Li in collaboration with the lab of Jay A. Stein (1968) Professor of Biology, Professor of Biological Engineering and Department Head Amy Keating.

Leveraging Machine Learning

The program, called FragFold, leverages AlphaFold, an AI model that has led to phenomenal advancements in biology in recent years due to its ability to predict protein folding and protein interactions. 

The goal of the project was to predict fragment inhibitors, which is a novel application of AlphaFold. The researchers on this project confirmed experimentally that more than half of FragFold’s predictions for binding or inhibition were accurate, even when researchers had no previous structural data on the mechanisms of those interactions. 

“Our results suggest that this is a generalizable approach to find binding modes that are likely to inhibit protein function, including for novel protein targets, and you can use these predictions as a starting point for further experiments,” says co-first and corresponding author Andrew Savinov, a postdoc in the Li Lab. “We can really apply this to proteins without known functions, without known interactions, without even known structures, and we can put some credence in these models we’re developing.”

One example is FtsZ, a protein that is key for cell division. It is well-studied but contains a region that is intrinsically disordered and, therefore, especially challenging to study. Disordered proteins are dynamic, and their functional interactions are very likely fleeting — occurring so briefly that current structural biology tools can’t capture a single structure or interaction. 

The researchers leveraged FragFold to explore the activity of fragments of FtsZ, including fragments of the intrinsically disordered region, to identify several new binding interactions with various proteins. This leap in understanding confirms and expands upon previous experiments measuring FtsZ’s biological activity. 

This progress is significant in part because it was made without solving the disordered region’s structure, and because it exhibits the potential power of FragFold.

“This is one example of how AlphaFold is fundamentally changing how we can study molecular and cell biology,” Keating says. “Creative applications of AI methods, such as our work on FragFold, open up unexpected capabilities and new research directions.”

Inhibition, and beyond

The researchers accomplished these predictions by computationally fragmenting each protein and then modeling how those fragments would bind to interaction partners they thought were relevant.

They compared the maps of predicted binding across the entire sequence to the effects of those same fragments in living cells, determined using high-throughput experimental measurements in which millions of cells each produce one type of protein fragment. 

AlphaFold uses co-evolutionary information to predict folding, and typically evaluates the evolutionary history of proteins using something called multiple sequence alignments for every single prediction run. The MSAs are critical, but are a bottleneck for large-scale predictions — they can take a prohibitive amount of time and computational power. 

For FragFold, the researchers instead pre-calculated the MSA for a full-length protein once and used that result to guide the predictions for each fragment of that full-length protein. 

Savinov, together with Keating Lab alum Sebastian Swanson, PhD ‘23, predicted inhibitory fragments of a diverse set of proteins in addition to FtsZ. Among the interactions they explored was a complex between lipopolysaccharide transport proteins LptF and LptG. A protein fragment of LptG inhibited this interaction, presumably disrupting the delivery of lipopolysaccharide, which is a crucial component of the E. coli outer cell membrane essential for cellular fitness.

“The big surprise was that we can predict binding with such high accuracy and, in fact, often predict binding that corresponds to inhibition,” Savinov says. “For every protein we’ve looked at, we’ve been able to find inhibitors.”

The researchers initially focused on protein fragments as inhibitors because whether a fragment could block an essential function in cells is a relatively simple outcome to measure systematically. Looking forward, Savinov is also interested in exploring fragment function outside inhibition, such as fragments that can stabilize the protein they bind to, enhance or alter its function, or trigger protein degradation. 

Design, in principle 

This research is a starting point for developing a systemic understanding of cellular design principles, and what elements deep-learning models may be drawing on to make accurate predictions. 

“There’s a broader, further-reaching goal that we’re building towards,” Savinov says. “Now that we can predict them, can we use the data we have from predictions and experiments to pull out the salient features to figure out what AlphaFold has actually learned about what makes a good inhibitor?” 

Savinov and collaborators also delved further into how protein fragments bind, exploring other protein interactions and mutating specific residues to see how those interactions change how the fragment interacts with its target. 

Experimentally examining the behavior of thousands of mutated fragments within cells, an approach known as deep mutational scanning, revealed key amino acids that are responsible for inhibition. In some cases, the mutated fragments were even more potent inhibitors than their natural, full-length sequences. 

“Unlike previous methods, we are not limited to identifying fragments in experimental structural data,” says Swanson. “The core strength of this work is the interplay between high-throughput experimental inhibition data and the predicted structural models: the experimental data guides us towards the fragments that are particularly interesting, while the structural models predicted by FragFold provide a specific, testable hypothesis for how the fragments function on a molecular level.”

Savinov is excited about the future of this approach and its myriad applications.

“By creating compact, genetically encodable binders, FragFold opens a wide range of possibilities to manipulate protein function,” Li agrees. “We can imagine delivering functionalized fragments that can modify native proteins, change their subcellular localization, and even reprogram them to create new tools for studying cell biology and treating diseases.” 

Alumni Profile: Desmond Edwards, SB ’22

An interest in translating medicine for a wider audience

School of Science
February 6, 2025

Growing up hearing both English and Patois in rural Jamaica, he always had an interest in understanding other languages, so he studied French in high school and minored in it at MIT. As a child with persistent illnesses, he was frustrated that doctors couldn’t explain the “how” and “why” of what was happening in his body. “I wanted to understand how an entity so small that we can’t even see it with most microscopes is able to get into a massively intricate human body and completely shut it down in a matter of days,” he says.

Edwards, now an MIT graduate and a PhD candidate in microbiology and immunology at Stanford University—with a deferred MD admission in hand as well—feels closer to understanding things. The financial support he received at MIT from the Class of 1975 Scholarship Fund, he says, was one major reason that he chose MIT.

Support for research and discovery

I took a three-week Independent Activities Period boot camp designed to expose first-years with little or no research background to basic molecular biology and microbiology techniques. We had guidance from the professor and teaching assistants, but it was up to us what path we took. That intellectual freedom was part of what made me fall in love with academic research. The lecturer, Mandana Sassanfar, made it her personal mission to connect interested students to Undergraduate Research Opportunities Program placements, which is how I found myself in Professor Rebecca Lamason’s lab.

At the end of my first year, I debated whether to prioritize my academic research projects or leave for a higher-paying summer internship. My lab helped me apply for the Peter J. Eloranta Summer Undergraduate Research Fellowship, which provided funding that allowed me to stay for the summer, and I ended up staying in the lab for the rest of my time at MIT. One paper I coauthored (about developing new genetic tools to control pathogenic bacteria’s gene expression) was published this year.

French connections

French is one of the working languages of many global health programs, and being able to read documents in their original language has been helpful because many diseases that I care about impact Francophone countries like those in sub-Saharan and west Africa. In one French class, we had to analyze an original primary historical text, so I was able to look at an outbreak of plague in the 18th century and compare their public health response with ours to Covid-19. My MIT French classes have been useful in some very cool ways that I did not anticipate.

Translating medicine for the masses

When I go home and talk about my research, I often adapt folk stories, analogies, and relatable everyday situations to get points across since there might not be exact Patois words or phrases to directly convey what I’m describing. Taking these scientific concepts and breaking them all into bite-size pieces is important for the general American public too. I want to lead a scientific career that not only advances our understanding and treatment of infectious diseases, but also positively impacts policy, education, and outreach. Right now, this looks like a combination of being an academic/medical professor and eventually leading the Centers for Disease Control and Prevention.

Alumni Profile: Matthew Dolan, SB ’81

From Bench to Bedside and Beyond

Lillian Eden | Department of Biology
January 16, 2025

Matthew Dolan, SB ‘81, worked in the U.S. and abroad during a fascinating time in the field of immunology and virology.

In medical school, Matthew Dolan, SB ‘81, briefly considered specializing in orthopedic surgery because of the materials science nature of the work — but he soon realized that he didn’t have the innate skills required for that type of work. 

“I’ll be honest with you — I can’t parallel park,” he jokes. “You can consider a lot of things, but if you find the things that you’re good at and that excite you, you can hopefully move forward with those.” 

Dolan certainly has, tackling problems from bench to bedside and beyond. Both in the U.S. and abroad through the Air Force, Dolan has emerged as a leader in immunology and virology, and has served as Director of the Defense Institute for Medical Operations. He’s worked on everything from foodborne illnesses and Ebola to biological weapons and COVID-19, and has even been a guest speaker on NPR’s Science Friday

“This is fun and interesting, and I believe that, and I work hard to convey that — and it’s contagious,” he says. “You can affect people with that excitement.” 

Pieces of the Puzzle

Dolan fondly recalls his years at MIT, and is still in touch with many of the “brilliant” and “interesting” friends he made while in Cambridge. 

He notes that the challenges that were the most rewarding in his career were also the ones that MIT had uniquely prepared him for. Dolan, a Course 7 major, naturally took many classes outside of Biology as part of his undergraduate studies: organic chemistry was foundational for understanding toxicology while studying chemical weapons, while pathogens like Legionella, which causes pneumonia and can spread through water systems like ice machines or air conditioners, are solved at the interface between public health and ecology.

Man sitting on couch next to white dog with pointy ears.
Matthew Dolan stateside with his German Shepherd Sophie. Photo courtesy of Matthew Dolan.

“I learned that learning can be a high-intensity experience,” Dolan recalls. “You can be aggressive in your learning; you can learn and excel in a wide variety of things and gather up all the knowledge and knowledgeable people to work together towards solutions.”

Dolan, for example, worked in the Amazon Basin in Peru on a public health crisis of a sharp rise in childhood mortality due to malaria. The cause was a few degrees removed from the immediate problem: human agriculture had affected the Amazon’s tributaries, leading to still and stagnant water where before there had been rushing streams and rivers. This change in the environment allowed a certain mosquito species of “avid human biters” to thrive.  

“It can be helpful and important for some people to have a really comprehensive and contextual view of scientific problems and biological problems,” he says. “It’s very rewarding to put the pieces in a puzzle like that together.” 

Choosing To Serve

Dolan says a key to finding meaning in his work, especially during difficult times, is a sentiment from Alsatian polymath and Nobel Peace Prize winner Albert Schweitzer: “The only ones among you who will be really happy are those who will have sought and found how to serve.”

One of Dolan’s early formative experiences was working in the heart of the HIV/AIDS epidemic, at a time when there was no effective treatment. No matter how hard he worked, the patients would still die. 

“Failure is not an option — unless you have to fail. You can’t let the failures destroy you,” he says. “There are a lot of other battles out there, and it’s self-indulgent to ignore them and focus on your woe.” 

Lasting Impacts

Dolan couldn’t pick a favorite country, but notes that he’s always impressed seeing how people value the chance to excel with science and medicine when offered resources and respect. Ultimately, everyone he’s worked with, no matter their differences, was committed to solving problems and improving lives. 

Dolan worked in Russia after the Berlin Wall fell, on HIV/AIDS in Moscow and Tuberculosis in the Russian Far East. Although relations with Russia are currently tense, to say the least, Dolan remains optimistic for a brighter future. 

“People that were staunch adversaries can go on to do well together,” he says. “Sometimes, peace leads to partnership. Remembering that it was once possible gives me great hope.” 

Dolan understands that the most lasting impact he has had is, likely, teaching: time marches on, and discoveries can be lost to history, but teaching and training people continues and propagates. In addition to guiding the next generation of healthcare specialists, Dolan also developed programs in laboratory biosafety and biosecurity with the State Department and the Defense Department, and taught those programs around the world. 

“Working in prevention gives you the chance to take care of process problems before they become people problems — patient care problems,” he says. “I have been so impressed with the courageous and giving people that have worked with me.” 

Cellular interactions help explain vascular complications due to COVID-19 virus infection

Whitehead Institute Founding Member Rudolf Jaenisch and colleagues have found that cellular interactions help explain how SARS-CoV-2, the virus that causes COVID-19, could have such significant vascular complications, including blood clots, heart attacks, and strokes.

Greta Friar | Whitehead Institute
December 31, 2024

COVID-19 is a respiratory disease primarily affecting the lungs. However, the SARS-CoV-2 virus that causes COVID-19 surprised doctors and scientists by triggering an unusually large percentage of patients to experience vascular complications – issues related to blood flow, such as blood clots, heart attacks, and strokes.

Whitehead Institute Founding Member Rudolf Jaenisch and colleagues wanted to understand how this respiratory virus could have such significant vascular effects. They used pluripotent stem cells to generate three relevant vascular and perivascular cell types—cells that surround and help maintain blood vessels—so they could closely observe the effects of SARS-CoV-2 on the cells. Instead of using existing methods to generate the cells, the researchers developed a new approach, providing them with fresh insights into the mechanisms by which the virus causes vascular problems. The researchers found that SARS-CoV-2 primarily infects perivascular cells and that signals from these infected cells are sufficient to cause dysfunction in neighboring vascular cells, even when the vascular cells are not themselves infected. In a paper published in the journal Nature Communications on December 30, Jaenisch, postdoc in his lab Alexsia Richards, Harvard University Professor and Wyss Institute for Biologically Inspired Engineering Member David Mooney, and then-postdoc in the Jaenisch and Mooney labs Andrew Khalil share their findings and present a scalable stem cell-derived model system with which to study vascular cell biology and test medical therapies.

A new problem requires a new approach

When the COVID-19 pandemic began, Richards, a virologist, quickly pivoted her focus to SARS-CoV-2. Khalil, a bioengineer, had already been working on a new approach to generate vascular cells. The researchers realized that a collaboration could provide Richards with the research tool she needed and Khalil with an important research question to which his tool could be applied.

The three cell types that Khalil’s approach generated were endothelial cells, the vascular cells that form the lining of blood vessels; and smooth muscle cells and pericytes, perivascular cells that surround blood vessels and provide them with structure and maintenance, among other functions. Khalil’s biggest innovation was to generate all three cell types in the same media—the mixture of nutrients and signaling molecules in which stem cell-derived cells are grown.

The combination of signals in the media determines the final cell type into which a stem cell will mature, so it is much easier to grow each cell type separately in specially tailored media than to find a mixture that works for all three. Typically, Richards explains, virologists will generate a desired cell type using the easiest method, which means growing each cell type and then observing the effects of viral infection on it in isolation. However, this approach can limit results in several ways. Firstly, it can make it challenging to distinguish the differences in how cell types react to a virus from the differences caused by the cells being grown in different media.

“By making these cells under identical conditions, we could see in much higher resolution the effects of the virus on these different cell populations, and that was essential in order to form a strong hypothesis of the mechanisms of vascular symptom risk and progression,” Khalil says.

Secondly, infecting isolated cell types with a virus does not accurately represent what happens in the body, where cells are in constant communication as they react to viral exposure. Indeed, Richards’ and Khalil’s work ultimately revealed that the communication between infected and uninfected cell types plays a critical role in the vascular effects of COVID-19.

“The field of virology often overlooks the importance of considering how cells influence other cells and designing models to reflect that,” Richards says. “Cells do not get infected in isolation, and the value of our model is that it allows us to observe what’s happening between cells during infection.”

Viral infection of smooth muscle cells has broader, indirect effects

When the researchers exposed their cells to SARS-CoV-2, the smooth muscle cells and pericytes became infected—the former at especially high levels, and this infection resulted in strong inflammatory gene expression—but the endothelial cells resisted infection. Endothelial cells did show some response to viral exposure, likely due to interactions with proteins on the virus’ surface. Typically, endothelial cells press tightly together to form a firm barrier that keeps blood inside of blood vessels and prevents viruses from getting out. When exposed to SARS-CoV-2, the junctions between endothelial cells appeared to weaken slightly. The cells also had increased levels of reactive oxygen species, which are damaging byproducts of certain cellular processes.

However, big changes in endothelial cells only occurred after the cells were exposed to infected smooth muscle cells. This triggered high levels of inflammatory signaling within the endothelial cells. It led to changes in the expression of many genes relevant to immune response. Some of the genes affected were involved in coagulation pathways, which thicken blood and so can cause blood clots and related vascular events. The junctions between endothelial cells experienced much more significant weakening after exposure to infected smooth muscle cells, which would lead to blood leakage and viral spread. All of these changes occurred without SARS-CoV-2 ever infecting the endothelial cells.

This work shows that viral infection of smooth muscle cells, and their resultant signaling to endothelial cells, is the lynchpin in the vascular damage caused by SARS-CoV-2. This would not have been apparent if the researchers had not been able to observe the cells interacting with each other.

Clinical relevance of stem cell results

The effects that the researchers observed were consistent with patient data. Some of the genes whose expression changed in their stem cell-derived model had been identified as markers of high risk for vascular complications in COVID-19 patients with severe infections. Additionally, the researchers found that a later strain of SARS-CoV-2, an Omicron variant, had much weaker effects on the vascular and perivascular cells than did the original viral strain. This is consistent with the reduced levels of vascular complications seen in COVID-19 patients infected with recent strains.

Having identified smooth muscle cells as the main site of SARS-Cov-2 infection in the vascular system, the researchers next used their model system to test one drug’s ability to prevent infection of smooth muscle cells. They found that the drug, N, N-Dimethyl-D-erythro-sphingosine, could reduce infection of the cell type without harming smooth muscle or endothelial cells. Although preventing vascular complications of COVID-19 is not as pressing a need with current viral strains, the researchers see this experiment as proof that their stem cell model could be used for future drug development. New coronaviruses and other pathogens are frequently evolving, and when a future virus causes vascular complications, this model could be used to quickly test drugs to find potential therapies while the need is still high. The model system could also be used to answer other questions about vascular cells, how these cells interact, and how they respond to viruses.

“By integrating bioengineering strategies into the analysis of a fundamental question in viral pathology, we addressed important practical challenges in modeling human disease in culture and gained new insights into SARS-CoV-2 infection,” Mooney says.

“Our interdisciplinary approach allowed us to develop an improved stem cell model for infection of the vasculature,” says Jaenisch, who is also a professor of biology at the Massachusetts Institute of Technology. “Our lab is already applying this model to other questions of interest, and we hope that it can be a valuable tool for other researchers.”

From Molecules to Memory

On a biological foundation of ions and proteins, the brain forms, stores, and retrieves memories to inform intelligent behavior.

Noah Daly | Department of Biology
December 23, 2024

Whenever you go out to a restaurant to celebrate, your brain retrieves memories while forming new ones. You notice the room is elegant, that you’re surrounded by people you love, having meaningful conversations, and doing it all with good manners. Encoding these precious moments (and not barking at your waiter, expecting dessert before your appetizer), you rely heavily on plasticity, the ability of neurons to change the strength and quantity of their connections in response to new information or activity. The very existence of memory and our ability to retrieve it to guide our intelligent behavior are hypothesized to be movements of a neuroplastic symphony, manifested through chemical processes occurring across vast, interconnected networks of neurons.

During infancy, brain connectivity grows exponentially, rapidly increasing the number of synapses between neurons, some of which are then pruned back to select the most salient for optimal performance. This exuberant growth followed by experience-dependent optimization lays a foundation of connections to produce a functional brain, but the action doesn’t cease there. Faced with a lifetime of encountering and integrating new experiences, the brain will continue to produce and edit connections throughout adulthood, decreasing or increasing their strength to ensure that new information can be encoded.

There are a thousand times more connections in the brain than stars in the Milky Way galaxy. Neuroscientists have spent more than a century exploring that vastness for evidence of the biology of memory. In the last 30 years, advancements in microscopy, genetic sequencing and manipulation, and machine learning technologies have enabled researchers, including four MIT Professors of Biology working in The Picower Institute for Learning and Memory – Elly NediviTroy LittletonMatthew Wilson, and Susumu Tonegawa – to help refine and redefine our understanding of how plasticity works in the brain, what exactly memories are, how they are formed, consolidated, and even changed to suit our needs as we navigate an uncertain world.

Circuits and Synapses: Our Information Superhighway

Neuroscientists hypothesize that how memories come to be depends on how neurons are connected and how they can rewire these connections in response to new experiences and information. This connectivity occursat the junction between two neurons, called a synapse. When a neuron wants to pass on a signal, it will release chemical messengers called neurotransmitters into the synapse cleft from the end of a long protrusion called the axon, often called the “pre-synaptic” area.

These neurotransmitters, whose release is triggered by electrical impulses called action potentials, can bind to specialized receptors on the root-like structures of the receiving neuron, known as dendrites (the “post-synaptic” area). Dendrites are covered with receptors that are either excitatory or inhibitory, meaning they are capable of increasing or decreasing the post-synaptic neuron’s chance of firing their own action potential and carrying a message further.

Not long ago, the scientific consensus was that the brain’s circuitry became hardwired in adulthood. However, a completely fixed system does not lend itself to incorporating new information.

“While the brain doesn’t make any new neurons, it constantly adds and subtracts connections between those neurons to optimize our most basic functions,” explains Nedivi. Unused synapses are pruned away to make room for more regularly used ones. Nedivi has pioneered techniques of two-photon microscopy to examine the plasticity of synapses on axons and dendrites in vivid, three-dimensional detail in living, behaving, and learning animals.

But how does the brain determine which synapses to strengthen and which to prune? “There are three ways to do this,” Littleton explains. “One way is to make the presynaptic side release more neurotransmitters to instigate a bigger response to the same behavioral stimulus. Another is to have the postsynaptic cell respond more strongly. This is often accomplished by adding glutamate receptors to the dendritic spine so that the same signal is detected at a higher level, essentially turning the radio volume up or down.” (Glutamate, one of the most prevalent neurotransmitters in the brain, is our main excitatory messenger and can be found in every region of our neural network.)

Littleton’s lab studies how neurons can turn that radio volume up or down by changing presynaptic as well as postsynaptic output. Characterizing many of the dozens of proteins involved has helped Littleton discover in 2005, for instance, how signals from the post-synaptic area can make some pre-synaptic signals stronger and more active than others. “Our interest is really understanding how the building blocks of this critical connection between neurons work, so we study Drosophila, the simple fruit fly, as a model system to address these questions. We usually take genetic approaches where we can break the system by knocking out a gene or overexpressing it, that allows us to figure out precisely what the protein is doing.”

In general, the release of neurotransmitters can make it more or less likely the receiving cell will continue the line of communication through activation of voltage-gated channels that initiate action potentials. When these action potentials arrive at presynaptic terminals, they can trigger that neuron to release its own neurotransmitters to influence downstream partners. The conversion of electrical signals to chemical transmitters requires presynaptic calcium channels that form pores in the cell membrane that act as a switch, telling the cell to pass along the message in full, reduce the volume, or change the tune completely. By altering calcium channel function, which can be done using a host of neuromodulators or clinically relevant drugs, synaptic function can be tuned up or down to change communication between neurons.

The third mechanism, adding new synapses, has been one of the focal points of Nedivi’s research. Nedivi models this in the visual cortex, labeling and tracking cells in lab mice exposed to different visual experiences that stimulate plasticity.

In a 2016 study, Nedivi showed that the distribution of excitatory and inhibitory synaptic sites on dendrites fluctuates rapidly, with the number of inhibitory sites disappearing and reappearing in the course of a single day. The action, she explains, is in the spines that protrude from dendrites along their length and house post-synaptic areas.

“We found that some spines which were previously thought to have only excitatory synapses are actually dually innervated, meaning they have both excitatory and inhibitory synapses,” Nedivi says. “The excitatory synapses are always stable, and yet on the same spine, about 70% of the inhibitory synapses are dynamic, meaning they can come and go. It’s as if the excitatory synapses on the dually innervated spines are hard-wired, but their activity can be attenuated by the presence of an inhibitory synapse that can gate their activity. Thus, Nedivi found that the number of inhibitory synapses, which make up roughly 15% of the synaptic density of the brain as a whole, play an outsized role in managing the passage of signals that lead to the formation of memory.

“We didn’t start out thinking about it this way, but the inhibitory circuitry is so much more dynamic.” she says. “That’s where the plasticity is.”

Inside Engrams: Memory Storage & Recall

A brain that has made many connections and can continually edit them to process information is well set up for its neurons to work together to form a memory. Understanding the mystery of how it does this excited Susumu Tonegawa, a molecular biologist who won the Nobel Prize for his prior work in immunology.

“More than 100 years ago, it was theorized that, for the brain to form a biological basis for storing information, neurons form localized groupings called engrams,” Tonegawa explains. Whenever an experience exposes the brain to new information, synapses among ensembles of neurons undergo persistent chemical and physical changes to form an engram.

Engram cells can be reactivated and modified physically or chemically by a new learning experience. Repeating stimuli present during a prior learning experience (or at least some part of it) also allows the brain to retrieve some of that information.

In 1992, Tonegawa’s lab was the first to show that knocking out a gene for the synaptic protein, alpha-CamKII could disrupt memory formation, helping to establish molecular biology as a tool to understand how memories are encoded. The lab has made numerous contributions on that front since then.

By 2012, neuroscience approaches had advanced to the point where Tonegawa and colleagues could directly test for the existence of engrams. In a study in Nature, Tonegawa’s lab reported that directly activating a subset of neurons involved in the formation of memory–an engram–was sufficient to induce the behavioral expression of that memory. They pinpointed cells involved in forming a memory (a moment of fear instilled in a mouse by giving its foot a little shock) by tracking the timely expression of the protein c-fos in neurons in the hippocampus. They then labeled these cells using specialized ion channels that activate the neurons when exposed to light. After observing what cells were activated during the formation of a fear memory, the researchers traced the synaptic circuits linking them.

It turned out that they only needed to optically activate the neurons involved in the memory of the footshock to trigger the mouse to freeze (just like it does when returned to the fearful scene), which proved those cells were sufficient to elicit the memory. Later, Tonegawa and his team also found that when this memory forms, it forms simultaneously in the cortex and the basolateral amygdala, where the brain forms emotional associations. This discovery contradicted the standard theory of memory consolidation, where memories form in the hippocampus before migrating to the cortex for retrieval later.

Tonegawa has also found key distinctions between memory storage and recall. In 2017, he and colleagues induced a form of amnesia in mice by disrupting their ability to make proteins needed for strengthening synapses. The lab found that engrams could still be reactivated artificially, instigating the freezing behavior, even though they could not be retrieved anymore through natural recall cues. They dubbed these no-longer naturally retrievable memories “silent engrams.” The research showed that while synapse strengthening was needed to recall a memory, the mere pattern of connectivity in the engram was enough to store it.

While recalling memories stored in silent engrams is possible, they require stronger than normal stimuli to be activated. “This is caused in part by the lower density of dendritic spines on neurons that participate in silent engrams,” Tonegawa says. Notably, Tonegawa sees applications of this finding in studies of Alzheimer’s disease. While working with a mouse model that presents with the early stages of the disease, Tonegawa’s lab could stimulate silent engrams to help them retrieve memories.

Making memory useful

Our neural circuitry is far from a hard drive or a scrapbook. Instead, the brain actively evaluates the information stored in our memories to build models of the world and then make modifications to better utilize our accumulated knowledge in intelligent behavior.

Processing memory includes making structural and chemical changes throughout life. This requires focused energy, like during sleep or waking states of rest. To hit replay on essential events and simulate how they might be replicated in the future, we need to power down and let the mind work. These so-called “offline states” and the processes of memory refinement and prediction they enable fascinate Matt Wilson. Wilson has spent the last several decades examining the ways different regions of the brain communicate with one another during various states of consciousness to learn, retrieve, and augment memories to serve an animal’s intelligent behavior.

“An organism that has successfully evolved an adaptive intelligent system already knows how to respond to new situations,” Wilson says. “They might refine their behavior, but the fact that they had adaptive behavior in the first place suggests that they have to have embedded some kind of a model of expectation that is good enough to get by with. When we experience something for the first time, we make refinements to the model–we learn–and then what we retain from that is what we think of as memory. So the question becomes, how do we refine those models based on experiences?”

Wilson’s fascination with resting states began during his postdoctoral research at the University of Arizona, where he noticed a sleeping lab rat was producing the same electrical activity in its brain as it did while running through a maze. Since then, he has shown that different offline states, including different states of sleep, represent different kinds of offline functions, such as replaying experiences or simulating them. In 2002, Wilson’s work with slow-wave sleep showed the important role the hippocampus plays in spatial learning. Using electrophysiology, where probes are directly inserted into the brain tissue of the mouse, Wilson found that the sequential firing of the same hippocampal neurons activated while it sought pieces of chocolate on either end of a linear track occurred 20 times faster while the rat was in slow-wave sleep.

In 2006, Wilson co-authored a study in Nature that showed mice can retrace their steps after completing a maze. Using electrophysiological recording of the activity of many individual neurons, Wilson showed that the mice replay the memory of each turn it took in reverse, doing so multiple times whenever they had an opportunity to rest between trials.
These replays manifested as ripples in electrical activity that occur during slow-wave sleep.

“REM sleep, on the other hand, can produce novel recapitulation of action-based states, where long sequences and movement information are also repeated.” (e.g. when your dog is moving its legs during sleep, it could be producing a full-fledged simulation of running). Three years after his initial replay study, Wilson found that mice can initiate replay from any point in the sequence of turns in the maze and can do so forward or in reverse.

“Memory is not just about storing my experience,” Wilson explains. “It’s about making modifications in an existing adaptive model, one that’s been developed based on prior experience. In the case of A.I.s such as large language models [like ChatGPT], you just dump everything in there. For biology, it’s all about the experience being folded into the evolutionary operating system, governed by developmental rules. In a sense, you can put this complexity into the machine, but you just can’t train an animal up de novo; there has to be something that allows it to work through these developmental mechanisms.”

The property of the brain that many neuroscientists believe enables this versatile, flexible, and adaptive approach to storing, recalling, and using memory is its plasticity. Because the brain’s machinery is molecular, it is constantly renewable and rewireable, allowing us to incorporate new experiences even as we apply prior experiences. Because we’ve had many dinners in many restaurants, we can navigate the familiar experience while appreciating the novelty of a celebration. We can look into the future, imagining similarly rewarding moments that have yet to come, and game out how we might get there. The marvels of memory allow us to see much of this information in real-time, and scientists at MIT continue to learn how this molecular system guides our behavior.