Tissue architecture affects chromosome segregation

Biologists discover that the environment surrounding a cell plays an integral role in its ability to accurately segregate its chromosomes.

Ashley Junger | Koch Institute
August 24, 2018

All growth and reproduction relies on a cell’s ability to replicate its chromosomes and produce accurate copies of itself. Every step of this process takes place within that cell.

Based on this observation, scientists have studied the replication and segregation of chromosomes as a phenomenon exclusively internal to the cell. They traditionally rely on warm nutritional cultures that promote growth but bear little resemblance to the cell’s external surroundings while in its natural environment.

New research by a group of MIT biologists reveals that this long-held assumption is incorrect. In a paper published this week, they describe how some types of cells rely on signals from surrounding tissue in order to maintain chromosome stability and segregate accurately.

Kristin Knouse, a fellow at the Whitehead Institute, is the lead author of the paper, which was published online in the journal Cell on Aug. 23. Angelika Amon, the Kathleen and Curtis Marble Professor in Cancer Research in the Department of Biology and a member of the Koch Institute for Integrative Cancer Research, is the senior author.

“The main takeaway from this paper is that we must study cells in their native tissues to really understand their biology,” Amon says. “Results obtained from cell lines that have evolved to divide on plastic dishes do not paint the whole picture.”

When cells replicate, the newly duplicated chromosomes line up within the cell and cellular structures pull one copy to each side. The cell then divides down the middle, separating one copy of each chromosome into each new daughter cell.

At least, that’s how it’s supposed to work. In reality, there are sometimes errors in the process of separating chromosomes into daughter cells, known as chromosome mis-segregation. Some errors simply result in damage to the DNA. Other errors can result in the chromosomes being unevenly divided between daughter cells, a condition called aneuploidy.

These errors are almost always harmful to cell development and can be fatal. In developing embryos, aneuploidy can cause miscarriages or developmental disorders such as Down syndrome. In adults, chromosome instability is seen in a large number of cancers.

To study these errors, scientists have historically removed cells from their surrounding tissue and placed them into easily controlled plastic cultures.

“Chromosome segregation has been studied in a dish for decades,” Knouse says. “I think the assumption was … a cell would segregate chromosomes the same way in a dish as it would in a tissue because everything was happening inside the cell.”

However, in previous work, Knouse had found that reported rates for aneuploidy in cells grown in cultures was much higher than the rates she found in cells that had grown within their native tissue. This prompted her and her colleagues to investigate whether the surroundings of a cell influence the accuracy with which that cell divided.

To answer this question, they compared mis-segregation rates between five different cell types in native and non-native environments.

But not all cells’ native environments are the same. Some cells, like those that form skin, grow in a very structured context, where they always have neighbors and defined directions for growth. Other cells, however, like cells in the blood, have greater independence, with little interaction with the surrounding tissue.

In the new study, the researchers observed that cells that grew in structured environments in their native tissues divided accurately within those tissues. But once they were placed into a dish, the frequency of chromosome mis-segregation drastically increased. The cells that were less tied to structures in their tissue were not affected by the lack of architecture in culture dishes.

The researchers found that maintaining the architectural conditions of the cell’s native environment is essential for chromosome stability. Cells removed from the context of their tissue don’t always faithfully represent natural processes.

The researchers determined that architecture didn’t have an obvious effect on the expression of known genes involved in segregation. The disruption in tissue architecture likely causes mechanical changes that disrupt segregation, in a manner that is independent of mutations or gene expression changes.

“It was surprising to us that for something so intrinsic to the cell — something that’s happening entirely within the cell and so fundamental to the cell’s existence — where that cell is sitting actually matters quite a bit,” Knouse says.

Through the Cancer Genome Project, scientists learned that despite high rates of chromosome mis-segregation, many cancers lack any mutations to the cellular machinery that controls chromosome partitioning. This left scientists searching for the cause of the increase of these division errors. This study suggests that tissue architecture could be the culprit.

Cancer development often involves disruption of tissue architecture, whether during tumor growth or metastasis. This disruption of the extracellular environment could trigger chromosome segregation errors in the cells within the tumor.

“I think [this paper] really could be the explanation for why certain kinds of cancers become chromosomally unstable,” says Iain Cheeseman, a professor of biology at MIT and a member of the Whitehead Institute, who was not involved in the study.

The results point not only to a new understanding of the cellular mechanical triggers and effects of cancers, but also to a new understanding of how cell biology must be studied.

“Clearly a two-dimensional culture system does not faithfully recapitulate even the most fundamental processes, like chromosome segregation,” Knouse says. “As cell biologists we really must start recognizing that context matters.”

This work was supported by the National Institutes of Health, the Kathy and Curt Marble Cancer Research Fund, and the Koch Institute Support (core) Grant from the National Cancer Institute.

Antidepressant restores youthful flexibility to aging inhibitory neurons

Neural plasticity and arbor growth decline with age, study in mice shows.

David Orenstein | Picower Institute for Learning and Memory
August 20, 2018

A new study provides fresh evidence that the decline in the capacity of brain cells to change (called “plasticity”), rather than a decline in total cell number, may underlie some of the sensory and cognitive declines associated with normal brain aging. Scientists at MIT’s Picower Institute for Learning and memory show that inhibitory interneurons in the visual cortex of mice remain just as abundant during aging, but their arbors become simplified and they become much less structurally dynamic and flexible.

In their experiments published online in the Journal of Neuroscience they also show that they could restore a significant degree of lost plasticity to the cells by treating mice with the commonly used antidepressant medication fluoxetine, also known as Prozac.

“Despite common belief, loss of neurons due to cell death is quite limited during normal aging and unlikely to account for age-related functional impairments,” write the scientists, including lead author Ronen Eavri, a postdoc at the Picower Institute, and corresponding author Elly Nedivi, a professor of biology and brain and cognitive sciences. “Rather it seems that structural alterations in neuronal morphology and synaptic connections are features most consistently correlated with brain age, and may be considered as the potential physical basis for the age-related decline.”

Nedivi and co-author Mark Bear, the Picower Professor of Neuroscience, are affiliated with MIT’s Aging Brain Initiative, a multidisciplinary effort to understand how aging affects the brain and sometimes makes the brain vulnerable to disease and decline.

In the study the researchers focused on the aging of inhibitory interneurons which is less well-understood than that of excitatory neurons, but potentially more crucial to plasticity. Plasticity, in turn, is key to enabling learning and memory and in maintaining sensory acuity. In this study, while they focused on the visual cortex, the plasticity they measured is believed to be important elsewhere in the brain as well.

The team counted and chronically tracked the structure of inhibitory interneurons in dozens of mice aged to 3, 6, 9, 12 and 18 months. (Mice are mature by 3 months and live for about 2 years, and 18-month-old mice are already considered quite old.) In previous work, Nedivi’s lab has shown that inhibitory interneurons retain the ability to dynamically remodel into adulthood. But in the new paper, the team shows that new growth and plasticity reaches a limit and progressively declines starting at about 6 months.

But the study also shows that as mice age there is no significant change in the number or variety of inhibitory cells in the brain.

Retraction and inflexibility with age

Instead the changes the team observed were in the growth and performance of the interneurons. For example, under the two-photon microscope the team tracked the growth of dendrites, which are the tree-like structures on which a neuron receives input from other neurons. At 3 months of age mice showed a balance of growth and retraction, consistent with dynamic remodeling. But between 3 and 18 months they saw that dendrites progressively simplified, exhibiting fewer branches, suggesting that new growth was rare while retraction was common.

In addition, they saw a precipitous drop in an index of dynamism. At 3 months virtually all interneurons were above a crucial index value of 0.35, but by 6 months only half were, by 9 months barely any were, and by 18 months none were.

Bear’s lab tested a specific form of plasticity that underlies visual recognition memory in the visual cortex, where neurons respond more potently to stimuli they were exposed to previously. Their measurements showed that in 3-month-old mice “stimulus-selective response potentiation” (SRP) was indeed robust, but its decline went hand in hand with the decline in structural plasticity, so that it was was significantly lessened by 6 months and barely evident by 9 months.

Fountain of fluoxetine

While the decline of dynamic remodeling and plasticity appeared to be natural consequences of aging, they were not immutable, the researchers showed. In prior work Nedivi’s lab had shown that fluoxetine promotes interneuron branch remodeling in young mice, so they decided to see whether it could do so for older mice and restore plasticity as well.

To test this, they put the drug in the drinking water of mice at various ages for various amounts of time. Three-month-old mice treated for three months showed little change in dendrite growth compared to untreated controls, but 25 percent of cells in 6-month-old mice treated for three months showed significant new growth (at the age of 9 months). But among 3-month-old mice treated for six months, 67 percent of cells showed new growth by the age of 9 months, showing that treatment starting early and lasting for six months had the strongest effect.

The researchers also saw similar effects on SRP. Here, too, the effects ran parallel to the structural plasticity decline. Treating mice for just three months did not restore SRP, but treating mice for six months did so significantly.

“Here we show that fluoxetine can also ameliorate the age-related decline in structural and functional plasticity of visual cortex neurons,” the researchers write. The study, they noted, adds to prior research in humans showing a potential cognitive benefit for the drug.

“Our finding that fluoxetine treatment in aging mice can attenuate the concurrent age-related declines in interneuron structural and visual cortex functional plasticity suggests it could provide an important therapeutic approach towards mitigation of sensory and cognitive deficits associated with aging, provided it is initiated before severe network deterioration,” they continued.

In addition to Eavri, Nedivi and Bear, the paper’s other authors are Jason Shepherd, Christina Welsh, and Genevieve Flanders.

The National Institutes of Health, the American Federation for Aging Research, the Ellison Medical Fondation, and the Machiah Foundation supported the research.

Study suggests glaucoma may be an autoimmune disease

Unexpected findings show that the body’s own immune system destroys retinal cells.

Anne Trafton | MIT News Office
August 11, 2018

Glaucoma, a disease that afflicts nearly 70 million people worldwide, is something of a mystery despite its prevalence. Little is known about the origins of the disease, which damages the retina and optic nerve and can lead to blindness.

A new study from MIT and Massachusetts Eye and Ear has found that glaucoma may in fact be an autoimmune disorder. In a study of mice, the researchers showed that the body’s own T cells are responsible for the progressive retinal degeneration seen in glaucoma. Furthermore, these T cells appear to be primed to attack retinal neurons as the result of previous interactions with bacteria that normally live in our body.

The discovery suggests that it could be possible to develop new treatments for glaucoma by blocking this autoimmune activity, the researchers say.

“This opens a new approach to prevent and treat glaucoma,” says Jianzhu Chen, an MIT professor of biology, a member of MIT’s Koch Institute for Integrative Cancer Research, and one of the senior authors of the study, which appears in Nature Communications on Aug. 10.

Dong Feng Chen, an associate professor of ophthalmology at Harvard Medical School and the Schepens Eye Research Institute of Massachusetts Eye and Ear, is also a senior author of the study. The paper’s lead authors are Massachusetts Eye and Ear researchers Huihui Chen, Kin-Sang Cho, and T.H. Khanh Vu.

Genesis of glaucoma

One of the biggest risk factors for glaucoma is elevated pressure in the eye, which often occurs as people age and the ducts that allow fluid to drain from the eye become blocked. The disease often goes undetected at first; patients may not realize they have the disease until half of their retinal ganglion cells have been lost.

Most treatments focus on lowering pressure in the eye (also known as intraocular pressure). However, in many patients, the disease worsens even after intraocular pressure returns to normal. In studies in mice, Dong Feng Chen found the same effect.

“That led us to the thought that this pressure change must be triggering something progressive, and the first thing that came to mind is that it has to be an immune response,” she says.

To test that hypothesis, the researchers looked for immune cells in the retinas of these mice and found that indeed, T cells were there. This is unusual because T cells are normally blocked from entering the retina, by a tight layer of cells called the blood-retina barrier, to suppress inflammation of the eye. The researchers found that when intraocular pressure goes up, T cells are somehow able to get through this barrier and into the retina.

The Mass Eye and Ear team then enlisted Jianzhu Chen, an immunologist, to further investigate what role these T cells might be playing in glaucoma. The researchers generated high intraocular pressure in mice that lack T cells and found that while this pressure induced only a small amount of damage to the retina, the disease did not progress any further after eye pressure returned to normal.

Further studies revealed that the glaucoma-linked T cells target proteins called heat shock proteins, which help cells respond to stress or injury. Normally, T cells should not target proteins produced by the host, but the researchers suspected that these T cells had been previously exposed to bacterial heat shock proteins. Because heat shock proteins from different species are very similar, the resulting T cells can cross-react with mouse and human heat shock proteins.

To test this hypothesis, the team brought in James Fox, a professor in MIT’s Department of Biological Engineering and Division of Comparative Medicine, whose team maintains mice with no bacteria. The researchers found that when they tried to induce glaucoma in these germ-free mice, the mice did not develop the disease.

Human connection

The researchers then turned to human patients with glaucoma and found that these patients had five times the normal level of T cells specific to heat shock proteins, suggesting that the same phenomenon may also contribute to the disease in humans. The researchers’ studies thus far suggest that the effect is not specific to a particular strain of bacteria; rather, exposure to a combination of bacteria can generate T cells that target heat shock proteins.

One question the researchers plan to study further is whether other components of the immune system may be involved in the autoimmune process that gives rise to glaucoma. They are also investigating the possibility that this phenomenon may underlie other neurodegenerative disorders, and looking for ways to treat such disorders by blocking the autoimmune response.

“What we learn from the eye can be applied to the brain diseases, and may eventually help develop new methods of treatment and diagnosis,” Dong Feng Chen says.

The research was funded by the National Institutes of Health, the Lion’s Foundation, the Miriam and Sheldon Adelson Medical Research Foundation, the National Nature Science Foundation of China, the Ivan R. Cottrell Professorship and Research Fund, the Koch Institute Support (core) Grant from the National Cancer Institute, and the National Eye Institute Core Grant for Vision Research.

School of Science appoints eight faculty members to named professorships
School of Science
July 23, 2018

The School of Science announced that eight of its faculty members have been appointed to named professorships. These positions afford the faculty members additional support to pursue their research and develop their careers.

Eliezer Calo, assistant professor in the Department of Biology, has been named the Irwin W. and Helen Sizer Career Development Professor. He focuses on the coordination of RNA metabolism using a combination of genetic, biochemical, and functional genomic approaches. The core of Calo’s research program is to understand how ribosome biogenesis is controlled by specific RNA binding proteins, particularly RNA helicases of the “DEAD box” family, and how disregulation of ribosome biogenesis contributes to various diseases, including cancer. He proposes initially to characterize the functions of specific genes of interest, including the DDX21 RNA helicase and the TCOF1 factor involved in RNA Pol I transcription and rRNA processing, using biochemical, molecular and genome-wide approaches in mouse, Xenopus and Zebrafish models.

Steven Flavell, assistant professor in the Department of Brain and Cognitive Sciences, has been named the Lister Brothers Career Development Professor. He uses Caenorhabditis elegans to examine how neuromodulators coordinate activity in neural circuits to generate locomotion behaviors linked to the feeding or satiety states of an animal. His long-term goal is to understand how neural circuits generate sustained behavioral states, and how physiological and environmental information is integrated into these circuits. Gaining a mechanistic understanding of how these circuits function will be essential to decipher the neural bases of sleep and mood disorders.

Pablo Jarillo-Herrero, the Cecil and Ida Green Professor of Physics, explores quantum transport in novel condensed-matter systems such as graphene, transition metal dichalcogenides and topological insulators. In recent work, he has demonstrated the presence of a bandgap in graphene-based van der Waals heterostructures, novel quantum spin Hall and photothermoelectric effects in graphene, as well as light-emitting diodes, photodetectors and solar cells in the atomically thin tungsten diselenide system. He has also made advances in characterizing and manipulating the properties of other ultrathin materials such as ultrathin graphite and molybdenum disulphide, which lack graphene’s ultrarelativistic properties, but possess other unusual electronic properties.

Becky Lamason, assistant professor in the Department of Biology, has been named the Robert A. Swanson (1969) Career Development Professor of Life Sciences. She investigates how intracellular bacterial pathogens hijack host cell processes to promote infection. In particular, she studies how Rickettsia parkeri and Listeria monocytogenes move through tissues via a process called cell-to-cell spread. She utilizes cellular, molecular, genetic, biochemical, and biophysical approaches to elucidate the mechanisms of spread in order to reveal key aspects of pathogenesis and host cell biology.

Rebecca Saxe, the inaugural John W. Jarve (1978) Professor in Brain and Cognitive Sciences, is best known for her discovery of a brain region that is specialized for “theory of mind,” people’s ability to think about the thoughts, beliefs, plans, hopes and emotions of other people. Saxe continues to study this region and its role in social cognition, and is exploring the theory-of-mind system as a promising candidate for understanding the biological basis of autism. She also studies brain development in human babies, including her own.

Omer Yilmaz, assistant professor in the Department of Biology, has been named the Eisen and Chang Career Development Professor. He studies how the adult intestine is maintained by stem cells that require a cellular neighborhood, or niche, consisting in part of Paneth cells. Specifically, he investigates the molecular mechanisms of how intestinal stem cells and their Paneth cell niche respond to diverse diets to coordinate intestinal regeneration with organismal physiology and its impact on the formation and growth of intestinal cancers. By better understanding how intestinal stem cells adapt to diverse diets, he hopes to identify and develop new strategies that prevent and reduce the growth of cancers involving the intestinal tract that includes the small intestine, colon, and rectum.

Yufei Zhao, assistant professor in the Department of Mathematics, has been named the Class of 1956 Career Development Professor. He has made significant contributions in combinatorics with applications to computer science. Recently, Zhao and three undergraduates solved an open problem concerning the number of independent sets in an irregular graph, a conjecture first proposed in 2001. Understanding the number of independent sets — subsets of vertices where no two vertices are adjacent — is important to solving many other combinatorial problems. In other research accomplishments, Zhao co-authored a proof with Jacob Fox and David Conlon that contributed to a better understanding of the celebrated Green-Tao theorem that states prime numbers contain arbitrarily long arithmetic progressions. Their work improves our understanding of pseudorandom structures — non-random objects with random-like properties — and has other applications in mathematics and computer science.

Martin Zwierlein, the inaugural Thomas A. Frank (1977) Professor of Physics, studies ultracold gases of atoms and molecules. These gases host novel states of matter and serve as pristine model systems for other systems in nature, such as neutron stars or high-temperature superconductors. In contrast to bulk materials, in experiments with cold gases one can freely tune the interaction between atoms and make it as strong as quantum mechanics allows. This enabled the observation of a novel robust form of superfluidity: Scaled to the density of electrons in solids, superfluidity would in fact occur far above room temperature. Under a novel quantum gas microscope with single-atom resolution, the team recently studied charge and spin correlations and transport in a Fermi-Hubbard lattice gas. This system is believed to hold the key to high-temperature superconductivity in cuprate materials. Using ultracold molecules, Zwierlein’s group also demonstrated coherence times on the order of seconds, spurring hopes for the future use of such molecules in quantum information applications.

What separates the strong from weak among connections in the brain

MIT study finds synapses develop strength with calcium, maturation.

David Orenstein | Picower Institute for Learning and Memory
July 10, 2018

To work at all, the nervous system needs its cells, or neurons, to connect and converse in a language of electrical impulses and chemical neurotransmitters. For the brain to be able to learn and adapt, it needs the connections, called synapses, to be able to strengthen or weaken. A new study by neuroscientists at MIT’s Picower Institute for Learning and Memory helps to explain why strong synapses are stronger, and how they get that way.

By pinpointing the properties of synaptic strength and how they develop, the study could help scientists better understand how synapses might be made weaker or stronger. Deficiencies in synaptic development and change, or plasticity, have a role in many brain diseases such as autism or intellectual disability, says senior author Troy Littleton, the Menicon Professor of Neuroscience in MIT’s Department of Biology.

“The importance of our study is figuring out what are the molecular features of really strong synapses versus their weaker neighbors and how can we think about ways to convert really weak synapses to stronger ones,” Littleton says.

In the study, published in eLife, Littleton’s team used innovative imaging techniques in the model organism of the fruitfly Drosophila to focus on “active zones,” which are fundamental components of synapses. The scientists identified specific characteristics associated with a strong connection on both sides of the synapse.

The team, led by postdoc Yulia Akbergenova and graduate student Karen Cunningham, also studied how strong synapses and active zones grow, showing that those that have the longest to mature during a few critical days of development become the strongest.

Sources of strength

The team’s study began with a survey of active zones at a junction where a motor neuron links up a muscle. About 300 active zones were present at the neuromuscular junction, which gave the team a rich diversity of synapses to examine.

Typically, neuroscientists study neural connectivity by measuring the electrical currents in the postsynaptic neuron after activation of the presynaptic one, but such measures represent an accumulation of transmission from many active zones. In the new study, the team was able to directly visualize the activity of individual active zones with unprecedented resolution using “optical quantal imaging.”

“We optimized a genetically encoded calcium sensor to position it near active zones,” Akbergenova says. “This allows us to directly visualize activity at individual release sites. Now we can resolve synaptic transmission at the level of each individual release site.”

Across many flies, the team consistently found that only about 10 percent of the active zones at the junction were strong, as measured by a high likelihood that they would release the neurostransmitter glutamate when the presynaptic neuron was stimulated. About 70 percent of the active zones were much weaker, barely ever releasing glutamate given the same stimulation. Another 20 percent were inactive. The strongest active zones had release probabilities as much as 50 times greater than weak ones.

“The initial observation was that the synapse made by the exact same neuron are not of the same strength,” Littleton says. “So then the question became, what is it about an individual synapse that determines if it is strong or weak?”

The team ran several tests. In one experiment, for instance, they showed that it’s not their supply of synaptic vesicles, the containers that hold their cache of glutamate. When they stimulated the presynaptic neurons over and over, the strong ones retained their comparatively higher likelihood of release, even as their synaptic vesicle supply was intermixed with those from nearby active zones.

The presynaptic tests that showed a difference had to do with measuring the rate of calcium influx into the active zone and the number of channels through which that calcium reaches the active zone. Calcium ions stimulate the vesicles to fuse to the membrane of the presynaptic cell, allowing neurotransmitters to be released.

At strong synapses, active zones had a significantly greater influx of calcium ions through a notably higher abundance of calcium ion channels than weak synapse active zones did.

Stronger active zones also had more of a protein called Bruchpilot that helps to cluster calcium channels at synapses.

Meanwhile, on the postsynaptic side, when the scientists measured the presence and distribution of glutamate receptor subtypes they found a dramatic difference at strong synapses. In the typical weak synapse, GluRIIA and GluRIIB containing receptors were pretty much mixed together. But in strong synapses, the A subtype, which is more sensitive, crowded into the center while B was pushed out to the periphery, as if to maximize the receiving cell’s ability to pick up that robust signal.

Might through maturity

With evidence of what makes strong synapses strong, the scientists then sought to determine how they get that way and why there aren’t more of them. To do that, they studied each active zone from the beginning of development to several days afterward.

“This is the first time people have been able to follow a single active zone over many days of development from the time it is born in the early larvae through its maturation as the animal grows,” Littleton says.

They did this “intravital imaging” by briefly anesthetizing the larvae every day to check for changes in the active zones. Using engineered GluRIIA and GluRIIB receptor proteins that glow different colors they could tell when a strong synapse had formed by the characteristic concentration of A and marginalization of B.

One phenomenon they noticed was that active zone formation accelerated with each passing day of development. This turned out to be important because their main finding was that synapse strength was related to active zone age. As synapses matured over several days, they accumulated more calcium channels and BRP, meaning that they became stronger with maturity, but only a few had the chance to do it for several days.

The researchers also wanted to know whether activity affected the rate of maturation, as would be expected in a nervous system that must be responsive to an animal’s experience. By tinkering with different genes that modulate the degree of neuronal firing, they found that active zones indeed matured faster with more activity and slower when activity was reduced.

“These results provide a high resolution molecular and developmental understanding of several major factors underlying the extreme heterogeneity in release strength that exists across a population of active zones,” Cunningham says. “Since the cohort of proteins that make up the presynaptic active zone in flies is largely conserved in mammalian synapses, these results will provide valuable insight into how active zone release heterogeneity might arise in more complex neural systems.”

In addition to Littleton, Akbergenova, and Cunningham, the paper’s other authors are MIT postdoc Shirley Weiss-Sharabi and former MIT postdoc Yao Zhang.

The National Institutes of Health supported the research.

Institute Archives spotlights pioneering women at MIT

Initiative is building collections highlighting the contributions of female faculty.

Brigham Fay | MIT Libraries
July 6, 2018

A new MIT Libraries initiative aims to highlight MIT’s women faculty by acquiring, preserving, and making accessible their personal archives. The Institute Archives and Special Collections (IASC) launched the project last year with the generous support of Barbara Ostrom ’78 and Shirley Sontheimer.

The first year of the project has focused on reaching out to faculty who are ending the active phase of their careers. Four faculty members added their personal collections, comprising 234 boxes and 50 gigabytes of material. They are:

  • Nancy Hopkins, the Amgen Inc. Professor of Biology Emerita, known for making zebrafish a widely used research tool and for bringing about an investigation that resulted in the landmark 1999 report on the status of women at MIT;
  • Mary Potter, professor emerita in the Department of Brain and Cognitive Sciences, former chair of the MIT faculty, and member of the Committee of Women Faculty in the School of Science, whose research and teaching focused on experimental methods to study human cognition;
  • Mary Rowe, adjunct professor at the MIT Sloan School of Management, special assistant to the president, and ombudsperson, a conflict resolution specialist whose work led to MIT having one of the nation’s first anti-harassment policies; and
  • Sheila Widnall ’60, SM ’61, ScD ’64, Institute Professor and professor of aeronautics and astronautics, the first woman to serve as secretary of the Air Force, and the first woman to lead an entire branch of the U.S. military.

A donation of the papers of Mildred Dresselhaus, late Institute Professor emerita of electrical engineering and computer science and physics, is also forthcoming. Dresselhaus, whose work paved the way for much of today’s carbon-based nanotechnology, was also known for promoting opportunities for women in science and engineering. Discussions with additional faculty are also underway.

“We are honored to be stewards of these personal archives that have been given to MIT,” says Liz Andrews, project archivist. “We’re committed to preserving and making accessible these unique materials so they can be shared with the world into the future.”

Acquisitions of MIT administrative records provide additional context to the personal archives and a broader view on issues of gender equity and the challenges faced by women in academia. In the next phase of the project, archivists will continue to manage donations, prepare collections for use, and enlarge this core group by reaching out to female faculty who were tenured in the 1960s, ’70s, and ’80s.

Ultimately, the collections will provide not only rich resources for researchers, journalists, teachers, and students, but also, as Sontheimer says, inspiration for generations of women to come. “I’m hoping the project will encourage more women to become engaged in science, technology, and engineering,” she says.

Restricting a key cellular nutrient could slow tumor growth

Researchers identify the amino acid aspartate as a metabolic limitation in certain cancers.

Raleigh McElvery | Department of Biology
June 29, 2018

Remove tumor cells from a living organism and place them in a dish, and they will multiply even faster than before. The mystery of why this is has long stumped cancer researchers, though many have simply focused on the mutations and chains of molecular reactions that could prompt such a disparity. Now, a group of MIT researchers suggests that the growth limitations in live organisms may stem from a different source: the cell’s environment. More specifically, they found that the amino acid aspartate serves as a key nutrient needed for the “proliferation” or rapid duplication of cancer cells when oxygen is not freely available.

The biologists took cancer cells from various tissue types and engineered them to convert another, more abundant substrate into aspartate using the gene encoding an enzyme from guinea pigs. This had no effect on the cells sitting in a dish, but the same cells implanted into mice engendered tumors that grew faster than ever before. The researchers had increased the cells’ aspartate supply, and in doing so successfully sped up proliferation in a living entity.

“There hasn’t been a lot of thought into what slows tumor growth in terms of the cellular environment, including the sort of food cancer cells need,” says Matthew Vander Heiden, associate professor of biology, associate director of the Koch Institute for Integrative Cancer Research, and senior author of the study. “For instance, if you’re trying to get to a given destination and I want to slow you down, my best bet is to set up a roadblock at a place on your route where you’d experience a slow-down anyways, like a long traffic light. That’s essentially what we’re interested in here — understanding what nutrients the cell is already lacking that put the brakes on proliferation, and then further limiting those nutrients to inhibit growth even more.”

Lucas Sullivan, a postdoc in Vander Heiden’s lab, is the lead author of the study, which appeared in Nature Cell Biology on June 25.

Building the case for aspartate

Isolating a single factor that could impact tumor growth within an organism is tricky business. One potential candidate came to Sullivan via a paper he co-authored with graduate student Dan Gui in 2015, which asked a somewhat controversial question: Why is it that cells need to consume oxygen through cellular respiration in order to proliferate?

It’s a rather counter-intuitive question, because some scientific literature suggests just the opposite: Cancer cells in an organism (“in vivo”) do not enjoy the same access to oxygen as they would in a dish, and therefore don’t depend on oxygen to produce enough energy to divide. Instead, they switch to a different process, fermentation, that doesn’t require oxygen. But Sullivan and Gui noted that cancer cells do rely on oxygen for another reason: to produce aspartate as a byproduct.

Aspartate, they soon confirmed, does, in fact, play a crucial role in controlling the rate of cancer cell proliferation. In another study one year later, Sullivan and Gui noted that the antidiabetic drug metformin, known to inhibit mitochondria, slowed tumor growth and decreased aspartate levels in cells in vivo. Since mitochondria are key to cellular respiration, Sullivan reasoned that blocking their function in an already oxygen-constrained environment (the tumor) might make cancer cells vulnerable to further suppression of respiration — and aspartate — explaining why metformin seems to have such a strong effect on tumor growth.

Despite being potentially required for certain amino acids and the synthesis of all four DNA nucleotides, aspartate is already hard to come by, even in oxygen-rich environments. It’s among the lowest concentration amino acids in our blood, and has no way to enter our cells unless a rare protein transporter is present. Precisely why aspartate import is so inefficient remains an evolutionary mystery; one possibility is that its scarcity serves as a “failsafe,” preventing cells from multiplying until they have all the resources to properly do so.

Regardless, the easiest way for cells to get aspartate is not to import it from outside, but rather to make it directly inside, breaking down another amino acid called asparagine to generate it. However, there are very few known mammals that have an enzyme capable of producing aspartate from asparagine — among them, the guinea pig.

Channeling the guinea pig

In the 1950s, a researcher named John Kidd made an accidental discovery. He injected cancer-ridden rats with sera from various animals — rabbits, horses, guinea pigs, and the like — and discovered that guinea pig serum alone shrunk the rats’ tumors. It wasn’t until years later that scientists learned it was an enzyme in the guinea pig blood called guinea pig asparaginase 1 (gpASNase1) that was responsible for this antitumorigenic effect. Today, we know about a host of simpler organisms with similar enzymes, including bacteria and zebrafish. In fact, bacterial asparaginase is approved as a medicine to treat acute lymphocytic leukemia.

Because guinea pigs are mammals and thus have similar metabolisms to our own, the MIT researchers decided to use gpASNase1 to increase aspartate levels in tumors in four different tumor types and ask whether the tumors would grow faster. This was the case for three of the four types: The colon cancer cells, osteosarcoma cells, and mouse pancreatic cancer cells divided more rapidly than before, but the human pancreatic cancer cells continued to proliferate at their normal pace.

“This is a relatively small sample, but you could take this to mean that not every cell in the body is as sensitive to loss of aspartate production as others,” Sullivan says. “Acquiring aspartate may be a metabolic limitation for only a subset of cancers, since aspartate can be produced via a number of different pathways, not just through asparagine conversion.”

When the researchers tried to slow tumor growth using the antidiabetic metformin, the cells expressing gpASNase1 remained unaffected — confirming Sullivan’s prior suspicion that metformin slows tumor growth specifically by impeding cellular respiration and suppressing aspartate production.

“Our initial finding connecting metformin and proliferation was very serendipitous,” he says, “but these most recent results are a clear proof of concept. They show that decreasing aspartate levels also decreases tumor growth, at least in some tumors. The next step is to determine if there are other ways to more intentionally target aspartate synthesis in certain tissues and improve our current therapeutic approaches.”

Although the efficacy of using metformin to treat cancer remains controversial, these findings indicate that one means to target tumors would be to prevent them from accessing or producing nutrients like aspartate to make new cells.

“Although there are many limitations to cancer cell proliferation, which metabolites become limiting for tumor growth has been poorly understood,” says Kivanc Birsoy, the Chapman-Perelman Assistant Professor at Rockefeller University. “This study identifies aspartate as one such limiting metabolite, and suggests that its availability could be targeted for anti-cancer therapies.”

Birsoy is a former postdoc in professor of biology David Sabatini’s lab, who authored a paper published in the same issue of Nature Cell Biology, identifying aspartate as a major growth limitation in oxygen-deprived tumors.

“These companion papers demonstrate that some tumors in vivo are really limited by the chemical processes that require oxygen to get the aspartate they need to grow, which can affect their sensitivity to drugs like metformin,” Vander Heiden says. “We’re beginning to realize that understanding which cancer patients will respond to which treatments may be determined by factors besides genetic mutations. To really get the full picture, we need to take into account where the tumor is located, its nutrient availability, and the environment in which it lives.”

The research was funded by an NIH Pathway to Independence Award, the American Cancer Society, Ludwig Center for Molecular Oncology Fund, the National Science Foundation, a National Institutes of Health Ruth Kirschstein Fellowship, Alex’s Lemonade Stand Undergraduate Research Fellowship, Damon Runyon Cancer Research Foundation, Howard Hughes Medical Institute Faculty Scholar Award, Stand Up to Cancer, Lustgarten Foundation, Ludwig Center at MIT, the National Institutes of Health, and the Koch Institute’s Center for Precision Cancer Medicine.

Advancing knowledge in medical and genetic sciences

Three MIT faculty members selected for funding from the G. Harold and Leila Y. Mathers Foundation.

Danielle Randall | Department of Chemistry
June 27, 2018

Research proposals from Laurie Boyer, associate professor of biology; Matt Shoulders, the Whitehead Career Development Associate Professor of Chemistry; and Feng Zhang, associate professor in the departments of Brain and Cognitive Sciences and Biological Engineering, Patricia and James Poitras ’63 Professor in Neuroscience, investigator at the McGovern Institute for Brain Research, and core member of the Broad Institute, have recently been selected for funding by the G. Harold and Leila Y. Mathers Foundation. These three grants from the Mathers Foundation will enable, over the next three years, key projects in the researchers’ respective labs.

Regenerative medicine holds great promise for treating heart failure, but that promise is unrealized, in part, due to a lack of sufficient understanding of heart development at the mechanistic level. Boyer’s research aims to achieve a deep, mechanistic understanding of the gene control switches that coordinate normal heart development. She then aims to leverage this knowledge and design effective strategies for rewiring faulty circuits in aging and disease.

“We are very grateful to receive support and recognition of our work from the Mathers Foundation,” said Boyer. “This award will allow us to build upon our prior work and to embark upon high risk projects that could ultimately change how we think about treating diseases resulting from faulty wiring of gene expression programs.”

Shoulders’ goal, with this support from the Mathers Foundation, is to elucidate underlying causes of osteoarthritis. There is currently no cure for osteoarthritis, which is perhaps the most common aging-related disease and is characterized by a progressive deterioration of joint cartilage culminating in inflammation, debilitating pain, and joint dysfunction. The Shoulders Group aims to test a new model for osteoarthritis — specifically, the concept that a collapse of proteostasis in aging cartilage cells creates an unrecoverable cartilage repair defect, thus initiating a self-amplifying, destructive feedback loop leading to pathology. Proteostasis collapse in aging cells is a well-known, disease-causing phenomenon that has previously been considered primarily in the context of neurodegenerative disorders. If correct, the proteostasis collapse model for osteoarthritis could one day lead to a novel class of therapeutic options for the disease.

“We are delighted to receive this generous support from the Mathers Foundation, which makes it possible for us to pursue an outside-the-box, high-risk/high-impact idea regarding the origins of osteoarthritis,” said Shoulders. “The research we are now able to pursue will not only provide fundamental, molecular-level insights into joint function, but also could change how we think about this widespread disease.”

Many genetic diseases are caused by the change of just a single base of DNA. Zhang is a leader in the field of genome editing, and he and his team have developed an array of tools based on the microbial immune CRISPR-Cas systems that can manipulate DNA and RNA in human cells. Together, these tools are changing the way molecular biology research is conducted, and they hold immense potential as therapeutic agents to correct thousands of genetic diseases. Now, with the support of the Mathers Foundation, Zhang is working to realize this potential by developing a CRISPR-based therapeutic that works at the level of RNA and offers a safe, effective route to treating a range of diseases, including diseases of the brain and central nervous system, which are difficult to treat with existing gene therapies.

“The generous support from the Mathers Foundation allows us the freedom to explore this exciting new direction for CRISPR-based technologies,” Zhang stated.

Known for their generosity and philanthropy, G. Harold and Leila Y. Mathers created their foundation with the goal of distributing their wealth among sustainable, charitable causes, with a particular interest in basic scientific research. The Mathers Foundation, whose ongoing mission is to advance knowledge in the life sciences by sponsoring scientific research and applying learnings and discoveries to benefit mankind, has issued grants since 1982.

3Q: Nancy Hopkins on the impact and potential of cancer prevention

Mechanism-based cancer prevention is poised to further decrease the numbers of U.S. cancer deaths, says MIT professor emerita.

Anne Trafton | MIT News Office
June 25, 2018

Great progress has already been made in reducing the cancer death toll through prevention, according to a new article in the June 25 issue of Genes and Development by MIT Professor Emerita Nancy Hopkins and colleagues from the Broad Institute, Fox Chase Cancer Center, University of Texas M.D. Anderson Cancer Center, and Oxford University. The potential for further reduction is great for two reasons, these researchers say: If these approaches can be more widely applied, in principle about half of current U.S. cancer deaths could be prevented over the next two to three decades; and new discoveries about how cancer develops could help scientists develop even better prevention and screening methods. MIT News spoke with Hopkins, the Amgen Inc. Professor of Biology Emerita, about why this is an exciting time for cancer research.   

Q: What does your new article reveal about the impact of cancer prevention and early detection?

A: We’ve described how researchers are integrating the dramatic advances in understanding the molecular biology of cancer to explain long-known facts about how lifestyle choices and factors in the environment affect how cancers arise, and how they progress to become detectable tumors.

Prevention and early detection have already had a tremendous impact on reducing U.S. cancer death rates. In the cancer prevention community, it is well-known that about half of current U.S. cancer deaths could, in theory, be prevented over the next two to three decades simply by the full uptake of proven methods of cancer prevention. This important fact is not as well appreciated by the larger cancer research community. This is not a fault of the cancer researchers; it simply reflects the reality that after years of investment and growth, the field of cancer is very broad, with most people working in areas of specialty.

Given the difficulties in treating established cancers, preventing many cancers entirely would obviously produce a quantal leap in reducing U.S. cancer death rates. But in addition, we believe that recent progress in understanding the molecular mechanisms that underlie cancer, and new technologies associated with these advances, could also lead to novel approaches to preventing cancer, detecting it at earlier stages when treatment is often far more successful, or even intercepting the progression of incipient cancers before they develop into tumors.

Q: What interventions have had success preventing cancer, and what promising new approaches are on the horizon?

A: Spectacular examples of preventing cancers from arising in the first place (formally called “primary cancer prevention”) include (1) successful efforts that reduced smoking rates in the United States (from over 40 percent in the 1960s to about 15 percent today) and that have led to a decline in the incidence of lung cancer and a dozen or more other types of cancer caused by smoking; (2) vaccines for cancer-causing viruses, including hepatitis B virus (a cause of liver cancer) and papilloma viruses (the cause of cervical, head and neck, and several other cancers); (3) clean air and water acts and safer workplace laws in the United States that have prevented workers as well as the general population from exposure to high concentrations of certain industrial chemicals known to cause cancer; (4) the development of drugs to cure hepatitis C infection, which are expected to prevent the development of liver cancer in the future; (5) campaigns such as the one in Australia to prevent skin cancers (particularly melanoma) by behavioral changes related to sun exposure.

As for promising new approaches to primary cancer prevention, the fuller uptake of proven methods of prevention is obviously one way to ensure a dramatic decrease in U.S. cancer death rates in the next two to three decades. This would require a greater investment in public health measures. As our article outlines, we are only now coming to understand the mechanisms by which factors such as obesity, inflammation, and some lifestyle choices synergize with long-appreciated risk factors to promote cancer. Based on this improved understanding, prevention could also be aided by research into new drugs, for example to prevent nicotine addiction or to intercept cancer progression by targeting inflammation. Exciting, too, is the possibility that DNA sequencing of cancer genomes may help to identify additional external causes of cancers based on the “mutational signatures” they leave in our DNA after exposures. If so, these agents may prove to be removable or avoidable in future.

We also discuss a second type of intervention to prevent cancer. This is screening, sometimes referred to as “secondary cancer prevention,” which can detect precancers and cancers at an early enough stage to remove them completely or treat them much more successfully. Spectacular successes to date include the Pap test that has greatly reduced deaths from cervical cancer in the United States and elsewhere; newer molecular tests focused on HPV-virus detection have proven similarly effective and are now replacing traditional Pap tests which require expert pathologic interpretations, making screening more widely available. A second success is colonoscopy, which has been enormously successful at detecting precancerous polyps and early-stage colon cancers that can be removed through the endoscope, or detected earlier when they’re more likely to be responsive to treatment. Additionally, other less-invasive methods of colon cancer screening are readily available and highly effective. Also successful has been mammography in combination with follow-up treatment. Along with greatly improved treatment, it is credited with contributing to the declining death rate from breast cancer.

Q: What types of new screening methods do you believe could help to further improve early detection of cancer?

A: Many of the most successful screening methods are for cancers that develop on body surfaces and hence can be detected by visual inspection. Imaging can be hugely successful for cancers that lie deeper in the body — breast for example — but imaging that becomes more and more sensitive can identify many abnormalities that may not be cancer at all. This can lead to costly and invasive testing of what are sometimes referred to as “incidentalomas.” Much needed are novel methods of screening that may combine imagining with other markers to make it possible to distinguish true cancers from noncancerous aberrations occurring in internal organs.

The holy grail of cancer screening would be blood tests to detect early-stage cancers, and many efforts are now directed to this goal. This is an extremely exciting time for the emergence of powerful molecular diagnostics that can help pinpoint very early-stage tumors. Some of these rely on relatively noninvasive methods, such as measurement of DNA signatures found in the blood. Widespread availability and demonstrated effectiveness of such methods would greatly enhance the field of secondary prevention, but there remain substantial challenges and it is not yet known if this approach will succeed. Also very exciting are methods being developed by bioengineers here at MIT and in other places to try to amplify other signals arising from tumors that may be difficult to detect otherwise and include, for example, completely noninvasive urine-based tests.

After decades of effort, cancer is gradually coming under control thanks to prevention and early detection, improvements in “conventional” cancer treatment (imaging, surgery, radiation, chemotherapy, and some adjuvant therapies), and novel approaches to treatment based on immunotherapy and more personalized drugs. But it is likely that for now, the full implementation of proven methods of prevention offers the most reliable approach to large-scale reduction of U.S. cancer deaths. Meanwhile, research into novel mechanism-based approaches to preventing the initiation and progression of cancer may one day prevent the majority of cancers from occurring in the first place.

Fighting implicit bias in STEM with increased cognitive control

In a visit with the Department of Biology, Lydia Villa-Komaroff PhD '75 explains how “thinking fast makes changing slow.”

Raleigh McElvery | Department of Biology
June 26, 2018

The brain carries out many processes automatically and without our conscious recognition. This means that when we encounter certain information — like the name on a resume suggesting a specific gender or race — we make an immediate and unintentional judgement. At the Building 68 Department of Biology retreat on June 14, keynote speaker Lydia Villa-Komaroff PhD ’75 explained the physiological roots of this implicit bias and offered potential solutions.

Villa-Komaroff is a biologist and business woman advocating for diversity in STEM. When she received her PhD from the Department of Biology in 1975, she was one of the first Mexican American women to receive a doctorate in the sciences. She served as the chief operating officer and vice president of research for MIT’s Whitehead Institute for Biomedical Research for two years, and later founded her own one-woman consulting firm, Intersections, SBD. She is a board member, former CEO, and former chief science officer of the biotech company Cytonome/ST, LLC, and a member of the Biology Department Visiting Committee. She is also a co-founding member of the Society for the Advancement of Chicanos/Hispanics and Native Americans in Science (SACNAS).

According to Villa-Komaroff, it’s not that STEM fields are completely without diversity. Rather, there are fewer members of underrepresented groups in positions of academic power relative to their peer populations. Women and underrepresented minorities tend to hold instructor roles or assistant professorships, and are less likely to become full professors, deans, and presidents.

“There has been some progress,” she said, “since the proportion of women and underrepresented groups has climbed. Women have climbed at a faster rate than have individuals from underrepresented ethnic groups, but the rate of increase in both of those groups is still slow relative to the changing population. Clearly something is going on in our society, and it has been going on for a very long time, longer than any of us have been around. So what might that be?”

Data are amassing, and not only from sociologists and psychologists, but from neuroscientists as well, Villa-Komaroff pointed out. Research has shown that humans are wired to make quick decisions that serve us well must of the time, but these inclinations can also cause us to misjudge the abilities of the person before us.

Since the brain is constantly confronted with a deluge of information, over the course of time it developed two systems to sift through all the input. System 1 is automatic: It’s running all the time, requires very little energy, and is crucial to our survival — permitting us to recognize danger and possible threats in a split second. It also allows us to complete habitual tasks, like playing the violin or holding a pipette, with very little conscious effort.

System 2 begets what we generally consider to be “thinking.” It is deliberate and requires a lot of energy to run. Often without our conscious awareness, System 1 overtakes System 2 and our decisions are driven by our instincts. Villa-Komaroff said we need to fight this tendency to “trust our instincts” when it comes time to select colleagues or students. It’s not simply about activating your thinking, it’s about challenging it.

“I’m sorry to say that we — that is those of us in the hard sciences — have been the most resistant to thinking that this might be in the case,” she said. “I can’t tell you how many times my colleagues have said to me, ‘This is not a problem for us because we care only about merit, and that is what we are basing our decisions upon.’ It’s true we care about merit, but that is not the factor on which we often base our first initial decisions.”

In fact, it has been shown that science faculty presented with two applications for a lab manager position, identical except for the names “Jennifer” and “John,” will evaluate John as more competent, give him more money, and offer him more career mentorship. The kicker is that these implicit biases aren’t just limited to a particular segment of the population. Women often have biases against other women, and the same is true for members of underrepresented groups.

But hope is not lost, Villa-Komaroff said. We can do something to counter this tendency if we just teach ourselves to recognize our own biases and deliberately work to override them.

In one study, researchers noticed that the panels at the American Society for Microbiology General Meeting consisted primarily of males. The panel committees that selected them also happened to be predominantly all-male. The researchers presented these data to the selection committees, and gave them an explicit call to action: Do something about it. The next year, the number of female speakers increased, and the number of all-male session planning committees decreased.

In another study, researchers took 92 medicine, science, and engineering departments from the University of Wisconsin at Madison and divided them into a matching control and test group, where the test group was invited to enroll in a short, two-and-a-half hour workshop on implicit bias. Despite the fact that, on average, just 25 percent of the faculty from the test group departments attended the session, afterwards they reported more self-initiated efforts to promote gender equity and better conflict resolution. Most notably, over the next several years the percentage of hires from underrepresented groups rose from 8 percent to 11 percent, while the controls saw a decrease from 10 percent to 5 percent. 

As part of the Strategies and Tactics to Increase Diversity and Excellence (STRIDE) program at the University of Michigan, full professors must now attend workshops on implicit bias in the fall during peak faculty recruitment season. Between 2001 and 2007, the percentage of faculty searches resulting in a female hire rose from 15 percent to 32  in STEM disciplines. If nothing else, these kinds of interventions may kick in the second, deliberate decision-making system, and allow us to see past the name on the resume.