MIT Latest News
MITx has awarded its second annual Prize for Teaching and Learning in MOOCs (massive open online courses) to two instructors selected from a pool of individuals who made significant contributions to MITx MOOC coursework offered on edX.org during the 2017 calendar year.
The MITx Prize for Teaching and Learning recognizes educators who have developed innovative digital course content that engage online learners around the world through digital classrooms. The award debuted last year as part of the Institute’s effort to encourage the development of new MOOC methods and technologies.
Chris Caplice, executive director of the MIT Center for Transportation and Logistics, received the award for his work on the MicroMasters Program in Supply Chain Management. MIT launched the MicroMasters credential in fall of 2015. Since then, more than than 25 universities have also launched MicroMasters programs through edX. The MicroMasters Program in Supply Chain Management includes five courses of 13 weeks each, plus a final comprehensive exam.
Individuals who pass the five courses plus the exam earn the MicroMasters credential and may apply for a master’s degree at MIT and other universities. Caplice notes that online learners can access the same exceptional resources as those taking courses on-campus.
“Each course is rigorous and mirrors what we do in the campus classes here at MIT,” he says.
The award for Caplice praises his “dedication to creating a high-quality learner experience both in the courses and beyond, and his work to ensure the value of the credential through rigorous assessment.”
Justin Reich, executive director of the MIT Teaching Systems Lab, was selected for his work on 11.154x (Launching Innovation in Schools), a six-week course targeted to school leaders including teachers, principals, superintendents, school board members, and others.
“It’s for anyone who wants to make their schools better,” Reich says.
The course was recognized for creating a collaborative environment for educators to take the course in facilitated learning circles, where school leaders are encouraged to take courses and create change together. The course was also lauded for encouraging educators to take specific actions each week that they can implement immediately in their own schools.
“We envision our courses not just a learning experience you do and apply it afterwards,” he explains. “You are encouraged to take what you are learning and immediately start applying it in your context to get feedback and iteratively improve a change initiative together.”
“Chris and I are both really interested in professionals out in the working world who are committed learners,” adds Reich. He also points out that distance learning is not new, having been a part of human learning for over a century from the earliest correspondence courses. “MOOCs like ours are built on that long tradition and we are in our century taking advantage of the new technologies to serve our learners today interested in life-long learning experiences,” he says.
More than 10,000 learners have earned over 20,000 individual course certificates, and 1,062 learners have completed the MicroMasters credential. On June 8, the first group of 40 individuals graduated with a master’s degree in supply chain management under the new blended master’s program.
“This year’s winners reflect the dedication of all of our MITx faculty in sharing the excellence of our on-campus MIT experience with global learners,” says MIT Professor Krishna Rajagopal, dean of digital learning. “Their innovative and timely courses have set a high standard for online learning that influences the whole field of this form of education.”
Anyone interested in more information about MOOCs on MITx is encouraged to visit MIT’s Open Learning site. MIT Open Learning will begin accepting nominations for next year’s prize in early 2019.
Noelle Selin, an associate professor with a joint appointment in the MIT Institute for Data, Systems, and Society (IDSS) and the Department of Earth, Atmospheric and Planetary Sciences (EAPS), has been appointed the next director for the Technology and Policy Program (TPP) at MIT.
TPP is a two-year, interdisciplinary master of science program that combines science and engineering with social sciences, to educate students whose research addresses important technological issues confronting society. Over more than 40 years, TPP’s more than 1,200 alumni have gone on to work in industry and government as well as academia.
Selin’s own research links science and policy, particularly on the topic of atmospheric pollutants. Her interdisciplinary research aims to inform decision-making on air pollution, climate change, and hazardous substances. A major focus is on mercury pollution, where she has engaged with policy-makers both domestically and internationally. In addition to her work modeling the transport and fate of pollutants, she has published articles and book chapters on the interactions between science and policy in international environmental negotiations, in particular focusing on global efforts to regulate hazardous substances.
“Noelle is an excellent educator and teacher, and has substantially contributed to the curriculum in IDSS and TPP,” said IDSS Director Munther Dahleh, a professor in IDSS and MIT’s Department of Electrical Engineering and Computer Science. While serving as associate director of TPP, Selin managed the admission process and led a curricular development effort that revised the set of course requirements for TPP students. In 2018, she shared the Joseph A. Martore ’75 Award for Exceptional Contributions to Education in IDSS for her contributions to the core TPP course Science, Technology, and Public Policy. She also received TPP’s Faculty Appreciation Award in 2013.
Selin first came to MIT in 2007 as a postdoc at the Center for Global Change Science. She joined the Engineering Systems Division as an assistant professor in 2010 with a joint appointment in EAPS. She joined IDSS as a core faculty member when it was launched in 2015. She was promoted to associate professor with tenure in July 2017.
In the area of policy, Selin had prior appointments as a research associate with the Initiative on Science and Technology for Sustainability at Harvard’s Kennedy School and as a visiting researcher at the European Environment Agency in Copenhagen, Denmark. She also previously worked on chemicals issues at the U.S. Environmental Protection Agency. She holds a BA in environmental science and public policy and an MA and PhD in earth and planetary sciences, all from Harvard University.
Selin received the NSF CAREER Award and two best Environmental Policy paper awards from the journal Environmental Science and Technology (2015 and 2016). She is a Kavli Frontiers of Science Fellow, a member of the Global Young Academy (2014-2018), a fellow of the AAAS Leshner Leadership Institute for Public Engagement (2016-2017), and a Leopold Leadership Fellow (2013).
“I am truly honored to be named as the next Director of TPP,” says Selin. “I see TPP as a hub for education, research, and practice in mobilizing technical expertise to inform policy, within MIT and beyond, and I am excited to help shape its future.”
Faculty and senior researchers at MIT are teaming up in unprecedented ways to help define the next frontier in human and machine intelligence with projects that delve into everything from fundamental research to societal applications for new technologies.
The MIT-SenseTime Alliance on Artificial Intelligence, a program within the MIT Quest for Intelligence, has announced funding for 27 projects involving about 50 principal investigators from departments and labs within engineering, science, architecture and planning, management, and the humanities and social sciences.
SenseTime, a leading artificial intelligence company founded by MIT alumnus Xiao’ou Tang PhD ’96, jointly created the alliance with MIT earlier this year to define the next frontier of human and machine intelligence. The selected projects are of one year in duration that are intended to kick-start new efforts and initiate longer-term work.
“We were thrilled with the range and creativity of the proposals we received,” says Anantha P. Chandrakasan, dean of the MIT School of Engineering and Vannevar Bush Professor of Electrical Engineering and Computer Science, who chairs the MIT-SenseTime Alliance.
“It is particularly exciting to see faculty from so many disciplines join together to embark on projects that speak to the major objectives of the MIT Quest for Intelligence,” he says.
Chandrakasan points to the funded exploratory “moonshot” projects that align with the work of “The Core,” which seeks to advance the science and engineering of human and machine intelligence, explore human intelligence using insights from computer science, and develop new machine-learning algorithms.
One such project features MIT faculty from the Department of Linguistics and Philosophy, the Department of Brain and Cognitive Sciences, and MIT’s Computer Science and Artificial Intelligence Laboratory. These professors are working on a “language moonshot” that explores how insights from linguistic theory can be transformed into machine-learning algorithms to better approximate how people converse.
Several MIT neuroscientists from the McGovern Institute for Brain Research are joining with an MIT physics professor and others in a moonshot project that explores the biological mechanisms subtending learning and how to incorporate that knowledge into more robust machine-learning techniques. Meanwhile, MIT faculty from computer science and brain and cognitive sciences are collaborating in a project focused on how artificial systems, like robots, can learn common sense knowledge.
Also popular were projects described as “mission-driven” that aligned, as Chandrakasan notes, with the MIT Quest for Intelligence’s second key entity, “The Bridge,” which is dedicated to the application of MIT discoveries in natural and artificial intelligence to all disciplines.
Seven principal investigators from disciplines including CSAIL, the Department of Aeronautics and Astronautics, and the Department of Earth, Atmospheric and Planetary Sciences are collaborating on a mission-driven project that will explore drone intelligence for societal applications.
Another mission-driven project features a faculty member from the Sloan School of Management teaming up with an MIT professor of electrical engineering and computer science who proposes to use machine learning to amplify human intelligence and human employment in future manufacturing. Another involves two MIT mechanical engineering faculty focusing on developing tools in product design and systems architecture that capitalize on strategies that combine human intelligence with machine intelligence.
Eight collaborative projects that involve two or more principal investigators also received funding. For example, Media Lab professors will evaluate the use of a social robot as a personalized emotional wellness coach; electrical engineering and computer science professors will use embedded AI to track neurocognitive decline in the elderly; nuclear science and engineering, materials science and engineering, and electrical engineering professors will develop an all-solid device for low-power, fast, brain-like computing; Sloan professors will research dynamic portfolio management with deep learning; and mechanical engineering professors will investigate a specific approach to low-power image regression at the edge for the internet of things.
Additional funding was dedicated to 13 projects led by individual faculty from a range of departments, including media arts and sciences, chemistry, computer science, anthropology, physics, brain and cognitive sciences, and more. These projects are wide-ranging and include pursuits such as harnessing artificial intelligence to design vaccines for treating metastatic skin cancer, quantifying the robustness of neural networks, and using AI to increase equity in decision making for agriculture market design.
“These projects are just the beginning of what promises to be one of the most important initiatives MIT has ever started,” says Antonio Torralba, the director of the MIT Quest for Intelligence. “SenseTime’s tremendous support for pivotal research will deepen the footprint of The Quest on campus.”
Not every technology platform or tool you use, or website you visit, comes straight from a startup or Silicon Valley. Many are developed by nonprofits, government agencies, or advocacy groups practicing community technology, technology for social justice, or “public interest technology.” What can we learn from these community-engaged technology practitioners? How can organizations that work for equity achieve the diversity they often advocate for in society?
Sasha Costanza-Chock, an associate professor in Comparative Media Studies/Writing at MIT, is the lead author of a new report, titled “#MoreThanCode: Practitioners reimagine the landscape of technology for justice and equity,” which delves into these issues. The report distills 109 interviews, 11 focus groups, and data from thousands of organizations into five high-level recommendations for those who want to use technology for the public good. (The report was funded by NetGain, the Ford Foundation, Mozilla, Code for America, and OTI.) MIT News sat down with Costanza-Chock to talk about the report and its recommendations.
Q: Who are the practitioners in this tech ecosystem?
A: “#MoreThanCode” is a report about people working to use technology for social good and for social justice — the space the report’s funders call “public interest technology.” There’s a very wide range of roles for people who use technology to advance the public interest, and it’s not only software developers who are active.
One of our key recommendations is that when funders and organzations — be they city governments or nonprofits or for-profit companies — are putting together teams, they need to think broadly about who is on that team. We found that a good team to develop technology that’s going to advance social justice or the public interest is going to include software developers, graphic designers, researchers, and domain [subject] experts. Domain experts might have formal expertise, but the most important team member is someone with lived experience of the particular condition that technology is supposed to address.
Q: On that note, can you say a little about the current state of social diversity in this sector?
A: Certainly. One of our key goals in the report was to produce baseline knowledge about who’s working in public interest technology. And unfortunately, in terms of hard data, the main finding is that we don’t have it, because many organizations in the space have not published diversity and inclusivity data about who their staff are, who their volunteers are.
And so one recommendation in the report is that everybody who says they’re doing public interest technology, or using technology for good, should be gathering data about, at the very least, race and gender, and publicly releasing it. Gathering and releasing diversity data, and setting time-bound, public targets for diversity and inclusion goals, are two main things that we know work in organizations, from the evidence-based literature. Good intentions aren’t enough.
Although we weren’t able to gather that kind of sector-wide diversity data, we did interview 109 people and conduct focus groups with 79 more, and asked them about their experiences with racism, sexism, transphobia, ableism, and other common forms of systematic marginalization people experience. About half of the people we talked to for the report said they had experiences like that.
The leading recommendation at the end of the report is summed up in a slogan from the disability justice movement, which is, “Nothing about us, without us.” The idea is that when you’re going to develop a technology to help a community, you have to include members of that community from the beginning of your process … and ideally in the governance of the project when it’s deployed.
Q: The report also suggests people should not always look for “silver bullets” or instant answers from technology alone. Why is that, and what are some of the other recommendations from the report?
A: I’m not going to say it’s never about finding a new [technological] solution, but over and over again, the people we interviewed said the projects that were most successful were deployments of resilient, proven technology, rather than some super-exciting new app that’s suddenly supposed to solve everything.
One recommendation is that when organizations set up tech teams, you want someone from the community on the design team, not just at a moment of consultation. That’s a pretty important takeaway. A lot of people told us it was important to go further than just doing initial consultations with a community — having people on the design team from beginning to end is a best practice we recommend.
Some people talked about creating tech clinics, modeled after legal clinics in education. That would be something a place like MIT could think about. Law schools often require students to spend a certain number of hours providing legal services pro bono to people in different domains who otherwise can’t afford lawyers. It would be interesting to consider whether there could be a [similar] tech clinic concept.
Our final recommendation was about recognizing organizational models beyond traditional startups, government offices, or 501c3 nonprofits — for example, consider tech cooperatives, or ad hoc networks that emerge around a crisis moment. These are hard for investors or foundations to fund: Whom do you fund? And yet a lot of really important technology projects are informal. In the wake of Hurricane Maria in Puerto Rico, there were hundreds of developers, techies, and community organizers doing everything they could, ad hoc, to get communications infrastructure back up.
People should develop strategies for supporting those kinds of networks when they do spring up. For funders, that may mean setting up a crisis response fund with a mechanism to rapidly dispense smaller amounts of funds. And members of the MIT community who are creating new companies to bring “tech for good” innovations to market should consider worker-owned cooperatives, platform co-ops, and other models that internally mirror the kind of world they’d like to build.
For decades, researchers have been exploring ways to replicate on Earth the physical process of fusion that occurs naturally in the sun and other stars. Confined by its own strong gravitational field, the sun’s burning plasma is a sphere of fusing particles, producing the heat and light that makes life possible on earth. But the path to a creating a commercially viable fusion reactor, which would provide the world with a virtually endless source of clean energy, is filled with challenges.
Researchers have focused on the tokamak, a device that heats and confines turbulent plasma fuel in a donut-shaped chamber long enough to create fusion. Because plasma responds to magnetic fields, the torus is wrapped in magnets, which guide the fusing plasma particles around the toroidal chamber and away from the walls. Tokamaks have been able to sustain these reactions only in short pulses. To be a practical source of energy, they will need to operate in a steady state, around the clock.
Researchers at MIT’s Plasma Science and Fusion Center (PSFC) have now demonstrated how microwaves can be used to overcome barriers to steady-state tokamak operation. In experiments performed on MIT’s Alcator C-Mod tokamak before it ended operation in September 2016, research scientist Seung Gyou Baek and his colleagues studied a method of driving current to heat the plasma called Lower Hybrid Current Drive (LHCD). The technique generates plasma current by launching microwaves into the tokamak, pushing the electrons in one direction — a prerequisite for steady-state operation.
Furthermore, the strength of the Alcator magnets has allowed researchers to investigate LHCD at a plasma density high enough to be relevant for a fusion reactor. The encouraging results of their experiments have been published in Physical Review Letters.
“The conventional way of running a tokamak uses a central solenoid to drive the current inductively,” Baek says, referring to the magnetic coil that fills the center of the torus. “But that inherently restricts the duration of the tokamak pulse, which in turn limits the ability to scale the tokamak into a steady-state power reactor.”
Baek and his colleagues believe LHCD is the solution to this problem.
MIT scientists have pioneered LHCD since the 1970s, using a series of “Alcator” tokamaks known for their compact size and high magnetic fields. On Alcator C-Mod, LHCD was found to be efficient for driving currents at low density, demonstrating plasma current could be sustained non-inductively. However, researchers discovered that as they raised the density in these experiments to the higher levels necessary for steady-state operation, the effectiveness of LHCD to generate plasma current disappeared.
This fall-off in effectiveness as density increased was first studied on Alcator C-Mod by research scientist Gregory Wallace.
“He measured the fall-off to be much faster than expected, which was not predicted by theory,” Baek explains. “The last decade people have been trying to understand this, because unless this problem is solved you can’t really use this in a reactor.”
Researchers needed to find a way to boost effectiveness and overcome the LHCD density limit. Finding the answer would require a close examination of how lower hybrid (LH) waves respond to the tokamak environment.
Driving the current
Lower hybrid waves drive plasma current by transferring their momentum and energy to electrons in the plasma.
Head of the PSFC’s Physics Theory and Computation Division, senior research scientist Paul Bonoli compares the process to surfing.
“You are on a surf board and you have a wave come by. If you just sit there the wave will kind of go by you,” Bonoli says. “But if you start paddling, and you get near the same speed as the wave, the wave picks you up and starts transferring energy to the surf board. Well, if you inject radio waves, like LH waves, that are moving at velocities near the speed of the particles in the plasma, the waves start to give up their energy to these particles.”
Temperatures in today’s tokamaks — including C-Mod — are not high enough to provide good matching conditions for the wave to transfer all its momentum to the plasma particles on the first pass from the antenna, which launches the waves to the core plasma. Consequently, researchers noticed, the injected microwave travels through the core of the plasma and beyond, eventually interacting multiple times with the edge, where its power dissipates, particularly when the density is high.
Exploring the scrape-off layer
Baek describes this edge as a boundary area outside the main core of the plasma where, in order to control the plasma, researchers can drain — or “scrape-off” — heat, particles, and impurities through a divertor. This edge has turbulence, which, at higher densities, interacts with the injected microwaves, scattering them, and dissipating their energy.
“The scrape-off layer is a very thin region. In the past RF scientists didn’t really pay attention to it,” Baek says. “Our experiments have shown in the last several years that interaction there can be really important in understanding the problem, and by controlling it properly you can overcome the density limit problem.”
Baek credits extensive simulations by Wallace and PSFC research scientist Syun’ichi Shiraiwa for indicating that the scrape-off layer was most likely the location where LH wave power was being lost.
Detailed research on the edge and scrape-off-layer conducted on Alcator C-Mod in the last two decades has documented that raising the total electrical current in the plasma narrows the width of the scrape-off-layer and reduces the level of turbulence there, suggesting that it may reduce or eliminate its deleterious effects on the microwaves.
Motivated by this, PSFC researchers devised an LHCD experiment to push the total current by from 500,000 Amps to 1,400,000 Amps, enabled by C-Mod’s high-field tokamak operation. They found that the effectiveness of LCHD to generate plasma current, which had been lost at high density, reappeared. Making the width of the turbulent scrape-off layer very narrow prevents it from dissipating the microwaves, allowing higher densities to be reached beyond the LHCD density limit.
The results from these experiments suggest a path to a steady-state fusion reactor. Baek believes they also provide additional experimental support to proposals by the PSFC to place the LHCD antenna at the high-field (inboard) side of a tokamak, near the central solenoid. Research suggests that placing it in this quiet area, as opposed to the turbulent outer midplane, would minimize destructive wave interactions in the plasma edge, while protecting the antenna and increasing its effectiveness. Principal Research scientist Steven Wukitch is currently pursuing new LHCD research in this area through PSFCs’ collaboration with the DIII-D tokamak in San Diego.
Although existing tokamaks with LHCD are not operating at the high densities of C-Mod, Baek feels that the relationship between the current drive and the scrape-off layer could be investigated on any tokamak.
“I hope our recipe for improving LHCD performance will be explored on other machines, and that these results invigorate further research toward steady-state tokamak operation,” he says.
All growth and reproduction relies on a cell’s ability to replicate its chromosomes and produce accurate copies of itself. Every step of this process takes place within that cell.
Based on this observation, scientists have studied the replication and segregation of chromosomes as a phenomenon exclusively internal to the cell. They traditionally rely on warm nutritional cultures that promote growth but bear little resemblance to the cell’s external surroundings while in its natural environment.
New research by a group of MIT biologists reveals that this long-held assumption is incorrect. In a paper published this week, they describe how some types of cells rely on signals from surrounding tissue in order to maintain chromosome stability and segregate accurately.
Kristin Knouse, a fellow at the Whitehead Institute, is the lead author of the paper, which was published online in the journal Cell on Aug. 23. Angelika Amon, the Kathleen and Curtis Marble Professor in Cancer Research in the Department of Biology and a member of the Koch Institute for Integrative Cancer Research, is the senior author.
“The main takeaway from this paper is that we must study cells in their native tissues to really understand their biology,” Amon says. “Results obtained from cell lines that have evolved to divide on plastic dishes do not paint the whole picture.”
When cells replicate, the newly duplicated chromosomes line up within the cell and cellular structures pull one copy to each side. The cell then divides down the middle, separating one copy of each chromosome into each new daughter cell.
At least, that’s how it’s supposed to work. In reality, there are sometimes errors in the process of separating chromosomes into daughter cells, known as chromosome mis-segregation. Some errors simply result in damage to the DNA. Other errors can result in the chromosomes being unevenly divided between daughter cells, a condition called aneuploidy.
These errors are almost always harmful to cell development and can be fatal. In developing embryos, aneuploidy can cause miscarriages or developmental disorders such as Down syndrome. In adults, chromosome instability is seen in a large number of cancers.
To study these errors, scientists have historically removed cells from their surrounding tissue and placed them into easily controlled plastic cultures.
“Chromosome segregation has been studied in a dish for decades,” Knouse says. “I think the assumption was … a cell would segregate chromosomes the same way in a dish as it would in a tissue because everything was happening inside the cell.”
However, in previous work, Knouse had found that reported rates for aneuploidy in cells grown in cultures was much higher than the rates she found in cells that had grown within their native tissue. This prompted her and her colleagues to investigate whether the surroundings of a cell influence the accuracy with which that cell divided.
To answer this question, they compared mis-segregation rates between five different cell types in native and non-native environments.
But not all cells’ native environments are the same. Some cells, like those that form skin, grow in a very structured context, where they always have neighbors and defined directions for growth. Other cells, however, like cells in the blood, have greater independence, with little interaction with the surrounding tissue.
In the new study, the researchers observed that cells that grew in structured environments in their native tissues divided accurately within those tissues. But once they were placed into a dish, the frequency of chromosome mis-segregation drastically increased. The cells that were less tied to structures in their tissue were not affected by the lack of architecture in culture dishes.
The researchers found that maintaining the architectural conditions of the cell’s native environment is essential for chromosome stability. Cells removed from the context of their tissue don’t always faithfully represent natural processes.
The researchers determined that architecture didn’t have an obvious effect on the expression of known genes involved in segregation. The disruption in tissue architecture likely causes mechanical changes that disrupt segregation, in a manner that is independent of mutations or gene expression changes.
“It was surprising to us that for something so intrinsic to the cell — something that's happening entirely within the cell and so fundamental to the cell's existence — where that cell is sitting actually matters quite a bit,” Knouse says.
Through the Cancer Genome Project, scientists learned that despite high rates of chromosome mis-segregation, many cancers lack any mutations to the cellular machinery that controls chromosome partitioning. This left scientists searching for the cause of the increase of these division errors. This study suggests that tissue architecture could be the culprit.
Cancer development often involves disruption of tissue architecture, whether during tumor growth or metastasis. This disruption of the extracellular environment could trigger chromosome segregation errors in the cells within the tumor.
“I think [this paper] really could be the explanation for why certain kinds of cancers become chromosomally unstable,” says Iain Cheeseman, a professor of biology at MIT and a member of the Whitehead Institute, who was not involved in the study.
The results point not only to a new understanding of the cellular mechanical triggers and effects of cancers, but also to a new understanding of how cell biology must be studied.
“Clearly a two-dimensional culture system does not faithfully recapitulate even the most fundamental processes, like chromosome segregation,” Knouse says. “As cell biologists we really must start recognizing that context matters.”
This work was supported by the National Institutes of Health, the Kathy and Curt Marble Cancer Research Fund, and the Koch Institute Support (core) Grant from the National Cancer Institute.
Alan S. Hanson PhD '77, who served as executive director of International Nuclear Leadership Education Program within the MIT Department of Nuclear Science and Engineering (NSE) until June 2015, died on Aug. 4 from metastatic cancer at the age of 71.
Hanson was born in Chicago, Illinois, on Dec. 27, 1946. He received a bachelor’s degree in mechanical engineering in 1969 from Stanford University, and a PhD in nuclear engineering in 1977 from MIT. His passion for discovery and learning led him back to school to earn a master's degree in liberal studies at Georgetown University in 2009.
Before returning to MIT in 2012, Hanson served for more than 30 years in increasingly senior executive positions in the nuclear industry, accumulating broad managerial, international, and engineering experience, most of which was devoted to the back end of the nuclear fuel cycle, nuclear waste management, and issues of non-proliferation
Hanson joined the International Atomic Energy Agency (IAEA) in Vienna, Austria in 1979, where he served first as coordinator of the International Spent Fuel Management Program and later as policy analyst with responsibilities in the areas of safeguards and non-proliferation policies.
Upon returning to the U.S., Hanson served as president and CEO of Transnuclear, Inc. In 2005 he was appointed as executive vice president of technologies and used fuel management at AREVA NC Inc. In this position he was responsible for all of AREVA’s activities in the backend of the nuclear fuel cycle in the U.S.
He completed a year-long assignment as a visiting scholar at the Center for International Security and Cooperation (CISAC) at Stanford University on loan from AREVA in 2011. At CISAC he conducted research on the worldwide nuclear supply chain and international fuel assurance mechanisms.
In 2012 Hanson returned to MIT as the executive director of NSE’s International Nuclear Leadership Education Program (INLEP). INLEP, an intensive executive education course designed for nuclear leaders from countries new to nuclear power, was the only program of its kind in the world.
Hanson proved to be an outstanding leader of INLEP. Professor Richard Lester, then head of NSE, recalled him as "a man of great ability and great integrity." Added Lester, "Alan listened carefully before he spoke, but he never hesitated to say what he thought. We could always be sure that he would put the interests of the department and MIT first, and during his service as INLEP executive director I relied on him implicitly for his wisdom, judgment, commitment and dedication."
Hanson’s love for the outdoors took him on hiking trips on the Appalachian Trail, Acadia National Park, and through Austria and Ireland. Over the last few years he helped clear trails with the Lewisboro Land Trust. Jazz, classical music, and reading were also amongst Hanson’s joys.
Loved by all who knew him, Hanson is remembered as a kind, brilliant man, devoted to family, friends, and colleagues alike. He is survived by his wife of 34 years, Bairbre; his daughter Alanna Reed and son-in-law Tim Reed; son Colin Hanson; his two grandchildren, Madeline and Molly Reed; his sister Shelley Ruth; and nephew Jason.
A celebration of his life will be held on Sept. 8 in Croton Falls, New York.
Chronic rhinosinusitis is distinct from your average case of seasonal allergies. It causes the sinuses to become inflamed and swollen for months to years at a time, leading to difficulty breathing and other symptoms that make patients feel miserable. In some people, this condition also produces tissue outgrowths known as nasal polyps, which, when severe enough, have to be removed surgically.
By performing a genome-wide analysis of thousands of single cells from human patients, MIT and Brigham and Women’s Hospital researchers have created the first global cellular map of a human barrier tissue during inflammation. Analysis of this data led them to propose a novel mechanism that may explain what sustains chronic rhinosinusitis.
Their findings also offer an explanation for why some rhinosinusitis patients develop nasal polyps, which arise from epithelial cells that line the respiratory tract. Furthermore, their study may have broader implications for how researchers think about and treat other chronic inflammatory diseases of barrier tissues, such as asthma, eczema, and inflammatory bowel disease.
“We saw major gene-expression differences in subsets of epithelial cells which had been previously obscured in bulk tissue analyses,” says Alex K. Shalek, the Pfizer-Laubach Career Development Assistant Professor of Chemistry, a core member of MIT’s Institute for Medical Engineering and Science (IMES), and an extramural member of the Koch Institute for Integrative Cancer Research, as well as an associate member of the Ragon and Broad Institutes.
“When you look across the entire transcriptome, comparing cells from patients with different disease statuses over thousands of genes, you can start to understand the relationships between them and discover which transcriptional programs have supplanted the usual ones,” Shalek says.
The lead authors of the paper, which appears in the Aug. 22 issue of Nature, are Jose Ordovas-Montanes, an IMES postdoc fellow supported by the Damon Runyon Cancer Research Foundation, and Daniel Dwyer, a research fellow at Brigham and Women’s Hospital. Shalek and Nora Barrett, an assistant professor of medicine at Brigham and Women’s, are the paper’s senior authors.
Clinical single-cell RNA sequencing
Last year, Shalek and his colleagues developed a new portable technology that enables rapid sequencing of the RNA contents of several thousand single cells in parallel from tiny clinical samples. This technology, known as Seq-Well, allows researchers to see what transcriptional programs are turned on inside individual cells, giving them insight into the identities and functions of those cells.
In their latest study, the MIT and Brigham and Women’s researchers applied this technology to cells from the upper respiratory tract of patients suffering from chronic rhinosinusitis, with the hypothesis that distinct gene-expression patterns within epithelial cells might reveal why some patients develop nasal polyps while others do not.
This analysis revealed striking differences in the genes expressed in basal epithelial cells (a type of tissue stem cell) from patients with and without nasal polyps. In nonpolyp patients and in healthy people, these cells normally form a flat base layer of tissue that coats the inside of the nasal passages. In patients with polyps, these cells begin to pile up and form thicker layers instead of differentiating into epithelial cell subsets needed for host defense.
This type of gross tissue abnormality has been observed through histology for decades, but the new study revealed that basal cells from patients with polyps had turned on a specific program of gene expression that explains their blunted differentiation trajectory. This program appears to be sustained directly by IL-4 and IL-13, immune response cytokines known to drive allergic inflammation when overproduced at pathologic levels.
The researchers found that these basal cells also retain a “memory” of their exposure to IL-4 and IL-13: When they removed basal cells from nonpolyps and polyps, grew them in equivalent conditions for a month, and then exposed them to IL-4 and IL-13, they found that unstimulated cells from patients with polyps already expressed many of the genes that were induced in those without polyps. Among the IL-4 and IL-13 responsive memory signatures were genes from a cell signaling pathway known as Wnt, which controls cell differentiation.
Immunologists have long known that B cells and T cells can store memory of an allergen that they have been exposed to, which partly explains why the immune system may overreact the next time the same allergen is encountered. However, the new finding suggests that basal cells also contribute a great deal to this memory.
Since basal cells are stem cells that generate the other cells found in the respiratory epithelium, this memory may influence their subsequent patterns of gene expression and ability to generate mature specialized epithelial cells. The team noted a substantial impact on the balance of cell types within the epithelium in patients with severe disease, leading to a population of cells with diminished diversity.
“Once you know that IL-4 and IL-13 act on stem cells, it changes the way in which you have to think about intervening, versus if they acted on differentiated cells, because you have to erase that memory in order to bring the system back to homeostasis,” Shalek says. “Otherwise you’re not actually dealing with a root cause of the problem.”
The findings show the importance of looking beyond immune cells for factors that influence chronic allergies, says Shruti Naik, an assistant professor of pathology, medicine, and dermatology at New York University School of Medicine.
“They examined the tissue as a whole rather than biasing the study toward one cell type or another, and what they found is that other components of the tissue are irreversibly impacted by inflammation,” says Naik, who was not involved in the research.
Blocking cytokines in humans
The findings suggested that ongoing efforts to block the effects of IL-4 and IL-13 might be a good way to try to treat chronic rhinosinusitis, a hypothesis that the researchers validated using an antibody that blocks a common receptor for these two cytokines. This antibody has been approved to treat eczema and is undergoing further testing for other uses. The researchers analyzed the gene expression of basal cells taken from one of the patients with polyps before and after he had been treated with this antibody. They found that most, but not all, of the genes that had been stimulated by IL-4 and IL-13 had returned to normal expression levels.
“It suggests that blockade of IL-4 and IL-13 can help to restore basal cells and secretory cells towards a healthier state,” Ordovas-Montanes says. “However, there’s still some residual genetic signature left. So now the question will be, how do you intelligently target that remainder?”
The researchers now plan to further detail the molecular mechanisms of how basal cells store inflammatory memory, which could help them to discover additional drug targets. They are also studying inflammatory diseases that affect other parts of the body, such as inflammatory bowel disease, where inflammation often leads to polyps that can become cancerous. Investigating whether stem cells in the gut might also remember immunological events, sustain disease, and play a role in tumor formation, will be key to designing early interventions for inflammation-induced cancers.
The research was funded by the Searle Scholars Program, the Beckman Young Investigator Program, the Pew-Stewart Scholars Program, Sloan Fellowship Program, the Steven and Judy Kaye Young Innovators program, the Damon Runyon Cancer Research Foundation, the Bill and Melinda Gates Foundation, and the National Institutes of Health.
MIT researchers have taken a step toward solving a longstanding challenge with wireless communication: direct data transmission between underwater and airborne devices.
Today, underwater sensors cannot share data with those on land, as both use different wireless signals that only work in their respective mediums. Radio signals that travel through air die very rapidly in water. Acoustic signals, or sonar, sent by underwater devices mostly reflect off the surface without ever breaking through. This causes inefficiencies and other issues for a variety of applications, such as ocean exploration and submarine-to-plane communication.
In a paper being presented at this week’s SIGCOMM conference, MIT Media Lab researchers have designed a system that tackles this problem in a novel way. An underwater transmitter directs a sonar signal to the water’s surface, causing tiny vibrations that correspond to the 1s and 0s transmitted. Above the surface, a highly sensitive receiver reads these minute disturbances and decodes the sonar signal.
“Trying to cross the air-water boundary with wireless signals has been an obstacle. Our idea is to transform the obstacle itself into a medium through which to communicate,” says Fadel Adib, an assistant professor in the Media Lab, who is leading this research. He co-authored the paper with his graduate student Francesco Tonolini.
The system, called “translational acoustic-RF communication” (TARF), is still in its early stages, Adib says. But it represents a “milestone,” he says, that could open new capabilities in water-air communications. Using the system, military submarines, for instance, wouldn’t need to surface to communicate with airplanes, compromising their location. And underwater drones that monitor marine life wouldn’t need to constantly resurface from deep dives to send data to researchers.
Another promising application is aiding searches for planes that go missing underwater. “Acoustic transmitting beacons can be implemented in, say, a plane’s black box,” Adib says. “If it transmits a signal every once in a while, you’d be able to use the system to pick up that signal.”
Today’s technological workarounds to this wireless communication issue suffer from various drawbacks. Buoys, for instance, have been designed to pick up sonar waves, process the data, and shoot radio signals to airborne receivers. But these can drift away and get lost. Many are also required to cover large areas, making them impracticable for, say, submarine-to-surface communications.
TARF includes an underwater acoustic transmitter that sends sonar signals using a standard acoustic speaker. The signals travel as pressure waves of different frequencies corresponding to different data bits. For example, when the transmitter wants to send a 0, it can transmit a wave traveling at 100 hertz; for a 1, it can transmit a 200-hertz wave. When the signal hits the surface, it causes tiny ripples in the water, only a few micrometers in height, corresponding to those frequencies.
To achieve high data rates, the system transmits multiple frequencies at the same time, building on a modulation scheme used in wireless communication, called orthogonal frequency-division multiplexing. This lets the researchers transmit hundreds of bits at once.
Positioned in the air above the transmitter is a new type of extremely-high-frequency radar that processes signals in the millimeter wave spectrum of wireless transmission, between 30 and 300 gigahertz. (That’s the band where the upcoming high-frequency 5G wireless network will operate.)
The radar, which looks like a pair of cones, transmits a radio signal that reflects off the vibrating surface and rebounds back to the radar. Due to the way the signal collides with the surface vibrations, the signal returns with a slightly modulated angle that corresponds exactly to the data bit sent by the sonar signal. A vibration on the water surface representing a 0 bit, for instance, will cause the reflected signal’s angle to vibrate at 100 hertz.
“The radar reflection is going to vary a little bit whenever you have any form of displacement like on the surface of the water,” Adib says. “By picking up these tiny angle changes, we can pick up these variations that correspond to the sonar signal.”
Listening to “the whisper”
A key challenge was helping the radar detect the water surface. To do so, the researchers employed a technology that detects reflections in an environment and organizes them by distance and power. As water has the most powerful reflection in the new system’s environment, the radar knows the distance to the surface. Once that’s established, it zooms in on the vibrations at that distance, ignoring all other nearby disturbances.
The next major challenge was capturing micrometer waves surrounded by much larger, natural waves. The smallest ocean ripples on calm days, called capillary waves, are only about 2 centimeters tall, but that’s 100,000 times larger than the vibrations. Rougher seas can create waves 1 million times larger. “This interferes with the tiny acoustic vibrations at the water surface,” Adib says. “It’s as if someone’s screaming and you’re trying to hear someone whispering at the same time.”
To solve this, the researchers developed sophisticated signal-processing algorithms. Natural waves occur at about 1 or 2 hertz — or, a wave or two moving over the signal area every second. The sonar vibrations of 100 to 200 hertz, however, are a hundred times faster. Because of this frequency differential, the algorithm zeroes in on the fast-moving waves while ignoring the slower ones.
Testing the waters
The researchers took TARF through 500 test runs in a water tank and in two different swimming pools on MIT’s campus.
In the tank, the radar was placed at ranges from 20 centimeters to 40 centimeters above the surface, and the sonar transmitter was placed from 5 centimeters to 70 centimeters below the surface. In the pools, the radar was positioned about 30 centimeters above surface, while the transmitter was immersed about 3.5 meters below. In these experiments, the researchers also had swimmers creating waves that rose to about 16 centimeters.
In both settings, TARF was able to accurately decode various data — such as the sentence, “Hello! from underwater” — at hundreds of bits per second, similar to standard data rates for underwater communications. “Even while there were swimmers swimming around and causing disturbances and water currents, we were able to decode these signals quickly and accurately,” Adib says.
In waves higher than 16 centimeters, however, the system isn’t able to decode signals. The next steps are, among other things, refining the system to work in rougher waters. “It can deal with calm days and deal with certain water disturbances. But [to make it practical] we need this to work on all days and all weathers,” Adib says.
The researchers also hope that their system could eventually enable an airborne drone or plane flying across a water’s surface to constantly pick up and decode the sonar signals as it zooms by.
The research was supported, in part, by the National Science Foundation.
As technology, trade, and globalization tie the world’s cultures and communities ever closer together, the responsibility of each to guarantee and protect the well-being of the others grows in step — and that goes for nations and corporations alike.
That was the message that Kofi Annan SF ’72, SM ’72, the seventh secretary-general of the United Nations from 1996 to 2007, had for members of the MIT Sloan community in October of 2002, when he spoke to mark the 50th anniversary of his alma mater.
Annan, the first black African to hold the top U.N. post, died Saturday at the age of 80 from a short and unspecified illness.
In the talk, Annan said his time as an MIT Sloan Fellow during the early part of his career, which he spent almost entirely with the U.N., broadened his perspective on how to achieve international change and cooperation.
“Sloan looked well beyond the confines of this campus, encouraged people from many nations to study here, and was eager to advance the cause of international cooperation, scholarly and otherwise,” Annan said.
That education would come in handy later on, he noted, as he helped the U.N. navigate some of its most challenging moments and found himself negotiating across from many of the world’s most powerful leaders.
Halfway through his tenure as secretary-general, Annan and the U.N. were jointly awarded a Nobel Peace Prize for their work to create a “better organized and more peaceful world,” containing the spread of HIV in Africa and working to oppose international terrorism.
But Annan also faced his fair share of challenging diplomatic situations. As the U.N.’s chief of peacekeeping, he oversaw the response to the Rwandan genocide of the mid-1990s, and later worked feverishly in an attempt to dissuade the United States from launching its 2003 invasion of Iraq. He told Time magazine in 2013 that his failure to prevent that action was “his darkest moment.”
Even after he left the U.N., he returned in various capacities, being tapped in 2012 to help find a resolution for the still-raging civil war in Syria. He also launched the Kofi Annan Foundation, a nonprofit that works to promote better global governance and world peace.
The challenges facing the world are much the same now as they were in 2002 — cultural distrust leading to violence, uncertainty in the markets raising global anxiety, and concerns that globalization is enriching a select few at the expense of the many. But Annan’s emphasis on shared responsibility led to the formation of partnerships between the U.N., major corporations, and the world’s governments designed to ensure sustainable progress for all during his tenure.
Annan, in the MIT Sloan speech, emphasized the importance of trust and understanding among the world’s governing institutions and highlighted the crucial role of global business in helping to solve those problems.
“Businesses may ask why they should go down this path, especially if it involves taking steps that competitors might not, or steps they feel are rightly the province of governments,” he said. “Sometimes, doing what is right … is in the immediate interest of business.”
Corporations, he said, should see it as their responsibility to use their resources to pass knowledge, technology, and training along to the communities in which they operate.
When German car manufacturer Volkswagen found that it was losing some of its best managers to HIV/AIDS in Brazil, Annan described, the company implemented an education and treatment program, which saw the employees survive to pass the same information on to their communities.
He continued: “Sometimes we must do what is right simply because not to do so would be wrong. And sometimes, we do what is right to help usher in a new day, of new norms and new behaviors. We do not want business to do anything different from their normal business; we want them to do their normal business differently.”
Absent that effort, he said, the world risks rejecting global citizenship and retreating into protectionism and isolation, to the detriment of all.
“All of us — the private sector, civil society, labor unions, NGOs, universities, foundations, and individuals — must come together in an alliance for progress,” Annan said. “Together, we can and must move from value to values, from shareholders to stakeholders, and from balance sheets to balanced development. Together, we can and must face the dangers ahead and bring solutions within reach.”
If you type the phrase “living wage” into Google’s search engine, the first result that appears — even before the Wikipedia entry — is MIT’s Living Wage Calculator (LWC). Created by Amy Glasmeier, a professor of economic geography and regional planning at the School of Architecture and Planning, the LWC calculates the baseline wage employees need to earn to support themselves in any county in the United States. The online tool factors in costs including food, housing, transportation, medical care, child care, and taxes.
While Glasmeier has made minimal revisions to the LWC since she launched it in 2003 — in part to maintain its prominence in Google rankings — she has dramatically expanded its use, enlisting corporate, municipal, and civic partners who share her conviction and commitment to promoting a living wage. Some partners are ethically minded technology companies that lend expertise. Others are large nonprofits that use the LWC to help shape policy. Still others are corporations — including IKEA and Patagonia — that want to ensure that their employees can thrive on their salaries. A select few are partners that have contributed to the annual updating of the tool.
“There were several organizations that had been using our data,” explains Glasmeier, who built the LWC with the support of a Ford Foundation grant. “They all seemed to care deeply about the issue, so it was natural for me to invite them to be partners. It’s an interesting group of individuals and organizations who believe that corporations can build a living wage into their business models and still be successful.”
The U.S. was just emerging from a recession when Glasmeier began to work on the LWC. Rural areas had been hit particularly hard; residents of towns with one or two large employers lost their livelihoods when those employers failed.
“The LWC was never intended to be a high-wage or middle-class income-based tool,” explains Glasmeier. “It was created to show policymakers what happens when a big employer goes under. That knowledge encouraged them to formulate programs to ease the consequences of massive layoffs and plant closures.”
Today the idea of a living wage has gained momentum across a country where the gap between rich and poor widens from week to week. The LWC receives 100,000 hits per month. In 2014, it drew national attention when IKEA publicly adopted the tool to set wages in its U.S. facilities.
“They told me they chose the LWC because it was durable,” Glasmeier recalls. “And since it came from MIT, they figured it was calculated correctly.”
For IKEA, it was a natural fit.
“We believe if we want our co-workers to stay and grow with IKEA, we need to demonstrate our commitment to our co-workers through meaningful investments in compensation and benefits,” says Kenya Jacobs, IKEA's U.S. compensation manager. “The Living Wage Calculator has helped us estimate the appropriate wage needed to meet a person’s basic needs including food, housing, and transportation.”
Scott Woods is president of LWC partner West Arete, the Pennsylvania-based company wrote the software that runs LWC and was assigned the difficult task of aggregating multiple data sources on a county-by-county basis to make the calculator work. West Arete, which is certified as a B Corporation and is a member of 1% for the Planet (a nonprofit that directs corporate contributions toward environmental concerns), was drawn to LWC by its methodology and its mission, Woods says.
“The project is meaningful because of the impact it can have,” says Woods, whose company writes software for higher education. “We saw thousands of IKEA workers get raises overnight — and workers at other companies as well. And it’s hard to imagine anyone doing a better or more accurate job at this than Amy has. I don’t know where you’d go for better data.”
Carey Anne Nadeau ’15, CEO and founder of LWC partner Open Data Nation, believes that good policy requires good data, and that the LWC provides that data regarding a living wage.
“Without an expertise in data, we can’t make precise assumptions about what can and cannot be done,” says Nadeau, who earned a master’s degree in city planning at MIT. Her company uses data collection and analysis to help municipalities and, more recently, insurance companies make evidence-based proactive decisions. “With this tool we can fill the void, and effectively advocate for folks who have to work more hours than there are in a day and still can’t pay their bills.”
Early this year, Patagonia entered into a partnership with LWC. The outdoor and recreation company had been using LWC data for several years, and Glasmeier had visited corporate headquarters to speak with its human resource department about the labor market.
“Most companies in our position want to pay a competitive market wage,” says Jasin Nazim, a senior compensation analyst at Patagonia. “They base that wage on an industry survey. But it’s much more complicated to get data for a living wage in a company with employees in multiple cities. With the MIT Living Wage Calculator, I know I’m being as fair to my employees in Austin, Texas, as I am to those in San Francisco. And my employees can trust the numbers, because they’re coming from a third party, and that third party is MIT.”
At Just Capital, a New York–based nonprofit that advises companies and investors on important social issues, the LWC fell right in their wheelhouse.
“The living wage was one of the most important issues that emerged from our most recent survey,” says Andrew Stevenson, managing director for thematic research at Just Capital, whose board of directors includes Ariana Huffington and Deepak Chopra. “And Dr. Glasmeier’s work is well known and respected. We approached her to help draft policy for public service employees. It is a great way for us to partner with MIT on something that truly matters.”
For some LWC partners, joining a community of advocates was just as important as sharing critical data.
“We decided to partner with LWC because of the quality of the brand,” says Maurice Jones, president and CEO of The Local Initiatives Support Council. LISC is a nonprofit community development institution active in dozens of urban and rural areas. “Both MIT and LWC are a guarantee of quality. In addition, this partnership puts us in touch with a network of people who are committed to paying their employees a living wage. It’s a movement that is expanding, and that we want to be part of.”
The LWC has enabled MIT to develop robust and vibrant relationships with LWC partners. “IKEA and MIT understand that both the employer and employees win when businesses choose to pay their employees a living wage,” says IKEA’s Jacobs. “In addition, MIT provides research on IKEA’s behalf that is unique to our business model. This has enabled IKEA to maintain a sustainable living wage model.”
But there is still much work to be done. As of 2017, a typical family of four (two working adults, two children) needs to work nearly four full-time minimum-wage jobs (a 76-hour work week per working adult) to earn a living wage. Single-parent families need to work almost twice as hard as families with two working adults to earn the living wage. A single mother with two children earning the federal minimum wage of $7.25 per hour needs to work 138 hours per week — nearly the equivalent of working 24 hours per day for 6 days — to earn a living wage.
“It’s no longer a moral issue,” says Glasmeier. “More and more people are understanding that the current system just isn’t going to work. We need to allow people to live and work without going into debt. And we need to educate companies to think of a living wage as a fixed cost, and to economize on other budget items. Once they understand they can pay their employees a living wage and still be profitable, they’ll want to do it.”
At entrepreneurial cauldrons like MIT and other likeminded schools, the developing world is increasingly viewed as an enticing business opportunity, one where principled innovators can not only turn a profit but also solve daunting social challenges, create good jobs, and accelerate the local economy. This growing appetite for profit-paired-with-impact is reflected in the rising level of support top schools are giving to students who wish to explore, or outright pursue, frontier market opportunities.
At the Institute, such students often find their academic home within a home at the Legatum Center for Development and Entrepreneurship, which operates on the belief that entrepreneurs and their market-driven solutions are critical to advancing global prosperity. The Center is celebrating a decade of progress they’ve made in that regard — as captured in its recently released report, "The Legatum Fellowship: 10 Years of Impact" — while also welcoming their newest cohort of fellows.
“Our 10-year Impact Report illustrates just how powerful a force for economic and social change entrepreneurs can be,” says Megan Mitchell, director of fellowship and student programs. “That said, what is most important to us is cultivating the next generation of change agents, and our new group of fellows shows enormous potential.”
While the Legatum Center offers a range of programs for students, the fellowship is reserved for those most committed to building and scaling ventures in the developing world. Besides tuition, travel, and prototyping support, Legatum Fellows receive access to mentors and advisors, a targeted for-credit curriculum, and the peer support of an incubator-like community.
Meghan McCormick, a recent MIT Sloan graduate, valued the community aspect of her Legatum Fellowship most of all.
“I worked hand-in-hand with students from eight different MIT departments and five different continents,” McCormick says. “Each of them contributed something that is now a part of my venture’s DNA.”
This year’s cohort of 24 Legatum Fellows was selected from over 80 applicants across all MIT schools. They are implementing ventures in four Latin American countries, six African countries, and five countries in South and Southeast Asia. Their industries include health care, education, professional services, real estate, IT-telecom, energy, legal services, and agriculture.
Elisa Mansur, for instance, is building a network of home-based daycare centers to deliver early childhood education to low-income neighborhoods in Brazil. Christian Ulstrup’s venture seeks to reduce health care disparities in Cambodia by empowering endoscopists with real-time lesion detection tools. The new cohort also includes several students returning for their second year of the fellowship, such as Prosper Nyovanie, whose company makes solar power more accessible to low-income households in Zimbabwe, and Juliet Wanyiri, whose venture empowers local innovators through design workshops in Kenya.
The new fellows will join a growing family of Legatum-bred entrepreneurs that stretches back more than a decade. To illustrate more broadly the benefit of supporting early-stage entrepreneurs within a robust ecosystem like MIT, the Legatum Center gathered as much information as possible on their alumni’s professional activity and analyzed it in their 10-year Impact Report.
With support from the Legatum Group, the Mastercard Foundation, and HRH Princess Moudi Bint Khalid, the Legatum Center from 2007 to 2017 distributed over $7 million in funding to 213 Fellows. Since then alumni with active ventures in frontier markets have reportedly raised a total of $79 million in outside funding and created 14,700 jobs. They have also impacted 600,000 consumers, 300,000 farmers, 230,000 patients, and 37,000 fellow business owners.
The report also features vignettes on some of the Center’s most impactful alumni. David Auerbach and Ani Vallabhaneni, for instance, are two cofounders of Sanergy, a company that’s making quality sanitation affordable in the slums of Nairobi, Kenya, by collecting and converting the waste into valuable products like fertilizer and insect-based animal feed. They employ 250 people directly at the company and work with over 1,000 more as franchise toilet operators. Sanergy serves 60,000 customers per day and is growing by 80-100 toilets each month.
Another alumna is Fernanda de Velasco, who, with her cofounders, established Mexico’s first equity-based crowdfunding site. Called Play Business, it has already helped fund over 100 startups, helped create over 1,500 new jobs, and is growing by 1,300 new users per month. Fernanda and her team also worked proactively with the Mexican federal government for two years to draft and pass the country’s first comprehensive fintech legislation.
Aukrit Unahalekhaka cofounded Ricult, which empowers over 1,000 smallholder farmers in Thailand and Pakistan through a platform that bridges credit, information, and access gaps. Ricult is backed by the Gates Foundation and recently raised $1.85 million in seed funding.
Bilikiss Adebiyi-Abiola, meanwhile, launched an innovative recycling business in Lagos, Nigeria, which employs over 100 people and has processed more than 3,000 tons of recycling that would otherwise sit in trash heaps. Bilikiss recently handed the reins of the venture over to her COO (who is also her brother) in order to accept a government appointment as general manager for the Parks and Gardens Agency for Lagos State, a role that allows her to continue in her lifelong mission to clean up Lagos by leading citywide initiatives and shaping policy.
Mitchell says it’s “always a joy” to have star alumni come back for networking events or to present in class.
“There’s so much the current fellows can learn from them,” she says, “but it’s also powerful for fellows just to look at these alumni and envision where they themselves could be in a few short years, already making an impact of their own.”
The School of Science recently announced the winners of its 2018 Teaching Prizes for Graduate and Undergraduate Education. The prizes are awarded annually to School of Science faculty members who demonstrate excellence in teaching. Winners are chosen from nominations by students and colleagues.
Ankur Moitra, the Rockwell International Career Development Associate Professor in the Department of Mathematics, was awarded the prize for graduate education, for course 18.S996/18.409 (Algorithmic Aspects of Machine Learning), which he designed. Notes from this class have been turned into a monograph, which has already been used in courses across the country. Nominators said Moitra distinguishes himself as an inspirational, caring, and captivating teacher.
Paul O'Gorman, an associate professor in the Department of Earth, Atmospheric and Planetary Sciences, was also awarded the prize for graduate education, for his teaching of 12.810 (Atmospheric Dynamics). Nominators noted that his class was well-organized with clear expectations set, and they also lauded his humorous, engaging, and passionate teaching style.
Kerstin Perez, an assistant professor in the Department of Physics, was awarded the prize for undergraduate education, for her outstanding mentoring of undergraduate students, specifically women and students of color. In addition, nominators noted Perez consistently receives top marks for her classes, 8.13 (Experimental Physics) and 8.02 (Electricity and Magnetism).
William Minicozzi, the Singer Professor of Mathematics, was also awarded the prize for undergraduate education, for his teaching of 18.02 (Multivariable Calculus), one of the General Institute Requirement subjects. Students consistently praise his clarity, ability to engage the class, and sense of humor. Nominators also note Minicozzi's ability to treat difficult topics at an appropriate pace in his upper-level undergraduate courses.
The School of Science welcomes Teaching Prize nominations for its faculty during the spring semester each academic year.
The human body produces many antimicrobial peptides that help the immune system fend off infection. Scientists hoping to harness these peptides as potential antibiotics have now discovered that other peptides in the human body can also have potent antimicrobial effects, expanding the pool of new antibiotic candidates.
In the new study, researchers from MIT and the University of Naples Federico II found that fragments of the protein pepsinogen, an enzyme used to digest food in the stomach, can kill bacteria such as Salmonella and E. coli.
The researchers believe that by modifying these peptides to enhance their antimicrobial activity, they may be able to develop synthetic peptides that could be used as antibiotics against drug-resistant bacteria.
“These peptides really constitute a great template for engineering. The idea now is to use synthetic biology to modify them further and make them more potent,” says Cesar de la Fuente-Nunez, an MIT postdoc and Areces Foundation Fellow, and one of the senior authors of the paper.
Other MIT authors of the paper, which appears in the Aug. 20 issue of the journal ACS Synthetic Biology, are Timothy Lu, an associate professor of electrical engineering and computer science and of biological engineering, and Marcelo Der Torossian Torres, a former visiting student.
Discovering new functions
Antimicrobial peptides, which are found in nearly all living organisms, can kill many microbes, but they are typically not powerful enough to act as antibiotic drugs on their own. Many scientists, including de la Fuente-Nunez and Lu, have been exploring ways to create more potent versions of these peptides, in hopes of finding new weapons to combat the growing problem posed by antibiotic-resistant bacteria.
In this study, the researchers wanted to explore whether other proteins found in the human body, outside of the previously known antimicrobial peptides, might also be able to kill bacteria. To that end, they developed a search algorithm that analyzes databases of human protein sequences in search of similarities to known antimicrobial peptides.
“It’s a data-mining approach to very easily find peptides that were previously unexplored,” de la Fuente-Nunez says. “We have patterns that we know are associated with classical antimicrobial peptides, and the search engine goes through the database and finds patterns that look similar to what we know makes up a peptide that kills bacteria.”
In a screen of nearly 2,000 human proteins, the algorithm identified about 800 with possible antimicrobial activity. In the ACS Synthetic Biology paper, the research team focused on the peptide pepsinogen, whose role is to break down proteins in food. After pepsinogen is secreted by cells that line the stomach, hydrochloric acid in the stomach mixes with pepsinogen, converting it into pepsin A, which digests proteins, and into several other small fragments.
Those fragments, which previously had no known functions, showed up as candidates in the antimicrobial screen.
Once the researchers identified those candidates, they tested them against bacteria grown in lab dishes and found that they could kill a variety of microbes, including foodborne pathogens, such as Salmonella and E. coli, as well as others, including Pseudomonas aeruginosa, which often infects the lungs of cystic fibrosis patients. This effect was seen at both acidic pH, similar to that of the stomach, and neutral pH.
“The human stomach is attacked by many pathogenic bacteria, so it makes sense that we would have a host defense mechanism to defend ourselves from such attacks,” de la Fuente-Nunez says.
More potent drugs
The researchers also tested the three pepsinogen fragments against a Pseudomonas aeruginosa skin infection in mice, and found that the peptides significantly reduced the infections. The exact mechanism by which the peptides kill bacteria is unknown, but the researchers’ hypothesis is that their positive charges allow the peptides to bind to the negatively charged bacterial membranes and poke holes in them, a mechanism similar to that of other antimicrobial peptides.
The researchers now hope to modify these peptides to make them more effective, so that they could be potentially used as antibiotics. They are also seeking new peptides from organisms other than humans, and they plan to further investigate some of the other human peptides identified by the algorithm.
“We have an atlas of all these molecules, and the next step is to demonstrate whether each of them actually has antimicrobial properties and whether each of them could be developed as a new antimicrobial,” de la Fuente-Nunez says.
A new study provides fresh evidence that the decline in the capacity of brain cells to change (called “plasticity”), rather than a decline in total cell number, may underlie some of the sensory and cognitive declines associated with normal brain aging. Scientists at MIT’s Picower Institute for Learning and memory show that inhibitory interneurons in the visual cortex of mice remain just as abundant during aging, but their arbors become simplified and they become much less structurally dynamic and flexible.
In their experiments published online in the Journal of Neuroscience they also show that they could restore a significant degree of lost plasticity to the cells by treating mice with the commonly used antidepressant medication fluoxetine, also known as Prozac.
“Despite common belief, loss of neurons due to cell death is quite limited during normal aging and unlikely to account for age-related functional impairments,” write the scientists, including lead author Ronen Eavri, a postdoc at the Picower Institute, and corresponding author Elly Nedivi, a professor of biology and brain and cognitive sciences. “Rather it seems that structural alterations in neuronal morphology and synaptic connections are features most consistently correlated with brain age, and may be considered as the potential physical basis for the age-related decline.”
Nedivi and co-author Mark Bear, the Picower Professor of Neuroscience, are affiliated with MIT’s Aging Brain Initiative, a multidisciplinary effort to understand how aging affects the brain and sometimes makes the brain vulnerable to disease and decline.
In the study the researchers focused on the aging of inhibitory interneurons which is less well-understood than that of excitatory neurons, but potentially more crucial to plasticity. Plasticity, in turn, is key to enabling learning and memory and in maintaining sensory acuity. In this study, while they focused on the visual cortex, the plasticity they measured is believed to be important elsewhere in the brain as well.
The team counted and chronically tracked the structure of inhibitory interneurons in dozens of mice aged to 3, 6, 9, 12 and 18 months. (Mice are mature by 3 months and live for about 2 years, and 18-month-old mice are already considered quite old.) In previous work, Nedivi’s lab has shown that inhibitory interneurons retain the ability to dynamically remodel into adulthood. But in the new paper, the team shows that new growth and plasticity reaches a limit and progressively declines starting at about 6 months.
But the study also shows that as mice age there is no significant change in the number or variety of inhibitory cells in the brain.
Retraction and inflexibility with age
Instead the changes the team observed were in the growth and performance of the interneurons. For example, under the two-photon microscope the team tracked the growth of dendrites, which are the tree-like structures on which a neuron receives input from other neurons. At 3 months of age mice showed a balance of growth and retraction, consistent with dynamic remodeling. But between 3 and 18 months they saw that dendrites progressively simplified, exhibiting fewer branches, suggesting that new growth was rare while retraction was common.
In addition, they saw a precipitous drop in an index of dynamism. At 3 months virtually all interneurons were above a crucial index value of 0.35, but by 6 months only half were, by 9 months barely any were, and by 18 months none were.
Bear’s lab tested a specific form of plasticity that underlies visual recognition memory in the visual cortex, where neurons respond more potently to stimuli they were exposed to previously. Their measurements showed that in 3-month-old mice “stimulus-selective response potentiation” (SRP) was indeed robust, but its decline went hand in hand with the decline in structural plasticity, so that it was was significantly lessened by 6 months and barely evident by 9 months.
Fountain of fluoxetine
While the decline of dynamic remodeling and plasticity appeared to be natural consequences of aging, they were not immutable, the researchers showed. In prior work Nedivi’s lab had shown that fluoxetine promotes interneuron branch remodeling in young mice, so they decided to see whether it could do so for older mice and restore plasticity as well.
To test this, they put the drug in the drinking water of mice at various ages for various amounts of time. Three-month-old mice treated for three months showed little change in dendrite growth compared to untreated controls, but 25 percent of cells in 6-month-old mice treated for three months showed significant new growth (at the age of 9 months). But among 3-month-old mice treated for six months, 67 percent of cells showed new growth by the age of 9 months, showing that treatment starting early and lasting for six months had the strongest effect.
The researchers also saw similar effects on SRP. Here, too, the effects ran parallel to the structural plasticity decline. Treating mice for just three months did not restore SRP, but treating mice for six months did so significantly.
“Here we show that fluoxetine can also ameliorate the age-related decline in structural and functional plasticity of visual cortex neurons,” the researchers write. The study, they noted, adds to prior research in humans showing a potential cognitive benefit for the drug.
“Our finding that fluoxetine treatment in aging mice can attenuate the concurrent age-related declines in interneuron structural and visual cortex functional plasticity suggests it could provide an important therapeutic approach towards mitigation of sensory and cognitive deficits associated with aging, provided it is initiated before severe network deterioration,” they continued.
In addition to Eavri, Nedivi and Bear, the paper’s other authors are Jason Shepherd, Christina Welsh, and Genevieve Flanders.
The National Institutes of Health, the American Federation for Aging Research, the Ellison Medical Fondation, and the Machiah Foundation supported the research.
Investigating inside the human body often requires cutting open a patient or swallowing long tubes with built-in cameras. But what if physicians could get a better glimpse in a less expensive, invasive, and time-consuming manner?
A team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) led by Professor Dina Katabi is working on doing exactly that with an “in-body GPS" system dubbed ReMix. The new method can pinpoint the location of ingestible implants inside the body using low-power wireless signals. These implants could be used as tiny tracking devices on shifting tumors to help monitor their slight movements.
In animal tests, the team demonstrated that they can track the implants with centimeter-level accuracy. The team says that, one day, similar implants could be used to deliver drugs to specific regions in the body.
ReMix was developed in collaboration with researchers from Massachusetts General Hospital (MGH). The team describes the system in a paper that's being presented at this week's Association for Computing Machinery's Special Interest Group on Data Communications (SIGCOMM) conference in Budapest, Hungary.
Tracking inside the body
To test ReMix, Katabi’s group first implanted a small marker in animal tissues. To track its movement, the researchers used a wireless device that reflects radio signals off the patient. This was based on a wireless technology that the researchers previously demonstrated to detect heart rate, breathing, and movement. A special algorithm then uses that signal to pinpoint the exact location of the marker.
Interestingly, the marker inside the body does not need to transmit any wireless signal. It simply reflects the signal transmitted by the wireless device outside the body. Therefore, it doesn't need a battery or any other external source of energy.
A key challenge in using wireless signals in this way is the many competing reflections that bounce off a person's body. In fact, the signals that reflect off a person’s skin are actually 100 million times more powerful than the signals of the metal marker itself.
To overcome this, the team designed an approach that essentially separates the interfering skin signals from the ones they're trying to measure. They did this using a small semiconductor device, called a “diode,” that mixes signals together so the team can then filter out the skin-related signals. For example, if the skin reflects at frequencies of F1 and F2, the diode creates new combinations of those frequencies, such as F1-F2 and F1+F2. When all of the signals reflect back to the system, the system only picks up the combined frequencies, filtering out the original frequencies that came from the patient’s skin.
One potential application for ReMix is in proton therapy, a type of cancer treatment that involves bombarding tumors with beams of magnet-controlled protons. The approach allows doctors to prescribe higher doses of radiation, but requires a very high degree of precision, which means that it’s usually limited to only certain cancers.
Its success hinges on something that's actually quite unreliable: a tumor staying exactly where it is during the radiation process. If a tumor moves, then healthy areas could be exposed to the radiation. But with a small marker like ReMix’s, doctors could better determine the location of a tumor in real-time and either pause the treatment or steer the beam into the right position. (To be clear, ReMix is not yet accurate enough to be used in clinical settings. Katabi says a margin of error closer to a couple of millimeters would be necessary for actual implementation.)
"The ability to continuously sense inside the human body has largely been a distant dream," says Romit Roy Choudhury, a professor of electrical engineering and computer science at the University of Illinois, who was not involved in the research. "One of the roadblocks has been wireless communication to a device and its continuous localization. ReMix makes a leap in this direction by showing that the wireless component of implantable devices may no longer be the bottleneck."
There are still many ongoing challenges for improving ReMix. The team next hopes to combine the wireless data with medical data, such as that from magnetic resonance imaging (MRI) scans, to further improve the system’s accuracy. In addition, the team will continue to reassess the algorithm and the various tradeoffs needed to account for the complexity of different bodies.
"We want a model that's technically feasible, while still complex enough to accurately represent the human body," says MIT PhD student Deepak Vasisht, lead author on the new paper. "If we want to use this technology on actual cancer patients one day, it will have to come from better modeling a person's physical structure."
The researchers say that such systems could help enable more widespread adoption of proton therapy centers. Today, there are only about 100 centers globally.
"One reason that [proton therapy] is so expensive is because of the cost of installing the hardware," Vasisht says. "If these systems can encourage more applications of the technology, there will be more demand, which will mean more therapy centers, and lower prices for patients."
Katabi and Vasisht co-wrote the paper with MIT PhD student Guo Zhang, University of Waterloo professor Omid Abari, MGH physicist Hsaio-Ming Lu, and MGH technical director Jacob Flanz.
Last year, physicists at MIT, the University of Vienna, and elsewhere provided strong support for quantum entanglement, the seemingly far-out idea that two particles, no matter how distant from each other in space and time, can be inextricably linked, in a way that defies the rules of classical physics.
Take, for instance, two particles sitting on opposite edges of the universe. If they are truly entangled, then according to the theory of quantum mechanics their physical properties should be related in such a way that any measurement made on one particle should instantly convey information about any future measurement outcome of the other particle — correlations that Einstein skeptically saw as “spooky action at a distance.”
In the 1960s, the physicist John Bell calculated a theoretical limit beyond which such correlations must have a quantum, rather than a classical, explanation.
But what if such correlations were the result not of quantum entanglement, but of some other hidden, classical explanation? Such “what-ifs” are known to physicists as loopholes to tests of Bell’s inequality, the most stubborn of which is the “freedom-of-choice” loophole: the possibility that some hidden, classical variable may influence the measurement that an experimenter chooses to perform on an entangled particle, making the outcome look quantumly correlated when in fact it isn’t.
Last February, the MIT team and their colleagues significantly constrained the freedom-of-choice loophole, by using 600-year-old starlight to decide what properties of two entangled photons to measure. Their experiment proved that, if a classical mechanism caused the correlations they observed, it would have to have been set in motion more than 600 years ago, before the stars’ light was first emitted and long before the actual experiment was even conceived.
Now, in a paper published today in Physical Review Letters, the same team has vastly extended the case for quantum entanglement and further restricted the options for the freedom-of-choice loophole. The researchers used distant quasars, one of which emitted its light 7.8 billion years ago and the other 12.2 billion years ago, to determine the measurements to be made on pairs of entangled photons. They found correlations among more than 30,000 pairs of photons, to a degree that far exceeded the limit that Bell originally calculated for a classically based mechanism.
“If some conspiracy is happening to simulate quantum mechanics by a mechanism that is actually classical, that mechanism would have had to begin its operations — somehow knowing exactly when, where, and how this experiment was going to be done — at least 7.8 billion years ago. That seems incredibly implausible, so we have very strong evidence that quantum mechanics is the right explanation,” says co-author Alan Guth, the Victor F. Weisskopf Professor of Physics at MIT.
“The Earth is about 4.5 billion years old, so any alternative mechanism — different from quantum mechanics — that might have produced our results by exploiting this loophole would’ve had to be in place long before even there was a planet Earth, let alone an MIT,” adds David Kaiser, the Germeshausen Professor of the History of Science and professor of physics at MIT. “So we’ve pushed any alternative explanations back to very early in cosmic history.”
Guth and Kaiser’s co-authors include Anton Zeilinger and members of his group at the Austrian Academy of Sciences and the University of Vienna, as well as physicists at Harvey Mudd College and the University of California at San Diego.
A decision, made billions of years ago
In 2014, Kaiser and two members of the current team, Jason Gallicchio and Andrew Friedman, proposed an experiment to produce entangled photons on Earth — a process that is fairly standard in studies of quantum mechanics. They planned to shoot each member of the entangled pair in opposite directions, toward light detectors that would also make a measurement of each photon using a polarizer. Researchers would measure the polarization, or orientation, of each incoming photon’s electric field, by setting the polarizer at various angles and observing whether the photons passed through — an outcome for each photon that researchers could compare to determine whether the particles showed the hallmark correlations predicted by quantum mechanics.
The team added a unique step to the proposed experiment, which was to use light from ancient, distant astronomical sources, such as stars and quasars, to determine the angle at which to set each respective polarizer. As each entangled photon was in flight, heading toward its detector at the speed of light, researchers would use a telescope located at each detector site to measure the wavelength of a quasar’s incoming light. If that light was redder than some reference wavelength, the polarizer would tilt at a certain angle to make a specific measurement of the incoming entangled photon — a measurement choice that was determined by the quasar. If the quasar’s light was bluer than the reference wavelength, the polarizer would tilt at a different angle, performing a different measurement of the entangled photon.
In their previous experiment, the team used small backyard telescopes to measure the light from stars as close as 600 light years away. In their new study, the researchers used much larger, more powerful telescopes to catch the incoming light from even more ancient, distant astrophysical sources: quasars whose light has been traveling toward the Earth for at least 7.8 billion years — objects that are incredibly far away and yet are so luminous that their light can be observed from Earth.
On Jan. 11, 2018, “the clock had just ticked past midnight local time,” as Kaiser recalls, when about a dozen members of the team gathered on a mountaintop in the Canary Islands and began collecting data from two large, 4-meter-wide telescopes: the William Herschel Telescope and the Telescopio Nazionale Galileo, both situated on the same mountain and separated by about a kilometer.
One telescope focused on a particular quasar, while the other telescope looked at another quasar in a different patch of the night sky. Meanwhile, researchers at a station located between the two telescopes created pairs of entangled photons and beamed particles from each pair in opposite directions toward each telescope.
In the fraction of a second before each entangled photon reached its detector, the instrumentation determined whether a single photon arriving from the quasar was more red or blue, a measurement that then automatically adjusted the angle of a polarizer that ultimately received and detected the incoming entangled photon.
“The timing is very tricky,” Kaiser says. “Everything has to happen within very tight windows, updating every microsecond or so.”
Demystifying a mirage
The researchers ran their experiment twice, each for around 15 minutes and with two different pairs of quasars. For each run, they measured 17,663 and 12,420 pairs of entangled photons, respectively. Within hours of closing the telescope domes and looking through preliminary data, the team could tell there were strong correlations among the photon pairs, beyond the limit that Bell calculated, indicating that the photons were correlated in a quantum-mechanical manner.
Guth led a more detailed analysis to calculate the chance, however slight, that a classical mechanism might have produced the correlations the team observed.
He calculated that, for the best of the two runs, the probability that a mechanism based on classical physics could have achieved the observed correlation was about 10 to the minus 20 — that is, about one part in one hundred billion billion, “outrageously small,” Guth says. For comparison, researchers have estimated the probability that the discovery of the Higgs boson was just a chance fluke to be about one in a billion.
“We certainly made it unbelievably implausible that a local realistic theory could be underlying the physics of the universe,” Guth says.
And yet, there is still a small opening for the freedom-of-choice loophole. To limit it even further, the team is entertaining ideas of looking even further back in time, to use sources such as cosmic microwave background photons that were emitted as leftover radiation immediately following the Big Bang, though such experiments would present a host of new technical challenges.
“It is fun to think about new types of experiments we can design in the future, but for now, we are very pleased that we were able to address this particular loophole so dramatically. Our experiment with quasars puts extremely tight constraints on various alternatives to quantum mechanics. As strange as quantum mechanics may seem, it continues to match every experimental test we can devise,” Kaiser says.
This research was supported in part by the Austrian Academy of Sciences, the Austrian Science Fund, the U.S. National Science Foundation, and the U.S. Department of Energy.
In the second grade, Kelsey Moore became acquainted with geologic time. Her teachers instructed the class to unroll a giant strip of felt down a long hallway in the school. Most of the felt was solid black, but at the very end, the students caught a glimpse of red.
That tiny red strip represented the time on Earth in which humans have lived, the teachers said. The lesson sparked Moore’s curiosity. What happened on Earth before there were humans? How could she find out?
A little over a decade later, Moore enrolled in her first geoscience class at Smith College and discovered she now had the tools to begin to answer those very questions.
Moore zeroed in on geobiology, the study of how the physical Earth and biosphere interact. During the first semester of her sophomore year of college, she took a class that she says “totally blew my mind.”
“I knew I wanted to learn about Earth history. But then I took this invertebrate paleontology class and realized how much we can learn about life and how life has evolved,” Moore says. A few lectures into the semester, she mustered the courage to ask her professor, Sara Pruss in Smith’s Department of Geosciences, for a research position in the lab.
Now a fourth-year graduate student at MIT, Moore works in the geobiology lab of Associate Professor Tanja Bosak in MIT’s Department of Earth, Atmospheric, and Planetary Sciences. In addition to carrying out her own research, Moore, who is also a Graduate Resident Tutor in the Simmons Hall undergraduate dorm, makes it a priority to help guide the lab’s undergraduate researchers and teach them the techniques they need to know.
“We have a natural curiosity about how we got here, and how the Earth became what it is. There’s so much unknown about the early biosphere on Earth when you go back 2 billion, 3 billion, 4 billion years,” Moore says.
Moore studies early life on Earth by focusing on ancient microbes from the Proterozoic, the period of Earth’s history that spans 2.5 billion to 542 million years ago — between the time when oxygen began to appear in the atmosphere up until the advent and proliferation of complex life. Early in her graduate studies, Moore and Bosak collaborated with Greg Fournier, the Cecil and Ida Green Assistant Professor of Geobiology, on research tracking cyanobacterial evolution. Their research is supported by the Simons Collaboration on the Origins of Life.
The question of when cyanobacteria gained the ability to perform oxygenic photosynthesis, which produces oxygen and is how many plants on Earth today get their energy, is still under debate. To track cyanobacterial evolution, MIT researchers draw from genetics and micropaleontology. Moore works on molecular clock models, which track genetic mutations over time to measure evolutionary divergence in organisms.
Clad with a white lab coat, lab glasses, and bright purple gloves, Moore sifts through multiple cyanobacteria under a microscope to find modern analogs to ancient cyanobacterial fossils. The process can be time-consuming.
“I do a lot of microscopy,” Moore says with a laugh. Once she’s identified an analog, Moore cultures that particular type of cyanobacteria, a process which can sometimes take months. After the strain is enriched and cultured, Moore extracts DNA from the cyanobacteria. “We sequence modern organisms to get their genomes, reconstruct them, and build phylogenetic trees,” Moore says.
By tying information together from ancient fossils and modern analogs using molecular clocks, Moore hopes to build a chronogram — a type of phylogenetic tree with a time component that eventually traces back to when cyanobacteria evolved the ability to split water and produce oxygen.
Moore also studies the process of fossilization, on Earth and potentially other planets. She is collaborating with researchers at NASA’s Jet Propulsion Laboratory to help them prepare for the upcoming Mars 2020 rover mission.
“We’re trying to analyze fossils on Earth to get an idea for how we’re going to look at whatever samples get brought back from Mars, and then to also understand how we can learn from other planets and potentially other life,” Moore says.
After MIT, Moore hopes to continue research, pursue postdoctoral fellowships, and eventually teach.
“I really love research. So why stop? I’m going to keep going,” Moore says. She says she wants to teach in an institution that emphasizes giving research opportunities to undergraduate students.
“Undergrads can be overlooked, but they’re really intelligent people and they’re budding scientists,” Moore says. “So being able to foster that and to see them grow and trust that they are capable in doing research, I think, is my calling.”
Geology up close
To study ancient organisms and find fossils, Moore has traveled across the world, to Shark Bay in Australia, Death Valley in the United States, and Bermuda.
“In order to understand the rocks, you really have to get your nose on the rocks. Go and look at them, and be there. You have to go and stand in the tidal pools and see what’s happening — watch the air bubbles from the cyanobacteria and see them make oxygen,” Moore says. “Those kinds of things are really important in order to understand and fully wrap your brain around how important those interactions are.”
And in the field, Moore says, researchers have to “roll with the punches.”
“You don’t have a nice, beautiful, pristine lab set up with all the tools and equipment that you need. You just can’t account for everything,” Moore says. “You have to do what you can with the tools that you have.”
As a Graduate Resident Tutor, Moore helps to create supporting living environments for the undergraduate residents of Simmons Hall.
Each week, she hosts a study break in her apartment in Simmons for her cohort of students — complete with freshly baked treats. “[Baking] is really relaxing for me,” Moore says. “It’s therapeutic.”
“I think part of the reason I love baking so much is that it’s my creative outlet,” she says. “I know that a lot of people describe baking as like chemistry. But I think you have the opportunity to be more creative and have more fun with it. The creative side of it is something that I love, that I crave outside of research.”
Part of Moore’s determination to research, trek out in the field, and mentor undergraduates draws from her “biggest science inspiration” — her mother, Michele Moore, a physics professor at Spokane Falls Community College in Spokane, Washington.
“She was a stay-at-home mom my entire childhood. And then when I was in middle school, she decided to go and get a college degree,” Moore says. When Moore started high school, her mother earned her bachelor’s degree in physics. Then, when Moore started college, her mother earned her PhD. “She was sort of one step ahead of me all the time, and she was a big inspiration for me and gave me the confidence to be a woman in science.”
The American Physical Society (APS) has recognized MIT Plasma Science and Fusion Center (PSFC) principal research scientists John Wright and Stephen Wukitch, as well as Yevgen Kazakov and Jozef Ongena of the Laboratory for Plasma Physics in Brussels, Belgium, with the Landau-Spitzer Award for their collaborative work.
Given biennially to acknowledge outstanding plasma physics collaborations between scientists in the U.S. and the European Union, the prize this year is being awarded “for experimental verification, through collaborative experiments, of a novel and highly efficient ion cyclotron resonance heating scenario for plasma heating and generation of energetic ions in magnetic fusion devices.”
The collaboration originated at a presentation on a proposed heating scenario by Kazakov, given at a conference in 2015. Wright and Wukitch were confident that the MIT's Alcator C-Mod (the world’s highest-field tokamak) and the UK's JET (the world’s largest tokamak) would allow for an expedited and comprehensive experimental investigation. C-Mod’s high magnetic fields made it ideal for confining energetic ions, and its unique diagnostics allowed the physics to be verified within months of the conference. The results greatly strengthened Kazakov and Ongena's proposal for a JET experiment that conclusively demonstrated generation of energetic ions via this heating technique.
Additional C-Mod experiments were the first to observe alpha-like energetic ions at high magnetic field and reactor-like densities. The joint experimental work highlighting JET and C-Mod results was published in Nature Physics.
One of the key fusion challenges is confining the very energetic fusion product ions that must transfer their energy to the core plasma before they escape confinement. This heating scenario efficiently generates energies comparable to that of those produced by fusion and can be used to study energetic ion behavior in present day devices such as JET and the stellarator Wendelstein 7-X (W-7X). It will also allow study in the non-nuclear phase of ITER, the next-generation fusion device being built in France.
“It will be the icing on the cake to use this scenario at W-7X,” says Wright. “Because stellarators have large volume and high-density plasmas, it is hard for current heating scenarios to achieve those fusion energies. With conventional techniques it has been difficult to show if stellarators can confine fast ions. Using this novel scenario will definitely allow researchers to demonstrate whether a stellarator will work for fusion plasmas.”
The award, given jointly by APS and the European Physical Society, will be presented to the team in November at the APS Division of Plasma Physics meeting in Portland, Oregon.
Do you have a great idea for a nonfiction book in science or technology, broadly defined? Editors receive hundreds of inquires each year. But what makes one book project stand out from the rest?
In the spirit of fun and fostering the Boston publishing scene, the MIT Press is hosting its first-ever Pitchfest competition.
Pitchfest will give six contestants the opportunity to present their best science or technology book idea before a panel of judges and a live audience at the Boston Book Festival on Oct. 13 in Boston. The winner will be given the opportunity to workshop a full-fledged book proposal with an MIT Press editor, get advice on how to navigate the publishing world, and receive a $1,000 cash prize.
The deadline for submissions is Sept. 1, after which time finalists will be selected and given offers to participate in the event. Pitchfest is an open competition and anyone is welcome to submit a proposal.
The MIT Press is a leading publisher of books and journals at the intersection of science, technology, and the arts. MIT Press books and journals are known for their intellectual daring, scholarly standards, and distinctive design.