MIT Latest News
The newly renovated Hayden Library and Building 14 courtyard opened to the MIT community Aug. 23. The spaces were re-envisioned to provide areas for collaborative work, exploring collections, a teaching and event space, a new cafe, and areas to unwind, surrounded by greenery and natural light.
“Libraries have a unique role to play at an institution like MIT, especially the physical spaces of the Libraries,” says MIT Libraries Director Chris Bourg. “It was critical that the new Hayden Library and courtyard meet some core needs for the MIT community, including a place for working hands-on with collections, spaces for collaborative group work and community building, and accessible, welcoming spots for working or relaxing in beautiful surroundings.”
A library for our current moment
Construction on the renovation began in January 2020 following a nine-month design phase that incorporated input from the MIT community with the vision for academic libraries outlined in the Institute’s Ad Hoc Task Force on the Future of Libraries. The design by Kennedy & Violich Architecture (KVA) creates flexible spaces that accommodate a range of teaching and learning needs — from focused, quiet study to collaborative work to casual conversation — and supports the important community functions of the libraries at MIT.
Entitled “Research Crossroads,” KVA’s design concept for Hayden reflects the many intersections that define research in today’s academic library — connections between disciplines, community members, and between digital and tangible collections.
“So much of our vision for libraries has been given new urgency by the pandemic — the need for equitable digital access to information, adapting to hybrid modes of teaching and learning, and the essential role of physical space, including outdoor space, on a campus,” says Bourg. “With this renovation, the design and features of Hayden and the courtyard meet that new urgency with an almost uncanny prescience.”
First floor: Collaboration and community
By exposing the courtyard-facing windows on the first floor of the library, the renovation has created an expansive, double-height space filled with natural light. At the center, two double-height central pavilions clad in translucent glass and ash wood offer views to both the courtyard and the Charles River while providing eight new reservable group study rooms and a loft space for exhibits. A variety of seating areas, punctuated with whiteboards, movable curtains, and curated library displays, face the Boston skyline in the collaborative study area.
The first floor is also home to two brand new community spaces. The Nexus is a flexible teaching and gathering space that can be configured in different ways to host a range of events, from library workshops to MIT Reads author events. The Courtyard Cafe, open Monday through Friday, 10 a.m. to 6 p.m., offers coffee, drinks, sandwiches, and snacks that can be enjoyed in a seating area overlooking the courtyard. The entire first floor, more than 10,000 square feet of space, is accessible 24 hours a day to current students, faculty, and staff.
Second floor: Quiet space and expert help
The second floor features the Quiet Reading Room, with Hayden’s signature window bays overlooking the river. Adjacent to the reading room is an area called the Oasis, envisioned as a technology-minimal zone for meditation, relaxation, and taking a study break.
In the second floor’s Consultation Suite, the community can meet with library experts in a range of subjects, scholarly communications, copyright, and more. Additional flexible study space for the MIT community has been added on the east end of the building on the second floor, including a Research Alcove for using library collections.
A revitalized courtyard
Working in collaboration with Stephen Stimson Associates Landscape Architects, KVA designed a woodland garden at the center of Building 14 with nine new Katsura trees and plantings in a new green topography bordered by paved walkways and seating areas. The renovated courtyard has two accessible entrances and a new, three-season porch extending along the east end of the courtyard. The porch’s accordion doors can open onto the courtyard in warm weather, and a perforated sun screen creates a dappled sunlight effect, extending the atmosphere of the tree canopy into the new space.
“In envisioning different ways that the MIT community could gather, collaborate, and conduct research, it was essential to design an intimate connection with nature in the courtyard,” says Sheila Kennedy, principal at KVA and professor of architecture in the MIT School of Architecture and Planning. “The new porch space minimizes energy loads and maximizes flexible uses on campus. It is a special place where people can enjoy a conversation with each other in the presence of plants and trees.”
Designed for inclusivity and wellness
In addition to targeting LEED Gold certification, the Hayden Library project meets a Fitwel certification goal that supports healthier workplace environments to help improve occupant health and productivity. The library is the first MIT construction project to target “‘Red List Free” materials for all interior finish materials and fabrics to create a healthier space for the community. All-gender restrooms throughout Building 14 provide equitable access for all members of the MIT community, and the second floor also features a lactation room. A new installation of art from the List Visual Arts Center throughout the second floor seeks to both engage the MIT community and reflect its diversity.
“The past year-and-a-half have underscored how important it is to connect with each other in physical spaces at MIT, as well as the need to do that in ways that promote health and well-being,” says Krystyn Van Vliet, associate provost and associate vice president for research. ”From the beginning, these projects have prioritized the needs of the MIT community, the value of green space, and the accessible, inclusive, and health-conscious aspects of campus space.”
Hayden Library and the Building 14 Courtyard are currently open to anyone in Covid Pass and escorted visitors. Please visit MIT Now for the latest information on campus access and Institute policies.
By definition, dwarf galaxies are small and dim, with just a fraction of the stars found in the Milky Way and other galaxies. There are, however, giants among the dwarfs: Ultra-diffuse galaxies, or UDGs, are dwarf systems that contain relatively few stars but are scattered over vast regions. Because they are so diffuse, these systems are difficult to detect, though most have been found tucked within clusters of larger, brighter galaxies.
Now astronomers from MIT, the University of California at Riverside, and elsewhere have used detailed simulations to detect “quenched” UDGs — a rare type of dwarf galaxy that has stopped generating stars. They identified several such systems in their simulations and found the galaxies were not in clusters, but rather exiled in voids — quiet, nearly empty regions of the universe.
This isolation goes against astronomers’ predictions of how quenched UDGs should form. So, the team used the same simulations to rewind the dwarf systems’ evolution and see exactly how they came to be.
The researchers found that quenched UDGs likely coalesced within halos of dark matter with unusually high angular momentum. Like a cotton candy machine, this extreme environment may have spun out dwarf galaxies that were anomalously stretched out.
These UDGs then evolved within galaxy clusters, like most UDGs. But interactions within the cluster likely ejected the dwarfs into the void, giving them wide, boomerang-like trajectories known as “backsplash” orbits. In the process, the galaxies’ gas was stripped away, leaving the galaxies “quenched” and unable to produce new stars.
The simulations showed that such UDGs should be more common than what has been observed. The researchers say their results, published today in Nature Astronomy, provide a blueprint for astronomers to go looking for these dwarfish giants in the universe’s voids.
“We always strive to get a complete consensus of the galaxies that we have in the universe,” says Mark Vogelsberger, associate professor of physics at MIT. “This study is adding a new population of galaxies that the simulation actually predicts. And we now have to look for them in the real universe.”
Vogelsberger co-led the study with Laura Sales of UC Riverside and José A. Benavides of the Institute of Theoretical and Experimental Astronomy in Argentina.
Red v blue
The team’s search for quenched UDGs began with a simple survey for UDG satellites — ultra-diffuse systems that reside outside galaxy clusters. Astronomers predict that UDGs within clusters should be quenched, as they would be surrounded by other galaxies that would essentially rub out the UDG’s already-diffuse gas and shut off star production. Quenched UDGs in clusters should then consist mainly of old stars and appear red in color.
If UDGs exist outside clusters, in the void, they are expected to continue churning out stars, as there would be no competing gas from other galaxies to quench them. UDGs in the void, therefore, are predicted to be rich with new stars, and to appear blue.
When the team surveyed previous detections of UDG satellites, outside clusters, they found most were blue as expected — but a few were red.
“That’s what caught our attention,” Sales says. “And we thought, ‘What are they doing there? How did they form?’ There was no good explanation.”
To find one, the researchers looked to TNG50, a detailed cosmological simulation of galaxy formation developed by Vogelsberger and others at MIT and elsewhere. The simulation runs on some of the most powerful supercomputers in the world and is designed to evolve a large volume of the universe, from conditions resembling those shortly after the Big Bang to the present day.
The simulation is based on fundamental principles of physics and the complex interactions between matter and gas, and its results have been shown in many scenarios to agree with what astronomers have observed in the actual universe. TNG50 has therefore been used as an accurate model for how and where many types of galaxies evolve through time.
In their new study, Vogelsberger, Sales, and Benavides used TNG50 to first see if they could spot quenched UDGs outside galaxy clusters. They started with a cube of the early universe measuring about 150 million light years wide, and ran the simulation forward, up through the present day. Then they searched the simulation specifically for UDGs in voids, and found most of the ones they detected were blue, as expected. But a surprising number — about 25 percent — were red, or quenched.
They zeroed in on these red satellite dwarfs and used the same simulation, this time as a sort of time machine to see how, when, and where these galaxies originated. They found that the systems were initially part of clusters but were somehow thrown out into the void, on a more elliptical, “backsplash” orbit.
“These orbits are almost like those of comets in our solar system,” Sales says. “Some go out and orbit back around, and others may come in once and then never again. For quenched UDGs, because their orbits are so elliptical, they haven’t had time to come back, even over the entire age of the universe. They are still out there in the field.”
The simulations also showed that the quenched UDGs’ red color arose from their ejection — a violent process that stripped away the galaxies’ star-forming gas, leaving it quenched and red. Running the simulations further back in time, the team observed that the tiny systems, like all galaxies, originated in halos of dark matter, where gas coalesces into galactic disks. But for quenched UDGs, the halos appeared to spin faster than normal, generating stretched out, ultra-diffuse galaxies.
Now that the researchers have a better understanding of where and how quenched UDGs arose, they hope astronomers can use their results to tune telescopes, to identify more such isolated red dwarfs — which the simulations suggest must be lurking in larger numbers than what astronomers have so far detected.
“It’s quite surprising that the simulations can really produce all these very small objects,” Vogelsberger says. “We predict there should be more of this kind of galaxy out there. This makes our work quite exciting.”
MIT.nano has added the Nanoscribe Photonic Professional GT2, a high-speed, three-dimensional microfabrication instrument, to its fabrication capabilities. The GT2 will provide MIT.nano users with the ability to create high-resolution 3D structures. It has been installed and qualified in MIT.nano’s third-floor soft lithography space, and is now available for training and use.
The GT2 uses two-photon technology to crosslink special polymers and produce intricate structures of nearly any 3D shape, including crystal lattices, porous scaffolds, and naturally inspired patterns. The tool offers this design flexibility while also being incredibly precise — it has a finest resolution down to 400 nanometers. It is also fast; the GT2 has a maximum scan speed of 625 millimeters per second. This tool will support research and rapid prototyping in a variety of areas such as microfluidics, micromechanics, biomedical engineering, micro-optics, and nanostructures.
“The GT2 employs a fundamentally different approach compared to other types of lithography systems we have in the fab,” says Assistant Director for User Services Jorg Scholvin. “It opens up completely new types of devices and research paths that would be impossible to explore with conventional lithography methods.”
The new instrument was purchased by the Portela Research Group in the Department of Mechanical Engineering. The group, which focuses on architected mechanics and materials across scales, is led by d’Arbeloff Career Development Assistant Professor Carlos Portela. In a recent study led by Portela, researchers fabricated an ultralight “nanoarchitected” material that can withstand supersonic microparticle impacts. “This work marks the beginning of our explorations on the dynamics of nanomaterials, which we plan to do at MIT.nano,” said Portela.
To accompany the new GT2, MIT.nano also acquired a multi-application critical point dryer (CPD) with a 2.5-inch chamber for use with lithography processes. Critical point drying is a process to remove liquid and dry delicate samples in a controlled way. This is particularly important to enable the delicate, sub-micron structures fabricated with the GT2. Forces from surface tension can pull at small, fragile structures as liquids change phase to gas and evaporate, resulting in damaged or destroyed devices. By pushing the pressure-temperature profile of the fluid near the critical point — at which the differences between the gas and liquid phases no longer exist — the sample can be “dried” without crossing these phase boundaries, avoiding damage from surface tension forces.
The new dryer, a Tousimis Autosamdri-931, can process up to five substrates per run and offers a fast chamber cooling, decreasing from 25 degrees Celsius to 3 C in just two minutes to push carbon dioxide fluid around its critical point quickly. The CPD is now available to researchers for training and use.
For more information about MIT.nano’s tools and instruments, visit nanousers.mit.edu.
Paul Penfield, the Dugald C. Jackson Professor of Electrical Engineering, Emeritus, died on June 22 at age 88. Affiliated with the Microsystems Technology Laboratories (MTL), Penfield was a member of the MIT Department of Electrical Engineering and Computer Science (EECS) faculty for 45 years, beginning in 1960. He served as associate head of the department from 1974-78; as director of the Microsystems Research Program from 1985-89; and as department head from 1989 to 1999.
He was the Dugald C. Jackson Professor of Electrical Engineering from January 2000 until his retirement in June 2005. In 1996-1997 he served as president of the National Electrical Engineering Department Heads Association (NEEDHA) and in March 2000 received its Outstanding Service Award. In 1998 he organized the Building 20 Commemoration, for which he received the 1999 Presidential Citation from The Association of Alumni and Alumnae of MIT.
As a departmental leader, Penfield was instrumental in building up MIT EECS’s activity in silicon integrated circuits. He played a key role in bringing Lynn Conway, as a visiting professor, to EECS in 1978 to teach the first very large-scale integration system design course, culminating with student-designed integrated circuits that were fabricated by HP. This launched not only a universally-accepted syllabus, but also the concept of the MOSIS foundry. As associate head of the EECS department, and subsequently founding director of the Microsystems Research Program, he also spearheaded the establishment of the MTL, which allowed MIT to reclaim a prominent national position in silicon technology. As associate head of EECS, he built faculty strength not just in integrated circuits, but in the computer architecture that this field of study enabled.
One of Penfield’s most significant contributions to the EECS community was the development of 6.050J / 2.110J (Information, Entropy, and Computation), a course offered jointly by EECS and the Department of Mechanical Engineering for almost 20 years, which helped make the Second Law of Thermodynamics more accessible to first-year students by treating entropy as a form of information. Penfield cared deeply about education in electrical engineering and computer science, articulating the importance of both “The Electron and the Bit” in the past and future evolution of the field. His devotion to his students was fittingly commemorated with the Paul L. Penfield Student Service Award, granted to undergraduate, MEng, and graduate students alike to honor his devotion to the department. Another lasting contribution was Penfield’s establishment of the master's of engineering degree as an accessible and primary path for MIT EECS undergraduates, following (or integrated with) their bachelor’s degree.
Beyond the classroom, Penfield’s wide-ranging research interests included inquiry into solid-state microwave devices and circuits, noise, and thermodynamics; electrodynamics of moving media; circuit theory; computer-aided design; APL language extensions; integrated-circuit design automation; computer-aided fabrication of integrated circuits; and the equivalence of information and thermodynamic entropy. His meaningful contributions to the field, including five books and hundreds of papers, were recognized with the Centennial Medal from IEEE in 1984, the IEEE Circuits and Systems Society Darlington Prize Paper Award in 1985, and the IEEE Circuits and Systems Society Golden Jubilee Award in 1999. Penfield was a fellow of the Institute of Electrical and Electronics Engineers (IEEE) and the International Engineering Consortium; a member of the National Academy of Engineering, Sigma Xi, the American Physical Society, the Association for Computing Machinery, the Audio Engineering Society, and the National Electrical Engineering Department Heads Association.
A passionate environmentalist and member of the American Fern Society and the Hardy Fern Foundation, Penfield cultivated a collection of ferns, beginning with spores in petri dishes and planting the larger specimens in the garden around his home. In the weeks immediately preceding his death, friends found him busily engaged with the installation of a QR-code-based system of plant identification placards for his home town of Weston, Massachusetts.
“For me, dad was through-and-through a professor,” says his son, David Penfield. “Knowledge was his currency of life … His goal was to become an expert — not for fame, fortune, to one-up people, or to belittle people. Rather, he wanted to be an expert because he realized it was the best way to contribute to a project, to others’ well-being, and to make the world a better place.”
One of Penfield's most notable contributions to that better world was his tireless championship of the development of an ecologically sound rail trail in Weston, where residents and visitors could hike and ride bikes in peaceful, wooded surroundings.
Penfield’s devotion to his community was exceeded only by his dedication to his family, whose growth and successes he chronicled with pride as an amateur genealogist. (His talent for investigation was reflected in his meticulously-researched family tree, which extended back to the 1600s.) He was predeceased by his first wife Martha Elise (Dieterle) Penfield, his second wife Barbara Jean (Buehrig Lory) Penfield, his sister Eleanor Baldwin (Penfield) Spencer and her husband Guilford Lawson Spencer II. He is survived by his companion, Catherine Liddell, of Natick, Massachusetts; his sister Martha Warren (Penfield) Brown and her husband David Tyner Brown of Virginia Beach, Virginia; three children: David Wesley Penfield and his wife Rebecca Emily Bronson of Westford, Massachusetts (sons Andrew Bronson Penfield and Scott Arthur Penfield), Patricia Penfield Jonas and her husband Craig Christopher Jonas of Lakewood, Colorado (son Paul Penfield Jonas), and Michael Baldwin Penfield of Saint Paul, Minnesota; four step-children: John Albert Lory and his wife, Joan Elaine Jouriles of Columbia, Missouri (sons Joshua Nicholas Lory and Asa John Lory), Stephen Richards Lory of Arcata, California (children Aaron Matthew Lory and Shaely Sullivan), Carol Frances (Lory) Topp of Vienna, Virginia (children Russell Francis Topp and Emilia Susan Topp), and Cameron Lory Faulds and her husband Andrew James Faulds of Brooklyn, New York.
From tropical storms to landslides, the form and frequency of natural hazards vary widely. But the feelings of vulnerability they can provoke are universal.
Growing up in hazard-prone cities, Ipek Bensu Manav, a civil and environmental engineering PhD candidate with the MIT Concrete Sustainability Hub (CSHub), noticed that this vulnerability was always at the periphery. Today, she’s studying vulnerability, in both its engineering and social dimensions, with the aim of promoting more hazard-resilient communities.
Her research at CSHub has taken her across the country to attend impactful conferences and allowed her to engage with prominent experts and decision-makers in the realm of resilience. But more fundamentally, it has also taken her beyond the conventional bounds of engineering, reshaping her understanding of the practice.
From her time in Miami, Florida, and Istanbul, Turkey, Manav is no stranger to natural hazards. Istanbul, which suffered a devastating earthquake in 1999, is predicted to experience an equally violent tremor in the near future, while Miami ranks among the top cities in the U.S. in terms of natural disaster risk due to its vulnerability to hurricanes.
“Growing up in Miami, I’d always hear about hurricane season on the news,” recounts Manav, “While in Istanbul there was a constant fear about the next big earthquake. Losing people and [witnessing] those kinds of events instilled in me a desire to tame nature.”
It was this desire to “push the bounds of what is possible” — and to protect lives in the process — that motivated Manav to study civil engineering at Boğaziçi University. Her studies there affirmed her belief in the formidable power of engineering to “outsmart nature.”
This, in part, led her to continue her studies at MIT CSHub — a team of interdisciplinary researchers who study how to achieve resilient and sustainable infrastructure. Her role at CSHub has given her the opportunity to study resilience in depth. It has also challenged her understanding of natural disasters — and whether they are “natural” at all.
“Over the past few decades, some policy choices have increased the risk of experiencing disasters,” explains Manav. “An increasingly popular sentiment among resilience researchers is that natural disasters are not ‘natural,’ but are actually man-made. At CSHub we believe there is an opportunity to do better with the growing knowledge and engineering and policy research.”
As a part of the CSHub portfolio, Manav’s research looks not just at resilient engineering, but the engineering of resilient communities.
Her work draws on a metric developed at CSHub known as city texture, which is a measurement of the rectilinearity of a city’s layout. City texture, Manav and her colleagues have found, is a versatile and informative measurement. By capturing a city’s order or disorder, it can predict variations in wind flow — variations currently too computationally intensive for most cities to easily render.
Manav has derived this metric for her native South Florida. A city texture analysis she conducted there found that numerous census tracts could experience wind speeds 50 percent greater than currently predicted. Mitigating these wind variations could lead to some $697 million in savings annually.
Such enormous hazard losses and the growing threat of climate change have presented her with a new understanding of engineering.
“With resilience and climate change at the forefront of engineering, the focus has shifted,” she explains, “from defying limits and building impressive structures to making structures that adapt to the changing environment around us.”
Witnessing this shift has reoriented her relationship with engineering. Rather than viewing it as a distinct science, she has begun to place it in its broader social and political context — and to recognize how those social and political dynamics often determine engineering outcomes.
“When I started grad school, I often felt ‘Oh this is an engineering problem. I can engineer a solution’,” recounts Manav. “But as I’ve read more about resilience, I’ve realized that it’s just as much a concern of politics and policy as it is of engineering.”
She attributes her awareness of policy to MIT CSHub’s collaboration with the Portland Cement Association and the Ready Mixed Concrete Research & Education Foundation. The commitment of the concrete and cement industries to resilient construction has exposed her to the myriad policies that dictate the resilience of communities.
“Spending time with our partners made me realize how much of a policy issue [resilience] is,” she explains. “And working with them has provided me with a seat at the table with the people engaged in resilience.”
Opportunities for engagement have been plentiful. She has attended numerous conferences and met with leaders in the realm of sustainability and resilience, including the International Code Council (ICC), Smart Home America, and Strengthen Alabama Homes.
Some opportunities have proven particularly fortuitous. When attending a presentation hosted by the ICC and the National Association for the Advancement of Colored People (NAACP) that highlighted people of color working on building codes, Manav felt inspired to reach out to the presenters. Soon after, she found herself collaborating with them on a policy report on resilience in communities of color.
“For me, it was a shifting point, going from prophesizing about what we could be doing, to observing what is being done. It was a very humbling experience,” she says. “Having worked in this lab made me feel more comfortable stepping outside of my comfort zone and reaching out.”
Manav credits this growing confidence to her mentorship at CSHub. More than just providing support, CSHub Co-director Randy Kirchain has routinely challenged her and inspired further growth.
“There have been countless times that I’ve reached out to him because I was feeling unsure of myself or my ideas,” says Manav. “And he’s offered clarity and assurance.”
Before her first conference, she recalls Kirchain staying in the office well into the evening to help her practice and hone her presentation. He’s also advocated for her on research projects to ensure that her insight is included and that she receives the credit she deserves. But most of all, he’s been a great person to work with.
“Randy is a lighthearted, funny, and honest person to be around,” recounts Manav. “He builds in me the confidence to dive straight into whatever task I’m tackling.”
That current task is related to equity. Inspired by her conversations with members of the NAACP, Manav has introduced a new dimension to her research — social vulnerability.
In contrast to place vulnerability, which captures the geographical susceptibility to hazards, social vulnerability captures the extent to which residents have the resources to respond to and recover from hazard events. Household income could act as a proxy for these resources, and the spread of household income across geographies and demographics can help derive metrics of place and social vulnerability. And these metrics matter.
“Selecting different metrics favors different people when distributing hazard mitigation and recovery funds,” explains Manav. “If we’re looking at just the dollar value of losses, then wealthy households with more valuable properties disproportionally benefit. But, conversely, if we look at losses as a percentage of income, we’re going to prioritize low-income households that might not necessarily have the resources to recover.”
Manav has incorporated metrics of social vulnerability into her city texture loss estimations. The resulting approach could predict unmitigated damage, estimate subsequent hazard losses, and measure the disparate impact of those losses on low-income and socially vulnerable communities.
Her hope is that this streamlined approach could change how funds are disbursed and give communities the tools to solve the entwined challenges of climate change and equity.
The city texture work Manav has adopted is quite different from the gravity-defying engineering that drew her to the field. But she’s found that it is often more pragmatic and impactful.
Rather than mastering the elements, she’s learning how to adapt to them and help others do the same. Solutions to climate change, she’s discovered, demand the collaboration of numerous parties — as well as a willingness to confront one’s own vulnerabilities and make the decision to reach out.
The Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) recently announced the 2021 J-WAFS Solutions grant recipients. The J-WAFS Solutions program aims to propel MIT water- and food-related research toward commercialization. Grant recipients receive one year of financial support, as well as mentorship, networking, and guidance from industry experts, to begin their journey into the commercial world — whether that be in the form of bringing innovative products to market or launching cutting-edge startup companies.
This year, three projects will receive funding across water, food, and agriculture spaces. The winning projects will advance nascent technologies for off-grid refrigeration, portable water filtration, and dairy waste recycling. Each provides an efficient, accessible solution to the respective challenge being addressed.
Since the start of the J-WAFS Solutions program in 2015, grants have provided instrumental support in creating a number of key MIT startups that focus on major water and food challenges. A 2015-16 grant helped the team behind Via Separations develop their business plan to massively decarbonize industrial separations processes. Other successful J-WAFS Solutions alumni include researchers who created a low-cost water filter made from tree branches and the team that launched the startup Xibus Systems, which is developing a handheld food safety sensor.
“New technological advances are being made at MIT every day, and J-WAFS Solutions grants provide critical resources and support for these technologies to make it to market so that they can transform our local and global water and food systems,” says J-WAFS Executive Director Renee Robins. “This year’s grant recipients offer innovative tools that will provide more accessible food storage for smallholder farmers in places like Africa, safer drinking water, and a new approach to recycling food waste,” Robins notes. She adds, “J-WAFS is excited to work with these teams, and we look forward to seeing their impact on the water and food sectors.”
The J-WAFS Solutions program is implemented in collaboration with Community Jameel, the global philanthropic organization founded by Mohammed Jameel ’78, and is supported by the MIT Venture Mentoring Service and the iCorps New England Regional Innovation Node at MIT.
Mobile evaporative cooling rooms for vegetable preservation
Food waste is a persistent problem across food systems supply chains, as 30-50 percent of food produced is lost before it reaches the table. The problem is compounded in areas without access to the refrigeration necessary to store food after it is harvested. Hot and dry climates in particular struggle to preserve food before it reaches consumers. A team led by Daniel Frey, faculty director for research at MIT D-Lab and professor of mechanical engineering, has pioneered a new approach to enable farmers to better preserve their produce and improve access to nutritious food in the community. The team includes Leon Glicksman, professor of building technology and mechanical engineering, and Eric Verploegen, a research engineer in MIT D-Lab.
Instead of relying on traditional refrigeration with high energy and cost requirements, the team is utilizing forced-air evaporative cooling chambers. Their design, based on retrofitting shipping containers, will provide a lower-cost, better-performing solution enabling farmers to chill their produce without access to power. The research team was previously funded by J-WAFS through two different grants in 2019 to develop the off-grid technology in collaboration with researchers at the University of Nairobi and the Collectives for Integrated Livelihood Initiatives (CInI), Jamshedpur. Now, the cooling rooms are ready for pilot testing, which the MIT team will conduct with rural farmers in Kenya and India. The MIT team will deploy and test the storage chambers through collaborations with two Kenyan social enterprises and a nongovernmental organization in Gujarat, India.
Off-grid portable ion concentration polarization desalination unit
Shrinking aquifers, polluted rivers, and increased drought are making fresh drinking water increasingly scarce, driving the need for improved desalination technologies. The water purifiers market, which was $45 billion in 2019, is expected to grow to $90.1 billion in 2025. However, current products on the market are limited in scope, in that they are designed to treat water that is already relatively low in salinity, and do not account for lead contamination or other technical challenges. A better solution is required to ensure access to clean and safe drinking water in the face of water shortages.
A team led by Jongyoon Han, professor of biological engineering and electrical engineering at MIT, has developed a portable desalination unit that utilizes an ion concentration polarization process. The compact and lightweight unit has the ability to remove dissolved and suspended solids from brackish water at a rate of one liter per hour, both in installed and remote field settings. The unit was featured in an award-winning video in the 2021 J-WAFS World Water Day Video Competition: MIT Research for a Water Secure Future. The team plans to develop the next-generation prototype of the desalination unit alongside a mass-production strategy and business model.
Converting dairy industry waste into food and feed ingredients
One of the trendiest foods in the last decade, Greek yogurt, has a hidden dark side: acid whey. This low-pH, liquid by-product of yogurt production has been a growing problem for producers, as untreated disposal of the whey can pose environmental risks due to its high organic content and acidic odor.
With an estimated 3 million tons of acid whey generated in the United States each year, MIT researchers saw an opportunity to turn waste into a valuable resource for our food systems. Led by the Willard Henry Dow Professor in Chemical Engineering, Gregory Stephanopoulos, and Anthony J. Sinskey, professor of microbiology, the researchers are utilizing metabolic engineering to turn acid whey into carotenoids, the yellow and orange organic pigments found naturally in carrots, autumn leaves, and salmon. The team is hoping that these carotenoids can be utilized as food supplements or feed additives to make the most of what otherwise would have been wasted.
The MIT University Center for Exemplary Mentoring (UCEM) was founded in 2015 with an Alfred P. Sloan Foundation grant that centers on the recruitment, retention, and academic success of underrepresented doctoral students of color in five areas within the School of Engineering: the departments of Biological Engineering, Chemical Engineering, Electrical Engineering and Computer Science (EECS), and Mechanical Engineering, and the Institute for Medical Engineering and Science/Harvard-MIT Program in Health Sciences and Technology.
Promising PhD candidates are recruited for the UCEM program, and once enrolled receive financial support, mentorship, and professional development training as well as access to a broad and diverse professional network.
UCEM engages candidates through the entire process of “thinking about a PhD, getting in and doing a PhD, and then exiting the PhD,” says Leslie Kolodziejski, UCEM principal investigator and professor of electrical engineering in the Department of Electrical Engineering and Computer Science (EECS). Over the life of UCEM thus far, 83 scholars have been supported, with 58 scholars currently in the program.
A valued space at MIT
UCEM Scholar Jean Carlos Serrano Flores SM ’18 PhD ’21, a graduate of the Department of Mechanical Engineering, remembers first encountering UCEM at an MIT event before he had formally accepted his offer of admission.
“I had two students from [UCEM] directly talk to me prior to my decision,” Serrano Flores says. “UCEM made me feel much more secure that I was going someplace where I’d be welcome … and supported.”
Danielle Olson ’14, SM ’19, PhD ’21, also a UCEM scholar and a graduate of EECS, names the UCEM funding as being an important part of her choice to attend MIT. She also said that she’s appreciated the flexibility of the funding, which can be used to attend conferences and for research expenses that aren’t covered under lab funding.
But both Olson and Serrano Flores emphasize the value of the community and mentorship elements of the UCEM program. UCEM scholars get to know each other through seminars, one-on-one mentoring, conferences, and other UCEM events, such as regular scholar lunches and social activities.
Serrano Flores describes seminars ranging from developing different aspects of a scholar’s academic career to navigating interpersonal relationships with advisors and peers to managing racial biases that graduate students of color experience in academia.
“We're all from underrepresented backgrounds,” says Serrano Flores. “Being in a room with everybody that knows the experiences that you would go [through] along the way, and how tricky it is to really manage them in today's world, was really good.”
Olson feels similarly.
“I felt certainly firsthand in my first several years of graduate school that it's really easy to feel like you could fall between the cracks if you don't have support in place,” she says.
Olson says she was able to access one-on-one support through UCEM, in addition to finding space to talk about common challenges such as “imposter syndrome.”
“I wouldn't necessarily go to a network event in my community of research to talk about those things,” she says. “But [at UCEM] you have these spaces where you can not only have honest conversations … but also get strategies for addressing those things.”
Bianca Lepe is a fourth-year PhD candidate in the Department of Biological Engineering, a UCEM scholar, and the Graduate Student Council’s diversity, equity, and inclusion chair. She says that the networking opportunities at UCEM helped her get the “lay of the land,” so that she could more effectively participate in addressing academic racism at MIT.
“I am on the steering committee for the Strategic Action Plan that's being created around diversity, equity, and inclusion,” Lepe says. “A lot of the conversations and lived experiences that I've had [with] my fellow UCEM scholars has been really helpful in creating the priorities … under the official MIT strategic plan.”
Increasing representation in academia
In accordance with the Sloan Foundation objective to increase representation in academic faculty positions, Kolodziejski says that UCEM scholars attend the Institute on Teaching and Mentoring conference twice during their tenure at MIT. The conference is a three-day crash course in the fundamentals of being a professor, and covers topics ranging from mentoring students to writing proposals to presenting scholarship and giving presentations for employment.
Kolodziejski says that participation in this conference can be very clarifying for students, and for those who want to pursue an academic path, it helps outfit them with the tools they need upon graduating. However, Kolodziejski makes it clear that UCEM scholars are under no obligation to take an academic path.
For more information about UCEM, or if you are interested in becoming a UCEM scholar, please visit the UCEM website.
When Obiageli “Oby” Nwodoh arrived at MIT, she already felt at home. A native of Bedford, Massachusetts, she was the daughter of Thomas Nwodoh, a former MIT Media Lab researcher; her first physics teacher at Bedford High was an MIT alum, Joe Zahka; and she had participated in the Minority Introduction to Engineering and Science (MITES) program.
At MIT, she studied physics, excelling in research, data analytics, machine learning, and computer programming. “I fell in love with physics because it touched reality,” says Nwodoh. “I had a way of explaining the world in numbers when words were challenging. It was learning a new language and using it to describe the world.”
But her interests began to drift toward economic justice. Away from home, she slowly began to understand the economic inequality her family had always experienced. Though unaware as a child, she later learned her family benefited from certain antipoverty initiatives. “It helped us immensely with paying bills, funding extracurricular programs, and more,” she says.
The final click for her was during an internship with a defense contractor, which didn’t match with her political views. She wanted to take her career in a more people-focused direction, so as a sophomore, she enrolled in classes and extracurricular activities that stoked her interests in social justice, science activism, public policy, and equity and diversity.
That’s when dawned on this physics student that she wanted to be a lawyer. And she was surprised at how well the two disparate fields complemented each other.
“The law requires the critical thinking offered by physics," she says. "With both, there is always the need to observe global issues, obtain necessary data, and use some framework to find a solution. I wanted to solve hard world problems, but those that helped people. The law was an outlet to solving major world issues that I experienced as a child. I believe that in America, we are so comfortable with poverty. The law has been a way to change that, along with many other issues.”
Nwodoh worked for several summers with Greater Boston Legal Services’ low-income tax clinic, on cases pertaining to taxes, immigration, and employment. “It was meaningful because I was solving so many issues my own single mother faced,” she says.
By the second summer with GBLS, her work was helping with pandemic stimulus checks. “What really opened up my eyes was how the pandemic affected low-income populations,” she says. “The stimulus provided money for people, but I didn’t hear enough about people who didn’t receive the checks, including immigrants and many people receiving federal assistance through welfare. There were a lot of forgotten people in the pandemic. My work at GBLS solidified my interest in the law and how much impact it could have.”
As a host for the Division of Student Life's podcast “MIT Is…” Nwodoh and her co-host Gabe Owens ’21 explored everything from MIT student life to global issues. She turned some of her research projects into podcasts about immigration, minority voter suppression, and the U.S. tax code, and another podcast turned into a research project where she examined how tax credits could be distributed in the state of New York to maximize payout. “I have dreams of starting my own show one day,” she says.
Nwodoh later worked with the Harvard College Black Pre-Law Association, before helping launch the MIT Pre-Law Society to connect students with relevant career opportunities, classes, and resources. She also was active with the National Society of Black Engineers, and was a peer career advisor at MIT’s Career Advising and Professional Development office. “So many face imposter syndrome, both academically and professionally. Being able to hype a student up and reassure them of their capabilities always filled me with joy,” she says.
Her physics education continued to play a role in her legal work. When she researched policing and voting, and steered various projects as a virtual racial justice data analyst intern with the NAACP Legal Defense and Education Fund, she relied on her skills as a scientist.
“I saw how there was a plethora of data in the world, but not as many people who knew how to use it. Though my experience was short, it inspired me to learn more about data analytics and how it could be useful in the law, ethics, and other fields.”
After graduating this spring with a major in physics and a minor in political science, she became a program paralegal at Ropes and Gray in Chicago, and is looking into law schools. She hopes to focus on technology, such as the impact that algorithm bias has on vulnerable populations.
“I have cherished how being a physicist has prepared me to not be a physicist," she says. "Physics taught me the importance of problem-solving which could be applied in other areas of my life and interests. The technical skills could be used to ‘hack’ different parts of my world. Physics and the law come down to the same thing: interacting with the world in a profound way. MIT taught me that there is always space for my skills in every nook and cranny of the world’s biggest questions. I feel like my work as a physicist has prepared me to delve deeper into any issue, and holds me to an ethical standard of doing so.”
Neural networks (NNs) are increasingly being used to predict new materials, the rate and yield of chemical reactions, and drug-target interactions, among others. For these applications, they are orders of magnitude faster than traditional methods such as quantum mechanical simulations.
The price for this agility, however, is reliability. Because machine learning models only interpolate, they may fail when used outside the domain of training data.
But the part that worried Rafael Gómez-Bombarelli, the Jeffrey Cheah Career Development Professor in the MIT Department of Materials Science and Engineering, and graduate students Daniel Schwalbe-Koda and Aik Rui Tan was that establishing the limits of these machine learning (ML) models is tedious and labor-intensive.
This is particularly true for predicting ‘‘potential energy surfaces” (PES), or the map of a molecule's energy in all its configurations. These surfaces encode the complexities of a molecule into flatlands, valleys, peaks, troughs, and ravines. The most stable configurations of a system are usually in the deep pits — quantum mechanical chasms from which atoms and molecules typically do not escape.
In a recent Nature Communications paper, the research team presented a way to demarcate the “safe zone” of a neural network by using “adversarial attacks.” Adversarial attacks have been studied for other classes of problems, such as image classification, but this is the first time that they are being used to sample molecular geometries in a PES.
“People have been using uncertainty for active learning for years in ML potentials. The key difference is that they need to run the full ML simulation and evaluate if the NN was reliable, and if it wasn't, acquire more data, retrain and re-simulate. Meaning that it takes a long time to nail down the right model, and one has to run the ML simulation many times” explains Gómez-Bombarelli.
The Gómez-Bombarelli lab at MIT works on a synergistic synthesis of first-principles simulation and machine learning that greatly speeds up this process. The actual simulations are run only for a small fraction of these molecules, and all those data are fed into a neural network that learns how to predict the same properties for the rest of the molecules. They have successfully demonstrated these methods for a growing class of novel materials that includes catalysts for producing hydrogen from water, cheaper polymer electrolytes for electric vehicles, zeolites for molecular sieving, magnetic materials, and more.
The challenge, however, is that these neural networks are only as smart as the data they are trained on. Considering the PES map, 99 percent of the data may fall into one pit, totally missing valleys that are of more interest.
Such wrong predictions can have disastrous consequences — think of a self-driving car that fails to identify a person crossing the street.
One way to find out the uncertainty of a model is to run the same data through multiple versions of it.
For this project, the researchers had multiple neural networks predict the potential energy surface from the same data. Where the network is fairly sure of the prediction, the variation between the outputs of different networks is minimal and the surfaces largely converge. When the network is uncertain, the predictions of different models vary widely, producing a range of outputs, any of which could be the correct surface.
The spread in the predictions of a “committee of neural networks” is the “uncertainty” at that point. A good model should not just indicate the best prediction, but also indicates the uncertainty about each of these predictions. It’s like the neural network says “this property for material A will have a value of X and I’m highly confident about it.”
This could have been an elegant solution but for the sheer scale of the combinatorial space. “Each simulation (which is ground feed for the neural network) may take from tens to thousands of CPU hours,” explains Schwalbe-Koda. For the results to be meaningful, multiple models must be run over a sufficient number of points in the PES, an extremely time-consuming process.
Instead, the new approach only samples data points from regions of low prediction confidence, corresponding to specific geometries of a molecule. These molecules are then stretched or deformed slightly so that the uncertainty of the neural network committee is maximized. Additional data are computed for these molecules through simulations and then added to the initial training pool.
The neural networks are trained again, and a new set of uncertainties are calculated. This process is repeated until the uncertainty associated with various points on the surface becomes well-defined and cannot be decreased any further.
Gómez-Bombarelli explains, “We aspire to have a model that is perfect in the regions we care about (i.e., the ones that the simulation will visit) without having had to run the full ML simulation, by making sure that we make it very good in high-likelihood regions where it isn't.”
The paper presents several examples of this approach, including predicting complex supramolecular interactions in zeolites. These materials are cavernous crystals that act as molecular sieves with high shape selectivity. They find applications in catalysis, gas separation, and ion exchange, among others.
Because performing simulations of large zeolite structures is very costly, the researchers show how their method can provide significant savings in computational simulations. They used more than 15,000 examples to train a neural network to predict the potential energy surfaces for these systems. Despite the large cost required to generate the dataset, the final results are mediocre, with only around 80 percent of the neural network-based simulations being successful. To improve the performance of the model using traditional active learning methods, the researchers calculated an additional 5,000 data points, which improved the performance of the neural network potentials to 92 percent.
However, when the adversarial approach is used to retrain the neural networks, the authors saw a performance jump to 97 percent using only 500 extra points. That’s a remarkable result, the researchers say, especially considering that each of these extra points takes hundreds of CPU hours.
This could be the most realistic method to probe the limits of models that researchers use to predict the behavior of materials and the progress of chemical reactions.
In the face of grave concerns about misinformation, social media networks and news organizations often employ fact-checkers to sort the real from the false. But fact-checkers can only assess a small portion of the stories floating around online.
A new study by MIT researchers suggests an alternate approach: Crowdsourced accuracy judgements from groups of normal readers can be virtually as effective as the work of professional fact-checkers.
“One problem with fact-checking is that there is just way too much content for professional fact-checkers to be able to cover, especially within a reasonable time frame,” says Jennifer Allen, a PhD student at the MIT Sloan School of Management and co-author of a newly published paper detailing the study.
But the current study, examining over 200 news stories that Facebook’s algorithms had flagged for further scrutiny, may have found a way to address that problem, by using relatively small, politically balanced groups of lay readers to evaluate the headlines and lead sentences of news stories.
“We found it to be encouraging,” says Allen. “The average rating of a crowd of 10 to 15 people correlated as well with the fact-checkers’ judgments as the fact-checkers correlated with each other. This helps with the scalability problem because these raters were regular people without fact-checking training, and they just read the headlines and lead sentences without spending the time to do any research.”
That means the crowdsourcing method could be deployed widely — and cheaply. The study estimates that the cost of having readers evaluate news this way is about $0.90 per story.
“There’s no one thing that solves the problem of false news online,” says David Rand, a professor at MIT Sloan and senior co-author of the study. “But we’re working to add promising approaches to the anti-misinformation tool kit.”
Intriguingly, when the regular readers recruited for the study were sorted into groups with the same number of Democrats and Republicans, their average ratings were highly correlated with the professional fact-checkers’ ratings — and with at least a double-digit number of readers involved, the crowd’s ratings correlated as strongly with the fact-checkers as the fact-checkers’ did with each other.
“These readers weren’t trained in fact-checking, and they were only reading the headlines and lead sentences, and even so they were able to match the performance of the fact-checkers,” Allen says.
While it might seem initially surprising that a crowd of 12 to 20 readers could match the performance of professional fact-checkers, this is another example of a classic phenomenon: the wisdom of crowds. Across a wide range of applications, groups of laypeople have been found to match or exceed the performance of expert judgments. The current study shows this can occur even in the highly polarizing context of misinformation identification.
The experiment’s participants also took a political knowledge test and a test of their tendency to think analytically. Overall, the ratings of people who were better informed about civic issues and engaged in more analytical thinking were more closely aligned with the fact-checkers.
“People that engaged in more reasoning and were more knowledgeable agreed more with the fact-checkers,” Rand says. “And that was true regardless of whether they were Democrats or Republicans.”
The scholars say the finding could be applied in many ways — and note that some social media behemoths are actively trying to make crowdsourcing work. Facebook has a program, called Community Review, where laypeople are hired to assess news content; Twitter has its own project, Birdwatch, soliciting reader input about the veracity of tweets. The wisdom of crowds can be used either to help apply public-facing labels to content, or to inform ranking algorithms and what content people are shown in the first place.
To be sure, the authors note, any organization using crowdsourcing needs to find a good mechanism for participation by readers. If participation is open to everyone, it is possible the crowdsourcing process could be unfairly influenced by partisans.
“We haven’t yet tested this in an environment where anyone can opt in,” Allen notes. “Platforms shouldn’t necessarily expect that other crowdsourcing strategies would produce equally positive results.”
On the other hand, Rand says, news and social media organizations would have to find ways to get a large enough groups of people actively evaluating news items, in order to make the crowdsourcing work.
“Most people don’t care about politics and care enough to try to influence things,” Rand says. “But the concern is that if you let people rate any content they want, then the only people doing it will be the ones who want to game the system. Still, to me, a bigger concern than being swamped by zealots is the problem that no one would do it. It is a classic public goods problem: Society at large benefits from people identifying misinformation, but why should users bother to invest the time and effort to give ratings?”
The study was supported, in part, by the William and Flora Hewlett Foundation, the John Templeton Foundation, and the Reset project of Omidyar Group’s Luminate Project Limited. Allen is a former Facebook employee who still has a financial interest in Facebook; other studies by Rand are supported, in part, by Google.
As the sun broke through the clouds on a breezy Monday morning, first-year students and their families gathered on Kresge Oval for MIT’s Convocation, the Institute’s annual welcome to the incoming class.
The ceremony marked one of the first major events MIT has hosted on campus since the start of the Covid-19 pandemic. And while some aspects of the occasion were shaped by the ongoing pandemic — notably, masks were required of all who attended — the message to the 1,184 members of MIT’s Class of 2025 was one of hope, connection, and gratitude.
“Whether you know it or not, along with your suitcases, your boxes, your duffel bags, and your satchels, you also brought a gift to our community,” said President L. Rafael Reif in welcoming the incoming class. “You brought to us a gift of your talent, your energy, your curiosity, your creativity, and your drive. And you cannot imagine how grateful we are for that.”
As guests settled into their seats under a large and airy tent, the event opened with “Diary of a Pandemic Year,” a virtual performance that was written, composed, produced, and performed by hundreds of MIT musicians and community members.
“It is a homemade MIT masterpiece,” Reif said of the composition. “It offers a marvelous taste of so many things we love about MIT: a wonderful mix of people and backgrounds, the pleasure we take in making things together, and the energy and creative aspiration of everyone we meet.”
“Your new home”
Reif recalled first arriving at MIT in 1980 as an assistant professor of electrical engineering and computer science — “Which is course…?” he asked of the new students, to which they confidently shouted back, “Six!”
Having grown up in Caracas, Venezuela, with an accent that was shaped 2,000 miles south of Cambridge, Reif was anxious about fitting in at MIT. But he quickly found that, like him, many at MIT “came from somewhere else, and they cared about helping each other, and helping society.”
Joining Reif onstage were several senior members of the MIT administration: Provost Martin Schmidt, Chancellor Melissa Nobles, Vice Chancellor for Undergraduate and Graduate Education Ian Waitz, and Vice Chancellor and Dean for Student Life Suzy Nelson. Reif briefly introduced each of them, noting that they represent essential pieces of a rich support system available to MIT students.
“You’re surrounded by a community that cares about you,” he said. “All of us are dedicated to your success, and we believe in you.”
Moments to meander
Reif then introduced three members of the MIT faculty, who also happen to be MIT alumni: Shankar Raman ’86, section head and professor of literature; Evelyn Wang ’00, the Ford Professor of Engineering and head of the Department of Mechanical Engineering; and Steven D. Eppinger ’83, SM ’84, ScD ’88, the General Motors Leaders for Global Operations Professor of Management at MIT’s Sloan School of Management.
Raman, Wang, and Eppinger each spoke about living and learning at the Institute. For Raman, the MIT experience started out predictably enough. He recalled arriving as an undergraduate from India, “determined to major in Course 6 and emerge an electrical engineer.”
He also loved literature and philosophy, and on his way toward an engineering degree he sampled courses in German, poetry, and Western philosophy. After signing up for a filmmaking class, he stumbled upon MIT’s Department of Architecture, where the course was taught at the time. This encounter sprouted a new path, and Raman went on to earn degrees in both electrical engineering and architecture.
“Whatever your major, remember these four years are probably the only ones in your life where you can meander — where you can decide to not follow the main avenue, but to follow oblique paths and detours, to discover new areas of study,” he said.
His career continued to take unexpected turns. While pursuing a master’s in electrical engineering from the University of California at Berkeley, he realized that “my heart wasn’t fully in it.” So, he switched fields entirely, earning a PhD in literature from Stanford University. In 1995, he returned to MIT as a faculty member in the MIT Literature Section, and today serves as its head, teaching classes in Shakespeare, postcolonial fiction, and perspectives on artificial intelligence.
“I had come to MIT to become an electrical engineer, and I had certainly learned that,” Raman said. “But MIT also taught me how not to be one. And for that lesson, I will be forever grateful, and I hope it’s one you all will experience.”
“You’ve got this”
As a first-year herself, Evelyn Wang recalled setting out with energy, ready to “bring my ‘A’ game.” But her older brother, who also had attended MIT, warned her about “the wicked-hard problem sets,” and that she might not always get the A’s she was accustomed to in high school.
“Getting straight A’s is really, really tough,” Wang said. “You’re probably going to get a B, and maybe even a C or D, and that’s okay. I got an F on my first physics exam. Grades are only one way to measure what you’ll learn here.”
She offered tips for students to make the most of their time at MIT. The first is to be resilient and keep from dwelling on stress.
“Take breaks when you need to. Walk along Memorial Drive. Take a sailing class on the Charles. Tinker with a pet robot. Then get back to the problem sets,” Wang said. “You’ve got this.”
She also encouraged students to build a community — of friends, professors, and loved ones back home — who can support, advise, and ground them as they navigate the next four years.
Wang also reminded students to stay healthy, and pace themselves — advice she learned the hard way as an undergraduate. During a particularly grueling week, she recalled getting very little sleep while attempting to finish multiple class projects. She and her friends were fueled by cans of Mountain Dew, which they erected at the end of the ordeal, in a massive “victory tower.”
“Afterward, I slept for 36 hours straight,” Wang said. “Even when you are young, your body will fall apart if you do that every week. Please hydrate, and maybe drink less Mountain Dew than I did.”
“You are not alone”
As a newly arrived first-year at a similar MIT welcome event, Steven Eppinger remembered being given an obvious yet unsettling reality check.
“A speaker warned us, ‘half of you will be at the bottom half of the class,’” Eppinger said, drawing laughter from the crowd after a beat. “That statistical reality really struck me. Here we were, all these highly accomplished students, being told we may be average or worse. How could I process that?”
He did so by being open to imperfection. He came to MIT on a chemistry scholarship and had been the top chemistry student in both his high school and his state. At MIT, though, he quickly learned to redefine his expectations. “I was not devastated to score poorly on several chemistry exams in my first year,” he said.
Instead, he expanded his interests, by pledging a fraternity, joining the crew team, and participating in design challenges, a talent show, and even some campus hacks — all of which gave him a sense of community and helped to put his heavy courseload into perspective.
He encouraged the Class of 2025 to explore, and to reach out — to study groups, teaching assistants, advisors, and MIT’s Student Support Services — for help along the way.
“You are not alone in this journey,” said Eppinger, closing with a hopeful vision for the future:
“All of you are going to play a role in changing the world, through science and engineering, and a range of humanitarian endeavours,” he said. “You are going to be people of great consequence, who will do great things.”
Would you like to live longer? It turns out that where you live, not just how you live, can make a big difference.
That’s the finding of an innovative study co-authored by an MIT economist, which examines senior citizens across the U.S. and concludes that some locations enhance longevity more than others, potentially for multiple reasons.
The results show that when a 65-year-old moves from a metro area in the 10th percentile, in terms of how much those areas enhance longevity, to a metro area the 90th percentile, it increases that person’s life expectancy by 1.1 years. That is a notable boost, given that mean life expectancy for 65-year-olds in the U.S. is 83.3 years.
“There’s a substantively important causal effect of where you live as an elderly adult on mortality and life expectancy across the United States,” says Amy Finkelstein, a professor in MIT’s Department of Economics and co-author of a newly published paper detailing the findings.
Researchers have long observed significant regional variation in life expectancy in the U.S., and often attributed it to “health capital” — tendencies toward obesity, smoking, and related behavioral factors in the regional populations. But by analyzing the impact of moving, the current study can isolate and quantify the effect that the location itself has on residents.
As such, the research delivers important new information about large-scale drivers of U.S. health outcomes — and raises the question of what it is about different places that affects the elderly’s life expectancy. One clear possibility is the nature of available medical care. Other possible drivers of longevity include climate, pollution, crime, traffic safety, and more.
“We wanted to separate out the role of people’s prior experiences and behaviors — or health capital — from the role of place or environment,” Finkelstein says.
The paper, “Place-Based Drivers of Mortality: Evidence of Migration,” is published in the August issue of the American Economic Review. The co-authors are Finkelstein, the John and Jennie S. MacDonald Professor of Economics at MIT, and Matthew Gentzkow and Heidi Williams, who are both professors of economics at Stanford University.
Comparing movers to see how place matters
To conduct the study, Finkelstein, Gentzkow, and Williams analyzed Medicare records from 1999 to 2014, focusing on U.S. residents between the ages of 65 and 99. Ultimately the research team studied 6.3 million Medicare beneficiaries. About 2 million of those moved from one U.S. “commuting zone” to another, and the rest were a random 10 percent sample of people who had not moved over the 15-year study period. (The U.S. Census Bureau defines about 700 commuting zones nationally.)
A central element of the study involves seeing how different people who were originally from the same locations fared when moving to different destinations. In effect, says Finkelstein, “The idea is to take two elderly people from a given origin, say, Boston. One moves to low-mortality Minneapolis, one moves to high-mortality Houston. We then compare thow long each lives after they move.”
Different people have different health profiles before they move, of course. But Medicare records include detailed claims data, so the researchers applied records of 27 different illnesses and conditions — ranging from lung cancer and diabetes to depression — to a standard mortality risk model, to categorize the overall health of seniors when they move. Using these “very, very rich pre-move measures of their health,” Finkelstein notes, the researchers tried to account for pre-existing health levels of seniors from the same location who moved to different places.
Still, even assessing people by 27 measures does not completely describe their health, so Finkelstein, Gentzkow, and Williams also estimated what fraction of people’s health conditions they had not observed — essentially by calibrating the observed health of seniors against health capital levels in places they were moving from. They then consider how observed health varies across individuals from the same location moving to different destinations and, assuming that differences in unobserved health — such as physical mobility — vary in the same way as observed differences in health, they adjust their estimates accordingly.
All told, the study found that many urban areas on the East and West Coasts — including New York City, San Francisco, and Miami — have positive effects on longevity for seniors moving there. Some Midwestern metro areas, including Chicago, also score well.
By contrast, a large swath of the deep South has negative effects on longevity for seniors moving there, including much of Alabama, Arkansas, Louisiana, and northern Florida. Much of the Southwest, including parts of Texas, Oklahoma, New Mexico, and Arizona, fares similarly poorly.
The scholars also estimate that health capital accounts for about 70 percent of the difference in longevity across areas of the U.S., and that location effects account for about 15 percent of the variation.
“Yes, health capital is important, but yes, place effects also matter,” Finkelstein says.
Other leading experts in health economics say they are impressed by the study. Jonathan Skinner, the James O. Freeman Presidential Professor of Economics, Emeritus, at Dartmouth College, says the scholars “have provided a critical insight” into the question of place effects “by considering older people who move from one place to another, thus allowing the researchers to cleanly identify the pure effect of the new location on individual health — an effect that is often different from the health of long-term residents. This is an important study that will surely be cited and will influence health policy in coming years.”
The Charlotte Effect: What makes a difference?
Indeed, the significance of place effects on life expectancy is also evident in another pattern the study found. Some locations — such as Charlotte, North Carolina — have a positive effect on longevity but still have low overall life expectancy, while other places — such as Santa Fe New Mexico — have high overall life expectancy, but a below-average effect on the longevity of seniors who move there.
Again, the life expectancy of an area’s population is not the same thing as that location’s effect on longevity. In places where, say, smoking is highly prevalent, population-wide longevity might be subpar, but other factors might make it a place where people of average health will live longer. The question is why.
“Our [hard] evidence is about the role of place,” Finkelstein says, while noting that the next logical step in this vein of research is to look for the specific factors at work. “We know something about Charlotte, North Carolina, makes a difference, but we don’t yet know what.”
With that in mind, Finkelstein, Gentzkow, and Williams, along with other colleagues, are working on a pair of new studies about health care practices to see what impact place-based differences may have; one study focuses on doctors, and the other looks at the prescription opioid epidemic.
In the background of this research is a high-profile academic and policy discussion about the impact of health care utilization. One perspective, associated with the Dartmouth Atlas of Health Care project, suggests that the large regional differences in health care use it has documented have little impact on mortality. But the current study, by quantifying the variable impact of place, suggest there may be, in turn, a bigger differential impact in health care utilization yet to be identified.
For her part, Finkelstein says she would welcome further studies digging into health care use or any other factor that might explain why different places have different effects on life expectancy; the key is uncovering more hard evidence, wherever it leads.
“Differences in health care across places are large and potentially important,” Finkelstein says. “But there are also differences in pollution, weather, [and] other aspects. … What we need to do now is get inside the black box of ‘the place’ and figure out what it is about them that matters for longevity.”
The study was supported, in part, by the National Institute on Aging, the National Science Foundation, and the Stanford Institute for Economic Policy Research.
As the United States races to achieve its goal of zero-carbon electricity generation by 2035, energy providers are swiftly ramping up renewable resources such as solar and wind. But because these technologies churn out electrons only when the sun shines and the wind blows, they need backup from other energy sources, especially during seasons of high electric demand. Currently, plants burning fossil fuels, primarily natural gas, fill in the gaps.
“As we move to more and more renewable penetration, this intermittency will make a greater impact on the electric power system,” says Emre Gençer, a research scientist at the MIT Energy Initiative (MITEI). That’s because grid operators will increasingly resort to fossil-fuel-based “peaker” plants that compensate for the intermittency of the variable renewable energy (VRE) sources of sun and wind. “If we’re to achieve zero-carbon electricity, we must replace all greenhouse gas-emitting sources,” Gençer says.
Low- and zero-carbon alternatives to greenhouse-gas emitting peaker plants are in development, such as arrays of lithium-ion batteries and hydrogen power generation. But each of these evolving technologies comes with its own set of advantages and constraints, and it has proven difficult to frame the debate about these options in a way that’s useful for policymakers, investors, and utilities engaged in the clean energy transition.
Now, Gençer and Drake D. Hernandez SM ’21 have come up with a model that makes it possible to pin down the pros and cons of these peaker-plant alternatives with greater precision. Their hybrid technological and economic analysis, based on a detailed inventory of California’s power system, was published online last month in Applied Energy. While their work focuses on the most cost-effective solutions for replacing peaker power plants, it also contains insights intended to contribute to the larger conversation about transforming energy systems.
“Our study’s essential takeaway is that hydrogen-fired power generation can be the more economical option when compared to lithium-ion batteries — even today, when the costs of hydrogen production, transmission, and storage are very high,” says Hernandez, who worked on the study while a graduate research assistant for MITEI. Adds Gençer, “If there is a place for hydrogen in the cases we analyzed, that suggests there is a promising role for hydrogen to play in the energy transition.”
Adding up the costs
California serves as a stellar paradigm for a swiftly shifting power system. The state draws more than 20 percent of its electricity from solar and approximately 7 percent from wind, with more VRE coming online rapidly. This means its peaker plants already play a pivotal role, coming online each evening when the sun goes down or when events such as heat waves drive up electricity use for days at a time.
“We looked at all the peaker plants in California,” recounts Gençer. “We wanted to know the cost of electricity if we replaced them with hydrogen-fired turbines or with lithium-ion batteries.” The researchers used a core metric called the levelized cost of electricity (LCOE) as a way of comparing the costs of different technologies to each other. LCOE measures the average total cost of building and operating a particular energy-generating asset per unit of total electricity generated over the hypothetical lifetime of that asset.
Selecting 2019 as their base study year, the team looked at the costs of running natural gas-fired peaker plants, which they defined as plants operating 15 percent of the year in response to gaps in intermittent renewable electricity. In addition, they determined the amount of carbon dioxide released by these plants and the expense of abating these emissions. Much of this information was publicly available.
Coming up with prices for replacing peaker plants with massive arrays of lithium-ion batteries was also relatively straightforward: “There are no technical limitations to lithium-ion, so you can build as many as you want; but they are super expensive in terms of their footprint for energy storage and the mining required to manufacture them,” says Gençer.
But then came the hard part: nailing down the costs of hydrogen-fired electricity generation. “The most difficult thing is finding cost assumptions for new technologies,” says Hernandez. “You can’t do this through a literature review, so we had many conversations with equipment manufacturers and plant operators.”
The team considered two different forms of hydrogen fuel to replace natural gas, one produced through electrolyzer facilities that convert water and electricity into hydrogen, and another that reforms natural gas, yielding hydrogen and carbon waste that can be captured to reduce emissions. They also ran the numbers on retrofitting natural gas plants to burn hydrogen as opposed to building entirely new facilities. Their model includes identification of likely locations throughout the state and expenses involved in constructing these facilities.
The researchers spent months compiling a giant dataset before setting out on the task of analysis. The results from their modeling were clear: “Hydrogen can be a more cost-effective alternative to lithium-ion batteries for peaking operations on a power grid,” says Hernandez. In addition, notes Gençer, “While certain technologies worked better in particular locations, we found that on average, reforming hydrogen rather than electrolytic hydrogen turned out to be the cheapest option for replacing peaker plants.”
A tool for energy investors
When he began this project, Gençer admits he “wasn’t hopeful” about hydrogen replacing natural gas in peaker plants. “It was kind of shocking to see in our different scenarios that there was a place for hydrogen.” That’s because the overall price tag for converting a fossil-fuel based plant to one based on hydrogen is very high, and such conversions likely won’t take place until more sectors of the economy embrace hydrogen, whether as a fuel for transportation or for varied manufacturing and industrial purposes.
A nascent hydrogen production infrastructure does exist, mainly in the production of ammonia for fertilizer. But enormous investments will be necessary to expand this framework to meet grid-scale needs, driven by purposeful incentives. “With any of the climate solutions proposed today, we will need a carbon tax or carbon pricing; otherwise nobody will switch to new technologies,” says Gençer.
The researchers believe studies like theirs could help key energy stakeholders make better-informed decisions. To that end, they have integrated their analysis into SESAME, a life cycle and techno-economic assessment tool for a range of energy systems that was developed by MIT researchers. Users can leverage this sophisticated modeling environment to compare costs of energy storage and emissions from different technologies, for instance, or to determine whether it is cost-efficient to replace a natural gas-powered plant with one powered by hydrogen.
“As utilities, industry, and investors look to decarbonize and achieve zero-emissions targets, they have to weigh the costs of investing in low-carbon technologies today against the potential impacts of climate change moving forward,” says Hernandez, who is currently a senior associate in the energy practice at Charles River Associates. Hydrogen, he believes, will become increasingly cost-competitive as its production costs decline and markets expand.
A study group member of MITEI’s soon-to-be published Future of Storage study, Gençer knows that hydrogen alone will not usher in a zero-carbon future. But, he says, “Our research shows we need to seriously consider hydrogen in the energy transition, start thinking about key areas where hydrogen should be used, and start making the massive investments necessary.”
Funding for this research was provided by MITEI’s Low-Carbon Energy Centers and Future of Storage study.
On July 29, MIT Provost Martin A. Schmidt and Associate Provost Krystyn Van Vliet attended a groundbreaking ceremony to celebrate the construction of Landmark Bio, a new 40,000-square foot biopharma manufacturing facility at The Arsenal on the Charles in Watertown, Massachusetts. Jongyoon Han, MIT professor of electrical engineering and biological engineering, and Richard D. Braatz, the Edwin R. Gilliland Professor, faculty research officer, and professor of chemical engineering at MIT also attended the event.
Landmark Bio emerged from a public-private partnership formed in 2019 among MIT and four founding members: Harvard University, FUJIFILM Diosynth Biotechnologies, Cytiva, and Alexandria Real Estate Equities.
The facility — which will house manufacturing and development spaces under the same roof — is open to MIT faculty and their research groups conducting research to advance cell-based and RNA-based therapies for challenging diseases such as cancer, as well as for regenerative medicine. Landmark Bio’s mission is to advance and remove barriers to manufacturing technologies while serving as a forum for workforce development available to workers in Massachusetts and beyond.
“With this collaboration among academia, industry, and government, we have a collective opportunity to advance the technologies that manufacture and distribute new and next-generation medicines by incubating and analyzing the data behind those technologies,” says Schmidt. “MIT is proud to play a role in the development of new technologies to make, measure, and analyze new medicines better than we can today.”
For MIT, the partnership will connect to the Institute’s ongoing advanced manufacturing workforce development efforts including a hands-on cell therapy manufacturing offering piloted this past spring at MIT. Van Vliet, who represents MIT on the Landmark Bio project, led the development of a course to train workers pursuing careers in advanced biomanufacturing in Greater Boston-Cambridge and beyond. More than 2,000 students in the United States and from 90 other countries enrolled in the first online offering.
“This includes MIT’s Center for Biomedical Innovation as well as online classes and hands-on training to enable reskilling and upskilling of tomorrow’s manufacturing workforce. Many roles are needed to make these next-generation medicines reach more patients,” says Van Vliet.
A public-private collaboration, Landmark Bio partner members include Beth Israel Deaconess Medical Center, Boston Children’s Hospital, Brigham and Women’s Hospital, the Dana-Farber Cancer Institute, Massachusetts General Hospital, and the Massachusetts Life Sciences Center.
“The partnership promotes cross-sector collaborations to accelerate innovation and strengthen the leading position of the Commonwealth in the life sciences,” says Ran Zheng, Landmark Bio’s chief executive officer. “One of the initiatives we are collaborating on is workforce development. With MIT’s world-class approach to teaching and learning and Landmark Bio’s cutting-edge process development and biomanufacturing facility, we believe we can provide an immersive learning experience to a diverse pool of talent, help address the workforce shortage in novel modality development and manufacturing, and enable the growth of the life sciences sector.”
Each of the founding members invested in the construction and operations of the facility, which is slated to open in 2022.
MIT spinoff OpenSpace invented automated 360-degree video jobsite capture and mapping. “It’s not exactly an amazing observation,” says CEO Jeevan Kalanithi, “but a picture really is worth a thousand words.”
In the world of real estate development, visual documentation of construction projects is critical. It aids in dispute resolution, prevents mistakes from being compounded, and allows for knowledge capture in case of change orders. Builders are often contractually obligated to document progress. Usually, this means hiring someone to walk the site and take photos of key areas once a month. These photos are then slapped in a binder or uploaded into a cloud storage service.
But the old way is akin to taking a few snapshots of the Grand Canyon — if the natural wonder was a human-made, built environment — and expecting your audience to grasp the totality of the spectacle. We now have drones, smartphones, and 360-degree cameras. But drones can’t be operated safely indoors, and even with 360-degree cameras, you still have to hire someone dedicated to the task of photographing the site while considering how the files will be properly stored and shared with stakeholders. Updated tech, prettier pictures; same old problems, and new costs. Change orders still lead to chaos, and accountability disputes run rampant.
Enter OpenSpace, a company that’s propelling the construction of any built environment into the digital age. They’ve updated an essential idea by attaching an off-the-shelf 360-degree camera to a hard hat, and imbued it with cutting-edge computer vision, artificial intelligence, and data visualization software — not unlike the perception and navigation AI systems used in autonomous vehicles.
All you have to do is turn on the camera, tap “go” on the app, and walk the site. It's essentially passive; the OpenSpace Vision System does all the work, mapping site photos to site plans automatically. The complicated part happens under the hood, so to speak, meaning ease of use and streamlined simplicity for the end-user and a comprehensive visual record of the site, with 15-minute processing times, not hours or days, as is the case with some of their competitors.
"OpenSpace provides a living tool for managing just about everything on the job site. It isn't just an archive. Once you have this near-live view of your project, it changes the way people build by instilling a sense of ground truth, shared facts,” says Kalanithi. “And it can be viewed from anywhere. It’s like a time machine meets teleportation device for the job site.”
Kalanithi and his co-founders Phil DeCamp and Mike Fleischman met as grad students at the Media Lab. Kalanithi sold his first company, Sifteo, to a drone company called 3D Robotics, where he eventually became the president. Prior to OpenSpace, CTO DeCamp was a computer vision and data visualization research scientist at the Institute. For his part, Fleischman started a data analytics company called Bluefin Labs, based on his research at the Institute, which he eventually sold to Twitter. It was one of the media giant’s largest, and some would argue most meaningful, acquisitions to date.
OpenSpace has raised close to $33 million in funding, with big names like Lux Capital and Menlo Ventures playing vital roles in their growth. While Kalanithi and team are thrilled to sit in on board meetings with people that have been part of companies that have scaled to the billions, they have not forgotten their roots.
"We exist because of the MIT Startup ecosystem," says Kalanithi. "There simply would not be an OpenSpace if it weren't for MIT. It's not trivial to scale any company, especially one like ours that is trying to help a fractured enterprise market lacking a clearinghouse for solutions and technologies for the industry. There's no substitute for the kind of connections that MIT can provide. Being named to the new cohort of STEX25 startups means we now have access to the leading companies that MIT has connections with,” says Kalanithi.
Founded in 2017, OpenSpace came out of stealth mode in 2018, and they’re expanding rapidly. It now has thousands of customer projects across 40 countries. From renovations and tenant improvements to stadiums, data centers, and hospitals, the computer vision company with an eye for construction has captured over 3 billion square feet of active construction projects, proving the speed and simplicity of its solution while simplifying low-trust, high-labor, complicated workflows. It's a number that is growing rapidly, and every square foot captured provides data that help OpenSpace build the future of its analytics products.
Today, it has a suite of products called ClearSight. It's a pioneering new class of AI tools that leverages images to provide unprecedented insight into project status and progression while reversing hundreds of thousands of dollars in change orders and saving millions in overall construction costs by trimming schedules. And thanks to the virtual aspect of the platform, travel budgets get cut significantly. No more trips to the site required when you can see a past-to-present 360-degree view from anywhere.
At the end of the day, the company is solving a tough computer vision problem: "Computer vision allows us to build tools for people that work in real physical reality that you just couldn't before; we’ve crossed a barrier in terms of technological advancement,” says Kalanithi.
A man’s ghostly voice speak-sings from the black screen: “Rock-a-bye baby, on the treetops …” It’s a tentative voice, unused to intoning lullabies, the voice of a man who was just released from prison. When he was convicted, his twin children were 45 days old. Now, they’re 21. This father’s voice is one of dozens collected in the ongoing documentary project “A Father’s Lullaby” by current MIT Open Documentary Lab Fellow Rashin Fahandej. It comprises a compilation of recorded lullabies and oral histories from incarcerated fathers separated from their young children. The project has taken such forms as a geo-located sound installation and an award-winning museum exhibition.
This inventive and moving inventory of lost lullabies is one of many examples of the boundary-pushing creative works that are found in the MIT Open Documentary Lab (ODL) archive — a deep archive known as the Docubase. Others include a poetic city symphony of Nairobi in virtual reality, a hybrid animated documentary and virtual reality game that tells the story of an Egyptian lesbian couple, and a participatory oral history of immigrant communities in Los Angeles. Many of these projects can also be described as “transmedia”— a term for works that extend beyond a single medium while playing to the strengths of each one.
Docubase, which takes the form of a vast website repository, is only one facet of the ODL’s ongoing mission to explore and incubate innovative forms of documentary using emerging technologies and techniques, among them cell phone recordings, virtual and augmented reality, and deep-fake manipulations. Other facets of the lab include many original projects; a co-creation studio; a weekly publication; conferences; championing public literacy about technologies, including AI, and their implications; and weekly lecture series open to the MIT community and beyond. Now celebrating its 10th year, ODL also boasts a far-reaching network of fellows, creators, and researchers, all perched at, and defining, the cutting edge of what a documentary can be.
In the late aughts, William Uricchio, a professor of comparative media studies and the founding principal investigator of ODL, recognized that documentary was in a moment of transition. He formulated the idea of a new lab at MIT’s unique crossroads of artistic and technological innovation, inspired by the Institute’s long history of using media to record aspects of the world.
Sarah Wolozin, the lab’s founding director and the creator of Docubase, says, “If you look at the history of documentary, it’s always evolving depending on what technology was available. One of the earliest examples are cave paintings. Today people use cellphones, cameras, computers, sensors, and many other technologies and processes to create stories about the world around us.”
Before helping Uricchio found the lab, Wolozin was working as a program manager at MIT Comparative Media Studies/Writing. As a multiplatform documentarian herself, she had been experimenting with media forms as a maker since the mid-’90s, when the internet first became publicly accessible.
Uricchio saw new technologies — and cell phones in particular — as revealing the many perspectives that go into telling any story and potentially changing who gets to tell it. Documentary, he and Wolozin realized, could have a new, accessible home on the internet where the many roles engaged in the genre — creator, producer, technician, subjects, and audience — blur together generatively. The legacy of the open-source movement at MIT also influenced their inspiration for the “open” ethos of the documentary lab: an open system that allows many people to contribute to and iterate on works.
The sea-change of cellphone technology and ubiquitous cameras can be felt deeply in our culture, Uricchio observes. Without omnipresent, publicly accessible camera footage, watershed national events, including the murder of George Floyd in Minneapolis — and the resulting national and international protests — would simply not have been visible to the country and the world. Understanding how documentary works and is experienced, as a form of witnessing and truth-telling, has developed into a significant focus of research at the ODL — and an exploration that has become more complex with the advent of deep fakes and other forms of manipulation.
Imagining the future
The nascent Open Documentary Lab hit the ground running in 2011 with its first New Arts of Documentary conference, which overflowed the Media Lab’s massive sixth floor with industry experts, makers, technologists, scholars, and curators.
Uricchio recalls the concept: “We thought: Let’s put the funders, the technologists, the film festival people, and the makers at the same table to have a conversation. And it was magical. It’s been amazing to watch these folks help one another to reimagine the future of documentary storytelling.”
“We were very outward facing from the very beginning,” says Wolozin. “It was really about being in dialogue and interacting with the field.”
Wolozin began fostering partnerships with Tribeca, Sundance, and other leading organizations in the field. The International Documentary Festival of Amsterdam, whose new media program is led by Uricchio’s former student, became an important collaborator. Together they created Moments of Innovation, Uricchio’s visual white paper that formed the basis for the lab’s approach to documentary. Another immediate outcome was a program for MIT students developed by Wolozin and Sundance Film Festival New Frontier curator Shari Frilot called “Creating Critics” that still exists today. Recognizing that there was very little critical discourse about the emerging new forms of documentary, ODL graduate student researchers are sent to Sundance Festival’s New Frontier program as critics to write about the New Frontier exhibit. Their articles are published in Indiewire, a film industry online publication.
Uricchio had also helped found MIT’s Comparative Media Studies/Writing program itself — the department in MIT SHASS that houses ODL — working in collaboration with foundational new media theorist Henry Jenkins. Where Jenkins’ landmark scholarship focuses on participatory media in a networked body of work (e.g., user-generated content fueling massive companies like YouTube and Twitter, for instance), Uricchio’s ODL spirals participation toward vast new speculative horizons: How can stories be told in novel and innovative formats that both give voice to the subject and agency to the audience? The work produced by ODL and its fellows is often interactive and immersive — creating the feeling of being actively engaged and embedded in a story and often, enabling users to find their own stories.
Recalling some of her earliest work on the web in the 1990s — as the world first had public access to the internet — Wolozin reflects, “These impulses for participatory starts, storytelling, and interactivity have always been there, and they just evolve and change based on the technology that’s available.”
“We look at where new technology meets the mission of documentary,” says Uricchio. In his own scholarship, which entwines media historicity with forward-thinking possibilities, he is fascinated with how past forms of media inform the present and the future. “Media technologies and affordances have changed over the centuries, and if documentary as an interrogation of the world around us is to remain relevant, we must push the boundaries of — and better understand the implications of — today’s trends such as personalization, interactivity, and immersion. Just as importantly, we now have an opportunity to shift the balance of agency and change who tells the story.”
In that spirit, ODL is also the home of the MIT Co-Creation Studio. Launched in 2016, the Co-Creation Studio dives into the methodological implications of how documentaries are made in a networked world.
Katerina Cizek, a multi-Emmy Award-winning documentarian who has been pioneering participatory and interactive documentary production for decades, leads the young but prolific Co-Creation Studio. Beginning in the early 2000s, Cizek worked as a director with the National Film Board of Canada, reinventing a huge program to use film to advance social justice and community development by partnering with people in the community across disciplines and sectors.
When she first came to ODL as a visiting artist, Cizek recalls, there was no real global hub for exploring co-creative methodologies. Five years ago, Wolozin and Uricchio invited her to bring her idea for a co-creation studio to MIT. MIT Press will publish the Co-Creation Studio’s first book, “Collective Wisdom,” in 2022, which in turn is based on a pivotal field study from the studio that rejuvenated interest in the how’s and why’s of the co-creative process in creative fields.
“What the Studio particularly contributes is a focus on new and collective methodologies,” says Cizek.
This past academic year, the studio launched a big, ongoing project: Indigenous Digital Delegation, a partnership with the Indigenous Screen Office based in Canada. It is an initiative for what the Indigenous Screen Office calls “Indigenous narrative sovereignty”: Indigenous control over how Indigenous stories are told and who tells them. The delegation’s partnership with MIT puts Indigenous media scholars and artists into conversation with a wide breadth of experts and thinkers at the Institute.
“It’s two-way conversation,” says Cizek. “It’s really about developing deep conversation around Indigenous epistemologies, artificial intelligence, and digital worlds. The pickup at MIT was amazing. We had over 60 faculty, staff, and students respond to and participate in a variety of ways with the delegation, and we’ll be running the program again next spring.”
Even today, there is no other lab doing this kind of work, says Wolozin, of the range and nature of ODL’s portfolio. There are new technological factors in the game — extended reality and artificial intelligence, to name two — but the lab’s mission continues to be bringing storytellers and technology scholars together to explore the relationship between representation and reality.
Each week, Immerse, ODL’s publication on Medium, offers a clear window onto the lab’s mission in action, from the role of street projections that subvert official narratives in South America to the social media life of an aging robot to speculative nonfictions in public space. The content of Immerse is all about how stories are being told now, in a dazzling array of media.
Co-founded by Wolozin and Ingrid Kopp, the director of interactive media at Tribeca Film Institute, and Jessica Clark, founder and CEO of Dot Connector in 2015, the publication is currently headed by Abby Sun, a CMS/W master’s student with an extensive professional history in film festivals and programming. “Editing Immerse is a collective undertaking,” says Sun, honoring the input of several key collaborators and industry veterans, including Wolozin and Cizek. “My role as editor has expanded my consciousness and context for the long history and vibrant future of this work.”
Also at home in Immerse is research and writing by MIT Comparative Media Studies faculty and graduate students, including Sun and Diego Cerna Aragon in the first 2021 issue. MIT alumni who were associated with ODL include Andrea Kim SM ’21, who recently received a Fulbright fellowship to continue her work on avatars; Sarah Rafsky ’18, a journalist and documentarian who produced an important investigative short film on Mecca, Mexico, for Netflix; Sue Ding SM ’17, a documentarian on the West Coast, with a breakout 2020 feature on Netflix about “The Baby-Sitters’ Club”; and Samuel Mendez ’20 who is now in a PhD program in public health at Harvard University and marries media art and public policy as a programmer.
A major function of the lab is making space for marginalized storytellers to take agency of how their own stories are told. Currently, ODL is engaged in a massive, two-year project on augmenting public space — either through geo-located sounds, projections on the sides of buildings, or QR codes. “Now that we’re challenging the master narrative of which monuments should be there,” asks Uricchio, “how can we leave traces in a more collective way? How can we actually augment and enhance spaces with people’s stories and narratives?”
In one place-based act of history-telling, ODL Fellow Assia Boundaoui projected redacted FBI surveillance reports of Muslim Americans against the walls of the U.S. National Security Agency building, while using artificial intelligence technology to fill in the redacted language.
Recently, a joint fellowship with the MIT Center for Art, Science & Technology to partner with Black Public Media brought two new fellows to ODL. One resultant project is “Mapping Blackness,” a powerful work by 2020-21 ODL Fellow Carla Bishop that uses innovative, intergenerational oral histories to document forgotten Black communities in northern Texas and Oklahoma. These small, century-old towns are very much alive and thriving — even if you might miss them if you blink driving by on the highway. Bishop’s work is a way to archive the stories of these communities and their histories accessible to a larger public.
Overall, the lab focuses on the guiding ethos of increasing public literacy about emerging technologies: Storytelling is a powerful way to demystify new technologies, to increase an understanding of their implications, and engage the public in decision-making about how emerging technologies will be deployed.
Documentary itself as a discipline has always been deeply entrenched with technological and scientific thinking. “The reality at the core of documentary has made it an ideal lens through which understand our representational conventions,” Uricchio says. “Mastering those conventions, and at times strategically breaking them, enables documentary not just to interpret the world, but to change it.”
“Making technologies accessible has always been important in my work,” says Wolozin. “Helping people understand their potential for storytelling and information. If we think about stories as a way to understand the world, we can do that with technology, and we can reach people in new ways. Because when people change the way they communicate, they change the way that they tell stories. And by so doing, they can transform how people see the world.”
Story prepared by MIT SHASS Communications
Editorial team: Alison Lanier and Emily Hiestand
Wide Area Networks (WANs), the global backbones and workhorses of today’s internet that connect billions of computers over continents and oceans, are the foundation of modern online services. As Covid-19 has placed a vital reliance on online services, today's networks are struggling to deliver high bandwidth and availability imposed by emerging workloads related to machine learning, video calls, and health care.
To connect WANs over hundreds of miles, fiber optic cables that transmit data using light are threaded throughout our neighborhoods, made of incredibly thin strands of glass or plastic known as optical fibers. While they’re extremely fast, they’re not always reliable: They can easily break from weather, thunderstorms, accidents, and even animals. These tears can cause severe and expensive damage, resulting in 911 service outages, lost connectivity to the internet, and inability to use smartphone apps.
Scientists from the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) and from Facebook recently came up with a way to preserve the network when the fiber is down, and to reduce cost. Their system, called “ARROW,” reconfigures the optical light from a damaged fiber to healthy ones, while using an online algorithm to proactively plan for potential fiber cuts ahead of time, based on real-time internet traffic demands.
ARROW is built on the crossroads of two different approaches: “failure-aware traffic engineering,” a technique that steers traffic to where the bandwidth resources are during fiber cuts, and “wavelength reconfiguration,” which restores failed bandwidth resources by reconfiguring the light.
Though this combination is powerful, the problem is mathematically difficult to solve because of its NP-hardness in computational complexity theory.
The team created a novel algorithm that can essentially create “LotteryTickets” as an abstraction for the “wavelength reconfiguration problem” on optical fibers and only feed essential information into the “traffic engineering problem.” This works alongside their “optical restoration method,” which moves the light from the cut fiber to “surrogate’’ healthy fibers to restore the network connectivity. The system also takes real-time traffic into account to optimize for maximum network throughput.
Using large-scale simulations and a testbed, ARROW could carry 2 to 2.4 times more traffic without having to deploy new fibers, while maintaining the network highly reliable.
“ARROW can be used to improve service availability, and enhance the resiliency of the internet infrastructure against fiber cuts. It renovates the way we think about the relationship between failures and network management — previously failures were deterministic events, where failure meant failure, and there was no way around it except over-provisioning the network,” says MIT postdoc Zhizhen Zhong, the lead author on a new paper about ARROW. “With ARROW, some failures can be eliminated or partially restored, and this changes the way we think about network management and traffic engineering, opening up opportunities for rethinking traffic engineering systems, risk assessment systems, and emerging applications too.”
The design of today's network infrastructures, both in data centers and in wide-area networks, still follow the “telephony model,” where network engineers treat the physical layer of networks as a static black box with no reconfigurability.
As a result, the network infrastructure is equipped to carry the worst-case traffic demand under all possible failure scenarios, making it inefficient and costly. Yet, modern networks have elastic applications that could benefit from a dynamically reconfigurable physical layer, to enable high throughput, low latency, and seamless recovery from failures, which ARROW helps enable.
In traditional systems, network engineers decide in advance how much capacity to provide in the physical layer of the network. It might seem impossible to change the topology of a network without physically changing the cables, but since optical waves can be redirected using tiny mirrors, they’re capable of quick changes: no rewiring required. This is a realm where the network is no longer a static entity but a dynamic structure of interconnections that may change depending on the workload.
Imagine a hypothetical subway system where some trains might fail once in a while. The subway control unit wants to plan how to distribute the passengers to alternative routes while considering all possible trains and traffic on them. Using ARROW, then, when a train fails, the control unit just announces to the passengers the best alternative routes to minimize their travel time and avoid congestion.
“My long-term goal is to make large-scale computer networks more efficient, and ultimately develop smart networks that adapt to the data and application,” says MIT Assistant Professor Manya Ghobadi, who supervised the work. “Having a reconfigurable optical topology revolutionizes the way we think of a network, as performing this research requires breaking orthodoxies established for many years in WAN deployments.’
To deploy ARROW in real-world wide-area networks, the team has been collaborating with Facebook and hopes to work with other large-scale service providers. “The research provides the initial insight into the benefits of reconfiguration. The substantial potential in reliability improvement is attractive to network management in production backbone,” says Ying Zhang, a software engineer manager at Facebook who collaborated on this research.
“We are excited that there would be many practical challenges ahead to bring ARROW from research lab ideas to real-world systems that serve billions of people, and possibly reduce the number of service interruptions that we experience today, such as less news reports on how fiber cuts affect internet connectivity,” says Zhong. “We hope that ARROW could make our internet more resilient to failures with less cost.”
Zhong wrote the paper alongside Ghobadi; MIT graduate student Alaa Khaddaj; and Facebook engineers Jonathan Leach, Ying Zhang, and Yiting Xia. They presented the research at ACM’s SIGCOMM conference.
This work was led by MIT in collaboration with Facebook. The technique is being evaluated for deployment at Facebook. Facebook provided resources for performing the research. The MIT affiliated authors were supported by Advanced Research Projects Agency–Energy, the Defense Advanced Research Projects Agency, and the U.S. National Science Foundation.
Ending an eviction moratorium for renters makes people in a community significantly more likely to contract Covid-19, according to a new study co-authored by MIT researchers.
The study uses the variable timing of state-level moratoriums, issued and terminated at different points during the Covid-19 pandemic, to quantify their effect. It is the first study to identify the individual-level risk for people in different social circumstances, due to eviction moratoriums ending. The increased risk runs throughout communities, the research shows, meaning that ending eviction moratoriums does not just affect those who lose their housing.
Eviction moratoriums have been used to protect renters in danger of losing their housing at a time of economic strain caused by the Covid-19 pandemic. The study shows that, on average, when a state lifted its moratorium and let evictions resume, the hazard of contracting Covid-19 was 1.39 times greater after five weeks and 1.83 times greater after 12 weeks, rather than if the moratorium had continued.
When people had three or more co-morbidities, that likelihood increased by 2.37 times within 12 weeks. The hazard of contracting Covid-19 in nonaffluent areas, and in areas of high rent burden, were 2.14 times and 2.31 times higher, respectively, within 12 weeks in states that lifted eviction moratoriums, as opposed to maintaining them.
“Not having access to a stable way of sheltering yourself from the pandemic can be very impactful for how the pandemic spreads, not just for you but for your community,” says Sebastian Sandoval-Olascoaga, a doctoral student at MIT and co-author of the new paper. “There are spillover effects, and there is a transmission process created by evictions within a community.”
For that reason, Sandoval-Olascoaga adds, “As new variants spread, our study suggests that this policy, which protects low-income communities and people with co-morbidities, can also create health equity and provide protection for groups with more advantages.”
The paper, “Eviction Moratoria Expiration and COVID-19 Infection Risk Across Strata of Health and Socioeconomic Status in the United States,” was published today by the journal JAMA Network Open.
The co-authors are Sandoval-Olascoaga, a doctoral student in MIT’s Department of Urban Studies and Planning (DUSP); Atheendar S. Venkataramani, an assistant professor of medical ethics and health policy at the University of Pennsylvania’s Perelman School of Medicine; and Mariana Arcaya, an MIT associate professor of urban planning and public health, and associate head of DUSP.
“The public health rationale for eviction moratoria appears strong,” says Arcaya.
Different states, different Covid-19 rates
Eviction moratoriums have been the subject of ongoing political debate during the Covid-19 pandemic, and last week the U.S. Supreme Court overturned the Biden administration’s federal eviction moratorium, which had been issued by the U.S. Centers for Disease Control and Prevention (CDC).
The CDC issued an initial ban of its own in the fall of 2020, which had been extended multiple times until the Supreme Court ruling. By September 2020, an estimated 47 percent of renters behind in their payments were in danger of eviction, according to U.S. Census Bureau surveys. Amid this policy uncertainty, 43 U.S. states plus the District of Columbia issued eviction moratoriums during the pandemic; seven never have.
Of those 44 state-level governments, 26 wound up lifting their eviction bans, while 18 did not, in effect forming “treatment” and “control” groups for the study. The researchers used variations in the timing of eviction bans in 2020, while controlling for complicating factors, to identify what difference the resumption of evictions made to state-level trajectories of Covid-19 spread.
“We have a natural experiment where some states could help us as a control group, and some could help us as a treatment group,” Sandoval-Olascoaga says.
To conduct the study, the scholars also examined anonymized commercial insurance and Medicare Advantage records from a large national database with health information on nearly 200 million people; ultimately they analyzed a random sample of 500,000 U.S. residents, to evaluate how the moratoriums affected health. Because many things affect the spread of Covid-19, the study controlled for a wide range of complicating factors, including state policies such as mask mandates, stay-at-home or shelter-in-place orders, school closures, business restrictions, and existing Covid-19 levels at the county and state levels.
Multiple potential mechanisms
The researchers suggest there are multiple potential mechanisms through which lifting an eviction ban increases the spread of Covid-19. More people, once evicted from their housing, may start living with relatives or friends in more crowded settings, in which Covid-19 is more likely to spread. Ending eviction bans also increases homelessness, which likely sends more people into crowded shelters or other situations where they have increased proximity to others.
Additionally, because people in poor health are more likely to be affected by the end of an eviction ban, it means that individuals with greater-than-average vulnerability to getting Covid-19 are put into situations where there is increased likelihood of transmission. As Sandoval-Olascoaga observes, “an eviction creates a cascade of events” in which Covid-19 can spread more easily.
Moreover, Sandoval-Olascoaga notes, because the study uses data from 2020, the findings show what happens when evictions resume in the context of a less transmissible version of Covid-19 than the currently prevalent Delta variant.
“These results occurred when the Delta variant was not a thing,” Sandoval-Olascoaga says. “We were able to find an impact with a Covid strain that was not as transmissible as this one.”
For her part, Arcaya says that “the pandemic is not over, and while we hear a lot about what individuals can do to protect themselves — with masking and vaccination being critical — stopping evictions and otherwise helping people stay in stable housing are part of how cities, states, and the federal government can protect all of us.”
In recent years, scientists have developed monoclonal antibodies — proteins that mimic the body’s own immune defenses — that can combat a variety of diseases, including some cancers and autoimmune disorders such as Crohn’s disease. While these drugs work well, one drawback to them is that they have to be injected.
A team of MIT engineers, in collaboration with scientists from Brigham and Women’s Hospital and Novo Nordisk, is working on an alternative delivery strategy that could make it much easier for patients to benefit from monoclonal antibodies and other drugs that usually have to be injected. They envision that patients could simply swallow a capsule that carries the drug and then injects it directly into the lining of the stomach.
“If we can make it easier for patients to take their medication, then it is more likely that they will take it, and healthcare providers will be more likely to adopt therapies that are known to be effective,” says Giovanni Traverso, the Karl van Tassel Career Development Assistant Professor of Mechanical Engineering at MIT and a gastroenterologist at Brigham and Women’s Hospital.
In a study appearing today in Nature Biotechnology, the researchers demonstrated that their capsules could be used to deliver not only monoclonal antibodies but also other large protein drugs such as insulin, in pigs.
Traverso and Ulrik Rahbek, vice president at Novo Nordisk, are the senior authors of the paper. Former MIT graduate student Alex Abramson and Novo Nordisk scientists Morten Revsgaard Frederiksen and Andreas Vegge are the lead authors.
Targeting the stomach
Most large protein drugs can’t be given orally because enzymes in the digestive tract break them down before they can be absorbed. Traverso and his colleagues have been working on many strategies to deliver such drugs orally, and in 2019, they developed a capsule that could be used to inject up to 300 micrograms of insulin.
That pill, about the size of a blueberry, has a high, steep dome inspired by the leopard tortoise. Just as the tortoise is able to right itself if it rolls onto its back, the capsule is able to orient itself so that its needle can be injected into the lining of the stomach. In the original version, the tip of the needle was made of compressed insulin, which dissolved in the tissue after being injected into the stomach wall.
The new pill described in the Nature Biotechnology study maintains the same shape, allowing the capsule to orient itself correctly once it arrives in the stomach. However, the researchers redesigned the capsule interior so that it could be used to deliver liquid drugs, in larger quantities — up to 4 milligrams.
Delivering drugs in liquid form can help them reach the bloodstream more rapidly, which is necessary for drugs like insulin and epinephrine, which is used to treat allergic responses.
The researchers designed their device to target the stomach, rather than later parts of the digestive tract, because the amount of time it takes for something to reach the stomach after being swallowed is fairly uniform from person to person, Traverso says. Also, the lining of the stomach is thick and muscular, making it possible to inject drugs while mitigating harmful side effects.
The new delivery capsule is filled with fluid and also contains an injection needle and a plunger that helps to push the fluid out of the capsule. Both the needle and plunger are held in place by a pellet made of solid sugar. When the capsule enters the stomach, the humid environment causes the pellet to dissolve, pushing the needle into the stomach lining, while the plunger pushes the liquid through the needle. When the capsule is empty, a second plunger pulls the needle back into the capsule so that it can be safely excreted through the digestive tract.
In tests in pigs, the researchers showed that they could deliver a monoclonal antibody called adalimumab (Humira) at levels similar to those achieved by injection. This drug is used to treat autoimmune disorders such as inflammatory bowel disease and rheumatoid arthritis. They also delivered a type of protein drug known as a GLP-1 receptor agonist, which is used to treat type 2 diabetes.
“Delivery of monoclonal antibodies orally is one of the biggest challenges we face in the field of drug delivery science,” Traverso says. “From an engineering perspective, the ability to deliver monoclonal antibodies at significant levels really transforms how we start to think about the management of these conditions.”
Additionally, the researchers gave the animals capsules over several days and found that the drugs were delivered consistently each time. They also found no signs of damage to the stomach lining following the injections, which penetrate about 4.5 millimeters into the tissue.
David Brayden, a professor of advanced drug delivery at University College Dublin, who was not involved in the research, described the new approach as “a very exciting advance for the potential oral delivery of macromolecules. That similar blood levels to those arising from injections of these types of drugs can be achieved by stomach administration to large animals is a technical landmark for the field.”
The MIT team is now working with Novo Nordisk to further develop the system.
“Although it is still early days, we believe this device has the potential to transform treatment regimens across a range of therapeutic areas,” Rahbek says. “The ongoing research and development of this approach mean that several drugs that can currently only be administered via parenteral injections (non-oral routes) might be administered orally in the future. Our aim is to get the device into clinical trials as soon as possible.”
Other authors of the paper include MIT’s David H. Koch Institute Professor Robert Langer, Brian Jensen Mette Poulsen, Brian Mouridsen, Mikkel Oliver Jespersen, Rikke Kaae Kirk, Jesper Windum, Frantisek Hubalek, Jorrit Water, Johannes Fels, Stefan Gunnarsson, Adam Bohr, Ellen Marie Straarup, Mikkel Wennemoes Hvitfeld Ley, Xiaoya Lu, Jacob Wainer, Joy Collins, Siddartha Tamang, Keiko Ishida, Alison Hayward, Peter Herskind, Stephen Buckley, and Niclas Roxhed.
The research was funded by Novo Nordisk, the National Institutes of Health, the National Science Foundation, MIT’s Department of Mechanical Engineering, Brigham and Women’s Hospital’s Division of Gastroenterology, and the Viking Olof Bjork scholarship trust.
Professor Emeritus Paul Schimmel PhD ’66 and his family recently committed $50 million to support the life sciences at MIT. They provided an initial gift of $25 million to establish the Schimmel Family Program for Life Sciences. This gift matches $25 million secured from other sources in support of the Department of Biology. The remaining $25 million from the Schimmel family will go to support the Schimmel Family Program in the form of matching funds as other gifts are secured over the next five years. Schimmel, who is the John D. and Catherine T. MacArthur Professor of Biochemistry and Biophysics Emeritus, is a lifelong supporter of the Institute in teaching, research, and philanthropy.
“I am tremendously grateful to Paul and his family for their generosity and support, and for their advocacy for our department and the life sciences,” says department head Alan D. Grossman, the Praecis Professor of Biology.
This most recent gift is one among many that Schimmel and his family have provided to MIT during their more than 50-year affiliation with the Institute, which includes Paul’s doctorate and his 30 years of teaching and research in the department. While at MIT, Paul and Cleo, Paul’s wife and philanthropic partner, provided an anonymous donation for the construction of Building 68, the most recent home for the Department of Biology.
“We cannot overstate our gratitude for our MIT experience. It was MIT that provided a ‘frontier of knowledge, which has no bounds’ and introduced us to some of the finest minds and people in the world,” Schimmel says.
“They educated and uplifted us, and convinced us of MIT’s singular role in making this a better world for all peoples,” says Cleo Schimmel, who was a past chair of the MIT Women’s League and, in her own right, contributed to the endowment of the league and other efforts to support women at MIT.
Currently, Paul Schimmel is the Ernst and Jean Hahn Professor at the Skaggs Institute for Chemical Biology at the Scripps Research Institute. Schimmel formally left MIT in 1997 to join Scripps Research, but he has remained actively involved in supporting the Institute’s research enterprise, specifically MIT graduate students.
Graduate funding for the future
Shortly after Paul left MIT, the Schimmels endowed four graduate fellowships for outstanding women in life sciences. “Since 2000, the Cleo and Paul Schimmel Scholars fellowships have helped the biology department recruit and retain the best talent,” says Grossman. Kristin Knouse PhD ’17 is a former Schimmel Scholar who rejoined the department this past July as an assistant professor.
“The MIT Department of Biology encompasses a remarkable breath of biology within a very close-knit community that places a strong emphasis on graduate training,” says Knouse. “Once in the lab, the resources and collaborations available through MIT provide unparalleled opportunities to accelerate and advance your research.”
Schimmel, who sits on the department’s Visiting Committee, continued to champion graduate student support by helping to endow the Teresa Keng Graduate Teaching Prize to support excellence in graduate student teaching in the department. In 2013, the Schimmel family donated the proceeds from the sale of their La Jolla, California, home for the purpose of training the next generation of MIT graduates in the life sciences. What formally became the department’s Graduate Training Initiative (GTI) was supported by others, including biology alumni Eric Schmidt PhD ’96 and Tracy Smith PhD ’96.
The GTI supports departmental efforts to enhance the graduate student experience in the form of both direct student support, including tuition and stipend, and indirect support, including programmatic activities such as seed funds for student-directed projects, shared computing facilities, and forums related to post-graduation employment.
This new gift to establish the Schimmel Family Program for Life Sciences will support not only the GTI in the Department of Biology, but also graduate students across MIT.
“The life sciences educational enterprise spreads across a dozen departments at MIT,” says Schimmel. “What makes the biology department and the life sciences at MIT so extraordinary is the singular ability to transfer knowledge and inventions to society for its benefit. That is much of why Kendall Square and Boston are what they are.”
To that end, Schimmel has also been an active player in shaping the MIT-Kendall Square innovation ecosystem, including the founding of companies such as Alnylam Pharmaceuticals in 2002. Alnylam — founded by Schimmel along with Institute Professor Phillip Sharp, MIT Professor David Bartel, MIT postdocs Thomas Tuschl and Phillip Zamore, and investors — has been a major player in the biopharma scene. Most recently, Alnylam partnered with Vir Biotechnology to develop therapeutics for coronavirus infections, including Covid-19.
Having a longstanding interest in the applications of basic biomedical research to human health, Schimmel holds numerous patents and is a co-founder or founding director of several biotechnology companies in addition to Alnylam, including aTyr Pharma, Alkermes, Cubist Pharmaceuticals, Metabolon, Repligen, and Sirtris Pharmaceuticals.
“I’ve been talking to the people that I’ve started companies with, reminding them that none of the extensive commercial and residential real estate development, restaurants, hotels, and the founding and locating of major biopharmaceutical enterprises would have happened without the MIT life sciences enterprise,” says Schimmel. “MIT’s Kendall Square is to biopharma what Silicon Valley is to technology. None of the robust economic impact would have occurred if it hadn’t been for MIT’s life sciences.”
The $50 million commitment was a capstone gift to MIT’s Campaign for a Better World, supporting important campaign priorities of human health and discovery science. In addition, Schimmel has future plans to continue supporting the life sciences at MIT through his estate plan with the Institute.
“We are extraordinarily grateful to Paul, Cleo, and the entire family,” says Nergis Mavalvala PhD ’97, the Curtis and Kathleen Marble Professor of Astrophysics and the dean of the MIT School of Science. “Not only do the Schimmels understand, from a firsthand perspective, the need to support graduate students, but they also understand that these young researchers are the future of our life sciences endeavors outside of MIT, in fundamental research, biopharma industries, and beyond.”
Schimmel graduated from Ohio Wesleyan University, earned a doctorate from MIT, and completed postdoc research at Stanford University. His many accomplishments include the publication of more than 500 scientific papers, numerous awards and honorary degrees, and elected membership to the American Academy of Arts and Sciences, the National Academy of Sciences, the American Philosophical Society, the Institute of Medicine (National Academy of Medicine), and National Academy of Inventors.