MIT Latest News

Subscribe to MIT Latest News feed
MIT News is dedicated to communicating to the media and the public the news and achievements of the students, faculty, staff and the greater MIT community.
Updated: 12 hours 48 min ago

Study: Smart devices’ ambient light sensors pose imaging privacy risk

Mon, 01/29/2024 - 3:55pm

In George Orwell’s novel “1984,” Big Brother watches citizens through two-way, TV-like telescreens to surveil citizens without any cameras. In a similar fashion, our current smart devices contain ambient light sensors, which open the door to a different threat: hackers.

These passive, seemingly innocuous smartphone components receive light from the environment and adjust the screen's brightness accordingly, like when your phone automatically dims in a bright room. Unlike cameras, though, apps are not required to ask for permission to use these sensors. In a surprising discovery, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) uncovered that ambient light sensors are vulnerable to privacy threats when embedded on a smart device’s screen. The team proposed a computational imaging algorithm to recover an image of the environment from the perspective of the display screen using subtle single-point light intensity changes of these sensors to demonstrate how hackers could use them in tandem with monitors. An open-access paper on this work was published in Science Advances on Jan. 10.

“This work turns your device's ambient light sensor and screen into a camera! Ambient light sensors are tiny devices deployed in almost all portable devices and screens that surround us in our daily lives,” says Princeton University professor Felix Heide, who was not involved with the paper. “As such, the authors highlight a privacy threat that affects a comprehensive class of devices and has been overlooked so far.”

While phone cameras have previously been exposed as security threats for recording user activity, the MIT group found that ambient light sensors can capture images of users’ touch interactions without a camera. According to their new study, these sensors can eavesdrop on regular gestures, like scrolling, swiping, or sliding, and capture how users interact with their phones while watching videos. For example, apps with native access to your screen, including video players and web browsers, could spy on you to gather this permission-free data.

According to the researchers, a commonly held belief is that ambient light sensors don’t reveal meaningful private information to hackers, so programming apps to request access to them is unnecessary. “Many believe that these sensors should always be turned on,” says lead author Yang Liu, a PhD student in MIT's Department of Electrical Engineering and Computer Science and a CSAIL affiliate. “But much like the telescreen, ambient light sensors can passively capture what we’re doing without our permission, while apps are required to request access to our cameras. Our demonstrations show that when combined with a display screen, these sensors could pose some sort of imaging privacy threat by providing that information to hackers monitoring your smart devices.”

Collecting these images requires a dedicated inversion process where the ambient light sensor first collects low-bitrate variations in light intensity, partially blocked by the hand making contact with the screen. Next, the outputs are mapped into a two-dimensional space by forming an inverse problem with the knowledge of the screen content. An algorithm then reconstructs the picture from the screen’s perspective, which is iteratively optimized and denoised via deep learning to reveal a pixelated image of hand activity.

The study introduces a novel combination of passive sensors and active monitors to reveal a previously unexplored imaging threat that could expose the environment in front of the screen to hackers processing the sensor data from another device. “This imaging privacy threat has never been demonstrated before,” says Liu, who worked alongside Frédo Durand on the paper, who is an MIT EECS professor, CSAIL member, and senior author of the paper.

The team suggested two software mitigation measures for operating system providers: tightening up permissions and reducing the precision and speed of the sensors. First, they recommend restricting access to the ambient light sensor by allowing users to approve or deny those requests from apps. To further prevent any privacy threats, the team also proposed limiting the capabilities of the sensors. By reducing the precision and speed of these components, the sensors would reveal less private information. From the hardware side, the ambient light sensor should not be directly facing the user on any smart device, they argued, but instead placed on the side, where it won’t capture any significant touch interactions.

Getting the picture

The inversion process was applied to three demonstrations using an Android tablet. In the first test, the researchers seated a mannequin in front of the device, while different hands made contact with the screen. A human hand pointed to the screen, and later, a cardboard cutout resembling an open-hand gesture touched the monitor, with the pixelated imprints gathered by the MIT team revealing the physical interactions with the screen.

A subsequent demo with human hands revealed that the way users slide, scroll, pinch, swipe, and rotate could be gradually captured by hackers through the same imaging method, although only at a speed of one frame every 3.3 minutes. With a faster ambient light sensor, malicious actors could potentially eavesdrop on user interactions with their devices in real time.

In a third demo, the group found that users are also at risk when watching videos like films and short clips. A human hand hovered in front of the sensor while scenes from Tom and Jerry cartoons played on screen, with a white board behind the user reflecting light to the device. The ambient light sensor captured the subtle intensity changes for each video frame, with the resulting images exposing touch gestures.

While the vulnerabilities in ambient light sensors pose a threat, such a hack is still restricted. The speed of this privacy issue is low, with the current image retrieval rate being 3.3 minutes per frame, which overwhelms the dwell of user interactions. Additionally, these pictures are still a bit blurry if retrieved from a natural video, potentially leading to future research. While telescreens can capture objects away from the screen, this imaging privacy issue is only confirmed for objects that make contact with a mobile device’s screen, much like how selfie cameras cannot capture objects out of frame.

Two other EECS professors are also authors on the paper: CSAIL member William T. Freeman and MIT-IBM Watson AI Lab member Gregory Wornell, who leads the Signals, Information, and Algorithms Laboratory in the Research Laboratory of Electronics. Their work was supported, in part, by the DARPA REVEAL program and an MIT Stata Family Presidential Fellowship.

Benchtop test quickly identifies extremely impact-resistant materials

Mon, 01/29/2024 - 3:00pm

An intricate, honeycomb-like structure of struts and beams could withstand a supersonic impact better than a solid slab of the same material. What’s more, the specific structure matters, with some being more resilient to impacts than others.

That’s what MIT engineers are finding in experiments with microscopic metamaterials — materials that are intentionally printed, assembled, or otherwise engineered with microscopic architectures that give the overall material exceptional properties.

In a study appearing today in the Proceedings of the National Academy of Sciences, the engineers report on a new way to quickly test an array of metamaterial architectures and their resilience to supersonic impacts.

In their experiments, the team suspended tiny printed metamaterial lattices between microscopic support structures, then fired even tinier particles at the materials, at supersonic speeds. With high-speed cameras, the team then captured images of each impact and its aftermath, with nanosecond precision.

Their work has identified a few metamaterial architectures that are more resilient to supersonic impacts compared to their entirely solid, nonarchitected counterparts. The researchers say the results they observed at the microscopic level can be extended to comparable macroscale impacts, to predict how new material structures across length scales will withstand impacts in the real world.

“What we’re learning is, the microstructure of your material matters, even with high-rate deformation,” says study author Carlos Portela, the Brit and Alex d’Arbeloff Career Development Professor in Mechanical Engineering at MIT. “We want to identify impact-resistant structures that can be made into coatings or panels for spacecraft, vehicles, helmets, and anything that needs to be lightweight and protected.”

Other authors on the study include first author and MIT graduate student Thomas Butruille, and Joshua Crone of DEVCOM Army Research Laboratory.

Pure impact

The team’s new high-velocity experiments build off their previous work, in which the engineers tested the resilience of an ultralight, carbon-based material. That material, which was thinner than the width of a human hair, was made from tiny struts and beams of carbon, which the team printed and placed on a glass slide. They then fired microparticles toward the material, at velocities exceeding the speed of sound.  

Those supersonic experiments revealed that the microstructured material withstood the high-velocity impacts, sometimes deflecting the microparticles and other times capturing them.

“But there were many questions we couldn’t answer because we were testing the materials on a substrate, which may have affected their behavior,” Portela says.

In their new study, the researchers developed a way to test freestanding metamaterials, to observe how the materials withstand impacts purely on their own, without a backing or supporting substrate.

In their current setup, the researchers suspend a metamaterial of interest between two microscopic pillars made from the same base material. Depending on the dimensions of the metamaterial being tested, the researchers calculate how far apart the pillars must be in order to support the material at either end while allowing the material to respond to any impacts, without any influence from the pillars themselves.

“This way, we ensure that we’re measuring the material property and not the structural property,” Portela says.

Once the team settled on the pillar support design, they moved on to test a variety of metamaterial architectures. For each architecture, the researchers first printed the supporting pillars on a small silicon chip, then continued printing the metamaterial as a suspended layer between the pillars.

“We can print and test hundreds of these structures on a single chip,” Portela says.

Punctures and cracks

The team printed suspended metamaterials that resembled intricate honeycomb-like cross-sections. Each material was printed with a specific three-dimensional microscopic architecture, such as a precise scaffold of repeating octets, or more faceted polygons. Each repeated unit measured as small as a red blood cell. The resulting metamaterials were thinner than the width of a human hair.

The researchers then tested each metamaterial’s impact resilience by firing glass microparticles toward the structures, at speeds of up to 900 meters per second (more than 2,000 miles per hour) — well within the supersonic range. They caught each impact on camera and studied the resulting images, frame by frame, to see how the projectiles penetrated each material. Next, they examined the materials under a microscope and compared each impact’s physical aftermath.

“In the architected materials, we saw this morphology of small cylindrical craters after impact,” Portela says. “But in solid materials, we saw a lot of radial cracks and bigger chunks of material that were gouged out.”

Overall, the team observed that the fired particles created small punctures in the latticed metamaterials, and the materials nevertheless stayed intact. In contrast, when the same particles were fired at the same speeds into solid, nonlatticed materials of equal mass, they created large cracks that quickly spread, causing the material to crumble. The microstructured materials, therefore, were more efficient in resisting supersonic impacts as well as protecting against multiple impact events. And in particular, materials that were printed with the repeating octets appeared to be the most hardy.

“At the same velocity, we see the octet architecture is harder to fracture, meaning that the metamaterial, per unit mass, can withstand impacts up to twice as much as the bulk material,” Portela says. “This tells us that there are some architectures that can make a material tougher which can offer better impact protection.”

Going forward, the team plans to use the new rapid testing and analysis method to identify new metamaterial designs, in hopes of tagging architectures that can be scaled up to stronger and lighter protective gear, garments, coatings, and paneling.

“What I’m most excited about is showing we can do a lot of these extreme experiments on a benchtop,” Portela says. “This will significantly accelerate the rate at which we can validate new, high-performing, resilient materials.”

This work was funded, in part, by DEVCOM ARL Army Research Office through the MIT Institute for Soldier Nanotechnologies.

Astronomers spot 18 black holes gobbling up nearby stars

Mon, 01/29/2024 - 3:00pm

Star-shredding black holes are everywhere in the sky if you just know how to look for them. That’s one message from a new study by MIT scientists, appearing today in the Astrophysical Journal.

The study’s authors are reporting the discovery of 18 new tidal disruption events (TDEs) — extreme instances when a nearby star is tidally drawn into a black hole and ripped to shreds. As the black hole feasts, it gives off an enormous burst of energy across the electromagnetic spectrum.

Astronomers have detected previous tidal disruption events by looking for characteristic bursts in the optical and X-ray bands. To date, these searches have revealed about a dozen star-shredding events in the nearby universe. The MIT team’s new TDEs more than double the catalog of known TDEs in the universe.

The researchers spotted these previously “hidden” events by looking in an unconventional band: infrared. In addition to giving off optical and X-ray bursts, TDEs can generate infrared radiation, particularly in “dusty” galaxies, where a central black hole is enshrouded with galactic debris. The dust in these galaxies normally absorbs and obscures optical and X-ray light, and any sign of TDEs in these bands. In the process, the dust also heats up, producing infrared radiation that is detectable. The team found that infrared emissions, therefore, can serve as a sign of tidal disruption events.

By looking in the infrared band, the MIT team picked out many more TDEs, in galaxies where such events were previously hidden. The 18 new events occurred in different types of galaxies, scattered across the sky.

“The majority of these sources don’t show up in optical bands,” says lead author Megan Masterson, a graduate student in MIT’s Kavli Institute for Astrophysics and Space Research. “If you want to understand TDEs as a whole and use them to probe supermassive black hole demographics, you need to look in the infrared band.”

Other MIT authors include Kishalay De, Christos Panagiotou, Anna-Christina Eilers, Danielle Frostig, and Robert Simcoe, and MIT assistant professor of physics Erin Kara, along with collaborators from multiple institutions including the Max Planck Institute for Extraterrestrial Physics in Germany.

Heat spike

The team recently detected the closest TDE yet, by searching through infrared observations. The discovery opened a new, infrared-based route by which astronomers can search for actively feeding black holes.

That first detection spurred the group to comb for more TDEs. For their new study, the researchers searched through archival observations taken by NEOWISE — the renewed version of NASA’s Wide-field Infrared Survey Explorer. This satellite telescope launched in 2009 and after a brief hiatus has continued to scan the entire sky for infrared “transients,” or brief bursts.

The team looked through the mission’s archived observations using an algorithm developed by co-author Kishalay De. This algorithm picks out patterns in infrared emissions that are likely signs of a transient burst of infrared radiation. The team then cross-referenced the flagged transients with a catalog of all known nearby galaxies within 200 megaparsecs, or 600 million light years. They found that infrared transients could be traced to about 1,000 galaxies.

They then zoomed in on the signal of each galaxy’s infrared burst to determine whether the signal arose from a source other than a TDE, such as an active galactic nucleus or a supernova. After ruling out these possibilities, the team then analyzed the remaining signals, looking for an infrared pattern that is characteristic of a TDE — namely, a sharp spike followed by a gradual dip, reflecting a process by which a black hole, in ripping apart a star, suddenly heats up the surrounding dust to about 1,000 kelvins before gradually cooling down.

This analysis revealed 18 “clean” signals of tidal disruption events. The researchers took a survey of the galaxies in which each TDE was found, and saw that they occurred in a range of systems, including dusty galaxies, across the entire sky.

“If you looked up in the sky and saw a bunch of galaxies, the TDEs would occur representatively in all of them,” Masteron says. “It’s not that they’re only occurring in one type of galaxy, as people thought based only on optical and X-ray searches.”

“It is now possible to peer through the dust and complete the census of nearby TDEs,” says Edo Berger, professor of astronomy at Harvard University, who was not involved with the study. “A particularly exciting aspect of this work is the potential of follow-up studies with large infrared surveys, and I’m excited to see what discoveries they will yield.”

A dusty solution

The team’s discoveries help to resolve some major questions in the study of tidal disruption events. For instance, prior to this work, astronomers had mostly seen TDEs in one type of galaxy — a “post-starburst” system that had previously been a star-forming factory, but has since settled. This galaxy type is rare, and astronomers were puzzled as to why TDEs seemed to be popping up only in these rarer systems. It so happens that these systems are also relatively devoid of dust, making a TDE’s optical or X-ray emissions naturally easier to detect.

Now, by looking in the infrared band, astronomers are able to see TDEs in many more galaxies. The team’s new results show that black holes can devour stars in a range of galaxies, not only post-starburst systems.

The findings also resolve a “missing energy” problem. Physicists have theoreticially predicted that TDEs should radiate more energy than what has been actually observed. But the MIT team now say that dust may explain the discrepancy. They found that if a TDE occurs in a dusty galaxy, the dust itself could absorb not only optical and X-ray emissions but also extreme ultraviolet radiation, in an amount equivalent to the presumed “missing energy.”

The 18 new detections also are helping astronomers estimate the rate at which TDEs occur in a given galaxy. When they figure the new TDEs in with previous detections, they estimate a galaxy experiences a tidal disruption event once every 50,000 years. This rate comes closer to physicists’ theoretical predictions. With more infrared observations, the team hopes to resolve the rate of TDEs, and the properties of the black holes that power them.

“People were coming up with very exotic solutions to these puzzles, and now we’ve come to the point where we can resolve all of them,” Kara says. “This gives us confidence that we don’t need all this exotic physics to explain what we’re seeing. And we have a better handle on the mechanics behind how a star gets ripped apart and gobbled up by a black hole. We’re understanding these systems better.”

This research was supported, in part, by NASA.

Opening the doorway to drawing

Sun, 01/28/2024 - 12:00am

On the first Friday in November, the students of 21A.513 (Drawing Human Experience) were greeted by two unfamiliar figures: a bespectacled monkey holding a heart-shaped message (“I’m so glad you are here”) and the person who drew that monkey on the whiteboard: award-winning cartoonist and educator Lynda Barry, whose “Picture This” was a central text on the new interdisciplinary course’s syllabus.

As the afternoon’s guest speaker, Barry welcomed each arrival, her long gray braids swinging, pens dangling from her neck. Within minutes, she had everyone — even the course’s instructors, anthropologist Graham Jones and visual artist Seth Riskin — settled around tables with their eyes closed, drawing giraffes.

When Barry asked participants to open their eyes and hold up their giraffes, the room filled with laughter over the menagerie of stubby legs, irregular necks, and erratic spots.

“It came out better than I thought!” one student exclaimed.

“Watching people draw with their eyes closed is fantastic,” Barry beamed. “It’s like being in the room with everyone dreaming.”

“Picture This” contends that everyone can draw; children do it unselfconsciously up to a certain age, Barry writes, but all too often conventional qualms put a stop to this expressive and deeply human practice. Jones saw evidence of this when the class convened in September.

“When we went around the room asking students what they wanted to get out of the class, about two-thirds said something like ‘I used to make art, but I don’t have time to do it anymore,’ or ‘I didn’t feel like I was good enough at it,’” he recalls. “For some students, we’ve been opening up a doorway to a set of experiences that’s been shut for a long time.” 

Senior Charles Williams, a computer engineering major, counts himself among that group. “This class breathes back into you the creative and artistic expression that is too often lost as we grow up and mature,” he says.

What it means to be human

Newly offered last fall, Drawing Human Experience was supported by a cross-disciplinary class development grant from the MIT Center for Art, Science & Technology (CAST). It was co-presented by MIT Anthropology and the MIT Museum Studio and Compton Gallery.

It is the second CAST grant shared by Jones and Riskin. In 2019 they co-taught 21A.S01 (Paranormal Machines), which explored how humans can use interactive technologies to create experiences beyond everyday life. That course left them eager to delve further into the intersection of their disciplines at the most essential level.

“Drawing is deceptively simple,” Jones notes. “You can do extraordinarily complicated things with the kind of media that everybody has immediately at hand.”

The course’s syllabus opens with a declaration — “We do not accept distinctions between ‘good’ and ‘bad’ drawing” — and a hint of what students would work toward: “We draw to give our inner world outer form, to create a zone of communication between both us and ourselves, and ourselves and others.”

The course bases students’ grades on their sincere investment in investigating that zone of communication — developing their own visual language along the way — rather than a mastery of photorealistic representation.

“The difference between an ordinary drawing class and this class is that it puts the quality of mind before technical skills,” says Riskin, manager of the MIT Museum Studio and Compton Gallery (where the course met), as well as co-instructor of a long-running class on vision in art and neuroscience.

Jones is a professor of anthropology who researches how people use language and other media to perform and interact.

“On the deepest level, anthropology asks the question ‘what does it mean to be human?’” he says. “What we’re trying to do in this class is allow the students to ask this fundamental anthropological question by going very deeply into their own experience.”

The instructors divided the course into three modules: abstraction, figuration, and diagrams. In the third unit, Jones lectured on the use of diagrams in anthropology to visualize complex social structures such as kinship and gift-giving networks. “Diagrams organize thinking,” Jones told the class, “and they organize people around that thought process. They’re one of the most profound inventions in human history.”

While diagrams the students encounter elsewhere in their studies might aim for the precise presentation of facts, Riskin urged them to consider the term more expansively. “Ambiguity is a very powerful vehicle in art,” he reminded them. “If there’s not ambiguity, maybe it’s not art anymore because there’s no role for the viewer’s imagination.”

Students discussed the work of artist Christine Sun Kim, who uses infographics for social commentary, as in her series of pie charts on “Deaf Rage” that have been exhibited at the MIT List Visual Arts Center and internationally. Then they partnered up for an exercise, documenting their changing relationship with a classmate before, during, and after a getting-to-know-you conversation. The resulting diagrams resembled swirling plasma, mushrooms releasing spores, spiky plants emerging from seeds — nary an x- or y-axis in sight.

The essence of drawing

Between classes, students completed “D-Sets” (drawing-based problem sets) in the hardbound sketchbooks they’d received at the start of the semester. D-Set number four, for example, had them practice gesture drawing — employing rapid, broad strokes — while observing passersby in a public space. The goal, explains Riskin, was “not to capture the many details that accurately represent the human form, but rather in two or three seconds to capture the whole, the gestalt, of a human figure.” He and Jones designed several such exercises for “training immediacy” — an antidote to the self-critical, goal-oriented attitudes that turn many adults away from the act of drawing.

“The assignments helped me think more about drawing to convey, rather than to represent,” says junior Jaclyn Thi, a computer science and engineering major. “They made drawing much more enjoyable overall.”

While students were urged not to overthink the process of putting marks on paper, class meetings provided a social, supportive space to reflect on the results.

“The second class was kind of a shock,” Jones remembers. “We had come up with this whole plan about how they were going to exchange their sketchbooks, and we had prepared all of these prompts. But as soon as I said, ‘OK, turn to somebody next to you who hasn’t seen your work,’ the room immediately erupted in conversation. They talked for half-an-hour about the drawings, and we had to cut it off. It was like the floodgates opened.”

During a peer feedback session in week six, the students clustered around the studio’s plain wooden tables, which had been pushed together to form a large surface. They gazed down at nearly two dozen sketchbooks splayed open to the latest D-Set: gesture drawings conveying emotional connections to important figures in their lives. Some pages were covered in thick, moody smudges, while others crawled with wispy lines, and a few clean white pages bore only a few bold marks.

Several students singled out a classmate’s drawing, remarking how its confident charcoal strokes — suggesting short hair, glasses, the slight curve of a smile — managed to evoke a sense of lightness and joy. Riskin addressed the artist: “Maybe the drawing surprised you a bit because it was easy? You were just with the person and the drawing came out as an expression of that,” he guessed, eliciting a nod of recognition. “That, to me, is the essence of drawing.”

Sophomore Kanna Pichappan, a brain and cognitive sciences major and anthropology minor, looks back on that assignment as one of her most challenging. “I chose to depict Goddess Durga, a deity from the Hindu tradition who motivates me to live with courage, inner strength, and a commitment to righteousness,” says Pichappan. “I was apprehensive that my drawing might not turn out the way I envisioned. Until this class, I hadn’t even realized that a depiction of a figure’s form is not the same as experiencing the feelings the figure inspires. That D-Set helped me establish new habits: drawing what I feel, rather than what something should look like.”

Weeks later, when choosing a subject for her final project, Pichappan decided to return to exploring the goddess’s role in her life “from a place of creativity and freedom.” The course, she says, made that possible: “It helped me shift from the belief that art is created to represent something, towards the understanding that drawing can be a powerful way of deepening and enriching our understanding of our personal human life experiences.”

School of Engineering fourth quarter 2024 awards

Fri, 01/26/2024 - 2:20pm

Faculty and researchers across MIT’s School of Engineering receive many awards in recognition of their scholarship, service, and overall excellence. The School of Engineering periodically recognizes their achievements by highlighting the honors, prizes, and medals won by faculty and research scientists working in our academic departments, labs, and centers.

Susan Solomon wins VinFuture Award for Female Innovators

Fri, 01/26/2024 - 10:00am

Lee and Geraldine Martin Professor of Environmental Studies Susan Solomon has been awarded the 2023 VinFuture Award for Female Innovators. Solomon was picked out of almost 1,400 international nominations across four categories for “The discovery of the ozone depletion mechanism in Antarctica, contributing to the establishment of the Montreal Protocol.” The award, which comes with a $500,000 prize, highlights outstanding female researchers and innovators that can serve as role models for aspiring scientists.

“I'm tremendously humbled by that, and I'll do my best to live up to it,” says Solomon, who attended the ceremony in Hanoi, Vietnam, on Dec. 20.

The VinFuture Awards are given annually to “honor scientific research and breakthrough technological innovations that can make a significant difference” according to their site. In addition to Female Innovators, the award has two other special categories, Innovators from Developing Countries and Innovators with Outstanding Achievements in Emerging Fields, as well as their overall grand prize. The awards have been given out by the Vietnam-based VinFuture Foundation since 2021.

“Countries all around the world are part of scientific progress and innovation, and that a developing country is honoring that is really very lovely,” says Solomon, whose career as an atmospheric chemist has brought her onto the international stage and has shown her firsthand how important developing countries are in crafting global policy.

In 1986 Solomon led an expedition of 16 scientists to Antarctica to measure the degradation of the ozone layer; she was the only woman on the team. She and her collaborators were able to figure out the atmospheric chemistry of chlorofluorocarbons and other similar chemicals that are now known as ozone-depleting substances. This work became foundational to the creation of the Montreal Protocol, an international agreement that banned damaging chemicals and has allowed the ozone to recover.

Solomon joined the MIT faculty in 2012 and holds joint appointments in the departments of Chemistry and Earth, Atmospheric and Planetary Sciences. The success of the Montreal Protocol demonstrates the ability for international cooperation to enact effective environmental agreements; Solomon sees it as a blueprint for crafting further policy when it comes to addressing global climate change.

“Women can do anything, even help save the ozone layer and solve other environmental problems,” she says. “Today's problem of climate change is for all of us to be involved in solving.”

Study: Stars travel more slowly at Milky Way’s edge

Fri, 01/26/2024 - 12:00am

By clocking the speed of stars throughout the Milky Way galaxy, MIT physicists have found that stars further out in the galactic disk are traveling more slowly than expected compared to stars that are closer to the galaxy’s center. The findings raise a surprising possibility: The Milky Way’s gravitational core may be lighter in mass, and contain less dark matter, than previously thought.

The new results are based on the team’s analysis of data taken by the Gaia and APOGEE instruments. Gaia is an orbiting space telescope that tracks the precise location, distance, and motion of more than 1 billion stars throughout the Milky Way galaxy, while APOGEE is a ground-based survey. The physicists analyzed Gaia’s measurements of more than 33,000 stars, including some of the farthest stars in the galaxy, and determined each star’s “circular velocity,” or how fast a star is circling in the galactic disk, given the star’s distance from the galaxy’s center.

The scientists plotted each star’s velocity against its distance to generate a rotation curve — a standard graph in astronomy that represents how fast matter rotates at a given distance from the center of a galaxy. The shape of this curve can give scientists an idea of how much visible and dark matter is distributed throughout a galaxy.

“What we were really surprised to see was that this curve remained flat, flat, flat out to a certain distance, and then it started tanking,” says Lina Necib, assistant professor of physics at MIT. “This means the outer stars are rotating a little slower than expected, which is a very surprising result.”

The team translated the new rotation curve into a distribution of dark matter that could explain the outer stars’ slow-down, and found the resulting map produced a lighter galactic core than expected. That is, the center of the Milky Way may be less dense, with less dark matter, than scientists have thought.

“This puts this result in tension with other measurements,” Necib says. “There is something fishy going on somewhere, and it’s really exciting to figure out where that is, to really have a coherent picture of the Milky Way.”

The team reports its results this month in the Monthly Notices of the Royal Society Journal. The study’s MIT co-authors, including Necib, are first author Xiaowei Ou, Anna-Christina Eilers, and Anna Frebel.

“In the nothingness”

Like most galaxies in the universe, the Milky Way spins like water in a whirlpool, and its rotation is driven, in part, by all the matter that swirls within its disk. In the 1970s, astronomer Vera Rubin was the first to observe that galaxies rotate in ways that cannot be driven purely by visible matter. She and her colleagues measured the circular velocity of stars and found that the resulting rotation curves were surprisingly flat. That is, the velocity of stars remained the same throughout a galaxy, rather than dropping off with distance. They concluded that some other type of invisible matter must be acting on distant stars to give them an added push.

Rubin’s work in rotation curves was one of the first strong pieces of evidence for the existence of dark matter — an invisible, unknown entity that is estimated to outweigh all the stars and other visible matter in the universe.

Since then, astronomers have observed similar flat curves in far-off galaxies, further supporting dark matter’s presence. Only recently have astronomers attempted to chart the rotation curve in our own galaxy with stars.

“It turns out it’s harder to measure a rotation curve when you’re sitting inside a galaxy,” Ou notes.

In 2019, Anna-Christina Eilers, assistant professor of physics at MIT, worked to chart the Milky Way’s rotation curve, using an earlier batch of data released by the Gaia satellite. That data release included stars as far out as 25 kiloparsecs, or about 81,000 light years, from the galaxy’s center.

Based on these data, Eilers observed that the Milky Way’s rotation curve appeared to be flat, albeit with mild decline, similar to other far-off galaxies, and by inference, the galaxy likely bore a high density of dark matter at its core. But this view now shifted, as the telescope released a new batch of data, this time including stars as far out as 30 kiloparsecs — almost 100,000 light years from the galaxy’s core.

“At these distances, we’re right at the edge of the galaxy where stars start to peter out,” Frebel says. “No one had explored how matter moves around in this outer galaxy, where we’re really in the nothingness.”

Weird tension

Frebel, Necib, Ou, and Eilers jumped on Gaia’s new data, looking to expand on Eilers’ initial rotation curve. To refine their analysis, the team complemented Gaia’s data with measurements by APOGEE — the Apache Point Observatory Galactic Evolution Experiment, which measures extremely detailed properties of more than 700,000 stars in the Milky Way, such as their brightness, temperature, and elemental composition.

“We feed all this information into an algorithm to try to learn connections that can then give us better estimates of a star’s distance,” Ou explains. “That’s how we can push out to farther distances.”

The team established the precise distances for more than 33,000 stars and used these measurements to generate a three-dimensional map of the stars scattered across the Milky Way out to about 30 kiloparsecs. They then incorporated this map into a model of circular velocity, to simulate how fast any one star must be traveling, given the distribution of all the other stars in the galaxy. They then plotted each star’s velocity and distance on a chart to produce an updated rotation curve of the Milky Way.

“That’s where the weirdness came in,” Necib says.

Instead of seeing a mild decline like previous rotation curves, the team observed that the new curve dipped more strongly than expected at the outer end. This unexpected downturn suggests that while stars may travel just as fast out to a certain distance, they suddenly slow down at the farthest distances. Stars at the outskirts appear to travel more slowly than expected.

When the team translated this rotation curve to the amount of dark matter that must exist throughout the galaxy, they found that the Milky Way’s core may contain less dark matter than previously estimated.

“This result is in tension with other measurements,” Necib says. “Really understanding this result will have deep repercussions. This might lead to more hidden masses just beyond the edge of the galactic disk, or a reconsideration of the state of equilibrium of our galaxy. We seek to find these answers in upcoming work, using high resolution simulations of Milky Way-like galaxies."

This research was funded, in part, by the National Science Foundation.

Entrepreneur creates career pathways with MIT OpenCourseWare

Thu, 01/25/2024 - 2:10pm

When June Odongo interviewed early-career electrical engineer Cynthia Wacheke for a software engineering position at her company, Wacheke lacked knowledge of computer science theory but showed potential in complex problem-solving.

Determined to give Wacheke a shot, Odongo turned to MIT OpenCourseWare to create a six-month “bridging course” modeled after the classes she once took as a computer science student. Part of MIT Open Learning, OpenCourseWare offers free, online, open educational resources from more than 2,500 courses that span the MIT undergraduate and graduate curriculum. 

“Wacheke had the potential and interest to do the work that needed to be done, so the way to solve this was for me to literally create a path for her to get that work done,” says Odongo, founder and CEO of Senga Technologies. 

Developers, Odongo says, are not easy to find. The OpenCourseWare educational resources provided a way to close that gap. “We put Wacheke through the course last year, and she is so impressive,” Odongo says. “Right now, she is doing our first machine learning models. It’s insane how good of a team member she is. She has done so much in such a short time.”

Making high-quality candidates job-ready

Wacheke, who holds a bachelor’s degree in electrical engineering from the University of Nairobi, started her professional career as a hardware engineer. She discovered a passion for software while working on a dashboard design project, and decided to pivot from hardware to software engineering. That’s when she discovered Senga Technologies, a logistics software and services company in Kenya catering to businesses that ship in Africa. 

Odongo founded Senga with the goal of simplifying and easing the supply chain and logistics experience, from the movement of goods to software tools. Senga’s ultimate goal, Odongo says, is to have most of their services driven by software. That means employees — and candidates — need to be able to think through complex problems using computer science theory.

“A lot of people are focused on programming, but we care less about programming and more about problem-solving,” says Odongo, who received a bachelor’s degree in computer science from the University of Massachusetts at Lowell and an MBA from Harvard Business School. “We actually apply the things people learn in computer science programs.”

Wacheke started the bridging course in June 2022 and was given six months to complete the curriculum on the MIT OpenCourseWare website. She took nine courses, including: Introduction to Algorithms; Mathematics for Computer Science; Design and Analysis of Algorithms; Elements of Software Construction; Automata, Computability, and Complexity; Database Systems; Principles of Autonomy and Decision Making; Introduction to Machine Learning; and Networks

“The bridging course helped me learn how to think through things,” Wacheke says. “It’s one thing to know how to do something, but it’s another to design that thing from scratch and implement it.”

During the bridging course, Wacheke was paired with a software engineer at Senga, who mentored her and answered questions along the way. She learned Ruby on Rails, a server-side web application framework under the MIT License. Wacheke also completed other projects to complement the theory she was learning. She created a new website that included an integration to channel external requests to Slack, a cross-platform team communication tool used by the company’s employees.

Continuous learning for team members

The bridging course concluded with a presentation to Senga employees, during which Wacheke explained how the company could use graph theory for decision-making. “If you want to get from point A to B, there are algorithms you can use to find the shortest path,” Wacheke says. “Since we’re a logistics company, I thought we could use this when we’re deciding which routes our trucks take.”

The presentation, which is the final requirement for the bridging course, is also a professional development opportunity for Senga employees. “This process is helpful for our team members, particularly those who have been out of school for a while,” Odongo says. “The candidates present what they’ve learned in relation to Senga. It’s a way of doing continuous learning for the existing team members.”

After successfully completing the bridging course in November 2022, Wacheke transitioned to a full-time software engineer role. She is currently developing a “machine” that can interpret and categorize hundreds of documents, including delivery notes, cash flows, and receipts.

“The goal is to enable our customers to simply feed those documents into our machine, and then we can more accurately read and convert them to digital formats to drive automation,” Odongo says. “The machine will also enable someone to ask a document a question, such as ‘What did I deliver to retailer X on date Y?’ or ‘What is the total price of the goods delivered?’”

The bridging course, which was initially custom-designed for Wacheke, is now a permanent program at Senga. A second team member completed the course in October 2023 and has joined the software team full time. 

“Developers are not easy to find, and you also want high-quality developers,” Odongo says. “At least when we do this, we know that the person has gone through what we need.”

Performance art and science collide as students experience “Blue Man Group”

Thu, 01/25/2024 - 1:10pm

On a blustery December afternoon, with final exams and winter break on the horizon, the 500 undergraduate students enrolled in Professor Bradley Pentelute’s Course 5.111 (Principles of Chemical Science) class were treated to an afternoon at the theater — a performance of “Blue Man Group” at Boston’s Charles Playhouse — courtesy of Pentelute and the MIT Office of the First Year.

Theatrical thrills aside, it was Blue Man Group’s practical application of chemical principles that inspired Pentelute to initiate and fund this excursion. The MIT Office of the First Year was pleased to collaborate with him to support an opportunity for first-year students to interact with one another outside of the classroom by providing funding for 300 of the tickets and T passes for all.

“By observing the use of specialized paints and materials in the show, students gain a deeper understanding of how chemistry intersects with creative expression,” says Pentelute. “This unique experience is inspired by our discussions on the chemistry of pigments and the role of chemistry in everyday life, aiming to bridge theoretical knowledge with real-world applications. The visit served as an engaging opportunity to enhance [the group’s] learning and foster a sense of community within our class.”

A fixture in Boston’s theater district since 1995, “Blue Man Group” is a euphoric, multi-sensory performance featuring three silent “Blue Men” who interact with the audience and one another not with words, but with art, music, comedy, and non-verbal communication. The characters are other-worldly in their innocence, appearing mystified by the audience and the most commonplace of objects. No two performances are completely alike, as the Blue Men pull members of the audience on stage, make music with instruments fashioned out of construction and plumbing materials, and, possibly most notably, drums covered in liquid paint that splash all over everything — and everyone — in what is known as the Poncho Zone.

The Charles Playhouse has a capacity of 500 seats, so the audience of this particular show was made up entirely of MIT undergraduate students — any tickets not utilized by 5.111 students were offered to first-generation first-year students. The experience proved to be an exciting example of practical applications of the general chemistry concepts and undergraduate camaraderie.

Catherine Hazard, a Department of Chemistry graduate student and the teaching assistant for 5.111, was one of the many attendees thrilled to see science in action at the theater.

“The use of brightly colored oil paints, a hallmark of the show, was a direct representation of chemical structures and crystal field theory concepts covered in class,” explains Hazard. “We learned how energy splitting of d orbitals influences color of varying inorganic transition metal complexes, as well as how chemicals such as waxes, resins, polymers, and stabilizers give the oil paint the proper consistency for the performance. The event was a fun culmination of the lessons learned just before heading into a week of finals.”

The goal of the Office of the First Year is to provide excellent services and programs to catalyze student exploration and access to opportunity, and promote the academic success and personal development of undergraduates. Programs and experiences like this one serve to enrich and support undergraduate education at MIT.

Pentelute joined the MIT faculty in 2011. His research group in the Department of Chemistry develops new protein modification chemistries, adapts nature's machines for efficient macromolecule delivery into cells, invents flow technologies for rapid biopolymer production, and discovers peptide binders to proteins.

Unlocking history with geology and genetics

Thu, 01/25/2024 - 12:00am

Fatima Husain grew up in the heart of the Midwest, surrounded by agriculture. “Every time you left your home, you saw fields of corn and soybeans. And it was really quite beautiful,” she says. During elementary school, she developed her own love of gardening and cultivated a small plot in her family’s backyard.

“Having the freedom to make a mess, experiment, and see things grow was very impactful,” says Husain, a fourth-year doctoral candidate in the MIT Department of Earth, Atmospheric and Planetary Sciences (EAPS) and a Hugh Hampton Young Fellow. This experimentation in the garden was the seed that blossomed into her fascination with science. “When you think about gardening and agriculture in Iowa,” she says, “you think about soil and its origins, which led me to geology and geochemistry and all these interdisciplinary fields that play a role in the Earth sciences.”

Husain has maintained that scientific curiosity throughout her academic career. As a graduate student in EAPS’ Program in Geology, Geochemistry, and Geobiology, she studies the fossil and genetic records of ancient and modern life forms to better understand the history of life on Earth. She says, “Twenty years ago, I was a stoked kid working with topsoil in Iowa. Now, I get to work with ancient dirt and sediments to better understand Earth and life’s past.” 

Sharing science

Though Husain loved her environmental science class in high school, when she enrolled at Brown University, she wasn’t sure which STEM major to pursue. Then, a guest lecture in her first-year biology course dispelled any uncertainty. “A professor walked on stage and introduced himself as a biogeochemist, and after that, everything just clicked,” she says. Within weeks of that fateful lecture, she had declared a major in geochemistry. “I’ve never looked back,” she says.

She then immersed herself in her Earth science classes, which applied the core science disciplines she studied to topics such as the oceans, weather and climate, and water quality. “I gained a sincere appreciation for the excellent teaching and writing that helped me access the world of the geosciences,” she says, “And that helped me realize the value in communicating science clearly.”

To hone her writing skills, Husain took nonfiction writing classes as her electives and joined one of the school newspapers. There, she took on the role of science writer and editor. As she neared graduation, she knew that she would eventually pursue geochemistry at the graduate level, but first she wanted to focus on journalism and writing. She reasoned that, if she could formally learn the fundamentals of science writing and reporting, then “I could more effectively share all the science I learned after that point,” she says. With the support of her undergraduate professors, she decided to apply to MIT’s Graduate Program in Science Writing, one of the only such programs in the country.

The program refined Husain’s writing skills and paved the way for her to pursue science journalism opportunities across a variety of media, including print, video, podcasting, and radio. She worked as a writing intern for MIT News during this time, and has written a number of MIT News articles while at MIT. After graduating, she served as a Curiosity Correspondent for the MIT-Nord Anglia Education Collaboration based at the MIT Museum. In that role, she says, “I worked on communicating the amazing science happening here at MIT to K-12 students around the world via educational videos.” Since beginning her PhD studies, Husain has transitioned to a new role in the collaboration — hosting a monthly webinar series called MIT Abstracts, which connects MIT researchers and experts with an international audience of middle schoolers.

Along the way, Husain has also worked as a reporter and managing producer for a Rhode Island-based sustainability science radio show called Possibly. In 2019, she founded a podcast with her colleagues called BIOmarkers, which serves as an oral history project for the discipline of organic geochemistry.

Acquiring the “biggest tool set” possible

After completing her master’s thesis, Husain began to return to her roots in geochemistry. She says, “At some point, when I was interviewing other scientists and they described their experiments, I’d miss being in the lab myself. That feeling helped me realize the time was right to get back into research.” Husain chose to stay at MIT for her PhD. “I couldn’t resist the opportunity to continue working on challenging, interdisciplinary problems within such an exciting environment,” she says. “There really is no other place quite like it.”

She joined the lab group of Roger Summons, the Schlumberger Professor of Geobiology. For her first project as a research assistant, Husain helped then-postdoc Ainara Sistiaga reconstruct the environment of Tanzania’s Olduvai Gorge 1.7 million years into the past, using molecule-scale fossils preserved in archeological sediments. Part of Africa’s Great Rift Valley, the site preserves evidence of ancient hominin tools and activities. The research team’s findings were later published in published in PNAS.

Under the mentorship of her advisors, Gregory Fournier, an associate professor of geobiology, and Summons, Husain studies both the fossil record and the genetic records of organisms alive today to answer fundamental questions about life’s evolution on Earth. “The farther back into Earth’s history we go, the fewer complete records we have,” Husain says, “To answer the questions that arise, I hope to employ the biggest tool set I can.”

Currently, Husain investigates the biomarkers of microbes living in Antarctic biofilms, which she hopes will provide hints about the types of places where the ancestors of complex life sheltered during global glaciation events through Earth’s Cryogenian period, which stretched between 720 to 635 million years ago. To do this, Husain applies techniques from chemistry, such as chromatography and mass spectrometry, to isolate and study microbial lipids, the precursors of molecular fossils preserved in the geologic record.

Husain also uses “molecular clocks,” tools which employ the genetic sequences of living organisms to estimate when in evolutionary time different species diverged, to better understand how long ago aerobic respiration arose on Earth. Using the growing databases of publicly available gene sequences, Husain says it’s possible to track the histories of metabolisms that arose billions of years ago in Earth’s past. Much of her research can also be applied to astrobiology, the study of potential life elsewhere in the universe.

As a PhD student, Husain has also had the opportunity to serve as teaching assistant for 12.885 (Science, Politics, and Environmental Policy) for two semesters. In that role, she says, “My goal is to help students improve their writing skills so that they are equipped to successfully communicate about important issues in science and policy in the future.”

Looking ahead, Husain hopes to continue applying both her science and communication skills to challenging problems related to Earth and the environment. Along the way, she knows that she wants to share the opportunities that she had with others. “Whichever form it takes,” she says, “I hope to play a role in cultivating the same types of supportive environments which have led me here.”  

Researchers demonstrate rapid 3D printing with liquid metal

Thu, 01/25/2024 - 12:00am

MIT researchers have developed an additive manufacturing technique that can print rapidly with liquid metal, producing large-scale parts like table legs and chair frames in a matter of minutes.

Their technique, called liquid metal printing (LMP), involves depositing molten aluminum along a predefined path into a bed of tiny glass beads. The aluminum quickly hardens into a 3D structure.

The researchers say LMP is at least 10 times faster than a comparable metal additive manufacturing process, and the procedure to heat and melt the metal is more efficient than some other methods.

The technique does sacrifice resolution for speed and scale. While it can print components that are larger than those typically made with slower additive techniques, and at a lower cost, it cannot achieve high resolutions.

For instance, parts produced with LMP would be suitable for some applications in architecture, construction, and industrial design, where components of larger structures often don’t require extremely fine details. It could also be utilized effectively for rapid prototyping with recycled or scrap metal.

In a recent study, the researchers demonstrated the procedure by printing aluminum frames and parts for tables and chairs which were strong enough to withstand postprint machining. They showed how components made with LMP could be combined with high-resolution processes and additional materials to create functional furniture.

“This is a completely different direction in how we think about metal manufacturing that has some huge advantages. It has downsides, too. But most of our built world — the things around us like tables, chairs, and buildings — doesn’t need extremely high resolution. Speed and scale, and also repeatability and energy consumption, are all important metrics,” says Skylar Tibbits, associate professor in the Department of Architecture and co-director of the Self-Assembly Lab, who is senior author of a paper introducing LMP.

Tibbits is joined on the paper by lead author Zain Karsan SM ’23, who is now a PhD student at ETH Zurich; as well as Kimball Kaiser SM ’22 and Jared Laucks, a research scientist and lab co-director. The research was presented at the Association for Computer Aided Design in Architecture Conference and recently published in the association’s proceedings.

Significant speedup

One method for printing with metals that is common in construction and architecture, called wire arc additive manufacturing (WAAM), is able to produce large, low-resolution structures, but these can be susceptible to cracking and warping because some portions must be remelted during the printing process.

LMP, on the other hand, keeps the material molten throughout the process, avoiding some of the structural issues caused by remelting.

Drawing on the group’s previous work on rapid liquid printing with rubber, the researchers built a machine that melts aluminum, holds the molten metal, and deposits it through a nozzle at high speeds. Large-scale parts can be printed in just a few seconds, and then the molten aluminum cools in several minutes.

“Our process rate is really high, but it is also very difficult to control. It is more or less like opening a faucet. You have a big volume of material to melt, which takes some time, but once you get that to melt, it is just like opening a tap. That enables us to print these geometries very quickly,” Karsan explains.

The team chose aluminum because it is commonly used in construction and can be recycled cheaply and efficiently.

Bread loaf-sized pieces of aluminum are deposited into an electric furnace, “which is basically like a scaled-up toaster,” Karsan adds. Metal coils inside the furnace heat the metal to 700 degrees Celsius, slightly above aluminum’s 660-degree melting point.

The aluminum is held at a high temperature in a graphite crucible, and then molten material is gravity-fed through a ceramic nozzle into a print bed along a preset path. They found that the larger the amount of aluminum they could melt, the faster the printer can go.

“Molten aluminum will destroy just about everything in its path. We started with stainless steel nozzles and then moved to titanium before we ended up with ceramic. But even ceramic nozzles can clog because the heating is not always entirely uniform in the nozzle tip,” Karsan says.

By injecting the molten material directly into a granular substance, the researchers don’t need to print supports to hold the aluminum structure as it takes shape. 

Perfecting the process

They experimented with a number of materials to fill the print bed, including graphite powders and salt, before selecting 100-micron glass beads. The tiny glass beads, which can withstand the extremely high temperature of molten aluminum, act as a neutral suspension so the metal can cool quickly.

“The glass beads are so fine that they feel like silk in your hand. The powder is so small that it doesn’t really change the surface characteristics of the printed object,” Tibbits says.

The amount of molten material held in the crucible, the depth of the print bed, and the size and shape of the nozzle have the biggest impacts on the geometry of the final object.

For instance, parts of the object with larger diameters are printed first, since the amount of aluminum the nozzle dispenses tapers off as the crucible empties. Changing the depth of the nozzle alters the thickness of the metal structure.

To aid in the LMP process, the researchers developed a numerical model to estimate the amount of material that will be deposited into the print bed at a given time.

Because the nozzle pushes into the glass bead powder, the researchers can’t watch the molten aluminum as it is deposited, so they needed a way to simulate what should be going on at certain points in the printing process, Tibbits explains.

They used LMP to rapidly produce aluminum frames with variable thicknesses, which were durable enough to withstand machining processes like milling and boring. They demonstrated a combination of LMP and these post-processing techniques to make chairs and a table composed of lower-resolution, rapidly printed aluminum parts and other components, like wood pieces.

Moving forward, the researchers want to keep iterating on the machine so they can enable consistent heating in the nozzle to prevent material from sticking, and also achieve better control over the flow of molten material. But larger nozzle diameters can lead to irregular prints, so there are still technical challenges to overcome.

“If we could make this machine something that people could actually use to melt down recycled aluminum and print parts, that would be a game-changer in metal manufacturing. Right now, it is not reliable enough to do that, but that’s the goal,” Tibbits says.

“At Emeco, we come from the world of very analog manufacturing, so seeing the liquid metal printing creating nuanced geometries with the potential for fully structural parts was really compelling,” says Jaye Buchbinder, who leads business development for the furniture company Emeco and was not involved with this work. “The liquid metal printing really walks the line in terms of ability to produce metal parts in custom geometries while maintaining quick turnaround that you don’t normally get in other printing or forming technologies. There is definitely potential for the technology to revolutionize the way metal printing and metal forming are currently handled.”

Additional researchers who worked on this project include Kimball Kaiser, Jeremy Bilotti, Bjorn Sparrman, and Schendy Kernizan.

This research was funded, in part, by Aisin Group, Amada Global, and Emeco.

MIT Faculty Founder Initiative announces finalists for second competition

Wed, 01/24/2024 - 4:15pm

The MIT Faculty Founder Initiative has announced 12 finalists for the 2023-24 MIT-Royalty Pharma Prize Competition. The competition, which is supported by Royalty Pharma, aims to support female faculty entrepreneurs in biotechnology and provide them with resources to help take their ideas to commercialization. 

“We are building a playbook to get inventions out of the lab towards impacting patients by connecting female faculty to the innovation ecosystem and creating a community of peers,” says Sangeeta Bhatia, the John J. and Dorothy Wilson Professor of Health Sciences and Technology and of Electrical Engineering and Computer Science (EECS), and faculty director of the MIT Faculty Founder Initiative.

Throughout the academic year, finalists for the prize competition will receive support through a number of events, workshops, and programs. These activities focus on topics ranging from executive education classes in entrepreneurship to intellectual property and fundraising strategy. Participants also have access to over 50 best-in-class executives, investors, and advisors who have volunteered to provide mentorship and guidance to the finalists as they further develop their startup ideas.

This spring, the cohort will pitch their ideas to a selection committee of faculty, biotech founders, and venture capitalists. The grand prize winner will receive $250,000 in discretionary funds, and the breakthrough science award winner and runner-up award winner will each receive $100,000. The winners will be announced at a showcase event on May 2, at which the entire cohort will share their work. All participants also receive a $10,000 stipend for participating in the competition.

“The support the MIT Faculty Founder Initiative provides female entrepreneurs in biotech is tremendous. Participants receive truly invaluable guidance from some of the world’s top experts to help hone their ideas and launch companies that have the potential to make a real impact in the biotech space,” adds Anantha Chandrakasan, dean of the School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science.  

The MIT Faculty Founder Initiative was launched in 2020 by the MIT School of Engineering, in collaboration with the Martin Trust Center for MIT Entrepreneurship. The idea for the program stemmed from a research project Bhatia conducted alongside Susan Hockfield, MIT Corporation life member, MIT president emerita, and professor of neuroscience, and Nancy Hopkins, professor emerita of biology. The team discovered that of the 250 biotech startups created by MIT professors, fewer than 10 percent had been founded by women, who made up 22 percent of all faculty.

In their research, the team estimated that if female faculty founded startups at the same rate as their male counterparts, there would be 40 more biotech companies.

“What that means is 40 more potential medicines. The societal impact of that is really important. It’s a lost opportunity,” says Bhatia, who co-write an editorial in Science alongside Hopkins and Hockfield.

In 2021, the Faculty Founder Initiative launched its first prize competition, which was supported by Northpond Ventures. Nine finalists pitched their ideas, with Ellen Roche, Latham Family Career Development Professor, an associate professor of mechanical engineering, and a core faculty of the Institute for Medical Engineering and Science (IMES), taking the grand prize. Eight of the nine participants have continued on their entrepreneurial journey.

The second prize competition cohort includes researchers affiliated with MIT as well as Brown University.

“We are thrilled to be supporting the 2023-2024 MIT-Royalty Pharma Prize Competition and this cohort of 12 brilliant researchers. Their ideas can lead to transformative solutions for patients around the world,” says Pablo Legorreta, founder and CEO of Royalty Pharma.

The 2023-24 finalists include:

  • Anne Carpenter, institute scientist at the Broad Institute of MIT and Harvard, serves as the senior director of the Imaging Platform. She is an expert in developing and applying methods for extracting quantitative information from biological images, especially in a high-throughput manner. Her group’s open-source CellProfiler software is used by thousands of biologists worldwide and their Cell Painting assay has been adopted throughout the pharma industry to accelerate drug discovery. Carpenter earned a BS from Purdue University and a PhD from the University of Illinois at Urbana-Champaign.
     
  • Kareen Coulombe, associate professor of engineering, is the director of graduate studies in biomedical engineering at Brown University and leads the Coulombe Lab for Heart Regeneration and Health. She studies cardiac regenerative medicine — from fundamentals of tissue formation and contractility to integration with the host heart — to develop translational therapies for heart disease patients around the world. Coulombe received a BS from the University of Rochester and a PhD from the University of Washington.
     
  • Betar Gallant, Class of 1922 Career Development Professor and associate professor of mechanical engineering, leads the Gallant Energy and Carbon Conversion Lab. Her research focuses on advanced battery chemistries and materials for high-energy rechargeable and primary batteries. She is also developing insights into reaction mechanisms that underpin advanced greenhouse gas mitigation technologies. Gallant received her BS, master’s degree, and PhD from MIT.
     
  • Carolina Haass-Koffler, associate professor of psychiatry and human behavior and associate professor of behavioral and social sciences at Brown University, is the chief of Brown’s Clinical Neuroscience Lab. As a translational investigator, she combines preclinical and clinical research in an effort to examine bio-behavioral mechanisms of addiction and developing novel therapeutic interventions. Haass-Koffler received her BS from the University of California at Berkeley, her PharmD from the University of California at San Francisco, and her PhD from Università di Camerino.
     
  • Stephanie Jones is a professor of neuroscience at Brown University. Her research integrates human brain imaging and computational neuroscience methods to study brain dynamics in health and disease. She aims to develop biophysically principled models of neural circuits that bridge electrophysiological measures of brain function to the underlying cellular and network level dynamics. Jones received a BS and master’s degree in mathematics from Boston College, and a PhD in mathematics from Boston University, followed by neuroscience training at Massachusetts General Hospital (MGH). 
     
  • Laura Lewis is the Athinoula A. Martinos Associate Professor of IMES and EECS at MIT, principal investigator in the Research Laboratory of Electronics at MIT, and an associate faculty member at the Martinos Center for Biomedical Imaging at MGH. Lewis focuses on neuroimaging approaches that better map brain function, with a particular focus on sleep. She is developing computational and signal processing approaches for neuroimaging data and applying these tools to study how neural computation is dynamically modulated across sleep, wake, attentional, and affective states. Lewis earned a BS at McGill University and a PhD at MIT.
     
  • Frederike Petzschner, assistant professor at the Carney Institute for Brain Science at Brown University. She also serves as the director of the Brainstorm Program, an incubator program that accelerates the translation of computational brain science to clinical applications and commercialization. She and her team at the PEAC (Psychiatry, Embodiment, and Computation) Lab study the latent cognitive processes that underpin perception and decision-making in both healthy individuals and those suffering from obsessive-compulsive disorder, addiction, and, most recently, chronic pain. The group recently launched SOMA, a digital tool designed to assist individuals with chronic pain. Petzschner received a BS and MS from the University of Würzburg and a PhD from Ludwig-Maximilians University in Munich.
     
  • Theresa Raimondo is an assistant professor of engineering at Brown University. Her research broadly centers around the design of RNA-lipid nanoparticles (LNPs) for therapeutic applications. By modulating both the RNA molecule (structure and sequence) and the lipid nanoparticle formulation, her team can deliver RNA-LNPs to immune cells in vivo for immunotherapy. In this application, siRNA-LNPs are used as a novel cancer checkpoint inhibitor therapy. Raimondo received a BS from Brown University and a MS and PhD from Harvard University.
     
  • Ritu Raman, the Brit (1961) and Alex (1949) d'Arbeloff Career Development Professor in Engineering Design and assistant professor of mechanical engineering at MIT, designs adaptive living materials powered by assemblies of living cells for applications ranging from medicine to machines. Currently, she is focused on building living neuromuscular tissues to advance understanding of human disease and restore mobility after injury or trauma. Raman received a BS from Cornell University and an MS and PhD as an NSF Fellow from the University of Illinois at Urbana-Champaign.
     
  • Deblina Sarkar, the AT&T Career Development Professor and assistant professor of media arts and sciences at MIT, is the founder and director of the Nano-Cybernetic Biotrek research group. She conducts transdisciplinary research fusing engineering, applied physics, and biology, aiming to bridge the gap between nanotechnology and synthetic biology to develop disruptive technologies for nanoelectronic devices and create new paradigms for life-nanomachine symbiosis. She received a BTech from the Indian Institute of Technology and an MS and PhD from the University of California at Santa Barbara.
     
  • Jessica Stark starts as assistant professor in the departments of Biological Engineering and Chemical Engineering and the Koch Institute for Integrative Cancer Research at MIT this month. She develops biological technologies to realize the largely untapped potential of glycans for immunological discovery and immunotherapy. Stark received a BS from Cornell University and a PhD from Northwestern University.
     
  • Joelle Straehla is a Charles W. (1995) and Jennifer C. Johnson Clinical Investigator at the Koch Institute, a pediatric oncologist at Dana-Farber/Boston Children's Cancer and Blood Disorders Center, and an instructor of pediatrics at Harvard Medical School. She conducts research at the intersection of nanotechnology and systems biology with the ultimate goal of accelerating cancer nanomedicine translation. She received a BS from the University of Florida and an MD from Northwestern University.

Q&A: What sets the recent Japan earthquake apart from others?

Wed, 01/24/2024 - 3:35pm

On Jan. 1, a magnitude 7.6 earthquake struck the western side of Japan on the Noto Peninsula, killing over 200 people. Japan is prone to earthquakes, including a magnitude 9.1 earthquake in 2011 that triggered a tsunami and killed almost 20,000 people.

William Frank, the Victor P. Starr Career Development Professor in the Department of Earth, Atmospheric and Planetary Sciences at MIT, has been studying an earthquake swarm in the region where the most recent earthquake occurred. He explains the difference between subduction earthquakes and earthquake swarms, and why the unknown nature of these swarms makes predictions hard.

Q: Why is Japan prone to earthquakes?

A: Japan is prone to earthquakes because it is at the western edge of the Pacific plate and a more complicated junction where two plates are subducting, or plunging, beneath the tectonic plate that Japan is sitting on. It's at the interface between those plates where you're going to have a lot of earthquakes, because you're generating stress as the plates move past one another.

But interestingly, this earthquake was not due to subduction. It's on the west coast of the island, and the subduction zones are on the east coast. There are still a lot of active tectonics that are not related to subduction. This one place is enigmatic [as to] why there are so many earthquakes, but there's been this earthquake swarm happening there since 2020. This latest earthquake is the latest big earthquake in the swarm.

Q: What is an earthquake swarm, and how can you tell this earthquake is a part of it?

A: Normally you have the big earthquake, what we call the mainshock, that is followed by a sequence of aftershocks. But in a swarm, there's no clear mainshock because there's a lot of earthquakes before and there's also a lot of earthquakes afterwards. Often there will just be one earthquake that will be bigger than the rest sometime within that swarm duration.

Earthquake swarms are typically around plate boundaries. There's a lot of them in subduction zones but not only [there] — there are also earthquake swarms, for example, in Southern California. These can last days, months, years. We call it a swarm is because it's generating many more earthquakes than we expect from that region, in sustained activity, for the past few years.

Q: How can you tell a swarm from general seismic activity in the region?

A: It's not obvious; it'll take some time before you realize that what's happening now is not what was happening previously. Typically, it's something that ramps up, attains some sort of sustained activity level, and then ramps back down.

Q: Tying it into the 2011 earthquake, which caused significantly more damage, what makes an earthquake more damaging than others?

A: The 2011 earthquake was the subduction of the Pacific plate beneath Japan. In there, you have a lot of fault real estate that can rupture altogether and generate a magnitude 9 earthquake. That earthquake was offshore of Japan, so the shaking was strong, but the biggest damage came from the tsunami. The sudden motion of the seafloor moved the water on top, and that created a big tsunami that then caused its own set of aftereffects and damage to the coast of Japan.

For this earthquake on the Noto Peninsula, because it was beneath the land it's not going to have that sudden uplift of the water on top and feed that tsunami. After the New Year’s Day earthquake, the Japanese authorities initially put out a bunch of tsunami alerts, but then eventually removed them when they realized that we don't expect this to generate the motion necessary for a tsunami. Depending on the tectonic context, a tsunami will likely be generated or not by an earthquake, and that is often the hazard that causes the most amount of damage.

Q: What can these swarms tell us about future activity in the region?

A: Going back to the mainshock/aftershock earthquake sequence, we know that there's going to be an elevated rate [of activity] for the next few days or months, and that these earthquakes are going to happen in the general region of where that big earthquake happened.

For a swarm, because we don't understand it as well, we don't have a clear idea of what's going to happen. Sometimes we've seen swarms that are actually stopped by a big earthquake, and then there's nothing else afterwards — it sort of shuts down the system. Sometimes it's just the biggest earthquake in a long sequence of earthquakes.

Q: You'll often hear people talking about big earthquakes being foreshocks or predictors for bigger earthquakes to come. Do we need to be worried about this being a foreshock?

A: When we are thinking about something along the lines of what you just mentioned, it's because we're thinking about the earthquake budget along a tectonic plate boundary. On a boundary, we know the relative motion and we know that the plates are pretty much rigid, so that all the motion is being accommodated at the interface between the two plates. That gives us some budget for how these plates are going to move over a long period of time.

Let's say, for example, there's a magnitude 7, but we know that there's enough slip budget potentially for a magnitude 8, then maybe that magnitude 7 will change the stress state of that tectonic environment and make it so that the eight might come quicker than if the seven hadn't happened.

But that's when we put ourselves within the context of a tectonic plate interface, like the subduction zones off the coast of East Japan. For this swarm, we don't have a good idea beforehand of what are the actual structures that are going to host the earthquakes. Because we don't have a good idea of where the earthquakes can potentially happen, we can't use that simple model of a slip budget along a fault. Until we have a better understanding of which structures are hosting the earthquakes and the relative motion we expect on those over a long period of time, we can't really forecast what.

Generating the policy of tomorrow

Wed, 01/24/2024 - 3:25pm

As first-year students in the Social and Engineering Systems (SES) doctoral program within the MIT Institute for Data, Systems, and Society (IDSS), Eric Liu and Ashely Peake share an interest in investigating housing inequality issues.

They also share a desire to dive head-first into their research.

“In the first year of your PhD, you’re taking classes and still getting adjusted, but we came in very eager to start doing research,” Liu says.

Liu, Peake, and many others found an opportunity to do hands-on research on real-world problems at the MIT Policy Hackathon, an initiative organized by students in IDSS, including the Technology and Policy Program (TPP). The weekend-long, interdisciplinary event — now in its sixth year — continues to gather hundreds of participants from around the globe to explore potential solutions to some of society’s greatest challenges.

This year’s theme, “Hack-GPT: Generating the Policy of Tomorrow,” sought to capitalize on the popularity of generative AI (like the chatbot ChatGPT) and the ways it is changing how we think about technical and policy-based challenges, according to Dansil Green, a second-year TPP master’s student and co-chair of the event.

“We encouraged our teams to utilize and cite these tools, thinking about the implications that generative AI tools have on their different challenge categories,” Green says.

After 2022’s hybrid event, this year’s organizers pivoted back to a virtual-only approach, allowing them to increase the overall number of participants in addition to increasing the number of teams per challenge by 20 percent.

“Virtual allows you to reach more people — we had a high number of international participants this year — and it helps reduce some of the costs,” Green says. “I think going forward we are going to try and switch back and forth between virtual and in-person because there are different benefits to each.”

“When the magic hits”

Liu and Peake competed in the housing challenge category, where they could gain research experience in their actual field of study. 

“While I am doing housing research, I haven’t necessarily had a lot of opportunities to work with actual housing data before,” says Peake, who recently joined the SES doctoral program after completing an undergraduate degree in applied math last year. “It was a really good experience to get involved with an actual data problem, working closer with Eric, who's also in my lab group, in addition to meeting people from MIT and around the world who are interested in tackling similar questions and seeing how they think about things differently.”

Joined by Adrian Butterton, a Boston-based paralegal, as well as Hudson Yuen and Ian Chan, two software engineers from Canada, Liu and Peake formed what would end up being the winning team in their category: “Team Ctrl+Alt+Defeat.” They quickly began organizing a plan to address the eviction crisis in the United States.

“I think we were kind of surprised by the scope of the question,” Peake laughs. “In the end, I think having such a large scope motivated us to think about it in a more realistic kind of way — how could we come up with a solution that was adaptable and therefore could be replicated to tackle different kinds of problems.”

Watching the challenge on the livestream together on campus, Liu says they immediately went to work, and could not believe how quickly things came together.

“We got our challenge description in the evening, came out to the purple common area in the IDSS building and literally it took maybe an hour and we drafted up the entire project from start to finish,” Liu says. “Then our software engineer partners had a dashboard built by 1 a.m. — I feel like the hackathon really promotes that really fast dynamic work stream.”

“People always talk about the grind or applying for funding — but when that magic hits, it just reminds you of the part of research that people don't talk about, and it was really a great experience to have,” Liu adds.

A fresh perspective

“We’ve organized hackathons internally at our company and they are great for fostering innovation and creativity,” says Letizia Bordoli, senior AI product manager at Veridos, a German-based identity solutions company that provided this year’s challenge in Data Systems for Human Rights. “It is a great opportunity to connect with talented individuals and explore new ideas and solutions that we might not have thought about.”

The challenge provided by Veridos was focused on finding innovative solutions to universal birth registration, something Bordoli says only benefited from the fact that the hackathon participants were from all over the world.

“Many had local and firsthand knowledge about certain realities and challenges [posed by the lack of] birth registration,” Bordoli says. “It brings fresh perspectives to existing challenges, and it gave us an energy boost to try to bring innovative solutions that we may not have considered before.”

New frontiers

Alongside the housing and data systems for human rights challenges was a challenge in health, as well as a first-time opportunity to tackle an aerospace challenge in the area of space for environmental justice.

“Space can be a very hard challenge category to do data-wise since a lot of data is proprietary, so this really developed over the last few months with us having to think about how we could do more with open-source data,” Green explains. “But I am glad we went the environmental route because it opened the challenge up to not only space enthusiasts, but also environment and climate people.”

One of the participants to tackle this new challenge category was Yassine Elhallaoui, a system test engineer from Norway who specializes in AI solutions and has 16 years of experience working in the oil and gas fields. Elhallaoui was a member of Team EcoEquity, which proposed an increase in policies supporting the use of satellite data to ensure proper evaluation and increase water resiliency for vulnerable communities.

“The hackathons I have participated in in the past were more technical,” Elhallaoui says. “Starting with [MIT Science and Technology Policy Institute Director Kristen Kulinowski’s] workshop about policy writers and the solutions they came up with, and the analysis they had to do … it really changed my perspective on what a hackathon can do.”

“A policy hackathon is something that can make real changes in the world,” she adds.

Faculty, staff, students to evaluate ways to decarbonize MIT's campus

Wed, 01/24/2024 - 3:10pm

With a goal to decarbonize the MIT campus by 2050, the Institute must look at “new ideas, transformed into practical solutions, in record time,” as stated in “Fast Forward: MIT’s Climate Action Plan for the Decade.” This charge calls on the MIT community to explore game-changing and evolving technologies with the potential to move campuses like MIT away from carbon emissions-based energy systems.

To help meet this tremendous challenge, the Decarbonization Working Group — a new subset of the Climate Nucleus — recently launched. Comprised of appointed MIT faculty, researchers, and students, the working group is leveraging its members’ expertise to meet the charge of exploring and assessing existing and in-development solutions to decarbonize the MIT campus by 2050. The group is specifically charged with informing MIT’s efforts to decarbonize the campus's district energy system.

Co-chaired by Director of Sustainability Julie Newman and Department of Architecture Professor Christoph Reinhart, the working group includes members with deep knowledge of low- and zero-carbon technologies and grid-level strategies. In convening the group, Newman and Reinhart sought out members researching these technologies as well as exploring their practical use. “In my work on multiple projects on campus, I have seen how cutting-edge research often relies on energy-intensive equipment,” shares PhD student and group member Ippolyti Dellatolas. “It’s clear how new energy-efficiency strategies and technologies could use campus as a living lab and then broadly deploy these solutions across campus for scalable emissions reductions.” This approach is one of MIT’s strong suits and a recurring theme in its climate action plans — using the MIT campus as a test bed for learning and application. “We seek to study and analyze solutions for our campus, with the understanding that our findings have implications far beyond our campus boundaries,” says Newman.

The efforts of the working group represent just one part of the multipronged approach to identify ways to decarbonize the MIT campus. The group will work in parallel and at times collaboratively with the team from the Office of the Vice President for Campus Services and Stewardship that is managing the development plan for potential zero-carbon pathways for campus buildings and the district energy system. In May 2023, MIT engaged Affiliated Engineers, Inc. (AEI), to support the Institute’s efforts to identify, evaluate, and model various carbon-reduction strategies and technologies to provide MIT with a series of potential decarbonization pathways. Each of the pathways must demonstrate how to manage the generation of energy and its distribution and use on campus. As MIT explores electrification, a significant challenge will be the availability of resilient clean power from the grid to help generate heat for our campus without reliance on natural gas.

When the Decarbonization Working Group began work this fall, members took the time to learn more about current systems and baseline information. Beginning this month, members will organize analysis around each of their individual areas of expertise and interest and begin to evaluate existing and emerging carbon reduction technologies. “We are fortunate that there are constantly new ideas and technologies being tested in this space and that we have a committed group of faculty working together to evaluate them,” Newman says. “We are aware that not every technology is the right fit for our unique dense urban campus, and nor are we solving for a zero-carbon campus as an island, but rather in the context of an evolving regional power grid.”

Supported by funding from the Climate Nucleus, evaluating technologies will include site visits to locations where priority technologies are currently deployed or being tested. These site visits may range from university campuses implementing district geothermal and heat pumps to test sites of deep geothermal or microgrid infrastructure manufacturers. “This is a unique moment for MIT to demonstrate leadership by combining best decarbonization practices, such as retrofitting building systems to achieve deep energy reductions and converting to low-temperature district heating systems with ‘nearly there’ technologies such as deep geothermal, micronuclear, energy storage, and ubiquitous occupancy-driven temperature control,” says Reinhart. “As first adopters, we can find out what works, allowing other campuses to follow us at reduced risks.”

The findings and recommendations of the working group will be delivered in a report to the community at the end of 2024. There will be opportunities for the MIT community to learn more about MIT’s decarbonization efforts at community events on Jan. 24 and March 14, as well as MIT’s Sustainability Connect forum on Feb. 8.

Q&A: A blueprint for sustainable innovation

Wed, 01/24/2024 - 2:00pm

Atacama Biomaterials is a startup combining architecture, machine learning, and chemical engineering to create eco-friendly materials with multiple applications. Passionate about sustainable innovation, its co-founder Paloma Gonzalez-Rojas SM 15, PhD 21 highlights here how MIT has supported the project through several of its entrepreneurship initiatives, and reflects on the role of design in building a holistic vision for an expanding business.

Q: What role do you see your startup playing in the sustainable materials space?

A: Atacama Biomaterials is a venture dedicated to advancing sustainable materials through state-of-the-art technology. With my co-founder Jose Tomas Dominguez, we have been working on developing our technology since 2019. We initially started the company in 2020 under another name and received Sandbox funds the next year. In 2021, we went through The Engine’s accelerator, Blueprint, and changed our name to Atacama Biomaterials in 2022 during the MITdesignX program. 

This technology we have developed allows us to create our own data and material library using artificial intelligence and machine learning, and serves as a platform applicable to various industries horizontally — biofuels, biological drugs, and even mining. Vertically, we produce inexpensive, regionally sourced, and environmentally friendly bio-based polymers and packaging — that is, naturally compostable plastics as a flagship product, along with AI products.

Q: What motivated you to venture into biomaterials and found Atacama?

A: I’m from Chile, a country with a beautiful, rich geography and nature where we can see all the problems stemming from industry, waste management, and pollution. We named our company Atacama Biomaterials because the Atacama Desert in Chile — one of the places where you can best see the stars in the world — is becoming a plastic dump, as many other places on Earth. I care deeply about sustainability, and I have an emotional attachment to stop these problems. Considering that manufacturing accounts for 29 percent of global carbon emissions, it is clear that sustainability has a role in how we define technology and entrepreneurship, as well as a socio-economic dimension.

When I first came to MIT, it was to develop software in the Department of Architecture’s Design and Computation Group, with MIT professors Svafa Gronfeldt as co-advisor and Regina Barzilay as committee member. During my PhD, I studied machine-learning methods simulating pedestrian motion to understand how people move in space. In my work, I would use lots of plastics for 3D printing and I couldn’t stop thinking about sustainability and climate change, so I reached out to material science and mechanical engineering professors to look into biopolymers and degradable bio-based materials. This is how I met my co-founder, as we were both working with MIT Professor Neil Gershenfeld. Together, we were part of one of the first teams in the world to 3D print wood fibers, which is difficult — it’s slow and expensive — and quickly pivoted to sustainable packaging. 

I then won a fellowship from MCSC [the MIT Climate and Sustainability Consortium], which gave me freedom to explore further, and I eventually got a postdoc in MIT chemical engineering, guided by MIT Professor Gregory Rutledge, a polymer physicist. This was unexpected in my career path. Winning Nucleate Eco Track 2022 and the MITdesignX Innovation Award in 2022 profiled Atacama Biomaterials as one of the rising startups in Boston’s biotechnology and climate-tech scene.

Q: What is your process to develop new biomaterials?

A: My PhD research, coupled with my background in material development and molecular dynamics, sparked the realization that principles I studied simulating pedestrian motion could also apply to molecular engineering. This connection may seem unconventional, but for me, it was a natural progression. Early in my career, I developed an intuition for materials, understanding their mechanics and physics.

Using my experience and skills, and leveraging machine learning as a technology jump, I applied a similar conceptual framework to simulate the trajectories of molecules and find potential applications in biomaterials. Making that parallel and shift was amazing. It allowed me to optimize a state-of-the-art molecular dynamic software to run twice as fast as more traditional technologies through my algorithm presented at the International Conference of Machine Learning this year. This is very important, because this kind of simulation usually takes a week, so narrowing it down to two days has major implications for scientists and industry, in material science, chemical engineering, computer science and related fields. Such work greatly influenced the foundation of Atacama Biomaterials, where we developed our own AI to deploy our materials. In an effort to mitigate the environmental impact of manufacturing, Atacama is targeting a 16.7 percent reduction in carbon dioxide emissions associated with the manufacturing process of its polymers, through the use of renewable energy. 

Another thing is that I was trained as an architect in Chile, and my degree had a design component. I think design allows me to understand problems at a very high level, and how things interconnect. It contributed to developing a holistic vision for Atacama, because it allowed me to jump from one technology or discipline to another and understand broader applications on a conceptual level. Our design approach also meant that sustainability came to the center of our work from the very beginning, not just a plus or an added cost.

Q: What was the role of MITdesignX in Atacama’s development?

A: I have known Svafa Grönfeldt, MITdesignX’s faculty director, for almost six years. She was the co-advisor of my PhD, and we had a mentor-mentee relationship. I admire the fact that she created a space for people interested in business and entrepreneurship to grow within the Department of Architecture. She and Executive Director Gilad Rosenzweig gave us fantastic advice, and we received significant support from mentors. For example, Daniel Tsai helped us with intellectual property, including a crucial patent for Atacama. And we’re still in touch with the rest of the cohort. I really like this “design your company” approach, which I find quite unique, because it gives us the opportunity to reflect on who we want to be as designers, technologists, and entrepreneurs. Studying user insights also allowed us to understand the broad applicability of our research, and align our vision with market demands, ultimately shaping Atacama into a company with a holistic perspective on sustainable material development.

Q: How does Atacama approach scaling, and what are the immediate next steps for the company?

A: When I think about accomplishing our vision, I feel really inspired by my 3-year-old daughter. I want her to experience a world with trees and wildlife when she's 100 years old, and I hope Atacama will contribute to such a future.

Going back to the designer’s perspective, we designed the whole process holistically, from feedstock to material development, incorporating AI and advanced manufacturing. Having proved that there is a demand for the materials we are developing, and having tested our products, manufacturing process, and technology in critical environments, we are now ready to scale. Our level of technology-readiness is comparable to the one used by NASA (level 4).

We have proof of concept: a biodegradable and recyclable packaging material which is cost- and energy-efficient as a clean energy enabler in large-scale manufacturing. We have received pre-seed funding, and are sustainably scaling by taking advantage of available resources around the world, like repurposing machinery from the paper industry. As presented in the MIT Industrial Liaison and STEX Program's recent Sustainability Conference, unlike our competitors, we have cost-parity with current packaging materials, as well as low-energy processes. And we also proved the demand for our products, which was an important milestone. Our next steps involve strategically expanding our manufacturing capabilities and research facilities and we are currently evaluating building a factory in Chile and establishing an R&D lab plus a manufacturing plant in the U.S.

New tool predicts flood risk from hurricanes in a warming climate

Wed, 01/24/2024 - 6:00am

Coastal cities and communities will face more frequent major hurricanes with climate change in the coming years. To help prepare coastal cities against future storms, MIT scientists have developed a method to predict how much flooding a coastal community is likely to experience as hurricanes evolve over the next decades.

When hurricanes make landfall, strong winds whip up salty ocean waters that generate storm surge in coastal regions. As the storms move over land, torrential rainfall can induce further flooding inland. When multiple flood sources such as storm surge and rainfall interact, they can compound a hurricane’s hazards, leading to significantly more flooding than would result from any one source alone. The new study introduces a physics-based method for predicting how the risk of such complex, compound flooding may evolve under a warming climate in coastal cities.

One example of compound flooding’s impact is the aftermath from Hurricane Sandy in 2012. The storm made landfall on the East Coast of the United States as heavy winds whipped up a towering storm surge that combined with rainfall-driven flooding in some areas to cause historic and devastating floods across New York and New Jersey.

In their study, the MIT team applied the new compound flood-modeling method to New York City to predict how climate change may influence the risk of compound flooding from Sandy-like hurricanes over the next decades.  

They found that, in today’s climate, a Sandy-level compound flooding event will likely hit New York City every 150 years. By midcentury, a warmer climate will drive up the frequency of such flooding, to every 60 years. At the end of the century, destructive Sandy-like floods will deluge the city every 30 years — a fivefold increase compared to the present climate.

“Long-term average damages from weather hazards are usually dominated by the rare, intense events like Hurricane Sandy,” says study co-author Kerry Emanuel, professor emeritus of atmospheric science at MIT. “It is important to get these right.”

While these are sobering projections, the researchers hope the flood forecasts can help city planners prepare and protect against future disasters. “Our methodology equips coastal city authorities and policymakers with essential tools to conduct compound flooding risk assessments from hurricanes in coastal cities at a detailed, granular level, extending to each street or building, in both current and future decades,” says study author Ali Sarhadi, a postdoc in MIT’s Department of Earth, Atmospheric and Planetary Sciences.

The team’s open-access study appears online today in the Bulletin of the American Meteorological Society. Co-authors include Raphaël Rousseau-Rizzi at MIT’s Lorenz Center, Kyle Mandli at Columbia University, Jeffrey Neal at the University of Bristol, Michael Wiper at the Charles III University of Madrid, and Monika Feldmann at the Swiss Federal Institute of Technology Lausanne.

The seeds of floods

To forecast a region’s flood risk, weather modelers typically look to the past. Historical records contain measurements of previous hurricanes’ wind speeds, rainfall, and spatial extent, which scientists use to predict where and how much flooding may occur with coming storms. But Sarhadi believes that the limitations and brevity of these historical records are insufficient for predicting future hurricanes’ risks.

“Even if we had lengthy historical records, they wouldn’t be a good guide for future risks because of climate change,” he says. “Climate change is changing the structural characteristics, frequency, intensity, and movement of hurricanes, and we cannot rely on the past.”

Sarhadi and his colleagues instead looked to predict a region’s risk of hurricane flooding in a changing climate using a physics-based risk assessment methodology. They first paired simulations of hurricane activity with coupled ocean and atmospheric models over time. With the hurricane simulations, developed originally by Emanuel, the researchers virtually scatter tens of thousands of “seeds” of hurricanes into a simulated climate. Most seeds dissipate, while a few grow into category-level storms, depending on the conditions of the ocean and atmosphere.

When the team drives these hurricane simulations with climate models of ocean and atmospheric conditions under certain global temperature projections, they can see how hurricanes change, for instance in terms of intensity, frequency, and size, under past, current, and future climate conditions.

The team then sought to precisely predict the level and degree of compound flooding from future hurricanes in coastal cities. The researchers first used rainfall models to simulate rain intensity for a large number of simulated hurricanes, then applied numerical models to hydraulically translate that rainfall intensity into flooding on the ground during landfalling of hurricanes, given information about a region such as its surface and topography characteristics. They also simulated the same hurricanes’ storm surges, using hydrodynamic models to translate hurricanes’ maximum wind speed and sea level pressure into surge height in coastal areas. The simulation further assessed the propagation of ocean waters into coastal areas, causing coastal flooding.

Then, the team developed a numerical hydrodynamic model to predict how two sources of hurricane-induced flooding, such as storm surge and rain-driven flooding, would simultaneously interact through time and space, as simulated hurricanes make landfall in coastal regions such as New York City, in both current and future climates.  

“There’s a complex, nonlinear hydrodynamic interaction between saltwater surge-driven flooding and freshwater rainfall-driven flooding, that forms compound flooding that a lot of existing methods ignore,” Sarhadi says. “As a result, they underestimate the risk of compound flooding.”

Amplified risk

With their flood-forecasting method in place, the team applied it to a specific test case: New York City. They used the multipronged method to predict the city’s risk of compound flooding from hurricanes, and more specifically from Sandy-like hurricanes, in present and future climates. Their simulations showed that the city’s odds of experiencing Sandy-like flooding will increase significantly over the next decades as the climate warms, from once every 150 years in the current climate, to every 60 years by 2050, and every 30 years by 2099.

Interestingly, they found that much of this increase in risk has less to do with how hurricanes themselves will change with warming climates, but with how sea levels will increase around the world.

“In future decades, we will experience sea level rise in coastal areas, and we also incorporated that effect into our models to see how much that would increase the risk of compound flooding,” Sarhadi explains. “And in fact, we see sea level rise is playing a major role in amplifying the risk of compound flooding from hurricanes in New York City.”

The team’s methodology can be applied to any coastal city to assess the risk of compound flooding from hurricanes and extratropical storms. With this approach, Sarhadi hopes decision-makers can make informed decisions regarding the implementation of adaptive measures, such as reinforcing coastal defenses to enhance infrastructure and community resilience.

“Another aspect highlighting the urgency of our research is the projected 25 percent increase in coastal populations by midcentury, leading to heightened exposure to damaging storms,” Sarhadi says. “Additionally, we have trillions of dollars in assets situated in coastal flood-prone areas, necessitating proactive strategies to reduce damages from compound flooding from hurricanes under a warming climate.”

This research was supported, in part, by Homesite Insurance.

New model predicts how shoe properties affect a runner’s performance

Wed, 01/24/2024 - 12:00am

A good shoe can make a huge difference for runners, from career marathoners to couch-to-5K first-timers. But every runner is unique, and a shoe that works for one might trip up another. Outside of trying on a rack of different designs, there’s no quick and easy way to know which shoe best suits a person’s particular running style.

MIT engineers are hoping to change that with a new model that predicts how certain shoe properties will affect a runner’s performance.

The simple model incorporates a person’s height, weight, and other general dimensions, along with shoe properties such as stiffness and springiness along the midsole. With this input, the model then simulates a person’s running gait, or how they would run, in a particular shoe.

Using the model, the researchers can simulate how a runner’s gait changes with different shoe types. They can then pick out the shoe that produces the best performance, which they define as the degree to which a runner’s expended energy is minimized.

While the model can accurately simulate changes in a runner’s gait when comparing two very different shoe types, it is less discerning when comparing relatively similar designs, including most commercially available running shoes. For this reason, the researchers envision the current model would be best used as a tool for shoe designers looking to push the boundaries of sneaker design.

“Shoe designers are starting to 3D print shoes, meaning they can now make them with a much wider range of properties than with just a regular slab of foam,” says Sarah Fay, a postdoc in MIT’s Sports Lab and the Institute for Data, Systems, and Society (IDSS). “Our model could help them design really novel shoes that are also high-performing.”

The team is planning to improve the model, in hopes that consumers can one day use a similar version to pick shoes that fit their personal running style.

“We’ve allowed for enough flexibility in the model that it can be used to design custom shoes and understand different individual behaviors,” Fay says. “Way down the road, we imagine that if you send us a video of yourself running, we could 3D print the shoe that’s right for you. That would be the moonshot.”

The new model is reported in a study appearing this month in the Journal of Biomechanical Engineering. The study is authored by Fay and Anette “Peko” Hosoi, professor of mechanical engineering at MIT.

Running, revamped

The team’s new model grew out of talks with collaborators in the sneaker industry, where designers have started to 3D print shoes at commercial scale. These designs incorporate 3D-printed midsoles that resemble intricate scaffolds, the geometry of which can be tailored to give a certain bounce or stiffness in specific locations across the sole.

“With 3D printing, designers can tune everything about the material response locally,” Hosoi says. “And they came to us and essentially said, ‘We can do all these things. What should we do?’”

“Part of the design problem is to predict what a runner will do when you put an entirely new shoe on them,” Fay adds. “You have to couple the dynamics of the runner with the properties of the shoe.”

Fay and Hosoi looked first to represent a runner’s dynamics using a simple model. They drew inspiration from Thomas McMahon, a leader in the study of biomechanics at Harvard University, who in the 1970s used a very simple “spring and damper” model to model a runner’s essential gait mechanics. Using this gait model, he predicted how fast a person could run on various track types, from traditional concrete surfaces to more rubbery material. The model showed that runners should run faster on softer, bouncier tracks that supported a runner’s natural gait.

Though this may be unsurprising today, the insight was a revelation at the time, prompting Harvard to revamp its indoor track — a move that quickly accumulated track records, as runners found they could run much faster on the softier, springier surface.

“McMahon’s work showed that, even if we don’t model every single limb and muscle and component of the human body, we’re still able to create meaningful insights in terms of how we design for athletic performance,” Fay says.

Gait cost

Following McMahon’s lead, Fay and Hosoi developed a similar, simplified model of a runner’s dynamics. The model represents a runner as a center of mass, with a hip that can rotate and a leg that can stretch. The leg is connected to a box-like shoe, with springiness and shock absorption that can be tuned, both vertically and horizontally.

They reasoned that they should be able to input into the model a person’s basic dimensions, such as their height, weight, and leg length, along with a shoe’s material properties, such as the stiffness of the front and back midsole, and use the model to simulate what a person’s gait is likely to be when running in that shoe.

But they also realized that a person’s gait can depend on a less definable property, which they call the “biological cost function” — a quality that a runner might not consciously be aware of but nevertheless may try to minimize whenever they run. The team reasoned that if they can identify a biological cost function that is general to most runners, then they might predict not only a person’s gait for a given shoe but also which shoe produces the gait corresponding to the best running performance.

With this in mind, the team looked to a previous treadmill study, which recorded detailed measurements of runners, such as the force of their impacts, the angle and motion of their joints, the spring in their steps, and the work of their muscles as they ran, each in the same type of running shoe.

Fay and Hosoi hypothesized that each runner’s actual gait arose not only from their personal dimensions and shoe properties, but also a subconscious goal to minimize one or more biological measures, yet unknown. To reveal these measures, the team used their model to simulate each runner’s gait multiple times. Each time, they programmed the model to assume the runner minimized a different biological cost, such as the degree to which they swing their leg or the impact that they make with the treadmill. They then compared the modeled gait with the runner’s actual gait to see which modeled gait — and assumed cost — matched the actual gait.

In the end, the team found that most runners tend to minimize two costs: the impact their feet make with the treadmill and the amount of energy their legs expend.

“If we tell our model, ‘Optimize your gait on these two things,’ it gives us really realistic-looking gaits that best match the data we have,” Fay explains. “This gives us confidence that the model can predict how people will actually run, even if we change their shoe.”

As a final step, the researchers simulated a wide range of shoe styles and used the model to predict a runner’s gait and how efficient each gait would be for a given type of shoe.

“In some ways, this gives you a quantitative way to design a shoe for a 10K versus a marathon shoe,” Hosoi says. “Designers have an intuitive sense for that. But now we have a mathematical understanding that we hope designers can use as a tool to kickstart new ideas.”

This research is supported, in part, by adidas.

What to do about AI in health?

Tue, 01/23/2024 - 4:25pm

Before a drug is approved by the U.S. Food and Drug Administration (FDA), it must demonstrate both safety and efficacy. However, the FDA does not require an understanding a drug’s mechanism of action for approval. This acceptance of results without explanation raises the question of whether the "black box" decision-making process of a safe and effective artificial intelligence model must be fully explained in order to secure FDA approval.  

This topic was one of many discussion points addressed on Monday, Dec. 4 during the MIT Abdul Latif Jameel Clinic for Machine Learning in Health (Jameel Clinic) AI and Health Regulatory Policy Conference, which ignited a series of discussions and debates amongst faculty; regulators from the United States, EU, and Nigeria; and industry experts concerning the regulation of AI in health. 

As machine learning continues to evolve rapidly, uncertainty persists as to whether regulators can keep up and still reduce the likelihood of harmful impact while ensuring that their respective countries remain competitive in innovation. To promote an environment of frank and open discussion, the Jameel Clinic event’s attendance was highly curated for an audience of 100 attendees debating through the enforcement of the Chatham House Rule, to allow speakers anonymity for discussing controversial opinions and arguments without being identified as the source. 

Rather than hosting an event to generate buzz around AI in health, the Jameel Clinic's goal was to create a space to keep regulators apprised of the most cutting-edge advancements in AI, while allowing faculty and industry experts to propose new or different approaches to regulatory frameworks for AI in health, especially for AI use in clinical settings and in drug development. 

AI’s role in medicine is more relevant than ever, as the industry struggles with a post-pandemic labor shortage, increased costs (“Not a salary issue, despite common belief,” said one speaker), as well as high rates of burnout and resignations among health care professionals. One speaker suggested that priorities for clinical AI deployment should be focused more on operational tooling rather than patient diagnosis and treatment. 

One attendee pointed out a “clear lack of education across all constituents — not just amongst developer communities and health care systems, but with patients and regulators as well.” Given that medical doctors are often the primary users of clinical AI tools, a number of the medical doctors present pleaded with regulators to consult them before taking action. 

Data availability was a key issue for the majority of AI researchers in attendance. They lamented the lack of data to make their AI tools work effectively. Many faced barriers such as intellectual property barring access or simply a dearth of large, high-quality datasets. “Developers can’t spend billions creating data, but the FDA can,” a speaker pointed out during the event. “There’s a price uncertainty that could lead to underinvestment in AI.” Speakers from the EU touted the development of a system obligating governments to make health data available for AI researchers. 

By the end of the daylong event, many of the attendees suggested prolonging the discussion and praised the selective curation and closed environment, which created a unique space conducive to open and productive discussions on AI regulation in health. Once future follow-up events are confirmed, the Jameel Clinic will develop additional workshops of a similar nature to maintain the momentum and keep regulators in the loop on the latest developments in the field.

“The North Star for any regulatory system is safety,” acknowledged one attendee. “Generational thought stems from that, then works downstream.” 

Rowing in the right direction

Tue, 01/23/2024 - 4:10pm

For a college student, senior Tatum Wilhelm wakes up painfully early — at 5:15 a.m., to be exact. Five days per week, by 6:20 a.m. sharp, she is already rowing on the Charles River, bursting through the early morning fog. 

Between majoring in chemical engineering, minoring in anthropology, and working as an undergraduate student researcher at the Furst Lab, Wilhelm’s days are packed. But she says it’s her role on MIT Crew that gives her perspective on her goals and what matters most.  

Stretching her arms after a workout on the erg, the unforgiving indoor rowing machine used for individual training, she explains, “Crew is a set time in the day when I’m not thinking about academics. I’m just focused on pushing myself physically — and the river is beautiful.” 

She was captain of her team last year, but winning isn’t the current that pulls Wilhelm deeper and deeper into her sport; it’s teamwork. 

“When I first came here, I had the preconception that everyone at MIT was a genius and super into their books,” she says. “They are very smart, but everyone also does really cool stuff outside of academics. My favorite thing about this school is the people — especially my team.” 

Fitting in

A first-generation college student raised by a single mom, Wilhelm came to MIT from California with the support of Questbridge, a nonprofit that mentors high-achieving, low-income students as they apply early decision to their top-choice colleges. She was passionate about science and knew that MIT was the right place, but she didn’t know a soul on campus. 

It’s Wilhelm’s friendships, both in the lab and in the eight-person boat, that have given her a feeling of belonging. 

“Before I got to MIT, I honestly didn’t know what an engineer was,” she says bluntly. 

But once Wilhelm saw engineering alumni solving real-world problems in the field, she knew it was for her, ultimately choosing chemical engineering. 

When Covid-19 hit the spring of her first year and remained virtual for the fall 2020 semester, Wilhelm temporarily relocated to Alaska, where she worked as a farm hand and learned about sustainable agriculture. “I am an engineer — not a farmer. I am also not that outdoorsy, and that experience pushed me way out of my academic comfort zone in a great way,” Wilhelm says. 

During that time, she began working remotely as an undergraduate researcher in the Furst Lab, logging on between shifts in the fields to meet with Assistant Professor Ariel Furst, who actively included her as one of the team from the start. 

Back in Cambridge as a sophomore, Wilhelm unexpectedly discovered a passion for anthropology when she signed up for class 21A.157 (The Meaning of Life), a seminar taught by William R. Kenan Jr. Professor of Anthropology Heather Paxson.

Wilhelm admits, “I thought the class would be too philosophical, but it was actually extremely applicable to things that were going on in students’ lives. It was about finding personal meaning in work, family, and money in tangible ways.” At the time, the whole world was still reeling from Covid-19, and being able to conduct that kind of soul-searching became a powerful tool. 

“I just kept going with the anthro courses and soon had collected enough for a minor,” Wilhelm says. “They complement my chemical engineering classes, which are very technical and centered around problem-solving.” 

Real-world chemical engineering

Wilhelm spent her junior year studying thermodynamics and fluid dynamics in the Department of Chemical Engineering (ChemE), as well as class 21A.520 (Magic, Science, and Religion), a seminar with professor of anthropology Graham Jones. The contrast both stretched and soothed her brain. She says Jones’s engaging style of teaching made him her favorite MIT professor.

This fall, Wilhelm took a class called 21A.301 (Disease and Health) with associate professor of anthropology Amy Moran-Thomas. Discussions about the biopharmaceutical industry and analyzing modes of care directly connected with her ChemE coursework and internships, and gave her perspective on how her future work can impact real-world users. She reflects, “Looking at how these treatments impact patients’ lives has provided a deeper understanding of the implications of my work. I value being able to look at very technical scientific problems from a humanities lens, and I think it has enhanced my learning in both disciplines.” 

Alongside her academic studies, Wilhelm has continued working at the Furst Lab, more recently with the support of MIT SuperUROP. The competitive program provides advanced undergraduates with independent research opportunities. 

With this funding, Wilhelm is conducting a project to examine how to potentially engineer cell-based electrochemical lanthanide sensors. Lanthanides are rare-earth elements used in several industries, including electronics and green energy, primarily due to their abundance and low cost. 

Wilhelm explains, “The current methods for the separation of lanthanides in mining and recycling are costly and environmentally damaging. This project aims to create an inexpensive and environmentally-friendly method for detecting and recovering lanthanides from complex solutions."

At MIT, she has noticed some interesting parallels between being part of the crew team and sharing the lab with researchers of different ages and backgrounds. In both settings, failing, iterating, and ultimately winning frame the culture. 

She says, “In the lab, there is an overarching sense of purpose, which also translates to crew. In rowing, we are all working together. We train both individually and as a team. Our performance as individuals matters, but we ultimately have to all come together to move the boat forward.” 

Next year, Wilhelm hopes to steer toward a PhD in chemical engineering or material science. 

“I’m really interested in the industry applications of ChemE, but in reality, I just want to continue researching and learning new things every day right now,” she says.

Pages