MIT Latest News

Subscribe to MIT Latest News feed
MIT News is dedicated to communicating to the media and the public the news and achievements of the students, faculty, staff and the greater MIT community.
Updated: 23 hours 41 min ago

Defending against Spectre and Meltdown attacks

Thu, 10/18/2018 - 3:30pm

In January the technology world was rattled by the discovery of Meltdown and Spectre, two major security vulnerabilities in the processors that can be found in virtually every computer on the planet.

Perhaps the most alarming thing about these vulnerabilities is that they didn’t stem from normal software bugs or physical CPU problems. Instead, they arose from the architecture of the processors themselves — that is, the millions of transistors that work together to execute operations.

“These attacks fundamentally changed our understanding of what’s trustworthy in a system, and force us to re-examine where we devote security resources,” says Ilia Lebedev, a PhD student at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). “They’ve shown that we need to be paying much more attention to the microarchitecture of systems.”

Lebedev and his colleagues believe that they’ve made an important new breakthrough in this field, with an approach that makes it much harder for hackers to cash in on such vulnerabilities. Their method could have immediate applications in cloud computing, especially for fields like medicine and finance that currently limit their cloud-based features because of security concerns.

With Meltdown and Spectre, hackers exploited the fact that operations all take slightly different amounts of time to execute. To use a simplified example, someone who’s guessing a PIN might first try combinations “1111” through “9111." If the first eight guesses take the same amount of time, and "9111" takes a nanosecond longer, then that one most likely has at least the "9" right, and the attacker can then start guessing "9111" through "9911", and so on and so forth.

An operation that’s especially vulnerable to these so-called “timing attacks” is accessing memory. If systems always had to wait for memory before doing the next step of an action, they’d spend much of their time sitting idle.

To keep performance up, engineers employ a trick: They give the processor the power to execute multiple instructions while it waits for memory — and then, once memory is ready, discards the ones that weren’t needed. Hardware designers call this “speculative execution.”

While it pays off in performance speed, it also creates new security issues. Specifically, the attacker could make the processor speculatively execute some code to read a part of memory it shouldn’t be able to. Even if the code fails, it could still leak data that the attacker can then access.

A common way to try to prevent such attacks is to split up memory so that it’s not all stored in one area. Imagine an industrial kitchen shared by chefs who all want to keep their recipes secret. One approach would be to have the chefs set up their work on different sides — that’s essentially what happens with the Cache Allocation Technology (CAT) that Intel started using in 2016. But such a system is still quite insecure, since one chef can get a pretty good idea of others’ recipes by seeing which pots and pans they take from the common area.

In contrast, the MIT CSAIL team’s approach is the equivalent of building walls to split the kitchen into separate spaces, and ensuring that everyone only knows their own ingredients and appliances. (This approach is a form of so-called “secure way partitioning”; the chefs in the case of cache memory are referred to as “protection domains.”)
                
As a playful counterpoint to Intel’s CAT system, the researchers dubbed their method “DAWG”, which stands for “Dynamically Allocated Way Guard.” (The dynamic part means that DAWG can split the cache into multiple buckets whose size can vary over time.)

Lebedev co-wrote a new paper about the project with lead author Vladimir Kiriansky and MIT professors Saman Amarasinghe, Srini Devadas, and Joel Emer. They will present their findings next week at the annual IEEE/ACM International Symposium on Microarchitecture (MICRO) in Fukuoka City, Japan.

“This paper dives into how to fully isolate one program's side-effects from percolating through to another program through the cache,” says Mohit Tiwari, an assistant professor at the University of Texas at Austin who was not involved in the project. “This work secures a channel that’s one of the most popular to use for attacks.”

In tests, the team also found that the system was comparable with CAT on performance. They say that DAWG requires very minimal modifications to modern operating systems.

“We think this is an important step forward in giving computer architects, cloud providers, and other IT professionals a better way to efficiently and dynamically allocate resources,” says Kiriansky, a PhD student at CSAIL. “It establishes clear boundaries for where sharing should and should not happen, so that programs with sensitive information can keep that data reasonably secure.”

The team is quick to caution that DAWG can’t yet defend against all speculative attacks. However, they have experimentally demonstrated that it is a foolproof solution to a broad range of non-speculative attacks against cryptographic software.

Lebedev says that the growing prevalence of these types of attacks demonstrates that, contrary to popular tech-CEO wisdom, more information sharing isn’t always a good thing.

“There’s a tension between performance and security that’s come to a head for a community of architecture designers that have always tried to share as much as possible in as many places as possible,” he says. “On the other hand, if security was the only priority, we’d have separate computers for every program we want to run so that no information could ever leak, which obviously isn’t practical. DAWG is part of a growing body of work trying to reconcile these two opposing forces.”

It’s worth recognizing that the sudden attention on timing attacks reflects the paradoxical fact that computer security has actually gotten a lot better in the last 20 years.

“A decade ago software wasn’t written as well as it is today, which means that other attacks were a lot easier to perform,” says Kiriansky. “As other aspects of security have become harder to carry out, these microarchitectural attacks have become more appealing, though they’re still fortunately just a small piece in an arsenal of actions that an attacker would have to take to actually do damage.”

The team is now working to improve DAWG so that it can stop all currently known speculative-execution attacks. In the meantime, they’re hopeful that companies such as Intel will be interested in adopting their idea — or others like it — to minimize the chance of future data breaches.

“These kinds of attacks have become a lot easier thanks to these vulnerabilities,” says Kiriansky. “With all the negative PR that’s come up, companies like Intel have the incentives to get this right. The stars are aligned to make an approach like this happen.”

Exploring the future of learning through virtual and augmented reality

Thu, 10/18/2018 - 3:20pm

At a recent on-campus symposium titled “VR, Sound and Cinema: Implications for Storytelling and Learning,” MIT Open Learning explored the future of storytelling and learning through virtual reality (VR) and augmented reality (AR).  

The event featured a panel of faculty and industry experts in VR/AR, cinema, and storytelling, showcasing the power of these tools and their potential impact on learning. Speakers included Sanjay Sarma, vice president for Open Learning; Fox Harrell, a professor of digital media and artificial intelligence at MIT; Academy Award-winning director Shekhar Kapur; Berklee College of Music Professor Susan Rogers; Academy Award-winning sound designer Mark Mangini; and Edgar Choueiri, a professor of applied physics at Princeton University.

Harrell, who is currently working on a new VR/AR project with MIT Open Learning, studies new forms of computational narrative, gaming, social media, and related digital media based in computer science. His talk focused on answering the question: “How do virtual realities impact our learning and engagement?” He also screened a preview of Karim Ben Khelifa’s “The Enemy,” a groundbreaking virtual reality experience that made its American premiere at the MIT Museum in December 2017.

In “The Enemy,” participants embody a soldier avatar, who encounters and interacts with enemy soldiers. Participants can ask their enemies questions, who can then adjust their responses based on the participants’ own lived experiences as well as their real-time physiological responses. The intended result is to create empathy between supposed enemies, whose hopes, dreams, and nightmares are more similar than their biases would have them believe.

“This can be a really powerful teaching tool,” Harrell said, explaining that it could be used in war zones and with child soldiers.

Next, film director and producer Shekhar Kapur spoke about storytelling in the age of infinite technological resources. Kapur pondered why people tend to watch the same movie over and over.

“We don’t always watch a movie again because it’s great, but because we can reflect upon ourselves and how we’ve changed even if the movie content hasn’t,” he said. In this sense, Kapur argued, stories have always been virtual, because they have always been filtered through each person’s subjective and shifting perspective.

“We are the stories we tell ourselves,” said Kapur, who believes that technology has always dictated the storytelling format. “If I don’t learn the new storytelling technologies, I’ll become a dinosaur.” Kapur insists that the three-act narrative dictated by past technologies will have to become more flexible, user-centric, and open-ended as VR becomes more commonplace. “We should be driven by the things we want. For example, I want to see my father again but he passed away several years ago. Can I retell his story with technologies that will make him seem real again? I don’t know.”

Finally, Susan Rogers, a professor of music production and engineering and an expert in music cognition at Boston’s Berklee College of Music, took the floor to talk about how technology is influencing our daily lives.

“Our behavior is becoming further from reality the more our technology imitates reality,” she said.

Rogers’ assessment focused on reality versus truth, examining what would happen to VR once it becomes so close to reality that it no longer seemed virtual.

“Scientists worship the truth — so how can scientists appreciate virtual reality?” she asked. “It isn’t truth.”

Following the panel, Professor Sarma invited guests to participate in a deeper dive into the day’s discussions. Academy Award-winning sound designer Mark Mangini and Edgar Choueiri, a professor of engineering physics at Princeton and director of the university's Electric Propulsion and Plasma Dynamics Laboratory (EPPDyL), led in-depth talks on how sound enhances learning and storytelling.

Mangini spoke of the need for sound designers to embrace artistry and narrative in their work.

“If we live in technique, we live on the boundaries of creativity,” he said. While technology has come a long way, he argued, there is still more to be done with 3-D.

“Our ancestors told stories around a fire,” he said. “Today, we still sit around in the dark watching a flickering light.”

Choueiri ended the event with a special interactive presentation, first asking aloud, “Why has spatial development been neglected for so long?” and then asserting that people’s emotional reactions are inherently spatial. To demonstrate the visceral nature of 3-D sound, Choueiri chose a volunteer and projected 3-D sound directly to him, by measuring and targeting his head-related transfer function (HRTF).

The sold-out event garnered an impressive level of interest from the public and students from MIT and Berklee College, who made up almost half of the audience. As VR/AR technology applications continue to grow, MIT Open Learning officials say they hope to hold more events that explore the intersection of science, media, and learning.

Cryptographic protocol enables greater collaboration in drug discovery

Thu, 10/18/2018 - 2:00pm

MIT researchers have developed a cryptographic system that could help neural networks identify promising drug candidates in massive pharmacological datasets, while keeping the data private. Secure computation done at such a massive scale could enable broad pooling of sensitive pharmacological data for predictive drug discovery.

Datasets of drug-target interactions (DTI), which show whether candidate compounds act on target proteins, are critical in helping researchers develop new medications. Models can be trained to crunch datasets of known DTIs and then, using that information, find novel drug candidates.

In recent years, pharmaceutical firms, universities, and other entities have become open to pooling pharmacological data into larger databases that can greatly improve training of these models. Due to intellectual property matters and other privacy concerns, however, these datasets remain limited in scope. Cryptography methods to secure the data are so computationally intensive they don’t scale well to datasets beyond, say, tens of thousands of DTIs, which is relatively small.

In a paper published today in Science, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) describe a neural network securely trained and tested on a dataset of more than a million DTIs. The network leverages modern cryptographic tools and optimization techniques to keep the input data private, while running quickly and efficiently at scale.

The team’s experiments show the network performs faster and more accurately than existing approaches; it can process massive datasets in days, whereas other cryptographic frameworks would take months. Moreover, the network identified several novel interactions, including one between the leukemia drug imatinib and an enzyme ErbB4 — mutations of which have been associated with cancer — which could have clinical significance.

“People realize they need to pool their data to greatly accelerate the drug discovery process and enable us, together, to make scientific advances in solving important human diseases, such as cancer or diabetes. But they don’t have good ways of doing it,” says corresponding author Bonnie Berger, the Simons Professor of Mathematics and a principal investigator at CSAIL. “With this work, we provide a way for these entities to efficiently pool and analyze their data at a very large scale.”

Joining Berger on the paper are co-first authors Brian Hie and Hyunghoon Cho, both graduate students in electrical engineering and computer science and researchers in CSAIL’s Computation and Biology group.

“Secret sharing” data

The new paper builds on previous work by the researchers in protecting patient confidentiality in genomic studies, which find links between particular genetic variants and incidence of disease. That genomic data could potentially reveal personal information, so patients can be reluctant to enroll in the studies. In that work, Berger, Cho, and a former Stanford University PhD student developed a protocol based on a cryptography framework called “secret sharing,” which securely and efficiently analyzes datasets of a million genomes. In contrast, existing proposals could handle only a few thousand genomes.

Secret sharing is used in multiparty computation, where sensitive data is divided into separate “shares” among multiple servers. Throughout computation, each party will always have only its share of the data, which appears fully random. Collectively, however, the servers can still communicate and perform useful operations on the underlying private data. At the end of the computation, when a result is needed, the parties combine their shares to reveal the result.

“We used our previous work as a basis to apply secret sharing to the problem of pharmacological collaboration, but it didn’t work right off the shelf,” Berger says.

A key innovation was reducing the computation needed in training and testing. Existing predictive drug-discovery models represent the chemical and protein structures of DTIs as graphs or matrices. These approaches, however, scale quadratically, or squared, with the number of DTIs in the dataset. Basically, processing these representations becomes extremely computationally intensive as the size of the dataset grows. “While that may be fine for working with the raw data, if you try that in secure computation, it’s infeasible,” Hie says.

The researchers instead trained a neural network that relies on linear calculations, which scale far more efficiently with the data. “We absolutely needed scalability, because we’re trying to provide a way to pool data together [into] much larger datasets,” Cho says.

The researchers trained a neural network on the STITCH dataset, which has 1.5 million DTIs, making it the largest publicly available dataset of its kind. In training, the network encodes each drug compound and protein structure as a simple vector representation. This essentially condenses the complicated structures as 1s and 0s that a computer can easily process. From those vectors, the network then learns the patterns of interactions and noninteractions. Fed new pairs of compounds and protein structures, the network then predicts if they’ll interact.

The network also has an architecture optimized for efficiency and security. Each layer of a neural network requires some activation function that determines how to send the information to the next layer. In their network, the researchers used an efficient activation function called a rectified linear unit (ReLU). This function requires only a single, secure numerical comparison of an interaction to determine whether to send (1) or not send (0) the data to the next layer, while also never revealing anything about the actual data. This operation can be more efficient in secure computation compared to more complex functions, so it reduces computational burden while ensuring data privacy.

“The reason that’s important is we want to do this within the secret sharing framework … and we don’t want to ramp up the computational overhead,” Berger says. In the end, “no parameters of the model are revealed and all input data — the drugs, targets, and interactions — are kept private.”

Finding interactions

The researchers pitted their network against several state-of-the-art, plaintext (unencrypted) models on a portion of known DTIs from DrugBank, a popular dataset containing about 2,000 DTIs. In addition to keeping the data private, the researchers’ network outperformed all of the models in prediction accuracy. Only two baseline models could reasonably scale to the STITCH dataset, and the researchers’ model achieved nearly double the accuracy of those models.

The researchers also tested drug-target pairs with no listed interactions in STITCH, and found several clinically established drug interactions that weren’t listed in the database but should be. In the paper, the researchers list the top strongest predictions, including: droloxifene and an estrogen receptor, which reached phase III clinical trials as a treatment for breast cancer; and seocalcitol and a vitamin D receptor to treat other cancers. Cho and Hie independently validated the highest-scoring novel interactions via contract research organizations.

Next, the researchers are working with partners to establish their collaborative pipeline in a real-world setting. “We are interested in putting together an environment for secure computation, so we can run our secure protocol with real data,” Cho says.

Electrical properties of dendrites help explain our brain’s unique computing power

Thu, 10/18/2018 - 11:00am

Neurons in the human brain receive electrical signals from thousands of other cells, and long neural extensions called dendrites play a critical role in incorporating all of that information so the cells can respond appropriately.

Using hard-to-obtain samples of human brain tissue, MIT neuroscientists have now discovered that human dendrites have different electrical properties from those of other species. Their studies reveal that electrical signals weaken more as they flow along human dendrites, resulting in a higher degree of electrical compartmentalization, meaning that small sections of dendrites can behave independently from the rest of the neuron.

These differences may contribute to the enhanced computing power of the human brain, the researchers say.

“It’s not just that humans are smart because we have more neurons and a larger cortex. From the bottom up, neurons behave differently,” says Mark Harnett, the Fred and Carole Middleton Career Development Assistant Professor of Brain and Cognitive Sciences. “In human neurons, there is more electrical compartmentalization, and that allows these units to be a little bit more independent, potentially leading to increased computational capabilities of single neurons.”

Harnett, who is also a member of MIT’s McGovern Institute for Brain Research, and Sydney Cash, an assistant professor of neurology at Harvard Medical School and Massachusetts General Hospital, are the senior authors of the study, which appears in the Oct. 18 issue of Cell. The paper’s lead author is Lou Beaulieu-Laroche, a graduate student in MIT’s Department of Brain and Cognitive Sciences.

Neural computation

Dendrites can be thought of as analogous to transistors in a computer, performing simple operations using electrical signals. Dendrites receive input from many other neurons and carry those signals to the cell body. If stimulated enough, a neuron fires an action potential — an electrical impulse that then stimulates other neurons. Large networks of these neurons communicate with each other to generate thoughts and behavior.

The structure of a single neuron often resembles a tree, with many branches bringing in information that arrives far from the cell body. Previous research has found that the strength of electrical signals arriving at the cell body depends, in part, on how far they travel along the dendrite to get there. As the signals propagate, they become weaker, so a signal that arrives far from the cell body has less of an impact than one that arrives near the cell body.

Dendrites in the cortex of the human brain are much longer than those in rats and most other species, because the human cortex has evolved to be much thicker than that of other species. In humans, the cortex makes up about 75 percent of the total brain volume, compared to about 30 percent in the rat brain.

Although the human cortex is two to three times thicker than that of rats, it maintains the same overall organization, consisting of six distinctive layers of neurons. Neurons from layer 5 have dendrites long enough to reach all the way to layer 1, meaning that human dendrites have had to elongate as the human brain has evolved, and electrical signals have to travel that much farther.

In the new study, the MIT team wanted to investigate how these length differences might affect dendrites’ electrical properties. They were able to compare electrical activity in rat and human dendrites, using small pieces of brain tissue removed from epilepsy patients undergoing surgical removal of part of the temporal lobe. In order to reach the diseased part of the brain, surgeons also have to take out a small chunk of the anterior temporal lobe.

With the help of MGH collaborators Cash, Matthew Frosch, Ziv Williams, and Emad Eskandar, Harnett’s lab was able to obtain samples of the anterior temporal lobe, each about the size of a fingernail.

Evidence suggests that the anterior temporal lobe is not affected by epilepsy, and the tissue appears normal when examined with neuropathological techniques, Harnett says. This part of the brain appears to be involved in a variety of functions, including language and visual processing, but is not critical to any one function; patients are able to function normally after it is removed.

Once the tissue was removed, the researchers placed it in a solution very similar to cerebrospinal fluid, with oxygen flowing through it. This allowed them to keep the tissue alive for up to 48 hours. During that time, they used a technique known as patch-clamp electrophysiology to measure how electrical signals travel along dendrites of pyramidal neurons, which are the most common type of excitatory neurons in the cortex.

These experiments were performed primarily by Beaulieu-Laroche. Harnett’s lab (and others) have previously done this kind of experiment in rodent dendrites, but his team is the first to analyze electrical properties of human dendrites.

Unique features

The researchers found that because human dendrites cover longer distances, a signal flowing along a human dendrite from layer 1 to the cell body in layer 5 is much weaker when it arrives than a signal flowing along a rat dendrite from layer 1 to layer 5.

They also showed that human and rat dendrites have the same number of ion channels, which regulate the current flow, but these channels occur at a lower density in human dendrites as a result of the dendrite elongation. They also developed a detailed biophysical model that shows that this density change can account for some of the differences in electrical activity seen between human and rat dendrites, Harnett says.

Nelson Spruston, senior director of scientific programs at the Howard Hughes Medical Institute Janelia Research Campus, described the researchers’ analysis of human dendrites as “a remarkable accomplishment.”

“These are the most carefully detailed measurements to date of the physiological properties of human neurons,” says Spruston, who was not involved in the research. “These kinds of experiments are very technically demanding, even in mice and rats, so from a technical perspective, it’s pretty amazing that they’ve done this in humans.”

The question remains, how do these differences affect human brainpower? Harnett’s hypothesis is that because of these differences, which allow more regions of a dendrite to influence the strength of an incoming signal, individual neurons can perform more complex computations on the information.

“If you have a cortical column that has a chunk of human or rodent cortex, you’re going to be able to accomplish more computations faster with the human architecture versus the rodent architecture,” he says.

There are many other differences between human neurons and those of other species, Harnett adds, making it difficult to tease out the effects of dendritic electrical properties. In future studies, he hopes to explore further the precise impact of these electrical properties, and how they interact with other unique features of human neurons to produce more computing power.

The research was funded by the National Sciences and Engineering Research Council of Canada, the Dana Foundation David Mahoney Neuroimaging Grant Program, and the National Institutes of Health.

Tang family gift supports MIT.nano, MIT Quest for Intelligence

Thu, 10/18/2018 - 12:00am

The Tang family of Hong Kong has made a $20 million gift to MIT to name the Tang Family Imaging Suite in the new MIT.nano facility and establish the Tang Family Catalyst Fund to support the MIT Quest for Intelligence.

The Imaging Suite in MIT.nano is part of a highly specialized facility for viewing, measuring, and understanding at the nanoscale. With design features that include a 5-million-pound slab of concrete for stabilization, isolated construction of individual spaces, and technology to minimize mechanical and electromagnetic interference, MIT.nano’s imaging suites provide the “quiet” environment needed for this sensitive work.

“We are grateful for the Tang family’s generosity and visionary investment in nanoscale research at MIT,” says Vladimir Bulović, inaugural director of MIT.nano and the Fariborz Maseeh Professor in Emerging Technology. “The imaging suite will allow scientists and engineers to decipher the structure and function of matter with precision that has not been possible before and, armed with this new knowledge, identify promising opportunities for innovation in health, energy, communications and computing, and a host of other fields.”

The Tang Family Catalyst Fund will provide $5 million for artificial intelligence (AI) research activities and operations, with a special focus on projects at the intersection of AI and financial technology.

“AI tools and technologies are going to revolutionize many industries and disciplines,” says Anantha P. Chandrakasan, dean of the MIT School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science. “We are delighted to have the support of the Tang family as we lead the development and discovery of these new tools and technologies.”

“The Quest is fueled by cross-disciplinary collaboration and the support of enterprising people such as the Tang family who see great value in exploration and discovery,” says Antonio Torralba, inaugural director of The Quest for Intelligence. “From seed grants for early-stage faculty and student research, to undergraduate and graduate student activities, the Tang Family Catalyst Fund will kindle new ideas that advance machine learning.”

A passionate advocate for open data

Thu, 10/18/2018 - 12:00am

Radha Mastandrea wants to know what the universe is made of.

More specifically, she wants to know about tiny pieces of it called quarks, the particles that make up other, bigger particles such as protons and neutrons. The more we know about those, she says, the more we know about the building blocks of all matter.

Mastandrea’s research is largely dependent on data, which she gets from CERN’s Large Hadron Collider in Switzerland. The scientists at CERN, the European Organization for Nuclear Research, will smash two protons together, which will generate a number of quarks and gluons. Every such particle then “showers” into a stream of other particles — these streams are called quark jets, or gluon jets (depending on the particle they showered from). Mastandrea then sifts through heaps of raw data about these jets, and uses the information to learn more about the particles they came from.

Professor Allan Adams, a former recitation instructor of Mastandrea’s, suggested she describe her current work “as if we were slamming two clocks together and we get two elephants out.” The clocks are protons — when they’re slammed together, the quark jets that result aren’t necessarily what researchers would have predicted.

“The step I add,” Mastandrea says, “is, you slam together clocks, you get two elephants, and then the elephants create baby elephants, and you record the baby elephants,” she laughs. “At some point the analogy kind of breaks down.”

Data-driven

Part of what makes Mastandrea’s research challenging is that data from CERN are organized in a way that works well for researchers there, but are difficult to sort through for scientists outside of CERN who may not understand how the data are structured.

To be sure, Mastandrea applauds CERN for making its data open to everyone; she’s passionate about the ability of open data to further research around the globe. However, she and her labmates have developed a GitHub tool that extracts important information from CERN’s data and puts it in a text file that’s easier for labs outside CERN to interpret. They plan to make that framework public. She used a grant she received from the Heising-Simons Foundation to expand computing resources for this project.  

Before using CERN’s data, Mastandrea primarily worked with simulations. During her first year at MIT she began studying emissions at the 21-centimeter line, which is named after the wavelength of energy hydrogen atoms emit when after they undergo a certain energy change. That summer, she studied neutrinoless double beta decays under Lindley Winslow, the Jerrold R. Zacharias Career Development Assistant Professor of Physics at MIT.

Mastandrea’s time at Caltech, with the Laser Interferometer Gravitational-wave Observatory (LIGO) in the summer after her sophomore year, provided a change of pace. There, she helped the LIGO team simulate black hole mergers, and was on the ground floor during some of the crucial discoveries that led to LIGO’s monumental paper on gravitational waves from neutron star mergers — experiences that made for “the most exciting summer.” But at the time, the LIGO team still hadn’t collected quite enough data from actual black hole mergers to run in the simulation she was working on. For Mastandrea, using the data from CERN in her current research is particularly exciting and meaningful.

“It feels like a true physics analysis. … I’m actually investigating the world, not just the fake world that I generate,” she says.

Dancing out of the lab

Mastandrea had never tried the Indian dance form called bhangra before coming to college. She’d played the trumpet for a long time, and she figured she would keep playing as an out-of-lab outlet. In time, though, she discovered she wasn’t enjoying it as much as she’d hoped. She decided to try something new.

“I was just surprised by how much fun it was,” she says, describing how she joined MIT’s bhangra dance team. “The people I met were so friendly.”

Mastandrea says that dancing is “a different kind of stress” than her work in the lab. Bhangra is no casual exercise — it’s high-energy and a little exhausting — but for Mastandrea, a co-captain of the team, it’s more rewarding than it is strenuous.

“The whole dance is meant to be an expression of joy,” she says. “It’s meant to engage people, and you have to make people not only love watching you, but you have to make them want to join you when you’re done. It’s definitely an amazing experience.”

When she’s not bringing joy through dance, she’s experiencing it through food — she loves to cook. She has a profound adoration for tomatoes — she’ll bite into them like they’re apples, she admits with a grin. One day, she says, she’d love to take time off and attend culinary school. For now, though, she gets her fill of cooking entertainment from Food Network’s “Cutthroat Kitchen,” in which chefs foil each other’s plans as they vie for dominance.

“Throwing yourself into a hugely different part of your life”

On the humanities side of her education, one of Mastandrea’s favorite parts of MIT is the philosophy department.

“From what I can tell … they make a real effort to engage undergraduates and make things interesting, and also just to talk with them. They’re very accessible, and that’s great when the problems that philosophy covers are so broad and require this discussion,” she says.

She’s also currently enrolled in a playwriting class. She gives credit to the theater department for creating such a unique experience for MIT students.

“Everyone I see is so passionate about science … and then I go here [to a theater class]. … I see the same passion about something completely different, throwing yourself into a hugely different part of your life, and … casting yourself out and trying something new. There’s no better way to do that than with the theater classes.”

In addition to classes, there are many individuals she is grateful to have met at MIT. She credits her research advisor, Jesse Thaler, and Lindley Winslow, who is both the advisor for the Undergraduate Women in Physics group (which Mastandrea is president of) and Mastandrea’s former research advisor, with helping transform her “from a person who studies physics, who takes physics classes, to someone who … is beginning to think and research with the mindset of a physicist.” Her academic advisor, Michael McDonald, has also been an important resource to her throughout her time here.

After graduating, Mastandrea plans to continue her education in graduate school and wants to keep researching. Maybe her work will lead her toward an answer to the ultimate question — what makes up the universe as we know it.

“Understanding what matter is made of tells us what matter can do, and how it can act in the world,” she says. “And we need to know the very basic constituents … knowing what things are made of tells us everything.”

Translating research into impact

Wed, 10/17/2018 - 12:40pm

The MIT Tata Center for Technology and Design has funded upwards of 100 projects since its inception, and finds itself at a crucial juncture of identifying market opportunities for some of its advanced-stage projects that require further support in order to be turned into profitable social enterprises.

The Tata Center was first established at MIT six years ago by a generous donation provided by one of India’s oldest philanthropic organizations, Tata Trusts. With several advanced-stage projects now in the pipeline, the center’s leadership recognized a need to answer a fundamental question: How can the Tata Center provide further support, and what might that support look like, to research projects that have reached a state of maturity?

The center's recently-concluded fourth annual symposium and workshop, a two-day event hosted at the Samberg Conference Center titled “Translating Research into Impact,” aimed to do just that.

“This is a preoccupation for us. We’re no longer looking for things to do, we’ve found things to do. And we’ve brought technologies to a point at which they’re ready to go out into the world in the form of helpful products and services,” Tata Center Director Rob Stoner said as he welcomed students, industry partners, faculty, non-governmental organization representatives, and government officials from both India and the U.S. to the conference. “So, our focus has become translation — handing off technologies that may have reached the prototype or demonstration stage at MIT to entrepreneurial firms, government agencies, NGOs — anyone who has the vision and commitment to bring them to scale in India. It takes a focused effort to do that successfully.”

Stoner was joined at the conference by Manoj Kumar, head of entrepreneurship and innovations at Tata Trusts and Maurizio Vecchione, the executive vice presdient of Global Good and Research, which is a collaboration between Intellectual Ventures and the Gates Foundation.

In his opening keynote address, The Power of Developing World Technology: Reverse Innovation, Vecchione stressed the importance of investing in technologies for the developing world from a market-driven perspective. Focusing on the health care sector, Vecchione emphasized the need to dramatically increase research and development budgets targeted toward finding solutions for diseases like HIV, malaria, and tuberculosis in the developing world. The world’s population, primarily led by developing countries like China, India, Nigeria, and Mexico, is projected to reach 9 billion by 2040. 

The keynote was followed by a panel on scaling social enterprises with Jessica Alderman, the director of communications for Envirofit International; Alex Eaton, CEO of Sistema Biobolsa and Charity; and Manoj Sinha, CEO of Husk Power Systems. One of the core issues that emerged during the panel was the perceived dichotomy of impact versus profit.

“The idea of profit is important. And profit is absolutely tied to impact,” Alderman said. “You will have a short-lived company if you don’t have a solid way of getting to profit.”

Symposium attendees were also introduced to new Tata Center startups and multiple advanced-stage projects working on techologies including:

  • urine-based tuberculosis diagnostics;
  • affordable silicon-based nanofiltration;
  • accessible intraperitoneal chemotherapy devices;
  • intelligence deployment to improve agri-supply chains; and
  • photovoltaic-powered village-scale desalination systems.

The first day to a close with a fireside chat with Ernest Moniz, the Cecil and Ida Green Professor of Physics and Engineering Systems Emeritus and former U.S. Secretary of Energy, followed by a town hall on funding social innovations with Ann Dewitt, COO of The Engine, Barry Johnson of the National Science Foundation, and Harkesh Kumar Mittal from India’s Department of Science and Technology.

On the second day of the conference, Ann Mei Chang, the author of “Lean Impact” and former chief innovation officer at USAID, delivered an inspiring keynote address on the importance of thinking big, starting small, and pursuing impact relentlessly.

This second day was dedicated to parallel sectorial workshops on Tata Center’s six focus areas: housing, health, agriculture, energy, environment, and water. Workshop participants included faculty from MIT, the Indian Institute of Technology in Mumbai, Tata Fellows, active Tata Center collaborators, industry representatives, and representatives of some of India’s most influential NGOs.

“So many projects end up not leaving the institution because of gaps in our support ecosystem,” Stoner said, drawing the event to a close. “We’re determined at the Tata Center not to let that happen with our projects by filling those gaps.”  

The MIT Tata Center’s efforts to build connections in the developing world are linked to MIT’s broader campaign to engage with global challenges, and to translate innovative research into entrepreneurial impact. That work continues year-round. The next Tata Center Symposium will be held at MIT on Sept. 12 and 13, 2019.

Four from MIT named American Physical Society Fellows for 2018

Wed, 10/17/2018 - 12:30pm

Four members of the MIT community have been elected as fellows of the American Physical Society for 2018. The distinct honor is bestowed on less than 0.5 percent of the society's membership each year.

APS Fellowship recognizes members that have completed exceptional physics research, identified innovative applications of physics to science and technology, or furthered physics education. Nominated by their peers, the four were selected based on their outstanding contributions to the field.

Lisa Barsotti is a principal research scientist at the MIT Kavli Institute for Astrophysics and Space Research and a member of the Laser Interferometer Gravitational-Wave Observatory (LIGO) team. Barsotti was nominated by the Division of Gravitational Physics for her “extraordinary leadership in commissioning the advanced LIGO detectors, improving their sensitivity through implementation of squeezed light, and enhancing the operation of the gravitational wave detector network through joint run planning between LIGO and Virgo.”

Martin Bazant is the E. G. Roos (1944) Professor of Chemical Engineering and a professor of mathematics. Nominated by the Division of Fluid Dynamics, Bazant was cited for “seminal contributions to electrokinetics and electrochemical physics, and their links to fluid dynamics, notably theories of diffuse-charge dynamics, induced-charge electro-osmosis, and electrochemical phase separation.”

Pablo Jarillo-Herrero is the Cecil and Ida Green Professor of Physics. Jarillo-Herrero was nominated by the Division of Condensed Matter Physics and selected based on his “seminal contributions to quantum electronic transport and optoelectronics in van der Waals materials and heterostructures.”

Richard Lanza is a senior research scientist in the Department of Nuclear Science and Engineering. Nominated by the Forum on Physics and Society, Lanza was cited for his “innovative application of physics and the development of new technologies to allow detection of explosives and weapon-usable nuclear materials, which has greatly benefited national and international security.”

Fighting cybercrime requires a new kind of leadership

Wed, 10/17/2018 - 12:15pm

On March 22, the city of Atlanta was hit by cyberattackers who locked city-wide systems and demanded a bitcoin ransom. Many city systems still have not recovered, and the cost to taxpayers may have reached as high as $17 million.

Also in March, the U.S. Department of Justice indicted nine Iranian hackers over an alleged spree of attacks on more than 300 universities in the United States and abroad. The hackers stole 31 terabytes of data, estimated to be worth $3 billion in intellectual property.

And recently engineers at Facebook detected the biggest security breach in Facebook's history. It took the company 11 days to stop it.

The FBI reports that more than 4,000 ransomware attacks occur daily. Large private sector companies routinely grapple with cybersecurity and fending off cybercrime, and corporate security isn't getting better fast enough. Cyber risk has emerged as a significant threat to the financial system: A recent IMF study suggests that average annual losses to financial institutions from cyber-attacks could reach a few hundred billion dollars a year, potentially threatening financial stability. Hacker attacks on critical infrastructure are already alarming, and the security of our cyber-physical infrastructure — the computer-controlled facilities that produce and deliver our energy, water, and communications, for example — are dangerously exposed.

This imminent danger is the subject of study by Stuart Madnick, founding director of the Cybersecurity at MIT Sloan Initiative. In a recent article for The Wall Street Journal, Madnick warned of weakest link in the defense against cyberattacks: people.

“Too many companies are making it easy for the attackers to succeed,” Madnick writes. “An analogy that I often use is this: You can get a stronger lock for your door, but if you are still leaving the key under your mat, are you really any more secure?”

In today’s landscape of escalating cybercrime, resiliency calls for a new kind of leadership and cybersafe culture, requiring the active engagement of both technical and non-technical management. This holistic approach is all the more urgent given the shortage of cybersecurity personnel; in the U.S. alone, 1 to 2 million cyber security analyst roles will go unfilled this year. This holistic approach is the focus of a new MIT Sloan Executive Education program taught by Stuart Madnick and his colleagues Keri Pearlson and Michael Seigel: Cybersecurity Leadership for Non-Technical Executives.

Cybersecurity issues are not purely a technology problem — they are multi-headed hydras that need to be addressed with a multi-disciplinary approach. This timely new program provides general managers with frameworks and best practices for managing cybersecurity-related risk. It also addresses the element common among many of the attacks that strike organizations every day — in particular, attacks that start as phishing or “spearphishing” emails. They rely on people falling for them.

“Such gullibility … is the result of a cyberculture where people are willing to share all kinds of information and try new things all the time,” writes Madnick in his recent WSJ article. “There are lots of good things about that, but also much that is dangerous. So now is the time for companies and institutions to change that culture. It won’t be easy, and it will take some time. But it’s crucial if we want our companies and information to be safe from cybertheft. We have to start now, and we have to do it right.”

The first session of Cybersecurity Leadership for Non-Technical Executives will occur Nov. 6-7.. The program will be offered again in April and July of 2019.

Probiotics and antibiotics create a killer combination

Wed, 10/17/2018 - 10:22am

In the fight against drug-resistant bacteria, MIT researchers have enlisted the help of beneficial bacteria known as probiotics.

In a new study, the researchers showed that by delivering a combination of antibiotic drugs and probiotics, they could eradicate two strains of drug-resistant bacteria that often infect wounds. To achieve this, they encapsulated the probiotic bacteria in a protective shell of alginate, a biocompatible material that prevents the probiotics from being killed by the antibiotic.

“There are so many bacteria now that are resistant to antibiotics, which is a serious problem for human health. We think one way to treat them is by encapsulating a live probiotic and letting it do its job,” says Ana Jaklenec, a research scientist at MIT’s Koch Institute for Integrative Cancer Research and one of the senior authors of the study.

If shown to be successful in future tests in animals and humans, the probiotic/antibiotic combination could be incorporated into dressings for wounds, where it could help heal infected chronic wounds, the researchers say.

Robert Langer, the David H. Koch Institute Professor and a member of the Koch Institute, is also a senior author of the paper, which appears in the journal Advanced Materials on Oct. 17. Zhihao Li, a former MIT visiting scientist, is the study’s lead author.

Bacteria wars

The human body contains trillions of bacterial cells, many of which are beneficial. In some cases, these bacteria help fend off infection by secreting antimicrobial peptides and other compounds that kill pathogenic strains of bacteria. Others outcompete harmful strains by taking up nutrients and other critical resources.

Scientists have previously tested the idea of applying probiotics to chronic wounds, and they’ve had some success in studies of patients with burns, Li says. However, the probiotic strains usually can’t combat all of the bacteria that would be found in an infected wound. Combining these strains with traditional antibiotics would help to kill more of the pathogenic bacteria, but the antibiotic would likely also kill off the probiotic bacteria.

The MIT team devised a way to get around this problem by encapsulating the probiotic bacteria so that they would not be affected by the antibiotic. They chose alginate in part because it is already used in dressings for chronic wounds, where it helps to absorb secretions and keep the wound dry. Additionally, the researchers also found that alginate is a component of the biofilms that clusters of bacteria form to protect themselves from antibiotics.

“We looked into the molecular components of biofilms and we found that for Pseudomonas infection, alginate is very important for its resistance against antibiotics,” Li says. “However, so far no one has used this ability to protect good bacteria from antibiotics.”

For this study, the researchers chose to encapsulate a type of commercially available probiotic known as Bio-K+, which consists of three strains of Lactobacillus bacteria. These strains are known to kill methicillin-resistant Staphylococcus aureus (MRSA). The exact mechanism by which they do this is not known, but one possibility is that the pathogens are susceptible to lactic acid produced by the probiotics. Another possibility is that the probiotics secrete antimicrobial peptides or other proteins that kill the pathogens or disrupt their ability to form biofilms.

The researchers delivered the encapsulated probiotics along with an antibiotic called tobramycin, which they chose among other tested antibiotics because it effectively kills Pseudomonas aeruginosa, another strain commonly found in wound infections. When MRSA and Pseudomonas aeruginosa growing in a lab dish were exposed to the combination of encapsulated Bio-K+ and tobramycin, all of the pathogenic bacteria were wiped out.

“It was quite a drastic effect,” Jaklenec says. “It completely eradicated the bacteria.”

When they tried the same experiment with nonencapsulated probiotics, the probiotics were killed by the antibiotics, allowing the MRSA bacteria to survive.

“When we just used one component, either antibiotics or probiotics, they couldn’t eradicate all the pathogens. That’s something which can be very important in clinical settings where you have wounds with different bacteria, and antibiotics are not enough to kill all the bacteria,” Li says.

Better wound healing

The researchers envision that this approach could be used to develop new types of bandages or other wound dressings embedded with antibiotics and alginate-encapsulated probiotics. Before that can happen, they plan to further test the approach in animals and possibly in humans.

“The good thing about alginate is it’s FDA-approved, and the probiotic we use is approved as well,” Li says. “I think probiotics can be something that may revolutionize wound treatment in the future. With our work, we have expanded the application possibilities of probiotics.”

In a study published in 2016, the researchers demonstrated that coating probiotics with layers of alginate and another polysaccharide called chitosan could protect them from being broken down in the gastrointestinal tract. This could help researchers develop ways to treat disease or improve digestion with orally delivered probiotics. Another potential application is using these probiotics to replenish the gut microbiome after treatment with antibiotics, which can wipe out beneficial bacteria at the same time that they clear up an infection.

Li’s work on this project was funded by the Swiss Janggen-Poehn Foundation and by Beatrice Beck-Schimmer and Hans-Ruedi Gonzenbach.

Angelika Amon wins 2019 Breakthrough Prize in Life Sciences

Wed, 10/17/2018 - 10:00am

Angelika Amon, an MIT professor of biology, is one of five scientists who will receive a 2019 Breakthrough Prize in Life Sciences, given for transformative advances toward understanding living systems and extending human life.

Amon, the Kathleen and Curtis Marble Professor in Cancer Research and a member of MIT’s Koch Institute for Integrative Cancer Research, was honored for her work in determining the consequences of aneuploidy, an abnormal chromosome number that results from mis-segregation of chromosomes during cell division.

The award, announced this morning, comes with a $3 million prize.

“Angelika Amon is an outstanding choice to receive the Breakthrough Prize,” says Tyler Jacks, director of the Koch Institute and the David H. Koch Professor of Biology. “Her work on understanding how cells control the decisions to divide and the effects of imbalances in chromosome number has helped shape how we think about normal development and disease. Angelika is a fearless investigator and a true scientist’s scientist. All of us in the Koch Institute and across MIT are thrilled by this news.”

Two MIT alumni, Charles Kane PhD ’89 and Eugene Mele PhD ’78, both professors at the University of Pennsylvania, will share a Breakthrough Prize in Fundamental Physics. Kane and Mele are being recognized for their new ideas about topology and symmetry in physics, leading to the prediction of a new class of materials that conduct electricity only on their surface.

New Horizons winners

Also announced today, three MIT physics researchers will receive the $100,000 New Horizons in Physics Prize, awarded to promising junior researchers who have already produced important work.

Lisa Barsotti, a principal research scientist at MIT’s Kavli Institute, and Matthew Evans, an MIT associate professor of physics, will share the prize with Rana Adhikari of Caltech for their work on ground-based detectors of gravitational waves. Daniel Harlow, an MIT assistant professor of physics, will share the prize with Daniel Jafferis of Harvard University and Aron Wall of Stanford University for their work generating fundamental insights about quantum information, quantum field theory, and gravity.

Additionally, Chenyang Xu, an MIT professor of mathematics, will receive a 2019 New Horizons in Mathematics Prize for his work in the minimal model program and applications to the moduli of algebraic varieties.

“On behalf of the School of Science, I congratulate Angelika Amon for this extraordinary honor, in recognition of her brilliant work that expands our understanding of cellular mechanisms that may lead to cancer,” says Michael Sipser, dean of the MIT School of Science and the Donner Professor of Mathematics. “We celebrate all recipients of these prestigious awards, including MIT’s four researchers whose impressive early-career achievements in physics and mathematics are being recognized today. Our scientists pursue fundamental research that advances human knowledge, which in turn leads to a better world.”

Chromosome imbalance

Most living cells have a defined number of chromosomes. Human cells, for example, have 23 pairs of chromosomes. However, as cells divide, they can make errors that lead to a gain or loss of chromosomes.

Amon has spent much of her career studying how this condition affects cells. When aneuploidy occurs in embryonic cells, it is almost always fatal to the organism. For human embryos, extra copies of any chromosome are lethal, with the exceptions of chromosome 21, which produces Down syndrome; chromosomes 13 and 18, which lead to developmental disorders known as Patau and Edwards syndromes; and the X and Y sex chromosomes, extra copies of which may sometimes cause various disorders but are not usually lethal.

In recent years, Amon’s lab has been exploring an apparent paradox of aneuploidy: When normal adult cells become aneuploid, it impairs their ability to survive and proliferate; however, cancer cells, which are nearly all aneuploid, can grow uncontrollably. Amon has shown that aneuploidy disrupts cells’ usual error-repair systems, allowing genetic mutations to quickly accumulate.

A better understanding of the consequences of aneuploidy could shed light on how cancer cells evolve and help to identify new therapeutic targets for cancer. Last year, Amon discovered a mechanism that the immune system uses to eliminate aneuploid cells from the body, raising the possibility of harnessing this system, which relies on natural killer cells, to destroy cancer cells.

Amon, who was informed of the prize several weeks ago, was sworn to secrecy until today’s announcement.

“When I received the phone call, I was driving in the car with my daughter, and it was really hard to not be too excited and thereby spill the beans,” she says. “Of course I am thrilled that our work is recognized in this manner.”

Scientists Frank Bennett of Ionis Pharmaceuticals, Adrian Krainer of Cold Spring Harbor Laboratory, Xiaowei Zhuang of Harvard University, and Zhijian Chen of the University of Texas Southwestern Medical Center will also receive Breakthrough Prizes in Life Sciences.

The 2019 Breakthrough Prize and New Horizon Prize recipients will be recognized at the seventh annual Breakthrough Prize ceremony, hosted by actor, producer and philanthropist Pierce Brosnan, on Sunday, Nov. 4, at NASA Ames Research Center in Mountain View, California, and broadcast live on National Geographic.

A step toward personalized, automated smart homes

Wed, 10/17/2018 - 12:00am

Developing automated systems that track occupants and self-adapt to their preferences is a major next step for the future of smart homes. When you walk into a room, for instance, a system could set to your preferred temperature. Or when you sit on the couch, a system could instantly flick the television to your favorite channel.

But enabling a home system to recognize occupants as they move around the house is a more complex problem. Recently, systems have been built that localize humans by measuring the reflections of wireless signals off their bodies. But these systems can’t identify the individuals. Other systems can identify people, but only if they’re always carrying their mobile devices. Both systems also rely on tracking signals that could be weak or get blocked by various structures.

MIT researchers have built a system that takes a step toward fully automated smart home by identifying occupants, even when they’re not carrying mobile devices. The system, called Duet, uses reflected wireless signals to localize individuals. But it also incorporates algorithms that ping nearby mobile devices to predict the individuals’ identities, based on who last used the device and their predicted movement trajectory. It also uses logic to figure out who’s who, even in signal-denied areas.

“Smart homes are still based on explicit input from apps or telling Alexa to do something. Ideally, we want homes to be more reactive to what we do, to adapt to us,” says Deepak Vasisht, a PhD student in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and lead author on a paper describing the system that was presented at last week’s Ubicomp conference. “If you enable location awareness and identification awareness for smart homes, you could do this automatically. Your home knows it’s you walking, and where you’re walking, and it can update itself.”

Experiments done in a two-bedroom apartment with four people and an office with nine people, over two weeks, showed the system can identify individuals with 96 percent and 94 percent accuracy, respectively, including when people weren’t carrying their smartphones or were in blocked areas.

But the system isn’t just novelty. Duet could potentially be used to recognize intruders or ensure visitors don’t enter private areas of your home. Moreover, Vasisht says, the system could capture behavioral-analytics insights for health care applications. Someone suffering from depression, for instance, may move around more or less, depending on how they’re feeling on any given day. Such information, collected over time, could be valuable for monitoring and treatment.

“In behavioral studies, you care about how people are moving over time and how people are behaving,” Vasisht says. “All those questions can be answered by getting information on people’s locations and how they’re moving.”

The researchers envision that their system would be used with explicit consent from anyone who would be identified and tracked with Duet. If needed, they could also develop an app for users to grant or revoke Duet’s access to their location information at any time, Vasisht adds.

Co-authors on the paper are: Dina Katabi, the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science; former CSAIL researcher Anubhav Jain ’16; and CSAIL PhD students Chen-Yu Hsu and Zachary Kabelac.

Tracking and identification

Duet is a wireless sensor installed on a wall that’s about a foot and a half squared. It incorporates a floor map with annotated areas, such as the bedroom, kitchen, bed, and living room couch. It also collects identification tags from the occupants’ phones.

The system builds upon a device-based localization system built by Vasisht, Katabi, and other researchers that tracks individuals within tens of centimeters, based on wireless signal reflections from their devices. It does so by using a central node to calculate the time it takes the signals to hit a person’s device and travel back. In experiments, the system was able to pinpoint where people were in a two-bedroom apartment and in a café.

The system, however, relied on people carrying mobile devices. “But in building [Duet] we realized, at home you don’t always carry your phone,” Vasisht says. “Most people leave devices on desks or tables, and walk around the house.”

The researchers combined their device-based localization with a device-free tracking system, called WiTrack, developed by Katabi and other CSAIL researchers, that localizes people by measuring the reflections of wireless signals off their bodies.

Duet locates a smartphone and correlates its movement with individual movement captured by the device-free localization. If both are moving in tightly correlated trajectories, the system pairs the device with the individual and, therefore, knows the identity of the individual.

To ensure Duet knows someone’s identity when they’re away from their device, the researchers designed the system to capture the power profile of the signal received from the phone when it’s used. That profile changes, depending on the orientation of the signal, and that change be mapped to an individual’s trajectory to identify them. For example, when a phone is used and then put down, the system will capture the initial power profile. Then it will estimate how the power profile would look if it were still being carried along a path by a nearby moving individual. The closer the changing power profile correlates to the moving individual’s path, the more likely it is that individual owns the phone.

Logical thinking

One final issue is that structures such as bathroom tiles, television screens, mirrors, and various metal equipment can block signals.

To compensate for that, the researchers incorporated probabilistic algorithms to apply logical reasoning to localization. To do so, they designed the system to recognize entrance and exit boundaries of specific spaces in the home, such as doors to each room, the bedside, and the side of a couch. At any moment, the system will recognize the most likely identity for each individual in each boundary. It then infers who is who by process of elimination.

Suppose an apartment has two occupants: Alisha and Betsy. Duet sees Alisha and Betsy walk into the living room, by pairing their smartphone motion with their movement trajectories. Both then leave their phones on a nearby coffee table to charge — Betsy goes into the bedroom to nap; Alisha stays on the couch to watch television. Duet infers that Betsy has entered the bed boundary and didn’t exit, so must be on the bed. After a while, Alisha and Betsy move into, say, the kitchen — and the signal drops. Duet reasons that two people are in the kitchen, but it doesn’t know their identities. When Betsy returns to the living room and picks up her phone, however, the system automatically re-tags the individual as Betsy. By process of elimination, the other person still in the kitchen is Alisha.

“There are blind spots in homes where systems won’t work. But, because you have logical framework, you can make these inferences,” Vasisht says.

“Duet takes a smart approach of combining the location of different devices and associating it to humans, and leverages device-free localization techniques for localizing humans,” says Ranveer Chandra, a principal researcher at Microsoft, who was not involved in the work. “Accurately determining the location of all residents in a home has the potential to significantly enhance the in-home experience of users. … The home assistant can personalize the responses based on who all are around it; the temperature can be automatically controlled based on personal preferences, thereby resulting in energy savings. Future robots in the home could be more intelligent if they knew who was where in the house. The potential is endless.”

Next, the researchers aim for long-term deployments of Duet in more spaces and to provide high-level analytic services for applications such as health monitoring and responsive smart homes.

Arctic ice sets speed limit for major ocean current

Wed, 10/17/2018 - 12:00am

The Beaufort Gyre is an enormous, 600-mile-wide pool of swirling cold, fresh water in the Arctic Ocean, just north of Alaska and Canada. In the winter, this current is covered by a thick cap of ice. Each summer, as the ice melts away, the exposed gyre gathers up sea ice and river runoff, and draws it down to create a huge reservoir of frigid fresh water, equal to the volume of all the Great Lakes combined.

Scientists at MIT have now identified a key mechanism, which they call the “ice-ocean governor,” that controls how fast the Beaufort Gyre spins and how much fresh water it stores. In a paper published today in Geophysical Research Letters, the researchers report that the Arctic’s ice cover essentially sets a speed limit on the gyre’s spin.

In the past two decades, as temperatures have risen globally, the Arctic’s summer ice has progressively shrunk in size. The team has observed that, with less ice available to control the Beaufort Gyre’s spin, the current has sped up in recent years, gathering up more sea ice and expanding in both volume and depth.

If global temperatures continue to climb, the researchers expect that the mechanism governing the gyre’s spin will diminish. With no governor to limit its speed, the researchers say the gyre will likely transition into “a new regime” and eventually spill over, like an overflowing bathtub, releasing huge volumes of cold, fresh water into the North Atlantic, which could affect the global climate and ocean circulation.

“This changing ice cover in the Arctic is changing the system which is driving the Beaufort Gyre, and changing its stability and intensity,” says Gianluca Meneghello, a research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “If all this fresh water is released, it will affect the circulation of the Atlantic.”

Meneghello is a co-author of the paper, along with John Marshall, the Cecil and Ida Green Professor of Oceanography, Jean-Michel Campin and Edward Doddridge of MIT, and Mary-Louise Timmermans of Yale University.

A “new Arctic ocean”

There have been a handful of times in the recorded past when the Beaufort Gyre has spilled over, beginning with the Great Salinity Anomaly in the late 1960s, when the gyre sent a surge of cold, fresh water southward. Fresh water has the potential to dampen the ocean’s overturning circulation, affecting surface temperatures and perhaps storminess and climate.

Similar events could transpire if the Arctic ice controlling the Beaufort Gyre’s spin continues to recede each year.

“If this ice-ocean governor goes away, then we will end up with basically a new Arctic ocean,” Marshall says.

“Nature has a natural governor”

The researchers began looking into the dynamics of the Beaufort Gyre several years ago. At that time, they used measurements taken by satellites between 2003 and 2014, to track the movement of the Arctic ice cover, along with the speed of the Arctic wind. They used these measurements of ice and wind speed to estimate how fast the Beaufort Gyre must be downwelling, or spinning down beneath the ice. But the number they came up with was much smaller than what they expected.

“We thought there was a coding error,” Marshall recalls. “But it turns out there was something else kicking back.” In other words, there must be some other mechanism that was limiting, or slowing down, the gyre’s spin.

The team recalculated the gyre’s speed, this time by including estimates of ocean current activity in and around the gyre, which they inferred from satellite measurements of sea surface heights. The new estimate, Meneghello says, was “much more reasonable.”

In this new paper, the researchers studied the interplay of ice, wind, and ocean currents in more depth, using a high-resolution, idealized representation of ocean circulation based on the MIT General Circulation Model, built by Marshall’s group. They used this model to simulate the seasonal activity of the Beaufort Gyre as the Arctic ice expands and recedes each year.

They found that in the spring, as the Arctic ice melts away, the gyre is exposed to the wind, which acts to whip up the ocean current, causing it to spin faster and draw down more fresh water from the Arctic’s river runoff and melting ice. In the winter, as the Arctic ice sheet expands, the ice acts as a lid, shielding the gyre from the fast-moving winds. As a result, the gyre spins against the underside of the ice and eventually slows down.

“The ice moves much slower than wind, and when the gyre reaches the velocity of the ice, at this point, there is no friction — they’re rotating together, and there’s nothing applying a stress [to speed up the gyre],” Meneghello says. “This is the mechanism that governs the gyre’s speed.”

“In mechanical systems, the governor, or limiter, kicks in when things are going too fast,” Marshall adds. “We found nature has a natural governor in the Arctic.”

The evolution of sea ice over the Beaufort Gyre: In springtime, as ice thaws and melts into the sea, the gyre is exposed to the Arctic winds. Courtesy of the researchers

“In a warming world”

Marshall and Meneghello note that, as Arctic temperatures have risen in the last two decades, and summertime ice has shrunk with each year, the speed of the Beaufort Gyre has increased. Its currents have become more variable and unpredictable, and are only slightly slowed by the return of ice in the winter.

“At some point, if this trend continues, the gyre can’t swallow all this fresh water that it’s drawing down,” Marshall says. Eventually, the levee will likely break and the gyre will burst, releasing hundreds of billions of gallons of cold, fresh water into the North Atlantic.

An increasingly unstable Beaufort Gyre could also disrupt the Arctic’s halocline — the layer of ocean water underlying the gyre’s cold freshwater, that insulates it from much deeper, warmer, and saltier water. If the halocline is somehow weakened by a more instable gyre, this could encourage warmer waters to rise up, further melting the Arctic ice.

“This is part of what we’re seeing in a warming world,” Marshall says. “We know the global mean temperatures are going up, but the Arctic tempertures are going up even more. So the Arctic is very vulnerable to climate change. And we’re going to live through a period where the governor goes away, essentially.”

This research was supported, in part, by the National Science Foundation.

Kristin Bergmann named a 2018 Packard Fellow

Tue, 10/16/2018 - 1:15pm

Kristin Bergmann, the Victor P. Starr Career Development Assistant Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS) has been awarded a 2018 Packard Fellowship in Science and Engineering. Bergmann is one of 18 early-career scientists in the nation selected this year. The prestigious fellowship, which includes a research grant of $875,000, encourages researchers to take risks and explore new frontiers in their field.  

“We are all extremely proud and happy that Kristin has received this well-deserved honor,” said Robert van der Hilst, the Schlumberger Professor of Earth and Planetary Sciences, EAPS department head, and Packard Fellow himself. “Kristin is a wonderful colleague, deeply engaged with our academic community. Running a lab and a field program is a major challenge, and the Packard Fellowship will help her pursue her exciting and ambitious studies of geological processes in Earth’s deep time.”

Bergmann is a geobiologist who reconstructs Earth’s ancient climate and surface environments. She uses methods spanning field measurements, isotope geochemistry and microanalysis to study rocks deposited in ancient oceans before and during the evolution of early animals.

“It is a great honor to have our work recognized and supported by the David and Lucile Packard Foundation,” Bergmann said.

During her fellowship, Bergmann will study ancient climate dynamics and dramatic environmental changes that accompany the emergence and dominance of multicellular, complex life on Earth. “I am fortunate at MIT to be able to pursue a research agenda that includes both field observations and laboratory-based geochemical techniques,” said Bergmann. “Often a researcher feels pulled between whether to spend months in the field or in the lab, but combining and balancing these allows my students to approach a problem from two sides.” By understanding the rocks within their environmental context, Bergmann can focus her research. “Where the sample comes from and its context is as important to me as the laboratory measurements we make at MIT and elsewhere. The Packard Fellowship will support this multidimensional approach.” 

Bergmann feels grateful and inspired by the award: “Geobiology is an interdisciplinary field requiring a variety of approaches and I’m very lucky to have the chance to interact with and learn from diverse, passionate scientists here at MIT and, before that, at Carleton College, Caltech, and Harvard. I look forward to meeting and interacting with other Packard Fellows from across the country.”

The David and Lucile Packard Foundation is a private family foundation created by David Packard, cofounder of the Hewlett-Packard Company.

Joining the resolution revolution

Tue, 10/16/2018 - 12:40pm

It's a time of small marvels and big ideas. Welcome to the Nano Age.

MIT’s preparations for this new era are in full swing, including the recent launch of MIT.nano, the Institute's center for nanoscience and nanotechnology. And on the day after MIT.nano’s opening ceremonies, the Department of Biology hosted its Cryogenic Electron Microscopy (Cryo-EM) Symposium, which was co-organized by biology professor Thomas Schwartz and the director of the new facility, Edward Brignole.

“We organized the symposium to raise awareness of this new research capacity, and to celebrate the many people who worked to fund these instruments, design the space, build the suites, and set up the microscopes,” Brignole said of the Oct. 5 event. “We also wanted to bring together the various groups across MIT working on diverse technologies to improve Cryo-EM, from mathematicians, computer scientists, and electrical engineers to biologists, chemists, and biological engineers.”

The event featured pioneers leveraging Cryo-EM for various interdisciplinary applications both on campus and outside of MIT — from biology and machine learning to quantum mechanics.

The program included Ed Egelman from the University of Virginia, Mark Bathe from the MIT Department of Biological Engineering, Katya Heldwein from Tufts University's School of Medicine, and Karl Berggren from the Department of Electrical Engineering and Computer Science. Also giving talks were computational and systems biology graduate student Tristan Bepler from MIT's Computer Science and Artificial Intelligence Laboratory, Luke Chao from Harvard Medical School and Massachusetts General Hospital, postdoc Kuang Shen from the Whitehead Institute at MIT, and graduate student Jonathan Weed from the MIT Department of Mathematics. The talks were followed by a reception in Building 68 and guided tours of the Cryo-EM Facility.

Unlike other popular techniques for determining 3-D macromolecular structures, Cryo-EM permits researchers to visualize more diverse and complex molecular assemblies in multiple conformations. Cryo-EM is housed in precisely climate-controlled rooms in the basement of MIT.nano, built atop a 5 million pound slab of concrete to minimize vibrations. Two multimillion-dollar instruments are being installed that will enable scientists to analyze cellular machinery in near-atomic detail; the microscopes are the first instruments to be installed in MIT.nano.

As Schwartz explained to an audience of more than 100 people during his opening remarks, he and his colleagues realized they needed to bring this technology to the MIT community. Like many of the center’s other tools, they would be too costly to purchase and too onerous to maintain for a single researcher or lab.

“Microscopes are very special and expensive tools, so this endeavor turned out to be much more involved than anything else I have done during my 14 years at MIT,” he said. “But this was not an effort of one or two people, it really took a whole community. We have many people to thank today.”

Establishing the Cryogenic Electron Microscopy Facility at MIT has been a long-time dream for Catherine Drennan, a professor of chemistry and biology and a Howard Hughes Medical Institute investigator. At the symposium, Drennan spoke about her work using the microscopes to capture snapshots of enzymes in action.

She remembers it was a “happy coincidence” that the plans for MIT.nano and the Cryo-EM Facility unfolded around the same time and then merged together to become one multi-disciplinary effort. Drennan, Schwartz, and others worked closely with MIT.nano Founding Director Vladimir Bulović and Vice President for Research Maria Zuber to gain institutional support and jumpstart the project. It took six years to design and construct MIT.nano, and four to implement the Cryo-EM Facility.

“We had this vision that the Cryo-EM Facility would be a shared space that people from all around MIT would use,” Drennan said.

An anonymous donor offered $5 million to fund the first microscope, the Titan Krios, while the Arnold and Mabel Beckman Foundation contributed $2.5 million to purchase the second, the Talos Arctica.

“The Beckman Foundation is really pleased to be supporting this kind of technology,” said Anne Hultgren, the foundation's executive director, who attended the symposium. “It was a win-win in terms of the timing and the need in the community. We are excited to be part of this effort, and to drive forward new innovations and experiments.”

The Beckman Foundation has made similar instrumentation grants to Johns Hopkins University School of Medicine, University of Pennsylvania’s Perelman School of Medicine, the University of Utah, and the University of Washington School of Medicine.

Drennan said that as the revolution in resolution continues to build, she hopes MIT’s new microscopes will bolster the resurging Cryo-EM community that’s slowly growing in and around Boston.

“Thanks to facilities like this, the Boston area went from being way behind the curve to right in front of it,” she said. “It's an incredibly exciting time, and I can’t wait to see how we learn and grow as a research community.”

Automated system identifies dense tissue, a risk factor for breast cancer, in mammograms

Tue, 10/16/2018 - 11:09am

Researchers from MIT and Massachusetts General Hospital have developed an automated model that assesses dense breast tissue in mammograms — which is an independent risk factor for breast cancer — as reliably as expert radiologists.

This marks the first time a deep-learning model of its kind has successfully been used in a clinic on real patients, according to the researchers. With broad implementation, the researchers hope the model can help bring greater reliability to breast density assessments across the nation.

It’s estimated that more than 40 percent of U.S. women have dense breast tissue, which alone increases the risk of breast cancer. Moreover, dense tissue can mask cancers on the mammogram, making screening more difficult. As a result, 30 U.S. states mandate that women must be notified if their mammograms indicate they have dense breasts.

But breast density assessments rely on subjective human assessment. Due to many factors, results vary — sometimes dramatically — across radiologists. The MIT and MGH researchers trained a deep-learning model on tens of thousands of high-quality digital mammograms to learn to distinguish different types of breast tissue, from fatty to extremely dense, based on expert assessments. Given a new mammogram, the model can then identify a density measurement that closely aligns with expert opinion.

“Breast density is an independent risk factor that drives how we communicate with women about their cancer risk. Our motivation was to create an accurate and consistent tool, that can be shared and used across health care systems,” says Adam Yala, a PhD student in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and second author on a paper describing the model that was published today in Radiology.

The other co-authors are first author Constance Lehman, professor of radiology at Harvard Medical School and the director of breast imaging at the MGH; and senior author Regina Barzilay, the Delta Electronics Professor at CSAIL and the Department of Electrical Engineering and Computer Science at MIT and a member of the Koch Institute for Integrative Cancer Research at MIT.

Mapping density

The model is built on a convolutional neural network (CNN), which is also used for computer vision tasks. The researchers trained and tested their model on a dataset of more than 58,000 randomly selected mammograms from more than 39,000 women screened between 2009 and 2011. For training, they used around 41,000 mammograms and, for testing, about 8,600 mammograms.

Each mammogram in the dataset has a standard Breast Imaging Reporting and Data System (BI-RADS) breast density rating in four categories: fatty, scattered (scattered density), heterogeneous (mostly dense), and dense. In both training and testing mammograms, about 40 percent were assessed as heterogeneous and dense.

During the training process, the model is given random mammograms to analyze. It learns to map the mammogram with expert radiologist density ratings. Dense breasts, for instance, contain glandular and fibrous connective tissue, which appear as compact networks of thick white lines and solid white patches. Fatty tissue networks appear much thinner, with gray area throughout. In testing, the model observes new mammograms and predicts the most likely density category.

Matching assessments

The model was implemented at the breast imaging division at MGH. In a traditional workflow, when a mammogram is taken, it’s sent to a workstation for a radiologist to assess. The researchers’ model is installed in a separate machine that intercepts the scans before it reaches the radiologist, and assigns each mammogram a density rating. When radiologists pull up a scan at their workstations, they’ll see the model’s assigned rating, which they then accept or reject.

“It takes less than a second per image … [and it can be] easily and cheaply scaled throughout hospitals.” Yala says.

On over 10,000 mammograms at MGH from January to May of this year, the model achieved 94 percent agreement among the hospital’s radiologists in a binary test — determining whether breasts were either heterogeneous and dense, or fatty and scattered. Across all four BI-RADS categories, it matched radiologists’ assessments at 90 percent. “MGH is a top breast imaging center with high inter-radiologist agreement, and this high quality dataset enabled us to develop a strong model,” Yala says.

In general testing using the original dataset, the model matched the original human expert interpretations at 77 percent across four BI-RADS categories and, in binary tests, matched the interpretations at 87 percent.

In comparison with traditional prediction models, the researchers used a metric called a kappa score, where 1 indicates that predictions agree every time, and anything lower indicates fewer instances of agreements. Kappa scores for commercially available automatic density-assessment models score a maximum of about 0.6. In the clinical application, the researchers’ model scored 0.85 kappa score and, in testing, scored a 0.67. This means the model makes better predictions than traditional models.

In an additional experiment, the researchers tested the model’s agreement with consensus from five MGH radiologists from 500 random test mammograms. The radiologists assigned breast density to the mammograms without knowledge of the original assessment, or their peers’ or the model’s assessments. In this experiment, the model achieved a kappa score of 0.78 with the radiologist consensus.

Next, the researchers aim to scale the model into other hospitals. “Building on this translational experience, we will explore how to transition machine-learning algorithms developed at MIT into clinic benefiting millions of patients,” Barzilay says. “This is a charter of the new center at MIT — the Abdul Latif Jameel Clinic for Machine Learning in Health at MIT — that was recently launched. And we are excited about new opportunities opened up by this center.”

This RNA-based technique could make gene therapy more effective

Tue, 10/16/2018 - 11:00am

Delivering functional genes into cells to replace mutated genes, an approach known as gene therapy, holds potential for treating many types of diseases. The earliest efforts to deliver genes to diseased cells focused on DNA, but many scientists are now exploring the possibility of using RNA instead, which could offer improved safety and easier delivery.

MIT biological engineers have now devised a way to regulate the expression of RNA once it gets into cells, giving them precise control over the dose of protein that a patient receives. This technology could allow doctors to more accurately tailor treatment for individual patients, and it also offers a way to quickly turn the genes off, if necessary.

“We can control very discretely how different genes are expressed,” says Jacob Becraft, an MIT graduate student and one of the lead authors of the study, which appears in the Oct. 16 issue of Nature Chemical Biology. “Historically, gene therapies have encountered issues regarding safety, but with new advances in synthetic biology, we can create entirely new paradigms of ‘smart therapeutics’ that actively engage with the patient’s own cells to increase efficacy and safety.”

Becraft and his colleagues at MIT have started a company to further develop this approach, with an initial focus on cancer treatment. Tyler Wagner, a recent Boston University PhD recipient, is also a lead author of the paper. Tasuku Kitada, a former MIT postdoc, and Ron Weiss, an MIT professor of biological engineering and member of the Koch Institute, are senior authors.

RNA circuits

Only a few gene therapies have been approved for human use so far, but scientists are working on and testing new gene therapy treatments for diseases such as sickle cell anemia, hemophilia, and congenital eye disease, among many others.

As a tool for gene therapy, DNA can be difficult to work with. When carried by synthetic nanoparticles, the particles must be delivered to the nucleus, which can be inefficient. Viruses are much more efficient for DNA delivery; however, they can be immunogenic, difficult, and expensive to produce, and often integrate their DNA into the cell's own genome, limiting their applicability in genetic therapies.

Messenger RNA, or mRNA, offers a more direct, and nonpermanent, way to alter cells’ gene expression. In all living cells, mRNA carries copies of the information contained in DNA to cell organelles called ribosomes, which assemble the proteins encoded by genes. Therefore, by delivering mRNA encoding a particular gene, scientists can induce production of the desired protein without having to get genetic material into a cell’s nucleus or integrate it into the genome.

To help make RNA-based gene therapy more effective, the MIT team set out to precisely control the production of therapeutic proteins once the RNA gets inside cells. To do that, they decided to adapt synthetic biology principles, which allow for precise programming of synthetic DNA circuits, to RNA.

The researchers’ new circuits consist of a single strand of RNA that includes genes for the desired therapeutic proteins as well as genes for RNA-binding proteins, which control the expression of the therapeutic proteins.

“Due to the dynamic nature of replication, the circuits’ performance can be tuned to allow different proteins to express at different times, all from the same strand of RNA,” Becraft says.

This allows the researchers to turn on the circuits at the right time by using “small molecule” drugs that interact with RNA-binding proteins. When a drug such as doxycycline, which is already FDA-approved, is added to the cells, it can stabilize or destabilize the interaction between RNA and RNA-binding proteins, depending on how the circuit is designed. This interaction determines whether the proteins block RNA gene expression or not.

In a previous study, the researchers also showed that they could build cell-specificity into their circuits, so that the RNA only becomes active in the target cells.

Targeting cancer

The company that the researchers started, Strand Therapeutics, is now working on adapting this approach to cancer immunotherapy — a new treatment strategy that involves stimulating a patient’s own immune system to attack tumors.

Using RNA, the researchers plan to develop circuits that can selectively stimulate immune cells to attack tumors, making it possible to target tumor cells that have metastasized to difficult-to-access parts of the body. For example, it has proven difficult to target cancerous cells, such as lung lesions, with mRNA because of the risk of inflaming the lung tissue. Using RNA circuits, the researchers first deliver their therapy to targeted cancer cell types within the lung, and through their genetic circuitry, the RNA would activate T-cells that could treat the cancer’s metastases elsewhere in the body.

“The hope is to elicit an immune response which is able to pick up and treat the rest of the metastases throughout the body,” Becraft says. “If you’re able to treat one site of the cancer, then your immune system will take care of the rest, because you’ve now built an immune response against it.”

Using these kinds of RNA circuits, doctors would be able to adjust dosages based on how the patient is responding, the researchers say. The circuits also provide a quick way to turn off therapeutic protein production in cases where the patient’s immune system becomes overstimulated, which can be potentially fatal.

In the future, the researchers hope to develop more complex circuits that could be both diagnostic and therapeutic — first detecting a problem, such as a tumor, and then producing the appropriate drug.

The research was funded by the Defense Advanced Research Projects Agency, the National Science Foundation, the National Institutes of Health, the Ragon Institute of MGH, MIT, and Harvard, the Special Research Fund from Ghent University, and the Research Foundation – Flanders.

Collaboration runs through J-WAFS-funded projects

Tue, 10/16/2018 - 10:50am

“In order to do the kind and scale of work that we do, international collaboration is essential. However, this can be difficult to fund,” Chris Voigt said. “J-WAFS is providing the support that we need for the cross-institutional and cross-sector collaboration that is enabling our work to move forward.”

Voigt, a professor in the MIT Department of Biological Engineering, made those comments at the first of two research workshops produced by the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) on Sept. 14th and Sept. 28th at the Samberg Center. The annual workshop brings members of the MIT community together to learn about the latest research results from J-WAFS-funded teams, to hear about newly funded projects, and to provide feedback on each other’s work.

The specific collaboration Voigt was referring to is a project that connects the work  on prokaryotic gene clusters in his lab to research at the Max Planck Institute of Molecular Plant Physiology in Germany and the Center for Plant Biotechnology and Genomics at the Universidad Politécnica in Spain.  

Voigt and experts in plastid engineering and plant gene expression from these partnering institutions are working to engineer cereal grains to produce their own nitrogen, eliminating the need for added fertilizer. Their goal is to transform farming at every scale — reducing the greenhouse gas emissions of industrial fertilizer production as well as problems of eutrophication from nutrient run-off and reducing the cost of added nitrogen fertilizer. With a growing world population and increasing demand for grain as a food and fuel, the need for innovations in agricultural technologies is urgent, yet the technical challenges are steep and often require complementary areas of expertise. Therefore, when researchers like Voightshare their skills and resources with other global experts in pursuit of a shared goal, the combined effort has the potential to produce dramatic results.

The collaboration is a hallmark of MIT’s research culture. J-WAFS seeks to leverage that collaboration by being particularly welcoming of cross-disciplinary project proposals and research teams. In fact, the majority of J-WAFS current and concluding projects are led by two or more principal investigators, with many of those teams being cross-disciplinary.      

In the case of a J-WAFS Solutions-funded project led by principal investigators Timothy Swager and Alexander Klibanov from the Department of Chemistry, interdisciplinary collaboration grew as the work on the project progressed. The team is developing a handheld food safety sensor that uses specialized droplets — called Janus emulsions — to test for bacterial contamination in food. The droplets behave like a dynamic lens, changing in the presence of specific bacteria. 

In developing optical systems that can indicate the presence or absence of bacteria, including salmonella, by analyzing the light either transmitted through or emanating from these dynamic lenses, the researchers realized that they did not have the expertise to fully understand the optics they observed when the droplets were exposed to light. For that, they needed help. Swager reached out to Mathias Kolle, an assistant professor in the Department of Mechanical Engineering, whose expertise in optical materials proved to be key. 

Kolle, who has received J-WAFS seed funding for his own work on industrial algae production, and his graduate student Sara Nagelberg provided the calculations necessary to understand the mechanics of light’s interaction with the particles. These insights contributed to sensor designs that were dramatically more effective, and the team has now launched a startup — Xibus Systems — and is currently working on product development. 

“This is the beginning of a much longer story for us,” Swager commented, reflecting on his collaboration with Kolle’s lab.

Several other research teams are applying multiple disciplinary perspectives to their work. 

In one project, Evelyn Wang, the Gail E. Kendall Professor in the Department of Mechanical Engineering, has teamed up with Mircea Dincă, an associate professor in the Department of Chemistry, to engineer highly absorbent metal organic frameworks in a device that pulls drinking water from air.

In another, assistant professor David Des Marais in the Department of Civil and Environmental Engineering is collaborating with Caroline Uhler, the Henry L. and Grace Doherty Assistant Professor in the Department of Electrical Engineering and Computer Science, to develop tools to analyze and understand the ways that genes regulate plants’ responses to environmental stressors such as drought. Their goal is to apply this understanding to better breed and engineer stress-tolerant plants so that crop yields can improve even as climate change creates more extreme growing conditions.

Meanwhile, J-WAFS itself collaborated with a partner program in organizing the event. The second day of the workshop coincided with the Tata Center’s annual research symposium, which was also held at the Samberg Center. J-WAFS and Tata’s missions have some significant overlaps — many Tata-funded MIT projects address food, water, and agriculture challenges in the developing world. The two groups merged audiences for their afternoon sessions and presentations to take advantage of these synergies, enabling participants of each event to interact and to learn about the food and water innovations that the programs are supporting.      

By funding research in all schools at MIT and seeding and supporting innovative collaboration that crosses departments and schools alikeJ-WAFS seeks to advance research that can provide answers to what might be one of the most pressing questions of our time: How do we ensure safe and resilient supplies of water and food on our changing planet, now and in the future? When experts come together around an urgent question like this one, each one approaches it from a different angle. And when successes emerge from collaborations in J-WAFS-funded projects, it demonstrate sthe value of MIT’s culture of interdisciplinary collaboration.    

Nuno Loureiro: Probing the world of plasmas

Mon, 10/15/2018 - 11:59pm

Growing up in the small city of Viseu in central Portugal, Nuno Loureiro knew he wanted to be a scientist, even in the early years of primary school when “everyone else wanted to be a policeman or a fireman,” he recalls. He can’t quite place the origin of that interest in science: He was 17 the first time he met a scientist, he says with an amused look.

By the time Loureiro finished high school, his interest in science had crystallized, and “I realized that physics was what I liked best,” he says. During his undergraduate studies at the IST Lisbon, he began to focus on fusion, which “seemed like a very appealing field,” where major developments were likely during his lifetime, he says.

Fusion, and specifically the physics of plasmas, has remained his primary research focus ever since, through graduate school, postdoc stints, and now in his research and teaching at MIT. He explains that plasma research “lives in two different worlds.” On the one hand, it involves astrophysics, dealing with the processes that happen in and around stars; on the other, it’s part of the quest to generate electricity that’s clean and virtually inexhaustible, through fusion reactors.

Plasma is a sort of fourth phase of matter, similar to a gas but with the atoms stripped apart into a kind of soup of electrons and ions. It forms about 99 percent of the visible matter in the universe, including stars and the wispy tendrils of material spread between them. Among the trickiest challenges to understanding the behavior of plasmas is their turbulence, which can dissipate away energy from a reactor, and which proceeds in very complex and hard-to-predict ways — a major stumbling block so far to practical fusion power.

While everyone is familiar with turbulence in fluids, from breaking waves to cream stirred into coffee, plasma turbulence can be quite different, Loureiro explains, because plasmas are riddled with magnetic and electric fields that push and pull them in dynamic ways. “A very noteworthy example is the solar wind,” he says, referring to the ongoing but highly variable stream of particles ejected by the sun and sweeping past Earth, sometimes producing auroras and affecting the electronics of communications satellites. Predicting the dynamics of such flows is a major goal of plasma research.

“The solar wind is the best plasma turbulence laboratory we have,” Loureiro says. “It’s increasingly well-diagnosed, because we have these satellites up there. So we can use it to benchmark our theoretical understanding.”

Loureiro began concentrating on plasma physics in graduate school at Imperial College London and continued this work as a postdoc at the Princeton Plasma Physics Laboratory and later the Culham Centre for Fusion Energy, the U.K.’s national fusion lab. Then, after a few years as a principal researcher at the University of Portugal, he joined the MIT faculty at the Plasma Science and Fusion Center in 2016 and earned tenure in 2017. A major motivation for moving to MIT from his research position, he says, was working with students. “I like to teach,” he says. Another was the “peerless intellectual caliber of the Plasma Science and Fusion Center at MIT.”

Loureiro, who holds a joint appointment in MIT’s Department of Physics, is an expert on a fundamental plasma process called magnetic reconnection. One example of this process occurs in the sun’s corona, a glowing irregular ring that surrounds the disk of the sun and becomes visible from Earth during solar eclipses. The corona is populated by vast loops of magnetic fields, which buoyantly rise from the solar interior and protrude through the solar surface. Sometimes these magnetic fields become unstable and explosively reconfigure, unleashing a burst of energy as a solar flare. “That’s magnetic reconnection in action,” he says.

Over the last couple of years at MIT, Loureiro published a series of papers with physicist Stanislav Boldyrev at the University of Wisconsin, in which they proposed a new analytical model to reconcile critical disparities between models of plasma turbulence and models of magnetic reconnection. It’s too early to say if the new model is correct, he says, but “our work prompted a reanalysis of solar wind data and also new numerical simulations. The results from these look very encouraging.”

Their new model, if proven, shows that magnetic reconnection must play a crucial role in the dynamics of plasma turbulence over a significant range of spatial scales – an insight that Loureiro and Boldyrev claim would have profound implications.

Loureiro says that a deep, detailed understanding of turbulence and reconnection in plasmas is essential for solving a variety of thorny problems in physics, including the way the sun’s corona gets heated, the properties of accretion disks around black holes, nuclear fusion, and more. And so he plugs away, to continue trying to unravel the complexities of plasma behavior. “These problems present beautiful intellectual challenges,” he muses. “That, in itself, makes the challenge worthwhile. But let’s also keep in mind that the practical implications of understanding plasma behavior are enormous.”

Computer model offers more control over protein design

Mon, 10/15/2018 - 3:03pm

Designing synthetic proteins that can act as drugs for cancer or other diseases can be a tedious process: It generally involves creating a library of millions of proteins, then screening the library to find proteins that bind the correct target.

MIT biologists have now come up with a more refined approach in which they use computer modeling to predict how different protein sequences will interact with the target. This strategy generates a larger number of candidates and also offers greater control over a variety of protein traits, says Amy Keating, a professor of biology and biological engineering and the leader of the research team.

“Our method gives you a much bigger playing field where you can select solutions that are very different from one another and are going to have different strengths and liabilities,” she says. “Our hope is that we can provide a broader range of possible solutions to increase the throughput of those initial hits into useful, functional molecules.”

In a paper appearing in the Proceedings of the National Academy of Sciences the week of Oct. 15, Keating and her colleagues used this approach to generate several peptides that can target different members of a protein family called Bcl-2, which help to drive cancer growth.

Recent PhD recipients Justin Jenson and Vincent Xue are the lead authors of the paper. Other authors are postdoc Tirtha Mandal, former lab technician Lindsey Stretz, and former postdoc Lothar Reich.

Modeling interactions

Protein drugs, also called biopharmaceuticals, are a rapidly growing class of drugs that hold promise for treating a wide range of diseases. The usual method for identifying such drugs is to screen millions of proteins, either randomly chosen or selected by creating variants of protein sequences already shown to be promising candidates. This involves engineering viruses or yeast to produce each of the proteins, then exposing them to the target to see which ones bind the best.

“That is the standard approach: Either completely randomly, or with some prior knowledge, design a library of proteins, and then go fishing in the library to pull out the most promising members,” Keating says.

While that method works well, it usually produces proteins that are optimized for only a single trait: how well it binds to the target. It does not allow for any control over other features that could be useful, such as traits that contribute to a protein’s ability to get into cells or its tendency to provoke an immune response.

“There’s no obvious way to do that kind of thing — specify a positively charged peptide, for example — using the brute force library screening,” Keating says.

Another desirable feature is the ability to identify proteins that bind tightly to their target but not to similar targets, which helps to ensure that drugs do not have unintended side effects. The standard approach does allow researchers to do this, but the experiments become more cumbersome, Keating says.

The new strategy involves first creating a computer model that can relate peptide sequences to their binding affinity for the target protein. To create this model, the researchers first chose about 10,000 peptides, each 23 amino acids in length and helical in structure, and tested their binding to three different members of the Bcl-2 family. They intentionally chose some sequences they already knew would bind well, plus others they knew would not, so the model could incorporate data about a range of binding abilities.

From this set of data, the model can produce a “landscape” of how each peptide sequence interacts with each target. The researchers can then use the model to predict how other sequences will interact with the targets, and generate peptides that meet the desired criteria.

Using this model, the researchers produced 36 peptides that were predicted to tightly bind one family member but not the other two. All of the candidates performed extremely well when the researchers tested them experimentally, so they tried a more difficult problem: identifying proteins that bind to two of the members but not the third. Many of these proteins were also successful.

“This approach represents a shift from posing a very specific problem and then designing an experiment to solve it, to investing some work up front to generate this landscape of how sequence is related to function, capturing the landscape in a model, and then being able to explore it at will for multiple properties,” Keating says.

Sagar Khare, an associate professor of chemistry and chemical biology at Rutgers University, says the new approach is impressive in its ability to discriminate between closely related protein targets.

“Selectivity of drugs is critical for minimizing off-target effects, and often selectivity is very difficult to encode because there are so many similar-looking molecular competitors that will also bind the drug apart from the intended target. This work shows how to encode this selectivity in the design itself,” says Khare, who was not involved in the research. “Applications in the development of therapeutic peptides will almost certainly ensue.” 

Selective drugs

Members of the Bcl-2 protein family play an important role in regulating programmed cell death. Dysregulation of these proteins can inhibit cell death, helping tumors to grow unchecked, so many drug companies have been working on developing drugs that target this protein family. For such drugs to be effective, it may be important for them to target just one of the proteins, because disrupting all of them could cause harmful side effects in healthy cells.

“In many cases, cancer cells seem to be using just one or two members of the family to promote cell survival,” Keating says. “In general, it is acknowledged that having a panel of selective agents would be much better than a crude tool that just knocked them all out.”

The researchers have filed for patents on the peptides they identified in this study, and they hope that they will be further tested as possible drugs. Keating’s lab is now working on applying this new modeling approach to other protein targets. This kind of modeling could be useful for not only developing potential drugs, but also generating proteins for use in agricultural or energy applications, she says.

The research was funded by the National Institute of General Medical Sciences, National Science Foundation Graduate Fellowships, and the National Institutes of Health.

Pages