MIT Latest News

Subscribe to MIT Latest News feed
MIT News is dedicated to communicating to the media and the public the news and achievements of the students, faculty, staff and the greater MIT community.
Updated: 2 hours 48 min ago

Translating research into impact

Wed, 10/17/2018 - 12:40pm

The MIT Tata Center for Technology and Design has funded upwards of 100 projects since its inception, and finds itself at a crucial juncture of identifying market opportunities for some of its advanced-stage projects that require further support in order to be turned into profitable social enterprises.

The Tata Center was first established at MIT six years ago by a generous donation provided by one of India’s oldest philanthropic organizations, Tata Trusts. With several advanced-stage projects now in the pipeline, the center’s leadership recognized a need to answer a fundamental question: How can the Tata Center provide further support, and what might that support look like, to research projects that have reached a state of maturity?

The center's recently-concluded fourth annual symposium and workshop, a two-day event hosted at the Samberg Conference Center titled “Translating Research into Impact,” aimed to do just that.

“This is a preoccupation for us. We’re no longer looking for things to do, we’ve found things to do. And we’ve brought technologies to a point at which they’re ready to go out into the world in the form of helpful products and services,” Tata Center Director Rob Stoner said as he welcomed students, industry partners, faculty, non-governmental organization representatives, and government officials from both India and the U.S. to the conference. “So, our focus has become translation — handing off technologies that may have reached the prototype or demonstration stage at MIT to entrepreneurial firms, government agencies, NGOs — anyone who has the vision and commitment to bring them to scale in India. It takes a focused effort to do that successfully.”

Stoner was joined at the conference by Manoj Kumar, head of entrepreneurship and innovations at Tata Trusts and Maurizio Vecchione, the executive vice presdient of Global Good and Research, which is a collaboration between Intellectual Ventures and the Gates Foundation.

In his opening keynote address, The Power of Developing World Technology: Reverse Innovation, Vecchione stressed the importance of investing in technologies for the developing world from a market-driven perspective. Focusing on the health care sector, Vecchione emphasized the need to dramatically increase research and development budgets targeted toward finding solutions for diseases like HIV, malaria, and tuberculosis in the developing world. The world’s population, primarily led by developing countries like China, India, Nigeria, and Mexico, is projected to reach 9 billion by 2040. 

The keynote was followed by a panel on scaling social enterprises with Jessica Alderman, the director of communications for Envirofit International; Alex Eaton, CEO of Sistema Biobolsa and Charity; and Manoj Sinha, CEO of Husk Power Systems. One of the core issues that emerged during the panel was the perceived dichotomy of impact versus profit.

“The idea of profit is important. And profit is absolutely tied to impact,” Alderman said. “You will have a short-lived company if you don’t have a solid way of getting to profit.”

Symposium attendees were also introduced to new Tata Center startups and multiple advanced-stage projects working on techologies including:

  • urine-based tuberculosis diagnostics;
  • affordable silicon-based nanofiltration;
  • accessible intraperitoneal chemotherapy devices;
  • intelligence deployment to improve agri-supply chains; and
  • photovoltaic-powered village-scale desalination systems.

The first day to a close with a fireside chat with Ernest Moniz, the Cecil and Ida Green Professor of Physics and Engineering Systems Emeritus and former U.S. Secretary of Energy, followed by a town hall on funding social innovations with Ann Dewitt, COO of The Engine, Barry Johnson of the National Science Foundation, and Harkesh Kumar Mittal from India’s Department of Science and Technology.

On the second day of the conference, Ann Mei Chang, the author of “Lean Impact” and former chief innovation officer at USAID, delivered an inspiring keynote address on the importance of thinking big, starting small, and pursuing impact relentlessly.

This second day was dedicated to parallel sectorial workshops on Tata Center’s six focus areas: housing, health, agriculture, energy, environment, and water. Workshop participants included faculty from MIT, the Indian Institute of Technology in Mumbai, Tata Fellows, active Tata Center collaborators, industry representatives, and representatives of some of India’s most influential NGOs.

“So many projects end up not leaving the institution because of gaps in our support ecosystem,” Stoner said, drawing the event to a close. “We’re determined at the Tata Center not to let that happen with our projects by filling those gaps.”  

The MIT Tata Center’s efforts to build connections in the developing world are linked to MIT’s broader campaign to engage with global challenges, and to translate innovative research into entrepreneurial impact. That work continues year-round. The next Tata Center Symposium will be held at MIT on Sept. 12 and 13, 2019.

Four from MIT named American Physical Society Fellows for 2018

Wed, 10/17/2018 - 12:30pm

Four members of the MIT community have been elected as fellows of the American Physical Society for 2018. The distinct honor is bestowed on less than 0.5 percent of the society's membership each year.

APS Fellowship recognizes members that have completed exceptional physics research, identified innovative applications of physics to science and technology, or furthered physics education. Nominated by their peers, the four were selected based on their outstanding contributions to the field.

Lisa Barsotti is a principal research scientist at the MIT Kavli Institute for Astrophysics and Space Research and a member of the Laser Interferometer Gravitational-Wave Observatory (LIGO) team. Barsotti was nominated by the Division of Gravitational Physics for her “extraordinary leadership in commissioning the advanced LIGO detectors, improving their sensitivity through implementation of squeezed light, and enhancing the operation of the gravitational wave detector network through joint run planning between LIGO and Virgo.”

Martin Bazant is the E. G. Roos (1944) Professor of Chemical Engineering and a professor of mathematics. Nominated by the Division of Fluid Dynamics, Bazant was cited for “seminal contributions to electrokinetics and electrochemical physics, and their links to fluid dynamics, notably theories of diffuse-charge dynamics, induced-charge electro-osmosis, and electrochemical phase separation.”

Pablo Jarillo-Herrero is the Cecil and Ida Green Professor of Physics. Jarillo-Herrero was nominated by the Division of Condensed Matter Physics and selected based on his “seminal contributions to quantum electronic transport and optoelectronics in van der Waals materials and heterostructures.”

Richard Lanza is a senior research scientist in the Department of Nuclear Science and Engineering. Nominated by the Forum on Physics and Society, Lanza was cited for his “innovative application of physics and the development of new technologies to allow detection of explosives and weapon-usable nuclear materials, which has greatly benefited national and international security.”

Fighting cybercrime requires a new kind of leadership

Wed, 10/17/2018 - 12:15pm

On March 22, the city of Atlanta was hit by cyberattackers who locked city-wide systems and demanded a bitcoin ransom. Many city systems still have not recovered, and the cost to taxpayers may have reached as high as $17 million.

Also in March, the U.S. Department of Justice indicted nine Iranian hackers over an alleged spree of attacks on more than 300 universities in the United States and abroad. The hackers stole 31 terabytes of data, estimated to be worth $3 billion in intellectual property.

And recently engineers at Facebook detected the biggest security breach in Facebook's history. It took the company 11 days to stop it.

The FBI reports that more than 4,000 ransomware attacks occur daily. Large private sector companies routinely grapple with cybersecurity and fending off cybercrime, and corporate security isn't getting better fast enough. Cyber risk has emerged as a significant threat to the financial system: A recent IMF study suggests that average annual losses to financial institutions from cyber-attacks could reach a few hundred billion dollars a year, potentially threatening financial stability. Hacker attacks on critical infrastructure are already alarming, and the security of our cyber-physical infrastructure — the computer-controlled facilities that produce and deliver our energy, water, and communications, for example — are dangerously exposed.

This imminent danger is the subject of study by Stuart Madnick, founding director of the Cybersecurity at MIT Sloan Initiative. In a recent article for The Wall Street Journal, Madnick warned of weakest link in the defense against cyberattacks: people.

“Too many companies are making it easy for the attackers to succeed,” Madnick writes. “An analogy that I often use is this: You can get a stronger lock for your door, but if you are still leaving the key under your mat, are you really any more secure?”

In today’s landscape of escalating cybercrime, resiliency calls for a new kind of leadership and cybersafe culture, requiring the active engagement of both technical and non-technical management. This holistic approach is all the more urgent given the shortage of cybersecurity personnel; in the U.S. alone, 1 to 2 million cyber security analyst roles will go unfilled this year. This holistic approach is the focus of a new MIT Sloan Executive Education program taught by Stuart Madnick and his colleagues Keri Pearlson and Michael Seigel: Cybersecurity Leadership for Non-Technical Executives.

Cybersecurity issues are not purely a technology problem — they are multi-headed hydras that need to be addressed with a multi-disciplinary approach. This timely new program provides general managers with frameworks and best practices for managing cybersecurity-related risk. It also addresses the element common among many of the attacks that strike organizations every day — in particular, attacks that start as phishing or “spearphishing” emails. They rely on people falling for them.

“Such gullibility … is the result of a cyberculture where people are willing to share all kinds of information and try new things all the time,” writes Madnick in his recent WSJ article. “There are lots of good things about that, but also much that is dangerous. So now is the time for companies and institutions to change that culture. It won’t be easy, and it will take some time. But it’s crucial if we want our companies and information to be safe from cybertheft. We have to start now, and we have to do it right.”

The first session of Cybersecurity Leadership for Non-Technical Executives will occur Nov. 6-7.. The program will be offered again in April and July of 2019.

Probiotics and antibiotics create a killer combination

Wed, 10/17/2018 - 10:22am

In the fight against drug-resistant bacteria, MIT researchers have enlisted the help of beneficial bacteria known as probiotics.

In a new study, the researchers showed that by delivering a combination of antibiotic drugs and probiotics, they could eradicate two strains of drug-resistant bacteria that often infect wounds. To achieve this, they encapsulated the probiotic bacteria in a protective shell of alginate, a biocompatible material that prevents the probiotics from being killed by the antibiotic.

“There are so many bacteria now that are resistant to antibiotics, which is a serious problem for human health. We think one way to treat them is by encapsulating a live probiotic and letting it do its job,” says Ana Jaklenec, a research scientist at MIT’s Koch Institute for Integrative Cancer Research and one of the senior authors of the study.

If shown to be successful in future tests in animals and humans, the probiotic/antibiotic combination could be incorporated into dressings for wounds, where it could help heal infected chronic wounds, the researchers say.

Robert Langer, the David H. Koch Institute Professor and a member of the Koch Institute, is also a senior author of the paper, which appears in the journal Advanced Materials on Oct. 17. Zhihao Li, a former MIT visiting scientist, is the study’s lead author.

Bacteria wars

The human body contains trillions of bacterial cells, many of which are beneficial. In some cases, these bacteria help fend off infection by secreting antimicrobial peptides and other compounds that kill pathogenic strains of bacteria. Others outcompete harmful strains by taking up nutrients and other critical resources.

Scientists have previously tested the idea of applying probiotics to chronic wounds, and they’ve had some success in studies of patients with burns, Li says. However, the probiotic strains usually can’t combat all of the bacteria that would be found in an infected wound. Combining these strains with traditional antibiotics would help to kill more of the pathogenic bacteria, but the antibiotic would likely also kill off the probiotic bacteria.

The MIT team devised a way to get around this problem by encapsulating the probiotic bacteria so that they would not be affected by the antibiotic. They chose alginate in part because it is already used in dressings for chronic wounds, where it helps to absorb secretions and keep the wound dry. Additionally, the researchers also found that alginate is a component of the biofilms that clusters of bacteria form to protect themselves from antibiotics.

“We looked into the molecular components of biofilms and we found that for Pseudomonas infection, alginate is very important for its resistance against antibiotics,” Li says. “However, so far no one has used this ability to protect good bacteria from antibiotics.”

For this study, the researchers chose to encapsulate a type of commercially available probiotic known as Bio-K+, which consists of three strains of Lactobacillus bacteria. These strains are known to kill methicillin-resistant Staphylococcus aureus (MRSA). The exact mechanism by which they do this is not known, but one possibility is that the pathogens are susceptible to lactic acid produced by the probiotics. Another possibility is that the probiotics secrete antimicrobial peptides or other proteins that kill the pathogens or disrupt their ability to form biofilms.

The researchers delivered the encapsulated probiotics along with an antibiotic called tobramycin, which they chose among other tested antibiotics because it effectively kills Pseudomonas aeruginosa, another strain commonly found in wound infections. When MRSA and Pseudomonas aeruginosa growing in a lab dish were exposed to the combination of encapsulated Bio-K+ and tobramycin, all of the pathogenic bacteria were wiped out.

“It was quite a drastic effect,” Jaklenec says. “It completely eradicated the bacteria.”

When they tried the same experiment with nonencapsulated probiotics, the probiotics were killed by the antibiotics, allowing the MRSA bacteria to survive.

“When we just used one component, either antibiotics or probiotics, they couldn’t eradicate all the pathogens. That’s something which can be very important in clinical settings where you have wounds with different bacteria, and antibiotics are not enough to kill all the bacteria,” Li says.

Better wound healing

The researchers envision that this approach could be used to develop new types of bandages or other wound dressings embedded with antibiotics and alginate-encapsulated probiotics. Before that can happen, they plan to further test the approach in animals and possibly in humans.

“The good thing about alginate is it’s FDA-approved, and the probiotic we use is approved as well,” Li says. “I think probiotics can be something that may revolutionize wound treatment in the future. With our work, we have expanded the application possibilities of probiotics.”

In a study published in 2016, the researchers demonstrated that coating probiotics with layers of alginate and another polysaccharide called chitosan could protect them from being broken down in the gastrointestinal tract. This could help researchers develop ways to treat disease or improve digestion with orally delivered probiotics. Another potential application is using these probiotics to replenish the gut microbiome after treatment with antibiotics, which can wipe out beneficial bacteria at the same time that they clear up an infection.

Li’s work on this project was funded by the Swiss Janggen-Poehn Foundation and by Beatrice Beck-Schimmer and Hans-Ruedi Gonzenbach.

Angelika Amon wins 2019 Breakthrough Prize in Life Sciences

Wed, 10/17/2018 - 10:00am

Angelika Amon, an MIT professor of biology, is one of five scientists who will receive a 2019 Breakthrough Prize in Life Sciences, given for transformative advances toward understanding living systems and extending human life.

Amon, the Kathleen and Curtis Marble Professor in Cancer Research and a member of MIT’s Koch Institute for Integrative Cancer Research, was honored for her work in determining the consequences of aneuploidy, an abnormal chromosome number that results from mis-segregation of chromosomes during cell division.

The award, announced this morning, comes with a $3 million prize.

“Angelika Amon is an outstanding choice to receive the Breakthrough Prize,” says Tyler Jacks, director of the Koch Institute and the David H. Koch Professor of Biology. “Her work on understanding how cells control the decisions to divide and the effects of imbalances in chromosome number has helped shape how we think about normal development and disease. Angelika is a fearless investigator and a true scientist’s scientist. All of us in the Koch Institute and across MIT are thrilled by this news.”

Two MIT alumni, Charles Kane PhD ’89 and Eugene Mele PhD ’78, both professors at the University of Pennsylvania, will share a Breakthrough Prize in Fundamental Physics. Kane and Mele are being recognized for their new ideas about topology and symmetry in physics, leading to the prediction of a new class of materials that conduct electricity only on their surface.

New Horizons winners

Also announced today, three MIT physics researchers will receive the $100,000 New Horizons in Physics Prize, awarded to promising junior researchers who have already produced important work.

Lisa Barsotti, a principal research scientist at MIT’s Kavli Institute, and Matthew Evans, an MIT associate professor of physics, will share the prize with Rana Adhikari of Caltech for their work on ground-based detectors of gravitational waves. Daniel Harlow, an MIT assistant professor of physics, will share the prize with Daniel Jafferis of Harvard University and Aron Wall of Stanford University for their work generating fundamental insights about quantum information, quantum field theory, and gravity.

Additionally, Chenyang Xu, an MIT professor of mathematics, will receive a 2019 New Horizons in Mathematics Prize for his work in the minimal model program and applications to the moduli of algebraic varieties.

“On behalf of the School of Science, I congratulate Angelika Amon for this extraordinary honor, in recognition of her brilliant work that expands our understanding of cellular mechanisms that may lead to cancer,” says Michael Sipser, dean of the MIT School of Science and the Donner Professor of Mathematics. “We celebrate all recipients of these prestigious awards, including MIT’s four researchers whose impressive early-career achievements in physics and mathematics are being recognized today. Our scientists pursue fundamental research that advances human knowledge, which in turn leads to a better world.”

Chromosome imbalance

Most living cells have a defined number of chromosomes. Human cells, for example, have 23 pairs of chromosomes. However, as cells divide, they can make errors that lead to a gain or loss of chromosomes.

Amon has spent much of her career studying how this condition affects cells. When aneuploidy occurs in embryonic cells, it is almost always fatal to the organism. For human embryos, extra copies of any chromosome are lethal, with the exceptions of chromosome 21, which produces Down syndrome; chromosomes 13 and 18, which lead to developmental disorders known as Patau and Edwards syndromes; and the X and Y sex chromosomes, extra copies of which may sometimes cause various disorders but are not usually lethal.

In recent years, Amon’s lab has been exploring an apparent paradox of aneuploidy: When normal adult cells become aneuploid, it impairs their ability to survive and proliferate; however, cancer cells, which are nearly all aneuploid, can grow uncontrollably. Amon has shown that aneuploidy disrupts cells’ usual error-repair systems, allowing genetic mutations to quickly accumulate.

A better understanding of the consequences of aneuploidy could shed light on how cancer cells evolve and help to identify new therapeutic targets for cancer. Last year, Amon discovered a mechanism that the immune system uses to eliminate aneuploid cells from the body, raising the possibility of harnessing this system, which relies on natural killer cells, to destroy cancer cells.

Amon, who was informed of the prize several weeks ago, was sworn to secrecy until today’s announcement.

“When I received the phone call, I was driving in the car with my daughter, and it was really hard to not be too excited and thereby spill the beans,” she says. “Of course I am thrilled that our work is recognized in this manner.”

Scientists Frank Bennett of Ionis Pharmaceuticals, Adrian Krainer of Cold Spring Harbor Laboratory, Xiaowei Zhuang of Harvard University, and Zhijian Chen of the University of Texas Southwestern Medical Center will also receive Breakthrough Prizes in Life Sciences.

The 2019 Breakthrough Prize and New Horizon Prize recipients will be recognized at the seventh annual Breakthrough Prize ceremony, hosted by actor, producer and philanthropist Pierce Brosnan, on Sunday, Nov. 4, at NASA Ames Research Center in Mountain View, California, and broadcast live on National Geographic.

A step toward personalized, automated smart homes

Wed, 10/17/2018 - 12:00am

Developing automated systems that track occupants and self-adapt to their preferences is a major next step for the future of smart homes. When you walk into a room, for instance, a system could set to your preferred temperature. Or when you sit on the couch, a system could instantly flick the television to your favorite channel.

But enabling a home system to recognize occupants as they move around the house is a more complex problem. Recently, systems have been built that localize humans by measuring the reflections of wireless signals off their bodies. But these systems can’t identify the individuals. Other systems can identify people, but only if they’re always carrying their mobile devices. Both systems also rely on tracking signals that could be weak or get blocked by various structures.

MIT researchers have built a system that takes a step toward fully automated smart home by identifying occupants, even when they’re not carrying mobile devices. The system, called Duet, uses reflected wireless signals to localize individuals. But it also incorporates algorithms that ping nearby mobile devices to predict the individuals’ identities, based on who last used the device and their predicted movement trajectory. It also uses logic to figure out who’s who, even in signal-denied areas.

“Smart homes are still based on explicit input from apps or telling Alexa to do something. Ideally, we want homes to be more reactive to what we do, to adapt to us,” says Deepak Vasisht, a PhD student in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and lead author on a paper describing the system that was presented at last week’s Ubicomp conference. “If you enable location awareness and identification awareness for smart homes, you could do this automatically. Your home knows it’s you walking, and where you’re walking, and it can update itself.”

Experiments done in a two-bedroom apartment with four people and an office with nine people, over two weeks, showed the system can identify individuals with 96 percent and 94 percent accuracy, respectively, including when people weren’t carrying their smartphones or were in blocked areas.

But the system isn’t just novelty. Duet could potentially be used to recognize intruders or ensure visitors don’t enter private areas of your home. Moreover, Vasisht says, the system could capture behavioral-analytics insights for health care applications. Someone suffering from depression, for instance, may move around more or less, depending on how they’re feeling on any given day. Such information, collected over time, could be valuable for monitoring and treatment.

“In behavioral studies, you care about how people are moving over time and how people are behaving,” Vasisht says. “All those questions can be answered by getting information on people’s locations and how they’re moving.”

The researchers envision that their system would be used with explicit consent from anyone who would be identified and tracked with Duet. If needed, they could also develop an app for users to grant or revoke Duet’s access to their location information at any time, Vasisht adds.

Co-authors on the paper are: Dina Katabi, the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science; former CSAIL researcher Anubhav Jain ’16; and CSAIL PhD students Chen-Yu Hsu and Zachary Kabelac.

Tracking and identification

Duet is a wireless sensor installed on a wall that’s about a foot and a half squared. It incorporates a floor map with annotated areas, such as the bedroom, kitchen, bed, and living room couch. It also collects identification tags from the occupants’ phones.

The system builds upon a device-based localization system built by Vasisht, Katabi, and other researchers that tracks individuals within tens of centimeters, based on wireless signal reflections from their devices. It does so by using a central node to calculate the time it takes the signals to hit a person’s device and travel back. In experiments, the system was able to pinpoint where people were in a two-bedroom apartment and in a café.

The system, however, relied on people carrying mobile devices. “But in building [Duet] we realized, at home you don’t always carry your phone,” Vasisht says. “Most people leave devices on desks or tables, and walk around the house.”

The researchers combined their device-based localization with a device-free tracking system, called WiTrack, developed by Katabi and other CSAIL researchers, that localizes people by measuring the reflections of wireless signals off their bodies.

Duet locates a smartphone and correlates its movement with individual movement captured by the device-free localization. If both are moving in tightly correlated trajectories, the system pairs the device with the individual and, therefore, knows the identity of the individual.

To ensure Duet knows someone’s identity when they’re away from their device, the researchers designed the system to capture the power profile of the signal received from the phone when it’s used. That profile changes, depending on the orientation of the signal, and that change be mapped to an individual’s trajectory to identify them. For example, when a phone is used and then put down, the system will capture the initial power profile. Then it will estimate how the power profile would look if it were still being carried along a path by a nearby moving individual. The closer the changing power profile correlates to the moving individual’s path, the more likely it is that individual owns the phone.

Logical thinking

One final issue is that structures such as bathroom tiles, television screens, mirrors, and various metal equipment can block signals.

To compensate for that, the researchers incorporated probabilistic algorithms to apply logical reasoning to localization. To do so, they designed the system to recognize entrance and exit boundaries of specific spaces in the home, such as doors to each room, the bedside, and the side of a couch. At any moment, the system will recognize the most likely identity for each individual in each boundary. It then infers who is who by process of elimination.

Suppose an apartment has two occupants: Alisha and Betsy. Duet sees Alisha and Betsy walk into the living room, by pairing their smartphone motion with their movement trajectories. Both then leave their phones on a nearby coffee table to charge — Betsy goes into the bedroom to nap; Alisha stays on the couch to watch television. Duet infers that Betsy has entered the bed boundary and didn’t exit, so must be on the bed. After a while, Alisha and Betsy move into, say, the kitchen — and the signal drops. Duet reasons that two people are in the kitchen, but it doesn’t know their identities. When Betsy returns to the living room and picks up her phone, however, the system automatically re-tags the individual as Betsy. By process of elimination, the other person still in the kitchen is Alisha.

“There are blind spots in homes where systems won’t work. But, because you have logical framework, you can make these inferences,” Vasisht says.

“Duet takes a smart approach of combining the location of different devices and associating it to humans, and leverages device-free localization techniques for localizing humans,” says Ranveer Chandra, a principal researcher at Microsoft, who was not involved in the work. “Accurately determining the location of all residents in a home has the potential to significantly enhance the in-home experience of users. … The home assistant can personalize the responses based on who all are around it; the temperature can be automatically controlled based on personal preferences, thereby resulting in energy savings. Future robots in the home could be more intelligent if they knew who was where in the house. The potential is endless.”

Next, the researchers aim for long-term deployments of Duet in more spaces and to provide high-level analytic services for applications such as health monitoring and responsive smart homes.

Arctic ice sets speed limit for major ocean current

Wed, 10/17/2018 - 12:00am

The Beaufort Gyre is an enormous, 600-mile-wide pool of swirling cold, fresh water in the Arctic Ocean, just north of Alaska and Canada. In the winter, this current is covered by a thick cap of ice. Each summer, as the ice melts away, the exposed gyre gathers up sea ice and river runoff, and draws it down to create a huge reservoir of frigid fresh water, equal to the volume of all the Great Lakes combined.

Scientists at MIT have now identified a key mechanism, which they call the “ice-ocean governor,” that controls how fast the Beaufort Gyre spins and how much fresh water it stores. In a paper published today in Geophysical Research Letters, the researchers report that the Arctic’s ice cover essentially sets a speed limit on the gyre’s spin.

In the past two decades, as temperatures have risen globally, the Arctic’s summer ice has progressively shrunk in size. The team has observed that, with less ice available to control the Beaufort Gyre’s spin, the current has sped up in recent years, gathering up more sea ice and expanding in both volume and depth.

If global temperatures continue to climb, the researchers expect that the mechanism governing the gyre’s spin will diminish. With no governor to limit its speed, the researchers say the gyre will likely transition into “a new regime” and eventually spill over, like an overflowing bathtub, releasing huge volumes of cold, fresh water into the North Atlantic, which could affect the global climate and ocean circulation.

“This changing ice cover in the Arctic is changing the system which is driving the Beaufort Gyre, and changing its stability and intensity,” says Gianluca Meneghello, a research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “If all this fresh water is released, it will affect the circulation of the Atlantic.”

Meneghello is a co-author of the paper, along with John Marshall, the Cecil and Ida Green Professor of Oceanography, Jean-Michel Campin and Edward Doddridge of MIT, and Mary-Louise Timmermans of Yale University.

A “new Arctic ocean”

There have been a handful of times in the recorded past when the Beaufort Gyre has spilled over, beginning with the Great Salinity Anomaly in the late 1960s, when the gyre sent a surge of cold, fresh water southward. Fresh water has the potential to dampen the ocean’s overturning circulation, affecting surface temperatures and perhaps storminess and climate.

Similar events could transpire if the Arctic ice controlling the Beaufort Gyre’s spin continues to recede each year.

“If this ice-ocean governor goes away, then we will end up with basically a new Arctic ocean,” Marshall says.

“Nature has a natural governor”

The researchers began looking into the dynamics of the Beaufort Gyre several years ago. At that time, they used measurements taken by satellites between 2003 and 2014, to track the movement of the Arctic ice cover, along with the speed of the Arctic wind. They used these measurements of ice and wind speed to estimate how fast the Beaufort Gyre must be downwelling, or spinning down beneath the ice. But the number they came up with was much smaller than what they expected.

“We thought there was a coding error,” Marshall recalls. “But it turns out there was something else kicking back.” In other words, there must be some other mechanism that was limiting, or slowing down, the gyre’s spin.

The team recalculated the gyre’s speed, this time by including estimates of ocean current activity in and around the gyre, which they inferred from satellite measurements of sea surface heights. The new estimate, Meneghello says, was “much more reasonable.”

In this new paper, the researchers studied the interplay of ice, wind, and ocean currents in more depth, using a high-resolution, idealized representation of ocean circulation based on the MIT General Circulation Model, built by Marshall’s group. They used this model to simulate the seasonal activity of the Beaufort Gyre as the Arctic ice expands and recedes each year.

They found that in the spring, as the Arctic ice melts away, the gyre is exposed to the wind, which acts to whip up the ocean current, causing it to spin faster and draw down more fresh water from the Arctic’s river runoff and melting ice. In the winter, as the Arctic ice sheet expands, the ice acts as a lid, shielding the gyre from the fast-moving winds. As a result, the gyre spins against the underside of the ice and eventually slows down.

“The ice moves much slower than wind, and when the gyre reaches the velocity of the ice, at this point, there is no friction — they’re rotating together, and there’s nothing applying a stress [to speed up the gyre],” Meneghello says. “This is the mechanism that governs the gyre’s speed.”

“In mechanical systems, the governor, or limiter, kicks in when things are going too fast,” Marshall adds. “We found nature has a natural governor in the Arctic.”

The evolution of sea ice over the Beaufort Gyre: In springtime, as ice thaws and melts into the sea, the gyre is exposed to the Arctic winds. Courtesy of the researchers

“In a warming world”

Marshall and Meneghello note that, as Arctic temperatures have risen in the last two decades, and summertime ice has shrunk with each year, the speed of the Beaufort Gyre has increased. Its currents have become more variable and unpredictable, and are only slightly slowed by the return of ice in the winter.

“At some point, if this trend continues, the gyre can’t swallow all this fresh water that it’s drawing down,” Marshall says. Eventually, the levee will likely break and the gyre will burst, releasing hundreds of billions of gallons of cold, fresh water into the North Atlantic.

An increasingly unstable Beaufort Gyre could also disrupt the Arctic’s halocline — the layer of ocean water underlying the gyre’s cold freshwater, that insulates it from much deeper, warmer, and saltier water. If the halocline is somehow weakened by a more instable gyre, this could encourage warmer waters to rise up, further melting the Arctic ice.

“This is part of what we’re seeing in a warming world,” Marshall says. “We know the global mean temperatures are going up, but the Arctic tempertures are going up even more. So the Arctic is very vulnerable to climate change. And we’re going to live through a period where the governor goes away, essentially.”

This research was supported, in part, by the National Science Foundation.

Kristin Bergmann named a 2018 Packard Fellow

Tue, 10/16/2018 - 1:15pm

Kristin Bergmann, the Victor P. Starr Career Development Assistant Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS) has been awarded a 2018 Packard Fellowship in Science and Engineering. Bergmann is one of 18 early-career scientists in the nation selected this year. The prestigious fellowship, which includes a research grant of $875,000, encourages researchers to take risks and explore new frontiers in their field.  

“We are all extremely proud and happy that Kristin has received this well-deserved honor,” said Robert van der Hilst, the Schlumberger Professor of Earth and Planetary Sciences, EAPS department head, and Packard Fellow himself. “Kristin is a wonderful colleague, deeply engaged with our academic community. Running a lab and a field program is a major challenge, and the Packard Fellowship will help her pursue her exciting and ambitious studies of geological processes in Earth’s deep time.”

Bergmann is a geobiologist who reconstructs Earth’s ancient climate and surface environments. She uses methods spanning field measurements, isotope geochemistry and microanalysis to study rocks deposited in ancient oceans before and during the evolution of early animals.

“It is a great honor to have our work recognized and supported by the David and Lucile Packard Foundation,” Bergmann said.

During her fellowship, Bergmann will study ancient climate dynamics and dramatic environmental changes that accompany the emergence and dominance of multicellular, complex life on Earth. “I am fortunate at MIT to be able to pursue a research agenda that includes both field observations and laboratory-based geochemical techniques,” said Bergmann. “Often a researcher feels pulled between whether to spend months in the field or in the lab, but combining and balancing these allows my students to approach a problem from two sides.” By understanding the rocks within their environmental context, Bergmann can focus her research. “Where the sample comes from and its context is as important to me as the laboratory measurements we make at MIT and elsewhere. The Packard Fellowship will support this multidimensional approach.” 

Bergmann feels grateful and inspired by the award: “Geobiology is an interdisciplinary field requiring a variety of approaches and I’m very lucky to have the chance to interact with and learn from diverse, passionate scientists here at MIT and, before that, at Carleton College, Caltech, and Harvard. I look forward to meeting and interacting with other Packard Fellows from across the country.”

The David and Lucile Packard Foundation is a private family foundation created by David Packard, cofounder of the Hewlett-Packard Company.

Joining the resolution revolution

Tue, 10/16/2018 - 12:40pm

It's a time of small marvels and big ideas. Welcome to the Nano Age.

MIT’s preparations for this new era are in full swing, including the recent launch of MIT.nano, the Institute's center for nanoscience and nanotechnology. And on the day after MIT.nano’s opening ceremonies, the Department of Biology hosted its Cryogenic Electron Microscopy (Cryo-EM) Symposium, which was co-organized by biology professor Thomas Schwartz and the director of the new facility, Edward Brignole.

“We organized the symposium to raise awareness of this new research capacity, and to celebrate the many people who worked to fund these instruments, design the space, build the suites, and set up the microscopes,” Brignole said of the Oct. 5 event. “We also wanted to bring together the various groups across MIT working on diverse technologies to improve Cryo-EM, from mathematicians, computer scientists, and electrical engineers to biologists, chemists, and biological engineers.”

The event featured pioneers leveraging Cryo-EM for various interdisciplinary applications both on campus and outside of MIT — from biology and machine learning to quantum mechanics.

The program included Ed Egelman from the University of Virginia, Mark Bathe from the MIT Department of Biological Engineering, Katya Heldwein from Tufts University's School of Medicine, and Karl Berggren from the Department of Electrical Engineering and Computer Science. Also giving talks were computational and systems biology graduate student Tristan Bepler from MIT's Computer Science and Artificial Intelligence Laboratory, Luke Chao from Harvard Medical School and Massachusetts General Hospital, postdoc Kuang Shen from the Whitehead Institute at MIT, and graduate student Jonathan Weed from the MIT Department of Mathematics. The talks were followed by a reception in Building 68 and guided tours of the Cryo-EM Facility.

Unlike other popular techniques for determining 3-D macromolecular structures, Cryo-EM permits researchers to visualize more diverse and complex molecular assemblies in multiple conformations. Cryo-EM is housed in precisely climate-controlled rooms in the basement of MIT.nano, built atop a 5 million pound slab of concrete to minimize vibrations. Two multimillion-dollar instruments are being installed that will enable scientists to analyze cellular machinery in near-atomic detail; the microscopes are the first instruments to be installed in MIT.nano.

As Schwartz explained to an audience of more than 100 people during his opening remarks, he and his colleagues realized they needed to bring this technology to the MIT community. Like many of the center’s other tools, they would be too costly to purchase and too onerous to maintain for a single researcher or lab.

“Microscopes are very special and expensive tools, so this endeavor turned out to be much more involved than anything else I have done during my 14 years at MIT,” he said. “But this was not an effort of one or two people, it really took a whole community. We have many people to thank today.”

Establishing the Cryogenic Electron Microscopy Facility at MIT has been a long-time dream for Catherine Drennan, a professor of chemistry and biology and a Howard Hughes Medical Institute investigator. At the symposium, Drennan spoke about her work using the microscopes to capture snapshots of enzymes in action.

She remembers it was a “happy coincidence” that the plans for MIT.nano and the Cryo-EM Facility unfolded around the same time and then merged together to become one multi-disciplinary effort. Drennan, Schwartz, and others worked closely with MIT.nano Founding Director Vladimir Bulović and Vice President for Research Maria Zuber to gain institutional support and jumpstart the project. It took six years to design and construct MIT.nano, and four to implement the Cryo-EM Facility.

“We had this vision that the Cryo-EM Facility would be a shared space that people from all around MIT would use,” Drennan said.

An anonymous donor offered $5 million to fund the first microscope, the Titan Krios, while the Arnold and Mabel Beckman Foundation contributed $2.5 million to purchase the second, the Talos Arctica.

“The Beckman Foundation is really pleased to be supporting this kind of technology,” said Anne Hultgren, the foundation's executive director, who attended the symposium. “It was a win-win in terms of the timing and the need in the community. We are excited to be part of this effort, and to drive forward new innovations and experiments.”

The Beckman Foundation has made similar instrumentation grants to Johns Hopkins University School of Medicine, University of Pennsylvania’s Perelman School of Medicine, the University of Utah, and the University of Washington School of Medicine.

Drennan said that as the revolution in resolution continues to build, she hopes MIT’s new microscopes will bolster the resurging Cryo-EM community that’s slowly growing in and around Boston.

“Thanks to facilities like this, the Boston area went from being way behind the curve to right in front of it,” she said. “It's an incredibly exciting time, and I can’t wait to see how we learn and grow as a research community.”

Automated system identifies dense tissue, a risk factor for breast cancer, in mammograms

Tue, 10/16/2018 - 11:09am

Researchers from MIT and Massachusetts General Hospital have developed an automated model that assesses dense breast tissue in mammograms — which is an independent risk factor for breast cancer — as reliably as expert radiologists.

This marks the first time a deep-learning model of its kind has successfully been used in a clinic on real patients, according to the researchers. With broad implementation, the researchers hope the model can help bring greater reliability to breast density assessments across the nation.

It’s estimated that more than 40 percent of U.S. women have dense breast tissue, which alone increases the risk of breast cancer. Moreover, dense tissue can mask cancers on the mammogram, making screening more difficult. As a result, 30 U.S. states mandate that women must be notified if their mammograms indicate they have dense breasts.

But breast density assessments rely on subjective human assessment. Due to many factors, results vary — sometimes dramatically — across radiologists. The MIT and MGH researchers trained a deep-learning model on tens of thousands of high-quality digital mammograms to learn to distinguish different types of breast tissue, from fatty to extremely dense, based on expert assessments. Given a new mammogram, the model can then identify a density measurement that closely aligns with expert opinion.

“Breast density is an independent risk factor that drives how we communicate with women about their cancer risk. Our motivation was to create an accurate and consistent tool, that can be shared and used across health care systems,” says Adam Yala, a PhD student in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and second author on a paper describing the model that was published today in Radiology.

The other co-authors are first author Constance Lehman, professor of radiology at Harvard Medical School and the director of breast imaging at the MGH; and senior author Regina Barzilay, the Delta Electronics Professor at CSAIL and the Department of Electrical Engineering and Computer Science at MIT and a member of the Koch Institute for Integrative Cancer Research at MIT.

Mapping density

The model is built on a convolutional neural network (CNN), which is also used for computer vision tasks. The researchers trained and tested their model on a dataset of more than 58,000 randomly selected mammograms from more than 39,000 women screened between 2009 and 2011. For training, they used around 41,000 mammograms and, for testing, about 8,600 mammograms.

Each mammogram in the dataset has a standard Breast Imaging Reporting and Data System (BI-RADS) breast density rating in four categories: fatty, scattered (scattered density), heterogeneous (mostly dense), and dense. In both training and testing mammograms, about 40 percent were assessed as heterogeneous and dense.

During the training process, the model is given random mammograms to analyze. It learns to map the mammogram with expert radiologist density ratings. Dense breasts, for instance, contain glandular and fibrous connective tissue, which appear as compact networks of thick white lines and solid white patches. Fatty tissue networks appear much thinner, with gray area throughout. In testing, the model observes new mammograms and predicts the most likely density category.

Matching assessments

The model was implemented at the breast imaging division at MGH. In a traditional workflow, when a mammogram is taken, it’s sent to a workstation for a radiologist to assess. The researchers’ model is installed in a separate machine that intercepts the scans before it reaches the radiologist, and assigns each mammogram a density rating. When radiologists pull up a scan at their workstations, they’ll see the model’s assigned rating, which they then accept or reject.

“It takes less than a second per image … [and it can be] easily and cheaply scaled throughout hospitals.” Yala says.

On over 10,000 mammograms at MGH from January to May of this year, the model achieved 94 percent agreement among the hospital’s radiologists in a binary test — determining whether breasts were either heterogeneous and dense, or fatty and scattered. Across all four BI-RADS categories, it matched radiologists’ assessments at 90 percent. “MGH is a top breast imaging center with high inter-radiologist agreement, and this high quality dataset enabled us to develop a strong model,” Yala says.

In general testing using the original dataset, the model matched the original human expert interpretations at 77 percent across four BI-RADS categories and, in binary tests, matched the interpretations at 87 percent.

In comparison with traditional prediction models, the researchers used a metric called a kappa score, where 1 indicates that predictions agree every time, and anything lower indicates fewer instances of agreements. Kappa scores for commercially available automatic density-assessment models score a maximum of about 0.6. In the clinical application, the researchers’ model scored 0.85 kappa score and, in testing, scored a 0.67. This means the model makes better predictions than traditional models.

In an additional experiment, the researchers tested the model’s agreement with consensus from five MGH radiologists from 500 random test mammograms. The radiologists assigned breast density to the mammograms without knowledge of the original assessment, or their peers’ or the model’s assessments. In this experiment, the model achieved a kappa score of 0.78 with the radiologist consensus.

Next, the researchers aim to scale the model into other hospitals. “Building on this translational experience, we will explore how to transition machine-learning algorithms developed at MIT into clinic benefiting millions of patients,” Barzilay says. “This is a charter of the new center at MIT — the Abdul Latif Jameel Clinic for Machine Learning in Health at MIT — that was recently launched. And we are excited about new opportunities opened up by this center.”

This RNA-based technique could make gene therapy more effective

Tue, 10/16/2018 - 11:00am

Delivering functional genes into cells to replace mutated genes, an approach known as gene therapy, holds potential for treating many types of diseases. The earliest efforts to deliver genes to diseased cells focused on DNA, but many scientists are now exploring the possibility of using RNA instead, which could offer improved safety and easier delivery.

MIT biological engineers have now devised a way to regulate the expression of RNA once it gets into cells, giving them precise control over the dose of protein that a patient receives. This technology could allow doctors to more accurately tailor treatment for individual patients, and it also offers a way to quickly turn the genes off, if necessary.

“We can control very discretely how different genes are expressed,” says Jacob Becraft, an MIT graduate student and one of the lead authors of the study, which appears in the Oct. 16 issue of Nature Chemical Biology. “Historically, gene therapies have encountered issues regarding safety, but with new advances in synthetic biology, we can create entirely new paradigms of ‘smart therapeutics’ that actively engage with the patient’s own cells to increase efficacy and safety.”

Becraft and his colleagues at MIT have started a company to further develop this approach, with an initial focus on cancer treatment. Tyler Wagner, a recent Boston University PhD recipient, is also a lead author of the paper. Tasuku Kitada, a former MIT postdoc, and Ron Weiss, an MIT professor of biological engineering and member of the Koch Institute, are senior authors.

RNA circuits

Only a few gene therapies have been approved for human use so far, but scientists are working on and testing new gene therapy treatments for diseases such as sickle cell anemia, hemophilia, and congenital eye disease, among many others.

As a tool for gene therapy, DNA can be difficult to work with. When carried by synthetic nanoparticles, the particles must be delivered to the nucleus, which can be inefficient. Viruses are much more efficient for DNA delivery; however, they can be immunogenic, difficult, and expensive to produce, and often integrate their DNA into the cell's own genome, limiting their applicability in genetic therapies.

Messenger RNA, or mRNA, offers a more direct, and nonpermanent, way to alter cells’ gene expression. In all living cells, mRNA carries copies of the information contained in DNA to cell organelles called ribosomes, which assemble the proteins encoded by genes. Therefore, by delivering mRNA encoding a particular gene, scientists can induce production of the desired protein without having to get genetic material into a cell’s nucleus or integrate it into the genome.

To help make RNA-based gene therapy more effective, the MIT team set out to precisely control the production of therapeutic proteins once the RNA gets inside cells. To do that, they decided to adapt synthetic biology principles, which allow for precise programming of synthetic DNA circuits, to RNA.

The researchers’ new circuits consist of a single strand of RNA that includes genes for the desired therapeutic proteins as well as genes for RNA-binding proteins, which control the expression of the therapeutic proteins.

“Due to the dynamic nature of replication, the circuits’ performance can be tuned to allow different proteins to express at different times, all from the same strand of RNA,” Becraft says.

This allows the researchers to turn on the circuits at the right time by using “small molecule” drugs that interact with RNA-binding proteins. When a drug such as doxycycline, which is already FDA-approved, is added to the cells, it can stabilize or destabilize the interaction between RNA and RNA-binding proteins, depending on how the circuit is designed. This interaction determines whether the proteins block RNA gene expression or not.

In a previous study, the researchers also showed that they could build cell-specificity into their circuits, so that the RNA only becomes active in the target cells.

Targeting cancer

The company that the researchers started, Strand Therapeutics, is now working on adapting this approach to cancer immunotherapy — a new treatment strategy that involves stimulating a patient’s own immune system to attack tumors.

Using RNA, the researchers plan to develop circuits that can selectively stimulate immune cells to attack tumors, making it possible to target tumor cells that have metastasized to difficult-to-access parts of the body. For example, it has proven difficult to target cancerous cells, such as lung lesions, with mRNA because of the risk of inflaming the lung tissue. Using RNA circuits, the researchers first deliver their therapy to targeted cancer cell types within the lung, and through their genetic circuitry, the RNA would activate T-cells that could treat the cancer’s metastases elsewhere in the body.

“The hope is to elicit an immune response which is able to pick up and treat the rest of the metastases throughout the body,” Becraft says. “If you’re able to treat one site of the cancer, then your immune system will take care of the rest, because you’ve now built an immune response against it.”

Using these kinds of RNA circuits, doctors would be able to adjust dosages based on how the patient is responding, the researchers say. The circuits also provide a quick way to turn off therapeutic protein production in cases where the patient’s immune system becomes overstimulated, which can be potentially fatal.

In the future, the researchers hope to develop more complex circuits that could be both diagnostic and therapeutic — first detecting a problem, such as a tumor, and then producing the appropriate drug.

The research was funded by the Defense Advanced Research Projects Agency, the National Science Foundation, the National Institutes of Health, the Ragon Institute of MGH, MIT, and Harvard, the Special Research Fund from Ghent University, and the Research Foundation – Flanders.

Collaboration runs through J-WAFS-funded projects

Tue, 10/16/2018 - 10:50am

“In order to do the kind and scale of work that we do, international collaboration is essential. However, this can be difficult to fund,” Chris Voigt said. “J-WAFS is providing the support that we need for the cross-institutional and cross-sector collaboration that is enabling our work to move forward.”

Voigt, a professor in the MIT Department of Biological Engineering, made those comments at the first of two research workshops produced by the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) on Sept. 14th and Sept. 28th at the Samberg Center. The annual workshop brings members of the MIT community together to learn about the latest research results from J-WAFS-funded teams, to hear about newly funded projects, and to provide feedback on each other’s work.

The specific collaboration Voigt was referring to is a project that connects the work  on prokaryotic gene clusters in his lab to research at the Max Planck Institute of Molecular Plant Physiology in Germany and the Center for Plant Biotechnology and Genomics at the Universidad Politécnica in Spain.  

Voigt and experts in plastid engineering and plant gene expression from these partnering institutions are working to engineer cereal grains to produce their own nitrogen, eliminating the need for added fertilizer. Their goal is to transform farming at every scale — reducing the greenhouse gas emissions of industrial fertilizer production as well as problems of eutrophication from nutrient run-off and reducing the cost of added nitrogen fertilizer. With a growing world population and increasing demand for grain as a food and fuel, the need for innovations in agricultural technologies is urgent, yet the technical challenges are steep and often require complementary areas of expertise. Therefore, when researchers like Voightshare their skills and resources with other global experts in pursuit of a shared goal, the combined effort has the potential to produce dramatic results.

The collaboration is a hallmark of MIT’s research culture. J-WAFS seeks to leverage that collaboration by being particularly welcoming of cross-disciplinary project proposals and research teams. In fact, the majority of J-WAFS current and concluding projects are led by two or more principal investigators, with many of those teams being cross-disciplinary.      

In the case of a J-WAFS Solutions-funded project led by principal investigators Timothy Swager and Alexander Klibanov from the Department of Chemistry, interdisciplinary collaboration grew as the work on the project progressed. The team is developing a handheld food safety sensor that uses specialized droplets — called Janus emulsions — to test for bacterial contamination in food. The droplets behave like a dynamic lens, changing in the presence of specific bacteria. 

In developing optical systems that can indicate the presence or absence of bacteria, including salmonella, by analyzing the light either transmitted through or emanating from these dynamic lenses, the researchers realized that they did not have the expertise to fully understand the optics they observed when the droplets were exposed to light. For that, they needed help. Swager reached out to Mathias Kolle, an assistant professor in the Department of Mechanical Engineering, whose expertise in optical materials proved to be key. 

Kolle, who has received J-WAFS seed funding for his own work on industrial algae production, and his graduate student Sara Nagelberg provided the calculations necessary to understand the mechanics of light’s interaction with the particles. These insights contributed to sensor designs that were dramatically more effective, and the team has now launched a startup — Xibus Systems — and is currently working on product development. 

“This is the beginning of a much longer story for us,” Swager commented, reflecting on his collaboration with Kolle’s lab.

Several other research teams are applying multiple disciplinary perspectives to their work. 

In one project, Evelyn Wang, the Gail E. Kendall Professor in the Department of Mechanical Engineering, has teamed up with Mircea Dincă, an associate professor in the Department of Chemistry, to engineer highly absorbent metal organic frameworks in a device that pulls drinking water from air.

In another, assistant professor David Des Marais in the Department of Civil and Environmental Engineering is collaborating with Caroline Uhler, the Henry L. and Grace Doherty Assistant Professor in the Department of Electrical Engineering and Computer Science, to develop tools to analyze and understand the ways that genes regulate plants’ responses to environmental stressors such as drought. Their goal is to apply this understanding to better breed and engineer stress-tolerant plants so that crop yields can improve even as climate change creates more extreme growing conditions.

Meanwhile, J-WAFS itself collaborated with a partner program in organizing the event. The second day of the workshop coincided with the Tata Center’s annual research symposium, which was also held at the Samberg Center. J-WAFS and Tata’s missions have some significant overlaps — many Tata-funded MIT projects address food, water, and agriculture challenges in the developing world. The two groups merged audiences for their afternoon sessions and presentations to take advantage of these synergies, enabling participants of each event to interact and to learn about the food and water innovations that the programs are supporting.      

By funding research in all schools at MIT and seeding and supporting innovative collaboration that crosses departments and schools alikeJ-WAFS seeks to advance research that can provide answers to what might be one of the most pressing questions of our time: How do we ensure safe and resilient supplies of water and food on our changing planet, now and in the future? When experts come together around an urgent question like this one, each one approaches it from a different angle. And when successes emerge from collaborations in J-WAFS-funded projects, it demonstrate sthe value of MIT’s culture of interdisciplinary collaboration.    

Nuno Loureiro: Probing the world of plasmas

Mon, 10/15/2018 - 11:59pm

Growing up in the small city of Viseu in central Portugal, Nuno Loureiro knew he wanted to be a scientist, even in the early years of primary school when “everyone else wanted to be a policeman or a fireman,” he recalls. He can’t quite place the origin of that interest in science: He was 17 the first time he met a scientist, he says with an amused look.

By the time Loureiro finished high school, his interest in science had crystallized, and “I realized that physics was what I liked best,” he says. During his undergraduate studies at the IST Lisbon, he began to focus on fusion, which “seemed like a very appealing field,” where major developments were likely during his lifetime, he says.

Fusion, and specifically the physics of plasmas, has remained his primary research focus ever since, through graduate school, postdoc stints, and now in his research and teaching at MIT. He explains that plasma research “lives in two different worlds.” On the one hand, it involves astrophysics, dealing with the processes that happen in and around stars; on the other, it’s part of the quest to generate electricity that’s clean and virtually inexhaustible, through fusion reactors.

Plasma is a sort of fourth phase of matter, similar to a gas but with the atoms stripped apart into a kind of soup of electrons and ions. It forms about 99 percent of the visible matter in the universe, including stars and the wispy tendrils of material spread between them. Among the trickiest challenges to understanding the behavior of plasmas is their turbulence, which can dissipate away energy from a reactor, and which proceeds in very complex and hard-to-predict ways — a major stumbling block so far to practical fusion power.

While everyone is familiar with turbulence in fluids, from breaking waves to cream stirred into coffee, plasma turbulence can be quite different, Loureiro explains, because plasmas are riddled with magnetic and electric fields that push and pull them in dynamic ways. “A very noteworthy example is the solar wind,” he says, referring to the ongoing but highly variable stream of particles ejected by the sun and sweeping past Earth, sometimes producing auroras and affecting the electronics of communications satellites. Predicting the dynamics of such flows is a major goal of plasma research.

“The solar wind is the best plasma turbulence laboratory we have,” Loureiro says. “It’s increasingly well-diagnosed, because we have these satellites up there. So we can use it to benchmark our theoretical understanding.”

Loureiro began concentrating on plasma physics in graduate school at Imperial College London and continued this work as a postdoc at the Princeton Plasma Physics Laboratory and later the Culham Centre for Fusion Energy, the U.K.’s national fusion lab. Then, after a few years as a principal researcher at the University of Portugal, he joined the MIT faculty at the Plasma Science and Fusion Center in 2016 and earned tenure in 2017. A major motivation for moving to MIT from his research position, he says, was working with students. “I like to teach,” he says. Another was the “peerless intellectual caliber of the Plasma Science and Fusion Center at MIT.”

Loureiro, who holds a joint appointment in MIT’s Department of Physics, is an expert on a fundamental plasma process called magnetic reconnection. One example of this process occurs in the sun’s corona, a glowing irregular ring that surrounds the disk of the sun and becomes visible from Earth during solar eclipses. The corona is populated by vast loops of magnetic fields, which buoyantly rise from the solar interior and protrude through the solar surface. Sometimes these magnetic fields become unstable and explosively reconfigure, unleashing a burst of energy as a solar flare. “That’s magnetic reconnection in action,” he says.

Over the last couple of years at MIT, Loureiro published a series of papers with physicist Stanislav Boldyrev at the University of Wisconsin, in which they proposed a new analytical model to reconcile critical disparities between models of plasma turbulence and models of magnetic reconnection. It’s too early to say if the new model is correct, he says, but “our work prompted a reanalysis of solar wind data and also new numerical simulations. The results from these look very encouraging.”

Their new model, if proven, shows that magnetic reconnection must play a crucial role in the dynamics of plasma turbulence over a significant range of spatial scales – an insight that Loureiro and Boldyrev claim would have profound implications.

Loureiro says that a deep, detailed understanding of turbulence and reconnection in plasmas is essential for solving a variety of thorny problems in physics, including the way the sun’s corona gets heated, the properties of accretion disks around black holes, nuclear fusion, and more. And so he plugs away, to continue trying to unravel the complexities of plasma behavior. “These problems present beautiful intellectual challenges,” he muses. “That, in itself, makes the challenge worthwhile. But let’s also keep in mind that the practical implications of understanding plasma behavior are enormous.”

Computer model offers more control over protein design

Mon, 10/15/2018 - 3:03pm

Designing synthetic proteins that can act as drugs for cancer or other diseases can be a tedious process: It generally involves creating a library of millions of proteins, then screening the library to find proteins that bind the correct target.

MIT biologists have now come up with a more refined approach in which they use computer modeling to predict how different protein sequences will interact with the target. This strategy generates a larger number of candidates and also offers greater control over a variety of protein traits, says Amy Keating, a professor of biology and biological engineering and the leader of the research team.

“Our method gives you a much bigger playing field where you can select solutions that are very different from one another and are going to have different strengths and liabilities,” she says. “Our hope is that we can provide a broader range of possible solutions to increase the throughput of those initial hits into useful, functional molecules.”

In a paper appearing in the Proceedings of the National Academy of Sciences the week of Oct. 15, Keating and her colleagues used this approach to generate several peptides that can target different members of a protein family called Bcl-2, which help to drive cancer growth.

Recent PhD recipients Justin Jenson and Vincent Xue are the lead authors of the paper. Other authors are postdoc Tirtha Mandal, former lab technician Lindsey Stretz, and former postdoc Lothar Reich.

Modeling interactions

Protein drugs, also called biopharmaceuticals, are a rapidly growing class of drugs that hold promise for treating a wide range of diseases. The usual method for identifying such drugs is to screen millions of proteins, either randomly chosen or selected by creating variants of protein sequences already shown to be promising candidates. This involves engineering viruses or yeast to produce each of the proteins, then exposing them to the target to see which ones bind the best.

“That is the standard approach: Either completely randomly, or with some prior knowledge, design a library of proteins, and then go fishing in the library to pull out the most promising members,” Keating says.

While that method works well, it usually produces proteins that are optimized for only a single trait: how well it binds to the target. It does not allow for any control over other features that could be useful, such as traits that contribute to a protein’s ability to get into cells or its tendency to provoke an immune response.

“There’s no obvious way to do that kind of thing — specify a positively charged peptide, for example — using the brute force library screening,” Keating says.

Another desirable feature is the ability to identify proteins that bind tightly to their target but not to similar targets, which helps to ensure that drugs do not have unintended side effects. The standard approach does allow researchers to do this, but the experiments become more cumbersome, Keating says.

The new strategy involves first creating a computer model that can relate peptide sequences to their binding affinity for the target protein. To create this model, the researchers first chose about 10,000 peptides, each 23 amino acids in length and helical in structure, and tested their binding to three different members of the Bcl-2 family. They intentionally chose some sequences they already knew would bind well, plus others they knew would not, so the model could incorporate data about a range of binding abilities.

From this set of data, the model can produce a “landscape” of how each peptide sequence interacts with each target. The researchers can then use the model to predict how other sequences will interact with the targets, and generate peptides that meet the desired criteria.

Using this model, the researchers produced 36 peptides that were predicted to tightly bind one family member but not the other two. All of the candidates performed extremely well when the researchers tested them experimentally, so they tried a more difficult problem: identifying proteins that bind to two of the members but not the third. Many of these proteins were also successful.

“This approach represents a shift from posing a very specific problem and then designing an experiment to solve it, to investing some work up front to generate this landscape of how sequence is related to function, capturing the landscape in a model, and then being able to explore it at will for multiple properties,” Keating says.

Sagar Khare, an associate professor of chemistry and chemical biology at Rutgers University, says the new approach is impressive in its ability to discriminate between closely related protein targets.

“Selectivity of drugs is critical for minimizing off-target effects, and often selectivity is very difficult to encode because there are so many similar-looking molecular competitors that will also bind the drug apart from the intended target. This work shows how to encode this selectivity in the design itself,” says Khare, who was not involved in the research. “Applications in the development of therapeutic peptides will almost certainly ensue.” 

Selective drugs

Members of the Bcl-2 protein family play an important role in regulating programmed cell death. Dysregulation of these proteins can inhibit cell death, helping tumors to grow unchecked, so many drug companies have been working on developing drugs that target this protein family. For such drugs to be effective, it may be important for them to target just one of the proteins, because disrupting all of them could cause harmful side effects in healthy cells.

“In many cases, cancer cells seem to be using just one or two members of the family to promote cell survival,” Keating says. “In general, it is acknowledged that having a panel of selective agents would be much better than a crude tool that just knocked them all out.”

The researchers have filed for patents on the peptides they identified in this study, and they hope that they will be further tested as possible drugs. Keating’s lab is now working on applying this new modeling approach to other protein targets. This kind of modeling could be useful for not only developing potential drugs, but also generating proteins for use in agricultural or energy applications, she says.

The research was funded by the National Institute of General Medical Sciences, National Science Foundation Graduate Fellowships, and the National Institutes of Health.

Technique quickly identifies extreme event statistics

Mon, 10/15/2018 - 3:03pm

Seafaring vessels and offshore platforms endure a constant battery of waves and currents. Over decades of operation, these structures can, without warning, meet head-on with a rogue wave, freak storm, or some other extreme event, with potentially damaging consequences.

Now engineers at MIT have developed an algorithm that quickly pinpoints the types of extreme events that are likely to occur in a complex system, such as an ocean environment, where waves of varying magnitudes, lengths, and heights can create stress and pressure on a ship or offshore platform. The researchers can simulate the forces and stresses that extreme events — in the form of waves — may generate on a particular structure.

Compared with traditional methods, the team’s technique provides a much faster, more accurate risk assessment for systems that are likely to endure an extreme event at some point during their expected lifetime, by taking into account not only the statistical nature of the phenomenon but also the underlying dynamics.

“With our approach, you can assess, from the preliminary design phase, how a structure will behave not to one wave but to the overall collection or family of waves that can hit this structure,” says Themistoklis Sapsis, associate professor of mechanical and ocean engineering at MIT. “You can better design your structure so that you don’t have structural problems or stresses that surpass a certain limit.”

Sapsis says that the technique is not limited to ships and ocean platforms, but can be applied to any complex system that is vulnerable to extreme events. For instance, the method may be used to identify the type of storms that can generate severe flooding in a city, and where that flooding may occur. It could also be used to estimate the types of electrical overloads that could cause blackouts, and where those blackouts would occur throughout a city’s power grid.

Sapsis and Mustafa Mohamad, a former graduate student in Sapsis’ group, currently assistant research scientist at Courant Institute of Mathematical Sciences at New York University, are publishing their results this week in the Proceedings of the National Academy of Sciences.

Bypassing a shortcut

Engineers typically gauge a structure’s endurance to extreme events by using computationally intensive simulations to model a structure’s response to, for instance, a wave coming from a particular direction, with a certain height, length, and speed. These simulations are highly complex, as they model not just the wave of interest but also its interaction with the structure. By simulating the entire “wave field” as a particular wave rolls in, engineers can then estimate how a structure might be rocked and pushed by a particular wave, and what resulting forces and stresses may cause damage.

These risk assessment simulations are incredibly precise and in an ideal situation might predict how a structure would react to every single possible wave type, whether extreme or not. But such precision would require engineers to simulate millions of waves, with different parameters such as height and length scale — a process that could take months to compute. 

“That’s an insanely expensive problem,” Sapsis says. “To simulate one possible wave that can occur over 100 seconds, it takes a modern graphic processor unit, which is very fast, about 24 hours. We’re interested to understand what is the probability of an extreme event over 100 years.”

As a more practical shortcut, engineers use these simulators to run just a few scenarios, choosing to simulate several random wave types that they think might cause maximum damage. If a structural design survives these extreme, randomly generated waves, engineers assume the design will stand up against similar extreme events in the ocean.

But in choosing random waves to simulate, Sapsis says, engineers may miss other less obvious scenarios, such as combinations of medium-sized waves, or a wave with a certain slope that could develop into a damaging extreme event.

“What we have managed to do is to abandon this random sampling logic,” Sapsis says.

A fast learner

Instead of running millions of waves or even several randomly chosen waves through a computationally intensive simulation, Sapsis and Mohamad developed a machine-learning algorithm to first quickly identify the “most important” or “most informative” wave to run through such a simulation.

The algorithm is based on the idea that each wave has a certain probability of contributing to an extreme event on the structure. The probability itself has some uncertainty, or error, since it represents the effect of a complex dynamical system. Moreover, some waves are more certain to contribute to an extreme event over others.

The researchers designed the algorithm so that they can quickly feed in various types of waves and their physical properties, along with their known effects on a theoretical offshore platform. From the known waves that the researchers plug into the algorithm, it can essentially “learn” and make a rough estimate of how the platform will behave in response to any unknown wave. Through this machine-learning step, the algorithm learns how the offshore structure behaves over all possible waves. It then identifies a particular wave that maximally reduces the error of the probability for extreme events. This wave has a high probability of occuring and leads to an extreme event. In this way the algorithm goes beyond a purely statistical approach and takes into account the dynamical behavior of the system under consideration.

The researchers tested the algorithm on a theoretical scenario involving a simplified offshore platform subjected to incoming waves. The team started out by plugging four typical waves into the machine-learning algorithm, including the waves’ known effects on an offshore platform. From this, the algorithm quickly identified the dimensions of a new wave that has a high probability of occurring, and it maximally reduces the error for the probability of an extreme event.

The team then plugged this wave into a more computationally intensive, open-source simulation to model the response of a simplified offshore platform. They fed the results of this first simulation back into their algorithm to identify the next best wave to simulate, and repeated the entire process. In total, the group ran 16 simulations over several days to model a platform’s behavior under various extreme events. In comparison, the researchers carried out simulations using a more conventional method, in which they blindly simulated as many waves as possible, and were able to generate similar statistical results only after running thousands of scenarios over several months.

MIT researchers simulated the behavior of a simplified offshore platform in response to the waves that are most likely to contribute to an extreme event. Courtesy of the researchers

Sapsis says the results demonstrate that the team’s method quickly hones in on the waves that are most certain to be involved in an extreme event, and provides designers with more informed, realistic scenarios to simulate, in order to test the endurance of not just offshore platforms, but also power grids and flood-prone regions.

“This method paves the way to perform risk assessment, design, and optimization of complex systems based on extreme events statistics, which is something that has not been considered or done before without severe simplifications,” Sapsis says. “We’re now in a position where we can say, using ideas like this, you can understand and optimize your system, according to risk criteria to extreme events.”

This research was supported, in part, by the Office of Naval Research, Army Research Office, and Air Force Office of Scientific Research, and was initiated through a grant from the American Bureau of Shipping.

In pursuit of the elusive stem cell

Mon, 10/15/2018 - 2:40pm

How does the body renew itself? How do cancer cells use the same or similar processes to form tumors and spread throughout the body? How might we use those processes to heal injuries or fight cancer?

A new research program at MIT is tackling fundamental biological questions about normal adult stem cells and their malignant counterparts, cancer stem cells. Launched last spring with support from Fondation MIT, the MIT Stem Cell Initiative is headed by Jacqueline Lees, the Virginia and D.K. Ludwig Professor of Cancer Research, professor of biology, and associate director of the Koch Institute for Integrative Cancer Research. Other founding members of the initiative are Robert Weinberg, a professor of biology, Whitehead Institute member, and director of the Ludwig Center at MIT; and Omer Yilmaz, an assistant professor of biology.

Rare power

Normal adult stem cells have been defined for more than a half-century. Relatively rare, they are undifferentiated cells within a tissue that divide to produce two daughter cells. One remains in the stem cell state to maintain the stem cell population, a process called self-renewal. The second daughter cell adopts a partially differentiated state, then goes on to divide and differentiate further to yield multiple cell types that form that tissue. In many fully formed adult tissues, normal stem cells divide periodically to replenish or repair the tissue. Importantly, this division is a carefully controlled process to ensure that tissues are restricted to the appropriate size and cell content.

Cancer stem cells are also of long-standing interest and share many similarities with normal adult stem cells. They perform the same division but, rather than differentiating, the additional cells produced by the second daughter cell amass to form the bulk of the tumor. Following surgery or treatment, cancer stem cells can regrow the tumor — and are frequently resistant to chemotherapy — making them especially dangerous. This unique ability of normal and cancer stem cells to both self-renew and form a tissue or tumor is referred to by researchers as “stemness,” and has important implications for biomedical applications.

Because of the key role they play in tissue maintenance and regeneration, normal stem cells hold great promise for use in repairing damaged tissues. Cancer stem cells, correspondingly, are the lifeblood of tumors. Although relatively rare within tumors, they are particularly important because they possess the ability to create tumors and are also chemotherapy-resistant. As a result, cancer stem cells are thought to be responsible for tumor recurrence after remission, and also for the formation of metastases, which account for the majority of cancer-associated deaths. Accordingly, an anti-cancer stem cell therapy that can target and kill cancer stem cells is one of the holy grails of cancer treatment — a means to suppress both tumor recurrence and metastatic disease.

Hiding in plain sight

One of the fundamental challenges to studying normal and cancer stem cells, and to ultimately harnessing that knowledge, is developing the ability to identify, purify, and propagate these cells. This has often proved tricky, as another key similarity between normal and cancer stem cells is that neither is visibly different from other cells in a tissue or tumor. Thus, a major goal in stem cell and cancer stem cell research is finding ways to distinguish these rare specimens from other cells, ideally by identifying unique surface markers that can be used to purify stem cell and cancer stem cell populations and enable their study.

The MIT Stem Cell Initiative is applying new technologies and approaches in pursuit of this goal. More specifically, the program aims to:

  • identify the stem cells and cancer stem cells in various tissues and tumor types;
  • determine how these cells change during aging (in the case of normal stem cells) or with disease progression (in the case of degenerative conditions and cancer); and
  • determine the similarities and differences between normal and cancer stem cells, with the goal of finding vulnerabilities in cancer stem cells that can be viable and specific targets for treatment. 


Ultimately, the ability to identify, purify, and establish various populations of stem cells and cancer stem cells could help researchers better understand the biology of these cells, and learn how to utilize them more effectively in regenerative medicine applications and target them in cancer.

When biology meets technology

MIT Stem Cell Initiative studies focus on normal and cancer stem cells of epithelial tissues. Epithelia are one of four general tissue types in the body; they line most organs and are where the vast majority of cancers arise. Epithelial cells from different organs share some biological properties, but also have distinct differences reflecting the organ’s specific role and/or environment. In particular, the MIT Stem Cell Initiative has focused on the breast and colon, as these tissues are quite different from each other, yet each constitutes a major portion of cancer incidence.

New technologies are enabling the researchers to make significant headway in these investigations, progress that was not feasible just a few years ago. Specifically, they are using a combination of specially cultured cells, sophisticated and highly controllable mouse models of cancer, and single-cell RNA sequencing and computational analysis techniques that are uniquely suited to extracting a great deal of information from the relatively small number of stem cells.

While breast and colon work is ongoing, MIT Stem Cell Initiative members are planning studies of additional tissues and recruiting collaborators for pilot projects. The results of the researchers’ studies will advance understandings of stem cell regulation and may ultimately lead to advances in tissue regeneration and/or cancer analysis and treatment.

Starting new conversations about identity abroad

Mon, 10/15/2018 - 2:30pm

Students from diverse cultural, racial, ethnic, national or economic backgrounds; students with disabilities; LGBTQ+ students; first-generation students; and others face unique challenges when participating in international programs. MIT International Science and Technology Initiatives (MISTI), based in the Center for International Studies within the School of Humanities, Arts, and Social Sciences, has launched an initiative to address such issues and better understand those perspectives.

The mission is simple: to prepare and support all students while abroad. Through student blogs, guided peer-to-peer conversation sessions and tailored resources, MISTI aims to empower students with new methods for engaging with their identities during the course of their international experiences.

“I have always ‘traveled’ through the course of my life. I have, in my 19 years of life, lived in 19 different buildings, four different states, and two different countries. Being a first-generation, low-income student did impact my confidence in my abilities to do well traveling abroad. …Thankfully, there were MISTI resources available that helped me,” says sophomore Enriko Kurtz Granadoz Chave, who participated in an internship in Santiago, Chile, at the University of Santiago de Chile through MIT-Chile.

Co-sponsored by the Institute Community and Equity Office (ICEO), MISTI received grant support to host speakers from Diversity Abroad for both staff and students in 2017, and this year received additional funding to foster student leadership. MISTI is focusing on professional development, campus collaboration, and student communication in order to better prepare students before departure to their host countries and to provide thoughtful support while students are abroad.

To develop the new programming, Mala Ghosh, MIT-India managing director and MISTI diversity lead, talked with campus partners, researched current best practices, and sought out student feedback. “We are proud of the diversity represented in MISTI participation,” says Ghosh. “However, we must go beyond numbers and ensure that we are supporting all students to thrive abroad.”

Creating a conversation

MISTI offers a series of dialogue-based sessions, led by students and guided by MISTI staff, partners, and speakers. These gatherings are focused on particular aspects of identity and are open to all MIT students, with the goal of preparing students for traveling and living abroad. Four sessions were held during the past year: “Embracing Your Diversity Abroad”, “Being Out in the World: Being LGBTQ+ Abroad,” “Going Abroad as a Student of Color,” and “Religion & Spirituality Abroad.”  

Eduardo Rivera, MIT-Chile program manager, captures the goals for both students and staff, “Every international academic experience is unique. The singularity of those experiences is not only shaped by the particular context of the destination, but more importantly by the unique lens through which the student will see and interact with the new context. Offering our students an opportunity to reflect on their identities and their international experiences is a fundamental step to supporting their personal and academic growth before, during, and after an experience abroad.”

Sharing student perspectives

MISTI also highlights student-to-student learning through MISTI IdentityX Ambassadors, where students write blogs about their MISTI experiences. These blogs start conversations on the ambassador’s identity and how it shaped their global experiences. This summer, 10 students wrote about religion, race, heritage, prejudice, privilege, LGBTQ+ identity, and economic status, among other topics. 

“I joined the MISTI IdentityX Ambassador Program because it was a way to capture my thoughts while abroad. I picked South Africa because I had questions about my own identity that I sought to answer and this was a perfect medium,” says IdentityX Ambassador and sophomore Peter Williams, who completed a MISTI internship in South Africa to complement his MIT mechanical engineering studies.

“Participating in IdentityX has provided me the opportunity to frame, process, and write about my experience abroad in the context of identity,” says senior Carrie Watkins, who is pursuing her master’s in city planning and completed her internship in The Netherlands. “It has given me an excuse to enter into real conversations with new friends and colleagues.”

MISTI aims for these conversations to inspire students who don’t feel like international opportunities are for them, or are nervous about being successful in an internship abroad. “I think having honest accounts are valuable for individuals who are considering MISTI,” says Yara Jabbour Al Maalouf, a senior in chemical-biological engineering who wrote her IdentityX blogs during her internship in India. “It isn't necessarily purely for advice on 'how to survive' or reassurance of certain worries, but it is also a unique perspective on how to make the most out of the experience and grow.” 

For master's student Trang Luu ’18, who completed MISTI internships in South Africa and Cameroon through MIT-Africa, the international experience forced her to expand and question aspects of her identity. “When I got my acceptance letter to MIT, I felt like I had broken through a glass ceiling,” says Luu. “I decided that the life I was going to live would be the life that I chose — and I chose to be an engineer. Never once did I anticipated that being an engineer could be have a downside; however, during my time in Cameroon, I began to realize that I needed to question my own perspectives and ensure broader social impact not only a technical or physical solution.”  

Future goals

Future MISTI events will continue to highlight different perspectives, the intersection of varying identities, and focus on providing country-specific resources to students. IdentityX Ambassadors will play an important part in that goal as peer mentors and program representatives.

“We believe one of the most effective ways for students to learn is by engaging with one another,” says Ghosh. “We are preparing MISTI IdentityX Ambassadors to help lead pre-departure sessions for students going overseas next year. It is vital for students to hear from other students not only about international academic and career opportunities, but also how their various identities played a role in their time abroad. We have found that students tend to open up more in smaller sessions focused on gender and safety abroad, being LGBTQ+ abroad, concerns around immigration and travel, student wellness while abroad, and preparing ahead for managing wellness or accessibility abroad.”

“The blogs and other identity programming can only make MISTI more approachable as a community,” says IdentityX Ambassador Johnson Huynh, who completed his internship in Mexico and is studying mechanical engineering. “If we could continue this trend of encouraging students to think about their identities, and highlight MISTI student personalities, I believe that it can only draw more participants towards the program and to international programs in general.”

The blogs not only met a student need, but also fulfilled a MISTI goal. “The MISTI blogs are a window to discover our students beyond their course or simple demographic data. The blogs are an exercise of reflection, but moreover, they are an expression of life changing experiences narrated in first person, an open book to the entire MIT community,” says Rivera.

Protein has unique effects on information processing

Mon, 10/15/2018 - 1:10pm

Our cognitive abilities come down to how well the connections, or synapses, between our brain cells transmit signals. Now new study by researchers at MIT’s Picower Institute for Learning and Memory has dug deeply into the molecular mechanisms that enable synaptic transmission to show the distinct role of a protein that, when mutated, has been linked to causing intellectual disability.

The key protein, called SAP102, is one of four members of a family of proteins, called PSD-MAGUKs, that regulate the transport and placement of key receptors called AMPARs on the receiving end of a synapse. But how each member of the family works — for instance, as the brain progresses through development to maturity — is not well understood. The new study, published in the Journal of Neurophysiology, shows that SAP102 and other family members like PSD-95, work in different ways, a feature whose evolution may have contributed to the greater cognitive capacity of mammals and other vertebrates.

“Our results show that PSD-95 and SAP102 regulate synaptic AMPAR function differently,” wrote the researchers, who include senior author Weifeng Xu, an assistant professor in the department of Brain and Cognitive Sciences, and lead author Mingna Liu, a former postodoc in Xu’s lab who is now at the University of Virginia.

Says Xu: “This study is part of a continuous effort in our lab to elucidate the molecular machinery for tuning synaptic transmission critical for cognition.”

Current affairs

Specifically, the scientists found that the proteins distinctly affected how quickly electrical currents lost strength in postsynaptic cells, or neurons.

“For the first time we show that PSD-95 and SAP102 have differential effects on the decay kinetics of synaptic AMPAR currents,” they wrote.

In one key set of experiments in rats in a region of the brain called the hippocampus, the researchers showed that while knocking out PSD-95 causes a reduction in AMPAR current frequency and amplitude, they could restore those by replacing PSD-95 with a different form, PSD-95alpha, or with SAP102. They did these manipulations by using a virus to make the swap, a technique that Xu has developed called molecular replacement.

But the two proteins are not merely interchangeable. Compared to control neurons with normal PSD-95 or cells in which PSD-95 was replaced with PSD-95alpha, cells in which PSD-95 was replaced with SAP102 had different AMPAR current kinetics, meaning that the currents took longer to decay. That timing difference made by SAP102 could make an important difference in how synapses operate to affect cognition.

“These data showed that PSD-95alpha and SAP102 have distinct effects on the decay time of synaptic AMPAR currents, which potentially lead to differential synaptic integration for neuronal information processing,” the researchers wrote.

Protein partner

In another set of experiments, the team showed that SAP102 uniquely depends on another protein called CNIH-2. Knocking CNIH-2 down on its own didn’t affect AMPAR currents, but when they knocked  the protein down in the context of replacing PSD-95 with PSD-95alpha or SAP102, the researchers found that SAP102 could no longer restore the currents. Meanwhile, knocking down CNIH-2 had no effect on PSD-95alpha’s rescue of AMPAR currents.

“These data showed that the effect of SAP102 but not that of PSD-95alpha on synaptic AMPAR currents depends on CNIH-2, suggesting that SAP102 and PSD-95alpha regulate different AMPAR complexes,” they wrote.

In all the findings suggest that the diversity of AMPAR regulation leads to cognitively consequential differences in current timing at synapses.

“It is likely the AMPAR complex diversity contributes to the temporal profile of synaptic events important for information encoding and integration in different cell types and synapses,” they wrote.

In addition to Liu and Xu, the paper’s other authors are Rebecca Shi, Hongik Hwang, Kyng Seiok Han, Man Ho Wong, Xiobai Ren, Laura Lewis, and Emery N. Brown.

The reserch was funded through the National Institutes of Health and an MIT Simons Seed Grant.

When light, not heat, causes melting

Mon, 10/15/2018 - 11:00am

The way that ordinary materials undergo a phase change, such as melting or freezing, has been studied in great detail. Now, a team of researchers has observed that when they trigger a phase change by using intense pulses of laser light, instead of by changing the temperature, the process occurs very differently.

Scientists had long suspected that this may be the case, but the process has not been observed and confirmed until now. With this new understanding, researchers may be able to harness the mechanism for use in new kinds of optoelectronic devices.

The unusual findings are reported today in the journal Nature Physics. The team was led by Nuh Gedik, a professor of physics at MIT, with graduate student Alfred Zong, postdoc Anshul Kogar, and 16 others at MIT, Stanford University, and Skolkovo Institute of Science and Technology (Skoltech) in Russia.

For this study, instead of using an actual crystal such as ice, the team used an electronic analog called a charge density wave — a frozen electron density modulation within a solid — that closely mimics the characteristics of a crystalline solid.

While typical melting behavior in a material like ice proceeds in a relatively uniform way through the material, when the melting is induced in the charge density wave by ultrafast laser pulses, the process worked quite differently. The researchers found that during the optically induced melting, the phase change proceeds by generating many singularities in the material, known as topological defects, and these in turn affect the ensuing dynamics of electrons and lattice atoms in the material.

These topological defects, Gedik explains, are analogous to tiny vortices, or eddies, that arise in liquids such as water. The key to observing this unique melting process was the use of a set of extremely high-speed and accurate measurement techniques to catch the process in action.

The fast laser pulse, less than a picoseond long (trillionths of a second), simulates the kind of rapid phase changes that occur. One example of a fast phase transition is quenching — such as suddenly plunging a piece of semimolten red-hot iron into water to cool it off almost instantly. This process differs from the way materials change through gradual heating or cooling, where they have enough time to reach equilibrium — that is, to reach a uniform temperature throughout — at each stage of the temperature change.

While these optically induced phase changes have been observed before, the exact mechanism through which they proceed was not known, Gedik says.

The team used a combination of three techniques, known as ultrafast electron diffraction, transient reflectivity, and time- and angle-resolved photoemission spectroscopy, to simultaneously observe the response to the laser pulse. For their study, they used a compound of lanthanum and tellurium, LaTe3, which is known to host charge density waves. Together, these instruments make it possible to track the motions of electrons and atoms within the material as they change and respond to the pulse.

Video above shows electron diffraction from the sample being studied. The smaller white spots close on either side of the central dot show the charge density wave, which is analogous to a crystal structure, as it "melts" when hit with an ultrafast laser pulse, and then "refreezes."

Energy bands in the material are depicted in this video, where the density of high-energy electrons is plotted versus their momentum. Bright bands that appear and then disappear correspond to the decrease in order (melting) and the reappearance of that order (freezing).

In the experiments, Gedik says, “we can watch, and make a movie of, the electrons and the atoms as the charge density wave is melting,” and then continue watching as the orderly structure then resolidifies. The researchers were able to clearly observe and confirm the existence of these vortex-like topological defects.

They also found that the time for resolidifying, which involves the dissolution of these defects, is not uniform, but takes place on multiple timescales. The intensity, or amplitude, of the charge density wave recovers much more rapidly than does the orderliness of the lattice. This observation was only possible with the suite of time-resolved techniques used in the study, with each providing a unique perspective.

Zong says that a next step in the research will be to try to determine how they can “engineer these defects in a controlled way.” Potentially, that could be used as a data-storage system, “using these light pulses to write defects into the system, and then another pulse to erase them.”

Peter Baum, a professor of physics at the University of Konstanz in Germany, who was not connected to this research, says “This is great work. One awesome aspect is that three almost entirely different, complicated methodologies have been combined to solve a critical question in ultrafast physics, by looking from multiple perspectives.”

Baum adds that “the results are important for condensed-matter physics and their quest for novel materials, even if they are laser-excited and exist only for a fraction of a second.”

The work was carried out in collaboration between researchers at MIT, Stanford University, and Skoltech. It was supported by the U.S. Department of Energy, the Gordon and Betty Moore Foundation, the Army Research Office, and the Skoltech NGP Program.

Provost's letter to the faculty about the MIT Stephen A. Schwarzman College of Computing

Mon, 10/15/2018 - 8:58am

The following email was sent today to the MIT faculty from Provost Martin Schmidt.

Dear colleagues,

As I trust you have seen, this morning Rafael wrote to the community to announce the creation of the MIT Stephen A. Schwarzman College of Computing. This is an historic day for the Institute.

The idea for the College emerged from a process of consultation the administration conducted over the past year. In that time, we consulted with many faculty members, both on School Councils and in some departments with significant computing activities. How to handle the explosive growth in student interest in computing, on its own and across other disciplines, has been an administrative concern for some time. As we’ve seen in the sharp rise in majors “with CS,” individual departments have worked hard to respond. But through more than a year’s worth of thoughtful input from many stakeholders, we came to see that if MIT could take a single bold step at scale, we could create important new opportunities for our community.

A central idea behind the College is that a new, shared structure can help deliver the power of computing, and especially AI, to all disciplines at MIT, lead to the development of new disciplines, and provide every discipline with an active channel to help shape the work of computing itself. Among those we have consulted so far, I sense a deep excitement for the power of this idea.

Opportunities for input

Today’s announcement has defined a vision for this College. Now, to realize its full potential, we are eager to launch a process that includes even more voices and perspectives. As a very first step, Rafael announced a set of community forums where we will share more detail on the vision and a process for moving forward. I hope you will join us for the faculty forum — October 18, 5:30–6:30 PM in 32-123 — so that we can learn from your feedback. The October 17th Faculty Meeting will also include discussion of the new College.

The search for the Dean of the MIT Schwarzman College of Computing

One immediate step is the search for the College’s inaugural dean. I am grateful to Institute Professor Ronald L. Rivest for agreeing to chair the search, and I am in the process of finalizing a search committee; we will announce the membership soon. I will ask the committee to recommend a short list of the best internal and external candidates by the end of November. It’s important that we work efficiently together to appoint a dean in the coming months, so that the new dean will be able to participle fully in implementing all aspects of the College.

I invite you to share your advice with the committee, including your suggestions for candidates for this important position, by sending email to CollegeOfComputingImplementation@mit.edu. All correspondence will be kept confidential.

The process moving forward

The Chair of the Faculty Susan Silbey and I have discussed ideas for the best process moving forward. Even as we conduct a search for the new dean of the College, we can begin to make progress on several fronts.

At this point, we believe we could form a number of working groups to advise the administration on important details of creating the College, perhaps following the process MIT used during the 2008 budget crisis, which actively engaged all key stakeholders at the Institute. The working groups can evaluate options and make recommendations on issues like the detailed structure of the college, how faculty appointments will be made, and how we envision new degrees and instructional support that cut across the Institute. Again, we welcome your comments, questions, and insights as we move forward with this process. Please feel free to contribute any input via CollegeOfComputingImplementation@mit.edu.

We have much work ahead of us, and I look forward to the excitement and challenge of writing this new chapter of the Institute’s history. I welcome your feedback and advice.

With my best regards,

Marty

Pages