MIT Latest News
In the spring of 2016, while Emily Lindemer was working toward her PhD at MIT, she was also struggling with something closer to home: watching someone she knew well fall in and out of recovery from opioid addiction.
Like many people in recovery, Lindemer’s friend had his ups and downs. There were promising periods of sobriety followed by relapses into old habits. As the months went by, Lindemer began to see patterns.
For example, when he lost his driver's license — a common occurrence for people struggling with substance abuse who have run-ins with police — he had to call his friends to give him rides to work. If the friends he called for a lift were also people he used drugs with, Lindemer says, he’d relapse within a week.
“His relapses were predictable almost to a T, just based on the people he was associating with — who he was talking to, calling, texting, and hanging out with,” she says.
This realization turned out to be an inspiration. What if, she thought, there was a way to provide gentle moments of pause to people struggling with substance-abuse disorders? And what if those reminders could come through a smartphone application that monitors users’ contacts, location, and behaviors — and, using the information it gathers, offers encouragement when they are communicating with risky people or when they’re near a trigger area?
Lindemer, then a PhD student in the Harvard-MIT Health Sciences and Technology program, formed a team, which started thinking through the basics of what would become an app called Hey,Charlie. She knew of dozens of existing apps to help people in recovery. Some, like MySoberLife, offer simple lifestyle tracking services. Others, like reSET, are prescription-only and share patients’ responses to questionnaires with doctors. But none addressed the primary trigger Lindemer saw for relapses: social contacts.
Lindemer and her team participated in MIT Hacking Medicine, a worldwide event in which people have a short time to come up with solutions to health care-related problems. They emerged from that experience with sharper ideas, and with a clear sense that they would need funding and more advice. So Lindemer applied to the MIT Sandbox Innovation Fund, a program that provides seed funding for students’ ideas. The team received $25,000 and was connected to mentors with relevant experience. Lindemer and her team streamlined the application and designed a business model, and recently they ran a successful usability pilot.
The Hey,Charlie app works on several levels. When someone downloads it, it prompts them to enter general information about a few of their contacts, including questions that might prove helpful on the road to recovery, for example: “How often does this person express doubt about your ability to continue your recovery process?”
“They are objective questions, not subjective, and they aren't stigmatizing,” Lindemer says. “They do not ask the person in recovery to incriminate anybody. We try to figure out things like, is this a person that even knows that you are struggling with substance abuse disorder? Is this a person who contributes to stress levels in your life? Or is this the type of person who encourages your sobriety?”
The app also asks new users for a unique set of spatial information. Where are the areas of their city or region that could be triggers for users — locations where they bought drugs, or where their friends who use drugs are living? The app's users identify a particular point and then drag a wider circle depending on the size of the area. As they users go about their day, if they approach a place they have identified as risk-related, the app sends a notification: “Hey, I know you’re near a risky area. You can do this.”
Even when users aren’t engaged with the app, Hey,Charlie collects data on their activity and interactions — very, very securely, says Lindemer.
“Anything that gets sent into the cloud for Hey,Charlie is encrypted,” she says. “What we get is anonymized communication data. So we might know this user is talking to five unique risky people, but we have no idea who those risky people are, what their phone numbers are, or anything. It’s not the specific people and places that are necessarily important. It is the volume of communication with people that are helpful versus unhelpful.”
Christopher Shanaha, the director of Hey,Charlie’s recent usability pilot at Boston Medical Center and Mattapan Community Center says the app’s nudges can help patients stay engaged with their recovery when they’re outside of the clinic.
“As clinicians we only see patients in the clinic 15 or 20 minutes a week, and yet patients have to live 24 hours a day and deal with their addictions all of the time,” Shanahan says. “This is one small way to support our patients in those interim time periods.”
During the pilot, which tracked 24 people using the app over the course of the month, Shanahan says he was surprised at how enthusiastic the responses were — users felt positively toward the app and indicated they would use it again in the future.
Michael Barros, an advisor on Hey,Charlie’s user interface who has been in recovery for heroin addiction, told Lindemer that many treatment facilities are run using old methods that are often ineffective.
“One of the most interesting thing about Hey,Charlie is having PhDs like Emily working to bring some science into a part of medicine that is still running on pen, paper, and hunches about what worked for people in the past,” Barros says. “The data that can be collected with an app like Hey,Charlie is badly needed.”
Elite athletes understand that to maximize performance, they can’t only train hard during workouts — they must also train smart. Unfortunately, unless you’re willing to live in a lab, it can be easier to get real-time information about your car than your body.
Startup Humon is one of a growing number of companies trying to change that with wearable sensors and other technologies. The company’s first product, the Hex, measures oxygen in athletes’ muscles as they train, and visualizes those data so users can tailor workouts to their body’s needs.
“The goal was to create the most useful and personalized training tool today,” says Humon CEO and co-founder Alessandro Babini SM ’15. “[To achieve that] we needed both amazing, lab-grade data and the expertise of a personal coach combined in a consumer product.”
The Hex, a lightweight device that straps onto a user’s thigh, determines muscle oxygen levels by emitting light into the muscle tissue and measuring its absorption, in a process called near-infrared spectroscopy. That information is then relayed to a user’s phone, smart watch, or laptop via Bluetooth or ANT+ technology and displayed in a simple graph along with personalized insights.
As athletes train, the graph shows them if their muscles are consuming oxygen at a higher rate than what is being supplied, which tells them if their current pace is sustainable.
Babini and his co-founder Daniel Wiese SM ’13 PhD ’16 got the idea for the Hex in 2015 while working on a class project together at MIT’s Sloan School of Management. Wiese was minoring in technology, innovation, and entrepreneurship while pursuing his PhD in mechanical engineering, and Babini was on his way to earning a master’s in management studies.
“The only thing we knew is we wanted to start a company centered around the human body, because we see no scenario in the future in which you’ll wake up in the morning knowing as little information about your body as you know today,” Babini says.
After deciding to start a company, the founders received support from the Martin Trust Center for MIT Entrepreneurship, which helped them secure office space, connect with mentors, and speak with founders of established companies.
“I hadn’t thought about starting a company before, much less had any experience with that process,” says Wiese. “I didn’t know the first thing about it, so it was just incredible to have all these resources.”
They also received a grant while participating in MIT’s delta v accelerator and participated in the MIT Fuse program to conduct market research.
“We realized we had very similar views on where we wanted to be in the future, and we knew we wanted to work together,” Babini recalls. “So we embarked on this market research project to figure out how to get there.”
Finding a target
Elite athletes are relentless in their quest to get even the slightest edge on their competition.
“We realized the early adopters were the athletes because they have a problem, strong purchasing power, and the education to voice their problem,” Babini explains. “The problem is they don’t have the information they need to optimize their training.”
Babini says the only real-time metric athletes have traditionally been able to track noninvasively has been heart rate, which can be influenced by factors such as caffeine intake and sleep deprivation. The founders wanted to track a metric that offers more value in training sessions, and decided muscle oxygen levels were the best metric of exertion.
Data on muscle oxygen levels can be useful in each stage of a workout: Prior to a training session, athletes can use muscle oxygen data to determine when their muscles are warmed up (helping to prevent injuries); during a training session, such as a long run, they can use the data to determine if their current pace is sustainable; and after a workout, they can use the data to monitor their muscle’s recovery and know exactly when they’re ready for more exertion — a particularly useful insight during interval training.
Near-infrared spectroscopy, the core technology behind the Hex, has been used to gather physiological data in labs for decades, but the founders sought to make it work seamlessly within an athlete’s workout routine. In 2016, after speaking with hundreds of athletes, they began a small pilot trial to get a feel for what the sensor’s optimal weight and size would be and how best to display the data. Later that year, they included 250 athletes in a larger trial designed to ensure the Hex solved a significant pain point.
The founders added features to the Hex as they spoke with more athletes, and the company officially started shipping the Hex in February of this year. Wiese likes to break the company’s journey to the market into three steps: developing the technology, making it convenient for athletes, and building out the software features.
“All we’re doing going forward is software and analytics, mining all of these insights and gleaning everything we can, then communicating that to the user via our app,” Wiese says. “We’re building out new features all the time and adding functionalities to integrate with user’s habits.”
The company markets the Hex as the world’s first real-time artificial intelligence coach because the system can prompt users to adjust their pace based on their body’s response to current training intensity and tell users when their muscles are ready for another interval set. Additional features will come as the company gathers more data.
Since the launch earlier this year, Humon has sold thousands of devices to users in 46 countries around the world. Babini reports sales are growing 40 percent month-over-month. But the metric the founders say they’re most proud of is engagement: The average customer uses the Hex three times a week. (The company is only able to track this metric through the smartphone application.)
Hex is also notable for its accuracy. In April, researchers from Harvard Medical School found that Humon’s method of measuring muscle oxygen saturation was up to 96 percent as accurate as the ISS Meta Ox, a stationary device Babini calls the “gold standard” of oxygen monitoring technology.
As the company continues rolling out features for the Hex, it will also unveil plans for a new product early next year. The plans are an indication that, although the founders are happily focused on athletes for now, they’re also mindful of their role in the context of the larger wearables movement.
“We’re big believers in the smart clothing industry,” Babini says. “We think that’s the future of the wearables industry, and [with the Hex] we want to make sure we become a leader in this market using existing sensors and data.”
Although Volha Charnysh initially distanced herself from her native land of Belarus, she has in recent years found reason to return to her Eastern European roots.
"In graduate school, I was at first reluctant to pursue questions that involved the region where I grew up — I feared it might be boring, or limit me somehow," says Charnysh, a newly-appointed assistant professor of political science. "But then I encountered compelling, quantitative political science on the legacy of violence on national identity and economic development, and I realized I could explore cases from Eastern Europe from fresh perspectives."
Charnysh's focus on the political and economic impacts of major historical conflicts began when she was a doctoral student in government at Harvard University. Pursuing a hunch, she investigated at a detailed, local level the massive population shifts that took place in and around Poland following World War II, a period when the redefinition of national boundaries displaced millions. This work involved a year in Poland, much of it in Warsaw archives, tracking the changing ethnic composition of more than 1,200 communities.
"It was fascinating to me," she says. "It involved an aspect of the war I hadn't known before, and that had a direct impact on the area where I'd grown up."
Charnysh also analyzed economic development before, during, and after the war in these municipalities — some where people had been forcibly resettled, and others where migrants moved voluntarily. She collected data on job creation and public service provision as well, through the end of Communist rule in Eastern Europe in 1989.
From her analysis, Charnysh learned that after what she calls the "churn of population," some communities contained homogeneous groups of migrants who had similar ethnic and cultural backgrounds, and other communities featured heterogeneous groups from disparate backgrounds.
"This distinction ultimately had a large impact," she says. "During the Communist period, homogeneous and heterogeneous communities were pretty similar, but after 1989, the heterogeneous communities had more enterprises and higher incomes."
Charnysh suggests that the difference in outcomes boils down to social dynamics: People in heterogeneous communities didn't get along as well as those in homogeneous communities, who banded together to provide services and goods for themselves. Heterogeneous communities, where people more likely distrusted their neighbors, demanded that the state provide more for them by way of health care, security, and education. In the long run, this facilitated the buildup of state capacity and promoted private economic activity that eventually benefited the entire municipality.
"Heterogeneity really paid off after Communism," Charnysh says.
The vast trove of data Charnysh collected is the basis for an ongoing book project she has titled "Migration, Diversity and Economic Development." It has also inspired several related research articles, including the provocatively titled "The Death Camp Eldorado: Political and Economic Effects of Mass Violence," published in the American Political Science Review.
In this study, conducted in collaboration with Evgeny Finkel, Charnysh revealed previously undocumented local impacts of the Holocaust.
"I was curious about what happened in the areas where Jews disappeared, and where the ethnic composition changed drastically," she says. Charnysh learned that the assets of Jews murdered in the Nazi death camp, Treblinka, sometimes benefited the population of the surrounding communities.
Charnysh also discovered that areas where there were more Jews before World War II not only registered higher support for anti-Semitic parties, but also voted against joining the European Union in the 2003 accession referendum. She argues that legacies of interethnic competition made voters in these localities more susceptible to arguments that linked membership in the European Union to Jewish influence.
As a result of her research, Charnysh has come to see a part of the world she thought she knew in an entirely different light. She was born in Grodno, a town on Belarus’s western border that over the centuries was a part of the Duchy of Lithuania, the Russian Empire, and Poland. While the historic old town contains a building once used as a synagogue, Charnysh says she "had only a vague sense that there was a Jewish population before."
The patriotism of her countrymen, not the Holocaust, "was central to our nation's World War II experience," she says.
Her father and mother, both lawyers, were intent on her getting the best education possible, and realized that might mean leaving Grodno. When the chance came to participate in a student exchange program in the U.S., the 15-year-old Charnysh seized it with her parents’ blessing.
"The United States was the land of freedom and opportunity, and they wanted this for me," she says.
The Future Leaders Exchange program, sponsored by the U.S. State Department for high school students of former Soviet republics, proved foundational for Charnysh. "I appreciated how different the U.S. was from Belarus, in terms of both wealth and politics," she says. "I no longer wanted to return to Belarus, and decided to apply to colleges in the U.S."
With the help of scholarships, she first attended Cottey, a two-year women's college in Missouri, then Smith College, where she received a BA in government. After college she took a one-year fellowship researching non-proliferation at the Arms Control Association in Washington, before launching on her current academic journey.
Today, Charnysh ponders extending her research to regions outside of Eastern Europe. "I would like to see if my hypothesis about heterogeneous communities, state capacity, and economic outcomes holds in other contexts — maybe Israel, or India after partition," she says.
Charnysh believes her historical research bears on contemporary issues and says she is learning how immigrants integrate into new communities, and how diverse populations facilitate the creation of strong national identities. These matters have personal resonance for her.
"I have been an immigrant myself and after living for years abroad, understand the problems of making a new home," she says, although it's good to feel settled. "I have a sense of solidarity with my political science colleagues here, and it is really good to be part of the broader academic community."
To assess long-range risks to food, water, energy and other critical natural resources, decision-makers often rely on Earth-system models capable of producing reliable projections of regional and global environmental changes spanning decades.
A key component of such models is the representation of atmospheric chemistry. Atmospheric simulations utilizing state-of-the-art complex chemical mechanisms promise the most accurate simulations of atmospheric chemistry. Unfortunately their size, complexity, and computational requirements have tended to limit such simulations to short time periods and a small number of scenarios to account for uncertainty.
Now a team of researchers led by the MIT Joint Program on the Science and Policy of Global Change has devised a strategy to incorporate simplified chemical mechanisms in atmospheric simulations that can match the results produced by more complex mechanisms for most regions and time periods. If implemented in a three-dimensional Earth-system model, the new modeling strategy could enable scientists and decision-makers to perform low-cost, rapid atmospheric chemistry simulations that cover long time periods under a wide range of scenarios. This new capability could both improve scientists’ understanding of atmospheric chemistry and provide decision-makers with a powerful risk assessment tool.
In a new study appearing in the European Geosciences Union journal Geoscientific Model Development, the research team conducted three 25-year simulations of tropospheric ozone chemistry using chemical mechanisms of different levels of complexity within the widely used CESM CAM-chem modeling framework, and compared their results to observations. They investigated conditions under which these simplified mechanisms matched the output of the most complex mechanism, as well as when they diverged. The researchers showed that, for most regions and time periods, differences in simulated ozone chemistry between these three mechanisms is smaller than the model-observation differences themselves. They found similar results for simulations of carbon monoxide and nitrous oxide.
“The most simplified mechanism that we tested, called Super-Fast, ran three times as fast as the most complex (MOZART-4) while largely producing the same results,” says Benjamin Brown-Steiner, the study’s lead author and a former postdoc at the MIT Joint Program and Department of Earth, Atmospheric and Planetary Sciences (EAPS). “This level of efficiency could, for instance, enable scientists to study an aspect of atmospheric chemistry over the course of the 21st century, running the simplified model for 100 years, and verifying its accuracy by running the complex model at the beginning, middle and end of the century.”
Brown-Steiner and his collaborators also explored how the concurrent utilization of chemical mechanisms of different complexities can further our understanding of atmospheric chemistry at various scales. They determined that scientists could streamline atmospheric chemistry investigations by developing simulations that include both complex and simplified chemical mechanisms. In such simulations, complex mechanisms would provide a more complete representation of complex atmospheric chemistry, and simple mechanisms would efficiently simulate longer time periods to better understand the roles of meteorological variability and other sources of uncertainty.
“By noting where results produced by simple and complex mechanisms diverge in particular regions, seasons or time periods, you can determine where and when simulations require more complex chemistry, and ramp up the modeling complexity as needed,” Brown-Steiner says.
It’s a modeling strategy that promises to enhance both scientists’ understanding of the Earth’s atmosphere and decision-makers’ capability to assess environmental policies, the researchers say.
“Our study shows that more complex models are not always more useful for decision-making,” says Noelle Selin, a co-author of the study, associate professor within MIT’s Institute for Data, Systems and Society and EAPS, and Joint Program faculty affiliate. “Researchers need to think critically about whether simple and efficient approaches like this one can be equally informative at lower cost.”
Finally, the study could lead to the inclusion of simplified atmospheric chemistry mechanisms in three-dimensional Earth-system modeling frameworks. This capability would help enable scientists and decision-makers to run long-term, large-ensemble (covering multiple scenarios to represent a range of uncertainty in key modeling parameters) 3-D simulations of the Earth’s atmosphere within a reasonable period of clock-time.
“We currently represent ozone, sulfate aerosols, and other key contributors to radiative forcing in the Earth system in two-dimensional models that do not provide the level of accuracy we want,” says Ronald Prinn, EAPS professor and Joint Program co-director, who is a co-author of the study.
“To that end we’d like to represent these in three-dimensional models and run ensembles [multiple scenarios], but once we put in a full 3-D chemical package, computer time becomes unaffordable,” Prinn adds. “This study shows that for radiative forcing calculations, incorporating a fast chemical package in a modeling system can get credible agreement among simple and complex chemical mechanisms and observations.”
The genome editing system CRISPR has become a hugely important tool in medical research, and could ultimately have a significant impact in fields such as agriculture, bioenergy, and food security.
The targeting system can travel to different points on the genome, guided by a short sequence of RNA, where a DNA-cutting enzyme known as Cas9 then makes the desired edits.
However, despite the gene-editing tool’s considerable success, CRISPR-Cas9 remains limited in the number of locations it can visit on the genome.
That is because CRISPR needs a specific sequence flanking the target location on the genome, known as a protospacer adjacent motif, or PAM, to allow it to recognize the site.
For example, the most widely used Cas9 enzyme, Streptococcus pyogenes Cas9 (SpCas9), requires two G nucleotides as its PAM sequence, significantly restricting the number of locations it can target, to around 9.9 percent of sites on the genome.
As yet, there are only a handful of CRISPR enzymes with minimal PAM requirements, meaning they are able to target a wider range of locations.
Now researchers at the MIT Media Lab, led by Joseph Jacobson, a professor of media arts and sciences and head of the Molecular Machines research group, have discovered a Cas9 enzyme that can target almost half of the locations on the genome, significantly widening its potential use. They report their findings in Science Advances Oct. 24.
“CRISPR is like a very accurate and efficient postal system, that can reach anywhere you want to go very precisely, but only if the ZIP code ends in a zero,” Jacobson says. “So it is very accurate and specific, but it limits you greatly in the number of locations you can go to.”
To develop a more general CRISPR system, the researchers implemented computational algorithms to conduct a bioinformatics search of bacterial sequences, to determine if there were any similar enzymes with less restrictive PAM requirements.
To carry out the search, the researchers developed a data analysis software tool, which they called SPAMALOT (Search for PAMs by Alignment of Targets).
This revealed a number of interesting possible enzymes, but no clear winner. So the team then built synthetic versions of the CRISPRs in the laboratory, to evaluate their performance.
They found that the most successful enzyme, a Cas9 from Streptococcus canis (ScCas9), was strikingly similar to the Cas9 enzyme already widely used, according to co-lead author Pranam Chatterjee, a graduate student in the Media Lab, who carried out the research alongside fellow graduate student Noah Jakimo.
“The enzyme looks almost identical to the one that was originally discovered … but it is able to target DNA sequences that the commonly used enzyme cannot,” Chatterjee says.
Rather than two G nucleotides as its PAM sequence, the new enzyme needs just one G, opening up far more locations on the genome.
This should allow CRISPR to target many disease-specific mutations that have previously been out of reach of the system.
For example, a typical gene is around 1,000 bases in length, giving researchers a number of different locations to target if their aim is to simply knock out the entire gene, Jacobson says.
However, many diseases, such as sickle cell anemia, are caused by the mutation of a single base, making them much more difficult to target.
“Base editing is not just a matter of hitting that gene anywhere over the 1,000 bases and knocking it out; it is a matter of going in and correcting, in a very precise way, that one base that you want to change,” Jacobson says.
“You need to be able to go to that very exact location, put your piece of CRISPR machinery right next to it, and then with a base editor — another protein that’s attached to the CRISPR — go in and repair or change the base,” he says.
The new CRISPR tool could be particularly helpful in such applications.
“We are excited to get ScCas9 into the hands of the genome editing community and receive their feedback for future development,” Chatterjee says.
The researchers very elegantly took advantage of the natural evolution of Cas9 sequences in Streptococcus bacteria in order to identify the new Cas9 protein, which will be a powerful tool for genome editing, says Jean-Paul Concordet, a CRISPR specialist at the National Museum of Natural History in France, who was not involved in the research.
“The amino acid sequence of ScCas9 is very closely related to that of SpCas9, so the anticipation is that it will also prove very easy to produce in recombinant form and can directly benefit from all the developments made for SpCas9,” Concordet says.
“In addition, ScCas9 works with the same guide RNAs as SpCas9, so it will be possible to use synthetic guide RNAs that are readily available from different companies,” he says.
The researchers are now hoping to use their technique to find other enzymes that could expand the targeting range of the CRISPR system even further, without reducing its accuracy, according to Jacobson.
“We feel confident of being able to go after every address on the genome,” he says.
Launched earlier this year, the MIT Task Force on the Work of the Future brings together a diverse team of MIT faculty and researchers from throughout the Institute, all seeking to understand the relationship between technology and work and how to best prepare workers for the future. To support its efforts, the task force has assembled two boards of experts (listed below). The advisory board includes leaders from industry, academia, labor, government, foundations, and other organizations, who will provide feedback and guidance to the task force. In addition, a research board of leading scholars in related fields will help to refine research-related questions and directions.
Leadership of the task force includes Elisabeth Reynolds, executive director of the MIT Industrial Performance Center (IPC) and lecturer in the Department of Urban Studies and Planning; David Autor, the Ford Professor of Economics and associate head of the MIT Department of Economics; and David Mindell, the Frances and David Dibner Professor of the History of Engineering and Manufacturing, and a professor of aeronautics and astronautics.
“Our advisory and research boards are invaluable resources for us, helping to ensure that our work is relevant, effective, and is informed by what is happening in the world today — in firms, in schools, in cities,” says Reynolds. “The perspectives and collaboration of our board members are critical to the success of the task force and this initiative as a whole.”
While several board members are leaders at companies such as Alphabet, Amazon, Ford Motor Company, IBM Corporation, PepsiCo, and Santander, others, including Jennifer Granholm, former governor of Michigan, bring experiences in policy and public service.
“The MIT Task Force on the Work of the Future seeks to provide some critical understanding at a critical time,” says Jennifer Granholm, former governor of Michigan. “The realities of technological disruption and the outsourcing of some manufacturing jobs to other countries indicate a fundamental change in the economies of many states — and of the U.S. as a whole. We need to look toward real solutions and long-term strategies toward economic stability.”
Some of the key research areas, or lenses through which the task force will look at the challenges and opportunities that artificial intelligence and robotics bring include: learning and skills, education and training institutions, how new technologies are being adopted in manufacturing and health care; the implications for mobility in cities of autonomous vehicles and ride sharing as well as comparative work in Germany, Scandinavia, China, and Africa. Task force members — as well as board members — bring expertise in a wide variety of fields, including engineering, economics, management, political science, and education innovation.
In terms of education, the task force is looking at the important question of how to ensure that the workforce has access to the training and skills needed keep up with new technologies. Some solutions may include trainings offered within companies or online, or hybrid programs of onsite and online training, including at programs at community and technical colleges.
Annette Parker, president of South Central College — a Minnesota State community and technical college with two campuses — has focused extensively on how to generate a skilled technical workforce, and has forged partnerships between community colleges and the automotive industry globally and manufacturers throughout Minnesota.
“The U.S. workforce must keep up with the innovation in science, technology, engineering, and math for both engineering and technician careers" says Parker. "There is a critical need for skilled, mid-level workers. Determining how to best prepare and continue to train these employees throughout industry is a major challenge and one that will benefit from the work of this task force.”
Advisory board member and MIT alumnus Jeff Wilke SM ’93, MBA ’93 is currently CEO of Amazon Worldwide Consumer.
“The future of the workforce is one of the most important issues facing the global economy,” says Wilke. “The research of the task force, combined with insights from corporations, governments, and educational institutions, will help evolve our understanding of how new technologies impact the workforce of the future, and how to best respond.”
The advisory board includes: Roger C. Altman, founder and senior chairman of Evercore; Ana Botin, executive chairman of the Santander Group; Charlie Braun, president of Custom Rubber Corp.; Eric Cantor, vice chairman of Moelis & Company; Volkmar Denner, chairman of the board of management at Robert Bosch GmbH; William Clay Ford Jr., executive chairman of Ford Motor Company; Jennifer Granholm, former governor of Michigan; Freeman A. Hrabowski III, president of the University of Maryland, Baltimore County; David H. Long, chairman and CEO of Liberty Mutual Insurance; Karen Mills, senior fellow at Harvard Business School; Indra Nooyi, chairman and CEO of PepsiCo; Annette Parker, president of South Central College; David Rolf, founder and president emeritus of SEIU 775; Ginni M. Rometty, chairman and CEO of IBM Corporation; Juan Salgado, chancellor of City Colleges of Chicago; Eric E. Schmidt, technical advisor and member of the board of Alphabet, Inc.; David M. Siegel, co-chairman of Two Sigma; Elizabeth Shuler, secretary-treasurer of the AFL-CIO; Robert Solow, professor emeritus of economics at MIT; Darren Walker, president of the Ford Foundation; Jeff Wilke, CEO of Amazon Worldwide Consumer; and Marjorie Yang, chairman of Esquel Group.
The research board includes: William Bonvillian, MIT lecturer; Rodney Brooks, founder, chairman, and CTO of Rethink Robotics; Josh Cohen, professor of law at Stanford University; Virginia Dignum, professor of social and ethical artificial intelligence at Umeå University; Susan Helper, professor at Case Western Reserve University; Susan Houseman, vice president and director of research at the W.E. Upjohn Institute; John Irons, director of the future of work at the Ford Foundation; Martin Krzywdzinski, principal investigator at the WZB Berlin Social Science Center; Frank Levy, Rose Professor Emeritus at MIT; Fei-Fei Li, professor of computer science at Stanford University; Nichola J. Lowe, associate professor of city and regional planning at the University of North Carolina at Chapel Hill; Joel Mokyr, professor of economics and history at Northwestern University; Michael Piore, professor emeritus of political economy at MIT; and Gill Pratt, executive technical advisor and CEO of Toyota.
A massive new survey developed by MIT researchers reveals some distinct global preferences concerning the ethics of autonomous vehicles, as well as some regional variations in those preferences.
The survey has global reach and a unique scale, with over 2 million online participants from over 200 countries weighing in on versions of a classic ethical conundrum, the “Trolley Problem.” The problem involves scenarios in which an accident involving a vehicle is imminent, and the vehicle must opt for one of two potentially fatal options. In the case of driverless cars, that might mean swerving toward a couple of people, rather than a large group of bystanders.
“The study is basically trying to understand the kinds of moral decisions that driverless cars might have to resort to,” says Edmond Awad, a postdoc at the MIT Media Lab and lead author of a new paper outlining the results of the project. “We don’t know yet how they should do that.”
Still, Awad adds, “We found that there are three elements that people seem to approve of the most.”
Indeed, the most emphatic global preferences in the survey are for sparing the lives of humans over the lives of other animals; sparing the lives of many people rather than a few; and preserving the lives of the young, rather than older people.
“The main preferences were to some degree universally agreed upon,” Awad notes. “But the degree to which they agree with this or not varies among different groups or countries.” For instance, the researchers found a less pronounced tendency to favor younger people, rather than the elderly, in what they defined as an “eastern” cluster of countries, including many in Asia.
The paper, “The Moral Machine Experiment,” is being published today in Nature.
The authors are Awad; Sohan Dsouza, a doctoral student in the Media Lab; Richard Kim, a research assistant in the Media Lab; Jonathan Schulz, a postdoc at Harvard University; Joseph Henrich, a professor at Harvard; Azim Shariff, an associate professor at the University of British Columbia; Jean-François Bonnefon, a professor at the Toulouse School of Economics; and Iyad Rahwan, an associate professor of media arts and sciences at the Media Lab, and a faculty affiliate in the MIT Institute for Data, Systems, and Society.
Awad is a postdoc in the MIT Media Lab’s Scalable Cooperation group, which is led by Rahwan.
To conduct the survey, the researchers designed what they call “Moral Machine,” a multilingual online game in which participants could state their preferences concerning a series of dilemmas that autonomous vehicles might face. For instance: If it comes right down it, should autonomous vehicles spare the lives of law-abiding bystanders, or, alternately, law-breaking pedestrians who might be jaywalking? (Most people in the survey opted for the former.)
All told, “Moral Machine” compiled nearly 40 million individual decisions from respondents in 233 countries; the survey collected 100 or more responses from 130 countries. The researchers analyzed the data as a whole, while also breaking participants into subgroups defined by age, education, gender, income, and political and religious views. There were 491,921 respondents who offered demographic data.
The scholars did not find marked differences in moral preferences based on these demographic characteristics, but they did find larger “clusters” of moral preferences based on cultural and geographic affiliations. They defined “western,” “eastern,” and “southern” clusters of countries, and found some more pronounced variations along these lines. For instance: Respondents in southern countries had a relatively stronger tendency to favor sparing young people rather than the elderly, especially compared to the eastern cluster.
Awad suggests that acknowledgement of these types of preferences should be a basic part of informing public-sphere discussion of these issues. In all regions, since there is a moderate preference for sparing law-abiding bystanders rather than jaywalkers, knowing these preferences could, in theory, inform the way software is written to control autonomous vehicles.
“The question is whether these differences in preferences will matter in terms of people’s adoption of the new technology when [vehicles] employ a specific rule,” he says.
Rahwan, for his part, notes that “public interest in the platform surpassed our wildest expectations,” allowing the researchers to conduct a survey that raised awareness about automation and ethics while also yielding specific public-opinion information.
“On the one hand, we wanted to provide a simple way for the public to engage in an important societal discussion,” Rahwan says. “On the other hand, we wanted to collect data to identify which factors people think are important for autonomous cars to use in resolving ethical tradeoffs.”
Beyond the results of the survey, Awad suggests, seeking public input about an issue of innovation and public safety should continue to become a larger part of the dialoge surrounding autonomous vehicles.
“What we have tried to do in this project, and what I would hope becomes more common, is to create public engagement in these sorts of decisions,” Awad says.
Important because it is mundane, working memory gets us through each day by allowing us, for example, to follow the receptionist’s directions to find the doctor’s office, or to sort through the costs and benefits of one set of tires versus another at the dealership. It’s also profoundly debilitating when it is diminished by disorders such as schizophrenia or autism.
But MIT neuroscientist Earl Miller also sees grandeur in working memory as a system that enables our minds to exert our will over sensory information.
“What’s special about working memory is that it is volitional,” says Miller, the Picower Professor at the Picower Institute for Learning and Memory at MIT. “It is the main mechanism by which your brain wrests control from the environment and puts it under its own control. Any simple creature can just react to the environment. But what higher order animals have evolved is the ability to take control over their own thoughts.”
If only neuroscientists knew how the brain did that. Motivated by that question and by a desire to help people in whom the system is not functioning properly, Miller has been studying how working memory works for more than 20 years.
Now, in the 30th anniversary edition of the journal Neuron, Miller and co-authors Mikael Lundqvist and Andre Bastos present a new model of working memory that explains how the brain holds information in mind (the memory part) and also executes volitional control over it (the working part).
“This model brings together the maintenance and volition of working memory,” says Miller, who recently won the George A. Miller Prize in Cognitive Neuroscience, in part, for this work.
Rocking the boat with waves
Essentially, the model posits that the brain operates working memory by coordinating ensembles of cells, or neurons, in the cortex with timely bursts of activity at the frequencies of specific brain waves. In the model, waves in the low alpha and beta frequencies carry our knowledge and goals in the situation (e.g. “I need tires that will last a long time but don’t want to pay more than $400.”) and regulate the higher frequency gamma waves that handle the new sensory information to be stored and manipulated, (e.g. the salesperson’s pitch that Tire set A will last 45,000 miles and cost $360, and tire set B will last 60,000 miles and cost $420).
Meanwhile, the temporary storage of that sensory information is achieved by how the interplay of these rhythmic waves changes the weight of connections among the neurons, called synapses. The new paper summarizes several lines of experimental evidence supporting the model, including from papers Miller’s lab published earlier this year in the Proceedings of the National Academy of Sciences and Nature Communications and in 2016 in Neuron.
The evidence, and the model itself, challenges at least two classically held beliefs among neuroscientists. One is that brainwaves are merely byproducts of neural activity and don’t have functional meaning. The other is that working memory is maintained by a persistent hum of neural firing, rather than short, coordinated bursts. But newer and more sophisticated techniques of analysis and measurement of neural activity amid working memory experiments in lab animals have shown otherwise, the three researchers write.
For instance, in the Nature Communications paper, led by Lundqvist, the team showed the functional consequences of the different frequencies of waves. Animals were trained to play a game where they saw a sequence of two images and had to judge whether the next sequence of two had the same images in the same order. Recordings of neural activity showed a specific pattern of interplay between the wave frequencies, in which beta would decline to allow gamma to increase when information needed to be stored or read out. Beta would then increase and gamma would die down when information could be discarded.
More strikingly, the researchers could see that deviations from this pattern correlated strongly with animals making mistakes. From specific deviations the scientists could even tell if the animal subsequently made the wrong decision based on the first or second of the test images.
“This adds to mounting evidence that brain waves have a major functional role in the brain,” Miller says.
In the PNAS paper, led by Bastos, the researchers not only measured this same kind of pattern of brainwave control, but also showed that the governing alpha and beta waves originate from deeper layers of the prefrontal cortex, while gamma waves originate in more superficial layers, just as neuroscientists had previously observed in the visual cortex.
Some of the lab’s newest data, not yet published, suggests that this interplay of what they call top down alpha-beta rhythms exerting executive control of bottom up, sensory-oriented gamma rhythms may be widespread around the cortex, therefore potentially governing other related functions such as attention.
A working model
In all, the paper provides researchers a path to follow to advance the study of working memory, says Sabine Kastner, professor of psychology and neuroscience at Princeton University.
“The review by Earl Miller and colleagues provides a beautiful synthesis putting forward a theory for working memory function from the level of neural circuits to neural ensembles at the brain’s large-scale,” she says. “While many aspects of this theory need to be linked more closely to behavior, Miller’s account provides a roadmap for the future of the working memory field.”
Indeed, Miller says his lab is asking new questions of the model both to explore the finer workings of working memory and also to find ways that it might be therapeutically enhanced for the treatment of psychiatric disease. Miller says the team in interested in determining how the model might explain how information can be reordered or sliced and diced in other ways. The lab is also working on systems to read out and, in real-time, correctively stimulate rhythms in the prefrontal cortex to enhance working memory function.
“Can we strengthen the top down rhythms when you need to focus, or can we strengthen gamma by weakening beta when you need to be more sensorily receptive?” Miller asks.
With further research, the model may tell. Miller will present this model at the 14th International Workshop on Advances in Electrocorticography in San Diego on Nov. 2. He will also be a panelist at Cell Press Conversations: “The State of the Mind 2018” panel event in San Diego on Nov. 3.
The National Institute of Mental Health, the Office of Naval Research, the Picower Fellows Program, and the MIT Picower Institute Innovation Fund have supported the research.
Imagine trying to write your name so that it can be read in a mirror. Your brain has all of the visual information you need, and you’re a pro at writing your own name. Still, this task is very difficult for most people. That’s because it requires the brain to perform a mental transformation that it’s not familiar with: using what it sees in the mirror to accurately guide your hand to write backward.
MIT neuroscientists have now discovered how the brain tries to compensate for its poor performance in tasks that require this kind of complicated transformation. As it also does in other types of situations where it has little confidence in its own judgments, the brain attempts to overcome its difficulties by relying on previous experiences.
“If you’re doing something that requires a harder mental transformation, and therefore creates more uncertainty and more variability, you rely on your prior beliefs and bias yourself toward what you know how to do well, in order to compensate for that variability,” says Mehrdad Jazayeri, the Robert A. Swanson Career Development Professor of Life Sciences, a member of MIT’s McGovern Institute for Brain Research, and the senior author of the study.
This strategy actually improves overall performance, the researchers report in their study, which appears in the Oct. 24 issue of the journal Nature Communications. Evan Remington, a McGovern Institute postdoc, is the paper’s lead author, and technical assistant Tiffany Parks is also an author on the paper.
Neuroscientists have known for many decades that the brain does not faithfully reproduce exactly what the eyes see or what the ears hear. Instead, there is a great deal of “noise” — random fluctuations of electrical activity in the brain, which can come from uncertainty or ambiguity about what we are seeing or hearing. This uncertainty also comes into play in social interactions, as we try to interpret the motivations of other people, or when recalling memories of past events.
Previous research has revealed many strategies that help the brain to compensate for this uncertainty. Using a framework known as Bayesian integration, the brain combines multiple, potentially conflicting pieces of information and values them according to their reliability. For example, if given information by two sources, we’ll rely more on the one that we believe to be more credible.
In other cases, such as making movements when we’re uncertain exactly how to proceed, the brain will rely on an average of its past experiences. For example, when reaching for a light switch in a dark, unfamiliar room, we’ll move our hand toward a certain height and close to the doorframe, where past experience suggests a light switch might be located.
All of these strategies have been previously shown to work together to increase bias toward a particular outcome, which makes our overall performance better because it reduces variability, Jazayeri says.
Noise can also occur in the mental conversion of sensory information into a motor plan. In many cases, this is a straightforward task in which noise plays a minimal role — for example, reaching for a mug that you can see on your desk. However, for other tasks, such as the mirror-writing exercise, this conversion is much more complicated.
“Your performance will be variable, and it’s not because you don’t know where your hand is, and it’s not because you don’t know where the image is,” Jazayeri says. “It involves an entirely different form of uncertainty, which has to do with processing information. The act of performing mental transformations of information clearly induces variability.”
That type of mental conversion is what the researchers set out to explore in the new study. To do that, they asked subjects to perform three different tasks. For each one, they compared subjects’ performance in a version of the task where mapping sensory information to motor commands was easy, and a version where an extra mental transformation was required.
In one example, the researchers first asked participants to draw a line the same length as a line they were shown, which was always between 5 and 10 centimeters. In the more difficult version, they were asked to draw a line 1.5 times longer than the original line.
The results from this set of experiments, as well as the other two tasks, showed that in the version that required difficult mental transformations, people altered their performance using the same strategies that they use to overcome noise in sensory perception and other realms. For example, in the line-drawing task, in which the participants had to draw lines ranging from 7.5 to 15 centimeters, depending on the length of the original line, they tended to draw lines that were closer to the average length of all the lines they had previously drawn. This made their responses overall less variable and also more accurate.
“This regression to the mean is a very common strategy for making performance better when there is uncertainty,” Jazayeri says.
The new findings led the researchers to hypothesize that when people get very good at a task that requires complex computation, the noise will become smaller and less detrimental to overall performance. That is, people will trust their computations more and stop relying on averages.
“As it gets easier, our prediction is the bias will go away, because that computation is no longer a noisy computation,” Jazayeri says. “You believe in the computation; you know the computation is working well.”
The researchers now plan to further study whether people’s biases decrease as they learn to perform a complicated task better. In the experiments they performed for the Nature Communications study, they found some preliminary evidence that trained musicians performed better in a task that involved producing time intervals of a specific duration.
The research was funded by the Alfred P. Sloan Foundation, the Esther A. and Joseph Klingenstein Fund, the Simons Foundation, the McKnight Endowment Fund for Neuroscience, and the McGovern Institute.
Patients with blood cancers such as leukemia and lymphoma are often treated by irradiating their bone marrow to destroy the diseased cells. After the treatment, patients are vulnerable to infection and fatigue until new blood cells grow back.
MIT researchers have now devised a way to help blood cells regenerate faster. Their method involves stimulating a particular type of stem cell to secrete growth factors that help precursor cells differentiate into mature blood cells.
Using a technique known as mechanopriming, the researchers grew mesenchymal stem cells (MSCs) on a surface whose mechanical properties are very similar to that of bone marrow. This induced the cells to produce special factors that help hematopoietic stem and progenitor cells (HSPCs) differentiate into red and white blood cells, as well as platelets and other blood cells.
“You can think about it like you’re trying to grow a plant,” says Krystyn Van Vliet, the Michael and Sonja Koerner Professor of Materials Science and Engineering, a professor of biological engineering, and associate provost. “The MSCs are coming in and improving the soil so that the progenitor cells can start proliferating and differentiating into the blood cell lineages that you need to survive.”
In a study of mice, the researchers showed that the specially grown MSCs helped the animals to recover much more quickly from bone marrow irradiation.
Van Vliet is the senior author of the study, which appears in the Oct. 24 issue of the journal Stem Cell Research and Therapy. The paper’s lead author is recent MIT PhD recipient Frances Liu. Other authors are Singapore-MIT Alliance for Research and Technology (SMART) postdoc Kimberley Tam, recent MIT PhD recipient Novalia Pishesha, and former SMART postdoc Zhiyong Poon, now at Singapore General Hospital.
Cellular drug factories
MSCs are produced throughout the body and can differentiate into a variety of tissues, including bone, cartilage, muscle, and fat. They can also secrete proteins that help other types of stem cells differentiate into mature cells.
“They act like drug factories,” Van Vliet says. “They can become tissue lineage cells, but they also pump out a lot of factors that change the environment that the hematopoietic stem cells are operating in.”
When cancer patients receive a stem cell transplant, they usually receive only HPSCs, which can become blood cells. Van Vliet’s team has shown previously that when mice are also given MSCs, they recover faster. However, in a given population of MSCs, usually only about 20 percent produce the factors that are needed to stimulate blood cell growth and bone marrow recovery.
“Left to their own devices in the current state-of-the-art culture environments, MSCs become heterogeneous and they all express a variety of factors,” Van Vliet says.
In an earlier study, Van Vliet and her SMART colleagues showed that she could sort MSCs with a special microfluidic device that can identify the 20 percent that promote blood cell growth. However, she and her students wanted to improve on that by finding a way to stimulate an entire population of MSCs to produce the necessary factors.
To do that, they first had to discover which factors were the most important. They showed that while many factors contribute to blood cell differentiation, secretion of a protein called osteopontin was most highly correlated with better survival rates in mice treated with MSCs.
The researchers then explored the idea of “mechanopriming” the cells so that they would produce more of the necessary factors. Over the past decade, Van Vliet and other researchers have shown that varying the mechanical properties of surfaces on which stem cells are grown can affect their differentiation into mature cell types. However, in this study, for the first time, she showed that mechanical properties can also affect the factors that stem cells secrete before committing to a specific tissue cell lineage.
Usually, stem cells removed from the body are grown on a flat sheet of glass or stiff plastic. The MIT team decided to try growing the cells on a polymer called PDMS and to vary its mechanical properties to see how that would affect the cells. They designed materials that varied in both their stiffness and their viscosity, which is a measure of how quickly the material stretches when stress is applied.
The researchers found that MSCs grown on materials with mechanical properties most similar that of bone marrow produced the greatest number of the factors necessary to induce HPSPCs to differentiate into mature blood cells.
The researchers then tested their specially grown MSCs by implanting them into mice that had had their bone marrow irradiated. Even though they did not implant any HSPCs, this treatment quickly repopulated the animals’ blood cells and helped them to recover more quickly than mice treated with MSCs grown on traditional glass surfaces. They also recovered faster than mice treated with the factor-producing MSCs that were selected by the microfluidic sorting device.
“The mouse studies were models of radiation therapy commonly used to kill cancer cells in the clinic. However, these therapies are highly destructive and also destroy healthy cells as well,” Liu says. “Our mechanoprimed MSCs can help to better support and regenerate those healthy bone marrow cells faster in these mouse models, and we hope the same results would translate to humans.”
“Illustrating how mechanopriming of mesenchymal stem cells can be exploited to improve on hematopoietic recovery is of huge medical significance,” says Viola Vogel, chair of the Department of Health Science at Technology at ETH Zurich, who was not involved in the research. “It also sheds light onto how to utilize their approach to perhaps take advantage of other cell subpopulations for therapeutic applications in the future.”
Van Vliet’s lab is now performing more animal studies in hopes of developing a combination treatment of MSCs and HSPCs that could be tested in humans.
“You can’t survive with a low blood cell count for very long,” she says. “If you’re able to get your complete blood cell count up to normal levels faster, you have a much better prognosis for speed of recovery.”
The researchers also hope to study whether mechanopriming can induce MSCs to produce different factors that would stimulate the development of additional cell types that could be useful for treating other diseases.
“You could imagine that by changing their culture environment, including their mechanical environment, MSCs could be used for administration to target several other diseases,” such as Parkinson’s disease, rheumatoid arthritis, and others, Van Vliet says.
The research was funded by the BioSystems and Micromechanics Interdisciplinary Research Group of the Singapore-MIT Alliance for Research and Technology (SMART), through the Singapore National Research Foundation, and the National Institutes of Health.
Infinite Cooling, a company that has developed a technology to capture and reuse water evaporating from cooling towers at power plans, was one of two local startups to be named a $100K Diamond Winner at the MassChallenge Awards which took place Oct. 17 at the Boston Convention and Exhibition Center. Co-founders Kripa Varanasi, associate professor of mechanical engineering, and postdocs Maher Damak PhD ’18 and Karim Khalil PhD ’18 accepted the award along with COO Derek Warnick.
MassChallenge is a non-profit organization the supports and connects entrepreneurs and startups. Now in its ninth year, the MassChallenge Awards highlight up-and-coming startups in the Boston area. At the event last week, $1 million in prizes were given to startups with the most potential for impact. Infinite Cooling was one of the more than 150 startups spotlighted at the event.
“Receiving the top prize at the MassChallenge Awards was an incredible honor for the Infinite Cooling team and will greatly help us towards our commercialization goals for the technology and solve a major problem at the Energy-Water nexus,” adds Varanasi.
The technology behind Infinite Cooling has the potential to substantially reduce the water consumption by power plants and even provide a low-cost source of drinking water. “Our system can very efficiently capture the plumes that would normally be released into the atmosphere through cooling towers at power plants. By recycling this water we provide significant water savings for the power plant and community,” says Damak.
“Our device not only reduces water consumption at power plants, it also reduces costly water-treatment requirements since the recondensed water droplets are pure,” explains Khalil. This clean water could be used for drinking water or recycled for pure water uses at the power plant itself.
Since power plants are one of the largest users of freshwater — about 39 percent of freshwater is ear marked for power plants in the U.S. alone — Infinite Cooling’s technology holds the potential to make a massive impact on water and resource conservation. Their research was supported in part by the Tata Center for Technology and Design, and the prototype was supported, in part, by the Office of Sustainability and is currently being tested at MIT’s Central Utility Plant.
Tiny robots no bigger than a cell could be mass-produced using a new method developed by researchers at MIT. The microscopic devices, which the team calls “syncells” (short for synthetic cells), might eventually be used to monitor conditions inside an oil or gas pipeline, or to search out disease while floating through the bloodstream.
The key to making such tiny devices in large quantities lies in a method the team developed for controlling the natural fracturing process of atomically-thin, brittle materials, directing the fracture lines so that they produce miniscule pockets of a predictable size and shape. Embedded inside these pockets are electronic circuits and materials that can collect, record, and output data.
The novel process, called “autoperforation,” is described in a paper published today in the journal Nature Materials, by MIT Professor Michael Strano, postdoc Pingwei Liu, graduate student Albert Liu, and eight others at MIT.
The system uses a two-dimensional form of carbon called graphene, which forms the outer structure of the tiny syncells. One layer of the material is laid down on a surface, then tiny dots of a polymer material, containing the electronics for the devices, are deposited by a sophisticated laboratory version of an inkjet printer. Then, a second layer of graphene is laid on top.
People think of graphene, an ultrathin but extremely strong material, as being “floppy,” but it is actually brittle, Strano explains. But rather than considering that brittleness a problem, the team figured out that it could be used to their advantage.
“We discovered that you can use the brittleness,” says Strano, who is the Carbon P. Dubbs Professor of Chemical Engineering at MIT. “It's counterintuitive. Before this work, if you told me you could fracture a material to control its shape at the nanoscale, I would have been incredulous.”
But the new system does just that. It controls the fracturing process so that rather than generating random shards of material, like the remains of a broken window, it produces pieces of uniform shape and size. “What we discovered is that you can impose a strain field to cause the fracture to be guided, and you can use that for controlled fabrication,” Strano says.
When the top layer of graphene is placed over the array of polymer dots, which form round pillar shapes, the places where the graphene drapes over the round edges of the pillars form lines of high strain in the material. As Albert Liu describes it, “imagine a tablecloth falling slowly down onto the surface of a circular table. One can very easily visualize the developing circular strain toward the table edges, and that’s very much analogous to what happens when a flat sheet of graphene folds around these printed polymer pillars.”
As a result, the fractures are concentrated right along those boundaries, Strano says. “And then something pretty amazing happens: The graphene will completely fracture, but the fracture will be guided around the periphery of the pillar.” The result is a neat, round piece of graphene that looks as if it had been cleanly cut out by a microscopic hole punch.
Because there are two layers of graphene, above and below the polymer pillars, the two resulting disks adhere at their edges to form something like a tiny pita bread pocket, with the polymer sealed inside. “And the advantage here is that this is essentially a single step,” in contrast to many complex clean-room steps needed by other processes to try to make microscopic robotic devices, Strano says.
The researchers have also shown that other two-dimensional materials in addition to graphene, such as molybdenum disulfide and hexagonal boronitride, work just as well.
Ranging in size from that of a human red blood cell, about 10 micrometers across, up to about 10 times that size, these tiny objects “start to look and behave like a living biological cell. In fact, under a microscope, you could probably convince most people that it is a cell,” Strano says.
This work follows up on earlier research by Strano and his students on developing syncells that could gather information about the chemistry or other properties of their surroundings using sensors on their surface, and store the information for later retrieval, for example injecting a swarm of such particles in one end of a pipeline and retrieving them at the other to gain data about conditions inside it. While the new syncells do not yet have as many capabilities as the earlier ones, those were assembled individually, whereas this work demonstrates a way of easily mass-producing such devices.
Apart from the syncells’ potential uses for industrial or biomedical monitoring, the way the tiny devices are made is itself an innovation with great potential, according to Albert Liu. “This general procedure of using controlled fracture as a production method can be extended across many length scales,” he says. “[It could potentially be used with] essentially any 2-D materials of choice, in principle allowing future researchers to tailor these atomically thin surfaces into any desired shape or form for applications in other disciplines.”
This is, Albert Liu says, “one of the only ways available right now to produce stand-alone integrated microelectronics on a large scale” that can function as independent, free-floating devices. Depending on the nature of the electronics inside, the devices could be provided with capabilities for movement, detection of various chemicals or other parameters, and memory storage.
There are a wide range of potential new applications for such cell-sized robotic devices, says Strano, who details many such possible uses in a book he co-authored with Shawn Walsh, an expert at Army Research Laboratories, on the subject, called “Robotic Systems and Autonomous Platforms,” which is being published this month by Elsevier Press.
As a demonstration, the team “wrote” the letters M, I, and T into a memory array within a syncell, which stores the information as varying levels of electrical conductivity. This information can then be “read” using an electrical probe, showing that the material can function as a form of electronic memory into which data can be written, read, and erased at will. It can also retain the data without the need for power, allowing information to be collected at a later time. The researchers have demonstrated that the particles are stable over a period of months even when floating around in water, which is a harsh solvent for electronics, according to Strano.
“I think it opens up a whole new toolkit for micro- and nanofabrication,” he says.
Daniel Goldman, a professor of physics at Georgia Tech, who was not involved with this work, says, “The techniques developed by Professor Strano’s group have the potential to create microscale intelligent devices that can accomplish tasks together that no single particle can accomplish alone.”
In addition to Strano, Pingwei Liu, who is now at Zhejiang University in China, and Albert Liu, a graduate student in the Strano lab, the team included MIT graduate student Jing Fan Yang, postdocs Daichi Kozawa, Juyao Dong, and Volodomyr Koman, Youngwoo Son PhD ’16, research affiliate Min Hao Wong, and Dartmouth College student Max Saccone and visiting scholar Song Wang. The work was supported by the Air Force Office of Scientific Research, and the Army Research Office through MIT’s Institute for Soldier Nanotechnologies.
Spectrometers — devices that distinguish different wavelengths of light and are used to determine the chemical composition of everything from laboratory materials to distant stars — are large devices with six-figure price tags, and tend to be found in large university and industry labs or observatories.
A new advance by researchers at MIT could make it possible to produce tiny spectrometers that are just as accurate and powerful but could be mass produced using standard chip-making processes. This approach could open up new uses for spectrometry that previously would have been physically and financially impossible.
The invention is described today in the journal Nature Communications, in a paper by MIT associate professor of materials science and engineering Juejun Hu, doctoral student Derek Kita, research assistant Brando Miranda, and five others.
The researchers say this new approach to making spectrometers on a chip could provide major advantages in performance, size, weight, and power consumption, compared to current instruments.
Other groups have tried to make chip-based spectrometers, but there is a built-in challenge: A device’s ability to spread out light based on its wavelength, using any conventional optical system, is highly dependent on the device’s size. “If you make it smaller, the performance degrades,” Hu says.
Another type of spectrometer uses a mathematical approach called a Fourier transform. But these devices are still limited by the same size constraint — long optical paths are essential to attaining high performance. Since high-performance devices require long, tunable optical path lengths, miniaturized spectrometers have traditionally been inferior compared to their benchtop counterparts.
Instead, “we used a different technique,” says Kita. Their system is based on optical switches, which can instantly flip a beam of light between the different optical pathways, which can be of different lengths. These all-electronic optical switches eliminate the need for movable mirrors, which are required in the current versions, and can easily be fabricated using standard chip-making technology.
By eliminating the moving parts, Kita says, “there’s a huge benefit in terms of robustness. You could drop it off the table without causing any damage.”
By using path lengths in power-of-two increments, these lengths can be combined in different ways to replicate an exponential number of discrete lengths, thus leading to a potential spectral resolution that increases exponentially with the number of on-chip optical switches. It’s the same principle that allows a balance scale to accurately measure a broad range of weights by combining just a small number of standard weights.
As a proof of concept, the researchers contracted an industry-standard semiconductor manufacturing service to build a device with six sequential switches, producing 64 spectral channels, with built-in processing capability to control the device and process its output. By expanding to 10 switches, the resolution would jump to 1,024 channels. They designed the device as a plug-and-play unit that could be easily integrated with existing optical networks.
The team also used new machine-learning techniques to reconstruct detailed spectra from a limited number of channels. The method they developed works well to detect both broad and narrow spectral peaks, Kita says. They were able to demonstrate that its performance did indeed match the calculations, and thus opens up a wide range of potential further development for various applications.
The researchers say such spectrometers could find applications in sensing devices, materials analysis systems, optical coherent tomography in medical imaging, and monitoring the performance of optical networks, upon which most of today’s digital networks rely. Already, the team has been contacted by some companies interested in possible uses for such microchip spectrometers, with their promise of huge advantages in size, weight, and power consumption, Kita says. There is also interest in applications for real-time monitoring of industrial processes, Hu adds, as well as for environmental sensing for industries such as oil and gas.
This work “is a very interesting approach, as it enables realizing a high-resolution spectrometer on a small footprint,” says Gunther Roelkens, a professor at Ghent University in Belgium, who was not connected to this research. “This device enables applications such as on-chip spectroscopic sensors, which is a hot research topic.”
“The challenge for future research will be to extend the wavelength coverage while maintaining the same resolution,” Roelkens adds. “Also, addressing different wavelength bands will enable many new applications.”
The team also included MIT undergraduate David Favela, graduate student Jérôme Michon, former postdoc Hongtao Lin, research scientist Tian Gu, and staff member David Bono. The research was supported by the National Science Foundation, MIT SENSE.nano, the U.S. Department of Energy, and the Saks Kavanaugh Foundation.
HUBweek in Kendall Square — it’s become a pretty good bet. And the festivities on Oct. 9 didn’t disappoint.
Now in its fourth year, HUBweek is a “festival of the future” that celebrates science, art, and technology. MIT is a founding sponsor, along with Harvard University, The Boston Globe, and Massachusetts General Hospital. This year’s theme was “We the Future.” The Kendall Square/MIT Innovation Playground and 314 Main Street Ground-breaking showcased the Institute’s innovative spirit right in the historic heart and future hub of MIT and Kendall Square.
The day began with a sold-out MIT Club of Boston event called “Inside the Dome: MIT and the Future of Kendall Square.” Israel Ruiz, MIT’s executive vice president and treasurer, joined with Katie Rae, CEO and managing partner of The Engine; Steve Marsh, managing director of real estate; Elisabeth Reynolds, executive director of the MIT Industrial Performance Center; Muriel Médard, the Cecil H. Green Professor of Electrical Engineering and Computer Science; and over 200 attendees to discuss MIT’s role in the transformation of Kendall Square and their collective vision for its future.
The tone for the rest of the day was set by the energetic buzz at the eight-hour-long “Innovation Playground.” The event was one of MIT’s HUBweek “Open Doors” offerings and took place inside the Institute’s graduate student housing construction site in the middle of Kendall Square.
The raw space, which will one day house a public food hall, was a perfect backdrop for a lineup of activities including digital portals to people around the world, a life-size coloring book, a learn-how-to-DJ station, a make-your-own unity print station, digital graffiti, augmented and virtual reality, a bank of old-fashioned phones sharing snippets of history, a Boeing autonomous vehicle exhibit, a GIF booth, a collection station for a time capsule, an open space brainstorming station, and an MIT-designed interactive art piece, “Flow,” created by MIT alumnus Karl Sims.
About 650 people came through the Innovation Playground during the course of the afternoon to play, explore, and socialize. Attendees say the experience felt part nightclub, part museum, and part coffee shop. Some stayed for hours.
In the evening, over 200 people joined together for an unconventional ground-breaking ceremony for the Institute’s 314 Main Street building, which is a key component of MIT’s Kendall Square Initiative. The building will be the future home of the new MIT Museum, the Boeing Aerospace and Autonomy Center, and other innovative tenants. Attendees watched the speaking program while others continued to play games, and about 60 neighborhood children and youth from the nearby Margaret Fuller Neighborhood House, East End House, Innovators4Purpose, and Community Art Center participated enthusiastically in the proceedings. When asked why MIT likes the 314 Main Street address, one child told the crowd that she knew that it was the number representing Pi or π.
“I haven’t been to all that many ground-breakings,” said MIT Museum Director John Durant, “but this event is a mash-up, and I’m so happy about that. It reflects the spirit and philosophy of the museum.”
MIT Provost Martin Schmidt described the surrounding context for the 314 Main Street building. It will be located next to a redesigned MBTA head house in the new MIT/Kendall gateway area, and across from the Institute’s new graduate student residence hall, now under construction. A “new front door” will lead to an inviting open space area that will be programmed to showcase the innovative spirit of MIT. “When it’s all done, no one will ever come up from the MBTA again and ask “Where is MIT?’”
Schmidt also summarized MIT’s long-term relationship with Boeing: “The MIT/Boeing relationship is over a century old — we have been problem-solving and advancing technologies together for a long time. In this new chapter of our relationship, Boeing will be focused on autonomous vehicles and artificial intelligence-enabled air travel. We are thrilled that they have decided to join us here in Kendall.”
Boeing Chief Technology Officer Greg Hyslop said: “We at Boeing are again embracing our inner startup. We’ve started an organization called NeXt, and we’re pushing forward with a new breed of autonomous vehicles and new concepts in AI-enabled airspace. We see fundamental changes coming in how people and goods move from place to place.”
Aurora Flight Sciences, which is part of the Boeing Company, will be moving its Kendall-based research and development center into the new building. John Langford, Aurora Flight Sciences founder, chief executive officer, and MIT alumnus said: “Today, Aurora’s Kendall Square team is already building innovative autonomous systems. By expanding Aurora’s 30-year relationship with MIT, and working with Boeing, we are creating a collaborative space where engineers, students and researchers can work together to create technologies that will define the next-century of air mobility.”
Museum membership free for Cambridge residents
At the start of the ground-breaking program, Sarah Gallop, MIT’s co-director of government and community relations, invited Cambridge Mayor Marc McGovern to join her and Durant on stage.
Durant announced that membership to the MIT Museum will be free to all Cambridge residents once it opens its doors in 2021. “Fantastic!” said the mayor. “That is very good and welcome news for our whole community. Thank you, MIT!”
Main Street’s “Walls of Unity” mural
The Community Art Center (CAC), located in the Cambridge neighborhood known as “The Port,” runs youth programs that create community-sourced public art installations. MIT has worked with CAC youth in the Teen Media Program on several murals that have been displayed at MIT construction sites over the past few years. The most recent creation is a 360-foot “Walls of Unity” mural that stretches along Main Street in front of the graduate student housing construction site and 314 Main Street. The mural, part of the CAC’s “Creative Current” initiative, depicts images of unity that have been created by young artists in the organization’s apprenticeship program and members of the MIT and Cambridge communities.
CAC Executive Director Eryn Johnson shared her perspective on the unique endeavor. “We are grateful to MIT for generously partnering with us. It’s extremely meaningful for our young artists to have their creations featured so prominently in Kendall Square.” William Gallop, one of the CAC apprentices, said: “I never thought I’d see my work displayed in public like that. It makes me feel empowered to be able to put my artistic voice towards good purposes.”
Mayor McGovern said “William, thank you — keep on doing what you’re doing! And Eryn Johnson, you are a stalwart leader and role model in our community.”
314 Main Street
Architects Weiss/Manfredi and Perkins+Will designed the new 437,000 square foot commercial lab building at 314 Main Street, and Turner Construction Company is constructing the facility. Expected to be completed in 2020, the 250-foot high building will house the MIT Press Bookstore, a restaurant, and a café, in addition to the MIT Museum and Boeing. Other forward-looking companies are expected to sign on as tenants in the coming months.
Höweler + Yoon Architecture designed the interior museum space, which will be located on the first, second, and third floors and will include exhibition and public programs space, classrooms, a store, and space for administration and collections research.
Before she was a PhD student searching through art history archives around the world, a young Nisa Ari attended museums with her family and tried to discern the histories behind the artwork and artifacts she saw. “It always had the appeal of detective work,” Ari says. Sometimes, when she’d walk into a new gallery, she’d challenge herself to identify artists from paintings at a distance: “That’s a Cézanne, that’s a Picasso, that’s a Léger.”
When visiting her father’s family in Istanbul, the Colorado native would sit and sketch on the porch with her grandfather, an artist. Ari’s interests included the performing arts as well. “I was a singer,” Ari says. “That was my life, but I was really academically minded, too. So when it came time to go to college, I applied to music conservatories as well as [universities].” Ari decided to attend Stanford University — with the intention to move to New York City and live as a performer after completing her undergraduate degree.
She enrolled in an art history class at Stanford during her first year — a decision which led her to major in art history. Following graduation, Ari spent five years auditioning for roles and performing in musical theater productions around the country. But she had a longing to return to the art world, and worked as an assistant director at the Elizabeth Foundation for the Arts in New York City. “That experience was good — it made me realize that I was not willing to totally commit myself to a life of performing,” Ari says. “There was a moment when I thought, it’s time to do a PhD.”
Now, Ari is a doctoral candidate in the History, Theory, and Criticism of Architecture and Art program at MIT, and studies the development of arts and crafts in Palestine from 1876 to 1948. Her transition back into academia was “relatively painless,” she says — and she still takes cues from her training as a performer to work though her dissertation. “I like to say I tap dance through grad school — I feel like I use the skills [gained in New York] all the time in what I do now.”
Ari was drawn to the History, Theory, and Criticism of Architecture and Art program because of its strength in modern Middle Eastern art history. Her early research was rooted in the 1990s — a period when prominent nonprofits that supported the arts arrived in the Middle East. With guidance from co-advisors Nasser Rabbat, the director of the Aga Khan Program for Islamic Architecture at MIT, and Caroline Jones, a professor of art history, Ari zeroed in on the idea of studying the development of Palestinian art during the 19th and 20th centuries. But it was a challenge: Three other students interested in Palestine had been stymied by a lack of access to artwork, the lack of a record of Palestinian artwork in archives, and the difficulties of working across the politics of the region. But Ari, who studied Arabic at Harvard University while enrolled at MIT and took summer research trips to Israel, the Palestinian territories, and Jordan through MISTI, was determined to follow through.
Ari’s fieldwork is “more comparable to that of an anthropologist or sociologist” and involves travel to Israel, the Palestinian territories, Lebanon, England, and Jordan. She looks through a plethora of documents that record the development of art by artists and social commentators alike: files, letters, photographs, postcards. Not all records are public — Ari often works to track down private collections as well. “There are several private Palestinian [art] collectors who live in Beirut, Lebanon. Those collections are more personal,” Ari says. While most of the artists Ari studies are now deceased, she sometimes comes across their descendants and listens to their stories.
“I’m very interested in social relationships and social networks in terms of how they affect artistic production,” Ari says. “For me, the proof is always in the pudding. I always have to be able to still see it in the artwork to understand those relationships.”
Between the end of the Ottoman empire in the early 20th century and the close of the British Mandate in 1948, Palestine underwent widespread political and social changes. Ari focuses on notable shifts in the region’s artwork and how art was used, from religious purposes to commercial and political ones. “I’m trying to understand how changes in art happened in Palestine as a result of this major political and social upheaval,” she says.
Informing research methods
Ari is also well aware of the political and social tensions stemming from the Arab-Israeli conflict in the countries she visits to perform her research. “It’s tough because it’s such an active situation, and people’s lives really are at stake,” Ari says. “There’s a kind of emotional labor involved in that there’s a lot of code switching that I have to do when I’m talking to an Israeli at the Central Zionist Archives in Jerusalem versus when I’m meeting with a Palestinian art collector in Bethlehem. While my project stays the same, my approach has to change.”
In some cases, Ari has to meet with curators and archivists multiple times before gaining access to archives: “Understandably, for some there’s a real fear about how you’re going to use the material.”
“I’ve found that honesty is the best policy and just to present myself as best I can, because the whole purpose of the project is to preserve nuance where so much of it has been lost because of contemporary politics,” Ari says. “I think it really helps that I did not start from a place of politics for this particular project.”
Ari cites her own heritage — her father’s family is orthodox Muslim and her mother’s is orthodox Jewish. “Having some background in both of these religious cultures has helped me to recognize the nuances of the situation. … My job is to be critical and deep as an art historian and not as a politician.”
“I remind myself that I’m making a choice to do this every day when I’m crossing the border [between Israel and the Palestinian territories] — other people who are there with me don’t have that choice,” Ari says.
Ari has also been co-editor of Thresholds, the MIT architecture department’s peer-reviewed journal; a dissertation fellow with the Mellon Foundation/American Council of Learned Societies; a research fellow with the Palestinian American Research Center; a dissertation fellow at Darat al Funun in Amman, Jordan; and the recipient of an international research grant from the Terra Foundation for American Art. She also received the Rhonda A. Saad Prize for Best Paper in Modern and Contemporary Art for a section from her dissertation, which was recently published in Arab Studies Journal.
After MIT, Ari hopes to apply for postdoctoral fellowships and teaching positions and turn her dissertation into a book, with the long-term intention of teaching. “I intended to teach from the start,” she says.
Professor Emeritus Sylvain Bromberger, a philosopher of language and of science who played a pivotal role in establishing MIT’s Department of Linguistics and Philosophy, died on Sept. 16 in Cambridge, Massachusetts. He was 94.
A faculty member for more than 50 years, Bromberger helped found the department in 1977 and headed the philosophy section for several years. He officially retired in 1993 but remained very active at MIT until his death.
Kindness and intellectual generosity
“Although he officially retired 25 years ago, Sylvain was an active and valued member of the department up to the very end,” said Alex Byrne, head of the Department of Linguistics and Philosophy. “He made enduring contributions to philosophy and linguistics, and his colleagues and students were frequent beneficiaries of his kindness and intellectual generosity. He had an amazing life in so many ways, and MIT is all the better for having been a part of it.”
Paul Egré, director of research at the French National Center for Scientific Research (aka CNRS) and a former visiting scholar at MIT, said, “Those of us who were lucky enough to know Sylvain have lost the dearest of friends, a unique voice, a distinctive smile and laugh, someone who yet seemed to know that life is vain and fragile in unsuspected ways, but also invaluable in others.”
Enduring contribution to fundamental issues about knowledge
Bromberger’s work centered largely on fundamental issues in epistemology, namely the theory of knowledge and the conditions that make knowledge possible or impossible. During the course of his career, he devoted a substantial part of his thinking to an examination of the ways in which we come to apprehend unsolved questions. His research in the philosophy of linguistics, carried out in part with the late Institute Professor Morris Halle of the linguistics section, included investigations into the foundations of phonology and of morphology.
Born in 1924 in Antwerp to a French-speaking Jewish family, Bromberger escaped the German invasion of Belgium with his parents and two brothers on May 10, 1940. After reaching Paris, then Bordeaux, his family obtained one of the last visas issued by the Portuguese consul Aristides de Sousa Mendes in Bayonne. Bromberger later dedicated the volume of his collected papers "On What We Know We Don’t Know: Explanation, Theory, Linguistics, and How Questions Shape Them" (University of Chicago Press, 1992) to Sousa Mendes.
The family fled to New York, and Bromberger was admitted to Columbia University. However, he chose to join the U.S. Army in 1942, and he went on to serve three years in the infantry. He took part in the liberation of Europe as a member of the 405th Regiment, 102nd Infantry Division. He was wounded during the invasion of Germany in 1945.
After leaving the Army, Bromberger studied physics and the philosophy of science at Columbia University, earning his bachelor’s degree in 1948. He received his PhD in philosophy from Harvard University in 1961.
Research and teaching at MIT
He served on the philosophy faculties at Princeton University and at the University of Chicago before joining MIT in 1966. Over the years, he trained many generations of MIT students, teaching alongside such notables as Halle, Noam Chomsky, Thomas Kuhn, and Ken Hale.
In the early part of his career, Bromberger focused on critiquing the so-called deductive-nomological model of explanation, which says that to explain a phenomenon is to deductively derive the statement reporting that phenomenon from laws (universal generalizations) and antecedent conditions. For example, we can explain that this water boils from the law that all water boils at 100 degrees C, and that the temperature of the water was elevated to exactly 100 C.
An influential article: Why-questions
One simple though key observation made by Bromberger in his analysis was that we may not only explain that the water boils at 100 C, but also how it boils, and even why it boils when heated up. This feature gradually led Bromberger to think about the semantics and pragmatics of questions and their answers.
Bromberger’s 1966 “Why-questions” paper was probably his most influential article. In it, he highlights the fact that most scientifically valid questions put us at first in a state in which we know all actual answers to the question to be false, but in which we can nevertheless recognize the question to have a correct answer (a state he calls “p-predicament,” with “p” for “puzzle”). According to Bromberger, why-questions are particularly emblematic of this state of p-predicament, because in order to ask a why-question rationally, a number of felicity conditions (or presuppositions) must be satisfied, which are discussed in his work.
The paper had an influence on ulterior accounts of explanation, notably Bas van Fraassen’s discussion of the semantic theory of contrastivism in his book "The Scientific Image" (to explain a phenomenon is to answer a why-question with a contrast class in mind). Still today, why-questions are recognized as questions whose semantics is hard to specify, in part for reasons Bromberger discussed.
In addition to investigating the syntactic, semantic, and pragmatic analysis of interrogatives, Bromberger also immersed himself in generative linguistics, with a particular interest in generative phonology, and the methodology of linguistic theory, teaching a seminar on the latter with Thomas Kuhn.
A lifelong engagement with new ideas
In 1993, the MIT Press published a collection of essays in linguistics to honor Bromberger on the occasion of his retirement. "The View From Building 20," edited by Ken Hale and Jay Keyser, featured essays by Chomsky, Halle, Alec Marantz, and other distinguished colleagues.
In 2017, Egré and Robert May put together a workshop honoring Bromberger at the Ecole Normale Supérieure in Paris. Talks there centered on themes from Bromberger’s work, including metacognition, questions, linguistic theory, and problems concerning word individuation.
Tributes were read, notably this one from Chomsky, who used to take walks with Bromberger when they taught together:
“Those walks were a high point of the day for many years … almost always leaving me with the same challenging question: Why? Which I’ve come to think of as Sylvain’s question. And leaving me with the understanding that it is a question we should always ask when we have surmounted some barrier in inquiry and think we have an answer, only to realize that we are like mountain climbers who think they see the peak but when they approach it find that it still lies tantalizingly beyond.”
Egré noted that even when Bromberger was in his 90s, he had a “constant appetite for new ideas. He would always ask what your latest project was about, why it was interesting, and how you would deal with a specific problem,” Egré said. “His hope was that philosophy, linguistics, and the brain sciences would eventually join forces to uncover unprecedented dimensions of the human mind, erasing at least some of our ignorance.”
Bromberger’s wife of 64 years, Nancy, died in 2014. He is survived by two sons, Allen and Daniel; and three grandchildren, Michael Barrows, Abigail Bromberger, and Eliza Bromberger.
Written by Paul Egré and Kathryn O’Neill, with contributions from Daniel Bromberger, Allen Bromberger, Samuel Jay Keyser, Robert May, Agustin Rayo, Philippe Schlenker, and Benjamin Spector
Political pundits are usually confident about their ability to identify why citizens think the way they do. Look at cable television or the internet, and you’ll find someone attributing an election result to economic anxiety, or claiming the latest polling numbers reflect a recent news story.
Teppei Yamamoto has his doubts.
Yamamoto is an associate professor in MIT’s Department of Political Science and an expert in the methodology used by researchers. More specifically, he zeroes in on questions of cause and effect. Yamamoto tries to be as specific as possible: What actually causes the effects we see?
A lot of the time, Yamamoto believes, what might seem to be a single cause actually consists of multiple things tangled together. His work frequently finds ways to separate out these “treatment components,” as he calls them, from one another.
“Even if you look at a clean study, the [outcome] is usually a composite of multiple components, any one of which could be actually causing the effect,” Yamamoto says.
Consider immigration policy. When people oppose immigration, why is that?
“Is it the ethnicity or country of origin that’s changing the effects, or is it something else, like perception of education?” Yamamoto asks. “If people think of immigration as [in terms of] people with lower education flooding to United States, they may not like that, as opposed to [immigrants’] nationality itself,” Yamamoto says.
Then again, it could be perceptions of ethnicity that influence people’s stance on immigration. To scrutinize such things, Yamamoto often uses a method known as “conjoint analysis,” which he has helped introduce to political science. Conjoint analysis examines different combinations of variables to evaluate their relative power; applying it carefully to a political survey, for example, can help ensure researchers are getting inside the minds of voters.
For his research and teaching, Yamamoto was awarded tenure earlier in 2018 at MIT, where he has carved out a valued role as a thinker grappling with highly relevant issues.
The long road to Cambridge
Yamamoto grew up in the city of Maebashi, the regional capital of Japan’s Gunma prefecture, about 70 miles from Tokyo. But his road to academia in the U.S. went via another place altogether: Thailand, where his family spent two years when Yamamoto was a child, from age 7 to age 9. Yamamoto’s father, an engineer, worked in Japan’s development aid agency.
“I can say that developed my interest for living in other countries and looking at different cultures,” Yamamoto says. “I don’t think I was completely conscious of the politics, but I was interested in how things work in society. I was observing differences. That made a strong impact on me in terms of the kind of work I [later] wanted to do, and I wanted to go abroad instead of staying in Japan for my whole life.”
By the time he reached high school back in Japan, Yamamoto wanted to become a diplomat. But at the University of Tokyo, where he studied international relations, Yamamoto became increasingly interested in academic research, something that was enhanced by studying abroad for a year at Oxford University. Soon Yamamoto was accepted into Princeton University’s PhD program in politics, where he began developing his present interests when reading papers for his seminars.
“I often found myself always focusing on the methodology: How does this paper prove what it says? I was interested in making arguments based on that,” Yamamoto says.
Moreover, he found a like-minded professor there, Kosuke Imai (now at Harvard University), who had the same kinds of interests in methodology, helped encourage Yamamoto’s skills, and was a “role model,” as Yamamoto puts it.
“He showed me how to be successful as a political methodologist, as someone who just came from a foreign country, specifically Japan, but he also showed me how to write a paper, how to find a methodological question,” Yamamoto says. “It was very important for me to see a complete path to success.”
Yamamoto joined the MIT faculty after completing his PhD at Princeton and has been at the Institute since 2011; in addition to his appointment as an associate professor, he is director of the Political Methodology Lab at MIT and a faculty affiliate of the Institute for Data, Systems, and Society (IDSS).
While Yamamoto’s methods work has been paired with studies of many particular political issues, he has also spent the last three years involved in a major MIT project about the effects of news media on political views. The project, funded by the National Science Foundation, is being conducted along with Yamamoto’s political science colleague Adam Berinsky, an expert on political falsehoods.
“The idea is that the American public has been polarized into two extremes in recent decades,” Yamamoto says. The project aims to see if media outlets merely reflect this polarization, or, as many observers have argued, actively contributes to this polarization.
“That’s a plausible explanation, but we do want hard evidence,” Yamamoto says. To that end, he has carefully designed the group’s studies, which he calls “a division of labor that’s been working well.”
To dig away at the issue, the project’s experiments do not just randomly assign people to watch certain types of content, but also give some people their choice of content, so as to better evaluate the effects of media on a variety of people.
“In the real world, people are not randomly assigned to Fox News or MSNBC,” Yamamoto says. “We have to estimate the effect on people who would watch [a channel anyway], as opposed to the standard effects.”
The project is also tracking respondents over time to measure the effects of long-term exposure to certain news channels, for instance — both in terms of political opinions and subsequent viewing choices.
“We want to see, what’s the effect of exposing people to the same media, and what’s the effect of mixing them up,” Yamamoto says.
All told, as the leading methodologist in his department, Yamamoto is pleased at the way political science generally has been adopting more complex and subtle research strategies, which he thinks will help it obtain robust results.
“It has transformed things,” Yamamoto says. “A very advanced understanding of methodology is standard for all students in political science at MIT. I think that’s the trend in the discipline as well.” Then he pauses for the punchline: “But I can’t claim it’s my causal effect on things.”
Four MIT graduate students have been awarded 2018 United States Department of Energy (DoE) Computational Science Graduate Fellowships to address intractable challenges in science and engineering. Nationwide, MIT garnered the most fellowships out of this year’s 26 recipients.
The fellows receive full tuition and additional financial support, access to a network of alumni, and valuable practicum experience working in a DoE national laboratory. By supporting students like Kaley Brauer, Sarah Greer, William Moses, and Paul Zhang, the DoE aims to help train the next generation of computational scientists and engineers, incite collaboration and progress, and advance the future of the field by bringing more visibility to computational science careers.
Kaley Brauer is a graduate student in the Department of Physics. Her computational work in the Kavli Institute for Astrophysics and Space Research is uncovering new details about how galaxies form — including the origin of the Milky Way. Using high-performance computing simulations and theoretical models, she is identifying processes that underlie galaxy formation to learn more about properties of the early universe.
“You need a detailed model to turn back the clock and learn about how a galaxy evolved step by step,” Brauer says. “In a supercomputer, you can see how things move and make adjustments so that you end up with a galaxy that looks like the galaxy we see today. It’s really fun.”
Brauer says that while she originally wanted to be a scientific illustrator, an undergraduate cosmology class left her eager to learn more. Her current research allows her to combine her interest in both design and cosmology, and she hopes to focus her practicum on scientific visualization.
“I'm very excited that Kaley was chosen as a fellow,” says Anna Frebel, an associate professor of physics and Brauer’s advisor. “It enables her to do the type of computational research she’s most excited about: to study galaxy formation and understand the evolution of our Milky Way Galaxy.”
Sarah Greer is a graduate student in the Computational Science and Engineering (CSE)/Department of Mathematics PhD program. Greer’s undergraduate research in geoscience focused on seismic data processing and improving visualizations of the Earth’s subsurface. She intends to build on this work through her graduate research by using computational mathematics to address large-scale geophysical problems.
Greer says she is grateful for the opportunities that the fellowship affords, including a plan of study that encourages her to take risks.
“It has helped me go outside my comfort zone and find areas I’m interested in that I wouldn’t have explored otherwise,” Greer says. “I also really like that the practicum lab component gives us the chance to try something out and see if it’s the right career option.”
Greer’s advisor, Laurent Demanet, an associate professor of applied mathematics, noted that modern geophysics has benefited from interdisciplinary researchers like Greer, who bring fresh perspectives to longstanding challenges.
“Sarah’s impressive background is a rare blend of data/signal processing, computational mathematics, and Earth sciences,” Demanet says. “It was not a difficult decision to admit her in the new CSE-math PhD program at MIT, and we were all glad that the DoE felt the same way about awarding her this fellowship.”
William “Billy” Moses is a graduate student in the Department of Electrical Engineering and Computer Science. Moses also completed undergraduate and master's degrees at MIT in computer science and physics. His current research in the Computer Science and Artificial Intelligence Laboratory (CSAIL) focuses on performance engineering — strategies to improve ease of use, speed, and efficiency in computing. In addition to developing programs that write code, he works on programs called compilers that allow code to run on different machines.
“I really enjoy working on these problems,” Moses said. “Succeeding at them lets everyone take advantage of the latest advances in computer science without folks needing to spend five years in computer science graduate school.”
Moses described how the financial assistance and connections through the fellowship would support his research and career.
“I have the freedom to work on what I think is important, without necessarily searching for funding,” Moses says. “What really sets the DoE fellowship apart is the community it makes between the fellows and the national labs. Being a fellow in the program means that I have this wealth of resources out there for me.”
“Billy is the kind of student who makes MIT a great place for research,” says Moses’ advisor, Charles Leiserson, a professor of computer science and engineering. Leiserson says that Moses, as an undergraduate, received a best paper award at the 2017 Symposium on Principles and Practice of Parallel Programming for modifying a highly complicated, 4-million-line compiler — “a feat that seasoned compiler engineers deemed well-nigh impossible,” Leiserson says. “I'm delighted that he has chosen graduate school at MIT to continue his research.”
Paul Zhang is a graduate student in the Department of Electrical Engineering and Computer Science. Zhang conducts research in the geometric data processing group in CSAIL with his advisor, Justin Solomon, an assistant professor of electrical engineering and computer science.
The geometric data processing group works on geometric problems in computer graphics, machine learning, and computer vision. Zhang is currently studying hexahedral meshing, a longstanding challenge that involves decomposing objects into cube-like elements for use in fluid simulation.
Zhang noted that the DoE fellowship provides important benefits beyond financial support. “In addition to the funding, it gives me the opportunity to meet other experts in my field,” Zhang says. “It also gives me opportunities to use national lab resources like supercomputers.”
Solomon says that in his time as a PhD student, Zhang “has already blown me away with his creativity and productivity — and he has achieved meaningful progress on some open research problems.”
“He is an obvious choice for this fellowship who will succeed in graduate school and become a top leader in the computational science community,” Solomon says.
Zhang and the 2018 cohort join the 10 other MIT students currently supported by the DoE Computational Science Graduate Fellowship Program. Administered by the Krell Institute and funded by the DoE’s Office of Science and the National Nuclear Security Administration, the fellowship program has supported more than 425 talented computational science students across the country since 1991.
Climate change, a surging population, and increasing demand for food, housing and natural resources present Africa and the world with extraordinary challenges.
On Sept. 24, numerous experts from diverse disciplines and areas of the world convened at MIT to discuss sustainable development in Africa. The conference was hosted by the Université Mohammed VI Polytechnique-MIT Research Program (UMRP), a collaboration with the Moroccan university (UM6P) led by MIT faculty director Elfatih A. B. Eltahir, the Breene M. Kerr Professor of Hydrology and Climate in the Department of Civil and Environmental Engineering.
UMRP, which launched in 2016, is comprised of six projects led by MIT faculty, which are each built around the dissertation research of an MIT graduate student. The UMRP researchers work closely with the faculty and student colleagues from UM6P, who engage in complementary research.
The African Sustainability Conference provided a showcase for these projects, featuring presentations from MIT and UM6P faculty, researchers, and international experts on climate and water, sustainable urbanization, precision agriculture, smart chemistry, and industrial optimization for the phosphate industry. Group discussions related to critical challenges and potential opportunities within each area followed each session.
Eltahir began the conference by highlighting the significance of Africa in terms of global sustainability, noting that the substantial yet uncertain effects of climate change are already noticeable in agricultural productivity and infrastructure throughout the continent. Projections show that by 2050, Africa’s population will double from 1 billion to 2 billion people, creating an influx of urbanization.
“We are forging an honest collaboration between MIT and a like-minded research and education partner in Africa with the mission of advancing sustainability goals, while also helping build UM6P’s institutional capacity to lead by example on the continent,” expressed Eltahir.
Eltahir brings his background in hydrology and climate to his own UMRP research project, that focuses on improving water management and agricultural productivity in one of Morocco’s major river basins, the Oum-Er-Rbia watershed.
“Climate change is a major challenge for the world, especially concerning Africa. Morocco is a country that suffers from interannual rainfall variability. We are focused on looking for ways to improve management for water resources and availability,” explained Eltahir.
Morocco is highly vulnerable to heat waves and low precipitation, and those extremes are expected to intensify due to climate change. Eltahir’s research addresses these issues through a three-level modeling approach geared toward climatology and forecasting, hydrology, and operations in terms of agricultural planning and infrastructure.
He hopes the program will continue to grow, allowing for further collaboration between MIT and UM6P, students, and faculty. Furthermore, some of the tools, models, and processes that are being utilized in Morocco and greater Africa, can be applied to other regions around the world who will face similar challenges due to climate change.
In addition to Eltahir, the workshop brought together MIT professors John Fernández of the Department of Architecture, Benedetto Marelli of the Department of Civil and Environmental Engineering, Paul Barton of the Department of Chemical Engineering, and Christopher Cummins and Yogesh Surendranath of the Department of Chemistry. Including UM6P colleagues, invited international experts, and MIT graduate students, the conference highlighted efforts to implement resilience, adaptability, and sustainability into the future of African cities.
John Fernández, director of MIT’s Environmental Solutions Initiative and professor of architecture, helped launch UMRP with the focus that there is an urgency needed for long-term sustainability, in the areas of society, economy, and climate.
Through comprehensive material accounting of the needs of Moroccan cities, Fernández will be developing specific technology and policy recommendations for UM6P, providing the country with a template for long term urban sustainability.
“One of our goals is to produce a UMRP urban resource tool that would allow Morocco and greater Africa to access data and reach informed decisions about urban sustainability,” said Fernández. The tool’s engine would be developed in partnership with UM6P and the tool itself would be offered online.
The strains of urban population growth, and a predicted threefold increase in urban energy and urban land area globally is a primary motivation of the project. In addition, it is likely that low-income urban areas in Africa will be most vulnerable to the consequences of climate change due to unreliable and limited access to energy sources, water, and shelter.
“With climate change, what happens in terms of the vulnerability of lower income segments of urban population, and at what point, with extreme heat, intense precipitation or climate-induced water shortage does urban vulnerability become urban survivability?” Fernández asked.
Securing resources for the future
In addition to climate concerns, agricultural production concerns were raised as well from both MIT and UM6P experts.
Benedetto Marelli, the Paul M Cook Career Development Assistant Professor in the Department of Civil and Environmental Engineering, shared that he is focused on developing new technologies that can increase agricultural production. He stated that with a growing population, a 70 percent increase in food production will be necessary by 2050.
Marelli is in the process of creating biofertilizers that can work with the plant, to boost germination and overcome environmental stressors such as pests, disease, heat waves, and drought.
Manal Mhada, a postdoc from UM6P, presented her research on precision agriculture, and the efficient use of seeds and fertilizers. Her work focuses on human-centered solutions for Moroccan communities, and includes local farmers in her research projects.
Mhada conducts close studies of the crop quinoa, with the intention of introducing it to Morocco in order to provide food and nutritional security. She acknowledges that climate change threatens agriculture, food security, and peace, but emphasizes that “big problems allow for immense opportunity.”
Resilience became a common thread throughout the conference. Hassan Radoine, director of the School of Architecture and Design at UM6P, urges for a paradigm shift, explaining how most people perceive Africa as poor.
“What is resilience? The responsiveness to risk and inventing new solutions. The reconstructing of a community or a place, is resilience,” Radoine said.
Echoing this, Remy Sietchiping, UN-Habitat leader of regional and metropolitan planning, outlined the urban agenda of creating smart cities that encompass adaptability and most importantly, resilience.
“You cannot buy sustainability,” Randoine said.
During the last session of the conference, gears shifted towards the “smart chemistry” projects, which work closely with Moroccan company, OCP, the leading supplier of phosphate rock in the world. Paul M Cook Career Development Assistant Professor Yogeth Surendranath of the MIT Department of Chemistry presented on the natural resource, phosphorous, which is abundant to Morocco.
However, the process of creating phosphate products demands an incredible amount of energy. Surendranath’s research is targeted at elucidating the process of electrochemical phosphate reduction in molten salts, in order to lower economic and environmental costs, and advance Morocco in the chemical markets.
Henry Dreyfus Professor of Chemistry Christopher Cummins’ project is also working with phosphate, and has successfully created a new method for the synthesis of phosphorous. The method utilizes a “wet process,” which enables the reduction of energy inputs, waste, and overall harm to the environment.
Following Cummins, Professor Paul Barton of the Department of Chemistry, discussed his project on optimal industrial symbiosis for the Jorf Lasfar platform, the phosphate mineral processing facility in Morocco. Barton is studying ways to optimize the phosphate resource, to generate returns on investment while also being mindful of energy and water consumption.
Throughout the afternoon, goals for the future were at the forefront of everyone’s mind. UMRP aims to continue to conduct impactful research, tackle developmental challenges, and build a strong foundation for UM6P.
“This conference provided a wonderful platform for UMRP to showcase their projects, build a community with UM6P and other colleagues, and help the growing institutional commitment of MIT to engage fruitfully in a future of sustainable development for Africa,” said UMRP Executive Director Kurt Sternlof.
It was evident that the MIT faculty-led research is results-driven and exhibits a strong vision of a sustainable future. The idea that UMRP research projects develop small solutions to make big impacts, became a recurring element of the conference.
“Whether discussing urban metabolism, industrial symbiosis, chemical processing or the hydrological cycle, the common theme of recognizing and optimizing closed loops of resource use — circular economies of production, consumption and renewal — was clear and compelling, and therein beats the heart of sustainability,” Sternlof said.
Members of the MIT engineering faculty receive many awards in recognition of their scholarship, service, and overall excellence. Every quarter, the School of Engineering publicly recognizes their achievements by highlighting the honors, prizes, and medals won by faculty working in our academic departments, labs, and centers.
Anant Agarwal, of the Department of Electrical Engineering and Computer Science and the Computer Science and Artificial Intelligence Laboratory, won the 2018 Yidan Prize for Educational Development Laureate on Sept. 15.
Angela Belcher, of the departments of Materials Science and Engineering and Biological Engineering, won the Xconomy Award for Innovation at the Intersection on July 18.
Martin Bazant, of the departments of Chemical Engineering and Mathematics, became a fellow of the American Physical Society on Sept. 26.
Svetlana Boriskina, of the Department of Mechanical Engineering, was elected to the Optical Society Board of Directors on Sept. 18.
Richard Braatz, of the Department of Chemical Engineering, was named a fellow of American Institute of Chemical Engineers on Aug. 3.
Marty Culpepper, of the Department of Mechanical Engineering, was named the Class of 1960 Fellow on Sept. 21.
Luca Daniel, of the Department of Electrical Engineering and Computer Science and the Research Lab of Electronics, won the Best Paper Award for the IEEE Transactions on Components, Packaging, and Manufacturing Technologies on Aug. 13.
Constantinos Daskalakis, of the Department of Electrical Engineering and Computer Science and the Computer Science and Artificial Intelligence Laboratory, won the Simons Investigator Award in Theoretical Computer Science from the Simons Foundation on July 1; he also won the Rolf Nevanlinna Prize from the International Mathematics Union on Aug. 1.
Domitilla Del Vecchio, of the Department of Mechanical Engineering, won the National Science Foundation Understanding the Rules of Life Award on Sept. 21.
Srini Devadas, of the Department of Electrical Engineering and Computer Science and the Computer Science and Artificial Intelligence Laboratory, won the Charles A. Desoer Technical Achievement Award of IEEE Circuits and Systems on July 1.
Piotr Indyk, of the Department of Electrical Engineering and Computer Science and the Computer Science and Artificial Intelligence Laboratory, was appointed as the Thomas D. and Virginia W. Cabot Professor on Sept. 13.
Lynn W. Gelhar, of the Department of Civil and Environmental Engineering, won the Charles V. Theis Award on Aug. 1.
Polina Golland, of the Department of Electrical Engineering and Computer Science and the Computer Science and Artificial Intelligence Laboratory, was named the Henry Ellis Warren (1894) Chair on Sept. 13.
Martha Gray, of the Department of Electrical Engineering and Computer Science and the Institute for Medical Engineering and Science, won the Civil Servants Social Security and Services Institute Memorial Award on June 12.
Charles Harvey, of the Department of Civil and Environmental Engineering, was awarded an AGU Fellowship on Aug. 9.
Asegun Henry, of the Department of Mechanical Engineering, won the 2018 Bergles-Rohsenow Young Investigator Award in Heat Transfer on Aug. 28.
Jeffrey A. Hoffman, of the Department of Aeronautics and Astronautics, won the Best Technical Paper Award at the 31st Annual Congress of the Association of Space Explorers on Sept. 14.
Qing Hu, of the Department of Electrical Engineering and Computer Science and the Research Lab of Electronics, won the Kenneth J Button Prize at the International Conference on Infrared, Millimeter, and Terahertz Waves on Sept. 14.
Klavs Jensen, of the Department of Chemical Engineering and Materials Science and Engineering, was named the 2018 American Institute of Chemical Engineers Prausnitz Institute Lecturer on Aug. 21.
Robert S. Langer, of the Department of Chemical Engineering, was awarded honorary foctorates from the University of Limerick in Ireland and from Université Laval in Canada; he also won the Leadership Award for Historic Scientific Advancement from the American Chemical Society, the 2018 Leadership Award for Historic Scientific Advancement from the American Chemical Society, and the 2018 Alpha Omega Dental Fraternity Achievement Medal Award; in addition, he was inducted into Advanced Materials Hall of Fame on Aug. 1.
Charles Leiserson, of the Department of Electrical Engineering and Computer Science and the Computer Science and Artificial Intelligence Laboratory, won the Association for Computing Machinery SIGCOMM Networking Systems Award on Aug. 28.
Aleksander Madry, of the Department of Electrical Engineering and Computer Science and the Computer Science and Artificial Intelligence Laboratory, won the Presburger Award for Young Scientists from the European Association for Theoretical Computer Science on July 13.
Tom Magnanti, of the Laboratory for Information and Decision Systems, was honored with Singapore’s National Day Award on Aug. 17.
Heidi Nepf, of the Department of Civil and Environmental Engineering, was awarded an AGU Fellowship on Aug. 9.
Dava Newman, of the Department of Aeronautics and Astronautics, won the 2018 Lowell Thomas Award on July 11.
Asu Ozdaglar, of the Department of Electrical Engineering and Computer Science, was named the School of Engineering Distinguished Professor of Engineering on Sept. 13.
Pablo Parrilo, of the Department of Electrical Engineering and Computer Science, was named the Joseph F. and Nancy P. Keithley Professor on Sept. 13.
Alberto Rodriguez, of the Department of Mechanical Engineering, won the Amazon Robotics Best Systems Paper Award in Manipulation on Sept. 14.
Hadley Sikes, of the Department of Chemical Engineering, was awarded the 2018 Best of BIOT (ACS Division of Biochemical Technology) Award on Sept. 25.
Michael Strano, of the Department of Chemical Engineering and the MIT Energy Initiative, will lead the new Energy Frontier Research Center to be established at MIT on June 29.
Russell Tedrake, of the Department of Electrical Engineering and Computer Science and the Computer Science and Artificial Intelligence Laboratory, won the International Journal of Robotics Inaugural Paper of the Year Award on July 6.
John Tsitsiklis, of the Department of Electrical Engineering and Computer Science, was awarded an honorary doctorate from the Athens University of Economics and Business; he also won the IEEE Control Systems Award on June 30.
Dennis Whyte, of the Department of Nuclear Science and Engineering and the Plasma Science and Fusion Center, won the Fusion Power Associates Leadership Award on Sept. 25.
Gregory Wornell, of the Department of Electrical Engineering and Computer Science and the Research Lab of Electronics, won the IEEE Leon K. Kirchmayer Graduate Teaching Award on Sept. 19.
Xuanhe Zhao, of the Department of Mechanical Engineering, won the Materials Today Rising Star Award on Sept. 19.