MIT Latest News

Subscribe to MIT Latest News feed
MIT News is dedicated to communicating to the media and the public the news and achievements of the students, faculty, staff and the greater MIT community.
Updated: 23 hours 38 min ago

Artificial intelligence shines light on the dark web

Mon, 05/13/2019 - 12:10pm

Beneath the surface web, the public form of the internet you use daily to check email or read news articles, exists a concealed "dark web." Host to anonymous, password-protected sites, the dark web is where criminal marketplaces thrive in the advertising and selling of weapons, drugs, and trafficked persons. Law enforcement agencies work continuously to stop these activities, but the challenges they face in investigating and prosecuting the real-world people behind the users who post on these sites are tremendous.

"The pop-up nature of dark-web marketplaces makes tracking their participants and their activities extremely difficult," says Charlie Dagli, a researcher in MIT Lincoln Laboratory's Artificial Intelligence Technology and Systems Group. Dagli is referring to the fast rate at which dark-web markets close down (because they are hacked, raided, abandoned, or set up as an "exit scam" in which the site shuts down intentionally after customers pay for unfulfilled orders) and new ones appear. These markets' short lifetimes, from a few months to a couple years, impede efforts to identify their users.

To overcome this challenge, Lincoln Laboratory is developing new software tools to analyze surface- and dark-web data.

These tools are leveraging the one benefit this whack-a-mole-like problem presents — the connections sellers and buyers maintain across multiple layers of the web, from surface to dark, and across dark-web forums. "This constant switching between sites is now an established part of how dark-web marketplaces operate," Dagli says.

Users are making new profiles constantly. Although they may not be employing the same usernames from site to site, they are keeping their connections alive by signaling to each other through their content. These signals can be used to link personas belonging to the same user across dark-web forums and, more revealingly, to link personas on the dark web to the surface web to uncover a user's true identity.

Linking users on the dark web is what law enforcement already tries to do. The problem is that the amount of data that they need to manually shuffle through — 500,000 phone numbers and 2 million sex ads posted a month — is too large and unstructured for them to find connections quickly. Thus, only a low percentage of cases can be pursued.

To automate the persona-linking process, Lincoln Laboratory is training machine learning algorithms to compute the similarity between users on different forums. The computations are based on three aspects of users' communications online: "How they identify to others, what they write about, and with whom they write to," Dagli explains.

The algorithm is first fed data from users on a given Forum A and creates an authorship model for each user. Then, data from users on Forum B are run against all user models from Forum A. To find matches for profile information, the algorithm looks for straightforward clues, such as changes in username spelling like "sergeygork" on Forum A to "sergey gorkin" on Forum B, or more subtle similarities like "joe knight" to "joe nightmare."

The next feature the system looks at is content similarity. The system picks up on unique phrases — for example, "fun in the sun" — that are used in multiple ads. "There's a lot of copy-and-paste going on, so similar phrasings will pop up that are likely from the same user," Dagli says. The system then looks for similarities in a user's network, which is the circle of people that the user interacts with, and the topics that the user's network discusses.

The profile, content, and network features are then fused to provide a single output: a probability score that two personas from two forums represent the same real-life person.

The researchers have been testing these persona-linking algorithms both with open-source Twitter and Instagram data and hand-labeled ground truth data from dark-web forums. All of the data used in this work are obtained through authorized means. The results are promising. "Every time we report a match, we are correct 95 percent of the time. The system is one of the best linking systems that we can find in the literature," Dagli says. 

This work is the most recent development in ongoing research. From 2014 to 2017, Lincoln Laboratory contributed to the Defense Advanced Research Projects Agency (DARPA) Memex program. Memex resulted in a suite of surface- and dark-web data analysis software developed collaboratively with dozens of universities, national laboratories, and companies. Ten laboratory technologies spanning text, speech, and visual analytics that were created for Memex were released as open-source software via the DARPA Open Catalog.

Today, more than 30 agencies worldwide are using Memex software to conduct investigations. One of the biggest users, and a stakeholder in Memex's development, is the Human Trafficking Response Unit (HTRU) in the Manhattan District Attorney's Office.

Manhattan District Attorney Cyrus Vance Jr. stated in a written testimony to the U.S. House of Representatives that his office used Memex tools to screen more than 6,000 arrests for signs of human trafficking in 2017 alone. "We also used Memex in 271 human trafficking investigations and in six new sex trafficking indictments that were brought in 2017," he stated. With the introduction of Memex, prostitution arrests screened by HTRU for human trafficking indicators increased from 5 to 62 percent, and investigations of New York Police Department prostitution-related arrests increased from 15 to 300 per year.

Jennifer Dolle, the deputy chief of HTRU, visited the laboratory to present how the unit has benefited from these technologies. "We use these tools every single day. They really have changed how we do business in our office," Dolle says, explaining that prior to Memex, a human trafficking investigation could take a considerably longer time.

Now, Memex tools are enabling HTRU to quickly enhance emerging cases and build sex trafficking investigations from leads that have little information. For example, these tools — including one called TellFinder (built by Memex contributor Uncharted Software) for indexing, summarizing, and searching sex ad data — have been used to identify additional, underage victims from data in a single online prostitution advertisement. "These additional investigative leads allow HTRU to prosecute traffickers on violent felony charges and hold these defendants responsible for the true nature of the crimes they commit against vulnerable victims," says Dolle.

Researchers are continuing to learn how emerging technologies can be tailored to what agencies need and for how the dark web operates. "Data-driven machine learning has become a demonstrably important tool for law enforcement to combat illicit online marketplaces on the dark web," says Lin Li, a principal investigator of this continuous work in the laboratory’s Human Dynamic Dark Networks program, which is funded through the laboratory’s Technology Office. "But, some of the ongoing challenges and areas of research include expanding our understanding of the demand economy, disrupting the supply economy, and gaining a better overall situational awareness."

A better understanding of how the supply-and-demand chains of the dark-web economy work will help the team develop technologies to disrupt these chains. Part of the goal is to raise the risks of participating in this illicit economy; linking personas on the dark web to those on the surface web is one potentially powerful tactic.

"This fast-growing illicit economy was shown by DARPA to fund terrorist activities and shown by HTRU as a driver of modern-day slavery. Defeating terrorism and eliminating slavery are national and humanitarian needs," says Joseph Campbell, leader of the Artifical Intelligence Technology and Systems Group. "Our group has extrodinary expertise in AI, machine learning, and the analysis of human networks based on information extracted from multilanguage speech, text, and video combined with network communications and activities. The state-of-the-art technologies that we create, develop, and advance are transferred to our sponsors, who use them daily with tremendous impact for these national and humanitarian needs."

Caught between criminals and cops

Mon, 05/13/2019 - 10:10am

To a resume rich in policy and security studies, work experience, and publications, Andrew Miller may now add the unlikely skill of video production. While investigating the impact of gang violence on Lagos, Nigeria, the sixth-year political science doctoral candidate came up with an innovative research tool: immersive, virtual reality (VR) videos.

"This was the first time VR was deployed in a large-scale field survey," says Miller, a PhD candidate in the MIT Department of Political Science. "Using VR video vignettes, we could immerse respondents in hypothetical scenarios, which helped elicit their real-world emotions when answering questions about these scenarios."

Miller's foray into production evolved as part of his multi-year doctoral study into the ways criminal organizations wield influence in communities.

"Deaths from criminal violence likely equal deaths from civil war, terrorism, and interstate war combined," he says, "and those responsible often operate with quasi-impunity." In the Americas, for instance, for every 100 murders, only about 25 people are convicted, Miller notes. "It's not just a problem for developing countries; even in some major American cities, people who commit murder are much more likely to get away with it than be arrested or convicted."

Miller has a master's degree in foreign service and security from Georgetown University, and has held international development and security positions with Deloitte Consulting and the Council on Foreign Relations. After spending significant time on the ground in places like Iraq, Afghanistan, Bosnia, and Kosovo, he became keenly aware of "criminal organizations operating in many of these places under the surface," and of frequent collusion between criminal groups and governments.

"You could have a government with all the resources, the trappings of legitimacy and legal frameworks, and still have small, illegal organizations that exercise a surprising degree of control in communities," he says.

In the daily lives of citizens in so many of the places he visited, the most meaningful security issues involve "problems with underground economies, real or perceived corruption of the police, and threatened and actual violence by criminals trying to control these economies," Miller says.

Concerned by this pervasive problem, which is only likely to grow in significance as urban areas expand in population, Miller set out to investigate the relationships between citizens and law enforcement. He decided to focus specifically on how and why people in communities afflicted by gang violence decide to cooperate with police. "If someone sees a shooting or hears about somebody involved in a shooting, what determines if that person shares information with the police?" Miller wondered.

Trust issues

Hoping to develop a broadly applicable theory, Miller chose two very different locales as research sites: Lagos, Nigeria, and Baltimore, Maryland. The former, home to more than 10 million people and the economic and cultural hub of West Africa, has pockets of the city beset with groups that extort shopkeepers, along the lines of Sicily's mafia. Baltimore is afflicted with gang violence around drug trafficking and one of the highest murder rates in the United States. What unites both cities, says Miller, is "a strained relationship between many residents and the police.”

Miller began in Lagos, with its densely populated markets, to explore this distrust. His research had built-in constraints: He could not run real-world simulations of violent incidents to test witness responses.

So Miller devised the notion of VR vignettes played on mobile phones to engage subjects and make it a more realistic experience for them. Hiring a Lagos production team and actors, he filmed a series of staged fights, with more than a dozen variations changing the circumstances of the fight or police response. Shown these different videos, 1,025 people completed surveys about their willingness to share information with the police.

After 11 months in Nigeria, Miller has begun to glean insights from his fieldwork. Among them: The central constraint to reporting incidents to police is "a deep-seated perceived retaliation risk from gangs, which are regarded with both antipathy and fear," says Miller. (One possible remedy to this hurdle that he identified through his research: expanding access to anonymous police tip lines — not currently available in Lagos.)

His survey data also revealed that even if citizens witness police using excessive force, violating the rights of suspects, they still believe sharing information is important.

"It was surprising to me that, even in cases where police are widely perceived as corrupt, citizens hold an enduring faith in their ability to bring law and order, as long as it doesn't jeopardize personal safety," he says. "People show amazing resilience in the face of their problems."

Baltimore and beyond

Miller has now turned his focus to completing the Baltimore phase of research. He's donning his production hat once again — this time for video segments of local news stories designed for an online survey. Both the work in Lagos and Baltimore will feature in his thesis on cooperation between citizens and the police in communities with gangs.

Although Miller has given himself little time off, he managed to slip away to northern Italy recently and was able to indulge in his favorite pastimes of travel and food.

While he once pursued a future in development and humanitarian assistance, he has fully committed to a life in academia. "I really love digging into issues deeply, and I enjoy teaching, especially the undergraduates at MIT," he says. He also cites the fruitful support and friendships he found in the political science department "that proved instrumental at all stages of the research process, from developing ideas to writing up the results."

A faculty position in a comparable environment that enables him to continue this work would be ideal, says Miller. "It's important that my work both contributes to academic theory and is relevant to people's lives," he says. "People in the communities where I have been working have emphasized to me that research like this needs to be done, so I hope it will be useful."

Structure of T cell epitopes a decisive factor in natural HIV control, study finds

Mon, 05/13/2019 - 9:00am

In 2017, nearly 37 million people were living with HIV worldwide, according to the Joint United Nations Program on HIV/AIDS (UNAIDS). Current treatments for the virus, which becomes AIDS in its most advanced stage, involve a combination of drugs known as antiretroviral therapy (ART). For those who know they have the virus and are receiving treatment, ART can help to reduce viral loads to undetectable — and therefore untransmittable — levels, allowing individuals to stay healthy for years. Today, regular ART treatment is the only known course for stopping the spread of HIV.

Although effective, treatment as prevention, scientists say, is a far cry from a preventive HIV vaccine: There are an estimated 5,000 new cases of HIV per day. A safe and affordable vaccine administered to individuals before they become infected would significantly lower the number of new cases, at a minimum. At best, it would eradicate the virus entirely, on a global scale.

In a paper published in Science, researchers from the lab of Bruce Walker, director of the Ragon Institute of MGH, MIT, and Harvard, and professor of the practice at the Institute for Medical Engineering and Science (IMES), hit upon a unique approach to understanding the immune responses of individuals who exhibit a natural ability to suppress HIV, key to developing a vaccine.

Building on previous studies that revealed how proteins could be described as networks, lead researchers Gaurav Gaiha and Elizabeth J. Rossin, both graduates of the Harvard-MIT Program in Health Sciences and Technology, sought to apply network theory to viral protein structure, to determine the degree to which the topology and connectivity of a given amino acid residue, or a single unit of an HIV protein, might affect an individual’s reaction to the virus.

“The concept of structural topology can define how critical a residue is to viral protein structure and function,” Gaiha said. “Considering structural topology gave us a novel lens through which to then look at T cell epitopes, linear viral peptide strands presented by infected cells, that hadn’t been seen before.”

Previous studies have relied on viral sequence conservation, i.e. the number of times a given amino acid appears in a particular sequence, to determine why an individual’s immune response might be better at suppressing HIV than that of someone with a similar viral load who goes on to develop AIDS. Gaiha cited a 2015 paper by Mark Connors at the National Institutes of Health which found no difference in the sequence conservation of epitopes targeted by T cells in controllers — that is, subjects with the ability to quell the virus to non-transmissible levels — and non-controllers.

While Gaiha and his team did not throw out sequence conservation as a guiding factor, Connors’ findings did impel them to look at why sequences were conserved, or repeated, in the first place. According to the researchers, protein structure had to play a substantial role.

“We decided that we needed to change our thought process and move beyond sequence conservation towards structural constraint. In other words, looking directly at the structure of a site to see how well it is able to accommodate a mutation. That will hopefully lead us to better vaccines going forward.”

In the new study, the Walker lab built networks of amino acids and non-covalent interactions using publicly available HIV protein structure data, and then made calculations to measure the topological importance of each amino acid residue. They found that higher scoring residues — ones with greater degrees of connectivity to other residues — also tended to better impair viral replication when mutated than low-scoring residues.

Gaiha likened what he saw to a social network, such as in a corporate setting. “If you removed someone critical rather than peripheral to an operation, someone more connected and less dispensable, like the director, it would be very perturbing for the organization,” he said. “It’s the same with amino acid residues within viral proteins. Some are really critical for structural integrity and for its function, and if you target them, it’s hard for them to accommodate mutations.”

In a subsequent portion of the study, researchers analyzed 114 untreated HIV-positive individuals, each with different levels of viral load. Those with smaller loads were designated controllers, while those with higher loads, who were least able to suppress the virus, were called progressors. Controllers overwhelmingly targeted epitopes with high network scores with their strongest responses, while progressors had weak or absent responses against these epitopes. The presence of “protective” HLA alleles, typically considered definitive markers of a patient’s ability to withstand the presence of HIV, did not correlate to an individual’s low viral load. Controllers both possessed and did not possess these so-called protective alleles.

The study found that high network-scoring epitopes were more difficult to mutate by the virus, or left completely unmutated, allowing CD8 T cells to kill infected CD4 T cells and maintain viral suppression. Thus, the link between network score, determined by the connectivity of T cell epitopes, and immune response was found to be significant.

“People are realizing the importance of structural studies to be able to guide drug design, but I think the concept of using structure to guide T cell vaccines is very much an emerging one,” Gaiha said. “Ideally, our study is one that starts making people think more about this for HIV and other viruses — this concept of structural vaccinology.”

Rossin, a biologist and current chief resident at Massachusetts Eye and Ear, summed up how the study’s findings could prompt scientists to continue to examine subjects from a nontraditional standpoint.

“The field of biology and medicine over the past two decades has transitioned from looking at things in a linear way, to realizing that biology is multi-dimensional.” Rossin said.

“Biological mechanisms exert their effect via complex interrelationships between proteins and even between molecules between proteins, and if we just look at them in a one-dimensional way, we’re not getting to the answers we need…If you’re looking at the trees, you’re missing the forest.”

Making it real

Mon, 05/13/2019 - 12:00am

Cloudy beige liquid swirls inside a large bioreactor resembling a French press as Jenna Ahn examines small flasks nearby. The lab where Ahn is working, in the subbasement of Building 66, has the feel of a beehive. She’s part of one of nine teams of undergraduates huddling in groups at their benches. Every now and then, someone darts off to use a larger piece of equipment among the shakers, spectrometers, flasks, scales, incubators, and bioreactors lining the walls.

These students aren’t practicing routine distillations or titrations set up by an instructor. Each team of three or four is trying to solve a problem that doesn’t yet have an answer. In 10.26/27/29 (Chemical Engineering/Energy/Biological Engineering Projects Laboratory), students are focused on data-driven, applied research projects. They work on engineering problems posed by companies and by research labs from across the Institute, with the goal of finding solutions that can be applied to the real world.

Ahn, a junior majoring in chemical and biological engineering, and her teammates are studying acid whey, a byproduct of cheese and yogurt. Although whey has nutritional value, it is often treated as a waste product, and its disposal can remove oxygen from waterways and kill aquatic life. While it can be purified and treated like wastewater, the process is expensive.

Ahn’s team is using genetically engineered yeast to break down whey into nutritious components like sugars and omega-3 fatty acids, which could then be introduced back into the food chain. After combining the yeast with the whey, the team regularly checks dissolved oxygen and pH levels and monitors whether the yeast is breaking down the whey into its components. “This could be turned into a component of animal feed for cows and other animals,” says Ahn, gesturing to the swirling the mixture in her flask.

Fundamentals in action

Gregory Rutledge, the Lammot du Pont Professor of Chemical Engineering, has been the instructor in charge of 10.26/27/29 (Chemical Engineering/Energy/Biological Engineering Projects Laboratory) for about five years. The excitement among the course’s students stems from the knowledge that they are directly contributing to advancing technology, he says. “It’s a great motivator. They may have gotten fundamentals in their classes, but they may not have seen them in action.”

The course has existed in its current form for about 30 years, Rutledge estimates. Its chemical engineering, biological engineering, and energy-related projects appeal to a wide variety of interests. Students are given project descriptions at the beginning of the semester and have flexibility in their choices.

In the current format, students give presentations on their research progress throughout the semester and are evaluated by the 10.26/27/29 professors and their peers. At the end of the term, final presentations are judged by faculty from the entire Department of Chemical Engineering during a project showcase.

The competitive element, Rutledge says, is just one part of how the course has changed over time. “It has evolved toward this organically, as we figure out what students need to know and how to best get that to them.”

Each year, the focuses of the students’ projects change. Two of this year’s teams are working in collaboration with Somerville, Massachusetts, startup C16 Biosciences, trying to use yeast to produce a sustainable alternative to palm oil. The production of palm oil, which is primarily used for culinary and cosmetic purposes, is a leading cause of deforestation.

“We’re trying to increase production of saturated fat sustainably,” explains Kaitlyn Hennacy, a junior majoring in chemical engineering. “This doesn’t require cutting down rainforests and could be a substitute in many applications.” Hennacy is examining a cuvette of yellow liquid in which there is a collection of bright orange blobs. The blobs’ color is a carotenoid pigment produced as a byproduct during the process. Her team is using seven different solvents, such as hexane and pentane, to extract a palm oil alternative from the yeast.

“It’s the intersection of an energy-related project and a consumer project,” says Carlos Sendao, one of Hennacy’s teammates and a fellow chemical engineering major. “This is a challenge I knew to take.” Sendao is going to continue research on this project over the summer through the Undergraduate Research Opportunities Program (UROP) and the MIT Energy Initiative.

Another team is looking into recycling plastics with an enzyme called PETase, which breaks down polyethylene terephthalate (PET), the type of plastic found in single-use water bottles. “One of the biggest constraints is time,” says Connor Chung, a junior majoring in chemical engineering. “We only have three to four months to learn as much as we can about this enzyme.”

Life lessons

Every year Rutledge is impressed with how much students learn and grow over the course of the semester. The problems they’re tackling aren’t easy, and working in teams presents challenges as students navigate the dynamics of group work.

“They’re also learning a lot about life. They’re probably going to run into something in the future — whether it’s a boss, a team member, or a piece of lab equipment — that doesn’t work in the way they expect,” he says. “We try to give the students the tools if or when they come across this. And when they give those final presentations, you can see they really have evolved as engineers,” he adds.

The approach seems to be effective, says Rutledge. “People will come back one, two, three years later when they’re working,” he says. “They say, ‘I learned so much. This is what I actually do.’”

MIT, Blue Origin to cooperate on sending research experiment to the moon

Fri, 05/10/2019 - 5:15pm

MIT and Blue Origin have signed a memorandum outlining plans to pursue mutual interests in space exploration. MIT will develop one or more payload experiments to be launched aboard Blue Origin’s Blue Moon, a flexible lander delivering a wide variety of small, medium, and large payloads to the lunar surface.

MIT Apollo Professor of Astronautics and former NASA Deputy Director Dava Newman, who developed the agreement with Blue Origin, says that over the coming months, MIT researchers will invite input from the MIT community to help determine the nature of the flight opportunity experiment. “Some potential areas include smart habitats, rovers, life support and autonomous systems, human-machine interaction, science of the moon, lunar poles, sample return, and future astronaut performance and suit technologies,” Newman says.

Blue Origin’s business development director, A.C. Charania, has said the company’s lunar transportation program is its “first step to developing a lunar landing capability for the country, for other customers internationally, to be able to land multimetric tons on the lunar surface.” Blue Moon payloads could include science experiments, rovers, power systems, and sample return stages.

MIT has a long history of aerospace engineering development and lunar science related to space exploration, including receiving the first major contract of the Apollo program, which involved the design and development of the lunar missions’ guidance and navigation computers. MIT experiments have flown on Space Shuttle missions, and been conducted aboard Skylab, Mir, and the International Space Station. MIT also led the GRAIL (Gravity Recovery And Interior Laboratory) mission to explore the moon’s gravity field and geophysical structure.

Robots shoot for the moon in MIT’s annual 2.007 competition

Fri, 05/10/2019 - 4:35pm

In their historic lunar mission 50 years ago, Apollo 11 astronauts Buzz Aldrin and Neil Armstrong collected and returned to Earth more than 48 pounds of lunar material, including 50 moon rocks that researchers have been analyzing intensely ever since. If they’d only had help from some MIT robots, the astronauts might have been able to bring back even more lunar loot.

On Thursday evening, students of MIT’s popular class 2.007 (Design and Manufacturing I) proved that robots can be efficient, ingenious, and even highly entertaining moon-rock scavengers.

Over four often nail-biting hours, 32 student finalists, winnowed from a roster of 165, competed head to machined head, in the course’s annual robot competition, held in the ice rink at MIT’s Johnson Athletic Center. This year’s theme, Moonshot, was an homage to the Apollo 11 moon landing, celebrating its 50th anniversary this year.

The course designers and machinists took the theme to heart, constructing two huge, identical game boards over which pairs of student-designed robots faced off. At the center of each board stood a replica of the Apollo 11 lunar module, or LEM, which served as the competition’s starting point. Ramps on either side of the LEM sloped down to a lunar-like surface, littered with “moon rocks” — stones of various sizes and shapes, which, for practical purposes, were of Earthly origin.

The challenge called for students to maneuver their robots, which either moved autonomously or were remotely controlled, from the LEM’s starting point down to the “lunar” surface to collect as many moon rocks as possible, and return them up to the LEM, within two minutes. Robots gained more points by planting a small flag on a hillside, spinning a wheel to “charge” the LEM’s battery, and pulling a cord to jettison two weights — a particularly tricky task that, if accomplished, would trigger the LEM to “lift off,” to dramatic smoke and sound effects.

“The competition name is very apropos of the challenge that the students face, because for many of them, making a robot by themselves for the first time is a moonshot,” says Amos Winter, course co-instructor and associate professor of mechanical engineering at MIT. Winter and Associate Professor Sangbae Kim served as the competition’s emcees, both suited up for the occasion in astronaut gear.

The 2.007 competition is a yearly tradition that dates back to the 1970s, with the course’s first instructor, Woodie Flowers, the Pappalardo Professor Emeritus of Mechanical Engineering, who developed 2.007 as one of the first hands-on, project-based undergraduate courses.

Each year, at the start of the semester, students are given the same toolbox of parts, including gears, wheels, circuit boards, and microcontrollers. Through lectures and time — lots of time — in the lab, students learn to design and machine their own robot, to carry out that year’s competition challenges.

This year’s challenge inspired a range of robotic strategies and designs, including a bot, aptly named Scissorlift, that stretched itself up via a scissoring mechanism to plant a flag, and a two-bot system named Lifties, comprising one robot that hoisted rocks up to a second robot via a telescoping arm.

While most students hoped for a win, sophomore Jaeyoung Jung simply wanted to entertain. After requesting that the event’s overhead music be turned down, Jung, sporting a tuxedo, made some music of his own, playing a recorder that he had rigged to maneuver his robot. Each note he played was converted to electrical signals that were picked up by a computer, which in turn sent a corresponding command to the robot’s controller to spin a wheel and charge up the LEM.

Though the competition’s first music-controlled robot didn’t make it through the first round, it was met with cheers from an often raucous crowd of family and friends, who were treated sporadically with a confetti of custom-designed foam astronauts fired from an air cannon.

Among the enthusiastic crowd was Evelyn Wang, head of the Department of Mechanical Engineering, and her two young children, who were seeing the engineering spectacle for the first time. The competition brought back memories for Wang, who participated in 2.007 when she herself was an MIT undergraduate. That year, she recalls having to compete on a game board dubbed “Ballcano,” for a volcano-like structure that spit out balls, which robots had to catch and distribute at various locations across the game board.

“It was the first time I learned how to design, build, machine, and work with different actuation mechanisms and motors and pneumatics,” says Wang, who proudly remembers taking home fourth place.

As the night wore on, robots battled over who could scrabble up the most moon rocks, using a variety of designs, from grippers and grabbers to snowplow- and comb-like sweepers, and rotating flippers and flaps. Between each bout, course assistants quickly repositioned the moon rocks and swept the game board of any residual moon rock dust that could make a robot slip. To contend with this potential hazard, some students designed their bots with extra traction, lining their wheels with Velcro or, in the case of one bot named Sloth, rubber bands.

Sophomore Jessica Xu, whose spiky, rock-snatching “Cactus-bot” made it all the way to the semifinals, says that 2.007’s hands-on experience has helped to steer her toward a mechanically-oriented career.

“This is my first experience ever even thinking of building a robot,” Xu says. “I started the class googling, ‘What are mechanisms that robots even do?’ Because I wasn’t even sure what the possibilities were. I came into college wanting to do something that applies to health care. Now I’m hoping to concentrate in medical devices, applying the mechanical side. I’m excited to see what it could be.”

In the end, it was a powerful, motor-heavy bot named Rocky that gobbled up rocks “like Cookie Monster,” as Winter reported to the crowd, that took home the prize. Rocky’s designer, sophomore Sam Ubellacker, says it could have been the bot’s drive train that made the difference. While most students included two motors in their drive trains, Ubellacker opted for four, in order to move twice as fast as his competitors — an 11th-hour decision that ultimately paid off.

“I pretty much redesigned my entire robot the week before this competition, because I realized my other one wasn’t going to score any points,” says Ubellacker, who, as it happens, has kept up the family tradition — his brother Wyatt won first prize in 2011. “I’ve probably worked about 100 hours this week on this robot. I’m just glad that it worked out.”

He credits his success, and all the know-how he’s gained throughout the semester, to all the experts behind 2.007.

“I didn’t know much about machining going in,” Ubellacker says. “Interacting with the machinists and the staff will be my most memorable experiences. They’re all really cool people, and they shared all this knowledge with me. This was all really great.”

Ashwin Sah, Megan Yamoah, and Steven Truong named 2019-20 Goldwater Scholars

Fri, 05/10/2019 - 3:30pm

Three undergraduate students have been selected for a 2019-20 Barry M. Goldwater Scholarship, two in the School of Science and one in the School of Engineering. In partnership with the U.S. Department of Defense National Defense Education Programs, the Goldwater Foundation gave the award to 496 sophomore and junior students within the United States, chosen from more than 5,000 nominations this year.

One of the 62 fellows in mathematics and computer science majors, Ashwin Sah, is not only aiming on continuing his education in mathematics to acquire a PhD but also hopes to teach as a faculty member at a university, researching theoretical mathematics. Now a sophomore in the Department of Mathematics, he was previously one of six Putnam Fellows at the Putnam Mathematics Competition and won the gold medal at the International Math Olympiad. Sah produced two papers accepted for publication in research journals, has written several others independently, and solved a 2001 conjecture by Jeff Kahn regarding the maximum number of independent sets in a graph. He is on track to graduate with his bachelor’s degree in three years.

Megan Yamoah, a junior in the Department of Physics, is among the 360 recipients majoring in natural sciences. In addition to an outstanding academic record, she performs research in two groups and has worked with a third as a repeat participant in MIT’s Undergraduate Research Opportunities Program. Yamoah has built a semi-conductor laser and subsequently performed an experiment with it, culminating in a patent currently under review. She also helped install dilution refrigerators in a lab on MIT's campus and experiment with them largely on her own, designing and engineering a microcontroller to regulate them. In her future, Yamoah plans to focus on quantum computing. Beyond research, she is a strong student leader in many physics societies and groups on campus.

In Course 20 (biological engineering), Steven Truong joins 74 engineers across the country who were granted this year’s Goldwater fellowship. He is a junior in the Department of Biological Engineering and is also a double-major in Writing. Truong has an outstanding academic record and is also an opinion editor for the MIT Tech newspaper and co-president of the MIT Biological Engineering Undergraduate Board. His research interests lie in studying diabetes, such as developing new ways to deliver insulin to diabetics. He currently works with members of the MIT Koch Institute and has also collaborated with the Joslin Diabetes Center, an affiliate of Harvard Medical School, and traveled to Vietnam for a project he co-led. 

The Barry Goldwater Scholarship and Excellence in Education Program was established by Congress in 1986 to honor Senator Barry Goldwater, who served for 30 years in the U.S. Senate. The program was designed to foster and encourage outstanding students in their pursuit of careers in mathematics, the natural sciences, and engineering, providing recipients with stipends of $7,500 per year to contribute toward their educational expenses.

Q&A: Heather Paxson on a new model for open-access publishing in anthropology

Fri, 05/10/2019 - 3:00pm

Publishers, librarians, research funders, and leaders from across the field of anthropology — including journal editors and representatives of the major Anglophone anthropological societies of both Europe and North America — gathered at MIT on April 24 for an invitational workshop focused on a sea change for everyone who attended: moving the discipline’s journals to an open-access (OA) model.

Currently, the expense of academic publishing creates significant barriers to the broad dissemination of scholarly findings. The goal of the workshop was to consider a new model for providing open access to journal publications in a way that could transform both anthropology and a full range of academic disciplines. Professor Heather Paxson, interim head of MIT Anthropology, and an event organizer, shared her thoughts on the workshop and new OA plans with SHASS Communications.

Q: Why is open access so important for academic publications, and why has it been so difficult to achieve up until this point?

A: As a mechanism for sharing knowledge freely, open access publishing promotes the advancement of science by fostering dialogue and constructive engagement among scholars around the world. Today, with much of our scholarly knowledge gated behind steep paywalls, scholars and students affiliated with wealthier institutions have easier access to more information than others. Consequently, it can be difficult for researchers in the global south — and at under-resourced institutions anywhere — to keep up to date with citations and this, in turn, limits their opportunity to be published in leading journals. OA has the potential to break down this barrier and to make the production of knowledge — not only its consumption — more inclusive, comprehensive, and experimental.

In keeping with MIT Open CourseWare and MITx, open access publishing is part of a broader commitment to inclusive, democratic knowledge-making. As the Institute’s Ad Hoc Faculty Task Force on Open Access underscores in its draft recommendations, OA publishing is fully in line with MIT’s vision, “that science and knowledge progress more quickly and can more readily be applied to solving the world’s biggest challenges when shared openly.”

The biggest challenge to OA publishing is ensuring sustainable funding. Who will pay the bills to provide free access to knowledge? Publishers, after all, are also in the business of making money. The project of OA is nothing less than to clear a new commons within an economy of publishing that has come, too often, to put profit before science.

Research libraries have been leading the way in calling for open access. They would prefer to see their subscription costs directly support the work of students and researchers rather than generate corporate profits, but their own budgets are precarious.

Should academic societies become publishers? As a values proposition it would seem to make sense — think of it as academic self-publishing at scale. But for academic societies to become publishers would require faculty to attain the skills and assume the work of a second profession on top of the one we’re trying to enrich in the first place.

Q: What are the key features of the new “Library + Funder” (L+F) model proposed by the anthropology journal collective Libraria, and what are its pros and cons?

A: The L+F model proposed by Libraria at our MIT workshop acknowledges the complexity of today’s knowledge ecology and includes all players who participate in the production and distribution of scientific and humanistic knowledge: scholarly journals and societies, research granting agencies, publishers, and libraries.

The basic idea is to ask granting agencies to support the open publication of their funded research so that findings may reach a wider audience, with libraries covering remaining publication costs out of the subscription fees they’re currently paying to for-profit publishers who keep articles behind paywalls, or impose steep article processing charges [APCs] to open articles on an individual basis.

The major “pro” of this funding model is that it offers a way around a problem currently common to open access publishing — namely, the exploitation of underpaid or volunteer labor of production staff, or of the goodwill of authors and their backing institutions in paying APCs. By escaping proprietary agreements, the L+F model also promises greater budgetary transparency and access to data analytics for all involved.

As discussed at the MIT workshop, a major challenge of the proposed model is bringing research funders on board. The agencies that support anthropological research represent a variety of organizational structures, each raising its own set of issues. For example, for a small private research foundation also to invest in publishing might require shifting limited resources away from funded research.

Meanwhile, national foundations like NSF [National Science Foundation] and NIH [National Institutes of Health] would require a change in federal law to be able to collaborate with individual journals, since their expenditures are currently required to be via grant, cooperative agreement, or competitively-bid contract. Such structural diversity poses bureaucratic challenges to setting up a mechanism for the broad participation by funders suggested by the L+F model.

Q: What consensus, if any, was reached by participants in MIT’s invitational workshop on open access? Can you describe the next steps?

A: Inspired by the day’s discussions, Berghahn Journals and Libraria will devise a “subscribe-to-open” package to provide research libraries with the option of supporting the move of 13 Berghahn anthropology journals to open access in 2020. With library support, the Berghahn 13 will become the biggest block of titles to move from closed subscriptions to open access within a single discipline since the SCOAP3 agreement converted 12 titles in high energy physics to OA in 2012.

As a pilot project, the Berghahn and Libraria bundling of 13 anthropology titles offers an opportunity to collect data and measure success that may inspire further confidence in open access. As such, the pilot represents the first phase of a larger Libraria initiative that will include: continuing to seek ways that granting agencies can more directly support the open publication of funded research; recruiting additional anthropology journals to the Berghahn + Libraria open package; and exploring ways of supporting similar initiatives across other disciplines.

Q: In what ways do you think this workshop has advanced the goals of open access, and what benefits do you hope to accrue — both to the field of anthropology and perhaps to other disciplines considering the move to OA?

A: The workshop represented an historic gathering of leaders — from across the U.S. as well as from Canada, Europe, and the U.K. — in all the sectors involved in making available the results of anthropological research to scholars and interested publics around the world. We took a tremendous step forward by bringing everyone into the same room, with stakeholders comparing notes amongst themselves and also with each other. And we emerged with the outline of a pilot project that may yet lead the way in moving scholarly journals to open access publishing at a disciplinary scale.


 

Interview prepared by SHASS Communications
Editorial and Design Director: Emily Hiestand
Senior Writer: Kathryn O'Neill

Recreating ancient minerals

Fri, 05/10/2019 - 2:45pm

When it comes to making a lasting impression in geological history, the medium makes all the difference, especially in the Earth’s paleo-oceans. Here, during the Archean Eon (4,000-2,500 million years ago) and at times during Proterozoic (2,500-541 million years ago), when oxygen in the atmosphere and oceans was much lower than today, sedimentary minerals preserved signatures of biological activity in the form of fine textures created by microbial communities. The environmental conditions under which rocks like these form dictate how the crystal structure develops — the more orderly and fine-grained, the better the preservation.

Understanding, and better yet, replicating how these ancient minerals grew provides information about Earth’s past environments, and how organisms developed and behaved. One of these fossil-bearing rocks has proven difficult to copy in the lab — until now.

Researchers from MIT and Princeton University have found a way to emulate a part of ancient Earth in the lab by reproducing one of these weathering-resistant, information-carrying minerals, dolomite, whose formation has long perplexed scientists. A close relative to, and which can be created from, minerals that make limestone, dolomite was pervasive in the past; however, researchers rarely find it in modern environments. While it’s created from components commonly found in seawater, there are physical and kinetic barriers preventing the formation of dolomite — layers of carbonate (CO3-2) ions with alternating central atoms of calcium and magnesium. Alternatively, studies have reported protodolomite — a rock with a disordered crystalline structure, occurring only in very salty modern environments — but this mineral does not preserve the same fine microbial textures as its more ordered brother.

“To look for evidence of ancient life and old processes, you have to look at microbial structures. That's where the information is. Some of that information is preserved in the form of very finely-grained dolomite, which precipitates almost as the microbes grow. It preserves the lamina of these microbial mats,” says Tanja Bosak, associate professor in the MIT Department of Earth, Atmospheric and Planetary Sciences (EAPS) whose lab led the research. Her group uses experimental geobiology to explore modern biogeochemical and sedimentological processes in microbial systems and interpret the record of life on the early Earth. However, “there's a big problem about the origin of finely-grained dolomite in a lot of microbial structures through time: There was no clear way of making dolomite under Earth's surface conditions.”

Their results published in the journal Geology report the first creation of ordered dolomite and find that the trick to capturing these textures may be a slurry of manganese ions, seawater, light, and a biofilm of anaerobic, sulfur-metabolizing, photosynthetic microbes in an oxygen-free environment.

The study’s co-authors are former EAPS postdoc Mirna Daye and Associate Professor John Higgins from Princeton University.

Dolomite problem and the importance of order

Since the first identification of dolomite in the 18th century in what is now known as the Dolomite Mountains of Northern Italy, scientists have been stumped by how dolomite forms, and why there is so much ancient dolomite and so little of the mineral in modern times. This issue was dubbed “the dolomite problem.”

Scientists have found that modern dolomite can form in two main ways. It precipitates when shallow, hypersaline seawater is heated, and when limestone encounters magnesium-rich water, like a deep reef that’s invaded by seawater solutions. However, both methods make large crystals that obscure much of the biological information. In modern seawater, however, aragonite and calcite (different crystalline structures of calcium carbonate) are more likely to precipitate out than dolomite. “It's not hard to make dolomite if you heat up a beaker of seawater to very high temperatures, but you'll never get it at the Earth’s surface temperature and pressure just on its own,” says Bosak. “It's really hard to get magnesium into the minerals; it doesn't really want to go into the crystal lattice.” That’s a portion of the larger picture. Additionally, these mechanisms do not account for mineral variations (manganese or iron-rich dolomite) seen during the Archean and Proterozoic periods that preserved these textures. “You see that seawater is saturated with respect to dolomite, [but] it just doesn't form, so there's some kinetic barrier to that.”

It wasn’t until the turn of the 20th century that a Russian microbiologist demonstrated the potential for anaerobic bacteria to cause dolomite to form from minerals in ocean water, a process called biomineralization. Since then, researchers have found that in modern environments, biofilms — containing photosynthetic microbes and the slimy organic matrix that they excrete for their home (exopolymeric substances) — in highly evaporative pools of salty water can provide a surface on which dolomite can nucleate and grow. However, these biofilms are not photosynthetic. In contrast, many microbial structures that were preserved before the rise of oxygen grew in less-salty marine environments and are thought to have been produced by photosynthetic microbial communities. Additionally, the location of ions and microbes thought to be involved in this process likely differed in the past. The past microbes relied on sulfide, hydrogen, or iron ions for photosynthesis. Researchers suspect that more than 2 billion years ago, manganese and iron ions were present higher in the ocean sediments or even the water column. Today, because of the oxygenated atmosphere, they’re buried deeper in sediments where anaerobic conditions can occur. However, the lack of sunlight means that microbial mats don’t grow here, so neither does dolomite.

While the suggestion of microbial involvement was a strong step to solving the dolomite problem, the matters of crystal ordering and formation in the sunlit marine zone, where microbes colonize sediments, were still unresolved.

Reproducing the past

While investigating early sedimentological preservation, the group performed a series of experiments replicating the conditions of these ancient oceans with an anaerobic atmosphere. They used a combination of modern biofilms, light/dark environments, and seawater modified to mimic early Earth conditions with and without manganese, one of the metals often found in the mineral and thought to facilitate bacterial growth. The researchers used microbes from a lake in upstate New York, from depths that lack oxygen.

In their experiments, the researchers noticed something unexpected — that the most abundant mineral in the biofilms was highly ordered dolomite, and the vials that produced the most contained photosynthesizing microbes and manganese — a result consistent with field reports. As the mats grew up toward the light, crystals accumulated on them, with the oldest on the bottom capturing tiny wiggles where now degraded microbial mats used to be. The more extensive the coverage, the smaller the porosity, which reduced the chances of fluids infiltrating them, interacting with and dissolving the minerals, and essentially erasing data. The experiments lacking manganese or performed in the dark (not photosynthesizing) developed disordered dolomite. “We don't understand exactly why manganese and the microbes have that effect, but it seems like they do. It's almost like a natural consequence of those types of conditions,” says Bosak. Nonetheless, “It was a big deal to show that that can actually happen.”

Now that the team has found a way to make ordered dolomite, they plan to look into why it forms, variations, and how the rock records the environmental conditions it forms in. After seeing the effect that manganese had on dolomite, the researchers will look at iron ions, which integrated into these ancient rocks. “Iron also seems to stimulate the formation of the incorporation of magnesium into this mineral, for whatever reason,” says Bosak.

They’ll also investigate the unique microbial interactions and physical properties present to see what components are essential to precipitating dolomite. The individual niches that each anaerobic organism occupies seem to help the community grow, cycle elements, degrade substances, and provide a surface for crystals. The Bosak group will do this by fossilizing various organisms under the same or different environmental conditions to see if they can produce dolomite. During these experiments, they will monitor how well dolomite records the temperature at which it was made, as well as the chemical and isotopic composition of the surrounding solution, to understand the process better.

"I think it tells us that — when we are trying to interpret the past — it's a really different planet: different types of organisms, different types of metabolisms that were dominant,” says Bosak, “and I think we are just starting to scratch the surface of what possible mineral outcomes, what kind of textural outcomes we can even expect.”

This research was supported, in part, by the Simons Foundation Collaboration on the Origins of Life and FESD NSF project. Part of this work was performed in MIT Center for Material Science and Engineering; part of Materials Research Science and Engineering Center funded by National Science Foundation.

Building a community for statistics and data science at MIT and beyond

Fri, 05/10/2019 - 2:10pm

As a focal point for statistics at MIT, the Statistics and Data Science Center (SDSC) reflects the unique nature of statistics at MIT: steeped in cutting-edge computation, with both theoretical explorations and novel applications across departments and domains. As part of the Institute for Data, Systems, and Society (IDSS), the SDSC also fosters multi-disciplinary collaborations that bring new approaches to complex societal challenges.

These themes — computation, cross-disciplinary collaboration, creative problem-solving — were all on display at the SDSC’s third annual SDSCon, a celebration of the statistics and data science community at MIT and beyond.

SDSCon brought together over 200 participants from academia and industry, with talks ranging from tactics and techniques like machine learning to statistical applications in biology and business. “The purpose of SDSCon is to bring together folks ... interested in statistics and data science, to both celebrate as well as build community,” said SDSC director and professor of electrical engineering and computer science (EECS) Devavrat Shah in his opening remarks. School of Engineering Dean Anantha Chandrakasan commented on the work the SDSC has done in building that community by “coalescing a community of scholars across campus around the shared mission to use statistical tools to advance research and education.”

“I feel somewhat like an interloper because I am not a statistician,” joked Esther Duflo in a plenary talk that highlighted how statistical methods are being used in new cross-disciplinary ways to address societal challenges. Duflo is the Abdul Latif Jameel Professor of Poverty Alleviation and Development Economics at MIT. Her research uses machine learning to analyze the results of randomized control trials. Combined with data collection and the leveraging of social networks, she seeks to raise the number of children in developing countries who receive crucial, life-saving immunizations.

A panel of talks exploring statistics in the social sciences addressed other key societal challenges. Alberto Abadie, an MIT professor of economics and associate director of IDSS, discussed how data science is driving changes in social science research and policy making. Stanford University’s Ashish Goel looked at tools for public decision making, while Aaron Roth of the University of Pennsylvania explored how social values and ethics can be better embedded into algorithms that make autonomous decisions.

Members of the community of scholars employing advanced statistics tools at MIT gave presentations on their work, ranging from mechanical engineering and IDSS Professor Anette “Peko” Hosoi’s investigation of luck versus skill in fantasy sports, to biology professor and SDSC affiliate Aviv Regev’s design for better experiments in solving large scale challenges in cellular biology. Nike Sun, an MIT math professor, described progress toward a solution in a theoretical geometric problem in classic probability called the Ising perceptron, while John Tsitsiklis, an EECS professor who directs MIT’s Laboratory for Information and Decision Systems, gave a plenary talk focused on gaps between theory and practice in a kind of machine learning known as reinforcement learning.

SDSCon also featured talks from data science practitioners in industry. Dawn Woodard, an adjunct professor at Cornell University who is also director of data science for maps at Uber, demonstrated methods for dynamic pricing and matching in ride hailing. Lester Mackey, an adjunct professor at Stanford and statistical machine learning researcher for Microsoft Research, discussed how machine learning tools are being used to improve weather and climate forecasting that is "subseasonal," a time period from two to six weeks in the future where precipitation prediction can have a big impact on water management.

The Statistics and Data Science Center, along with IDSS, will join the new MIT Stephen A. Schwarzman College of Computing in the fall. The new college, like IDSS, crosses all five schools at MIT, and should serve as a fitting home for what Chandrakasan called the “deep interdisciplinary nature of statistics and data science.”

Said Chandrakasan: “I commend SDSC for providing a shared space among disciplines, and shaping the practice of statistics at MIT in a manner that focuses on multi-disciplinary collaborations that examine some of the most complex societal challenges we face today.”

How to tell whether machine-learning systems are robust enough for the real world

Fri, 05/10/2019 - 12:00am

MIT researchers have devised a method for assessing how robust machine-learning models known as neural networks are for various tasks, by detecting when the models make mistakes they shouldn’t.

Convolutional neural networks (CNNs) are designed to process and classify images for computer vision and many other tasks. But slight modifications that are imperceptible to the human eye — say, a few darker pixels within an image — may cause a CNN to produce a drastically different classification. Such modifications are known as “adversarial examples.” Studying the effects of adversarial examples on neural networks can help researchers determine how their models could be vulnerable to unexpected inputs in the real world.

For example, driverless cars can use CNNs to process visual input and produce an appropriate response. If the car approaches a stop sign, it would recognize the sign and stop. But a 2018 paper found that placing a certain black-and-white sticker on the stop sign could, in fact, fool a driverless car’s CNN to misclassify the sign, which could potentially cause it to not stop at all.

However, there has been no way to fully evaluate a large neural network’s resilience to adversarial examples for all test inputs. In a paper they are presenting this week at the International Conference on Learning Representations, the researchers describe a technique that, for any input, either finds an adversarial example or guarantees that all perturbed inputs — that still appear similar to the original — are correctly classified. In doing so, it gives a measurement of the network’s robustness for a particular task.

Similar evaluation techniques do exist but have not been able to scale up to more complex neural networks. Compared to those methods, the researchers’ technique runs three orders of magnitude faster and can scale to more complex CNNs.

The researchers evaluated the robustness of a CNN designed to classify images in the MNIST dataset of handwritten digits, which comprises 60,000 training images and 10,000 test images. The researchers found around 4 percent of test inputs can be perturbed slightly to generate adversarial examples that would lead the model to make an incorrect classification.

“Adversarial examples fool a neural network into making mistakes that a human wouldn’t,” says first author Vincent Tjeng, a graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL). “For a given input, we want to determine whether it is possible to introduce small perturbations that would cause a neural network to produce a drastically different output than it usually would. In that way, we can evaluate how robust different neural networks are, finding at least one adversarial example similar to the input or guaranteeing that none exist for that input.”

Joining Tjeng on the paper are CSAIL graduate student Kai Xiao and Russ Tedrake, a CSAIL researcher and a professor in the Department of Electrical Engineering and Computer Science (EECS).

CNNs process images through many computational layers containing units called neurons. For CNNs that classify images, the final layer consists of one neuron for each category. The CNN classifies an image based on the neuron with the highest output value. Consider a CNN designed to classify images into two categories: “cat” or “dog.” If it processes an image of a cat, the value for the “cat” classification neuron should be higher. An adversarial example occurs when a tiny modification to that image causes the “dog” classification neuron’s value to be higher.

The researchers’ technique checks all possible modifications to each pixel of the image. Basically, if the CNN assigns the correct classification (“cat”) to each modified image, no adversarial examples exist for that image.

Behind the technique is a modified version of “mixed-integer programming,” an optimization method where some of the variables are restricted to be integers. Essentially, mixed-integer programming is used to find a maximum of some objective function, given certain constraints on the variables, and can be designed to scale efficiently to evaluating the robustness of complex neural networks.

The researchers set the limits allowing every pixel in each input image to be brightened or darkened by up to some set value. Given the limits, the modified image will still look remarkably similar to the original input image, meaning the CNN shouldn’t be fooled. Mixed-integer programming is used to find the smallest possible modification to the pixels that could potentially cause a misclassification.

The idea is that tweaking the pixels could cause the value of an incorrect classification to rise. If cat image was fed in to the pet-classifying CNN, for instance, the algorithm would keep perturbing the pixels to see if it can raise the value for the neuron corresponding to “dog” to be higher than that for “cat.”

If the algorithm succeeds, it has found at least one adversarial example for the input image. The algorithm can continue tweaking pixels to find the minimum modification that was needed to cause that misclassification. The larger the minimum modification — called the “minimum adversarial distortion” — the more resistant the network is to adversarial examples. If, however, the correct classifying neuron fires for all different combinations of modified pixels, then the algorithm can guarantee that the image has no adversarial example.

“Given one input image, we want to know if we can modify it in a way that it triggers an incorrect classification,” Tjeng says. “If we can’t, then we have a guarantee that we searched across the whole space of allowable modifications, and found that there is no perturbed version of the original image that is misclassified.”

In the end, this generates a percentage for how many input images have at least one adversarial example, and guarantees the remainder don’t have any adversarial examples. In the real world, CNNs have many neurons and will train on massive datasets with dozens of different classifications, so the technique’s scalability is critical, Tjeng says.

“Across different networks designed for different tasks, it’s important for CNNs to be robust against adversarial examples,” he says. “The larger the fraction of test samples where we can prove that no adversarial example exists, the better the network should perform when exposed to perturbed inputs.”

“Provable bounds on robustness are important as almost all [traditional] defense mechanisms could be broken again,” says Matthias Hein, a professor of mathematics and computer science at Saarland University, who was not involved in the study but has tried the technique. “We used the exact verification framework to show that our networks are indeed robust … [and] made it also possible to verify them compared to normal training.”

Ambient plant illumination could light the way for greener buildings

Thu, 05/09/2019 - 11:59pm

Buildings of the future may be lit by collections of glowing plants and designed around an infrastructure of sunlight harvesting, water transport, and soil collecting and composting systems. That’s the vision behind an interdisciplinary collaboration between an MIT architecture professor and a professor of chemical engineering.

The light-emitting plants, which debuted in 2017, are not genetically modified to produce light. Instead, they are infused with nanoparticles that turn the plant’s stored energy into light, similar to how fireflies glow. “The transformation makes virtually any plant a sustainable, potentially revolutionary technology,” says Michael Strano, the Carbon P. Dubbs Professor of Chemical Engineering at MIT. “It promises lighting independent of an electrical grid, with ‘batteries’ you never need to charge, and power lines that you never need to lay.”

But Strano and his colleagues soon realized that they needed partners who could expand the concept and understand its challenges and potential as part of a future of sustainable energy. He reached out to Sheila Kennedy, professor of architecture at MIT and principal at Kennedy and Violich Architecture, who is known for her work in clean energy infrastructure.

“The science was so new and emergent that it seemed like an interesting design challenge,” says Kennedy. “The work of this design needed to move to a different register, which went beyond the problem of how the plant nanobionics could be demonstrated in architecture. As a design team, we considered some fundamental questions, such as how to understand and express the idea of plant lighting as a living, biological technology and how to invite the public to imagine this new future with plants.”

“If we treat the development of the plant as we would just another light bulb, that’s the wrong way to go,” Strano adds.

In 2017, Kennedy and Strano received a Professor Amar G. Bose Research Grant to build on their collaboration. The MIT faculty grants support unconventional, ahead-of-the-curve, and often interdisciplinary research endeavors that are unlikely to be funded through traditional avenues, yet have the potential to lead to big breakthroughs.

Their first year of the Bose grant yielded several generations of the light-emitting watercress plants, which shine longer and brighter than the first experimental versions. The team is evaluating a new component to the nanobiotic plants that they call light capacitor particles. The capacitor, in the form of infused nanoparticles in the plant, stores spikes in light generation and “bleeds them out over time,” Strano explains. “Normally the light created in the biochemical reaction can be bright but fades quickly over time. Capacitive particles extend the duration of the generated plant light from hours to potentially days and weeks.”

The researchers have added to their original patent on the light-emitting plant concept, filing a new patent on the capacitor and other components as well, Strano says.

Designing for display

As the nanobionic plant technology has advanced, the team is also envisioning how people might interact with the plants as part of everyday life. The architectural possibilities of their light-emitting plant will be on display within a new installation, “Plant Properties, a Future Urban Development,” at the Cooper Hewitt, Smithsonian Design Museum in New York opening May 10.

Visitors to the installation, part of the 2019 “Nature—Cooper Hewitt Design Triennial” exhibition, can peek into a scaled architectural model of a New York City tenement building — which also serves as a plant incubator — to see the plants at work. The installation also demonstrates a roadmap for how an existing residential building could be adapted and transformed by design to support the natural growth of plants in a future when available energy could be very limited.

“In Plant Properties, the nanobionic plant-based infrastructure is designed to use nature’s own resources,” says Kennedy. “The building harvests and transports sunlight, collects and recycles water, and enriches soil with compost.”

The invitation to contribute to the Cooper Hewitt exhibition offered an unexpected way to demonstrate the plants’ possibilities, but designing an exhibit brought about a whole new set of challenges, Kennedy explains. “In the world of design museums, you’re usually asked to show something that’s already been exhibited, but this is new work and a new milestone in this project.”

“We learned a lot about the care of plants,” Strano adds. “It’s one thing to make a laboratory demonstration, but it’s another entirely to make 33 continuous weeks of a public demonstration.”

The researchers had to come up with a way to showcase the plants in a low-light museum environment where dirt and insects attracted by living plants are usually banished. “But rather than seeing this as a sort of insurmountable obstacle,” says Kennedy, “we realized that this kind of situation — how do you enable living plants to thrive in the enclosed setting of a museum — exactly paralleled the architectural problem of how to support significant quantities of plants growing inside buildings.”

In the installation, multiple peepholes into the building model offer glimpses into the ways people in the building are living with the plants. Museum visitors are encouraged to join the experiment and crowdsource information on plant growth and brightness, by uploading their own photos of the plants to Instagram and tagging the MIT Plant Nanobiotics lab, using @plantproperties.

The team is also collecting data on how the plants respond to the nanoparticles and other potential stresses. “The plants are actually under more stress from being in the museum environment than from the modifications that we introduce, but these effects need to be studied and mitigated if we are to use plants for indoor lighting,” Strano notes.

Bright and nurturing futures

Kennedy and Strano say the plants could be at the center of a new — but also “pre-eclectic” — idea in architecture.

For most of human history, Kennedy explains, natural processes from sunlight to waste composting were part of the essential infrastructure of buildings. But these processes have been excluded in modern thinking or hidden away, preventing people from coming face to face with the environmental costs of energy infrastructure made from toxic materials and powered by fossil fuels.

“People don’t question the impacts of our own mainstream electrical grid today. It’s very vulnerable, it’s very brittle, it’s so very wasteful and it’s also full of toxic material,” she says. “We don’t question this, but we need to.”

“Lighting right now consumes a vast portion of our energy demand, approaching close to 20 percent of our global energy consumption, generating two gigatons of carbon dioxide per year,” Strano adds. “Consider that the plants replace more than just the lamp on your desk. There’s an enormous energy footprint that could potentially be replaced by the light-emitting plant.”

The team is continuing to work on new ways to infuse the nanoparticles in the plants, so that they work over the lifetime of the plant, as well as experimenting on larger plants such as trees. But for the plants to thrive, architects will have to develop building infrastructure that integrates the plants into a new internal ecosystem of sunlight, water and waste disposal, Kennedy says.

“If plants are to provide people with light, we need to keep plants healthy to benefit from everything they provide for us,” she says. “We think this is going to trigger a much more caring or nurturing relationship of people and their plants, or plants and the people that they illuminate.”

Cornerstone donation sparks bright future for MISTI MIT-Israel program

Thu, 05/09/2019 - 12:20pm

In the first major step toward solidifying a future for the MIT International Science and Technology Initiatives (MISTI) MIT-Israel program, Arthur J. Samberg '62 has made a $1 million donation. This extraordinary gift is a foundational move in making sure the program — a critical bridge between MIT and Israel for over a decade — will be able to continue supporting student and faculty work for years to come.

“As someone who has been very closely involved at MIT for many years," Samberg reflects, "I was very excited to hear about the MIT-Israel program and the work MISTI has been doing in the country. I really hope that my gift will motivate others to ensure that the MISTI MIT-Israel program will be endowed in perpetuity for many generations to come to continue to reap the benefits of this impactful program — both for our students and for the State of Israel.”

A gift toward sustainability

MIT-Israel has had major demand for more than 11 years. The initiative pairs students with tailored and intensive internship, research, and teaching opportunities across the country. Over 800 MIT students have taken part in the program, with more than 100 host institutions participating. Upwards of 30 faculty seed funds have been facilitated by MIT-Israel, and in the last year alone, 118 MIT students have taken part in the popular MISTI program.

Christine Ortiz, the Morris Cohen Professor of Materials Science and Engineering and founding faculty of the MIT-Israel program, writes that the program has met with incredible popularity. “When we launched the MIT-Israel program, we had no idea of student and faculty demand and impact,” says Ortiz. “The program continues to grow with strong demand both from the MIT community and our partners in Israel. Mr. Samberg’s gift is a strong step towards our goal of ensuring the financial sustainability of the program for generations to come.”

Opportunities in the "Startup Nation"

Despite the program’s momentum, MIT-Israel has not secured the funding needed to ensure its current and future programming into its second decade. Samberg’s gift — $500,000 in endowment and $500,000 in expendables for student and seed funds — is a major step in what the program hopes will be continuing support from the MIT community of alumni and donors to keep the program running strong. As David Dolev, MIT-Israel managing director, shares: “This is an incredible next step towards ensuring a strong MIT-Israel engagement for our students and faculty. We invite our alumni and others in the wider community to join us and build on this gift to fully endow the MISTI MIT-Israel program and enable these opportunities for future generations.”

Dubbed the “Startup Nation,” Israel is a center of innovation and opportunity that attracts ambitious MIT students. Major marquee companies are also drawn to Israel, hoping to join in what MIT Technology Review calls “the magic in the country’s bubbling ecosystem of more than 6,000 start-ups.” For over a decade, MIT-Israel has been enabling students to plug into and have transformational experiences in this ecosystem, where talented minds have never been in higher demand.

Cultural and professional growth

Alumni of the MIT-Israel program frequently cite the cultural impact of the program as well. For Tammy Wu, who took part in MISTI’s Global Teaching Lab to teach in a network of Israeli high schools over Independent Activities Period, MISTI-Israel was a chance to immerse herself in an unfamiliar, non-Western culture. The difference in behavioral norms gave her a chance to grow in new ways. “For example,” she says, “because Israel’s culture invites discussion, I’ve improved on asserting myself and not being afraid to share an opinion that is different than others.”

Another alumnus recalls, “By specifically doing MISTI in Israel, I got the rare opportunity to grow both culturally and professionally. The work I did in my lab was probably the most interesting project I’ve ever worked on, and the experience of traveling around Israel was one of the most rewarding cultural experiences I’ve ever had.”

The transformative experiences that MIT students are having in Israel is part of the educational arch of the program. As Dolev says: “One key component of the MISTI experience is the rigorous pre-departure preparation. As with all MISTI programs, students need to acquire country-based knowledge prior to their experience abroad. The aim of this training is to give the students a deep understanding of the country and culture they will be immersed in so that they can both succeed professionally and navigate effectively the new culture they will be living and working in."

While in Israel, students also have assignments to meet with local alumni to understand more in-depth about the unique factors that have brought about the startup nation culture, and they meet with local leaders in their field of internship. Upon their return, students have the opportunity to be mentors for students going to Israel in the following year.

Multi-generational impact

“I have had the opportunity over the years to engage with our students going to Israel for hands-on experiences,” says Eran Ben-Joseph, faculty director and head of the MIT Department of Urban Studies and Planning. “They come back with both a deep understanding of the region while taking their course of study to the next level. Both aspects of their Israel experience are invaluable, and they take them with them far beyond their time at MIT. Mr. Samberg’s support will impact many students for years to come.”

MISTI creates applied international learning opportunities for MIT students that increase their ability to understand and address real-world problems and bolsters MIT’s research mission by promoting collaborations between MIT faculty members and their counterparts abroad.

Building global understanding

“I personally think it’s incredibly important,” says senior Alice Zhou, who also taught in Israel alongside Tammy Wu, “to take the time to appreciate different cultures, as we live in a global community, but so much fear and hatred can brew from a lack of understanding each other. Cultural differences should be celebrated, and the first step to that appreciation is exposure.”

Looking to the future, Zhou sees an important mission for MIT-Israel and similar programs. “I hope in the future more MIT students will be able to see and experience the same cultural diversity I was able to and carry that work forward fostering global cooperation.”

 

Story prepared by MIT SHASS Communications
Editorial and Design Director: Emily Hiestand
Writer: Alison Lanier / Advisor: David Dolev, MISTI

 

New seed fund to address food, water, and agriculture in India

Thu, 05/09/2019 - 12:00pm

Representatives of the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS), MIT-India, and the Indian Institute of Technology Ropar (IIT Ropar) gathered recently for a signing ceremony to formally launch a new faculty seed fund. The seed fund will help initiate new water- and food systems-related research collaborations between faculty and research scientists from MIT and IIT Ropar. Through it, MIT will provide grants for early-stage research projects on topics primarily related to water, food, and agriculture.

“Our goal is to establish a pathway for research collaboration with IIT Ropar’s faculty and students on the world’s pressing challenges around water supply and food security. The interchange will provide MIT researchers with a direct window into the agricultural and water resources environment of Punjab and Himachal Pradesh in India,” says John Lienhard, J-WAFS director and Abdul Latif Jameel Professor of Water at MIT.

J-WAFS promotes and supports research at MIT, with a mission to make meaningful contributions to solving the diverse challenges surrounding the world’s food and water needs. Through research funding and other activities, they support the development and deployment of effective technologies, programs and policies to address concerns stemming from population growth, climate change, urbanization, and development.

“IIT Ropar is a fast-rising research institution, already highly-ranked for research citations in India," says Lienhard. “IIT Ropar lies in one of India’s most important agricultural regions and will be an excellent partner for research around water and food.” IIT Ropar is one of eight new Indian Institutes of Technology (IITs) set up by the government of India to expand the reach and enhance the quality of technical education in India. The IITs emphasize research as a primary focus of the institution.

Director of IIT Ropar Professor Sarit K. Das, who has also served as a visiting professor at MIT in 2007 and 2011, spoke at the ceremony about the need for research to address these critical issues. “We are one of the new generation IITs, and we want a very large focus on research,” he says. “There are large problems with agriculture, with water resources, and we want this as one of our focus areas. This is where J-WAFS comes in; we decided that we must join hands together to do something.”

To provide these opportunities for joint research, J-WAFS and IIT Ropar will work with MIT-India, part of MIT International Science and Technology Initiatives (MISTI), to expand the outreach to MIT’s research community.  “This is a natural partnership for us,” says Renee Robins, executive director of J-WAFS. “Most faculty members are aware of MISTI Global Seed Funds, and the MIT-India program has a long track record of successful international collaborations, with established infrastructure for program management and proposal review.”

“We are committed to working with our MIT colleagues through these interdisciplinary initiatives to address the research interests of our MIT community and our Indian colleagues,” says Mala Ghosh, managing director of MIT-India. “This new seed fund will create a cross-fertilization of ideas in critical areas, generate student involvement, and link the overlapping networks of J-WAFS, MIT-India, and IIT, thereby launching an even more robust and effective research environment.”

The MIT-IIT Ropar Seed Fund will become a part of the MISTI Global Seed Funds. Open to faculty and researchers from MIT and IIT Ropar who are pursuing water- and food-related research, MISTI Global Seed Funds create opportunities for international cooperation by funding early-stage collaboration between MIT researchers and their counterparts around the world. The call for proposals will open in May with a deadline in September.

Martin Zwierlein receives Vannevar Bush Faculty Fellowship

Thu, 05/09/2019 - 11:50am

Physics Professor Martin Zwierlein has been named one of 10 recipients of the 2019 Class of Vannevar Bush Faculty Fellowship by the U.S. Department of Defense (DoD).

Zwierlein and other fellows each will receive up to $3 million over a five-year fellowship term. Zwierlein aims to “uncover the rules by which ensembles of atoms and molecules organize under the laws of quantum mechanics,” he explains. To do so, he will use a system that uses ultracold gases of atoms and molecules as stand-ins for electrons in condensed matter or neutrons and protons in nuclear matter. With this “quantum emulator,” Zwierlein’s research group hopes to gain insights into a much wider range of physical systems.

“The award provides me with the unique opportunity to go into truly unchartered territory in the quantum world, not driven by deadlines and milestones but, in the best sense of fundamental research, by curiosity,” says Zwierlein, who is the inaugural Thomas A. Frank (1977) Professor of Physics. “With luck, we may stumble upon new states of matter with extraordinary properties that we did not even anticipate.”

The highly competitive fellowship, formerly known as the National Security Science and Engineering Faculty Fellowship, aims to advance transformative, university-based fundamental research. It is named in honor of Vannevar Bush PhD 1916 (1890-1974), a professor, and dean of engineering at MIT, as well as vice president, chair of the MIT Corporation, and honorary chair. A scientist and engineer nicknamed “The General of Physics,” he organized and led American science and technology during World War II. Bush also served as the director of the U.S. Office of Scientific Research and Development and founded a large defense and electronics company.  

Selected by a panel of experts from among more than 250 white papers, this year’s awardees will join 55 current fellows conducting DoD-related research in areas that include materials science, cognitive neuroscience, quantum information sciences, and applied mathematics.    

"The Department of Defense is the home of big ideas for unique problem sets," said Bindu Nair, deputy director for basic research in the Office of the Under Secretary of Defense for Research and Engineering. "The Vannevar Bush Faculty Fellowship reflects the department's commitment to support paradigm-shifting research that explores the unknown, engages outstanding scientists and engineers on these challenges, and helps to define and transform our research agendas of the future."

Painting a fuller picture of how antibiotics act

Thu, 05/09/2019 - 10:59am

Most antibiotics work by interfering with critical functions such as DNA replication or construction of the bacterial cell wall. However, these mechanisms represent only part of the full picture of how antibiotics act.

In a new study of antibiotic action, MIT researchers developed a new machine-learning approach to discover an additional mechanism that helps some antibiotics kill bacteria. This secondary mechanism involves activating the bacterial metabolism of nucleotides that the cells need to replicate their DNA. 

“There are dramatic energy demands placed on the cell as a result of the drug stress. These energy demands require a metabolic response, and some of the metabolic byproducts are toxic and help contribute to killing the cells,” says James Collins, the Termeer Professor of Medical Engineering and Science in MIT’s Institute for Medical Engineering and Science (IMES) and Department of Biological Engineering, and the senior author of the study. Collins is also the faculty co-lead of the Abdul Latif Jameel Clinic for Machine Learning in Health.

Exploiting this mechanism could help researchers to discover new drugs that could be used along with antibiotics to enhance their killing ability, the researchers say.

Jason Yang, an IMES research scientist, is the lead author of the paper, which appears in the May 9 issue of Cell. Other authors include Sarah Wright, a recent MIT MEng recipient; Meagan Hamblin, a former Broad Institute research technician; Miguel Alcantar, an MIT graduate student; Allison Lopatkin, an IMES postdoc; Douglas McCloskey and Lars Schrubbers of the Novo Nordisk Foundation Center for Biosustainability; Sangeeta Satish and Amir Nili, both recent graduates of Boston University; Bernhard Palsson, a professor of bioengineering at the University of California at San Diego; and Graham Walker, an MIT professor of biology.

“White-box” machine-learning

Collins and Walker have studied the mechanisms of antibiotic action for many years, and their work has shown that antibiotic treatment tends to create a great deal of cellular stress that makes huge energy demands on bacterial cells. In the new study, Collins and Yang decided to take a machine-learning approach to investigate how this happens and what the consequences are.

Before they began their computer modeling, the researchers performed hundreds of experiments in E. coli. They treated the bacteria with one of three antibiotics — ampicillin, ciprofloxacin, or gentamicin, and in each experiment, they also added one of about 200 different metabolites, including an array of amino acids, carbohydrates, and nucleotides (the building blocks of DNA). For each combination of antibiotics and metabolites, they measured the effects on cell survival.

“We used a diverse set of metabolic perturbations so that we could see the effects of perturbing nucleotide metabolism, amino acid metabolism, and other kinds of metabolic subnetworks,” Yang says. “We wanted to fundamentally understand which previously undescribed metabolic pathways might be important for us to understand how antibiotics kill.”

Many other researchers have used machine-learning models to analyze data from biological experiments, by training an algorithm to generate predictions based on experimental data. However, these models are typically “black-box,” meaning that they don’t reveal the mechanisms that underlie their predictions.

To get around that problem, the MIT team took a novel approach that they call “white-box” machine-learning. Instead of feeding their data directly into a machine-learning algorithm, they first ran it through a genome-scale computer model of E. coli metabolism that had been characterized by Palsson’s lab. This allowed them to generate an array of “metabolic states” described by the data. Then, they fed these states into a machine-learning algorithm, which was able to identify links between the different states and the outcomes of antibiotic treatment.

Because the researchers already knew the experimental conditions that produced each state, they were able to determine which metabolic pathways were responsible for higher levels of cell death.

“What we demonstrate here is that by having the network simulations first interpret the data and then having the machine-learning algorithm build a predictive model for our antibiotic lethality phenotypes, the items that get selected by that predictive model themselves directly map onto pathways that we’ve been able to experimentally validate, which is very exciting,” Yang says.

Markus Covert, an associate professor of bioengineering at Stanford University, says the study is an important step toward showing that machine learning can be used to uncover the biological mechanisms that link inputs and outputs.

“Biology, especially for medical applications, is all about mechanism,” says Covert, who was not involved in the research. “You want to find something that is druggable. For the typical biologist, it hasn’t been meaningful to find these kinds of links without knowing why the inputs and outputs are linked.”

Metabolic stress

This model yielded the novel discovery that nucleotide metabolism, especially metabolism of purines such as adenine, plays a key role in antibiotics’ ability to kill bacterial cells. Antibiotic treatment leads to cellular stress, which causes cells to run low on purine nucleotides. The cells’ efforts to ramp up production of these nucleotides, which are necessary for copying DNA, boost the cells’ overall metabolism and leads to a buildup of harmful metabolic byproducts that can kill the cells.

“We now believe what’s going on is that in response to this very severe purine depletion, cells turn on purine metabolism to try to deal with that, but purine metabolism itself is very energetically expensive and so this amplifies the energic imbalance that the cells are already facing,” Yang says.

The findings suggest that it may be possible to enhance the effects of some antibiotics by delivering them along with other drugs that stimulate metabolic activity. “If we can move the cells to a more energetically stressful state, and induce the cell to turn on more metabolic activity, this might be a way to potentiate antibiotics,” Yang says.

The “white-box” modeling approach used in this study could also be useful for studying how different types of drugs affect diseases such as cancer, diabetes, or neurodegenerative diseases, the researchers say. They are now using a similar approach to study how tuberculosis survives antibiotic treatment and becomes drug-resistant.

The research was funded by the Defense Threat Reduction Agency, the National Institutes of Health, the Novo Nordisk Foundation, the Paul G. Allen Frontiers Group, the Broad Institute of MIT and Harvard, and the Wyss Institute for Biologically Inspired Engineering.

Franklin Fisher, professor emeritus of economics, dies at 84

Wed, 05/08/2019 - 12:45pm

Franklin M. Fisher, the Jane Berkowitz Carlton and Dennis William Carlton Professor of Microeconomics, emeritus, died on April 29 at the age of 84. 

Fisher was born in New York City and received both his undergraduate degree and his PhD from Harvard University. He joined the MIT faculty in 1960, after a one-year post-PhD stint as an assistant professor at the University of Chicago. Fisher spent the rest of his career at MIT. In 2000, he was appointed the inaugural holder of the Jane Berkowitz Carlton and Dennis William Carlton Professorship of Microeconomics. He became a professor emeritus in 2004. 

Fisher was a versatile economist who made important contributions to economic theory, econometric methods, and the empirical analysis of firm and industry behavior. He was best known for his research on aggregation theory, estimation of simultaneous equation models, and the measurement and consequences of industry concentration. His contributions were widely celebrated. In 1973, he received the John Bates Clark Medal from the American Economic Association, an award then presented every other year to the American economist under the age of 40 who is judged to have made the most significant contributions to economic thought and knowledge. He served as president of the Econometric Society, was a fellow of the American Academy of Arts and Sciences, and held an honorary degree from the Hebrew University in Jerusalem.

Fisher was actively involved in antitrust policy. He served as the lead economic expert for IBM in the 1970s, when the U.S. Department of Justice sued the firm for anticompetitive behavior. After the case was settled in the early 1980s, he and co-authors John McGowan and Joen Greenwood published "Folded, Spindled and Mutilated," a comprehensive analysis of the economic issues in the case. Several years later, Fisher served as the lead expert for the Department of Justice in another high-profile antitrust case, U.S. v. Microsoft. 

Late in his career, Fisher’s interests shifted to the economics of water distribution in the Middle East, leading a team to model water resources and identify opportunities for gains from cross-border water trading. 

Fisher was a very popular teacher and an active dissertation adviser. He served as the primary adviser for 47 doctoral students, and as the secondary adviser (committee member) for dozens more. Five of his advisees are current MIT faculty members, including Nancy L. Rose ’85, the current department head for Economics. 

“Frank was a wonderful mentor whose lectures combined technical rigor with a rich interest in applied questions," Rose says. "He was in high demand as a dissertation supervisor, where his advice ranged from econometric specifications to the craft of writing.”

Fisher was active in a broad range of outside pursuits. He was a silver life master of duplicate bridge and an avid sailor. His work with numerous nonprofit organizations included presidencies of American Friends of Peace Now, the New Israel Fund, and the American Jewish Congress New England Region.

He is survived by his wife, Ellen Paradise Fisher of Cambridge, Massachusetts; three children: Abraham and Abigail of Belmont, Massachusetts, and Naomi of Ann Arbor, Michigan, and their spouses; and eight grandchildren.   
 

Gil Strang is still going strong, online and in print

Wed, 05/08/2019 - 12:30pm

MIT’s class 18.06 (Linear Algebra) has surpassed 10 million views on OpenCourseWare (OCW). That’s the kind of math that makes Professor Gilbert Strang one of the most recognized mathematicians in the world.

“That was a surprise to me,” says Strang. But not to those at OCW.

“He is a favorite; there is no way around it,” says OCW Director Curt Newton. Each month, OCW publishes a list of its most-visited courses, and Newton points out that Strang’s course has always been among the top 10 most-viewed since OCW launched. “He cracked the 10 million number,” he says. “It’s clear that Gil’s teaching has struck just the right chord with learners and educators around the world.”

Strang’s 18.06 lectures, posted between 2002-2011, also have more than 3.1 million YouTube views from math students in places like India, China, and Africa, among others. “His lectures are just excellent,” explains math Professor Haynes Miller. To illustrate the video’s massive popularity, Miller recounts a conversation, at the online Electronic Seminar on Mathematics Education, about revising a linear algebra course at the University of Illinois. “In the new version, they do almost no lecturing ... and one reason they feel that they can get away with that is that they can send students to Gil’s lectures on OCW.”

A linear path to MIT

Strang, the MathWorks Professor of Mathematics, received his BS from MIT in 1955. After earning his PhD from the University of California at Los Angeles in 1959, he returned to MIT to teach.

Strang began teaching linear algebra in the 1970s, during a time when engineers and scientists wrote large software packages using the finite element method to solve structural problems, computing forces and stresses in solid and fluid mechanics. Strang recalls his “Aha!” moment when he thought about the finite element method of solving partial differential equations using simple trial functions. With scientists generating a huge amount of data, from magnetic resonance scans producing millions of images to microarrays of entire genomes, the goal was to find structure and language to make sense of it all.

Once Strang realized that the tools of linear algebra were related to everything from pure math to the internet, he decided to change the way the subject was taught. The 18.06 class soon became popular with science and engineering students, at MIT and around the world. Now in its fifth edition, Strang’s textbook "Introduction to Linear Algebra" has been translated into French, German, Greek, Japanese, and Portuguese. More than 40 years later, about a third of MIT students take this course.

“I’m not teaching the math guys who jump over linear algebra,” he says. “18.06 is specifically for engineering and science and economics and management.”

Certainly one of the secrets to his OCW success is his teaching style. Strang has a quick smile and an encouraging manner. In his class, he says “please” and “thank you.” To gauge whether students are keeping up, he asks, “Am I OK?” or adds explanations and recaps. He strives for an interactive class by asking questions, and gives intuitions and pictures before presenting a formal proof. And the students seem delighted to see beautiful results emerge from seemingly simple constructions.

After a lifetime of teaching at MIT, he is still able to project energy and enthusiasm over his subject. In short, he’s a natural for video.

“My original motive for doing this was to encourage other faculty to do it, and maybe show them a new way to teach linear algebra,” he says. His first set of lectures was recorded in 1999 with support from the Lord Foundation of Massachusetts. The videos don’t feature fancy graphics or music, but are an homage to the power of old-school lectures with a chalkboard by a master teacher.

The most popular of Strang’s multiple 18.06 OCW versions is the enhanced 18.06SC “OCW Scholar” version, published in 2011. It adds problem-solving videos by grad students and postdocs patiently explaining a complex subject to a grateful audience, very much in the spirit of Strang’s lectures.

“This lecture series is one of the few that I like to watch for fun,” says one commenter. Adds another, “This teacher would be fun to sit down with and have a cup of coffee and conversation.” And a high school teacher says, “He is clear, interesting, and nonthreatening. I watch his linear algebra lessons and wish I could tell him how terrific he is.”

A new book

OCW will soon post 34 videos, along with an introduction, to his relatively new class 18.065 (Matrix Methods in Data Analysis, Signal Processing, and Machine Learning.) To accompany the class, Strang recently released "Linear Algebra and Learning from Data," his 12th textbook.

Strang is known for his clear yet lively writing, and early reviews confirm that this new book continues his style. Even the book’s cover is evocative. He chose a photo his son Robert took, on Inle Lake in Myanmar, of a man on a boat holding a fishing net encased in a bamboo cage. The man is lifting up what Strang says resembles a neural net.

The class was a chance for Strang to expand his linear algebra teachings into the area of deep learning. This class debuted in 2017 when Professor Raj Rao Nadakuditi of the University of Michigan spent his sabbatical teaching 18.065 at MIT. For the class, professor of applied mathematics Alan Edelman introduced the powerful language Julia, while Strang explained the four fundamental subspaces and the Singular Value Decomposition.

“This was linear algebra for signals and data, and it was alive,” says Strang. “More important, this was the student response, too.”

Last spring, he started assembling the handouts and online materials into a book. Now in its third year, the class is held in 2-190 and is filled to capacity. In the class and book, Strang starts with linear algebra and moves to optimization by gradient descent, and then to the structure and analysis of deep learning. His goal is to organize central methods and ideas of data science, and to show how the language of linear algebra expresses those ideas.

“The new textbook is just the beginning, as the course invites students to ask their own questions and write their own programs. Exams are outlawed. A key point of the course is that it ends with a project from each student — and those projects are wonderful.”

His students agree.

“Professor Strang structures the class so that ideas seem to flow from the students into proofs,” says senior and math major Jesse Michel. “There’s a nice balance between proofs and examples, so that you know the approaches work in general, while never losing sight of practice. Every class includes a cool math trick or joke that keeps the class laughing. Professor Strang’s energy and emphasis on the exciting points keeps the class on the edge of their seats.”

Open means open

Haynes Miller says that all MIT faculty are invited to contribute courses to OCW. There are about 2,450 courses on OCW currently, with over 100 having complete video lectures, and more going up as fast as OCW can post them.

“OCW began under foundation grants, but is now supported by the provost here at MIT, corporate sponsors, and user donations,” says Miller. “I feel that MIT faculty are extremely lucky to have OpenCourseWare as a publication venue for courseware we design.”

Professor Emeritus David Gordon Wilson, expert in human-powered transport and gas turbines, dies at 91

Wed, 05/08/2019 - 10:40am

David Gordon Wilson, professor emeritus of mechanical engineering, passed away on May 2 at the age of 91. Wilson served on MIT’s faculty since 1966 and remained an active member of the mechanical engineering community up until his death.

Wilson was born in 1928 and grew up in Warwickshire, England. Inspired by his love for bicycles, Wilson studied engineering at the University of Birmingham, where he received his bachelor’s degree in 1948. He continued his education at the University of Nottingham, where he earned his PhD in 1953.

Upon completing his PhD, Wilson was given a postdoctoral Commonwealth Fund Fellowship to conduct research abroad at MIT and Harvard University. At the conclusion of his fellowship, Wilson worked at Boeing as a gas turbine engineer.

After briefly returning to the U.K., Wilson embarked on a two-year stint in Africa, where he taught at the University Ibadan in Zaria, Nigeria. He also worked for Voluntary Service Overseas in Cameroon. A case of malaria forced Wilson to move home to England.

In 1960, Wilson was invited by the Northern Research and Engineering Corp. to serve as technical director and vice president. He was charged with leading efforts to form a London branch of the company that specialized in heat transfer and turbo-powered machinery.

At the invitation of Richard Soderberg, then the head of MIT’s Department of Mechanical Engineering, Wilson joined MIT’s faculty in 1966. He taught thermodynamics and mechanical design. As a professor, Wilson served as advisor to a number of students conducting research in turbomachinery, fluid mechanics, and various design topics.

While much of his primary research focused on turbine gas engines and jet engine design, Wilson parlayed a number of his passions into professional pursuits. His interest in transportation led to an appointment on a commission of the Massachusetts Bay Transportation Authority, where he gave recommendations on how to increase use and efficiency in public transportation. He also served on the Center for Transportation Studies.

Transportation was a key theme in Wilson’s career — not only in his research on jet engines, but also in a thread that would weave throughout his life: his love of bicycles. Wilson was particularly enamored with recumbent bicycles. In 1967, he helped organize an international design competition in human-powered land transport in an effort to get more people interested in bicycle design.

In 1974, Wilson released the first edition of "Bicycling Science." It became MIT Press' best-selling book and is regarded as the premiere authority on bicycle design. Throughout the 1970s, he continued to design recumbent bicycles. He eventually designed the Avatar 2000, a bike that broke the world record in speed at the International Human Powered Vehicle Association in 1982.

Around the same time, Wilson studied fossil fuel emissions and human impact on the environment. He was a staunch advocate for a “carbon fee” to encourage companies to curb fossil fuel emissions and promote the adoption of renewable energy. This pursuit got him more engaged in government, and as a result he joined the Massachusetts chapter of the grassroots organization Common Cause. He also was co-founder of the Massachusetts Action on Smoking and Health, which advocated for nonsmokers' rights.

After 28 years on the faculty at MIT, Wilson retired in 1994. In 2001, he co-founded Wilson TurboPower, a company focused on the development of microturbines.

In retirement, Wilson remained an active member of the MIT community — often attending departmental meetings and serving as a faculty judge at the annual de Florez Awards. He is survived by his wife, Ellen Wilson, his two daughters, Erica Mandau and Susan Wilson, and his granddaughter.

A memorial service will be held on May 17 at 10 a.m. at The Parish of the Epiphany, 70 Church Street, Winchester, Massachusetts.

The (evolving) art of war

Wed, 05/08/2019 - 12:00am

In 1969, the Soviet Union moved troops and military equipment to its border with China, escalating tensions between the communist Cold War powers. In response, China created a new military strategy of “active defense” to repel an invading force near the border. There was just one catch: China did not actually implement its new strategy until 1980.

Which raises a question: How could China have taken a full decade before shifting its military posture in the face of an apparent threat to its existence?

“It really comes down to the politics of the Cultural Revolution,” says Taylor Fravel, a professor of political science at MIT and an expert in Chinese foreign policy and military thinking. “China was consumed with internal political upheaval.”

That is, through the mid-1970s, leader Mao Zedong and his hardline allies sought to impose their own visions of politics and society on the country. Those internal divisions, and the extraordinary political strife accompanying them, kept China from addressing its external threats — even though it might sorely have needed a new strategy at the time.

Indeed, Fravel believes, every major change in Chinese military strategy since 1949 — and there have been a few — has occurred in the same set of circumstances. Each time, the Chinese have recognized that global changes in warfare have occurred, but they have required political unity in Beijing to implement those changes. To understand the military thinking of one of the world’s superpowers, then, we need to understand its domestic politics.

Fravel has synthesized these observations in a new book, “Active Defense: China’s Military Strategy since 1949,” published by Princeton University Press. The book offers a uniquely thorough history of modern Chinese military thinking, a subject that many observers have regarded as inscrutable.

“One way to understand how great powers think about the use of military force is to examine their [formal] military strategy,” Fravel notes. “In this respect, China has not been studied as thoroughly or systematically as the other great powers.”

Rethinking Mao

Fravel’s book examines military thinking during the entire length of the People’s Republic of China, dating to 1949, when Mao led the communist takeover of the country. China was not at that point regarded as a serious military power, although Fravel notes that the country’s leaders were giving the idea of becoming one serious thought back then.

“I think some people might be surprised to learn that China has been dedicated to building a modern military, and thus thinking about strategy, since the birth of the People’s Republic,” Fravel says.

As Fravel sees it, based on a significant amount of original archival research, there are nine times in modern China’s history when the government has issued comprehensive new military strategies. These formal strategic plans, he thinks, are critical to understanding what Chinese leaders have thought about military force and how to use it.

“It’s an articulation of principles that should guide subsequent activities,” Fravel says.

Of these nine strategies, Fravel finds three to be particularly significant: Those issued in 1956, 1980, and 1993. The first of these articulated a posture of forward defense meant to insulate the country from invasion by, principally, the U.S.

By the 1960s, however, the country had shifted toward a different military posture, one more in line with Mao’s own thinking, which featured an emphasis on guerilla-style retreat and concession of territory in the face of a potential invasion. The idea, deployed by Mao in China’s civil war in the 1930s, was to wear an enemy down over time while providing elusive targets for opponents.

The Soviet massing of military forces just outside China in the late 1960s raised concerns that it might be better to pursue a more “active defense” — and thus the title of Fravel’s book — in which China positioned its armed forces to contain enemies near the border. But given all the internal political conflict (and leadership purges) within China, this shift did not gain enough traction to be implemented in the 1970s. Moreover, as a distinct change from Mao’s ideas, the notion of active defense required considerable political unity to be implemented.

“In that sense, it was profoundly different, and perhaps challenging to pursue,” Fravel says. “They had to de-emphasize one of Mao’s core strategic principles.”

Still, the new strategy became official policy, and remained such for over a decade — until Chinese military leaders watched the 1991 Gulf War on television and recognized that the new era of precision aerial warfare demanded another shift in strategy for them as well.

“I think in many countries, the Gulf War catalyzed a complete rethinking of warfare in very short order,” Fravel says.

And yet, even as this was occurring, China was experiencing yet another moment of internal political division, following the Tianamen Square massacre of 1989. It took another year or two, and a new internal political consensus, before China could develop a new, contemporary strategy for fighting high-tech wars.

“What they wanted to do was really challenging,” Fravel says, noting that the new strategy requires complex coordination of different military domains — air, sea, and land — which had not previously been unified.

The nuclear exception

China’s 1993 statement of strategy remains a guidepost for its current military thinking. However, as Fravel notes, there is one area of military force — nuclear weapons — which is an “exception to the rule” he postulates about policy following unity. China has had nuclear weapons since the 1960s, while always considering them a deterrent to other countries, and not threatening first use of them.

“When you look at the nuclear domain, they’ve basically had the same strategic goal since testing their first device in 1964, which is to deter other countries from attacking China first with nuclear weapons,” Fravel says. “It’s also the one element of defense strategy never delegated by top party leaders. It was so important to them, they never let go of the authority to devise nuclear strategy.”

Other scholars regard “Active Defense” as a significant contribution to its field. Charles Glaser, a professor at George Washington University, states that Fravel “contributes significantly to our understanding of the evolution of China’s military strategy, and offers insightful theoretical arguments about civil-military relations.”

Avery Goldstein, a professor at the University of Pennsylvania, calls the book “an impressive achievement” and notes that Fravel “deftly draws on a wide range of literature about influences on military strategy” as well as “newly available sources of evidence” from historical archives.

For Fravel’s part, he says that identifying the strong pattern leading to changes in China’s military strategy will help as a guide to the future, as well.

“China is a country we know less about, in the study of international politics, than the other great powers,” Fravel says. “If there is a significant shift in the kinds of warfare in the international system, then China would be more likely to consider changing its military strategy.”

Pages