MIT Latest News
New machine-learning application to help researchers predict chemical properties
One of the shared, fundamental goals of most chemistry researchers is the need to predict a molecule’s properties, such as its boiling or melting point. Once researchers can pinpoint that prediction, they’re able to move forward with their work yielding discoveries that lead to medicines, materials, and more. Historically, however, the traditional methods of unveiling these predictions are associated with a significant cost — expending time and wear and tear on equipment, in addition to funds.
Enter a branch of artificial intelligence known as machine learning (ML). ML has lessened the burden of molecule property prediction to a degree, but the advanced tools that most effectively expedite the process — by learning from existing data to make rapid predictions for new molecules — require the user to have a significant level of programming expertise. This creates an accessibility barrier for many chemists, who may not have the significant computational proficiency required to navigate the prediction pipeline.
To alleviate this challenge, researchers in the McGuire Research Group at MIT have created ChemXploreML, a user-friendly desktop app that helps chemists make these critical predictions without requiring advanced programming skills. Freely available, easy to download, and functional on mainstream platforms, this app is also built to operate entirely offline, which helps keep research data proprietary. The exciting new technology is outlined in an article published recently in the Journal of Chemical Information and Modeling.
One specific hurdle in chemical machine learning is translating molecular structures into a numerical language that computers can understand. ChemXploreML automates this complex process with powerful, built-in "molecular embedders" that transform chemical structures into informative numerical vectors. Next, the software implements state-of-the-art algorithms to identify patterns and accurately predict molecular properties like boiling and melting points, all through an intuitive, interactive graphical interface.
"The goal of ChemXploreML is to democratize the use of machine learning in the chemical sciences,” says Aravindh Nivas Marimuthu, a postdoc in the McGuire Group and lead author of the article. “By creating an intuitive, powerful, and offline-capable desktop application, we are putting state-of-the-art predictive modeling directly into the hands of chemists, regardless of their programming background. This work not only accelerates the search for new drugs and materials by making the screening process faster and cheaper, but its flexible design also opens doors for future innovations.”
ChemXploreML is designed to to evolve over time, so as future techniques and algorithms are developed, they can be seamlessly integrated into the app, ensuring that researchers are always able to access and implement the most up-to-date methods. The application was tested on five key molecular properties of organic compounds — melting point, boiling point, vapor pressure, critical temperature, and critical pressure — and achieved high accuracy scores of up to 93 percent for the critical temperature. The researchers also demonstrated that a new, more compact method of representing molecules (VICGAE) was nearly as accurate as standard methods, such as Mol2Vec, but was up to 10 times faster.
“We envision a future where any researcher can easily customize and apply machine learning to solve unique challenges, from developing sustainable materials to exploring the complex chemistry of interstellar space,” says Marimuthu. Joining him on the paper is senior author and Class of 1943 Career Development Assistant Professor of Chemistry Brett McGuire.
Scientists apply optical pooled CRISPR screening to identify potential new Ebola drug targets
The following press release was issued today by the Broad Institute of MIT and Harvard.
Although outbreaks of Ebola virus are rare, the disease is severe and often fatal, with few treatment options. Rather than targeting the virus itself, one promising therapeutic approach would be to interrupt proteins in the human host cell that the virus relies upon. However, finding those regulators of viral infection using existing methods has been difficult and is especially challenging for the most dangerous viruses like Ebola that require stringent high-containment biosafety protocols.
Now, researchers at the Broad Institute and the National Emerging Infectious Diseases Laboratories (NEIDL) at Boston University have used an image-based screening method developed at the Broad to identify human genes that, when silenced, impair the Ebola virus’s ability to infect. The method, known as optical pooled screening (OPS), enabled the scientists to test, in about 40 million CRISPR-perturbed human cells, how silencing each gene in the human genome affects virus replication.
Using machine-learning-based analyses of images of perturbed cells, they identified multiple host proteins involved in various stages of Ebola infection that when suppressed crippled the ability of the virus to replicate. Those viral regulators could represent avenues to one day intervene therapeutically and reduce the severity of disease in people already infected with the virus. The approach could be used to explore the role of various proteins during infection with other pathogens, as a way to find new drugs for hard-to-treat infections.
The study appears in Nature Microbiology.
“This study demonstrates the power of OPS to probe the dependency of dangerous viruses like Ebola on host factors at all stages of the viral life cycle and explore new routes to improve human health,” said co-senior author Paul Blainey, a Broad core faculty member and professor in the Department of Biological Engineering at MIT.
Previously, members of the Blainey lab developed the optical pooled screening method as a way to combine the benefits of high-content imaging, which can show a range of detailed changes in large numbers of cells at once, with those of pooled perturbational screens, which show how genetic elements influence these changes. In this study, they partnered with the laboratory of Robert Davey at BU to apply optical pooled screening to Ebola virus.
The team used CRISPR to knock out each gene in the human genome, one at a time, in nearly 40 million human cells, and then infected each cell with Ebola virus. They next fixed those cells in place in laboratory dishes and inactivated them, so that the remaining processing could occur outside of the high-containment lab.
After taking images of the cells, they measured overall viral protein and RNA in each cell using the CellProfiler image analysis software, and to get even more information from the images, they turned to AI. With help from team members in the Eric and Wendy Schmidt Center at the Broad, led by study co-author and Broad core faculty member Caroline Uhler, they used a deep learning model to automatically determine the stage of Ebola infection for each single cell. The model was able to make subtle distinctions between stages of infection in a high-throughput way that wasn’t possible using prior methods.
“The work represents the deepest dive yet into how Ebola virus rewires the cell to cause disease, and the first real glimpse into the timing of that reprogramming,” said co-senior author Robert Davey, director of the National Emerging Infectious Diseases Laboratories at Boston University, and professor of microbiology at BU Chobanian and Avedisian School of Medicine. “AI gave us an unprecedented ability to do this at scale.”
By sequencing parts of the CRISPR guide RNA in all 40 million cells individually, the researchers determined which human gene had been silenced in each cell, indicating which host proteins (and potential viral regulators) were targeted. The analysis revealed hundreds of host proteins that, when silenced, altered overall infection level, including many required for viral entry into the cell.
Knocking out other genes enhanced the amount of virus within inclusion bodies, structures that form in the human cell to act as viral factories, and prevented the infection from progressing further. Some of these human genes, such as UQCRB, pointed to a previously unrecognized role for mitochondria in the Ebola virus infection process that could possibly be exploited therapeutically. Indeed, treating cells with a small molecule inhibitor of UQCRB reduced Ebola infection with no impact on the cell’s own health.
Other genes, when silenced, altered the balance between viral RNA and protein. For example, perturbing a gene called STRAP resulted in increased viral RNA relative to protein. The researchers are currently doing further studies in the lab to better understand the role of STRAP and other proteins in Ebola infection and whether they could be targeted therapeutically.
In a series of secondary screens, the scientists examined some of the highlighted genes’ roles in infection with related filoviruses. Silencing some of these genes interrupted replication of Sudan and Marburg viruses, which have high fatality rates and no approved treatments, so it’s possible a single treatment could be effective against multiple related viruses.
The study’s approach could also be used to examine other pathogens and emerging infectious diseases and look for new ways to treat them.
“With our method, we can measure many features at once and uncover new clues about the interplay between virus and host, in a way that’s not possible through other screening approaches,” said co-first author Rebecca Carlson, a former graduate researcher in the labs of Blainey and Nir Hacohen at the Broad and who co-led the work along with co-first author J.J. Patten at Boston University.
This work was funded in part by the Broad Institute, the National Human Genome Research Institute, the Burroughs Wellcome Fund, the Fannie and John Hertz Foundation, the National Science Foundation, the George F. Carrier Postdoctoral Fellowship, the Eric and Wendy Schmidt Center at the Broad Institute, the National Institutes of Health, and the Office of Naval Research.
Astronomers discover star-shredding black holes hiding in dusty galaxies
Astronomers at MIT, Columbia University, and elsewhere have used NASA’s James Webb Space Telescope (JWST) to peer through the dust of nearby galaxies and into the aftermath of a black hole’s stellar feast.
In a study appearing today in Astrophysical Journal Letters, the researchers report that for the first time, JWST has observed several tidal disruption events — instances when a galaxy’s central black hole draws in a nearby star and whips up tidal forces that tear the star to shreds, giving off an enormous burst of energy in the process.
Scientists have observed about 100 tidal disruption events (TDEs) since the 1990s, mostly as X-ray or optical light that flashes across relatively dust-free galaxies. But as MIT researchers recently reported, there may be many more star-shredding events in the universe that are “hiding” in dustier, gas-veiled galaxies.
In their previous work, the team found that most of the X-ray and optical light that a TDE gives off can be obscured by a galaxy’s dust, and therefore can go unseen by traditional X-ray and optical telescopes. But that same burst of light can heat up the surrounding dust and generate a new signal, in the form of infrared light.
Now, the same researchers have used JWST — the world’s most powerful infrared detector — to study signals from four dusty galaxies where they suspect tidal disruption events have occurred. Within the dust, JWST detected clear fingerprints of black hole accretion, a process by which material, such as stellar debris, circles and eventually falls into a black hole. The telescope also detected patterns that are strikingly different from the dust that surrounds active galaxies, where the central black hole is constantly pulling in surrounding material.
Together, the observations confirm that a tidal disruption event did indeed occur in each of the four galaxies. What’s more, the researchers conclude that the four events were products of not active black holes but rather dormant ones, which experienced little to no activity until a star happened to pass by.
The new results highlight JWST’s potential to study in detail otherwise hidden tidal disruption events. They are also helping scientists to reveal key differences in the environments around active versus dormant black holes.
“These are the first JWST observations of tidal disruption events, and they look nothing like what we’ve ever seen before,” says lead author Megan Masterson, a graduate student in MIT’s Kavli Institute for Astrophysics and Space Research. “We’ve learned these are indeed powered by black hole accretion, and they don’t look like environments around normal active black holes. The fact that we’re now able to study what that dormant black hole environment actually looks like is an exciting aspect.”
The study’s MIT authors include Christos Panagiotou, Erin Kara, Anna-Christina Eilers, along with Kishalay De of Columbia University and collaborators from multiple other institutions.
Seeing the light
The new study expands on the team’s previous work using another infrared detector — NASA’s Near-Earth Object Wide-field Infrared Survey Explorer (NEOWISE) mission. Using an algorithm developed by co-author Kishalay De of Columbia University, the team searched through a decade’s worth of data from the telescope, looking for infrared “transients,” or short peaks of infrared activity from otherwise quiet galaxies that could be signals of a black hole briefly waking up and feasting on a passing star. That search unearthed about a dozen signals that the group determined were likely produced by a tidal disruption event.
“With that study, we found these 12 sources that look just like TDEs,” Masterson says. “We made a lot of arguments about how the signals were very energetic, and the galaxies didn’t look like they were active before, so the signals must have been from a sudden TDE. But except for these little pieces, there was no direct evidence.”
With the much more sensitive capabilities of JWST, the researchers hoped to discern key “spectral lines,” or infrared light at specific wavelengths, that would be clear fingerprints of conditions associated with a tidal disruption event.
“With NEOWISE, it’s as if our eyes could only see red light or blue light, whereas with JWST, we’re seeing the full rainbow,” Masterson says.
A Bonafide signal
In their new work, the group looked specifically for a peak in infrared, that could only be produced by black hole accretion — a process by which material is drawn toward a black hole in a circulating disk of gas. This disk produces an enormous amount of radiation that is so intense that it can kick out electrons from individual atoms. In particular, such accretion processes can blast several electrons out from atoms of neon, and the resulting ion can transition, releasing infrared radiation at a very specific wavelength that JWST can detect.
“There’s nothing else in the universe that can excite this gas to these energies, except for black hole accretion,” Masterson says.
The researchers searched for this smoking-gun signal in four of the 12 TDE candidates they previously identified. The four signals include: the closest tidal disruption event detected to date, located in a galaxy some 130 million light years away; a TDE that also exhibits a burst of X-ray light; a signal that may have been produced by gas circulating at incredibly high speeds around a central black hole; and a signal that also included an optical flash, which scientists had previously suspected to be a supernova, or the collapse of a dying star, rather than tidal disruption event.
“These four signals were as close as we could get to a sure thing,” Masterson says. “But the JWST data helped us say definitively these are bonafide TDEs.”
When the team pointed JWST toward the galaxies of each of the four signals, in a program designed by De, they observed that the telltale spectral lines showed up in all four sources. These measurements confirmed that black hole accretion occurred in all four galaxies. But the question remained: Was this accretion a temporary feature, triggered by a tidal disruption and a black hole that briefly woke up to feast on a passing star? Or was this accretion a more permanent trait of “active” black holes that are always on? In the case of the latter, it would be less likely that a tidal disruption event had occurred.
To differentiate between the two possibilities, the team used the JWST data to detect another wavelength of infrared light, which indicates the presence of silicates, or dust in the galaxy. They then mapped this dust in each of the four galaxies and compared the patterns to those of active galaxies, which are known to harbor clumpy, donut-shaped dust clouds around the central black hole. Masterson observed that all four sources showed very different patterns compared to typical active galaxies, suggesting that the black hole at the center of each of the galaxies is not normally active, but dormant. If an accretion disk formed around such a black hole, the researchers conclude that it must have been a result of a tidal disruption event.
“Together, these observations say the only thing these flares could be are TDEs,” Masterson says.
She and her collaborators plan to uncover many more previously hidden tidal disruption events, with NEOWISE, JWST, and other infrared telescopes. With enough detections, they say TDEs can serve as effective probes of black hole properties. For instance, how much of a star is shredded, and how fast its debris is accreted and consumed, can reveal fundamental properties of a black hole, such as how massive it is and how fast it spins.
“The actual process of a black hole gobbling down all that stellar material takes a long time,” Masterson says. “It’s not an instantaneous process. And hopefully we can start to probe how long that process takes and what that environment looks like. No one knows because we just started discovering and studying these events.”
This research was supported, in part, by NASA.
Theory-guided strategy expands the scope of measurable quantum interactions
A new theory-guided framework could help scientists probe the properties of new semiconductors for next-generation microelectronic devices, or discover materials that boost the performance of quantum computers.
Research to develop new or better materials typically involves investigating properties that can be reliably measured with existing lab equipment, but this represents just a fraction of the properties that scientists could potentially probe in principle. Some properties remain effectively “invisible” because they are too difficult to capture directly with existing methods.
Take electron-phonon interaction — this property plays a critical role in a material’s electrical, thermal, optical, and superconducting properties, but directly capturing it using existing techniques is notoriously challenging.
Now, MIT researchers have proposed a theoretically justified approach that could turn this challenge into an opportunity. Their method reinterprets neutron scattering, an often-overlooked interference effect as a potential direct probe of electron-phonon coupling strength.
The procedure creates two interaction effects in the material. The researchers show that, by deliberately designing their experiment to leverage the interference between the two interactions, they can capture the strength of a material’s electron-phonon interaction.
The researchers’ theory-informed methodology could be used to shape the design of future experiments, opening the door to measuring new quantities that were previously out of reach.
“Rather than discovering new spectroscopy techniques by pure accident, we can use theory to justify and inform the design of our experiments and our physical equipment,” says Mingda Li, the Class of 1947 Career Development Professor and an associate professor of nuclear science and engineering, and senior author of a paper on this experimental method.
Li is joined on the paper by co-lead authors Chuliang Fu, an MIT postdoc; Phum Siriviboon and Artittaya Boonkird, both MIT graduate students; as well as others at MIT, the National Institute of Standards and Technology, the University of California at Riverside, Michigan State University, and Oak Ridge National Laboratory. The research appears this week in Materials Today Physics.
Investigating interference
Neutron scattering is a powerful measurement technique that involves aiming a beam of neutrons at a material and studying how the neutrons are scattered after they strike it. The method is ideal for measuring a material’s atomic structure and magnetic properties.
When neutrons collide with the material sample, they interact with it through two different mechanisms, creating a nuclear interaction and a magnetic interaction. These interactions can interfere with each other.
“The scientific community has known about this interference effect for a long time, but researchers tend to view it as a complication that can obscure measurement signals. So it hasn’t received much focused attention,” Fu says.
The team and their collaborators took a conceptual “leap of faith” and decided to explore this oft-overlooked interference effect more deeply.
They flipped the traditional materials research approach on its head by starting with a multifaceted theoretical analysis. They explored what happens inside a material when the nuclear interaction and magnetic interaction interfere with each other.
Their analysis revealed that this interference pattern is directly proportional to the strength of the material’s electron-phonon interaction.
“This makes the interference effect a probe we can use to detect this interaction,” explains Siriviboon.
Electron-phonon interactions play a role in a wide range of material properties. They affect how heat flows through a material, impact a material’s ability to absorb and emit light, and can even lead to superconductivity.
But the complexity of these interactions makes them hard to directly measure using existing experimental techniques. Instead, researchers often rely on less precise, indirect methods to capture electron-phonon interactions.
However, leveraging this interference effect enables direct measurement of the electron-phonon interaction, a major advantage over other approaches.
“Being able to directly measure the electron-phonon interaction opens the door to many new possibilities,” says Boonkird.
Rethinking materials research
Based on their theoretical insights, the researchers designed an experimental setup to demonstrate their approach.
Since the available equipment wasn’t powerful enough for this type of neutron scattering experiment, they were only able to capture a weak electron-phonon interaction signal — but the results were clear enough to support their theory.
“These results justify the need for a new facility where the equipment might be 100 to 1,000 times more powerful, enabling scientists to clearly resolve the signal and measure the interaction,” adds Landry.
With improved neutron scattering facilities, like those proposed for the upcoming Second Target Station at Oak Ridge National Laboratory, this experimental method could be an effective technique for measuring many crucial material properties.
For instance, by helping scientists identify and harness better semiconductors, this approach could enable more energy-efficient appliances, faster wireless communication devices, and more reliable medical equipment like pacemakers and MRI scanners.
Ultimately, the team sees this work as a broader message about the need to rethink the materials research process.
“Using theoretical insights to design experimental setups in advance can help us redefine the properties we can measure,” Fu says.
To that end, the team and their collaborators are currently exploring other types of interactions they could leverage to investigate additional material properties.
“This is a very interesting paper,” says Jon Taylor, director of the neutron scattering division at Oak Ridge National Laboratory, who was not involved with this research. “It would be interesting to have a neutron scattering method that is directly sensitive to charge lattice interactions or more generally electronic effects that were not just magnetic moments. It seems that such an effect is expectedly rather small, so facilities like STS could really help develop that fundamental understanding of the interaction and also leverage such effects routinely for research.”
This work is funded, in part, by the U.S. Department of Energy and the National Science Foundation.
