Feed aggregator
Oil-backed startup begins sucking carbon from the ocean
Trump taps ex-NY congressman for mass transit czar
New Mexico’s ‘green amendment’ faces clean energy opposition
California Senate leader floats year-round peak firefighting staffing
A luxury house is close to tumbling into Cape Cod Bay. Will anyone stop it?
Monsoon floods kill 2 people, wreak havoc in Australia’s Queensland
Making climate-friendly lifestyle choices isn’t always easy, India learns
Observed multi-decadal increase in the surface ocean’s thermal inertia
Nature Climate Change, Published online: 06 February 2025; doi:10.1038/s41558-025-02245-w
Analysis of 42 years of daily sea surface temperature data shows increasing persistence of anomalies. These changes, which are attributed to deepening of the mixed layer, reduced oceanic forcing and reduced damping associated with stronger stratification, have implications for marine heatwave duration.3 Questions: What the laws of physics tell us about CO2 removal
Human activities continue to pump billions of tons of carbon dioxide into the atmosphere each year, raising global temperatures and driving extreme weather events. As countries grapple with climate impacts and ways to significantly reduce carbon emissions, there have been various efforts to advance carbon dioxide removal (CDR) technologies that directly remove carbon dioxide from the air and sequester it for long periods of time.
Unlike carbon capture and storage technologies, which are designed to remove carbon dioxide at point sources such as fossil-fuel plants, CDR aims to remove carbon dioxide molecules that are already circulating in the atmosphere.
A new report by the American Physical Society and led by an MIT physicist provides an overview of the major experimental CDR approaches and determines their fundamental physical limits. The report focuses on methods that have the biggest potential for removing carbon dioxide, at the scale of gigatons per year, which is the magnitude that would be required to have a climate-stabilizing impact.
The new report was commissioned by the American Physical Society's Panel on Public Affairs, and appeared last week in the journal PRX. The report was chaired by MIT professor of physics Washington Taylor, who spoke with MIT News about CDR’s physical limitations and why it’s worth pursuing in tandem with global efforts to reduce carbon emissions.
Q: What motivated you to look at carbon dioxide removal systems from a physical science perspective?
A: The number one thing driving climate change is the fact that we’re taking carbon that has been stuck in the ground for 100 million years, and putting it in the atmosphere, and that’s causing warming. In the last few years there’s been a lot of interest both by the government and private entities in finding technologies to directly remove the CO2 from the air.
How to manage atmospheric carbon is the critical question in dealing with our impact on Earth’s climate. So, it’s very important for us to understand whether we can affect the carbon levels not just by changing our emissions profile but also by directly taking carbon out of the atmosphere. Physics has a lot to say about this because the possibilities are very strongly constrained by thermodynamics, mass issues, and things like that.
Q: What carbon dioxide removal methods did you evaluate?
A: They’re all at an early stage. It's kind of the Wild West out there in terms of the different ways in which companies are proposing to remove carbon from the atmosphere. In this report, we break down CDR processes into two classes: cyclic and once-through.
Imagine we are in a boat that has a hole in the hull and is rapidly taking on water. Of course, we want to plug the hole as quickly as we can. But even once we have fixed the hole, we need to get the water out so we aren't in danger of sinking or getting swamped. And this is particularly urgent if we haven't completely fixed the hole so we still have a slow leak. Now, imagine we have a couple of options for how to get the water out so we don’t sink.
The first is a sponge that we can use to absorb water, that we can then squeeze out and reuse. That’s a cyclic process in the sense that we have some material that we’re using over and over. There are cyclic CDR processes like chemical “direct air capture” (DAC), which acts basically like a sponge. You set up a big system with fans that blow air past some material that captures carbon dioxide. When the material is saturated, you close off the system and then use energy to essentially squeeze out the carbon and store it in a deep repository. Then you can reuse the material, in a cyclic process.
The second class of approaches is what we call “once-through.” In the boat analogy, it would be as if you try to fix the leak using cartons of paper towels. You let them saturate and then throw them overboard, and you use each roll once.
There are once-through CDR approaches, like enhanced rock weathering, that are designed to accelerate a natural process, by which certain rocks, when exposed to air, will absorb carbon from the atmosphere. Worldwide, this natural rock weathering is estimated to remove about 1 gigaton of carbon each year. “Enhanced rock weathering” is a CDR approach where you would dig up a lot of this rock, grind it up really small, to less than the width of a human hair, to get the process to happen much faster. The idea is, you dig up something, spread it out, and absorb CO2 in one go.
The key difference between these two processes is that the cyclic process is subject to the second law of thermodynamics and there’s an energy constraint. You can set an actual limit from physics, saying any cyclic process is going to take a certain amount of energy, and that cannot be avoided. For example, we find that for cyclic direct-air-capture (DAC) plants, based on second law limits, the absolute minimum amount of energy you would need to capture a gigaton of carbon is comparable to the total yearly electric energy consumption of the state of Virginia. Systems currently under development use at least three to 10 times this much energy on a per ton basis (and capture tens of thousands, not billions, of tons). Such systems also need to move a lot of air; the air that would need to pass through a DAC system to capture a gigaton of CO2 is comparable to the amount of air that passes through all the air cooling systems on the planet.
On the other hand, if you have a once-through process, you could in some respects avoid the energy constraint, but now you’ve got a materials constraint due to the central laws of chemistry. For once-through processes like enhanced rock weathering, that means that if you want to capture a gigaton of CO2, roughly speaking, you’re going to need a billion tons of rock.
So, to capture gigatons of carbon through engineered methods requires tremendous amounts of physical material, air movement, and energy. On the other hand, everything we’re doing to put that CO2 in the atmosphere is extensive too, so large-scale emissions reductions face comparable challenges.
Q: What does the report conclude, in terms of whether and how to remove carbon dioxide from the atmosphere?
A: Our initial prejudice was, CDR is just going to take so much energy, and there’s no way around that because of the second law of thermodynamics, regardless of the method.
But as we discussed, there is this nuance about cyclic versus once-through systems. And there are two points of view that we ended up threading a needle between. One is the view that CDR is a silver bullet, and we’ll just do CDR and not worry about emissions — we’ll just suck it all out of the atmosphere. And that’s not the case. It will be really expensive, and will take a lot of energy and materials to do large-scale CDR. But there’s another view, where people say, don’t even think about CDR. Even thinking about CDR will compromise our efforts toward emissions reductions. The report comes down somewhere in the middle, saying that CDR is not a magic bullet, but also not a no-go.
If we are serious about managing climate change, we will likely want substantial CDR in addition to aggressive emissions reductions. The report concludes that research and development on CDR methods should be selectively and prudently pursued despite the expected cost and energy and material requirements.
At a policy level, the main message is that we need an economic and policy framework that incentivizes emissions reductions and CDR in a common framework; this would naturally allow the market to optimize climate solutions. Since in many cases it is much easier and cheaper to cut emissions than it will likely ever be to remove atmospheric carbon, clearly understanding the challenges of CDR should help motivate rapid emissions reductions.
For me, I’m optimistic in the sense that scientifically we understand what it will take to reduce emissions and to use CDR to bring CO2 levels down to a slightly lower level. Now, it’s really a societal and economic problem. I think humanity has the potential to solve these problems. I hope that we can find common ground so that we can take actions as a society that will benefit both humanity and the broader ecosystems on the planet, before we end up having bigger problems than we already have.
Seeking climate connections among the oceans’ smallest organisms
Andrew Babbin tries to pack light for work trips. Along with the travel essentials, though, he also brings a roll each of electrical tape, duct tape, lab tape, a pack of cable ties, and some bungee cords.
“It’s my MacGyver kit: You never know when you have to rig something on the fly in the field or fix a broken bag,” Babbin says.
The trips Babbin takes are far out to sea, on month-long cruises, where he works to sample waters off the Pacific coast and out in the open ocean. In remote locations, repair essentials often come in handy, as when Babbin had to zip-tie a wrench to a sampling device to help it sink through an icy Antarctic lake.
Babbin is an oceanographer and marine biogeochemist who studies marine microbes and the ways in which they control the cycling of nitrogen between the ocean and the atmosphere. This exchange helps maintain healthy ocean ecosystems and supports the ocean’s capacity to store carbon.
By combining measurements that he takes in the ocean with experiments in his MIT lab, Babbin is working to understand the connections between microbes and ocean nitrogen, which could in turn help scientists identify ways to maintain the ocean’s health and productivity. His work has taken him to many coastal and open-ocean regions around the globe.
“You really become an oceanographer and an Earth scientist to see the world,” says Babbin, who recently earned tenure as the Cecil and Ida Green Career Development Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “We embrace the diversity of places and cultures on this planet. To see just a small fraction of that is special.”
A powerful cycle
The ocean has been a constant presence for Babbin since childhood. His family is from Monmouth County, New Jersey, where he and his twin sister grew up playing along the Jersey shore. When they were teenagers, their parents took the kids on family cruise vacations.
“I always loved being on the water,” he says. “My favorite parts of any of those cruises were the days at sea, where you were just in the middle of some ocean basin with water all around you.”
In school, Babbin gravitated to the sciences, and chemistry in particular. After high school, he attended Columbia University, where a visit to the school’s Earth and environmental engineering department catalyzed a realization.
“For me, it was always this excitement about the water and about chemistry, and it was this pop of, ‘Oh wow, it doesn’t have to be one or the other,’” Babbin says.
He chose to major in Earth and environmental engineering, with a concentration in water resources and climate risks. After graduating in 2008, Babbin returned to his home state, where he attended Princeton University and set a course for a PhD in geosciences, with a focus on chemical oceanography and environmental microbiology. His advisor, oceanographer Bess Ward, took Babbin on as a member of her research group and invited him on several month-long cruises to various parts of the eastern tropical Pacific.
“I still remember that first trip,” Babbin recalls. “It was a whirlwind. Everyone else had been to sea a gazillion times and was loading the boat and strapping things down, and I had no idea of anything. And within a few hours, I was doing an experiment as the ship rocked back and forth!”
Babbin learned to deploy sampling cannisters overboard, then haul them back up and analyze the seawater inside for signs of nitrogen — an essential nutrient for all living things on Earth.
As it turns out, the plants and animals that depend on nitrogen to survive are unable to take it up from the atmosphere themselves. They require a sort of go-between, in the form of microbes that “fix” nitrogen, converting it from nitrogen gas to more digestible forms. In the ocean, this nitrogen fixation is done by highly specialized microbial species, which work to make nitrogen available to phytoplankton — microscopic plant-like organisms that are the foundation of the marine food chain. Phytoplankton are also a main route by which the ocean absorbs carbon dioxide from the atmosphere.
Microorganisms may also use these biologically available forms of nitrogen for energy under certain conditions, returning nitrogen to the atmosphere. These microbes can also release a byproduct of nitrous oxide, which is a potent greenhouse gas that also can catalyze ozone loss in the stratosphere.
Through his graduate work, at sea and in the lab, Babbin became fascinated with the cycling of nitrogen and the role that nitrogen-fixing microbes play in supporting the ocean’s ecosystems and the climate overall. A balance of nitrogen inputs and outputs sustains phytoplankton and maintains the ocean’s ability to soak up carbon dioxide.
“Some of the really pressing questions in ocean biogeochemistry pertain to this cycling of nitrogen,” Babbin says. “Understanding the ways in which this one element cycles through the ocean, and how it is central to ecosystem health and the planet’s climate, has been really powerful.”
In the lab and out to sea
After completing his PhD in 2014, Babbin arrived at MIT as a postdoc in the Department of Civil and Environmental Engineering.
“My first feeling when I came here was, wow, this really is a nerd’s playground,” Babbin says. “I embraced being part of a culture where we seek to understand the world better, while also doing the things we really want to do.”
In 2017, he accepted a faculty position in MIT’s Department of Earth, Atmospheric and Planetary Sciences. He set up his laboratory space, painted in his favorite brilliant orange, on the top floor of the Green Building.
His group uses 3D printers to fabricate microfluidic devices in which they reproduce the conditions of the ocean environment and study microbe metabolism and its effects on marine chemistry. In the field, Babbin has led research expeditions to the Galapagos Islands and parts of the eastern Pacific, where he has collected and analyzed samples of air and water for signs of nitrogen transformations and microbial activity. His new measuring station in the Galapagos is able to infer marine emissions of nitrous oxide across a large swath of the eastern tropical Pacific Ocean. His group has also sailed to southern Cuba, where the researchers studied interactions of microbes in coral reefs.
Most recently, Babbin traveled to Antarctica, where he set up camp next to frozen lakes and plumbed for samples of pristine ice water that he will analyze for genetic remnants of ancient microbes. Such preserved bacterial DNA could help scientists understand how microbes evolved and influenced the Earth’s climate over billions of years.
“Microbes are the terraformers,” Babbin notes. “They have been, since life evolved more than 3 billion years ago. We have to think about how they shape the natural world and how they will respond to the Anthropocene as humans monkey with the planet ourselves.”
Collective action
Babbin is now charting new research directions. In addition to his work at sea and in the lab, he is venturing into engineering, with a new project to design denitrifying capsules. While nitrogen is an essential nutrient for maintaining a marine ecosystem, too much nitrogen, such as from fertilizer that runs off into lakes and streams, can generate blooms of toxic algae. Babbin is looking to design eco-friendly capsules that scrub excess anthropogenic nitrogen from local waterways.
He’s also beginning the process of designing a new sensor to measure low-oxygen concentrations in the ocean. As the planet warms, the oceans are losing oxygen, creating “dead zones” where fish cannot survive. While others including Babbin have tried to map these oxygen minimum zones, or OMZs, they have done so sporadically, by dropping sensors into the ocean over limited range, depth, and times. Babbin’s sensors could potentially provide a more complete map of OMZs, as they would be deployed on wide-ranging, deep-diving, and naturally propulsive vehicles: sharks.
“We want to measure oxygen. Sharks need oxygen. And if you look at where the sharks don’t go, you might have a sense of where the oxygen is not,” says Babbin, who is working with marine biologists on ways to tag sharks with oxygen sensors. “A number of these large pelagic fish move up and down the water column frequently, so you can map the depth to which they dive to, and infer something about the behavior. And my suggestion is, you might also infer something about the ocean’s chemistry.”
When he reflects on what stimulates new ideas and research directions, Babbin credits working with others, in his own group and across MIT.
“My best thoughts come from this collective action,” Babbin says. “Particularly because we all have different upbringings and approach things from a different perspective.”
He’s bringing this collaborative spirit to his new role, as a mission director for MIT’s Climate Project. Along with Jesse Kroll, who is a professor of civil and environmental engineering and of chemical engineering, Babbin co-leads one of the project’s six missions: Restoring the Atmosphere, Protecting the Land and Oceans. Babbin and Kroll are planning a number of workshops across campus that they hope will generate new connections, and spark new ideas, particularly around ways to evaluate the effectiveness of different climate mitigation strategies and better assess the impacts of climate on society.
“One area we want to promote is thinking of climate science and climate interventions as two sides of the same coin,” Babbin says. “There’s so much action that’s trying to be catalyzed. But we want it to be the best action. Because we really have one shot at doing this. Time is of the essence.”
Closing the Gap in Encryption on Mobile
It’s time to expand encryption on Android and iPhone. With governments around the world engaging in constant attacks on user’s digital rights and access to the internet, removing glaring and potentially dangerous targets off of people’s backs when they use their mobile phones is more important than ever.
So far we have seen strides for at least keeping messages private on mobile devices with end-to-end encrypted apps like Signal, WhatsApp, and iMessage. Encryption on the web has been widely adopted. We even declared in 2021 that “HTTPS Is Actually Everywhere.” Most web traffic is encrypted and for a website to have a reputable presence with browsers, they have to meet certain requirements that major browsers enforce today. Mechanisms like certificate transparency, Cross-origin resource sharing (CORS) rules, and enforcing HTTPS help prevent malicious activity happening to users every day.
Yet, mobile has always been a different and ever expanding context. You access the internet on mobile devices through more than just the web browser. Mobile applications have more room to spawn network requests in the app without the user ever knowing where and when a request was sent. There is no “URL bar” to see the network request URL for the user to see and check. In some cases, apps have been known to “roll their own” cryptographic processes outside of non-standard encryption practices.
While there is much to discuss on the privacy issues of TikTok and other social media apps, for now, let’s just focus on encryption. In 2020 security researcher Baptiste Robert found TikTok used their own “custom encryption” dubbed “ttEncrypt.” Later research showed this was a weak encryption algorithm in comparison to just using HTTPS. Eventually, TikTok replaced ttEncrypt with HTTPS, but this is an example of one of the many allowed practices mobile applications can engage in without much regulation, transparency, or control by the user.
Android has made some strides to protect users’ traffic in apps, like allowing you to set private DNS. Yet, Android app developers can still set a flag to use clear text/unencrypted requests. Android owners should be able to block app requests engaging in this practice. While security settings can be difficult for users to set themselves due to lack of understanding, it would be a valuable setting to provide. Especially since users are currently being bombarded on their devices to turn on features they didn’t even ask for or want. This flag can’t possibly capture all clear text traffic due to the amount of network access “below” HTTPS in the network stack apps can control. However, it would be a good first step for a lot of apps that still use HTTP/unencrypted requests.
As for iOS, Apple introduced a feature called iCloud Private Relay. In their words “iCloud Private Relay is designed to protect your privacy by ensuring that when you browse the web in Safari, no single party — not even Apple — can see both who you are and what sites you're visiting.” This helps shield your IP address from websites you’re visiting. This is a useful alternative for people using VPNs to provide IP masking. In several countries engaging in internet censorship and digital surveillance, using a VPN can possibly put a target on you. It’s more pertinent than ever to be able to privately browse on your devices without setting off alarms. But Private Relay is behind a iCloud+ subscription and only available on Safari. It would be better to make this free and expand Private Relay across more of iOS, especially apps.
There are nuances as to why Private Relay isn’t like a traditional VPN. The “first hop” exposes the IP address to Apple and your Internet Service Provider. However, the website names requested cannot be seen by either party. Apple is vague with its details about the “second relay,” stating, “The second internet relay is operated by third-party partners who are some of the largest content delivery networks (CDNs) in the world.” Cloudflare is confirmed as the third-party, and its explanation goes further to expound that the standards used for Private Relay are TLS 1.3, QUIC, and MASQUE.
The combination of protocols used in Private Relay could be utilized on Android by using Cloudflare’s 1.1.1.1 app. Which would be the “closest” match from a technical standpoint for Android, and be applied globally instead of just the browser. A more favorable outcome would be utilizing this technology on mobile in a way that doesn’t use just one company to distribute modern encryption. Android’s Private DNS setting allows for various options of providers, but that covers just the encrypted DNS part of the request.
VPNs are another tool that can be used to mask an IP address and circumvent censorship, especially in cases where someone distrusts their Internet Service Provider (ISP). But using VPNs for this sole purpose should start to become obsolete with modern encryption protocols that can be deployed to protect the user. Better encryption practices across mobile platforms would lessen the need for people to flock to potentially nefarious VPN apps that put the user in danger. Android just announced a new badge program that attempts to address this issue by getting VPNs to adhere to Play Store guidelines for security and Mobile Application Security Assessment (MASA) Level 2 validation. While this attempt is noted, when mass censorship is applied, users may not always go to the most reputable VPN or even be able to access reputable VPNs because Google and Apple comply with app store take downs. So widening encryption outside of VPN usage is essential. Blocking clear text requests by apps, allowing users to restrict an app’s network access, and expanding Apple’s Private Relay would be steps in the right direction.
There are many other privacy leaks apps can engage in that expose what you are doing. In the case of apps acting badly by either rolling their own, unverified cryptography or using HTTP, users should be able to block network access to those apps. Just because the problem of mobile privacy is complex, doesn’t mean that complexity should stop potential. We can have a more private internet on our phones. “Encrypt all the things!” includes the devices we use the most to access the web and communicate with each other every day.
Paraguay’s Broadband Providers Continue to Struggle to Attain Best Practices at Protecting Users’ Data
Paraguay’s five leading broadband service providers made some strides in making their privacy policies more accessible to the public, but continue to fall short in their commitments to transparency, due process in sharing metadata with authorities, and promoting human rights—all of which limits their user’s privacy rights, according to the new edition of TEDIC’s ¿Quién Defiende Tus Datos? (“Who Defends Your Data").
The report shows that, in general, providers operating as subsidiaries of foreign companies are making more progress in committing to user privacy than national internet providers. But the overall performance of the country’s providers continues to lag behind their counterparts in the region.
As in its four previous reports about Paraguay, TEDIC evaluated Claro, Personal, and Tigo, which are subsidiaries, and national providers Copaco and Vox.
The companies were evaluated on seven criteria: whether they provide clear and comprehensive information about how they collect, share, and store user data; require judicial authorization to disclose metadata and communication content to authorities; notify users whose data is turned over to the government; publicly take a stance to support rights protections; publish transparency reports; provide guidelines for security forces and other government bodies on how to request user information, and make their website accessible to people with disabilities.
Tigo performed best, demonstrating 73% overall compliance with the criterion, while Vox came in last, receiving credit for complying with only 5% of the requirements.
Paraguay’s full study is available in Spanish. The following table summarizes the report’s evaluations.
Privacy, Judicial Authorization Policies LagThe report shows that Claro, Personal, and Tigo provide relatively detailed information on data collection and processing practices, but none clearly describe data retention periods, a crucial aspect of data protection. Copaco, despite having a privacy policy, limits its scope to data collected on its applications, neglecting to address data processing practices for its services, such as Internet and telephone. Vox has no publicly available privacy policy.
On the plus side, three out of the five providers in the report met all criteria in the privacy policy category. No company disclosed its policies about data collection when TEDIC reports began in 2017. The progress, though slow, is notable given that Paraguay doesn’t have a comprehensive data protection law—one of the few Latin American countries without one. There is a bill pending in Paraguay’s Parliament, but it hasn't been finally approved so far.
All five providers require a court order before handing over user information, but the report concludes that their policies don’t cover communications metadata, despite the fact that international human rights standards applicable to surveillance, established in the rulings of the Inter-American Court of Human Rights in the cases Escher v. Brazil (2009) and CAJAR v. Colombia (2023), state that these should also be protected under privacy guarantees like the communications content.
None of the five ISPs has a policy of notifying users when their data is requested by the authorities. This lack of transparency, already identified in all previous editions of QDTD, raises significant concerns about user rights and due process protections in Paraguay.
While no providers have made a strong commitment to publicly promote human rights, Tigo met three out of four requirements to receive full credit in this category and Claro received half credit due to the policies of their parent companies, rather than from the direct commitment of their local units. Tigo and Claro are also the companies with the most security campaigns for their users, identified throughout the editions of ¿Quién Defiende Tus Datos?
Claro and Tigo also provide some transparency about government requests for user data, but these reports are only accessible on their parent company websites and, even then, the regional transparency reports do not always provide detailed country-level breakdowns, making it difficult to assess the specific practices and compliance rates of their national subsidiaries
David McGee named head of the Department of Earth, Atmospheric and Planetary Sciences
David McGee, the William R. Kenan Jr. Professor of Earth and Planetary Sciences at MIT, was recently appointed head of the MIT Department of Earth, Atmospheric and Planetary Sciences (EAPS), effective Jan. 15. He assumes the role from Professor Robert van der Hilst, the Schlumberger Professor of Earth and Planetary Sciences, who led the department for 13 years.
McGee specializes in applying isotope geochemistry and geochronology to reconstruct Earth’s climate history, helping to ground-truth our understanding of how the climate system responds during periods of rapid change. He has also been instrumental in the growth of the department’s community and culture, having served as EAPS associate department head since 2020.
“David is an amazing researcher who brings crucial, data-based insights to aid our response to climate change,” says dean of the School of Science and the Curtis (1963) and Kathleen Marble Professor of Astrophysics Nergis Mavalvala. “He is also a committed and caring educator, providing extraordinary investment in his students’ learning experiences, and through his direction of Terrascope, one of our unique first-year learning communities focused on generating solutions to sustainability challenges.”
“I am energized by the incredible EAPS community, by Rob’s leadership over the last 13 years, and by President Kornbluth’s call for MIT to innovate effective and wise responses to climate change,” says McGee. “EAPS has a unique role in this time of reckoning with planetary boundaries — our collective path forward needs to be guided by a deep understanding of the Earth system and a clear sense of our place in the universe.”
McGee’s research seeks to understand the Earth system’s response to past climate changes. Using geochemical analysis and uranium-series dating, McGee and his group investigate stalagmites, ancient lake deposits, and deep-sea sediments from field sites around the world to trace patterns of wind and precipitation, water availability in drylands, and permafrost stability through space and time. Armed with precise chronologies, he aims to shed light on drivers of historical hydroclimatic shifts and provide quantitative tests of climate model performance.
Beyond research, McGee has helped shape numerous Institute initiatives focused on environment, climate, and sustainability, including serving on the MIT Climate and Sustainability Consortium Faculty Steering Committee and the faculty advisory board for the MIT Environment and Sustainability Minor.
McGee also co-chaired MIT's Climate Education Working Group, one of three working groups established under the Institute's Fast Forward climate action plan. The group identified opportunities to strengthen climate- and sustainability-related education at the Institute, from curricular offerings to experiential learning opportunities and beyond.
In April 2023, the working group hosted the MIT Symposium for Advancing Climate Education, featuring talks by McGee and others on how colleges and universities can innovate and help students develop the skills, capacities, and perspectives they’ll need to live, lead, and thrive in a world being remade by the accelerating climate crisis.
“David is reimagining MIT undergraduate education to include meaningful collaborations with communities outside of MIT, teaching students that scientific discovery is important, but not always enough to make impact for society,” says van der Hilst. “He will help shape the future of the department with this vital perspective.”
From the start of his career, McGee has been dedicated to sharing his love of exploration with students. He earned a master’s degree in teaching and spent seven years as a teacher in middle school and high school classrooms before earning his PhD in Earth and environmental sciences from Columbia University. He joined the MIT faculty in 2012, and in 2018 received the Excellence in Mentoring Award from MIT’s Undergraduate Advising and Academic Programming office. In 2015, he became the director of MIT’s Terrascope first-year learning community.
“David's exemplary teaching in Terrascope comes through his understanding that effective solutions must be found where science intersects with community engagement to forge ethical paths forward,” adds van der Hilst. In 2023, for his work with Terrascope, McGee received the school’s highest award, the School of Science Teaching Prize. In 2022, he was named a Margaret MacVicar Faculty Fellow, the highest teaching honor at MIT.
As associate department head, McGee worked alongside van der Hilst and student leaders to promote EAPS community engagement, improve internal supports and reporting structures, and bolster opportunities for students to pursue advanced degrees and STEM careers.
Victory! EFF Helps Defeat Meritless Lawsuit Against Journalist
Jack Poulson is a reporter, and when a confidential source sent him the police report of a tech CEO’s arrest for felony domestic violence, he did what journalists do: reported the news.
The CEO, Maury Blackman, didn’t like that. So he sued Poulson—along with Amazon Web Service, Substack, and Poulson’s non-profit, Tech Inquiry—to try and force Poulson to take down his articles about the arrest. Blackman argued that a court order sealing the arrest allowed him to censor the internet—despite decades of Supreme Court and California Court of Appeals precedent to the contrary.
This is a classic SLAPP: strategic lawsuit against public participation. Fortunately, California’s anti-SLAPP statute provides a way for defendants to swiftly defeat baseless claims designed to chill their free speech.
The court granted Poulson’s motion to strike Blackman’s complaint under the anti-SLAPP statute on Tuesday.
In its order, the court agreed that the First Amendment protects Poulson’s right to publish and report on the incident report.
This is an important ruling.
Under Bartnicki v. Vopper, the First Amendment protects journalists who report on truthful matters of public concern, even when the information they are reporting on was obtained illegally by someone else. Without it, reporters would face liability when they report on information provided by whistleblowers that companies or the government wants to keep secret.
Those principles were upheld here: Although courts have the power to seal records in appropriate cases, if and when someone provides a copy of a sealed record to a reporter, the reporter shouldn’t be forced to ignore the newsworthy information in that record. Instead, they should be allowed to do what journalists do: report the news.
And thanks to the First Amendment, a journalist who hasn’t done anything illegal to obtain the information has the right to publish it.
The court agreed that Poulson’s First Amendment defense defeated all of Blackman’s claims. As the court said:
"This court is persuaded that the First Amendment’s protections for the publication of truthful speech concerning matters of public interest vitiate Blackman’s merits showing…in this case there is no evidence that Poulson and the other defendants knew the arrest was sealed before Poulson reported on it, and all defendants’ actions in not taking down the arrest information after Blackman informed them of the sealing order was not so wrongful or unlawful that they are not protected."
The court also agreed that CEOs like Blackman cannot rewrite history by obtaining court orders that seal unflattering information—like an arrest for felony domestic violence. Blackman argued that, because, under California law, sealed arrests are “deemed” not to have occurred for certain legal purposes, reporting that he had been arrested was somehow false—and actionable. It isn’t.
The court agreed with Poulson: statutory language that alleviates some of the consequences of an arrest “cannot alter how past events unfolded.”
Simply put, no one can use the legal system to rewrite history.
EFF is thrilled that the court agrees.
DDoSed by Policy: Website Takedowns and Keeping Information Alive
Who needs a DDoS (Denial of Service) attack when you have a new president? As of February 2nd, thousands of web pages and datasets have been removed from U.S. government agencies following a series of executive orders. The impacts span the Department of Veteran Affairs and the Center of Disease Control and Prevention, all the way to programs like Head Start.
Government workers had just two days to carry out sweeping takedowns and rewrites due to a memo from the Office of Personnel Management. The memo cites a recent executive order attacking Trans people and further stigmatizing them by forbidding words used to accurately describe sex and gender. The result was a government-mandated censorship to erase these identities from a broad swatch of websites, resources, and scientific research regardless of context. This flurry of confusion comes on the heels of another executive order threatening CDC research by denying funding for government programs which promoted diversity, equity, and inclusion or climate justice. What we’re left with has been an anti-science, anti-speech, and just plain dangerous fit of panic with untold impacts on the most vulnerable communities.
The good news is technologists, academics, librarians, and open access organizations rushed to action to preserve and archive the information once contained on these sites. While the memo’s deadline has passed, these efforts are ongoing and you can still help.
New administrations often revise government pages to reflect new policies, though they are usually archived, not erased. These takedowns are alarming because they go beyond the usual changes in power, and could deprive the public of vital information, including scientific research impacting many different areas ranging from life saving medical research to the deadly impacts of climate change.
To help mitigate the damage, institutions like the Internet Archive provided essential tools to fight these memory holes, such as their “End of Term” archives, which include public-facing websites (.gov, .mil, etc) in the Legislative, Executive, and Judicial branches of the government. But anyone can use the Wayback Machine for other sites and pages: if you have something that needs archiving, you can easily do so here. Submitted links will be backed up and can be compared to previous versions of the site. Even if you do not have direct access to a website's full backup or database, saving the content of a page can often be enough to restore it later. While the Wayback archive is surprisingly extensive, some sites or changes still slip through the cracks, so it is always worth submitting them to be sure the archive is complete.
Academics are also in a unique position to protect established science and historical record of this public data. Library Innovation Lab at Harvard Law School, for example, has been preserving websites for courts and law journals. This has included hundreds of thousands of valuable datasets from data.gov, government git repositories, and more. This initiative is also building new open-source tools so that others can also make verifiable backups.
The impact of these executive orders go beyond public-facing website content. The CDC, impacted by both executive orders, also hosts vital scientific research data. If someone from the CDC were interested in backing up vital scientific research that isn’t public-facing, there are other road maps as well. Sci-Hub, a project to provide free and unrestricted access to all scientific knowledge that contains 85 million scientific articles, was kept alive by individuals downloading and seeding 850 torrents containing Sci-Hub’s 77 TB library. A community of “data hoarders,” independent archivists who declare a “rescue target” and build a “rescue team” of storage and seeders, are also archiving public datasets, like those formerly available at data.cdc.gov, which were not saved in the Internet Archive’s End of Term Archive.
Dedicating time to salvage, upload, and stop critical data from going dark, as well as rehosting later, is not for everyone, but is an important way to fight back against these kinds of takedowns.
Maintaining Support for Open InformationThis widespread deletion of information is one of the reasons EFF is particularly concerned with government-mandated censorship in any context: It can be extremely difficult to know how exactly to comply, and it’s often easier to broadly remove huge swathes of information rather than risk punishment. By rooting out inconvenient truths and inconvenient identities, untold harms are done to the people most removed from power, and everyone’s well being is diminished.
Proponents of open information who have won hard fought censorship battles in the past that helped to create the tools and infrastructure needed to protect us in this moment. The global collaborative efforts afforded by digital technology means the internet rarely forgets, all thanks to the tireless work of institutional, community, and individuals in the face of powerful and erratic censors.
We appreciate those who have stepped in. These groups need constant support, especially our allies who have had their work threatened, and so EFF will continue to advocate for both their efforts and for policies which protect progress, research, and open information.
Study in India shows kids use different math skills at work vs. school
In India, many kids who work in retail markets have good math skills: They can quickly perform a range of calculations to complete transactions. But as a new study shows, these kids often perform much worse on the same kinds of problems as they are taught in the classroom. This happens even though many of these students still attend school or attended school through 7th or 8th grades.
Conversely, the study also finds, Indian students who are still enrolled in school and don’t have jobs do better on school-type math problems, but they often fare poorly at the kinds of problems that occur in marketplaces.
Overall, both the “market kids” and the “school kids” struggle with the approach the other group is proficient in, raising questions about how to help both groups learn math more comprehensively.
“For the school kids, they do worse when you go from an abstract problem to a concrete problem,” says MIT economist Esther Duflo, co-author of a new paper detailing the study’s results. “For the market kids, it’s the opposite.”
Indeed, the kids with jobs who are also in school “underperform despite being extraordinarily good at mental math,” says Abhijit Banerjee an MIT economist and another co-author of the paper. “That for me was always the revelation, that the one doesn’t translate into the other.”
The paper, “Children’s arithmetic skills do not transfer between applied and academic math,” is published today in Nature. The authors are Banerjee, the Ford Professor of Economics at MIT; Swati Bhattacharjee of the newspaper Ananda Bazar Patrika, in Kolkata, India; Raghabendra Chattopadhyay of the Indian Institute of Management in Kolkata; Duflo, the Abdul Latif Jameel Professor of Poverty Alleviation and Development Economics at MIT; Alejandro J. Ganimian, a professor of applied psychology and economics at New York University; Kailash Rajaha, a doctoral candidate in economics at MIT; and Elizabeth S. Spelke, a professor of psychology at Harvard University.
Duflo and Banerjee shared the Nobel Prize in Economics in 2019 and are co-founders of MIT’s Jameel Abdul Lateef Poverty Action Lab (J-PAL), a global leader in development economics.
Three experiments
The study consists largely of three data-collection exercises with some embedded experiments. The first one shows that 201 kids working in markets in Kolkata do have good math skills. For instance, a researcher, posing as an ordinary shopper, would ask for the cost of 800 grams of potatoes sold at 20 rupees per kilogram, then ask for the cost of 1.4 kilograms of onions sold at 15 rupees per kilo. They would request the combined answer — 37 rupees — then hand the market worker a 200 rupee note and collect 163 rupees back. All told, the kids working in markets correctly solved this kind of problem from 95 to 98 percent of the time by the second try.
However, when the working children were pulled aside (with their parents’ permission) and given a standardized Indian national math test, just 32 percent could correctly divide a three-digit number by a one-digit number, and just 54 percent could correctly subtract a two-digit number from another two-digit number two times. Clearly, the kids’ skills were not yielding classroom results.
The researchers then conducted a second study with 400 kids working in markets in Delhi, which replicated the results: Working kids had a strong ability to handle market transactions, but only about 15 percent of the ones also in school were at average proficiency in math.
In the second study, the researchers also asked the reverse question: How do students doing well in school fare at market math problems? Here, with 200 students from 17 Delhi schools who do not work in markets, they found that 96 percent of the students could solve typical problems with a pencil, paper, unlimited time, and one opportunity to self-correct. But when the students had to solve the problems in a make-believe “market” setting, that figure dropped to just 60 percent. The students had unlimited time and access to paper and pencil, so that figure may actually overestimate how they would fare in a market.
Finally, in a third study, conducted in Delhi with over 200 kids, the researchers compared the performances of both “market” and “school” kids again on numerous math problems in varying conditions. While 85 percent of the working kids got the right answer to a market transaction problem, only 10 percent of nonworking kids correctly answered a question of similar difficulty, when faced with limited time and with no aids like pencil and paper. However, given the same division and subtraction problems, but with pencil and paper, 59 percent of nonmarket kids got them right, compared to 45 percent of market kids.
To further evaluate market kids and school kids on a level playing field, the researchers then presented each group with a word problem about a boy going to the market and buying two vegetables. Roughly one-third of the market kids were able to solve this without any aid, while fewer than 1 percent of the school kids did.
Why might the performance of the nonworking students decline when given a problem in market conditions?
“They learned an algorithm but didn’t understand it,” Banerjee says.
Meanwhile, the market kids seemed to use certain tactics to handle retail transactions. For one thing, they appear to use rounding well. Take a problem like 43 times 11. To handle that intuitively, you might multiply 43 times 10, and then add 43, for the final answer of 473. This appears to be what they are doing.
“The market kids are able to exploit base 10, so they do better on base 10 problems,” Duflo says. “The school kids have no idea. It makes no difference to them. The market kids may have additional tricks of this sort that we did not see.” On the other hand, the school kids had a better grasp of formal written methods of divison, subtraction, and more.
Going farther in school
The findings raise a significant point about students skills and academic progress. While it is a good thing that the kids with market jobs are proficient at generating rapid answers, it would likely be better for the long-term futures if they also did well in school and wound up with a high school degree or better. Finding a way to cross the divide between informal and formal ways of tackling math problems, then, could notably help some Indian children.
The fact that such a divide exists, meanwhile, suggests some new approaches could be tried in the classroom.
Banerjee, for one, suspects that part of the issue is a classroom process making it seem as if there is only one true route to funding an arithmetic answer. Instead, he believes, following the work of co-author Spelke, that helping students reason their way to an approximation of the right answer can help them truly get a handle on what is needed to solve these types of problems.
Even so, Duflo adds, “We don’t want to blame the teachers. It’s not their fault. They are given a strict curriculum to follow, and strict methods to follow.”
That still leaves open the question of what to change, in concrete classroom terms. That topic, it happens, is something the research group is in the process of weighing, as they consider new experiments that might address it directly. The current finding, however, makes clear progress would be useful.
“These findings highlight the importance of educational curricula that bridge the gap between intuitive and formal mathematics,” the authors state in the paper.
Support for the research was provided, in part, by the Abdul Latif Jameel Poverty Action Lab’s Post-Primary Education Initiative, the Foundation Blaise Pascal, and the AXA Research Fund.
Physicists measure a key aspect of superconductivity in “magic-angle” graphene
Superconducting materials are similar to the carpool lane in a congested interstate. Like commuters who ride together, electrons that pair up can bypass the regular traffic, moving through the material with zero friction.
But just as with carpools, how easily electron pairs can flow depends on a number of conditions, including the density of pairs that are moving through the material. This “superfluid stiffness,” or the ease with which a current of electron pairs can flow, is a key measure of a material’s superconductivity.
Physicists at MIT and Harvard University have now directly measured superfluid stiffness for the first time in “magic-angle” graphene — materials that are made from two or more atomically thin sheets of graphene twisted with respect to each other at just the right angle to enable a host of exceptional properties, including unconventional superconductivity.
This superconductivity makes magic-angle graphene a promising building block for future quantum-computing devices, but exactly how the material superconducts is not well-understood. Knowing the material’s superfluid stiffness will help scientists identify the mechanism of superconductivity in magic-angle graphene.
The team’s measurements suggest that magic-angle graphene’s superconductivity is primarily governed by quantum geometry, which refers to the conceptual “shape” of quantum states that can exist in a given material.
The results, which are reported today in the journal Nature, represent the first time scientists have directly measured superfluid stiffness in a two-dimensional material. To do so, the team developed a new experimental method which can now be used to make similar measurements of other two-dimensional superconducting materials.
“There’s a whole family of 2D superconductors that is waiting to be probed, and we are really just scratching the surface,” says study co-lead author Joel Wang, a research scientist in MIT’s Research Laboratory of Electronics (RLE).
The study’s co-authors from MIT’s main campus and MIT Lincoln Laboratory include co-lead author and former RLE postdoc Miuko Tanaka as well as Thao Dinh, Daniel Rodan-Legrain, Sameia Zaman, Max Hays, Bharath Kannan, Aziza Almanakly, David Kim, Bethany Niedzielski, Kyle Serniak, Mollie Schwartz, Jeffrey Grover, Terry Orlando, Simon Gustavsson, Pablo Jarillo-Herrero, and William D. Oliver, along with Kenji Watanabe and Takashi Taniguchi of the National Institute for Materials Science in Japan.
Magic resonance
Since its first isolation and characterization in 2004, graphene has proven to be a wonder substance of sorts. The material is effectively a single, atom-thin sheet of graphite consisting of a precise, chicken-wire lattice of carbon atoms. This simple configuration can exhibit a host of superlative qualities in terms of graphene’s strength, durability, and ability to conduct electricity and heat.
In 2018, Jarillo-Herrero and colleagues discovered that when two graphene sheets are stacked on top of each other, at a precise “magic” angle, the twisted structure — now known as magic-angle twisted bilayer graphene, or MATBG — exhibits entirely new properties, including superconductivity, in which electrons pair up, rather than repelling each other as they do in everyday materials. These so-called Cooper pairs can form a superfluid, with the potential to superconduct, meaning they could move through a material as an effortless, friction-free current.
“But even though Cooper pairs have no resistance, you have to apply some push, in the form of an electric field, to get the current to move,” Wang explains. “Superfluid stiffness refers to how easy it is to get these particles to move, in order to drive superconductivity.”
Today, scientists can measure superfluid stiffness in superconducting materials through methods that generally involve placing a material in a microwave resonator — a device which has a characteristic resonance frequency at which an electrical signal will oscillate, at microwave frequencies, much like a vibrating violin string. If a superconducting material is placed within a microwave resonator, it can change the device’s resonance frequency, and in particular, its “kinetic inductance,” by an amount that scientists can directly relate to the material’s superfluid stiffness.
However, to date, such approaches have only been compatible with large, thick material samples. The MIT team realized that to measure superfluid stiffness in atomically thin materials like MATBG would require a new approach.
“Compared to MATBG, the typical superconductor that is probed using resonators is 10 to 100 times thicker and larger in area,” Wang says. “We weren’t sure if such a tiny material would generate any measurable inductance at all.”
A captured signal
The challenge to measuring superfluid stiffness in MATBG has to do with attaching the supremely delicate material to the surface of the microwave resonator as seamlessly as possible.
“To make this work, you want to make an ideally lossless — i.e., superconducting — contact between the two materials,” Wang explains. “Otherwise, the microwave signal you send in will be degraded or even just bounce back instead of going into your target material.”
Will Oliver’s group at MIT has been developing techniques to precisely connect extremely delicate, two-dimensional materials, with the goal of building new types of quantum bits for future quantum-computing devices. For their new study, Tanaka, Wang, and their colleagues applied these techniques to seamlessly connect a tiny sample of MATBG to the end of an aluminum microwave resonator. To do so, the group first used conventional methods to assemble MATBG, then sandwiched the structure between two insulating layers of hexagonal boron nitride, to help maintain MATBG’s atomic structure and properties.
“Aluminum is a material we use regularly in our superconducting quantum computing research, for example, aluminum resonators to read out aluminum quantum bits (qubits),” Oliver explains. “So, we thought, why not make most of the resonator from aluminum, which is relatively straightforward for us, and then add a little MATBG to the end of it? It turned out to be a good idea.”
“To contact the MATBG, we etch it very sharply, like cutting through layers of a cake with a very sharp knife,” Wang says. “We expose a side of the freshly-cut MATBG, onto which we then deposit aluminum — the same material as the resonator — to make a good contact and form an aluminum lead.”
The researchers then connected the aluminum leads of the MATBG structure to the larger aluminum microwave resonator. They sent a microwave signal through the resonator and measured the resulting shift in its resonance frequency, from which they could infer the kinetic inductance of the MATBG.
When they converted the measured inductance to a value of superfluid stiffness, however, the researchers found that it was much larger than what conventional theories of superconductivity would have predicted. They had a hunch that the surplus had to do with MATBG’s quantum geometry — the way the quantum states of electrons correlate to one another.
“We saw a tenfold increase in superfluid stiffness compared to conventional expectations, with a temperature dependence consistent with what the theory of quantum geometry predicts,” Tanaka says. “This was a ‘smoking gun’ that pointed to the role of quantum geometry in governing superfluid stiffness in this two-dimensional material.”
“This work represents a great example of how one can use sophisticated quantum technology currently used in quantum circuits to investigate condensed matter systems consisting of strongly interacting particles,” adds Jarillo-Herrero.
This research was funded, in part, by the U.S. Army Research Office, the National Science Foundation, the U.S. Air Force Office of Scientific Research, and the U.S. Under Secretary of Defense for Research and Engineering.
A complementary study on magic-angle twisted trilayer graphene (MATTG), conducted by a collaboration between Philip Kim’s group at Harvard University and Jarillo-Herrero’s group at MIT appears in the same issue of Nature.
On Generative AI Security
Microsoft’s AI Red Team just published “Lessons from
Red Teaming 100 Generative AI Products.” Their blog post lists “three takeaways,” but the eight lessons in the report itself are more useful:
- Understand what the system can do and where it is applied.
- You don’t have to compute gradients to break an AI system.
- AI red teaming is not safety benchmarking.
- Automation can help cover more of the risk landscape.
- The human element of AI red teaming is crucial.
- Responsible AI harms are pervasive but difficult to measure.
- LLMs amplify existing security risks and introduce new ones...