Feed aggregator
California carbon offset program under scrutiny
EU plans to spare 80 percent of firms from carbon tax
Trump rescinds $4B in US pledges for UN climate fund
Inside the divided coalition coming for the Green Deal
Uniformity of climate anxiety scales
Nature Climate Change, Published online: 07 February 2025; doi:10.1038/s41558-025-02261-w
Uniformity of climate anxiety scalesPlants countering downpours
Nature Climate Change, Published online: 07 February 2025; doi:10.1038/s41558-025-02262-9
Plants countering downpoursThermal impacts on aquatic fertility
Nature Climate Change, Published online: 07 February 2025; doi:10.1038/s41558-025-02260-x
Thermal impacts on aquatic fertilityIneffective carbon offset
Nature Climate Change, Published online: 07 February 2025; doi:10.1038/s41558-025-02259-4
Ineffective carbon offsetValidation technique could help scientists make more accurate forecasts
Should you grab your umbrella before you walk out the door? Checking the weather forecast beforehand will only be helpful if that forecast is accurate.
Spatial prediction problems, like weather forecasting or air pollution estimation, involve predicting the value of a variable in a new location based on known values at other locations. Scientists typically use tried-and-true validation methods to determine how much to trust these predictions.
But MIT researchers have shown that these popular validation methods can fail quite badly for spatial prediction tasks. This might lead someone to believe that a forecast is accurate or that a new prediction method is effective, when in reality that is not the case.
The researchers developed a technique to assess prediction-validation methods and used it to prove that two classical methods can be substantively wrong on spatial problems. They then determined why these methods can fail and created a new method designed to handle the types of data used for spatial predictions.
In experiments with real and simulated data, their new method provided more accurate validations than the two most common techniques. The researchers evaluated each method using realistic spatial problems, including predicting the wind speed at the Chicago O-Hare Airport and forecasting the air temperature at five U.S. metro locations.
Their validation method could be applied to a range of problems, from helping climate scientists predict sea surface temperatures to aiding epidemiologists in estimating the effects of air pollution on certain diseases.
“Hopefully, this will lead to more reliable evaluations when people are coming up with new predictive methods and a better understanding of how well methods are performing,” says Tamara Broderick, an associate professor in MIT’s Department of Electrical Engineering and Computer Science (EECS), a member of the Laboratory for Information and Decision Systems and the Institute for Data, Systems, and Society, and an affiliate of the Computer Science and Artificial Intelligence Laboratory (CSAIL).
Broderick is joined on the paper by lead author and MIT postdoc David R. Burt and EECS graduate student Yunyi Shen. The research will be presented at the International Conference on Artificial Intelligence and Statistics.
Evaluating validations
Broderick’s group has recently collaborated with oceanographers and atmospheric scientists to develop machine-learning prediction models that can be used for problems with a strong spatial component.
Through this work, they noticed that traditional validation methods can be inaccurate in spatial settings. These methods hold out a small amount of training data, called validation data, and use it to assess the accuracy of the predictor.
To find the root of the problem, they conducted a thorough analysis and determined that traditional methods make assumptions that are inappropriate for spatial data. Evaluation methods rely on assumptions about how validation data and the data one wants to predict, called test data, are related.
Traditional methods assume that validation data and test data are independent and identically distributed, which implies that the value of any data point does not depend on the other data points. But in a spatial application, this is often not the case.
For instance, a scientist may be using validation data from EPA air pollution sensors to test the accuracy of a method that predicts air pollution in conservation areas. However, the EPA sensors are not independent — they were sited based on the location of other sensors.
In addition, perhaps the validation data are from EPA sensors near cities while the conservation sites are in rural areas. Because these data are from different locations, they likely have different statistical properties, so they are not identically distributed.
“Our experiments showed that you get some really wrong answers in the spatial case when these assumptions made by the validation method break down,” Broderick says.
The researchers needed to come up with a new assumption.
Specifically spatial
Thinking specifically about a spatial context, where data are gathered from different locations, they designed a method that assumes validation data and test data vary smoothly in space.
For instance, air pollution levels are unlikely to change dramatically between two neighboring houses.
“This regularity assumption is appropriate for many spatial processes, and it allows us to create a way to evaluate spatial predictors in the spatial domain. To the best of our knowledge, no one has done a systematic theoretical evaluation of what went wrong to come up with a better approach,” says Broderick.
To use their evaluation technique, one would input their predictor, the locations they want to predict, and their validation data, then it automatically does the rest. In the end, it estimates how accurate the predictor’s forecast will be for the location in question. However, effectively assessing their validation technique proved to be a challenge.
“We are not evaluating a method, instead we are evaluating an evaluation. So, we had to step back, think carefully, and get creative about the appropriate experiments we could use,” Broderick explains.
First, they designed several tests using simulated data, which had unrealistic aspects but allowed them to carefully control key parameters. Then, they created more realistic, semi-simulated data by modifying real data. Finally, they used real data for several experiments.
Using three types of data from realistic problems, like predicting the price of a flat in England based on its location and forecasting wind speed, enabled them to conduct a comprehensive evaluation. In most experiments, their technique was more accurate than either traditional method they compared it to.
In the future, the researchers plan to apply these techniques to improve uncertainty quantification in spatial settings. They also want to find other areas where the regularity assumption could improve the performance of predictors, such as with time-series data.
This research is funded, in part, by the National Science Foundation and the Office of Naval Research.
Cleaning up critical minerals and materials production, using microwave plasma
The push to bring manufacturing back to the U.S. is running up against an unfortunate truth: The processes for making many critical materials today create toxic byproducts and other environmental hazards. That’s true for commonly used industrial metals like nickel and titanium, as well as specialty minerals, materials, and coatings that go into batteries, advanced electronics, and defense applications.
Now 6K, founded by former MIT research scientist Kamal Hadidi, is using a new production process to bring critical materials production back to America without the toxic byproducts.
The company is actively scaling its microwave plasma technology, which it calls UniMelt, to transform the way critical minerals are processed, creating new domestic supply chains in the process. UniMelt uses beams of tightly controlled thermal plasma to melt or vaporize precursor materials into particles with precise sizes and crystalline phases.
The technology converts metals, such as titanium, nickel, and refractory alloys, into particles optimized for additive manufacturing for a range of industrial applications. It is also being used to create battery materials for electric vehicles, grid infrastructure, and data centers.
“The markets and critical materials we are focused on are important for not just economic reasons but also U.S. national security, because the bulk of these materials are manufactured today in nonfriendly countries,” 6K CEO Saurabh Ullal says. “Now, the [U.S. government] and our growing customer base can leverage this technology invented at MIT to make the U.S. less dependent on these nonfriendly countries, ensuring supply chain independence now and in the future.”
Named after the 6,000-degree temperature of its plasma, 6K is currently selling its high-performance metal powders to parts manufacturers as well as defense, automotive, medical, and oil and gas companies for use in applications from engine components and medical implants to rockets. To scale its battery materials business, 6K is also building a 100,000-square-foot production facility in Jackson, Tennessee, which will begin construction later this year.
A weekend project
Between 1994 and 2007, Hadidi worked at the Plasma Science and Fusion Center (PFSC), where he developed plasma technologies for a range of applications, including hydrogen production, fuel reforming, and detecting environmental toxins. His first company was founded in 2000 out of the PFSC to detect mercury in coal-fired power plants’ smokestacks.
“I loved working at MIT,” Hadidi says. “It’s an amazing place that really challenges you. Just being there is so stimulating because everyone’s trying to come up with new solutions and connect dots between different fields.”
Hadidi also began using high-frequency microwave plasmas to create nanomaterials for use in optical applications. He wasn’t a materials expert, so he collaborated with Professor Eric Jordan, a materials synthesis expert from the University of Connecticut, and the researchers started working on nights and weekends in the PSFC to develop the idea further, eventually patenting the technology.
Hadidi officially founded the company as Amastan in 2007, exploring the use of his microwave plasma technology, later named UniMelt for “uniform melt state process,” to make a host of different materials as part of a government grant he and Jordan received.
The researchers soon realized the microwave plasma technology had several advantages over traditional production techniques for certain materials. For one, it could eliminate several high-energy steps of conventional processes, reducing production times from days to hours in some cases. For batteries and certain critical minerals, the process also works with recycled feedstocks. Amastan was renamed 6K in 2019.
Early on, Hadidi produced metal powders used in additive manufacturing through a process called spheroidization, which results in dense, spherical powders that flow well and make high-performance 3D-printed parts.
Following another grant, Hadidi explored methods for producing a type of battery cathode made from lithium, nickel, manganese, and cobalt (NMC). The standard process for making NMCs involved chemical synthesis, precipitation, heat treatment, and a lot of water. 6K is able to reduce many of those steps, speeding up production and lowering costs while also being more sustainable.
“Our technology completely eliminates toxic waste and recycles all of the byproducts back through the process to utilize everything, including water,” Ullal says.
Scaling domestic production
Today, 6K’s additive manufacturing arm operates out of a factory in Pennsylvania. The company’s critical minerals processing, refining, and recycling systems can produce about 400 tons of material per year and can be used to make more than a dozen types of metal powders. The company also has 33,000-square-foot battery center in North Andover, Massachusetts, where it produces battery cathode materials for its energy storage and mobility customers.
The Tennessee facility will be used to produce battery cathode materials and represents a massive step up in throughput. The company says it will be able to produce 13,000 tons of material annually when construction is complete next year.
“I’m happy if what I started brings something positive to society, and I’m extremely thankful to all the people that helped me,” says Hadidi, who left the company in 2019. “I’m an entrepreneur at heart. I like to make things. But that doesn’t mean I always succeed. It’s personally very satisfying to see this make an impact.”
The 6K team says its technology can also create a variety of specialty ceramics, advanced coatings, and nanoengineered materials. They say it may also be used to eliminate PFAS, or “forever chemicals,” though that work is at an early stage.
The company recently received a grant to demonstrate a process for recycling critical materials from military depots to produce aerospace and defense products, creating a new value stream for these materials that would otherwise deteriorate or go to landfill. That work is consistent with the company’s motto, “We take nothing from the ground and put nothing into the ground.”
The company’s additive division recently received a $23.4 Defense Production Act grant “that will enable us to double processing capacity in the next three years,” Ullal says. “The next step is to scale battery materials production to the tens of thousands of tons per year. At this point, it’s a scale-up of known processes, and we just need to execute. The idea of creating a circular economy is near and dear to us because that’s how we’ve built this company and that’s how we generate value: addressing our U.S. national security concerns and protecting the planet as well.”
EFF Applauds Little Rock, AK for Cancelling ShotSpotter Contract
Community members coordinated to pack Little Rock City Hall on Tuesday, where board members voted 5-3 to end the city's contract with ShotSpotter.
Initially funded through a federal grant, Little Rock began its experiment with the “gunshot detection” sensors in 2018. ShotSpotter (now SoundThinking) has long been accused of steering federal grants toward local police departments in an effort to secure funding for the technology. Members of Congress are investigating this funding. EFF has long encouraged communities to follow the money that pays for police surveillance technology.
Now, faced with a $188,000 contract renewal using city funds, Little Rock has joined the growing number of cities nationwide that have rejected, ended, or called into question their use of the invasive, error-prone technology.
EFF has been a vocal critic of gunshot detection systems and extensively documented how ShotSpotter sensors risk capturing private conversations and enable discriminatory policing—ultimately calling on cities to stop using the technology.
This call has been echoed by grassroots advocates coordinating through networks like the National Stop ShotSpotter Coalition. Community organizers have dedicated countless hours to popular education, canvassing neighborhoods, and conducting strategic research to debunk the company's spurious marketing claims.
Through that effort, Little Rock has now joined the ranks of cities throughout the country to reject surveillance technologies like gunshot detection that harm marginalized communities and fail time and time again to deliver meaningful public safety.
If you live in a city that's also considering dropping (or installing) ShotSpotter, share this news with your community and local officials!
MIT method enables ultrafast protein labeling of tens of millions of densely packed cells
A new technology developed at MIT enables scientists to label proteins across millions of individual cells in fully intact 3D tissues with unprecedented speed, uniformity, and versatility. Using the technology, the team was able to richly label large tissue samples in a single day. In their new study in Nature Biotechnology, they also demonstrate that the ability to label proteins with antibodies at the single-cell level across large tissue samples can reveal insights left hidden by other widely used labeling methods.
Profiling the proteins that cells are making is a staple of studies in biology, neuroscience, and related fields because the proteins a cell is expressing at a given moment can reflect the functions the cell is trying to perform or its response to its circumstances, such as disease or treatment. As much as microscopy and labeling technologies have advanced, enabling innumerable discoveries, scientists have still lacked a reliable and practical way of tracking protein expression at the level of millions of densely packed individual cells in whole, 3D intact tissues. Often confined to thin tissue sections under slides, scientists therefore haven’t had tools to thoroughly appreciate cellular protein expression in the whole, connected systems in which it occurs.
“Conventionally, investigating the molecules within cells requires dissociating tissue into single cells or slicing it into thin sections, as light and chemicals required for analysis cannot penetrate deep into tissues. Our lab developed technologies such as CLARITY and SHIELD, which enable investigation of whole organs by rendering them transparent, but we now needed a way to chemically label whole organs to gain useful scientific insights,” says study senior author Kwanghun Chung, associate professor in The Picower Institute for Learning and Memory, the departments of Chemical Engineering and Brain and Cognitive Sciences, and the Institute for Medical Engineering and Science at MIT. “If cells within a tissue are not uniformly processed, they cannot be quantitatively compared. In conventional protein labeling, it can take weeks for these molecules to diffuse into intact organs, making uniform chemical processing of organ-scale tissues virtually impossible and extremely slow.”
The new approach, called “CuRVE,” represents a major advance — years in the making — toward that goal by demonstrating a fundamentally new approach to uniformly processing large and dense tissues whole. In the study, the researchers explain how they overcame the technical barriers via an implementation of CuRVE called “eFLASH,” and provide copious vivid demonstrations of the technology, including how it yielded new neuroscience insights.
“This is a significant leap, especially in terms of the actual performance of the technology,” says co-lead author Dae Hee Yun PhD '24, a recent MIT graduate student who is now a senior application engineer at LifeCanvas Technologies, a startup company Chung founded to disseminate the tools his lab invents. The paper’s other lead author is Young-Gyun Park, a former MIT postdoc who’s now an assistant professor at KAIST in South Korea.
Clever chemistry
The fundamental reason why large, 3D tissue samples are hard to label uniformly is that antibodies seep into tissue very slowly, but are quick to bind to their target proteins. The practical effect of this speed mismatch is that simply soaking a brain in a bath of antibodies will mean that proteins are intensely well labeled on the outer edge of the tissue, but virtually none of the antibodies will find cells and proteins deeper inside.
To improve labeling, the team conceived of a way — the conceptual essence of CuRVE — to resolve the speed mismatch. The strategy was to continuously control the pace of antibody binding while at the same time speeding up antibody permeation throughout the tissue. To figure out how this could work and to optimize the approach, they built and ran a sophisticated computational simulation that enabled them to test different settings and parameters, including different binding rates and tissue densities and compositions.
Then they set out to implement their approach in real tissues. Their starting point was a previous technology, called “SWITCH,” in which Chung’s lab devised a way of temporarily turning off antibody binding, letting the antibodies permeate the tissue, and then turning binding back on. As well as it worked, Yun says, the team realized there could be substantial improvements if antibody binding speed could be controlled constantly, but the chemicals used in SWITCH were too harsh for such ongoing treatment. So the team screened a library of similar chemicals to find one that could more subtly and continuously throttle antibody binding speed. They found that deoxycholic acid was an ideal candidate. Using that chemical, the team could not only modulate antibody binding by varying the chemical’s concentration, but also by varying the labeling bath’s pH (or acidity).
Meanwhile, to speed up antibody movement through tissues, the team used another prior technology invented in the Chung Lab: stochastic electrotransport. That technology accelerates the dispersion of antibodies through tissue by applying electric fields.
Implementing this eFLASH system of accelerated dispersion with continuously modifiable binding speed produced the wide variety of labeling successes demonstrated in the paper. In all, the team reported using more than 60 different antibodies to label proteins in cells across large tissue samples.
Notably, each of these specimens was labeled within a day, an “ultra-fast” speed for whole, intact organs, the authors say. Moreover, different preparations did not require new optimization steps.
Valuable visualizations
Among the ways the team put eFLASH to the test was by comparing their labeling to another often-used method: genetically engineering cells to fluoresce when the gene for a protein of interest is being transcribed. The genetic method doesn’t require dispersing antibodies throughout tissue, but it can be prone to discrepancies because reporting gene transcription and actual protein production are not exactly the same thing. Yun added that while antibody labeling reliably and immediately reports on the presence of a target protein, the genetic method can be much less immediate and persistent, still fluorescing even when the actual protein is no longer present.
In the study the team employed both kinds of labeling simultaneously in samples. Visualizing the labels that way, they saw many examples in which antibody labeling and genetic labeling differed widely. In some areas of mouse brains, they found that two-thirds of the neurons expressing PV (a protein prominent in certain inhibitory neurons) according to antibody labeling, did not show any genetically-based fluorescence. In another example, only a tiny fraction of cells that reported expression via the genetic method of a protein called ChAT also reported it via antibody labeling. In other words, there were cases where genetic labeling both severely underreported or overreported protein expression compared to antibody labeling.
The researchers don’t mean to impugn the clear value of using the genetic reporting methods, but instead suggest that also using organ-wide antibody labeling, as eFLASH allows, can help put that data in a richer, more complete context. “Our discovery of large regionalized loss of PV-immunoreactive neurons in healthy adult mice and with high individual variability emphasizes the importance of holistic and unbiased phenotyping,” the authors write.
Or as Yun puts it, the two different kinds of labeling are “two different tools for the job.”
In addition to Yun, Park, and Chung, the paper’s other authors are Jae Hun Cho, Lee Kamentsky, Nicholas Evans, Nicholas DiNapoli, Katherine Xie, Seo Woo Choi, Alexandre Albanese, Yuxuan Tian, Chang Ho Sohn, Qiangge Zhang, Minyoung Kim, Justin Swaney, Webster Guan, Juhyuk Park, Gabi Drummond, Heejin Choi, Luzdary Ruelas, and Guoping Feng.
Funding for the study came from the Burroughs Wellcome Fund, the Searle Scholars Program, a Packard Award in Science and Engineering, a NARSAD Young Investigator Award, the McKnight Foundation, the Freedom Together Foundation, The Picower Institute for Learning and Memory, the NCSOFT Cultural Foundation, and the National Institutes of Health.
Streamlining data collection for improved salmon population management
Sara Beery came to MIT as an assistant professor in MIT’s Department of Electrical Engineering and Computer Science (EECS) eager to focus on ecological challenges. She has fashioned her research career around the opportunity to apply her expertise in computer vision, machine learning, and data science to tackle real-world issues in conservation and sustainability. Beery was drawn to the Institute’s commitment to “computing for the planet,” and set out to bring her methods to global-scale environmental and biodiversity monitoring.
In the Pacific Northwest, salmon have a disproportionate impact on the health of their ecosystems, and their complex reproductive needs have attracted Beery’s attention. Each year, millions of salmon embark on a migration to spawn. Their journey begins in freshwater stream beds where the eggs hatch. Young salmon fry (newly hatched salmon) make their way to the ocean, where they spend several years maturing to adulthood. As adults, the salmon return to the streams where they were born in order to spawn, ensuring the continuation of their species by depositing their eggs in the gravel of the stream beds. Both male and female salmon die shortly after supplying the river habitat with the next generation of salmon.
Throughout their migration, salmon support a wide range of organisms in the ecosystems they pass through. For example, salmon bring nutrients like carbon and nitrogen from the ocean upriver, enhancing their availability to those ecosystems. In addition, salmon are key to many predator-prey relationships: They serve as a food source for various predators, such as bears, wolves, and birds, while helping to control other populations, like insects, through predation. After they die from spawning, the decomposing salmon carcasses also replenish valuable nutrients to the surrounding ecosystem. The migration of salmon not only sustains their own species but plays a critical role in the overall health of the rivers and oceans they inhabit.
At the same time, salmon populations play an important role both economically and culturally in the region. Commercial and recreational salmon fisheries contribute significantly to the local economy. And for many Indigenous peoples in the Pacific northwest, salmon hold notable cultural value, as they have been central to their diets, traditions, and ceremonies.
Monitoring salmon migration
Increased human activity, including overfishing and hydropower development, together with habitat loss and climate change, have had a significant impact on salmon populations in the region. As a result, effective monitoring and management of salmon fisheries is important to ensure balance among competing ecological, cultural, and human interests. Accurately counting salmon during their seasonal migration to their natal river to spawn is essential in order to track threatened populations, assess the success of recovery strategies, guide fishing season regulations, and support the management of both commercial and recreational fisheries. Precise population data help decision-makers employ the best strategies to safeguard the health of the ecosystem while accommodating human needs. Monitoring salmon migration is a labor-intensive and inefficient undertaking.
Beery is currently leading a research project that aims to streamline salmon monitoring using cutting-edge computer vision methods. This project fits within Beery’s broader research interest, which focuses on the interdisciplinary space between artificial intelligence, the natural world, and sustainability. Its relevance to fisheries management made it a good fit for funding from MIT’s Abdul Latif Jameel Water and Food Systems Lab (J-WAFS). Beery’s 2023 J-WAFS seed grant was the first research funding she was awarded since joining the MIT faculty.
Historically, monitoring efforts relied on humans to manually count salmon from riverbanks using eyesight. In the past few decades, underwater sonar systems have been implemented to aid in counting the salmon. These sonar systems are essentially underwater video cameras, but they differ in that they use acoustics instead of light sensors to capture the presence of a fish. Use of this method requires people to set up a tent alongside the river to count salmon based on the output of a sonar camera that is hooked up to a laptop. While this system is an improvement to the original method of monitoring salmon by eyesight, it still relies significantly on human effort and is an arduous and time-consuming process.
Automating salmon monitoring is necessary for better management of salmon fisheries. “We need these technological tools,” says Beery. “We can’t keep up with the demand of monitoring and understanding and studying these really complex ecosystems that we work in without some form of automation.”
In order to automate counting of migrating salmon populations in the Pacific Northwest, the project team, including Justin Kay, a PhD student in EECS, has been collecting data in the form of videos from sonar cameras at different rivers. The team annotates a subset of the data to train the computer vision system to autonomously detect and count the fish as they migrate. Kay describes the process of how the model counts each migrating fish: “The computer vision algorithm is designed to locate a fish in the frame, draw a box around it, and then track it over time. If a fish is detected on one side of the screen and leaves on the other side of the screen, then we count it as moving upstream.” On rivers where the team has created training data for the system, it has produced strong results, with only 3 to 5 percent counting error. This is well below the target that the team and partnering stakeholders set of no more than a 10 percent counting error.
Testing and deployment: Balancing human effort and use of automation
The researchers’ technology is being deployed to monitor the migration of salmon on the newly restored Klamath River. Four dams on the river were recently demolished, making it the largest dam removal project in U.S. history. The dams came down after a more than 20-year-long campaign to remove them, which was led by Klamath tribes, in collaboration with scientists, environmental organizations, and commercial fishermen. After the removal of the dams, 240 miles of the river now flow freely and nearly 800 square miles of habitat are accessible to salmon. Beery notes the almost immediate regeneration of salmon populations in the Klamath River: “I think it was within eight days of the dam coming down, they started seeing salmon actually migrate upriver beyond the dam.” In a collaboration with California Trout, the team is currently processing new data to adapt and create a customized model that can then be deployed to help count the newly migrating salmon.
One challenge with the system revolves around training the model to accurately count the fish in unfamiliar environments with variations such as riverbed features, water clarity, and lighting conditions. These factors can significantly alter how the fish appear on the output of a sonar camera and confuse the computer model. When deployed in new rivers where no data have been collected before, like the Klamath, the performance of the system degrades and the margin of error increases substantially to 15-20 percent.
The researchers constructed an automatic adaptation algorithm within the system to overcome this challenge and create a scalable system that can be deployed to any site without human intervention. This self-initializing technology works to automatically calibrate to the new conditions and environment to accurately count the migrating fish. In testing, the automatic adaptation algorithm was able to reduce the counting error down to the 10 to 15 percent range. The improvement in counting error with the self-initializing function means that the technology is closer to being deployable to new locations without much additional human effort.
Enabling real-time management with the “Fishbox”
Another challenge faced by the research team was the development of an efficient data infrastructure. In order to run the computer vision system, the video produced by sonar cameras must be delivered via the cloud or by manually mailing hard drives from a river site to the lab. These methods have notable drawbacks: a cloud-based approach is limited due to lack of internet connectivity in remote river site locations, and shipping the data introduces problems of delay.
Instead of relying on these methods, the team has implemented a power-efficient computer, coined the “Fishbox,” that can be used in the field to perform the processing. The Fishbox consists of a small, lightweight computer with optimized software that fishery managers can plug into their existing laptops and sonar cameras. The system is then capable of running salmon counting models directly at the sonar sites without the need for internet connectivity. This allows managers to make hour-by-hour decisions, supporting more responsive, real-time management of salmon populations.
Community development
The team is also working to bring a community together around monitoring for salmon fisheries management in the Pacific Northwest. “It’s just pretty exciting to have stakeholders who are enthusiastic about getting access to [our technology] as we get it to work and having a tighter integration and collaboration with them,” says Beery. “I think particularly when you’re working on food and water systems, you need direct collaboration to help facilitate impact, because you're ensuring that what you develop is actually serving the needs of the people and organizations that you are helping to support.”
This past June, Beery’s lab organized a workshop in Seattle that convened nongovernmental organizations, tribes, and state and federal departments of fish and wildlife to discuss the use of automated sonar systems to monitor and manage salmon populations. Kay notes that the workshop was an “awesome opportunity to have everybody sharing different ways that they're using sonar and thinking about how the automated methods that we’re building could fit into that workflow.” The discussion continues now via a shared Slack channel created by the team, with over 50 participants. Convening this group is a significant achievement, as many of these organizations would not otherwise have had an opportunity to come together and collaborate.
Looking forward
As the team continues to tune the computer vision system, refine their technology, and engage with diverse stakeholders — from Indigenous communities to fishery managers — the project is poised to make significant improvements to the efficiency and accuracy of salmon monitoring and management in the region. And as Beery advances the work of her MIT group, the J-WAFS seed grant is helping to keep challenges such as fisheries management in her sights.
“The fact that the J-WAFS seed grant existed here at MIT enabled us to continue to work on this project when we moved here,” comments Beery, adding “it also expanded the scope of the project and allowed us to maintain active collaboration on what I think is a really important and impactful project.”
As J-WAFS marks its 10th anniversary this year, the program aims to continue supporting and encouraging MIT faculty to pursue innovative projects that aim to advance knowledge and create practical solutions with real-world impacts on global water and food system challenges.
Protecting Free Speech in Texas: We Need To Stop SB 336
The Texas legislature will soon be debating a bill that would seriously weaken the free speech protections of people in that state. If you live in Texas, it’s time to contact your state representatives and let them know you oppose this effort.
Texas Senate Bill 336 (SB 336) is an attack on the Texas Citizens Participation Act (TCPA), the state’s landmark anti-SLAPP law, passed in 2011 with overwhelming bipartisan support. If passed, SB 336 (or its identical companion bill, H.B. 2459) will weaken safeguards against abusive lawsuits that seek to silence peoples’ speech.
What Are SLAPPs?SLAPPs, or Strategic Lawsuits Against Public Participation, are lawsuits filed not to win on the merits but to burden individuals with excessive legal costs. SLAPPs are often used by the powerful to intimidate critics and discourage public discussion that they don’t like. By forcing defendants to engage in prolonged and expensive legal battles, SLAPPs create a chilling effect that discourages others from speaking out on important issues.
Under the TCPA, when a defendant files a motion to dismiss a SLAPP lawsuit, the legal proceedings are automatically paused while a court determines whether the case should move forward. They are also paused if the SLAPP victim needs to get a second review from an appeal court. This is crucial to protect individuals from being dragged through an expensive discovery process while their right to speak out is debated in a higher court.
SB 336 Undermines Free Speech ProtectionsSB 336 strips away safeguards by removing the automatic stay of trial court proceedings in certain TCPA appeals. Even if a person has a strong claim that a lawsuit against them is frivolous, they would still be forced to endure the financial and emotional burden of litigation while waiting for an appellate decision.
This would expose litigants to legal harassment. With no automatic stay, plaintiffs with deep pockets will be able to financially drain defendants. In the words of former Chief Justice of the Texas Supreme Court, Wallace B. Jefferson, removing the automatic stay in the TCPA would create a “two-tier system in which parties would be forced to litigate their cases simultaneously at the trial and appellate courts.”
If the TCPA is altered, the biggest losers will be everyday Texans who rely on the TCPA to shield them from retaliatory lawsuits. That will include domestic violence survivors who face defamation suits from their abusers after reporting them; journalists and whistleblowers who expose corruption and corporate wrongdoing; grassroots activists who choose to speak out; and small business owners and consumers who leave honest reviews and speak out against unethical business practices.
Often, these individuals already face uphill battles when confronting wealthier and more powerful parties in court. SB 336 would tip the scales further in favor of those with the financial means to weaponize the legal system against speech they dislike.
Fighting To Protect Free Speech For TexansIn addition to EFF, SB 336 is opposed by a broad coalition of groups including the ACLU, the Reporters Committee for Freedom of the Press, and an array of national and local news organizations. To learn more about the TCPA and current efforts to weaken it, check out the website maintained by the Texas Protect Free Speech Coalition.
Unfortunately, this is the fourth legislative session in a row in which a bill has been pushed to significantly weaken the TCPA. Those efforts started in 2019, and while we stopped the worst changes that year, the 2019 Texas Legislature did vote through some unfortunate exceptions to TCPA rules. We succeeded in blocking a slate of poorly thought-out changes in 2023. We can, and must, protect TCPA again in 2025–if people speak up.
If you live in Texas, call or email your state representatives or the Senators on Committee for State Affairs today and urge them to vote NO on SB 336. Let’s ensure Texas continues to be a place where peoples’ voices are heard, not silenced by unjust lawsuits.
AIs and Robots Should Sound Robotic
Most people know that robots no longer sound like tinny trash cans. They sound like Siri, Alexa, and Gemini. They sound like the voices in labyrinthine customer support phone trees. And even those robot voices are being made obsolete by new AI-generated voices that can mimic every vocal nuance and tic of human speech, down to specific regional accents. And with just a few seconds of audio, AI can now clone someone’s specific voice.
This technology will replace humans in many areas. Automated customer support will save money by cutting staffing at ...