Feed aggregator
Enbridge proposes expansion of New England pipeline
6 things Trump won’t see in the FEMA report
Democratic governors have a new playbook: Build projects fast
Florida and Georgia wildfires show growing risk in Southeast
New York asks to back Sunrise Wind in legal challenge
Texas lifts fiber-optic rule for camp safety enacted after deadly flood
What to know about predictions for record-breaking El Niño
NATO backs renewables as solution to energy security, despite US skepticism
Passengers evacuate from hantavirus ship at Tenerife
Despite gains, forest degradation in Brazil’s Amazon is looming threat
City type specifies carbon cycle
Nature Climate Change, Published online: 11 May 2026; doi:10.1038/s41558-026-02646-5
City type specifies carbon cycleLargest increase of carbon dioxide in 2024
Nature Climate Change, Published online: 11 May 2026; doi:10.1038/s41558-026-02647-4
Largest increase of carbon dioxide in 2024Food policy adaptation
Nature Climate Change, Published online: 11 May 2026; doi:10.1038/s41558-026-02645-6
Food policy adaptationDecreasing ice and colder winters
Nature Climate Change, Published online: 11 May 2026; doi:10.1038/s41558-026-02648-3
Decreasing ice and colder wintersScientists breed low-emission rice to fight climate change
Nature Climate Change, Published online: 11 May 2026; doi:10.1038/s41558-026-02614-z
New hybrid grains are expected to emit less than half of the methane that their natural counterparts emit.Carbon markets rule change would harm mitigation and Indigenous peoples
Nature Climate Change, Published online: 11 May 2026; doi:10.1038/s41558-026-02629-6
Carbon markets rule change would harm mitigation and Indigenous peoplesCongress Narrowed the GUARD Act, But Serious Problems Remain
Following criticism, lawmakers have narrowed the GUARD Act, a bill aimed at restricting minors’ access to certain AI systems. The earlier version could have applied broadly to nearly every AI-powered chatbot or search tool. The amended bill focuses more narrowly on so-called “AI companions”—conversational systems designed to simulate emotional or interpersonal interactions with users.
That change does address some of the broadest concerns raised about the original proposal, though some questions about the bill’s reach remain. Bottom line: the revised bill still creates serious problems for privacy, online speech, and parental choice.
Tell Congress: oppose the guard act
The new GUARD Act still requires companies offering AI companions to implement burdensome age-verification systems tied to users’ real-world identities. Even parents who specifically want their teenagers to use these systems would still face significant hurdles. A family might decide that a conversational AI tool helps an isolated teenager practice social interaction, or engage in harmless creative roleplay. A parent deployed in the military might set up a persistent AI storyteller for a younger child. Under the revised bill, those users could still face mandatory age checks tied to sensitive personal or financial information before they or their children can use these services.
The revised bill also leaves important definitions unclear while sharply increasing penalties for developers and companies that get those judgments wrong. Congress narrowed the GUARD Act. But it is still trying to solve a complicated social problem with vague legal standards, heavy liability, and privacy-invasive verification systems.
Intrusive Age-Verification Remains In The BillThe revised GUARD Act still requires companies offering AI companions to verify that users are adults through a “reasonable age verification” system. The bill allows a broader set of verification methods than the earlier version, but they are still tied to a user’s real-world identity—such as financial records, or age-verified accounts for a mobile operating system or app store.
That approach still raises serious privacy and access concerns. Millions of Americans do not have current government ID, accounts at major banks, or stable access to the kinds of digital identity systems the bill contemplates. Even for those who do, requiring identity-linked verification to access online speech tools creates real risks for privacy, anonymity, and data security. Many people are rightly creeped out by age-verification systems, and may simply forgo using these services rather than compromise their privacy and security.
The revised definition of “AI companion” is also narrower than before, but it’s unclear at the margins. The bill now focuses on systems that “engage in interactions involving emotional disclosures” from the user, or present a “persistent identity, persona or character.”
EFF appreciates that the authors recognized that the prior definition could reach a variety of AI systems that are not chatbots, including internet search engines. But the narrowed definition could be read to also apply to a variety of chat tools that are not AI companions. For example, many modern online conversational systems increasingly recognize and respond to users’ emotions. Customer service systems, including completely human-powered ones that existed long before AI chatbots, have long been designed to recognize frustration and respond empathetically. As conversational AI becomes more emotionally responsive, a customer service chatbot’s efforts to empathize may sweep it within the bill’s definition.
Bigger Penalties, Bigger Incentives To Restrict AccessThe revised bill also sharply increases penalties. Instead of $100,000 per violation, companies—including small developers—can face fines of up to $250,000 per violation, enforced by both federal and state officials.
That kind of liability creates incentives to over-restrict access, especially for minors. Smaller developers, in particular, may decide it is safer to block younger users entirely, disable conversational features, or avoid developing certain tools at all, rather than risk severe penalties under vague standards.
The concerns driving this bill are real. Some AI systems have engaged in troubling interactions with vulnerable users, including minors. But the right answer to that is targeted enforcement against bad actors, and privacy laws that protect us all. The revised GUARD Act instead responds with a privacy-invasive system that burdens the right to speak, read, and interact online.
Congress did improve this bill, but EFF’s core speech, privacy, and security issues remain.
Friday Squid Blogging: Giant Squid Live in the Waters of Western Australia
Insider Betting on Polymarket
Insider trading is rife on Polymarket:
Analysis by the Anti-Corruption Data Collective, a non-profit research and advocacy group, found that long-shot bets—defined as wagers of $2,500 or more at odds of 35 percent or less—on the platform had an average win rate of around 52 percent in markets on military and defense actions.
That compares with a win rate of 25 percent across all politics-focused markets and just 14 percent for all markets on the platform as a whole.
It is absolutely insane that this is legal. We already know how insider betting warps sports. Insider betting warping politics—and military actions—is orders of magnitude worse...
Free Signal Guide
EFF friend Guy Kawasaki* has written a book: Everybody Has Something to Hide: Why and How to Use Signal to Preserve Your Privacy, Security, and Well-Being. This guide is now available in Spanish and English as an ebook in the EPUB format that you can download here. Take a look and consider sharing it with anyone who you know who uses (or should use) Signal.
And don't forget: EFF has two short guides on using Signal on our Surveillance Self-Defense site. An intro How to Use Signal guide, and a guide on Managing Signal Groups.
Everybody Has Something to Hide: Why and How to Use Signal to Preserve Your Privacy, Security, and Well-Being courtesy of Guy Kawasaki.
*Guy Kawasaki is an EFF donor.
