Feed aggregator
EFF Stands in Solidarity With RightsCon and the Global Digital Rights Community
When governments shut down spaces for dialogue, dissent, and collective organizing, the damage extends far beyond a single event. The abrupt cancellation of RightsCon 2026—the world’s largest annual global digital rights conference—is not just a logistical disruption for thousands of researchers, journalists, technologists, and activists—it is part of a growing global pattern of shrinking civic space and increasing hostility toward free expression and independent civil society.
Just days before the conference was set to begin and as participants had begun to arrive in Lusaka, organizers announced that RightsCon would no longer proceed in Zambia or online after mounting political pressure and demands that would have excluded vulnerable communities and constrained discussion. The U.N.’s World Press Freedom Day, which was set to take place just prior to the conference, was scaled down in light of the events, and its press freedom prize ceremony postponed to a later date.
RightsCon has long served as one of the few truly global convenings where civil society groups, grassroots organizers, technologists, and policymakers can meet on equal footing to confront some of the most urgent human rights challenges of the digital age—from censorship and surveillance to internet shutdowns, platform accountability, and the safety of marginalized communities online. EFF has had a presence at RightsCon since its inception in 2011, and had planned to meet with and learn from international partners and present our work during several sessions in Lusaka.
The cancellation is especially devastating because of what RightsCon represents. For many advocates—particularly those from the global majority—it is not merely another conference. It is a rare opportunity to build solidarity across borders, form lasting partnerships, learn from other regions’ experiences, secure funding and support for local work, and ensure that the people most impacted by digital repression have a seat at the table. Holding the event in southern Africa carried particular significance, promising to elevate regional voices and strengthen local digital rights networks.
What happened in Zambia sends a chilling message. According to organizers and multiple reports, the pressure surrounding the event included Chinese government demands to exclude Taiwanese participants and moderate discussions around politically sensitive topics. At a moment when governments around the world are increasingly restricting protest, targeting journalists, cutting funds for human rights work, banning young people from online communities, censoring speech, and criminalizing civil society activity, the cancellation of RightsCon reflects the broader erosion of democratic space online and offline.
Organizations from the digital rights community have spoken out forcefully against the government’s cancellation of the conference, making clear that these attacks on civic participation will not pass unnoticed. Access Now described the decision as evidence of “the far reach of transnational repression targeting civil society.” Index on Censorship’s response warned that the move represents a dangerous escalation in attempts to suppress open dialogue, while IFEX rightly described the cancellation as a blow not just to one conference, but to freedom of expression and assembly everywhere.
We are also heartened to see statements from members of the international community—including Tabani Moyo, who spoke about the impact on the southern African community, and Taiwanese participant Shin Yang, who emphasized the importance of preserving spaces where marginalized communities can safely organize and speak—underscoring that attempts to silence civil society only reinforce the importance of defending open, global spaces for organizing and debate.
Even as this cancellation represents a serious setback, it is important to remember that the digital rights community has always adapted under pressure. Around the world, advocates continue to organize in increasingly difficult environments, finding new ways to connect, collaborate, and resist censorship and repression. Upcoming events like the Global Gathering and FIFAfrica—both of which EFF plans to attend—will bring together members of the community to tackle tough issues. And in the meantime, groups from all over the world are working together to incorporate global perspectives into platform regulations, oppose age verification laws, protect against surveillance, and fight internet shutdowns, among many other efforts.
RightsCon itself emerged from a recognition that defending human rights in the digital age requires international solidarity—and that need has not disappeared.
The conversations that were supposed to happen in Lusaka will continue elsewhere: in community spaces, online gatherings, encrypted chats, and future convenings yet to come. Governments may close venues, restrict participation, or attempt to narrow the boundaries of acceptable speech, but they cannot erase the global movement working to defend a free and open internet.
RightsCon will not go on in Zambia, but we remain heartened and inspired by the strength of the global digital rights community, stand with them in solidarity, and look forward to seeing our allies at the next RightsCon and other upcoming events.
LLMs and Text-in-Text Steganography
Turns out that LLMs are really good at hiding text messages in other text messages.
Enbridge proposes expansion of New England pipeline
6 things Trump won’t see in the FEMA report
Democratic governors have a new playbook: Build projects fast
Florida and Georgia wildfires show growing risk in Southeast
New York asks to back Sunrise Wind in legal challenge
Texas lifts fiber-optic rule for camp safety enacted after deadly flood
What to know about predictions for record-breaking El Niño
NATO backs renewables as solution to energy security, despite US skepticism
Passengers evacuate from hantavirus ship at Tenerife
Despite gains, forest degradation in Brazil’s Amazon is looming threat
City type specifies carbon cycle
Nature Climate Change, Published online: 11 May 2026; doi:10.1038/s41558-026-02646-5
City type specifies carbon cycleLargest increase of carbon dioxide in 2024
Nature Climate Change, Published online: 11 May 2026; doi:10.1038/s41558-026-02647-4
Largest increase of carbon dioxide in 2024Food policy adaptation
Nature Climate Change, Published online: 11 May 2026; doi:10.1038/s41558-026-02645-6
Food policy adaptationDecreasing ice and colder winters
Nature Climate Change, Published online: 11 May 2026; doi:10.1038/s41558-026-02648-3
Decreasing ice and colder wintersScientists breed low-emission rice to fight climate change
Nature Climate Change, Published online: 11 May 2026; doi:10.1038/s41558-026-02614-z
New hybrid grains are expected to emit less than half of the methane that their natural counterparts emit.Carbon markets rule change would harm mitigation and Indigenous peoples
Nature Climate Change, Published online: 11 May 2026; doi:10.1038/s41558-026-02629-6
Carbon markets rule change would harm mitigation and Indigenous peoplesCongress Narrowed the GUARD Act, But Serious Problems Remain
Following criticism, lawmakers have narrowed the GUARD Act, a bill aimed at restricting minors’ access to certain AI systems. The earlier version could have applied broadly to nearly every AI-powered chatbot or search tool. The amended bill focuses more narrowly on so-called “AI companions”—conversational systems designed to simulate emotional or interpersonal interactions with users.
That change does address some of the broadest concerns raised about the original proposal, though some questions about the bill’s reach remain. Bottom line: the revised bill still creates serious problems for privacy, online speech, and parental choice.
Tell Congress: oppose the guard act
The new GUARD Act still requires companies offering AI companions to implement burdensome age-verification systems tied to users’ real-world identities. Even parents who specifically want their teenagers to use these systems would still face significant hurdles. A family might decide that a conversational AI tool helps an isolated teenager practice social interaction, or engage in harmless creative roleplay. A parent deployed in the military might set up a persistent AI storyteller for a younger child. Under the revised bill, those users could still face mandatory age checks tied to sensitive personal or financial information before they or their children can use these services.
The revised bill also leaves important definitions unclear while sharply increasing penalties for developers and companies that get those judgments wrong. Congress narrowed the GUARD Act. But it is still trying to solve a complicated social problem with vague legal standards, heavy liability, and privacy-invasive verification systems.
Intrusive Age-Verification Remains In The BillThe revised GUARD Act still requires companies offering AI companions to verify that users are adults through a “reasonable age verification” system. The bill allows a broader set of verification methods than the earlier version, but they are still tied to a user’s real-world identity—such as financial records, or age-verified accounts for a mobile operating system or app store.
That approach still raises serious privacy and access concerns. Millions of Americans do not have current government ID, accounts at major banks, or stable access to the kinds of digital identity systems the bill contemplates. Even for those who do, requiring identity-linked verification to access online speech tools creates real risks for privacy, anonymity, and data security. Many people are rightly creeped out by age-verification systems, and may simply forgo using these services rather than compromise their privacy and security.
The revised definition of “AI companion” is also narrower than before, but it’s unclear at the margins. The bill now focuses on systems that “engage in interactions involving emotional disclosures” from the user, or present a “persistent identity, persona or character.”
EFF appreciates that the authors recognized that the prior definition could reach a variety of AI systems that are not chatbots, including internet search engines. But the narrowed definition could be read to also apply to a variety of chat tools that are not AI companions. For example, many modern online conversational systems increasingly recognize and respond to users’ emotions. Customer service systems, including completely human-powered ones that existed long before AI chatbots, have long been designed to recognize frustration and respond empathetically. As conversational AI becomes more emotionally responsive, a customer service chatbot’s efforts to empathize may sweep it within the bill’s definition.
Bigger Penalties, Bigger Incentives To Restrict AccessThe revised bill also sharply increases penalties. Instead of $100,000 per violation, companies—including small developers—can face fines of up to $250,000 per violation, enforced by both federal and state officials.
That kind of liability creates incentives to over-restrict access, especially for minors. Smaller developers, in particular, may decide it is safer to block younger users entirely, disable conversational features, or avoid developing certain tools at all, rather than risk severe penalties under vague standards.
The concerns driving this bill are real. Some AI systems have engaged in troubling interactions with vulnerable users, including minors. But the right answer to that is targeted enforcement against bad actors, and privacy laws that protect us all. The revised GUARD Act instead responds with a privacy-invasive system that burdens the right to speak, read, and interact online.
Congress did improve this bill, but EFF’s core speech, privacy, and security issues remain.
