EFF: Updates

Subscribe to EFF: Updates feed
EFF's Deeplinks Blog: Noteworthy news from around the internet
Updated: 3 hours 22 min ago

Online Tracking is Out of Control—Privacy Badger Can Help You Fight Back

Thu, 03/27/2025 - 5:09pm

Every time you browse the web, you're being tracked. Most websites contain invisible tracking code that allows companies to collect and monetize data about your online activity. Many of those companies are data brokers, who sell your sensitive information to anyone willing to pay. That’s why EFF created Privacy Badger, a free, open-source browser extension used by millions to fight corporate surveillance and take back control of their data. 

Since we first released Privacy Badger in 2014, online tracking has only gotten more invasive and Privacy Badger has evolved to keep up. Whether this is your first time using it or you’ve had it installed since day one, here’s a primer on how Privacy Badger protects you.

Online Tracking Isn't Just Creepy—It’s Dangerous 

The rampant data collection, sharing, and selling fueled by online tracking has serious consequences. Fraudsters purchase data to identify elderly people susceptible to scams. Government agencies and law enforcement purchase people’s location data and web browsing records without a warrant. Data brokers help predatory companies target people in financial distress. And surveillance companies repackage data into government spy tools.

Once your data enters the data broker ecosystem, it’s nearly impossible to know who buys it and what they’re doing with it. Privacy Badger blocks online tracking to prevent your browsing data from being used against you. 

Privacy Badger Disrupts Surveillance Business Models

Online tracking is pervasive because it’s profitable. Tech companies earn enormous profits by targeting ads based on your online activity—a practice called “online behavioral advertising.” In fact, Big Tech giants like Google, Meta, and Amazon are among the top companies tracking you across the web. By automatically blocking their trackers, Privacy Badger makes it harder for Big Tech companies to profit from your personal information.

Online behavioral advertising has made surveillance the business model of the internet. Companies are incentivized to collect as much of our data as possible, then share it widely through ad networks with no oversight. This not only exposes our sensitive information to bad actors, but also fuels government surveillance. Ending surveillance-based advertising is essential for building a safer, more private web. 

While strong federal privacy legislation is the ideal solution—and one that we continue to advocate for—Privacy Badger gives you a way to take action today. 

Privacy Badger fights for a better web by incentivizing companies to respect your privacy. Privacy Badger sends the Global Privacy Control and Do Not Track signals to tell companies not to track you or share your data. If they ignore these signals, Privacy Badger will block them, whether they are advertisers or trackers of other kinds. By withholding your browsing data from advertisers, data brokers, and Big Tech companies, you can help make online surveillance less profitable. 

How Privacy Badger Protects You From Online Tracking

Whether you're looking to protect your sensitive information from data brokers or simply don’t want Big Tech monetizing your data, Privacy Badger is here to help.

Over the past decade, Privacy Badger has evolved to fight many different methods of online tracking. Here are some of the ways that Privacy Badger protects your data:

  • Blocks Third-Party Trackers and Cookies: Privacy Badger stops tracking code from loading on sites that you visit. That prevents companies from collecting data about your online activity on sites that they don’t own. 
  • Sends the GPC Signal to Opt Out of Data Sharing: Privacy Badger sends the Global Privacy Control (GPC) signal to opt out of websites selling or sharing your personal information. This signal is legally binding in some states, including California, Colorado, and Connecticut. 
  • Stops Social Media Companies From Tracking You Through Embedded Content: Privacy Badger replaces page elements that track you but are potentially useful (like embedded tweets) with click-to-activate placeholders. Social media buttons, comments sections, and video players can send your data to other companies, even if you don’t click on them.
  • Blocks Link Tracking on Google and Facebook: Privacy Badger blocks Google and Facebook’s attempts to follow you whenever you click a link on their websites. Google not only tracks the links you visit from Google Search, but also the links you click on platforms that feel more private, like Google Docs and Gmail
  • Blocks Invasive “Fingerprinting” Trackers: Privacy Badger blocks trackers that try to identify you based on your browser's unique characteristics, a particularly problematic form of tracking called “fingerprinting.” 
  • Automatically learns to block new trackers: Our Badger Swarm research project continuously discovers new trackers for Privacy Badger to block. Trackers are identified based on their behavior, not just human-curated blocklists.
  • Disables Harmful Chrome Settings: Automatically disables Google Chrome settings that are bad for your privacy.
  • Easy to Disable on Individual Sites While Maintaining Protections Everywhere Else: If blocking harmful trackers ends up breaking something on a website, you can disable Privacy Badger for that specific site while maintaining privacy protections everywhere else.

All of these privacy protections work automatically when you install Privacy Badger—there’s no setup required! And it turns out that when Privacy Badger blocks tracking, you’ll also see fewer ads and your pages will load faster. 

You can always check to see what Privacy Badger has done on the site you’re visiting by clicking on Privacy Badger’s icon in your browser toolbar.

Fight Corporate Surveillance by Spreading the Word About Privacy Badger

Privacy is a team sport. The more people who withhold their data from data brokers and Big Tech companies, the less profitable online surveillance becomes. If you haven’t already, visit privacybadger.org to install Privacy Badger on your web browser. And if you like Privacy Badger, tell your friends about how they can join us in fighting for a better web!

Install Privacy Badger

A New Tool to Detect Cellular Spying | EFFector 37.3

Wed, 03/26/2025 - 3:54pm

Take some time during your Spring Break to catch up on the latest digital rights news by subscribing to EFF's EFFector newsletter!

This edition of the newsletter covers our new open source tool to detect cellular spying, Rayhunter; The Foilies 2025, our tongue-in-cheek awards to the worst responses to public records requests; and our recommendations to the NSF for the new AI Action Plan to put people first.

You can read the full newsletter here, and even get future editions directly to your inbox when you subscribe! Additionally, we've got an audio edition of EFFector on the Internet Archive, or you can view it by clicking the button below:

LISTEN ON YouTube

EFFECTOR 37.3 - A NEW TOOL TO DETECT CELLULAR SPYING

Since 1990 EFF has published EFFector to help keep readers on the bleeding edge of their digital rights. We know that the intersection of technology, civil liberties, human rights, and the law can be complicated, so EFFector is a great way to stay on top of things. The newsletter is chock full of links to updates, announcements, blog posts, and other stories to help keep readers—and listeners—up to date on the movement to protect online privacy and free expression. 

Thank you to the supporters around the world who make our work possible! If you're not a member yet, join EFF today to help us fight for a brighter digital future.

How to Delete Your 23andMe Data

Wed, 03/26/2025 - 2:45pm

This week, the genetic testing company 23andMe filed for bankruptcy, which means the genetic data the company collected on millions of users is now up for sale. If you do not want your data included in any potential sale, it’s a good time to ask the company to delete it.

When the company first announced it was considering a sale, we highlighted many of the potential issues, including selling that data to companies with poor security practices or direct links to law enforcement. With this bankruptcy, the concerns we expressed last year remain the same. It is unclear what will happen with your genetic data if 23andMe finds a buyer, and that uncertainty is a clear indication that you should consider deleting your data. California attorney general Rob Bonta agrees.

First: Download Your Data

Before you delete your account, you may want to download the data for your own uses. If you do so, be sure to store it securely. To download you data:

  1. Log into your 23andMe account and click your username, then click "Settings." 
  2. Scroll down to the bottom where it says "23andMe Data" and click "View."
  3. Here, you'll find the option to download various parts of your 23andMe data. The most important ones to consider are:
    1. The "Reports Summary" includes details like the "Wellness Reports," "Ancestry Reports," and "Traits Reports."
    2. The "Ancestry Composition Raw Data" the company's interpretation of your raw genetic data.
    3. If you were using the DNA Relatives feature, the "Family Tree Data" includes all the information about your relatives. Based on the descriptions of the data we've seen, this sounds like the data the bad actors collected.
    4. You can also download the "Raw data," which is the uninterpreted version of your DNA. 

There are other types of data you can download on this page, though much of it will not be of use to you without special software. But there's no harm in downloading it all. 

How to Delete Your Data

Finally, you can delete your data and revoke consent for research. While it doesn’t make this clear on the deletion page, this also authorizes the company to destroy your DNA sample, if you hadn't already asked them to do so. You can also make this request more explicit if you want in the Account preferences section page.

If you're still on the page to download your data from the steps above, you can skip to step three. Otherwise:

  1. Click your username, then click "Settings." 
  2. Scroll down to the bottom where it says "23andMe Data" and click "View."
  3. Scroll down to the bottom of this page, and click "Permanently Delete Data."
  4. You should get a message stating that 23andMe received the request but you need to confirm by clicking a link sent to your email. 
  5. Head to your email account associated with your 23andMe account to find the email titled "23andMe Delete Account Request." Click the "Permanently Delete All Records" button at the bottom of the email, and you will be taken to a page that will say "Your data is being deleted" (You may need to log in again, if you logged out).

23andMe should give every user a real choice to say “no” to a data transfer in this bankruptcy and ensure that any buyer makes real privacy commitments. Other consumer genetic genealogy companies should proactively take these steps as well. Our DNA contains our entire genetic makeup. It can reveal where our ancestors came from, who we are related to, our physical characteristics, and whether we are likely to get genetically determined diseases. Even if you don’t add your own DNA to a private database, a relative could make that choice for you by adding their own.

This incident is an example of why this matters, and how certain features that may seem useful in the moment can be weaponized in novel ways. A bankruptcy should not result in our data getting shuffled off to the highest bidder without our input or a guarantee of  real protections.

Saving the Internet in Europe: Fostering Choice, Competition and the Right to Innovate

Tue, 03/25/2025 - 9:58am

This post is part four and the final part in a series of posts about EFF’s work in Europe. Read about how and why we work in Europe here.  

EFF’s mission is to ensure that technology supports freedom, justice, and innovation for all people of the world. While our work has taken us to far corners of the globe, in recent years we have worked to expand our efforts in Europe, building up a policy team with key expertise in the region, and bringing our experience in advocacy and technology to the European fight for digital rights.   

In this blog post series, we will introduce you to the various players involved in that fight, share how we work in Europe, and discuss how what happens in Europe can affect digital rights across the globe.  

EFF’s Approach to Competition  

Market concentration and monopoly power among internet companies and internet access impacts many of EFF’s issues, particularly innovation, consumer privacy, net neutrality, and platform censorship. And we have said it many times: Antitrust law and rules on market fairness are powerful tools with the potential to either cement the hold of established giants over a market even more or to challenge incumbents and spur innovation and choice that benefit users. Antitrust enforcement must hit monopolists where it hurts: ensuring that anti-competitive behaviors like abuse of dominance by multi-billion-dollar tech giants come at a price high enough to force real change.  

The EU has recently shown that it is serious about cracking down on Big Tech companies with its full arsenal of antitrust rules. For example, in a high-stakes appeal in 2022, EU judges hit Google with a record fine of more than €4.13 billion for abusing its dominant position by locking Android users into its search engine (now pending before the Court of Justice). 

We believe that with the right dials and knobs, clever competition rules can complement antitrust enforcement and ensure that firms that grow top heavy and sluggish are displaced by nimbler new competitors. Good competition rules should enable better alternatives that protect users’ privacy and enhance users’ technological self-determination. In the EU, this requires not only proper enforcement of existing rules but also new regulation that tackles gatekeeper’s dominance before harm is done. 

The Digital Markets Act  

The DMA will probably turn out to be one of the most impactful pieces of EU tech legislation in history. It’s complex but the overall approach is to place new requirements and restrictions on online “gatekeepers”: the largest tech platforms, which control access to digital markets for other businesses. These requirements are designed to break down the barriers businesses face in competing with the tech giants. 

Let’s break down some of the DMA’s rules. If enforced robustly, the DMA will make it easier for users to switch services, install third party apps and app stores and have more power over default settings on their mobile computing devices. Users will no longer be steered into sticking with the defaults embedded in their devices and can choose, for example, their own default browser on Apple’s iOS. The DMA also tackles data collection practices: gatekeepers can no longer cross-combine user data or sign them into new services without their explicit consent and must provide them with a specific choice. A “pay or consent” advertising model as proposed by Meta will probably not cut it.  

There are also new data access and sharing requirements that could benefit users, such as the right of end users to request effective portability of data and get access to effective tools to this end. One section of the DMA even requires gatekeepers to make their person-to-person messaging systems (like WhatsApp) interoperable with competitors’ systems on request—making it a globally unique ex ante obligation in competition regulation. At EFF, we believe that interoperable platforms can be a driver for technological self-determination and a more open internet. But even though data portability and interoperability are anti-monopoly medicine, they come with challenges: Ported data can contain sensitive information about you and interoperability poses difficult questions about security and governance, especially when it’s mandated for encrypted messaging services. Ideally, the DMA should be implemented to offer better protections for users’ privacy and security, new features, new ways of communication and better terms of service.  

There are many more do's and don'ts in the new fairness rulebook of the EU, such as the prohibition of platforms to favour their own products and services over those of rivals in ranking, crawling and indexing (ensuring users a real choice!), along with many other measures. All these and other requirements are to create more fairness and contestability in digital markets—a laudable objective.  If done right, the DMA presents an option for a real change for technology users—and a real threat to current abusive or unfair industry practices by Big Tech. But if implemented poorly, it could create more legal uncertainty, restrict free expression, or even legitimize the status quo. It is now up to the European Commission to bring the DMA’s promises to life. 

Public Interest 

As the EU’s 2024–2029 mandate is now in full swing, it will be important to not lose sight of the big picture. Fairness rules can only be truly fair if they follow a public-interest approach by empowering users, business, and society more broadly and make it easier for users to control the technology they rely on. And we cannot stop here: the EU must strive to foster a public interest internet and support open-source and decentralized alternatives. Competition and innovation are interconnected forces and the recent rise of the Fediverse makes this clear. Platforms like Mastodon and Bluesky thrive by filling gaps (and addressing frustrations) left by corporate giants, offering users more control over their experience and ultimately strengthening the resilience of the open internet. The EU should generally support user-controlled alternatives to Big Tech and use smart legislation to foster interoperability for services like social networks. In an ideal world, users are no longer locked into dominant platforms and the ad-tech industry—responsible for pervasive surveillance and other harms—is brought under control. 

What we don’t want is a European Union that conflates fairness with protectionist industrial policies or reacts to geopolitical tensions with measures that could backfire on digital openness and fair markets. The enforcement of the DMA and new EU competition and digital rights policies must remain focused on prioritizing user rights and ensuring compliance from Big Tech—not tolerating malicious (non)compliance tactics—and upholding the rule of law rather than politicized interventions. The EU should avoid policies that could lead to a fragmented internet and must remain committed to net neutrality. It should also not hesitate to counter the concentration of power in the emerging AI stack market, where control over infrastructure and technology is increasingly in the hands of a few dominant players. 

EFF will be watching. And we will continue to fight to save the internet in Europe, ensuring that fairness in digital markets remains rooted in choice, competition, and the right to innovate. 

230 Protects Users, Not Big Tech

Mon, 03/24/2025 - 3:22pm

Once again, several Senators appear poised to gut one of the most important laws protecting internet users - Section 230 (47 U.S.C. § 230)

Don’t be fooled - many of Section 230’s detractors claim that this critical law only protects big tech. The reality is that Section 230 provides limited protection for all platforms, though the biggest beneficiaries are small platforms and users. Why else would some of the biggest platforms be willing to endorse a bill that guts the law? In fact, repealing Section 230 would only cement the status of Big Tech monopolies.

As EFF has said for years, Section 230 is essential to protecting individuals’ ability to speak, organize, and create online. 

Congress knew exactly what Section 230 would do – that it would lay the groundwork for speech of all kinds across the internet, on websites both small and large. And that’s exactly what has happened.  

Section 230 isn’t in conflict with American values. It upholds them in the digital world. People are able to find and create their own communities, and moderate them as they see fit. People and companies are responsible for their own speech, but (with narrow exceptions) not the speech of others. 

The law is not a shield for Big Tech. Critically, the law benefits the millions of users who don’t have the resources to build and host their own blogs, email services, or social media sites, and instead rely on services to host that speech. Section 230 also benefits thousands of small online services that host speech. Those people are being shut out as the bill sponsors pursue a dangerously misguided policy.  

If Big Tech is at the table in any future discussion for what rules should govern internet speech, EFF has no confidence that the result will protect and benefit internet users, as Section 230 does currently. If Congress is serious about rewriting the internet’s speech rules, it must spend time listening to the small services and everyday users who would be harmed should they repeal Section 230.  

Section 230 Protects Everyday Internet Users 

There’s another glaring omission in the arguments to end Section 230: how central the law is to ensuring that every person can speak online, and that Congress or the Administration does not get to define what speech is “good” and “bad”.   

Let’s start with the text of Section 230. Importantly, the law protects both online services and users. It says that “no provider or user shall be treated as the publisher” of content created by another. That's in clear agreement with most Americans’ belief that people should be held responsible for their own speech—not that of others.   

Section 230 protects individual bloggers, anyone who forwards an email, and social media users who have ever reshared or retweeted another person’s content online. Section 230 also protects individual moderators who might delete or otherwise curate others’ online content, along with anyone who provides web hosting services

As EFF has explained, online speech is frequently targeted with meritless lawsuits. Big Tech can afford to fight these lawsuits without Section 230. Everyday internet users, community forums, and small businesses cannot. Engine has estimated that without Section 230, many startups and small services would be inundated with costly litigation that could drive them offline. Even entirely meritless lawsuits cost thousands of dollars to fight, and often tens or hundreds of thousands of dollars.

Deleting Section 230 Will Create A Field Day For The Internet’s Worst Users  

Section 230’s detractors say that too many websites and apps have “refused” to go after “predators, drug dealers, sex traffickers, extortioners and cyberbullies,” and imagine that removing Section 230 will somehow force these services to better moderate user-generated content on their sites.  

These arguments fundamentally misunderstand Section 230. The law lets platforms decide, largely for themselves, what kind of speech they want to host, and to remove speech that doesn’t fit their own standards without penalty. 

 If lawmakers are legitimately motivated to help online services root out unlawful activity and terrible content appearing online, the last thing they should do is eliminate Section 230. The current law strongly incentivizes websites and apps, both large and small, to kick off their worst-behaving users, to remove offensive content, and in cases of illegal behavior, work with law enforcement to hold those users responsible. 

If Congress deletes Section 230, the pre-digital legal rules around distributing content would kick in. That law strongly discourages services from moderating or even knowing about user-generated content. This is because the more a service moderates user content, the more likely it is to be held liable for that content. Under that legal regime, online services will have a huge incentive to just not moderate and not look for bad behavior. This would result in the exact opposite of their goal of protecting children and adults from harmful content online.

Pages