Cindy Cohn Is Leaving the EFF, but Not the Fight for Digital Rights

After a quarter century defending digital rights, Cindy Cohn announced on Tuesday that she is stepping down as executive director of the Electronic Frontier Foundation. Cohn, who has led the San Francisco–based nonprofit since 2015, says she will leave the role later this year, concluding a chapter that helped define the modern fight over online freedom.
Cohn first rose to prominence as lead counsel in Bernstein v. Department of Justice, the 1990s case that overturned federal restrictions on publishing encryption code. As EFF’s legal director and later executive director, she guided the group through legal challenges to government surveillance, reforms to computer crime laws, and efforts to hold corporations accountable for data collection. Over the past decade, EFF has expanded its influence, becoming a central force in shaping the debate over privacy, security, and digital freedom.
In an interview with WIRED, Cohn reflected on EFF’s foundational encryption victories, its unfinished battles against National Security Agency (NSA) surveillance, and the organization’s work protecting independent security researchers. She spoke about the shifting balance of power between corporations and governments, the push for stronger state-level privacy laws, and the growing risks posed by artificial intelligence.
Though stepping down from leadership, Cohn tells WIRED she plans to remain active in the fight against mass surveillance and government secrecy. Describing herself as “more of a warrior than a manager,” she says her intent is to return to frontline advocacy. She is also at work on a forthcoming book, Privacy’s Defender, due out next spring, which she hopes will inspire a new generation of digital rights advocates.
This interview has been edited for length and clarity.
WIRED: Tell us about the fights you won, and the ones that still feel unfinished after 25 years.
CINDY COHN: The early fight that we made to free up encryption from government regulation still stands out as setting the stage for a potentially secure internet. We're still working on turning that promise into a reality, but we're in such a different place than we would've been in had we lost that fight. Encryption protects anybody who buys anything online, anyone who uses Signal to be a whistleblower or journalists, or just regular people who want privacy and use WhatsApp or Signal. Even the backend-certificate authorities provided by Let’s Encrypt—that make sure that when you think you're going to your bank, you're actually going to your bank website—are all made possible because of encryption. These are all things that would've been at risk if we hadn't won that fight. I think that win was foundational, even though the fights aren't over.
The fights that we've had around the NSA and national security, those are still works in progress. We were not successful with our big challenge to the NSA spying in Jewel v. NSA, although over the long arc of that case and the accompanying legislative fights, we managed to claw back quite a bit of what the NSA started doing after 9/11.
But it's unfinished business. A related question is the government's misuse of national security justifications to violate people's rights, including their privacy. I think it's especially important now that we do something about the misuse of national security arguments and secrecy, because in the hands of the Trump administration, it's become an all-purpose tool to go after political enemies, violate people’s rights, and to do so many other things that we have been warning about for years.
Another thing that we've really accomplished at EFF that is sometimes overlooked is in the work we call “coders rights;” especially protecting people who do independent security research. The first year I was at EFF, the FBI arrested somebody at Defcon for having the audacity to reveal that Adobe's security on one of their products was lousy—the guy was Dmitry Sklyarov. Dramatic arrests of security researchers don’t happen anymore, thankfully. We’ve seen the software world really evolve on this. Now companies have bug-bounty programs to reward independent security researchers. And companies like Microsoft have come a long way in devoting themselves to security and supporting security research. While we still occasionally have to stand up for somebody revealing security flaws more often outside of traditional areas of software development for things like security in cars or medical devices, we just passed another Defcon where we didn’t have to bail anyone out of jail or fight a gag order, and that’s cause for celebration.
Tech companies often claim to take privacy seriously, even as many expand surveillance-driven business models. At this point, who poses the greater risk to personal privacy—government agencies or corporations?
One thing that I've learned at EFF is that there isn't a bright line between those two. The NSA spying that we talk about was the NSA using the telecommunications, internet, and communications companies to spy on us.
We’re seeing a lot more skepticism when it comes to company promises that they “take our privacy seriously.” The thing that jumps to my mind right now is the largest jury verdict in history just happened here in San Francisco on a privacy issue involving Google. That kind of tells you that people are pretty fed up, and they really don't think that the companies have their best interests at heart. And that wasn't always true.
The companies have also moved against our privacy over time. I like to remind people that Facebook launched itself as the privacy-protective social network, back when that actually was true. They then disintegrated that protection over time.
There’s no question that we need a comprehensive national privacy law that has teeth and empowers people to protect their own privacy through things like a private right of action. That doesn't feel very close right now. What we are seeing is some state laws that are coming close, and it's getting easier and easier to get them passed, because it's very popular to protect people's privacy.
With Congress perpetually gridlocked on this issue, EFF has been pushing harder at the state and local level. Has that shift produced meaningful results, or are you simply filling a gap until federal action becomes possible?
We made an assessment a few years ago that, on the federal level, Congress is broken for reasons that are much bigger than digital rights issues. While the digital rights community is still pretty good at stopping bad things federally—not all the time, but we can do that like with FOSTA/SESTA and the recent attempt to prevent any AI legislation—it’s just really hard to build good things in Congress. So yeah, we decided that we would dig in and develop our expertise in the states, and we started with California because that's where we are and that's also where so much of the tech industry is. If you get a good law passed in California, it can have a wider impact than some other places.
We've been pretty successful at getting things passed and stopping a lot of bad ideas, most recently around AI. There's a couple things pending in the California legislature now that are looking pretty good and I hope they will pass the “1337” bill.
Also, mass surveillance is now happening at a local level, which it really wasn’t as of a few years ago. We've partnered with local groups to try to create local control and more accountability. For instance, in the past year we’ve supported groups in Austin and in Denver. The statewide work is great, but we're also trying to really partner with local groups on a community level because this kind of mass surveillance is coming to them very fast now and they’re asking us, “What do we do about these automated license plate readers? What do we do about mass spying? What do we do about the Ring cameras and the impacts they have on communities when police have access to them?”
Many of these issues end up being hyperlocal, and we try to be strategic. We’re looking to protect people wherever we can.
The EFF has served as one of the biggest bridges between technologists, activists, and lawyers in the US. What has that model revealed about coalition-building that outsiders often miss?
Well, we have this kind of coalition building inside EFF because we're made up of lawyers, activists, and technologists. For most of the issues we work on, whether it's protecting the children or street-level surveillance or the NSA, we have working groups internally that have representatives from each of the teams. I think that's really been our superpower. We try to bridge it out in the coalition work we do with other groups as well.
We hired the very first staff technologist at EFF, and now we have a whole team of them. I think 12 or 13 on staff. We were really the first organization that had real tech expertise internally to inform what we're doing. And that helps us in a couple of ways. One is that it's sacrosanct at EFF that when we weigh in on a technology’s impact on people's rights, we are right about how the technology works. That’s the chief job of the “public interest technologists” on the staff, which we now call the PIT crew.
I also think that gives us a lot of power when we show up in DC or in the European Parliament or in state legislatures. People who we talk to, whether they agree with us or don't agree with us, they know that when we're telling them how the tech works, that we're telling them straight and not misleading them.
The same is true in the courts. Especially in the early days, we wrote a lot of briefs that were, for instance, explaining how peer-to-peer file sharing actually works. We always have a position we’re advocating for, but we also spend time to explain the technology because we want the courts to be educated enough to rule correctly.
That’s true in the legislature and it’s true in the public. I think having the PIT team in-house—along with the many computer scientists and academics who take the time to talk to us—has made sure we’re always right on the technology
Being right about how the tech works also means we're trusted by people who work in tech, who can easily dismiss criticisms that are based on a misunderstanding of how the technology works. We try never to be those people. We want to come from a place of deep knowledge.
The other side of that is making sure the technologists really do understand how the law works. The bridging goes that way too. For example, we want people who are building technologies that might face law enforcement requests for information to understand the difference between a warrant, a court order, and a subpoena, and why that would matter in terms of people's rights. There are many more people who are cross-trained now between law and tech than when we started in the 1990s, but having both sets of expertise weigh in on issues really does make us stronger. And then of course, our activism team not only moves the conversation forward with the public, but also makes sure that neither the techs nor the lawyers bore everybody with how we explain stuff.
Courts and lawmakers regularly rely on clumsy or misleading metaphors to describe technology. I’ve seen a prosecutor compare someone who posted leaked data online to a person stealing an already stolen car. Do you have your own favorite example of this?
That's hilarious.
The first thing that comes to mind is not a tech metaphor, it's “intellectual property,” because it sets the whole thing up as if there's a scarcity issue when there is not. If somebody comes and steals your cow, you don't have the cow anymore. If somebody comes and makes a copy of your book, you still have the book. Thomas Jefferson said: “He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me.” That's a better metaphor for how the law should work in the digital age.
I'm pretty proud that EFF recognized how important getting copyright and patents and trademarks right would be for the health of the open internet. And one of the things that drives me crazy are these “stealing” and “property” metaphors.
End-to-end encryption remains a flash point. Where do you see the most immediate threats today—and how concerned are you that backdoors could be introduced without public scrutiny?
It's a tremendously important issue. It’s hard enough in other times to get companies to install end-to-end encryption in the tools that they're offering to us because it can interfere with the surveillance business model. But I think even people who weren’t worried about corporate surveillance should be more worried now with the government directly taking stakes in some of these companies.
I'm reminded of the things we learned from the Snowden revelations, specifically the efforts the NSA went to, even on the chip level, to weaken security in order to try to ensure that they would always have access to everything people said or did using technology.
Anyone who thinks about it for 10 seconds understands you can't build a backdoor into systems that only good guys can use. Anytime you make a system insecure so that good guys have access, whether you call it “lawful access” or anything else, bad guys are going to use it.
And speaking of metaphors, I often say about encryption that the government often uses the digital space as a bit of a smoke screen to hide what they are actually doing with our security when they attack encryption. If the cops showed up at your front door and said, “Look, we've got a problem because you might be a thief, so we want you to leave your backdoor open so we can come and check and make sure you're not a thief next time there's a break-in somewhere,” most people would say “that’s crazy and will make me less safe, not only from you but from bad guys too.” Yet that’s what we see over and over and over again in these encryption conversations—law enforcement claiming that we need to weaken security and claiming that this will make us safer. It won’t.
One thing that worries me the most right now is that we may not even be having these conversations in public anymore. With the tech companies being so closely aligned with the Trump administration right now, those backdoors may happen without us having the chance to have the public conversation about whether they're a good idea. That wouldn't be without precedent for the NSA.
What risks do you see repeating from past tech booms today as artificial intelligence systems are becoming more embedded in daily life, and what threats demand immediate attention?
There are two things that I've seen before. One is this crazy hype, as if AI is going to solve every problem. This idea is not true. But neither is the reverse: the idea that AI will make everything suck in every single dimension.
We have to be sensible about where it's useful and where it's not. Often when people say AI, they're talking about large language models, but there’s many areas where machine learning or AI might be used. It's important that we understand better how these things fail. Because they don't fail in the same way people fail. And if we don’t know the actual ground truth of a situation, we cannot even evaluate whether the system is failing or lying to us. That’s my worry about “predictive policing” and other attempts to try to use AI systems to predict future human behavior, especially in the context of law enforcement or government decisions like whether to remove a child from the home.
I don't feel like we have a very good intuition about how and when AI fails, which means we can’t build the kind of guardrails we need.
We know, for example, that human judges get tired in the afternoons and their sentencing decisions will change as a result of that in ways that are pretty predictable and have been tracked. AI doesn’t have that problem. It never gets tired. But if an AI is deciding whether you get bail or not, it's going to have other vulnerabilities. We know some of them based on what it's trained on, that it will have biases that mean people won’t get judged on their individual merits. We know quite a bit about that at this point, but that’s just one example. We still don't know all the ways in which these systems will fail so its dangerous to rely on them for decisions that matter.
The other big thing with AI is that the court decisions are starting to come out and the first set has been based in copyright, which is not a good legal vehicle addressing problems with AI. It’s unfortunate that it's first because I think it's clouded things. Copyright law is not designed to be applied to the context of training up AIs—in general that training process should be fair use. And overall, copyright is too blunt an instrument with statutory damages and other things that are not well suited for the problems that AI may cause.
For younger advocates stepping into this space—where surveillance is pervasive and privacy protections are thin—what’s the case for staying in the fight?
Organizations like EFF were made for this moment. If everything was magically going well and governments were great and benevolent and companies were doing the right thing, you wouldn't need a digital civil society or an organization filled with fighty lawyers, technologists, and activists. We need EFF, but more than that, we need a robust digital civil society to try to hold the line today and to try to make things better tomorrow. With the clear slide into tyranny, and tech so central to that slide, we need digital society more than ever.
So the first thing I would say to somebody if they were thinking about becoming a digital rights advocate is this: We need you. There are so few of us compared to the size of the problems and the complexity of the problems that we're trying to address. I had the luxury of coming up in a time when a lot of people didn't understand technology and it wasn't in very many people's hands. So while the fights were important, they were more about laying the groundwork for the future. We didn’t win all of them, though we did win a lot and helped move things forward for many others.
But regardless, we're in the future now. There’s a lot more money at stake, a lot more government involvement, and a lot more public attention and reliance on digital tools. So in some ways it's harder, but it's also more important to try to build a better digital future.
I think it's easy to look at the stakes and what we're up against, and be despondent. But we can either lose now, or fight and maybe lose later. I’m on the side of fighting and standing up for what we believe in. It's not magically going to go well, and it never was magically going to go well. If the founders of EFF had thought that, they wouldn’t have created the organization and hired lawyers like me.
It’s clearer than ever that people have to stand up for freedom, justice, and innovation for all the people of the world, which is EFF's mission.
But another thing I would say is that it’s great fun to be on the righteous side. You get to work with really good, smart, and hilarious people. I often say the people working for a better world throw better parties, and that’s been my experience for 25 years and counting.
Obviously the last question is: What’s next?
I don't know yet. I really wanted to set EFF up for success and make sure it could fly without me, so I didn't spend a lot of energy figuring out my next steps. I would say a couple of things. While it's an honor to be EFF’s executive director, it's never really been my dream job. I'm more of a warrior and a fighter, so I'm looking for a way to get more directly into the fight than I could do from this role. It’s wonderful, but when you're responsible for the care and feeding of 125 people, you spend a lot of time on HR and budgets and other things that are really, really important but are not the part that feeds me.
I'm looking to get back into the fight, and I'm not sure exactly what that's going to look like. I am still passionate about fighting unnecessary government surveillance, mass surveillance, and secrecy. I’m still interested in fighting when national security arguments are used as cover for undermining people's privacy and free speech and short-circuiting due process.
I believe that, after 25 years, it's time for other people to get a chance to lead EFF. But I'm not done yet. I want to stay in the fight.
wired