The pandemic has plunged the world into a privacy nightmare

The pandemic has plunged the world into a privacy nightmare

Privacy is very hard to come by in today's connected world and this problem has only been exacerbated by the pandemic, which has legitimized the collection of vast amounts of personal data. Governments have already started using Covid data in other contexts. And beyond medical data, the debate over the use of facial recognition and biometrics by state agencies continues. To learn more, TechRadar Pro spoke with Cindy Cohn, executive director of the Electronic Frontier Foundation, an organization that opposes censorship and surveillance.

What are the main issues currently facing the FEP?

We are working to create competition in technology markets, particularly through tools such as interoperability. But we also remain concerned about the telecommunications giants and their impact on society. We believe that a lack of competition plays a key role in the fact that many people don't have good broadband options these days. We continue to work to support free expression around the world, both to fight bad legislative proposals like Section 230 "reform" and to help creators on YouTube and elsewhere who have been censored by content filters too wide. And we defend your security and privacy against so many attacks from so many directions that it's hard to name just one or two, but our job is to try to keep up with surveillance technologies acquired by United Nations forces. through our Atlas of Surveillance project is particularly exciting right now.

It has been tackling censorship and surveillance issues for a long time. How have these issues evolved over the years and what do they mean in the era of the remote workplace?

The biggest change is that now everyone understands what we have been talking about for a long time. Previously, we had to convince people that the Internet would mean that we would all have the opportunity to make our voices heard, or that people's work and culture would depend heavily on digital networks. No more. Another positive change is that people are more skeptical of technology. They look for privacy concerns, they care about where your data is going and who is using it. Now our next step is to empower them: governments and companies have convinced many that the situation is dire. Our job is to convince them that a better future is possible and that we can make it happen. We have long feared that surveillance law would be based on the presumption that we keep our most private information secret. That is, we keep our key documents in our homes, our key relationships are not tracked or often not even known to those outside of our circle of family and friends (and maybe not even there). Today, we all know that our most sensitive information is in the hands of not one, but many third parties, with Facebook, Amazon, Google and others following and inferring (via machine learning or AI) or knowing about all of our associations. This has implications for our privacy rights against governments and private companies. As far as censorship, we started by worrying about governments, because they really were the main threats to online speech. But now we are all seeing that the big tech giants are the main decision makers about whether you can talk online. It's not a constitutional issue (in the legal sense of the term), but from a practical standpoint we need to get to a place where we don't just have a few companies controlling what is said online. But also where we have the best tools, whether digital or not, to protect ourselves from hate, bullying, and other harmful activities online. I think it's a mistake to think that the tech giants are going to start solving these problems in a way that makes everyone (or maybe even anyone) happy, so we should focus on ways to promote competition and interoperability in their systems for the better. the options.

Benjamin Franklin said that "every problem is an opportunity in disguise," and this is true of government and other private entities in the way they have transformed Covid-19 to exert more control. You do not agree

Yes, exactly. We have been very concerned about the possibility of more widespread surveillance, as well as the use of this surveillance to limit or control our access to goods, services and benefits. We know that responses to crises often continue long after the crisis has passed and are then used for other purposes. We've already seen governments in Singapore start to use Covid tracking data for other purposes and I wouldn't be surprised if this happens in some places in the US as well. We are also concerned that these measures are regressive, meaning people with resources can easily get things like Covid passports that are up to date and correct, and that other people cannot. And since these are often the communities most at risk of contracting the virus, its impact ends up being exactly the opposite. The people who need protection the most have it the least.

Looking ahead, what are some of the aspects of digital freedom that EFF will be working on in the near future?

We have issues that are still relevant today: ensuring you have rights and a voice when you go online, advocating for real security, including strong end-user controlled encryption, working to ensure the fourth amendment protects us in the digital age, and make sure everyone has true broadband access. In 2021, we have identified three “challenges” to which we intend to pay special attention: Policing
This category includes facial recognition and other biometric data collection, as well as warrantless data collection by federal agencies, such as invasive device searches and social media by Customs and Border Protection. disciplinary technologies
Disciplinary technologies are sold to companies, schools, and individuals for the ostensible purpose of monitoring performance, confirming policy compliance, or ensuring security. In reality, these are non-consensual violations of a person's autonomy and privacy that are, at best, loosely related to the stated purpose of the system. The best examples are student monitoring software (especially monitoring) and employee monitoring software. Consumer harassing software, children's software and other spyware are intertwined and often overlap to monitor and control intimate partners and household members. Strengthen policies and structures that support online discourse and promote user expression.
This includes the promotion and defense of intermediary models that protect the rights and interests of users, the promotion of a human rights framework for the preservation of speech through online services and the promotion of competitive compatibility, which gives control to the users.

EFF has been a strong advocate for decentralized technology and protocols. Can you suggest ways that ordinary people can protect themselves from supervision and control?

There is a growing set of decentralized technologies available that are worth checking out. They provide insight into a better strategy, and the more people use and contribute to them, the better they are. For almost every service provided by a tech giant, there is a community that creates a decentralized version, and I look forward to the day when the next tool or service that makes your life better comes from that community. That day is drawing near, I believe. But I think we need to free up more space for this world, and some of it will require legal and policy changes, not just individual choices by users and developers. We need to stop being blocked from interoperability by laws like CFAA, DMCA and those click-don't-read envelopes that prevent reverse engineering and other steps we need to create new tools that allow you to interact with the elders. And we may need some affirmative requirements that companies allow and even encourage interoperability. In the meantime, there are some things ordinary people can do. EFF does not endorse any specific tool or service, but interested parties should refer to privacy tools such as Signal and Tor, tracking blockers such as EFF's Privacy Badger, and Internet browsers and privacy protection services such as those offered by Firefox and DuckDuckGo. There are others, of course. And use your privacy settings even on services like Google and Facebook. They're not great, but they also send a signal that you care. I often talk about business executives arguing that people just don't care about privacy and security and when I ask how many people have adjusted their privacy settings, it shows that people do care, that we just need better options.