|

Let’s not trample upon human rights in the name of “cyber”

russia_cyber_surian_soosay

This year, major data breaches at corporations and within government have spurred officials to do something ?— anything at all ?— to fix online security. We’ve seen a raft of so-called cybersecurity legislation introduced around the world that creates more problems than it solves, trampling upon privacy and human rights in the name of “cyber.” Access has been involved in a number of those fights around the world.

For example, in the U.S., we’ve been fighting the Cybersecurity Information Sharing Act (CISA), a dangerous bill that reduces the incentives for securing the private information held by companies. This vaguely worded legislation would give corporations broad liability protections when the information is shared, undermining consumers’ privacy. The information can be delivered to military and intelligence agencies, including the U.S. National Security Agency. This would create massive repositories of personal information for surveillance, but does not address many of the real problems with online security, such as existing software vulnerabilities. Fortunately, as of this writing, we are beating back CISA. People who want real security joined our campaign, led with our partners Fight for the Future and EFF, generating more than 6 million faxes to lawmakers. The U.S. Senate went home for summer break without passing CISA, so it will not be up for a vote until the fall ?— if ever.

In China, lawmakers have advanced a “cybersecurity” law that aims ostensibly to clarify the laws regarding the internet, but in fact increases government control of the internet and raises a number of human rights concerns. Under article 50, the law sanctions government shutdowns of the internet, a practice that can precede and facilitate egregious human rights violations, as I’ve previously argued. Other articles in the legislation require tech firms and social media companies to track the “real” identity of users. That is in addition to the Great Firewall of China, a sophisticated monitoring and blocking regime that stifles free expression — one which was just bolstered by a new rule requiring police to be embedded within technology companies.

The United Kingdom has also promoted several policies over the last year that would drastically impact human rights. Portions of the Data Retention and Investigatory Powers Act (DRIPA), passed as emergency legislation in 2014, were struck down by the U.K. High Court last month in a case brought by the human rights firm Liberty. Yet the ruling did not properly follow a decision by the Court of Justice of the European Union, which explained that retention and access to retained data interfere with the rights to privacy and data protection. Prime Minister David Cameron has called for the banning of encryption technologies, which would not only harm the privacy rights of U.K. citizens, but also weaken their security, and threaten the rights and security for people around the world who rely on encryption to stay safe.

We’ve seen similar laws from countries as far afield as Switzerland, Kenya, and Tunisia. The trend for each of these pieces of legislation is that governments have not sought input from a wide variety of stakeholders, a crucial step for developing considered laws and policies. By listening only to groups who enjoy regular access to the halls of power — typically business or other branches of government — lawmakers tend to pass laws that ignore diversity and trample on human rights. In their rush to do something, they make things worse. For example, CISA grants companies the power to hack back against perceived threats, even if innocent users may be harmed in the process. Bad laws can put marginalized minorities or dissidents at risk, leading to unconscionable fines, imprisonment, or death. They can treat our personal data — about our health, families, and acquaintances — as evidence to be used against us.

It’s not easy to gather input from a variety of stakeholders, but when done properly, we can craft good cyber legislation or decide to keep laws that are already effective. Mozilla recently reached out to cyber experts to identify what they believe are the most pressing issues in the field. Mozilla used a partially anonymized method to eliminate bias, and the findings showed that the experts tend to agree that confidentiality, integrity, and availability of information are critical aspects of cybersecurity. Many emphasized the importance of automatic updates to patch bugs, for example. Some experts did agree that sharing security information is useful, but they also called for strong privacy laws. Importantly, the experts came from various backgrounds such as civil liberties, academia, security research, technology, and government/military. These options require the incorporation of human rights which are interrelated and indivisible. You can read the paper here [PDF].

We’ve spent a lot of time thinking about cyber security at Access. Our 24-hour digital security helpline serves users at risk around the world, and our Digital Security Action Plan, a core part of our Encrypt All the Things Campaign, provides concrete steps for companies to bolster corporate security. Many others have come up with effective solutions that also respect human rights, which we celebrated at our first Crypto Summit in Washington, D.C. last month. We believe these conversations help address government surveillance and security at the same time.

In the headlong rush to respond to security breaches, governments would prefer to act rather than listen to stakeholders. But this mistakenly conflates taking action with finding an enduring solution. To achieve real cybersecurity, policy makers need to take a user-first approach and ensure that human rights remain at the very center of any debate.

photo credit: Surian Soosay