• Tag Archives cybercrime
  • The UN Cybercrime Draft Convention is a Blank Check for Surveillance Abuses

    This is the second post in a series highlighting the problems and flaws in the proposed UN Cybercrime Convention. Check out our detailed analysis on the criminalization of security research activities under the proposed convention.

    The United Nations Ad Hoc Committee is just weeks away from finalizing a too-broad Cybercrime Draft Convention. This draft would normalize unchecked domestic surveillance and rampant government overreach, allowing serious human rights abuses around the world.

    The latest draft of the convention—originally spearheaded by Russia but since then the subject of two and a half years of negotiations—still authorizes broad surveillance powers without robust safeguards and fails to spell out data protection principles essential to prevent government abuse of power.

    As the August 9 finalization date approaches, Member States have a last chance to address the convention’s lack of safeguards: prior judicial authorization, transparency, user notification, independent oversight, and data protection principles such as transparency, minimization, notification to users, and purpose limitation. If left as is, it can and will be wielded as a tool for systemic rights violations.

    Countries committed to human rights and the rule of law must unite to demand stronger data protection and human rights safeguards or reject the treaty altogether. These domestic surveillance powers are critical as they underpin international surveillance cooperation.

    EFF’s Advocacy for Human Rights Safeguards

    EFF has consistently advocated for human rights safeguards to be a baseline for both the criminal procedural measures and international cooperation chapters. The collection and use of digital evidence can implicate human rights, including privacy, free expression, fair trial, and data protection. Strong safeguards are essential to prevent government abuse.

    Regrettably, many states already fall short in these regards. In some cases, surveillance laws have been used to justify overly broad practices that disproportionately target individuals or groups based on their political views—particularly ethnic and religious groups. This leads to the suppression of free expression and association, the silencing of dissenting voices, and discriminatory practices. Examples of these abuses include covert surveillance of internet activity without a warrant, using technology to track individuals in public, and monitoring private communications without legal authorization, oversight, or safeguards.

    The Special Rapporteur on the rights to freedom of peaceful assembly and of association has already sounded the alarm about the dangers of current surveillance laws, urging states to revise and amend these laws to comply with international human rights norms and standards governing the rights to privacy, free expression, peaceful assembly, and freedom of association. The UN Cybercrime Convention must be radically amended to avoid entrenching and expanding these existing abuses globally. If not amended, it must be rejected outright.

    How the Convention Fails to Protect Human Rights in Domestic Surveillance

    The idea that checks and balances are essential to avoid abuse of power is a basic “Government 101” concept. Yet throughout the negotiation process, Russia and its allies have sought to chip away at the already-weakened human rights safeguards and conditions outlined in Article 24 of the proposed Convention. 

    Article 24 as currently drafted requires that every country that agrees to this convention must ensure that when it creates, uses, or applies the surveillance powers and procedures described in the domestic procedural measures, it does so under its own laws. These laws must protect human rights and comply with international human rights law. The principle of proportionality must be respected, meaning any surveillance measures should be appropriate and not excessive in relation to the legitimate aim pursued.

    Why Article 24 Falls Short?

    1. The Critical Missing Principles

    While incorporation of the principle of proportionality in Article 24(1) is commendable, the article still fails to explicitly mention the principles of legality, necessity, and non-discrimination, which hold equivalent status to proportionality in human rights law relative to surveillance activities. A primer:

    • The principle of legality requires that restrictions on human rights including the right to privacy be authorized by laws that are clear, publicized, precise, and predictable, ensuring individuals understand what conduct might lead to restrictions on their human rights.
    • The principles of necessity and proportionality ensure that any interference with human rights is demonstrably necessary to achieving a legitimate aim and only include measures that are proportionate to that aim.
    • The principle of non-discrimination requires that laws, policies and human rights obligations be applied equally and fairly to all individuals, without any form of discrimination based on race, color, sex, language, religion, political or other opinion, national or social origin, property, birth, or other status, including the application of surveillance measures.

    Without including all these principles, the safeguards are incomplete and inadequate, increasing the risk of misuse and abuse of surveillance powers.

    2. Inadequate Specific Safeguards 

    Article 24(2) requires countries to include, where “appropriate,” specific safeguards like:

    • judicial or independent review, meaning surveillance actions must be reviewed or authorized by a judge or an independent regulator.
    • the right to an effective remedy, meaning people must have ways to challenge or seek remedy if their rights are violated.
    • justification and limits, meaning there must be clear reasons for using surveillance and limits on how much surveillance can be done and for how long.

    Article 24 (2) introduces three problems:

    2.1 The Pitfalls of Making Safeguards Dependent on Domestic Law

    Although these safeguards are mentioned, making them contingent on domestic law can vastly weaken their effectiveness, as national laws vary significantly and many of them won’t provide adequate protections. 

    2.2 The Risk of Ambiguous Terms Allowing Cherry-Picked Safeguards

    The use of vague terms like “as appropriate” in describing how safeguards will apply to individual procedural powers allows for varying interpretations, potentially leading to weaker protections for certain types of data in practice. For example, many states provide minimal or no safeguards for accessing subscriber data or traffic data despite the intrusiveness of resulting surveillance practices. These powers have been used to identify anonymous online activity, to locate and track people, and to map people’s contacts. By granting states broad discretion to decide which safeguards to apply to different surveillance powers, the convention fails to ensure the text will be implemented in accordance with human rights law. Without clear mandatory requirements, there is a real risk that essential protections will be inadequately applied or omitted altogether for certain specific powers, leaving vulnerable populations exposed to severe rights violations. Essentially, a country could just decide that some human rights safeguards are superfluous for a particular kind or method of surveillance, and dispense with them, opening the door for serious human rights abuses.

    2.3 Critical Safeguards Missing from Article 24(2)

    The need for prior judicial authorization, for transparency, and for user notification is critical to any effective and proportionate surveillance power, but not included in Article 24(2).

    Prior judicial authorization means that before any surveillance action is taken, it must be approved by a judge. This ensures an independent assessment of the necessity and proportionality of the surveillance measure before it is implemented. Although Article 24 mentions judicial or other independent review, it lacks a requirement for prior judicial authorization. This is a significant omission that increases the risk of abuse and infringement on individuals’ rights. Judicial authorization acts as a critical check on the powers of law enforcement and intelligence agencies.

    Transparency involves making the existence and extent of surveillance measures known to the public; people must be fully informed of the laws and practices governing surveillance so that they can hold authorities accountable. Article 24 lacks explicit provisions for transparency, so surveillance measures could be conducted in secrecy, undermining public trust and preventing meaningful oversight. Transparency is essential for ensuring that surveillance powers are not misused and that individuals are aware of how their data might be collected and used.

    User notification means that individuals who are subjected to surveillance are informed about it, either at the time of the surveillance or afterward when it no longer jeopardizes the investigation. The absence of a user notification requirement in Article 24(2) deprives people of the opportunity to challenge the legality of the surveillance or seek remedies for any violations of their rights. User notification is a key component of protecting individuals’ rights to privacy and due process. It may be delayed, with appropriate justification, but it must still eventually occur and the convention must recognize this.

    Independent oversight involves monitoring by an independent body to ensure that surveillance measures comply with the law and respect human rights. This body can investigate abuses, provide accountability, and recommend corrective actions. While Article 24 mentions judicial or independent review, it does not establish a clear mechanism for ongoing independent oversight. Effective oversight requires a dedicated, impartial body with the authority to review surveillance activities continuously, investigate complaints, and enforce compliance. The lack of a robust oversight mechanism weakens the framework for protecting human rights and allows potential abuses to go unchecked.

    Conclusion

    While it’s somewhat reassuring that Article 24 acknowledges the binding nature of human rights law and its application to surveillance powers, it is utterly unacceptable how vague the article remains about what that actually means in practice. The “as appropriate” clause is a dangerous loophole, letting states implement intrusive powers with minimal limitations and no prior judicial authorization, only to then disingenuously claim this was “appropriate.” This is a blatant invitation for abuse. There’s nothing “appropriate” about this, and the convention must be unequivocally clear about that.

    This draft in its current form is an egregious betrayal of human rights and an open door to unchecked surveillance and systemic abuses. Unless these issues are rectified, Member States must recognize the severe flaws and reject this dangerous convention outright. The risks are too great, the protections too weak, and the potential for abuse too high. It’s long past time to stand firm and demand nothing less than a convention that genuinely safeguards human rights.

    Check out our detailed analysis on the criminalization of security research activities under the UN Cybercrime Convention. Stay tuned for our next post, where we’ll explore other critical areas affected by the convention, including its scope and human rights safeguards.

    https://www.eff.org/deeplinks/2024/06/un-cybercrime-draft-convention-blank-check-unchecked-surveillance-abuses


  • Culpability for this Ransomware Belongs to the NSA

    Culpability for this Ransomware Belongs to the NSA

    In all the coverage of the recent ransomware attack shutting down computer systems around the world, one point has been buried and obscured. The focus has been on precisely who spread this horrid thing, what damage it has done, what to do once you have it, and how to prevent it.

    All fascinating questions. But an equally, if not more, important question is: who created this weapon of mass computer destruction? What was its origin? How did it get released in the first place?

    And here, the answer is as sure as it is alarming. The culpability belongs to the National Security Agency. That’s right. The government that claims to be protecting us against cybercrime both made the virus and failed to secure it from being stolen by malicious actors.



    ComputerWorld explains

    The tools, which security researchers suspect came from the NSA, include an exploit codenamed EternalBlue that makes hijacking older Windows systems easy. It specifically targets the Server Message Block (SMB) protocol in Windows, which is used for file-sharing purposes…. The developer of Wanna Decryptor appears to have added the suspected NSA hacking tools to the ransomware’s code, said Matthew Hickey, the director of security provider Hacker House, in an email.

    ArsTechnica explains:

    A highly virulent new strain of self-replicating ransomware shut down computers all over the world, in part by appropriating a National Security Agency exploit that was publicly released last month by the mysterious group calling itself Shadow Brokers…. Another cause for concern: wcry copies a weapons-grade exploit codenamed Eternalblue that the NSA used for years to remotely commandeer computers running Microsoft Windows. Eternalblue, which works reliably against computers running Microsoft Windows XP through Windows Server 2012, was one of several potent exploits published in the most recent Shadow Brokers release in mid-April.

    The New York Times says:

    The attacks on Friday appeared to be the first time a cyberweapon developed by the N.S.A., funded by American taxpayers and stolen by an adversary had been unleashed by cybercriminals against patients, hospitals, businesses, governments and ordinary citizens…. The United States has never confirmed that the tools posted by the Shadow Brokers belonged to the N.S.A. or other intelligence agencies, but former intelligence officials have said that the tools appeared to come from the N.S.A.’s “Tailored Access Operations” unit, which infiltrates foreign computer networks. (The unit has since been renamed.)

    The furious president of Microsoft weighed in:

    Starting first in the United Kingdom and Spain, the malicious “WannaCrypt” software quickly spread globally, blocking customers from their data unless they paid a ransom using Bitcoin. The WannaCrypt exploits used in the attack were drawn from the exploits stolen from the National Security Agency, or NSA, in the United States…. The governments of the world should treat this attack as a wake-up call. They need to take a different approach and adhere in cyberspace to the same rules applied to weapons in the physical world. We need governments to consider the damage to civilians that comes from hoarding these vulnerabilities and the use of these exploits. This is one reason we called in February for a new “Digital Geneva Convention” to govern these issues, including a new requirement for governments to report vulnerabilities to vendors, rather than stockpile, sell, or exploit them.

    Cyberscoop interviewed several experts:

    “In my view, there isn’t a policy problem, it’s an operational problem,” [former White House National Security Council cyber staffer Rob] Knake, now with the Council on Foreign Relations, told CyberScoop. “NSA should not have lost those tools. No way for policymakers to account for that problem other than to move quickly to get info on the vulnerabilities out, which they apparently did. Loss of the tools is an operational problem. The response was appropriate and timely.”

    This is obviously terrible for the United States in terms of international relations. It is the equivalent of having built a weapon of mass destruction and inadvertently failing to secure it from access by criminals. Yes, the people who use such weapons are bad actors, but the bureaucracy that made the weapon and allowed its release in the first place bears primary responsibility.

    And while the NSA’s responsibility is certainly being downplayed in the American mainstream media – NPR reported it but quietly and inauspiciously – you can bet it is all the talk in the 100 countries that are affected.

    Yes, it would be very sweet if users around the world were forgiving and understanding. Everyone makes mistakes. Sadly, that is not the case. The NSA developed this virus to use against network systems of enemy countries and failed to secure it. The head of Microsoft is correct that this really is an outrage, and cries out for a fix.

    Had a private company been responsible, its stock would now sit at nearly zero and the feds would be all over it for responsibility for cybercrime. Probably there would be jail time.

    What will be the fallout from the NSA screw up? Watch for it: surely a bigger budget.


    Jeffrey A. Tucker

    Jeffrey Tucker is Director of Content for the Foundation for Economic Education. He is also Chief Liberty Officer and founder of Liberty.me, Distinguished Honorary Member of Mises Brazil, research fellow at the Acton Institute, policy adviser of the Heartland Institute, founder of the CryptoCurrency Conference, member of the editorial board of the Molinari Review, an advisor to the blockchain application builder Factom, and author of five books. He has written 150 introductions to books and many thousands of articles appearing in the scholarly and popular press.

    This article was originally published on FEE.org. Read the original article.