• Tag Archives censorship
  • States Introduce Dubious Anti-Pornography Legislation to Ransom the Internet

    More than a dozen state legislatures are considering a bill called the “Human Trafficking Prevention Act,” which has nothing to do with human trafficking and all to do with one man’s crusade against pornography at the expense of free speech.

    At its heart, the model bill would require device manufacturers to pre-install “obscenity” filters on devices like cell phones, tablets, and computers. Consumers would be forced to pony up $20 per device in order to surf the Internet without state censorship.  The legislation is not only technologically unworkable, it violates the First Amendment and significantly burdens consumers and businesses.

    Perhaps more shocking is the bill’s provenance. The driving force behind the legislation is a man named Mark Sevier, who has been using the alias “Chris Severe” to contact legislators. According to the Daily Beast, Sevier is a disbarred attorney who has sued major tech companies, blaming them for his pornography addiction, and sued states for the right to marry his laptop.  Reporters Ben Collins and Brandy Zadrozny uncovered a lengthy legal history for Sevier, including an open arrest warrant and stalking convictions, as well as evidence that Sevier misrepresented his own experience working with anti-trafficking non-profits.

    The bill has been introduced in some form Alabama, Florida, Georgia, Indiana, Louisiana, New Jersey, North Dakota, Oklahoma, South Carolina, Texas, West Virginia, and Wyoming (list here). We recommend that any legislator who has to consider this bill read the Daily Beast’s investigation.

    But that’s not why they should vote against the Human Trafficking Prevention Act. They should kill this legislation because it’s just plain, awful policy.  Obviously, each version of the legislation varies, but here is the general gist.



    Read EFF’s opposition letter against H.3003, South Carolina’s iteration of the Human Trafficking Prevention Act. 

    Pre-installed Filters

    Manufacturers of Internet-connected devices would have to pre-install filters to block pornography, including “revenge porn.” Companies would also have to ensure that all child pornography, “revenge pornography,” and “any hub that facilitates prostitution” are rendered inaccessible. Most iterations of the bill require this filtering technology to be turned on and locked in the on position, by default.

    This is terrible for consumer choice because it forces people to purchase a software product they don’t necessarily want. It’s also terrible for free speech because it restrains what you can see. Because of the risk of legal liability, companies are more likely to over-censor, blocking content by default rather than giving websites the benefit of the doubt.  The proscriptions are also technologically unworkable: for example, an algorithm can hardly determine whether an item of pornography is “revenge” or consensual or whether a site is a hub for prostitution.

    To be clear, unlocking such filters would not just be about accessing pornography.  A user could be seeking to improve the performance of their computer by deleting unnecessary software.  A parent may want to install premium child safety software, which may not play well with the default software. And, of course, many users will simply want to freely surf the Internet without repeatedly being denied access to sites mistakenly swept up in the censorship net.

    A Censorship Tax

    The model bills would require consumers to pay a $20 fee to unlock each of their devices to exercise their First Amendment rights to look at legal content. Consumers could end up paying a small fortune to unlock their routers, smartphones, tablets, and desktop computers.

    Data Collection

    Anyone who wants to unlock the filters on their devices would have to put their request in writing. Then they’d be required to show ID, be subjected to a “written warning regarding the potential dangers” of removing the obscenity filter, and then would have to sign a form acknowledging they were shown that warning. That means stores would be maintaining private records on everyone who wanted their “Human Trafficking” filters removed.

    The Censorship Machine

    The bill would force the companies we rely upon to ensure open access to the Internet to create a massive censorship apparatus that is easily abused.

    Under the bill, tech companies would be required to operate call centers or online reporting centers to monitor complaints that a particular site isn’t included in the filter or complaints that a site isn’t being properly filtered. Not only that, but the bill specifically says they must “ensure that all child pornography and revenge pornography is inaccessible on the product” putting immense pressure on companies to aggressively and preemptively block websites to avoid legal liability out of fear of just one illegal or forbidden image making it past their filters. Social media sites would only be immune if they also create a reporting center and “remain reasonably proactive in removing reported obscene content.”

    It’s unfortunate that the Human Trafficking Prevention Act has gained traction in so many states, but we’re pleased to see that some, such as Wyoming and North Dakota, have already rejected it. Legislators should do the right thing: uphold the Constitution, protect consumers, and not use the problem of human trafficking as an excuse to promote this individual’s agenda against pornography.

    Source: States Introduce Dubious Anti-Pornography Legislation to Ransom the Internet | Electronic Frontier Foundation


  • Government Pressure Shutters Backpage’s Adult Services Section

    Succumbing to years of government pressure, the online classified ads website Backpage.com has shut down its adult services section. Just like Craigslist before it, Backpage faced the difficult choice of censoring an entire forum for online speech rather than continue to endure the costly onslaught of state and federal government efforts seeking to hold it responsible for the illegal activity of some of its users.

    The announcement came on the eve of a hearing by the Senate Permanent Subcommittee on Investigations (PSI). The hearing was the backdrop for the release of a committee report [PDF] alleging [PDF] that Backpage knew that its website was being used to post ads for illegal prostitution and child sex trafficking, and directly edited such ads to make their illegality less conspicuous or flagged for the posters how to do so themselves.

    While acknowledging the horrific nature of sex trafficking, EFF has participated in several cases to remind courts about the importance of preserving strong legal protection under the First Amendment and Section 230 (47 U.S.C. § 230) for Internet intermediaries.

    For example, we were counsel for the Internet Archive in two cases in which Backpage was co-plaintiff, one in Washington state and the other in New Jersey, challenging state laws that sought to hold online companies responsible for hosting third-party ads for illegal sexual transactions. We successfully argued that the laws were invalid under the First Amendment and Section 230.

    Section 230 is the two-decade old statute passed by Congress to promote online free speech and innovation by immunizing (with certain exceptions) Internet intermediaries from liability for illegal content created or posted by their users. Section 230 immunity holds as long as the companies did not themselves create the illegal content, while editing user-generated content is permitted by Section 230 as long as the editing itself does not make the content illegal.

    We’ve also filed amicus briefs in support of strong legal protections for Internet intermediaries. We filed an amicus brief in an emotionally tough Massachusetts case against Backpage brought by young women trafficked for sex as minors via the website. The court rightly dismissed the case, largely adopting our Section 230 arguments.

    Much of Backpage’s fights have hinged on defending fundamental First Amendment rights online. We submitted an amicus brief in a case where Backpage successfully challenged the “campaign of suffocation” by an Illinois sheriff who had illegally coerced major credit card companies to stop doing business with Backpage. Recently, we submitted an amicus brief in a case where Backpage is challenging some of the subpoenas issued by PSI, arguing that the committee’s inquiry into Backpage’s ad moderating practices amounts to improper government interference into core editorial functions protected by the First Amendment—something we also argued Sen. Thune did in relation to Facebook’s “trending” news stories.

    During the PSI hearing, senators expressed their disdain for Backpage’s reliance on Section 230 and the First Amendment. Chairman Rob Portman (R-OH) said that Backpage’s invocation of Section 230 is a “fraud on courts, on victims, and on the public.” Ranking Member Claire McCaskill (D-MO) exclaimed, “This investigation is not about curbing First Amendment rights. Give me a break!” And Sen. Heidi Heitkamp (D-ND) said that Backpage has “the audacity to hide behind the First Amendment.”

    EFF and other civil liberties organizations are all too familiar with the fact that First Amendment rights are often championed by those accused of disseminating unpopular or harmful speech. And when First Amendment rights are weakened for one unsavory person or entity, First Amendment rights become weakened for everyone.

    Most disturbing during the hearing, Chairman Portman said that the committee will explore “legislative remedies” to address the problem of online sex trafficking. This surely means a weakening of Section 230 protection for Internet intermediaries, which EFF strongly opposes. Congress already passed the SAVE Act in 2015, which amended the federal criminal statute on sex trafficking to include anyone involved in advertising sex trafficking. This amendment was specifically meant to target online platforms that host ads posted by third parties, and strip those platforms of Section 230 protection since the statute does not provide immunity against federal criminal charges.

    Any changes to Section 230 itself, to make it easier to impose liability on companies for user-generated content, would be devastating to the web as we know it—as a thriving online metropolis of free speech and innovation. As my colleague Matt Zimmerman wrote back in 2010 when Craigslist shuttered its adult services section, Section 230 “is not some clever loophole” but rather “a conscious policy decision by Congress to protect individuals and companies who would otherwise be vulnerable targets to litigants who want to silence speech to which they object.”

    Matt further explained:

    This clear protection plays an essential role in how the Internet functions today, protecting every interactive website operator—from Facebook to Craigslist to the average solo blog operator—from potentially crippling legal bills and liability stemming from comments or other material posted to websites by third parties. Moreover, if they were obligated to pre-screen their users’ content, wide swaths of First Amendment-protected speech would inevitably be sacrificed as website operators, suddenly transformed into conservative content reviewers, permitted only the speech that they could be sure would not trigger lawsuits.

    So while Backpage’s announcement suggests that the company’s opponents have at least temporarily won the battle against the adult services section of the website (because Backpage has vowed to continue its legal battles), EFF will continue to try to win the war to ensure that both the First Amendment and Section 230 remain strong protectors of Internet intermediaries—the online innovators who enable the rest of us to communicate, engage in commerce, and generally be active participants in our democratic and diverse society like never before.

    Source: Government Pressure Shutters Backpage’s Adult Services Section | Electronic Frontier Foundation


  • App Store Censorship and FBI Hacking Proposed at Congressional Crypto Hearing

    Tech experts and industry representatives squared off against law enforcement officials in two sessions of lively testimony today in front of the House Energy and Commerce committee. Today’s hearing is the latest in the ongoing battle in the courts and legislature commonly called the second “Crypto Wars,” after a similar national debate in the 1990s.

    Two witnesses on the law enforcement panel offered a chilling proposal to deal with the well-documented weakness that any domestic encryption ban would do little against the hundreds of encryption products developed and sold internationally. Thomas Galati of the NYPD and Charles Cohen of the Indiana State Police argued that software could be kept off American computing devices by exerting legal pressure on the Android, Apple, and Blackberry app stores.

    That proposal would seem to leave to app store gatekeepers the nigh-impossible task of ensuring none of the software it carries comes with “warrant-proof” cryptographic options. But worse, it cuts right to the core of fundamental computing freedom questions and cues up the next legislative battle to address what software people are allowed to run on their devices.

    It’s a scenario envisioned by EFF Special Advisor Cory Doctorow in his essay Lockdown: as long as we’re using the kinds of general purpose computers that power our phones, laptops, and increasingly everything else, the only way to remove capabilities is by requiring DRM software and other spyware to make sure users are in compliance.

    The laws that currently aim to enforce those kinds of restrictions piggyback on copyright law, and create uncertainty around phone jailbreaking, to pick a relevant example. EFF has argued for—and won—explicit exemptions to those laws, allowing users to install software from alternative app stores. It’s not hard to imagine that if a proposal to regulate encryption software through app store chokepoints were to proceed, it would be accompanied by pressure to tighten those restrictions.

    At another point in the hearing, lawmakers pressed the FBI’s Amy Hess on the role of third-party “grey hat” hackers in accessing the data on the iPhone at the heart of the hotly contested “Apple v. FBI” case. Representative Diana DeGette of Colorado suggested those capabilities might be cultivated internally instead.

    Hess disagreed, saying the FBI will always need to seek the cooperation of industry and academic experts. That might have been an opportunity to discuss the duty FBI and other agencies have in disclosing vulnerabilities to those same tech industry companies—an area EFF has worked to shine light on through Freedom of Information Act requests and lawsuits concerning the Vulnerabilities Equities Process (VEP). Unfortunately, no lawmakers pushed Hess on the question.

    The second panel—made up of industry and tech representatives—seemed to serve as a fact-checking service for the first. Apple’s General Counsel Bruce Sewell, for example, categorically denied three allegations made about his company in the previous panel, saying Apple has not provided source code to the Chinese government, has not actively “thrown away” keys it once used to assist law enforcement, and has not announced passcode protection for the next generation of its iCloud backup software.

    Other irresponsible statements from the first panel went without comment. When Charles Cohen, the Indiana State Police commander, was asked about information that is more accessible to surveillance now than before cell phones, he drew a blank. “I’m having problems thinking of information that is available now that was not before. From my perspective, thinking through investigations that we previously had information for, when you combine the encryption issue along with shorter and shorter retention periods for Internet service providers … it might be difficult to find an example of an avenue that is now available that was not before.” It’s possible that Cohen is not familiar the myriad ways in which cell phone metadata, content, and location tracking are being used by law enforcement—but that would be quite a surprise, given the Indiana State Police’s long history with the technology.

    Ultimately, it’s a step forward that a congressional committee has summoned tech expertise into the room, if only to explain why law enforcement wasn’t able to compromise our security in the first Crypto Wars. Speaking to a representative who floated the idea of a key escrow system, University of Pennsylvania Associate Professor of Computer and Information Science Dr. Matt Blaze explained: “I just want to caution that the split-key design, as attractive as it sounds, was also at the core of the NSA design of the Clipper Chip, which was where we started over two decades ago.” Blaze should know; his research discovering a fatal flaw in the Clipper Chip protocol is often credited with sinking the project.

    Meanwhile in the Senate, draft legislation could threaten uncompromised cryptography altogether. U.S. readers, tell your Senators to oppose the Burr-Feinstein backdoor proposal today.

    Source: App Store Censorship and FBI Hacking Proposed at Congressional Crypto Hearing | Electronic Frontier Foundation