People lie to the media about WIN IT; The media must stop repeating their false claims
of that’s-not-how-it-all-works department
Update: After this post was published, Tech Review appears to have made a major edit to this article and added a correction regarding the completely false claim regarding Section 230 protecting CSAM. The article still has issues, but isn’t so blatant anymore. The message below is for the original article.
MIT Tech Review has an article this week which is presented as a news article claiming (doubtfully) that “the United States now hosts more child sexual exploitation material (CSAM) online than any other country”, and claiming that if we don’t adopt EARN IT Act, “the problem will only get worse.” The problem is that the article is full of false or misleading claims that the reporter apparently did not verify.
The biggest problem with the article is that it attributes this turn of events to two things: a group of “prolific CSAM sites” moving their servers from the Netherlands to the US, and then… Section 230.
The second is that Internet platforms in the United States are protected by Section 230 of the Communications Decency Act, which means they cannot be sued if a user uploads something illegal. While there are exceptions for copyright infringement and material related to adult sex work, there is no exception for CSAM.
So that’s the claim a lot of people make, but a journalist from a respectable publication shouldn’t make it, because it is simply wrong. Incredibly, the reporter points out that there are “exceptions” for copyright violations, but she fails to note that the exception she names, 230(e)(2), comes after another exception, 230(e)(1), which literally says:
(1) No effect on criminal law
Nothing in this section shall be construed as affecting the application of section 223 or 231 of this Title, Chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of Title 18, or any other federal criminal statute.
It’s almost as if the reporter just accepted the claim that there was no exception for CSAM and didn’t bother to, you know, look at the current law. Child sexual abuse material violates federal law. Section 230 directly exempts all federal laws. The idea that 230 has no exception for CSAM is simply wrong. It’s not a matter of interpretation. It’s a question of facts and the MIT Tech Review is lying to you.
The article then gets worse.
According to Hany Farid, professor of computer science at the University of California, Berkeley and co-developer of PhotoDNA, a technology that turns images into unique digital signatures, called hashes, to identify CSAM.
People keep saying that companies have “little legal incentive” to treat CSAM as if 18 USC 2258A does not exist. But it is. And this law says very clearly that websites must report CSAM on their platforms. If a website fails to do so, it can be fined $150,000 for its first violations and up to $300,000 for each subsequent violation.
I don’t know how anyone can look at this and say there is no legal incentive to keep CSAM off their platform.
And, just to make a point even clearer, you’ll be hard pressed to find any legitimate Internet service that wants this content on its website for fairly obvious reasons. First, it is objectionable content. Second, it’s a good way to shut down your entire department when the DOJ comes after you. Third, it’s not good for any type of regular business (especially ad-based) if you’re “the platform that enables” this type of objectionable content.
To claim that there is no incentive, legal or otherwise, is simply not true.
Later in the article, the reporter mentions that companies must report content, but then argues that it’s different because “they don’t have to actively seek it out.” And that goes to the heart of the WIN IT debate. EARN IT supporters insist it’s not an ‘oversight’ bill, but when you dig into the details, they admit what’s pissing them off is just a few companies who refuse to install this type of filtering technology. Except, as we’ve detailed (and the article doesn’t even bother to confront), if the US government passes a law that mandates filters, it creates a massive 4th amendment problem that will make more making it difficult to prosecute CSAM providers legally (under the 4th Amendment, the government cannot order a blanket search like this, and if it does, it will allow those being prosecuted to suppress the evidence).
Plus, we’ve been through this time and time again. If the real problem is companies’ inability to find and report CSAM, then the real problem is why has the DOJ not done anything about it? They already have the tools under Section 230 (exempt from CSAM) and 2258A to sue. But they didn’t. And EARN IT does nothing to better fund the DOJ or even ask why the DOJ never initiates any of these lawsuits?
Incredibly, some of the “experts”, who are all among the people who will benefit from EARN IT’s passage (because the reporter apparently didn’t even bother to ask anyone else), kind of clarified this point, without even realizing it. :
Other than “bad press,” there aren’t many penalties for platforms that fail to quickly remove CSAM, says Lloyd Richardson, director of technology at the Canadian Center for Child Protection. “I think you’d be hard-pressed to find a country that fines an e-service provider for being slow or not removing CSAM,” he says.
Well, isn’t that the problem then? If the problem is that countries are not enforcing the law, shouldn’t we ask ourselves why and how to get them to enforce the law? Instead they want this new WIN IT law which does nothing to actually increase this app but rather will open up many websites to completely frivolous lawsuits if they dare to do something like offer encrypted messaging to end users.
Incredibly, later in the article, the journalist admits that (as mentioned at the beginning of the article) the reason why so many websites hosting this type of abusive material left the Netherlands was… because the government finally took law enforcement seriously this has. But then he immediately says but since the content just moved to the United States, it wasn’t really effective and “the solution, according to child protection experts, will come in the form of legislation “.
But, again, it’s already illegal. We already have laws. The problem is not the legislation. The problem is the application.
Finally, at the end, the reporter mentions that “privacy and human rights advocates” don’t like EARN IT, but misrepresent their real arguments and present it as a false dichotomy between tech companies “prioritizing the privacy of those who distribute CSAM on their platforms on the safety of those who are victims. It’s just inducing the wrong rage.
Businesses rightly prioritize encryption to protect data privacy. everyone, and encryption is particularly important for marginalized and at-risk people who need to be able to seek help in a way that is not compromised. And, again, any major Internet company already takes this stuff very seriously, as it must under current law.
Also, as mentioned before, the article never once mentions the 4th Amendment – and with it the fact that by forcing websites to be scanned, it will actually be much, much harder to stop CSAM. Experts have explained it. Why didn’t the journalist talk to real experts?
The whole article repeatedly confuses dark and unpredictable web providers with big internet companies. WIN IT will not be used against these dark web forums. Just like FOSTA, it is going to be used against random third parties that have been used by some of these sketchy companies. We know it. We have seen it. Mailchimp and Salesforce were both sued under FOSTA because some people indirectly associated with sex trafficking also used these services.
And with EARN IT, anyone offering encryption will also fall victim to these kinds of lawsuits.
A honest EARN IT’s account and what it does would not (1) lie about what Section 230 does and does not protect, (2) misrepresent the state of the law for websites in USA Today, (3) wouldn’t just quote people heavily involved in the fight for EARN IT, (4) didn’t misrepresent people’s warnings pointing out the many problems with EARN IT, ( 5) have not overlooked that the real problem is the DOJ’s unwillingness to enforce existing laws right, (6) would have been willing to discuss the real threats to undermine the encryption, (7) would have been willing to discuss the real issues of requiring universal monitoring/download filters, and (8) not letting anyone get away with a fake quote falsely claiming that companies care more about the privacy of CSAM providers than shutting down the CSAM. That last one is really infuriating, because there are a lot of really good people trying to figure out how these companies can stop the spread of CSAM, and articles like this, full of lies and nonsense, belittle all the work they did.
The MIT tech journal should know more and shouldn’t post garbage like this.
Filed Under: csam, earn it, incentives, photodna, digitization, section 230, surveillance
Companies: technical review