A new attack on social media's immunity
Published Date: 6/13/2019
Source: axios.com
For all the talk of antitrust investigations, the bigger threat to tech platforms like Google and Facebook is an intensifying call from Congress to revamp a law that shields them and other web companies from legal liability for users' posts.Driving the news: House Intelligence Chairman Adam Schiff today joined a motley group of policymakers calling to reconsider the legal protections afforded to tech platforms. It's a broadening of a line of attack that caught fire last year when a new law made it easier to sue tech platforms for hosting sex-trafficking ads.The big picture: Social media companies are taking hits from every direction for allowing hate speech, false information and now fake video to mushroom on their sites. But legally, they're in the clear even when hosting the most odious content.Be smart, per Axios' David McCabe: Lawmakers have been threatening broad changes to the immunity law for over a year but haven't advanced any legislative proposals doing so. At this point, it's more potent leverage than it is something they've been willing to get moving.Details: Section 230 of the Communications Decency Act protects companies that carry user-generated content — like Facebook, Twitter, YouTube and other sites — from bearing legal liability for what their users post.It's become a cornerstone of the modern internet since it was passed in 1996, freeing companies from having to closely police every sentence, video or photo published on their platforms.But critics say it's allowed them to shirk a societal responsibility to keep harmful and false information from spreading online.After a hearing today on national security implications of deepfakes — AI-manipulated videos — Schiff told reporters:"If the social media companies can't exercise a proper standard of care when it comes to a whole variety of fraudulent or illicit content, then we have to think about whether that immunity still makes sense. These are not nascent industries or companies that are struggling for viability — they're now behemoths, and we need them to act responsibly."One idea for how to update the law comes from Danielle Citron, a University of Maryland law professor who has written extensively about deepfakes and was a witness at today's hearing."It shouldn't be a free pass," Citron said of the immunity. "It should be conditioned on reasonable content moderation practices."That would mean that companies like Facebook could get in legal trouble if someone posted a defamatory fake video and the company didn't act reasonably to take it down or tell users it was manipulated.The "reasonable person" standard is commonly applied across law.What's next: If this idea picks up steam again in Congress, expect Big Tech — including any site that hosts user comments and reviews, user-written ads, or videos and photos — to fight tooth and nail to keep its Section 230 immunity.