Policymakers and users are demanding that Facebook, Google, YouTube and Twitter, identify and take down hate speech, terrorist propaganda and other forms of problematic speech. The European Commission has signed a memorandum of understanding that obliges the social platforms to speed up their takedowns. In June, Germany adopted a law that will impose large fines on networks who do not remove unlawful speech within 24 hours of notification.
In response, the platforms are coming up with algorithms, ranging from keyword filters to AI learning software, that automatically moderate large amounts of content. The Center for Democracy & Technology (CDT) is publishing a new paper evaluating the success of these new tools. Do they end up removing legal, even newsworthy, posts? Do they leave many illegal posts online?
Come hear Emma Llansó, present the results of CDT’s research and preview their recommendations. While automation in some aspects of large-scale content moderation processes may be desirable, or even necessary, she also underlines the risks of overbroad censorship.
Emma Llansó short biography:
Emma Llansó is the Director of CDT’s Free Expression Project, which works to promote law and policy that support users’ free expression rights in the United States and around the world. She leads in developing content policy best practices with Internet content platforms and advocating for user-empowerment tools and other alternatives to government regulation of online speech.
Emma earned a B.A. in anthropology from the University of Delaware and a J.D. from Yale Law School. Emma joined CDT in 2009 as the Bruce J. Ennis First Amendment Fellow; her fellowship project focused on legal and policy advocacy in support of minors’ First Amendment rights in the US.
- Emma Llansó, CDT: Automation in Content Moderation: Capabilities and Limitations
- Konrad Niklewicz, Civic Institute: Weeding out Fake News: An Approach to Social Media Regulation
- CEPS Commentary based on the event, by William Echikson: To filter or not to filter – That is the question