Law Review Symposium Examines Online Terrorist Incitement


The Fordham Law Review’s spring symposium convened leading First Amendment and Internet law scholars from across the country on April 7 to present scholarly works addressing the tenuous balance between online free speech and national security concerns.

The symposium, “Terrorist Incitement on the Internet,” featured presentations from 14 renowned law professors with panels examining how the U.S. Constitution might apply to the regulation of terrorist speech in the Internet Age, the role social media plays in terrorist incitement, and inter-doctrinal policy challenges raised by critical race theory and human behavior in the online context. The professors’ papers on the symposium topic will be published in the Fall 2017 edition of the Fordham Law Review.

In recent years, the global reach of social media platforms such as Facebook and Twitter, combined with the increased use of such outlets by terrorist organizations, to recruit new members and inspire violent acts, has raised major questions among legal scholars about whether the free speech doctrine outlined in the Supreme Court’s landmark Brandenburg v. Ohio ruling should be narrowed. Speech can only be prohibited if it’s “directed at inciting or producing imminent lawless action” or is “likely to incite or produce such action,” the Supreme Court said in Brandenburg.

Democratic government in the Internet age has “a capability problem dealing with incitement and precursors to incitement” such as online hate speech, Fordham Law School Professor Joel Reidenberg said during the day’s opening panel, “Current Approaches & New Realities,” which was moderated by Fordham Law Professor Joseph Landau.

Technology is responsible for the capability problem, noted Reidenberg, who is the founding academic director of Fordham CLIP, the Center on Law and Information Policy.

With little to no accountability, he explained, one person can use social media infrastructure to instantly transmit a fundamentally corrosive message, such as fake news or anti-Semitic content, to millions of people across geographic boundaries. Such capability “cripples” democratic governments’ ability to counter this messaging and jeopardizes public order and safety, Reidenberg said.

Thus far, Section 230 of the Communications Decency Act of 1996 has exonerated social media gatekeepers from culpability for failure to remove posts that could potentially incite violence.  Reidenberg argued that the law goes too far in protecting them. The law must determine, according to Reidenberg, what level of responsibility an algorithm’s developers should bear for the design of their filters that determine what posts are displayed on social media platforms.

“The challenge is going to be figuring out what level of responsibility they should have and how we are going to oversee implementation of that responsibility,” Reidenberg continued, noting that in this context responsibility means liability.

Reidenberg and other presenters also grappled with free speech, as a legal and societal concept, in the post-9/11 social media age.

“The First Amendment has, in effect, made the U.S. a safe haven for online material that is illegal in other democratic countries that have different views,” said Reidenberg, who also moderated the day’s fourth and final panel, “Inter-Doctrinal Interplay,” in addition to presenting during the morning.

The day’s second panel, “Caution Against Overreaching,” featured presentations by Northwestern University Law Professor Andrew Koppelman and NYU Distinguished Fellow Thane Rosenbaum that questioned the wisdom of tolerating free speech that advocated violating the law. The Internet can and does breed real-life violence and fascism, and thus steps must be taken to confront “morally toxic” hate speech and calls to violence on the web, Rosenbaum asserted.

Context must be taken into account when judging online threats, cautioned Professor Lyrissa Lidsky of the University of Florida Levin College of Law. Lidsky illustrated her point with the story of a 15-year-old “League of Legends” player arrested after he threatened to shoot up a school —an empty threat liberally thrown around in this particular gaming culture and one that shouldn’t have resulted in legal repercussions, the professor contended.

Restricting free speech principles can also produce vast unintended consequences for corporate entities and societies, said Danielle Keats Citron ’94, a professor at the University of Maryland Carey School of Law, during the day’s third panel, “Terrorism on Social Media.” Fordham Law Professor Abner Greene moderated the panel.

“In the last year, we’ve seen European regulators essentially coerce companies into wholesale changes to their speech rules and practices in ways that threaten creeping censorship and are far more systematized than ever before,” Citron said, noting the risk is that such provisions later censor far more speech than the hate speech they were designed to police.

Citron was joined on the third panel by Loyola University Chicago Law Professor Alexander Tsesis, who organized the academics who participated in the symposium; Yale Law Professor Jack M. Balkin, founder and director of Yale’s Information Society Project; and Professor Raphael Cohen-Almagor, chair in politics and founding director of The Middle East Study Group at the University of Hull in England.

Loyola University Chicago School of Law co-sponsored the symposium.


Comments are closed.