Will robots take my job? It’s a frightening thought that many American workers have faced, including legal professionals. A website called WillRobotsTakeMyJob.com, which calculates the probability of certain jobs being automated, puts lawyers at a relatively safe 4% risk of being replaced by automation, while paralegals and legal assistants are at 94%—“Automation Risk Level: You Are Doomed” is the site’s deadpan prognosis for these professionals. But lawyers who are resistant to technology as a whole might need to be reminded that employment anxiety is only the tip of the iceberg: law’s relationship to artificial intelligence promises to grow in scope and complexity in the coming years, as a February 15 conference at Fordham Law confirmed.
The conference, titled Rise of the Machines: Artificial Intelligence, Robotics, and the Reprogramming of Law, brought together a diverse and interdisciplinary array of professionals, including attorneys, neuroscientists, and technologists, to explore current and near-future developments in robotics, artificial intelligence, law, and policy.
“It seems like just a few years ago that these issues were purely in the realm of science fiction,” said Dean Matthew Diller in his opening remarks. Ethical standards for the development of artificial intelligence, Diller stressed, will be especially critical moving forward—a sentiment that was echoed throughout the day.
“In many ways machines are only as good as their creators,” added Professor Deborah Denno, director of Fordham Law’s Neuroscience and Law Center and one of the conference’s organizers, in her welcome address.
The first panel, “How Neuroscience and Ethics Inform Robots and the Laws Governing Them,” directly tackled these pressing ethical concerns. Panelist Shlomit Yanisky-Ravid, visiting professor at Fordham Law, and head of the AI-IP Project at the Law School’s Center on Law and Information Policy, talked about recent AI-produced artwork, including an incredibly convincing mock-Rembrandt painting. The question for Yanisky-Ravid becomes one of ownership and accountability; who legally owns these artworks, once generated? “IP laws are outdated and irrelevant in many ways, regarding AI,” she noted, and they must be updated in order to clear up the confusion.
The second panel of the day was “AI and Machine Learning in Finance and the Ethical Effects on Markets.”Tom C.W. Lin, professor at Temple University’s Beasley School of Law, spoke about the ways in which AI has affected the world of finance, and particularly high-speed trading. “AI has led to a more efficient marketplace,” he said, but there are caveats. “AI often has the veneer of objectivity, masked by math and technology,” he said—but AI makes mistakes too. Lin stressed that, in the end, people are at the heart of markets and finance, and they must continue to have a presence in finance, particularly to counterbalance the natural limitations of AI.
Panel III addressed “Ethical Programming and the Impact of Algorithmic Bias.” Topics ranged from biases in online marketing algorithms to moral and legal issues surrounding autonomous weapons systems. Rebecca Crootof, executive director of Yale Law School’s Information Society Project, presented on the latter topic. Liability was a major point of concern. Instead of asking, “Who is responsible?” when a weapons system malfunctions and harms innocent people, Crootof suggested we ask “What is the appropriate liability?” Indeed, allocating full blame on a software engineer for unintended deaths caused by an AI malfunction could be a troubling precedent to set.
The fourth and final panel of the day was “Protecting Consumers’ Privacy and Ensuring Ethical Data Practices.” Panelist Oliver Round ’07, counsel and vice president at BNY Mellon, talked about the day-to-day challenges of dealing with new technology while working at a law firm. Round and his colleagues had to build an AI system in order to find information they needed in a collection of digitized contracts—all while dealing with the thorny issue of privacy. The field of privacy law is “constantly evolving” thanks to new technologies, according to Round, and it will be crucial for privacy attorneys and lawmakers to keep up.
Friday’s conference was presented by the Fordham Law Review and Fordham Law’s Neuroscience and Law Center, and made possible through the sponsorship of Davis Polk & Wardwell LLP.