When Algorithms Don’t Account for Civil Rights

0

Olivier Sylvain was quoted in The Atlantic about instances of discrimination in online content.

When it comes to assessing culpability in the realm of online discrimination, the Communications Decency Act is often used to determine whether or not internet platforms are at fault for illegal content that appears on their sites. The law, passed in 1996, essentially says that platforms that host a ton of user-uploaded content, such as Facebook, YouTube, or Craigslist, can’t generally be held responsible for a user posting something that is discriminatory, according to Olivier Sylvain, a professor at Fordham Law School.

 

But posting paid advertising that violates anti-discrimination laws is different, Sylvain says: “They are on the hook when they contribute one way or another in their design and the way in which the information is elicited.” One example that helps to illustrate the limits of the protections offered to companies by the Communications Decency Act (CDA) involved a website called Roommates.com. The platform, a forum to help individuals find roommates, was sued for violating the Fair Housing Act by allegedly allowing for gender discrimination in housing. A court ruled that because the site’s design required users to fill in fields about gender in order to post, it couldn’t rely on immunity offered by CDA as a defense. Roomates.com ultimately won its lawsuit, but the platform now makes adding information about gender optional. (Roomates.com did not respond to a request for comment.)

 

Read full article.

Share.

Comments are closed.