“It’s free and always will be”—a marketing line that Facebook features on its homepage—is an assertion that Tom Norton, executive director at Fordham Law’s Center on Law and Information Policy (CLIP) takes issue with. “There really is no such thing as a free product or service,” Norton says. “When internet services say things like this, you have to realize you’re paying for it somehow, and it’s most likely with your information, and therefore, your privacy.”
The ways in which internet giants such as Facebook and Google have been utilizing their users’ private information have come under scrutiny in recent months, with last year’s Cambridge Analytica scandal serving as the poster-child for this disturbing trend. In fact, it was this scandal—in which over 70 million Facebook users had their personal information sold to a data-mining firm without their consent—that recently prompted a group of Fordham Law and University of Michigan scholars to spread the word about how data collection, and particularly data exchange between websites, works.
The result was the paper “APIs and Your Privacy,” released today and co-written by N. Cameron Russell, former executive director of CLIP, Florian Schaub, assistant professor at the University of Michigan’s School of Information, Allison McDonald, Ph.D. candidate in computer science at the University of Michigan’s College of Engineering, and William Sierra-Rocafort, project fellow at CLIP. The paper uses everyday language to explain how application programming interfaces (APIs) function and what impact their use has had on the web, and on society at large.
APIs, simply defined, are software intermediaries that interconnect internet services. An API is what allows people to sign into Pandora using Facebook login information, or to use a custom Google search function on a website other than Google.
But APIs have a dark side too, given how freely they facilitate the exchange of information between internet services—often without user knowledge. Indeed, it was Facebook’s Open Graph API that enabled the thisisyourdigitallife app to provide Cambridge Analytica third-party access to Facebook user data. While 300,000 Facebook users consented to allow the app access to some of their own personal information, the API also enabled access to the data of those users’ ‘friends,’ affecting 87 million people.
APIs also allow advertisers to tailor ads to individuals by mining personal information from their Facebook updates and Google searches, a phenomenon that leaves some users uneasy.
“Some people are somewhat aware of the data exchange that goes on—and of the collection, sharing, and digital economy that enables that,” Norton says, “but I think that most people aren’t aware of how deep it goes and how vast data collection actually is.”
Given that smartphone and computer users regularly allow themselves to be physically tracked through location-based apps, frequently input sensitive financial data on mobile payment systems, and casually reveal details about their personal lives on social media, the implications of API abuse are troubling.
Luckily, Norton notes, there has been some progress made on curbing potential issues with APIs. In June 2018, California passed its new Consumer Privacy Act, one of the nation’s toughest privacy laws. The following month, Twitter placed new restrictions on its API. Sweeping data protection laws have already taken effect in Europe, and Norton hopes that the United States will be quick to follow.
For now, he notes, there are a few tools users can install to combat some unwanted aspects of APIs, including Ghostery, Privacy Badger, and uBlock Origin.
Norton notes that any sort of sweeping ban on APIs would be impossible. “Basically all of the internet that we use today is based on APIs,” he says. “They are everywhere, and they’re not going away.”
With this study, Norton says, “we want people to internalize how vast the data ecosystem is—how much data from you, and people all around you, is being collected and shared all the time and at many different levels.
“This paper is a good start, but policymakers have to pick up the ball. We hope this paper will show policymakers how this technology can impact privacy and influence them to push concrete policy change.”