As gig workers’ pay gets slashed by algorithms, experts warn that A.I.-driven wage systems mean that no one’s paycheck is safe. Fordham Law Professor Zephyr Teachout warns in a recent article for Slate that specific legislation curbing algorithmic wage discrimination “could eventually be deemed unfair trade practices under state or federal law.”
Algorithms can be employed to sniff out desperation for income based on the extremes people are willing to take on the job, such as high trip acceptance rates among Uber drivers. With this hoard of granular information, A.I. can calculate the lowest possible pay that workers across sectors will tolerate and suggest incentives like bonuses to control their behavior. While bosses have always offered so-called variable pay—for instance, paying more for night shifts or offering performance-based salary boosts—high-tech surveillance coupled with A.I. is taking real-time tailored wages to new extremes.
“Now you have machine learning trained on identifying the desperation index of workers,” Zephyr Teachout, a professor of law at Fordham University, told me. “When you move to the formal employment context, there is every reason to think that employers who can would be interested in tailoring their wages and using behavioral data.”
…
Organized labor has also started cracking open up the A.I. “black box”: In 2020, unionized IBM employees in Japan claimed the company refused to explain how it uses A.I. to help evaluate staff and decide on salary increases. This kicked off an investigation by a labor relations commission that ended in a settlement this past August, when IBM agreed that it would reveal to the union the 40 different data categories involved.
Ultimately, experts say that specific legislation is likely the best bet to curb algorithmic wage discrimination. These tactics are currently considered legal in the U.S., Dubal wrote, as long as they don’t violate minimum wage or antidiscrimination laws. Teachout thinks these types of tactics could eventually be deemed unfair trade practices under state or federal law, but to her knowledge such cases haven’t been brought in court yet.
Federal and state law technically protect us from workplace discrimination based on factors like race and gender, but Teachout has noted in her research that A.I. may incorporate data that can act as a proxy for these identities. An algorithm could find, for example, that workers with less savings are less likely to leave a job even when paid low wages. This could disproportionately affect Black and Latino workers, who tend to have lower rates of savings compared to other groups.
Read “Why You Might Soon Be Paid Like an Uber Driver—Even If You’re Not One” on Slate.