Credit Scores: Transparency and Inclusiveness in the Banking Industry


Information collecting and online tracking is an extremely lucrative industry controlled by data brokers who manipulate humans as products.[1] Data brokers are those who acquire information across thousands of channels, aggregate it based on a target audience, and resell it to interested third parties.[2] Data brokers collect information such as name, address, age, income, education, occupation, race, social media use, web history, credit card or purchase history and enter it into their algorithm to predict future actions.[3] They obtain the information from government databases, public websites, and other data brokers’ database and the algorithms often use past trends to hypothesize and insert data where our information is lacking.[4] The interested third parties use this information to best manipulate or predict our actions and further their business model.[5]

For the purposes of this post, the interested third-parties are banks, specifically those analyzing credit and determining interest rates.

Interest rates are traditionally based on credit scores which gauge an individual’s specific risk by entering the individual’s past credit activity into a statistical model to gauge their specific loan risk.[6] Under traditional credit score calculations, 26 million Americans do not have the credit history necessary to generate a credit score, subjecting them to higher interest rates.[7] The higher interest rate, unfairly derived from insufficient information, can catapult those affected individuals into a cycle of debt and cost the banking industry billions in earnings.[8] If banks embrace new clients and provide more loans, they will increase their profits from interest.[9]

As a result, many banks are beginning to incorporate information from data brokers in determining interest rates.[10] If successful, traditionally excluded individuals will have access to reasonable rates.[11] In turn, reasonable rates advance their opportunities and help to end the  cycle of poverty.[12] This process has the potential to limit the systematic barriers within the industry and increase inclusivity, allowing many new individuals to receive benefits the banking industry formerly withheld.[13]

The credit industry is regulated by the Fair Credit Reporting Act (“FCRA”) which requires that all individuals have access to a free credit report with all material information and that all individuals have the opportunity to dispute any incorrect or incomplete information.[14] If the information is incorrect, the credit reporting agency must correct or delete it.[15] The credit industry is also regulated by the Equal Credit Opportunity Act (“ECOA”) which “prohibits creditors from discriminating against credit applicants on the basis of race, color, religion, national origin, sex, marital status, age . . .”.[16]When the banks incorporate this information into their scores, the companies are still held to these standards.[17]

Additionally, the Senate recently sponsored the Data Broker List Act of 2019 which requires data brokers to register with the Federal Trade Commission.[18] However, this regulation is not expansive enough to protect against the severe implications of unknown data sharing. Even combining the FCRA, the ECOA and the potential Data Broker List Act of 2019, the regulations on credit data brokers will be insufficient given the rapid expansion of the industry, uncontrolled and potentially inaccurate nature of the internet in today’s society and the vast amount of data involved in their analyzations.[19] Individuals cannot protect their privacy if brokers are not visible and data brokers are not required to disclose their algorithm, leaving the individual frustrated, with insufficient recourse to correct the information.[20]

The scores created from the information provided by the data broker are only as unbiased and successful as the algorithm that powers it.[21] Unsuccessful programing can lead to implicit bias and systematic barriers in this method of collecting data.[22] For example, a data broker analyzes google searches. It has determined that individuals who searched “drunk” were higher risks and thus banks may charge them a higher rate.[23] However, it does not necessarily follow that this seemingly innocent google search shows an individual is financially irresponsible, for they could be searching for such content because they are concerned about a family member or friend.

The risk of financial instability increases when the channels offering credit are not controlled and the basis of their decisions are not disclosed.[24] The sheer amount of data makes it difficult to determine what is material and exactly which factors lead to a certain conclusion.[25] Thus, the broker may not even have the ability to conceptualize what was materially relevant to their conclusion.[26]

If implemented successfully, data brokers can completely transform the banking industry and help eliminate current systematic barriers.[27] The banking industry would enter the individual’s information into an algorithm and generate a reasonable interest rate based on the individual’s character and personality traits.[28] In order for the banking industry risk model to be the most accurate and protective of individuals’ privacy rights, the algorithm must eliminate, not perpetuate, material error and implicit bias. How can we be sure it is accurate if the information is not available to be verified? The United States must enact new regulations that force transparency and limit the amount of data analyzed, thereby ensuring only relevant, unbiased factors are considered, which will benefit both consumers and the banking industry.

[1] See generally FTC, Data Brokers: A Call for Transparency and Accountability 3 (2014),

[2] See id. at ii.

[3] See id. at iv, 12 – 13.

[4] See Douglas MacMillian, Data Brokers are Selling your Secrets. How States are Trying to Stop Them, The Wash. Post (June 24, 2019, 5:54 PM),

[5] See Amy Kapczynski, The Law of Information Capitalism, 129 Yale L.J. 1460, 1469 (2020) (explaining how Google enters our searches to determine our behaviors and make prediction future trends and monetize these behaviors).

[6] See Fin. Stab. Bd., Artificial Intelligence and Machine Learning in Financial Services Market Developments and Financial Stability Implications 12 (2017),

[7] Kenneth P. Bevoort & Michelle Kambara, CFPB Data Point: Becoming Credit Visible 4 (2017),

[8] See generally Aaron Klein, Reducing Bias in AI – Based Financial Services, Brookings (July 10, 2020),

[9] Id.

[10] See U.S. Gov’t Accountab’y Off., Information Resellers Consumer Privacy Framework Needs to Reflect Changes in Technology and the Marketplace 16-19 (2013),

[11] See Klein, supra note 8.

[12] See id.

[13] See Fin. Stab. Bd., supra note 6, at 13.

[14] 15 U.S.C. § 1681.

[15] Id.

[16] 15 U.S.C. § 1691.

[17] See U.S. Gov’t Accountab’y Off., supra note 10, at 8.

[18] Data Broker List Act, S. 2342, 116th Cong. (2019).

[19] See Klein, supra note 8.

[20] See Kalev Leetaru, The Data Broker So Powerful Even Facebook Brought Their Data – But They Got Me Wildly Wrong, Forbes (Apr. 8, 2018, 4:08 PM),

[21] See Klein, supra note 8.

[22] Id.

[23] See, e.g. Katja Langenbucher, Responsible A.I. Credit Scoring – A Legal Framework, 25 Eur. L. Rev. 1, 3 (2020) (citing The Future of Credit Scoring, TotallyMoney, (last visited Sept. 25, 2020)).

[24] See Ratna Sahay et al., Financial Inclusion: Can it Meet Multiple Macroeconomic Goals, Int’l Monetary Fund 16 (2015),

[25] See Lael Brainard, Governor, Fed. Rsrv. Sys., Speech at Fintech and the New Financial Landscape: What are We Learning about Artificially Intelligence in Financial Services (Nov. 3, 2018).

[26] Id.

[27] See Klein, supra note 8.

[28] Id.


About Author

Comments are closed.

Fordham Journal of Corporate & Financial Law