top of page
Search

Digital Credit Scoring: To What Ends—and Whose?

  • Molly Bombard
  • May 2
  • 3 min read

In the modern age of finance, digital credit scoring resembles a high-stakes poker game, but the rules of the table are known to the house alone, leaving many players in the dark about their financial status. 


Gone are the days where credit reporting agencies use traditional metrics such as credit history and income to determine one’s credit score. Instead, financial hegemons such as Bank of America and American Express now delve into an individual’s social media activities, online behavior, and personal affiliations to develop an automated “digital credit score.”


The increase in digital credit scoring by financial institutions raises concerns not just about its opacity but also its inherent biases, departing from established norms surrounding credit reporting. Heretofore, the Fair Credit Reporting Act (FCRA) has been the main policy vehicle for consumers to ensure fairness in their credit scores. Yet, with alternative data sources such as online browsing history, users are not privy to the types of data used to create their credit score. As a result, the effectiveness of the FCRA is limited.


It is hard to believe that when one accepts a bank’s privacy policy, they consent to an analysis of where they fall within the hierarchy of his or her social networks or whether or not they attend marriage counseling.


What’s next, the person behind you in line could actively affect your credit score?


Well, actually yes. Consider the case of Kevin Johnson in 2008. Despite maintaining a clean credit history and using his American Express card responsibly, his credit limit was abruptly slashed from $10,800 to $3,800 because “other customers who have used their card at an establishment [he] frequently shops at have a poor repayment history with American Express.” Johnson, a Black man who lives in a historically underinvested neighborhood, is thus punished for factors beyond his credit trustworthiness. 


How is this any different from the nefarious practice of redlining? Consumers are financially affected on the basis of their neighborhoods. Thus, a problem arises with digital credit scoring: the use of vast troves of online data enables banks to discriminate against individuals historically excluded from financial services. 


Consumers from different racial and cultural backgrounds may access the Internet in different ways that leave different types of digital footprints—such as using a mobile phone versus a computer—and people from different cultures are more likely to visit certain social media platforms than others (think religious websites and cultural organizations). Therefore, using nontraditional behavioral data that is highly correlated with certain protected characteristics can introduce bias against an otherwise protected group. 


Due to proprietary information laws, banks are not required to release how their digital scoring algorithm incorporates personal data. This creates a major issue for consumers who feel their digital credit score does not reflect their trustworthiness and want to seek recourse through the FCRA. The policy “serves the dual goals of ensuring fairness in consumer credit reporting through limitations on how consumer credit information can be disclosed or used.” Yet, this law places the burden of accuracy on consumers. If banks are not required to release their decision-making processes for digital credit scoring, it is nearly impossible for consumers to rectify inaccurate reports. 


Efforts to mitigate this issue are gradually gaining traction in discussions of financial ethics. This September, the Consumer Financial Protection Bureau issued guidance asserting that lenders should provide specific and accurate reasons for using AI to take adverse actions against consumers. How seriously banks will implement this guidance is unclear. What is clear: the Fair Credit Reporting Act must not be outpaced by developments in digital credit scoring; proprietary information laws should not protect discriminatory practices. If our country truly values equal opportunity, banks ought to be required to release the data used in creating individual credit scores.


In this high-stakes poker game of digital credit scoring, only the banks know the rules and keep their cards close. It is time for the house to finally reveal its hand.




Bibliography


Chopra, Sahiba, "Current Regulatory Challenges in Consumer Credit Scoring Using Alternative Data-Driven Methodologies." Vanderbilt Journal of Entertainment & Technology Law. 23 (2020): 625.


Fowler, Geoffrey "The Spy in Your Wallet: Credit Cards Have a Privacy Problem." Washington Post (2019).


Hurley, Mikella, "Credit Scoring in the Era of Big Data." Yale Journal of Law and Technology. 18 (2016): 148.


Shaylor, Jay,. “Some Credit Card Companies Financially Profiling 

Customers.” ABC News (2008)

 
 
 

Comments


bottom of page