Bias and Discrimination with Artificial Intelligence
By Robert Robinson (@medicaliphone)
Artificial intelligence (AI) systems have been shown to effectively and invisibly algorithmically discriminate against customers on the basis of gender and race, reinforcing existing barriers to social and financial success.
Gender discrimination has the potential to impact the majority of the customers of the financial industry due to deep societal biases against women. AI systems have been shown to hide high paying job openings, exclude otherwise qualified job candidates, and prioritise professions or activities based on gender stereotypes. These types of bias reinforce low paying jobs, reduced opportunities, and gender stereotypes for women. These factors can reduce the financial potential of women and have life long consequences for the individual and society.
Invisibly incorporating gender bias through artificial intelligence into the building blocks of the financial system will exacerbate gender based financial disparities. Racial discrimination through AI can be seen in longer prison sentences in the United States for minorities, the identification of minorities as animals in photographs, and targeting for increased law enforcement scrutiny based on race alone. This discrimination is serious and potentially life altering, invisibly reinforces segregation and negative stereotypes.
Historic bias in the financial industry against racial minorities could readily return with the deployment of AI based financial risk assessment systems based on historical and prejudiced data. Limiting the financial freedom and opportunities on the basis of gender and race because of hidden biases within AI systems will also reduce the potential of the financial industry.
Maximising opportunities for all through the use of unbiased systems will reduce disparities and result in a stronger and more inclusive global financial system. This can only be accomplished through algorithmic transparency and conscious attention to the potential of incorporating historical societal and cultural biases into AI systems.
If you liked what you just read, why not sign up to our weekly newsletter? Get all the content straight to your inbox here.