Former FCA exec says banking tech could lead to discrimination
Advances in banking technology are putting vulnerable customers at risk of discrimination, according to a former City regulator, as reported on The Guardian.
Mick McAteer, a former board member at the Financial Conduct Authority (FCA), told the publication that lenders and insurers were gaining access to tools to more accurately identify “unprofitable” or costly customers, increasing the risk of exclusion for certain sections of society.
The issue has gained fresh attention after an algorithm used to set credit limits for the new Apple Card sparked claims of gender discrimination. David Heinemeier Hansson, a tech entrepreneur, notes that he had been offered 20 times more credit than his wife.
Suggested reading: Apple Card issuer investigated after claims of sexist credit checks
The Apple co-founder Steve Wozniak states that he and his wife, who share all the same assets, faced similar problems with the Goldman Sachs-run algorithm, which is now under formal investigation by a US regulator.
“Everyone’s getting very excited about fintech and open banking and all this new technology, but the fact is … this actually risks more people being excluded from the financial system,” says McAteer.
“Technology and the use of big data actually allows banks – and insurance companies, for that matter – to identify more precisely which customers are profitable, which customers are not, [and] which customers cost more to maintain.”
Financial firms already differentiate between customers to some degree, resulting in different rates for products such as credit cards and mortgages, which he believes could amplify bias.
“A lot of the business models are built around standard males with average earnings, with certain predictable patterns of earnings growth. There’s a risk that those that exhibit different behaviour end up being discriminated against,” says McAteer, who now runs the Financial Inclusion Centre thinktank.
In the UK, firms are not allowed to segment customers based on gender, race or physical ability, but it has become easier to identify customers based on technical analysis of data such as spending habits and income.
Read more: AI: A force for good?
“The whole point of this tech is to allow very, very precise profiling to people’s behaviour. Some firms will use that for good purposes, others won’t,” he adds.
Last year, the FCA raised concerns that certain customers were being racially profiled by insurers who were buying extra data to set their prices.
Some firms decided to scrap the use of that data after the regulator said it contained information that could identify race or ethnicity. But others defended their data mining as a “proportionate means of achieving a legitimate aim” around pricing.
“We’re very concerned about the downsides about the growth and use of technology and big data,” adds McAteer. “We think it will lead to more exclusion unless regulators constrain the way that it’s used.”
Charlotte Crosswell, the chief executive of the UK’s fintech lobby group Innovate Finance, notes that diverse hiring practices would help address discrimination by financial technologies. “In the race to leverage artificial intelligence, we must ensure that algorithms are set without bias of offerings to consumers based on gender and social background. Diversity is therefore imperative at all levels of all companies, from those who work in engineering through customer-facing roles and in the leadership team.”
Responding to the Apple Card controversy, Goldman Sachs notes that all its credit applications were evaluated on income and creditworthiness factors such as personal credit scores and debt levels. “Based on these factors, it is possible for two family members to receive significantly different credit decisions. In all cases, we have not and will not make decisions based on factors like gender.”