F
20

A client's loan denial letter made me question the whole 'unbiased algorithm' thing

I was helping a couple look at houses in Akron last fall, and they were perfect on paper. Good jobs, solid down payment, the works. Their loan got flagged by the bank's automated system for 'high risk', with no real reason given. The wife showed me the letter, and she said, 'It feels like the computer just looked at our zip code and said no.' That stuck with me. I've seen a dozen similar cases since. If the system is trained on old, biased lending data, it's just going to keep making the same bad calls forever. Has anyone else seen a clear case where a 'fair' algorithm seemed to repeat a human bias?
3 comments

Log in to join the discussion

Log In
3 Comments
shanef34
shanef3426d ago
How is that different from a human loan officer?
-1
lewis.mila
lewis.mila26d ago
Used to think the same, but then I saw one make a call based on data a person would miss.
3
kim_ramirez3
Exactly, it's not just about seeing the data, it's about seeing ALL of it at once and connecting weird dots. A person might look at a few big things like income and credit score. The system can check thousands of tiny data points in a second, like how you fill out a form or payment timing on old bills. It finds patterns humans would NEVER spot. That can actually be fairer sometimes, because it ignores things like how someone looks or sounds. But it's also scary because we don't always know WHY it made the call.
6