A client's loan denial letter made me question the whole 'unbiased algorithm' thing
I was helping a couple look at houses in Akron last fall, and they were perfect on paper. Good jobs, solid down payment, the works. Their loan got flagged by the bank's automated system for 'high risk', with no real reason given. The wife showed me the letter, and she said, 'It feels like the computer just looked at our zip code and said no.' That stuck with me. I've seen a dozen similar cases since. If the system is trained on old, biased lending data, it's just going to keep making the same bad calls forever. Has anyone else seen a clear case where a 'fair' algorithm seemed to repeat a human bias?