AI-powered inequity and discrimination
In a jaw-dropping revelation that left many questioning the fairness of cutting-edge technology, the credit limit algorithm13 used by the prestigious Apple Card faced shocking accusations of gender bias. The supposedly sophisticated AI-driven system, designed to streamline the credit experience for millions of users, stood exposed when it was discovered that women received significantly lower credit limits compared to their male counterparts with strikingly similar financial backgrounds. This mind-boggling disparity sparked an outcry and provoked fierce debate, leaving the public to wonder whether even the giants of technology could succumb to the antiquated prejudices plaguing society.
A controversial study from Stanford University by Wang and Kosinski, Deep neural networks are more accurate than humans at detecting sexual orientation from facial images,14 which was dubbed as an artificially intelligent radar, claimed that widely used facial...