Connect with us

The Plunge Daily

Twitter scraps image cropping algorithm after finding racial, gender bias

twitter scraps ai powered image cropper

Tech Plunge

Twitter scraps image cropping algorithm after finding racial, gender bias

Twitter is scrapping its AI- driven image cropping algorithm after the microblogging site found a problematic bias in its functioning. The study, conducted by three of its machine learning researchers, found that the AI-powered algorithm gave a slight preference to white individuals over Black individuals in images and male-presenting images over female-presenting images. The algorithm was used to automatically decide which portion of a photo shared on the social network was best for the screen viewing it.



It found an 8% difference from demographic parity in favor of women, and a 4% favor toward white individuals, news agency Reuters reported. Objectification biases were also discovered in the study. The paper cited several possible reasons, including preference for high-contrast images, issues with image backgrounds and eye color, but said none were an excuse.

“Machine learning based cropping is fundamentally flawed because it removes user agency and restricts user’s expression of their own identity and values, instead imposing a normative gaze about which part of the image is considered the most interesting,” the researchers wrote.

To address the issue, the microblogging site recently started showing standard aspect ratio photos in full – without any crop – on its mobile apps and is trying to expand that effort. The findings also highlights the disparate impact from artificial intelligence systems including demographic biases identified in facial recognition and text analysis, the paper said.

Last year, the microblogging site was in the eye of storm after several users pointed out the racial bias in its picture-cropping algorithm


Also Read: AT& T to merge media operations with Discovery in 43 billion USD deal


Users noticed when two photos – one of a black face the other of a white one – were in the same post, Twitter often showed only the white face on mobile.

Subsequently, the company apologised for the algorithmic flaw and launched an investigation.

In a statement, a Twitter spokesperson admitted the company had work to do. “Our team did test for bias before shipping the model and did not find evidence of racial or gender bias in our testing. But it’s clear from these examples that we’ve got more analysis to do. We’ll continue to share what we learn, what actions we take, and will open source our analysis so others can review and replicate.”


Click to comment

Leave a Reply

Your email address will not be published.

To Top
Loading...