Twitter | Search | |
Dr. Pragya Agarwal
Talking about face recognition tech & bias, today I was unable to take a picture in a photo kiosk. I was clearly neither smiling nor had my mouth open. But 18 tries in a row, even when I tried to purse my lips in a scowl, as here, it did not meet their biometric regulations.
Reply Retweet Like More
Rich McEachran Dec 23
Replying to @DrPragyaAgarwal
As someone with a facial difference, AI and facial recognition tech are not my friend either. I recently pitched a feature on this to a few editors, but none of them bit.
Reply Retweet Like
Dr. Pragya Agarwal Dec 23
Replying to @richmceachran
these algorithms and tech are trained on some ‘standard’ parameters & show bias in society as to what is considered the norm. I wrote something for prospect magazine recently & also cover more in my upcoming book but haven’t been able to place other articles either...
Reply Retweet Like
Sarah Hart Dec 23
has a long story similar to this... Am on bedtime duty now but will look for thread later
Reply Retweet Like
Dr. Pragya Agarwal Dec 23
Pls do share. Am on bedtime duty soon. I wrote something about the gender bias in facial recognition tech
Reply Retweet Like
d0nkey_k0ng Dec 26
Replying to @DrPragyaAgarwal
Bias is something which isn't manually programmed. It has to do completely with the data and the features the model's been trained on. Having a dataset with low distribution of race, bad model tuning and bad testing can be a cause.
Reply Retweet Like
Dr. Pragya Agarwal Dec 26
Replying to @RaghavKukreti
Yes that is often the case due to the bias in the sample data, and lack of diversity in it. I cover it in more detail in my upcoming book (pinned tweet)
Reply Retweet Like
Amr Abdelwahab(عمرو) Dec 24
Replying to @DrPragyaAgarwal
I wanna make a case study of how these technologies deal with my curly hair hahaha
Reply Retweet Like
Dr. Pragya Agarwal Dec 24
Replying to @amrAbdelwahab
Have you had any issues with your hair?
Reply Retweet Like
damien venditti Dec 23
Replying to @DrPragyaAgarwal
Wtf?
Reply Retweet Like
Jonathan Lambert Dec 23
Replying to @DrPragyaAgarwal
Women with darker skin tones and anyone who doesn’t conform to mainstream learning models isn’t exactly an edge condition: it’s humanity. Many underestimate the cost of making these technogies work inclusively, and how much of that cost to be born on specific groups & individuals
Reply Retweet Like