Joy Buolamwini introduces herself as a "poet of code" on a mission to combat "the coded gaze," a term she uses to describe algorithmic bias.
Algorithmic bias can lead to unfairness and can spread rapidly and on a large scale, resulting in exclusionary experiences and discriminatory practices.
Personal Experiences with Algorithmic Bias
Joy encountered algorithmic bias while developing the Aspire Mirror project, which involved using facial recognition software.
The software failed to detect her face unless she wore a white mask.
Previous experience with social robots also highlighted this bias when the robot couldn't detect her face during a project at Georgia Tech.
In Hong Kong, a social robot demo failed to recognize her face, using the same facial recognition software.
How Algorithmic Bias Occurs
Computer vision uses machine learning and training sets to recognize faces.
If training sets lack diversity, the system may fail to recognize faces that deviate from the established norm.
The opportunity to create more diverse training sets exists to better reflect humanity.
Consequences of Algorithmic Bias
Facial recognition is increasingly used by police departments in the US.
Georgetown Law reported that half of US adults have their faces in facial recognition networks which are unregulated.
Misidentification in facial recognition can lead to serious consequences, unlike the lighter implications on platforms like Facebook.
Algorithms extend beyond facial recognition into areas like hiring, loans, insurance, college admissions, and pricing.
Machine learning is also used in predictive policing and judicial decisions.
Solutions to Combat Algorithmic Bias
Importance of "incoding" movement: creating more inclusive code.
Three tenets of incoding:
Who codes matters: Assembly of diverse teams to check biases.
How we code matters: Incorporating fairness in system development.
Why we code matters: Making social change a priority.
Initiatives and Call to Action
Joy launched the Algorithmic Justice League for fairness in algorithms.
Invitation to join the incoding movement to build inclusive technology.
Call to report bias, request audits, and join the conversation using #codedgaze.
Encouragement to participate in creating inclusive training sets, such as a "Selfies for Inclusion" campaign.
Conclusion
Joy concludes with a call to action for creating a world where technology benefits everyone and promotes inclusivity.
Ends with an invitation to join her in the fight against the coded gaze.