How bias is coded into our technology
By Faith Foushee
Digital Media Editor
“Coded Bias” is a documentary film about Joy Buolamwini, a black female student at the Massachusetts Institute of Technology. In an MIT Media Lab course, she was creating a mirror that would show graphics to inspire the user, such as a lion on her face. The mirror used artificial intelligence algorithms for facial recognition.
At first, the AI could not correctly detect her face unless she wore a white mask. Buolamwini considered potential issues with lighting or angles, but concluded that the issue was that the algorithm, itself, was creating racial and gender bias.
The algorithmic issue sparked Buolamwini’s interests in determining any bias that exists against women and people of color. She learned that AI began at Dartmouth College in the math department. The board in this department that worked on the project comprised exclusively of white males, so the algorithms were created to best detect the people who created them.
Facial recognition is widely used these days. I use facial recognition without thinking about it every time I use a filter on Snapchat. I have never thought that the information Snapchat received from recognizing my face could be used in other ways.
And it doesn’t end with Snapchat, many iPhone users with the latest iPhones use facial recognition to unlock their smartphones.
“It’s kind of scary to think that every time I use facial recognition to unlock my phone or log in to an app, they are getting information,” said Missy Marazzo, an HPU sophomore and communication fellow. “It seems like this is a hot topic, and more and more information is coming out. Maybe, we’ll soon have answers to how this data is being utilized.”
Throughout the documentary, Buolamwini examined the use of AI and facial recognition in a variety of ways. She discovered examples in the United Kingdom and China, where the police were using AI to find criminals on the street.
She found that the algorithms showed negative bias toward non-white males and provided many inaccurate results that labeled individuals as criminals or at high risk for committing a crime. Labeling people as high risk to the community of committing a crime based on their skin color and/or social status is unfair. The technology shouldn’t be permitted for use until this issue is resolved.
Buolamwini also found that many businesses use AI in their hiring process, including IBM, Amazon and Apple. These algorithms show gender bias in the process of hiring. One example included Amazon initially using AI to sift through applications. The algorithm was prone to gender bias and automatically deleted applications from people who listed women’s colleges, women’s sports, etc. on their resume or application.
While Buolamwini’s work has brought the issue to the attention of many employers who showed this gender bias, there are likely many more companies who aren’t aware of the bias that’s happening. Does this mean that, by selecting that I am a female in the demographics section when applying for a job, I am potentially setting myself up for rejection?
On job applications, there are often demographic questions that are sometimes listed as optional. I sometimes hesitate when I get to those questions, out of fear that the establishment won’t look at my qualifications based on my selected gender or race. I still share the information because I presume they’d know I am female, based on my name; however, this documentary leaves me unsure.
Buolamwini’s determination throughout the film was inspiring. When she first noticed the bias, she started digging instead of complaining. When she found the answers and issues, she went to companies like IBM to present her findings and make a difference. Her determination led to the decision by these companies to stop using the AI technology in an effort to reduce the amount of racial and gender bias in their hiring process.
She also noted that, as a woman of color, people are going to try to discredit and defund her work. Women in technology are often underestimated because they are not part of the majority, but her determination and passion drive her to push through such challenges. Your race or gender should not define your success. Buolamwini’s work is important and should be recognized.
Information is constantly being gathered through AI and facial recognition. We do not know how this information is being used or to what extent it can be used. Even now, there are still no regulations on these algorithms, so there are no limitations. Buolamwini emphasized that we need a regulatory organization that makes algorithms work for all of society without discriminating.
More and more types and styles of AI will continue to enter and transform our world. What will people in power do with this information? I guess we will have to wait and see.