I recently started thinking about racial bias in cell phone technology. This isn’t something that I would have considered, being covered by white privilege. But, during the Super Bowl, there was a commercial for the Google Pixel 6. One of the advantages to this new camera is new technology that gets skin tones of non-white people right. Their technology is trademarked “Real Tone” and it allows the camera to capture the skin tones of people of color.
Once I saw the commercial, I had to learn more about racial bias in cell phone technology. Google has an excellent YouTube video about the need for the Real Tone technology and how they developed it. I was impressed with their thoughtfulness about this issue.
Skin tone in pictures has always been a problem for people of color. In 1954, Kodak produced the Shirley Card, so that small, independent film labs could make sure they were getting the colors right when they developed photos. Shirley was a white employee of the Kodak company. For the next twenty years, the color accuracy of all photos was tested using a white woman. In the 1970’s, Kodak realized they needed to test color accuracy using people with more diverse skin tones. They started producing Shirley Card with white, black, and Asian models.
Facial Recognition systems, however, continue this racial bias. In a 2021 National Institute of Standards and Technology report, researchers studied 189 facial recognition algorithms. They found that facial recognition technologies falsely identified Black and Asian faces 10 to 100 times more often than they did white faces. The technologies also falsely identified women more than they did men—making Black women particularly vulnerable to algorithmic bias. Algorithms using U.S. law enforcement images falsely identified Native Americans more often than people from other demographics.
These algorithmic biases have major real-life implications. Several levels of law enforcement use facial recognition technology to support policing and airport screenings. This technology sometimes determines who receives housing or employment offers. One analyst at the American Civil Liberties Union reportedly warned that false matches “can lead to missed flights, lengthy interrogations, watch list placements, tense police encounters, false arrests, or worse.” There are many examples of people who have been wrongfully charged with a crime after facial recognition systems falsely identified them.
Racial bias in cell phone technology would seem to be a lesser problem. But technologies overlap. What if a self-driven car “sees” a white person crossing the road, but doesn’t “see” a black person? And just the idea that you can have a cell phone camera that will take lousy pictures of a person based solely on their skin tone bothers me.
This idea was made more real to me when I looked at the pictures I took of the Fort Frederica African American Festival. The Gullah Geechee Ring Shouters’ faces disappear in this picture, while the faces of the white people behind them are fine. In the shadows, you can see the faces of the white people – the camera compensated for them – but you can barely see the faces of the black people. This especially bothered me in the picture I took of artist S.A. Hunter. You can’t see her face at all, while the face of the white man in front of her is perfectly clear. Obviously this is wrong and unfair to people who are not white.
I applaud Google Pixel 6 for taking this problem seriously and trying to get it right. I also appreciate their commercial that allowed me to face my own biases and think about this problem. My old iPhone still has racial bias built into it. And the iPhone “True Tone” technology does not address the problem at all. When I get ready to upgrade, I will certainly take into consideration how people of all colors are portrayed by the phone’s camera.