Catch up on LTI’s ChatGPT and Generative AI in the Library Coverage
Teaching, learning, and using AI — all in one place
Posted on in Blog Posts
Posted on August 28, 2023 in Blog Posts
If you took a picture of a gorilla at the zoo and uploaded it to your Google Photos account, you may not be able to search and find it. In July 2023, the New York Times reported that in photo apps run by Google, Amazon, Microsoft, and Apple, searching for “gorilla” or other terms for primates may bring back no results, even when photos of primates are present in the apps’ collection (Grant & Hill, 2023). This is because of algorithmic bias.
Algorithmic bias is not a new concept. Much has been written about it, and the news has reported plenty of stories about it. In 2015, Google Photo’s auto-tagging feature mischaracterized selfies of two Black people as gorillas (Wachter-Beottcher, 129). This terminology, of course, has a long, racist history when used to describe Black people and demonstrates why “gorilla” is not a searchable term today. Google appears to have blocked results to prevent this issue from ever repeating.
In response to this incident, Google’s Yonatan Zunger, chief social architect at the time, said on Twitter that “we’re working on longer term fixes around … image recognition itself (e.g. better recognition of dark-skinned faces.).” Did the historical data fed to Google Photos possibly fail to include pictures of Black or other minority people? If so, this is a significant lapse in judgment and demonstrates the false assumption Sara Wachter-Boettcher uncovers in Technically Wrong: “that the data [technologists] have is neutral, and that anything at the edges can be written off” (141). Data is not neutral, and the programs we build on biased data reflect a biased reality.
🌟 Check out our other introductory posts:
This is not an example of a glitch. The racist auto-tagging in Google Photos instead makes clear that the cultural norms of programmers are being coded into technical systems. Glitches are, Ruha Benjamin writes in Race After Technology, “a form of exclusion and subordination built into the ways in which priorities are established and solutions defined in the tech industry … [G]litches are not spurious, but rather a kind of signal of how the system operates” (79). By 2022, Google was promoting a feature of its Pixel 6 phones called Real Tone, which represents “nuances of different skin tones for all people beautifully and authentically” (Google, n.d.). Eight years have passed, and Google has not yet fixed its image recognition problem in Google Photos, yet somehow the camera app is calibrated differently—enough to sell phones, at least.
In Masked by Trust, Matthew Reidsma describes the authority of algorithms to make everyday decisions as “our increasing faith that the algorithm will deliver a more objective decision than a human could, that somehow the algorithm eliminates the human biases that we often see coloring our decisions” (21). Google relies on this faith and exploits it in its documentation of its search engine, “How Search Works,” linked at the bottom of google.com. The company is aware that problematic content can appear. In April 2022, they said, “[O]ccasionally results may contain content that some find objectionable or offensive.” Note that Google does not say that they serve results that may include biased or even factually incorrect information. Just that some results may be objectionable or offensive. Policy-violating content is resolved by improving automated systems first, and in some cases, human beings may take manual action to block the content, but not rearrange ranked results. Google is careful to distance itself from problematic content while also emphasizing the objectivity of automated systems.
If we know algorithms are biased and sometimes even untrustworthy, we can work toward a solution. We must recognize the limitations of algorithms and learn how to approach results with skepticism. And we must share this knowledge with our patrons and users. It’s not just a matter of improving diversity in Silicon Valley. As Safiya Umoja Noble says in Algorithms of Oppression, “[W]e have automated human decision making and then disavowed our responsibility for it” (181). DEI initiatives can often put the onus on marginalized people to learn coding, become programmers, or step into leadership roles in the face of systemic racism in the very companies that promise diverse hiring practices.
Algorithmic bias is a multilayered problem that needs many different solutions working together. Wachter-Boettcher writes, “[I]f technology has the power to connect the world, as technologists so often proclaim, then it also has the power to make the world a more inclusive place, simply by building interfaces that reflect all its users” (197).
Benjamin, R. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Polity Press.
Grant, N., & Hill, K. (2023, May 22). Google’s Photo App Still Can’t Find Gorillas. And Neither Can Apple’s. The New York Times. https://www.nytimes.com/2023/05/22/technology/ai-photo-labels-google-apple.html
Noble, S.U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press.
Reidsma, M. (2019). Masked by Trust: Bias in Library Discovery. Litwin Books.
Google. (n.d.). How Search Works. Retrieved April 5, 2022 from https://www.google.com/search/howsearchworks/our-approach/maximize-access/
Google. (n.d.). Take Authentic, Accurate Portraits with Real Tone. Retrieved April 6, 2022 from https://store.google.com/product/pixel_6?hl=en-US#p6-overview-camera
Wachter-Boettcher, S. (2018). Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. W.W. Norton & Company.
🔥 Sign up for LibTech Insights (LTI) new post notifications and updates.
📅 Join us for a free webinar on AI prompt engineering for librarians.
✍️ Interested in contributing to LTI? Send an email to Deb V. at Choice with your topic idea.
Teaching, learning, and using AI — all in one place
Posted on in Blog Posts
Meet your users where they're at.
Posted on in Blog Posts
The first major antitrust lawsuit against Big Tech in decades
Posted on in Blog Posts
Makerspaces provide new opportunities for creative thinking
Posted on in Blog Posts