IP3 Algorithms: Tipping the Scales in Whose Favour?
Updated: Oct 23
Due to a number of factors, Google Search results are considered neutral and trustworthy by most users (Noble, 2018). However, not only are they not neutral but, as argued by Safiay Umoja Noble in her 2018 book Algorithms of Oppression, they exacerbate historical and existing inequalities experienced by marginalized groups through, what Noble calls, racial redlining and algorithmic oppression.
It's important to understand what content prioritization is in order to understand why it's problematic to use Google Search results without a critical eye. Content prioritization refers to the systemic organization of materials wherein they are ranked according to a set value system established by whoever created the code. There may be layers of value systems working on top of each other as previous models are worked into newer ones. Materials can include anything, including songs, images, results, and web pages. Examples of content prioritization in action include,
Google search results, including website listings, retail suggestions, and Google Maps options
Spotify song and podcast recommendations
Instagram content feeds
Content prioritization algorithms use a combination of past user behaviour, user demographics, and content tags to statistically predict an outcome, such as the likelihood of user engagement or financial gain through either user purchases or advertising spending. The system then displays search results or populates a content feed or dashboard based on what it has prioritized.
The comic below (created in Pixton) is a re-imagining of an experience I had searching for baby sleep support after my first child was born. I was struck by how all the top results seemed to repeat each other and none of them reflected what I knew as an early childhood educator who had studied early childhood development at the post-secondary level. I had to seek out reputable sources elsewhere and, even though I've accessed certain sites frequently, I still have to type in their URL directly. Considering the sleep training industry is worth hundreds of millions of dollars per year in the US alone, this isn't surprising.
Unlike in the comic above, the content prioritization process is invisible to users, which creates a propensity for users to trust the results are neutral (Noble, 2018).
The impact of commercialized search results cuts deeper than mere frustration. Extreme views are shared freely online making them profitable and normalizing their content (Noble, 2018). Noble calls search results "an intersection of popular and commercial interests" (Noble, 2018). With this in mind, the algorithms that power Google Search and others favour those in power both in cultural narrative and financial gain (Crawford, 2021; Noble, 2018).
Additionally, content prioritization lacks context and human nuance in decision-making (Crawford, 2021; Noble, 2018). Because of this, aspects invisible or unimportant to power-holders can be invisible or rank low in the algorithms those power-holders design. Noble's Algorithms of Opporession (2018) includes the story of a Black hair salon owner who was consistently downgraded on Yelp search results because she declined to pay them advertising dollars. Despite being the only Black salon in the area, having positive reviews, and a long-standing reputation her business was all but invisible. The platform directed traffic to other businesses even when hers could be found. According to Yelp, her salon was almost irrelevant, but in reality, her salon, now damaged by Yelp's content prioritization, had been a valued small business in the community.
The impacts of algorithms are, of course, present in my professional life as well. As an instructional designer who works with a variety of clients on different topics all the time,
I use Google Search or Google Scholar for a preliminary introduction to businesses or concepts and to complement subject matter expert knowledge.
It's increasingly common for IDs to use ChatGPT to write an initial draft of content and then edit with a human touch, which only protects against issues if the human editing knows enough to catch missing or downgraded information.
I search for stock images, videos, music, and narrators, relying on recommendation systems to search faster.
I am also starting to be involved in the design of dashboards and AI-generated feeds or recommendation systems. This IP, as well as the IP2 on AI, have helped me develop a more critical perspective on these design practices. In particular, Noble's suggestion of a colour-wheel-style search has been very impactful.
Personally, PageRank affects my life every day. I use Google Search to shop, find information, get directions, pick businesses and destinations, and learn about concepts and world news. I use it multiple times per day. I can easily say it is the portal through which I access almost all of the Internet content I interact with daily. While I cannot control the capitalist pressures present on PageRank (apart from advocating and voting for better regulation) I can impact PageRank to some degree with what I search, and what information I agree to share. However, as Noble (2018) says, the burden is often placed on the user to improve search results when in reality they don't have the ability to change systemic issues. I can use a critical eye when reviewing Google Search results (and other algorithm curations) to be aware of other priorities at work.
Crawford, K. (2021). Atlast of AI. Yale University Press.
Noble, S. U. (2018). Algorithms of oppression. New York University Press.