When using the search engine of Google as your primary method of acquiring informational resources, you would hope that the generated sources would hold an abundance of factual and unbiased information. Good arguments are founded on premises that are in fact true, and the logic that follows that groundwork must be valid and sensible. Think back to the basic spelling tests and mathematical problems taught by your elementary school teachers. Did you ever doubt that the logic they were using to teach such concepts was faulty or inaccurate? The answer is probably not, given that their teachings continued to hold steady in cumulative discussions during further forms of education. This same idea can be applied to search engines, we obtain information from web-based services and follow through with it in future contexts. To put it simply, when searching the equation of “2+2,” any reliable search platform would provide the answer of “4.” However, the harsh reality is that commercial search engines spit out sources of information that are tailored to what advertisers think we are looking for, and the algorithms that process our searches inherently lead us to popular and profitable websites. In other words, the knowledge we obtain from search engines like Google will always be biased in some capacity.
As mentioned earlier, we tend to act off previously learned information in later situations, which makes biased search results all the more concerning. In a chapter from Algorithms of Oppression by Safiya Umoja Noble, it is mentioned that one of the biggest hate crimes of recent memory was brought upon by a white nationalist named Dylan “Storm” Roof, who opened fire at unsuspecting African American Christian worshipers. The key to understanding why this tragedy took place lies within Mr. Roof’s search history. A simple google search of “black on white crime” led Mr. Roof to believe the false nation that Black violence on White Americans is out of control within the United States. The author of the article conducted the exact same search as Mr. Roof, and had her screen flooded with a number of conservative, White-nationalist websites that foster pure hatred towards African Americans. If commercial search engines like Google were truly unbiased, it would have displayed the Federal Bureau of Investigation (FBI) crime statistics on violence within the United States. The FBI’s site clearly shows that harm towards White Americans is largely an intra-racial issue, meaning that violence against White Americans is carried out by other White Americans. In essence, Mr. Roof cultivated a hatred of Black Americans through obtaining extremely biased results off of a search engine that runs on an algorithm capable of perpetuating forms of oppression.
The true problem with viewing results that lean more towards one ideology than any others is the lasting effects it may have if the researcher is never exposed to other viewpoints. Let’s say that you come home from elementary school and use a search engine to determine the answer to the extremely basic equation of “1+1.” Through the influence of advertising and an algorithm that has a preference to display popular websites, you determine that the answer is in fact the number “3.” The reason that this blunder is acceptable is because going back to school the next day allows for your teacher to correct this faulty search result by replacing “3” with the faultless answer of “2.” Mr. Roof did not have a guiding figure of sorts that could rid him of the false pretenses exposed to him by a biased search engine. Instead, Mr. Roof read statements and opinions from members of The Council of Conservative Citizens (CCC), which is an organization that has the sole goal of preventing all efforts to mix the races of mankind. While there may not be direct linkage between the internet and murder, it can be surmised that Mr. Roof’s horrific hate crime was, in part, brought to fruition due to the unbalanced nature of our search engines.
Can search engines or web browsers be better designed to expose people to different narratives of the information they are looking for?
It is so scary that the most “reliable” source of knowledge in our lives, and the thing that anyone turns to when they need a question answered can be biased. I am not sure how search engines can be better designed to help this problem to not occur. In some ways, it makes sense that ones search engine is tailored to them. For shopping preferences, recent life developments, etc. However just like everything else, biases about race have to be changed in order for things to improve.
I thought the title was really eye-catching! I find this to be really interesting and I have never really thought about it. While reading this post, it also brought up the fact that AI is inherently biased and how could we change that. It is important that you made the distinction of Google always being biased as it is very interesting to think about. I also wonder if we could ever fix that even a little.