Is Google Racist? How Machine Learning Fairness (MLF) went Terribly Wrong

Did you know CNN anchor Don Lemon is the best example of white man? At least to Google he is. If you search Google images for “white man”, Don Lemon is the first search result, followed by a woman, a composite image of low-polling politicians and a serial killer. It turns out Lebron James, Ilan Omar and Kamala Harris are also pertinent examples of white men.

The most important take away from googling “white men” is that they are really, really bad, if not downright evil. Google’s motto is “Do no Evil” so you can trust them.

Of the top 20 google search results, only one portrays white men in a positive light. Another does show empathy for white men while calling out their strangle hold on power.

Let that sink in. The world’s leading search engine, who’s mission is “to organize the world’s information and make it universally accessible and useful” makes it almost impossible to find any examples of good white men. It should be noted that the media may be just as much to blame.

The most common google search result for “white men” cite stories about serial killers and mass murderers. As a result, many on the far Left are convinced white men commit the vast majority of homicides in the United States. Inconveniently, The United States Department of Justice, reports that blacks are 7 times more likely to commit homicide than whites. Sometimes numbers can be so racist, right? Maybe Google can correct for that too.

You might wonder what happens when you google “black men”. The top 20 Google search results are all black men, as one would expect. Some are referred in the captions as “good”, “resilient” and “lovable”. None are portrayed in a negative light. It’s also noteworthy that none of the search results are Caucasian or woman, in contrast to our search results for “white men”.

As explained by Douglas Murray in a book titled The Madness of Crowds — Gender, Race and Identity, Google engineers have implemented algorithms designed to correct for historical prejudice. The technical term is Machine Learning Fairness or MLF.

In the book, Murray provides numerous examples of Machine Learning Fairness gone wrong. Some examples include Google image searches for “straight couples” versus “gay couples”, “white family” versus “black family” and even “European art”.

Murray argues that such absurdities could not occur as a result of algorithms operating without human influence. It is far more likely that Google employees are trying to teach their machines, and us, how to think about fairness. Unfortunately, they have exposed the extreme biases operating within the company.

Google search results for “white men” (December 8, 2019)

References

In a book titled The Madness of Crowds, author Douglas Murray discussed Google’s efforts to insure “Machine Learning Fairness” in search results.

I am the truth, and the truth hurts.