On so countless occasions, Google has proven to be our saviour. Too bad, it can't save itself from controversy. Here are a few instances of Google giving the most random, even inappropriate answers to queries, and landing in trouble for them.
1. When it showed Kannada to be the ugliest language in India.
This is a pretty recent one, and took matters to a level where the search-engine company had to apologise for the mistake.
#Kannada is one of the most beautiful language and I am so glad that I got to learn this amazing language! 🙏🏽 @doddaganesha— Anirban Chakraborty (@anirban_thestar) June 3, 2021
ಕನ್ನಡವನ್ನು ತಿಳಿದುಕೊಳ್ಳಲು ಮತ್ತು ಮಾತನಾಡಲು ನನಗೆ ಹೆಮ್ಮೆ ಇದೆ https://t.co/87NCtomR8e
2. When it showed Narendra Modi as the answer to "India's first Prime Minister".
While the text correctly said it was Jawahar Lal Nehru, the image was that of Narendra Modi, which made people ask a lot of questions.
3. And before that, when it showed PM Modi's picture for the search "world's most stupid prime minister".
Google later clarified that this happened because of an article from Reuters Why work with India’s new leader? It’s the economy, stupid. Wondering how strong the SEO of that must have been.
4. And before THAT, when it showed PM Modi's photo when someone searched "top 10 criminals in India".
Again, Google had to apologise and clarify why it was happening (answer:Metadata).
These results trouble us and are not reflective of the opinions of Google. Sometimes, the way images are described on the internet can yield surprising results to specific queries.
5. When it showed Donald Trump's image for the search word "idiot" (which happens to date).
6. When Google Home refused to respond to the question "who is Jesus?". However, it could answer questions about Buddha and Muhammad Ali, which led people to point out the discrimination.
And here was Google's response.
Some have noticed the Google Assistant wouldn’t respond for “Who is Jesus.” This wasn’t out of disrespect but to ensure respect. Some Assistant replies come from the web. It might not reply in cases where web content is more vulnerable to vandalism & spam. Our full statement: pic.twitter.com/7iu1D8FEEK— Google SearchLiaison (@searchliaison) January 26, 2018
7. For years, people have been pointing out the biases of Google results. For instance, if you search for the word, "girl", the images that appear are not very diverse and inclusive.
And for whatever reason, it has been happening, we are pretty sure it can be stopped or at least some clarification can be given.
Googling how to make Google better.