Netizens are in a panic, fearing Google is creating a “police” force tasked with stopping free speech
Already, their fake news detection “army” numbers around 10,000. Google calls them “Search Quality Raters“. And they are scanning the web, flagging websites that they believe fit into the category of conspiracy theory or offensive, amongst other things.
Much of the panic surrounding Google’s efforts is the assumption that there is a left leaning bias involved. Is Google tailoring their results to suit their own narrative?
These 10,000 paid contractors are allegedly scrolling through search terms that are commonly made by Google users. Google have claimed that this is merely an experimental exercise.
Free Speech vs Hate Speech
Contractors, for example, make a search, such as “holocaust history“. What had concerned Google with searches such as this, is that conspiracy websites were at the top of results pages. Racist and non-factual websites like “Stormfront” were able to spread malicious lies to innocent people.
Some people argue that we must trust humans to use their brains to make intelligent decisions. Adults should be able to discern whether what they are reading is based on legitimate historical sources. Or, if it is emotive opinion, and constructed from a biased foundational narrative.
If Google plans on running this experiment correctly, then it must show evidence of policing all websites – left leaning, right leaning, and everything in between.
Currently, the flagging being done by these contractors is not changing anything within the Google search algorithms. The data is being collected for use by the human coders employed by Google. It is these Coders that will create future algorithms based on these findings.
So what exactly are Google looking for?
From the outset, according to the Search Quality Rater’s Guidelines, the job is merely about tidying up the web. When the entire guidelines are examined, flagging news as “fake” is just a small part of the larger project.
So let’s look at some of Google’s intentions:
1. Financial Scams
If a web page looks like it could trick vulnerable people out of their life savings, and potentially ruin lives, Google wants it to be flagged. The concept is termed “Your Money or Your Life” (YMYL) pages.
YMYL does not involve proven reputable shopping websites. However, a site without a secure payment connection would be deemed suspicious, for example.
2. Expertise, Authoritativeness, and Trustworthiness (EAT)
Google wants to make sure that the websites that will lead search results are factual. Who are the experts quoted? What are their histories and context?
Are claims that are made backed by science or proven sources, and multiple witnesses? Opinions are never facts. Facts can sometimes become blurred. However, they should be easy to differentiate in between the author’s opinions.
3. Helpful, and Easy to Navigate
The websites that lead search results should be professional and serve the visitor. Google wants its contractors to visit websites and rate how comfortable the experience was. They want to see an obvious differentiation between the content on the site, and advertising.
Google wants people to find the answers they are looking for as quickly as possible. There should not be false landing pages for searches. The contractors will rate how “front and center” are the results of a person’s search query.
4. Abandoned Websites
If a web-site is no longer being tended to by its owner, it is vulnerable. Especially if it is quite old, it can become subject to hackers, spam, and infiltration of false information.
Google wishes to “prune” these kinds of websites from its searches.