Google doubles down on deceiving customers about its political manipulation

This article may contain statements that reflect the opinion of the author

Bypass censorship by sharing this link:
Image: Google doubles down on deceiving customers about its political manipulation

(Natural News) After multiple sources corroborated the longstanding accusation that Google stealthily infuses its political preferences into its products, the company has continued to claim neutrality, leading to incongruous answers by its executives to lawmakers’ questioning.

(Article by Petr Svab republished from

A June 24 exposé by Project Veritas showed several Google employees and a cache of internal documents describing methods Google has used to tweak its products to surreptitiously push its users toward a certain worldview.

One employee even appeared to say, when caught on hidden camera, that Google’s goal was preventing President Donald Trump, or anybody like him, from being elected again—an assertion confirmed by another employee who spoke under the condition of anonymity.

Google spokespeople have failed to produce an official response, but two of its executives were questioned about the revelations—one at a June 25 Senate hearing and one at a House hearing the following day.

During the June 26 House Homeland Security Committee hearing, Rep. Debbie Lesko (R-Ariz.) confronted Derek Slater, Google’s global director of information policy, with one of the leaked documents on “algorithmic unfairness” (pdf).

“Imagine that a Google image query for ‘CEOs’ shows predominantly men. Even if it were a factually accurate representation of the world, it would be algorithmic unfairness,” the document says, explaining that in some cases “it may be desirable to consider how we might help society reach a more fair and equitable state, via … product intervention.”


“What does that mean Mr. Slater?” Lesko asked.

“I’m not familiar with the specific slide,” he said. “But I think what we’re getting at there is when we’re designing our products, again, we’re designing for everyone. We have a robust set of guidelines to ensure we’re providing relevant, trustworthy information. We work with a set of Raters around the world, around the country, to make sure those Search Rater Guidelines are followed, those are transparent, available for you to read on the web.”

“All right. Well, I personally don’t think that answered the question at all,” she replied.

Similarly, Maggie Stanphill, Google’s head of Digital Wellbeing, was questioned by Senate Commerce Committee member Ted Cruz (R-Texas) the day before.

He asked whether Stanphill agreed with a quote from one of the leaked documents saying that Google should “intervene for fairness” in its machine-learning algorithms. Stanphill said she didn’t agree with it.

But Google has already put the “fairness” doctrine into practice, based on what the employees and the documents in the Project Veritas report say.

‘Algorithmic Unfairness’

“Our goal is to create a company-wide definition of algorithmic unfairness that … establishes a shared understanding of algorithmic unfairness for use in the development of measurement tool, product policy, incident response, and other internal functions,” says a document last updated in February 2017.

“What they’re really saying about fairness is that they have to manipulate their search results so it gives them the political agenda that they want,” the unidentified insider said.

For instance, when one types in the Google search bar “men can” and makes a space, the search engine suggests phrases like: “men can have babies,” “men can get pregnant,” and “men can have periods.”

When one types in “women can” and makes a space, the suggestions would show phrases like: “women can vote,” “Women can do anything,” and “women can be drafted.”

This isn’t because these phrases are so popular among users, but because the “fairness” algorithm pulled them from so-called “sources of truth”—they reflect the political narrative Google desires, the insider said.

Moreover, Google has adopted the doctrine while keeping its users in the dark, he said. One of the document says “it is not a goal at this time to release this definition [of algorithmic unfairness] externally.”

Read more at:

Receive Our Free Email Newsletter

Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.