Google offers watered-down description of controversial characters, study finds

If you start typing Alex Jones’ name into Google search, it suggests: Alex Jones, radio host and gives you a mini-picture so you can see if this is the person you’re looking for.

The radio host qualifier for the American with far-right ideas, founder of the website Infowarswho became famous for his conspiracy theories about the Sandy Hook shootings and the 9/11 attacks, might surprise you.

Do the same exercise with Gavin McInnes, founder of the far-right group Proud Boys, and it is presented to you as a writer. For Jake Angeli, says QAnon Shamanone of the participants in the assault on the United States Capitol in 2021, the search engine uses the qualifier activist.

Discussions about platforms that help spread misinformation online regularly target Facebook and Twitter, while Google’s search engine is too often overlooked, argues Ahmed Al-Rawi, a professor at the University’s School of Communications. Simon Fraser (SFU), British Columbia.

However, these descriptions given by the automatic suggestion tool of the Google search engine omit part of the truth and embellish the vision of these individuals with the general public, as explained by Ahmed Al-Rawi, who is also director of the laboratory. The Disinformation Project from SFU.

Calling someone an activist, when that person has spread hatred and even called for genocide, is not normal. »

A quote from Ahmed Al-Rawi, Director, Disinformation Project, SFU

Activists, journalists, and many more

As part of this study, Ahmed Al-Rawi searched the titles suggested by Google’s search engine to qualify 37 people, considered to be conspiracy theorists or to have supported conspiracy theories, along with other researchers from the Disinformation Lab. .

A suggestion from Google's search engine featuring a photo of Alex Jones and calling him a radio host.

Alex Jones is one of 37 conspiracy theorists studied in this study.

Photo: Radio Canada

The results of the study, published in the M/C Journalshow that, among the 30 people who had a subtitle, none of these qualifiers reflected the vision that the public has of them, according to Ahmed Al-Rawi: 16 were represented for their contribution in the artistic field, 4 were qualified as activists, 7 associates, to their original jobs, 2, related to the journalistic field, one, to his sports career, and the last, identified as doing research.

Knowing that Google’s search engine suggestions can yield different results depending on the geographic search area, researchers were surprised to find that the same results were obtained in Canada, the United States and the Netherlands. . They came to this conclusion using a virtual private network (VPN) that originated in these three countries.

A bias impossible to define without a known algorithm

An algorithm uses a series of rules and a defined database to arrive at a result. If its initial database is subjective, these inequities will also be conveyed to the given result.

In this case, are the qualifiers assigned to conspiracy theorists due to an error in Google’s database or a conscious choice? Difficult to say, given the little information available on the functioning of its algorithm, argues Stéphane Couture, professor of communication at the University of Montreal, co-responsible for the Laboratory on online rights and alternative technologies.

However, it is clear that the choice of these titles was not made within the framework of an editorial policy. Google doesn’t have an editor who decided to put that radio host subtitle to Alex Jonesexplains Stéphane Couture.

A robot looks at algorithms.

Human beings transmit their biases to the algorithms they design.

Photo: gettyimages/istockphoto / PhonlamaiPhoto

Ahmed Al-Rawi, for his part, fears that conspiratorial groups will not be able to take advantage of this system.

The laboratory director The Disinformation Project argues that, if Google’s database relies on information available on the Internet, as the platform stipulates, these conspiracy theorists could influence how they are presented by the search tool, thanks to the way they define themselves on the Internet.

The system is manipulated, it’s like a loophole that lets conspiracy theorists promote themselves with the help of Google. »

A quote from Ahmed Al-Rawi, Director, The Disinformation Project, SFU
The shadow of a woman hides behind data.

Algorithmic biases can lead to situations of discrimination.

Photo: iStock

A limited vision

Stéphane Couture explains that Google relies on the neutrality of its research process to ensure the support of as many Internet users as possible.

Although this neutrality is only claimed and these subtitles give a limited vision of these individuals, the fact remains that they are not false, supports the professor in communication. Alex Jones is indeed a radio host, he recalls.

Of course, if the platform presents Alex Jones as a conspiracy the people who are in the camp of Alex Jones will be angry with Google.

This information seems biased to us, but it is not for Alex Jones. »

A quote from Stéphane Couture, professor in communication, University of Montreal

According to Stéphane Couture, it is rather in the choice of giving or not a subtitle to these individuals that Google takes a position. Other controversial figures, such as Osama Bin Laden, do not have it, for example.

Control and transparency required

The two researchers point out that this is not the first time that Google’s search engine algorithms have been discussed.

Ahmed Al-Rawi believes that, if there was enough pressure from the international community, it could prompt the digital giant to step in to fix the situation.

Google has already made changes to its search algorithm, after an outcry over terms with a pejorative tendency to qualify women or racialized groups on the search tool, recalls Ahmed Al-Rawi.

In 2020, Google also decided to remove gendered labels, such as male Where womenof its algorithm to comply with its ethics rules on artificial intelligence.

The Google logo on the exterior facade of a building.

Google has already adopted measures to avoid certain biases in its algorithm.

Photo: dpa via getty images / NOAH SEELAM

Stéphane Couture says Google should be more transparent about how its algorithms work. He suggests that the platform withdraw these titles from its suggestions and adopt a chief editor who could be held responsible when questions of algorithmic bias arise.

Their impact is very real, according to him, given the use that the general public makes of the platform.

It’s like saying, “Osama Bin Laden, he was a former citizen of Saudi Arabia.” It completely erases its history and, in a strange way, the political dimension behind it. »

A quote from Stéphane Couture, professor in communication, University of Montreal

The greats of the Internet like Google justify these prejudices of their algorithms by the fact that they are the mirror of societybut according to more and more researchers and politicians, these platforms have an editorial role to play, says Stéphane Couture.

With information from Nantou Soumahoro

Leave a Comment