Second, the systems are built on a set of historical data. For example, the AI is often trained on past news articles and content, which contain hidden (or not so hidden) biases. This is why Google images will mostly show images of women when you search for ‘nurse’ and men when you search for ‘footballer’.
In relatively new areas, like some women’s sports leagues, there are fewer examples for Google to use. The NRL was established in 1998, which means Google has 24 years of references to the men’s NRL Grand Final to build on, while there are only four years of women’s competition.
This hinders the progress of women’s sport by perpetuating their online invisibility and reinforcing historical prejudices that men are at the center of sport, while women are sidelined.
It is also more difficult to find information about the games and sports that fans find, which limits and prohibits their engagement.
This contributes to a feedback loop, where what has always been done in the past is amplified.
As for solutions, Pfefferkorn says Google has the ability to step in, and has done so in the past. In 2016, the search engine made algorithmic changes to combat Holocaust denier issues and has already downgraded spam sites. However, Google is more inclined to reduce the functionality of algorithms, rather than trying to improve them for the benefit of society.
This is where bringing more women and people from marginalized communities into computing and software development can help challenge the status quo embedded in these algorithms.
The increased coverage of women’s sports will also help the software’s AI learn where the world is heading. So the more stories the media publishes about the NRLW, the more Google will know about it as well.
And if you want to help out, you can shout out women’s sports from the depths of online spaces. Or, Google “NRLW grand finale” 1000 times until he gets the idea.