Google’s invisible filters: how to identify and bypass them? – SEO and engine news

When working on SEO, it may happen that our page meets all the necessary conditions to be able to be positioned, but remains invisible in Google. No matter how hard we look for explanations at the technical, editorial level, related to the age of the site, the Search Console does not send any signal that could clarify the situation, the “Manual actions” page remains blank. It is likely that you have fallen victim to one of Google’s post-filters, not obvious at first glance, but nevertheless still close to us. In this article we will review the main filters of Google (SafeSearch, DMCA, Right to be forgotten, etc.): what are they and how to identify them in the search?

As we know, referencing a site provides for a whole set of actions to make it compatible with both the needs of Internet users and those of search engines, including Google. And most of the time, that’s enough.

But if, despite this, the results do not satisfy us, we review our optimizations by looking for improvements on the technical, editorial or popularity aspects. And this approach is absolutely correct, with one caveat that there are also, if we can say so, powers “from above” that search engines and Google included can apply on top of its results and that directly influence positioning.

Thus, alongside classic ranking factors, a page or website may not appear in search results because:

  1. Explicit content filtering due to the SafeSearch filter.
  2. From a DMCA complaint due to copyright infringement.
  3. From a decision of the Court of Justice of the European Union (CJEU).
  4. Government restrictions.

The SafeSearch filter or “adult filter”

The SafeSearch filter, also known as the “adult filter”, is the most common filter in Google search results of all types: classic search, images or videos. We have to do every time we use Google, without necessarily realizing it, simply because it is activated by default, especially on Google Images.

SafeSearch option enabled by default in Google Images.

The SafeSearch filter was born around the year 2000 under the direct control of Matt Cutts. At the time, Google was not yet using machine learning to identify pornographic content in photos, and filtering was based solely on the textual content of pages.

Years passed and the SafeSearch filter became more and more vigilant and precise, in particular thanks to the implementation of the machine learning and image recognition technologies (OCR). Today, it analyzes both textual and visual content and has a team dedicated to this functionality.

The SafeSearch filter performs the explicit content filtering to remove it from search results. Explicit results include sexual, pornographic, violent, and gory content. That’s what Google’s official documentation page tells us.

But going through the options in Google’s API (SafeSearch Annotation), one identifies the fact that it goes way beyond that.

There are 5 types of content categories that Google searches, analyzes and categorizes as explicit content

[Cet article est disponible sous sa forme complète pour les abonnés du site Réacteur. Pour en savoir plus :]

SEO tests: typology and implementation methodology (1st part)

An article written by Alexis Rylkosenior SEO consultant at iProspect ( &

Leave a Comment