Google is filtering explicit images for generic searches

Dec 12, 2012 22:21 GMT  ·  By

Google has changed the way it filters "adult" image results for certain queries. Users with safe search disabled, i.e. which don't want any filtering, are now getting much cleaner results than they used to get. In fact, disabling the safe search filter doesn't do all that much anymore.

Plenty of people are angry over the changes, which they see as a restriction on their freedom.

In essence, Google is not filtering results, strictly speaking, rather, for generic searches, it makes sure to remove any unsavory results.

The same searches, with safe search disabled, used to be filled with explicit images. However, Google is not censoring anything, users just have to be more specific in their searches.

Adding more terms to the query, indicating that they are actually looking for explicit images, will give users exactly what they're looking for.

"We are not censoring any adult content, and want to show users exactly what they are looking for," Google explained in a statement.

"We use algorithms to select the most relevant results for a given query. If you're looking for adult content, you can find it without having to change the default setting -- you just may need to be more explicit in your query if your search terms are potentially ambiguous. The image search settings now work the same way as in Web search," it said.

While some people were quick to call this censorship, it's actually a good idea. With safe search disabled, users regularly stumbled on explicit images for some queries.

It may not have bothered them, since they chose to disable safe search in the first place, but it wasn't what they were looking for. Google is now simply smarter about giving users what they want and understanding the queries better. Overall, it doesn't negatively impact on the vast majority of searches.