The company rolled out a change to its image search algorithm overnight that makes it tougher to stumble across adult pictures, whether or not you’re searching for them. Here’s how a Google representative explains the change:
“We are not censoring any adult content, and want to show users exactly what they are looking for — but we aim not to show sexually-explicit results unless a user is specifically searching for them. We use algorithms to select the most relevant results for a given query. If you’re looking for adult content, you can find it without having to change the default setting — you just may need to be more explicit in your query if your search terms are potentially ambiguous. The image search settings now work the same way as in Web search.”
In other words, if you have SafeSearch turned off, you can still probably find anything you’re looking for by appending the word “porn” to your search.
But that might not be enough to satisfy some users. A Reddit thread this morning found users struggling to find explicit content even with SafeSearch turned off.
“What is this? communism?!” asked user Fake_Cakeday. “BRING BACK THE PORN!”
Google says the change simply brings image search settings in line with existing settings for Web and video search. But it’s worth noting that this move is unusual for a company that in every other case works to make it easier, not harder, for you to find the thing you’re looking for. The idea behind products like the Knowledge Graph and Google Now is that the company should bring you information with the least amount of effort possible. With its new image search settings, Google has identified one place where it wants users to work a bit harder.