Google Search's problems with explicit materials on its image results are reportedly getting worse as the platform continues to display deepfakes and non-consensual sexual photos online.
According to an in-depth report by Wired, Google employees are having a hard time urging executives to act on the issue amid the growth of explicit images on the platform.
Although Google has made it easier to request removals of sexual and non-consensual consent, several former staffers note that the company has been lax in following up with earlier proposals to fully address the issue.
Among the recommendations previously proposed are stricter website verification policies to block non-consensual materials from being uploaded online and the adoption of the StopNCII tool.
StopNCII, or Stop Non-Consensual Intimate Image Abuse, is an industry-wide free tool to help social media block out explicit images from its database of over 572,000 hashed photos and videos.
A Google representative told Wired that preventing the spread of non-consensual images on the platform remains a top priority as its workers "continue to work diligently to bolster our safeguards."
Generative AI Could Worsen Explicit Images on Google Search
With generative AI becoming more widely available to the public, experts have long raised warnings that explicit deepfakes would only increase as malicious actors exploit the technology for illegal uses.
Several earlier victims of explicit images on Google Search are already seeing AI-generated sexual deepfakes of their likenesses despite earlier takedown requests.
Deepfakes are even able to bypass the platform's automated censorship of adult content as AI-generated sexual images of celebrities can easily be spotted with simple search requests.
White House Steps Up Efforts to Combat Non-Consensual Explicit Images
This was not the first time Google has been reported of negligence in addressing explicit content on its platforms to the point that lawmakers have started intervening on the issue.
Just last May, US President Joe Biden issued an executive order to curb "image-based sexual abuse" online, including the use of AI, to target vulnerable groups.
One of the primary steps in the executive order tasks is forming a new task force to prevent the further spread of non-consensual materials.
This is in addition to urging online platforms like Google to "manage the risks of AI" from being monetized on sexually exploited individuals.