Victims Struggle as Google Fails to Act Swiftly

Reports of intimate images and videos posted online without consent are rising, with deepfakes adding a new dimension to the problem. In early 2022, two Google policy staffers met with women victimized by the GirlsDoPorn scam, where explicit videos of them were circulating online, including via Google search results.

The women suggested Google use a 25-terabyte hard drive containing all GirlsDoPorn episodes to create a "hash" of each clip and block them from search results. However, two years later, none of these ideas have been implemented, and the videos continue to appear in search results.

Google’s Inadequate Response

Despite recent changes allowing survivors of image-based sexual abuse to more easily remove unwanted search results, victims and advocates are frustrated by Google’s lack of bolder action. Google has declined to adopt the industry tool StopNCII, which shares information about nonconsensual intimate imagery (NCII), due to concerns about the content of the database.

Internally, Google employees have suggested stricter measures, such as requiring adult websites to verify consent, but these ideas have not been adopted. A Google spokesperson stated that combating nonconsensual explicit imagery (NCEI) remains a priority and that the company’s actions go beyond what is legally required.

However, sources within Google argue that more could be done. They point to Google’s tighter restrictions on child sexual abuse material (CSAM) as evidence that the company can implement stricter measures for NCII.

Calls for Proactive Measures

Advocates believe that Google should take more proactive steps to protect victims. The National Center on Sexual Exploitation argues that Google should automatically honor all takedown requests and require websites to prove there was consent to record and publish the disputed content.

Google’s current systems, which attempt to automatically remove search links when previously reported content resurfaces, have limitations. The known victim protection system, designed to filter out results with explicit images from similar search queries, is not foolproof.

Victims like those of the GirlsDoPorn scam are forced to remain vigilant, constantly searching for and reporting new uploads of their NCII. This burden should not fall on the victims, says Adam Dodge, founder of the advocacy group Ending Tech-Enabled Abuse. He calls for Google to take more responsibility in proactively identifying and removing nonconsensual media.

A Call to Action

Victims and advocates are urging Google to adopt stronger measures to combat the growing problem of nonconsensual explicit images. While Google has made some improvements, such as updating its takedown forms and policies, these efforts are seen as insufficient.

With the rise of AI-generated deepfakes and increasing reports of NCII, the need for more robust and proactive measures is more critical than ever. Victims are counting on Google to take the necessary steps to ensure their privacy and safety, and to put an end to the re-victimization caused by the continual appearance of their explicit images online.