South Korea: Online sexual assault on the rise as survivors blame Google’s failure

Women and girls reported that reporting non-consensual explicit content to Google was a difficult process to navigate so that videos of sexual assault continued to circulate online.

“While the wave of digital sexual crimes in South Korea wreaks havoc on the women and girls it targets, Google’s inappropriate system for reporting explicit, non-consensual content is exacerbating matters,” said Jihyun Yoon, Director of Amnesty International Korea.

“Google needs to do more to prevent the spread of gender-based violence online — not just in Korea, but globally. Anyone who wants to remove harmful content has to use the same flawed reporting system, so it’s very likely that it will stretch This problem far beyond Korea.”

Today, Amnesty International launched a global petition calling on Google to address flaws in its reporting system.

Sexual digital crimes on the rise despite ‘Nth Room’ case

In March 2020, a group of South Korean journalists exposed the existence of eight secret chatrooms on the messaging app Telegram where thousands of videos of women and girls of sexually explicit content had been sold without consent using cryptocurrency. Known as the “Nth Room” case, more than 60,000 people were involved in the crimes committed in these chat rooms, Korean police said.

In October 2021, one of the chat operators in the ‘Nth Room’ case was sentenced to 42 years in prison. However, digital sex crimes continue, and the ease of sharing and frequently posting images creates additional suffering.

Recent criminal cases show that perpetrators often threaten survivors with existing video footage to commit more sexual abuse and produce more content. This shows that if non-consensual content and survivors’ personal information are not removed, women and girls experience further harm and crime, even when the original perpetrators are punished.

“Removing explicit, non-consensual content circulating online is essential to restoring the daily lives of survivors. These women have no choice but to put their trust in removal requests from tech companies, despite the painful process of having to search repeatedly,” said Jihyun Yoon. for the explicit, non-consensual content in which they appear and its collection.”

“If these requests are not addressed quickly and the harmful content can be redistributed at any time, survivors suffer long-term physical and mental harm.”

Google’s broken reporting system

Google says that explicitly non-consensual content can be removed upon request. However, survivors and activists who spoke to Amnesty International in Korea said Google’s reporting categories and procedures were confusing and difficult to follow. The right forms were hard to find, and had categories that were unclear about what type of content was being reported.

After a complaint was finally successfully filed, users experienced a lack of communication about the progress of their claim—often for months at a time.

Amnesty International Korea conducted a survey of 25 survivors and activists. All 11 of those who filed a complaint through Google said it was difficult to confirm whether their requests were handled properly. This was mainly due to the lack of communication from Google during the reporting process.

* Hyun-jin waited just over a year between receiving an acknowledgment from Google and finally being notified of the outcome of her series of takedown requests.

Google doesn’t seem to take survivor trauma into account when crafting its removal procedures. When reporting content, users must check a box that says they know they can be penalized if the submission is not valid, while Google says it will not consider incomplete complaints.

Hyun-jin said these instructions add to her anxiety: “I brought it up with difficulty, but instead of being convinced that it would be removed, I became more anxious thinking that if it didn’t work out, it would be my responsibility.”

It has since produced a sample 600-word letter detailing why the content is illegal, and shared it with other survivors to help them file takedown requests.

One of Google’s reporting forms also requires that “photo ID” be attached when submitting the report, ignoring the fact that survivors who distributed explicit material without their consent fear posting their photos online for sharing.

“Having survivors asked to post their passport pictures on the internet, where videos of the victims are circulating, is heartbreaking,” said Dan of the activist group Team Flame.

Google is the “worst secondary victimization site”

Amnesty International interviewed four survivors of online gender-based violence and six activists who supported them. All survivors reported harm to their physical and mental health, including the need to isolate themselves from society to avoid stigma.

While sexual assault and its spread online had already caused serious harm to these survivors, it was exacerbated when they faced the slow and confusing process of removing content from the internet.

“It was very easy [voor de dader] to upload a video, but it took months to remove it,” survivor Hyun Jin told Amnesty International.

She had gone to the police after an involuntary sexually explicit video of her was circulated online. Then I wrongly assumed that the video would be removed soon.

“If you become a victim in this way, you have no idea what to do. I looked at my phone all day and googled my name. I could barely sleep an hour a day, because I spent most of my time searching. I had nightmares all the time, but The reality itself was a nightmare.”

“To remove videos, photos and keywords, I had to take hundreds of screenshots and report them to Google. I couldn’t ask anyone else to do all this for me because I had to include these malicious images of myself in my report. I had to do everything on my own.”

“Google has many advantages – you can easily get the information you want. But for victims, Google is nothing more than a huge distribution site. It is the worst site in terms of secondary victimization. The other day I checked URLs with distributed content, and [de zoekresultaten] It was over 30 pages long. It is not easily removed upon request. [Toch] “I have no choice but to continue filing takedown requests,” Hyun Jin said.

Technology companies’ responsibility to prevent harm to their platforms

Google’s inappropriate reporting system is difficult to navigate, inconsistent and difficult to follow, and fails to assist survivors in a prompt and transparent manner.

The responsibility of all companies to respect human rights is well articulated in the United Nations Guiding Principles on Business and Human Rights. It states that all companies must avoid causing or contributing to adverse human rights impacts through their own activities, and that they must address such impacts if they arise.

Google’s human rights policy states that it is committed to “upholding the standards set by the United Nations Guiding Principles on Business and Human Rights.”

“By being inconsistent and slow in responding to removal requests from survivors of digital sex crimes, Google is failing to respect human rights. It should implement a survivor-centered reporting system that prevents re-traumatization and is easy to access, navigable, and auditable,” Jihyun Yoon said.

“Google must ensure that online gender-based violence does not occur within its services. Survivors of digital crime must be helped through Google’s reporting mechanisms, rather than needlessly prolonging their suffering.”

Amnesty International wrote to Google on 11 November requesting a response to its findings. Google has not provided an official response, but it did say in a meeting that this is an important matter and that the company wants to improve how it responds to it.

* All real identities of survivors have been protected at their request.

Leave a Comment