Addressing the distribution of illicit sexual content by minors online

您所在的位置:网站首页 热血少女支线任务 Addressing the distribution of illicit sexual content by minors online

Addressing the distribution of illicit sexual content by minors online

2023-06-15 06:51| 来源: 网络整理| 查看: 265

The creation and trading of Child Sexual Abuse Material, or CSAM, is often regarded as the most harmful abuse found across online communication and social media platforms.聽

Most of the policy, law enforcement and platform discussion on addressing CSAM rightfully focuses on adult offenders who create, distribute and monetize sexual imagery of children. While a majority of the content that is purchased or traded online is created by adult abusers, in some instances minors (often teenagers) also create this illegal content. They often provide paid offerings modeled after content on well-known adult sites, such as OnlyFans. In other cases, minors are coerced into producing illicit sexual content, known as sextortion, with a dramatic increase in cases reported by the FBI in the past year.聽

A new Stanford Internet Observatory report investigates networks on Instagram and Twitter that are involved in advertising and trading self-generated child sexual abuse material (SG-CSAM). These findings were covered by the Wall Street Journal, with responses from the companies named in the report.

Key Takeaways

Large networks of accounts that appear to be operated by minors are openly advertising self-generated child sexual abuse material (SG-CSAM) for sale.Instagram is currently the most important platform for these networks with features like recommendation algorithms and direct messaging that help connect buyers and sellers.Twitter had an apparent and now resolved regression allowing CSAM to be posted to public profiles despite hashes of these images being available to platforms and researchers for automated detection and removal.Telegram implicitly allows the trading of CSAM in private channels.Gift card swapping and exchanges, such as G2G, are a critical part of the monetization of SG-CSAM, allowing anonymous compensation for content.Study of these dynamics is challenging but necessary, particularly in an environment where platform providers are divesting from Trust and Safety programs. SIO has implemented systems to study these networks while preventing exposure to or storage of CSAM itself.

Our investigation finds that large networks of accounts, purportedly operated by minors, are openly advertising SG-CSAM for sale on social media. Instagram has emerged as the primary platform for such networks, providing features that facilitate connections between buyers and sellers. Instagram's popularity and user-friendly interface make it a preferred option for these activities. The platform's recommendation algorithms effectively advertise SG-CSAM: these algorithms analyze user behaviors and content consumption to suggest related content and accounts to follow.聽

These networks are also present on Twitter. In the course of the investigation, researchers found that despite the availability of image hashes to identify and remove known CSAM, Twitter experienced an apparent regression in its mitigation of the problem. Using PhotoDNA, a common detection system for identified instances of known CSAM, matches were identified on public profiles, bypassing safeguards that should have been in place to prevent the spread of such content. This gap was disclosed to Twitter鈥檚 Trust & Safety team which responded to address the issue. However, the failure highlights the need for platforms to prioritize user safety and the importance of collaborative research efforts to mitigate and proactively counter online child abuse and exploitation.聽

While the primary platforms identified as having significant SG-CSAM activity were Instagram and Twitter, a wide cross-section of the industry is leveraged by this ecosystem鈥攕ome of which we could not analyze in-depth using open-source methods.聽

An industry-wide initiative is needed to limit production, discovery, advertisement and distribution of SG-CSAM; more resources should be devoted to proactively identifying and stopping abuse. These networks utilize not only social media platforms, but file sharing services, merchants, and payment providers. Given the multi-platform nature of the problem, addressing it will require better information sharing about production networks, countermeasures, and methods for identifying buyers.

SIO hopes that this research aids industry and non-profits in their efforts to remove child sexual abuse material from the internet.聽Platforms have updated safety measures based on the findings, but more work is needed.聽We will continue to partner with technology and child safety organizations to conduct further research and recommend countermeasures.



【本文地址】


今日新闻


推荐新闻


CopyRight 2018-2019 办公设备维修网 版权所有 豫ICP备15022753号-3