WASHINGTON (BP and local reports) – Lina Nealon, a sexual assault survivor and mother of four children, poses as a 14-year-old on Snapchat. Within minutes, Snapchat’s Stories section — marketed as “content from vetted media publishers and content creators” — recommends sexually suggestive videos.
“Clearly Snap and I have a different idea of what is acceptable content to share with teens,” said Nealon, vice president and director of corporate advocacy for the National Center on Sexual Exploitation (NCOSE) in Washington, D.C.
Nealon shared her experience on Instagram in NCOSE’s release May 2 of its 10th annual Dirty Dozen List, accusing companies of facilitating, enabling, and profiting from sexual abuse and exploitation.
Snapchat joins Apple App Store, Discord, eBay, Instagram, Kik, Microsoft’s GitHub, Only Fans, Reddit, Roblox, Spotify, and Twitter in comprising NCOSE’s Dirty Dozen.
“Those on our 2023 Dirty Dozen List were included for facilitating a diverse set of sexual exploitation issues including sex trafficking, image-based sexual abuse, child sexual abuse material, grooming children for exploitation, and childlike sex abuse dolls,” Nealon said.
“Sexual abuse and exploitation are on the rise and are facilitated by digital platforms,” she alleged. “It is past time for tech platforms to stop their products from threatening the safety of children and enabling sexual abuse to happen to people of all ages.”
NCOSE released its list with documented examples directly captured from the tech sites and corroborating information supporting the accusations of sexual exploitation, with avenues for public advocacy for change.
“We call for urgent change from those who made the 2023 Dirty Dozen List. Over the past decade, the Dirty Dozen List campaign has instigated major policy changes at Google, TikTok, Comcast, Delta Airlines, Amazon, the Department of Defense, and many other influential institutions.
“The list exposes practices and products that endanger and harm people and galvanizes the public to press on the named entities to act ethically and promote human dignity,” Nealon said.
Apple reached out to NCOSE ahead of the release of the list “with an offer to renew meetings around safeguarding concerns,” NCOSE said in a press release.
NCOSE presents Apple’s App Store as “deceptive to the core,” stating that the site’s age ratings and descriptions “mislead parents about the content, risks, and dangers to children on available apps.
“It is yet to be seen what steps Apple will take to address the concerns contained within the Dirty Dozen List, but NCOSE is hopeful that Apple will step up its efforts to protect children in the App Store.”
Spotify’s purported filter doesn’t filter abusive content, a NCOSE researcher said on Instagram in releasing the report.
“When I began my research on Spotify I was absolutely shocked at the amount of pornographic content that I was able to easily find,” the researcher said.
“Parents are putting their trust in this filter and they’re sometimes even paying for a premium subscription in order to be able to control this filter on their kids’ account, and this filter is doing virtually nothing to shield their kids from hardcore pornography on the app.
“It very rarely catches the audio pornography and it actually never catches the visual pornography because it just doesn’t [monitor] images,” the researcher said.
Twitter is persistent in posting child sexual abuse material and has resisted legal challenges to material considered sexually exploitative and abusive, NCOSE said, citing lawsuits against the company that was recently purchased by billionaire Elon Musk.
“In the ‘John Doe #1 and John Doe #2 v. Twitter’ case, the company continues to claim 100 percent legal immunity for its actions in facilitating the child sexual exploitation of John Doe and his friends,” NCOSE wrote.
NCOSE describes the Discord messaging platform as a “haven for sexual exploiters,” offering avenues for predators to groom children, find and trade child sexual material, and commit image-based sexual abuse of adults.”
NCOSE quotes eBay’s advertising slogan, “Whatever it is, you can get it on eBay,” in pointing out available content there including “childlike sex abuse dolls and spy cams advertised specifically for filming women without their consent.”
Instagram facilitates the capturing and sharing of images that can potentially make grooming minors for sexual abuse easier, the dissemination of child sexual abuse materials, sex trafficking, and other harms.
Kik is “crawling with criminals,” NCOSE asserts, describing the anonymous messaging app as a “predator’s paradise for grooming kids and sharing images of their abuse, in addition to hardcore porn and prostitution ads.”
NCOSE describes GitHub as “the go-to place to create sexually exploitative technology,” including deepfakes, a type of artificial intelligence used to create convincing fake images as well as audio and video hoaxes; “nudify” apps that make fully clothed people appear naked; and artificial intelligence-generated pornography.
OnlyFans empowers sex traffickers, child exploiters, and “revenge porn” attackers in committing their crimes, NCOSE said, while Roblox may expose kids to predators and inappropriate content.
The full Dirty Dozen list is available here, with NCOSE’s related editorial posts available on Instagram. For information on how to deal with potential inappropriate content on the Internet, click here.