Child abuse is generally a hidden act in our society that usually happens behind closed doors and in private moments, often perpetrated by those adults who are the most trusted by their child victims. Although the online child sexual abuse methods may vary from person to person, the purpose is the same for all. Child sexual abuse material (CSAM) includes video, images, text and drawings of children being abused and exploited. Although online child sexual exploitation (CSE) reached its peak during the COVID-19 pandemic, it has existed offline for at least 53-54 years. Exploitation related to children across the age spectrum is being widely observed. Generally, it takes place on social media apps and in chat rooms. The technology industry has been vocal against online CSE, but it has yet to prioritize it over profit. Every half a second, a child goes online for the first time. In 2022 alone, there were approximately 85 million images and videos of child sexual abuse online that now live on the internet forever.1
Language Matters
CSAM demonstrates that a child has been groomed, coerced and exploited by abusers. When it comes to pornography, in most cases, the adults featured have decided to participate and have consented to be filmed. That is not the case with CSE, as minors cannot legally give consent to have sex or have images of their abuse recorded and distributed publicly/privately2. Rather, each and every video/image of CSAM is actually evidence that the child has been the victim of sexual abuse. In most cases, the abuse has been committed by someone who is known or trusted by the child. These âtrustedâ individuals may include family members, teachers, coaches/trainers and others who may be near and dear to the child. Abusers/offenders often use âgroomingâ techniques to encourage secrecy and normalize the sexual misconduct.
Social Media and Messaging Apps
Online predators have flocked to social networking sites, eager to find and exploit individuals who are willing to share personal information. Few children consider the consequences of the information they share online. Cyber stalkers use social media to discover where children live and what school they attend.
Facebook, Instagram, Snapchat, X (formerly known as Twitter), TikTok, Discord, Telegram, Pinterest, Signal, Vine, Google Chat and Dropbox are the most popular platforms.3 Facebook, X and Instagram are used mainly to share images through mobile devices, while Snapchat is used for âsnapsâ (texts). Snapchat was intentionally designed to be a visual communications platform for communicating with real friends. The initial product design included safeguards to make it harder for strangers to find and contact younger people, but predators targeted the platform for child abuse since the original product safeguards were relaxed and children have learned how to override parental privacy settings. With Snapchat, there is a false sense of security that information or pictures are being deleted and that there is no evidence of communication between two people on the platform. Adult offenders often connect with teens on one platform, such as Instagram4, and then move the conversation onto Snapchat. Messaging apps must use technology to detect and combat abuse, as well as report any known images and videos featuring CSE to authorities. They should use machine learning-driven tools to identify keywords and account behaviors that suggest abusive accounts or other suspicious activity. Then, they are required to use these signals to flag high-risk accounts for suspicious activity review. Metaâs Facebook groups, Instagram hashtags, recommendation features and algorithms often make it easier for people to connect with abusive content and others interested in it, thus enabling the promotion and sharing of child exploitation material.
Telegram implicitly allows the trading of CSAM in private channels. Platforms like Discord present children to predators online with almost no checkpoints/safeguards. Almost all of the financial sextortion targeting minors is coming from a cybercrime group called the Yahoo Boys located in Nigeria and CĂŽte dâIvoire. Moreover, nearly all of the financial sextortion targeting minors starts on one of three platforms: Instagram, Snapchat and Wizz.5 Criminals use catfish accounts to send hundreds of thousands of âfollowâ requests to boys on Instagram, impersonating young females. The moment a teen accepts, the criminal takes screenshots of the follower and his following lists. Instagram has the power to mitigate a vast majority of sextortion incidents by making one simple privacy change todayâthey may allow users to keep their followers and following lists always private and make this the default setting for minors.
AFC Professionals Can Help
Investigators working on transaction monitoring teams should be particularly wary when confronted with cases involving the scenarios listed below:
- Frequent transactional (especially low value) flows (usually cross-border, through money services businesses) between adults and apparently young clients, with a focus on age (e.g., 40-50 for adults and 20-25 for the young clients) and gender of customers
- Repetitive or frequent use of a few specific keywords while facilitating transactions
- Use of high-risk merchants/payment processors as delivery channels
- Payments related to online purchases, app purchases, online gaming and gambling, use of online video and communication technologies and use of online file storage
- Purchases at vendors that offer online encryption tools, virtual private network services, gift cards, software to clear online tracking and vendors that offer software for peer-to-peer sharing platforms
- Payments related to travel-related expenses (e.g., passport purchase, flight bookings, airline baggage fees) that occur closely before or after transfers to a jurisdiction of concern for child sexual exploitation
- Purchases on dating platforms or sites for adult entertainment content (e.g., www.onlyfans.com, www.filipinocupid.com, www.asianbeauties.com, www.asiandating.com, www.asiandatingspace.com, www.asiandate.com, www.amolatina.com, www.lovetoria.com, www.mingle2.com, www.tinder.com, www.naughtydate.com), creator-content streaming websites (e.g., www.restream.io, www.vimeo.com), domain registration/website hosting entities (e.g., www.hostinger.com, www.godaddy.com)
- Use of virtual currencies to fund a virtual currency account, convert funds and/or transfer funds to another virtual currency wallet, obtain a cryptocurrency loan or withdraw funds in cash
- Email money transfers that include a partial email address or reference with terms possibly related to child sexual exploitation
- Transactions to reload prepaid credit cards (particularly ones that deal with virtual currencies)
- Transactions made by account logins through IP addresses in a jurisdiction of concern
- Purchases at youth-oriented stores or venues (e.g., toy store, children's clothing store, amusement park, play center, candy shop)
- Payments to streaming platforms (e.g., Pornhub, Lips, clickcastx) or file-sharing service providers (e.g., WeTransfer)
Vulnerable Jurisdictions
In the digital world, any person from any location can create and store sexually exploitative content. Child sex offenders may even livestream sexual abuse from the confines of their homes, directing on-demand abuse of children far away. However, China, Pakistan, Egypt, Vietnam, Argentina, Russia, the United Arab Emirates, Jamaica, Nigeria, Kazakhstan, Australia, the U.S., Thailand, Colombia, the Philippines, Ukraine, Romania, Ghana and the Dominican Republic are among the countries that pose online safety risks for children.6 Undeniably, the U.S. is the home to the worldâs highest number of data centers and secure internet servers with smooth/fast networks that are top choices for CSAM hosting sites.
The Victims
Victims of CSE experience some common emotions and indicators: Difficulty setting boundaries; shame, blame and guilt; post-traumatic stress disorder; depression; obesity; reliance on alcohol/drugs and eating disorders; and low self-esteem.
Grooming a child may involve a close relationship between the offending adult, the targeted child and, potentially, the childâs caregivers. Often, the offenders are highly regarded and trusted within the community. In general, the grooming process may be divided into six distinct stages:
- Targeting the child: During this stage, the predator pays special attention to the targeted child to get close to them. They use loopholes (e.g., lack of parental oversight, isolation, parentsâ abnormal conjugal life, disputes/chaos, separation, divorce) to gain insight into the childâs unfulfilled emotional needs.
- Gaining the childâs trust (and caregiversâ where applicable): This is done by having a 360-degree understanding of the childâs needs and figuring out ways to fulfill those needs that may have been missed/ignored by parents.
- Filling the need of the child: Once the childâs special need is identified, the predator extends tempting offers to the child to fulfill that need(s). They may also use gifts or money as tactics.
- Isolating the child: The predator may create situations in which they and the targeted child can be alone with each other. They often try to make the child believe that their parents do not care as much for them as the predator does.
- Bringing sensuality/eroticism to the relationship: Once the predator achieves their goal of gaining trust along with emotional dependence, the offender exploits the childâs natural curiosity of the unknown (e.g., details about the physical relationships of parents and other people, revealing or nude photos of the human body)âinformation or material that is usually kept from the child by parents.
- Maintaining secrecy and control over the victim: Once the offender starts sexual misconduct with the targeted child, they blame the child, indicating the fault was the childâs and threaten the child to maintain secrecy. At one point, the child may start believing that the consequences of exposing the relationship will be more humiliating than continuing the abusive relationship.
Taking Appropriate Steps
When the child is identified (through gut feeling or obvious unusual behaviors) as vulnerable and shows grooming behaviors, the following steps may be taken by parents:7
- Establish house/family rules about when, how and to what extent adults (other than the parents) may engage with their children. When a boundary seems to be crossed, the offending adult may be confronted or warned (if necessary).
- Children should be taught about body autonomy (and private parts) and be assured that there is nothing wrong with saying ânoâ to adults when their touches seem uncomfortable.
- Nothing can ever replace the benefits of having open conversations with the child. It is critical to let them know that their parents are always there for them no matter how hectic the day or situation may be.
- Suspected adult(s) should be informed that unexpected drop-ins, taking the child out or spending time alone with the child are not allowed and are being strictly monitored.
- It is vital that parents have control over their childâs devices (cell phones, tablets, iPads) and knowledge of the applications being used (e.g., Discord) to check with whom their child is communicating. Limits (in terms of usage and accessibility to portals/applications) may also be established. There are some parental control apps available to facilitate establishing these limits.
Conclusion
The emotional, societal and economic implications (e.g., the judicial, social services and counseling costs) of CSE are huge; however, translating a childâs pain into a dollar value may only undermine the sensitive understanding of the issue. Ensuring a safe online environment for our children is our shared responsibility. CSE impacts the lives of far too many children, adolescents, families and communities. When it comes to the role of social media in regard to CSE, we parents must take responsibility and not shift the blame to social media or private messaging platforms. If X decides to restore sites that were identified as sharing CSAM and other artificial intelligence-generated harmful content, as parents, we need to do our part to protect our children from predators. Social media platforms will rarely prioritize a moral calling over earning revenue. However, parents are not focused on revenue when their childrenâs safety is of concern. We need to be more conscious of the threat and more vigilant about dangerous online content. Ensuring a safe environment for our future generations means everyone needs skin in the game. The cry of a child sounds the same, be it that of a CSE victim or a wounded, near-death child hit by a senseless carpet bombing or being starved in a conflict zone. No âdouble standardâ or âmotivation toward profit at the cost of human livesâ applies.
Ahsan Habib, AFC practitioner in Canada,
- Nicole Lin Chang, âA million child sex abuse images get âdigital fingerprintsâ to prevent online sharing,â Euronews.next, June 7, 2022, https://www.euronews.com/next/2022/06/07/a-million-child-sex-abuse-images-get-digital-fingerprints-to-prevent-online-sharing
- âWhy language matters: why we should never use âchild pornographyâ and always say child sexual abuse material,â NSPCC Learning, January 30, 2023, https://learning.nspcc.org.uk/news/why-language-matters/child-sexual-abuse-material
- âSocial Media Is Accelerating the Spread of Child Sexual Abuse Material,â Giving Compass, November 30, 2022, https://givingcompass.org/article/social-media-is-accelerating-the-spread-of-child-sexual-abuse-material
- David Thiel, Renee DiResta and Alex Stamos, âAddressing the distribution of illicit sexual content by minors online,â Stanford University Cyber Policy Center, June 6, 2023, https://cyber.fsi.stanford.edu/news/addressing-distribution-illicit-sexual-content-minors-online
- Kevin Poireault, âNigerian âYahoo Boysâ Behind Social Media Sextortion Surge in the US,â Infosecurity Magazine, January 29, 2024, https://www.infosecurity-magazine.com/news/nigerian-yahoo-boys-social-media/
- âEnding Online Child Sexual Exploitation and Abuse,â United Nations Childrenâs Fund (UNICEF), December 2021, https://www.unicef.org/sites/default/files/2021-12/Ending%20online%20child%20sexual%20exploitation%20and%20abuse_December%202021_0.pdf
- Â Isha Bhargava, âChild luring and exploitation through Snapchat is on the rise. Here's what you should look out for,â CBC News, January 24, 2023, https://www.cbc.ca/news/canada/london/child-luring-and-exploitation-through-snapchat-is-on-the-rise-here-s-what-you-should-look-out-for-1.6722978