“We are not going to prosecute our way out of the epidemic of child pornography on the internet. Industry — which has benefited so much from the unfettered flow of content — must take responsibility for protecting children from the posting of child sex abuse images on its platforms – Carol Hepburn (Canadian Centre for Child Protection, 2019a, p. 5).”
Introduction
Child sexual abuse has been well documented in recorded history (Vieth, 2018). However, the recording of child sexual abuse material (CSAM) has only been available since the invention of the camera in the late 19th century (Binford et al., 2014). Criminalization of child sexual abuse, limited access to CSAM, and public revulsion of child sex offenders — served to titrate the trade of child sexual abuse images (Binford et al., 2014). With the birth and maturation of the digital age, technology has obliterated the social restraints against online CSAM — offering a tsunami of available images to invisible and undetectable virtual criminal consumers.
As of 2018, U.S. tech companies reported 45 million images of CSAM to law enforcement in 2018 (Keller & Dance, 2019b). Law enforcement officials — overrun by the deluge in reporting — have necessarily limited investigatory resources to CSAM cases that involve infants and children (Barbaro, 2020; Keller & Dance, 2019b).
Similar to U.S. based policy, the Canadian Criminal Code prohibits the making, distribution, possession and accessing of child sexual abuse material (Government of Canada, 2020). Further Canadian policy, assented into law in 2011, requires “the mandatory reporting of Internet child pornography by persons who provide an Internet service” (Government of Canada, 2011). This Federal Act, states that if tech companies are advised that CSAM is being made available through the course of their service provision — they legally have a duty to report to law enforcement and to the Canadian Centre for Child Protection (herein referred to as The Canadian Centre or The Canadians). The current impetus for tech companies to take down CSAM is if the images meet the criminal law threshold for child pornography (Canadian Centre for Child Protection, 2019b; Government of Canada, 2011). The implications of this federal policy, tech companiess adherence to the act, gaps in policy, as well as a framework for next steps — reside herein.
Overview of CSAM as a child welfare issue
The making, distribution, possession and accessing of CSAM is a crime according to the Criminal Code of Canada (Government of Canada, 2020) — yet, Canadian internet hosting services are overrun with images (Canadian Centre for Child Protection, 2019a). Law enforcement is so overwhelmed that they must selectively investigate the most egregious cases, with the youngest and most vulnerable — preverbal children (Keller & Dance, 2019b). This number has increased forty-five-fold from just over ten years ago and by double as compared to the year prior (Cordua, 2020; Keller & Dance, 2019b).
Pedophiles are universally reviled by the general population (Salter, 2003). In his Four Preconditions Model for sexually offending, Finkelhor (1984), submits that offenders must be able to overcome certain preconditions such as the stigma and shame of criminal conduct in order to sexually offend (Finklehor, 1984). The internet affords the most reviled offenders a platform to shamelessly congregate, celebrate, and showcase their egregious criminality against the most vulnerable of victims. Offenders are often cheered on by their predatory peers, gaining esteem for fresh abusive acts, and notoriety for novel violence (Cordua, 2020; Thompson, 2020). Live streaming of sexual abuse has become commonplace content and e-currency (Binford et al., 2014; IWF, 2018).
The scope of CSAM
The Canadian Centre operates Cybertip.ca — Canada’s tip line and reporting “front door” for CSAM (Canadian Centre for Child Protection, 2017). Realized in 2002, the tip line now fields more than 10,000 calls per month — 98% of which are related to CSAM (Canadian Centre for Child Protection, 2017, 2020). Using project Arachnid, an online crawling detection tool for images on the web, The Canadians now process approximately 100,000 reports per month (Canadian Centre for Child Protection, 2019a). For scale, in the United States, the National Center for Missing and Exploited Children (NCMEC) receives over 1 million CSAM reports per month (Canadian Centre for Child Protection, 2017).
The personal cost of CSAM for victims
In a recent TED Talk, Julie Cordua (2020) from THORN, a U.S. NGO, suggests that, “It is clear that abusers have been quick to leverage new technology, but our response as a society has not.” The first cohort of CSAM victims to have their abuse images recorded and or disseminated on the internet has now managed to survive to adulthood (Canadian Centre for Child Protection, 2017). In a recent survivor survey conducted by The Canadian Centre, evidence suggests that these victims experience additional confounding issues from imaged based trauma as compared to their abused, but non-recorded, peers (Canadian Centre for Child Protection, 2017). Survivor self-reports suggest irreparable lifetime traumatization, increased risk of suicidality, lack of control or capacity to remove CSAM from public circulation, fear of recognition from CSAM, and real time realization of those fears (Binford et al., 2014; Canadian Centre for Child Protection, 2017; Wolak et al., 2005).
The gain of CSAM for offenders
On both the open and dark web, there is a thriving offender community complete with its own economy and e-currency — criminal content (Cordua, 2020). A UN Special Rapporteur Report (2009) indicates that as many as 750,000 predators are online at any given point in time. Global CSAM coffers are profitable to offenders not only visually, but also fiscally — with estimated values between 3 – 20 billion dollars (U.N., 2009). The exponential growth of CSAM on the open and dark web ensures that this form of borderless criminality is thriving locally, nationally, and internationally (BBC, 2014; Keller & Dance, 2019b). Tech companies have the legal responsibility to report and the technological capacity to take down — yet corporate apathy remains the overwhelming response (Keller & Dance, 2019b). Why the corporate knuckle dragging and noncompliance?
The inconsistent response of tech companies
Digital imaging technology has been used to irreparably harm innumerable children — it can, however, also be used to help to detect and take down CSAM. Stakeholders have developed technology (Photo DNA) to detect CSAM by forming a fingerprint of the image called a hash (Binford et al., 2014). Having developed technology to detect images — the same tech companies appear to be reluctant to remove them (Canadian Centre for Child Protection, 2019b). In some cases, tech companies do not even review the content on their servers – citing privacy concerns (Keller & Dance, 2019a). Shockingly, some search engines (Bing, DuckDuckGo, Yahoo) on the open web actually recommend CSAM search terms (Constine, 2019).
The threshold of abuse: See no evil
Over 90% of the 45 million detected and reported images originated from Facebook Messenger (Keller & Dance, 2019a), which is slated to become encrypted end to end, thereby eliminating the mass detection of over 90% of images (Keller & Dance, 2019a). If Facebook and friends cannot see CSAM, a legal loophole forms, and they do not have to report it. Additionally, unless the abusive images meet the threshold of child pornography in their country — tech companies are not obligated to remove it (Canadian Centre for Child Protection, 2019a; Fehr, 2019).
The untimeliness of tech takedowns
In the past three years, the Canadians have created Project Arachnid, a computer program that crawls the open and dark web to detect CSAM — issuing nearly 5 million take down notices to tech companies (Canadian Centre for Child Protection, 2019b). When the Canadians send a takedown notice to any one of the 400 companies who host CSAM, the top ten percent of tech respondents do so within a day — the lowest ten percent complete a leisurely two-week timeline for takedown (Canadian Centre for Child Protection, 2019b)). While there are enforceable federal penalties for non-reporting known CSAM, there is no such policy impetus for tech companies to remove CSAM in a comprehensive or timely fashion (Canadian Centre for Child Protection, 2019b).
Policy description & evaluation
The current Canadian policy mandates that all internet service providers follow the parameters as outlined by the statute. The federal government partners with key stakeholders (The Canadian Centre for Child Protection, Public Safety, The Department of Justice, Royal Canadian Mounted Police) and conducts ongoing oversight for outcomes of the federal policy. As part of the National Strategy for The Protection of Children from Exploitation on the Internet, the long-term goal of mandatory reporting and subsequent takedown of CSAM by tech companies is to protect children and reduce or eliminate the online predatory pipeline of CSAM (Public Safety Canada, 2015). The Canadian federal government provides millions of annual funding dollars for the implementation of these policies to invested Canadian stakeholders in accordance with the National Strategy (Public Safety Canada, 2015). The federal act is expected remain in place indefinitely; whereas evaluation of its effectiveness has already been called into question (Canadian Centre for Child Protection, 2019a; Public Safety Canada, 2015).
The goals of the federal act are legal, just, and democratic towards tech companies who for far too long enjoyed the profits of the unregulated underworld of the open and dark webs, while simultaneously contributing to the protection and prevention of image-based exploitation of children. Reporting and removing CSAM from the internet is in keeping with the values of safeguarding children, protecting their privacy, and honouring their human dignity (Canadian Centre for Child Protection, 2019a).
Policy recommendations
There is evidence to suggest that the public at large remains virtually unaware of the of the dangers of online child exploitation (Public Safety Canada, 2015). The safety and security of children underpins this federal act as well as other congruent and concurrent policy initiatives (Public Safety Canada, 2015). The exponential proliferation of CSAM has outpaced necessary changes in public policy. Child pornography laws have been proceeded by mandatory reporting laws for internet providers but have not been followed up by robust enforcement of timely and comprehensive image takedown. Built into the statues for internet service providers, there are consequences to companies who are non-compliant reporters. Unfortunately, the statutes do not outline or define the take down parameters for the internet industry.
The Canadian Center reports that tech companies fit into one of the following categories: proactively seeking to take down CSAM; reactive to take down requests, but do not actively seek to prevent CSAM from being disseminated on their hosting sites; resistant; non-compliant; and finally actively complicit (Canadian Centre for Child Protection, 2019a). Advocates and altruistic stakeholders alike are seeking consistent proactive action and accountability from tech companies (Canadian Centre for Child Protection, 2019a). These actions include but are not limited to: using technology tools readily available (PhotoDNA) (Keller & Dance, 2019a); creating new detection devices; partnering with other industry stakeholders; and the active prohibition of the ability of minors to engage in end-to-end encryption services on their platform (NCMEC, 2020).
A forward-thinking framework
As a global leader in CSAM, The Canadians have proposed a principled and cogent industry framework that sets the standard for tech companies. This framework includes the immediate removal of ALL sexual abuse material related to an abusive incident or series of incidents. The Canadians further clarify that such a spectrum of CSAM include images which “often do not meet criminal law definitions but are still part of the continuum of abuse” (Canadian Centre for Child Protection, 2019b).
Of growing concern to the public are “nude or partially nude images of children that have been made publicly available (typically stolen from unsecured social media accounts or secretly taken images), AND are used in a sexualized context or connected to sexual commentary” (Canadian Centre for Child Protection, 2019b). Not covered under the definition of child pornography, nevertheless used as part of the compendium of abuse, the Canadian framework rightly requests that industry stakeholders remove “images/videos of children being physically abused, tortured, or restrained” (Canadian Centre for Child Protection, 2019b). The Canadians offer common sense recommendations in an industry where common sense, like offenders, is increasingly invisible.
The tech “industry must act on removal notices without subjectivity or unevenness when notified by a trusted/verified hotline, which includes internet providers denying services to those negligent or complicit in the online availability of child sexual abuse images” (Canadian Centre for Child Protection, 2019b). Legislative bodies must center policy on the best interests and needs of children — not the ease and efficiencies of tech companies (Canadian Centre for Child Protection, 2019b). As a global society, with a global pandemic of CSAM, we “must demand change” (Canadian Centre for Child Protection, 2019b).
Indeed — we must.
“From its earliest days, the internet has been weaponized against children around the world. From its earliest days, the technology sector had been negligent in ensuring that their platforms are not used to post child sexual abuse images. From its earliest days, the technology sector has profited while turning a blind eye to the horrific action of millions of their users around the world. This shameful behavior must end. We must reclaim our online communities and hold the technology sector responsible for their actions and lack of action. With the emphasis where it belongs, on the young victims, the Canadian Centre for Child protection is taking the long-needed steps to reframe the problem and the solution” Dr. Hany Farid (Canadian Centre for Child Protection, 2019a, p. 3).
References
Binford, W., Giesbrecht-McKee, J., Savey, J., & Schwartz-Gilbert, R. (2014). Beyond paroline: Ensuring Meaningful remedies for child pornography victims at home and abroad. SSRN Electronic Journal, 35(2). https://doi.org/10.2139/ssrn.2481515
Canadian Centre for Child Protection. (2017). Survivors survey: Executive summary.
Canadian Centre for Child Protection. (2019b). How we are failing children: Project arachnid: The data.
Canadian Centre for Child Protection. (2020). Programs & initiatives. https://www.protectchildren.ca/en/programs-and-initiatives/
Constine, J. (2019). Microsoft bing not only shows child sexual abuse, it suggests it. Tech Crunch. https://techcrunch.com/2019/01/10/unsafe-search/
Cordua, J. (2020). How we can we eliminate child sexual abuse material from the internet. In Ted Talk. https://www.thorn.org/
Fehr, C. (2019). A proposal for police acquisition of ISP subscribers information on administrative demand in child pornography investigations. Canadian Criminal Law Review, 24(2), 1–7.
Finkelhor, D. (1984). Four preconditions: A model. In Child Sexual Abuse: New Theory and Research (pp. 63–68). The Free Press.
Government of Canada. (2011). An act respecting the mandatory reporting of internet child pornography by persons who provide an Internet service. Justice Laws Website. https://laws-lois.justice.gc.ca/eng/acts/I-20.7/page-1.html
Government of Canada. (2020). Criminal code. Justice Laws Website. https://laws-lois.justice.gc.ca/eng/acts/c-46/section-163.1.html
IWF. (2018). Trends in online child sexual exploitation: examining the distribution of captures of live-streamed child sexual abuse. Internet Watch Foundation, May. https://www.iwf.org.uk/sites/default/files/inline-files/Distribution of Captures of Live-streamed Child Sexual Abuse FINAL.pdf
Keller, M. H., & Dance, G. J. X. (2019a). Child abusers run rampant as tech companies look the other way. The New York Times. https://www.nytimes.com/interactive/2019/11/09/us/internet-child-sex-abuse.html
Keller, M. H., & Dance, G. J. X. (2019b). The internet is overrun with images of child sexual abuse. What went wrong? The New York Times. https://www.nytimes.com/interactive/2019/09/28/us/child-sex-abuse.html
NCMEC. (2020). End-to-end encryption. Statement Regarding End-To-End Encryption. http://www.missingkids.org/blog/2019/post-update/end-to-end-encryption
Thompson, L. A. (2020). Mod 6 paper: Chad 540.
U.N. (2009). Special rapporteur report.
Wolak, J., Wolak, J., Finkelhor, D., Finkelhor, D., Mitchell, K., & Mitchell, K. (2005). Child-pornography possessors arrested in internet-related crimes. National Center for Missing and Exploited Children. http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:Child-Pornography+Possessors+Arrested+in+Internet-Related+Crimes:#1