28 November 2013
Siobhan Weare and Suzanne Ost examine the recent blocking of online child sexual abuse images by Google and Microsoft. They agree that although this is clearly a step in the right direction in attempting to combat such abuse, it is one which only scrapes the surface.

The recent news that Google and Microsoft have agreed to block child abuse images online has been presented as a positive step forward in the fight against child sexual abuse and exploitation. The measures introduced by Google and Microsoft prevent illegal child abuse images and videos from appearing in around 100,000 search terms, instead triggering warnings that such images are illegal.  The move is one which has been hailed by the Prime Minister, David Cameron, as constituting ‘significant progress’.

Pressure for the introduction of these measures increased after Stuart Hazel and Mark Bridger were convicted for the murders of Tia Sharp and April Jones, with the suggestion that there was a link between their viewing of abusive images online and their subsequent perpetration of violence against children. However, such a relationship between viewing child sexual abuse images and child sexual abuse offending is a tenacious one at best, with diverging opinions in the research on whether such a causal connection actually exists.

It is clear that not all perpetrators of child sexual abuse view child abuse images online and vice versa. Indeed, child sexual abuse can be perpetrated in a variety of contexts, outside of the viewing of such images by perpetrators (See S. Ost, Child Pornography and Sexual Grooming: Legal and Societal Responses (Cambridge University Press, 2009), 39-46). So, for example, a recent inquiry conducted by the Office of the Children’s Commissioner for England found that 2,049 youngsters were known to be victims of child sexual exploitation by groups of perpetrators and a two-year study released by the University of Bedfordshire found that sexual exploitation, such as rape, was used by group members as ‘a form of weaponry’ (see further ‘Group Localised Grooming’, a forthcoming paper in the Child and Family Law Quarterly by J. Mooney and S. Ost).

Equally it is clear that sexual abuse can occur as a result of viewing adult pornography, rather than child sexual abuse images. Newport Crown Court recently heard how a 12 year old boy admitted to three counts of raping his younger sister after he tried to re-enact scenes from an adult porn film. Thus, the blocking of online child sexual abuse images arguably does little to combat these particular forms of sexual exploitation and abuse.

Despite it being a clear step in the right direction, such measures only scrape the surface when attempting to combat child sexual abuse and exploitation online. Jim Gamble, former head of the Child Exploitation and Online Protection Centre (CEOP) warned that this step would not solve the problem but would instead only ‘mask the symptoms’. He noted that child sexual abusers do not use traditional search engines to source abuse images, but rather use so-called ‘dark corners of the internet’, such as peer-to-peer file sharing websites. These areas of the ‘dark internet’ are not catalogued by search engines such as Google, and are often password protected, giving access to those from certain addresses, or only being accessed by those invited to join them.

Catching those who make use of the ‘dark internet’ to share images online is by far one of the biggest challenges, with law enforcement agencies playing a game of cat and mouse to try to keep up with and out-manoeuvre the individuals and groups who are adept at keeping their online criminal activities underground and undetected. In fact, cynics might argue that demanding the step taken by Google and Microsoft  is an easy way for the government to be seen to be ‘doing something’ to tackle child sexual abuse and exploitation online when faced with the difficulties of tackling the much bigger problem.

Nonetheless, blocking images of child sexual abuse online is a clear attempt to address the harms of such abuse, with the invocation of the common sense notion that fewer images must mean fewer victims. Although such a statement may have some validity, blocking such images does not deal with the underlying causes of and issues surrounding child sexual abuse. As put by one person commenting on a BBC News report: ‘Do you really think the abusers will stop the abuse just because it’s harder to post an image online?’ Undoubtedly, combatting child sexual abuse at its source is the most effective way to ultimately tackle the distribution of abusive images online, something which Google and Microsoft cannot do and, indeed, should not be responsible for. This is the responsibility of both the government and society more widely. We would suggest that:

  • More resources need to be invested in educating parents and children on the signs of child sexual exploitation and abuse.
  • In particular, more education is needed on the risks of sexual exploitation and abuse faced by children in offline contexts.
  • There is a need for more societal debate regarding the potential risks caused by children accessing and sharing adult pornography on the internet and sexting (see A.A. Gillespie, ‘Adolescents, Sexting and Human Rights’ (2013) 13 Human Rights Law Review 4).

It is, therefore, not enough to simply block these online images of sexual abuse. However, this is a measure which, alongside other more effective approaches such as those listed above, has the potential to help reduce both the instances of, and harms caused by, child sexual abuse and exploitation.

Siobhan Weare is a PhD Candidate in the Law School at Lancaster University. Her research interests are focused in the areas of criminal law and criminal justice. She has recently published a paper on the socio-legal responses to and constructions of women who kill. She is also researching in the area of child sexual abuse and exploitation, with a particular focus on non-ideal perpetrators of such abuse, for example women and children.

You can find out more about Siobhan's research at http://www.lancaster.ac.uk/fass/law/profiles/siobhan-weare

Suzanne Ost is Professor of Law in the Law School at Lancaster. She researches in the areas of child sexual exploitation and medical law. She is the author of Child Pornography and Sexual Grooming: Legal and Societal Responses. She has recently co-authored a paper on group-localised grooming with Jamie-lee Mooney and is currently working on a paper exploring how victims of child pornography should receive reparation.

You can find out more about Suzanne's research at http://www.lancaster.ac.uk/fass/law/profiles/suzanne-ost