Law and the Internet: Regulating Cyberspace
eds. Lilian Edwards and Charlotte Waelde
Published by Hart Publishing, 1997

Chapter 13: Governance of Pornography and Child Pornography on the Global Internet: A Multi-Layered Approach

By Yaman Akdeniz *

The full citation for this chapter is: Akdeniz, Yaman "Governance of Pornography and Child Pornography on the Global Internet: A Multi-Layered Approach," in Edwards, L and Waelde, C eds, Law and the Internet: Regulating Cyberspace, Hart Publishing, 1997, pp 223-241.

Ph.D. Student at the Centre for Criminal Justice Studies, Law Faculty, University of Leeds, Leeds LS2 9JT. E-mail: lawya@leeds.ac.uk. For further information see http://www.cyber-rights.org/yamancv.htm

Copyright © 1997-2000, Yaman Akdeniz.

Table of Contents

Governance of Pornography and Child Pornography on the Global Internet: A Multi-Layered Approach

Introduction
The availability of pornographic content on the Internet
The governance of the Internet
Overview of UK pornography laws
Obscene Publications Act 1959 and 1964
Child pornography
UK child pornography laws
Protection of Children Act 1978
Section 160 of the Criminal Justice Act 1988
Operation Starburst
Possession offences
Distribution offences
Fellows and Arnold: The Birmingham University Case
US attempts to regulate the Internet - the Communications Decency Act 1996 (CDA)
Legal challenges to the CDA
Developments within the European Union
Responsibility of Internet Service Providers (ISPs)
Self- regulation by ISPs - the Internet Watch Foundation
UK police censorship of Internet newsgroups
Technical solutions and rating systems
Parental control software
Conclusion

Copyright © 1997, 1998 Yaman Akdeniz.

 

Ordering Informationfor Law and the Internet: Regulating Cyberspace

"Law and the Internet" (ISBN 1-901362-30-2) can be ordered direct from Richard Hart Publishing at a cost of £25.00 per copy, plus £3.00 postage and packing for the first copy and £1.00 for any subsequent copies. [Mastercard/Access and Visa can be accepted.]

For further information see:
http://www.law.ed.ac.uk/internet.htm
Telephone: +44 (0) 1865 434459, or Fax: +44 (0) 1865 794882. Alternatively, orders can be sent to:

Hart Publishing
19 Whitehouse Road
OXFORD
OX1 4PA
UK

or e-mailed direct to:
the publishers.

Governance of Pornography and Child Pornography on the Global Internet: A Multi-Layered Approach

Introduction

How pornography should be regulated is one of the most controversial topics to have arisen in relation to the Internet in recent years. The widespread availability of pornography on the Internet has stirred up a ‘moral panic’(1) shared by the government, law enforcement bodies such as the police, prosecutors and judges along with the media in general.(2)

There have been many attempts to limit the availability of pornographic content on the Internet by governments and law enforcement bodies all around the world. While the US Government introduced the Communications Decency Act 1996 (‘CDA’), the UK police attempted to censor Usenet discussion groups allegedly carrying child pornography in the summer of 1996. Both attempts were criticised and the US Supreme Court struck down the CDA in June 1997.

There is no settled definition of pornography, either in the United Kingdom itself, or in the multi-national environment of the Internet, where cultural, moral and legal variations all around the world make it difficult to define ‘pornographic content’ in a way acceptable to all. What is considered simply sexually explicit but not obscene in England may well be obscene in many other countries; conversely what is considered lawful but not pornographic in Sweden may well be obscene under the current UK legislation.

This chapter will discuss two different issues: the regulation of potentially harmful content such as pornography on the Internet; and regulation of invariably illegal content such as child pornography. These issues are different in nature and should not be confused. It is the submission of this paper that any regulatory action intended to protect a certain group of people, such as children, should not take the form of an unconditional prohibition of using the Internet to distribute certain content where that is freely available to adults in other media.

Before explaining the possibilities of how to govern the availability of ‘pornographic content’ on the global Internet, I will briefly discuss how and in what form these materials are available on the Internet.

 

The availability of pornographic content on the Internet

Pornography on the Internet is available in different formats. These range from pictures and short animated movies, to sound files and stories. Most of this kind of pornographic content is available through World Wide Web (‘WWW’) pages; but sometimes they are also distributed through an older communication process, Usenet newsgroups. The Internet also makes it possible to discuss sex, see live sex acts, and arrange sexual activities(3) from computer screens. There are also sex related discussions on the Internet Relay Chat (‘IRC’) channels where users in small groups or in private channels exchange messages and files. But as with the Web and theUsenet, only a small fraction of the IRC channels are dedicated to sex. There are more than 14,000 Usenet discussion groups all around the world but only around 200 groups are sex related, some of these relating to socially valuable and legitimate discussions, concerning, eg, homosexuality or sexual abuse.

 

The governance of the Internet

If illegal and harmful content on the Internet needs to be regulated then the question is: how should this be achieved? Despite the popular perception, the Internet is not a ‘lawless place.’(4) Rather the Internet ‘poses a fundamental challenge for effective leadership and governance.’(5) Walker states that:

‘In the current stage of modern, or post-modern society, one can expect a trend towards ‘governance’ rather than the ‘government’, in which the role of the nation state is not exclusive but may need further sustenance by the activation of more varied levels of power at second hand.’(6)

According to Reidenberg, laws, regulations, and standards will affect the development of the Internet and this is also true for self-regulatory solutions introduced for the availability of pornographic content on the Internet. Reidenberg states that:

‘Rules and rule-making do exist. However, the identities of the rule makers and the instruments used to establish rules will not conform to classic patterns of regulation.’(7)

The Internet is a complex, anarchic, and multi-national environment where old concepts of regulation, reliant as they are upon tangibility in time and space, may not be easily applicable or enforceable. This is why the wider concept of governance may be more suitable. According to Walker, ‘social regulation within modern society has developed within physical bounds of time and space, but the development of cyberspace distanciates its inhabitants from local controls and the physical confines of nationality, sovereignty and governmentality leading to new possibilities in relationships and interaction.’(8) The idea of ‘governance without government’ may be the best approach for the development of the Internet. But ‘if such mechanisms of international governance and re-regulation are to be initiated then the role of nation states is pivotal.’(9)

There appears to be no single solution to the regulation of illegal and harmful content on the Internet because, for example, the exact definition of offences such as child pornography varies from one country to another and also what is considered harmful will depend upon cultural differences. A recent European Commission Communication Paper stated that ‘each country may reach its own conclusion in defining the borderline between what is permissible and not permissible.’(10) The multi-layered governance system should be a mixture of national and international legislation, and self-imposed regulation by the ISPs and on-line users. This should include codes of conduct by the ISPs, software filters to be used by parents, advice to parents and school teachers, hotlines and special organisations to report illegal content on the Internet.

Governance theorists are beginning to recognise that ‘objects of governance are only known through attempts to govern them’(11) and ‘governance is not a choice between centralisation and decentralisation. It is about regulating relationships in complex systems,’(12) and the global Internet does provide a great challenge for governance. The following headings will try to address the issues arising from the multi-layered approach to the governance of ‘pornographic-content’ on the Internet.

 

Overview of UK pornography laws

This section concentrates mainly on those aspects of UK law relating to obscenity which have particular reference to the Internet. UK obscenity legislation has recently been amended by the Criminal Justice and Public Order Act 1994 (‘CJPOA 1994’) to deal with the specific problem of Internet pornography.(13) The following will show, however, that there are difficulties with the application of existing national laws to a medium such as the global Internet which does not have any borders.

Obscene Publications Act 1959 and 1964

These two statutes constitute the major legislation to combat pornographic material of any kind in the UK. Section 1(1) of the 1959 Act provides that ‘an article shall be deemed to be obscene if its effect or the effect of any one of its items is, if taken as a whole, such as to tend to deprave and corrupt persons who are likely, having regard to all relevant circumstances, to read, see or hear the matter contained or embodied in it.’(14)

Under Section 2(1) of the Obscene Publications Act (‘OPA’), it is an offence to publish an obscene article or to have an obscene article for publication for gain. Section 1(3) of the 1959 Act makes it clear that the ‘articles’ contemplated were such items as computer disks; however most of the pornography on the Internet is now transferred electronically from one computer to another using telephone lines and modems rather than via any tangible medium such as discs. This left a possible lacuna in section 1(3), OPA 1959, but this has now been plugged by CJPOA 1994 ,which amended the meaning of "publication" in that section, so that electronic transmission of pornographic material is now clearly covered by the 1994 Act. When A sends B pornographic pictures attached to an e-mail, this electronic transmission will be a publication covered by the Act.(15)

Section 1(2) of OPA 1964 makes it an offence to have an obscene article in ownership, possession or control with a view to publishing it for gain. Following the amendments made by CJPOA 1994, this would even apply when A simply makes the data available to be transferred or downloaded electronically, by providing a password to B, so that B can access the materials and copy them.(16)

 

Child pornography

The main concern of legislators and parents in relation to Internet content is child pornography, rather than other forms of pornographic content. This has been the case ever since paedophiles started to use the Internet for circulating pornographic materials related to children.(17) Paedophilia can be seen as a minority sexual group, with its own form of expression explicitly involving fantasies and imaginings about sex with children. But while it is often argued that pornography should not be proscribed on the basis of freedom of speech arguments, there is a general consensus that the line should be drawn with child pornography. In most cases, child pornography is a permanent record of the sexual abuse of an actual child (except in the case of pseudo-photographs, which are discussed below). An understanding of the special way in which child pornography is child abuse, is crucial to an understanding of the whole problem of child pornography.

 

UK child pornography laws

Protection of Children Act 1978

The 1978 Act was passed in response to the growing problem of child pornography. Its main purpose was to close some potential gaps in the measures available to police and prosecutors.(18) The definition of "photograph" given in section 7(4) of the 1978 Act was extended to include photographs in electronic data format following the amendments made by section 84 (4) of the Criminal Justice and Public Order Act 1994 (CJPOA 1994).

The CJPOA 1994 introduced the concept of ‘pseudo-photographs’ of children. Pseudo-photographs are technically photographs, but they are created by computer software manipulating one or more pre-existing pictures. For example, a child’s face can be superimposed on an adult body, or to another child’s body, with the characteristics of the body altered to create pornographic computer generated images without the involvement of a real child. It is now an offence "for a person to take, or permit to be taken or to make, any indecent photographs or pseudo-photographs of a child; (or) to distribute or show such indecent photographs or pseudo-photographs" under section 1 of the 1978 Act.

The UK police believe that the creators or possessors of pseudo-photographs will end up abusing children, so the purpose of the new legislation may be seen as to criminalise acts preparatory to abuse,(19) and also to close possible future loopholes in the prosecution of such cases, as it may be very difficult to separate a pseudo-photograph from a real photograph.(20)

Although pseudo-photographs can be created without the involvement of real children, there is a justifiable fear that harm to children is associated with all child pornography. The Williams Committee stated:

‘Few people would be prepared to take the risk where children are concerned and just as the law recognises that children should be protected against sexual behaviour which they are too young to properly consent to, it is almost universally agreed that this should apply to participation in pornography.(21)

On the other hand, there are arguments that pseudo-photographs are not harmful. The children involved in child pornography may suffer physical or mental injury, but with pseudo-photographs, the situation is quite different. These photographs are created only by the use of computers. There is no involvement of children in production and there is no direct harm to children in their use. However there is substantial evidence that photographs of children engaged in sexual activity are used as tools for the further molestation of other children,(22) and photographs or pseudo-photographs will be used interchangeably for this purpose.(23)

Section 160 of the Criminal Justice Act 1988

Under section 160 of the 1988 Act as amended by section 84(4) of the CJPOA 1994, it is an offence for a person to have an indecent photograph or pseudo-photograph of a child in his possession. This offence is now a serious arrestable offence with a maximum imprisonment term not exceeding six months. It has been successfully used in its new form in recent cases involving possession of child pornography.

 

Operation Starburst

In July 1995, the British police were involved in Operation Starburst, an international investigation of a paedophile ring who used the Internet to distribute graphic pictures of child pornography. Nine British men were arrested as a result of the operation which involved other arrests in Europe, America, South Africa and the Far East. The operation identified 37 men world wide.(24)

 

Possession offences

As a result of Operation Starburst, many cases of simple possession offences were brought to court. Christopher Sharp was fined £9000 and was the first person to be prosecuted in a case involving pornography and the Internet in the UK. Sharp admitted two charges of possessing indecent photographs of children under the age of 16 contrary to section 160 of the Criminal Justice Act 1988. In early 1996, Martin Crumpton, a former computer consultant, was sentenced to three months’ imprisonment in a Birmingham magistrates’ court. He also admitted possession of indecent pictures of children and was the first person to be jailed in the UK in an offence concerning pornography and the Internet.(25)

 

Distribution offences

Fellows and Arnold: The Birmingham University Case

Fellows and Arnold were charged with a total of 18 charges, under the Protection of Children Act 1978, Obscene Publications Act 1959, and the CJPOA 1994, which widened the definition of "publication" to include computer transmission. West Midlands Police Commercial Vice Squad was contacted by US Customs saying they had identified a site in the UK. Vice Squad officers then swooped on the Department of Metallurgy at Birmingham University and discovered thousands of pictures stored in the computer system of youngsters engaged in obscene acts. The material could be accessed through the Internet across the world. Fellows had built up an extensive library of explicit pornography called ‘The Archive,’ featuring children as young as three, on a computer at Birmingham University where he worked.

The judge ruled that the computerised images could be legally regarded as photographs, setting a legal precedent that a pornographic computer image was, in law, the same as a photograph. After the ruling of the trial judge, Fellows admitted four charges of possessing indecent photographs of children with a view to distributing them, and one of possessing obscene photographs of adults for publication. Arnold also admitted distributing indecent photographs of children. Fellows was jailed for three years, and Arnold for six months for providing Fellows with up to 30 pornographic pictures of children.

Owen J. stated:

‘The pictures could fuel the fantasies of those with perverted attitudes towards the young and they might incite sexual abuse on innocent children.’

This decision, and Crumpton’s imprisonment in 1996, both show the current judicial attitude towards traffickers of child pornography and paedophiles in general.

On appeal ,Evans L.J., upheld the ruling of the trial judge that images stored on computer disc constitute photographs.(26) His Lordship reviewed the terms of the Protection of Children Act and decided that although the computer disk was not a photograph, it was ‘a copy of an indecent photograph.’(27)

 

US attempts to regulate the Internet - the Communications Decency Act 1996 (CDA)

The US Telecommunications Act 1996, including the provisions of the CDA 1996, attempted to restrict access by minors to ‘patently offensive depictions of sexual or excretory activities’, a provision clearly intended to cover the pornographic images and materials which are widely available on-line over the Internet. In particular the CDA specified that it covered content available via an ‘interactive computer service,’. This obviously included materials available on the Internet. In the US, speech which is not considered ‘obscene’ but is indecent enjoys First Amendment protection, though it can still be regulated where there is a sufficient governmental interest. The fact that the CDA was intended to prohibit ‘indecent speech’ would have had an unprecedented effect on the Internet. Information regarding protection from AIDS, birth control or prison rape, is sexually explicit and may be considered ‘indecent’ or ‘patently offensive’ in some communities, and this kind of speech would have been affected by the provisions of the CDA, particularly as it had no definition of the word ‘indecent’.

 

Legal challenges to the CDA

The American Civil Liberties Union (ACLU) and other civil liberties groups filed a lawsuit challenging the CDA as an unconstitutional restraint on free speech on the Internet. In ACLU v. Janet Reno, ACLU claimed that the CDA was ill defined and did not sufficiently delineate what speech or other actions would be subject to prosecution. ACLU and the other plaintiffs argued that:

‘Not only does this ban unconstitutionally restrict the First Amendment rights of minors and those who communicate with them about important issues, but, because of the nature of the online medium, it essentially bans "indecent" or "patently offensive" speech entirely, thus impermissibly reducing the adult population to "only what is fit for children".’

ACLU did not challenge the statute to the extent that it covered already proscribed obscenity or child pornography, merely opposing the extension of liability for speech introduced by the CDA.(28)

Following an initial temporary restraint order obtained by the ACLU, in June 1996 the Federal District Court of Philadelphia held that ACLU had established a reasonable probability of eventual success in the litigation by demonstrating that sections 223(a)(1)(B) and 223(a)(2) of the CDA were unconstitutional on their face to the extent that they covered ‘indecency. Accordingly, a preliminary injunction was granted. Dalzell J stated:

‘As the most participatory form of mass speech yet developed, the Internet deserves the highest protection from government intrusion. Just as the strength of the Internet is chaos, so the strength of our liberty depends upon the chaos and cacophony of the unfettered speech the First Amendment protects.’(29)

The final appeal in the ACLU case, to the Supreme Court, resulted in a historic ruling on June 26, 1997 in which by a 7-2 vote, the online censorship provisions of the CDA were struck down. The Supreme Court affirmed the Philadelphia Court’s ruling that the CDA was unconstitutional, declaring that ‘[t]he CDA’s "indecent transmission" and "patently offensive display" provisions abridge the freedom of speech protected by the First Amendment’.(30) They went on to add:

‘As a matter of constitutional tradition, in the absence of evidence to the contrary, we presume that governmental regulation of the content of speech is more likely to interfere with the free exchange of ideas than to encourage it. The interest in encouraging freedom of expression in a democratic society outweighs any theoretical but unproven benefit of censorship.’

One of the principal issues addressed in the judgement was whether Internet content was more akin to content in print media or in braodcast media such as television. Because of its mass appeal and easy access by children, a higher level of scrutiny in broadcasting than in print media is justified. If part of a broadcasting program on radio or on television is patently offensive, vulgar or shocking than it may be considered indecent and banned at certain times of the day. The Supreme Court explained that the factors that are present in broadcasting are not present in cyberspace. ‘Neither before nor after the enactment of the CDA have the vast democratic fora of the Internet been subject to the type of government supervision and regulation that has attended the broadcast industry.’ The Internet was not as invasive a medium as radio or television, since communications over the Internet did not invade an individual’s home, or appear on one’s computer screen unbidden. Users seldom encountered offensive content by accident. Proscribing offensive content on the Internet for all users just to protect children would be ‘burn[ing] the house to roast the pig.’(31)

In his opinion for the Court, Justice Stevens wrote that ‘[t]he CDA, casting a far darker shadow over free speech, threatens to torch a large segment of the Internet community.’ The CDA went too far in reducing all material accessible on the global Internet to a level suitable only for children.

 

Developments within the European Union

The European Commission launched a Communication Paper on ‘Illegal and Harmful Content’ together with a Green Paper on the Protection of Minors and Human Dignity in Audio-visual and Information Services in October 1996.(32) The Communication Paper was the result of calls for the regulation of the Internet within the European Union dating from early 1996.

The European Commission documents followed a resolution adopted by the Telecommunications Council of Ministers in September 1996, concerning the dissemination of illegal content on the Internet, especially child pornography. While the Communication gives policy options for immediate action to fight against harmful and illegal content on the Internet, the Green Paper sets out to examine the challenges that society faces in ensuring that these issues of over-riding public interest are adequately taken into account in the rapidly evolving world of audio-visual and information services.(33) All these initiatives at the European level were adopted in a Resolution at the Telecommunications Council in November 1996.(34)

The European Parliament adopted a resolution following a report about the European Commission Communication in April 1997.(35) Following the resolution, the European Commissioner Martin Bangemann, stated in his view that ‘it is difficult to pass legislation at international level on harmful content on the Internet, but there is no cultural difference in what is illegal, and the response must be global.’(36) Therefore solutions may not be limited to the EU level and a future involvement of other fora such as the OECD or G7 is likely in future.

 

Responsibility of Internet Service Providers (ISPs)

It is not possible to access the Internet without the services of an ISP, and thus the role of ISPs in content regulation of the Internet is crucial. As a result they are obvious targets for enforcement authorities. ISPs have recently been charged with criminal offences of providing child pornography in both Germany and France. Access to "hate speech" on the Internet is of particular concern to the German government, and again the ISPs have been the ‘usual suspects’ in investigations of provision of such material on the Internet.(37)

The UK Government’s preferred option in relation to ISPs, like that of the EC, is one of self-regulation rather than control by legislation.(38) ISPs have been encouraged to produce codes of practice to control access to illegal and unsuitable material.(39) The Home Office stated that:

‘it is important to distinguish between illegal material and material which is legal but which some would find offensive. Self-regulation is an appropriate tool to address the latter. Dealing with illegal material is a matter for the courts and the law enforcement agencies.’(40)

Walker comments that:

‘Self-regulation in this field has a number of advantages. Rules devised by the media are more likely to be internalised and accepted. In addition, it may avoid heavy-handed legal intervention which carries with it the spectre of government censorship.’(41)

It should not however be forgotten that the prime responsibility for content lies with authors and primary content providers. Blocking access at the level of access providers was criticised in the EU communication paper discussed above on the ground that access is restricted to far more material than the limited category of illegal communications. Such a restrictive regime severely interferes with the freedom of the individual and the political traditions of Europe. There is a real need for the legal position of the ISPs to be clarified, so that they need not, as at present, steer a path between accusations of censorship by users, and exposure to liability for the content they carry.

 

Self- regulation by ISPs - the Internet Watch Foundation (42)

The Internet Watch Foundation (IWF), was announced in September 1996 with the backing of the UK government. It follows a similar initiative in Holland although there are differences between the two hotline systems.(43) The IWF has an e-mail, telephone and fax hot-line so that users can report materials related to child pornography and other obscene materials.(44) The IWF undertake to inform all British ISPs once they locate undesirable content. The ISP concerned then has no excuse in law that it is unaware of the offending material, and the UK police will be entitled to take action against any ISP which does not remove the relevant content requested from IWF.(45)

Although the IWF proposals state that UK ISPs should bear responsibility for their services, and take reasonable measures to hinder the use of the Internet for illegal purposes, it is wrong to assume that ISPs should be held solely responsible for content provided by third parties on the Internet. The real problem will remain elsewhere; in the real rather than virtual world, where pornographic materials are originally created. As long as such material is produced, there can never be a total solution to its availability via the Internet. The Internet is just another convenient tool for paedophiles who wish to traffic in these kind of materials.(46) The formation of the IWF sets a dangerous precedent for privatised censorship on the Internet. A better approach would have been a free confidential telephone hot-line not run by the industry itself, akin to that run by the Metropolitan Police in London to combat terrorism. Furthermore, removing materials containing child pornography from the Internet at a UK level only is near futile as material can always be accessed by UK residents from computers located abroad.

There are further problems. Users of the IWF hotline will probably report material unacceptable according to their taste and moral views, but it should be remembered that what is obscene or illegal is a matter for the courts. The IWF also promotes and recommends the use of rating systems such as PICS (see below) but industry based organisations backed up by governments should not impose rating systems nor get involved in their development. The utility of the IWF will need to be monitored and perhaps re-assessed.

 

UK police censorship of Internet newsgroups

Although the UK Government supports self-regulation with respect to the Internet, the UK police appears to wish to take a more pro-active regulatory role. In mid August 1996, the Clubs & Vice Unit of the Metropolitan Police sent a letter to the UK ISPs supplying them with a list of Usenet discussion groups that they believe to contain pornographic material. The list mainly covered newsgroups which carried child pornography such as ‘alt.binaries.pictures.lolita.fucking, alt.binaries.pictures.boys,’ but it also included such newsgroups as ‘alt.sex.fetish.tickling, alt.sex.fetish.wrestling, alt.homosexual,’ which might or might not include pornographic content. AS many people post the same material to multiple newsgroups, it is possible to find child pornography in newsgroups not intentionally devoted to the topic but attracting a similar readership such as alt.sex.fetish.tickling.

The action taken by the UK police appears to have been ill-considered and will not do much to reduce the availability of pornographic content on the Internet. Furthermore, the list of newsgroups provided by the UK police includes much material that is not illegal, such as legitimate discussion groups for homosexuals, and discussion groups which do not contain any pictures, but contain text, sexual fantasies and stories. These would almost certainly not infringe UK obscenity laws. The action of the UK police also amounted to censorship of material without public debate in Parliament or elsewhere. Political action by the UK government would be preferable to random censorship by law enforcement authorities.

 

Technical solutions and rating systems

Platform for Internet Content Selections (PICS)(47) is a rating system for the Internet similar to the "V-chip" technology used to filter out violence or pornography on the television systems. PICS is widely supported by various governments and industry based organisations such as the Internet Watch Foundation in the UK. PICS works by embedding electronic labels in the text or image documents to vet their content before the computer displays them or passes them on to another computer.(48) The vetting system can be applied to political, religious, advertising or commercial topics. PICS tags can be added by the publisher of the material, by the company providing access to the Internet, or by an independent vetting body. The most common scheme for screening material is that developed in the United States by the Recreational Software Advisory Council on the Internet (‘RSACi’). This was originally a scheme for rating computer games.(49) It rates material according to the degree of sex, violence, nudity, and bad language depicted. It is usually this PICS/RSACi screening combination that people have in mind when they refer to PICS.(50) PICS/RSACi initiatives are strongly criticised in the UK by ‘The Campaign for Internet Freedom’ organised by Living Marxism Online:

‘We do not have the freedom to make up our own minds. PICS is just the modern face of censorship... State bans are overt, public and contestable. By contrast, the censorship of PICS is covert; the ratings authorities are not democratically accountable; the ratings schemes are not publicly determined; and there is no room for dissent.’(51)

According to Electronic Frontiers Australia, ‘the definitions used in determining the four categories were clearly chosen with computer games in mind and lack the flexibility required for a wider range of materials. It is ludicrous that such a system should be applied to novels, online libraries, art galleries, and other such resources.’(52)

There will be many rating authorities, and different communities may consider the same web pages to be in different PICS/RSACi categories. Some rating authorities may eg judge a certain site as an offensive, even though it has a public purpose, such as Web sites dealing with sexual abuse and AIDS. There will be no opportunity for free speech arguments to be made if ratings have been applied by private bodies as the government itself will not be involved directly in censorship.

 

Parental control software

Filtering software products(53) are available which are intended to allow parents to implement their preferences as to content when making decisions for their own children. The vast majority of the material available on the Internet is related to everyday topics, such as politics, news, sports, and shopping, but just as in the real world, there are areas of cyberspace which may contain materials that are not appropriate for children. Blocking and filtering technologies are far more effective and far more flexible than any law. The tools are designed to be easy to use for parents who may not be as computer savvy as their children.(54) The National Center for Missing and Exploited Children produces a brochure called ‘Child Safety on the Information Highway.’(55) After explaining the benefits of the Internet, it also explains the risks of the Internet for children:

(a) Exposure to inappropriate material,

(b) Physical molestation,

(c) Harassment.

The brochure strongly emphasises the importance of parents and their responsibility for their children’s use of on-line services. Similar brochures are also produced in the UK(56) and blocking and filtering software is available to limit or control children’s access to adult oriented Internet sites.(57) By using such technology parents themselves have the chance to decide what is good for their children, and what is not, but do not inflict this choice on the rest of the world’s Internet users. There are many programs available with parental control features including ‘Surf Watch,’(58) ‘Net Nanny’(59) and ‘CYBERsitter’.(60) Sometimes this kind of software is over-inclusive and limits access to or censors inconvenient web sites, or filters potentially educational materials regarding AIDS and drug abuse prevention.(61) Again, the companies creating this kind of software provide no appeal system to content providers who are "banned" by parents, thereby ‘subverting the self-regulating exchange of information that has been a hallmark of the Internet community.’(62) As one opponent of such systems put it:

‘A close look at CYBERsitter reveals an agenda that infringes on the rights of children, parents and teachers wherever the program is used. Despite the hype over ‘parental control’ as an alternative to government censorship, it is Solid Oak Software that takes control when CYBERsitter is running on your computer.’(63)

CYBERsitter, it should be remembered, still relies upon an initial form of labelling outside the home, which can amount to unchallengeable censorship. It is better for such control to be placed wholly in the hands of parents who can set standards for the welfare of individual children.

 

Conclusion

By providing quick and cheap access to any kind of information, the Internet is the first truly interactive ‘mass’ medium. It should not be surprising that governments around the globe are anxious to control this new medium,(64) and the Internet seems to be following a pattern common to the regulation of new media.(65) In reality, while the Internet tends to produce extreme versions of problems, it rarely produces genuinely new ones.

There is a real problem of availability of child pornography on the Internet (and elsewhere), as well as that of the availability of sexually explicit material to unsuitable audiences, such as children. But any regulatory action intended to protect children from being abused in the production of pornography, or from accessing unsuitable content, should not take the form of an unconditional prohibition on using the Internet to distribute content where that content is freely available to adults in other media.

At the moment bans or pre-censorship acts in relation to Internet pornogrpahy or sexual content would in any case be unworkable because of the diversity of pornographic sources. Following the introduction of the CDA 1996 in the USA, many WWW pages containing sexually explicit material introduced password protection schemes which required credit card numbers. For example, Adultcheck(66) is one of the main US based companies regulating WWW pages carrying sexually explicit content on the Internet. Its system requires that both the willing adults and the providers are registered by paying fees to obtain username and passwords. By means such as this, the pornography industry will regulate itself anyway. To do so is in their best interest, since they will wish to safeguard the substantial amount of profits made from the pornography industry each year.(67)

The prime responsibility for assuring an appropriate moral environment for children does not rest with Internet content suppliers or access providers. Instead parents and teachers should be responsible for protecting children from accessing sexual or other material which may be harmful to their development. Standards that are overly broad or too loosely defined will result if the job of rating is handed over to rating bodies with different cultural backgrounds, the software industry, or even the producers of pornography. It is not unreasonable to demand that parents take personal responsibility, when the computer industry is already supplying software which parents can use to regulate access to the Internet.

Child pornography is another matter. Its availability and distribution should be regulated, whether on the Internet or elsewhere. But the main concern of enforcement authorities should remain the prevention of child abuse - the involvement of children in the making of pornography, or its use to groom them to become involved in abusive acts - rather than victimless discussion and fantasy by adults. Child pornography not only consists of‘crime scene photographs’ of child sexual abuse and exploitation, but is also a possible tool for future criminal abuse and exploitation of other children. It is considered ‘illegal’ in many countries, so there is no need to single it out in a special way because it is found on the Internet. The police should make no distinction whether the offence is committed in Oxford Street or on the Internet. Hotlines and monitoring of Internet content should however be encouraged, and police forces should take action if a content provider refuses to remove the illegal materials. Existing UK legislation is capable of fighting child pornography on the Internet and elsewhere, but many of the paedophiles act in international rings, and the targeted group should be the distributors rather than the possessors of child pornography (in some countries possession of child pornography is not an offence(68)) and tougher sentences for the production of child pornography may be needed. Although the UK police succeeded with ‘Operation Starburst’ in identifying an international paedophile ring, substantial collaboration at an international level is needed between various national police forces. All nations have an important part to play in the fight against child pornography. This can be achieved, as suggested by the European Commission, initially at the EU level.

There are no borders on the Internet, and actions by individual governments and international organisations can have a profound effect on the rights of the citizens around the world. The full potential for the development of the Internet depends on global society striking the right balance between freedom of speech and public interest considerations; between policies designed to foster the emergence of new services, and the need to ensure that the opportunities they create are not abused.


Endnotes

* LL.B., MA , Ph.D. Student at the Centre for Criminal Justice Studies, Law Faculty, University of Leeds. E-mail: lawya@leeds.ac.uk. For further information, see <http://www.leeds.ac.uk/law/pgs/yaman/yaman.htm>. Portions of this article appeared in an earlier form in Y Akdeniz, ‘The Regulation of Pornography and Child Pornography on the Internet,’ (1997) 1 Journal of Information, Law and Technology.

  1. See S Cohen, Folk Devils and Moral Panics: Creation of Mods and Rockers (Blackwell, 1987).

  2. It all started with a controversial Time Magazine article in the summer of 1995. See P Elmer-Dewitt ‘On a screen near you: Cyberporn’, Time, 1995, July 3, 34-41.

  3. But see section 2 of the Sexual Offences (Conspiracy and Incitement) Act 1996 which makes it an offence to incite another person to commit certain sexual acts against children abroad. The scope of incitement for the purposes of section 2 extends to the use of Internet and any incitement will be deemed to take place in the UK if the message is received in the UK.

  4. See J R Reidenberg, "Governing Networks and Cyberspace Rule-Making" (1996) Emory Law Journal 45.

  5. Ibid.

  6. See Walker Clive "Cyber-Contempt: Fair Trials and the Internet" (1997) Yearbook of Media and Entertainment Law.

  7. See J R Reidenberg, "Governing Networks and Cyberspace Rule-Making" (1996) Emory Law Journal 45.

  8. Walker Clive, "Cyber-Contempt: Fair Trials and the Internet" Yearbook of Media and Entertainment Law.

  9. P Hirst and G Thompson, ‘Globalization and the Future of the Nation State,’ (1995) 24 Economy and Societ 408 at 430.

  10. European Commission Communication to the European Parliament, The Council, The Economic and Social Committee and the Committee of the Regions: Illegal and Harmful Content on the Internet, Com (96) 487, Brussels, 16 October 1996. An on-line copy is available at <http://www2.echo.lu/legal/en/internet/content/content.html>

  11. A Hunt & G Wickham Foucault and Law: Towards a Sociology of Law as Governance (1994, Pluto Press)at p 78.

  12. RAW Rhodes, ‘The Hollowing Out of the State: The Changing Nature of the Public Services in Britain’, (1994) Political Quarterly 138 at p 151.

  13. See House of Commons, Home Affairs Committee: First report on Computer Pornography, (HMSO,1994).

  14. This legal definition of obscene is narrower than the ordinary meaning of obscene which is filthy, lewd or disgusting. See R v Anderson and others [1971] 3 All ER 1152.

  15. See further Y Akdeniz, "Computer Pornography: A Comparative Study of the US and UK Obscenity Laws and Child Pornography Laws in Relation to the Internet", [1996] 10 International Review of Law, Computers & Technology 235.

  16. See R v Arnolds; R v Fellows, (1996) The Times, 27 September. See also section 43 of the Telecommunications Act 1984 which makes it an offence to send ‘by means of a public telecommunications system, a message or other matter that is grossly offensive or of an indecent, obscene or menacing character’ and is an imprisonable offence with a maximum term of six months. In addition to dealing with indecent, obscene or offensive telephone calls, the Act also covers the transmission of obscene materials through the telephone systems by electronic means.

  17. The Meese Commission Report, in 1986, provides evidence that paedophile offenders and child pornographers had begun to use personal computers and computer networks for communication and distribution of materials. See Attorney General’s Commission on Pornography: Final Report, 2 vols. Washington, D.C.: U.S. Government Printing Office, July 1986 [The Meese Commission] at page 629.

  18. Gibbons Thomas (1996) ‘Computer Generated Pornography’, International Yearbook of Law Computers and Technology, 1995, vol. 9, pp 83-95, page 87.

  19. In March 1996 the author had an interview with Detective Inspector David Davis, head of West Midlands police commercial vice unit which deals with child pornography. He clearly stated that the UK police believe that if somebody creates or posses indecent pseudo-photographs of children, he is a potential child abuser and will abuse children in the future. See also Explosive Substances Act 1883 as an example for preparatory acts being criminalised.

  20. See the Canadian case of R v. Pecchiarich [1995] 22 O.R. (3d) 748-766, in which Pecchiarich, 19, was convicted and sentenced to two year probation, and 150 hours of community service for distributing ‘pseudo-photographs’ of children over the Internet under section 163 (1) of the Canadian Criminal Code. Although Pecchiarich created these materials and they prove his paedophilic tendencies and fantasies, he did not commit any offence towards children. Compare the case of Jake Baker, who had fantasies about torturing, raping and murdering a female student at the University of Michigan. He also sent his story to alt.sex.stories giving the name of a classmate. His case was dismissed by a US District Count Judge ruling that he was protected by the First Amendment. Baker’s case was tackled as a speech issue and although he had sick fantasies they did not involve immediate danger or any criminal activity. See U.S. v. Baker, 890 F. Supp. 1375 (1995).

  21. Williams Committee Report (1979) Obscenity and Film Censorship, Cmnd 7772, (London: HMSO), page 90, para 6.68.

  22. Attorney General’s Commission [The Meese Commission] Final Report on Pornography, 2 vols. (Washington, D.C.: U.S. Government Printing Office) July 1986, page 411.

  23. See also the recent US legislation, Child Pornography Prevention Act 1996 which sets mandatory prison sentences of 15 years for production of child pornography, five years for possession of child pornography, and life imprisonment for repeat offenders convicted of sexual abuse of a minor. The 1996 Act also covers the computer generated images of children as in Canada and the UK.

  24. See further Akdeniz Yaman, ‘The Regulation of Pornography and Child Pornography on the Internet,’ 1997 (1) The Journal of Information, Law and Technology.

  25. More recently, Dr John Payne, 48, a GP in Warminster, Wiltshire, admitted a string of computer child pornography charges in November 1996 and was sentenced to 120 hours’ community service in December 1996, by the Trowbridge Magistrates. He had four images of children in indecent poses stored on his home computer. See Cyber-Rights & Cyber-Liberties (UK) for further information on all UK child pornography cases involving the Internet at <http://www.leeds.ac.uk/law/pgs/yaman/yaman.htm>.

  26. See R v. Fellows, R v. Arnold, CA, The Times October 3, 1996.

  27. See also the case of Father Adrian McLeish, a Roman Catholic priest at St Joseph’s church in Gilesgate, Durham, who held the largest known collection of child pornography yet gathered electronically. He had amassed a vast store of obscene pictures and drawings in his presbytery and exchanged thousands of explicit e-mail messages with other paedophiles. McLeish was sentenced to six years imprisonment by Newcastle upon Tyne Crown Court in November 1996. His activities were exposed a year ago during ‘Operation Starburst.’ See Cyber-Rights & Cyber-Liberties (UK) supra for further information and for other cases involving child pornography and the Internet.

  28. See 18 U.S.C. 1464-65 and 2251-52. See also New York v. Ferber, 458 U.S. 747 (1982), and Miller v. California, 413 U.S. 15 (1973), and U.S. v. Thomas 74 F.3d 701 (1996).

  29. ACLU, et al.v. Janet Reno, 929 F Supp 824 (1996).

  30. See the Supreme Court decision No 96-511, at <http://www.aclu.org/court/renovacludec.html>.

  31. Quoted from Sable Communications v FCC 492 US 115 (1989).

  32. See European Commission Green Paper on the Protection of Minors and Human Dignity in Audovisual and Information Services, Brussels, 16 October 1996. An on-line copy is available at <http://www2.echo.lu/legal/en/internet/content/content.html>

  33. See also the European Commission Working Party Report (1996) ‘Illegal and Harmful Content on the Internet’ at <http://www2.echo.lu/legal/en/internet/content/wpen.html>

  34. See <http://law-www-server.law.strath.ac.uk/diglib/lab/resol.html>.

  35. See Report on the Commission Communication on illegal and harmful content on the Internet (COM(96)0487 - C4-0592/96) Committee on Civil Liberties and Internal Affairs, Rapporteur: Mr Pierre PRADIER - 20 March 1997, available at <http://www.europarl.eu.int/dg1/a4/en/a4-97/a4-0098.htm>.

  36. Agence Europe, ‘MEPs want voluntary code of good conduct to guarentee freedom of expression, while protecting children,’ April 26, 1997.

  37. Deutsche Telekom (DT), the national telephone company, in January 1996, blocked users of its T-Online computer network from accessing Internet sites used to spread anti-Semitic propaganda, which is a crime in Germany. The company was responding to demands by Mannheim prosecutors who were investigating Ernst Zundel, a German-born neo-Nazi living in Toronto. See "German Service Cuts Net Access" San Jose Mercury News, January 27, 1996.

  38. "Home Office Meeting of January 19th 1996" available at Cityscape manager Clive Feather’s home page at <http://www.gold.net/users/cdwf/homeoffice/>.

  39. See for example the JANET Acceptable Use Policy, at <http://www.ja.net/documents/use.html>.

  40. See House of Lords, Select Committee on Science and Technology, "Information Society: Agenda for Action in the UK", Session 1995-96, 5th Report, London: HMSO, 23 July 1996, available at <http://www.parliament.the-stationery-office.co.uk/pa/ld199596/ldselect/inforsoc/inforsoc.htm>, para 4.63.

  41. Walker Clive "Fundamental Rights, Fair Trials and the New Audio-Visual Sector" [1996] MLR 59, 4, 517-539, pages 537, 538.

  42. See further Cullen, p XX.

  43. While the Dutch hotline was established by the Dutch Foundation for Internet Providers (‘NLIP’), Dutch Internet users, the National Criminal Intelligence Service (‘CRI’), National Bureau against Racial Discrimination and a psychologist, the UK Internet Watch Foundation (‘IWF’) is predominantly industry based.

  44. See <http://www.internetwatch.org.uk/hotline/>.

  45. See Safety-Net proposal, "Rating, Reporting, Responsibility, For Child Pornography & Illegal Material on the Internet" adopted and recommended by the Executive Committee of ISPA - Internet Services Providers Association, LINX - London Internet Exchange and The Safety-Net <Foundation at http://dtiinfo1.dti.gov.uk/safety-net/r3.htm>.

  46. David Kerr, head of the IWF had been reported to state that ‘there is also a whole category of dangerous subjects that demand ratings’ such as discussions advocating suicide, information about dangerous sports like bungee-jumping, and more common areas of concern such as drugs, cigarette advertising, sex, and violence. See Wendy Grossman, ‘Europe Readies Net Content Ratings,’ Wired News, 7 July, 1997, at <http://www.wired.com/news/news/politics/story/5002.html>.

  47. PICS has been developed by the World Wide Web Consortium at http://www.w3.org/pub/WWW/PICS/, a non-profit making association of academics, public interest groups and computer companies that looks at the social consequences of technology. It has the backing of 39 global computer and communications companies. The WWW Consortium expects the vetting system to be in widespread use by the end of this year and 80 per cent of information on the Internet to be coded by the end of 1997.

  48. See R Whittle’s web site "Internet censorship, access control and content regulation" at http://www.ozemail.com.au/~firstpr/contreg/ for an explanation of the PICS system and how it works. See also for a critique of PICS by The Campaign for Internet Freedom, ‘Frequently Asked Questions about PICS and Censorship’ at http://www.junius.co.uk/censorship/faq.html.

  49. See <http://www.rsac.org/>.

  50. See <http://www.junius.co.uk/censorship/PICS.html>.

  51. See <http://www.junius.co.uk/censorship/index.html>.

  52. See Electronic Frontier Australia, "Media Release: Internet Labelling System Condemned," 9 February, 1997, at <http://www.efa.org.au/Publish/PR970209.html>

  53. See Netparents.org which provides resources for Internet parents at <http//www.netparents.org>.

  54. See CDT Policy Post Vol 3 (10), July 16, 1997 at <http://www.cdt.org>. See also ‘Summary of the Internet Family Empowerment White Paper: How Filtering Tools Enable Responsible Parents to Protect Their Children Online,’ July 16, 1997, at <http://www.cdt.org/speech/summary.html>. See the White Paper at <http://www.cdt.org/speech/empower.html>.

  55. NCMEC and Interactive Services Association "Child Safety on the Information Highway" 1994, available at <http://www.isa.net/isa>. See also the Interactive Working Report to Senator Leahy, "Parental Empowerment, Child Protection, & Free Speech in Interactive Media" July 24, 1995 available at <http://www.cdt.org/>.

  56. See British Computer Society, "Combatting Computer Pornography: Guidance Notes for the BCS Members" BCS, April 1995, National Council for Educational Technology, "NCET Information Sheet for Schools: Computer Pornography" NCET, February 1995 and Norfolk IT Team, "Organising IT in Schools: Computer Pornography" Norfolk Educational Press, 1994.

  57. See eg. Cyber Patrol available on the Internet at http://www.microsys.com/CYBER/.

  58. Surf Watch is designed to provide parental control for families who do not subscribe to commercial online services which is available at http://www.surfwatch.com/. Surf Watch allows parents to block their children’ access to known Internet sites

  59. Net Nanny is designed to prevent children from accessing areas on the Internet that a parent deems inappropriate, prevent children from giving the name, address, telephone number, credit card, or other personal information to strangers via e-mail or chat rooms, and can log off an on-line service or shut down the computer when the child attempts any of these activities. Net Nanny is available at http://www.netnanny.com/netnanny.

  60. CYBERsitter is similar to the other two with an option to prevent children from accessing files on the home PC computer. It is available at http://www.solidaok.com/.

  61. It has been reported in December 1996 that CYBERsitter completely or partially blocks access to sites such as the National Organization of Women (<http://www.now.org>), and the Yahoo search engine (http://www.yahoo.com).

  62. From a letter sent to Solid Oak (CYBERsitter) by The Cyber-Rights working group of Computer Professionals for Social Responsibility, a group of computer and network users concerned about the preservation of free and open expression on computer networks in the USA, dated 18 December 1996. See Cyber-Rights at <http://www.cpsr.org/cpsr/nii/cyber-rights/>.

  63. See ‘Don’t Buy Cybersitter’ at The Ethical Spectacle Web page, http://www.spectacle.org and Peacefire web pages at http://www.peacefire.org

  64. See Human Rights Watch Report, "Silencing The Net: The Threat to Freedom of Expression On-line" [1996] Monitors: A Journal of Human Rights and Technology 8 (2), at <http://www.cwrl.utexas.edu/~monitors/>

  65. See eg. the Cinemas Act 1909, Broadcasting Act 1952.

  66. See Adultcheck at http://www.adultcheck.com

  67. It has been estimated that pornography, including child pornography, is an $8 to $10 billion a year business, and it is also said to be organised crime’s third biggest money maker, after drugs and gambling. See US Senate Report 104-358, Child Pornography Prevention Act 1996.

  68. The recent cases shown that many cases involved simple possession offences in the UK. See Cyber-Rights & Cyber-Liberties UK at http://www.leeds.ac.uk/law/pgs/yaman/yaman for a complete list of child pornography prosecutions in the UK.


Back to Cyber-Rights & Cyber-Liberties (UK) pages.