Founder and Director: Yaman Akdeniz , LL.B, MA
E-mail: lawya@cyber-rights.org Tel: +44 (0) 7798 865116 - Fax: +44 (0) 7092199011
Mail Correspondence Address: Cyberlaw Research Unit, Centre For Criminal Justice Studies, University of Leeds, Leeds LS2 9JT, UK.

   Home Page | About Us | Press Enquiries| Reports | Policy Issues | News Items | Press Releases | Mailing Lists | Bookstore

Cyber-Rights & Cyber-Liberties (UK) Report,

‘Who Watches the Watchmen: Internet Content Rating Systems, and Privatised Censorship’

Cite as: Cyber-Rights & Cyber-Liberties (UK) Report, 'Who Watches the Watchmen: Internet Content Rating Systems, and Privatised Censorship,' November 1997, http://www.cyber-rights.org/watchmen.htm.

Copyright 1997-2001 Cyber-Rights & Cyber-Liberties (UK)

Who Watches the Watchmen - Logo

 

Table of contents

Introduction

A short history of content regulation and content blocking technology

Eurim Report and proposed Legislation

(I) No Pressing Need in Fact

(II) National Legislation is the Wrong Response

(III) Confusion between Illegal and Harmful Content

(IV) Adults should not be treated like Children

Self-regulatory solutions for the Internet

Rating Systems

Parents should be responsible for protecting children

The role of self-regulatory bodies such as the Internet Watch Foundation

Internet Service Providers’ Liability

Policing the Internet and the Role of the UK Police

Conclusion

Appendix I

Bibliography and Online Resources

Credits

Notes for the Media


Introduction

After much recent publicity concerning the availability of materials on the Internet that are offensive to many people (racist and Nazi propaganda, pornography, and information on disrupting train travel), Internet content rating systems are developing with broad support by the government agencies and by the industry but without much public debate over their utility or about their long-term implications. Civil liberties proponents in many countries who have examined content-control proposals have found them to be much more intrusive and restrictive than the supporters of rating systems and filtering software claim. The proposed systems often exceed their makers’ claims in the types of content restricted, the number and type of people prevented from reaching content, the technical changes required to public electronic networks, and the burdens on providers of content or Internet service providers.

Recently the UK Internet Watch Foundation (‘IWF’) convened an advisory board comprising representatives of content providers, children’s charities, regulators from other media, Internet Service Providers and civil liberties groups, to propose a UK-focused system for rating Internet content. (See House of Commons, 26 June 1997, Written Answers, Internet).

Cyber-Rights & Cyber-Liberties (UK), a non-profit civil liberties organisation which promotes free speech and privacy related issues on the Internet, has recently discovered that no ‘civil liberties’ organisations are in fact involved in the development of rating systems at the UK level. It has been wrongly stated many times by the media, by members of the Parliament, and in different EU reports. that UK civil liberties organisations are involved with the development of rating systems and that they have been also consulted on these issues.

It is the purpose of this report to explain why the debates on regulation of Internet content should take place openly and with the involvement of the public at large rather than at the hands of a few industry based private bodies.

 

A short history of content regulation and content blocking technology

 

Until the 1990s there were no restrictions on Internet content. Governments did not concern themselves because Internet access was available mainly to a relatively small (though international) community of academics and engineers at universities, government research institutions, and commercial research institutions.

Despite the largely serious and academic nature of most material, a sub-culture also flourished of odd sexually-oriented, politically-oriented, and other materials often considered ‘wacko’ (insane). The presence of such materials was tolerated by all users and even considered a sign of the health of the medium. In particular, few people were bothered by the presence of pornography in a community made up over 90% of male users.

When the Internet became more widespread and governments began to take notice, the first stage in Internet content control began, consisting of heavy-handed and repressive forays in censorship. The U.S. Communications Decency Act 1996 was a part of this trend, as are more recent but similar proposals by the Australian government.

The first wave of direct censorship ran its course, turned back by concerns over its effects on free expression (the CDA was declared to infringe on constitutionally protected speech, see Reno v. ACLU, 1 17 S. Ct. 2329 (1997)) as well as its technological inappropriateness for the medium and its ineffectiveness in a global environment.

The second stage in content control thus began with the introduction of rating and filtering products that claim to permit users to block unwanted material from their personal systems. The most sophisticated and widely recognised of these systems is the Platform for Internet Content Selection (‘PICS’), introduced by the World Wide Web Consortium. European governments were especially interested in this hoped-for solution. They backed away quickly from incidents in the first stage of direct suppression and put forward PICS and rating systems as a proposed standard, both through national governments and the European Union as a self-regulatory solution to Internet content.

There are many problems, however, in rating and filtering systems as will be explained in this report. They are crude and tend to block too many sites. Most focus on the World Wide Web, offering no way to block objectionable content on other distribution mechanisms of the Internet such as newsgroups and ftp sites. Each system is extremely subjective and affected by cultural assumptions, so international exchanges of systems will not satisfy users. Finally, the systems were designed for individual users and do not scale well to use by entire countries and third parties.

Thus, we are beginning to see a third stage emerge in content control: that of international co-operation to remove content from the Internet. For some clearly delineated materials, such as sexually explicit material in which children are actors, such co-operation may be helpful. However, as a general trend this stage is fraught with danger. The public is not likely to support the suppression of material that is legal in their own country but illegal in another.

 


 

Eurim Report and proposed Legislation

 

Eurim, a UK body made up of members of parliament, industry representatives and special interest groups, set up a working party to examine illegal content earlier this year. Eurim published a report entitled, ‘Internet Content Regulation,’ in July 1997 which found that existing regulations are inadequate to cover the new medium of the Internet.

"There is a need to clarify and refine our existing laws on illegal material. The application of such laws to the Net ... is not particularly clear ... but even when the law is clear, we must ensure that those whose job it is to uphold it, our police forces, are given the equipment and specialist training they need," said Baroness Dean, Eurim council representative.

The EURIM Report recommends the strengthening of the Internet Watch Foundation (IWF), or the setting up of a statutory body to monitor the Internet industry. The report states that ‘the IWF is not independent from the ISPs and lacks the credibility and influence which formal recognition and legal status could give.’ Tory MP Ian Bruce, vice-chairman of Eurim and a member of the Parliamentary IT Committee, said he aimed for the watchdog to have ‘legislative teeth’ – to cope with cases where Internet service providers refuse to act voluntarily against an offending source. Bruce also suggested it might be necessary to set up an ‘OFNET’, along the lines of OFTEL, to take over the IWF’s regulatory work. (see Computing, ‘MPs act to curb Net abuse,’ 10 September 1997)

Cyber-Rights & Cyber-Liberties (UK) does not agree that the existing laws need to be clarified to cover the new medium. UK Defamation laws were recently updated with the new Defamation Act 1996 and it clarifies the liability of ISPs. The Sexual Offences (Conspiracy & Incitement) Act 1996 refers to the use of the Internet and the child pornography laws is more than adequate to deal with the availability and dissemination of this kind of material on the Internet. There have been many prosecutions following ‘Operation Starburst’ in the UK (see Appendix I for a complete list of child pornography cases involving the UK).

 

(I) No Pressing Need in Fact

A new bill was presented in the UK Parliament by Mrs Ann Winterton on - Internet (Dissemination of Child Pornography) in June 1997 and this would create new rules that are more restrictive and oppressive on the Internet than in other media. Restrictive legislation of this kind should be resisted as fears and impressions of illegal trafficking on the Internet are exaggerated. Between December 1996 and June 1997, about 1000 illegal items were reported to the Internet Watch Foundation, but only 9 reports involving 75 of them originated from the UK. Therefore, there is no need for heavy handed legislation involving the dissemination of child pornography on the Internet as most of the illegal content available on the Internet does not originate from the UK. There is also no need for expensive monitoring of the Internet at a national level as the few problems created by the Internet remain global ones.

A recent European Commission working paper agreed and stated that ‘there is no legal vacuum as regards the protection of minors and human dignity, not even in online and Internet services. According to the principle of territorial jurisdiction, the law applies on the national territory of the State and hence also applies to online services.’ (see Commission Staff Working paper, ‘Protection of Minors and Human Dignity in Audio-visual and Information Services: Consultation on the Green Paper, SEC (97) 1203, Brussels, June 1997).

 

(II) National Legislation is the Wrong Response

However, we do recognise that the Internet is a global medium which does not respect boundaries, and that individual nation-states are losing their capacity for governance. Therefore, heavy handed new legislation at a national level will in any event be inadequate and ineffective. All nations have an important part to play in the fight against internationally defined illegal material, such as forms of child pornography. Although the UK police have been successful with ‘Operation Starburst’ in identifying an international paedophile ring, substantial collaboration at an international level may be needed to fight child pornography between various national police forces. This can be achieved, as suggested by the European Commission, initially at the EU level. Therefore, it is not in the best interest of the UK Parliament to legislate on these matters just because there is a public outcry and moral panic.

 

(III) Confusion between Illegal and Harmful Content

The regulation of potentially ‘harmful content’ such as pornography on the Internet and regulation of invariably illegal content such as child pornography are different in nature and should not be confused. Child pornography is banned in a wide range of countries because its creation involves child abuse. Other types of offensive content, by contrast, are ‘victimless crimes’ and have no proven ill-effects on other people. For example, a link between the consumption of pornography and sexual abuse has never been established (see e.g. Dennis Howitt and Guy Cumberbatch, Pornography: Impacts and Influences, Research and Planning Unit London: HMSO, 1990). This distinction explains why there is a wide variation among countries (and local communities within those countries) about what is tolerable in pornography involving adults.

 

(IV) Adults should not be treated like Children

Any regulatory action intended to protect a certain group of people, such as children, should not take the form of an unconditional prohibition of using the Internet to distribute content that is freely available to adults in other media. Therefore, attempts to pass online censorship legislation such as the US Communications Decency Act (part of the 1996 Telecommunications Act) should be avoided and child pornography laws should not be used as false examples of supposed legitimate restriction of freedom of expression. The US Supreme Court recently stated in Reno v. ACLU, 1 17 S. Ct. 2329 (1997) that ‘the Internet is not as "invasive" as radio or television and confirmed the finding of the Us Court of Appeal that ‘communications over the Internet do not "invade" an individual’s home or appear on one’s computer screen unbidden. Users seldom encounter content by accident.’ Partly on the basis of this user-driven aspect of the Internet, the court unanimously struck down the Communications Decency Act, which tried to restrict the distribution of ‘indecent’ material.

This report will now proceed to examine the technical means of restricting content which have been widely proposed as a self-regulatory solution instead of ‘top-down’ regulatory restrictions.

 


 

Self-regulatory solutions for the Internet

There appears not to be a single solution for the regulation of illegal and harmful content on the Internet because, for example, the exact definition of offences such as child pornography varies from one country to another.

These are pressing issues of public, political, commercial and legal interest. The treatment of material considered harmful may be different in different societies, and what is considered to be harmful depends on cultural differences. It is therefore imperative that international initiatives take into account different ethical standards in different countries in order to explore appropriate rules to protect people against offensive material. For example, the European Court of Human Rights in Handyside (see Handyside case (1976) 19 Y.B.E.C. 506) stated that the steps necessary in a democratic society for the protection of morals will depend on the type of morality to which a country is committed. A Recent European Commission Communication Paper (1996) stated that ‘each country may reach its own conclusion in defining the borderline between what is permissible and not permissible’. A conflict always exists between the desire to allow free expression and the feeling that morality must be enforced. Each society must decide where to draw the line. However, a good rule of thumb is that free expression is more important to a healthy and free society, and should not be seriously harmed by attempts to enforce to moral standards.

In this context it might be useful to quote from one of the more recent judgements of the European Court of Human Rights in Castells v. Spain (judgement of 23 April 1992, Series A no. 236, p.22, 42):

‘... freedom of expression constitutes one of the essential foundations of a democratic society, one of the basic conditions for its progress. Subject to paragraph 2 of Article 10 [of the European Convention on Human Rights], it is applicable not only to "information" or "ideas" that are favourably received or regarded as inoffensive or as a matter of indifference, but also to those that offend, shock or disturb. Such are the demands of that pluralism, tolerance or broadmindedness without which there is no democratic society.’

‘Harm’ is a criterion which will depend upon cultural differences. There have been attempts, for example, by the German government to restrict the availability of hate speech on the Internet, specifically the web sites related to the denial of the Holocaust. Many of these same materials are legal in other countries, even though most of the population finds them offensive. The preservation of the principle of free expression should be more important than the pursuit and prosecution of every potentially dangerous speaker.

Self-regulation is an appropriate tool to address the criteria of harmful content. Dealing with illegal material is a matter for the courts and the law enforcement agencies. (see House of Lords, Select Committee on Science and Technology "Information Society: Agenda for Action in the UK", Session 1995-96, 5th Report, London: HMSO, 23 July 1996, para 4.163).

‘Self-regulation in this field has a number of advantages. Rules devised by the media are more likely to be internalised and accepted. In addition, it may avoid heavy-handed legal intervention which carries with it the spectre of government censorship.’ (see Walker Clive "Fundamental Rights, Fair Trials and the New Audio-Visual Sector" [1996] MLR 59, 4, 517-539.)

A self-regulatory model for harmful content on the Internet may include the following levels and in this model ‘self’ means as in ‘individual’ without the state involvement :

 

User or Parental Responsibility

Parental Software

 

On the other hand we offer the following model for fighting such illegal content as forms of child pornography on the Internet and this is a more collective solution different from the above model:

User Responsibility to report it

Hotlines for reporting

Code of Conduct by ISPs

National Legislation - distribution

International Level - Co-operation

 

There is no need for rating systems to be used for illegal content and the next sections explain why there is no need for rating systems to be used for harmful content on the Internet.

 


 

Rating Systems

There have been recent calls in Europe for the regulation of the Internet and these are relevant to the UK developments. Recently European Commission approved a Communication on Illegal and Harmful Content on the Internet (1996) and a Green Paper (1996) on the protection of minors and human dignity in the context of new electronic services in October 1996. The European Commission documents follow the resolution adopted by the Telecommunications Council of Ministers in September 1996, on preventing the dissemination of illegal content on the Internet, especially child pornography. While the Communication gives policy options for immediate action to fight against harmful and illegal content on the Internet, the Green Paper sets out to examine the challenges that society faces in ensuring that these issues of overriding public interest are adequately taken into account in the rapidly evolving world of audiovisual and information services.

The European Commission Communication Paper suggested that:

"the answer to the challenge will be a combination of self-control of the service providers, new technical solutions such as rating systems and filtering software, awareness actions for parents and teachers, information on risks and possibilities to limit these risks and of international co-operation."

All these initiatives at the European level were adopted in a Resolution at the Telecommunications Council of November 1996. The European Parliament also adopted a resolution following these initiatives. The UK Government welcomed the Communication with its emphasis on self-regulation by industry, as entirely consistent with the UK’s approach:

 

"The UK strongly agrees with the Commission that since a legal framework for regulation of the Internet already exists in Member States, new laws or regulations are unnecessary." (Select Committee on European Legislation, 1996, para 14.8)

Cyber-Rights & Cyber-Liberties (UK) argues that a radical self-regulatory solution for the hybrid Internet content should not include any kind of rating systems and self-regulatory solutions should include minimum government and industry involvement.

Platform for Internet Content Selections (‘PICS’) is a rating system for the Internet and is similar to the V-chip technology for filtering out violence or pornography on television systems. PICS is widely supported by various governments and industry based organisations such as the Internet Watch Foundation in the UK. PICS works by embedding electronic labels in the text or image documents to vet their content before the computer displays them or passes them on to another computer. The vetting system could include political, religious, advertising or commercial topics. These can be added by the publisher of the material, by the company providing access to the Internet, or by an independent vetting body.

Currently (as of November 1997), there are three PICS related rating systems that are being widely used or promoted:

RSACi: The most common scheme for screening material was developed by the United States based Recreational Software Advisory Council on the Internet (‘RSACi’), originally a scheme for rating computer games. It rates material according to the degree of sex, violence, nudity, and bad language depicted. It is usually this PICS/RSACi screening combination that people have in mind when they refer to PICS. As of September 1997, RSACi claims to have over 43,000 sites rated.

SafeSurf: Developed by the SafeSurf corporation, this system’s categories include ‘Age Range’, ‘Profanity’, ‘Heterosexual Themes’, ‘Homosexual Themes’, ’Nudity’, ‘Violence,’ ‘Sex, Violence, and Profanity’, ‘Intolerance’, ‘Glorifying Drug Use’, ‘Other Adult Themes’, and ‘Gambling’, with 9 distinctions for each category.

SafeSurf and RSACi both rely on self-rating of Internet sites by web publishers. While apparently being voluntary and fair, this kind of system is likely to end up being a serious burden on content providers. First, the only way to deal with incorrect ratings is to prosecute content providers. That is very dangerous and an infringement on free speech. Secondly, ISPs and search engines will simply block any unrated sites, so that content providers will feel it necessary to rate their sites even if they oppose the system.

NetShepherd: Based in Calgary, Net Shepherd rates sites based on maturity levels (General, Child, Pre-teen, Teen, Adult, and Objectionable), and quality levels (1-5 stars). Unlike SafeSurf and RSAC, NetShepherd conducts third-party ratings of web sites. NetShepherd claim to have rated over 300,000 sites. NetShepherd has also announced partnerships with firms such as Altavista and Catholic Telecom, Inc.

The Eurim Report encourages the development of internationally accepted rating systems so that some sort of ‘harmful content’ may be controlled at the point of access. The Internet Watch Foundation (‘IWF’), was seen as a possible way forward on this subject by the Eurim report and the IWF has been working on the introduction of these rating systems together with its European partners (including ECO, the German Electronic Commerce Forum and Childnet International, the UK-based charity) under the Internet Content Rating for Europe (‘INCORE’) project.

This initiative aims to: (1) create a forum of interested groups to investigate content rating (identifying illegal and classifying legal material. A key element of this will be consumer research as to users’ expectations regarding the Internet and, more specifically, the kind of material they would consider to be appropriate to apply ratings to); (2) draw together self-regulatory bodies as hot-line organisations; and (3) consider European input into world-wide standards.

Child pornography is often used as an excuse to regulate the Internet but there is no need to rate illegal content such as child pornography since it is forbidden for any conceivable audience and this kind of illegal content should be regulated by the enforcement of existing UK laws. On the other hand, the Internet contains other kind of content which would be legal but otherwise defined as harmful for instance to children.

According to the Internet Watch Foundation, there is ‘a whole category of dangerous subjects’ that require ratings and these are information related to drugs, sex, violence, information about dangerous sports like bungee-jumping, and hate speech material (see Wired News, ‘Europe Readies Net Content Ratings,’ 7 July, 1997). It is surprising to see bomb-making material being omitted from this list, but we can expect it to be added to the list as happened recently in the US. Senator Dianne Feinstein, in the United States introduced legislation specifically making it illegal to distribute bomb-making information on the Internet. This legislation was found unconstitutional in the US and it should be noted that this kind of information, including the Anarchist’s Cookbook are available through well known bookshops such as Waterstones and Dillons within the UK.

We also warn that self-rating systems must not be used as a pretext for ‘zoning’ the Internet, as two dissenting justices suggested in the U.S. Supreme Court, Reno v. ACLU, 1 17 S. Ct. 2329 (1997). The dissenting argument, while agreeing that the CDA was unconstitutional, left open the possibility that material could in the future be banned from the open Internet and allowed only in special sites where access would be controlled by identification and screening of users. This proposal is onerous for several reasons: it threatens to restrict socially valuable information that the government does not wish people to see, and requires users to reveal their identities when viewing sensitive materials such as information on sexually transmitted diseases or information for victims of AIDS. This kind of violation would have serious implications for privacy of online users and also would have a chilling effect on use of the Internet.

Recently in the USA, the American Civil Liberties Union was alarmed because of the failure to examine the longer term implications for the Internet of rating and blocking schemes. The ACLU published a white paper in August 1997 entitled Fahrenheit 451.2: Is Cyberspace Burning? How Rating and Blocking Proposals May Torch Free Speech on the Internet (see <http://www.aclu.org/issues/cyber/burning.html>). The ACLU paper warned that government-coerced, industry efforts to rate content on the Internet could torch free speech online.

‘In the physical world, people censor the printed word by burning books,’ said Barry Steinhardt, Associate Director of the ACLU and one of the paper’s authors. ‘But in the virtual world, you can just as easily censor controversial speech by banishing it to the farthest corners of cyberspace with blocking and rating schemes.’ According to the ACLU, third-party ratings systems pose free speech problems and with few third-party rating products currently available, the potential for arbitrary censorship increases. The white paper was distributed with an open letter from Steinhardt to members of the Internet community. ‘It is not too late for the Internet community to slowly and carefully examine these proposals and to reject those that will transform the Internet from a true marketplace of ideas into just another mainstream, lifeless medium.’

The ACLU white paper gave six reasons why self-rating schemes are wrong for the Internet and Cyber-Rights & Cyber-Liberties (UK) endorses these statements:

(1)Self-rating schemes will cause controversial speech to be censored.

(2) Self-rating is burdensome, unwieldy, and costly.

(3)Conversation cannot be rated.

(4)Self-rating will create ‘Fortress America’ on the Internet

(5) Self-ratings will only encourage, not prevent, government regulation.

(6) Self-ratings schemes will turn the Internet into a homogenised medium dominated by commercial speakers.

It seems likely that there will be many rating authorities, and different communities will consider the same web pages to be in different PICS/RSACi categories. Some rating authorities may judge a certain site as an offensive even though it has a socially valuable purpose, such as web sites dealing with sexual abuse and AIDS. This would mean that there will be no space for free speech arguments and dissent because the ratings will be done by private bodies and the government will not be involved ‘directly.’

The governments do not need to either impose rating systems and rating bodies with different cultural backgrounds, nor get involved in their development.

 


 

Parents should be responsible for protecting children

The prime responsibility for assuring an appropriate moral environment for children must rest elsewhere. Parents and teachers should be responsible for protecting children from accessing pornographic content which may be harmful to their development. Standards that are overly broad or loose will result if the job is handed over to rating bodies with different cultural backgrounds, the software industry, or even the producers of pornography. This is not a helpless demand for personal responsibility, since the computer industry is also supplying the means of protection.

Most filtering software available is designed for the home market. These are intended to respond to the preferences of parents making decisions for their own children. There are currently 15 blocking and filtering products and these are mainly US based (see http://www.netparents.org/software/) and do not represent the cultural differences in a global environment such as the Internet.

It has been reported many times that, this kind of software is over-inclusive and limits access to or censors inconvenient web sites, or filters potentially educational materials regarding AIDS and drug abuse prevention. Therefore, ‘censorware’ enters homes despite the hype over ‘parental control’ as an alternative to government censorship. The companies creating this kind of software also provide no appeal system to content providers who are ‘banned’, thereby ‘subverting the self-regulating exchange of information that has been a hallmark of the Internet community.’ (see CPSR letter dated 18 December 1996 sent to Solid Oak, the makers of CyberSitter at http://www.cpsr.org/cpsr/nii/cyber-rights/)

Therefore, such software should not be used in public and university libraries because libraries are responsible for serving a broad and diverse community with different preferences and views. American Library Association in a resolution adopted in June 1997, stated that ‘blocking Internet sites is antithetical to library missions because it requires the library to limit information access.’

We recommend that any filtering system should be market driven by the local industries, without government interference and that the local industries creating these kind of parental tools should be open and accountable to the online users.

 


 

The role of self-regulatory bodies such as the Internet Watch Foundation

Internet Watch Foundation, supported by the UK Government, was announced in September 1996 and it follows up a similar initiative in Holland (see below) although there are differences between the two hotline systems. While the Dutch hotline is established by the Dutch Foundation for Internet Providers (‘NLIP’), Dutch Internet users, the National Criminal Intelligence Service (‘CRI’), National Bureau against Racial Discrimination and a psychologist, the UK Internet Watch Foundation (‘IWF’) is predominantly industry based.

The Dutch Model

The Dutch hotline has been operating quite successfully since June 1996, resulting in a substantial reduction of the amount of child pornography pictures distributed from Holland and resulting in the actual prosecution of authors, in close co-operation with the police. Furthermore, a procedure has been developed to deal with child pornography originating from other countries than the Netherlands. In case such complaint is sent to the hot-line, the foreign author and service provider are notified. If this action does not lead to the actual removal of the content, the Dutch police ,after being informed by a representative of the hot-line, notifies their colleagues in the country of origination.

 

The Metropolitan Police in London has a free confidential telephone hot-line (0800-789321) to combat terrorism, and a similar step should have been taken to combat child pornography and child sexual abuse whether related to the Internet or not. This would have had a general purpose. The idea of removing materials containing child pornography from the Internet at UK level seems not to be a solution in a multi-national environment. The IWF is playing with fire as their possible future involvement with other kinds of content which may be offensive but totally legal, may set up a dangerous unprecedented act of privatised censorship where there is no space for dissent.

IWF has an e-mail, telephone and fax hot-line for users to be able to report materials related to child pornography and other obscene materials. IWF, informs all British ISPs once they locate the ‘undesirable content.’ The ISPs will have no excuse in law of being unaware of the offending material and the UK police will probably take action against those ISPs who do not remove the relevant content requested from IWF.

In contrast to the Dutch Model, the IWF proposals state that the UK ISPs should bear responsibility for their services and they need to implement reasonable, practicable and proportionate measures to hinder the use of the Internet for illegal purposes. But it is wrong to assume that ISPs should be responsible for content provided by the third parties on the Internet.

There are also technical problems with the utility of the IWF initiatives where on-line users will report the unwanted materials. Users will probably report material unacceptable according to their taste and moral views, but it should be remembered that it is for the Courts and judges to decide whether something is obscene or illegal. It should also be noted that with reporting systems the interpretation of images will always be subjective. IWF also promotes and recommends the use of rating systems such as PICS (see above) but industry based organisations backed up by governments do not need to either impose rating systems and rating bodies with different cultural backgrounds, nor get involved in their development. The application and utility of the IWF will have to be assessed and maybe reviewed.

 


 

Internet Service Providers’ Liability

ISPs differ in nature in different countries, but the main aim remains the provision of Internet related services to the online users. Technically it is not possible to access the Internet without the services of an ISP and therefore the role of the ISPs is crucial to access the Internet. The crucial role they play in providing access to the Internet made them visible targets for the control of ‘content regulation’ on the Internet.

A recent European Commission Communication to the European Parliament, The Council, The Economic and Social Committee and the Committee of the Regions on Illegal and Harmful Content on the Internet, (1996) stated that ‘Internet access providers and host service providers play a key role in giving users access to Internet content. It should not however be forgotten that the prime responsibility for content lies with authors and content providers.’

Blocking access at the level of access providers has been criticised by the EU communication paper on the ground that these actions go far beyond the limited category of illegal content and ‘such a restrictive regime is inconceivable for Europe as it would severely interfere with the freedom of the individual and its political traditions.’ Therefore ‘the law may need to be changed or clarified to assist access providers and host service providers, whose primary business is to provide a service to customers.’

The EU developments are very important and would affect both the UK and other Member States. ‘Therefore, the position of the ISPs should be clarified, and they should not be targeted by the individual governments and law enforcement bodies where the ISPs have no control of the Internet content.’

Two technical factors prevent a service provider, such as the CompuServe branch prosecuted twice in Germany over the past two years, from blocking the free flow of information on the Internet. First, an Internet service provider cannot easily stop the incoming flow of material and the thousands of unsolicited commercial e-mails that go through the systems of the ISPs is a good example of this. No one can monitor the enormous quantity of network traffic, which may consist of hundreds of thousands of emails, newsgroup messages, files, and Web pages that pass through in dozens of text and binary formats, some of them readable only by particular proprietary tools. As the European Commission noted recently, ‘it is as yet unclear how far it is technically possible to block access to content once it is identified as illegal. This is a problem which also affects the degree of liability of the access providers.’

A second technical problem is that a provider cannot selectively disable transmission to particular users. Electronic networks typically do not allow for the identification of particular users or their national region. Thus, CompuServe correctly claimed that it cannot provide material in one country while blocking it in another; such a distinction would require an enormous new infrastructure on top of the current network.

Some networking technologies, such as newsgroups, may allow individual operators to select some groups or items and block others. But many technologies, such as the widely used World Wide Web, currently do not support such selectivity.

The recent ‘Bonn Declaration’ underlined the importance of clearly defining the relevant legal rules on responsibility for content of the various actors in the chain between creation and use. The Declaration recognised the need to make a clear distinction between the responsibility of those who produce and place content in circulation and that of intermediaries such as the Internet Service Providers. (see <http://www2.echo.lu/bonn/final.html>.)

The current situation at the UK does not represent a self-regulatory solution as suggested by the UK Government. It is moving towards a form of censorship, a privatised and industry based one where there will be no space for dissent as it will be done by the use of private organisations, rating systems and at the entry level by putting pressure on the UK Internet Service Providers. One can only recall the events which took place in the summer of 1996 and how the ISPs were pressured by the Metropolitan police to remove around 130 newsgroups from their servers.

 


 

Policing the Internet and the Role of the UK Police

Internet related crimes are not a priority for the UK police forces while there is an insatiable demand for the bobby on the beat and reduction of the street crimes such as car thefts are a priority. Considering the international aspect of the Internet, it would not be only up to the UK police, or any other police force in its own to try to patrol the Internet.

The action taken by the UK Metropolitan police in August 1996 to censor usenet discussion groups was ill-considered and did not reduce the availability of pornographic content on the Internet. The list of newsgroups provided by the UK police included much material that is not illegal, such as legitimate discussion groups for homosexuals, and discussion groups which do not contain any pictures, but contain text, sexual fantasies and stories. These would almost certainly not infringe UK obscenity laws. The action of the Metropolitan police also amounts to censorship of material without any public debate. Any action with regard to regulation of the Internet should take place following informed debate and policy-making by Parliament and not by the police (or the industry itself). Sensible action by the UK government is needed to resolve the problem rather than censoring or banning distasteful material on the Internet and it is wrong to treat the ISPs as ‘usual suspects’ for the provision of illegal content on the Internet.

 


 

Conclusion

With rating systems and the moral panic behind the Internet content, the Internet could be transformed into a ‘family friendly’ medium, just like the BBC. But it should be remembered that the Internet is not as intrusive as the TV and users seldom encounter illegal content such as child pornography. Like other historical forms of censorship, current attempts to define and ban objectionable content are vague and muddy, reaching out far beyond their reasonable targets to hurt the promise of open communication systems.

Government-imposed censorship, over-regulation, or service provider liability will do nothing to keep people from obtaining material the government does not like, as most of it will be on servers in another country (as happened recently with the availability of the JET Report in 37 different web sites on the Internet outside the UK). Such restrictions would, however, make Britain, like any other jurisdiction that goes too far, a very hostile place for network development or any other high-tech industry and investment.

If there is anyone who needs to be educated on Internet matters, it is the government officials, the police and MPs together with the media in the first place but not online users, parents and children. We do not need moral crusaders under the guise of industry based organisations to decide what is acceptable and not acceptable.

Child pornography is an other matter, and its availability and distribution should be regulated whether on the Internet and elsewhere. But the main concern should remain the prevention of child abuse - the involvement of children in the making of pornography or its use to groom them to become involved in abusive acts, rather than discussion and fantasy. It was reported recently by the Home Department that the National Criminal Intelligence Service (‘NCIS’) Paedophile Section has spent 53,027 (1995-96) and 61,672 (1996-97) for gathering information on all forms of paedophile activity. More money should be spent to gather information about paedophiles and online paedophilia activity rather than spending the available resources on developing rating systems.

When censorship is implemented by government threat in the background, but run by private parties, legal action is nearly impossible, accountability difficult, and the system is not open and becomes undemocratic. These are sensitive issues and therefore, before introducing these systems there should be an open public debate possibly together with a consultation paper from the DTI. It should be noted that the IWF is predominantly industry based and therefore it does not necessarily represent the public at large and the UK society.

 

Copyright 1997-2001 Cyber-Rights & Cyber-Liberties (UK)

 


 

Appendix I

 

TABLE OF UK CASES INVOLVING CHILD PORNOGRAPHY ON THE INTERNET

 

Case Name Time Possession/ Distribution Legislation Pseudo-Photographs Sentence
Christopher Sharp October 1995 Possession S.160 CJA 1988 NO Fined 9000
Martin Crumpton January 1996 Possession S.160 CJA 1988 NO 3 months
Melvin Dunstan January 1996 Both S. 43 TA 1984 NO 120 Hrs Comm. S.
A. Fellows May 1996 Both PCA 1978

OPA 1959

NO 3 years
S. Arnold May 1996 Both PCA 1978

OPA 1959

NO 6 months
Smith ??? Distribution ??? ??? Not on trial yet
Simon Jackson September 1996 Both   YES 4 months
Peter Crowhurst August 1996 Possession S. 160 CJA 1988 YES charged
John Payne November 1996 Possession S. 160 CJA 1988 NO 120 Hrs Comm. S.
Father A. McLeish November 1996 Both   YES 6 years
Robert Bickerstaffe August 1996 Charged with both   ??? Found dead
Danial Nye September 1996 Possession     6 months
Graham Fitchie July 1997 Both - and also with indecent assault     3 years
Peter James Morris July 1997 Possession   NO 4000
George Reid October 1997 Possession   NO 3 months
Christopher Wells December 1996 Both     2 years
Graham Warren November 1996 Possession     Fined 1000

This table is up-to-date to October 1997 and compiled by Yaman Akdeniz.
For recent cases see Yaman Akdeniz, Regulation of Child Pornography on the Internet: 
Cases and Materials, at http://www.cyber-rights.org/reports/child.htm (Last Updated April 2001)


 

Bibliography and Further Online Resources

Akdeniz, Yaman, ‘The Regulation of Pornography and Child Pornography on the Internet’ 1997 (1) The Journal of Information, Law and Technology (JILT). http://elj.warwick.ac.uk/jilt/internet/97_1akdz/

American Civil Liberties Union, Fahrenheit 451.2: Is Cyberspace Burning? How Rating and Blocking Proposals May Torch Free Speech on the Internet, August 1997 http://www.aclu.org/issues/cyber/burning.html

Computer Professionals for Social Responsibility Question Internet Filtering Agreement, July 18, 1997, at http://www.cpsr.org/dox/issues/filters.html

EURIM is an association of Parliamentarians and businesses established to advance the UK’s contribution to pan-European informatics (information technology and related products, services and issues) and telematics (electronic communication whether by wire, febre or wavelength and the products services and materials transmitted over thes neworks) and to act as a link between parliamentarians, commerce and industry, Whitehall and Brussels. At the start of 1995, members of EURIM included over 75 Mps, MEPs and Peers from all parties plus over 40 Corporate Members and Not-for-profit and Small Firm Associates.

EURIM web pages are at <http://www.eurim.org/>. EURIM Briefing No 19 : The Regulation of Content on the Internet, July 1997.

European Commission Communication to the European Parliament, The Council, The Economic and Social Committee and the Committee of the Regions: Illegal and Harmful Content on the Internet, Com (96) 487, Brussels, 16 October 1996.

European Commission Green Paper on the Protection of Minors and Human Dignity in Audovisual and Information Services, Brussels, 16 October 1996.

European Commission Working Party Report (1996) ‘Illegal and Harmful Content on the Internet’

EPIC Censorware pages at http://www.epic.org/free_speech/censorware/

Filtering Facts, a web site which supports the idea of filtering on the Internet, http://www.filteringfacts.org/index.htm

Finkelstein, Seth, The Truth Isn't Out There, http://www.spectacle.org/cs/seth.html

Internet Watch Foundation is available at http://www.internetwatch.org.uk/

Lessig, Lawrence , Tyranny in the Infrastructure: The CDA was bad - but PICS may be worse, Wired, Issue 5.07, July 1997.

Peacefire, a US organisation which opposes blocking software, http://www.peacefire.org/info/blocking_software.shtml

Wallace, Jonathan, The Censorware Page at http://www.spectacle.org/cs/

 


Credits

Cyber-Rights & Cyber-Liberties (UK) Report, ‘Watching the Watchmen: Internet Content Rating Systems, hotlines and privatised censorship.’ is written by Yaman Akdeniz. Professor Clive Walker, Centre for Criminal Justice Studies, University of Leeds, Ms Louise Ellison, Faculty of Law, University of Manchester and Mr Andrew Oram, Computer Professionals for Social Responsibility (USA) contributed to this report.

Cite as: Cyber-Rights & Cyber-Liberties (UK) Report, 'Who Watches the Watchmen: Internet Content Rating Systems, and Privatised Censorship,' November 1997, http://www.cyber-rights.org/watchmen.htm. Copyright 1997-2001, Cyber-Rights & Cyber-Liberties (UK).

Notes for the Media

Cyber-Rights & Cyber-Liberties (UK) is a non-profit civil liberties organisation founded on January 10, 1997. Its main purpose is to promote free speech and privacy on the Internet and raise public awareness of these important issues. The Web pages have been online since July 1996. Cyber-Rights & Cyber-Liberties (UK) started to become involved with national Internet-related civil liberties issues following the release of the DTI white paper on encryption in June 1996 and the Metropolitan Police action to censor around 130 newsgroups in August 1996. Cyber-Rights & Cyber-Liberties (UK) recently criticised the attempts of the Nottinghamshire County Council to suppress the availability of the JET Report on the Internet.

Cyber-Rights & Cyber-Liberties (UK) covers such important issues as the regulation of child pornography on the Internet and UK Government’s encryption policy. The organisation provides up-to-date information related to free speech and privacy on the Internet. Cyber-Rights & Cyber-Liberties (UK) is a member of various action groups on the Internet and also a member of the Global Internet Liberty Campaign (see <http://www.gilc.org>) which has over 30 member organisations world wide.

READ also the follow up to this report: Cyber-Rights & Cyber-Liberties (UK) Report: "Who Watches the Watchmen: Part II - Accountability & Effective Self-Regulation in the Information Age," September 1998 at http://www.cyber-rights.org/watchmen-ii.htm  and note the Cyber Rights & Cyber-Liberties (UK) Memorandum for the Internet Content Summit 1999, September, 1999, at http://www.cyber-rights.org/reports/summit99.htm 


Back to Cyber-Rights & Cyber-Liberties (UK) Home page.