Internet Censorship:
Law & policy around the world

Last Updated: 28 March 2002

This report contains information on government policy and/or laws regarding Internet censorship in various countries around the world. The information was compiled by EFA in March 2002 in response to a request from the Chair of a NSW Parliamentary Committee inquiring into a NSW Bill intended to censor the Internet (for background, see the Introduction section below).



This report contains information on government policy and/or laws regarding Internet censorship in various countries around the world. Information herein was compiled by EFA in March 2002 in response to a request by the Chair of the NSW Standing Committee on Social Issues for information on whether or not other countries have Internet censorship laws similar to Schedule 2 of the NSW Classification (Publications, Films and Computer Games) Enforcement Amendment Bill 2001. (For detailed information about this Bill, see the NSW Internet Censorship Bill section of EFA's web site).

EFA subsequently undertook extensive research into the current status of laws and government policy outside Australia. EFA was unable to find any indication that any country broadly comparable to Australia (in terms of democratic political systems and cultures) has, or intends to introduce, Internet censorship laws as restrictive as the provisions of the NSW Bill, nor as restrictive as existing Commonwealth legislation. While numerous countries have laws of general application applicable to Internet content such as child pornography or incitement to racial hatred, they do not prohibit or otherwise restrict provision of "matter unsuitable for minors" on the Internet.

The lack of similar laws in comparable countries is not due to a failure of Parliaments or Governments to consider the problems of illegal content or content unsuitable for minors on the Internet. Rather, it reflects a different approach from that in Australia to dealing with the problems.

The remainder of this document contains an overview of governmental approaches to dealing with Internet content that is illegal, or is unsuitable for minors, followed by sections containing more detailed information about various countries.


Since approximately 1995, numerous governments around the world have been addressing the problems of material on the Internet that is illegal under their offline laws, and also that considered harmful or otherwise unsuitable for minors. The nature of material of principal concern has varied substantially. For example: political speech; promotion of or incitement to racial hatred; pornographic material. Few governments have attempted to ban or otherwise legislatively restrict access to "matter unsuitable for minors" as distinct from material illegal to distribute to adults.

As at March 2002, government policies concerning censorship of the Internet may be broadly grouped into four categories:

a) Government policy to encourage Internet industry self-regulation and end-user voluntary use of filtering/blocking technologies.

This approach is taken in the United Kingdom, Canada, and a considerable number of Western European countries. It also appears to be the current approach in New Zealand where applicability of offline classification/censorship laws to content on the Internet seems less than clear.

In these countries laws of general application apply to illegal Internet content such as child pornography and incitement to racial hatred.

Content "unsuitable for minors" is not illegal to make available on the Internet, nor must access to same be controlled by a restricted access system. Some (perhaps all) such governments encourage the voluntary use of, and ongoing development of, technologies that enable Internet users to control their own, and their children's, access to content on the Internet.

b) Criminal law penalties (fines or jail terms) applicable to content providers who make content "unsuitable for minors" available online.

This approach is taken in some Australian State jurisdictions and has been attempted in the USA (although no such US Federal law is presently enforceable, and to the best of EFA's knowledge nor is any such US State law).

In these countries, in addition, laws of general application apply to content that is illegal for reasons other than its unsuitability for children, such as child pornography.

c) Government mandated blocking of access to content deemed unsuitable for adults.

This approach is taken in Australian Commonwealth law (although it has not been enforced in this manner to date) and also in, for example, China, Saudi Arabia, Singapore, the United Arab Emirates and Vietnam. Some countries require Internet access providers to block material while others only allow restricted access to the Internet through a government controlled access point.

d) Government prohibition of public access to the Internet.

A number of countries either prohibit general public access to the Internet, or require Internet users to be registered/licensed by a government authority before permitting them restricted access as in (c) above. Information on countries in this category is available in the Reporters Without Borders/Reporters Sans Frontiers report Enemies of the Internet of February 2001.

In the many countries that have Internet censorship laws far more restrictive than those existing or proposed in Australia, governmental focus appears to be on prohibiting and/or restricting politically sensitive speech, criticism of the government, etc. These governments do not appear to have any focus on prohibiting or restricting content deemed unsuitable for minors as distinct from content deemed unsuitable for adults.

Commentary on the ineffectiveness of legislation like the NSW Bill to protect children on the Internet often focusses on the USA constitutional right to freedom of speech and the US Supreme Court having struck down a similar law. However, from a global perspective, the law in the USA is irrelevant. Even in the highly unlikely event that the USA and Australia enacted identical laws, these would remain ineffective in protecting minors on the global Internet.

Concerns about access to content on the Internet vary markedly around the world and regulatory policy reflects this. What is illegal in one country is not illegal in others, and what is deemed unsuitable for minors in one country is not in others. For example films classified R18 in Australia are often classified suitable for persons under 18 years in other countries, e.g. Intimacy (sex scenes) and Hannibal (violence) are classified 18 in Australia, but are classified 12 in France. However, France prohibits the display of Nazi memorabilia, including on web pages, which is not prohibited by Australian offline laws nor by existing or proposed online censorship laws. Many similar examples exist that demonstrate the ineffectiveness of national censorship laws to protect children (or adults) on the Internet.

The USA is not the only country where citizens have a right to freedom of expression. In contrast to Australia, governments in comparable countries including Canada, New Zealand, the United Kingdom and various European countries have chosen to legislate to give citizens a right in domestic law to freedom of expression similar to that contained in the International Convention on Civil and Political Rights (ICCPR). Such a right is by no means absolute and does not prevent governments from enacting or enforcing laws restricting freedom of expression. However, to the best of EFA's knowledge, governments in these countries have not enacted or indicated any intent to enact, Internet censorship legislation as restrictive of adults' freedom of expression, as that existing and proposed in Australia.

Chronology of developments
(in Australia, Canada, NZ, UK, USA)

This section contains a chronology of developments since 1996 regarding governmental policy on Internet censorship and freedom of expression rights in Australia, Canada, NZ, UK, and the USA.

  • 1996:
    • Victoria, WA and NT Internet censorship laws commenced (1 Jan 96)
    • USA Communication Decency Act (CDA) enacted, US Court restraining order prevented its enforcement (Feb 96)
    • USA Court ruled CDA unconstitutional (Jun 96)
    • UK Government issued R3 Safety-Net action plan developed by UK ISP trade associations and agreed by Government involving industry establishment of complaints hotline and related take-down procedures for illegal Internet content, primarily child pornography (Sep 96)
    • European Commission issued a Communication on illegal and harmful content on the Internet advocating the use of filtering software and rating systems, and an encouragement to self-regulation of access-providers. (Oct 96)
  • 1997:
    • USA Supreme Court struck down the CDA (Jun 97)
    • NZ Parliament rejected a Bill intended to censor Internet content unsuitable for minors (Jul 97).
  • 1998:
    • UK Human Rights Bill enacted implementing the European Convention on Human Rights (ECHR).
    • USA Child Online Protection Act (COPA) enacted (Oct 98). US Court restraining order prevented its enforcement (Nov 98).
  • 1999:
    • Canadian Radio-television and Telecommunications Commission (CRTC) announced it would not regulate the Internet (17 May 99).
    • Australian Commonwealth Parliament passed Internet censorship Bill (26 May & 30 Jun 99)
    • Australian State and Territory Censorship Ministers issued draft Internet censorship legislation for comment (Aug 99).
    • NZ Court of Appeal overturned a classification of (offline) 'objectionable' material on grounds that the Classification Board had failed to demonstrate the classification was consistent with the NZ Bill of Rights (Dec 99)
  • 2000:
    • Australian Commonwealth Internet censorship legislation commenced (1 Jan 2000)
    • US Court of Appeals upheld injunction restraining enforcement of US Child Online Protection Act (COPA). (Jun 2000)
    • UK censorship of sexually explicit material became less restrictive following an appeal case (Jul 2000).
    • Australian censorship of sexually explicit material became more restrictive following a government decision to ban depictions of various types of legal activity between adults. (Sep 2000)
    • UK Courts became able to test the compatibility of UK law with the European Convention on Human Rights (commencement of relevant provisions of the UK Human Rights Bill 1998). (Oct 2000)
    • South Australian State Government tabled an Internet censorship Bill (Nov 2000)
    • UK Government issued Communications White Paper indicating no intent to enact Internet censorship legislation and stating policy of supporting means of enabling Internet users to control their own and their children's access, "rather than third party regulation". (Dec 2000)
  • 2001:
    • NSW State Government tabled an Internet censorship Bill (Nov 2001)

Country information


The Internet censorship regime in Australia comprises legislation at both Commonwealth and State/Territory Government level.

In April 1999, the Commonwealth Government introduced an Internet censorship Bill into Parliament. This Bill, if it had been passed without amendments, would have required Internet Service Providers to, among other things, block adults' access to content on sites outside Australia on threat of fines for non-compliance. Following widespread criticism, the proposed law was amended to include provision for an additional access prevention method (other than ISP blocking of overseas hosted material at the server level).

The amended law, the Broadcasting Services Amendment (Online Services) Act 1999, commenced operation on 1 January 2000.

The Commonwealth regime is a complaints based system and applies to content hosts including ISPs, but not to content creators/providers. Content hosts are required to delete Australian hosted content from their server (Web, Usenet, FTP, etc) that is deemed "objectionable" or "unsuitable for minors" on receipt of a take-down notice from the government regulator, the Australian Broadcasting Authority ("ABA"). To date (March 2002), the law has been implemented in a way that does not require ISPs to block access to content hosted outside Australia. Instead, the ABA notifies filtering/blocking software providers of content hosted outside Australia to be added to their blacklists. Australian Internet users are not required by law to use blocking software.

In addition, State and Territory criminal laws apply to content providers/creators. These laws enable prosecution of Internet users who make available material that is deemed "objectionable" or "unsuitable for minors". The detail of the criminal offence provisions is different in each jurisdiction that has enacted or proposed laws of this nature.

For more detailed information about the Commonwealth and State/Territory laws, see EFA's Internet Censorship in Australia page.


In August 1998, the Canadian Radio-television and Telecommunications Commission (CRTC) called for public discussion on what role - if any - it should have in regulating matters such as online pornography, hate speech, and "Canadian content" on the Web.

Subsequently, on 17 May 1999, the CRTC issued a media release titled "CRTC Won't Regulate the Internet" stating, among other things, that:

"The Canadian Radio-television and Telecommunications Commission (CRTC) announced today that it will not regulate new media services on the Internet. After conducting an in- depth review under the Broadcasting Act and the Telecommunications Act beginning last July, the CRTC has concluded that the new media on the Internet are achieving the goals of the Broadcasting Act and are vibrant, highly competitive and successful without regulation. The CRTC is concerned that any attempt to regulate Canadian new media might put the industry at a competitive disadvantage in the global marketplace."

See also: Canada Won't Regulate Net, Pierre Bourque, Wired News, 17 May 1999

Offline Classification:

In Canada, film and video classifications are province-based. Some provinces use the same classification system for both films and videos, while others use a different system, e.g. Ontario. As videos for home rental can have different provincial classifications, in 1995 the film and video industry developed the Canadian Rating System for Home Videos. Under this system, ratings are determined by averaging the classifications assigned by the provincial boards and placed on videos.

Some provinces include sexually explicit material in the R category, while others provinces have a separate category for material similar to that classified X, and some classified RC, in Australia.

Computer games are not required by law to be classified in most Canadian provinces. A non-government rating scheme is voluntarily used by video and computer game distributors which includes an Adults Only/18 rating (unlike in Australia).


In September 1996, China reportedly banned access to an estimated 100 Web sites by using a filtering system to prevent delivery of offending information. The banned sites included Western news outlets, Taiwanese commentary sites, anti-China dissident sites and sexually explicit sites.

Since 1996, the Chinese government has enacted a number of highly restrictive laws prohibiting publishing political commentary the government considers undesirable and so on, and there have been continuing reports of various foreign media and human rights Web sites being blocked.

On 18 January 2002, Associated Press reported that:

"China has issued its most intrusive Internet controls to date, ordering service providers to screen private e-mail for political content and holding them responsible for subversive postings on their Web sites. ...

Under the new rules, general portal sites must install security programs to screen and copy all e-mail messages sent or received by users. Those containing 'sensitive materials' must be turned over to authorities.

Providers are also responsible for erasing all prohibited content posted on their Web sites, including online chatrooms and bulletin boards.

The new rules include a long list of banned content prohibiting writings that reveal state secrets, hurt China's reputation or advocate the overthrow of communism, ethnic separatism or 'evil cults.'

The last category covers the Falun Gong spiritual movement, which has frequently resorted to the Internet to defy a harsh two-year crackdown.

Pornography and violence are also prohibited."

(China creates stern Internet, e-mail rules, Associated Press/USA Today, 18 Jan 2002)

For further information, see the report Freedom of Expression and the Internet in China issued by Human Rights Watch on 1 August 2001.


As at March 2002, Denmark had no law making it a criminal offence to make material that is unsuitable for minors available on the Internet, nor were there any proposals to create such a law. Discussion related to protection of minors was primarily unfolding around the issue of filtering at public libraries.

European Union

The approach of a large majority of (perhaps all) European Union Member States to dealing with illegal and harmful content on the Internet appears to be in accord with the 1996 recommendations of the European Commission. In these countries, laws regarding material that is illegal offline, such as child pornography and racist material, also apply to Internet content. With regard to material unsuitable for children, the EU Safer Internet Action Plan covering the period 1999-2002 has a budget of 25 million euro for, inter alia, initiatives directed to enabling end-users to better manage their own, and their children's, Internet access.

On 16 October 1996, the European Commission issued a Communication on illegal and harmful content on the Internet and a Green Paper on the protection of minors in the context of new electronic services. The EC media release stated:

"...While the Communication gives policy options for immediate action to fight against harmful and illegal content and concentrates on the Internet, the Green Paper takes a horizontal approach and will initiate a medium- and long-term reflection on the issue across all electronic media. Both documents advocate a closer co-operation between Member States and on an international level, the use of filtering software and rating systems, and an encouragement to self-regulation of access-providers."

The EU Safer Internet Action Plan covering the period 1999-2002 has a budget of 25 million euro, and has three main action lines;

  • Creating a safer environment through promotion of hotlines, encouragement of self-regulation and codes of conduct,
  • Developing filtering and rating systems, facilitation of international agreement on rating systems,
  • Awareness: Making parents, teachers and children aware of the potential of the Internet and its drawbacks, overall co-ordination and exchange of experience.

Information on initiatives undertaken in EU, and other European, countries is included in the Summary and Analysis of replies to a questionnaire on self-regulation and user protection against illegal or harmful content on the new communications and information services, prepared by the Directorate General of Human Rights, Council of Europe, in approx. Nov 2001.


Regulatory activity in France concerning illegal material on the Internet has recently been focussed on enforcing French laws prohibiting race hate material.

In May 2000, a French judge ruled that USA Yahoo! Inc must make it impossible for French users to access sites auctioning race hate memorabilia. Yahoo! said it was technically impossible for it to block Internet users in France from seeing Nazi-related content on its USA Web site and that its French site complied with France's laws prohibiting advertising Nazi memorabilia. In November 2001, a US District Court ruled that Yahoo! does not have to comply with the French court's order concerning access to its USA site. The Court ruled that the USA First Amendment protects content generated in the US by American companies from being regulated by authorities in countries that have more restrictive laws on freedom of expression.


In the mid 1990s, German ISPs were expected to block access to some Internet content outside Germany containing material that is illegal under German laws of general application, particularly race hate propaganda and child pornography. In July 2000, it was reported that the German government had ceased trying to bar access to content outside Germany but police would continue to aim to stop illegal "homegrown" material. In 2001 and 2002, German authorities issued takedown notices to a number of web hosts in the USA. These web hosts refused to comply. Further information on these developments is below.

In September 1996, following "advice" from the German Chief Prosecutor's Office, German Internet Service Providers started attempting to block access to sites containing material banned in Germany, such as the Netherland's based website of the magazine Radikal. This had the immediate effect of further publicising the material, and resulted in mirror sites springing up elsewhere around the world.

In July 2000, Reuters reported that:

"Germany, which has some of the world's toughest laws banning race hate propaganda, has conceded defeat to the cross-border reach of the Internet and given up trying to bar access to foreign-based neo-Nazi sites. Deputy Interior Minister Brigitte Zypries, the government's Internet security chief, said this week in an interview with Reuters that it was unrealistic to try to shield Germans from foreign Web sites, even though police do aim to stop homegrown Nazi and other offensive material, such as child pornography."
(Source: Germany won't block access to foreign Nazi sites, Adam Tanner, Silicon Valley News, 25 Jul 2000)

In November 1999, a German court of appeal overturned the conviction of the former head of Internet service provider CompuServe's German operations on charges of spreading child pornography on the Web. The court said it was not possible for the CompuServe chief to block the publication of child pornography on the Internet. Reportedly, CompuServe-Germany president Frank Sarfeld hailed the court's decision, saying it re-established 'legal security' for Internet providers.
(Source: Court reverses Net porn charge, AFP, 23 Nov 1999)

The Ministry for Families, Seniors, Women and Children (Bundesministerium fur Familie, Senioren, Frauen und Jugend) continues to issue takedown notices to foreign web hosts under the "Act of the Dissemination of Publications and other Media Morally Harmful to Youth" in relation to offshore sites that contain material "harmful to youth" including online pornography that would be rated X or RC in Australia. A number of web hosts in the USA were issued with these notices in 2001 and 2002, and refused to comply with the takedown request. The Ministry claims jurisdiction against web sites worldwide that contain "pornographic, extreme violence, war-mongering, racist, fascist and/or anti-Semitic content". The notices require the web host (as opposed to the website owner or content provider) to either remove the material or subject it to an age-verification system based on, for example, credit card checks.


In March 1999, numerous media organisations reported that Prime Minister Datuk Seri Dr Mahathir Mohamad had said there would be no censorship of the Internet. For example, in an article titled Do Away With Internet Censorship: PM the Utusan Express reported:

"KUALA LUMPUR March 16 - Prime Minister Datuk Seri Dr Mahathir Mohamad has advised the Cabinet to dispense with attempts on internet censorship, said Multimedia Development Corporation (MDC) executive chairman Tan Sri Dr Othman Yeop Abdullah today.
'The Prime Minister has given specific instructions that there will not be censorship on the internet. That's the position of the government,' he said at a press conference.
He explained that the Prime Minister was reaffirming a pledge in the Bill of Guarantee last week that there will be no censorship on the internet.

On 14 February 2002, an article in The Australian reported that the Malaysian government had signalled an intent to require website operators to obtain licences. According to the article "Steven Gan, editor-in-chief of, says the Mahathir government's plan to require licences for website operators would effectively silence the only dissenting media voices in a nation with rigid controls over traditional news outlets, especially newspapers."

New Zealand

New Zealand had not enacted legislation specifically directed to censorship of Internet content as at 11 March 2002.

In 1993, the NZ Films, Videos, and Publications Classification Act (FVPCA) was enacted. The definitions in this Act are said by regulatory authorities to cover Internet content: "publication" covering text and static images and "film" covering moving images. However, apparently offence provisions of the Act have very rarely (if ever) been enforced to date in relation to online content other than child pornography. The Act certainly applies to material downloaded to a computer disk from the Internet but the extent to which provisions apply to content on the Internet may be questionable.

In 1994, a Private Member's Bill titled the "Technology and Crimes Reform Bill" (aka the "Trevor Rogers Bill") was introduced into NZ Parliament, but was not enacted. The Bill was intended, among other things, to regulate the transmission of objectionable material via the Internet and also proposed to make material restricted to adults offline, illegal on the Internet. The Bill attracted considerable criticism, and was considered by a Parliamentary Select Committee.

In July 1997, the Committee recommended the Bill not be passed and recommended a voluntary Code of Practice by Internet Service Providers as the best option. Among other things the Committee was concerned with Bill of Rights implications, according to an article on NZ law firm Clendon Feeney's web site. The Committee's recommendations were accepted by the Parliament and the Bill did not proceed.

The NZ Department of Internal Affairs (DIA) states on its "Censorship and the Internet" web page that:

"The Department of Internal Affairs' Inspectors have the role of investigating New Zealand Internet websites and newsgroups and enforcing censorship legislation. We take a proactive role in prosecuting New Zealanders who trade objectionable material via the Internet. If a publication is categorised as 'objectionable' it is automatically banned by the Films, Videos, and Publications Classification Act 1993."

Reports indicate that the DIA, as at March 2002, had been concentrating its Internet related enforcement activities concerning objectionable material almost entirely on child pornography and especially on the use of chat services such as IRC to distribute/obtain such material.

Unofficial information provided to EFA indicates that probably only one web site has been classified objectionable, in approximately 1997 as a result of one or more complaints. The NZ site, which promoted the supposed US urban Afro-American gangster culture, had been the subject of a NZ newspaper report (subsequent reports in other media indicate the content was prepared by a person who had never visited the USA). The site was classified objectionable not because of material involving violence and speaking approvingly of killing and arson, but because its expressed attitude to women was found to contravene Section 3(3)(e) of the Act which prohibits material that:

"Represents (whether directly or by implication) that members of any particular class of the public are inherently inferior to other members of the public by reason of any characteristic of members of that class, being a characteristic that is a prohibited ground of discrimination specified in section 21 (1) of the Human Rights Act 1993."

The DIA web site does not refer to any prohibitions or restrictions relative to material on the Internet that is or would be classified "R18" or any other equivalent of "unsuitable for minors" under NZ legislation.

In NZ, "objectionable" material does not include that classified X in Australia, nor is there an "X" classification under NZ law. The R18 category covers all material that is classified restricted to adults, including sexually explicit material.

In 2000, a NZ Parliament Select Committee (the Government Administration Committee) commenced an inquiry into the operation of the Films, Videos, and Publications Classification Act 1993 and related issues including new communications media, the Court decision in Moonen (see later herein), and numerous other matters. Public submissions closed on 4 May 2001. As at mid March 2002, the Committee had not issued a report on its inquiry.

Offline Classification:

NZ law does not require publications, other than films and videos, to be classified and labelled prior to being made available to the public (publications may be voluntarily submitted for classification). Computer games are exempt from classification unless the game would be likely to be restricted if classified. Films and videos falling within exemption criteria in the Act (e.g. educational, cultural, scientific, etc, material) are also not required to be classified.

Sexually explicit films and videos (of the type classified X in Australia) are classified "R18" in NZ with a descriptive note "contains explicit sex scenes" attached. The R18 category covers all material that is classified restricted to adults, not only sexually explicit material.

Under New Zealand law, "objectionable" material may encompass a smaller range of material than the nearest equivalent under Australian law. The NZ test for banning material is based on harm: whether the material is "likely to be injurious to the public good". The Australian test is based on offensiveness: whether the material "offend[s] against the standards of morality, decency and propriety generally accepted by reasonable adults".

In addition, NZ classifiers must take the New Zealand Bill of Rights Act into account for every decision they make, even if the material clearly fits into the criteria set out in Section 3(2) of the NZ Classification Act. In this regard, the NZ Office of Film and Literature Classification states in an article titled Highlights from the 1999-2000 Annual Report on its web site that:

"Finally, a review of the year cannot go by without some comment on the Court of Appeal's interpretation of s3(2) in Moonen v Film and Literature Board of Review. That case concerned the classification of 37 photographs and one book as objectionable under s3(2)(a) because the Board of Review found that they "promote or support the exploitation of children, or young persons, or both, for sexual purposes". Section 3(2) deems such publications to be objectionable. The Court overturned the Board's decision on the ground that the Board insufficiently demonstrated how such a classification was a reasonable and demonstrably justified limitation on the freedom of expression."


With regard to new media such as the Internet, the Norwegian Board of Film Classification's web site states (as at 22 March 2002):

New Media
The work here is mainly connected with research, information and guidance. The new media in Norway is not regulated by law so the only option is to try to keep the public informed. We publish book, booklets and reports on the subject matter. We also use polling bureaus to help us monitor the development, and give lectures for schools and other interested parties. We also work internationally with these matters and try to stay on top of the events happening internationally, as with the technological developments and social impacts on the field.

Offline Classification:

Cinema films are required to be classified by the Norwegian Board of Film Classification, an independent body reporting to the ministry of Cultural Affairs. Classification of videos and DVDs for sale or hire is not mandatory. Video distributors place an advisory age limit on the videos on a voluntary basis, or may obtain a penal code evaluation of the video from the Board. The videos have to be registered in a video registry and a small number of videos are checked every year. (Source: Norwegian Board of Film Classification web site).

Saudi Arabia

In Saudi Arabia, public access to and from the Internet has been funnelled through a single government controlled centre since February 1999 when Internet access was first made available. From this centre, the government blocks access to Internet content deemed unsuitable for the country's citizens, such as information considered sensitive for political or religious reasons, pornographic sites, etc. The list of rules includes:

"1. Anything contravening a fundamental principle or legislation, or infringing the sanctity of Islam and its benevolent Shari'ah, or breaching public decency.
2. Anything contrary to the state or its system
5. Anything damaging to the dignity of heads of states or heads of credited diplomatic missions in the Kingdom, or harms relations with those countries.

According to a report in the New York Times on 19 November 2001 (Companies Compete to Provide Saudi Internet Veil) over 7,000 sites are added to the blacklist monthly and the control center receives more than 100 requests a day to remove specific sites from the blacklist - many because they have been wrongfully characterized by the US commercial blocking software used.

For further information, see:

  • Saudi Internet Rules, Council of Ministers Resolution, 12 February 2001
  • Losing the Saudi cyberwar, by Brian Whitaker, The Guardian, 26 Feb 2001
    "The authorities in Saudi Arabia, who recently boasted that they had found a way to censor the internet, are now licking their wounds in a cyberwar against an opposition group based in London."
  • Saudis pay to surf censored sites, by Frank Gardner in Saudi Arabia, BBC UK, 3 Nov 2001
    "The Saudi Government has tried with limited success to stop its citizens from accessing pornographic and political controversial websites. But where there is a will there is a way, and there is no shortage of people willing to pay to log onto forbidden websites."
  • The report titled Enemies of the Internet, issued by Reporters Without Borders/Reporters Sans Frontiers, February 2001.


The Singapore Broadcasting Authority (SBA) has regulated Internet content as a broadcasting service since July 1996.

A Class Licence Scheme operates under which Internet Content Providers and Internet Service Providers are deemed automatically licensed. Licensees are required to comply with the Class Licence Conditions and the Internet Code of Practice which includes the definition of "prohibited material". Apparently the Scheme does not apply to all Internet content providers. The Internet Industry Guidelines on the SBA web site state:

"Individuals who put up personal web pages are exempted from the Class Licence, unless they are putting these web pages for business, or to promote political or religious causes."
"Internet Content Providers who are not targeting Singapore as their principal market will not be subject to Singapore's standards unless they are primarily in the business of distributing pornography. For example, movie sites which are hosted in Singapore can promote and carry movie clips, even those which do not meet Singapore's standards."

The SBA has the power to impose sanctions, including fines, on licensees who contravene the Code of Practice. The "SBA takes a light-touch approach in regulating services on the Internet. For example, licensees [Internet Content Providers and Internet Service Providers] found to be in breach of regulations will be given a chance to rectify the breach before the Authority takes action." (Extract from the Internet Industry Guidelines)

"Prohibited Material" is defined in the Code of Practice and appears to involve material deemed unsuitable for adults by the Singaporean Government. It does not appear to cover information unsuitable for minors, nor does it contain a requirement that web sites attempt to restrict access to such material to adults. Briefly, "prohibited material" is that which is deemed "objectionable on the grounds of public interest, public morality, public order, public security, national harmony, or is otherwise prohibited by applicable Singapore laws". The stated factors to be considered in determining what is prohibited material indicate this includes material of a pornographic nature; advocacy of "homosexuality or lesbianism"; depictions of "detailed or relished acts of extreme violence or cruelty" and material that "glorifies, incites or endorses ethnic, racial or religious hatred, strife or intolerance". An additional factor is "whether the material has intrinsic medical, scientific, artistic or educational value".

According to the SBA's FAQ as at 11 March 2002:

"Users in Singapore...have access to all material available on the Internet, with the exception of a few high impact illegal websites"
"Content on the Internet are not pre-censored by SBA, nor are ISPs required to monitor the Internet. ISPs are currently only required to limit public access to some high-impact illegal sites [as identified by SBA] as a statement of Singapore's social values. Internet Content Providers (ICPs) are required to exercise judgement with regards to content posted on the Internet according to the definitions of what constitute prohibited material in the Internet Code of Practice. SBA is concerned primarily with pornography, violence and incitement of racial or religious hatred. SBA's purview only covers the provision of material to the public. Private communications, such as email and Internet Relay Chat between two individuals or parties are not covered. Users can freely access any material on the Internet in the privacy of their own home. SBA also does not regulate what companies provide to their employees."

The SBA Internet Industry Guidelines included the following information as at 11 March 2002:

  • "ISPs are required to limit access to some high-impact websites, as identified by SBA. ISPs are encouraged to take their own initiative against offensive content through their own Acceptable Use Policies. They are not required to monitor the Internet or their users' Internet activities.
    ISPs are free to decide which newsgroups to subscribe to based on their business policy and the Code. We do not require ISPs to monitor or remove postings, nor does SBA monitor postings in newsgroups. ... We require ISPs to make an initial assessment as to the likelihood of a newsgroup being a conduit for prohibited material. If any subscribed newsgroups is subsequently found to contain a significant amount of prohibited material, we encourage ISPs to exercise judgement on whether to unsubscribe to the newsgroup. SBA may also direct ISPs to unsubscribe to particular newsgroups which contain prohibited material."
  • "Internet Content Providers (ICP), particularly web authors, should observe the Internet Code of Practice. There is no need to seek SBA's prior approval for content posted on the Net. ICPs should exercise judgement according to the definitions of what constitute prohibited material. Licensees who are not sure if their content would be considered prohibited may check with SBA."

For more detailed information see the SBA's web site, including:

In relation to the number of web sites blocked, as at 11 March 2002, the SBA web site refers to "a few" and "some" as quoted above. Previously in May 1999 and November 2001, a page on the SBA site stated that "100 high-impact pornographic sites" were blocked. The page had said:

"What are the obligations of the Internet Service Providers?

According to the licensing framework, ISPs must exercise their best possible efforts to ensure that the main information highways stay clean. SBA does not want this task to be an onerous one for the ISPs, hence, they are only required to limit access to 100 high-impact pornographic sites. The ISPs are also required to observe the SBA content guidelines to determine which newsgroups they should subscribe to. SBA encourages industry self-regulation."

An archived copy of a June 2001 page from the SBA site on The Internet Archive Wayback Machine site states:

"[ISPs] ... need to limit access to 100 websites that are contrary to community values, as identified by SBA."

South Korea

In July 2001, the "Internet Content Filtering Ordinance" (law) became effective in South Korea.

According to media and other reports, the provisions of the Ordinance are controversial within South Korea. Previous content filtering provisions that were included in a 2000 Bill were not passed by the National Assembly due apparently to the harsh criticism it provoked from civic groups.

Recent information on English language web sites regarding the extent and manner of enforcement of the 2001 Ordinance is scarce. Information below has been obtained from articles and reports, principally from before and shortly after the Ordinance was enacted, as listed at the end of this section.

The Ordinance reportedly contains a number of components including:

  • requiring Internet Service Providers to block access to web sites on a government compiled list,
  • requiring Internet access facilities accessible to youth, such as Internet cafes, public libraries and schools to install filtering software,
  • introduction of an internet content rating system.

Sites required to be blocked are those meeting criteria determined by the Information and Communications Ethics Committee (ICEC). The ICEC is a Seoul-based non-governmental organisation that is mandated by the 1995 Law for Electronics and Communications Businesses to suppress harmful information and communication, and to foster a healthy information culture by "filtering national illegal and harmful information" and prevent "cyber sexual violence".

As part of the Ordinance, on 1 November 2001, the Ministry of Information and Communications (MIC) formally enacted an internet content rating system implementing the ICEC's rating criteria. The ICEC had released its "Criteria for Indecent Internet Sites" on 24 April 2001 which reportedly, among other things, classifies information concerning homosexuality in the category of "obscenity and perversion".

As at August 2001, according to media and other reports, the government had indicated an intent to require blocking of: sites in a list of over 120,000 that had been compiled by the ICEC; sites containing words in a control list of keywords, and sites rated in some categories of the government's Internet Content Rating System. (As at 23 March 2002, EFA has not been able to ascertain whether these indicated intentions of the government became fully operative, nor to what extent any of them are currently operative. With regard to the Rating System, it appears that rating of content may be largely voluntary, but may be mandatory for some types of content, such as sites deemed "indecent" by the ICEC).

For further information see:

  • The ICEC Web site which includes some information about Internet regulation in the English language.
  • Alert and Background Information issued by the International Gay and Lesbian Human Rights Commission (IGLHRC), 23 Aug 2001

    "In July 2001, an Internet Content Filtering Ordinance entered into effect, and implemented the definitions of the ICEC. ...

    Since the introduction of the Internet Content Filtering Ordinance in July, ... government interventions have moved beyond labeling to active and extensive blocking of gay sites throughout Korea. ..."

  • 120,000 Internet Sites Blacklisted, Kim Deok-hyun, The Korea Times, 3 May 2001

    "The Information and Telecommunication Ethics Commission, a regulatory group under the Ministry of Information and Communication, has blacklisted some 119,000 Internet sites as requiring access to be blocked, fanning the flames in the fight over Internet filtering and civil rights. ... The government is preparing to use new systems to block those "anti- social" Internet sites under the Internet filtering ordinance, which will take effect July 1. 'The categorized list of barred Internet sites will be used for software makers to develop an advanced blocking system," the regulatory body said."
  • Internet Filtering Ordinance Spurs New Debate, Kim Deok-hyun, Korea Times, 23 Apr 2001

    "In a public hearing yesterday at the Sejong Cultural Art Center in downtown Seoul, civic organizations and Internet users strongly protested against the new ordinance, saying it amounts to de facto censorship to curb the free flow of information. ...
    The contents filtering was included in the revised bill to promote information- technology activities last year, but the National Assembly didn't pass the bill due to the harsh criticism it provoked from civic groups. ...
    'The new ordinance is not designed to control the Internet but to provide services to protect children from being exposed to malicious online content,' said MIC director Na Bong-ha. ...
    However, civic groups dismissed such defensive announcements, saying it could result in 'over blocking.' They also expressed concerns about the new ordinance that would bring about unexpected side effects of curbing the freedom of expression and criticism. ..."


In May 1998, the Swedish Parliament enacted the "Act (1998:112) on Responsibility for Electronic Bulletin Boards". Electronic bulletin board is defined so as to include storage of data on the Internet, such as web pages, etc (thereby excluding personal e-mail that is only stored in individuals' mailboxes).

The law places an obligation on service providers who store information (not those who only provide connections to the Internet) to remove or make inaccessible content that is obviously illegal (they do not have to decide in doubtful cases) according to the penal code, that is:

  • instigation of rebellion (section 16 article 5)
  • racial agitation (section 16 article 8)
  • child pornography (section 16 article 10)
  • illegal description of violence (section 16 article 10)
  • or when it obvious that the user has infringed copyright law.

According to explanations of the law made by the government, in computer areas where illegal messages often occur, the service providers must check what is stored, but in other areas, it is enough compliance for service providers to check when someone complains that something illegal has been stored.

The above information is a very brief summary of information available (as at March 2002) on the web site of Jacob Palme of the Department of Computer and Systems Sciences, Stockholm University/KTH. For more detailed information see Jacob Palme's pages:


The United Kingdom has not enacted censorship legislation specific to the Internet and appears to have no intention of so doing.

In December 2000, a Communications White Paper setting out the "Government's response to the new communications environment" was jointly published by the Departments of Trade and Industry (DTI), and Culture, Media, and Sport (DCMS). The paper includes principles under which a new telecommunications and broadcasting regulator, the Office of Communications (OFCOM), will operate.

In relation to regulation of Internet content, the White Paper states that research suggests people want tools that help them control what they and their children will see on the Internet, "rather than third party regulation" and that government and industry partnership provides the best approach:

"6.3.4 ... We recognise, for example, that partnership provides the best approach for Internet services. In this area, therefore, we will expect OFCOM to continue to support the Internet Watch Foundation's work to allow Internet users to regulate their own Internet experience, or that of their children, by using rating and filtering systems, as outlined in section 6.10." by_chapter/ch6/6_3.htm

"6.10.1 OFCOM should ensure continuing and effective mechanisms for tackling illegal material on the Internet, such as those being pursued under the auspices of the Internet Watch Foundation. It will also promote rating and filtering systems that help Internet users control the content they and their children will see.

The Government sees enormous benefits in promoting new media, especially the Internet. But it is important that there are effective ways of tackling illegal material on the Internet and that users are aware of the tools available, such as rating and filtering systems, that help them control what they and their children will see on the Internet. Research suggests that this is what people want in relation to the Internet, rather than third party regulation. ..." by_chapter/ch6/6_10.htm

In September 1996, a non-government organisation named the UK Internet Watch Foundation (IWF) (originally the Safety-Net Foundation) was established by Internet Service Provider associations to implement proposals to deal with illegal material on the Internet, with particular reference to child pornography. The proposals were agreed between the major UK Internet Service Provider trade associations, the Department of Trade and Industry (DTI), the Home Office and the Metropolitan Police. The IWF's establishment followed the London Metropolitan Police sending a letter to all Internet Service Providers on 9 August 1996 requesting them to censor Usenet news groups or else police would find it necessary to prosecute ISPs in relation to illegal material made available via their systems.

The IWF operates a hotline to enable members of the public to report child pornography or other illegal material on the Internet. According to the White Paper:

"When the IWF receives a report, it reviews the material and decides whether it is potentially illegal. If it is, it then tries to determine the origin of the material and notifies the UK police or appropriate overseas law enforcement agency. It also notifies UK ISPs that they should take the material down from their servers; if they do not, they risk prosecution. The Government will continue to encourage and support this work."

Since 1996, the IWF has dealt with material involving child pornography. In February 2002, the IWF announced it would henceforth also deal with "criminally racist content" and that the Home Office had provided IWF with "an extended guide to the application of the [UK] law to racism on the Internet - 'Racially Inflammatory Material on the Internet'". IWF said it envisaged it would concentrate its efforts on the offences referred to as "Threats of Direct Violence" in the guide.

Offline Classification:

In the UK, films, videos and some computer games are classified by the British Board of Film Classification (BBFC), an independent, non-governmental body. The BBFC's film classifications are not binding. Local councils have statutory powers in relation to films and may (and sometimes do) overrule the BBFC's decisions for films exhibited in their jurisdiction. Video recordings (other than those covered by exemptions) offered for sale or hire commercially in the UK must be classified by the BBFC, which is the classification authority designated by the Home Secretary under the Video Recordings Act 1984. Some video games/computer games (those that are likely to be restricted) must be classified by the BBFC under the VRA.

If a film or video work is obscene within the meaning of the Obscene Publications Act (OPA), or offends against other provisions of the criminal law, then the work is refused a BBFC classification certificate. (Under the OPA, an obscene article is one which, in the view of the court, tends to "deprave and corrupt" those likely, having regard to all relevant circumstances, to read, see or hear it.)

The BBFC must also consider the provisions of the European Convention on Human Rights (ECHR), which is incorporated in the UK Human Rights Act 1998. Article 10 of the ECHR contains a right to freedom of expression and requires that any restriction of freedom of expression imposed by national law must be capable of objective justification as being necessary in a democratic society for one of the purposes set out in Article 10. From October 2000, the courts have been able to consider issues arising under the ECHR, and hence to test the compatibility of UK law with the ECHR.

Non-violent sexually explicit films and videos in the UK are classified "R18". Sale of such videos is restricted to adults through licensed adult shops, and mail order sales are illegal. This is exactly the opposite of the situation in Australia. On 18 July 2000, following an appeal case and public consultation, new less restrictive R18 classification guidelines for sexually explicit material were issued.

A more detailed overview of UK classification, censorship and related laws is available in the Home Office's July 2000 Consultation Paper on the Regulation of R18 Videos. See also the BBFC web site.


The USA Government has enacted two Federal laws intended to censor offensive online content. Neither of these laws are in force as at March 2002. The first law (the CDA) was struck down by the USA Supreme Court on First Amendment grounds. The second law (the COPA), which is more narrowly focussed and covers only communications that are made for commercial purposes on the World Wide Web, is the subject of a Court injunction (also on First Amendment grounds) preventing its enforcement pending a decision of the Supreme Court. The Court decision is expected to be handed down in the latter part of 2002.

Since 1996, four U.S. states, New York, New Mexico, Michigan and Virginia have passed Internet censorship legislation restricting/banning online distribution of material deemed "harmful to minors". These laws have been struck down on Constitutional grounds.

Information about the two Federal laws is provided below.

The Communications Decency Act (CDA)

The CDA was enacted in February 1996. In the same month, a US Court issued a restraining order preventing its enforcement. In June 1996, a panel of federal judges in Philadelphia ruled the CDA unconstitutional. In June 1997, the US Supreme Court struck down the CDA on grounds that it violated the First Amendment.

The following brief information about the CDA is extracted from the USA Court of Appeals for the Third Circuit's decision (Feb 2000) on the COPA:

"The CDA prohibited Internet users from using the Internet to communicate material that, under contemporary community standards, would be deemed patently offensive to minors under the age of eighteen. In so restricting Internet users, the CDA provided two affirmative defenses to prosecution; (1) the use of a credit card or other age verification system, and (2) any good faith effort to restrict access by minors. In holding that the CDA violated the First Amendment, the Supreme Court explained that without defining key terms the statute was unconstitutionally vague. Moreover, the Court noted that the breadth of the CDA was "wholly unprecedented" in that, for example, it was "not limited to commercial speech or commercial entities . . . [but rather] [i]ts open-ended prohibitions embrace all nonprofit entities and individuals posting indecent messages or displaying them on their own computers.

Further, the Court explained that, as applied to the Internet, a community standards criterion would effectively mean that because all Internet communication is made available to a worldwide audience, the content of the conveyed message will be judged by the standards of the community most likely to be offended by the content. Finally, with respect to the affirmative defenses authorized by the CDA, the Court concluded that such defenses would not be economically feasible for most noncommercial Web publishers, and that even with respect to commercial publishers, the technology had yet to be proven effective in shielding minors from harmful material. As a result, the Court held that the CDA was not tailored so narrowly as to achieve the government's compelling interest in protecting minors, and that it lacked the precision that the First Amendment requires when a statute regulates the content of speech."

Child Online Protection Act (COPA)

COPA is the sequel to the CDA and aimed to avoid the constitutional defects of the CDA. COPA covers communications that are made for commercial purposes on the World Wide Web. It requires commercial Web publishers to ensure that minors do not access "material harmful to minors" on their Web site.

COPA was enacted on 21 October 1998. On 20 November 1998, the US District Court for the Eastern District of Pennsylvania issued a temporary restraining order against enforcement of the law and subsequently, on 1 February 1999, issued an injunction preventing the government from enforcing the law. On 22 June 2000, the US Court of Appeals for the Third Circuit upheld the lower court's injunction. The Court stated in its conclusion that "Due to current technological limitations, COPA -- Congress' laudatory attempt to achieve its compelling objective of protecting minors from harmful material on the World Wide Web -- is more likely than not to be found unconstitutional as overbroad on the merits."

The decision was appealed to the US Supreme Court and that Court's decision is expected to be handed down in the latter part of 2002.

An overview of COPA's provisions is included in the Court of Appeals February 2000 decision:

'COPA ... attempts to "address[ ] the specific concerns raised by the Supreme Court" in invalidating the CDA. COPA prohibits an individual or entity from:

"knowingly and with knowledge of the character of the material, in interstate or foreign commerce by means of the World Wide Web, mak[ing] any communication for commercial purposes that is available to any minor and that includes any material that is harmful to minors."

As part of its attempt to cure the constitutional defects found in the CDA, Congress sought to define most of COPA's key terms. COPA attempts, for example, to restrict its scope to material on the Web rather than on the Internet as a whole;4 to target only those Web communications made for "commercial purposes";5 and to limit its scope to only that material deemed "harmful to minors."

Under COPA, whether material published on the Web is "harmful to minors" is governed by a three-part test, each of which must be found before liability can attach:

(A) the average person, applying contemporary community standards, would find, taking the material as a whole and with respect to minors, is designed to appeal to, or is designed to pander to, the prurient interest;

(B) depicts, describes, or represents, in a manner patently offensive with respect to minors, an actual or simulated sexual act or sexual contact, an actual or simulated normal or perverted sexual act, or a lewd exhibition of the genitals or post-pubescent female breast; and

(C) taken as a whole, lacks serious, literary, artistic, political, or scientific value for minors.

COPA also provides Web publishers subject to the statute with affirmative defenses. If a Web publisher" has restricted access by minors to material that is harmful to minors" through the use of a "credit card, debit account, adult access code, or adult personal identification number . . . a digital certificate that verifies age . . . or by any other reasonable measures that are feasible under available technology," then no liability will attach to the Web publisher even if a minor should nevertheless gain access to restricted material under COPA.'

Footnotes from the Court of Appeals' decision:

'4. COPA defines the clause "by means of the World Wide Web" as the "placement of material in a computer server-based file archive so that it is publicly accessible, over the Internet, using hypertext transfer protocol or any successor protocol."

5. COPA defines the clause "commercial purposes" as those individuals or entities that are "engaged in the business of making such communications." In turn, COPA defines a person "engaged in the business" as one

who makes a communication, or offers to make a communication, by means of the World Wide Web, that includes any material that is harmful to minors, devotes time, attention, or labor to such activities, as a regular course of such person's trade or business, with the objective of earning a profit as a result of such activities (although it is not necessary that the person make a profit or that the making or offering to make such communications be the person's sole or principal business or source of income).'

Offline Classification:

In the USA, films, videos and computer games are not legislatively required to be classified prior to exhibition, sale or hire. Voluntary non-government established and administered rating systems are widely used.