Internet censorship in the United Kingdom is conducted under a variety of laws, judicial processes, administrative regulations and voluntary arrangements. It is achieved by blocking access to sites as well as the use of laws that criminalise publication or possession of certain types of material. These include English defamation law, the Copyright law of the United Kingdom,[1] regulations against incitement to terrorism[2] and child pornography.
British citizens have a negative right to freedom of expression under the common law.[3] In 2000, the United Kingdom required its courts to interpret as far as they can its domestic legislation compatibly with the European Convention, and the guarantee of freedom of expression it contains in Article 10. This was achieved under the Human Rights Act 1998 by requiring courts, if necessary, to strain the meaning of domestic law to be compatible with Convention rights and public authorities also have a duty to act compatibly with these rights. Where courts cannot interpret compatibly, some courts can issue a declaration of incompatibility; however the incompatible domestic legislation remains intact and it is for Parliament to decide whether to amend it to bring it into line with the Convention. Moreover there is a broad sweep of exceptions in the Convention.
The law provides for freedom of speech and press, and prohibits arbitrary interference with privacy, family, home, or correspondence, and the government routinely respects these rights and prohibitions. An independent press, an effective judiciary, and a functioning democratic political system combine to ensure freedom of speech and press. Individuals and groups routinely use the Internet, including e-mail, to express a wide range of views.[4]
Since the mid-2000s there has been a gradual shift toward increased surveillance and police measures in the UK. National security concerns, terrorism and crime, and issues regarding child protection have resulted in the state introducing extensive surveillance measures over online communications as well as filtering and tracking practices. In some cases these are encouraged or required by the state and used by state agencies. In others they are voluntarily implemented by private operators (e.g., internet service providers).
The country was listed among the "Enemies of the Internet" in 2014 by Reporters Without Borders,[5] a category of countries with the highest level of internet censorship and surveillance that "mark themselves out not just for their capacity to censor news and information online but also for their almost systematic repression of Internet users".[6] Other major economies listed in this category include China, Iran, Pakistan, Russia and Saudi Arabia. However years later, they are no longer listed as an "enemy of the internet".[7]
In 2017 the Communications Select Committee set up an inquiry as to whether to, and how to, further regulate the Internet in the UK.[8]
See main article: Web blocking in the United Kingdom.
UK mobile phone operators began filtering Internet content in 2004[9] when Ofcom published a "UK code of practice for the self-regulation of new forms of content on mobiles".[10] This provided a means of classifying mobile Internet content to enable consistency in filtering. All major UK operators now voluntarily filter content by default and when users try to access blocked content they are redirected to a warning page. This tells them that they are not able to access an 'over 18 status' Internet site and a filtering mechanism has restricted their access. Categories that are listed as blocked include: adult / sexually explicit, chat, criminal skills, drugs, alcohol and tobacco, gambling, hacking, hate, personal and dating, violence, and weapons.[11] Users who are adults may have the block lifted on request.[11]
Guidelines published by the Independent Mobile Classification Body were used by mobile operators to classify sites until the British Board of Film Classification took over responsibility in 2013.[12] Classification determines whether content is suitable for customers under 18 years old.[13] The default assumption is that a user is under 18.
The following content types are blocked from under 18's:[13]
Significant overblocking of Internet sites by mobile operators is reported, including the blocking of political satire, feminism and gay content.[14] Research by the Open Rights Group highlighted the widespread nature of unjustified site blocking.[15] In 2011 the group set up Blocked.org.uk, a website allowing the reporting of sites and services that are 'blocked' on their mobile network.[16] [17] The website received hundreds of reports[18] of the blocking of sites covering blogs, business, internet privacy and internet forums across multiple networks. The Open Rights Group also demonstrated that correcting the erroneous blocking of innocent sites can be difficult. No UK mobile operator provides an on-line tool for identifying blocked websites. The O2 Website status checker[19] [20] was available until the end of 2013 but was suspended in December[21] after it had been widely used to determine the extent of overblocking by O2.[22] Not only were civil liberties and computing sites being blocked,[23] but also Childline, the NSPCC, the Police. An additional opt-in whitelist service aimed at users under 12 years is provided by O2. The service only allows access to websites on a list of categories deemed suitable for that age group.[24]
See main article: Web blocking in the United Kingdom.
See main article: Web blocking in the United Kingdom.
See main article: Web blocking in the United Kingdom. The main focus of political censorship in UK law is concerned with the prevention of political violence. Hence incitement to ethnic or racial hatred is a criminal offence in the UK and those who create racist websites are liable to prosecution. Incitement to hatred against religions is an offence in England and Wales under the Racial and Religious Hatred Act 2006. Holocaust denial is not an offence per se unless it contravenes other laws. Other legal exceptions to the principle of freedom of speech include the following:
In September 2014 Home Secretary Theresa May proposed the introduction of Extremism Disruption Orders. These would allow judges to ban people who are deemed extremists (but who "do not break laws”) from broadcasting, protesting in designated places or posting messages on Social Media.[34]
There are a number of legal exceptions to freedom of speech in the United Kingdom that concern pornography. These include obscenity and indecency, including corruption of public morals and outraging public decency. The UK has a markedly different tradition of pornography regulation from that found in other Western countries. It was almost the only liberal democracy not to have legalised hardcore pornography during the 1960s and 1970s. Pre-existing laws, such as the Obscene Publications Act 1959, continued to make its sale illegal through the 1980s and 1990s. Additionally new laws were introduced to extend existing prohibitions. The Video Recordings Act 1984 required the BBFC to censor all video works before release. As a result, the UK became one of the few representative government countries where the sale of explicit pornography on video (and later DVD) was illegal (thus opening the market to unlicensed pornography shops which technically operated in defiance of the haphazardly enforced laws).[35]
The appearance of the Internet during the 1990s introduced unregulated access to hardcore pornography in the UK for the first time. The existing legal and regulatory framework came to be seen as insufficient and in the 21st century a number of measures have been introduced, including web blocking and additional criminal legislation. Nevertheless, the Obscene Publications Act is still in force, and it makes it illegal for websites that can be accessed from the UK without age restriction to contain certain types of adult content.[36]
See main article: Child pornography laws in the United Kingdom. The first attempts to regulate pornography on the Internet concerned child pornography. Legislation in the form of the Protection of Children Act 1978 already existed making it illegal to take, make, distribute, show or possess an indecent photograph or pseudo-photograph of someone under the age of 18. The R v Bowden case in 2000 established that downloading indecent images of children from the Internet constituted the offence of making, since doing so causes a copy of the image to exist which previously did not exist.[37]
Initial steps to restrict pornography on the Internet were taken by the UK police. In the 1990s they began to take a pro-active regulatory role with respect to the Internet, using existing legislation and working on a self-tasking basis. In August 1996, the Metropolitan Police Clubs & Vice Unit sent an open letter to the Internet Service Providers Association (ISPA) supplying them with a list of 132 Usenet discussion groups that they believed to contain pornographic images or explicit text and requesting that they ban access to them.[38] The list mainly included newsgroups which carried child pornography. Ian Taylor, the Conservative Science and Industry Minister, warned ISPs that the police would act against any company which provided their users with "pornographic or violent material".[39] Taylor went on to make it clear that there would be calls for legislation to regulate all aspects of the Internet unless service providers were seen to wholeheartedly "responsible self-regulation". Following this, a tabloid-style exposé of ISP Demon Internet appeared in the Observer newspaper, which alleged that Clive Feather (a director of Demon) "provides paedophiles with access to thousands of photographs of children being sexually abused".[40]
During the summer and autumn of 1996 the UK police made it known that they were planning to raid an ISP with the aim of launching a test case regarding the publication of obscene material over the Internet. The action of the UK police has been described as amounting to censorship without public or Parliamentary debate. It has been pointed out that the list supplied to ISPs by the police in August included a number of legitimate discussion groups concerned with legal sexual subjects. These contained textual material without pictures that would not be expected to infringe UK obscenity laws.[41]
See main article: Internet Watch Foundation.
The direct result of the 1996 campaign of threats and pressure was the setting up of the Internet Watch Foundation (IWF), an independent body to which the public could report potentially criminal Internet content, both child pornography and other forms of criminally obscene material. These reports would be passed on to ISPs and the Police as a ‘notice and takedown’ service for the removal of potentially illegal content hosted in the UK. It was intended that this arrangement would protect the internet industry from any criminal liability. The IWF was also intended to support the development of a website rating system.[42] [43] Demon Internet was a driving force behind the IWF's creation, and one of its directors, Clive Feather, became the IWF's first chairman.[44]
After 3 years of operation, the IWF was reviewed for the DTI and the Home Office by consultants KPMG and Denton Hall. Their report was delivered in October 1999 and resulted in a number of changes being made to the role and structure of the organisation, and it was relaunched in early 2000, endorsed by the government and the DTI, which played a "facilitating role in its creation", according to a DTI spokesman.
At the time, Patricia Hewitt, then Minister for E-Commerce, said: "The Internet Watch Foundation plays a vital role in combating criminal material on the Net." To counter accusations that the IWF was biased in favour of the ISPs, a new independent chairman was appointed, Roger Darlington, former head of research at the Communication Workers Union.
The Google search engine Google Search includes a SafeSearch filter which restricts the content returned by a search. In December 2012 the option to turn the filter off entirely was removed.[45]
In July 2013 Prime Minister David Cameron called on Internet search engines to "blacklist" certain search terms, so that they would bring up no results. Microsoft quickly responded by introducing a blacklist provided by the Child Exploitation and Online Protection Centre (CEOP). A 'pop-up' warning appears on the UK version of its search engine Bing when searches contravene the blacklist.[46] In November 2013 Google announced that 100,000 "blacklisted" search terms would no longer give any results, while 13,000 would produce a warning message. Child protection experts, including a former head of the CEOP, have warned that these measures will not help to protect children because most child pornography on the Internet is on hidden networks inaccessible through these search engines.[47]
In 2009 the UK Ministry of Justice claimed that legislation was needed to reduce the availability of hardcore paedophilic cartoon pornography on the internet.[48] The decision was made to make possession of cartoon pornography depicting minors illegal in the UK. The Coroners and Justice Act 2009 (sections 62–68), which came into force on 6 April 2010,[49] created an offence in England, Wales and Northern Ireland of possession of a prohibited image of a child.[50] The maximum penalty is three years imprisonment and listing on the sex offender registry.[51]
A prohibited cartoon image is defined as one which involves a minor in situations which are pornographic and "grossly offensive, disgusting or otherwise of an obscene character". The Act makes it illegal to own any picture depicting under-18s participating in sexual activities, or depictions of sexual activity in the presence of someone under 18 years old. The definition of a "child" in the Act includes depictions of 16- and 17-year-olds who are over the age of consent in the UK, as well as any adults where the "predominant impression conveyed" is of a person under the age of 18. "The law has been condemned by a coalition of graphic artists, publishers, and MPs, fearing it will criminalise graphic novels such as Lost Girls and Watchmen."[48]
See main article: Section 63 of the Criminal Justice and Immigration Act 2008. Calls for violent adult pornography sites to be shut down began in 2003, after the murder of Jane Longhurst by Graham Coutts, a man who said he had an obsession with Internet pornography.[52] Jane Longhurst's mother and sister also campaigned to tighten laws regarding pornography on the Internet. In response the government announced plans to crack down on sites depicting rape, strangulation, torture and necrophilia.[53] [54] [55] However, in August 2005 the Government announced that instead of targeting production or publication, it planned to criminalise private possession of what the Government now termed "extreme pornography".[56] [57] This was defined as real or simulated examples of certain types of sexual violence as well as necrophilia and bestiality. The passing of the Criminal Justice and Immigration Act 2008 resulted in the possession of "extreme pornographic images" becoming illegal in England and Wales as of January 2009.[58]
The law has been criticised for criminalising images where no crime took place in their creation.[59] Additionally, the law's placing of liability on consumers rather than producers has been criticised for creating a power imbalance between the individual and the state. There has never been a legal challenge to the law in the UK as the cost of doing so would be beyond most individuals.[60] In 2011, there were over 1300 prosecutions under the law, compared to the Government estimate of 30 cases a year.[61] [62]
In 2004 in Scotland, a committee of Members of the Scottish Parliament backed a call to ban adult pornography as the Equal Opportunities Committee supported a petition claiming links between porn and sexual crimes and violence against women and children.[63] A spokeswoman said "While we have no plans to legislate we will, of course, continue to monitor the situation." In 2007, MSPs looked again at criminalising adult pornography, in response to a call from Scottish Women Against Pornography for pornography to be classified as a hate crime against women. This was opposed by Feminists Against Censorship.[64] [65] In September 2008, Scotland announced its own plans to criminalise possession of what it termed "extreme" adult pornography, but extending the law further, including depictions of rape imagery.[66] These plans became law with the Criminal Justice and Licensing (Scotland) Act 2010.
In July 2013, David Cameron proposed that pornography which depicts rape (including simulations involving consenting adults) should become illegal in England and Wales bringing the law in line with that of Scotland.[67] These plans became law with the Criminal Justice and Courts Act 2015.
In January 2019, the Crown Prosecution Service amended their advice regarding prosecutions under obscenity laws of depictions of acts that are themselves legal to perform, stating that they "do not propose to bring charges based on material that depicts consensual and legal activity between adults, where no serious harm is caused and the likely audience is over the age of 18".[68]
See also: Audiovisual Media Services Regulations 2014 and ATVOD.
The Audiovisual Media Services Regulations 2014 require that the online streaming of videos (known as Video On Demand or VOD) in the UK conforms to the BBFC R18 certificate regulations which had previously only restricted those sold in licensed sex shops.[69] The regulations were first announced in July 2013 by David Cameron.[67]
The UK regulator of VOD is Ofcom, which replaced ATVOD as the regulator from the beginning of 2016.[70] During its tenure as regulator ATVOD regularly instructed UK websites to comply with its rules and failure to do so resulted in Ofcom issuing a fine or shutting down a website.[36] [71] It is a criminal offence not to restrict access to adult VOD content to those aged over 18, by means such as requiring the user to provide credit card details.[72]
In March 2014 ATVOD proposed new legislation that would introduce a licensing system for all UK adult content providers. The verification of customers' ages would be a condition of granting a license. Furthermore, there would be a legal requirement on financial institutions to block the customer payments of unlicensed adult websites.[73]
See main article: Revenge porn.
An amendment to the Criminal Justice and Courts Act 2015 creates a specific offence in England and Wales of distributing a private sexual image of someone without their consent and with the intention of causing them distress (commonly called "revenge porn"). The maximum custodial sentence is two years. The law received Royal Assent and came into effect in February 2015.[74]
Pressure for a change in the law came from reports in April 2014 by UK charities including The National Stalking Helpline, Women's Aid, and the UK Safer Internet Centre that the use of revenge porn websites had increased. Women's Aid Charity Chief Executive Polly Neate stated, "To be meaningful, any attempt to tackle revenge porn must also take account of all other kinds of psychological abuse and controlling behaviour, and revenge porn is just another form of coercive control. That control is central to domestic violence, which is why we're campaigning for all psychological abuse and coercive control to be criminalised". In July, Minister of Justice Chris Grayling announced plans to "take appropriate action" to address revenge porn in Britain.[75] A House of Lords Committee, in a report on social media crime, subsequently called for clarification from the DPP as to when revenge porn becomes a crime.[76] [77]
See main article: R v Walker.
R v Walker, sometimes called the "Girls (Scream) Aloud Obscenity Trial", was the first prosecution for written material under Section 2(1) of the Obscene Publications Act in nearly two decades.[78] It involved the prosecution of Darryn Walker for posting a story entitled "Girls (Scream) Aloud" on an internet erotic story site in 2008. The story was a fictional written account describing the kidnap, rape and murder of pop group Girls Aloud.[79] It was reported to the IWF who passed the information on to Scotland Yard’s Obscene Publications Unit. During the trial the prosecution claimed that the story could be "easily accessed" by young fans of Girls Aloud. However, the defence demonstrated that it could only be located by those specifically searching for such material. As a result, the case was abandoned and the defendant cleared of all charges.[80] [81]
In October 2013 a press exposé resulted in a number of on-line e-book retailers removing adult fiction titles including descriptions of rape, incest or bestiality from their download catalogues.[82]
See main article: Proposed UK Internet age verification system.
With the passing of the Digital Economy Act 2017, the United Kingdom became the first country to pass a law containing a legal mandate on the provision of an Internet age verification system. Under the act, websites that publish pornography on a commercial basis would have been required to implement a "robust" age verification system.[83] [84] The British Board of Film Classification (BBFC) was charged with enforcing this legislation.[85] [86] [87] After a series of setbacks, the planned scheme was eventually abandoned in 2019.[88]
Social media in the United Kingdom are subject to a number of laws which restrict the range of comments that users can make.
See main article: Communications Act 2003.
Section 1 of the Malicious Communications Act 1988 criminalises sending another any article which is indecent or grossly offensive with an intent to cause distress or anxiety (which has been used to prohibit speech of a racist or anti-religious nature).[89] [90]
Section 127 of the Communications Act 2003 makes it an offence to send a message that is grossly offensive or of an indecent, obscene or menacing character over a public electronic communications network.[91] The section replaced section 43 of the Telecommunications Act 1984 and is drafted as widely as its predecessor.[92] The section has controversially been widely used to prosecute users of social media.[93] On 19 December 2012, to strike a balance between freedom of speech and criminality, the Director of Public Prosecutions issued interim guidelines, clarifying when social messaging is eligible for criminal prosecution under UK law. Revisions to the interim guidelines were issued on 20 June 2013 following a public consultation[94] and have been updated since then.
See main article: English defamation law.
The fact that existing libel laws apply to Internet publishing was established by the Keith-Smith v Williams case of 2006, but the time limit of one year after publication for libel suits does not apply to Internet publishing because each incidence of material being accessed on the Internet is defined as a new publication. As a result, many newspapers and journals do not publish controversial material in their on-line archives due to a fear of potential libel suits.[95] In addition, individuals without the financial means to defend themselves against libel suits can also be reluctant to publish controversial material on-line. With older forms of publishing the media companies themselves had legal responsibility for posts but with social media such as Twitter it is the users and not their online hosts who have legal responsibility.[96]
Individuals who are defamed online may also not have the financial means to seek legal redress. The UK Ministry of Justice drew up plans in 2008 to give such individuals access to cheap low-cost legal recourse but these proposals were never implemented.[97] Instead the Defamation Act 2013 (which came into force on 1 January 2014[98]) reformed libel law to allow new defences and introduce a requirement for claimants to show that they have suffered serious harm.[99] The intention behind the reform was to make it harder to bring libel suits in Britain.[100]
Exceptions to freedom of speech include prior restraint, restrictions on court reporting including names of victims and evidence and prejudicing or interfering with court proceedings,[101] prohibition of post-trial interviews with jurors,[101] and scandalising the court by criticising or murmuring judges.[101] [102]
The use of social media to comment on a legal case can constitute contempt of court, resulting in the fining or imprisonment of the social media user. This can happen if a trial is seriously prejudiced as a result of a comment, such as a breach of jury confidentiality, resulting in the need for a retrial.[103] It can also happen if the identity of an individual is publicly revealed when their identity is protected by a court. For instance, victims of rape and serious sexual offences are entitled as a matter of law to lifelong anonymity in the media under the Sexual Offences Act 1992, even if their name has been given in court.[104]
There have been a number of instances of users of social media being prosecuted for contempt of court. In 2012 the R v Evans and McDonald rape trial generated more than 6,000 tweets, with some people naming his victim on Twitter and other social media websites. Nine people were prosecuted.[105] In February 2013, the Attorney General's Office instituted contempt of court proceedings against three men who used Twitter and Facebook to publish photographs which allegedly showed the two murderers of the toddler James Bulger as adults. This use of social media breached a worldwide injunction that prevented publication of anything that could identify the pair.[106]
In December 2013 the Attorney General's Office set up a Twitter account to provide advice to individuals using social media. The advice is intended to help individuals avoid committing contempt of court when commenting on legal cases. The professional news media routinely receive such advice.[107]
On 11 August 2011, following the widespread riots in England, British Prime Minister David Cameron said that Theresa May, the Home Secretary, would meet with executives of the Web companies Facebook and Twitter, as well as Research In Motion, maker of the BlackBerry smartphone, to discuss possible measures to prevent troublemakers from using social media and other digital communications tools.[108] During a special debate on the riots, Cameron told Parliament:
Everyone watching these horrific actions will be struck by how they were organised via social media. Free flow of information can be used for good. But it can also be used for ill. And when people are using social media for violence we need to stop them. So we are working with the police, the intelligence services and industry to look at whether it would be right to stop people communicating via these Web sites and services when we know they are plotting violence, disorder and criminality”.
Critics said that the British government was considering policies similar to those it has criticised in totalitarian and one-party states.[109] And in the immediate aftermath of the 2011 England riots, Iran, often criticised by the West for restricting the Internet and curbing free speech, offered to "send a human rights delegation to Britain to study human rights violations in the country".[110]
On 25 August 2011 British officials and representatives of Twitter, Facebook and BlackBerry met privately to discuss voluntary ways to limit or restrict the use of social media to combat crime and periods of civil unrest.[111] The government was seeking ways to crack down on networks being used for criminal behavior, but was not seeking any additional powers and had no intention of restricting Internet services.[112] It was not clear what new measures, if any, would be taken as a result of the meeting.
The practice of file sharing constitutes a breach of the Copyright, Designs and Patents Act 1988 if it is performed without the permission of a copyright holder. Courts in the UK routinely issue injunctions restricting access to file sharing information published on the Internet. The British Phonographic Industry represents the interests of British record companies and along with the British Video Association encourages UK governments to regulate and legislate to reduce copyright infringement. As a result, the Digital Economy Act was passed in 2010. Further legislation has been suggested, such as the 2014 proposal for a general law to prevent search engines from returning file-sharing websites as search results.[113]
See main article: Digital Economy Act 2010.
The Digital Economy Act 2010 is the only Internet-specific legislation regarding copyright in the UK. Progress on the implementation of the Act was slow,[114] [115] and in the end, its measures were never passed by Parliament.
The Act had proposed a Code to be drafted by Ofcom and implemented by Parliament, containing provisions restricting the downloading of copyrighted material from the Internet. Under the Act, warning letters would have been sent to Internet users suspected of downloading copyright-infringing material (provided their ISP has more than 400,000 customers), and a customer receiving three such letters in one year would be recorded by their service provider and could have been subject to a civil claim by the copyright holder under the Copyright, Designs and Patents Act 1988 (the copyright holder having first sought the subscriber's identity using a court order). After these provisions have been in force for a year, additional rules could have then been applied, requiring ISPs to reduce the download speed of repeat offenders and in some cases disconnect their Internet supply. The Act originally allowed the Secretary of State to order the blocking of websites which provided material that infringed copyright, although this section was dropped following the successful use of court orders to block websites. Commentators debate the practicality of such controls and the ability of the UK government to exact control.[116]
See main article: Web blocking in the United Kingdom.