Filtering content

In mid-March 2017 I published the following piece on my LinkedIn page. A few days later Dave Winder linked to it from his article in SC Magazine on Age Verification and the progress of the Digital Economy Bill.

Age Verification under the Digital Economy Bill 2016-17

Last month (February 2017) on the BBC R4 TODAY programme there was a horrific piece by Matthew Price’s on a nine-year-old child viewing a beheading video on the internet. This material should not be available to children. Around the same time, Barnardo's released a report on Child on Child abuse and its extraordinary increase owing to the lack of filtering on tablets and mobile phones for children.

Gloria Steinem in 2016, in conversation with Professor Heather McGregor, said that exposure to large amounts of graphic content and pornography amongst teenagers was increasing violence against women by normalising what would otherwise be aberrant behaviour. Her statements resonate with a 2013 TEDmed talk where violence is treated as an infectious disease. Although the medical evidence is not definitive, a reasonable application of the precautionary principle suggests that age verification controls should be mandated on our internet systems to stop children having easy access to graphic video content and video pornography on grounds of public health as well as for social, and moral reasons.

My company, SafeCast, is an associate member of the Digital Policy Alliance (DPA) in the UK Parliament. The DPA is the organisation which is behind the age verification provisions that are contained in the Digital Economy Bill (DEB) as part of a universal child protection strategy which is currently going through Parliament.

John Carr OBE is a strong advocate of age verification on the internet for the protection of children. He is a member of the Executive Board of the UK Council for Child Internet Safety (UKCCIS). UKCCIS is co-chaired by three Government Ministers, one from Education, one from the Home Office and one from the Department of Culture Media and Sport. At the end of January 2017, John wrote in his blog:

" The great majority of social media sites say that 13-year-olds may use their services but because few, if any, do any kind of age verification we know that 75% of all 10-12-year-olds in the UK are on them. However, on Twitter and Tumblr, for example, there is a phenomenal amount of hardcore porn published by individual account holders. For practical purposes, it can be viewed by anyone. There is no justification for a site that specifies 13 as its minimum age to provide ready access to 18+ material, and if the same site also knows that in fact large numbers of sub-13s are customers their position is completely indefensible. We cannot let the Digital Economy Bill pass into law without addressing this. ..."

The way that Parliament, through the DEB, will establish the universal child protection requirements demanded by children's advocates such as John Carr is through ‘contractual enforcement of age membership terms’, which are backed up by regulations and rules on Internet Service Providers laid down by the UK regulator Ofcom. All providers of broadcasting and internet services in the UK are under the aegis of Ofcom regulations and contractual enforcement is to be based around the ‘internet filters’ provisions which are set out in Clause 91 of the DEB.

  • 91 Internet filters
  • (1) A provider of an internet access service to an end-user may prevent or restrict access on the service to information, content, applications or services, for child protection or other purposes, if the action is in accordance with the terms on which the end-user uses the service.
  • (2) This section does not affect whether a provider of an internet access service may prevent or restrict access to anything on the service in other circumstances.
  • ...

In my opinion, this form of words, the use of a civil law enforcement power by internet access service providers rather than criminal sanctions and penalties, avoids the ‘net neutrality’ and censorship problems that beset Baroness Howe’s attempts to protect children and caused the withdrawal of her Online Safety Bill in 2015.

However, a ‘restriction on access for child protection or other purposes’ is actually ‘age verification’ - that is the clear meaning of the legalistic language in context. Hence the DEB encourages all ‘providers of internet access services’ to implement the BSI Age Verification Standard (AVS). All bodies that provide internet access services are licensed and regulated by Ofcom and hence this is effectively mandating that all organisations that provide internet access services implement the AVS in some shape or form in their electronic access devices (tablets, mobile phones, pcs, televisions etc.).

The problem is that few people have actually had sight of the AVS and do not know what is in it and how it is going to work. In my opinion, however, it is clear from the context that, through the provisions of Clause 91 of the DEB, AVS is going to be required and enforced by civil law restrictions as well as by criminal law sanction. Civil law will require the filtering of some content at age zero upwards. Criminal and civil law will require filtering and exclusion of a lesser quantity of content at age 18+. This is not going to be simply a binary requirement of content being allowed/not allowed at the 18 age of majority but will be considered to be a graduated scale, enforced by civil law requirements, which flips into an area that is subject to criminal sanctions (in addition to the civil penalties) at age 18.

Filtering from age zero upwards can only work, without censorship, if content is labelled by its creator. Hence SafeCast's process of labelling video content.

Labelling - The SafeCast HeadCodes and a demonstration of their use

I explain below the history of content labelling and provide a link to our SafeCast Headcode system so that readers can test out our proposals. which are free for all content creators.


The unfettered access by children to the flood of pornographic content from tube sites such as Kink.Com and PornHub arises because there has been no universally accepted labelling standard for video content and its suitability for children. Every labelling system that has been promulgated over the past three decades has been an added cost to the site operator. With tight margins and legitimate concerns about censorship, website operators have been able to avoid including metadata labels in their content.

Content labelling issues are not new. Back in the early 1990s the World Wide Web Consortium (W3C), an international community led by Sir Tim Berners-Lee, set about drawing up web standards. The W3C quickly identified that there needed to be an international standard specification to enable labels (metadata) to be associated with Internet content. W3C developed the PICS specification (Platform for Internet Content Selection). PICS was designed to help parents and teachers control what children access on the Internet. The PICS platform became the construct on which rating services and filtering software, such as NetNanny and K9 Web Protection, were built.

In about 1999 PICS was superseded by the Protocol for Web Description Resources (POWDER) which inserted greater flexibility into the PICS classification system. Unfortunately, the POWDER specification settled on a design which separated content labels from the content itself - a design which is inappropriate for cloud-based operations. It then, as an additional complication, started to add quality labels to content. Rather than a simple binary statement (Is there any violence in this clip? Y/N) the POWDER system led itself into trying to answer questions such as “Is this a good piece of literature? Is this an important website?” This led to the POWDER web classification system becoming one of the tools of “astroturfers” as battles raged between Search Engine Optimisers, trying to push their client’s sites up the search engine’s listings and Google’s analytical algorithms trying to only show the best sites to its users.

POWDER in 2009 was approved by W3C as “the recommended standard method for describing Web sites and building applications that act on such descriptions”. Unfortunately, by the end of this process, it was useless as a practical tool to protect children from seeing pornographic or inappropriate video content.

That is why our SafeCast Headcode system is important. My wife, Diana Kelman, and I built on the analysis and simplicity of the founder of computing. In 1947 John von Neumann, the famously gifted Hungarian mathematician, was the keynote speaker at the first annual meeting of the Association for Computing Machinery. In his address, he said that future computers would get along with just a dozen instruction types, a number known to be adequate for expressing all of mathematics. He went on to say that one need not be surprised at this small number, since 1,000 words were known to be adequate for most situations in real life and mathematics was only a small part of life, and a simple part at that.

Coming from a technology background, we reviewed the process of labelling of video content on television and the internet. We formed the view that it could be considered to be equivalent to the process of enumerating the instruction types required for expressing all of mathematics. Our work and analysis built on the sixty years of experience in UK commercial television; Baroness Howe’s work at the Independent Broadcasting Authority (where she famously established that the TV Watershed should be a ‘watershed and not a waterfall’); the growth of internet delivery of video; the need for uniform international standards to protect children from seeing inappropriate, harmful or violent material in television viewing and also on the internet via their tablets and mobile phones. We also took on board the requirements of Ofcom for restrictions on content and advertising to children, the National Association of Head Teachers who sought to protect young children from coming to harm in the UK and reports from other organisations and agencies involved in child protection and sales of inappropriate content to children (e.g. the 2015 Mothers’ Union Report “Bye Buy Childhood”).

Our analysis suggested that the number of head codes needed was a very small number. In our view, it is possible to use just six head codes to label all video content on television and the internet. We call these head codes the Safecast Headcodes. These can be used free of charge to post metadata into video content and thereby allow it to be filtered in accordance with Clause 91 of the DEB. Our proposals are therefore ideally suited to be the taxonomy in support of an international standard for child protection.

The SafeCast Headcode Labeller (demonstrator)

To test out the SafeCast Headcode labeller (demonstrator) please go to the SafeCast website at and click on the large logo, This will start up a Google Form which demonstrates the labelling operations required when uploading a video to a site such as YouTube or Facebook. There are two modes, Expert and Beginners, which are self-explanatory. Please feel free to contact me with your questions after trying the demonstrator.


Ours is a simple system which is mapped onto the Ofcom regulatory framework. Programme makers within the major broadcasters all decide on their scheduling in accordance with the TV Watershed regulations. It follows that they could easily insert the Safecast Headcodes which match the live broadcasting decisions taken in their viewing rooms. Overnight all mainstream video broadcasting would become automatically filterable in a measured manner so as to protect children and vulnerable people.