By Lynette Owens

On Monday, July 22, British Prime Minister David Cameron gave a speech strongly urging ISPs, search engines, and mobile operators to block consumers from accessing online content, namely pornography.  The motivation for the speech was a belief that this content is harmful to the well-being of children, so the technology industry should fix this.

The speech stopped short of calling for censorship, suggesting that consumers of age could opt-in to get access to pornography if they so wished.  But the speech also carried a strong threat of legislation if the parties held accountable did not fall in line.

It’s hard to disagree with wanting good things for our children.  But it’s dangerous and, ironically, immoral to use them as a means to make a lot of noise and get credit for actions that do not fully solve the problem and creates new ones in the process.

Criticism of the speech has come in many forms.  Here is ours.

It Wrongly Assumes Technology is the Answer

The goal of the UK ban is to shield children from online pornography and to stop all people from accessing content that contains both child sexual abuse images and pornography depicting rape.

Such images are worthy of eliminating entirely from the Internet.  Many industries and organizations, from technology to financial to non-profits, including ours, have already done, and continue to do, numerous things to stop the distribution of child sexual abuse images.  In our case, we work closely with both the Internet Watch Foundation (IWF) and the National Center for Missing and Exploited Children (NCMEC) to ensure our products are blocking access to child sexual abuse images.

But even those measures are not fool-proof.  The IWF and NCMEC will tell you that even with the help of law enforcement agencies, we are all still reacting (albeit as quickly as possible) versus preventing such images from getting online.  The images are being taken in places we don’t know about.  While those of us in the tech industry can always do more to help this issue, it ultimately cannot be 100 percent effective since the root cause of the problem is the creation of the images in the first place.  This is something the tech industry can never solve.

It Diminishes the Role of Parents

In his speech, Mr. Cameron used the words “filter” 29 times and “block” 11 times.  By contrast, he used the words “teach” and “educate” only three times each.

The speech, though not overt, was ultimately a broader call to clean up content on the Internet that is harmful to kids.  The criticism many have already voiced is that there is no objective measure of what is or isn’t okay for our kids to see.  This is ultimately an individual family’s decision, based on their own values and the particular needs of their children.  Mr. Cameron, however, argues that by forcing filtering in the home “we can protect all children, whether their parents are engaged in Internet safety or not.”  This is absolutely the wrong message to parents; it both dismisses the importance of their role and gives them license to be disengaged and/or have a false sense their concerns will be handled by someone else.

Anyone, whether a parent or not, knows the ultimate well-being of a child depends highly on their parents’ engagement.  My kids may see something that is harmful to them – however one defines “harmful”, whether online, on TV or in real-life – but while I don’t seek to deliberately expose them to this, it is inevitable.  In April, my children saw news footage of the Boston Marathon bombings airing on a television in a restaurant where we dined. Instead of denying this upsetting reality, it is my job to talk to my children about it and assure them they are safe and have nothing to worry about.

To assume that Internet filtering will eliminate the need for such conversations is a far cry from reality.  If the well-being of our children is the goal, then there should be greater emphasis and investment in encouraging parent involvement than on prosecuting search engines.

It Does Not Address the Origin of Content

Mr. Cameron’s speech also overlooks two issues of the content that he is concerned about.

In an age where social media and user-generated content comprise an enormous amount of the information we find online, some of the media that may be harmful to kids might be created by kids.  While the majority of things on the Internet are not harmless,  there has been concern about teens sending inappropriate images of themselves to others (particularly through popular apps like Snapchat, where images disappear after 10 seconds).  Filtering or blocking does not ultimately solve the creation of such images in the first place.  Parent involvement, along with online safety and digital literacy education in schools are a better solution to this problem.

Second, the Internet is not the only way kids might be exposed to age-inappropriate or harmful content.  Mr. Cameron, where is your speech to the movie, television and video game industries?

It Creates New Privacy and Security Issues

Finally, in addition to falling short of being an effective solution, Mr. Cameron’s proposal gives rise to new problems.

First, it requires consumers to divulge personal information never previously required.  Those of age who do not want to be blocked from accessing legal pornographic content must identify themselves.  This alone has both angered and amused privacy advocates for its sheer audacity.  This information would presumably be stored by individual ISPs and mobile operators, though it is not entirely clear who is responsible to collect, store and protect it.

A second issue, which results from the first, is that such draconian measures may force people underground; it would encourage them to use proxy services in order to get around filters to access content that is already legal.  It is our belief that this will consequently compromise the trust and safety on the Internet.  For those of us whose life’s work is to distinguish between the good and bad actors online, our job would become more difficult if these technical disguises were used by even those who are doing nothing wrong.

This criticism is not a campaign against filtering.  Online filters have their use and place.  It is part of our on-going mission to educate everyone – from parents to schools to organizations to governments – that filtering in and of itself can never be the end.  It should be one step in a series of many to ensure that all of us are heavily involved in helping the world’s youngest citizens use the Internet in ways that help them succeed and contribute to society.  Filters are designed to set and forget.  That’s the easy part.  Raising a generation of good digital citizens?  No one can do this alone.  Not a single government, organization, school, or family, by threat or otherwise.

Protecting our children online does not mean building higher walls around them.  It means teaching them to protect themselves: to think critically about what they see and share, to understand how others’ online actions can impact them and to be aware of how their own actions can impact themselves and others.  It is not the easy path, but it’s the only one worth taking.

Lynette Owens

Lynette Owens

Lynette Owens is Vice President of Global Consumer Education & Marketing at Trend Micro and Founder of the Internet Safety for Kids and Families program. With 25+ years in the tech industry, Lynette speaks and blogs regularly on how to help kids become great digital citizens. She works with communities and 1:1 school districts across the U.S. and around the world to support online safety, digital and media literacy and digital citizenship education. She is a board member of the National Association for Media Literacy Education, an advisory committee member of the Digital Wellness Lab, and serves on the advisory boards of INHOPE and U.S. Safer Internet Day.

Follow her on Twitter @lynettetowens.