Listen to what citizens say about online regulation
Technology is changing the world in all sorts of wonderful ways, but for many of us that sense of change can also be scary. We live in a time of proliferating data, sensors and intelligent machines. The ways we live, work, travel and stay healthy are going to continue to change long after the current pandemic is over. At BT we’re excited and ambitious about what it could all mean. But we also know the pace of change can be jarring - disorientating and hard to understand.
That’s understandable given the clear risks and harms caused by illegal and damaging online content, in particular for children, who can be vulnerable to abuse and exploitation. People worry about what they might experience online, from harassment to disinformation and fraud. For some, it’s enough to put them off going online at all.
There is a noisy debate about how far Government should intervene in digital markets, whether through regulation or taxation. Many companies and commentators claim to know what’s best for people or what the public want, but too often these views aren’t evidence based.
Ofcom has published some very thorough analysis of the prevalence of different sorts of online harm and offence, and the extent of people’s concern. But there has been very little published research that tests what ordinary citizens think about how such things should be regulated and why.
BT has 30 million customers right across the UK with our three brands – BT, EE and Plusnet. We want to understand what they think there is to hope or fear from technology: we want to listen to them, and to give them a voice in these debates.
That’s why we asked Demos to conduct in-depth research into public attitudes to the different ways that the worst excesses of internet behaviour could be tackled, which was the focus of the Government Online Harms White Paper last year.
The research found that there is real consensus over the need for online service and social media companies to better protect those they impact negatively, especially children. The headline results are striking and suggest that the Government has a very clear mandate to be bold with its forthcoming legislation. In a representative sample of more than 2,000 people:
More than half (53%) have themselves experienced online harm
Over 80% think the grooming, bullying and sexual exploitation of children are big problems for society
A majority think that responsibility for fixing that is shared – between different sorts of companies, politicians, police, regulators and individuals themselves
For the most serious crimes, the great majority – again over 80% - place most responsibility with the social platforms and with Government and regulators
77% favour ‘age verification’ measures to keep kids away from certain sites
Over 75% support the blocking of entire websites if companies fail to take the right steps to prevent online harm
None of this means the task of framing legislation is easy, because there are some complicated trade-offs involved. For example, if both social media companies and regulators are to be held responsible for preventing harm, where does the platform’s responsibility stop and when does the regulator take over? Is it right to include the contents of private messaging services in the scope of any future regulation: at what point should an individual user’s right to privacy give way to a need to protect the wider safety of children?
Within BT, from time to time we’ve had our own internal debates about how to approach these questions. In fact, that was one of the reasons we commissioned this research in the first place – because we could all agree that the public’s voice should be the loudest and the most important.
The research shows that many people are willing to make some sacrifices to their individual online liberties for the sake of the wider community’s protection and security. For example:
64% want to stop online user anonymity because they think it allows more harmful behaviour
65% say people should not be free to express themselves online if what they say causes harm or serious distress to someone else
58% would be happy to see barriers to harmful content put up, even if that means accidentally censoring some non-harmful content
However, it’s important to say that these responses also seem to be less clear-cut than others. When tested in focus groups, people raised concerns about who should be able to decide what is harmful, and how. Perhaps counter-intuitively, some of those who have themselves been direct victims of online harm are less likely to see it as a big problem for society, or to propose the strongest action. But there are also some who decide to leave online spaces entirely, given their experiences there.
The most divisive question Demos asked was about whether it should be possible for government agencies to access the contents of private messages between two people, in order to identify and prevent the most serious crimes, such as child abuse and terrorism. Respondents were split roughly 50/50.
When people were given more time to talk this through in groups, they were quick to focus on issues such as the boundaries that could be put around access, and the checks and balances that could ensure the authorities targeted any extreme intervention of this sort at the most extreme examples of criminal behaviour. In other words, they got straight to the sort of debate you might expect to hear in Parliament.
All of which leads me to make two very broad suggestions to DCMS on the back of this research.
First, press on and publish a draft Bill as soon as possible – there is a strong public demand for this legislation; the principles are very clear; and if appointed as the regulator Ofcom will do a great job.
At the same time: invite more open scrutiny and deliberation on the really tricky questions, about how to regulate in a way that provides both freedom and protection to all online citizens in the UK. These are fundamentally democratic questions, and many of the right answers are likely to come from further research and consultation with citizens themselves.
- Alex Towers, Director of Policy and Public Affairs, BT Group
Demos’s summary of their findings is here: https://demos.co.uk/wp-content/uploads/2020/10/Online-Harms-A-Snapshot-of-Public-Opinion.pdf
Their summary of focus groups: https://demos.co.uk/wp-content/uploads/2020/10/Full-Polling-tables.pdf
Full Polling results: https://demos.co.uk/wp-content/uploads/2020/10/Polling-and-Focus-Group-Toplines.pdf