View Static Version
Loading

National Online Harms Consultation Responses Catch22, August 2020

The Catch22 Online Harms Consultation was launched in June 2020, acknowledging the increasing amount of time young people were spending online as a result of the lockdown.

As we adapt and develop services across the UK, we need insight on what is happening on the frontline across youth services.

Catch22 received responses from young people, frontline professionals, from tech platforms and from commissioners: this interactive summary is an insight into the responses.

WHAT DID YOUNG PEOPLE SAY?

Catch22 consulted with 16 to 23-year-olds and received detailed responses from 22 young people.

Top apps young people report using in 2020:

  • Facebook
  • Instagram
  • Snapchat
  • Youtube
  • Twitter
  • TikTok

What would you do if you saw something concerning online?

“I would report it if it’s genuinely offensive and against community standards, although I’m beginning to give up because nothing will get done. Otherwise, I sometimes ignore stuff. If people request that I complain to support them, I will.”

Do you read the terms and conditions for the platforms or apps you use?

73% do not read any terms and conditions on any platform they use.

Of those who say they do read them, 67% said it does not affect how they use the app.

What should social media platforms be doing to better support their users?

Improvements to privacy and moderation were the key themes, including reference to blocking ‘fake news’, blocking harmful comments and users, and automatically tracking and blocking certain phrases or images. A number of young people requested checks for under-age users.

“When blocking someone … the app should ask why you are blocking the person.”
“Sometimes people complain about the platform not acting [on reported content] and then those posts are taken down! They act to protect themselves but not their users.”

What is your favourite content online?

  • Talking and gaming with friends
  • Fitness and health videos
  • Good news stories
  • Recipes
  • Funny videos
  • Discussion and debates
  • Music and music videos

If there was one thing you could do to make the online world a safer place for young people today, what would you do?

Almost all responses referred to improving online moderation and privacy checks, referencing a need for quicker responses to addressing harmful content, blocking fake accounts, and limiting use for those with a history of posting harmful comments. Other users referred to the need for more online counselling support, such as Kooth.

“Social media companies need to take more responsibility for the exposure of explicit and fake content to young and vulnerable people.”
“Simple terms and conditions … which can be easily understood.”

WHAT DID SERVICE COMMISSIONERS AND PROVIDERS SAY?

We received detailed responses from third sector organisations, Police and Crime Commissioners and researchers.

100% of respondents are working with other organisations to improve online safety

What are they seeing?

“The vast majority of CSE offences involve the use of social media and other online applications or websites”

“The grooming of a 12-year-old female, who was encouraged to leave their local area via train, where she met up with a 21-year-old male and was sexually abused.”

“Those programmes which equip adults and carers to talk to children and young people, support them to navigate the online space and are built on developing open dialogue and trust between young people and the adults who care for them and support them.”

What do they want to see?

  • Total transparency of social media identity
  • Up-to-date technology for support staff
  • Lengthy enforceable bans for cybercrime offenders
  • Place the burden on technology platforms to develop safe spaces
  • Education for parents and carers of the risks of online behaviour, and the benefits of understanding these risks
  • Policing of social media platforms, to prevent the sharing of inappropriate images and the ability to remove photos from the internet
  • Improved moderation from social media platforms
  • Well-evidenced safeguarding policies
  • High-quality training for parents and professionals working with young people.
“Improvements in regulation and oversight, both via increased responsibility of ISPs and increased resources for open source investigation and proactive law enforcement”

But they are facing barriers to tackling online harms too:

  • Organisational understanding of the risks
  • Keeping up with the rapid evolution of platforms
  • Funding for the extra work
  • Keeping ahead of the online environment
  • Lack of training for staff
  • Risks for staff members using platforms securely
  • Digital access and inclusivity “not all volunteers and mentors have smart phones, preventing engagement”
  • Barriers from social media platforms putting checks and balances in place
  • Determining the skills required and embedding the prevention of online harm into everyday business.
“Managing what are typically highly complex and resource intensive online investigations, often with multiple interconnected victims and perpetrators.”

What are they doing which works?

They are taking varied methods to address online harms:

WHAT THE FRONTLINE WORKERS SAY:

We asked frontline experts working directly with young people, what harms they are seeing and what help is needed.

We received responses from 75 practitioners, including teachers and youth workers.

When are their main concerns about online safety?

  • Grooming, cyber bullying and the sending and receiving of explicit images were mentioned as concerns by almost all respondents.
  • The constant changes to platforms and privacy settings, without the training to match.
  • Mental health issues aggravated by content seen online.
  • Lack of age verification – we received reports of children as young as 10 years old being groomed online.
  • Harmful content – practitioners want to see automated blocking of certain images and content

What questions do they have?

“What are the most common social media platforms used by young people and for what reason do they use them?”
"How do we get young people to talk honestly about their online activity?”
“How do we approach the mental health issues which result from negative online activity?"
“How do I support a young person who is not responding to me online? For example, a bullying incident happened and they are now not answering messages or calls from staff. When we worked face-to-face, it wasn’t easy for them to ignore us like they can with phone or emails.”

What do they want to see?

  • Shared knowledge on the latest harms being seen
  • Relevant, regular and up-to-date training
  • Guidance on how to talk to young people about online harms
  • Insight into the harms they are facing
  • What social media platforms are being used the most, and why they want to use them!

WHAT DO TECH AND SOCIAL MEDIA PLATFORMS SAY?

We asked technology platforms what online behaviour concerns you or your platform and what they're seeing.

A range of online harms concerned the respondents, they referred to:

Hacking, digital viruses, scams and fraud to child sexual exploitation and abuse, and online bullying were raised as key issues for young users and their platforms.

Helen Burrows, Content and Policy Services Director , BT

“We have worked closely with the Government and social media on online conspiracy theories around 5G, although this experience has revealed the weaknesses of a 'self regulatory' approach. We pro-actively filter out hacks and scams, offer security software to our customers, offer a range of tools including parental controls to protect, and enable our customers to protect themselves.”

Facebook:

"Due to the unprecedented COVID-19 situation, we took the decision in March to temporarily send these content reviewers home, for their safety. As a result, since Mid- March we have been operating with a reduced content review workforce. With a reduced and remote workforce, we will now rely more on our automated systems to detect and remove violating content and disable accounts … This means some reports will not be reviewed as quickly as they used to be and we will not get to some reports at all. Specifically related to youth crime in the UK, we don’t allow any criminal organisation or gang to have a presence on Facebook and Instagram. We have consulted a number of UK experts on policies relevant to the area of serious youth violence, including the sale of knives as well as promotion of regulated goods, which includes knives."

What do they want to see?

“Differentiated devices with a management system which flags their age range to sites and services so they can only access content that is suitable. For younger cohorts, give them access only to a limited list of suitable services, not the open internet.”

Create digital identities and to make online anonymity impossible on social platforms.”

“At a time when young people are faced with so much uncertainty and anxiety around their futures, positive content that can bring inspiration about their future is needed more than ever.” - Facebook

WE ASKED ALL RESPONDENTS WHAT PROGRAMMES ARE HAVING AN IMPACT:

Only 20% of frontline practitioners nationwide were able to name support or prevention programmes which discuss online behaviour and have a strong impact.

Those referenced as effective were:

The Social Switch Project: Funded by Google.org and the Mayor of London’s Violence Reduction Unit, the advisory board includes researchers and representatives from Metropolitan Police and leading social media platforms.

  • Trained more than 500 frontline professionals on how the relationship between social media and youth violence should be understood, tackled and solved.
  • Trained and developed 40 young people to launch their digital careers.
  • Distributed more than £75,000 in grants for grassroots projects across London, reaching approximately 5,000 young people.

The Digital Safety Ambassadors program with the Diana Award and Childnet: Reached over 1700 schools and nearly 40,000 young people are now trained as anti-bullying ambassadors.

Drillosophy: Online workshops created and delivered by Ciaran Thapar and Reveal Poison via a highly popular YouTube channel, the workshops are reaching vulnerable young people who might be difficult to reach via traditional outreach work.

Livity, Facebook and The Princes Trust: A six-week residency for six creatives through January 2020, tasking them to build a youth-led digital content campaign for those affected by youth violence.

Wud U? Barnardos: An educational app by Barnardo's for use with young people who might be at risk of sexual exploitation

CEOP Training: A law enforcement agency helping keep children and young people safe from sexual abuse and grooming online.

NSPCC Online Safety Training: Elearning for teachers, social workers, volunteers and youth workers to keep children safer online.

Bullybusters: An anti bullying initiative providing training and awareness sessions for young people, children, professionals, Governors and parents or carer's.

Mental health services were also heavily referenced as a need in this space, including:

  • Young Minds
  • Samaritans
  • Charlie Waller
  • KOOTH
For more information, contact: Catch22 Marketing Team | marketing@catch-22.org.uk
NextPrevious