As seen above, inappropriate content covers a wide range of issues and Insafe helplines categorise it using various headings. Overall, inappropriate content could be said to be the main issue that helplines are dealing with as clearly cyberbullying and sexting (both the subjects of previous “Insafe insights…” reports) involve content which is deemed inappropriate.
Inappropriate content is addressed at every Insafe Training meeting. Recently there were discussions about children and young people being exposed to inappropriate online challenges which, in some cases, had led to young people self-harming or committing suicide. Known as the Blue Whale Challenge, this was an issue which needed to be addressed quickly. Despite being a hoax, it was clear that young people were harming themselves as a result and authorities faced the dilemma of when to alert people to the issue: do it too soon and you can be accused of scare-mongering; too late and you’re not doing your job. Discussions about this type of thing at a network level mean that a consistent approach can be applied and this was seen more recently with concerns over the Momo challenge. Helplines, awareness centres and social media providers met quickly following press coverage in Austria and the UK to devise a strategy for working with the media to provide useful information around the issue.
Safer Internet Centres that are part of the Insafe network enjoy positive relationships with many of the key social media providers. This is very important in being able to understand how best to support end users who encounter problems on the various platforms. Colleagues have a good knowledge of community standards/community guidelines and the most effective ways to have problematic content removed. One key piece of advice from social media providers is that users should provide as much information as possible when making a report. The more context that can be provided about an offending piece of content, then the more likely it can be dealt with properly.
Unfortunately reporting mechanisms on social media platforms have not enjoyed a good reputation in recent years with many users expressing their frustration that they reported some content, but that nothing happened as a result. Some platforms now keep users informed about action that has been taken on a particular report that they have made, and many platforms are keen to provide as much transparency as possible with regards to reporting processes.
Similarly, some of the major providers have also released transparency reports which show how they have dealt with specific types of content, how much they have removed, and so on. The figures are sadly quite staggering such is the volume of traffic that these sites attract.
Further statistics can be found at www.betterinternetforkids.eu/helpline-statistics.