Hello and welcome to Safeguarding Soundbites. As always, we’ll be checking out the week’s digital and safeguarding news, plus giving you the latest updates and advice from our Online Safety Experts.
Have you heard of or ever used Twitch, the online livestreaming platform, where people can broadcast themselves playing videogames, going for walks, cooking and just about anything you can think of?! You might be wondering how it works.
People – called streamers – livestream themselves while viewers watch and interact with the streamer via chat. Twitch users can subscribe to their favourite streamers and donate money to them. It’s a lot like following an influencer or celebrity on social media but the ‘live’ element is what really hooks viewers in.
The livestreaming platform has an average of a whopping 103,000 live streams happening at any given time…that’s a lot of people sharing their lives. Although it is mostly used to watch gamers playing, the platform is also used for streamers to just chat to their viewers or ‘take them with them’ as they carry out activities or go about their daily lives. But unfortunately, the live aspect of Twitch has brought problems, and some safeguarding risks. Streamers have broadcast sexually explicit and inappropriate content, despite that being against Twitch’s Community Guidelines. Our Online Safety Experts have put together a Guide to Twitch, which explains further about what the platform is, what the risks are and gives great top tips on how you can help the young people in your care stay safer on Twitch and other live streaming platforms.
You can visit ineqe.com to find our Guide to Twitch, as well as a recent Safeguarding Alert about an explicit trend on TikTok and a Safeguarding Updated about a new type of game on Roblox.
In the news this week, Meta has announced that they are co-launching a new platform that aims to stop intimate images of young people being posted online. Called ‘Take It Down’, it’s been created in conjunction with the National Centre for Missing and Exploited Children. Young people will be able to request participating apps to search for intimate images of themselves – which, depending on their age, is referred to as youth produced sexual imagery. Meta, the company behind Facebook and Instagram, has faced strong criticism in the past over their lack of action on child sex abuse images hosted on their site. In 2019, the National Centre for Missing and Exploited Children received 16.9 million referrals from US tech firms of child sex abuse images, 94% of which came from Facebook.
As The Online Safety Bill continues its journey through parliament, the president of messaging app Signal has spoken out about the bill – and she’s not happy. In fact, she’s threatened to pull the app from the UK altogether should the bill go through. Signal prides itself on its commitment to privacy, with no screenshotting of messages, no tracking and no marketing. However, it’s the app’s end-to-end encryption messaging that’s brought the president of Signal promising to “100% walk” away if the bill comes into law. The Online Safety Bill could mean that tech companies like Signal would be required to allow Ofcom to scan encrypted messages for child sexual abuse materials and terrorism content. End-to-end encryption has long been the centre of debate between online privacy campaigners and safeguarding organisations. With the bill’s details still being finalised as it passes through the House of Lords, it will remain to be seen whether it will signal the end of signal in the UK.
And it’s a bit of a change of tune for Twitter after they have announced a tightening of their rules around violent content. The social media platform has been in the spotlight recently due to their concerning behaviour when it comes to safeguarding, such as firing large numbers of their content moderation staff. But the recently announced policy will see a ban on wishing others harm, ranging from traffic accidents, illnesses to death, along with threats against homes and infrastructure. The zero-tolerance approach won’t extend to speech related to sporting events or video games, however, nor satire or artistic expression when expressing a viewpoint.
The British Transport Police are launching a Snapchat campaign in cities across England, to warn male teenagers against getting involved in gangs. Aimed at 13 to15 year olds, it’s been launched to help combat the use of social media by drug dealers to recruit boys to work in their gangs. The age group have been targeted and enticed with the promise of rewards like money, new clothes and mobile phones, for carrying out tasks like carrying drugs to different areas via trains. The British Transport Police have reported finding teenagers as young as 13 being exploited by gangs. The Snapchat campaign will be launched in London, Birmingham and Liverpool.
Some of the biggest platforms have announced new safety features to their users in this past week. TikTok has planned improvements to their screen time tool with additional options for users to trial. There will also be new default settings for teen accounts and plans to include more parental controls on the platform, such as customisable daily screen time limits and a schedule for muting app notifications.
Snapchat have announced that they are trialling the ability for users to pause their snap streaks. This is a big shift in one of their most popular features, as streaks are attributed to friends who have consecutively sent each other at least one message on the platform every 24 hours. It is thought that being able to ‘pause streaks’ will allow users to improve their amount of screentime and decrease the stress a user might feel if they are unable to continue their streak for any reason.
That’s everything for this week’s episode of Safeguarding Soundbites. We’ll be back again next week but, in the meantime, follow us on social media to stay up-to-date on what our Online Safeguarding Experts are up to – just search social media for INEQE Safeguarding Group. We’d love you to share this podcast with your friends, family and colleagues so we can help keep the children and young people in our care safer online. Speak to you next time!