Hello and welcome back to your latest edition of Safeguarding Soundbites. My name’s Colin Stitt, Head of Safer Schools here at Ineqe Safeguarding Group and for those of you listening for the first time, this is the mini podcast that rounds up all of the week’s safeguarding news, whilst also letting you know all the latest from our safeguarding experts.
I hope you enjoyed your long weekend last week! We’ve been busy getting back into the swing things here so let’s get started with this week’s Safeguarding Soundbites.
When was the last time you sent a text? If you’re like me, it was probably a long time ago! These days, we almost all communicate via messaging apps – be it WhatsApp, Facebook Messenger or – for some people, it’s Telegram. Now, you may have heard of Telegram in the news – it’s had some controversial press over the years. But is it all bad and should we feel comfortable if the young people in our care are using it? Our online safety experts took a look – find out what they had to say in our online safety section at ineqe.com.
Pride month has begun, the annual month for celebrating and reflecting LBGTQ+ communities. For some young people who identify as part of this community, there can be extra challenges to face, something that as parents, carers and school staff, we should all be aware of. That’s why we’ve put together our State of the Nation report to help understand and support young people who identify as LGBTQ+. Visit our online safety section to find out more.
Now, if you’re a fan of the metaverse, you might be aware of Horizon Worlds – it’s the virtual universe created by Meta where users can interact, create and explore. The virtual reality platform is for users aged 18 and older, but there are concerning reports that children as young as six have been online within the platform. With adult-focused conversations and even incidents of sexual harassment and abuse, we think it’s important to make sure you’re aware of the risks and how to mitigate them when it comes to this VR platform. Learn more in our online safety section.
In the news, charities in the UK are asking for social media firms to be required by law to hand over data linked to children’s suicides. The group of 37 charities, including the NSPCC and Barnardo’s, want the families of children who have died or been seriously hurt to be given access to their social media data and want Ofcom to oversee the legislation.
The Internet Watch Foundation reports that the digital fingerprints of a million images of child sexual abuse have been created. The fingerprints, known as hashes, help companies and police find copies of the images in hope that reuse of the images can be prevented.
In other news, Instagram has added a new Amber alert feature to the platform, in order to help raise awareness when a child goes missing. Amber alerts include details about a missing child, usually in the circumstances of an abduction, such as their photo, location and any key details that may help the child be recognised.
According to the latest Online Nation report by Ofcom, almost two thirds of children own a smartphone by the time they’re aged 10. The annual report covers how adults and young people are using the internet. Keep an eye out for our article on the Online Nation next week!
That’s everything from me for this week – join me next time when I’ll have more news and safeguarding advice. In the meantime, find us on social media by searching ‘Ineqe Safeguarding Group’. Thanks for listening and as always stay safe!