Danielle: Hello and welcome to Safeguarding Soundbites. My name’s Danielle.
Natalie: And I’m Natalie. And this is the podcast that gives you everything you need to know from this week’s online safeguarding news. So Danielle, what are we going to be talking about today?
Danielle: Well first, we’re going to catch up on some social media news and updates. Then, we’ll be talking about a viral TikTok challenge, why you should be concerned about calculators, and our safeguarding success story of the week.
Natalie: Okay, sounds interesting. Shall we crack on?
Danielle: We shall! We’ll start over on X, formerly known as Twitter –
Natalie: – Still not used to it! –
Danielle: – I don’t think anyone is! But we’ll persevere. So, two main news stories with X this week. Firstly, they’ve updated their reporting system with a new flow that offers more specific reasons for reporting a post. It’s designed to be much simpler, easier, and quicker to do.
Natalie: Okay so if you’re reporting something for…let’s say child safety?
Danielle: It will then give you options for the best match, such as grooming, physical child abuse, underage user, minor at risk etc.
Natalie: It’s good to see Twitter –
Danielle: – X! –
Natalie: Sorry, X – I told you! – implementing better tools. There’s been major concerns since Elon Musk took over safeguarding, with reports of high numbers of moderators being let go. So let’s hope this new system allows users to feel more confident in reporting issues. Did you say there were two news stories on X?
Danielle: Yes, the other story is not really as positive, unfortunately. A new study by the European Commission has found that X has the biggest proportion of disinformation on the six big social networks.
Natalie: So that’s X, Facebook, Instagram, TikTok….
Danielle: LinkedIn and YouTube.
Danielle: And interestingly, YouTube has the lowest. Now this study didn’t involve the UK – they looked at Spain, Poland and Slovakia.
Natalie: Any reason why those countries?
Danielle: Yes, those are all countries that are deemed to be especially at risk when it comes to disinformation. In particular, disinformation on elections and the war in Ukraine.
Natalie: Okay. But it does make you question whether it’s the same here too. I remember talking on this podcast earlier in the year about X pulling out of the EU’s code to fight disinformation.
Danielle: They did. And back in April, a BBC investigation found hundreds of Russian and Chinese state propaganda accounts on X so we know this sort of thing is happening here, in America and worldwide.
Natalie: Be careful what you believe on X! Or on any social media site, I guess.
Danielle: Yes, and for parents, carers, and teachers, it’s always a good time to have those conversations with the children in your care. You can find resources on our website ineqe.com and on our safeguarding apps.
Natalie: Great, thank you. And finally for our social media safeguarding news, a BBC Three investigation into TikTok explains how they might be encouraging anti-social behaviour.
Danielle: Yes, this came out last week but we had our special episode on the Online Safety Bill. Basically, the investigation found that TikTok’s algorithm and design means that people are seeing videos they wouldn’t normally. It seems to be drawing attention and giving disproportionate engagement to certain content, causing what’s been called an ‘online frenzy’.
Natalie: So similar to the protests we saw in schools or when lots of young people turned up to vandalise or disrupt shops?
Danielle: Exactly. The BBC also found this occurring around particular crime or death-related incidents, like Nicola Bulley or the murder of four university students in Idaho.
Natalie: And didn’t some ex-staff speak out about this issue, too?
Danielle: Yes. According to the BBC, ex-employees have said the issue is not being tackled for fear of slowing down the growth of TikTok’s business.
Natalie: Hmm, that’s not great.
Danielle: Nope, it is not. TikTok did respond to the allegations from the BBC, saying that their algorithm brings together communities while prioritising safety and that the reason this content is recommended is to interrupt repetitive patterns.
Natalie: We are sticking with TikTok for our next story but sadly, we have a tragic incident to report. A 14-year-old teenage girl from Ireland lost her life after potentially participating in a viral social media ‘challenge.’ The young girl was rushed to a Dublin hospital after falling ill. Authorities initiated an investigation amid concerns that she might have encountered this dangerous challenge on TikTok, which involved inhaling an aerosol, a practice commonly known as ‘chroming.’
Danielle: It’s a heartbreaking situation that unfortunately we hear about far too often. TikTok has issued a statement expressing their condolences to the family and emphasising that content of this nature is strictly “prohibited” on their platform and will be swiftly removed if detected.
Natalie: This serves as a stark reminder of the potential dangers of these viral challenges on social media platforms and the real-world consequences they can have, especially when young users are exposed to this kind of harmful content. And we encourage you to remind the young people in your care to take a moment when they see this kind of online challenge that could potentially be dangerous. Pause and think about the consequences of taking part.
Danielle: Exactly. Just taking a moment to think can help you to resist the pressure to join in and to make a safer choice.
Natalie: Absolutely, and it’s worth noting that we provide valuable information about viral and dangerous online challenges in our safeguarding apps and on our website at ineqe.com for parents, carers, and educators to stay informed about these trends to better protect young people online.
Natalie: So Danielle, before you moved further into the world of Safeguarding, you were a teacher, weren’t you?
Danielle: That’s right, I was.
Natalie: Well, when you were teaching, would you have been concerned if you noticed a pupil that had a calculator app on their phone?
Danielle: At the time, I probably would have just thought they loved my maths lessons.
Natalie: And I think many parents and safeguarding professionals may think the same. But actually, having a calculator as an app on your phone can often be a bit of a red flag. It could in fact be what we call a ‘decoy app’.
Danielle: So that’s an app that looks fairly normal but actually once a particular code is inserted, it takes you to a vault where you can store and hide images, videos, messages or links.
Natalie: Exactly and we see this with young people who are sharing images and want to keep that hidden away from parents or peers.
Danielle: Unfortunately, our next story is one of the more concerning examples of how these decoy apps are being used. A teenager in England was caught with CSAM hidden in a fake calculator app. The teen in England admitted to making indecent images of children after he was found with hundreds of category A, B and C images and videos on his mobile phone.
Natalie: These images were only found after the young man messaged into a Snapchat group where he posed as a 15 year old girl.
Danielle: Yes, quite worrying. The offender was issued a two-year community order, where he must complete 30 rehabilitation activity days,
120 hours of unpaid work and will be made subject to the sex offender’s register he will also be placed under a sexual harm prevention order for five years.
Natalie: For more information on decoy apps and how to spot them, you can head to your safeguarding app.
Danielle: Oh, I heard some good news earlier this week.
Natalie: Is this our safeguarding success story of the week?
Danielle: No it’s not! But it could be…consider it a bonus! But yes, the NSPCC are making their book called ‘Pantosaurus and the Power of PANTS’ available in Welsh for every primary school, nursery and library in Wales.
Natalie: That’s a great name!
Danielle: It is! It’s a really great book that helps children recognise abuse and teaches them how to speak up if something makes them feel uncomfortable. It also gives advice for parents and carers of children aged three to eight on how to talk to children about abuse.
Natalie: Fantastic. And now it’s time for our safeguarding success story! So, this week, it’s the announcement of a national homework task on the 10th of October!
Danielle: Hmm I don’t know if more homework is going to be a very popular success story!
Natalie: Normally, probably not! But this is a really great and positive homework. It’s part of ITV’s Britain Get Talking campaign and it’s designed to help young people open up about what’s on their minds.
On the 10th of October, which is World Mental Health Day, teachers are being asked to set their pupils the homework of asking a trusted adult to have a conversation with them about what’s worrying them. It’s that simple! But of course, we know that opening up isn’t actually simple at all, which is why this homework is so important.
Danielle: It’ll get the conversation started. And that makes it easier for the next time.
Natalie: Yes! And of course, for all of our teacher-listeners out there – don’t worry, there’s no marking required!
Danielle: I think I heard that sigh of relief there from our listeners!
Natalie: Well, that’s all from us this week. Join us again next week to find out more news and alerts. And in the meantime, follow us on social media by searching for Ineqe Safeguarding Group and visit our website ineqe.com to sign up for our safeguarding hub.