To return to your region please select

Loading...

Read the script below

Colin: Hello, and welcome to Safeguarding Soundbites! I’m Colin

Danielle: and I’m Danielle! If you haven’t tuned in before, Safeguarding Soundbites is the weekly podcast that brings you all the latest in safeguarding news, advice, and updates in a ‘bite sized’ format!

Colin: We had taken a break for the summer holidays – with a summer special at the end of July – but we are back with some of the top stories from the last few weeks. We’ll be covering the latest updates on social media platforms, a new online challenge warning for parents and carers, new technology being used by the Internet Watch Foundation to detect abuse images, a new study that links excessive screentime to heart damage in kids, and our Safeguarding Success story of the week.

Danielle: Why don’t we jump back in by covering some of the updates on social media platforms?

Colin: That sounds like an X-cellent idea!

Danielle: Speaking of X, formerly known as Twitter, it’s been a rough few weeks for the platform. After owner Elon Musk instated a rebranding of the name, the platform has seen a decline in weekly active users and downloads.

Colin: What’s worrying is that apparently this rebranding also included getting rid of safety features. X has stated that it will be removing the block function from its features, a decision that is putting the platform in conflict with the App store and Google Play Store guidelines. According to these guidelines, apps must provide a system for users to block content or other users. As we know, blocking is an important part of online safety. It helps users of all ages control their online spaces and limit interactions with harmful content or abusive users. Many anti-bullying activists and safeguarding professionals have criticized X for seeking to remove “a critical tool to keep people safe online”.

Danielle: I can’t help but agree – of all the questionable moves X has made, I can’t see the good that will come from this one.

Colin: Now Musk has stated that they will not be removing the block feature from direct messages. However, this is only one small section of the platform, and still leaves users at significant risk. Since the announcement, X employees have stood by the decision and promised that they are working on “something better” to protect users. It remains to be seen what that is, and when they will officially remove the blocking function. They have also announced plans to introduce call and video features, and an option to collect users’ biometric data. Our online safety experts will continue to monitor the situation, and we will update you with new information when we can!

Danielle: X hasn’t been the only social media platform making questionable decisions regarding user safety. TikTok has recently been fined for breaching children’s privacy in the EU. This comes after they were fined £12.7 million by the UK for illegally processing the data of over one million children under the age of 13. Regulators have claimed the platform has done “very little, if anything” to verify the ages of its users and remove any underage accounts.

Colin: Does this have anything to do with the EU’s Digital Services Act?

Danielle: Yes, it does! As we know, the EU Digital Services Act came into effect last year, but allowed time for social media platforms to become compliant with the requirements. Despite TikTok voluntarily submitting to a ‘stress test’ by the EU technology commissioner, it was discovered that the platform still has more work to do to ensure they are fully compliant with the regulations. Since then, TikTok has announced a host of new features for the EU, including the option to report posts and advertisements they believe are illegal, as well as the option to turn off personalisation.

Colin: Turn off personalisation? So they wouldn’t be seeing content decided for them by algorithms?

Danielle: Exactly. Rather than seeing content based on their interests, which may not be aligned with age-appropriate options, users who turn off personalisation will see content that is suitable for their age range and based on their country of residence. It’s not a perfect solution, but it is a step in the right direction.

Colin: It’s also encouraging to know that big social media platforms are being held accountable by legislation like the Digital Services Act and the UK’s Online Safety Bill, which we’ve covered in previous Safeguarding Soundbites. For those of you who don’t know, this legislation will make social media companies legally responsible for the safety of children and young people online.

Danielle: It includes removing illegal content quickly or preventing it from appearing at all, blocking harmful and age-inappropriate content, increasing age limits and age verification processes, and that potential risks and dangers on the largest social media platforms are made more apparent to users.

Colin: Unfortunately, the Online Safety Bill is still making its way through UK parliament – and has been for quite some time now – with Ofcom stepping in to provide advice for the government and online services regarding the bill this past June.

Danielle: Hopefully this means we are a big step closer to seeing the Online Safety Bill come into effect!

Colin: In other news, a recent online challenge has seen parents and carers being cautioned for putting their children and young people at risk. Known as the ‘Egg Crack Challenge’, videos of parents cracking an egg on their unsuspecting child’s head have begun to go viral on TikTok and Instagram Reels.

Danielle: What!? Hold on – please explain this. Cracking an egg on a head?

Colin: Yes, it does sound a bit ridiculous. In the videos, a parent or carer is seen standing with their child at a counter. Normally they are baking or cooking, and it seems that the child has been asked to help.

Danielle: A good example of bonding between children and their caregivers.

Colin: It is – until it isn’t. When it comes time to crack the egg into the bowl, the parents take the egg and smack it directly on the child’s forehead. There have been mixed reactions in the videos themselves. Some of the children seem genuinely shocked before bursting into laughter, while others have been visibly upset, hurt, or confused. What is worse is that in nearly every video, the parent immediately begins to laugh at the child’s reaction, even if the child is upset.

Danielle: My first question is ‘why’?

Colin: For many parents, it’s seen as a bit of harmless fun. A laugh between them and their child.

Danielle: Except, of course, for the kids who are upset by it.

Colin: Exactly. Some of these children appear to be very young as well, at an age where they wouldn’t be able to grasp that it is ‘just a joke’ or even understand what an online challenge is. It’s important to remember that while there are always online challenges going around social media, what you do in the challenge can have a lasting harmful impact once the trend has died down.

However, child psychology experts are especially concerned about any challenges that involve tricking young children. One British expert has explicitly referred to the challenge as abusive, saying that parents laughing at the child’s reaction might be experienced by the child as “humiliation” or “a lack of trust”. This could potentially confuse them in a way that is emotionally damaging and could negatively impact the trust they have in their parent or carer going forward.

Danielle: It’s also worth noting that, in most cases, the child is unable to provide sufficient consent to this type of video being shared on social media.

Colin: Yes, and consent is key. The power dynamic is uneven between a child and the person who is meant to care for and protect them. Even if they are asked, a child may feel they can’t say ‘no’ or that they would be disappointing their parent. Parents and carers may innocently forget about this imbalance of power and could be missing out on the opportunity to check whether this ‘prank’ is painful or upsetting to their child.

Danielle: I think it’s important for all of us to remember that just because you can do something, doesn’t mean you should. A big part of teaching children and young people how to navigate the online world and its challenges is about modelling positive behaviours.

Colin: This means asking permission, considering what you’re going to share, reporting things that are harmful, and taking regular breaks from screens. It’s also crucial that children and young people learn that it’s ok to say no to harmful online trends from a young age.

Danielle: Exactly. By showcasing the ability to not go along with something just because it is trending at the time, you’ll help build up their digital resilience.

Colin: Which is a win-win for everyone. For more information on harmful online challenges and trends find it in your safeguarding app, or check out our helpful resource on our website. Print it out, share it with friends on social media, and help spread awareness of the risks that can come with trending videos.

Danielle: A technology breakthrough has made detection and removal of child sexual abuse material, or CSAM, a little easier, thanks to a grant by Nominet. Nominet is the official registry to .UK domain names, giving them wide access to a range of websites. They have partnered with the Internet Watch Foundation (IWF) to ensure this technology is utilised effectively.

Colin: And what kind of technology is this?

Danielle: It is referred to as ‘clustering technology’, which is a way to describe digital resources on one or more connected systems that are transparently available to users. Don’t worry that didn’t make sense to me at first either – but essentially, it’s being hailed as ‘a revolution’ in assessing online child sexual abuse imagery by helping analysts assess multiple CSAM images in seconds, rather than hours.

Colin: Oh wow! That’s quite a lot!

Danielle: It is, it means that it will help increase the IWF’s hotline image assessment system by 112%, according to Nominet. The IWF have claimed that images with the same victim can be identified faster, as well as whether or not the victim is a child and if there is criminal abuse taking place. And it will also help them assess criminal images in larger numbers, rather than losing critical time to individual assessments.

Colin: This seems completely groundbreaking, especially as the IWF are the leading experts in this sector.

Danielle: The hotline manager at the IWF has said that what this means for their analysts is “hard to overstate” as it not only makes their results more accurate at a faster rate, but also protects the analysts’ from prolonged exposure to extreme imagery.

Colin: Advances like these are an incredible example of the way that tech can be used for good. They are vital to creating a safer online space for all, and protecting children and young people as they grow up in the digital world. It also helps victims who have already gone through something traumatic – which actually brings me on to our next story.

A new pilot project taking place in Scotland – Bairns’ Hoose is a new scheme that will aim to provide a “safe space” for children and young people who have been victims of violent and sexual crimes.

Danielle: Ah yes, I’ve heard about this scheme! It will allow children and young people to give pre-recorded evidence, correct?

Colin: Yes. They will be able to remain in a safe, calming environment and give their evidence to a specially trained police officer without needing to visit a police station or a courthouse. It’s based on a trial scheme for the ‘Barnehus model’ in Norway, and some judges are already suggesting that the Bairns’ Hoose model could avoid the need for children and young people to be cross-examined by solicitors or have to revisit the trauma multiple times.

Danielle: We know at times these processes can add to the trauma of the situation for a child or young person who is already suffering.

Colin: Exactly. The Scottish government has supported this pilot scheme in full and hopes to see it rolled out in other areas across the UK.

Colin: Researchers have found a worrying link between inactivity in teenage years and a later appearance of heart damage in young adults, with increased screentime being a major contributing factor. Studies have suggested that this could lead to things like heart attacks and strokes in later life.

Danielle: My goodness – that’s quite a shocking find!

Colin: Yes, it is. The study tracked over 700 UK children for 13 years to ensure results were thorough and spanned across different ages, genders, and other variables. It used smartwatches to measure their activity throughout the week. The subjects were tested at age 11, 15, and 24.

Danielle: Was there any difference between the ages?

Colin: There was. Researchers found that the average sitting time increased from 6 hours a day at age 11 to 9 hours a day at age 24. A scan was used to assess the weight of the heart’s left ventricle, which according to the study is an effective way to predict adverse heart events in adult life and to assess any cardiac damage in children and young adults. Apparently, an increased left ventricle was linked to longer periods of inactivity, such as time spent sitting.

Danielle: So, how does increased screentime factor in?

Colin: As we know, most screentime activities involve sitting down or being ‘inactive’. It’s important to note that this isn’t just a reality for home life as well! Many schools and workplaces involve a consistent use of screens throughout the day while we sit at a desk, which adds to the overall screentime a person might experience.

Danielle: And a lot of the time, when we come home from school or work, we immediately do something that involves the use of a screen, like scrolling through social media, watching tv, or playing videogames.

Colin: Which we are all guilty of, not just children and young people! It’s also worth mentioning that enjoying screentime activities like gaming is not necessarily a bad thing – it’s all about moderation! I say this because some of the media outlets who picked this story up have headlines like “Playing videogames linked to heart disease”, which can immediately raise alarm bells about that particular activity.

Danielle: Even though it’s actually all forms of screentime that are contributing.

Colin: Right! The study’s author, a doctor at the University of Eastern Finland, said that parents should try to encourage children and young people to move more by going for walks, and should limit the amount of time they spent on social media and video games in favour of other screen-free activities.

Colin: Now that kids are heading back to school, make sure to introduce a healthy balance of screentime and downtime for your entire household. Having screen-free zones like the dinner table or bedrooms can help, as well as creating a screentime schedule for everyone – even you – to follow. You can also encourage family walks, playing outside with friends, or extracurricular classes as active off-screen alternatives to help get them moving!

Danielle: For more ideas on how to break bad screentime habits and get your whole family moving, check out our Screentime Activity pack on our website!

Danielle: We know that screentime, like all things, should be used in moderation by people of all ages. Playing videogames is included in that.

Colin: It is! We should also consider some of the positives of videogames, as they can help teach children and young people important skills like teamwork, problem solving, creativity, critical thinking, and quick reflexes.

Danielle: Precisely! So, for our first Safeguarding Success story of the school term, let’s focus on something a little more positive in world of gaming.

Colin: Indeed. UK videogame trade body Ukie has recently released a new campaign called ParentPowerUps’, designed to help families have “conversations about responsible gameplay”. It offers support and guidance on how to use parental controls that help manage screen time, in-game purchases, online interactions, and access to age-appropriate content.

Danielle: It’s great to see. ParentPowerUps actually found that almost 7 in 10 parents already talk to their children about the amount of time they spend gaming – but I suppose that doesn’t tell us much about how those conversations are actually going.

Colin: Well, if it’s anything like my household, those conversations aren’t always easy.

Danielle: But that’s what exactly what ParentPowerUps are aiming to do – help families find the right balance and language to use when it comes to gaming. A big part of this initiative is the ‘PowerUpPact’, a downloadable form that can be used to agree on specific aspects of gaming in the household.

Colin: What are some of the things it covers?

Danielle: This form lets families agree on how much gaming time there is per day, how much pocket money can be spent on games per week, who the child is allowed to game with, if they are allowed to play online games, and which PEGI ratings should be followed when it comes to choosing a game.

Colin: That’s such a great way to make sure parents and carers cover all the important gaming questions with their kids!

Danielle: It is! ParentPowerUps is also fronted by some big names, like footballer Jermaine Jenas and comedian Judi Love, and they encourage parents to not only have these conversations but to get involved with gaming as well. Professor Tanya Bryon, who partnered with Ukie to create the PowerUpPact, said that videogaming “can be a great way to engage in fun activities with your children” and will help parents understand the usage of gaming controls and restrictions.

Colin: Sounds like it’s time to get the dual controllers powered up!

Danielle: It does! Or, at the very least, walk through your child’s favourite game with them.

Colin: Which will help them feel that you value their interests. And it’s certainly an excellent way to encourage discussions about healthy gaming habits. Where can parents and carers find this resource?

Danielle: We’ve placed this resource in your safeguarding app to make it easier for you to find. You can also visit the askaboutgames website for more information, including recommendations for how to approach conversations.

Colin: I loved that success story. And if you’re wanting to brush up on any of your gaming lingo beforehand, you can check out our Gaming Buzzwords resource on both our website and your safeguarding app!

Colin: Well, I don’t know about you Danielle, but it feels great to get back to the studio for weekly Safeguarding Soundbites!

Danielle: It really does, Colin!

Colin: We’ll be back next week with more safeguarding updates and news for you. Until then, you can keep up to date on everything we’re doing by visiting our website or heading to your safeguarding app.

Danielle: You can also follow us on social media by searching for Safer Schools!

Colin: Until next week, goodbye and…

Both: Stay safe!

Join our Online Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online.

Sign Up

Pause, Think and Plan

Guidance on how to talk to the children in your care about online risks.
Image of a collection of Safer Schools Resources in relation to the Home Learning Hub

Visit the Home Learning Hub!

The Home Learning Hub is our free library of resources to support parents and carers who are taking the time to help their children be safer online.

2023-09-01T15:41:51+00:00
Go to Top