To return to your region please select

Loading...

Read the script below

Colin: Hello and welcome to Safeguarding Soundbites!

Natalie: This is the podcast for catching up on all the week’s most important online safeguarding news.

Colin: And for finding advice and getting top tips on some common safeguarding concerns. My name’s Colin.

Natalie: And I’m Natalie.

Colin: As always, let’s start off with some social media news. Natalie, what’s been happening this week?

Natalie: Okay, so this week, it came to light that Twitter is set to leave the EU Code of Practice on Disinformation. Now I know that may sound very boring, but bear with me because it could actually have some significant safeguarding impacts for young people.

Colin: Okay, I’m listening!

Natalie: Good! So the EU Code of Practice on Disinformation is an agreement that all the major platforms operating within the European Union have signed to say, basically, ‘we promise to stick to our obligations to fight fake and false information circulating on our platform’. That encompasses a range of obligations, from reducing the number of malicious bots and deepfakes on their platforms and empowering fact-checking communities, but it really boils down to penalising, removing and reducing disinformation.

Colin: And disinformation is purposely spreading information that is known to be incorrect.

Natalie: Yes – so misinformation is information that is wrong – but that could be accidental. When you’re spreading disinformation, there’s an intent there and it’s pretty much never a good intent! It could be to damage someone’s reputation, to influence politics or to con or scam people.

At the minute, the EU Code of Practice is voluntary. Companies like Meta, TikTok and Google have all signed up.

Colin: And Twitter too?

Natalie: Yes, but now it seems they are going to pull out of the agreement. Which is concerning because it might raise questions about how they will moderate content and to a wider degree, safeguarding in general of their users. We’ve mentioned a few times on this podcast about Elon Musk’s mass firing of content moderators. Now we have this ‘takesie-backsie’, essentially on a promise to fight the spread of maliciously false content.

Colin: Okay, so for young people on Twitter, Natalie, what could this mean, risk-wise?

Natalie: Risk-wise Colin, young people using Twitter are potentially going to be exposed to not only a lot more false and confusing information, but also more bots, more spam, more scams.

Colin: All of which can be difficult to spot, particularly since the rise of AI.

Natalie: Yes. And for a young person, it could be especially tricky to figure out if something – or someone – is fake or has malicious intentions. And unfortunately with spam bots, we have seen people making those bots to spread inappropriate content, such as sexually graphic images and videos.

Colin: Now, it does sounds like this is something that could really change the landscape of Twitter in the future.

Natalie: Yes, and very much something that we will be monitoring to to see if the level of risk increases or if there’s any significant impact.

Colin: Okay, thanks Natalie. Let’s move on now to Snapchat. Snapchat have announced the launch of a new AI feature for their paid users. Snap+ subscribers will be able to send Snaps to the in-app chatbot and receive a generated Snap back of a related image. So, for example, you might send the bot a photo of your pet dog and it might send back an equally cute dog photo! Or, if you send a photo of your vegetables, the AI Bot might send back a recipe suggestion using the vegetables shown as ingredients. And although it’s not launched yet, there are concerns about what harmful images could potentially be generated as there have been many stories about AI’s accuracy and safety. Again, this is something that our researchers will be actively investigating, and you can be sure we’ll keep you in the loop.

And over to WhatsApp who have announced that they will be adding a new feature to the social messaging platform – the ability to edit messages for up to 15 minutes after sending them. The feature has already been made available for some users but will be rolling out to all soon. So if you’ve got a young WhatsApp user in your house, they may already have access to this feature.

Natalie: I’m thinking this will be useful for correcting spelling mistakes or if you’ve sent a message to the wrong person…we’ve all been there! But Colin, are there some ways this could be misused?

Colin: Natalie, absolutely. And those ways are what I am worried about. If users can send a message and then edit it, that opens the door for some quite dark and dangerous uses of the feature. For example, a bully or predator sending a message and then editing it before the victim gets a chance to show it to someone or screenshot it.

Natalie: And we already know that with WhatsApp being end-to-end encrypted – meaning third parties can’t view messages sent between users – it makes it more difficult for relevant authorities to view potentially harmful material.

Colin: So authorities certainly won’t be able to see what was in a message originally before it was edited.

Natalie: Colin, do you have any advice for parents and carers listening who might be quite concerned about this new feature?

Colin: Yes; I think what I would advise first and foremost is having an open conversation with your child – you can mention that you have heard about the feature on the news, which is an easy, natural way to bring up the topic. Then you can have that conversation about the risks…ask them what they think the risks might be, and what they would do if someone does send them an unkind or inappropriate message or asks them to send photos.

Natalie: And asking them what they would do is the perfect moment to reinforce what they should do – and who they can talk to, for example, their parents/carers or one of their trusted adults, like a teacher or relative.

Colin: Absolutely, a really perfect moment to really reinforce those messages. I’d also say make sure to mention that people can still screenshot the message for its edited . So, whether you’re sending an image, a mean message to someone you don’t like or personal information like an address or password, even if the person you’re sending it to says, ‘oh you can edit that out again so don’t worry’, there’s never a guarantee they’re telling the truth. It’s important that your child is already aware that once the message is sent, having an edit feature means nothing if someone has taken a screenshot.

Natalie: Great advice there, Colin, thank you.

Okay, I want to talk now about some new stats that have been released this week from Childline about exam stress.

Colin: It’s that time of year!

Natalie: It is! And here in the UK, the sun has been shining which always reminds me of exam time too. Revising in the sunshine in the garden…

Colin: …or staring out of the window in the exam hall wishing it was over!

Natalie: That too! But no matter what the weather was like, I think we can also remember what a stressful time exam season is. Which is reflected in these new statistics from Childline – in the last year, they have delivered over 2,000 counselling sessions to pupils struggling with exam stress.

Colin: Would that be a standard amount for them?

Natalie: No, that’s actually a 10% rise from the year before.

Colin: That’s a significant amount.

Natalie: It is. Childline have said that some students were worried about the impact of learning disruptions from the pandemic, in particular how the pandemic affected their ability to cope with pressure and their performance. They also encountered children struggling with their mental health and family expectations.

Colin: That’s a lot of worries for a young person to be shouldering. Plus, for children who are about to move to the next stage in their education, such as primary school pupils moving on to secondary, that could be playing on their minds also.

Natalie: Yes, that’s a really big change with lots of new faces in a new place. Now Colin, I know you’re a part of the INEQE furniture here, but I remember when I first started, I was nervous on my first day.

Colin: I’m not sure if I should take that as a compliment or not! But yes, even as an adult new places and people are nerve wracking. So that on top of exam stress and results worries…What can parents, carers and the school community do to help?

Natalie: Being there to talk to and listening is number one. Knowing how to ask open questions and how to receive a young person talking about their worries without getting worked up yourself or dismissing their concerns. Also knowing when to talk – it’s important to choose your moment! Talking during car journeys is always good. Basically, avoid saying ‘let’s sit down and talk!’ and directing them to the kitchen table…that’s a bit scary! And then also knowing the signs to look out for that they are stressed, such as changes in eating or sleeping habits, acting out of character or being very negative and down about the future.

Colin: Lucky we’ve put all of this together in our most recent article then, isn’t it?

Natalie: Isn’t it just! There’s lots more advice on there about this topic so it’s worth going to read. Listeners, you can find it on our website in our online safety section at ineqe.com and also in our Safer Schools and safeguarding Apps. There’s also a shareable that you can read, download and share, so I’d encourage you to do that, especially as you don’t know who on your social media might be going through this themselves or with their child and it could be really helpful.

Colin: Very good point.

Okay, I just want to give our listeners a very quick update on a topic we’ve talked about a few times now and that’s vaping. We’ve talked about this before because vaping has been on the rise amongst young people, and we’ve been particularly concerned about vaping being advertised on social media by influencers. But this week, the Prime Minister has announced that there is going to be a review into banning retailers from selling vapes, including ‘nicotine-free’ vapes to anyone under 18. And there’s going to be a review on the laws on issuing fines to shops who do sell them to children. There will also be dedicated police school liaison officers to help keep illegal vapes out of schools.

Natalie: Positive news there.

Colin: And speaking of….

Natalie: Our safeguarding success story of the week! This week, our story is about using tech for good, which is something we love doing here at INEQE! It’s the news that a mobile phone app has been created to help children overcome a lazy eye condition. The app has been created by eye specialists, mathematicians and app designers and uses complex programming to check if a user is using their eye patch correctly. With around one in 50 children having this visual impairment, it sounds like this app is great news indeed!

Colin: Fantastic stuff. Well, that’s all from us this week. Remember you can find us on social media by searching for Ineqe Safeguarding Group or Safer Schools.

Natalie: And you can also sign up to our safeguarding hub by visiting ineqe.com. Until next time…

Both: Stay safe!

Join our Online Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online.

Sign Up

Pause, Think and Plan

Guidance on how to talk to the children in your care about online risks.
Image of a collection of Safer Schools Resources in relation to the Home Learning Hub

Visit the Home Learning Hub!

The Home Learning Hub is our free library of resources to support parents and carers who are taking the time to help their children be safer online.

2023-06-02T11:07:38+00:00
Go to Top