To return to your region please select

Loading...

Read the script below

Natalie: Hello and welcome to Safeguarding Soundbites – this is the podcast that brings you up to speed on all the week’s online safeguarding news.

Colin: This week, we’re going to be talking about Horizon Worlds, why Twitter might be in trouble for not paying their bills again and more.

Natalie: : My name’s Natalie.

Colin: And I’m Colin

Natalie: Let’s jump in with our social media news round-up. Colin?

Colin: Yes, this week Meta announced they’re adding a text-based chat to their virtual reality space Horizon Worlds. Called ‘World Chat’, it adds another way for users to communicate to each other alongside the existing voice chat.

Natalie: So it’s text-based chat…that means basically like texting or messaging other users?

Colin: Yes. At the minute, users can only talk to each other through voice messages. This will now be like messaging on any other game or platform.

And for anyone who doesn’t know, Horizon Worlds is Meta’s virtual social space. It’s part of the metaverse and users can interact with a 3D space through avatars, exploring different worlds, playing games and interacting with other users in these 3D spaces.

Natalie: Okay. And what about safety? We know Meta and Horizon Worlds have been criticised in the past for safeguarding risks. There were reports of sexual harassment, for example.

Colin: Yes, there were previous reports of users being harassed through inappropriate interactions with their avatars. Meta implemented several different features to try and combat this, including a feature that garbles voices so users wouldn’t be exposed to potential harassment via the voice chat.

Meta have promised there will be strict security controls for the new chat feature, including proactive scanning and removal of messages that violate their Code of Conduct and users will be able to control what communication methods they want to use. It’s important to also mention Horizon Worlds has an age limit of 18+ here in the U.K., although our researchers previously found high numbers of younger users on the platform. And one report by the Institute of Engineering and Technology found users as young as six years old on the platform.

Natalie: So those security control features should, in theory, remove any inappropriate content someone is trying to send via the World Chat before it’s even sent?

Colin: I think that’s the idea. But with anything like this, it’s impossible for us to say whether or not it’s effective before we see it in action, unfortunately. What I would reemphasise is that young people need to be 18 in order to use Horizon Worlds; ; that age limit is there for a reason. So, if you think the child or young person in your care is using Horizon Worlds, this is a good time to bring it up and have a conversation about whether you think they should be using the platform or not.

Natalie: Good idea, it’s always a good opportunity when something has been in the news – it makes it easy to drop into conversation casually: ‘oh, I was reading a story earlier about this new platform/app feature…what do you think about the metaverse’ or even ‘have you or any of your friends use it? Do you think there’s any risks accessing the metaverse? etc.

Colin: Exactly!

Natalie: Okay, moving over to TikTok now, where a new report has shown the platform is the fastest growing social network… for news! A whopping 20% of 18–24-year-olds are using it as a news source. This is according to Reuters Institute Digital News Report’s survey, which also found younger generations have little interest in the more conventional news found on other sites like Facebook.

Colin: Although you mentioned 18-24-year-olds, could this apply to younger users, too?

Natalie: I believe this study didn’t look at online users younger than 18 but it’s very possible that the same patterns apply – young people at that stage in which they’re interested in news are likely sourcing their news information through the social media platforms they’re already using, like TikTok. Obviously, I can’t say for sure but personally, I’d imagine so!

Colin: and either way Natalie it is interesting. Lastly for our social media news, I want to talk about Twitter and some major safeguarding concerns that have been brought up in the news this week.

Natalie: Is this about Google Cloud?

Colin: It is! So Twitter has a contract with Google to host some of their servers on Google cloud – basically meaning that Twitter has been paying Google to host some of the services that keeps Twitter going. But Twitter wants to move their systems over to their own servers before the current contract runs out at the end of June…and it’s been reported that the social media giant, has been refusing to pay.

Natalie: Some of that was quite technical!

Colin: It was but the important part is this for our listeners to know is it’s putting one of their really important safeguarding systems at risk. It’s called Smyte and this system helps Twitter to moderate content on the platform. It helps spot safeguarding risks like spam, and combat online abuse and child sexual abuse material.

Natalie: Okay, so this is a really important tool for Twitter.

Colin: Yes, it really is.

Natalie: And it was only on last week’s Safeguarding Soundbites that we were discussing the new report from the Stanford Internet Observatory about the prevalence of child sexual abuse material being shared and traded on social media platforms. So tools like Smyte – these systems that work in the background of platforms like Twitter – even if we don’t quite understand the technical ins-and-outs of what they do, we can understand what an essential role they play.
So what is going to happen with this next…do we know?

Colin: Right now, reports are saying that Twitter is trying to move all of these systems, including Smyte, over to their own servers before the 30th of June, but they’re behind schedule. For now, this is a wait-and-see situation.

Natalie: Okay, and we’ll come back with any updates to this story in a future episode of Safeguarding Soundbites.
Now, it’s time for our safeguarding success story of the week! This week, it’s the launch of a new project designed to tackle online hate by encouraging conversation between parents/carers and children. Called, ‘The Online Together Project’, the initiative has been launched by Samsung and Internet Matters and is an interactive question-based tool that offers advice and aims to open up conversations about online hate, such as racism, homophobia and sexism. With new research showing hate speech is one of the top five things young people experience online and 62% of parents reporting concerns about their child being exposed to hate speech, this initiative comes at a great time.

Colin: Fantastic and it’s great to see tech companies like Samsung getting involved in providing solutions, too. And, of course, we always champion anything that helps get families talking about online safety and the experiences that children and young people are having online. Oh, and listeners, if your school or organisation is starting any amazing new initiatives or is involved in any great online safeguarding projects, drop us a line and let us know! You can tag us on social media, send us a message or email us. You’ll find all the ways to contact us on our website ineqe.com.

Well, that’s everything from us for this week – we’ll be back next week but in the meantime, you can keep up to date by joining our safeguarding hub on our website.

Natalie: And follow us on social media by searching for Ineqe Safeguarding Group or Safer Schools. Until next time…

Both: Stay safe!

Join our Online Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online.

Sign Up

Pause, Think and Plan

Guidance on how to talk to the children in your care about online risks.
Image of a collection of Safer Schools Resources in relation to the Home Learning Hub

Visit the Home Learning Hub!

The Home Learning Hub is our free library of resources to support parents and carers who are taking the time to help their children be safer online.

2023-06-16T14:59:00+00:00
Go to Top