TikTok algorithm encouraging anti-social behaviour in the real world
- A BBC Three investigation has found that TikTok is encouraging anti-social behaviour in the real world.
- Ex-employees say the issue is not being addressed due to fear of slowing growth of the social media app’s business.
- These ‘frenzies’, where TikTok drives disproportionate amounts of engagement to some topics, are evidenced by former staff members, app users and BBC analysis of wider social media data.
- Users are being driven to this content through the platform’s design and algorithm.
- Four episodes in recent months where disproportionate engagement on TikTok was connected to harmful behaviour included the interference in the police investigation of Nicola Bulley and school protests involving vandalism spreading across the UK.
- For more, please visit the BBC News website.
AI developing too fast for regulators to keep up, says Oliver Dowden
- Dowden will use a speech at the UN general assembly to bring awareness over the lack of regulation of AI, which he says is developing faster than policymakers thought.
- He will urge other countries to come together to create an international regulatory system.
- Experts say AI can be used to generate fake images, videos, sounds and text that are indistinguishable from reality, making them a powerful disinformation tool.
- Some also worry that the use of AI such as facial recognition software could lead to discriminatory outcomes if they data they have been trained on shows bias.
- For more, please visit The Guardian website.
More online TV channels could face regulation
- More than seven in 10 households have an internet-connected smart TV, according to Ofcom, which can give access to up to 900 unregulated channels.
- The government proposals, which are subject to consultation, would extend Ofcom’s powers to the most popular and easily accessible unregulated channels.
- Rules cover areas such as protecting children from harmful content and impartiality for news.
- While some unregulated channels voluntarily follow rules on inappropriate content, viewers cannot complain to Ofcom if they are concerned about a programme and the regulator has no powers to issue fines/sanctions if a channel broadcasts harmful content.
- They also do not have to follow Ofcom rules on ensuring subtitles, audio description and signing are available for people with disabilities.
- Announcing the plans at a Royal Television Society conference in Cambridge, Culture Secretary Lucy Frazer said: “Any change to regulations must strike a balance between protecting people – particularly the young and vulnerable – while protecting freedom of speech, and not unduly burdening the TV industry.”
- For more, please visit the BBC News website.
The following story may be regionalised:
Child sex abuse ‘should be treated like a pandemic’
- A new institute in the UK and based in Edinburgh, Childlight, are calling for governments around the world to treat child sex abuse as a global health emergency, like Covid-19.
- The institute plans to gather data from around the world to build an unprecedented picture of the extent and nature of child sexual exploitation and abuse.
- Its Chief Executive Paul Stanfield hopes Childlight’s data will pressurise governments into taking action.
- He stated “tech companies need to prevent their platforms being used to facilitate child sexual abuse. They must have the technology to do this”.
- Childlight will take part in a three-day conference being held in Edinburgh by the US-based International Society for the Prevention of Child Abuse and Neglect.
- In Scotland alone there was a 511% increase in reports of online offending targeting children between 2015 and 2021.
- For more, please visit the BBC News website.