Every day, we see the release of new apps and new features that promise to help keep you more connected to others. One of the most popular online spaces for children and young people to connect is Discord. Keep reading to find out everything you need to know about this platform and why it presents a risk to those in your care.
How Does it Work?
While this platform maintains it is “not social media” due to its lack of algorithms, news feeds, and other familiar functions, it does focus on user interaction. It uses a simple design layout and is split up into online communities called “servers”. All users can create their own server for free. You can purchase premium memberships, with perks and enhanced features, but this is not necessary to fully experience Discord. Servers are based on individual topics or interests (such as Among Us, reading, or sports teams). They can be public (anyone can request to join) or private (requires an invitation from admin/moderators).
Once granted access to a server, users can participate in an open chat with other users from all over the world. Text, video, and voice chat options are available, with limitations on how many people can join in. Many young people use the ‘screenshare’ option to communicate with each other while watching films, playing multiplayer video games, or watching sport matches. There are private chat options available as well.
©Discord: Blurred screen showing the Discord desktop view
Age Restrictions & Safety Settings
Discord has relatively ineffective age verification measures. It recommends that users should be at least 13 to use the platform, but only requests a date of birth on registration without asking for verified ID. User accounts also cannot be made explicitly private. This means that any user can see another user’s profile and contact them, even if they are under 18.
The platform says it has an automatic privacy setting for users under 18 called Keep Me Safe. This scans all direct messages to block explicit content and restricts access to NSFW servers. These often have pornographic content, with some belonging to extremist or predatory groups.
All moderation on Discord is done by the individual server moderators (normal users who are not paid by the company). It’s important to note that some content is visible to non-members if the server is set to public.
Our online safety experts signed up as a 13-year-old and were able to switch off the Keep Me Safe filter in settings. They received a pop-up warning that the NSFW server was not suitable for users under the age of 18, but they were able to click OK and proceed.
What are the risks?
Discord’s simple design and special interest categories are especially appealing to children and young people. However, this creates a prime environment for someone with harmful intentions to easily build rapport with a young person based on similar interests. This creates the illusion of friendship and trust, and can lead to more serious consequences.
Join our Safeguarding Hub Newsletter Network
Members of our network receive weekly updates on the trends, risks and threats to children and young people online.