TikTok and Risks to Minors
This report documents an evaluation of systems on TikTok with to evaluate the risks posed to minors, including:
- TikTok’s Content Moderation System;
- Understandability of the platform for younger users;
- TikTok’s safety-by-design settings;
- TikTok’s ad manager systems.
We find multiple issues that potentially do no comply with the DSA including:
- TikTok under-moderates both pro-restrictive eating disorder content, pro-suicide, and/or pro-self harm materials;
- There is a muted response to these materials when TikTok become aware of them via user-reporting system, and TikTok failed to respond to the majority of pro-restrictive eating disorder content and pro-suicide, and/or pro-self harm materials when they became aware of it;
- A 13 year old would likely not understand the design and functioning of TikTok at the point of signing on;
- Safety by design settings are not the highest possible. 16-year-olds are not offered best practice privacy protections onTikTok. There also appears to be a “between country” variation on TikTok, with 16-year-olds treated differently in different countries;
- Access to safety-centres and help tools is not routinely accessible to young people in their first languages;
- Underaged targeting by age parameter selections is not completely removed from TikTok’s ad manager system.