Global platforms, partial protections: Design discriminations on social media platforms
Uusi Fairplay raportti paljastaa syrjintää suosittujen sosiaalisen median sovellusten toiminnassa kansainvälisellä tasolla.
Protect Children are proud to join Fairplay in bringing to light the global design discriminations that children face online. Fairplay is the leading nonprofit organization committed to helping children thrive in an increasingly commercialized, screen-obsessed culture, and the only organization dedicated to ending marketing to children.
Tegan Insoll, Researcher & Specialist at Protect Children joined a global coalition of organisations to conduct research on the experiences of young people around the world. The new research found that young people around the world experience the digital environment very differently, even when using the same digital platforms. Children from some parts of the world are offered more safety and more privacy than others; this is a form of design discrimination.
Three global platforms show various levels of design discrimination: WhatsApp, Instagram and TikTok.
TikTok’s “age-appropriate experience”: only for European youth
TikTok’s Privacy Policies vary around the world, offering different safety features, privacy features, and minimum ages to users.
Young people aged 13-17 years based in the EEA, the UK & Switzerland are offered more protection than any other young people around the world.
“To provide users younger than 18 with an age-appropriate experience, certain features are not available.”
This ‘age appropriate experience’ does not appear in policies for anywhere else in the world.
WhatsApp’s differential treatment of data
In July 2021, Data Privacy Brazil compared differences in the Terms and Conditions offered to Brazilian, Indian and European users, including users aged 13-17 years. They found that European children enjoyed stronger data protections against unnecessary data sharing, and more clarity about data deletion and what this means.
Instagram’s privacy settings
When the UK’s Age Appropriate Design Code came into force, Instagram announced a number of changes for young people to offer a ‘safer, more private experience’. They announced:
"Wherever we can, we want to stop young people from hearing from adults they don’t know or don’t want to hear from. We believe private accounts are the best way to prevent this from happening. So starting this week, everyone who is under 16 years old (or under 18 in certain countries) will be defaulted into a private account when they join Instagram."
This means that 16 and 17 year-olds from ‘certain countries’ are better protected from adult strangers. From the research conducted, we believe these certain countries are all or largely European.
Even within Europe, youth are offered different levels of protection: in Finland, 17 year old Instagram users are not defaulted to the most private settings, unlike their peers in the UK, Slovenia and Germany, who are defaulted to private accounts.
Regulations that require the prioritization of children’s best interests are essential in ensuring children and young people’s digital worlds are as safe, private and rights-advancing as possible.
These sorts of regulations are in place in the UK, Ireland, the Netherlands, France and Sweden, which may explain why European children are afforded more protections. But proposals are in consideration in California, Australia and the EU as a whole. Policymakers and civil society actors from around the world should give these regulations serious consideration to ensure that all young people are afforded the protection they deserve.
Individual tech companies should act in compliance with these requirements globally, not just in countries where they have legal obligations.
 Meta 2021 ‘Giving young people a safer, more private experience on Instagram’ https://about.fb.com/news/2021/07/instagram-safe-and-private-for-young-people/