
In light of the 18th edition of Safer Internet Day, which “calls upon all stakeholders to join together to make the internet a safer and better place for all, and especially for children and young people”, I have decided to dedicate a blog post to the Italian TikTok case.
The case of TikTok
The Italian DPA (Garante per la protezione dei dati personali) recently made European headlines thanks to its so-called “crusade against Big Tech” in which TikTok has taken center-stage since January 2020. Just over one year ago, in fact, the then President of the Authority at the time sent a letter to the European Data Protection Board calling the attention of all European Union (EU) DPAs to the necessity for “strong, coordinated action”, stressing “the importance and sensitiveness of these platforms that are intended mainly for very young users.” Other DPAs, including the United Kingdom Information Commissioner’s Office and the American Federal Trade Commission also had investigations underway at the time.
Italian DPA takes action
The situation became more complex in December when the Italian DPA took action against TikTok, initiating proceedings due to the alleged “Poor attention to the protection of children, easy-to-circumvent signup restrictions for kids, poor transparency and clarity in user information, privacy-unfriendly default settings.”
In a press release, the Italian DPA explains that since March 2020 it has observed several violations of European Data Protection law. One of the shortcomings pointed out included the fact that the sign-up process for TikTok did not have adequate mechanisms in place to verify the age of the child user. It’s important to understand that the minimum age to sign up for the platform is 13, while in Italy the minimum age at which a child can provide their consent without the authorization of their parent or legal guardian is 14 (EU Member States are empowered to establish the age of consent to information society services).
As Guido Scorza, a member of the board of the Italian DPA pointed out in an interview with Politico, it may not be an optimal solution for TikTok to request the identity cards of its users-to-be given that it could run the risk of “creating a database of millions of people, including children, whose addresses are on their ID. We don’t want a cure that could be worse than the ailment.” To this end, Paola Pisano, Italian Minister of Technological Innovation and Digitalization, proposed using the Sistema Pubblico di Identità Digitale (SPID, the Italian Public Digital Identity System) to ensure that only eligible users are able to access social media and create profiles.
Then, just a few days ago, the increasingly active Authority recently imposed the immediate limitation on processing by TikTok of the data of users whose age could not be verified with certainty. This historic and swift action is somberly the result of the death of a young Sicilian girl, only 10 years old, who passed away in what would seem to be a “challenge” which was being shared on the platform, putting the Garante’s actions with respect to not only TikTok but also Facebook and Instagram into Italian headlines. The actions of the Italian DPA are particularly interesting in light of the One-Stop-Shop mechanism established in the General Data Protection Regulation. In fact, most recently the Danish DPA announced that it was handing its case over to the Irish DPA given TikTok’s establishment in Dublin and therefore the Lead Supervisory Authority for the social media giants.
TikTok will comply with requests
On 3 February, the Italian DPA announced that TikTok will comply with the requests of the Authority, blocking access to minors under 13. TikTok will remove the relevant Italian users and will request the user’s date of birth in order to continue to use the app. Accounts of users identified as being under 13 will be deleted. Following the initial determination of the age, the social media company has expressed its commitment to explore the use of artificial intelligence for age verification, consulting with the Irish DPA.
Following the Garante’s intervention, as of 25 January, TikTok introduced a button directly into the app that allows users to quickly and easily flag other users who appear to be under 13 years old. TikTok also committed to doubling the number of Italian-language moderators of content on the platform. Furthermore, starting from 4 February, Tik Tok launched an information/educational campaign providing useful information to users about public and private and security settings. Importantly, it has also promised to improve the summary of its app privacy policy for users under 18 years to explain in an “accessible and engaging way” the kinds of data it collects and how it processes them.
On 9 February, Ansa reported that TikTok has commenced with its age verification procedure and launched its awareness campaign which explains how the social network functions, providing information on both privacy settings and the platform’s content reporting system.
Awareness and transparency
The Italian DPA also lanched a national televised awareness campaign together with Telefono Azzurro, a child abuse hotline, to inform parents about the active role that they should take concerning the privacy of their children, urging them to pay particular attention when their children are asked to enter their age when accessing the Tik Tok app. Launched on 8 February, the ad entitled “Minors and social networks. The campaign of the Garante and Telefono azzurro” urges parents to wait until their children are at least 13 years old before allowing them to use social networks. You can view the ad here.
I hope this case will be taken as an opportunity to concretely address the question of online age verification, which has largely been ignored so far. In this respect, I invite you to see the ICO’s Code of Practice to protect children’s privacy online (the Age Appropriate Design Code came into force on 2 September 2020 and organizations should conform to it by 2 September 2021). Surely technologies like AI can help in this respect, but adequate Data Protection Impact Assessments together with Human Rights Impact Assessments need to be performed. The accent on transparency – in terms of using adequate language for children – is also noteworthy in the TikTok case. Together with transparency, awareness campaigns represent a crucial factor for helping users, both children and parents, to protect themselves on the internet. These are also pillars of the Maastricht Data Protection as a Corporate Social Responsibility Framework, which should be applied by digital service providers targeting children in order to take their responsibilities beyond pure legal compliance.
CONNECT