The safety and privacy of children online continues to be the focus for regulators in the EU and UK. In this article, we look at the latest from the data protection regulators, what we may expect from other regulators in the landscape and how businesses can better protect and prepare based on recent enforcement action findings and conclusions.
The latest from data protection regulators
TikTok has been issued yet another fine in relation to children’s privacy. Following the £12.7 million fine to TikTok issued by the UK ICO in May 2023, the Irish Data Protection Commissioner (DPC) is the latest regulator in the EU to issue a fine for €345 million, along with corrective measures, over TikTok’s alleged EU GDPR violations concerning children’s data protection.
The investigation was launched in September 2021, examining processing between 31 July 2020 and 31 December 2020. The DPC examined and found a number of issues, including those related to:
-
- a failure of data protection by design and default, including that child user profile settings were set to public by default which posed several risks to children under the age of 13 who accessed the platform;
- verification of users using the platform’s “family pairing” functionality, which was introduced to allow parents/guardians to pair their account with their child’s account to manage app settings such as limiting screen time and restricting direct messages and content that may not be appropriate. The DPC found that TikTok did not adequately verify whether the user was actually the child user’s parent or guardian, undermining the idea behind such functionality and allowing for an easy way to loosen child profile settings; and
- the implemented of “dark patterns” by nudging users towards choosing more privacy intrusive options during the registration process.
TikTok has three months to bring its practices into compliance. TikTok has already disputed the level of the DPC fine on a number of bases, including that TikTok had already addressed a number of the concerns before the investigation began, for example by setting all accounts held by 13 to 15 year olds to private by default in 2021.
The fine issued is very much in line with the €405 million fine issued by the DPC to Instagram this time last year which raised similar issues relating to user settings for children and “public-by-default setting” with children’s accounts, illustrating how child data protection concerns continue to account for some of the biggest fines handed down by UK/EU data protection regulators in recent years.
What about other regulators in this space?
The main regulator to watch is OFCOM, the regulator in charge of managing enforcement under the draft Online Safety Bill (OSB), which has now been signed off by both Houses of Parliament. At a high level, the OSB will require online service providers to undertake a number of actions, including to remove illegal content quickly/prevent it from appearing in the first place, prevent children from accessing harmful and age-inappropriate content, implement and enforce age-checking measures and conduct risk assessments. The OSB introduces categorisations of “in-scope” services, referred to as Categories 1, 2A and 2B, and different obligations will be imposed for providers of these services depending on this categorisation.
The enforcement appetite under the OSB in respect of online safety of children will likely mirror the data protection regulators’ appetite in respect of children’s privacy, particularly online. Enforcement powers are similar to those under the GDPR in scope. OFCOM may issue an “information notice” requesting further information or issue a “provisional notice of contravention” which could specify failures and/or steps that OFCOM considers need to be taken/remedy in order to comply with the law, or inform that OFCOM proposes to impose a penalty. OFCOM is also able to issue an “enforcement notice” requiring a service provider to do, or refrain from doing something required under the OSB and issue fines of up to £18m or 10% of global revenue (notably higher than the UK GDPR’s £17.5 million or 4% of annual global turnover). Arguably, the OSB goes one step further that the GDPR, but providing a mechanism to impose criminal sanctions for failing to comply with a requirement of an information notice, including fines and imprisonment for up to two years.
Once the OSB becomes law, it will likely be called “The Online Safety Act”, and OFCOM’s powers under it will commence. However, given that there are a number of implementation steps that need to take place, it is unlikely that any fines will be immediately issued and we understand that OFCOM will take a “phased approach” to enforcement. In addition, OFCOM still needs to publish various codes of practice and guidance, which it plans to do shortly after commencement of its powers. There are also various consultations that will be launched following Royal Assent, before such codes and guidance becomes approved and binding. OFCOM also needs to designate certain regulated services by category and publish the thresholds for each category criteria. We expect the various consultation processes to launch in the weeks following Royal Assent.
Whilst the changes brought about by the OSB won’t happen overnight, affected businesses should take note of the regulatory landscape that affects their platforms that are aimed at or may be used by children to not only ensure they do not fall foul of the data protection laws, but prepare themselves for another layer of regulation and enforcement that will soon take effect to protect children online.
Top three tips to better protect and prepare
- Age verification and enforcement of age limits seems to be a key focus for regulators, particularly in the data protection space. Consider if your platform has appropriate age verification mechanisms, and whether you are effectively enforcing such age restrictions/age-checking measures that may operate on your service.
- Privacy by design is another key focus, with recent enforcement action in the data protection space scrutinising public-by-default settings, and not protecting children’s personal data by default (via default heightened privacy settings for child users).
- DPIAs and risk assessments are some of the first documents that regulators (and we expect, OFCOM once the OSB comes into force) ask for. Ensuring that these documents are complete and regularly updated can help to identify ongoing and new risks that affect your business and platforms.
- Look out for OFCOM’s codes of practice and guidance that will soon be published, which will be applicable should your service fall within the scope of the OSB. We expect these codes of practice and guidance to provide further tips and best practice to help services comply with the new law.