13 March 2025

The children’s privacy code: social media, EdTech and everything in-between

Brett Watson
Manager

elevenM’s Brett Watson unpacks the whys and wherefores of the forthcoming children’s online privacy code, and looks at one of the biggest blind spots for children’s privacy.

The passage of the first ‘tranche’ of reforms to the Privacy Act 1988 (Privacy Act) at the end 2024 fired the starting gun on a two-year process for the Office of the Australian Information Commissioner (OAIC) to consult on, design and release a children’s online privacy code.

Amongst a set of privacy reforms that were generally considered uncontroversial, creating a new set of rules to protect children’s privacy when they go online was probably the least contentious of the lot. Who could argue with doing something for the children, after all? 

There are nonetheless a range of questions around:

  • why we need a privacy code for children
  • what we can expect the code to contain
  • how the code relates (or doesn’t relate) to the children’s social media ban
  • why schools are children’s privacy blind spot
  • how an Australian code could improve on international examples.

So, what even is a privacy code?

The Australian Privacy Principles (APPs) in the Privacy Act are, as their name suggests, principles-based law. This means that, generally, the APPs do not mandate fixed privacy compliance requirements but can flexibly apply across a range of settings depending on what is reasonable in the circumstances.

In some sectors, or in relation to some processes, certain activities may need specific controls to reduce privacy risk. This is a role that a privacy code can play. Provisions in a privacy code apply in addition to the APPs and provide clarification about what compliance with a privacy principle looks like in specific scenarios.

As an example, APP 1.2 is a general principle requiring entities to have appropriate practices, procedures and systems in place to govern privacy risk. The Australian Government Agencies Privacy Code specifies that, for government agencies, these governance measures must include conducting privacy impact assessments (PIAs) on high risk projects and publishing a register of completed PIAs.

Privacy Codes are not commonplace — there are only three other privacy codes and there are a series of procedural requirements that need to be satisfied during the development of a privacy code.

Where did the idea for a children’s online privacy code come from?

A children’s online privacy code was one of the few reform proposals that was fully agreed to by the government in its response to the Privacy Act review (see page 15), remembering that most proposals were given the more tenuous response of ‘agreed in-principle’. It is generally accepted that:

  • children, due to their young age and developing comprehension skills, are particularly vulnerable to online harms
  • children increasingly rely on online platforms, social media, mobile applications and other internet connected devices in their everyday lives
  • the world for children now is different to when the APPs were created.

Putting all this together, the obvious conclusion is that legislative framework in the Privacy Act needs to apply with greater detail and specificity to address the potential privacy harms to children in the online environment.

What can we expect a children’s online privacy code to look like?

As well as agreeing to the development of a children’s online privacy code, the government recommended that, to the extent possible, the scope of the code should align with international approaches, including the UK Age Appropriate Design Code (UK Code). Taking a look at the UK Code gives us a reliable indication of what an Australian code is likely to contain.

The UK Code contains 15 standards, which are explained in detailed yet accessible guidance. We think the standards can be generally understood as falling into one of the following four categories:

  1. The one standard to rule them all (Standard 1)

The guiding principle underpinning all the standards in the UK Code is that the best interests of the child are paramount. This means ensuring that a child’s needs for safety, health, wellbeing, family relationships, physical, psychological and emotional development, identity, freedom of expression, privacy and agency to form their own views and have them heard are put before any other interest.

There is nothing in the UK Code that is designed to prevent children’s online services providers from making a commercial return, however, it does make an important distinction that it is unlikely that the commercial interests of an organisation will outweigh the best interests of a child. Put another way, making a profit from an online service cannot come at the expense of the privacy or wellbeing of the children that use the service.

  • The privacy best practice standards (Standards 2, 3, 4, 8, 14, 15)

Several standards in the UK Code require a practice or procedure that would be readily recognisable to a privacy practitioner, as they replicate operational guidance that typically accompanies the APPs.

Conducting PIAs, ensuring your privacy settings are transparently communicated and only collecting the data that you need are (or at least should be) baseline privacy practices.

A unique feature of a privacy code for children is an additional transparency requirement to ensure privacy settings and controls are applied in an age-appropriate way. The UK Code provides some helpful detail about the different stages of cognitive development that can generally be expected based on a child’s age range, and how this could be translated into privacy messages that can be understood by that age group. This could involve using devices like symbols, colours and graphics.

Privacy practitioners know that it can be very difficult to reduce the reading age of privacy policies and collection notices down to a level that is at or below high school level. This is because concepts like ‘consent’, ‘processing’ and ‘disclosure’ are reasonably abstract, even when they are communicated using plain English.

As a result, it isn’t surprising that guidance in the UK Code around age-appropriate communications and settings includes references to parental controls and recommends diverting a child user to a parent in almost all the age ranges included in the guidance, except for children who are in late high school.

  • The ‘Off by default’ standards (Standards 7, 9, 10, 12)

As the title of this group suggests, the expectation is that an online service aimed at children will behave in a more privacy preserving way than how we have come to expect online services aimed at adults.

This means that an online service used by children cannot, by default:

  • collect excess personal information
  • share personal information with other entities
  • use location specific tracking
  • use personal information about the child user to ‘profile’ them — predict aspects of their life or put them into a category.
  • The ‘Don’t be evil’ standards (Standards 5, 6, 11, 13)   

Finally, there is the group of standards that a reasonable person would think are so obvious that they do not need to be spelled out, yet the very existence of these standards unfortunately suggests otherwise. The standards in this group address design features in an app or software that are confusing and possibly harmful for children. The UK Code prohibits children’s online services from:

  • using children’s personal information in ways that have been shown to be detrimental to their wellbeing (e.g., content that is mentally exploitative, age-inappropriate or traumatising)
  • operating in ways that are contrary to industry codes of practice, or advice from the Government
  • incentivising children to stay online through in-game rewards (or make them ‘lose’ if they leave)
  • having parental monitoring that isn’t communicated to the child using the app/platform/service
  • manipulating or ‘nudging’ children into undermining the default privacy settings (e.g., by using dark patterns or deliberately confusing design).

Standards like these are likely to be unique to a privacy code that applies to children, because the common thread that runs through them is a recognition of the power imbalance between adults and children when it comes to the capacity to understand whether the functionality of an online service is in your own best interest or not, and the responsibility that adults have to ensure that this power imbalance is not taken advantage of.

The UK Information Commissioner’s Office recently published an overview of its monitoring and enforcement activities in 2024/25, which includes targeted interventions addressing the practices of online platforms used by children that were not following the Code’s requirements, particularly in relation to default privacy and geolocation settings.

What will the children’s online privacy code apply to?

The UK Code applies to online services that are likely to be accessed by children, and the Australian code is specifically designed to align with the UK Code.  

Whether a service is ‘likely’ (more probable than not) to be accessed by a child will depend on:

  • the nature and content of the service
  • whether that has particular appeal for children
  • the way in which the service is accessed
  • any measures that are put in place to prevent children gaining access.

For some online services the position will be clear, and for others it may not be — particularly in relation to teenagers. Where the position is unclear, it not enough for an entity to avoid compliance with the UK Code by simply assuming that their service won’t be accessed by children. This position needs to be supported by evidence.

This is sounding a lot like social media…aren’t children banned from social media now?

Not yet, but the Australian government passed legislation at the end of 2024 that will introduce a mandatory minimum age of 16 for accounts on certain social media platforms by December 2025. Like the application of the UK Code, the law places the onus on the relevant service providers to implement systems and processes to ensure that children under the age of 16 cannot access their service.

There are some fundamental questions that need to be answered before the ban comes into effect. One is how service providers will be expected to demonstrate that they are blocking access to their service for children under the age of 16 in a way that also complies with existing privacy laws. It’s worth noting (and considerably under-appreciated) that the impact of the ban isn’t limited to children — it means age verification for everyone.

The social media ban will be limited to specific social media platforms like Instagram, Snapchat and Tiktok. Several services commonly accessed by children like YouTube and online games like Fortnite and Roblox are not covered by the ban. Unsurprisingly, the online platforms slated for the ban are voicing their displeasure at the exemptions available to others.

The technological capability that is best able to achieve this objective is currently the subject of a trial and includes options like facial recognition powered age estimation.

Ultimately though, the online environment that children exist within is bigger and more varied than just what is available through a handful of social media platforms, and this is specifically called out in the scope of the children’s online privacy code in the Explanatory Memorandum for the code’s enabling legislation.

So, we do still need a children’s online privacy code?

Yes, absolutely. Privacy Commissioner Carly Kind argues that while banning children from social media may be well-intentioned, it could have the unintended and highly undesirable outcome of pushing children towards lower-quality online environments where the risk of exposure to harmful content is greater. Improving the online environment overall via a children’s online privacy code is the more effective mechanism for achieving the policy objective of protecting children from online harms. At the very least, the code is an essential complement to the social media ban given the breadth of online services that are not covered, not to mention the inevitable ingenuity of teenagers who will treat the ban as a challenge to be beaten.

Finally, while they don’t generate the same headlines as social media platforms, there are many other online services frequently accessed by children that are overdue for tighter regulation.  

The children’s privacy blind-spot

Software, apps and online programs have been used in the school environment to good effect for decades. Out of necessity, the social distancing requirements of the COVID years turbocharged the use of online learning platforms in schools (collectively referred to as ‘EdTech’).

Marking the class roll? There’s an app for that. Learning about science? There’s an app for that. Schools can now even use a program to collect daily red/amber/green indicators from children about their mental health.

The Office of the Victorian Information Commissioner (OVIC) investigated the settings and practices of EdTech software and apps in educational environments in 2019-2020. In the investigation, they reviewed the privacy settings of a range of apps used being used in classrooms and interviewed stakeholders at the Victorian Department of Education to understand how they were being used and how children and parents were being informed about them.

The report found that while government policy required schools to conduct PIAs and to seek parents’ consent for the usage of the apps, and that parents were entitled to assume that the privacy implications of EdTech platforms had been reviewed by their child’s school before rolling them out, evidence indicated that schools are not well equipped to do this. Most schools (three out of four) were not aware of a requirement to seek consent from parents.

The report also described the privacy settings of some of the apps that were used in the classroom in Victoria. It notes that, because schools were not fully appraised of privacy risks relating to the apps, they may not be satisfying requirements in the Information Privacy Principles around the protection of personal information or restrictions around disclosing personal information overseas.

There are a couple of issues from this investigation that emphasise the need for a children’s online privacy code. First, the finding that individual schools are a resource constrained environment that lack the expertise to closely consider the privacy implications of EdTech software comes as a surprise to exactly nobody.

Second, the idea that parents can provide consent for their children to use EdTech software in a way that satisfies all four required elements of consent (current, specific, voluntarily provided by someone who is informed and has capacity) is a rather far-fetched, particularly in relation to the ‘informed’ element. It is unrealistic to assume that hundreds of parents have the capacity to inform themselves and understand the privacy policies of a dozen or more separate apps, particularly given diversity of experiences and cultures that make up a school community. 

This excellent explainer from the ABC goes into detail about the incredible length and complexity of terms of service and privacy policies for commonly used software in the classroom — sometimes amounting to more than 200,000 words!

Privacy law does its job when it mediates power imbalances that affect an individual’s ability to exercise their right to privacy. There is a power imbalance between a software developer and a child who wants to use an online service targeted at them. There is a power imbalance between a school and a parent when the school is seeking parental consent for children to use EdTech apps in the classroom. And there is a power imbalance between EdTech providers that present a solution to ever increasing resourcing constraints and the teachers who face them.

Parents are not required to give consent for their children to use art or sports equipment because it is assumed that this equipment is fit for purpose and will not harm the children that use them. Given how manifestly inadequate seeking consent is for ensuring the privacy of children who use EdTech software, a privacy code that mandates minimum information handling requirements for EdTech software is an increasingly important mediator for these power dynamics.

Ok — we need a children’s online privacy code. Should we just copy the UK version?

Given how comprehensively the UK Code outlines requirements for handling children’s personal information, and that mirroring the UK Code was recommended by the Government in its response to the Privacy Act review, it is reasonable to expect that many of the UK Code’s features will be replicated in Australia.

The retention and deletion of children’s personal information is dealt with in passing by the UK Code in the data minimisation standard but could certainly be bolstered in the forthcoming Australian code that mandates the deletion of children’s data when retaining it is no longer in a child’s interests and even offers specific retention period guidance.

A key limitation of the UK Code is the scope of the services that applies to — online services that are likely to be accessed by children. Framing a children’s privacy code in this way achieves the objective of ensuring that when children use online services, the services are safe and privacy preserving in the way that they operate. Framing a children’s privacy code in this way overlooks, however, privacy harms that can arise for children from online services that they do not use but handle their personal information anyway.

Children and their childhoods are being ‘datafied’ more than ever before. Any person with a social media account would be familiar with the myriad ways in which children’s activities and milestones are shared on social media. Anyone who has had a child in a childcare centre would also be familiar with the photos and statistics that are shared in daily updates and parent portals. One can imagine that the ‘right to be forgotten’ that has long been a feature of European privacy law, and is also recommended in the Privacy Act review, will become an increasingly important individual right when today’s children grow up.  

Limiting the scope of a children’s online privacy code to the services that children interact with does not address the psychosocial implications of children growing up in an environment where the forensic documentation of their activities by a smartphone has become normalised, and how that might shape their attitudes towards privacy and identity.

In the short time since the introduction of the UK Code, the proliferation of AI powered tools to biometrically identify individuals and manipulate images into deepfake content has clearly demonstrated that seriously invasive privacy harms can materialise for people of any age, and without the individual having any direct interaction with an online platform. Developing a Code that not only covers online platforms accessed by children, but platforms that access children would help to mitigate these potential harms.       

What happens next?

An Australian children’s online privacy code is required to be in place by 10 December 2026. In the time between then and now, the OAIC will be engaging in wide ranging consultation on the design and content of the code. Organisations who handle children’s personal information and any interested individuals should monitor the OAIC’s website for consultation updates and developments.

Contact us

If you’re interested in learning more about uplifting privacy processes in your organisation, please contact us.