elevenM’s Brett Watson outlines the practical steps that organisations need to take to implement tranche one of privacy law reform and be ready for tranche two.
On 12 September the Attorney General introduced the Privacy and Other Legislation Amendment Bill 2024 to Parliament, with the statement that it is tranche one of two for privacy law reform. We released our first impressions last week and you can also read our in-depth analysis of the Bill on our website.
We have a Bill! Now what?
In the time since the government released its response to the Privacy Act Review Report, ‘when will we get a Privacy Act reform Bill?’ has been the question on every privacy professional’s lips, closely followed by ‘and what will be in it?’
The first tranche of reforms didn’t include a lot of the proposals that had the potential for day-to-day operational impacts like changes to the definition of personal information, direct marketing opt-outs and the ‘fair and reasonable’ test. That said, there are still inclusions that organisations need to understand and implement.
The reforms are not guaranteed to pass through Parliament without amendment and may still be the subject of a committee review. There appears to be enough time for the reform Bill to clear both houses by the next election, though this assumes that the election will not be called earlier than the deadline of May 2025. If the Bill is enacted, some of the reforms, like the privacy tort and the Office of the Australian Information Commissioner’s (OAIC) new powers, will commence immediately, while there will be a two-year grace period before the children’s privacy code and the privacy policy requirements for Automated Decision Making transparency will come into effect.
All the work you were doing before… Keep doing it, but better!
In all the noise around potential reforms it can be easy to lose track of existing privacy obligations. While some of the areas where the Privacy Act was anticipated to ask organisations to ‘do more’ are still in the works, there was absolutely nothing in the first reform tranche to suggest organisations should be doing less. On the contrary, the government has signalled that organisations should expect swifter, and more tangible consequences for weaknesses in their privacy program.
The suite of increased powers given to the OAIC to investigate privacy concerns, compel the production of evidence and issue infringement notices (for up to $66k per infringement) all point to privacy becoming a more strongly regulated compliance domain.
The introduction of the long-awaited direct cause of action for individuals who have suffered a serious invasion of privacy (aka the ‘privacy tort’) will open up a new legal risk exposure for organisations over and above their obligations under the Australian Privacy Principles, particularly in relation to data breaches. This is because the privacy tort and complaints processes for interferences with the APPs are designed to operate in parallel.
To use the ‘carrot and stick’ analogy, there were no carrots for organisations in the first tranche of privacy reforms, and the regulatory stick is getting incrementally longer.
Call to action
Privacy, legal and risk teams in organisations need to discuss the impacts of the regulatory changes on their organisation’s risk exposure. They may need to consider whether the introduction of multiple levels of fines and the privacy tort changes their risk ratings, and whether an update to the risk framework is necessary.
Take a pause on Automated Decision Making (ADM)
An aspect of the reforms that will impact all regulated entities is a new requirement to include information about automated decisions in privacy policies. Organisations will be required to disclose information about the types of decisions that are made using computer systems in an automated way, and the types of personal information used in those systems, where automated decision-making is used for decisions that significantly affect the rights or interests of an individual.
Updating the wording in a privacy policy is, itself, not necessarily a complicated task. But don’t be fooled — this is the easy part. Ensuring that the wording in a privacy policy is actually true requires a detailed understanding of the information handling practices across an organisation, and the complexity of this task will often increase relative to an organisation’s size. If your organisation has been an enthusiastic adopter of new AI tools for processing personal information over the last six months, this is a privacy reform that you will need to pay close attention to. And it’s not just AI. The provisions are drafted to capture any automated processing of personal information (such as producing a score or rating according to a pre-defined formula) that substantially contributes to a significant decision.
There is some reasonably extensive (but not exhaustive) guidance in the Explanatory Memorandum to the reform Bill (para 343) that sets out examples of the sorts of decisions that are intended to be captured within the scope of this reform, including decisions that affect:
- the granting of a benefit (such as welfare)
- rights under a contract (such as an insurance policy)
- access to services or support (such as healthcare)
- targeting individuals with content and advertisements (such as if the targeting results in differential pricing, access to significant goods or services, or employment opportunities).
Call to action
Understanding the ways in which personal information is used in any automated systems will be a necessary precursor for your organisation to comply with the new privacy policy transparency requirement. We have written extensively on the usefulness of tools like data processing inventories to get a handle on an organisation’s data holdings, as well as how valuable they are to help manage change.
Though it covers much more than just ‘AI’, this new requirement, as well as others that seem likely to come in the future, dovetails into the increasing need for AI governance within organisations. If your organisation has embraced AI without implementing a robust governance structure, now is the time to bring privacy, risk, governance and legal together to develop one.
Think of the children
The introduction of a privacy code that recognises the vulnerable position of children in the digital economy and specifies APP compliance requirements to protect them from privacy harms is probably the least controversial of all the proposed reforms.
Online service providers that develop content targeted at children are clearly the primary targets of this reform, and many organisations are likely to be largely unaffected by the introduction of a children’s privacy code. It’s important to note, however, that while the code is yet to be designed, as it is currently framed, the test for whether compliance with the code is necessary is not whether an organisation designs its services for children, but whether children are likely to access them — something that service providers will be expected to proactively assess.
This is a subtle distinction, and it is one drawn in the UK’s Age Appropriate Design Code, which the Australian children’s privacy code is generally expected to replicate. The idea is that the focus of harm prevention should not only be on services that are explicitly designed for children, but also on the services that we know children will encounter when they go online.
Another relevant component of the children’s privacy code is that it looks set to define a child as anyone under 18, not under 13, which has been something of a de facto definition in the tech and social media space.
Call to action
If your organisation, or a part of your organisation, provides online content or online services and you aren’t absolutely confident about the composition of your audience, now is a good time to consider the extent to which it might be made up of children. Market research can provide concrete data, and if you’re worried, you may consider either changing the way certain content or services are offered, or take a more active approach to age verification.
We would recommend reading the UK Age Appropriate Design Code, as that will give a sneak preview into what an Australian children’s privacy code will likely look like.
Keep calm and carry on
As slow as the rate of change may be, the trajectory of privacy reform in Australia is only headed in one direction — towards a model that is much closer aligned to privacy legislation in Europe and many other Asia-Pacific jurisdictions. The government has made clear that the first tranche of changes to the Privacy Act is an initial step towards more far-reaching reform. While we don’t know the timeline for the rest of changes, something like an economy-affecting large-scale data breach could easily see reforms hurtling back up the agenda.
Rather than taking their foot off the pedal, prudent organisations will use this gift of time from the government to understand and gain control of their personal information holdings. There are still plenty of good reasons to do this — from maximising the potential of AI tool usage through to building public trust. If you’re still waiting for an explicit direction in the Privacy Act to tell you what to do, chances are you’re behind the curve.
Contact us
If you’re interested in learning more about how to uplift your privacy program, get in touch. You can also watch a LinkedIn Live conversation featuring elevenM’s Brett Watson and Tessa Loftus here.