19 September 2024

Analysis of tranche one of Privacy Act reform

Tom Kench
Manager

elevenM’s Tom Kench provides a detailed analysis of tranche one of the recently introduced Privacy Act reform, outlining what’s included, what’s impacted, and giving some key takeaways for organisations in Australia.

Introduction

The Privacy and Other Legislation Amendment Bill 2024 was introduced to parliament on 12 September 2024.

We now have what is being termed tranche one of the long-awaited reform to the Privacy Act 1988 (Privacy Act). This bill responds to concerns surrounding vulnerable groups through the criminalisation of doxxing and creation of a children’s privacy code, individual privacy rights by introducing the long-awaited statutory tort for serious invasions of privacy, and artificial intelligence adoption by imposing transparency requirements for organisations using automated decision making as concerns.

Whilst this Bill has been titled tranche one, it follows the October 2022 introduction of the Privacy Legislation Amendment (Enforcement and Other Measures) Bill 2022, which increased civil penalty amounts for serious and repeated interferences with privacy, provided the Office of the Australian Information Commissioner (OAIC) with new information gathering and sharing powers, and broadened the extra-territorial application of the Privacy Act.

What’s missing

Highly anticipated ‘agreed in principle’ proposals from the Review Report have been put on the backburner. The fair and reasonable standard, removal of the small business exemption, improved data subject rights, and improvements to the definitions of personal information and consent will have to wait until tranche two, which we don’t expect to be introduced to parliament before the next federal election (due by May 2025).

Timeline

The development of this bill and the tortious conduct it addresses has itself been tortuous. This era of Australian privacy law reform has been a lengthy one, beginning with the ACCC’s Digital Platforms Inquiry report in July 2019. An issues paper was released in October 2020 and several rounds of consultation and reporting have since followed. Throughout these discussions, the need for an Act and regulator with powers fit for the digital age was a common thread.

Political lens

The Bill is set to implement 23 of the 25 legislative proposals that were agreed in the Government Response to the Privacy Act review, indicating an approach of co-operation and delay of more politically contentious proposals. This ‘safe’ approach is consistent with the strategy the government signalled in its response to the Attorney-General Department’s Review Report.

One of the agreed proposals not implemented in the Bill, Proposal 9.1, would have restricted the journalism exemption for media organisations, and exemplifies industry pushback in the reform process.

Tips for organisations

There are a range of actions that organisations should take to implement tranche one, and get ready for tranche two, including assessing their risk rating framework, increasing their understanding of the data processing, and reviewing the focus of online services. You can read more about this advice in Where to start with privacy law reform.

Statutory Tort for Serious Invasions of Privacy

Schedule 2 of the Bill introduces a cause of action for individuals in tort for serious invasions of privacy.  This cause of action will allow individuals to sue for serious invasions of privacy in circumstances where the individual had a reasonable expectation of privacy, but subject to some limitations. 

The ALRC defines a tort as:

a legal wrong which one person or entity commits against another person or entity and for which the usual remedy is an award of damages. Many torts protect fundamental liberties, such as personal liberty, and fundamental rights, such as property rights, and provide protection from interferences by other people or entities and by the Crown. In short, torts protect people from wrongful conduct by others and give claimants a right to sue for compensation or possibly an injunction to restrain the conduct.

Presently, although the Information Commissioner is empowered to require compensation to be paid in response to a complaint about a breach of the Privacy Act, there is no general right to obtain compensatory damages for serious invasions of privacy in Australia, and individuals are limited in their ability to directly enforce privacy complaints. Australia has a variety of laws at the Commonwealth, state, and territory levels — including common law, criminal law, and privacy legislation — that address privacy invasions.

However, these laws are not nationally uniform and have significant gaps, particularly regarding the conduct of individuals. They differ between jurisdictions in terms of their application, the forums through which they are enforced, and the remedies they offer.

The 2023 Review Report reiterated the findings of several previous reviews that existing frameworks have clear gaps in current privacy protection, and the ability for an individual to take steps to protect themselves and seek compensation for invasions of privacy.

History

A tort for privacy interference has long been mooted in both statute and the common law. The High Court’s 2001 decision in Australian Broadcasting Corporation v Lenah Meats Pty Ltd opened the possibility for a common law tort of interference with privacy to be developed. This decision has been interpreted restrictively by subsequent courts and no common law tort for invasion of privacy has been fully recognised in Australia.

In its 2008 report For Your Information: Australian Privacy Law and Practice (ALRC Report 108), the ALRC recommended the implementation of a statutory cause of action for serious invasion of privacy, following the commission’s 28-month inquiry into the extent to which the Privacy Act effectively protected privacy in Australia. The ALRC’s 2014 report (ALRC Report 123) Serious Invasions of Privacy in the Digital Era examined what this model would look like in practice and it is this model that is largely reflected in the Bill.

In practice

Elements

The tort would not be specifically activated by a breach of the APPs or other provisions of the Privacy Act (noting that a separate direct right of action for breach of the APPs may be introduced as part of Tranche two).

Instead, a cause of action arises here for individuals if they suffer an invasion of their privacy, either by an intrusion into their seclusion or by misuse of information, when:

  1. a person in their position would have had a reasonable expectation of privacy in all the circumstances;
  2. the invasion of privacy was intentional or reckless; and
  3.  the invasion of privacy was serious.

Where one or more competing public interests are identified by a defendant (for example, the public interest in freedom of expression), the claimant must also satisfy the court that the public interest in protecting their privacy outweighs those competing public interests.

Defences/Exemptions/Remedies

Oft cited human rights principles of proportionality and necessity are again relevant here, and public safety, public interest, and intersection with other human rights will be factored in assessments of privacy incursions.

Further exemptions apply for legitimate activities, including those of law enforcement, intelligence agencies, journalists., and those under 18 years old when the invasion of privacy occurred.

The exemption for journalists is notably broad and incongruent with the ALRC’s model, providing no cause of action to the extent that the invasion of privacy involves the collection, preparation for publication or publication of journalistic material by:

  1. a journalist;
  2. an employer of a journalist; or
  3. a person assisting a journalist, who is engaged by the journalist’s employer or in the person’s professional capacity.

The rationale in the explanatory memorandum for this exemption is to protect the vital public interest in press freedom, and the role of journalists in fostering informed public debate.  However, the right in the Bill applies regardless of whether reporting is in the public interests and whether the journalist has breached professional standards.

Individuals will be able to access compensation under the tort for emotional distress, reputational damage, and harms associated with a significant incursion into their privacy.

Takeaways

The statutory tort for serious invasions of privacy is intended to operate similarly to other torts, in that it would be developed through jurisprudence. It is distinct from the regulatory regime established in the Privacy Act, which requires compliance with the APPs and is overseen by a regulator. As such, it is intended that courts would draw on key concepts from other torts, including privacy torts in other jurisdictions.

From a plaintiff’s perspective, a significant drawback to the elements of the cause of action is the fault elements of intent and recklessness apply, but negligence does not. Serious invasions of privacy, and serious harms, caused by negligent acts of organisations or government will not give rise to the action. However, establishing a duty of care might give rise to a claim in negligence.

We are likely to see an uptick in class actions as there is now a clear action for privacy breaches available with associated remedies. In the Optus and Medibank proceedings, class actions were brought via consumer law, shareholder claims, and regulatory involvement, and the tort provides much needed clarity.

Additionally, privacy complainants will now have another avenue of complaint, directly to the courts. Privacy complaint handlers should connect with their legal departments to ensure that complaint processes identify and manage litigation risk.

Despite the increased financial and litigation risks for organisations created by this tort, the high fault element thresholds limit the application of the tort.

Doxxing

The government signalled intent earlier this year to criminalise doxxing off the back off large scale data breaches and a highly publicised incident in February where hundreds of Jewish Australians in a private WhatsApp group had their details published online by Pro-Palestinian advocates. To this end, the Bill amends part 10.6 of the Criminal Code Act 1995 (Cth) to introduce new offences aimed at doxxing.

Definition and harms

Doxxing is an abbreviation of the phrase “dropping documents”. Doxxing is the intentional malicious exposure of an individual’s personal data online.

Doxxing can refer to a number of different practices, including:

  1. De-anonymising doxxing – revealing the identity of someone who was previously anonymous (for example, someone who uses a pseudonym).
  2. Targeting doxxing – revealing specific information about someone that allows them to be contacted or located, or their online security to be breached (for example, their phone number or home address, or their account username and password).
  3. De-legitimising doxxing – revealing sensitive or intimate information about someone that can damage their credibility or reputation (for example, their private medical, legal, or financial records, or personal messages and photos usually kept out of public view).

Doxxing is a harmful practice, leaving targets vulnerable to and fearful of:

  • public embarrassment, humiliation or shaming
  • discrimination, if personal characteristics are disclosed
  • cyberstalking and physical stalking 
  • identity theft and financial fraud
  • damage to their personal and professional reputation, leading to social and financial disadvantage such as loss of employment
  • increased anxiety
  • reduced confidence and self-esteem.

In practice

The proposed amendments to Part 10.6 of the Criminal Code

  1. introduce a new offence for using a carriage service to make available, publish or distribute personal data, where the person engages in the conduct in a way that reasonable persons would regard as being menacing or harassing, and
  2. introduce a further offence where a person or group is targeted because of their race, religion, sex, sexual orientation, gender identity, intersex status, disability, nationality or national or ethnic origin.

The former offence carries a maximum prison term of 6 years and the latter a period of 7 years. These penalties mirror those in the government’s bill targeting sharing of sexually explicit deepfakes.

Section 3 of the Bill introduces the concept of personal data of an individual. ‘Personal data’ will mean information about an individual that allows them to be identified, contacted or located, and will include an individual’s name, phone number, photograph, email address, online account, residential or work address and place of education or worship.

Personal Data is defined more broadly than ‘Personal Information’ under the Privacy Act, as it goes beyond an individual’s identity to include their contact details and location, even when that individual is not themselves identifiable, and may be an insight into the government’s thinking regarding a future amended definition of personal information.

There are no defences or exceptions listed in relation to the new doxxing offences, and the wording is best described as broad. However, the drafting is unclear as to how the offences will operate alongside the implied right of political communication, a matter that will be left to the courts.

Deepfakes

Earlier this year, legislation was brought to parliament creating new criminal offences for the sharing of sexually explicit material without consent. This applies to material that is either real or AI-generated, a response to the rising use of deepfakes. Under this legislation, sharing of non-consensual sexually explicit deepfakes carries penalties up to six years imprisonment. If the person created the deepfake, this is considered an aggravated offence with an additional year imprisonment available.

Takeaways

Doxxing and deepfakes are two areas where the government has introduced legislation ‘fit for the digital age’ Attorney-General Mark Dreyfus said doxxing has led to “public embarrassment, humiliation, shaming, discrimination, stalking and identify theft and financial fraud,” and is “often used against women in the context of domestic and family violence.’ A 2023 report from the Australian Institute of Criminology found that around 4% of people had experienced doxxing in their lifetime, and this reform addresses this harmful and dangerous practice.

These reforms will have a limited impact on organisations and are outside the scope of the Privacy Act.

Children’s Online Privacy Code

Part 4 of the Bill imposes a requirement for the Australian Information Commissioner to develop and register a Children’s Online Privacy Code (COP Code) within two years of commencement of the relevant provisions. The COP Code would be an enforceable APP code setting out how one or more of the Australian Privacy Principles are to be applied or complied with in relation to the privacy of children.

Children’s online privacy is being further supported by the government’s plans to introduce a bill imposing a minimum age to use social media platforms by the end of 2024.

The Code

The aim of the COP Code is to clarify and enhance how relevant APPs would apply to children’s online privacy. By codifying protections, the right to privacy of a child (established in Article 16 of the CRC), is strengthened via specific enforceable obligations in the handling of children’s personal information.

Unlike other APP codes which are usually developed externally, the Information Commissioner is required to develop this code.

This Code will apply to social media platforms, relevant electronic services and any designated media services which are likely to be accessed by children (excluding health services). It is intended that the Code will specify how these types of entities must comply with privacy obligations in relation to children and alignment with similar codes in comparable jurisdictions such as the UK is encouraged.

UK ICO

The UK Information Commissioner’s Office established the Age Appropriate Design Code (otherwise known as the Children’s Code) in 2020 to apply to any internet connected product or service that is likely to be accessed by a person under the age of 18.

The ICO’s Code requires online services to abide by a range of privacy considerations including:

  • To be designed in the best interest of the child and their health, safety and privacy
  • Provide the strongest privacy settings be the default
  • Abide by data minimisation in collection
  • Limit third party disclosures without justification
  • Communicate clear privacy policies and controls

Takeaways

The OAIC must develop the Code within 24 months from Royal Assent and make a draft publicly available for a minimum 40-day consultation period prior to registering the Code. The OAIC will receive additional funding of $3 million over 3 years to assist with the development of the Code.

Organisations that may be regulated by the COP code should keep abreast of its development. Whether a service constitutes one that is ‘likely to be accessed by children’ is open to interpretation until details of the Code are completed. However, the Bill’s Explanatory Memorandum notes that service providers are expected to proactively assess the likelihood that their service is accessed by children regardless of whether it is explicitly targeted at children, and should consider:

  • whether the service has a particular appeal to children;
  • market research on the user base of the service; and
  • the way in which the service is accessed, and whether there are measures in place to prevent children from accessing the service.

Automated decision making

One of the measures introduced in the Bill to increase transparency and certainty surrounding the handling of personal information is requiring entities to include information in privacy policies about automated decisions that significantly affect the rights or interests of an individual.

Automated decision making (ADM) systems can be used to assist or replace the judgement of human decision makers. ADM systems pose privacy risks as they can use personal information about individuals in ways which may have significant impact, with little transparency.

Importantly, ‘automated decision making’ is broader than ‘AI’, taking in any computer-driven scoring, rating or processing of personal information that substantially and directly relates to a significant decision, whether based on machine learning or a pre-defined formula.

Elements

Under the proposed APP 1.8, entities would be required to include information in privacy policies surrounding:

  1. The types of personal information used in ADM systems;
  2. The types of decisions made solely by these systems; and
  3. The types of decisions for which the systems do something substantially and directly related to the decision.

‘Automated decisions’ to be covered by this new transparency obligation are broadly defined:

  1. the entity has arranged for a system to make, or do a thing that is substantially and directly related to making a decision;
  2. the decision could reasonably be expected to significantly affect the rights or interests of an individual; and
  3. personal information about the individual is used in the operation of the system described above.

Proposed APP 1.9 provides clarity on the above, in that:

  1. making a decision or “doing a thing” includes refusing or failing to make a decision or “do a thing”;
  2. the effect of the decision on the individual’s rights or interests could be an adverse or beneficial effect; and
  3. there is non-exhaustive list of the kinds of decisions that may affect an individual’s rights or interests – for example, decisions that:
  4. are made under a law to grant, or refuse to grant, a benefit to the individual;
  5. affect the individual’s rights under a contract, agreement or arrangement; and
  6. affect the individual’s access to a significant service or support.

Takeaways

This new transparency requirement is one of the few substantive changes to the APPs contained in the bill. Organisations will need to identify and review any decision-making processes that may affect the rights or interests of an individual to assess whether they include any kind of scoring, rating or other automation (whether AI or based on a pre-defined formula) that may be captured.

The Review Report included further proposals relating to automated decisions, which might come into force at a later stage, including:

  • A requirement for APP entities to conduct a PIA prior to engaging in a ‘high risk privacy activity’ – likely to be defined as a function or activity with a significant impact on individuals’ privacy
  • a right for individuals to request meaningful information about how substantially automated decisions with legal or similarly significant effect are made

This reform is a welcome response to heightened concerns over automated decision making with the growth of AI processing capabilities and its potential to cause harm (see Robodebt), by promoting AI governance principles of transparency and explainability.  

There is likely to be further developments in the ADM space ahead. Earlier this month, Minister for Industry and Science Ed Husic released a discussion paper proposing 10 “mandatory guardrails” for high-risk AI including human oversight and the ability to challenge the use of AI or outcomes of automated decision-making.

Additional Reforms

OAIC powers

New enforcement and review powers for the OAIC are introduced by the Bill.

Infringement Notices

The Bill introduces a new tiered penalty regime designed to capture a broader range of contraventions of the Privacy Act. This is a significant shift away from the existing practice of only penalising practices that constitute ‘serious’ or ‘repeated’ interference(s) with the privacy of individuals.

If approved, the following tiered approach will apply to civil penalties and infringement notices under the Act

  • The OAIC may seek civil penalties of up to $50 million, or potentially more based on turnover, or benefit from the breach, for serious interferences with privacy. These amounts were previously available for serious or repeated breaches, but the Bill now limits this to serious breaches, with repetition being one of the listed factors to consider in determining seriousness, amongst others including the information involved, volume of individuals, any vulnerability, potential or actual consequences, and controls in place.
  • The OAIC may seek civil penalties of up to 2000 penalty units ($660,000) for interferences with privacy not deemed ‘serious’ for individuals and 10,000 penalty units ($3.3 million) for bodies corporate.
  • The OAIC may issue infringement notices of up to 200 penalty units ($66,000) for individuals and 1000 penalty units ($330,000) for bodies corporate for breaches of specific APP obligations and non-compliant eligible data breach statements. As referenced above the OAIC can directly issue infringement notices for these penalties without an application to a Court.

This represents a significant shift in the scope of conduct which is captured by civil penalty provisions. Whereas previously only serious or repeated interferences were subject to penalties, under the proposed amendments entities may be subject to penalties for any interference with privacy.

The OAIC’s power to issue infringement notices is limited to non-compliance with specific APP obligations that are administrative in nature and where a contravention can be easily established, including:

  • Failure to have a privacy policy that meets APP 1.4 and new automated decision making requirements.
  • Failure to provide individuals with the right to be anonymous or use a pseudonym (where practicable to do so) per APP 2.1.
  • Failure to comply with direct marketing opt-out obligations under APP 7.
  • Per APP 13, failure to respond to data subjects’ personal information correction requests within a reasonable timeframe, or charging individuals for making these requests.

By enabling the Information Commissioner to issue infringement notices for civil penalties for relatively minor contraventions of the Privacy Act, entities not meeting privacy obligations are penalised without the need for protracted litigation.

Other Powers

The Bill further empowers the OAIC by providing investigation and monitoring powers similar to that of other regulatory bodies. Furthermore, under the Bill the OAIC is entitled to conduct public inquiries into matters relating to privacy on the direction or approval of the Minister.

Overseas disclosures

APP 8 of the Privacy Act states that prior to disclosing personal information to an overseas recipient, APP entities must take reasonable steps to ensure that the overseas recipient does not breach the APPs in relation to the information.

The Bill introduces a mechanism akin to the GDPR’s adequacy decisions scheme. If introduced the Governor-General can prescribe countries and binding schemes that provide substantially similar privacy protections to the APPs. The intent behind this mechanism is to facilitate the free flow of information across borders in our modern borderless digital economy while ensuring the privacy of individuals is maintained.

This amendment simplifies cross border disclosures for entities by ‘whitelisting’ jurisdictions, removing the complexities for entities associated with determining whether foreign laws or schemes are substantially similar to the APPs. This process may be improved in future by the adopting the GDPR approach of standard contract clauses and defining ‘disclosure’ in the Privacy Act.

APP 11 clarification

APP 11.1 requires APP entities to take reasonable steps to protect personal information it holds from misuse, interference and loss, as well as unauthorised access, modification or disclosure, and APP 11.2 imposes obligations on entities to destroy or de-identify information when it is no longer needed for a purpose for which the entity may use or disclose it consistently with the APPs.

APP 11.3 is introduced in the Bill, clarifying that reasonable steps to protect information includes technical and organisational measures. This addition reflects current (non-binding) OAIC guidance. Technical measures such as encrypting data and securing access to systems and premises, and organisational measures such as staff training are promoted to address information security risks.

Notifiable data breaches — information sharing

Part 7 of the Bill includes an information sharing scheme that applies in the event of a notifiable data breach, known as an ‘eligible data breach declaration’.

This declaration allows for limited sharing and handling of personal information to prevent or reduce the risk of harm arising from misuse of personal information, in ways that would otherwise breach privacy laws, duties of confidence or other statutory secrecy provisions. Instead, the ways in which entities could collect, use and disclose information in these circumstances will be governed by a written declaration from the minister. The safeguards that apply include that sharing of information must only be for the purpose of preventing or reducing the risk of harm to individuals.

The principal use case being contemplated here is to permit sharing of personal information about individuals affected by a data breach with financial institutions to enable proactive monitoring and increased fraud protections.

What’s next?

Once the Bill is passed and receives Royal Assent, most of its provisions will take effect immediately. The statutory tort will commence six months after Royal Assent or on a date to be announced, and a 2 year grace period applies to the new requirements relating to automated decision making.

As for the remainder of the expected reforms, the Attorney General’s Department has signalled that the rest of the ‘agree in principle’ reform items are still en route, and has stated an intent to draft tranche 2 legislation in the coming months. However, it is unlikely to reach Parliament before the 2025 federal election.

Throughout this reform process, there has been significant lobbying from companies and industry bodies across the marketing, big tech and social media, corporations, and journalism industries (to name just a few). It remains to be seen whether a balance can be struck between competing stakeholders, political priorities, and the interests of individuals.

Contact us

If you’re interested in learning more about how to uplift your privacy programget in touch.