This year for Privacy Awareness Week, we unpacked some current issues in privacy – from automated decision-making to facial recognition technology to pentesting-for-privacy and many other topics in between.
Transparency
Only 29% of people globally say it’s easy to understand how organisations protect their personal information, making transparency and clear communication about privacy processes increasingly important.
Incoming changes to the Privacy Act will introduce a transparency requirement around any kind of automated decision-making (this can be anything from AI tools to formulas in a spreadsheet), and you will need to publicly disclose this, and implicit in this is being able to explain it to any consumers (and the Privacy Commissioner).
The OAIC has also made it clear that they expect clear and transparent communication about information handling practices, calling out improvements to privacy policies and collections notices in several privacy assessments in the last year (for example: Digital ID Assessment 3 and Handling of personal information: my health app).
Privacy policies are often treated as either a compliance check box exercise or a tool for managing legal risk, but we think, when done right, they can be a powerful customer communication tool.
Don’t neglect your privacy communications – transparency is the first APP for a reason, and it’s worth investing in. Read more about customer-centric privacy communications.

Collection
Data minimisation is increasingly gaining traction as an effective privacy control — as the Australian Privacy Commissioner noted, “In a world of increasing cyber threats…know that you can’t lose data you don’t hold in the first place” (Carly Kind, Privacy Commissioner).
In the past, businesses have been concerned about the risk of not holding information — such is the value of data. But with a steady increase in data breaches, and cyber attacks becoming increasingly sophisticated, choosing to not collect information, and taking active steps to reduce your personal information holdings is the better risk-mitigation approach.
We are increasingly talking to our clients about taking a data minimisation approach to collection processes and developing and implementing retention and disposal schedules and plans. Not to mention that you’ve probably read our blogs about the importance of deletion.

Use and disclosure
While some consider AI the province of tech, those of us in privacy know that safe and responsible adoption of AI is essential for the protection of personal information. The first of its kind International AI Safety report discusses privacy as a fundamental systemic risk in AI usage. This arises because of a range of issues, such as sensitive information being used in model development or as an input into a tool, but there is also a risk to privacy arising from the fact that it is simply not possible to identify the scale or scope of privacy violations, as these typically occur at a remove and people might not even be aware of them.
The National AI Centre’s Responsible AI Index from late last year showed that the mean Responsible AI score for Australian businesses was 44%, putting the bulk of Australian businesses in the ‘emerging’ and ‘developing’ categories, with only 8% of businesses falling into the ‘leading’ category. This indicates that there is quite a lot of work to be done in implementing safe and responsible AI across Australian businesses.
elevenM, as part of the SAAM consortium, is a member of the Responsible AI Network – “a world first cross-ecosystem collaboration aimed at uplifting the practice of responsible AI across Australia’s the commercial sector.” The Responsible AI Network (under the aegis of the National AI Centre and Department of Industry, Science & Resources) is working to develop best practice in AI across 7 pillars: law, standards, principles, governance, leadership, technology, and design.
Before you start using customer/client information or your organisation’s commercial information in or to develop an AI tool, invest in an AI Governance Framework.

Accuracy
Facial recognition technology (FRT) has been gradually improving and the last few years has seen an explosion in use.
While most people like being able to use face ID to unlock devices, the gradual increase in one-to-many facial recognition is raising ethical questions, of which privacy is one. The Australian Public Attitudes to Facial Recognition Technology report from Monash University found that 89% of people think they should be notified if one-to-many facial recognition is being used in public, and that the vast majority of people do not support the use of FRT for workplace productivity surveillance, or for marketing in retail environments.
One-to-many FRT raises a range of significant privacy risks, as it works through collecting (however momentarily) the biometric data of every person in its catchment. This raises issues of necessity and proportionality, consent and transparency, governance, and accuracy – specifically bias and discrimination, about which the OAIC have indicated that they have eyes on.
While accuracy is always important in privacy, it is absolutely essential to managing privacy risk in facial recognition technology – we have worked with a number of clients on this issue, to help weigh up the varying risks of false positives vs false negatives, and where the potential harm will fall on individuals. You can read more about it in Privacy Officer’s guide to facial recognition technology for security.

Security
Australia’s digital risk environment has evolved rapidly. With many high-profile cyber incidents occurring in the last few years, the Office of the Australian Information Commissioner (OAIC) has made clear that failure to adequately protect personal information is not just a PR disaster but can also be a costly breach of the Privacy Act.
The 2025 Verizon Data Breach Investigations report has shown that exploitation of vulnerabilities, which has been growing as an attack vector year-on-year, is now the second most frequent access vector. With a huge 34% growth as an attack vector in the last year, it now accounts for 20% of breaches.
So, where does privacy come into this? In an age where data is currency and breaches are headline material, digital risks are converging and privacy professionals can no longer afford to think of cyber security controls as separate to their specialist priorities — it’s time for cyber and privacy to start playing for the same team.
Vulnerability management and cyber controls like pentesting have long been the preserve of the cyber team, but we think they have an increasing role to play in privacy and preventing data breaches.
Join elevenM’s Nick Allen as he makes the case for privacy-focussed pentesting: How to ensure penetration testing addresses privacy risk.

Access
93% of Australians think they should have the right to ask a business to delete their personal information, and with the Government agreeing in principle to introduce a right to be forgotten, we should prepare for it as part of upcoming tranches of Privacy Act reforms.
While this right might seem relatively straightforward, with the complex integrations and uses of data these days, it is likely to be extremely complex to implement.
One of the challenges that comes up when we discuss data holdings with our clients is that the uses of data assets are rarely straightforward and frequently one segment of a business won’t know exactly how reliant other areas are on the data in systems or platforms. ‘We don’t want to break anything downstream’ is one of the recurring messages we get when developing retention and disposal plans for large and complex organisations.
Our advice in this area is advice that we often give in the privacy space – you need to know what data you hold and how you are collecting, using and disposing of it.
Read more about the challenges of the right to erasure in Australia and learn more about the merits, competing rights and prospects of the right to be forgotten in elevenM podcast episode #110.
And if you’re ready to start working on this problem, find out more about how to undertake a data processing inventory or implement retention and disposal and other data controls.
