The annual International Association of Privacy Professionals ANZ Summit took place last week, and as is often the case, several themes and consistent threads appeared throughout the many interesting and varied presentations.
Obviously AI, biometrics and facial recognition were all popular topics, but as far as thematic discussions went, there was a strong thread and agreement between presenters of the absolute need for data minimisation and data inventory work across all industries, a steadily increasing discussion of the inherent link between privacy and cyber security, and many discussions about the need to strategically consider points of failure (whether in terms of likely data breaches, areas of weaknesses of biometrics, or other areas) and proactively address the harms that arise from them.
We were all also treated to a complete casserole of (primarily food-based) analogies and metaphors, some better than others…
Here’s a few highlights from some of the elevenM team:
Cavill Borger, Consultant
After a delicious lunch interlude, the (first) afternoon began with a panel discussion about implementing children’s privacy protections. The panel, led by Nicole Stephensen, gave a comprehensive overview of where children’s privacy is most at risk online, including, social media platforms, gaming apps, health tech and education tech. The issues arising in this area are largely around how we can provide children with the benefits of an online experience and access to a wealth of information, while also protecting their vulnerabilities and shielding them from inappropriateness and exploitation.
Loren Leung, from Bird & Bird, gave great insights into what other jurisdictions are doing to tackle this issue. This included an overview of the UK ICO Age Appropriate Design Principles, Ireland’s Data Protection Commission and their 14 principles for services likely to be accessed by children, and a comparison of the USA’s Childrens Online Privacy Protection Act and California’s Age Appropriate Design Code.
The overarching theme across these approaches to child privacy laid in the UN’s Convention of the Right of the Child, being ‘what is in the best interest of the child?’. Implementing this requires a flexible approach, including establishing an age at which a child can give consent on their own, coupled with age verification requirements and ensuring transparent processes. There is also a careful balancing act in prohibiting certain processes, such as tracking or automated decision making, with the benefits that may come from the use of such a process. A great example here is around the use of AI in schools; it could be used appropriately to determine where a child would benefit most from additional education support, but other prominent uses of AI could prompt blanket bans on such uses, which would be to the detriment of the child. This leaves the great unanswered question of how this would look in practice, but that’s a whole other blog topic!
Privacy and employment
Margaret Carter, Head of People and Culture
As a People Professional I was particularly interested in the IAPP ANZ conference this year.
Two proposed changes to the Privacy Act 1988 discussed in the conference have the potential to bring Australia in line with global contemporary employee data legislation and I believe that this in turn will be a huge positive step forward for all organisations in Australia. As these two changes were presented at the conference, I felt a surge of positive energy within myself and across the wider audience.
The first change means that in the future, private businesses may need to comply with the Australian Privacy Principles (APPs) when dealing with employee data (this proposal was agreed in-principle, with the Government flagging further consultation). The impact of this, should it come into force, is that all businesses will need to reconsider the way that they collect, use, store and dispose of employee information.
The second change relates to the proposal that all private registered organisations will need to ensure that a senior employee has specific responsibility for privacy (again, this is currently agreed in-principle). What is promising about this change is that it is supported by clear expectations about what this role may entail, and the skills and knowledge required to perform the position effectively.
I envisage that there is a lot of work for People and Culture departments to prepare for these changes (whatever they end up looking like) over the next 12 months and you’ll hear more from me on this topic.
An introduction to privacy
Hannah Dellevoet, Consultant
When I say I am new to the privacy space, I mean to say that I could hardly be newer to privacy. I joined elevenM just under 6 months ago as a Privacy Consultant after coming from a retail analytics background, and IAPP is actually the first conference in general that I have attended. Since I started, I have been struck by the really interesting time at which I have managed to slip into being a privacy professional. Privacy in Australia and New Zealand is in what seems to be a state of flux — data breaches are rife, community attitudes to privacy are changing, we are on the cusp of Privacy Act reform and the emergence of more and more AI tools and platforms is asking us to consider how we best regulate AI.
Over the course of the two days and 10 sessions I attended, I found that there was much I agreed with my peers on (including the necessity of coffee to privacy practice) and even some things I disagreed on! I found that there is a metric tonne for me to learn and that privacy extends into areas that I wouldn’t ever have thought of. When selecting which sessions to attend, I focused on selecting sessions to learn from, which included discussions about the upcoming Privacy Act reform and how to best tackle privacy in practice. I very much enjoyed the opportunity to listen and learn and welcomed the opportunity for friendly debate to discuss and formulate some of my own opinions on privacy practice.
I noticed some common themes emerge across the sessions as the days progressed. Some that I gravitated towards were:
- The importance of trust and agency over personal information for the individual in the consumer-company relationship.
- The concept of accountability — particularly that companies must be accountable for their data.
- The importance of data minimisation (“you can’t breach what you don’t hold”).
- The juxtaposition of privacy as a human right against the ‘commerciality of privacy’ and the tensions around how both may exist (or not) in tandem.
The conversations around these themes afforded me valuable learnings that I will keep front of mind as I continue my journey in privacy.