End of year wrap

The year started with a meltdown. Literally.

New Year’s Eve hangovers had barely cleared when security researchers announced they had discovered security flaws that would impact “virtually every user of a personal computer”. “Happy new year” to you too. Dubbed “Meltdown” and “Spectre”, the flaws in popular computer processors would allow hackers to access sensitive information from memory – certainly no small thing. Chipmakers urgently released updates. Users were urged to patch. Fortunately, the sky didn’t fall in.

If all this was meant to jolt us into taking notice of data security and privacy in 2018 … well, that seemed unnecessary. With formidable new data protection regulations coming into force, many organisations were already stepping into this year with a much sharper focus on digital risk.

The first of these new regulatory regimes took effect in February, when Australia finally introduced mandatory data breach reporting. Under the Notifiable Data Breaches (NDB) scheme, overseen by the Office of the Australian Information Commissioner, applicable organisations must now disclose any breaches of personal information likely to result in serious harm.

In May, the world also welcomed the EU’s General Data Protection Regulation (GDPR). Kind of hard to miss, with an onslaught of updated privacy policies flooding user inboxes from companies keen to show compliance.

The promise of GDPR is to increase consumers’ consent and control over their data and place a greater emphasis on transparency.  Its extra-territorial nature (GDPR applies to any organisation servicing customers based in Europe) meant companies all around the world worked fast to comply, updating privacy policies, implementing privacy by design and creating data breach response plans. A nice reward for these proactive companies was evidence that GDPR is emerging as a template for new privacy regulations around the world. GDPR-compliance gets you ahead of the game.

With these regimes in place, anticipation built around who would be first to test them out. For the local NDB scheme, the honour fell to PageUp. In May, the Australian HR service company detected an unknown attacker had gained access to job applicants’ personal details and usernames and passwords of PageUp employees.

It wasn’t the first breach reported under NDB but was arguably the first big one – not least because of who else it dragged into the fray. It was a veritable who’s who of big Aussie brands – Commonwealth Bank, Australia Post, Coles, Telstra and Jetstar, to name a few. For these PageUp clients, their own data had been caught up in a breach of a service provider, shining a bright light on what could be the security lesson of 2018: manage your supplier risks.

By July we were all bouncing off the walls. Commencement of the My Health Record (MHR) three month opt-out period heralded an almighty nationwide brouhaha. The scheme’s privacy provisions came under heavy fire, most particularly the fact the scheme was opt-out by default, loose provisions around law enforcement access to health records, and a lack of faith in how well-versed those accessing the records were in good privacy and security practices. Things unravelled so much that the Prime Minister had to step in, momentarily taking a break from more important national duties such as fighting those coming for his job.

Amendments to the MHR legislation were eventually passed (addressing some, but not all of these issues), but not before public trust in the project was severely tarnished. MHR stands as a stark lesson for any organisation delivering major projects and transformations – proactively managing the privacy and security risks is critical to success.

If not enough attention was given to data concerns in the design of MHR, security considerations thoroughly dominated the conversation about another national-level digital project – the build out of Australia’s 5G networks. After months of speculation, the Australian government in August banned Chinese telecommunications company Huawei from taking part in the 5G rollout, citing national security concerns. Despite multiple assurances from the company about its independence from the Chinese government and offers of greater oversight, Australia still said ‘no way’ to Huawei.

China responded frostily. Some now fear we’re in the early stages of a tech cold war in which retaliatory bans and invasive security provisions will be levelled at western businesses by China (where local cyber security laws should already be a concern for businesses with operations in China).

Putting aside the geopolitical ramifications, the sobering reminder for any business from the Huwaei ban is the heightened concern about supply chain risks. With supply chain attacks on the rise, managing vendor and third-party security risks requires the same energy as attending to risks in your own infrastructure.

Ask Facebook. A lax attitude towards its third-party partners brought the social media giant intense pain in 2018. The Cambridge Analytica scandal proved to be one of the most egregious misuses of data and abuses of user trust in recent memory, with the data of almost 90 million Facebook users harvested by a data mining company to influence elections. The global public reacted furiously. Many users would delete their Facebook accounts in anger. Schadenfreude enthusiasts had much to feast on when Facebook founder and CEO Mark Zuckerberg’s uncomfortably testified in front of the US Senate.

The social network would find itself under the pump on various privacy and security issues throughout 2018, including the millions of fake accounts on its platform, the high profile departure of security chief Alex Stamos and news of further data breaches.

But when it came to brands battling breaches, Facebook hardly went it alone in 2018. In the first full reporting quarter after the commencement of the NDB scheme, the OAIC received 242 data breach notifications, followed by 245 notifications for the subsequent quarter.

The scale of global data breaches has been eye-watering. Breaches involving Marriott International, Exactis, Aadhar and Quora all eclipsed 100 million affected customers.

With breaches on the rise, it becomes ever more important that businesses be well prepared to respond. The maxim that organisations will increasingly be judged not on the fact they had a breach, but on how they respond, grew strong legs this year.

But we needn’t succumb to defeatism. Passionate security and privacy communities continue to try to reduce the likelihood or impact of breaches and other cyber incidents. Technologies and solutions useful in mitigating common threats gained traction. For instance, multi-factor authentication had more moments in the sun this year, not least because we became more attuned to the flimsiness of relying on passwords alone (thanks Ye!). Security solutions supporting other key digital trends also continue to gain favour – tools like Cloud Access Security Brokers enjoyed strong momentum this year as businesses look to manage the risks of moving towards cloud.

Even finger-pointing was deployed in the fight against hackers. This year, the Australian government and its allies began to publicly attribute a number of major cyber campaigns to state-sponsored actors. A gentle step towards deterrence, the attributions signalled a more overt and more public pro-security posture from the Government. Regrettably, some of this good work may have been undone late in the year with the passage of an “encryption bill”, seen by many as weakening the security of the overall digital ecosystem and damaging to local technology companies.

In many ways, in 2018 we were given the chance to step into a more mature conversation about digital risk and the challenges of data protection, privacy and cyber security. Sensationalist FUD in earlier years about cyber-attacks or crippling GDPR compliance largely gave way to a more pragmatic acceptance of the likelihood of breaches, high public expectations and the need to be well prepared to respond and protect customers.

At a strategic level, a more mature and business-aligned approach is also evident. Both the Australian government and US governments introduced initiatives that emphasise the value of a risk-based approach to cyber security, which is also taking hold in the private sector. The discipline of cyber risk management is helping security executives better understand their security posture and have more engaging conversations with their boards.

All this progress, and we still have the grand promise that AI and blockchain will one day solve all our problems.  Maybe in 2019 ….

Till then, we wish you a happy festive season and a great new year.

From the team at elevenM.

You get an Aadhaar! You get an Aadhaar! Everybody gets an Aadhaar!

On 26 September 2018, the Supreme Court of India handed down a landmark ruling on the constitutionality of the biggest biometric identity system in the world, India’s Aadhaar system.

The Aadhaar was implemented in 2016, and has since acquired a billion registered users. It’s a 12-digit number issued to each resident of India, linked to biometrics including all ten fingerprints, facial photo and iris scans, and basic demographic data, all held in a central database. Since being implemented, it’s been turned to a variety of uses, including everything from proof of identification, tracking of government employee attendance, ration distribution and fraud reduction, entitlements for subsidies, and distribution of welfare benefits. The Aadhaar has quickly become mandatory for access to essential services such as bank accounts, mobile phone SIMs and passports.

Beyond banks and telcos, other private companies have also been eager to use to the Aadhaar, spurring concerns about private sector access to the database.

In 2012, a series of challenges were levelled at the Aadhaar, including that the Aadhaar violated constitutionally protected privacy rights.

In a mammoth 1448 page judgement, the Court made several key rulings:

  • The Court ruled that the Aadhaar system does not in itself violate the fundamental right to privacy. However, the Court specifically called out a need for a ‘robust data protection framework’ to ensure pricy rights are protected.
  • However, the Aadhaar cannot be mandatory for some purposes, including access to mobile phone services and bank accounts, as well as access to some government services, particularly education. Aadhaar-authentication will still be required for tax administration (this resolves some uncertainty from a previous ruling).
  • The private sector cannot demand that an Aadhaar be provided, and private usage of the Aadhaar database is unconstitutional unless expressly authorised by law.
  • The Court also specified that law enforcement access to Aadhaar data will require judicial approval, and any national security-based requests will require consultation with High Court justices (i.e., the highest court in the relevant Indian state).
  • Indian citizens must be able to file complaints regarding data breaches involving the Aadhaar; prior to this judgment, the ability to file complaints regarding violations of the Aadhaar Act was limited to the government authority administering the Aadhaar system, the Unique ID Authority of India.

The Aadhaar will continue to be required for many essential government services, including welfare benefits and ration distribution – s7 of the Aadhaar Act makes Aadhaar-based authentication a pre-condition for accessing “subsidy, benefits or services” by the government. This has been one of the key concerns of Aadhaar opponents – that access to essential government services shouldn’t be dependant on Aadhaar verification. There have been allegations that people have been denied rations due to ineffective implementation of Aadhaar verification, leading to deaths.

It’s also unclear whether information collected under provisions which have now been ruled as unconstitutional – for example, Aadhaar data collected by Indian banks and telcos – will need to be deleted.

As Australia moves towards linking siloed government databases and creating its own digital identity system, India’s experience with the Aadhaar offers many lessons. A digital identity system offers many potential benefits, but all technology is a double-edged sword. Obviously, Australia will need to ensure that any digital identity system is secure but, beyond that, that the Australian public trusts the system. To obtain that trust, Australian governments will need ensure the system and the uses of the digital identity are transparent and ethical – that the system will be used in the interests of the Australian public, in accordance with clear ethical frameworks. Those frameworks will need to be flexible enough to enable interfaces with the private sector to reap the full benefits of the system, but robust enough to ensure those uses are in the public interest. Law enforcement access to government databases remains a major concern for Australians, and will need to be addressed. It’s a tightrope, and it will need to be walked very carefully indeed.


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

Don’t call me, I’ll call you

You’ve just pulled dinner out of the oven, the kids have been wrangled to the table, and you’re just about to sit down.

Suddenly, your miracle of domestic logistics is shattered by the klaxon  of your phone ringing. Juggling a hot plate of roast chicken and a small, wriggling child, you grab for the handset… only to be greeted by the forced enthusiasm of a long-suffering call centre worker who desperately wants you to tell you about simply fantastic savings on energy prices.

The Do Not Call Register has been in place since 2006. The DNCR Register allows Australians to place their phone number on a register indicating that they don’t wish to receive marketing calls or faxes, with fines applying for non-compliance.

The ACMA enables to organisations that want to conduct telemarketing campaigns subscribe to the Register and  ‘wash’ their calls lists against it. This helps organisation make sure they aren’t calling people who don’t want to hear from them.

Of course, that doesn’t help if you don’t bother to check the Register in the first place, like Lead My Way. Lead My Way received a record civil penalty of $285,600 today for making marketing calls to numbers on the DNCR Register. Lead My Way had actually subscribed to the DNCR Register, but for some reason hadn’t washed their call list against it. This led to numerous complaints to the ACMA, which commenced an investigation.

Lead My Way was calling people to test their interest in its clients’ products or services, then on selling that information as ‘leads’ – that is, as prospective customers. This kind of business model can also raise significant Privacy Act compliance issues. Do the people being called understand that their personal information is collected and will be sold? How are they notified of the collection (APP 5)? Have they consented to that use? Is that consent informed and valid? Is the sale of their personal information permissible (APP 6)? Are they able to opt out of receiving further marketing calls, and are those opt outs being respected (APP 7)?

Cutting corners on how you manage and use personal information may save you time and money in the short term. But, as Lead My Way discovered, in the long run it can create massive compliance risk, annoy your end users, and incur the wrath of the regulators. Were the (likely minuscule) savings of ignoring the DNCR Register worth a regulator investigation and the comprehensive trashing of Lead My Way’s brand?

Perhaps we should call them and ask.


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

In Privacy Awareness Week, will Australia follow the GDPR?

Last week, the headlines told us that the senate backs GDPR style laws in Australia.

But what does this really mean in terms of the government’s commitment to reviewing privacy in Australia?

This does not (necessarily) mean the law will be reviewed

In short, it means very little.  The senate’s support of senator Jordon Steele-John’s notice of motion calling on the Government to consider the impact of our current privacy laws on Australians and look to the GDPR as a potential model for privacy protections for Australians holds no commitment as the senate cannot commit the government to action.

What it does signify is something very big and that is, a shift in the willingness of the senate to stand behind the Greens’ position that Australian privacy laws must be scrutinised.  Just two months ago, senator Steele-John put forward a very similar notice of motion and it was shut down, as were a couple of other privacy related motions.

Why did this one pass? (What has changed)

There are a few likely reasons why this one passed.  Putting aside matters of semantics and the politics of calling on government to subject itself to tighter scrutiny, (which was the case in motions no 749 and no 786), there is one material reason why this motion passed.

In the last two months, consumers have started to wake up to something we privacy professionals have worried about for a while – and that legal compliance is not enough and can, in fact, be damaging if ethical behaviours and transparent practices are perceived to be lacking.

There has been an enormous groundswell in Australia over the last two months, with both Facebook Cambridge Analytica and Commonwealth Bank blitzing the press with actions they have taken – or not taken – which although arguably lawful, have not met public perceptions of fairness and ethics.  Put simply, community expectations have surpassed legal standards.

So, senator Steele-John had his day, and time will tell whether this will serve as a prompt for government to call for a review of Australian privacy law in view of the GDPR.

There are plenty of other reasons why GDPR compliance makes sense, but we’ll leave that to a future blog.

Happy Privacy Awareness Week!


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

Facebook and Cambridge Analytica: Would the GDPR have helped?

It’s a modern-day truism that when you use a “free” online service, you’re still paying – not with your money, but with your personal information. This is simply the reality for many of the services we’ve come to rely on in our daily lives, and for most it’s an acceptable (if sometimes creepy) bargain.

But what if you’re paying for online services not just with your own personal information, but with that of your friends and family? And what if the information you’re handing over is being shared with others who might use it for purposes you didn’t consider when you signed up – maybe for research purposes, maybe to advertise to you, or maybe even to influence the way you vote?

Last week it emerged that an organisation called Cambridge Analytica may have used personal information scraped from Facebook to carry out targeted political advertising. The information was obtained when Facebook users accessed a psychometric profiling app called thisisyourdigitallife – but the data that was collected wasn’t just about app users, it was also about their Facebook friends (more on that below).

It’s what we’re now seeing from consumers that’s interesting.  People are rightfully asking for an explanation. Whilst we seem to have been asleep at the wheel over the last few years, as data empires around the world have pushed the boundaries, the current Facebook debacle is leading us to ask questions about the value of these so-called “free” services, and where the lines should be drawn.  The next few weeks will be telling, in terms of whether this really is the “tipping point” as many media commentators are calling it, or just another blip, soon forgotten.

In any case, with only a few months until the EU General Data Protection Regulation (GDPR), comes into force, this blog post asks:  If GDPR was operational now, would consumers be better protected?

First, some background

There’s plenty of news coverage out there covering the details, so we’ll just provide a quick summary of what happened.

A UK-based firm called Global Science Research (GSR) published thisisyourdigitallife and used the app to gather data about its users. Because GSR claimed this data was to be used for academic purposes, Facebook policies at the time allowed it to also collect limited information about friends of app users. All up, this meant that GSR collected the personal information of more than 50 million people – many more than the 270,000 people who used the app.

GSR then used the personal information to create psychometric profiles of the included individuals, apparently without their informed consent. These profiles were then allegedly passed on to Cambridge Analytica (possibly in breach of Facebook’s rules), which used the data to target, market to – and perhaps manipulate – individuals.

Was this a breach?

There’s been some debate over whether this incident can be fairly labelled a “breach”. Based on what we know, it certainly doesn’t appear that any personal information has been lost or disclosed by means of an accident or a security vulnerability, which is something many consider a necessary element of a “data breach”.

Facebook’s initial response was to hit back at claims it was a “data breach”, saying users willingly handed over their information, and the information of their friends. “Everyone involved gave their consent. People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked” it allegedly said.

Facebook has since hired a digital forensics firm to audit Cambridge Analytica and has stated that if the data still exists, it would be a “grave violation of Facebook’s policies and an unacceptable violation of trust and the commitments these groups made.”

In more recent days, Mark Zuckerberg has made something of a concession, apologising for the  “major breach of trust”.   We love this line from the man that told us that privacy is dead.

GDPR – would it have helped?

We at elevenM are supporters of the GDPR, arguably the most extensive and far reaching privacy reforms of the last 25 years. The GDPR raises the benchmark for businesses and government and brings us closer to one global framework for privacy.   But would the GDPR have prevented this situation from occurring? Would the individuals whose data has been caught up by Cambridge Analytica be in a better position if the GDPR applied?

Let’s imagine that GDPR is in force and it applies to the acts of all the parties in this case, and that Facebook still allowed apps to access information about friends of users (which it no longer does). Here is the lowdown:

  1. Facebook would have to inform its users in “clear and plain” language that their personal information (aka personal data under GDPR) could (among other things) be shared with third party apps used by their friends.
  2. Because the personal data may have been used to reveal political opinions, users would likely also need to provide consent. The notification and consent would have to be written in “clear and plain” language, and consent would have to be “freely given” via a “clear affirmative act” – implied consent or pre-ticked boxes would not be acceptable.
  3. The same requirements relating to notification and consent would apply to GSR and Cambridge Analytica when they collected and processed the data.
  4. Individuals would also have the right to withdraw their consent at any time, and to request that their personal data be erased (under the new “right to be forgotten”). If GSR or Cambridge Analytics were unable to find another lawful justification for collecting and processing the data (and it’s difficult to imagine what that justification could be), they would be required to comply with those requests.
  5. If Facebook, GSR or Cambridge Analytica were found to be in breach of the above requirements (although again, this is purely hypothetical because GDPR is not in force at the time of writing), they could each face fines up to 20 million EUR, or 4% of worldwide annual turnover (revenue), whichever is higher. Those figures represent the maximum penalty and would only be applied in the most extreme cases – but they make clear that GDPR is no toothless tiger.

So, there it is.  We think that GDPR would have made it far more likely that EU residents were made aware of what was happening with their personal data and would have given them effective control over it.

Some lessons

With so many recent data incidents resulting from outsourcing and supply chain, regulators around the world are focussing increasingly on supplier risk.  Just last week here in Australia, we saw the financial services regulator APRA’s new cyber security regulation littered with references to supplier risk.   The Cambridge Analytica situation is another reminder that we are only as strong as our weakest link.  The reputations of our businesses and the government departments for whom we work will often hinge on the control environments of third parties.  Therefore, organisations need to clearly assess third party risks and take commensurate steps to assure themselves that the risks and controls are reasonable and appropriate.

As for individuals – regardless of what regulatory action is taken in Australia and abroad, there are simple steps that we all can and should be taking.  This episode should prompt people to think again about the types of personal information they share online, and who they share it with. Reviewing your Facebook apps is a good start – you might be surprised by some of the apps you’ve granted access to, and how many of them you’d totally forgotten about (Candy Crush was so 2015).

What’s next

We expect this issue to receive more attention in the coming weeks and months.

Regulators around the world (including the Australian Privacy Commissioner, the UK Information Commissioner (ICO), the Canadian Privacy Commissioner and the EU Parliament) are looking into these issues now. Just over the weekend we saw images of ICO personnel allegedly raiding the premises of Cambridge Analytica, Law & Order style.

The Australian Competition and Consumer Commission (ACCC) also has been preparing to conduct a “Digital Platforms Inquiry” which, among other things, may consider “the extent to which consumers are aware of the amount of data they provide to digital platforms, the value of the data provided, and how that data is used…”

Meanwhile, we await the consumer backlash.  Consumers will likely expect increasingly higher standards from the organisations they share their data with and will seek out those organisations that are transparent and trustworthy, and which can demonstrate good governance over privacy and data protection practices.   Will you be one of them?


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

Head to Head: the GDPR and the Australian Privacy Principles – Part 2: A Tale of Two Jurisdictions

This article was originally published in issue #81 (5 December 2017) of Privacy Unbound, the journal of the International Association of Privacy Professionals, Australia-New Zealand (iappANZ).

In Part 1 of this article our aim was to help you understand whether the GDPR applies to your business. In Part 2 we will help you focus your efforts in preparing for the GDPR by identifying links and differences between the 13 Australian Privacy Principles and the GDPR’s 99 Articles.

Gap analysis – Comparing the GDPR and Australian Privacy Principles

If the GDPR is likely to apply to your data processing, understanding the gaps in your current privacy framework will be critical. A gap analysis can help you identify the key areas to focus on.

The GDPR shares some thematic similarities with Australia’s national privacy regulatory regime, set out in the Privacy Act 1988 (Cth) and the Australian Privacy Principles (APPs).

The GDPR and the Privacy Act share a similar purpose – to foster transparent information handling practices and business accountability in relation to the handling of personal information. The two regimes take different approaches – the GDPR’s 99 articles are highly prescriptive, whereas the Privacy Act relies on a principles-based approach supplemented by extensive guidance.

However, the founding principles of the GDPR (the lawful, transparent and fair processing of personal data) laid out in Chapter III (Articles 5-11) and many of the GPDR’s express obligations align with the steps that the OAIC expects Australian companies to take to comply with the APPs (as set out in OAIC guidance). In short, best practice compliance with the APPs will help Australian companies support compliance with the GDPR.

There are some key differences – both in terms of legal concepts and additional data subject rights and corresponding obligations found in the GDPR. These are set out in the comparison table below.

Summary of the APPs vs the GPDR

The Australian Privacy Act applies to ‘APP entities’ – that is Australian and Norfolk Island government agencies (agencies) and private sector businesses (organisations) as well as credit providers and credit reporting bodies. Individuals and many ‘small business operators’ – businesses with an annual turnover of less than AUD $3 million – are exempt from the operation of the Act.

Unlike the GDPR, the Privacy Act does not distinguish between ‘data controllers’ and ‘data processors’ – any APP entity that holds personal information must comply with the APPs.

APP 1 — Open and transparent management of personal information

This first APP requires APP entities to manage personal information in an “open and transparent way”, including taking reasonable steps to ensure that they comply with the APPs.

APP 1 is similar in effect to GDPR Article 5 Principle 2, which requires controllers to be able to demonstrate compliance with the obligations set out in Principle 1. Principle 1(a) also requires data processing to be done in a “transparent manner”.

APP 1.3 and 1.4 also require APP entities to have a clearly expressed privacy policy that deals with specified matters. GDPR Article 7 discusses obtaining of consent from an individual in the context of a “written declaration”, and Articles 12-14 address similar matters to those specified in APP 1.3 and 1.4. GDPR Articles 13 – 14 also require additional information to be provided; this includes information about how long personal data will be stored, the enhanced personal rights under the GDPR (such as data portability, the right to withdraw consent, and the right to be forgotten), and any automated decision-making including profiling.

APP 2 — Anonymity and pseudonymity

APP 2 requires APP entities to give individuals the option of not identifying themselves, or of using a pseudonym, unless a listed exception applies.

There is no direct analogue to this provision in the GDPR. However, the GDPR may apply to pseudonymous information (see Recital 28).

APP 3 — Collection of solicited personal information

APP 3 outlines what personal information an APP entity can collect. In particular, this APP requires that organisations only collect personal information that is reasonably necessary or directly related to their functions or activities, by “lawful and fair means” and, where reasonable and practicable, directly from the individual. Higher standards are applied to the collection of ‘sensitive information’ (see comparison table below); specifically, sensitive information may only be collected with consent, or where a listed exception applies.

A comparison can be drawn here to GDPR Article 5, which requires data collected for “specified, explicit and legitimate purposes”, and be processed “lawfully [and] fairly” (Principle 1(a) and (b)). The question of whether a company has a lawful basis for processing personal information is critical.

APP 4 — Dealing with unsolicited personal information

APP 4 requires APP entities to destroy or de-identify unsolicited personal information that they could not have otherwise collected under APP 3.

There is no direct analogue in the GDPR, however it should be noted that the GDPR does not permit collection of personal data without a specified, explicit purpose.

APP 5 — Notification of the collection of personal information

APP 5 requires APP entities to notify individuals (or otherwise ensure that they are aware) of specified matters when they collect their personal information (for example, by providing individuals with a collection statement).

Again, GDPR Articles 12, 13 and 14 impose requirements for the provision of privacy information about how data is processed that are substantially similar to the matters specified in APP 5, as well as additional obligations (see APP 1, above). This includes a requirement that the information is clear and easy to understand. Australian companies should consider, for example, whether their privacy policies are written in plain English.

APP 6 — Use or disclosure of personal information

This APP outlines the circumstances in which an APP entity may use or disclose personal information that it holds. Where an APP entity has collected personal information for a specific purpose, and wishes to use it for a secondary purpose, APP 6 provides that entities may not do so unless the individual has consented, it is within their reasonable expectations, or another listed exception applies. Exceptions include circumstances involving health and safety and law enforcement.

GDPR Article 6 similarly requires that personal data may only be processed where the data subject has consented to one or more of the specific purposes of the processing, or the processing is otherwise lawful as another listed scenario applies. For example, where the processing is necessary to perform a contract or comply with a legal obligation.

APP 7 — Direct marketing

APP 7 provides that an organisation that is an APP entity may only use or disclose personal information for direct marketing purposes if certain conditions are met. In particular, direct marketing messages must include a clear and simple way to opt out of receiving future messages, and must not be sent to individuals who have already opted out. Sensitive information about an individual may only be used for direct marketing with consent of the individual.

GDPR Article 21 provides individuals with, amongst other things, the right to object to the use of their personal data for direct marketing.

APP 8 — Cross-border disclosure of personal information

This principle requires an APP entity, before it discloses personal information to and overseas recipient, to take reasonable steps to ensure that the recipient does not breach the APPs in relation to that information. Personal information may only be disclosed where the recipient is subject to a regulatory regime that is substantially similar to the APPs, where the individual has consented, or another listed exception applies. APP entities may be liable for the acts and practices of overseas recipients in certain circumstances (s16).

Chapter 5 of the GDPR provides that transfers of personal data outside of EU jurisdiction may only be made where the recipient jurisdiction has been assessed as ‘adequate’ in terms of data protection, where sufficient safeguards (such as a binding contract or corporate rules) have been put in place, or a listed exception applies. The European Commission has not, to date, assessed Australia as ‘adequate’, but the Commission is currently reviewing its adequacy assessments.

APP 9 — Adoption, use or disclosure of government related identifiers

APP 9 provides that an organisation that is an APP entity may not adopt a government related identifier of an individual as its own identifier, or use or disclose such an identifier, unless a listed exception applies. There is no direct analogue to this provision in the GDPR.

APP 10 — Quality of personal information

APP 10 requires APP entities to take reasonable steps to ensure the personal information it collects, uses or discloses is accurate, up to date and complete.

Accuracy and currency of the information are mentioned in GDPR Article 5 o(Principle 1(d); “every reasonable step must be taken” to ensure that inaccurate personal data is “rectified without delay”.

APP 11 — Security of personal information

This APP requires APP entities to take reasonable steps to protect personal information they hold from misuse, interference and loss, and from unauthorised access, modification or disclosure. This provision is a frequent focus of investigations in to APP entities conducted by the Australian Information Commissioner.

GDPR Article 5 similarly requires that data processing be undertaken in a manner “that ensures appropriate security of the data” (Principle 1(f)). Further, Article 32 requires the data controller and the processor to implement appropriate technical and organisational measures to ensure a level of security appropriate (taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes). Those measures must also address the confidentiality, integrity and availability of the data.

APP 11.2 provides that APP entities must also take reason steps to destroy or de-identify personal information that they no longer require for a lawful business purpose.

GDPR Article 5 imposes a similar storage limitation – personal data may “kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed” (Principle 1(e)). However, the GDPR also explains that “personal data may be stored for longer periods insofar as the personal data will be processed solely for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with Article 89(1)”.

APP 12 — Access to personal information

APP 12 requires APP entities to give an individual access to the personal information about them that the entity holds, on request by that individual. APP 12 imposes procedural requirements around access, and includes limited exceptions.

Article 15 of the GDPR imposes a similar right of access, with additional rights to know information about the collection and envisaged use of the data (such as recipients or potential recipients, likely storage period, and safeguards for overseas transfers)

APP 13 — Correction of personal information

APP 13 requires APP entities to take reasonable steps to correct personal information they hold about an individual, on request by the individual. This APP also imposes procedural requirements and includes limited exceptions.

GDPR Article 16 imposes a similar but stronger right; data subjects have the absolute “right to obtain…without undue delay the rectification of inaccurate personal data concerning [them]”.

GDPR rights that are not in the APPs

What none of the APPs provide is an express right to erasure, the right of restriction of processing, data portability and the right to object. The GDPR provides for these rights in Articles 17, 18 ,20 and 21.

Complimentary APP v GDPR legal concepts comparison table

Complimentary Legal Comparison Table


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

 

Head to Head: the GDPR and the Australian Privacy Principles – Part 1: The long arm of the law

This article was originally published in issue #81 (5 December 2017) of Privacy Unbound, the journal of the International Association of Privacy Professionals, Australia-New Zealand (iappANZ).

Introduction

The EU’s new and wide-ranging General Data Protection Regulation (GDPR) represents an unprecedented shakeup of the European data protection regulatory environment. The GDPR promises to set a new regulatory benchmark and drive reform in jurisdictions around the world. The GDPR will come into force on 25 May 2018, replacing the current EU Data Protection Directive 95/46/EC. It will have immediate direct effect in all EU Member States.

Australian companies with exposure to the European market should take note – the GDPR can and will apply to companies based outside of Europe. Australian-based companies should take this opportunity to confirm whether the GDPR will apply to them come May, or whether they need to prepare for GPDR compliance to access the European market in the future.

The costs of non-compliance may be extreme – the GDPR introduces a new set of sharp teeth for European regulators, including fines of up to €20 million or 4% of global revenue, whichever is the greater. However, the added burden of compliance promises to pose a challenge for many businesses working with limited resources.

Part 1 of this article will help you understand whether the GDPR will apply to your business. Part 2, will help you focus your efforts in preparing for the GDPR by identifying links and differences between the 13 Australian Privacy Principles and the GDPR’s 99 Articles.

The GDPR’s extra-territorial application

Critically for Australian companies, Article 3 of the GDPR extends the GDPR to any company that controls or processes the personal information of individuals in the EU (whatever their nationality or place of residence) if the processing is related to offering goods or services or monitoring their behaviour, whether or not the company is located in the EU or the processing occurs in the EU.

For the purposes of the GDPR, a data ‘controller’ determines the purposes and means of the personal information, and the ‘processor’ processes the information on their behalf. ‘Processing’ is not a term found in Australian privacy law. The term is broadly defined and essentially means any act or practice that is done to, or in connection with, personal information.

Therefore, Australian companies that service or supply European clients, or otherwise offer goods or services to or monitor the behaviour of individuals in the EU that takes place in the EU, need to assess their client and individual customer bases, operations, systems and processes to answer three key questions:

  1. Do you have an ‘establishment’ in the EU? (Article 3.1)
  2. Do you offer good or services to individuals who are in the EU (whether or not you charge for them) ? (Article 3.2(a))
  3. Do you monitor any behaviour of individuals in the EU? (Article 3.2(b)

Establishment

Article 4 provides that the main establishment of a data controller is the “place of its central administration” in the EU. That is, where the “decisions on the purposes and means of the processing” occur. For example, if you have an EU office or headquarters.

For processors, the main establishment will be either the place of central administration in the EU or, if the processor does not have one, then where the main processing activity in the EU takes place. For example, if you have your head office in Australia, but maintain an EU data centre.

Offering goods and services

The GDPR recitals explain that a range of factors will be relevant to deciding whether a company is ‘offering goods or services’ to individuals in the EU. These include:

  • the use of language and currency or a top-level domain name of an EU Member State
  • delivery of physical goods to a Member State
  • making references to individuals in a Member State to promote the goods and services, or
  • targeting advertising at individuals in a Member State.

Mere accessibility of an Australian company’s website or app to individuals in the EU will not, by itself, reach the threshold.

Some of these factors obviously indicate that goods and services are being offered. But it may ultimately be the cumulative effect of various activities that bring a company’s data processing within the reach of the GDPR.

Monitoring

To determine whether a processing activity can be considered to be ‘monitoring’ the behaviour of individuals in the EU for the purposes of Article 3.2(b), you should consider whether your company is:

  • associating individuals in the EU with online identifiers provided by their devices, applications, tools and protocols, such as IP addresses and cookie identifiers
  • tracking their behaviour on the Internet, and
  • using data processing techniques that profile individuals, particularly in order to make decisions concerning them for analysing or predicting their personal preferences, behaviours and attitudes.

Enforcement

European data protection authorities will have increased supervisory powers under the GDPR. However, the question of how those authorities will approach extraterritorial enforcement against companies established and operating outside the EU is far from settled.

GDPR Article 50 imposes obligations on the EU Commission and authorities to take appropriate steps to cooperate with international stakeholders. In recent years, there has been increasing cooperation between authorities. Under the GDPR, it is likely that EU authorities will liaise with the Australian privacy regulator – the Office of the Australian Information Commissioner (OAIC) – when responding to data processing by an Australian company. This may in turn trigger regulatory action by the OAIC or a cooperative effort to effect an appropriate response. Any evidence of a company’s presence in or nexus with an EU Member State may influence the potential for cross-border enforcement action.

How can you prepare?

If any of your answer to the three questions above is ‘yes’, then you will need to consider:

  • what are the risks from gaps in your current compliance under Australian privacy law against the GDPR requirements, and
  • what additional steps you need to take to ensure that you can comply with additional GDPR requirements, or
  • whether you need to cease any activities in relation to individuals in the EU to which the GDPR will apply and/or restructure your EU operations

If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

Comparing Australia’s Privacy Act with the GDPR

In this series by iapp, they look at laws from across the globe and matches them up against the EU General Data Protection Regulation. The aim being to help you avoid duplication as you move toward GDPR compliance.

In this instalment, our very own Tim de Sousa compares Australia’s Privacy Act 1988 with the GDPR.

Read the full article here: https://iapp.org/news/a/gdpr-matchup-australias-privacy-act-1988/


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.