Nine steps to a successful privacy and cyber security capability uplift

Most organisations today understand the critical importance of cyber security and privacy protection to their business. Many are commencing major uplift programs, or at least considering how they should get started.

These projects inevitably carry high expectations because of what’s at stake. They’re also inherently complex and impact many parts of the organisation. Converting the effort and funding that goes into these projects into success and sustained improvement to business-as-usual practices is rarely straightforward.

Drawing on our collective experiences working on significant cyber security and privacy uplift programs across the globe, in a variety of industries, here’s what we believe are key elements to success.

1. Secure a clear executive mandate

Your uplift program is dealing with critical risks to your organisation. The changes you will seek to drive through these programs will require cooperation across may parts of your organisation and potentially partners and third parties too. A mandate and sponsorship from your executive is critical.

Think strategically about who else you need on-side, beyond your board and executive committee. Build an influence map and identify potential enablers and detractors, and engage early. Empower your program leadership team and business leadership from affected areas to make timely decisions and deliver their mandate.

2. Adopt a customer and human-centric approach

Uplift programs need to focus on people change as well as changes to processes and technology. Success in this space very often comes down to changing behaviours and ensuring the organisation has sufficient capacity to manage the new technology and process outputs (eg how to deal with incidents).

We therefore suggest that you adopt a customer and human-centric approach. Give serious time, attention and resourcing to areas including communications planning, organisational change management, stakeholder engagement, training and awareness.

3. Know the business value of what you are going to deliver and articulate it

An opaque or misaligned understanding of what a security or privacy program is meant to deliver is often the source of its undoing. It is crucial to ensure scope is clear and aligned to the executive mandate.

Define the value and benefits of your uplift program early, communicate them appropriately and find a way to demonstrate this value over time. Be sure to speak in terms the business understands, not just new technologies or capabilities you will roll-out for instance, what risks have you mitigated?

You can’t afford to be shy. Ramp up the PR to build recognition about your program and its value among staff, executive and board members. Think about branding.

4. Prioritise the foundational elements

If you’re in an organisation where security and privacy risks have been neglected, but now have a mandate for broad change, you can fall into the trap of trying to do too much at once.

Think of this as being your opportunity to get the groundwork in place for your future vision. Regardless of whether the foundational elements are technology or process related, most with tenure in your organisation know which of them need work. From our experience, those same people will also understand the importance of getting them right and in most cases would be willing to help you fix them.

As a friendly warning, don’t be lured down the path of purchasing expensive solutions without having the right groundwork in place. Most, if not all of these solutions rely on such foundations.

5. Deliver your uplift as a program

For the best results, deliver your uplift as a dedicated change program rather than through BAU.

Your program will of course need to work closely with BAU teams to ensure the sustained success of the program. Have clear and agreed criteria with those teams on the transition to BAU. Monitor BAU teams’ preparation and readiness as part of your program.

6. Introduce an efficient governance and decision making process

Robust and disciplined governance is critical. Involve key stakeholders, implement clear KPIs and methods of measurement, and create an efficient and responsive decision-making process to drive your program.

Governance can be light touch provided the right people are involved and the executive supports them. Ensure you limit the involvement of “passengers” on steering groups who aren’t able to contribute and make sure representatives from BAU are included

7. Have a ruthless focus on your strategic priorities

These programs operate in the context of a fast-moving threat and regulatory landscape. Things change rapidly and there will be unforeseen challenges.

It’s important to be brave and assured in holding to your strategic priorities. Avoid temptation to succumb to tactical “quick fixes” that solve short-term problems but bring long-term pain.

8. Build a high-performance culture and mindset for those delivering the program

These programs are hard but can be immensely satisfying and career-defining for those involved. Investing in the positivity, pride and engagement of your delivery team will pay immense dividends.

Seek to foster a high-performance culture, enthusiasm, tolerance and collaboration. Create an environment that is accepting of creativity and experimentation.

9. Be cognisant of the skills shortage and plan accordingly

While your project may be well funded, don’t be complacent about the difficulties accessing skilled people to achieve the goals of your project. Globally, the security and privacy industries continue to suffer severe short-ages in skilled professionals. Build these into your forecasts and expectations, and think laterally about the use of partners.


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

Musings on the OAIC’s second Notifiable Data Breaches report

On 31 July, the Office of the Australian Information Commissioner (OAIC) released its second Notifiable Data Breaches Quarterly Statistics Report.

This report covers the first full quarter since the Notifiable Data Breaches scheme (NDB scheme) began on 22 February 2018, and the OAIC has clearly put some work into building out the report with detailed stats and breakdowns. Let’s take a look.

Going up, up, up!

This quarter there were 242 notifications overall, noting that multiple notifications relating to the same incident (including the infamous PageUp breach) were counted as a single breach.

The OAIC’s month by month breakdown shows a steady increase in notifications by month, going from 55 notifications in March to 90 notifications in June. Overall, accounting for the partial quarter in the first report, we’ve seen a doubling in the rate of notifications.

However, there are a lot of factors that may be affecting the notification rate. Since February, many companies and agencies have implemented new processes to make sure they comply with the NDB scheme, and this may be driving more notifications. On the other hand, in our experience a lot of companies and agencies are still unsure about their notification obligations and when to notify, so they might be over reporting – notifying breaches that may not meet the ‘likely risk of serious harm’ threshold just to be sure that they are limiting their compliance risk.

At this early stage of the scheme, we think it’s premature to draw any conclusions on rising notification rates. The rate may change significantly as companies and agencies come to grips with their obligations and what does and doesn’t need to be reported.

Teach your children staff well

59% of breaches this quarter were identified as being caused by malicious or criminal attacks. The vast majority (68%) of attacks were cyber incidents and, of those, over three quarters related to lost or stolen credentials. This includes attacks based on phishing, malware, and social engineering. Brute force attacks also featured significantly.

We think that the obvious conclusion here is that there’s an opportunity to significantly reduce the attack surface by training your staff to better protect their credentials. For example, teach them how to recognise phishing attempts, run drills, and enforce regular password changes.

There are also some system issues that could be addressed, such as multi-factor authentication, enforcing complex password requirements, and implementing rate limiting on credential submissions to prevent brute force attacks.

To err is human

Human error accounted for 36% of breaches this quarter. It was the leading cause in the first quarterly report, but again, there are a number of factors that could have caused this shift.

Notably, over half of the breaches caused by human error were scenarios in which personal information was sent to the wrong person – by email, mail, post, messenger pigeon or what have you, but especially email (29 notifications). Again, this suggests a prime opportunity to reduce your risk by training your staff. For example, it appears that at least 7 people this quarter didn’t know (or forgot) how to use the BCC/Blind Carbon Copy function in their email.

People make mistakes. And we know this, so it’s a known risk. We should be designing processes and systems to limit that risk, such as systems to prevent mistakes in addressing.

Doctors and bankers and super, oh my!

Much ink has been spilt over information governance in the health and finance sectors recently, and those sectors accounted for more notifications than any other this quarter (49 and 36 notifications respectively). These are pretty massive industry sectors – healthcare alone accounts for 13.5% of jobs in Australia – so scale is likely affecting the high number of notifications. Anyway, the OAIC has helpfully provided industry level breakdowns for each of them.

In the finance sector (including superannuation providers), human error accounted for 50% of all breaches, and malicious attacks for 47%. Interestingly, in the finance sector almost all the malicious attacks were based on lost or stolen credentials, so we’re back to staff training as a key step to reduce risk.

Bucking the trend, human error accounted for almost two thirds of breaches in the health sector – clearly there’s some work to be done in that sector in terms of processes and staff training. Of the breaches caused by the malicious attacks, 45% were theft of physical documents or devices. This isn’t particularly surprising, as it can be challenging for small medical practices that make up a large part of the sector to provide high levels of physical security. It’s important to note that these notifications only came from private health care providers – public providers are covered under state-based privacy legislation. Also, these statistics don’t cover notifications relating to the My Health Records system – the OAIC reports on those numbers separately in its annual report. So these stats don’t offer a full picture of the Australian health industry as a whole.

All in all, this quarter’s NDB scheme report contains some interesting insights, but as agencies and organisations become more familiar with the scheme (and continue to build their privacy maturity), we may see things shift a bit. Only time will tell.


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

GDPR is here

If the recent flurry of emails from organisations sending privacy policy updates didn’t tip you off, the new EU General Data Protection Regulation (GDPR) commences today.

Reading every one of those emails (something even those of us in the privacy world struggle with), might give you the impression that there’s a standard approach to GDPR compliance. But the truth is that how your organisation has (hopefully) prepared for GDPR, and how it will continue to improve its privacy practices, is highly variable.

We’ve covered the GDPR at length on this blog, and a collection of links to our various articles is at the bottom of this post– but first, we’d like to set out a few thoughts on what the GDPR’s commencement means in practice.

Remember the principles

If the GDPR applies to your organisation, you’ve presumably taken steps to prepare for the requirements that apply under the new privacy regime. Among these are new requirements relating to data breach notification, as well as new rights and freedoms for individuals whose personal data you may be processing.

One aspect of GDPR that has received plenty of attention is the new penalties, which can be up to 4% of an organisation’s annual turnover, or 20 million Euros (whichever is greater). Certainly, those numbers have been very effective in scaring plenty of people, and they may cause you to check once again whether your organisation fully meets the new requirements under the GDPR.

However, the reality isn’t quite so straightforward (or scary). Much of the GDPR is principles-based, meaning that there isn’t always a single way to comply with the law – you need to take account of your organisation’s circumstances and the types of personal data it processes to understand where you stand in relation to GDPR’s requirements.

Although we don’t expect EU supervisory authorities to provide an enforcement ‘grace period’, we’re also of the view that enforcement activities will ramp up gradually. The authorities understand that, for many organisations, GDPR compliance is a journey. Those organisations that can demonstrate they’ve taken every reasonable step to prepare for GDPR, and which have a plan for continuing to improve their privacy compliance and risk programs, will be far better placed than those that have done little or nothing to get ready for the new law.

If your organisation still has work to do to comply with the GDPR, or you want to continue improving your organisation’s compliance and risk program (and there is always more to do!), there is plenty of help available to help you navigate GDPR and understand how it applies to your organisation.

Our previous coverage of the GDPR

Tim compares Australia’s Privacy Act with the GDPR

Melanie spoke to Bloomberg about driving competitive advantage from GDPR compliance

Head to Head: the GDPR and the Australian Privacy Principles (Part 1 and Part 2)

A Lesson in Data Privacy: You Can’t Cram for GDPR

Facebook and Cambridge Analytica: Would the GDPR have helped?

5 things you need to know about GDPR’s Data Protection Officer requirement


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

5 things you need to know about GDPR’s Data Protection Officer requirement

This article was originally published in issue #83 of Privacy Unbound under the title ‘5 Questions about DPOs’. Privacy Unbound is the journal of the International Association of Privacy Professionals, Australia-New Zealand (iappANZ).

1. What is a ‘DPO’, anyway? What are they even supposed to do?

In a nutshell, the Data Protection Officer (DPO) is a senior advisor with oversight of how your organisation handles personal data.

Specifically, DPOs should be able to:

  • inform and advise your organisation and staff about their privacy compliance obligations (with respect to the GDPR and other data protection laws)
  • monitor privacy compliance, which includes managing internal data protection activities, advising on data protection impact assessments, training staff and conducting internal audits
  • act as a first point of contact for regulators and individuals whose data you are handling (such as users, customers, staff… etc.) (Art. 39(1)).

2. But we’re not based in Europe, so do we even need one?

Well, even if you aren’t required to have one, you should have one. If you’re processing, managing or storing personal data about EU residents, you’ll need to comply with the requirements of the GDPR – this is one of those requirements, whether you’re based in the EU or not.

Specifically, the GDPR requires that you appoint a DPO in certain circumstances (Art. 37(1)).

These include if you carry out ‘large scale’ systematic monitoring of individuals (such as online behavioural tracking).

You’ll also need to appoint a DPO if you carry out ‘large scale processing of personal data’, including:

  • ‘special categories of data’ as set out in article 9 – that is, personal data that reveals racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric identifiers, health information, or information about person’s sex life or sexual orientation.
  • data relating to criminal convictions and offences (per Art. 10).

The Article 29 Working Party has stated[1] that ‘large scale’ processing could include, for example, a hospital processing its patient data, a bank processing transactions, the analysis of online behavioural advertising data, or a telco processing the data required to provide phone or internet services.

Even if you don’t fit into one of these categories, you can still appoint a DPO in the spirit of best practice, and to ensure that your company is leading from the top when it comes to privacy.

In this respect, New Zealand is already ahead of the game. Entities covered by the New Zealand Privacy Act are already required to have a privacy officer, and they largely fulfil the same functions as a DPO.[2] However, they’ll still need to meet the other DPO requirements; see below.

While Australia hasn’t made having a privacy officer an express requirement for the private sector, the Office of the Australian Information Commissioner recommends that companies appoint a senior privacy officer as part of an effective privacy management framework.[3]

Government agencies aren’t off the hook

Being Public Service will not save you. Public authorities that collect the information of EU residents are also required to have a DPO (Art. 37(1)).

It’s worth noting that Australian Government agencies will need to appoint privacy officers and senior privacy ‘champions’ under the Australian Government Agencies Privacy Code,[4] which comes into force on 1 July 2018. Agency Privacy Champions may also be able to serve as the DPO.

As New Zealand Government agencies already have privacy officers, the only question they must answer is whether their privacy officer meets the other DPO requirements; see below.

3. OK, fine. We get it. We need a DPO. Who should we appoint?

The DPO needs to be someone that reports to the ‘highest management level’ of your organisation; that is, Board-level or Senior Executive (Art. 38(3)).

They’ll need to be suitably qualified, including having expert knowledge of the relevant data protection laws and practices (Art. 37(5)).

The DPO also needs to be independent; they can’t be directed to carry out their work as DPO in a certain way, or be penalised or fired for doing it (Art 38(3)). You’ll also need to ensure they’re appropriately resourced to do the work (Art. 38(2)).

If you’re a large organisation with multiple corporate subsidiaries, you can appoint a single DPO as long as they are easily accessible by each company (Art. 37(3)).

You can appoint one of your current staff as DPO (Art 37(6)), as long as their other work doesn’t conflict with their DPO responsibilities (Art. 38(6)). This means that you can’t have a DPO that works on anything that the DPO might be required to advise or intervene on. That is, they can’t also have operational responsibility for data handling. This means you can’t, for example, appoint your Chief Security Officer as your DPO.

4. But that means we can’t appoint any of our current staff. We can’t take on any new hires right now. Can we outsource this?

Yes, you can appoint an external DPO (Art 37(6)), but whoever you contract will still need to meet all of the above requirements.

Some smaller companies might not have enough work to justify a full-time DPO; an outsourced part-time DPO might be a good option for these organisations.

It might also be hard to find qualified DPOs, at least in the short term; IAPP has estimated that there will be a need for 28,000 DPOs in the EU.[5] A lot of companies in Australia and New Zealand are already having trouble finding qualified privacy staff, so some companies might have to share.

5. This all seems like a lot of trouble. Can we just wing it?

I mean, sure. If you really want to. But under the GDPR, failure to meet the DPO obligations may attract an administrative fine of up to €10 million, or up to 2% of your annual global turnover (Article 83(4)). Previous regulatory action in the EU on privacy issues has also gained substantial media attention. Is it really worth the risk? Especially given that, in the long run, having robust privacy practices will help you keep your users and your customers safe – having an effective DPO may well save you money.

[1] http://ec.europa.eu/information_society/newsroom/image/document/2016-51/wp243_annex_en_40856.pdf

[2] S23, Privacy Act 1993 (NZ); http://www.legislation.govt.nz/act/public/1993/0028/latest/DLM297074.html

[3] https://www.oaic.gov.au/agencies-and-organisations/guides/privacy-management-framework

[4] https://www.oaic.gov.au/privacy-law/privacy-registers/privacy-codes/privacy-australian-government-agencies-governance-app-code-2017

[5] https://iapp.org/news/a/study-at-least-28000-dpos-needed-to-meet-gdpr-requirements/


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

Our view on APRA’s new information security regulation

For those of you who don’t work in financial services and may not know the structure associated with APRA’s publications, there are Prudential Practice Guides (PPGs) and Prudential Standards (APSs or CPSs). A PPG provides guidance on what APRA considers to be sound practice in particular areas. PPGs discuss legal requirements but are not themselves legal requirements. Simply put, this is APRA telling you what you should be doing without making it enforceable.

On the other hand, APSs and CPSs are regulatory instruments and are therefore enforceable.

Until now, those working within a cyber security team at an Australian financial services company had PPG 234 – Management of security risk in information and information technology (released in 1 February 2010) as their only reference point as to what APRA were expecting from them in regard to their cyber security controls. But things have moved on a fair bit since 2010. Don’t get us wrong, PPG 234 is still used today as the basis for many ‘robust’ conversations with APRA.

APRA’s announcement

That leads us to the Insurance Council of Australia’s Annual Forum on 7th March 2018. It was at this esteemed event that APRA Executive Board Member Geoff Summerhayes delivered a speech which noted:

“APRA views cyber risk as an increasingly serious prudential threat to Australian financial institutions. To put it bluntly, it is easy to envisage a scenario in which a cyber breach could potentially damage an entity so badly that it is forced out of business.

“….What I’d like to address today is APRA’s view on the extent to which the defences of the entities we regulate, including insurers, are up to the task of keeping online adversaries at bay, as well as responding rapidly and effectively when – and I use that word intentionally – a breach is detected”

Summerhayes then went on to announce the release of the consultation draft of CPS 234 – Information Security. Yeah, actual legislative requirements on information security.

So what does it say?

Overall there are a lot of similarities to PPG 234 but the ones that caught our eye based upon our experience working within financial services were:

Roles and responsibilities

  • “The Board of an APRA-regulated entity (the Board) is ultimately responsible for ensuring that the entity maintains the information security of its information assets in a manner which is commensurate with the size and extent of threats to those assets, and which enables the continued sound operation of the entity”. – Interesting stake in the ground from APRA that Boards need to be clear on how they are managing information security risks. The next obvious question is what reporting will the Board need from management for them to discharge those duties?

Information security capability

  • “An APRA-regulated entity must actively maintain its information security capability with respect to changes in vulnerabilities and threats, including those resulting from changes to information assets or its business environment”. – Very interesting. There is a lot in this provision. First, there is a push to a threat based model, which we fully endorse (see our recent blogpost: 8 steps to a threat based defence model). Next, there is a requirement to have close enough control of your information assets to determine if changes to those assets somehow adjust your threat profile. Definitely one to watch. That brings us nicely to the following:

Information asset identification and classification

  • “An APRA-regulated entity must classify its information assets, including those managed by related parties and third parties, by criticality and sensitivity. Criticality and sensitivity is the degree to which an information security incident affecting that information asset has the potential to affect, financially or non-financially, the entity or the interests of depositors, policyholders, beneficiaries, or other customers”. – This really is a tough one. From our experience, many companies say they have a handle on this for their structured data with plans in place to address their unstructured data. In our experience however, very few actually do anything that would stand up to scrutiny.

Implementation of controls

  • “An APRA-regulated entity must have information security controls to protect its information assets, including those managed by related parties and third parties, that are implemented in a timely manner”. – Coming back to the previous point, there is now a requirement to have a clear line of sight of the sensitivity of data, this just adds to the requirement to build effective control over that data.
  • “Where information assets are managed by a related party or third party, an APRA-regulated entity must evaluate the design and operating effectiveness of that party’s information security controls”. – Third party security assurance is no longer a nice to have folks! Third party risk is referenced a couple of times in the draft, and so definitely seems to be a focus point. This will be very interesting as many companies struggle getting to grips with this risk. The dynamic of having to face into actual regulatory obligations however, is a very different proposition.

Incident management

  • “An APRA-regulated entity must have robust mechanisms in place to detect and respond to information security incidents in a timely manner. An APRA-regulated entity must maintain plans to respond to information security incidents that the entity considers could plausibly occur (information security response plans)”. – We love this section. A very important capability that often gets deprioritised when the dollars are being allocated. Whilst the very large banks do have mature capabilities, most do not. Pulling the ‘Banks’ industry benchmark data from our NIST maturity tool we see that for the NIST domain Respond, the industry average is sitting at 2.39. So in maturity terms it is slightly above Level 2 – Repeatable, where the process is documented such that repeating the same steps may be attempted. In short, many have a lot to do in this space.

Testing control effectiveness

  • “An APRA-regulated entity must escalate and report to the Board or senior management any testing results that identify information security control deficiencies that cannot be remediated in a timely manner, to enable an assessment and potential response by the Board or senior management to mitigate the exposure, as appropriate”. – Yep, we also love this. Putting formal requirements around the basic principle of ‘fix what you find’! The key message from us to Boards and senior management is make sure you are clear on what is in/out of scope for this testing and why.
  • “Testing must be conducted by appropriately skilled and functionally independent specialists”.- The Big 4 audit firms will be very excited about this one!

APRA Notification

  • “An APRA-regulated entity must notify APRA as soon as possible, and no later than 24 hours, after experiencing an information security incident”. –  Eagle-eyed readers will spot that this reflects mandatory data breach obligations that recently came into force under the Privacy Act on 22 February. The Privacy Act requires entities that experience a serious breach involving personal information, to notify the OAIC and affected individuals ‘as soon as practicable’ after identifying the breach. Another example of how companies now have to contend with notifying multiple regulators, on different time-frames. 

Conclusion

CPS 234 is just a draft, and ultimately the final product may be vastly different. Nevertheless, we feel APRA’s approach is a positive step to drive awareness of this significant risk, and one which will hopefully be used to baseline the foundational cyber security capabilities noted within. Well done, APRA!

Consultation on the package is open until 7 June 2018. APRA intends to finalise the proposed standard towards the end of the year, with a view to implementing CPS 234 from 1 July 2019.

Link to the consultation draft.


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

In Privacy Awareness Week, will Australia follow the GDPR?

Last week, the headlines told us that the senate backs GDPR style laws in Australia.

But what does this really mean in terms of the government’s commitment to reviewing privacy in Australia?

This does not (necessarily) mean the law will be reviewed

In short, it means very little.  The senate’s support of senator Jordon Steele-John’s notice of motion calling on the Government to consider the impact of our current privacy laws on Australians and look to the GDPR as a potential model for privacy protections for Australians holds no commitment as the senate cannot commit the government to action.

What it does signify is something very big and that is, a shift in the willingness of the senate to stand behind the Greens’ position that Australian privacy laws must be scrutinised.  Just two months ago, senator Steele-John put forward a very similar notice of motion and it was shut down, as were a couple of other privacy related motions.

Why did this one pass? (What has changed)

There are a few likely reasons why this one passed.  Putting aside matters of semantics and the politics of calling on government to subject itself to tighter scrutiny, (which was the case in motions no 749 and no 786), there is one material reason why this motion passed.

In the last two months, consumers have started to wake up to something we privacy professionals have worried about for a while – and that legal compliance is not enough and can, in fact, be damaging if ethical behaviours and transparent practices are perceived to be lacking.

There has been an enormous groundswell in Australia over the last two months, with both Facebook Cambridge Analytica and Commonwealth Bank blitzing the press with actions they have taken – or not taken – which although arguably lawful, have not met public perceptions of fairness and ethics.  Put simply, community expectations have surpassed legal standards.

So, senator Steele-John had his day, and time will tell whether this will serve as a prompt for government to call for a review of Australian privacy law in view of the GDPR.

There are plenty of other reasons why GDPR compliance makes sense, but we’ll leave that to a future blog.

Happy Privacy Awareness Week!


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

Facebook and Cambridge Analytica: Would the GDPR have helped?

It’s a modern-day truism that when you use a “free” online service, you’re still paying – not with your money, but with your personal information. This is simply the reality for many of the services we’ve come to rely on in our daily lives, and for most it’s an acceptable (if sometimes creepy) bargain.

But what if you’re paying for online services not just with your own personal information, but with that of your friends and family? And what if the information you’re handing over is being shared with others who might use it for purposes you didn’t consider when you signed up – maybe for research purposes, maybe to advertise to you, or maybe even to influence the way you vote?

Last week it emerged that an organisation called Cambridge Analytica may have used personal information scraped from Facebook to carry out targeted political advertising. The information was obtained when Facebook users accessed a psychometric profiling app called thisisyourdigitallife – but the data that was collected wasn’t just about app users, it was also about their Facebook friends (more on that below).

It’s what we’re now seeing from consumers that’s interesting.  People are rightfully asking for an explanation. Whilst we seem to have been asleep at the wheel over the last few years, as data empires around the world have pushed the boundaries, the current Facebook debacle is leading us to ask questions about the value of these so-called “free” services, and where the lines should be drawn.  The next few weeks will be telling, in terms of whether this really is the “tipping point” as many media commentators are calling it, or just another blip, soon forgotten.

In any case, with only a few months until the EU General Data Protection Regulation (GDPR), comes into force, this blog post asks:  If GDPR was operational now, would consumers be better protected?

First, some background

There’s plenty of news coverage out there covering the details, so we’ll just provide a quick summary of what happened.

A UK-based firm called Global Science Research (GSR) published thisisyourdigitallife and used the app to gather data about its users. Because GSR claimed this data was to be used for academic purposes, Facebook policies at the time allowed it to also collect limited information about friends of app users. All up, this meant that GSR collected the personal information of more than 50 million people – many more than the 270,000 people who used the app.

GSR then used the personal information to create psychometric profiles of the included individuals, apparently without their informed consent. These profiles were then allegedly passed on to Cambridge Analytica (possibly in breach of Facebook’s rules), which used the data to target, market to – and perhaps manipulate – individuals.

Was this a breach?

There’s been some debate over whether this incident can be fairly labelled a “breach”. Based on what we know, it certainly doesn’t appear that any personal information has been lost or disclosed by means of an accident or a security vulnerability, which is something many consider a necessary element of a “data breach”.

Facebook’s initial response was to hit back at claims it was a “data breach”, saying users willingly handed over their information, and the information of their friends. “Everyone involved gave their consent. People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked” it allegedly said.

Facebook has since hired a digital forensics firm to audit Cambridge Analytica and has stated that if the data still exists, it would be a “grave violation of Facebook’s policies and an unacceptable violation of trust and the commitments these groups made.”

In more recent days, Mark Zuckerberg has made something of a concession, apologising for the  “major breach of trust”.   We love this line from the man that told us that privacy is dead.

GDPR – would it have helped?

We at elevenM are supporters of the GDPR, arguably the most extensive and far reaching privacy reforms of the last 25 years. The GDPR raises the benchmark for businesses and government and brings us closer to one global framework for privacy.   But would the GDPR have prevented this situation from occurring? Would the individuals whose data has been caught up by Cambridge Analytica be in a better position if the GDPR applied?

Let’s imagine that GDPR is in force and it applies to the acts of all the parties in this case, and that Facebook still allowed apps to access information about friends of users (which it no longer does). Here is the lowdown:

  1. Facebook would have to inform its users in “clear and plain” language that their personal information (aka personal data under GDPR) could (among other things) be shared with third party apps used by their friends.
  2. Because the personal data may have been used to reveal political opinions, users would likely also need to provide consent. The notification and consent would have to be written in “clear and plain” language, and consent would have to be “freely given” via a “clear affirmative act” – implied consent or pre-ticked boxes would not be acceptable.
  3. The same requirements relating to notification and consent would apply to GSR and Cambridge Analytica when they collected and processed the data.
  4. Individuals would also have the right to withdraw their consent at any time, and to request that their personal data be erased (under the new “right to be forgotten”). If GSR or Cambridge Analytics were unable to find another lawful justification for collecting and processing the data (and it’s difficult to imagine what that justification could be), they would be required to comply with those requests.
  5. If Facebook, GSR or Cambridge Analytica were found to be in breach of the above requirements (although again, this is purely hypothetical because GDPR is not in force at the time of writing), they could each face fines up to 20 million EUR, or 4% of worldwide annual turnover (revenue), whichever is higher. Those figures represent the maximum penalty and would only be applied in the most extreme cases – but they make clear that GDPR is no toothless tiger.

So, there it is.  We think that GDPR would have made it far more likely that EU residents were made aware of what was happening with their personal data and would have given them effective control over it.

Some lessons

With so many recent data incidents resulting from outsourcing and supply chain, regulators around the world are focussing increasingly on supplier risk.  Just last week here in Australia, we saw the financial services regulator APRA’s new cyber security regulation littered with references to supplier risk.   The Cambridge Analytica situation is another reminder that we are only as strong as our weakest link.  The reputations of our businesses and the government departments for whom we work will often hinge on the control environments of third parties.  Therefore, organisations need to clearly assess third party risks and take commensurate steps to assure themselves that the risks and controls are reasonable and appropriate.

As for individuals – regardless of what regulatory action is taken in Australia and abroad, there are simple steps that we all can and should be taking.  This episode should prompt people to think again about the types of personal information they share online, and who they share it with. Reviewing your Facebook apps is a good start – you might be surprised by some of the apps you’ve granted access to, and how many of them you’d totally forgotten about (Candy Crush was so 2015).

What’s next

We expect this issue to receive more attention in the coming weeks and months.

Regulators around the world (including the Australian Privacy Commissioner, the UK Information Commissioner (ICO), the Canadian Privacy Commissioner and the EU Parliament) are looking into these issues now. Just over the weekend we saw images of ICO personnel allegedly raiding the premises of Cambridge Analytica, Law & Order style.

The Australian Competition and Consumer Commission (ACCC) also has been preparing to conduct a “Digital Platforms Inquiry” which, among other things, may consider “the extent to which consumers are aware of the amount of data they provide to digital platforms, the value of the data provided, and how that data is used…”

Meanwhile, we await the consumer backlash.  Consumers will likely expect increasingly higher standards from the organisations they share their data with and will seek out those organisations that are transparent and trustworthy, and which can demonstrate good governance over privacy and data protection practices.   Will you be one of them?


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

A Lesson in Data Privacy: You Can’t Cram for GDPR

Deadlines are a powerful motivator. While travelling the world over the past year, including here in Australia, I’ve been energized by the discussions that companies are having about data privacy as they prepare for the European Union’s General Data Protection Regulation (GDPR). But I’ve also been dismayed by the numerous companies who remain oblivious – wilfully or otherwise – to the implications GDPR has for their business operations.

The GDPR comes into force on 25 May, 2018 – that is, less than four months from now, yet many Australian companies are still confused as to whether GDPR applies to them.

In a nutshell, if your business has any interaction with the personal data of an EU resident, then GDPR will apply.

I also see companies struggling with what GDPR actually means and being lured by quick fix sales pitches for tools and technology that claim to make you compliant with GDPR.  Vendors, suppliers, and consultants who have never operated in the data privacy space have miraculously become GDPR “experts”, with beautiful brochures and marketing collateral promising that their technology alone will deliver compliance. But, buyer beware: don’t believe the hype.

The high-level concept is simple: GDPR requires that companies have a data privacy legal compliance framework in place. In practice, that will look different for every organisation. That’s why effective compliance will never come straight from a box. Complying with the GDPR requires having a privacy program that lays out your business’s foundation for meeting its obligations around an individual’s fundamental rights to privacy and to own and control their personal data. It incorporates what data your company collects, why and how you collect it, and what you do with it. It takes account of your specific people, processes and systems. Technology has its place, but you must ensure you have the right tools for the right problems.

If you are using outside assistance to help bring you into compliance, there are some key things to consider before you sign on.

Do your homework

When you hire a new employee, you don’t make your decision based strictly on how they sell themselves. You read their resume, interview them, and check their references, because hiring an employee is a long-term investment and a poor decision can have significant consequences. Complying with GDPR is also a long-term proposition that deserves the same level of attention. Just like you would with a prospective new hire, get to know your prospective advisors and their capabilities by digging deeper than glossy sales brochures and snappy product taglines.

If you’re engaging an IT supplier, consider what steps they are taking to ensure that they comply with GDPR. Ask about their privacy framework and the internal policies and processes they have to support it. Ask them specifically how they comply with Australian data protection laws and all other relevant data protection laws.

Choose a company that clearly understands the difference between privacy and security and that takes a holistic view that includes all the processes and tools you need to protect your company and your customers. If you ask a privacy related question and they give a security answer, it is a sure sign that they don’t understand privacy at its core. World class security does not ensure privacy compliance – building a fortress around data you are not legally allowed to have will not save you from the inquisitive eye of European data regulators. And it won’t help restore the trust of customers who feel intruded upon by your organisation.

Information management has become a global proposition, so you want to work with a service provider that has a global approach, not a national one. If you operate in the European Union or provide goods and services to EU residents, member states have laws that also require consideration. Depending on how your business is structured, you many need to comply with the laws of multiple jurisdictions in overlapping contexts.  Ask how the provider stays current with new developments in privacy legislation and regulations around the world. If they say that rules in other jurisdictions aren’t ‘relevant’, then keep looking.

Keep your eyes on the prize

As you work to bring your company into compliance, remember your goal. Tools and technology might be part of your solution, but successful compliance with GDPR won’t be measured by the amount of software or data storage that someone installs for you, or the location of your data centre, or the latest data mapping or classification tool you implement. Success will be measured by your ability to demonstrate that you understand what data you’re collecting and what you’re doing with it.

When your service provider is finished, you and your employees should have a solid grasp of several key elements. Firstly, you should know what information you’re collecting about employees and customers, and you should have a procedure to ensure that you have their consent where needed and a lawful right to process data where consent is not an option. You also should know what agreements you have in place with third party providers that collect, process or host information for you. This isn’t the time to pass the buck—you need to know how they protect data that they collect, because they’re doing it on your behalf, and your customers will hold you responsible for their actions.  You need to know what data you collect about your customers, why you require that information from them, and what you do with it. If you and your team can’t answer these questions, chances are high that you don’t have an adequate data privacy framework and that you’re not compliant. If that’s the case, it’s time to get cracking.

Don’t cram the night before the exam

If your company hasn’t started preparing for GDPR, don’t panic—just get to work.  Start by taking stock of what data you collect and why. If you need external support, don’t be lured by those promising a quick fix – these will only cost you money to give you the appearance of compliance, and the regulators won’t be fooled. Spend the extra time to hire someone that can help you develop a proper privacy framework that will serve you, your employees, and your customers in the long run.

Still have questions?

Read our articles on:


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

Head to Head: the GDPR and the Australian Privacy Principles – Part 2: A Tale of Two Jurisdictions

This article was originally published in issue #81 (5 December 2017) of Privacy Unbound, the journal of the International Association of Privacy Professionals, Australia-New Zealand (iappANZ).

In Part 1 of this article our aim was to help you understand whether the GDPR applies to your business. In Part 2 we will help you focus your efforts in preparing for the GDPR by identifying links and differences between the 13 Australian Privacy Principles and the GDPR’s 99 Articles.

Gap analysis – Comparing the GDPR and Australian Privacy Principles

If the GDPR is likely to apply to your data processing, understanding the gaps in your current privacy framework will be critical. A gap analysis can help you identify the key areas to focus on.

The GDPR shares some thematic similarities with Australia’s national privacy regulatory regime, set out in the Privacy Act 1988 (Cth) and the Australian Privacy Principles (APPs).

The GDPR and the Privacy Act share a similar purpose – to foster transparent information handling practices and business accountability in relation to the handling of personal information. The two regimes take different approaches – the GDPR’s 99 articles are highly prescriptive, whereas the Privacy Act relies on a principles-based approach supplemented by extensive guidance.

However, the founding principles of the GDPR (the lawful, transparent and fair processing of personal data) laid out in Chapter III (Articles 5-11) and many of the GPDR’s express obligations align with the steps that the OAIC expects Australian companies to take to comply with the APPs (as set out in OAIC guidance). In short, best practice compliance with the APPs will help Australian companies support compliance with the GDPR.

There are some key differences – both in terms of legal concepts and additional data subject rights and corresponding obligations found in the GDPR. These are set out in the comparison table below.

Summary of the APPs vs the GPDR

The Australian Privacy Act applies to ‘APP entities’ – that is Australian and Norfolk Island government agencies (agencies) and private sector businesses (organisations) as well as credit providers and credit reporting bodies. Individuals and many ‘small business operators’ – businesses with an annual turnover of less than AUD $3 million – are exempt from the operation of the Act.

Unlike the GDPR, the Privacy Act does not distinguish between ‘data controllers’ and ‘data processors’ – any APP entity that holds personal information must comply with the APPs.

APP 1 — Open and transparent management of personal information

This first APP requires APP entities to manage personal information in an “open and transparent way”, including taking reasonable steps to ensure that they comply with the APPs.

APP 1 is similar in effect to GDPR Article 5 Principle 2, which requires controllers to be able to demonstrate compliance with the obligations set out in Principle 1. Principle 1(a) also requires data processing to be done in a “transparent manner”.

APP 1.3 and 1.4 also require APP entities to have a clearly expressed privacy policy that deals with specified matters. GDPR Article 7 discusses obtaining of consent from an individual in the context of a “written declaration”, and Articles 12-14 address similar matters to those specified in APP 1.3 and 1.4. GDPR Articles 13 – 14 also require additional information to be provided; this includes information about how long personal data will be stored, the enhanced personal rights under the GDPR (such as data portability, the right to withdraw consent, and the right to be forgotten), and any automated decision-making including profiling.

APP 2 — Anonymity and pseudonymity

APP 2 requires APP entities to give individuals the option of not identifying themselves, or of using a pseudonym, unless a listed exception applies.

There is no direct analogue to this provision in the GDPR. However, the GDPR may apply to pseudonymous information (see Recital 28).

APP 3 — Collection of solicited personal information

APP 3 outlines what personal information an APP entity can collect. In particular, this APP requires that organisations only collect personal information that is reasonably necessary or directly related to their functions or activities, by “lawful and fair means” and, where reasonable and practicable, directly from the individual. Higher standards are applied to the collection of ‘sensitive information’ (see comparison table below); specifically, sensitive information may only be collected with consent, or where a listed exception applies.

A comparison can be drawn here to GDPR Article 5, which requires data collected for “specified, explicit and legitimate purposes”, and be processed “lawfully [and] fairly” (Principle 1(a) and (b)). The question of whether a company has a lawful basis for processing personal information is critical.

APP 4 — Dealing with unsolicited personal information

APP 4 requires APP entities to destroy or de-identify unsolicited personal information that they could not have otherwise collected under APP 3.

There is no direct analogue in the GDPR, however it should be noted that the GDPR does not permit collection of personal data without a specified, explicit purpose.

APP 5 — Notification of the collection of personal information

APP 5 requires APP entities to notify individuals (or otherwise ensure that they are aware) of specified matters when they collect their personal information (for example, by providing individuals with a collection statement).

Again, GDPR Articles 12, 13 and 14 impose requirements for the provision of privacy information about how data is processed that are substantially similar to the matters specified in APP 5, as well as additional obligations (see APP 1, above). This includes a requirement that the information is clear and easy to understand. Australian companies should consider, for example, whether their privacy policies are written in plain English.

APP 6 — Use or disclosure of personal information

This APP outlines the circumstances in which an APP entity may use or disclose personal information that it holds. Where an APP entity has collected personal information for a specific purpose, and wishes to use it for a secondary purpose, APP 6 provides that entities may not do so unless the individual has consented, it is within their reasonable expectations, or another listed exception applies. Exceptions include circumstances involving health and safety and law enforcement.

GDPR Article 6 similarly requires that personal data may only be processed where the data subject has consented to one or more of the specific purposes of the processing, or the processing is otherwise lawful as another listed scenario applies. For example, where the processing is necessary to perform a contract or comply with a legal obligation.

APP 7 — Direct marketing

APP 7 provides that an organisation that is an APP entity may only use or disclose personal information for direct marketing purposes if certain conditions are met. In particular, direct marketing messages must include a clear and simple way to opt out of receiving future messages, and must not be sent to individuals who have already opted out. Sensitive information about an individual may only be used for direct marketing with consent of the individual.

GDPR Article 21 provides individuals with, amongst other things, the right to object to the use of their personal data for direct marketing.

APP 8 — Cross-border disclosure of personal information

This principle requires an APP entity, before it discloses personal information to and overseas recipient, to take reasonable steps to ensure that the recipient does not breach the APPs in relation to that information. Personal information may only be disclosed where the recipient is subject to a regulatory regime that is substantially similar to the APPs, where the individual has consented, or another listed exception applies. APP entities may be liable for the acts and practices of overseas recipients in certain circumstances (s16).

Chapter 5 of the GDPR provides that transfers of personal data outside of EU jurisdiction may only be made where the recipient jurisdiction has been assessed as ‘adequate’ in terms of data protection, where sufficient safeguards (such as a binding contract or corporate rules) have been put in place, or a listed exception applies. The European Commission has not, to date, assessed Australia as ‘adequate’, but the Commission is currently reviewing its adequacy assessments.

APP 9 — Adoption, use or disclosure of government related identifiers

APP 9 provides that an organisation that is an APP entity may not adopt a government related identifier of an individual as its own identifier, or use or disclose such an identifier, unless a listed exception applies. There is no direct analogue to this provision in the GDPR.

APP 10 — Quality of personal information

APP 10 requires APP entities to take reasonable steps to ensure the personal information it collects, uses or discloses is accurate, up to date and complete.

Accuracy and currency of the information are mentioned in GDPR Article 5 o(Principle 1(d); “every reasonable step must be taken” to ensure that inaccurate personal data is “rectified without delay”.

APP 11 — Security of personal information

This APP requires APP entities to take reasonable steps to protect personal information they hold from misuse, interference and loss, and from unauthorised access, modification or disclosure. This provision is a frequent focus of investigations in to APP entities conducted by the Australian Information Commissioner.

GDPR Article 5 similarly requires that data processing be undertaken in a manner “that ensures appropriate security of the data” (Principle 1(f)). Further, Article 32 requires the data controller and the processor to implement appropriate technical and organisational measures to ensure a level of security appropriate (taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes). Those measures must also address the confidentiality, integrity and availability of the data.

APP 11.2 provides that APP entities must also take reason steps to destroy or de-identify personal information that they no longer require for a lawful business purpose.

GDPR Article 5 imposes a similar storage limitation – personal data may “kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed” (Principle 1(e)). However, the GDPR also explains that “personal data may be stored for longer periods insofar as the personal data will be processed solely for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with Article 89(1)”.

APP 12 — Access to personal information

APP 12 requires APP entities to give an individual access to the personal information about them that the entity holds, on request by that individual. APP 12 imposes procedural requirements around access, and includes limited exceptions.

Article 15 of the GDPR imposes a similar right of access, with additional rights to know information about the collection and envisaged use of the data (such as recipients or potential recipients, likely storage period, and safeguards for overseas transfers)

APP 13 — Correction of personal information

APP 13 requires APP entities to take reasonable steps to correct personal information they hold about an individual, on request by the individual. This APP also imposes procedural requirements and includes limited exceptions.

GDPR Article 16 imposes a similar but stronger right; data subjects have the absolute “right to obtain…without undue delay the rectification of inaccurate personal data concerning [them]”.

GDPR rights that are not in the APPs

What none of the APPs provide is an express right to erasure, the right of restriction of processing, data portability and the right to object. The GDPR provides for these rights in Articles 17, 18 ,20 and 21.

Complimentary APP v GDPR legal concepts comparison table

Complimentary Legal Comparison Table


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

 

Head to Head: the GDPR and the Australian Privacy Principles – Part 1: The long arm of the law

This article was originally published in issue #81 (5 December 2017) of Privacy Unbound, the journal of the International Association of Privacy Professionals, Australia-New Zealand (iappANZ).

Introduction

The EU’s new and wide-ranging General Data Protection Regulation (GDPR) represents an unprecedented shakeup of the European data protection regulatory environment. The GDPR promises to set a new regulatory benchmark and drive reform in jurisdictions around the world. The GDPR will come into force on 25 May 2018, replacing the current EU Data Protection Directive 95/46/EC. It will have immediate direct effect in all EU Member States.

Australian companies with exposure to the European market should take note – the GDPR can and will apply to companies based outside of Europe. Australian-based companies should take this opportunity to confirm whether the GDPR will apply to them come May, or whether they need to prepare for GPDR compliance to access the European market in the future.

The costs of non-compliance may be extreme – the GDPR introduces a new set of sharp teeth for European regulators, including fines of up to €20 million or 4% of global revenue, whichever is the greater. However, the added burden of compliance promises to pose a challenge for many businesses working with limited resources.

Part 1 of this article will help you understand whether the GDPR will apply to your business. Part 2, will help you focus your efforts in preparing for the GDPR by identifying links and differences between the 13 Australian Privacy Principles and the GDPR’s 99 Articles.

The GDPR’s extra-territorial application

Critically for Australian companies, Article 3 of the GDPR extends the GDPR to any company that controls or processes the personal information of individuals in the EU (whatever their nationality or place of residence) if the processing is related to offering goods or services or monitoring their behaviour, whether or not the company is located in the EU or the processing occurs in the EU.

For the purposes of the GDPR, a data ‘controller’ determines the purposes and means of the personal information, and the ‘processor’ processes the information on their behalf. ‘Processing’ is not a term found in Australian privacy law. The term is broadly defined and essentially means any act or practice that is done to, or in connection with, personal information.

Therefore, Australian companies that service or supply European clients, or otherwise offer goods or services to or monitor the behaviour of individuals in the EU that takes place in the EU, need to assess their client and individual customer bases, operations, systems and processes to answer three key questions:

  1. Do you have an ‘establishment’ in the EU? (Article 3.1)
  2. Do you offer good or services to individuals who are in the EU (whether or not you charge for them) ? (Article 3.2(a))
  3. Do you monitor any behaviour of individuals in the EU? (Article 3.2(b)

Establishment

Article 4 provides that the main establishment of a data controller is the “place of its central administration” in the EU. That is, where the “decisions on the purposes and means of the processing” occur. For example, if you have an EU office or headquarters.

For processors, the main establishment will be either the place of central administration in the EU or, if the processor does not have one, then where the main processing activity in the EU takes place. For example, if you have your head office in Australia, but maintain an EU data centre.

Offering goods and services

The GDPR recitals explain that a range of factors will be relevant to deciding whether a company is ‘offering goods or services’ to individuals in the EU. These include:

  • the use of language and currency or a top-level domain name of an EU Member State
  • delivery of physical goods to a Member State
  • making references to individuals in a Member State to promote the goods and services, or
  • targeting advertising at individuals in a Member State.

Mere accessibility of an Australian company’s website or app to individuals in the EU will not, by itself, reach the threshold.

Some of these factors obviously indicate that goods and services are being offered. But it may ultimately be the cumulative effect of various activities that bring a company’s data processing within the reach of the GDPR.

Monitoring

To determine whether a processing activity can be considered to be ‘monitoring’ the behaviour of individuals in the EU for the purposes of Article 3.2(b), you should consider whether your company is:

  • associating individuals in the EU with online identifiers provided by their devices, applications, tools and protocols, such as IP addresses and cookie identifiers
  • tracking their behaviour on the Internet, and
  • using data processing techniques that profile individuals, particularly in order to make decisions concerning them for analysing or predicting their personal preferences, behaviours and attitudes.

Enforcement

European data protection authorities will have increased supervisory powers under the GDPR. However, the question of how those authorities will approach extraterritorial enforcement against companies established and operating outside the EU is far from settled.

GDPR Article 50 imposes obligations on the EU Commission and authorities to take appropriate steps to cooperate with international stakeholders. In recent years, there has been increasing cooperation between authorities. Under the GDPR, it is likely that EU authorities will liaise with the Australian privacy regulator – the Office of the Australian Information Commissioner (OAIC) – when responding to data processing by an Australian company. This may in turn trigger regulatory action by the OAIC or a cooperative effort to effect an appropriate response. Any evidence of a company’s presence in or nexus with an EU Member State may influence the potential for cross-border enforcement action.

How can you prepare?

If any of your answer to the three questions above is ‘yes’, then you will need to consider:

  • what are the risks from gaps in your current compliance under Australian privacy law against the GDPR requirements, and
  • what additional steps you need to take to ensure that you can comply with additional GDPR requirements, or
  • whether you need to cease any activities in relation to individuals in the EU to which the GDPR will apply and/or restructure your EU operations

If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.

Melanie spoke to Bloomberg about driving competitive advantage from GDPR compliance

Boomberg: Aussie Breach Notice Law May Distract from EU Privacy Compliance

By Murray Griffin

Australia’s new mandatory data breach notice law may have distracted some companies from their compliance obligations under the European Union’s new privacy regime, privacy attorneys told Bloomberg BNA.

A decade after it was proposed, mandatory data breach notification in Australia takes effect in February 2018, a few months before the EU General Data Protection Regulation (GDPR) enters into force in May 2018. The GDPR not only has its own breach notice requirement, it includes several provisions that will impose first-time compliance burdens on Australian companies.

The more imminent requirements under the Australian breach notice law “are getting more focus” than the GDPR, Brendan Tomlinson, information technology, privacy, and cybersecurity special counsel with Maddocks in Sydney, told Bloomberg Law. “GDPR isn’t as much on the radar as it needs to be,” he said. “The requirements around data breach for GDPR are more stringent and the potential fines are far greater.”

The Australian law requires that companies notify privacy regulators of data breaches no later than 30 days after discovering the incident. The GDPR requires breach notice within 72 hours.

Although companies with revenues in Australia of over A$3 million ($2.3M) need to be prepared for the long-awaited Australian breach notice requirement, all companies that collect and control the use of or process personal data of EU citizens or aim their business at the EU will be subject to the GDPR’s broad extraterritorial scope. In addition, the maximum A$1.8 million ($1.3 million) penalty under the Australian law pails in comparison to the potential $20 million euro ($23.3 million) or 4 percent of a company’s worldwide revenue fines available under the GDPR.

Other Differences

The GDPR will also introduce new privacy principles unknown in Australia, including the right for individuals to request that their personal information be deleted, Melanie Marks, principal of Australian privacy and cybersecurity consulting company elevenM, told Bloomberg Law.

Australian law doesn’t recognize the concept of a data processor, a company that handles and processes personal data on behalf of another company that controls the use of the data. The GDPR extends current privacy requirements beyond data controllers to cover data processors. The GDPR also requires informed consent from individuals to the processing of their personal data and that companies allow individuals the right to easily withdraw their consent.

Australian companies are “reviewing their consent frameworks, and in some cases, we hear they are contemplating moving away from consent as a basis for data handling entirely,” Marks said. Instead they are looking to other legal means to process data under the GDPR, such as through specific contracts, she said.

The GDPR also includes a broad right of individuals to access their personal data.

“The challenge is there is a lot of intermixed data between data that is considered the employee’s and data that is considered the organization’s,” Didier Elzinga, CEO of employee analytics company Culture Amp Pty Ltd, told Bloomberg Law.

Large Australian companies, such as retailer Woolworths Ltd.—which brought in $42 billion in revenues in fiscal year 2017 according to Bloomberg data—have been subject to Australia’s privacy laws for decades. But small businesses in Australia may also be forced to confront privacy protections for the first time with the advent of the GDPR.

“In Australia, there is an exemption if you are a small business from needing to comply with the Australian Privacy Principles,” but the GDPR has no such exemption, Tomlinson said.

Competitive Advantage

For technology companies, much of the pressure to be GDPR-ready will come from business clients or potential clients that value the ability to participate freely in the international digital economy and that provides commercial opportunities, Marks said.

“It becomes a unique selling point or a point of differentiation that you can say ’our data practices are compliant with the GDPR’,” she said.

About 95 percent of a company’s spending on GDPR preparation should go to compliance efforts and having a good privacy framework, but the other 5 percent should be set aside for marketing the story of GDPR-readiness “because it is a commercial advantage.”


If you enjoyed this and would like to be notified of future elevenM blog posts, please subscribe below.