This week in digital trust » Episode 87

#87 Is it time to give up on privacy policies?

21 November 2023

This week, Jordan sits down with elevenM privacy communications expert Tessa Loftus to debate whether we should just give up on privacy policies.

Historically, privacy regulation has leant heavily on transparency and consumer choice – the idea that if we just give consumers the right information, then they can take control of their privacy. But the reality is that most privacy policies are an unintelligible to most people, and we’ve all lost control.

So what’s the answer? Can businesses communicate better about privacy? Or should we give up on consumer choice and put the onus on businesses to act ‘fairly and reasonably’.

Listen now

Transcript

This is an automatically generated transcript. We make our best efforts to check that it is an accurate reflection of the episode, but it may contain some errors and unedited content.

JORDAN
Welcome to another edition of This Week in Digital Trust, elevenM’s regular conversation about all things tech policy, privacy, and cybersecurity. I’m Jordan, joining you today from Wurundjeri country in Melbourne.

TESSA
And I’m Tessa, joining you from Gadigal country in Sydney.

JORDAN
And welcome Tessa. Arj is still on leave, so we’re roping in more people to fill his big shoes.

TESSA
Enormous shoes.

JORDAN
Thank you for putting your hand up. Looking forward to having a chat about privacy comms. Tessa is an expert in privacy comms at elevenM. She’s got a particular obsession about making things accessible and communicating privacy in particular effectively to all sorts of audiences. So that’s what we’re going to have a bit of a chat about today.

TESSA
Yep, it’s going to be great. Let’s talk privacy policies. Let’s talk privacy communications.

JORDAN
I think the fun thing, what I’m looking forward to about this conversation is we have… a little bit of a disagreement in approach. I mean, we might hand that up a bit, but we have different opinions about the value of privacy comms and the value of privacy policies in particular.

TESSA
Yeah, I mean, I think I generally say that privacy policies, you know, are a tool that exists. And while there’s the possibility of better tools in the future, I don’t want to throw the baby out with the bathwater. We already have this tool and we could be doing so much more with it. And by we, I mean businesses.

JORDAN
I think it’s a fun conversation to have and to pick apart a bit today because it’s been a real tension or topic of debate in the kind of Privacy Act law reform process over the last couple of years. This tension between do we focus privacy law on empowering consumers, giving them the right information so they can make their choices?

Or do we start back away from that and focus more on pushing those obligations on to companies and saying, well, no, don’t say that, don’t worry about consumer consent so much. Just make sure that the things you are doing are reasonable and fair and, you know, meet consumer expectations generally.

TESSA
If you read the Privacy Act, report, you know, the most recent report, it tries to do both. It’s trying to say, no, let’s move away from consent and we’ll do fair and reasonable. And then you read the next section and it says, and we’re going to enhance privacy collection notices and introduce new streamlined ways of doing consent and notice and privacy policies. And it’s, it hasn’t itself resolved that tension.

JORDAN
Yeah, yeah. And I think that’s the right place to be right. I think that, you know, there is not just one magical answer to such a complex set of issues as privacy, right? That like, you kind of got to use all of the tools.

TESSA
Any kind of complicated situation, there’s never one solution. And I think what we’re talking about and what you and I tend to argue about is the top down, bottom up approach. You want the policy changes from the top, filtering down and I’m sort of… saying, well, I agree with you. I’d like that. That would be, that’s a nice to have. But, you know, top down approaches are harder and they take longer. So, you know, maybe we should also try to improve our bottom up approach.

JORDAN
Yeah. So, so the top down in this case being kind of pushing companies to do better, right? Having, having a regulator with a stick who’s going and reviewing what a company is doing and saying, well, that’s fair. That’s not fair. That’s reasonable, that’s not reasonable. I mean, taking a step back from that, one of the key proposals that we’ve mentioned on the pod before in the Privacy Act Reform is this general requirement that anything a company does with personal information needs to be fair and reasonable, right? And so-

TESSA
And I definitely support that.

JORDAN
Which is a great, yeah, great, great requirement. And it gives a regulator something to hang their hat on in terms of going after companies that are being exploitative or unfair or doing things that are likely to cause people harm. So that’s that kind of top-down approach, right? You have a regulator who’s enforcing from the top. The bottom up is more that consumer choice and transparency and helping, almost helping the market. It’s not really a market-based approach, but it’s helping individuals make proper choices, understand what’s happening with their data. And I don’t think the flow on argument is that kind of shapes a market where companies are responsible and need to take better privacy choices?

TESSA
And it kind of could, like it could have that market impact if people actually felt like they could do that. You know what I mean? So if, if people felt like they could read a privacy policy, understand, and then decide which of these two rewards programs they actually want to sign up to, then maybe that would have the market force effect.

But the way it stands at the moment, it isn’t, it won’t, it can’t, you know, because as I said, I’m the only person I know who reads privacy policies and the IAPP, you know, global trust report that they did this year, they said that 29% of people globally find it easy to understand how well a company protects their personal information.

JORDAN
That’s unreasonably high, right? What in four?

TESSA
That’s unreasonably high. Yeah. There’s still a lot of people that know they don’t.

JORDAN
I think that’s right. And I think that’s what you’re pointing to. There’s some great stats here from a range of different reports and companies about just how much this idea of privacy policies as informing consumers has failed. There’s a great organization here in Australia called the Consumer Policy Research Center that does this kind of research as well. And they have a recent survey that’s found that 94% of Australian consumers reported not reading all privacy policies. So, you know, 6% of consumers actually read, claimed to read all of the privacy policies in terms and conditions that applied to them in the last 12 months.

TESSA
I think that’s a more reasonable, like a more realistic number than the 20% in the Australian community attitudes to privacy survey who claim they read privacy policies. I don’t believe those people.

JORDAN
I work in privacy every day and I don’t read the privacy policies because well, and the reason or one of the reasons is the next bit in this CPRC survey, which is that all of that 6% that said they read them. 70% of them accepted terms, even though they weren’t comfortable with it. Yeah, because everyone knows no way of negotiating.

TESSA
Exactly right. We say as privacy professionals and the OERC says this as well, we say, you know, one of the ways to protect yourself is to read the privacy policy, but there isn’t any way like you’re just reading it. Like, and then what? What does that do? Right.

JORDAN
Yeah, exactly. What does that do? Yep. There’s two problems there. One is that typically the privacy policy will have these incredibly general statements that we will share your data with trusted partners for our business activities. It’s like, well, what does that mean? Right? Is the business activity marketing? Is the business activity auditing financial records? Like they’re very different risk profiles. They’re very different activities. And the second is, yeah, as we’ve said, there’s, it’s just impossible to negotiate a, you know, I want to sign up for a Facebook account, but I don’t like clause 43 of your privacy policy is not a conversation that you can have with Meta.

TESSA
That would be so entertaining.

JORDAN Well, look, yeah, we can we can just start writing letters.

TESSA
Just send them letters. Yeah.

JORDAN
I don’t have time to read the things, let alone get into those arguments. So yeah, so maybe not. So I think you have a view on this, but the way it’s currently set up, it seems to me like privacy policies are not written for humans.

TESSA
They’re not. So, I mean, like, if you just go back to the sort of the very like basic sort of central point, right in Australia, privacy policies are part of the Australian privacy principles. So, APP 1.2 says you must have a clearly expressed and up-to-date privacy policy about how you handle, collect, et cetera, personal information. And the APP guidelines add easy to understand, avoiding jargon and legalistic terms, which indicates to me that no one who’s ever written a privacy policy has ever read the APP guidelines, right?

And that’s consistent as well with GDPR because article 12 of the GDPR says, concise, transparent, intelligible, easily accessible and plain language. Those are the, that’s the words that they use. So I guess what I would say is that when you read that, it is absolutely stated. Like it’s not even implied. It is stated that the privacy policy is there for the consumer or therefore whoever the customer of that business is, right?

JORDAN
These laws are imagining that a customer will follow that OAIC advice. They’ll consider a product and in considering the product they’ll read the privacy policy and then make a decision.

TESSA
Yeah, yeah, absolutely. And those laws and that guidance is written clearly stating privacy policies are a communication tool. That is what they are, but that is not how they’re used.

So they are used as a tool for managing legal risk. Right. And so, I mean, you only, you only have to see that the fact that they’re folded into the terms and conditions to, to realize that, right. They’re almost always written by lawyers. They’re almost never written by people like me. And, and that is how they’re used. So they’re not straight from the off. They’re not used the way they are supposed to be used.

JORDAN
Yeah. They’re written by lawyers and their audience is not the average consumer typically, their audience is either the privacy regulator or a body like the ACCC, the Competition Consumer Commission, that companies are worried about getting fined by for misrepresenting their privacy practices, right?

TESSA
Yeah, they’re written for the regulators, and they’re also written for, you know, the head of legal, you know, the person who signs off, who says, you know, we and not, you know, like, this is what our risk is. And we don’t want any situation where we, you know, are putting ourselves in any kind of risk. And this is a legal document and all that sort of stuff. And I would say that’s just completely contrary to their purpose.

JORDAN
Yeah, exactly right. Well, the pressure there is to have general statements, to not be pinned down to too many, like, specific commitments that someone might hold against you later. To minimize liability in that way, whereas actually, and to be incredibly precise, right? To not have anything in that someone might claim was inaccurate. Whereas if you want to use it as a communication tool to accurately describe what the company is doing, actually you’re better off with simpler statements that might be much easier to understand but might not have quite that legal precision. That might be easier to engage in or that actually better explain the activities to a normal human rather than being written for a regulator?

TESSA
Yeah, absolutely. But the thing I think is that when people are writing a privacy policy sort of as a legal tool, they’re thinking, oh, we must be incredibly precise. But actually, it’s not uncommon for that incredible precision to end up being very vague and to provide no information whatsoever. So it has like the opposite effect of what they say it’s having. Whereas the looser, less specific language that is plain English and communicating to an individual provides a much greater level of understanding. And so I would argue provides actually more information. Like if I read the privacy policy and I come away understanding something, then I have got more information out of that than if I read the privacy policy and understood nothing.

JORDAN
But it’s technically every statement in the policy was technically accurate.

TESSA
Yeah, it’s just, you know, that is not that is not meaning.

JORDAN
Yeah, no, for sure. And so it’s I feel like historically, you know, we’ve relied on privacy policies and consumer choice in privacy regulation kind of since the get go. Right. Noticing consent’s been the regulatory model in a large part. But these mismatches of incentives, I think, or the difference between the regulatory intention, which is that it’s a communication tool for consumers, and the practical incentives for companies, which is really you’re communicating to the regulators or to the privacy advocates or whatever, rather than to the consumer, has led us to this place where they’re just useless, essentially, as communication tools.

Do you see, are there companies that do it well? Are there ways of doing it well? Like, how do we claw that back? Is there a way to claw that back?

TESSA
There is, and it is totally possible to write an understandable privacy policy that’s actually accurate. And I mean, I guess at risk of sounding like the regulator, the answer is layering. But basically what it is, is it’s like it’s the Wikipedia approach, right? So it’s the providing, the kind of the broad strokes, it’s providing the understanding before you try to get into, you know, the nitty gritty, before you try to get into the absolute detail. And the amount of detail that you’re gonna need and the amount of kind of layers of information you need is gonna depend on the complexity of your information gathering. I mean, I would say, like just as an aside, that there are a lot of bad privacy policies out there written by companies that are bad because the companies themselves don’t understand their own information handling.

Right? Like straight from the off, like if you don’t actually know what you’re collecting or what you’re doing with it, there is no chance that your privacy policy will be accurate or comprehensible. Like that’s just straight from the off you’ve already lost. But leading aside the house company.

JORDAN
Without naming anyone, that’s, that’s something that we occasionally run into with clients, right? It’s like, can you help us update the privacy policy? It’s like, sure. What do you do with information?

TESSA
Oh, exactly. Right.

JORDAN
I was like, all right, well, we’ve got some other work to do before we tell people.

TESSA
Let’s start at the bottom. Yeah, no, but leaving that aside, I have seen good privacy policies. You know, I’ve seen even from even from complicated businesses, you know, like a few years ago, I haven’t read it recently, but a few years ago, Twitter had a really good privacy policy. And the reason it was really good was because it had a short introduction, it had categorization of information, you know, like it had information about collection, it had information about disclosure, it had information about security and it grouped it, you know? And then for each sort of section, it had like a little pop-up box that was, you know, one sentence or one paragraph plain English summary of the information that was in that section. So it was like layers upon layers. You could look at just the sections you were interested in, you could just read the little pop ups, you could just read the intro. Like, it’s not that I think that Twitter, you know, as it was five years ago, necessarily, you know, had perfect information handling practices or any of that sort of stuff. It wasn’t better because they do, you know, a better job at protecting people’s privacy. Not that I would imply that they don’t. But it was better because it being constructed and written to be read by people. But that is that is was so obviously what it was for.

JORDAN
And that doesn’t happen magically, right? Like that’s, that’s actually like a lot of work and a lot of like sitting down and, and working out how you can structure the data, the information, thinking about your audience and you know, how they’re going to process it, what questions they’re going to have when they come to it and frontloading that information.

TESSA
Yeah, absolutely. Yeah. It’s also involving people in that communications process that aren’t just lawyers, right? Like the lawyers have a place to, like they have a place and the privacy team has a place and you know, the risk area has a place and you know, all that sort of stuff. The records management team has a place. All of these people have a role to play, but it’s a communications tool. So if you don’t have communications people involved in the writing of your privacy policy, it’s very unlikely to be set like successful.

JORDAN
No, for sure. It’s something that, I think privacy people sometimes forget is a problem in other industries as well, you know, like financial services and insurance and energy markets. There’s a stack of actually regulation and study about how people process numbers and quotes and terms and how do you structure information in a way that gives people the key stuff and then lets them dig in later or whatever.

TESSA
Yeah, absolutely. I think I mean, financial disclosure statements have exactly the same problem that privacy policies have, and I don’t read them. Exactly right. Yeah. Right. I read privacy policies, but I don’t read financial disclosure statements because I don’t know enough and it’s exactly the same problem.

JORDAN
Even no matter how well drafted, easy to understand a privacy notice or policy is, I think sometimes, I’m still not going to read all of them, right? I access a million different websites a day. Even that having to click on a bloody cookie notice for every website that I visit.

TESSA
Everybody hates cookie notices.

JORDAN
Everybody hates cookie notices, right? And that’s a notice and consent kind of situation gone crazy. I think no matter how good we get at privacy notices, I think there is still a really important role for pushing accountability back on organizations and off the hands of an individual right. Like we still want to give people environments where they can not pay attention and still be safe.

TESSA
Yeah, absolutely. I mean, I want fair and reasonable. Like I absolutely want it fair and reasonable.

Like I think we need it, like no question. There’s definitely areas, you know, in our privacy law that are just like, you know, massive loopholes that need to be closed. And yeah, we definitely are. But I think like fair and reasonable, it’s funny because we think, well fair and reasonable is so great. Fair and reasonable has like real limitations as well. And I think that’s what’s interesting is they’ve actually both got these limitations, but they actually kind of mesh together quite well. Like you don’t want to read privacy policies, right? And you want the fair and reasonable test to do it for you.

JORDAN
Yep.

TESSA
And I’d love that for you, Jordan. But realistically, fair and reasonable is going to cover things like direct marketing to children. It’s going to cover things like collection and use of biometric information. It’s going to cover things like health information, sexuality, all the sensitive information stuff, right?

It’s going to cover, you know, it might cover, you know, third party uses in certain circumstances. What it’s absolutely not ever going to do for you is I have just bought something on this website and they have subscribed me to marketing email and now they’re sending me, you know, marketing emails. Fair and reasonable is just not going to, it’s not going to address that because it’s not serious enough and it’s too specific. You know what I mean? Like you can’t have fair and reasonable has to be blanket. Right. It can’t get down to a level of granularity where it can cover all these little, little situations that are different in every circumstance. That’s where privacy policies and proper privacy policies and proper notice and consent could actually give us something that we don’t, you know, currently have or kind of aren’t currently able to access.

JORDAN
Yeah, no, you’re right. Relying on a regulator running around applying a fair and reasonableness test is not really scalable either, right, to those smaller, more context-based situations. I mean, the harm that comes from receiving that direct marketing email compared to, you know, the regulator has a limited amount of resources, they’re going to be focused on the high risk, high impact serious stuff.

TESSA
They’re also going to be focused on the stuff that everyone agrees on, right? I think everybody agrees that direct marketing to children is bad. Not everyone agrees that direct marketing to adults is bad, right? There’s actually people that like direct marketing. I don’t understand it, but you know, each to their own. And yeah, you know, it exists, it has a place to play. It’s not a definite, right? It’s not an absolute. And so it’s not going to be covered by fair and reasonable in all circumstances?

JORDAN
Yeah, no, that’s very true. There is a very wide range of marketing or handling, collecting of personal information, consumer activity that is legitimately within the realm of choice, right? That like some people are into it, some people aren’t. And in order to enable that whole slice of, probably the majority of activities and consumer relationships, you really need to have the capability to communicate effectively about privacy, right? And that’s the role of that, the noticing consent in the middle there. Yeah.

TESSA
And giving people that opportunity to do the kind of personal maps for themselves, essentially, you know? Do I want this rewards card? Do I want this service? Am I willing to use this website? But I think where we absolutely need fair and reasonable, like where privacy policies just can’t get it done is in those areas that aren’t optional. Government services and not just government services, medical services, but also those sort of allegedly optional services that aren’t really optional, like social media. Like you can’t just say, well, oh, but you could just not be on social media. I mean, I don’t know how many people there are that aren’t on social media, but it’s not very many. And it’s quite, it’s quite hard now to not be on social media.

JORDAN
And there are relationships that are embedded there, right? Like if I want to ditch social media, I need to then satisfy myself that I’m not going to get updates from my cousins or from, you know, the people I care about who like, this is like my only way of keeping track of their lives in a practical sense.

TESSA
You know, it’s increasingly common that, you know, LinkedIn is used to verify people’s CVs, you know? So if you’re not on LinkedIn, you’re committing to, okay, well, I’m applying for jobs and I feel comfortable that this person that I’m applying for the job with will believe that I am who I say I am or that my say, you know what I mean? There are a lot of like optional services that aren’t really optional now in this kind of brave new world.

JORDAN
Yeah. And so that’s, that’s the kind of area, the other area where fair and reasonable or that regulator top down approach is necessary because you can’t rely on consumer choice.

I think though, even like, especially in those areas as well, it’s not neither or, right? As we were saying, like the transparency is still really valuable. And like giving, like I don’t have a real choice, but still I should be, or these companies should be pushed to, and government should be pushed to coherently explain what they’re doing in a way that people can actually understand.

TESSA
Yeah, absolutely. And I think there are a lot of privacy policies that are incomprehensible on purpose. You know what I mean?

There are organizations who are genuinely using them to obfuscate. They don’t want to communicate with you. Right. And I think that should be pushed against by the regulator.

JORDAN
Yeah. No, for sure.

TESSA
They should like try harder, you know, like I understand that, you know, the information handling at, you know, Meta or Centrelink or, you know, these big complicated services, it’s complicated. I understand that. And you might struggle to explain that to me, but I think that you could pry harder to make a genuine attempt to do so, you know?

JORDAN
Yeah, no, for sure. And again, this is where I like the analogy to financial services that like, you don’t like, we don’t accept that explanation. We don’t accept the like, oh, it’s complicated. It’s like, well, I don’t care, find a way, right? You need to articulate the terms in detail. And you need to articulate the nature of the product and the risks in a way that a normal human will understand.

And that’s the requirement. I’m sorry, you can’t offer the thing unless you can do that. For me, the lesson or the takeaway here, I mean, we’ve talked a bit about just like the challenges and the tensions and the mismatched kind of incentives around drafting privacy policies. I think what the Privacy Act reform is trying to do, which I really agree with, is just keep that pressure, like keep pushing, right, and saying, well, like, actually, you know, these privacy notices, privacy policies need to be clear, concise, understandable. They need to be, if they’re directed at a child, they need to be drafted in a way that the child will understand. You know, they, we don’t accept this kind of tendency that’s happened where they’re just legal disclosure documents. While at the same time having these top-down tools for regulators to go after the really bad stuff and not rely on people to identify and protect themselves from the really bad stuff. I think that’s a sensible way forward. Are you optimistic?

TESSA
I don’t know. I mean, if you’re optimistic, you can end up being horribly disappointed, whereas if you’re pessimistic, you’re always pleasantly surprised. I think it’s possible, put it that way. Like, I think tat that fair and reasonable puts more pressure on like noticing consent to kind of come up and meet it somewhere halfway.

JORDAN
One day we’ll meet in the middle where there’s like great privacy communications and well-funded regulators going after the bad stuff. See you in the happy sunny future Tessa. Thanks for the chat, it’s been fun.

TESSA
I’ll see you on the other side. Thanks Jordan. Bye.