This week in digital trust » Episode 101

#101 Go dark or go home – Unscrambling the encryption debate

20 March 2024

** Content warning: This episode contains discussion of Child Sexual Abuse Material **

Australia’s eSafety Commissioner is trying to force tech giants to scan cloud storage for illegal and harmful content. Apple says this could undermine security protections and facilitate mass surveillance.
 
This is just the latest skirmish in the crypto wars – a 50 year old policy debate about how to balance strong encryption (which is essential to privacy and security online) with law enforcement access to crucial data.
 
This week we explore some of the policy and rhetorical challenges that arise when trying to debate these competing objectives.
 
Links:
Article on Apple’s criticism of eSafety’s proposed standards (InnovationAus) ⁠https://www.innovationaus.com/apple-warns-against-mass-file-scanning-proposal/⁠
 
An old but good breakdown of some common crypto wars rhetoric (The Verge) ⁠https://www.theverge.com/2016/1/12/10749806/encryption-debate-fbi-terrorism-going-dark⁠
 
Criticism of Malcolm Turnbull’s laws of mathematics gaffe (Electronic Frontier Foundation) ⁠https://www.eff.org/deeplinks/2017/07/australian-pm-calls-end-end-encryption-ban-says-laws-mathematics-dont-apply-down⁠

Listen now

Transcript

This is an automatically generated transcript. We make our best efforts to check that it is an accurate reflection of the episode, but it may contain some errors and unedited content.

Arj
Welcome to this week in Digital Trust, elevenM’s regular conversation about all things tech policy, privacy, AI and cybersecurity. I’m Arj joining you today from Awabakal country.

Jordan
And I’m Jordan joining you from Wurundjeri country in Melbourne.
And Arj, I’m joining you on an encrypted connection today.

Arj
Is that right?

Jordan
Yes, highly encrypted secure.

Arj
Does that matter when you broadcast it out on a public platform like a week later? Does it matter that…

Jordan
Good question.

Arj
We don’t want anyone to listen to the raw uncut version of this.

Jordan
You don’t get to snoop on the actual recording. You just get the polished publication afterwards. Yes.
We’re talking about encryption.

Arj
Yay. I think I said this before, but I’m in, I am intimidated by this topic. I think it’s really hard to have this debate without it feeling like you have to be a technical expert in cryptography.
Even though you don’t have to be a technical expert in cryptography, there’s much more to it than just the technical elements. Yeah.
As soon as you start talking about it, you immediately feels like you need to know about that stuff.

Jordan
That’s it. And there, there’s a really kind of long-standing debate about the tensions between encryption and law enforcement access. It’s variously been called the crypto wars, the encryption debate, various other things.
But there are a lot of really important public policy questions there. And yet, every time we try to talk about them, we end up within minutes in the weeds on, if you capture it here rather than there, or you encrypt it in this way rather than that way.
We’ll try to kind of keep it at the high level.

Arj
I’m glad we are talking about it though, because I think it is, it’s one of those elements of technology that you, to some extent, you don’t see and you take for granted. You know, like it’s there, we use it all the time, it protects our information.
And then also, you know, as we’ll talk about, like, for certain types of people, it’s particularly important. And, you know, you need that protection from bad actors and encryption is … It’s so accessible to us on all of our devices and it’s pretty cool.

Jordan
Yeah. And it’s one of the fundamental enabling technologies in the internet of modern life, right? That your ability to communicate with someone else and be confident that that’s who is at the end of the line. Reading you and that there’s no one in the middle reading your email or viewing your bank account and so on.
So, you know, the value of encryption is, you know, astronomical in enabling modern technologies. But yeah, the debate is not, is encryption good or bad, right? The debate is like, at the margins, what should law enforcement have access to? At the margins, what responsibilities do digital platforms and service providers have to snoop, I guess, on our content or scan, as in the kind of story that we’re starting with today?
Or break encryption when there’s a warrant or kind of lawful access has been granted by a court to your stuff.

Arj
Okay. Well, you mentioned the sort of the story that we’re going to kind of kick off with because you know, this is a recent story and it’s a good trigger for us to have this conversation and I suppose before I introduced the story, it’s probably useful information for people to know that – Many of the digital platforms have just rolled out end-to-end encryption across much of their services, whether it’s people using WhatsApp or Signal from a messaging perspective through to like iCloud being encrypted.
This basically means that our content on our devices on our phones is largely inaccessible to authorities if they want to look for anything that might be what they consider to be bad. So that’s the context for the story I’m about to introduce.

Jordan
Let’s encrypt as well. There’s been a huge push over the last probably 10 years now to encrypt just everyday internet traffic as well. Ten years ago, basically only your access to your bank or properly secret secure websites would be encrypted.
These days, virtually every website you access, whether it’s shopping, whether it’s just an information access or a government encrypted, like the traffic between you and that will be encrypted. So encryption is like everywhere now.

Arj
It’s everywhere. And so, you know, in the face of this trend of everything going dark for authorities, there’s this other issue and trend of like the proliferation of child sexual abuse material and child sexual exploitation material, which is a genuine and real problem. And we have a large number of child safety advocates and we have institutions like the esafety commission that are really working hard to try to think about how we can kind of be more proactive in, you know, identifying, detecting, reporting and stopping this material being produced and proliferated.
The eSafety in Australia is pushing for a requirement for tech companies to proactively scan for this material, for child sexual abuse material and pro-terror content on, you know, their platforms and to effectively report it.
And what we’ve seen in this recent news story is a pushback from groups like Apple, Digi, which is like an alliance of major tech platforms, I think, and Digital Rights pushing back against this, effectively saying that this is gonna undermine, if not encryption, undermine their need to protect the privacy and security of data of their users.

Jordan
Yeah, super interesting to see Digital Rights Watch and Digi on the same side of a tech policy debate. They usually certainly when it relates to privacy, they’re often on opposite sides.
But yeah, this is part of so the e-safety commissioner has a power to make codes under the Online Safety Act. So the Online Safety Act was passed in like 2021, which, you know, regulates kind of online material and behaviors from bullying and harassment to like you know, like videos of terror content or super violent stuff or pornography.
And the e-safety commissioner has a power under that online safety act to work with industry to establish codes for how that kind of material is to be managed. So they’re developing these codes and in the code that the e-safety commissioner is proposing, there is this requirement that online platforms or messaging apps, proactively scan for and remove child sexual abuse material, wherever it is technically feasible to do so.

Arj
Yeah. And so, I mean, that, that wording is interesting. In fact, there’s a lot of back and forth here between the platforms like Apple and eSafety around the wording, because technically feasible from a eSafety perspective is sort of saying, look, we’re not asking you to break encryption because that’s not possible to do in the sense that if you don’t have the keys, you can’t break it. So we’re looking at other measures.
One of the things that safety has, for example, talked about is this idea of, you know, let’s not get you to build systemic vulnerabilities, but do things like hash matching, which basically is like, they have an ability to kind of digitally fingerprint a piece of content like child sexual abuse material and compare that fingerprint to a similar fingerprint of the content on a device.
So without knowing exactly what the piece of content is to run an algorithm and get a hash. And then if, because they have a database of hashes for child sexual abuse material, they can make that kind of comparison and say, okay, that’s an example of a device that’s holding some child sexual abuse material. But the core thing is for me, safety’s position, they’re saying, we’re not asking you to break encryption, but Apple still sees that in the language of you’re asking us to build a backdoor.
So there’s a lot of this kind of jargon around build backdoor and, you know, what’s technically feasible that gets, I think gets very confusing.

Jordan
Yeah. And I think that’s the really interesting thing for me about this debate. Right. So you’ve got the dispute between essentially Apple and some of the big tech companies and a safety, which is a safety says you must scan for this horrible, objectionable content.
And Apple response saying, well, privacy, we can’t be scanning for this content because it would require us to break encryption. It would require us to monitor our customers in a way that’s terrible and bad for human rights and privacy and so on. Both of those are really kind of reasonable, sensible. Like I am on board with both of those objectives, right? That child sexual abuse material, I’ll sometimes reflexively call that CSAM.
That’s the acronym, right? Child sexual abuse material. If you’re listening and wondering what that acronym is, you know, so on the one hand, we’ve got to deal with CSAM.
That’s, you know, horrible objectionable. We need to catch that stamp it out wherever we can. On the other hand, privacy, security, safety, good encryption, fundamental technology for the internet and safety and privacy super important. And so there’s so much rhetorical power on both sides of this debate.
But the solutions are always in the like really low level technical details about exactly how they’re doing it. That makes it this kind of, I don’t know, it makes it really hard for these discussions to be had, I think from a policy perspective, because the rhetorical power on either side is so great, but the detail is so difficult.

Arj
Yeah. As we have discovered, I think it’s hard to have a conversation at the level of rhetoric or the policy level without having at least some diving into the detail and mastery of the detail because the rhetoric kind of, you know, has meaning. So I see the words a lot in this conversation about you’re asking us to undermine encryption. And one side of the argument will say, look, we don’t want you to touch encryption. We believe in encryption. We want you to do this other thing.
And the other side will say, but that undermines encryption.
Like you, if you asked me to scan a piece of content before it is encrypted in order to tell you what’s in it, that that’s undermining the fact that my practice is to encrypt data, like just because I’m not breaking the encryption in a technical sense does not mean I haven’t undermined it. So there’s a little bit of like that back and forth in language games and rhetoric, which is difficult.
And, you know, to your point about the kind of power, like I’ve, I kind of pulled out two quotes that I saw around this conversation Which to me, like you read them both in isolation and they both seem completely compelling and I’m on board with, and like Apple’s is, you know, children can be protected without companies combing through personal data and we will continue working with governments, child advocates and other companies to protect young people, their right to privacy and make the internet a safe place for children and for us all. Sounds kind of reasonable.
And then you’ve got like Sarah Gardner, who’s from the HEAT Initiative, which is a child advocacy group who basically says, look, In terms of Apple, it’s their responsibility to design a safe privacy forward environment that allows for the detection of known child sexual abuse images. As long as people can still share and store a known image of a child being raped in iCloud, we will demand they do better. And you know, that also is very compelling.
And we talk about organizational accountability in different contexts, which is like at the end of the day, you build these platforms, you have to take some accountability for the harms.

Jordan
Yeah, no, for sure. And for me, the most compelling kind of fact or like element of this, just building on what you just said about organizational accountability is that Apple reports almost no CSAM on their services. So, so there’s this technology and way of reporting this kind of material that you’ve kind of pointed to earlier, but you have a hash that you make of image and there’s a central register, there’s a couple, but there’s a national center for missing and exploited children in the US.
There’s a tool that Microsoft makes for checking hashes. You can build this into your system so that the hash gets made, you check against, is it a known child exploitation material? Facebook or Meta in 2022 made 21 million reports on the Facebook, Instagram also Meta, made five million reports. Google made two million reports. Apple made 234.
That cannot be because nobody with an Apple device shares or stores this material. It’s because they’re not looking. And I think like putting aside the debate about like, are they undermining encryption? If everybody else seems to be able to scan for this material and report it, And Apple cannot, like something is wrong there, right?

Arj
And what’s interesting also in the Apple story is how they effectively didn’t about face over a couple of years, right? So in 2021, they actually announced exactly this, which was they announced they were gonna build this iCloud photo scanning tool to detect CSAM material. And they got a lot of vocal support and pats on the backs from child advocacy groups who were wanting to see this for exactly probably the reason you’ve just outlined, which is that, you know, the reports weren’t coming in.
And then there was a sort of flurry of feedback and concern from researchers and advocacy groups and customers. And they paused it in September, like a month later, and then eventually killed it off in December of 2022. So it’s interesting that they’ve also had that about face and – to be fair to them, I want to give them just a little bit of voice to, I guess, some of the concerns that some of those groups were raising.
And like one of them is the potential for, okay, we understand the use cases around CSAM, we understand that we’re going to just compare a known set of hashes for child sexual abuse material with the hashes on a device. And if, you know, there’s a match, then we know we need to report it. One of the concerns raised is that that could be misused, that whole architecture could be misused because Apple don’t know the validity of any of those hashes.
They’re given those hashes by various bodies like NCMEC, I think is like a National Child Protection Institute.

Jordan
National Center for Missing and Exploited Children.

Arj
Yeah. And law enforcement say, you know, here’s a bunch of hashes that we, that are C-semitue that we want you to scan for. And one of the concerns that researchers have raised is that Apple don’t know what those hashes are.
And in certain regimes, an equivalent set of hashes could be given to Apple and told that this is what it is, but it could be posters voicing for political freedom or in support of opposition parties. And suddenly this same architecture, this same technology is being used in other ways. And then the idea that the same technology could be expanded into Text, you know, like having a comparison of hashes of text, which again is like, you know, in terms of freedom of communication.
So these are some of the concerns. I don’t know. This is where I kind of, I start to get nervous and very aware of my own limitations because I’m aware that I’m talking about the technology and how it works, but I just wanted to give some voice to these are some of the concerns that researchers have raised that have apparently given groups like Apple pause though those stats you read out before are incredibly compelling.

Jordan
Well, that’s why I go to the outcomes rather than the how, because we’re a former lawyer and a former journalist, consultants opining on the specific details of the technology. We’re not qualified on the details. For me, with that disclaimer, for me, that slippery slope argument doesn’t really resonate.
There are lots of technologies that if these big tech companies allow regimes in and undemocratic countries to, or democratic countries for that matter, to co-opt, then that’s a problem for human rights and privacy. It doesn’t, you know, I don’t know why like scanning for a particular hash is different from say just providing everyone like the data that you can access for a particular person. I’m not totally convinced.
Maybe it’s because I don’t understand the, you know, the technical details, but I’m not totally convinced by that slippery slope argument.
Taking a step back or up, I do find it really interesting how quickly these debates become just a battle of very high level ideas. For example, language like terrorists are going dark, which you used earlier but which is very common in this debate. Language like tech companies are refusing to cooperate or that it’s impossible to do anything without putting everyone at risk. It’s really evident in even just the quite mature coverage of this Apple e-safety dispute on a very specific point of these e-safety standards. It still ends up this e-safety saying, well, bloody You know, it’s impossible to do this without putting everyone at risk. The resort to rhetoric, I think super interesting.

Arj
Yeah. What I actually said before was that the internet going dark, because it’s actually that I’ve heard that phrase as well about terrorists going dark with law enforcement users and what, but the internet going dark is something I’ve heard actually in security circles, because when you’ve got security teams trying to secure their companies, let’s say.
One of the things they used to be able to do was inspect network traffic and say, like, okay, there’s bad stuff in here. There’s malicious bugs and nasties in this traffic and we’re going to block it. And now they increasingly can’t do that as effectively because there’s just encryption everywhere, end to end encryption everywhere.
So security teams haven’t responded to that by going, we need to get rid of encryption because they see that it has a value and, um, you know, they’ve looked at new models like – You know, things like zero trust, which means like, can we do other ways of checking for the validity of something that don’t require us to sort of rely on encryption and I wonder if yet that’s not what people are proposing Apple do.
It’s just like, we’re not asking you to break the encryption. We’re saying let’s there’s a trade-off here that means, okay, you’re doing something else that’s like on the local device that doesn’t, and that doesn’t mean that you can see the content.
And so, you know, that I think that, and again, we’re sort of straying into the sort of the tech now technicalities of it. I think you’re right. And I think I’m a bit like you where I’m sort of leaning towards the idea of like, I think this is a problem, CSAM material is a problem and these platforms ultimately have accountability for the outcomes on their platforms. And so there is something for them to do here. I don’t know exactly what that is because I think it does require smarter minds than us and tech from a technical perspective.

Jordan
I think that’s true though. There are engineering solutions to these things. A lot of the encryption debate has been about this core question of, can you build a back door that always allows law enforcement access to certain data without compromising security? People talk about laws trying to make people.
break encryption. Malcolm Turnbull in 2017, I wanted to get this in, has this wonderful quote when he’s talking about proposing laws to break encryption, to require tech companies to break encryption. He said, the laws of mathematics are very commendable, but the only law that applies in Australia is the law of Australia. The laws of Australia kind of override mathematical axioms.

Arj
Physics, yeah.

Jordan
Yeah, exactly, which is a great assertion. But again, it’s so tempting to go to that, that’s kind of a straw man, right? What ASafety is asking here is not to break the laws of mathematics. What they are asking for is engineering solutions across a platform that allow for the detection of some of the most horrendous material on the planet?

Arj
I think we’re naturally, even on this podcast, but generally, there’s a cynicism around the law enforcement and the abuses, and I think that’s completely warranted in most cases. And so there’s a cynical view around, if you build a capability like this, well, what we’re seeing here is that Everything has gone digital. All our content is digital.
And suddenly, you know, the eyes of law enforcement have grown wide because they’re like, oh, well now we can get our hands on all of it. You know, there used to be stuff that people had in their bedrooms and in their drawers and whatever, and we couldn’t really see other than what we could see them do in the public eye.
But now if it’s all on digital technology, we could get access to all of it. So that’s why we’re making these requests. That’s why we’re building these, you know, creating these laws that require platforms to comply. That’s it.
But I think that’s a cynical view because I think another view is because everything is digital and encryption is increasingly pervasive, then even when law enforcement has a warrant to legitimately access something, it’s easy for hands to just go up in the air and go, we can’t help you. And so we have problem spaces like child sexual abuse material where we know there’s a problem, we know the contents there, you know, that it exists, but we have we created a scenario where hands can be thrown in the air and platform because I can’t do anything about it.
And effectively, I think you’re right. I think what the policy approach is almost like the sort of the summary line of it is, you guys have smart engineers that have built lots of smart stuff, figure it out.

Jordan
Sure, and sure, of course, there are limits to what smart engineers can figure out. They’re not, you know, magicians.
But there are also, I think, reasonable steps or reasonable arrangements that can be made that address some of the law enforcement content concerns whilst still preserving privacy to a reasonable extent.
So I think the temptation to go to the extremes, the hyperbole, privacy versus security is unproductive. And yeah, I think we need to kind of reframe that debate to like, okay, what in a practical sense is achievable?

Arj
The last thing I want to say also is I think some of our conversations in other contexts apply here as well, which is that we’re not saying here that this is only a technology only solution to what’s really like a large social problem.
Like we’ve talked in other contexts about like in dating apps and gender violence and things like that, that there’s a tendency to go, let’s fix it with the tech. And we’re not saying that here, like there is still clearly a case for you know, a lot of these whole of society issues to be addressed in other ways and the balance of responsibilities therefore does fall in different places.
But clearly there’s a, you know, there’s a, there’s a dark spot and you know, those reporting numbers say that there’s a, there’s a real kind of black hole in terms of what we can see here.

Jordan
And with CSAM especially, it’s a kind of dark economy that is basically built in the internet as well. Like there’s kind of this international trade of awful abuse material that is a product of the internet. And so I think pointing to a tech solution on that front is not unreasonable.

Arj
Righto, we did it, we made it.

Jordan
I always feel weird coming out of conversations about like the CSAM and gross online stuff. So I’m gonna go, you know, touch some grass and be in the sunlight and yeah Chat to you again next week.

Arj
Sounds wonderful. Enjoy, and yeah, I will chat to you next week.

Jordan
Cheers, Arj. See you.

Arj
See you, Jordan.