This week in digital trust » Episode 100

#100 Reflections on 100 – the lessons that popped

12 March 2024

To mark 100 episodes, this week we reflect on the topics covered on This Week in Digital Trust since its inception.

Arj and Jordan also share four key insights and learnings that have emerged through their discussions, and how these have shaped (and are shaped) their world views.

If you’d like to share some feedback with us, we’d love to hear it.⁠ https://forms.office.com/r/6HxEztcC85

Listen now

Transcript

This is an automatically generated transcript. We make our best efforts to check that it is an accurate reflection of the episode, but it may contain some errors and unedited content.

Arj
Welcome to this week in Digital Trust, elevenM’s regular conversation about all things tech policy, privacy, AI and cybersecurity. I’m Arj joining you today from Awabakal country.
Jordan
And for the 100th time Arj I’m Jordan joining you from Wirandjuri country in Melbourne.

Arj
Woohoo! 100 episodes, yeah. Just a few thousand behind Joe Rogan.

Jordan
Yeah, look, we’ll get there. We’ll get there. Do something every week and it adds up.

Arj
Yeah, that’s really cool. Just want to say that from the top. Wouldn’t have expected it, but 100 episodes. Very, very cool.

Jordan
Yeah, I mean, if you’d told us week one in what November 2022 when we set out on this journey that we’d make it to 100, I would not have believed you.

Arj
I’m pretty sure those first dozen or so was probably just friends and family, but it’s been nice to pick up a healthy kind of number of listeners over the years and people who actually stop you in conferences and business. Hey, you’re the guys from the podcast.

Jordan
Yeah, it’s pretty crazy, isn’t it? It’s I think that is that is even more remarkable than making 100 episodes is that people actually listen to them. So thank you if you’re listening.

We thought we’d spend this epp having a bit of a reflection. So, you know, 100 episodes in, what have we learned? It’s been, I mean, we’ll get to some lessons.
But for me, it’s just wild. Like 100 episodes over two and a half years. It was really interesting prepping for this app to go back through the back catalog and have a look at some of the things we’ve been talking about. And, you know, there’s some discussions that are like, two years old that feels like we just had them yesterday. So like, I think you picked out like that Roe v. Wade discussion was like, you know, about the US rolling back Roe v. Wade. That was in May 2022. You know, it feels like that was a lot more recent than that.

But other discussions as well about things like children’s privacy or the failure of notice and consent, facial recognition. These are things that we’ve been talking about for like two and a half years. And maybe it’s a depressing observation that there’s still live issues, right? That you could just republish some of those episodes and they’re like totally relevant, but then others are like all the conversations we had about crypto in 2022, which was, you know, front of mind in all technology media is they’ve just evaporated, right? And they’re just completely gone.
So I think it’s something that we don’t often do in kind of technology criticism or technology policies kind of have a bit of a look at a, I mean granted it’s only two years, but that kind of longitudinal look or the look back about are the things that were big issues two years ago or even one year ago or 10 years ago.
Have they continued? Is the thing that was promised delivered or is it not? It has the risk materialized or has it not? I think they’re really useful conversations.

Arj
Yeah, I think the weirdness of time is partly a function of, I think, the way some of these issues in technology kind of play out. Like some of them are flashes in the pan. And so when you look at those conversations and you’re like, hey, we haven’t been spoken about, you know, web 3.0 in like 18 months, but it felt like all we were talking about before that was where.
And other things, even though it’s tech and tech policy, just don’t change. Like you said, some of these topics, um, we’re still grappling with the issues. We’re not, you know, reforms that we’re waiting for.
So, you know, we talked about them a couple of weeks ago, but we also talked to them a couple of years ago and we were sort of having the same conversation. And so that, I think that plays into it. I was thinking about and reflecting ahead of this podcast, like whether I had a, like a list of things I thought we would talk about and how that actually compared to our back catalog.
And I think like some of those things like privacy reform and cyber strategy and breaches, like as expected, they’re kind of littered throughout, but you know, things like, I don’t think I would have anticipated the sort of post-Chat GPT AI explosion that happened and kind of has dominated.
But yeah, I don’t think I had a list, but one of the things that I think that jumps out for me is that I think when we set out, we did that classic thing people do in podcasts, which is like, have conversations just organically and then at the end of it go, oh, that was very interesting to me. That should be a podcast. We should have a podcast.
But it was always about how we use tech and data and how that shapes the way we work in the world and our identity. And I think for like that was what interested us. And as it turns out, that conversation about how we sort of position ourselves in the world and amongst others – It applies to more than just privacy and cyber. Like it’s turned out that we do talk about those things.
And we talked about the platforms, but we also talk about dating apps. And we talk about workplace tech and all of these things, because it fits into that frame of like, how do these things affect the way we sort of operate in the world and our relationships with each other and with, with, you know, our place in society. So it’s been really interesting to see how that’s kind of found voice in different topics.

Jordan
Yeah, for sure. And the pod really started for us as a learning journey, right? To, you know, coming from our relative positions of expertise, you know, you doing, you in kind of comms and communication around privacy and cyber and myself in kind of lawyering or, you know, risk management around privacy in particular, and kind of branching out into those broader areas of tech criticism.
And yeah how technology impacts every aspect of our lives has been a really interesting, I think, learning journey. And so what we’re looking to do today, I think, is go through some of the things that we’ve actually learned.

Arj
I’ll kick us off. One of the things that this pod has provided an opportunity for me to do is sort of evolve my understanding of what privacy actually is. And I know that sounds silly because it’s like, you know We’re a privacy and cybersecurity consultancy. And our pod is about that. But I often find that when you have these conversations, privacy is often kind of conflated with security in some cases, like people talk about it just in terms of like my data being secure. And if there’s a breach, well then my privacy has been breached. Or about secrecy, like this idea that just no one should know anything about me if I don’t want them to. And that’s what privacy is.
And one thing that really became clear was that like, that just doesn’t gel with like online experience where we clearly are out there sharing, you know, things about ourselves in different forums. You know, people want to reveal themselves, yet we have this tension about like our privacy is always under threat.
So for me, I was like always trying to wrap my head around like what that was all about. And one of the things the pod has allowed me to sort of refine my understanding around is this idea of, and you brought it to me frankly, is that this idea of like a lot of…
What privacy is about a sense of agency and control about how you’re seen and understood in different contexts and having the ability to be in charge of revealing that.

And you brought that to me in the context of the work of Helen Nissenbaum, who talks about this idea of contextual integrity. And again, it’s really that idea of like having the ability to manage your persona in different contexts. And that just immediately made sense of so much of what we talk about in the world about.
The fact that you can be someone who wants to share information online, but you want to have the control over that. And the thing that the online world often challenges us with is the fact that, you know, information just flows freely and gets assembled and, you know, this world of data brokers where, you know, you, there’s this on sharing and selling and accumulation of data just completely undermines that integrity. So that has really been a really good lesson for me.

Jordan
The internet collapses contexts, right?
You’re underselling yourself with the, you know, like the, what is privacy question is something that academics have been grappling with for the last century. Right. Like it’s, it feels like a silly question, but it really is. Yeah. Central and hard.

Arj
It’s hard. And I think the reason I raised that also is also cause it’s, it also goes to the heart of that, like that premise that a lot of the social media platforms started out with where they’d point to like behaviors of people and go look they’re sharing information on Facebook or on these social media platforms privacy is dead and it’s like This actually explains that it’s not that’s not what it’s about people do want to share They want to have agency and one of the episodes where it came to light for me was that episode 46 we did on called leaving on a jet plane, which was actually about this idea of public information that we have about ourselves, still are still expecting a right to privacy.
So you might have information about yourself that’s out in the public domain, and yet you feel this need to say, well, actually I have a right to privacy in how that public information is used, because it’s this idea of the context, just because I have something on an electoral roll or whatever, doesn’t mean you can pipe it over here in the internet and publish it on a news site. Yeah.

Jordan
That was Elon Musk trying to protect his, you know, where his plane flies, right? Stopping people publishing that, yeah.
So related to that, my kind of philosophical learning is kind of similar, but a different lesson, which is how important thinking about power relationships and what technology does to power relationships within our society is to thinking about the technology itself.
You know, technology that people often kind of claim that technology is neutral, it doesn’t really do anything, but it, but it always does. And it’s always kind of contains values and changes what humans can do to each other. And so, you know, new technologies, inevitably change or exacerbate existing power structures. So maybe they kind of devalue labor in the context of AI and automation. Maybe they empower or disempower workers, so you know, gig apps to the second of those. They might provide new ways of controlling or monitoring behavior.
We did an episode about bossware and surveillance in the workplace, things like that. So technology policy and like our responses to new technologies, thinking about what it does to people’s power relationships, who it empowers, who it disempowers, I think it’s a really kind of, it’s been a thread in our conversations throughout, but it’s really, for me, crystallized, even just like a couple of weeks ago in that conversation about neo-Luddites.
And we had a quote in that from Corey Doctorow about thinking not merely about what technology does, but who it does it for and who it does it to.
And so, yeah, I think that was my lesson. It’s like technology is about power. Technology policy is about making a decision as a society on who gets what powers, what new powers that technology has brought us, who gets those, who uses them against who.

Arj
Yeah. And it makes sense for it. Like I probably wouldn’t have anticipated this, but we often spend a lot of time, or I think I raise it probably a lot, the VC companies who have sort of inserted themselves, these big venture capital Silicon Valley giants who’ve kind of inserted themselves into our lives.
And yeah, essentially kind of thumbing, putting a thumb on the scale of those power relationships. And it makes sense, right? Like so many of these contexts, like the examples you gave, it’s really about relationships. It’s like consumers and businesses, but individuals in the state or workers and their bosses. Like everything, like it’s not just abstract tech, it’s about relationships in these different contexts and the way the tech works, the way the data works, it’s going to influence the power dynamics there.

Jordan
Yeah, for sure. And it’s, it’s something I’ve been conscious on, especially in some of the, our recent episodes of our, like how you think about those relationships shapes how you think about the technology, right? And so both of our relatively left-leaning politics, I think, comes out in those, right? That, you know, we have views about how we should shift the balance of power between labor and employers, for example, or, you know, things like that, I think one’s kind of natural politics inevitably comes through, because they’re not just conversations about newfangled technology and bits and bytes and whatever, they’re conversations about political power in a society.

Arj
And you know, we just been some really, you mentioned the NeoLuddites episode, but then, you know, even just, just in the last kind of week or so, we’ve had a paper from UNSW and CPSC about this idea of singling out, you know, the way that data collected and the data broken environment allows companies to single out individuals even without identifying them by
But what that ultimately means is that they can do things to that person. They can offer them different pricing and prevent them from certain seeing messages or data. So it’s this whole game of like the power, you know, that, that organizations have over us on the basis of our data.

Jordan
Yeah. And that, that’s not a political issue, right? Like there is, there is a general trend there, particularly like with a lot of tech, right, that it, it’s centralizing power or giving, giving these kind of big corporate or government actors the power to act on specific individuals where they never used to have it.

Arj
Yeah. Which is why I think some of us have been so keen to see some movement on reform on privacy reform because of this, you know, this, this understanding that, that tech and the use of data, uh, you know, in the, in the internet age is kind of playing into the way our power like relationships are mediated and power is mediated. And we’ve seen like that the balance is not fair for consumers. And we’ve wanted to see privacy reform.
And, you know, this brings us to our next sort of area of learning or evolved understanding, which perhaps is not really quite a learning as much as a, you know, a big sigh.

Jordan
Acceptance.

Arj
Yeah, it’s been really.
Frankly, it’s really disheartening to be at episode 100 and feel like we’re pretty much where we were at episode 1, which was waiting for privacy reform. You know, it’s been and it’s been an emotional rollercoaster.

Jordan
Yeah, no, it really has. You’ve got it in the notes. So like maybe just step on your thunder. But like Episode 51, which would have been over a year ago, was the waiters over on privacy reform. It’s not.

Arj
Well, this project is over, Jordan. We’ve set this up to make this happen and we’re done.

Jordan
That was in February last year. Yeah. It’s hilarious that we called it.

Arj
And so it’s been an emotional rollercoaster because we had like, I think the first 40 episodes were probably like under the coalition government where. there wasn’t really as much traction and receptiveness to the idea of privacy reform. Then we saw a change in government, we saw positive signs and yet here we are, you know, some years down the road. So, I mean, for me, like, you know, there’s been no shortage of, you know, effort and reviews and conversation. And we now have a, like, you know, a set of policy proposals that are agreed at various levels, but we don’t have legislation.
And it’s hard not to conclude, like the lesson for me is that privacy, unfortunately, lacks a degree of political importance, you know, in the sort of political, in the context of making kind of political progress. And I like, you know, maybe I’m overstating it, but maybe we’ll, you know, we’re still expecting to see laws drawn up this year.
But, you know. Where it does seem to get play very easily is on the back of like a cyber context. So there’s a breach like Medibank and Optus, there’s anger, and we saw fines increased, but we’re not seeing like a lot of this kind of, you know, the broader suite of reforms and, you know, the importance of advocacy in this context has been really illuminating to see like the impact of big tech and big business lobby sort of talking about chilling of innovation and, you know, costs on small business at the expense of like progress in the favour of consumers and individuals.

Jordan
Yeah, no, for sure. I think that’s a really interesting and important observation. I mean, you mentioned the privacy paradox a minute ago that, you know, with that idea that kind of people say they care about privacy, but their observed behaviours, you know, they still use Facebook, they publish stuff, whatever they accept the surveillance. I feel like there’s a similar kind of problem.
One of the drivers of that is that privacy is really hard, right? And it’s impossible not to use Facebook and you can care about privacy, but like, what are you going to do? I think there’s a really interesting analogy there to the reform problem, right? That you can care and it can be a priority, but it’s like, it is so hard. It impacts every aspect of our economy. It impacts individuals. You’ve got the big tech lobby coming in from overseas. You’ve got small businesses Solving the privacy reform, it’s not like you’re just passing a new law. They’re really complicated issues. There’s a million different stakeholders and there’s a lot of really nuanced, difficult stuff in there. So I wonder, that’s kind of my potted analysis of why it’s so hard, right?

Arj
Yeah, I think that’s really interesting and good point. And it’s like whether it was a folly to sort of pursue this as like one reform bill, one act when you’re just trying to, you’re touching so many different stakeholders. And I mean, one of the other things I reflect on is just where privacy is grounded in Australian society as well, because we’ve talked about the comparisons between privacy as a human rights issue in Europe and the strong kind of tradition of rights based activism in a year.
And we talked about Max Shrems in episode 65 and the role he’s had. And, you know, where does it sit in Australia? Do we see it as a consumer issue? Is it a rights issue? That to me has been interesting to learn about as well.
And I’m really curious to see, you know, to bring it to sort of a future focus. We’ve got a new privacy commissioner, Carly Kind, who’s just started. And when she started the OAIC, welcomed her with a kind of social media post, which sort of said, our people came together today to welcome privacy commissioner, Carly Kine to the OAIC. Commissioner Kine shared that she’s passionate about seeing human rights brought into the policy and legal arena.
So I’m really curious to see, like, what what does she bring to that? You know, what does that mean in terms of pushing the agenda forward?

Jordan
Yeah, no, that’s I think that’s really significant. I mean.
Historically, the OAIC, I mean, this is going back some years, but has been quite reluctant to use rights language, right? Because it often falls flat in Australian politics. So it’s super interesting to see that emphasis.
My kind of politics learning was trying to take the optimistic side of what you’ve just put right, which is that people do care about privacy like deeply and almost universally across Australia. I mean, like just anecdotally that has increased as well, right? That like when I, 10 years ago, I would tell people at a barbecue that I work in privacy and it would end the conversation because just like, I don’t know, they’ve got no personal experience of it. They don’t really care. Um, you tell them now when they’re talking about the Optus data bridge, they’re talking about TikTok and you know, data going to China and all of this stuff.
And there’s some awesome evidence of this. There’s the community attitudes to privacy survey that the OAIC runs. And there’s a fantastic kind of research and public policy outfit called the Consumer Policy Research Center that does awesome research on this too. And just on them, their recent report, which I just mentioned before singling out, which they did with Katharine Kemp from UNSW.

Arj
Yeah.
I just saw something on LinkedIn where Katharine Kemp was on the Today Show or something talking about that report, which I like to your barbecue example. It’s like five years ago. Imagine that kind of conversation on the Today Show on mainstream use, you know, a very sort of wonky policy topic about how organizations use that data to single us out.

Jordan
Those choice reports on facial recognition at stadiums and at Bunnings and the good guys and stuff. Same thing, right? They’re making like mainstream Today Show news.
There’s stats like eight in 10 Australians care enough about their privacy to do something to try to take active steps to protect their personal information. 94% of Australian consumers surveyed by the CPRC, the Consumer Policy Research Center, were uncomfortable about how their information is collected and shared online. That’s like 94%. Getting a response of 94% from people doing a survey is wild, right?
You can ask… Yeah, any kind of vaguely political question and it’s, you know, 50, 50, 60, 40, 90% is almost universal. You know, one of my highlights over the last couple of years is sitting down episode 74 and interviewing Angeline Falk, the then Information and Privacy Commissioner, now Just Information Commissioner, about their community attitudes to privacy survey and kind of breaking that down and how it influences the way they work and the way they approach policy and complaints and so on.
I think it’s just a really valuable insight there that we need to keep front of mind when we are talking about the Privacy Act reform staff is that almost universally, Australians are not happy with how privacy works and how their information is harvested and sold and managed and collected and whatnot.
And, you know, the argument that you get from big tech or, you know, people opposed to reform is usually like, don’t worry, things are operating fine. And it’s just not true. Nobody likes the way that things are operating right now.

Arj
Okay, well over to our sort of, I guess our final topic of conversation, final learning around the 100 episodes is around cyber security. And, you know, this is something that cyber is often like a very, can be a technical or very like national security oriented conversation. And one of the things I’ve sort of found that as we’ve talked about this over the 100 episodes is how much of it is actually sort of an hard-nosed economics conversation.
Like that there’s such an economic imperative behind so many sides of this question. So, you know, the attackers, the world of cyber attackers and cybercrime groups. In episode 90, we interviewed Jonathan Lusthaus, who’s a sociologist out of Oxford and really understanding that it’s just it’s a business. You know, you’ve got really skilled technology, skilled people in, you know, developing countries with no other outlet to earn money and so they’ve formed this entire industry, this scaled industry.
And so we’re not fighting individuals in basements, we’re fighting an industry. It’s trying to tackle an entire economy is really the way we should think about this.

Jordan
Yeah, and they’re in an office building with a HR manager and quotas and KPIs. You know, all of this.

Arj
Yeah, yeah. Whiteboards and all charts and all of the all the stuff.

Jordan
A lot of the ransomware outfits have like customer support in case you can’t unlock your, you know, like all of this stuff. It’s wild.

Arj
Yeah. And it also came up this kind of economic lens came up, you know, in the context of whether you pay a ransom or not. So when the Medibank Optus breach happened, we talked in episode 39 to pay or not to pay this like pros and cons of paying and not paying a ransom.
And it’s often just framed as a moral choice. Like, you know, we should not pay ransoms. We should not support, you know, these groups. But for some businesses, it’s a cold hard calculation. And they say, well, you know, the cost of the ransom versus the cost of, you know, kind of continued disruption of my business loss of revenue. I’m going to weigh that up, you know, in financial terms. And
so we’ve seen moves to ban payments as a result, but, you know, this is an ongoing debate about like businesses will make the financial decision often in many cases.

Jordan
Yeah, I think that’s so interesting. With cyber especially, there is that big picture view that you have to take, especially around policy, right? Like thinking about the ecosystems or the influences on that ecosystem. That’s why we’re talking about banning ransom payments, for example. But you also need to talk about kind of the tiny technical specific situations of a particular, you know, these companies still need to do all of the million different operational things to, you know, have a culture of security and have technical protections in place and all of these things.
That was kind of my learning in this area that holding those two, the big and the little in your head when we’re talking about these kinds of policy questions, I think is really important. And all of the conversations that we have, all the tech, you know, like, like cyber security or AI or privacy or e-safety, holding those two, the big and the little equally important is kind of essential to coming to a sensible view. So like, like you’re saying the, you know, with, with ransomware banning ransomware payments makes so much sense in the context of that big picture. But then you look at how that applies out into specific organisational contexts where it might harm the affected individuals or it might, you know, destroy a business because they don’t have backups, they can’t recover. That’s where the challenge comes.

Arj
Yeah. And you have a policy imperative to, you know, get rid of ransomware gangs. But you know, just a policy to ban ransomware payments is not effective unless You’ve reduced the value of that commodity of stolen data. And one of the ways you do that is you encourage businesses to get better at protecting the data or having backups so that the extortion is less effective. So they have to be held together, those two big and small things.
And I don’t know if we’re there yet in terms of the way we think about it at a public policy level, but it’s so interesting what you say, which is that you’re solving it at that big national and sometimes international policy level. And then you’ve got this kind of control within organizations at a very granular level.

Jordan
One other episode where that big and little really came through to me as important was back at episode 50. One of my favorite episode titles, by the way, like swiping left on techno carceral solutionism. We were talking about like online safety in the context of dating apps and how there was at the time this kind of push towards more like identity proofing and surveillance and you know, there’d been some horrible violence against women that was, you know, in part facilitated by these apps.
And so the response was, well, should you have to provide a hundred points of ID to use hinge or Tinder or whatever? And one of the things that came up in our discussion was how, you know, on the one hand, you have to think about the small scale the individual experiences, which are both positive and negative. Right.
There’s awful stuff that goes on, but there’s also whole communities for whom going out in a bar is not a viable solution. You know, queer communities in regional towns, for example, or, you know, people looking within particular kind of ethnic or religious context, trying to make people, these apps can be super valuable for those people. And you’ve got to think about the impact on those people of kind of real identity requirements and so on.
But you also need to think about like the super large scale about like how, how kind of violence against women in those apps is not a product of the app so much as a product of society in general.

Arj
Right. And gender norms and all that.

Jordan
Yeah. And the, you know, when you’re looking at the large scale, it’s maybe not even really a technology problem. I mean, there’s valid questions about how we can make it better or worse with technology, but.
But if we’re talking about violence against women, it’s not just a product of the apps. It’s an issue we need to deal with more broadly in society. And so yeah, that kind of technology problem, small scale harming people, affecting people, benefiting people in some cases, but then large scale blowing up into this, the issue is this massive social problem.
For me, that issue of like Sometimes the stuff we talk about blows up into broader social problems. Sometimes it’s about like the specific harm that a specific person feels and, you know, understanding the impact of the optus breach on a specific individual. I think that’s one of the things I really have enjoyed about this whole kind of two and a half years, right? Like we’ve just talked about, I don’t know, philosophical learnings we’ve had, or, um, you know, the political implications and the political challenges of privacy reform and.
And the big and the little in terms of kind of tech policy problems. I don’t know. One of the fun things about all of these episodes for me is how we, we usually sit down each week with a headline or a current issue and kind of start talking and we might not have a great idea of where we want to go when we sit down and then kind of all of these kind of interesting problems, connections, complexities start falling out. And so I feel like I have a better idea of what this exercise is about now than I did maybe a hundred episodes ago or 50 episodes ago, right? We’re trying to pull apart and understand those many different threads of the headline each week.

Arj
And a better understanding of like, can we see the world? I think, you know, inadvertently, I’ve found that you’re sort of testing and evolving your own worldview through these technology challenges, which I didn’t expect.
Like you just, you know, you think you’re talking about a particular app or a particular platform’s policy and next thing you know, you’re sort of.

Jordan
Yeah, suddenly we’re talking about gender norms or whatever.

Arj
Yeah, you know, how you frame the world, which I think is, you know, it’s been really interesting and fun and useful. And yeah, I hope we get to do it for another 100 episodes. But.

Yeah, with that, I think let’s round out the first 100. Let’s, I mean, I’m a cricket fan, so let’s lift the bat in the air and, Lift the bat. You know, but I think we should just close, I think with a note of thanks to everyone that’s listened along the way.

Jordan
Absolutely.

Arj
As I said at the start, we maybe wouldn’t have expected it, but we do have a healthy audience and we’re very grateful for everyone and be even more grateful if you could tell more of your friends about us.

Jordan
Tell a friend. lever-rating in the App Store. Yeah. All the things, yeah.

Jordan
Brighter of you on your university bathroom wall. That’s how we did things when I was a kid. Yeah, okay. Stuff like that.

Arj
Good one. Yeah. All right. Well, thanks Jordan.

Jordan
Thanks Arj

Arj
Appreciate you for 100 and yeah, look forward to 101 next week.

Jordan
Here’s to another 100 and yeah, talk to you again next week.