This week we discuss revelations from a US Department of Justice investigation into X (formerly Twitter), which raises questions over how it can comply with existing privacy orders given large workforce departures since Elon Musk’s takeover.
The situation provides insight into the reliance of all organisations on well-resourced and skilled privacy teams in order to meet privacy compliance challenges – and the growing difficulty finding skilled people to make up these teams.
Listen now
Transcript
This is an automatically generated transcript. We make our best efforts to check that it is an accurate reflection of the episode, but it may contain some errors and unedited content.
ARJ
Welcome to This Week in Digital Trust, elevenM’s regular conversation about all things tech policy, privacy and cyber security. I’m Arj, joining you today from Awabakal country.
JORDAN
And I’m Jordan, joining you from Wurundjeri country.
And Arj it’s been. A while since we’ve talked about Elon.
ARJ
Yes. Well, is that a good thing or? Is that are you saying that with?
JORDAN
Good thing it’s a great.
ARJ
Some lament, OK.
JORDAN
Thing it’s something that I’ve been very happy with and we’ve managed to get drawn back into Iran and. Ex. This week.
ARJ
It’s been a while since I’ve used the artist from this Twitter since I’ve used ex. I mean I’ve kind of been. Drawn back there. Because people sometimes send you links that land you. There you’re like. Whatsthis.com and now that’s right. But yeah, we haven’t. We haven’t really spoken. About Elon and all these shenanigans with Twitter and X.
JORDAN
I’ve been the same. I haven’t really been using it either, and I’m always like slightly surprised that there’s still a lot of people on this. Oh, yeah, right, this is still a thing. Yeah. People are still posting, axing whatever you call it. The thing that’s baited us back in is this FTC. Investigation into X I’ve got FTC VX. On my screen in front of me and it just feels like I’m. Playing scrub and I’m like LFT CV.
What am I? Going to do with that.
ARJ
It’s a very, very bad. Hand or or if you’re very good at Scrabble, probably a.
JORDAN
If there’s any valves.
ARJ
Very good hand.
JORDAN
Available. I’ll be OK.
ARJ
Let’s unpack your Scrabble board a little bit. There’s been a new court filing from the US Department of Justice that revealed that there’s been an investigation into Musk’s Twitter takeover. And really unpacking, I guess the chaos that’s behind the scenes since that takeover, but very specifically about this idea of whether X now has adequate resources to protect user privacy. And this is in relation to its ability to comply with the previous. Administrative order from the FTC, from the Federal Trade Commission, so the Federal Trade Commission, which is our equivalent of the H Triple C. The sort of regulatory body. That overseas kind of antitrust competition policy, they had an administrative order out against ex relating to privacy and this DOJ investigation is kind of looking into that a lot more closely and basically saying there’s serious doubts here about whether they’d be able to comply with what’s required under that order. Yeah.
JORDAN
There’s a few things to unpack there, right? Cause the US doesn’t have a federal comprehensive privacy law. We’ve kind of talked about that before. So, so predominantly privacy in the US is regulated by the Federal Trade Commission. And predominantly it’s regulated as a misleading and deceptive conduct thing. So the kind of things they go after companies for is like promising that they’ll keep data secure and then having a data breach or then, you know, not investing in information security or promising they’ll only use data for certain purposes. And then using it for other purposes. And that’s a, yeah, a misleading, interceptive conduct kind of issue rather than a a strictly private. The issue, but they are effectively in the US, the privacy regulator, and so this all dates back to you know. Well, the well. Before Musk’s acquisition of Twitter back in 2010, Twitter had a couple of data breaches and got told off by the FTC. For those and other detail that’s relevant here is that the FTC doesn’t really have like powers to find people. Directly for breaching these these laws, what they do is they they find someone, they investigate them, they effectively enter into a contract in undertaking to for that the company agrees that they’ll do better and they can find people for breaching those undertakings rather than the. Also, you kind of get like as a company you kind of get one guy to the apple or something one free pass the first time you stuff up, you’re not gonna get fined, but then you’ll get pushed into this undertaking. And then if you stuff up again, you’ll get a fine. So in 2022. The FTC went after Twitter again for breaching that 2010 undertaking. Twitter is using phone numbers provided for security authentication, second factor authentication. They were using those phone numbers for like advertising, targeting advertising and stuff. And so they got yelled at by the FTC. Then find $150 million and again they’re under this like enhanced consent decree. So they they they’ve promised to do all these. Security compliance and privacy compliance. Things notify the FTC about incidents within 30 days. Do all these kind of reviews and assessments from third parties and have this comprehensive privacy programme? This is all the background since 2022, shortly before. And I must purchased Twitter and Twitter had essentially committed to this really enhanced privacy and security programme, which is actually really quite difficult to implement and guarantee, and we’ll probably get to that later. And then Elon Musk turns up and sacks like 90% of the workforce. This and the FTC and their, you know, his head privacy officer, leaves all sorts of things start happening, and the FTC is sitting there wondering, hang on a minute. It doesn’t look like you’re gonna be able to deliver on all this stuff you’ve promised us in settlement of your previous.
ARJ
Breaches. Yeah, that’s it. So and this court filing for you know. Involving the DOJ basically says that they were the FTC was forced to ask these questions because there were these sudden radical changes. So as you. Say you’ve got this existing. Order Twitters kind of or ex have kind of committed to run this enhanced security programme and then suddenly there’s all these news reports of Musk sacking kind of half the Twitter workforce, but then also very specifically heads of security, heads of privacy kind of resigning and the I mean the parallel. I was thinking just to localise it was. CBA entered into an enforceable undertaking in 2019. Our our Commonwealth Bank with the Australian privacy regulator, the OIC. And that was relating to a couple of, you know, very serious privacy and security incidents. And then as a result of that and this enforceable undertaking, they committed to sort of a four year programme of work which, you know, I think it’s still going or perhaps nearing its conclusion, you know, to do a whole bunch of stuff, you know, enhanced policy standards and controls and whatnot. And it would be like. They had made that commitment, but then sometime in the middle of it you were, you know, hearing about them just kind of laying off all the privacy and security people. And, you know, you would imagine the OC would kind. Of be going. You know what? The you know like the this how are you gonna meet? How you gonna meet these obligations? So that’s kind of what’s playing out. That’s what this stuff reveals. And what’s interesting is I think there’s kind of, I guess like a. In of omission and Commission, in the sense of like there’s the risk of the fact that if Musk is laying off all of these very important teams and people, well, then clearly they’re just not gonna be able to get the job done. The general sort of privacy compliance work and security work that needs to get done. So there’s a I think in this ruling. You there’s a quote from the Chief Privacy Officer saying that the firings and layoffs have meant that no one was responsible for about 37% of ex Corpse privacy programme controls. It’s 37%, it’s fairly heavy. The hefty chunk I would say, but then there’s all this stuff actually about very specific things that Musk has also directed happen. While he doesn’t have these kind of provisions and teams in place, so talking about for example, during this whole Twitter file stuff where he essentially gave away a bunch of internal data to a journalist to. Talked about how Twitter had not complied with free speech prior to him being in. Garage there was, you know, talk of him just giving access Open Access to this journalist with no limits at all. And you had kind of internal stuff saying that’s actually not a good idea from a privacy and security perspective. And then other things like moving data centres without going through the protocols of like, you know, where are you moving it? What are the kind of controls? So.
JORDAN
Yeah, I think there was some reporting requirements attached to that as well, right. If you’re making material changes, they have to have done privacy and security assessments. If it’s a material change and informed the the FTC, there are also three really senior privacy executives left. A day before Twitter had this deadline to submit some information to the FTC, which I think is particularly relevant when you consider that misleading the FTC or misrepresenting your compliance is a criminal offence and three executives resign immediately before. Particular representation. These were made. It’s. Yeah. Look, it’s looking pretty suspicious.
ARJ
And fair enough, I would say from the perspective.
JORDAN
Yeah, yeah, for sure.
Speaker
That, you know like.
ARJ
The FTC in this whole situation in, in the states, it’s like it’s quite polarised. I mean the the fact that they’re even having this investigation into X has been kind of characterised by, you know, the Republican side of politics as you know, as some sort of witch hunt and being outraged. Listen. Yeah, but but there’s actually, like, an interesting issue at the heart of this story, which is that what does it take, you know, in terms of resourcing and what kind of privacy teams do you need in, you know, in the current environment to get the job done? Like, what is required from a resourcing perspective? Cause that’s really. The heart of this and what triggered kind of the the investigations from the DOJ and the FTC is that. OK, you’ve laid off all these key people. You’re not gonna be able to comply with what we expect of you from a privacy and security perspective. So I think we wanted to just kind of explore that as like what is actually the scope and requirement of a privacy team within a large organisation today, given all of the regulations all around the world and all of the data. Total organic. Translations are collecting. It’d be great to get your sense of this, you know, cause you’ve seen this in different organisational context, but like what does it look like? Like what does privacy resourcing?
JORDAN
Look like. Yeah, I mean, this is what we. Do right and so it’s. It’s probably a little self. Serving to say it’s hard, private privacy is hard. It is complicated. It is not just like one or two people hanging out. You know, you’re reviewing the occasional contract or giving some advice on a particular technology. The difficulty, I think comes. The complexity of organisations and the complexity of regulations, right? If you’re running a digital platform that is global, I mean, we’ve often talked about the varying privacy standards and it’s not just privacy. There’s like, you know, safety or free speech or laws about criticising governments or. Hate speech or content moderation? You’ve got all of these. Like often competing objectives, which are often quite different in different countries, and then you’ve got a wildly complex technology stack that you’re trying to make sure complies with those varying different standards and requirements that are often quite hard to translate. Directly into the system to make sure that. The information is only used for certain purposes and so on and corporate IT systems and technology companies in particular are much, much more complicated than kind of you first think. Like as soon as you lift up the hood on some kind of corporate IT system, there’s like literally thousands of different servers and software and systems and so on. That are all connected and talking to each other and so. Trying to make those two things, it’s like set of many different legal obligations that are often kind of vague standards like reasonableness or fairness or safety, and making that talk to a really complex technology stack. Is just like a really hard problem looking at X from the outside it is. It is very much not surprising to me that the FTC is. Looking at a team of like a handful of people and saying it’s just not possible to.
ARJ
Do with the next situation like X. You can imagine they’ve got a whole bunch of data. They’ve got a whole bunch of requirements from all the jurisdictions they operate in. But then if you think kind of outside the context of X and large businesses, they’re probably. You know, being asked to implement new technologies all the time, like there’s probably, you know, vendors coming in and saying, oh, here’s my latest kind of AI thing. Or here’s my latest SAS platform of some kind. And then. Again, I imagine you’re having to look under the hood of these complex technologies. These things that are completely new and understand how they work and what they’re doing and where the information is flowing. And then what the compliance growing.
JORDAN
Yeah, it’s increasingly a part of the privacy profession is is what we call privacy. Engineering, which is really that kind of more technical role. It’s not just, it’s not like a lawyer, it’s someone who understands the law, understands the privacy requirements, but also. Who has some kind of engineering or? IT systems design qualifications and they can look at, you know, they can read and write code and they can engage with an engineering team on quite a deep level in terms of making sure the inner functions of this thing meet those requirements. I mean, that’s increasingly. More difficult with kind of AI and so on. That might be black boxes and they might. Not be able. To give you an assurance about exactly how it works. We had an episode along time ago Episode 14 where we looked at this internal engineering briefing from Facebook, the message of which was essentially that Facebook’s data structures and flows for targeting. Ads are so complex that they’re essentially ungovernable, that that Facebook itself looks at the way they’ve set up their data flows, and they are not able to provide a guarantee that, say, this information is not used for that purpose. And that’s like one of the really fundamental privacy restrictions, right? Like you need to be able to say if I’ve collected your information for a certain purpose, I need to be able to give you a guarantee that I’m not going to use that for another purpose. But tracking that through their complex data system, they don’t have enough. Controls. They don’t have enough visibility of exactly how that works, because it’s just so complex.
ARJ
I remember the the bottle of. Ink analogy in that paper when they. Talk about, you know if if the bottle of. Ink is like. All of the kind of user Pi and all the data they’ve collected about individuals and then. The the analogy in the Facebook context is like someone has poured that bottle of ink into an open lake and the open lake is kind of the the the breadth of their systems and their company. And then suddenly they’re need they’re needing to kind of know where the where the ink has gone in this kind of. Context of this.
JORDAN
Like exactly all. All of these things like don’t. Use this data for that purpose. It sounds really quite straightforward, right? That like, OK, you know, just like, make sure everyone knows not to use. That for the. Purpose. But when you expand it out to an organisation of like thousands or hundreds of thousands of people, when you expanded out to there’s there’s thousands of different. Pieces of information and each have a different set of usage restrictions which apply differently in different countries. When you expand it out to each of those, people have their own laptops, they’ve got. Maybe there’s some data stored in Microsoft Office. Maybe there’s some data stored on some online. And cloud platform and they’re pulling data and you know mixing it and using it for different purposes. And well now you’ve got composite data which restrictions applies to this composite data unless you have control over it right at the start. And unless you can like like there’s ways of building these systems, so the data. Permission goes with it applying even just a simple restriction like do not use this data for that purpose. This can get wildly complicated as soon as you apply it to. A coconut tea.
ARJ
So I’m interested to understand then like, what’s the sort of the skills profile in these teams?
JORDAN
It’s a good question. When I started in privacy some time ago, it was largely well, it was. I was as a lawyer, and it was largely a legal kind of discipline that’s changed a lot, right? And so to answer that question, I’ll take a little step back and. Just talk about the range of works that a privacy team does, and then you’ll see the variety of expertises. So for starters, as a privacy team, your job is to ensure that the company complies with privacy laws also often. There’s a role related to reputation and marketing and cons that you know that privacy is part. If you get privacy wrong, yeah, you might get a fine, but you also get have reputational impacts, right? So you’re look. At at public perceptions and public understandings. So it’s not just legal advice, even from the start. Then you’ve got to try to apply a programme and consciousness of privacy across an organisation, so you’ve gotta look at big picture culture values of the organisation. What are the decision making structures? What are the processes? So that’s bringing in like comms expertise. Like your own. Like we often work together and that. Kind of thing. You’re also bringing in, like organisational design and corporate governance in terms of who gets to make a decision, where are the process. There’s there’s often a risk components or risk management is a whole profession and you look at potential privacy failings through a risk lens. What’s the consequence? What’s the likelihood? How does that justify investment? You’ve then got the traditional legal stuff. You know, you might be doing contract reviews. Design advice privacy impact assessments, which might be quite legalistic. But then there’s this thing called privacy by design, which is really the best practise of how we apply privacy restrictions, which is not usually just a one off assessment. It’s about embedding an expert in a team and understanding their objectives and the design considerations and working with them. Kind of throughout that product development process.
ARJ
The team being like a technology team.
JORDAN
With a technology team, or maybe it’s a service delivery. Team or a? You know, product like physical product design that has data collection. So having an understanding of that kind of product development or engineering or technology team. There’s audits, reviews, assurance on processes so you know you might have auditors or that kind of expertise.
ARJ
That’s a whole different.
JORDAN
You know there’s. Like 10 different professional skill. Sets in that list of tasks, right?
ARJ
Yeah. And I was gonna add it. It seems increasingly also these teams are like the sort of stewards for the social licence that organisations need to do a lot of this. Stuff. So yeah, you know, there’s this kind of ethical role now where, you know, businesses want to do something. And in addition to all of what you’ve described and knowing the law and, you know, being able to advise on. And there’s, you know, like facial recognition is a good example. We don’t necessarily have like a really good law that governs that at the moment. It’s a sort of privacy laws kind. Of been applied to. It, but just because you comply with what the privacy law allows you to do to the letter of the law doesn’t mean you’re not gonna fall afoul of public sentiment. And the social licence. That your organisation you know, wants to maintain, and we’ve seen that happen. And it’s really privacy advisors, all these people in these teams that are trying to advise on that and say, look, here’s kind of what you need to consider from a public sentiment perspective as well. So that seems like again and another additional thing that makes it kind of quite a complex team to to.
JORDAN
Feel. Yeah, it really is a very kind of multidisciplinary. Area and it’s it’s a whole, it’s this set of different skills that again need to be applied, especially if you’re a multinational across quite a complex patchwork. Of requirements in different jurisdictions that you operate.
ARJ
You just laid out kind of quite a complex set of skills required for these teams, privacy teams. And we clearly see that we need more of these people. These skills within organisations as they get bigger as they collect more data as kind of. Privacy and data protection laws are passed and you know we’ve got a bunch of reforms on the cards here in. Australia as well. But we know anecdotally that it’s hard to hire people with the right skill set, given what you’ve just described in terms of what’s needed. But there’s also data out there. That reflects, I think, IP estimate will need IP as the International Association of Privacy Professionals, but they estimate there’s going to be a need for 1.5 to 2 million privacy professionals in the next 5 to 10 years. And so we have this kind of emerging need. We have an existing shortage. But it doesn’t really seem like it’s on the public policy agenda. You know, we don’t tend to talk about privacy skills and the way we talk about IT, skills or cyber skills. And it seems like a kind of, yeah, it’s a it’s a problem that’s it’s it’s. Sort of a hidden problem.
JORDAN
One answer I think is. Getting awareness about privacy out as a career a bit more thoroughly to yeah, having a podcast that people can listen to and realise how much fun it is. Maybe. But getting more engagement in in privacy as a career and is as quite I think that cross disciplinary aspect of. That is exciting, more than it is daunting for me anyway, that like nobody can be expected to be an expert in these 20 different fields that privacy touches on. But you get to sit down and work with each of those people and get their advice. You know, I I in my day to day. Club sit down with comms, people with brand, people, with engineers, with lawyers, with programme managers, with procurement, people with risk professionals. They taught me through their bit and I bring their privacy concerns and. There’s not one person who’s the expert in all of these things, right? It’s the skill set that’s required is an ability to have, you know, maybe it’s a Jack of all trades kind of situation where you need to be able to engage with and understand and incorporate all of those professional areas without necessarily being a. A10 year expert in in any one of.
ARJ
Them and I think what you said before about even just conceiving of the fact that it’s a career path and it and even helping people understand, you know, particularly students and young people. What a privacy in a professional role looks like and what it actually means. I think Cyber has has done this quite well. You know, they’ve been on that journey for sort of 10 or 15 years where people were going through school and they they I think you know, even 10 years ago they they knew about, you know, the sort of the the. The pathway to become a Mark Zuckerberg and programming became all the rage, and everyone understood that that was a career path. But the idea that kind of cyber security was a career path was something that had to be kind of built from the the narrative up to the, you know, building the courses and creating the. Kind of pipeline across, but in the same way I think there are people that are interested in technology and the sort of the modern digital world, but also have a kind of grounding in maybe rights, you know, sort of human rights or kind of broader sense of kind of fairness and and justice and and it’s kind of that Nexus where, you know, you can actually sit. Within an organisation, talk to them about technology, but talk to them about applying that in a way that. That’s preserving of those rights and then fear and to me. That’s that’s a. A good story to kind of attract people, but I think the story for the longest part around privacy has probably been more that kind of maybe that dry legal compliance. Yeah. Story. So there’s a bit of branding to be done.
JORDAN
I think there’s maybe a an element of that. Cyber story that we can pull on right that you know, cyber security is appealing cause like there’s the security aspect. There’s the safety aspect. There’s making sure that. You know, our banks are safe for now. Borders are safe and I think there’s an analogous, but you know, more values based and more rights based argument that you could, you know, sale sale that you can make of privacy, that it’s about keeping our technology driven lives. Aligned to our values and keeping people safe within those.
ARJ
I I think there’s more work to be done there, but in the meantime. Probably good to. Keep watching what’s going on over in the states, cause I think it’s been a nice little kind of investigation over there that’s given a bit of profile to the importance of privacy folk within organisations. That was kind of I think what attracted our interest. So we’ll watch that with interest but yeah. Good to kind of break. It down with you? Yeah, we’ll.
JORDAN
See how that progresses and. We’ll do our best to keep building the Australian privacy industry, tell a friend about privacy. That’s your that’s your homework for next week if you’re.
ARJ
Listening. That’s. I like it. I like it.
JORDAN
Yeah, thanks much. See ya.
ARJ
Good one. Alright with that I will chat to you next week.
