This week in digital trust » Episode 86

#86 Johanna Weaver – the future of tech policy

14 November 2023

The importance of tech policy as a subset of public policy has emerged forcefully into the spotlight in recent years.

With new technologies rapidly transforming societies, countries and governments all around the world are now grappling with the best way to shape these technologies to serve our collective long-term interests.

This week we sit down with Johanna Weaver, Director of the Tech Policy Design Centre at the Australian National University, to discuss this important area of policy.

We explore Johanna’s perspective on current tech policy challenges, the Australian approach to tech policy, and learn more about the Centre’s work.

Listen now


This is an automatically generated transcript. We make our best efforts to check that it is an accurate reflection of the episode, but it may contain some errors and unedited content.

Welcome to this week in Digital Trust, elevenM’s regular conversation about all things tech policy, privacy and cyber security. I’m Arj joining you today from Awabakal country. This week we’re featuring an interview with Johanna Weaver. Johanna is the director of the Tech Policy Design Center at the Australian National University.
For those that aren’t familiar, the Tech Policy Design Centre is focused on maturing the tech policy ecosystem in Australia with the goal of shaping technology to support the long-term benefit of humanity. Our conversation was a great opportunity to hear Johanna’s perspective on current tech policy challenges and to learn more about the Centre’s work. Hope you enjoy!
Johanna Weaver, welcome to this week in Digital Trust.

Johanna Weaver
Thanks so much for having me. I’m really excited for the conversation.

Likewise, really excited to have you on. We’ve wanted to have you on for a long time. We first probably came across your work through your podcast. You’ve got a great podcast, Tech Mirror, which I highly recommend for all our listeners. But from that, Jordan and I have sort of followed closely the Tech Policy Design Center, which you run at the Australian National University. and are the founding director of, and we’re looking forward to talking more about the center, but I’d love to just start by understanding a little bit more of your professional journey and how you kind of came to do what you’re doing and start the Tech Policy Design Center.

Thank you, Arjun. It’s actually a real treat to be on the other side of the podcast microphone and being the guest. So thank you so much for the invitation to have me on.
You know, I think like you, I never set out to have a career in technology policy. My background, I started my career as a commercial litigator. I then became a diplomat, just a standard run of the mill diplomat. And after I’d done a diplomatic posting, I decided I wanted to do something different. And I’d read Lawrence Lessing’s book, Code 2.0, which if you haven’t read it, listeners, you absolutely must. It’s a seminal book.
It’s from 2006, but it still stands the test of time. And so I decided I was gonna specialize in what I called at the time cyber policy, which I would now call tech policy, but it just shows the evolution of the lexicon. And then I ended up actually back in foreign affairs. I never intended to go back into the foreign service, but I was offered a position to help set up.
Australia’s cyber diplomacy practice in advance of the appointment of our first ambassador for cyber affairs, so Dr Tobias Fiecken. And so I then spent about five years working in various different capacities within the Foreign Service looking at technology policy from an international perspective. And I finished up in a role as Australia’s independent expert to the United Nations and I was our chief negotiator on two cyber processes that were going on.
And that meant I spent a lot of time talking to governments all around the world about the challenges of regulating digital technologies and the urgency of doing it. You know, it didn’t matter whether I was talking to the Russian representative, the Chinese representative, a representative from Indonesia, from Singapore, from South Africa, you name it. Every government was grappling with how to do this well and how to do it properly. But there wasn’t…
really any institutions that existed looking at how do you actually uplift the capacity of governments to be able to do this well. And to sort of circle back on the start of this where we’re saying that, you know, we’ve all sort of entered into this field without deliberately doing so. There isn’t that body of knowledge within the governments on how to regulate this stuff. And, you know, I fundamentally reject the proposition that we can’t regulate it. But I do recognise that we do need to provide the supporting infrastructure to uplift that. And so that really was the motivation to establish the Tech Policy Design Center at the Australian National University. And I’m happy to talk a little bit more about our mission and what we’re setting out to do.

Yeah, let’s do that. I’d love to actually just start by saying the Tech Policy Design Center, like what does that name actually reflect? And then if you can tell us about the mission and what it’s all about.

I love that question. I was just reflecting actually with a friend and colleague last night.
Two years ago, when we were talking about the name for the centre, I’m very deliberate with the words that I use. And so tech policy is quite common in the lexicon now, but when we chose it two years ago, it wasn’t. So that in and of itself, you know, that it was still largely people talking about cyber or emerging technologies, these types of things. And we wanted to be able to have the breadth to grow outside of digital technologies over time. And so that’s why we chose technology.
Policy because we’re looking at policy in its broader sense. So, you know, for that it’s law, regulation, strategy, policy, you know, looking at it from the full breadth of those issues. And the design I think is the most important word in the name. And we chose that for two reasons. One, because we’re not here to admire the problem. We’re here to co-design solutions. And we really wanted to have in the name the reflection of the active nature of the work that we do, and also to imply creativity. We can’t keep applying the same old solutions to these new and novel problems. We actually need to think creatively about how we’re going to address them. And so that was sort of the combination of tech policy design centers, I guess the most boring word of the four, but it’s fairly self-descriptive of what’s on the table.

You’ve got to be somewhere, I guess.

Exactly. Also on our mission. So, you know, our mission is to shape technology for the long-term benefit of humanity. And, you know, for us, this was about law, policy, regulations, they’re all tools to shape technology. We so often look to the technologists to shape the technology that’s shaping our world. And, you know, part of my observation was that we’re not paying enough attention to the role that policy law regulation, et cetera, is playing.
It’s about how do we use all of the tools in our arsenal to shape technology for the long-term benefit of humanity. And it really is important to put humans at the center of this because obviously the technology is shaping our world and we want the technology to be making our lives better. We don’t just have to accept the tech that we’re served up with. We can actually, we have agency to shape the technology in different ways if we want it to be made differently. And so, I guess at the core of what we do, get more engaged in these conversations about what they want from technology in the future.

I love it. I love it because it also parallels, I guess, the development of technology. The concept of design is something that people who are technologists understand. You know, if you want to get this thing to work the way you want it to work, you need to think very carefully and deliberately about design. And so that should be the same for policy. If we can get a little bit more, I guess, almost academic with it, I’d love to sort of dive into what we actually mean by tech because in theory you could have tech policy kind of considerations in every part of the economy. And in some ways, I think the AI regulation conversation is a good example, at least it seems to me to be, we’ve had this discussion recently in Australia around safe and responsible AI and the sort of discussion paper and process kicked off by Minister Ed Husic. And one of the themes around that was sort of, do we need something bespoke to regulate AI or does AI get regulated and kind of attended to from a policy perspective by the fact that we’ve already got laws in around privacy, around anti-discrimination, around copyright, around workplace relations. So what is tech policy as a distinct thing given technology is everywhere?

Ultimately for me, my mission is to get to a point where people recognize tech policy as public policy. Because as you say, technology is embedded in everything that we do, and we need our public policy to take into account technology in all of its forms in every aspect of our society. But I also recognise that we’re not there yet. In general, our public servants haven’t been given the skills and the toolkits that they need to be able to consider technology policy as part of public policy. And so… what we’re looking to do at the Tech Policy Design Center is to say, how do we provide that stepping stone? What are the education? What is the research? What are the frameworks that we need to put in place to facilitate that evolution? And I think Nick Davies, he’s one of the co-directors at Human Technology Institute at UTS. And he wrote an article that was saying, tech policy is actually just policy. And I very much subscribe to that view.
In terms of the scope of what we do, One of the tools that we’ve developed is something called the Tech Policy Atlas. And if listeners just Google Tech Policy Atlas, you should be able to find it relatively quickly. It currently covers 36 jurisdictions around the world, and it is a really good marker of what we consider to be technology policy. And it includes everything from competition policy to regulation of artificial intelligence, to directors duties, to… regulations or opportunities for promotion of investment. You know, it’s really broad because technology is so broad in our society. What we say is basically technology policy is any element of public policy that includes technology. In our broad definition, we don’t limit that to digital technologies. But in most of our work, because we need to bring focus to what we’re doing, we do focus on digital technologies. So we have that breadth.
And then I guess to your point about the question around regulating artificial intelligence and the questions that were posed in the paper, the discussion paper that was put out by the department on responsible AI. The first thing I would say on that is it was really notable to me that that discussion paper contained an explicit statement that existing laws apply already to artificial intelligence.
There aren’t a lot of governments who have come out and made that policy statement so expressly. So I really commend the government for having done that as a first step. And I’d also recommend for listeners who are really interested in this, the dichotomy between the existing laws and where we need new laws, there’s a paper, a submission that was made by DP-REG, so the Digital Platform Regulators Forum, which includes the ACCC, eSafety Commissioner, the Communications Media Authority and the Office of the Information Commissioner. And they put a joint submission, which is actually quite interesting, right, to have regulators putting in a joint public submission in response to a government’s call for submission. But that submission sets out the views of the regulators on how some of these existing laws apply. It’s a really incredibly valuable and rich contribution to this conversation.
I don’t know that enough people have paid attention to that. So, you know, if people are looking at how is the ACCC going to interpret existing competition law, have a look at this submission because it goes into some of that detail. What it also does is identify some of the areas where the regulators are identifying that there are gaps. So my principle starting point when we’re talking about regulation of technology is existing laws apply, and largely those laws will provide the frameworks that we need.
It may not be that the laws themselves have everything that we need in it, but the first principles that guide the laws, you know, the harms that the competition act is trying to prevent, the obligations that are imposed on directors via directors duties, they actually will largely stand the test of time. And it’s a question of what do we need to do to make them applicable at the speed and scale that we need for the evolution of technology. So, you know, I’m very much one of the people who subscribe to existing laws apply, don’t throw the baby out with the bathwater and create something new, but we also need to really urgently look at where are the gaps in those existing laws and where do we need to update them?

There’s a couple of themes there around coordination. I mean, DP reg is obviously a great example of that from a regulatory perspective, but it’s something that I know that you’ve focused on a lot in your work, could you talk a little bit about the barriers that you’re seeing to getting good tech policy outcomes at the moment.

It sounds a bit boring and bureaucratic to talk about coordination of policy processes, but it’s so vitally important. So if you work in fields of established public policy, like health policy or education policy or national security policy, intelligence policy, when there is a new challenge in that field or you’re having an evolution of policy, there are really well established processes and muscle memory for coordination among all of the different agencies and bodies that are relevant and have expertise on that particular area. But that muscle memory when it comes to developing tech policy just doesn’t yet exist. And so you see tech policy being developed in silos in a way that it just doesn’t happen in other areas of public policy.
So, for example, we have, if you look at probably three of the biggest areas of reform that are on the table at the moment, digital identity, cyber security and privacy. These are all reporting to separate government ministers led by different government departments, in some cases, multiple government departments going through different approvals processes within the Australian Public Service. So, you know, the committee process to approve before it gets to the politicians, before it goes to parliament, are different processes with different senior bureaucrats approving and involved. And yet privacy, cybersecurity, and digital identity are inherently entwined, right? You know, privacy without cybersecurity doesn’t exist in the modern age that we’re living in. And so, what we have promoted through the work that you’ve spoken about previously is a report called Cultivating Coordination, which proposes a model to streamline coordination from the ministerial level through departments and regulators, through better coordination among the regulators themselves separate from the department, and then also looking at how do we actually upskill and improve coordination across the public service.
And that really came from a bunch of research that we did in an earlier report called Tending the Tech Ecosystem, which identified a number of, you know, really almost intractable or they were presented as intractable issues and barriers to effective tech policy. So things like knowledge asymmetry, how can government have the knowledge and expertise that they need when this technology is evolving so quickly and the knowledge and cutting edge knowledge rests outside of government, mainly within industry that you’re trying to regulate. The trust deficit between government and industry is really quite large. You know, in many other regulated industries, you still have an independent powerful regulator, but there isn’t quite the… the trust deficit that there is around, particularly when we’re looking at regulating large platforms, digital platforms. So the coordination piece that we’ve developed, which was in that second report, Cultivating Coordination, was really looking to say, well, look, these problems, people may present them like they’re intractable, but they’re actually not. We’ve solved these problems in other fields. The financial sector is an incredibly complex sector that has global reach and implications that evolves really quickly. And we’ve got structures in place that allows us to regulate the financial sector. So what can we learn from those other sectors and bring into the regulation and broader governance of technology policy? And that was the model that we proposed in cultivating coordination.

There’s a kind of common framing you often hear when you hear this conversation globally of kind of, you know, the US has a more sort of laissez faire, market driven approach to hands off, you know, don’t don’t trample innovation. You know, that’s kind of the approach to sort of tech policy. But then, you know, if you look at the EU, it’s more interventionist and rights based and, you know, there’s a willingness to regulate and … You’ve had such expertise and experience as a diplomat talking about these things with other international counterparts. Is there a view of what the Australian approach to tech policy is? Do we have one? You know, I’d love to kind of get a sense of that and also how mature you think our capability is.

So, look, I think Australia often overestimates the value of being first and underestimates the value of getting it right. And I think the other thing that we underestimate, which is why getting it right is so important, is we underestimate the extent to which countries look to Australia as an example. So when you talk about the US, many countries, I mean, the US actually isn’t a good example of good policy or regulation, but for many countries, the size and scale of the US means that it’s not something that even if they were effectively regulating that they could duplicate.
Likewise, the size of the EU market is not something that many countries have. And so therefore they can’t necessarily mirror exactly what the EU is doing, or at least not in advance of the EU having done it. So Australia as an advanced economy, but with a relatively small population,
And we have some amazing tech innovators in Australia, but we’re not a powerhouse of tech innovation in the world. It means that lots of countries look to us and say, well, actually we’re a bit like Australia too. And so what is Australia doing in this field? And I think what we’re doing well and where we have a lot of maturity, when I look back, say from five years ago, and Claire O’Neill, the Minister for Home Affairs makes this point a lot, Australia has a functioning parliament and there aren’t actually a lot of democracies in the world that have that. And I think we do need to appreciate that that is, you know, as much as we might critique the parliamentary process in Australia, it is actually still a functional process and we need to value that more, I think.
We also in this particular government have ministers that are really, that have deep expertise in these issues and have a background in it and an interest in getting it right. That is also something that is incredibly valuable and certainly not a given going into the future. We would hope that it would be, but it’s not a feature of many ministers around the world in this space. So that combination is really important. We also are quite mature in our regulators.
So if you think about the work of the ACCC, the digital platforms work that’s been going on now for many years, and initially led by Rod Sims and now by Gina Cascotly, that work and the team that has been leading that has built up real expertise in the terms of competition and digital platforms. So I think that’s one example. The other example in the regulatory space obviously is the e-safety commissioner, and Julia Inman-Grant who really Again, people can critique the merit or the substance of what these regulators are doing, but they have really built deep expertise in this space. So I think that’s another area where Australia is out in front.
If I’m being really honest, and this is not intended necessarily as a critique of any individual, I very much understand why it is the case. But if you are to compare the depth of expertise that sits within our government, departments, so within the Australian Public Service as opposed to the regulators, we don’t have as much expertise. And so, you know, I think that often creates a little bit of tension in the sense that, you know, the regulators are independent, their job is to implement and enforce the policy, but not necessarily to develop the policy, that’s the job of the government departments. And so, you know, this is one area where we’re looking to work as the tech policy design centre to help uplift that capacity and we’re in the process of designing some professional development courses and things in that field. And I think most government departments are quite open about the fact that this is an area of expertise that they want and need to develop. So it is, I say that in a very constructive way.
So I think Australia, we bring more maturity than many jurisdictions. That doesn’t mean we aren’t mature yet in this space.

Yeah, okay. I think that’s very encouraging to hear. I think, you know, we often have the comment you made about being first and not getting it right. I think that resonated with me because I think I’ve heard that more recently that, oh, we were the first one to introduce X Bill or do this and it wasn’t the right one. Can you maybe tell us a little bit more about the centre’s activities of work, what you’re doing to sort of shape that even further and develop that maturity even further?

Yeah, so. I mean, we do a lot of work with government departments and also with politicians. We take a very proactive approach. So we look at it and say, what are the issues that you are grappling with? And how can we, rather than saying, how do you solve the problem of regulating artificial intelligence? The framing that I have is, how do we give politicians and public servants a toolkit so that they can be thinking about the regulation of whatever technology? we’re talking about. Because artificial intelligence is obviously the topic du jour, but it will continue to evolve. And so it’s about actually giving people the conceptual frameworks to take these issues forward.
So some of the work we’ve done is developing a tech policy process. So if you go to our website, and navigate through the projects. There’s one called the Tech Policy Design Kit, and in that design kit is a tech policy process. And this was out of a partnership we had with the Department of Prime Minister and Cabinet Digital Technologies Task Force, which has now moved to Department of Industry and Resources in partnership with the Tech Council of Australia. And this was in my early conversations across, largely introduction conversations, people said, we don’t actually even, there’s not a deep understanding of what the process of developing good tech policy is. And so we went through this iterative process of developing, like if you were, if you’re gonna develop tech policy well, what does that process look like?
And it’s kind of a choose your own adventure process. So it’s quite a fun, like click through page and it has a bunch of questions, a bunch of outputs. And if you can’t answer the questions, then it gives you, you know, proceed to independent inquiries and budgets and all sorts of fun stuff.
And so that’s one example. We also developed a set of principles that should guide the development of technology policy. But ultimately what both of those processes revealed to us and the conclusion that we got to in the end is that the process for developing good technology policy is actually just the process of developing good policy. And the principles that apply to developing good tech policy are just the principles that apply to developing good policy, which sort of brings us back to the start of the conversation.
But we’re also in the process of developing, as I said, professional development courses. We do a lot of proactive engagement up at parliament as well. And you know, a number of different research bodies and work, including the Tech Policy Atlas, which, you know, the purpose of that is that people can find the laws and regulations from all around the world and bring that comparative approach to, to If we’re talking about regulation of artificial intelligence with that Atlas, with a few clicks you can have six different jurisdictions and all of the different approaches to artificial intelligence regulation around the world. So we’re really just looking to provide very practical, implementable solutions and tools to uplift the capacity across the whole tech ecosystem.

A lot of our listeners are in the private sector and industry. Do they have a role to play? I mean, I think, you know, we often think about their role in terms of, you know, putting a submission to an open consultation, but is there more that industry can and should be doing on tech policy design?

Oh my gosh, absolutely yes. So everything we do at the Tech Policy Design Center, we do in partnership with government, industry, civil society, and academia. So, you know, obviously various different bodies of our work are funded by different parts of that trifecta, quadrifecta, I don’t know what the right word is there. But regardless of who’s funding the work, the work that we do is always in collaboration with industry, government, civil society. And I think people often look at us and like mine and say, where do you sit on the spectrum? Are you pro government? Are you pro big tech? Are you anti big tech? You know, people try and place you on the spectrum.
My response to that is we are very neutral. What we’re looking at here is we need to collaborate with everybody if we’re going to get the right outcomes. I do genuinely believe you have to engage with industry to be able to develop good tech policy, but we also need to upskill and uplift the capacity across government, for example. And civil society is just a voice that is not involved enough in these conversations and really needs to be amplified. So,
And, you know, we’ve worked a lot, for example, through the tech council, but also with a AIIA, with DIGI, with a number of industry organizations. And to your point, Arjun about the submissions, what was interesting when we developed that tech policy design process, it has 11 steps, which I really hate because it’s an odd number and it’s 11. I wish it had been 12 or 10. Anyway, as 11 steps. And what was really interesting about it is.
Most people outside of government, so from industry or to a lesser extent from civil society, but most people from industry thought the tech policy design process started at step six, which is the consultation process that you’re talking about there. But their ability to engage and influence the process is much more significant in stages one through to five, which is sort of the actual policy design process.
By the time you get to consultation, you’re kind of you know, tweaking words around the edges, you’re not tweaking the sort of foundational principle issues. And so what we are really looking to do is, how do we get all of these actors more involved in the process throughout, including through the education processes that we’re about to develop. So they’ll be co-designed and developed with industry and with government and civil society, as well as co-delivered. And, you know, I think if people are interested in that, then please We’ve got a mailing list that you can subscribe to through our website, etc. The more people we have engaged in this process, the better. And I really appreciate the opportunity to come and talk about our work today, Arjun.

No, thank you so much. And I just echo that. I really recommend listeners check out your work and even things like your sort of coordination model in one of your report that sort of lays out the different levels. And so in addition to the process, as you described, where you can kind of get a sense of how early I as a person in industry could or should get involved versus leaving it too late. There’s also one of the different levels at which I can kind of make a contribution because, you know, there’s obviously those involved in the design, but then at the political ministerial level, there’s a different set of imperatives that, you know, you might want to shape the thinking around. So it’s really instructive just to get that, get that map.
So thank you so much. sharing all your wisdom around tech policy and the great work that the centre’s doing. I look forward to seeing it develop and hopefully we can continue the conversation. Before we wrap though, is there any particular resources or links or things that you’d like to encourage people to go check out?

Look, I really would encourage people to look at our website,
And that has links to the podcast, to the Atlas, to the design kit, to the process, all of the things that we’re talking about. So it’s a really good central repository.
And if you wanna follow what we’re doing, LinkedIn Tech Policy Design Center is also a really good way just to sort of keep up to date on what we’re doing. And that’s where we’ll post information about how you can get involved in the forward activities that we’re doing. So…
And yeah, so thank you so much Arjun. And I’m a long time regular listener of the podcast. So thank you for all of the work that you’re doing and the contribution that you’re making. It’s really wonderful to be able to have the conversation.

Oh, thank you so much. We’ll put all those links in the notes as well for people to check out. But Johanna Weaver, thank you for your time and for joining us on this week in Digital Trust.

Thanks Arjun.