This week in digital trust » Episode 79

#79 The long and winding road to age verification

19 September 2023

This week we revisit the hard problem of age verification, which we last discussed in episode #64.

In order to protect kids from explicit content and other harmful effects of online platforms, we first need to be able to identify them. But can we do that without major risks to privacy and free speech?

The eSafety Commissioner’s Roadmap to Age Verification has the answer. Kind of.

We’ll dig through the findings and recommendations in the recently published roadmap report, including the state of current age assurance technology, the legislative and regulatory framework required to make age verification work safely, and the need for a more holistic approach to protect kids.

Listen now

Transcript

This is an automatically generated transcript. We make our best efforts to check that it is an accurate reflection of the episode, but it may contain some errors and unedited content.

Arj
Welcome to This Week in Digital Trust, elevenM’s regular conversation about all things tech policy, privacy and cyber security. I’m Arj, joining you today from Awabakal country.
Jordan
And I’m Jordan, joining you from Wurundjeri country. But we’re on the road today. Much, we’ve got a map. We’re heading off.
Arj
A map we are heading off. We are forging our path towards age verification.
Jordan
Indeed, we’ve got a road map which actually just. Think states more problems and defines more problems than it solves really, which is a good thing because it’s a complex area, but an age verification road map from E safety, we promised that we talked about this when it came out. We talked about age verification a little while ago, episode 64, we said when the. A safety road map to age verification drops. We do an episode on it. And here we.
Arj
You call something a road map. People expect to get to that. You know, very clear answer around. You know, it’s a road map to identification. We’re gonna write a long report and at the end of it, we’ll be age verification. But it’s actually not as clear a picture as that, is it? It’s sort of. It’s a very nuanced and detailed. Report into this problem of age verification, but. The solutions are maybe a little bit. More complicated than that.
Jordan
Yeah, the solutions really are complicated and so.
Speaker
We’ll try to.
Jordan
Draw some of them out today. I think one of the real concerns that I know I had, I think a lot of the kind of privacy activist, civil society folks had in anticipating this road map was that. It would be a concrete, you know, let’s go age verification and the consensus. In civil society, I think has been that we’re just not ready yet. We’re not there. The text not there. That was the focus of our conversation a few weeks ago. Right. And so there was some worry that this would be a concrete in three years, we’ll be mandating age verification, but it’s not that. And I’m very pleased.
Arj
On the flip side, I think that expectation was there amongst some of the, I guess activist groups in. That we’re looking for. You know, really strong action around age verification cause one of them in the press reports about the road map sort of said stop calling this a road map. There is no road map, there are only delays and obstacles which, yeah, reflects the sort of the other side of the fence because, you know, we we talked about this some episodes ago episode. 64 around how this is a hard problem, but at its core is there is something kind of there is a consensus around the fact that we’re all on board with the idea that we need to protect children. And restrict children from accessing certain kinds of online content. I mean, there’s a lot of unsavoury content from child sexual abuse materials, you know, extremist material, but even, you know, legal material, like online pornography, that is harmful for kids to access, you know, particularly very young children. So there’s a consensus around wanting to do something to know that someone’s younger and restrict them from accessing that.
Jordan
Material in terms of. Kids accessing inappropriate content. The discussion is usually around pornography that will probably be the focus of our discussion today, but there’s also, you know, gambling and. Alcohol purchasing and there are other reasons to have even just like you know, adult themes kind of discussions that you might not want much younger children to be involved in. But yeah, the the discussion is is primarily around we have a we have a society where pornography is legal pornography. It’s it’s perfectly OK for adults. To be consuming that stuff. But we wanna put some gates in place to prevent young kids from from accessing it.
Arj
I I think that’s a really good point cause. That that speaks to the fact that this is actually sort of quite a broad challenge in the sense that it the problem space is often talked about in terms of porn and, you know, yeah, pornographic material, but but there is kind of gambling sites or alcohol sites, but also increasingly we’re seeing even in kind of. Data protection and privacy regulation this idea of. Needing to do something different for younger or more vulnerable groups so you know this idea of like banning targeted advertising to kids, let’s say or, you know, acting in the best interests of children in the way you sort of design your systems all at the base of it is this idea of, OK, we need to sort of somehow know who’s a kid and who’s not. Here, when they’re accessing the Internet, when we talked about it last time you brought up that great cartoon, that famous cartoon about on. The Internet, nobody knows you’re. A dog? Yeah, from New York. And yeah, it it’s it’s. That’s the kind of core of it is that there are certain things on the in the online world that we want to restrict kids from being able to access or. Be exposed to.
Jordan
Or we want to, as you say, we want to apply a higher level of care or support, right? So there are a number of proposals for the reform to the Privacy Act. That includes special duties of care. To children where you’ve gotta. If you’re collecting and handling their data, you’ve got to take into account their best interests. You can’t do certain types of targeted advertising, that kind of thing. So. So it’s all about really at its core is how do we get to build a layer on top of the Internet that gives you some idea of. You know, not necessarily whether someone’s a dog, but whether whether they’re 12, whether they’re 15, whether they’re 18 and lets you personalise that content based on that.
Arj
Yeah. And so, you know, we’ve obviously got this road map that’s been developed that we’re gonna talk about. But before we get into it, just to sort of note that this is a global kind of challenge we’re seeing. Regulators, governments, authorities all around the world kind of looking at ways to solve this problem. And you know, we’ve even seen in jurisdictions like the US, which haven’t necessarily been prone to legislating their way to solve technology challenges looking, you know, we’ve seen numerous states over the last kind of two years in the US. Introduce age verification requirements on and you know, imposing those on social media platforms so the conversation in Australia is kind of, I guess you know very much a parallel to what we’re seeing all around the world.
Jordan
Yeah, not just the US to France and Germany. I think both have age verification laws. So yeah, it’s it’s kind of more and more. There’s this global pressure towards sorting this out. I think we talked about this last time as well, but broadly there are kind of three. Approaches to do this one is to just do like traditional identification verification. Show me a credit card, show me a ID or a governments drivers licence. That’s problematic because you don’t want, like every other website on the Internet, to demand a copy of your ID. You can have that stuff everywhere, so that’s. Ideal you’ve got a bunch of very emerging, very new AI kind of biometric technologies. So you’re trying to guess how old someone is by their face or by their voice, or by asking them questions or looking at their behaviour. That’s like super sketchy, that kind of technology. Maybe it will get there somewhere, but. It it it in? It’s not particularly accurate, I don’t think. And it also. Who runs into all these problems with people who are not typical, right? If you have an atypical phase, if you have a particular disability, or even if you’re have a have dark skin, they often don’t work as as well.
Arj
I read recently that that kind of age estimation through facial analysis doesn’t work particularly well for older people, which is quite ironic in this context. You know, like using it for age verification, and it tends to work less reliably if you are aged.
Jordan
Right. Interesting. Yeah. So you that’s that’s a bit sketchy too. And then the third is kind of the ideal. Situation, but we’re not quite there yet, which is you have some kind of digital identification system where you I can prove my ID, my age to some trusted party like a bank or a telco provider or, I don’t know, someone who’s relatively trusted and then they can go and vouch or they can give me a. That I can give to a website that is acts as if they’re vouching for my age. You know, they say I’m over 18 and that’s all they give me a token that proves that they’ve checked my ID and I give that to the website. And so they know that the banks checked my ID, but I’m not actually giving them actual ID. I’m not giving them any kind of identification. I don’t even have to give them my actual age, right? It just needs to be it. It’s just a token that says I’m over 12 / 15 / 18 or something, and that’s like the ideal future state of the technology. We’re just not quite there yet.
Arj
One of the things I think it’s useful to sort of point out because I think. Often people sort of go. Why? Why is it so complicated? Why? Why are you mentioning all these things when we’ve already solved this in the real world? You rock up to a bar and you’re under 18. The guy at the front says show me your ID and then if you’re old enough, you’re let in. And if you’re not, you’re turned away like.
Jordan
That tremendously reliable process, yeah.
Arj
So, so reliable, but.
Jordan
Never got into a bar before I was 18.
Arj
So easy. Yeah, yeah, yeah, exactly. But like, you know, people often say that like we, we just wanna do what we do. You know, in the real world this. Is not a problem, but you already pointed to one of them, which is the. You know in in the. Online world, it often means the the company and the the the service provider that’s asking for that ID is retaining it. There’s a kind of privacy and a security risk, but it’s also just this. The complication of the fact that, you know, like adult content in a content that’s inappropriate for kids doesn’t always necessarily. Neatly fit within the four walls of a kind of building like a. Bar you know. Yes, there are porn sites, and that’s often the use case. But often this content. And is spread in so many different places over the web and sometimes like it’s on social media platforms, for example, or or it’s content as you indicated before, like on somewhere like Wikipedia where it’s, you know, it’s maybe a a little bit more sort of adult appropriate than it is kid appropriate the descriptions of what’s going on when you’re talking about. Age gating, kind of content like that you you might be blocking off access to you know large platforms for example it’s a it’s a much more complicated. The prize, even before you get down to the idea of, like yeah, thinking about tokens and all of that.
Jordan
Sort of stuff. Yeah. There’s a great little set of stats in this age verification road map from a safety that 70% looking they they interviewed a whole bunch of people like the kids about where they had. Encountered pornography and 70% say. Well, pornography website. Fine that that makes sense. But 35% said on on social media, 28% said in their actual like social media feed, they didn’t go looking for it, they didn’t, you know, not like it someone shared it or it popped up as an ad or a something in their feed. 17% group chat 17%. Private social media pages, where it’s being shared in to those environments. So yeah, a really interesting part of this problem is preventing people from accidentally encountering content that they don’t want to encounter. So. One of the things that really just changed my mind in reading this that I’d never really thought of is the utility of just those are you over 18, yes or no pop ups. Yeah, you get them. On like alcohol websites as well, if you if you’re buying out, are you over 18? Always thought that was that was like completely pointless, right? You know you’re getting me to a test of my age. I can just say what I like. What’s the point of this? But those actually. Have real value, especially around kind of explicit content pornography that cause you’re giving the person, especially a young person, a warning that what they’re about to encounter is not made for them right? And and if they don’t wanna see it, they should get out of here now. And so, yeah, for for a young person that that. Can actually be a really effective way of. It doesn’t stop the person who wants to get there, but it does give the person who might be shocked and might not have the maturity and the mental equipment to deal with and process and be OK. Is viewing what’s on the other side of that page. It gives them the warning and the opportunity to. Get out. Which was. I I actually hadn’t thought of I. Thought that was. Yeah, it’s a really good point.
Speaker
Yeah, it it.
Arj
It is a great point cause it it’s quite nuanced in terms of what you’re trying to achieve with age verification in different contexts. So in the example you’re giving, it’s like you’re giving people a chance. If they definitely don’t want to see anything that’s kind of adult, they have the opportunity to know that that’s what they’re about to step into and. To pull out and then. You know what? Last time we spoke, I also had a maybe a similar kind of attitude where I was like, you know, some of these kind of like let’s say age estimation type technologies where you can’t know for sure that someones you know over the age of 18, what’s the point of that? But there are different use cases. There are gonna be examples where it’s enough to get a general sense. That someone might not be over the age of 18. We don’t need to get them to vouch for their ID. We don’t need to go down that path, cause there are trade offs with kind of asking people to provide, say, a government issued ID around privacy and security. Maybe in in a particular use case, it’s. Enough to to try to get a sense of someones. Kind of, you know age or or, you know suitability for a bit of content and that’s where those solutions are. OK. But in other examples, there’s a legal requirement, let’s say like we know that. We need to. We need to know this person is over the age of 18. So then you start to think about, OK, is government issued ID the way or is digital identity? You know which we’ll talk about the way. You know there are different use cases. For how and why? People need to make this assessment and I think that’s reflected in that kind of spectrum of solutions.
Jordan
Yeah, which which also ties to, I think this just overall message that it is really complicated, right? Like kids aren’t even homogeneous. Like the the kind of way you need to treat a 10 year old or a 12 year old versus a 14 year old versus 16 year old versus an 18 year old is really quite different. The kinds of content, the kind. Of their willingness to accept parental controls, the appropriateness of parental supervision into the, you know, sex, life of a 17 year old or something. All of that. Like changes dramatically across the age groups, we also as you’re pointing out, have a whole bunch of different. Use cases, right? You’ve got content appearing in a social media feed or being shared among friends versus a dedicated porn or gambling website. You’ve got different industry verticals, gambling, pornography. The you’ve got all of these different like axes that makes this like a really kind of big and complicated problem for me. That’s one of the real messages that comes out of this road map that like, Yep, sure. Eventually we want to get to this stage where we have a privacy and rights preserving. Age verification layer on the Internet. That’s the end goal, but. Well, there are so many different players here. There are so many different ways that this can impact people that we just need to get there slowly.
Arj
Let’s talk about the road map, the road map. Was something that the E Safety Commission developed after almost two years. Of cross sector consultation research, they did an independent technical assessment of technologies which we’re going to talk about, and this is something they submitted to the Australian Government for consideration and it was just made public in the last few weeks, along with the government’s response to it, which we’ll talk about. But as as you said before we did. Flag that when the. Road map landed and we we we gotta got to get our hands on it. We would talk about it. So let’s let let’s kind of. Step through it. At at a very high level. The road map kind of has I guess 4 broad areas that have sort of splits its findings into one is around this idea of this layer of the Internet age verification layer that you spoke about. Can that be implemented A mandatory age verification mechanism? Is it something that could be practically? Achieved. That’s one. The second is around kind of regulatory and legislative frameworks and you know what could be kind of imposed I guess on on different organisations and providers of technology services. And what you. Know what should those proposals look like? Then we’ve got kind of the complementary measures which talks about essentially what you were saying about that broader ecosystem of parents and the the behaviours of kids around devices at different age groups. And what does that mean in terms of how you respond? And then the last thing is around awareness. You know, what kind of education? We need to put out there to help address this challenge. So they’re the kind of four broad.
Jordan
Areas the so the first area is just the technology, right? That and a safety observes that. A lot of people are in support. Four out of three out of four Australian adults support some kind of age assurance for online pornography, but a lot of adults you survey the public, a lot of adults are concerned about the safety and privacy. We’ve kind of talked about that trade off E safety engaged, a independent kind of tech consulting. Company to review a number of age assurance. Mechanisms really the the the one line summary of that is that it’s immature. There’s a bunch of different kind of potential approaches which I kind of broke down a minute ago, but you can ask people to send in a copy of their ID, you can do biometric assessments or you can. Have some kind of you know, person vouching for. Or them tokenized identity approach. My read of their summary is that really the the the biometric stuff is like maybe but not there yet? The ID approach is no good cause privacy and security concerns and really they settled. This this double blind tokenized approach. So you have a independent assurance partner. I prove my ID to them. They vouch for me to the. Website and you can if you do some kind of clever arranging of the cryptography and the relationships there you can do that in a way that neither the identity provider who I prove my age to doesn’t know doesn’t have a record of which website. I provided that token to. And the website doesn’t actually even know my identity. You can run that in a way that is this double blind. It’s it. It preserves privacy in both directions. So my takeaway from the report is that there’s potentially some other promising technologies, but really. That double blind age assurance process is emerging as an international standard and that Australia, their recommendation, is that we should follow.
Arj
That this this seemed to be the part of the road map that got, I think, most of the play in terms of the media coverage, because one of the recommendations. From it is that yes, we make the assessment that that this kind of market for these technologies is is not quite mature, but there was a recommendation that you know the Australian government kind of commissioned some sort of pilot or trial before then thinking about mandating these technologies and. The government has in its response not committed to do that. It’s kind of just reaffirmed that. The findings around the fact that the market’s not quite there yet, and as a result, they’re sort of saying, well, we’re gonna look towards more putting expectations on these platforms first before we look into. The technology but. You know the opposition rights group, kind of activist groups that sort of are very much behind this idea of kind of. You know, protecting children for age verification has been very critical of the government for so, so called kicking the can down the. Road, but it my sense was from one of the quotes that I heard from the E Safety Commissioner. In the news, he. Sort of said look, the government’s made its decision and I need to get on with using the tools I have to help reach more parents, children and and you know, with education information which you know to me also felt like a little bit like maybe she was disappointed that a trial hadn’t been entered into.
Jordan
The road map does go into a bit of. Detail about how that. Trail will look who should be involved. Moved and so on. So yeah, you you do get the sense that that a safety really saw that as the path forward.
Arj
Yeah, who should be involved in like they even. Sort of spell. Out some of the considerations and requirements around designing a trial around you know, ensuring this privacy impact assessments and you know that that it kind of. Rules on some of those examples you gave from Europe, so it you know that that was clearly something that they were I think hoping to see. Come, come out of this but it.
Jordan
Hasn’t. Yeah, so that leaves. Legislative and regulatory framework. That’s the next section of the of the report we come to that because, as you said, Julian and Grant and she’s the safety Commissioner. Said she’d get on with her current tools. Mostly this next section deals with what are the tools currently available and what should a regulatory framework around age verification look like? The answer? It’s quite a short answer really in that. If you have a mandate for age verification, you also need to have some kind of regulatory scheme for accreditation of oversight of that age verification. So what the road map’s really proposing is that, yeah, if we do establish expectations and requirements for service providers to to apply. Edge assurance we need to. Would provide a regulatory scheme that provides independent oversight, provides provides governance, provides transparency, trustworthiness, makes sure that whatever requirement we’re putting on websites is first of all, actually enforce. But then second is enforced in a way and that identity proofing process is done in a way that. Is safe and respective of privacy and security.
Arj
The other component of the legislative and regulatory approach is this idea of these online safety codes, so. You know, just really. Being clear about the expectations of these online platforms and social media companies and getting them to kind of Co design and agree to codes around how they’re going to treat this material. So at the moment there’s some progress around kind of the phase one of those codes, which is about what they call. Class one content which is kind of child sexual exploitation material and terrorism material and there’s. Sort of some agreements around how platforms will restrict and monitor and if they identify in that content, you know committing to take it down within 24 hours and so forth. But we’re still kind of waiting to see what the approach is for the Class 2 content, which is really the sort of pornography and the kind of legal but inappropriate content, so. Still a way to go, but that’s I think that’s what Gillian migrant, the Commissioner, is sort of saying is like, I have to get on now with kind of working out what those codes look like and trying to get industry to take them on to sort of sell. This self police, I guess this this area as opposed to rolling out technologies to.
Jordan
Do it. Yeah, exactly. So these codes are initially developed by industry, but they have to be approved by the Safety Commissioner so she can knock them back, or she can actually, I think, make the code herself. If that process breaks down, if they can’t get. To where they need to go through that participatory approach with industry that phase two is not started yet, it’s taken, you know, over a year I think maybe to to get through that phase one. Code making process for Category 1, csam and and protera and stuff content. I imagine it will be at least as long, perhaps longer to work through this phase two process it it does deal with all this legal content, right? So it’s going to need to. Kind of cover stuff like accent, age verification and access by children. As well as I think like to a certain extent, I think that that extreme category is really easy like like everybody’s on board. It’s like the worst of the worst. We gotta get rid of it. Yeah. Once you’re into, you know, legal, but shocking or legal. But yeah, but problematic for certain.
Arj
Ohh yeah yeah yeah.
Jordan
Viewers or or legal but harmful to kids? You get into a much more difficult area to negotiate, so I wonder if I wonder how hard that that code making process will be.
Arj
I expect it’ll be quite hard but. It speaks to the broader report. Which brings us, I guess to our. Kind of the. The the last two parts of the road map which you know to really underscore that this wasn’t just silver bullet. Solution that you could solve with either a piece of tech. Or a law the. The as you’ve kind of indicated the. The last two parts of. The road map are around. These complementary measures for a holistic approach and the kind of awareness raising so complementary moves for holistic approach is really about sort of thinking more broadly about those contexts that where children encounter porn and you know the different behaviours at different ages. The different technology uses. And recognising that. You know this response to, you know, restricting access to inappropriate content is also going to include, you know, parents being better educated. It’s going to include, you know, what sort of safety discussions take place in the home. And, you know, in various other educational context, it’s kind of other kinds of philtres. And parental controls that can be built in thinking very broadly. I mean like the, I think one of the great examples for me in the in the road map that sort of spoke to some of this was the idea that, you know, if you’re in the age bracket. Of sort of 10 to 12. You’re probably using kind of a common device. You know the the the the home laptop or you know some other kind of shared device. Maybe you’re kind of much more reliant on sort of search engines and app stores, and you’re at a point where you haven’t had a lot of as much kind of sexual education. Let’s say, you know. You haven’t grown up in the world. You haven’t been kind of exposed to, you know, safe sex and other contexts where. You can make meaning out of, I guess some of the content, should you, you know, stumble upon it, and then you compare that just to the next age bracket up of sort of the 13 to 15 year old. And they’ve probably at this stage got their own device and they’re, you know, under, you know, the age restrictions of many social media platforms. They’re allowed to be on social media. Where there is. porn on some platforms. And so it’s suddenly a very different situation and clearly the responses can’t be one size fits all across those two age groups.
Jordan
Yeah, AA really interesting. Just little facts that I got out of this report as well was. Just illustrating what you just said is that Australia’s National Research Organisation for women’s safety, which was set up as part of the national plan to end violence against women and children, to build, you know, evidence based around this women’s safety, says that from 14. And above viewing pornography may be an age appropriate sexual behaviour, right? So so as soon as you’re 1415161718. Actually there are even there may even be times when it’s not a shocking it’s not a a thing that we necessarily need to avoid, that it could be safe and healthy. And I mean, the vast majority of pornography is pretty horrendous, especially for a for a 14 year old. So certainly not all. One certainly only in very specific contexts, but it really illustrates, I think, how difficult that getting the age balance right and exactly how you need to manage kids at different ages. So yeah, so this this section of the report, I think it’s really interesting, but as you say it it, it highlights those different approaches for different ages. And it also highlights just like how many little things apart from age verification. Might be useful here in kind of managing. Access to inappropriate content. Making sure that it. It’s maybe monitored or or seen or kids got someone to someone to talk to about it. So for example, there’s, you know, things like awareness and parental controls for, for parents that you know you’d want for a 10 year old that become less appropriate. For a 16 or 17 year old, but also expectations on platforms, right? And these are, I think, things that the Safety Commissioner will look at putting into the codes that that we talked about a moment ago for online platforms, but they’re things like having clear and well enforced policies about whether pornography is even allowed on the platform, enforcing minimum ages so that people that under 13 can’t get into. Say Facebook, which is supposed to be for over 30. Means having age gates, so those kinds of pop ups are you over 18 giving users kids control over their safety and privacy settings, giving them better control and better moderation of content. All of these things are kind of non age verification tools that can really address that harm. We were talking about. Stumbling across or encountering pornography in ways and in contexts where it’s, you know, distressing or not, what not what. They were looking for.
Arj
And so you know, with that in mind, there’s a lot of focus on kind of various educational activities as well. So not just. Or the children or the the under age, people that might stumble upon the content. But for parents, for educators to have the conversations, there’s sections in the road map that’s, you know, speak to the Australian Educational curriculum and what needs to happen there and just funding for resources for the, you know, the. Safety Commission already has a a great kind of library of resources and information around online safety, but sort of being able to expand and enrich that with the sort of research that the the road map has produced. So that’s kind of. You know very much rounding out that holistic response.
Jordan
Yeah, for sure. And you know, if you, if you’re a parent listening to this and thinking, well, I should probably have a conversation with my kid. E safety has a good has a. They’ve got a great library of those of like, conversation templates actually and like guidance on and and you know age appropriate. As well, again like it changes so. You know what’s the level of specificity like like how do you talk to a 10 year old or a 12 year old about what they might come across online first, how do you talk to a 14 or a 16 year old? So yeah, I think. That’s really interesting.
Arj
Just, I mean, just hearing you say it doesn’t and as a parent you can kind of see why there’s such an appeal in having a technology solution. Too, it’s like. Yeah, they just roll out a you. Know roll out something that stops them getting to this content altogether. It’s clearly what. Yeah. Why do I have to look at the template? You know, but. You know clearly.
Jordan
Yep, Yep. Yeah and. You know, not having kids? I’m. Like, why can’t parents just sort this out?
Arj
Yeah, it is a you. Know definitely recommend people check out this this the road map because it does give it does shine a light on sort of. This is not a simple solution even though it’s a. It’s. It’s what it’s. In a way, it’s a simpler problem in the sense that a lot of technology policy issues, you know, when we talk about privacy, there’s not a consensus on the problem itself. You know, like some it’s, you know, in privacy. Some people say, well, it’s actually, you know, it’s not a problem. You know, we want information to flow freely. We want innovation, and other people will say, well, actually. We need to protect. People’s rights and there’s a contention even. On the problem. Whereas here’s one where it’s like we need you. Know we want. Kind of inappropriate content not to make its. Way to kids who aren’t ready. To handle it. That we all agree on the problem, but. The solution is is not is not straightforward.
Jordan
Yeah, the solution is complicated and and I think the solution is lots of things right. There’s a great line at the start of this report, which I think is on point to what you just said, that a mandatory age assurance mechanism on its own will not address the issue of pornography’s influence and association with attitudes and behaviours. Which can contribute to gender based violence. However, to the extent that age assurance and other forms of safety technology may increase, the age at which children are likely to encounter online pornography featuring sex, symbolism, misogyny, gender based violence, they can contribute to reducing harmful attitudes and that delay then. Provides an opportunity for children to receive respectful relationships education prior to viewing such content, right? So if we. And just put some pressure down back on easy access to horrible problematic content. Then we make that room for all of the other the the education, the the coaching, the maturity to to come through. Which yeah, that’s what made a lot of sense.
Speaker
OK, well.
Arj
That’s us for the week. Good to finally get into this road map now that it’s. Out, but yeah. Well, it’s it looks like there there might be more conversations to come. This is not a a.
Jordan
Quick fix. Yeah, we’ve got a road map, but it is a long and winding Rd, so we’ll we’ll set off and we’ll see where. We end up. Thanks Jim. Thanks ash. Chat to you. Next time, see ya.