Ever wondered what can be inferred about you just from the way you walk?
This might be a growing concern as the use of biometric technologies – which include solutions that offer “gait analysis” – become more common.
This week we explore how biometrics like fingerprints, faceprints and behavioural attributes are being used in different contexts, for purposes including identification, verification and analysis.
The convenience appeal of these technologies is hard to resist – as anyone with facial verification turned on for their smartphone can attest – but there are also an emerging set of social and policy concerns that need to be managed.
We discuss these concerns and the emerging regulatory approaches
This is an automatically generated transcript. We make our best efforts to check that it is an accurate reflection of the episode, but it may contain some errors and unedited content.
Welcome to this week in Digital Trust, elevenM’s regular conversation about all things tech policy, privacy, AI and cybersecurity. I’m Arj joining you today from Awabakal country.
And I’m Jordan joining you from Wirandjuri country in Melbourne.
And Arj, I’m turning over a new leaf for the year and I want to help with this. We’re going to talk about biometrics today.
The immediate response is, oh, sketchy privacy surveillance problems, you know, don’t like it. But I’m going to try to start the conversation with something good and positive and happy in the new year, right? This new leaf kind of business.
And so what I wanted to do is open biometrics are cool because I can unlock my phone with my face. And I just don’t think that is appreciated enough as just like such a cool use case that I can just leave like from convenience and security. It is just the coolest thing in the world. I can leave my phone wherever I like. No one can access it and seamlessly I pick it up and I can access my stuff and that is just so cool.
It is. It’s pretty magical. You think about it. I’m very, firstly I’m very warmed by this new, you know, new Leaf Jordan, you know, this positive, positive techno optimist.
I’m very curious to see how far it goes. Like should we do blockchain next week and test you out?
Yeah, that’ll be a real challenge.
Yeah. But you are, look, I think you’re right. Like it’s pretty, it is pretty magical. Like the idea that you can just look at your phone and it opens. And I mean, it’s like, I think that is one of the tough things for like technologies that is You get used to it so quickly. You get, you just, it just take it for granted so quickly.
And I was trying to find stats just before we got online for how many times face ID gets activated on average a day to unlock someone’s phone. Like how often do we use it?
Because I must be using it all the time. Like basically you pick up the phone so often. It used to be just for unlocking the phone, but now you’re like so many apps integrated, so you’re logging into apps, you’re paying for things. It’s magic.
It is. And it’s, I mean, there’s see, I’m already falling off my wagon. There’s an insidiousness to that as well. Right. That like we are having our faces scanned constantly now and that it’s real powerful. Right. In the like, it can link my identity to my accounts, to payments. It makes all of this stuff, you know, Apple Pay and what not super seamless, super easy. So like that biometric linking for identity is like super useful.
But like you say, I mean, I’d be fascinated to see that number of like just how many times per day is my face being scanned? I mean, every time I pick up my phone it is. And then, you know, there’s other material uses in, you know, if I’m buying something or if I’m accessing a lot of my apps are behind a second. You know, facial recognition.
You can buy an app from the app store. You just look at the phone and you know, you can get it.
The more positive spin on that is that this is all relatively controlled facial recognition uses, right? I know I’ve signed up for it. It’s this constrained use in the particular use case of, you know, the phone and it’s all encrypted and kind of controlled by the phone manufacturer.
But I mean, I think we’re diving into rabbit holes too quickly, right? Let’s take a step back. We wanted to talk about biometrics today, right? Not so much in the specific just What are biometrics when we’re talking about them? What are the risks? What are the use cases? Why are they kind of an important topic from a privacy sense? So, Arjun, what are biometrics?
Well, good place to start, because I think, yeah, we just take it for granted that everybody knows what they are. But specifically, obviously biometrics are kind of biological measurements, physical characteristics that we use to identify people. So things like your fingerprint, things like your face print.
But.. In the context of this conversation and the kind of our industry and why we talk about them, we typically mean those biometric technologies that use some sort of matching to identify or authenticate a person based on one of those characteristics.
And it’s quite broad, like biometrics can just be physical or physiological features. So it can be, as I said, your fingerprint, your face print. Also your kind of… your iris, your eyeball, your voice, your hand geometry, but also kind of increasingly in the innovative biometric space, it’s just behavioral attributes.
So your gait, you know, the way that you walk or run, your signature, your keystroke pattern, like how you, even just how you use a phone, like behavioral tracking of like the way you swipe and the way you are interacting with that device can be a behavioral attribute that is a biometric.
Even language patterns, the way we speak and the use of grammar and words can be considered biometrics. They can be used as a way to identify people through these means. So it’s quite broad when you think about it.
And there are some really interesting use cases there already, right? Like the kind of typing patterns or swiping phone usage patterns or voice or, you know, use of language in written communications. All of these things have these kind of nuanced indicators of the person. And they’re increasingly used in security contexts, I think, that you can see, you know, if I’ve left my laptop unlocked and someone else has come and tapping away, you can see how that would be an incredibly useful measure, right? To see, like, hold on, suddenly the mouse isn’t moving like Jordan moves it and the typing isn’t happening like Jordan does.
Yeah, I am, you know, in former lives, I’ve been aware that the banking industry has looked at this closely, like in the mobile banking app context. So, you know, being able to detect fraud, because the person using the app is, you know, using it in a way that is not reflective of the normal user of that app. And so this very interesting use cases there.
That’s often not tied to identity, right? Or it might not be. It’s just a material change in a use case. And so let’s flag for security reviewing or whatever. Do it a lot in likes. That’s the same kind of flagging for suspicious transactions. This person usually uses their credit card in particular ways. So there’s kind of a fuzzy line between the biometrics that we’re talking about often in privacy, which is pretty focused on using it to identify a person.
So the Privacy Act defines biometrics and biometric templates by reference to a use to identify a person. If you’re using it for some kind of identity purpose, then it’s a biometric and it’s got special regulation under the Privacy Act. But if you’re just using it to flag suspicious behavior, it might not be. There’s a fuzzy line there. But it’s a really useful technology and it’s a really challenging technology.
You know, we’ve talked about a few of the kind of uses and applications, but one of the, I think, back to the start of this conversation, that one of the most useful and most powerful is tying identity back to an individual in a really controlled way as a credential.
So it’s kind of at the core of digital ID legislation and system that is kind of being built in Australia. We talked about that in episode 85 with Brett Watson, but core to that is, is the idea that you can use pictures of people’s faces or biometric credentials, often through a phone or a verified provider to tie someone to a digital document, then you can suddenly pay for things or verify your identity online.
And it’s a, it’s a super powerful use case. But as we’ve said, there are a bunch of others, right, for security or for, you know, we’ve often talked about facial recognition in public places or, you know, in public spaces, which, you know, lots of different kinds of use cases.
Yeah, I was just having a think about some of the different ways we’ve kind of incidentally covered biometrics. Like we’ve talked a lot about, you know, facial recognition and even that there’s a spectrum, as you said, there’s the sort of very controlled like I want to access my phone. I’m the one that’s given the phone, the face print and you know, it’s letting me in through to, you know, security screening and picking people out of a crowd.
But then we’ve talked about fingerprint access to school toilets. You know, that was a case we talked about last year, which was still bizarre to even say it out loud now is that there was a proposal to get school kids to use their fingerprints to access the school toilets because they were concerned about vandalism of the toilets and they wanted to sort of lock that down.
Um, you know, so, and then keystroke patterns and things like that also from an office worker monitoring kind of context as well has come up. So there is a wide variety, but there’s, there’s a range of different appeals and benefits in that.
Yeah, there are, right. The appeal is, is largely that it’s something that’s attached to a person that’s often, if it’s done well, pretty reliable. Um, And that is often because of the nature of these technologies, quite difficult, not impossible, but quite difficult to, um, smooth fraud to, yeah. Um, to replicate if you know, it’s quite hard to convince the facial recognition algorithm that you are someone else. Um, and so there’s power there. Um, but there’s also, I mean, you know, with great power comes great risk, right?
That the, some of the risks with biometrics are attached to those same things that make them compelling technologies that you remove agency from people. I can’t stop someone. Or it’s very hard for me to stop someone from taking a photo of my face in a public place or through a security camera and identifying me. Whereas other forms of identity, if you want to check my driver’s license, you need to ask me and I need to consent facial recognition not so much.
And you can often do it at scale as well in a way that you can, we’ve talked about this in the context of stadiums, right? You could scan the faces of everyone walking through a gate a lot faster than you can check and look at their driver’s license photo and check that they are who they say they are.
So I think it’s a really interesting technology for that reason that it’s so powerful for these kinds of applications where we wanna have some assurance about who we’re dealing with.
But because it’s so powerful, if it’s used wrongly, it’s really dangerous.
Yeah, the scale one I agree is really interesting because it’s the thing that makes it so powerful and appealing, as you say, because you can do things like check people at such a large scale that you can’t replicate that with non-biometric ways.
But the scale also means that things like error rates become magnified in terms of their impact. You can have a very small error rate, but if you’re training these sensors, these facial recognition cameras on massive, massive amounts of people, and you get even a tiny proportion wrong, you’re potentially impacting large numbers of people.
So you’ve got, you know, the classic example, I think you’ve mentioned this on the pod before, but like if you’re using this in an airport, amazing. Like if you can clean up that customs line for me, I’m all down, right? Like if I can just walk through you scan my face and you can tell what you need to tell. And I’m just walking straight through.
But if you’re getting that wrong for a tiny percentage of the, you know, the large volumes of traffic that walk through airports, that’s a lot of people that could be adversely affected.
I just wanted to like paint the convenience picture a little bit more as well, just in terms of the benefits, just to step into a couple of the other use cases. One is, I mean, obviously the airport example I just gave, but you, you know, there are examples like Amazon doing kind of a palm print is a biometric way of identifying yourself.
I think it’s looking at like your vein, the patterns of your veins in your hand. Right.
So it’s, it’s yeah. Your literal palm print, like where you can read your future authentically. Like not bad. It’s the, it’s the vein.
Your Arjun and you have a very auspicious year ahead.
I see a lot of purchases in your immediate future. Yeah, but Amazon have obviously got this out in a lot of their retail stores where, you know, you tie your palm basically to your account, your payment details, and you can just walk through and grab a bunch of stuff, you know, wave your palm over a reader and it pays.
And because it’s Amazon, I think a lot of people’s kind of here stand up on the back of their neck and they’re a bit worried. But the picture of that, the convenience of that is incredibly compelling. The fact that, you know, there’s just no friction in that at all.
Um, another good one I read about was the, was Disneyland, you know, the most magical place on earth getting even more magical by allowing you to, um, you know, using your kind of fingerprint, um, identify yourself. Then you’re given some sort of reader and then you’re basically walking around the park and just using this reader to skip queues, to get personalized experiences and for it to know everywhere you are and constantly kind of giving you offers.
And so, I mean, the convenience thing is, is like, I mean, we started by talking about the face ID and the phone, but it is the thing that is kind of, I just think it’s such a factor in this biometric conversation because some of the concerns They’re hard to focus on when so many things are made so much more easier.
Yeah, it’s so true. And I think that’s the case with like a lot of the, it’s why privacy law is so important, I think. And we didn’t start this conversation with, with this as a goal or topic, but it, but like the, we’ve talked about a similar topics where we’ve said, you know, I’ve definitely said that I want the good thing without the bad, right?
Technology gives us these opportunities, these powers, to do things that we could never do before. Right, look, recognize people at scale, do ID seamlessly, tie, you know, tie effortlessly, tie transactions back to you without having to have your card or whatever. All of these things, unlocking your phone, keeping it locked, super useful, I want that, that’s tremendously useful, but the same power that lets us do that gives us the ability to do that for like really creepy reasons or really disturbing reasons, right?
Um, and the same power as well. If we’re relying on that system when it fails, you know, if your hand is too dark or you have a scar or you don’t have a hand, if the facial recognition system doesn’t work, does that mean you don’t get to go to Disneyland, right?
Is there a fail safe mechanism that allows a person to still use, still access those stores, those products, those services, if that technology doesn’t work for them. Because if we get too excited about the fun new tool, right, we leave people behind and create more harm than benefit.
So I really see the role of regulation and the role of privacy in particular to step in with these technologies, right? And say, well, yes, absolutely fantastic. I’m on your side. I love this stuff. Like let’s build it, but let’s build it in a way that’s not gonna lead to some techno surveillance, dystopia where we can’t be private. We have constantly surveilled wherever we are.
Yeah, because yeah, I think that’s a fair point because as big as the benefits are on the risk side, they’re not insignificant either. I mean, that surveillance picture is a real one.
You know, the fact that in a world of biometrics, you can be walking around and feeling constantly surveilled, if not constantly surveilled. And the other one is recognizing the inherently special nature of biometrics, which is that they are completely unique and immutable. And so when we talk about using biometrics trivially and if they’re not, you know, if there’s not safeguards around them and they’re used in an insecure way, if those biometrics are lost, if there’s a data breach around your fingerprint, your face print, that’s inherently tied to you.
We’ve kind of used the line, you can reset your password but you can’t reset your face. So the risk of it going wrong is also incredibly significant. And again, like we’re talking again about a use case where There’s great appeal and benefit.
I mean, passwords as a conventional way of accessing things as a sort of way to secure our accounts is incredibly problematic. I mean, the reason we see half the data breaches we see is because passwords sucks, passwords are broken, they’re done. I, in fact, trying to log into the podcast platform just now to do our podcast.
My password manager popped up and I kept typing my password to get in the password getting it wrong. It’s just, you know, it’s, it’s Monday or whatever, and I’m not sleep deprived and I kept getting it wrong.
And I, my laptop has a biometrics feature on it. And, you know, I did that and a bang I was in and it’s, you know, it’s just, but, but putting aside the convenience of the password, the security of the passwords, as you said, passwords suck.
We don’t use them well. We use, we use insecure, easy to guess ones, or we reuse passwords. And it’s created this whole ecosystem of accounts and an ecosystem of kind of authentication that’s incredibly porous. And that’s why accounts and breach and data breaches happen all the time.
And biometrics offer us a way to kind of circumvent a lot of that. But the risk on the flip side is you lose a biometric, you lose someone’s face print fingerprint, and there’s real damage there. They can’t very, very readily remediate that like they can resetting a password.
Yeah. And I said earlier on that they’re typically pretty hard. You know, it’s harder to recreate a biometric than it is, say, a password. But AI content generation is getting pretty scarily good, right?
And there’s some examples of, in Australia, the tax office uses voice as a biometric identifier, for example. We’ve got a lot of content about voices out on the internet, right? And a lot of people do. It’s pretty easy to capture someone’s voice and, you know, use one of these systems to, uh, to pretty convincingly, you know, sound like that person.
So, you know, and yeah, I can’t, I can’t change my voice. That’s, that’s, that’s locked in. So yeah, it’s definitely an issue that we kind of need to manage or it’s one of those risks.
One of the interesting things to me is the way biometrics also intersects with some of the other privacy concerns that we have generally, that sort of foundation of trust around how organizations manage our data. It applies to biometrics.
So we talk about consent and in the biometric context, consent is difficult because the facial recognition example we often talk about is like, it’s hard to get consent when you’re scanning a crowd. You can put a sign up in front of a stadium.
So this kind of fear that there’s no proper consent is amplified around biometrics.
But I think the other one is just generally that lack of certainty around things like secondary purpose and secondary use, which is a general thing people worried about with privacy, is I give you a piece of information and you say you wanted it for purpose one, but there’s some weasel words that allow you to do other things with it.
I think that is amplified in a case of biometrics. It’s like if I give you, if you ask as a workplace or as a institution, tell me that you need my fingerprint to access my computer in order for me to log in.
How do I not know that you’re then using it to track me and, you know, monitor my work rate, my behavior, there’s that kind of extra level of tracking and concern around the secondary use of biometrics.
So I think there’s some interesting things around there and how like existing privacy conversations we have sort of apply tenfold in the biometric context?
Yeah, yeah, I think so. And there’s a question of just how they’re regulated. So maybe I’ll speak to that first. Because there is this interaction with the Privacy Act. The Privacy Act deals with biometrics, but it deals with them in the context of identification. And it does that by essentially saying they’re sensitive information, which means that you need to get someone’s consent. That’s why we’re talking about consent for handling facial recognition or biometrics. So that requirement one is consent.
There’s a second requirement that the privacy commissioner has been pushing in some recent determinations. So they recently yelled at 7-Eleven for using facial recognition to make sure that people in their stores weren’t like gaming their satisfaction surveys. They’d take a photo of the person’s filling in the survey and check if that person’s filled in a survey before that day or something so that store owners can’t game the system right.
There’s a few things that the commissioner was quite critical of there. But one of the really interesting ones is that they said that it’s just not proportionate, right?
That the use of facial recognition with all these risks and all this power that we’ve been talking about in order to prevent someone gaming your feedback server is just not proportionate. And I think that’s a really useful requirement to put in, you know, to kind of push organizations to justify that, like, if you are using facial recognition, you need to justify that there’s not like an easier, safer way of doing this stuff.
Yeah. On the sort of regulation and regulatory side of things, the US, particularly one state, has a very interesting approach to regulating biometrics, Illinois.
Do you know a little bit about that? Like what’s the story and sort of why they’ve got that dedicated approach?
Yeah. I mean, this is just a really interesting thing about US state based regulation, right? Is that a single state can go ahead and pass a law that then applies to anyone collecting data in that state. And then that state can go ahead and, you know, sue people like Facebook who are doing facial recognition of citizens in that state.
And that state’s requirement kind of becomes a default requirement broadly. I’m not super familiar with the detail. I think the law is essentially quite similar to the Australian requirement of consent, right? That you cannot do facial recognition or biometric identification of people without their consent. And there’s been quite a number of really high impact litigations – Facebook’s gotten in trouble, Clearview AI got in trouble for doing facial recognition of people on their platforms or on scraped data from the internet. And that included citizens from that state. And yeah, the potential penalties are quite large too.
So that’s useful because I don’t know the, I guess the ins and outs of the regulation, but there’s, I did come across this great story of how the regulations came to be, which I just think is awesome.
The law was enacted in response to this company called Pay By Touch, which had this really charismatic founder, went and kind of drummed up a whole bunch of investors and venture capital and then flushed with the cash, basically went off the rails, started like just spending money, hiring lots of people that didn’t need to be hired. But then, according to allegations of employees, spending it on booze and drugs and just going off the rails.
And then the company went bankrupt and as part of trying to get through its bankruptcy, one of the things it sought to do was sell off all the fingerprints it had because it was a pay-by-touch company.
And so just, just a complete train wreck of a story. But look, kind of leading to this point where regulators thought we need more protections around our fingerprints are treated when companies want to collect them.
That’s so interesting. And also relevant to your point about controlling secondary use, right? Like that can fall, how that can fall apart when, you know, over time, if a company goes bankrupt or just has a change in leadership, change in values that, you know, if, if they’ve got a big database of faces or fingerprints or other personal data, yeah, if they’re free to sell it on to law enforcement to others.
And it’s also the whole like narrative arc in a nice little story, which is that, you know, pay by touch sounds like an amazing idea. So convenient. We all jump, you know, we all jump up from our seats at the possibility and people give this guy $50 million worth of things.
But then you realize down the road that there are risks and that there need to be safeguards. And so, you know, the convenience is like not the end of the story. It’s only the beginning.
Yeah. Yeah. And this is something that we often bang on about too, that like the down the road is actually really hard to assess right up front that like, what’s going to happen to this data in five years, in 10 years? Is this going to be on sold or obtained by warrant for some kind of surveillance database?
Is this somehow going to be misused against me? What’s the technology for replicating fingerprints going to look like in 10 years? Is that going to be something I need to worry about? All of this stuff that like, you know, upfront, as you say, the convenience is very compelling,
Um, you know, it’s impossible for a normal human to assess those risks. So yeah, there’s, there’s an important role for, for Laura and folks like us as well.
Okay. Well, that’s us for another week. Good to dive deep into biometrics with you. What’s your gait be very mindful of who’s looking at where you walk.
No, it’s fun. Fun doing these little like a, you know, subject focused rather than news focused episodes, so hopefully that’s fun. I read you can get, uh, orthotics that alter your gait. If you have a stone in your shoe, you’re going to walk slightly differently, right? So I might look into that stealth my way home.
Good one. Already thinking of the defenses. Love it. Okay. Thanks Jordan. Chat to you next week.
Thanks, Arj. Talk next week.