This week we explore the issue of privacy in the workplace.
Historically employers have been exempt from Privacy Act in Australia, but this is a live issue again with the privacy reform process underway and the Government having agreed in-principle to extending privacy protections to workers.
We also explore how the problem space is expanding in the era of hybrid working and as technologies for tracking and quantifying work continue to evolve.
This is an automatically generated transcript. We make our best efforts to check that it is an accurate reflection of the episode, but it may contain some errors and unedited content.
Welcome to this week in Digital Trust, elevenM’s regular conversation about all things tech policy, privacy and cybersecurity. I’m Arj joining you today from Awabakul country.
And I’m Jordan joining you from Wurundjeri country. And I’d like to start off the podcast Arj by first of all welcoming you back after a period of leaves.
Thank you. It’s great to be back.
And I’d also like to say hello to Mel and Pete, our bosses, who listen to the podcast, supporters of the podcast, and it’s appropriate that they’re listening because we’re talking about workplace surveillance.
We are indeed. Workplace privacy and workplace surveillance. And, you know, let’s just say they’re very good on all of those fronts. They’re, you know, very much, as you’d expect, very privacy minded. But.
As privacy consultancy, we take this stuff kind of seriously, but yeah, when we’re actively recording our conversations for monitoring by the public at large, yes, the boss might listen in as well.
It’s perhaps not as well known in the general community that employers are actually exempt from the Privacy Act for anything that they do related to that employment relationship and for anything that involves an employee record, which is a shock. I think the Community Attitudes to Privacy Survey had some data around what people expect around privacy in terms of organizations. businesses that collect work related information are required to protect it under the Privacy Act. It’s actually not the case. There’s an exemption. And that’s being revisited at the moment under the Privacy Reform process. So the government’s response to the Privacy Act reforms was that they agree in principle that privacy protection should actually be extended to employees, but there’s going to be further consultation around exactly how to do it. people have made cases for to say, hey, these exemptions around privacy for employees need to go.
Yeah. So, and it’s worth just focusing in on what the existence of that exemption means for just a second. So employers don’t need to comply with the Privacy Act for anything they do. This is a terminology in the Act, but like…that’s directly related to an employee relationship and involves an employee record. So for example, the Privacy Act includes an obligation to keep records safe, to, you know, prevent them from unauthorized access, to notify people if there’s been a data breach, to not collect information unfairly. You don’t have to, as an employer, take any kind of precautions under the Privacy Act to protect information from hacking, and you don’t have to … if their information gets hacked, which when you consider the sensitivity of information that employers hold, which we’ll probably get to, but it is quite a surprising and shocking thing, and it’s quite unusual globally in terms of privacy regulation. So Australia is kind of out on its own here.
Most other equivalent type countries, the EU in particular, but the UK, Ireland, New Zealand, Hong Kong, most countries with privacy laws don’t have a similar exemption. And in fact, a lot of countries like the EU have stronger protections for employees because of the sensitive nature and because of the power imbalances that exist in employee relationships. So yeah, it’s quite surprising and it’s quite unusual.
I imagine there’s situations where particularly larger organizations do so much to protect personal information for customers and yet, you know, they have to somehow carve out and have a different set of processes where they don’t have to or don’t need to do it for employees. It’s probably this kind of Jekyll and Hyde thing going on within organizations as well.
Yeah, it’s a funny situation because it’s actually more effort to do that than it is in a lot of cases to just like provide the same set of rules, set of protections for everything you hold. So, I mean, one of the things I should say, most Australian corporates, in my experience, do protect personal information of employees on the same level as they do, you know, anyone else, contractors or customers or any other personal information they hold.
Um, that’s kind of the standard practice. It’s what we advise our clients to say, look, don’t, don’t try to have a carved off area over here for your employees who you owe a duty of care and you want to protect a bit, like have you carved off area where you have lesser protections.
Who You see in the halls and at the water cooler. Right.
Um, and yeah, who you’re probably friends with and whatever. So in practice, I think a lot of, and most companies do protect their, um, their employee data, but legally they’re not under a privacy obligation to do so.
So you kind of flagged this idea that it’s a bit bizarre because if you think about the kind of information that companies hold, it’s actually quite broad and quite sensitive. And so it might be worth just quickly touching on some of those kind of potential privacy concerns that are at play because yeah, as you said, the kinds of things, the kinds of data that’s involved in employment, it’s all the sort of.. personal information like contact details.
But yeah, you get into sort of the terrain of more confidential types of information like the terms of your employment and your kind of your salary. But then things like medical information can be at play, information about your performance, your banking details, your financial details, information that’s defined as sensitive information like your membership of a trade union, a very common piece of information in an employment context.
So even in a sort of vanilla traditional employment kind of context, there’s quite a broad spectrum of information before we start to step into, what we’ll talk about I guess later, which is this sort of surveillance in a more hybrid online context.
Yeah, but before anyone listens to your podcast. Yeah, so it’s really standard for companies to have information about disability, reasonable adjustments. They’ve probably got a bunch of information about if you use a work device work email for anything, they might have a bunch of information about you just based on the emails you send or the people you interact with. And yeah, like you say, a lot of references, performance evaluations, there’s a huge amount of data that companies collect.
But yeah, like you were flagging, as we’ve moved into working remotely from COVID for us professional desk workers, that amount of information that companies tend to have has just skyrocketed, right?
Because I’m now sitting in the front room of my house, you can see, you know, a partner walking past behind me. Like, my work is now co-located in my home space. And there’s a whole bunch of information about my use of the computer that you might collect for security or for monitoring purposes. There’s a whole bunch of information about my location, that a company might have visibility into as well and remote working technology has really pushed that information collection up a notch.
Yeah, there’s some interesting research and stats around some of this stuff, which it’s not really hard to find. It’s been quite actively reported on, but just as one example, a survey of 645 US-based workers found that the amount of you know, surveillance tools increased significantly during the first six months of the pandemic. And so, as you said, like a lot of this is driven by a change in the way that we work.
And there’s the fact that it’s just like organically more, as you said, co-located, like, you know, like me looking at you on a work call now means I’m looking into your home. And but there’s also the active steps being taken by some employers to increase surveillance because they want to assure themselves about productivity or they want to make sure that these workers who are no longer, you know, they have line of sight on because they’re sitting next to them inside the office are still working at the same rate and still, you know, turning over the same amount of, you know, output.
And so there’s installation of various devices and various software platforms that enable this increased surveillance in a very deliberate way.
The Electronic Frontiers Foundation calls it BossWare. But yeah, that kind of software that you can install on a work device or on a personal device that either overtly or covertly, you know, monitors how often you’re clicking on things or take screenshots even of what your work screen looks like at various points so that, you know, someone can go back and check that you’re actually working all day.
That’s a great TikTok that stuck with me from the COVID era of someone who just like attached a ruler from a fan that’s scanning back and forth and like sticky taped it to their mouse.
So that teams just like, you know, and then they zoom in on their team’s profile and it’s like active all of that kind of thing, right? That software can be really invasive. It can be, especially if it’s on a mixed use device that you’re, you know, doing personal stuff on as well, doing that kind covertly is largely illegal under Australian law, it’s worth pointing out, but a lot of workplaces have requirements, right, that you sign up, you’re using work devices that will tell you that it’s being surveilled, that will tell you that these are the conditions of the contract, so they need you to sign up to them.
It’s also worth noting there actually that our bias, of course, is kind of office work, professional knowledge work type stuff, but there’s a whole stack of surveillance in other industries that has been intensifying over the years as the technology has developed. And some of it’s driven by COVID, but some of it’s just been the trends. So things like in logistics, there’s a lot of coverage of Amazon and similar companies monitoring workplaces like warehouses and employee movements exactly what you’re packing and when, or monitoring truck driver efficiency for deliveries. There’s Uber and other platforms like that that have incredibly granular data about where people are and what they’re doing and how fast they’re driving and so on.
There’s a huge amount of tech in mining, biometrics and workplace monitoring, often for safety and regulatory purposes. There’s much sufficiency, but there are a whole bunch of industries where, yeah, there’s a stack of other kinds of monitoring. We’re just focusing on, you know, desk stuff, cause that’s what we do.
And you mentioned sort of safety, cause it’s interesting to see how often the, the rationale around this that’s presented is not just, you know, we’re tracking all this for efficiency reasons. It’s often kind of safety and employee wellbeing is often the sort of story that’s told even in those kinds of warehouse logistics context, you sort of talking about, you know, Um, or truck drivers, you talk about like monitoring eyelids for sleepiness or monitoring kind of movement within the warehouse of workers to sort of ensure that there’s no sort of unhealthily stationary periods. And so it’s often presented that way.
And then you sort of seeing that kind of safety, um, messaging or that safety rationale bleed into the, the hybrid corporate professional context as well. So these apps and platforms that monitor, you know, employee wellbeing based on like your moods and, you know, other things like that. And, you know, so it’s, it’s very interesting to see that that’s kind of the positioning as well, but it also gives you an insight into like something like employee wellbeing. I mean, that’s sort of insights into your mental health.
So again, you get an insight into the level of sensitivity of the information that’s now being tracked. It’s not just, you know, a file with my you know, my address on it anymore. It’s this quite rich behavioral and psychological insight.
And it’s what makes this area so interesting and so challenging, right? Cause there are some really great uses of this stuff. Um, you know, if I, if I was running a call center, I’d love to know every time a staff member got yelled at or that, you know, have a little flag that someone they spoke to was abusive or angry and so on, I can check in on them.
There’s a stack of safety or well-being applications that this stuff is great for. But it’s the same technology and the same surveillance that enables a whole bunch of really damaging and problematic uses as well. And that’s the kind of workplace that I think we can all agree that we don’t want to work in where monitoring of where you are. It’s not about outputs. It’s about what you’re clicking on or your tone or
it’s a good, it’s a good point. Cause I probably came across as just sort of cynical in the way I was talking about the safety positioning, but it is a genuine problem that a lot of businesses are trying to work on. So like in this hybrid environment where people are dislocated from each other, how do you maintain engagement? How do you make sure you know, you’ve, you have a sense for whether people feel isolated or feel.
There are legitimate, well-meaning reasons for it, but then as you say, there are excesses which are problematic as well around tracking.
And that, if anything, just speaks to the fact that the absence of any guardrails, any kind of regulations or provisions around this stuff is the issue. Like, you know, it’s, you know, those guardrails, those provisions are what allow us to sort of have the well-meaning use cases confidently
The proposals for the, in the Privacy Act review are really that, you know, there should be some protections. It doesn’t go too far in terms of saying exactly what they are. They’re kind of high level in what they’re suggesting in terms of more transparency, applying that security obligation, making sure companies protect the information that they collect, that they notify people about data breaches, but at the same time, making sure that employers do have the flexibility to collect information and manage an employee relationship, worry about well-being, manage performance, all of that kind of thing in a kind of fair and reasonable way. I don’t think that regulation solves everything though, right?
Because ultimately, a lot of what we’re talking about isn’t just… the existence of the technology or the existence of the monitoring. It’s about the employee relationship in the workplace and how the technology is deployed within that relationship.
And so I think privacy kind of has a role here, but also like we were just saying, the way that technology and the way that work has evolved over the last 20 years, but also over the last couple years since the pandemic, the data is there, the technology is there, they’re increasingly needed for security or safety purposes.
It starts to come down to a negotiation about specific uses and trust and a more nuanced kind of power relationship rather than just keeping stuff secret.
It is, yeah. I think the key word that you use there, trust, there are these competing value sets sometimes within organizations. I think in, you know, for example, if you worked in a financial institution, a bank, you know that your mission as an organization also is to protect sensitive and confidential information of large numbers of customers, millions of Australians, let’s say. And so often there’s this, in a more trusting context, there’s a dialogue about, look, there might be some more restrictive security practices that we are going to have in place because we have a mission as an organization to do that.
And so, you know, I’m, you know, I’m aware of organizations where, you know, the CEO has been very open about at least once or twice a year telling staff, look, when you browse the internet on a computer device, we see everything that you do because we need to, we need to monitor that there’s no sensitive customer data leaving the organization.
But we also want to make sure that if, you know, there’s malware or something coming back the other way. We can see that because we want to protect the organization. We want to protect our customers. That’s what we’re all here for.
And it’s a violation in some sense of privacy to, you know, break open people’s kind of encrypted connection to the internet in order to see that stuff. But it’s the, it’s, that’s the, that’s the nuanced conversation you’re talking about where it’s, you’re trying to build a more trusted relationship and, um, you know, and so, so there, there are different things that play there, you know, in, in some cases
that’s exactly how to do it. Right. right, is that like it’s transparent, it’s clearly directed at an outcome that can’t be achieved through some other means. It’s an effective means, proportionate means of achieving that outcome. Fantastic. The, you know, making sure that I’m moving my mouse more than, you know, any minute or something as a measure of my productivity, kind of less effective, right? So, and that’s, I think, one of the challenges that there’s that we see in a lot of areas with kind of new technologies is there’s a problem. People are at home. I want to manage their productivity. And there is some typically Silicon Valley startup that’s proposing a magical solution to that problem, which is my software will give your employees a productivity rating or something based on how much they move their mouse.
And it’s not, you know, it’s just a digital equivalent of making sure that people are at their desk until at least 6pm, right? That like, it’s not a measure of productivity. They’re sitting around in the office just, you know, waiting for the clock, wanting to show their face.
Yeah, or other versions of that, which are just, you know, just because you can, you do this stuff, and then you make up a reason for it. So for example, sentiment analysis of, you know, consuming everything from a…you know, the internal teams chat in order to get a sense of sentiment and the rationale you give as an organization as well, you know, this might give us leading indicators about, you know, insider trading or, you know, some sort of, you know, insider threat, but in the process of doing that, you know, is quite violating.
There’s actually really good evidence that that kind of thing pushes in fact in the wrong direction that the more you monitor people, particularly knowledge workers, but like, all sorts of industries, the more you monitor people, the more that you alienate them by refusing to trust them, the less likely they’re going to go out of the way to support your company, the less likely they’re going to go the extra mile or actually try that, you know, if you reduce my job down to how many clicks I did and how much I’m moving my mouse, I’m going to stick my mouse to a fan and meet all your KPIs and not actually achieve the job I’m being employed for.
One of the thoughts I had while we were, you know, I guess, preparing for this conversation was just the idea that in the workplace rights are often, have often been delivered through collective action.
And that’s always struck me as, you know, potentially something to draw on just for privacy in general, like, you know, outside of the workplace context. One of the reasons we are often at the mercy of large corporates and large platforms is because there is that power imbalance and there’s just me as an individual trying to kind of stand up for my privacy rights.
And it’s interesting to just parallel the fact that many rights in the workplace context are the result of collective action and wondering whether there’s a mechanism there through sort of the same, given that the Privacy Act isn’t at play here its industrial relations legislation, whether there’s a sort of a role there for, you know, collective action to sort of start to kind of move the dial on some of this stuff?
Yeah, I think there is. I mean, it’s interesting that the original justification for leaving privacy of employee records out of the Privacy Act was that it should be covered under industrial, you know, it’s an industrial relations thing, and it should be covered under the heading of industry relations, not the heading of the Privacy Act.
There are some good examples of that in a bunch of industries, particularly in the US, where collective action in various industries has led to pushback on that kind of really strict monitoring of meatpacking or warehouse workers or transport workers. The union has made the argument that… This is unsafe, that this isn’t the workplace that we want to live in. It’s been the union that has pushed back on that rather than any kind of regulatory intervention.
I think there’s a real role for that here as well as those that surveillance or data practices and monitoring can really impact the experience of the workplace. I think it’s a very reasonable target for enterprise bargaining negotiations and stuff that certain certain monitoring, certain uses of AI, certain means of decision-making are out.
We’ve just seen entertainment guilds, the writers and actors in the US, make a whole bunch of claims and with a fair bit of success about what exactly their work looks like and how tech interfaces with their work. So yeah, I think that’s right. And I don’t think the, I mean, the proposals to change the Privacy Act are very much baselines, right? They’re not dealing with any of this detail.
I think increasingly stuff like how we’re monitored and, you know, and what the metrics are and how decisions are made about us, yeah, will have to be a subject for that kind of collective negotiation.
Well, on that note of solidarity?
Right on. Solidarity. Don’t listen to that bit now, Pete. You know, we’ll just get organized down here. Well, let’s reconvene and fight the fight next week.
Fight the fight. Unite. Okay. Thanks, Jordan.