29 September 2019

Parenting, privacy and the future

elevenM Principal Melanie Marks reflects on the need for better public engagement on the future of privacy, much as is emerging around climate change.

Sitting at a symposium on children’s online privacy a little over a week ago, the footsteps of a thousand marching children ran through my head. Just streets away from the event was the Melbourne base of the national climate change rallies. In the room was an inspiring group of academics, regulators and policy makers, grappling with the future of surveillance and profiling. Both social issues, concerning the future of humanity.

I pondered why the climate rally had delivered so many to the streets now, when we have known about climate change for years? And I concluded that what has changed is that we can see it now, and we are parents. We see the world through our kids’ eyes – strangely short seasons, farmers without water and a disappearing reef are some recent reflections in the Marks household.

Privacy harm is more nebulous. The potential policy issues are hard to solve for and engaging the public even more difficult.

Who could blame consumers for tapping out of this conundrum? According to symposium speaker, Victoria’s Information Commissioner Sven Bluemmel, the model which underpins global privacy frameworks is broken. It rests on the idea that we the consumers are free, informed and empowered to make decisions. But the paradigm is stretched. AI, if done properly, reaches unexpected conclusions. If you can’t explain it, says Bluemmel, you can’t consent to it. If you don’t collect information – you merely infer it – the moment to ask for consent never comes. Add to this, the privacy paradox – consumers say we care, but at crunch time our behaviours bely these words. We are bombarded with notices, consents, terms and conditions, breach notifications, etc. etc. etc. We are fatigued. We would need 65 hours every day to read all this content.

As a result, we consumers are blindly making a trade-off every time we accept terms and conditions we do not read and cannot understand, weighing benefits now (e.g. convenience, incentives, personalisation) against future, undefined harms.

One of the guests at the symposium asks why aren’t the regulators doing more to beat corporations with the available sticks? But this is the wrong question – I mean, it’s a good question, deserving of a response which features cold hard facts, plans and budgets but it is not the right question, right now. The real question is what kind of future we want for ourselves and for our kids.

Do we want to be known in all places, spaces, channels, forums? Do we want decisions to be made by others for us, by stealth? Do we want our mistakes and those of our offspring to be recorded for posterity? Do we want others to film, record and analyse our faces, our moods, our behaviours? Do we want to be served a lifetime of curated content?

The speakers at the symposium present a range of perspectives. Youth ambassadors working with OVIC share a generational view on privacy – they do care about privacy but it’s got a 2019 twist – for a generation who is used to sharing their lives on social media, privacy is controlling what people know about us, how they use it and who they are.

Prof. Neil Selwyn of Monash University Faculty of Education talks about the impacts of datafication and reductionism (defined as: “reducing and turning things into data – remove context and focus on specific qualities only”) in schools: Robots to replace teachers who identify and address isolation in the classroom with 71% accuracy (so low?); school yard surveillance to catch antisocial behaviours; the shift to using this surveillance data for implementing anticipatory control of behaviour rather than the ‘old school’ disciplinary response. Selwyn talks about the ease with which these technologies have rolled into schools. They are a soft touch, a “loss leader” for introducing new technologies to giant sections of our society without the necessary debate, easing the way for subsequent roll out in hospitals and other public spaces.

Another perspective is that of Tel Aviv University’s Sunny Kalev, who talks about the breakdown of boundaries between our key spaces – school, home, physical and virtual. There is now data transfer between these domains for children, basically meaning there is no escape – no place is private. He called on the words of Edward Snowden: “A child born today will grow up with no conception of privacy at all.  They’ll never know what it means to have a private moment to themselves, an unrecorded, analysed thought. That’s a problem because privacy matters. Privacy is what allows us to determine who we are and who we want to be.”

Dr Julia Fossi is more optimistic about what the eSafety Commissioner’s Office has been able to achieve, imagining a future where we can work with big tech to deploy AI-driven technologies to keep us safe, protect the most vulnerable of us, and send help to people in crisis. It’s the utopian vision that presents a counterpoint to my ‘do we want to be known’ questions. It also gives a little dose of optimism – that it is possible to find that middle ground of embracing the best of technology while protecting what matters.

The challenge is building the same kind of consensus around the more nebulous privacy harms as exists around safety, or as is emerging around climate change.