How Much More of Our Children’s Data Will Be Exported Overseas? This Isn’t Protecting Kids From TikTok — It’s Exporting Their Faces.
- 4 days ago
- 4 min read

Do You Know What It Would Take to Ban Under-16s From Social Media?
When a Minister jumps on a podcast and promises to ban every under-16 in Aotearoa from social media before the next election, it sounds simple. It sounds decisive. It sounds like someone doing something “for the kids.”
And I’m sure it’s a slogan B416 will absolutely adore — clean, comforting, and perfectly engineered for a campaign poster. The target is obvious: our parents, their next voters. A promise polished for the billboards, not the realities of the classroom, or the night-time glow of a phone held by a lonely child.
But promises like this always have a price. And in this case the currency is our children’s data, identity, and power.
Anyone who has ever worked in a school, raised a teenager, or tried to navigate the digital world with our tamariki knows: nothing about this is simple. And nothing this big can be done in ten months without very heavy costs — costs that will fall, as they always do, on those with the least power.
To understand the true danger, we have to understand the machinery beneath the headline.
To deliver on Erica’s promise, the government would need to erect a whole new surveillance structure around our children.
Not metaphorical.
Not philosophical.
Literal.
Technical.
Legal.
International.
And here is the part no slogan, no podcast promise, no campaign poster will advertise to its voters: to identify every child, the system must first identify every adult.
A ban on under-16s cannot work unless platforms know exactly who is over 16.
Which means your face, my face, every parent’s face, every adult’s face becomes part of the verification net.
To keep one group out, the system must scan everyone in.
What is being sold as a child-safety measure is, in practice, the creation of the largest biometric database Aotearoa has ever seen — built at speed, outsourced overseas, and normalised under the comforting banner of “protecting kids.”
But it isn’t only our children’s data on the line. It’s all of ours.
It begins with a rushed new law granting the state — and overseas tech vendors — unprecedented access to children’s identities. A nationwide ban cannot be enforced through platform guidelines alone. It would require legislation forcing social-media companies to verify the age of every New Zealand user, which means uploading ID, matching faces to documents, or scanning a child’s face to estimate their age.
In the real world, that is biometric data, and biometric data does not stay on our shores. It goes where the vendors are, where the contracts sit, and wherever storage is cheapest.
Facial templates — digital fingerprints that can never be changed, even if breached — would be sold, traded, stored or analysed under frameworks we do not control.
Because no sovereign, Tiriti-honouring, privacy-protected biometric system can be built in ten months, Aotearoa would be forced to import one. It would come from the US, Australia, Europe — from whoever can deliver by the deadline.
And with that import comes their algorithms, their biases, their failure rates for brown skin, young faces, Māori features and neurodiverse expression, along with their ownership and retention of our children’s stored data. This is not “protecting kids from TikTok.” This is exporting their faces.
Every parent, caregiver and kaiako should be asking: What values are we trading for this? Whose legal, cultural and ethical frameworks will govern these systems? And where does Māori data sovereignty sit within a ten-month imported solution? Because the truth is simple: it doesn’t.
There would also be no time for meaningful consultation with Māori, rangatahi, educators, disability advocates or trauma-affected communities. A change of this scale requires deep kōrero, slow kōrero, intergenerational kōrero. But a rushed rollout erases those voices entirely, all in the name of a political campaign.
And in communities like ours, where many of our tamariki carry anxiety, neurodiversity, grief or trauma into their school bags every day, online spaces can be the one place they can exhale. A space where friendships feel safer, communication is gentler, and they can participate without fear of being seen “wrong.” These are the young people who will feel the sudden rupture most, the ones least considered in the political rush.
Meanwhile, the A-list suburbs will applaud the Minister for “fixing social media.” They won’t be the ones paying for the downstream harm. They won’t be the ones whose kids are mis-aged, mis-flagged, locked out of community, or swept into a biometric dragnet they never asked for.
So we must ask, urgently and honestly: Whose children does this policy actually protect? And whose does it place under surveillance?
Because from Te Tai Tokerau to the motu, the risk is that we trade our children’s data, our children’s autonomy, our Treaty obligations, and our collective power for a slogan on a billboard.
Our communities deserve policy shaped with us, not done to us.
Our tamariki deserve safety crafted through compassion, not control. And if a Minister is willing to hand our children’s digital fingerprints to offshore companies for a pre-election win, then we must be willing — as parents, as whānau, as educators — to stand tall and say:
Not with our kids.
Not with their data.
Not with their futures.
Aotearoa is capable of so much more than this.




Comments