Protecting Children or Profiting From Them? A Question for the Education Minister
- Dec 30, 2025
- 8 min read
by ELV

Admittedly, there is nothing I can say or do to stop the rollout of Janison’s SMART tool.
My own school has already been approved for six months of PLD to support its use.
This is not a piece about refusal.
It is a piece about understanding.
Because SMART feels like a scab — raw, inflamed — one that no matter how much we pick at it will leave a scar on our children.
There is plenty to say about the shoddy procurement, about a Minister planning this tool years before legislation ever reached the public, about the familiar hunger for a system-level solution (designed offshore) to harvest data on our children in the hope of resurrecting something dangerously close to National Standards.
But there is more educators and whānau deserve to know. And this concern is grounded not in fear, but in documented failures, media reporting, and international warnings.
It won’t stop the machine.
But it matters that we understand what kind of machine this is — and who it ultimately serves.
Much of what follows is conspicuously absent from the Risk Assessment released through the OIA process. Yet it is all sitting in plain sight in Janison’s 2025 Shareholders Annual Report — written not for teachers, not for whānau, not for children, but for investors.
And when companies speak to investors, they are careful — but they are honest.
Just not in the way the public is usually invited to read.
What Actually Sits Behind SMART
The AI software underpinning Janison’s ed-tech ecosystem is called Jai — Janison AI.
The problem is not AI itself.
The problem is that many of the grown-ups in the room — and yes, that includes politicians — do not fully understand how AI systems work.
I am not an AI engineer. But I have completed two micro-credentials and earned 20 credits through Dr Craig Hansen’s Summit Institute. I know enough to say this plainly:
AI systems learn from what they are fed. Humans feed them.
That means our histories, our hierarchies, and our blind spots are baked directly into machine learning. Bias is not a glitch — it is a predictable outcome of the data we give it and the world that data comes from.
If readers want to see this made visible, the short video How AI Image Generators Make Bias Worse is worth watching. It shows how AI reproduces stereotypes around race, gender, power, and “normality” — not because the machine is malicious, but because it is statistically copying patterns it has absorbed.
Once bias is automated, it looks neutral.
Objective.
Scientific.
And when systems like this are applied to education and assessment, those representational harms don’t disappear — they scale.
Difference becomes deviation.
Pattern becomes judgement.
And harm hides behind code.
The Line We Crossed Without Naming It
Jai has been trained primarily on westernised datasets, deficit-framed assessment models, and linear, time-bound notions of success. These are not neutral foundations. They are ideological ones.
There is nowhere near enough Māori data online to have trained this system in ways that honour mātauranga Māori, relational learning, whakapapa-based understandings of progress, or cyclical growth.
The moment Māori students — and all New Zealand students — engage with this system, their data becomes part of the machine.
Their responses.
Their behaviours.
Their hesitations.
Their neurodiversity.
Their difference.
That data does not stay in the classroom.
It becomes part of Janison’s asset base.
Many ed-tech companies around the world would dearly love to claim their AI is “culturally responsive”. Janison can now inch closer to that claim — because Aotearoa has handed them the data to do so.
But who agreed to that?
Were whānau consulted?
Were iwi asked?
Were teachers invited into informed consent?
No.
No.
And no.
This data was effectively gifted on our behalf — without mandate, without transparency — by Ellen MacGregor-Reid, Pauline Cleaver, and Erica Stanford.

And once data enters an AI system, it does not simply leave.
That is not innovation.
That is extraction.
Your child’s learning data should never become someone else’s product.
Surveillance, Rebranded as Progress
Janison’s platform also includes ‘remote proctoring capabilities’.
It sounds technical.
It sounds efficient.
It sounds harmless.
It is none of those things.
Remote proctoring allows assessments to happen anywhere — bedrooms, living rooms, libraries — while students are watched by webcams, tracked by eye-movement software, recorded through microphones, and flagged by algorithms deciding what looks “suspicious”.
There is no teacher present.
No relationship.
No context.
A system decides what counts as normal.
For neurodiverse learners, anxious learners, and traumatised learners, this is not neutral. It is punitive by design.
This is not assessment in the service of learning.
It is surveillance in the service of scale.
We Have Seen This Before
NAPLAN Online is the shiny star Janison waves at investors.
But the risk is never theirs.
NAPLAN has been widely criticised for increasing anxiety, narrowing curriculum, distorting teaching priorities, and amplifying inequity. Its online rollout has been plagued by technical failures and unresolved questions about validity.
When large-scale delivery failed in New South Wales, it made national news. Crowd control collapsed at major test sites. Hundreds of students and whānau were left waiting for hours. Police — including a riot squad — were called in.
Janison called this an “operational challenge”.
On the ground, it was distressing.
For children, frightening.
For families, unacceptable.
For those of us in Aotearoa, it sounds eerily familiar.
Because we have lived this before.
It echoes Novopay — where warnings were raised early, reassurances were offered loudly, and the real cost was carried by people who never consented to be part of the experiment.
History tells us this much:
When public systems become test beds for complex digital infrastructure, risk is never absorbed at the top.
It lands on children.
It lands on whānau.
It lands on schools.
Even the speed and scale of this procurement process sounds like a familiar Novapay process we were supposed to learn from.
Crucially, the Auditor-General stated that the Ministry during this time:
“Did not fully understand the risks it was taking on”
and that the decision to proceed was driven by time pressure rather than readiness.
We Don’t Have to Imagine the Risks — We’ve Seen Them Reported
This is not speculation. The risks of large-scale, high-stakes online assessment have already played out publicly — and they have been documented by mainstream media, not critics on the fringe or ‘loony lefties with political agendas’.
In May 2025, national news outlets across Australia reported that riot squad police were called in to manage crowd chaos outside selective school placement exam sites in New South Wales after serious logistical failures during online testing. Thousands of students and whānau were left waiting for hours, tests were delayed or cancelled, and parents described confusion, poor communication, and unsafe crowd conditions.
The NSW Department of Education issued an apology and acknowledged the situation “fell short of expectations”.ABC News coverage. This happened just two months prior to the Ministry and Minister announcing Janison had won the New Zealand contract! So they all knew about this as a risk.
Within days, a sweeping review was announced into what went wrong — including planning, venue coordination, safety, and online delivery. That review was not triggered by ideology. It was triggered by failure. News.com.au reporting.
The Guardian confirmed similar issues across multiple Sydney sites, reporting crowd control breakdowns, traffic congestion, and police intervention as authorities scrambled to keep students safe. The Guardian report.
Even financial and investor reporting outlets acknowledged the disruption, noting that Janison postponed NSW school tests due to safety concerns, working with the Department to reschedule assessments after the scale of the problem became impossible to ignore. TipRanks summary.
This matters because these incidents were not edge cases. They were the product of scale meeting reality.
And this is not limited to crowd management.
Australian educators and unions have long warned that NAPLAN Online introduces distortions and inequities that are not captured in official reporting. A government document revealed that differences do exist between online and pen-and-paper test results, despite repeated assurances that the formats were comparable — validating long-standing teacher concerns that modality affects outcomes.
Unions and academics have repeatedly warned that NAPLAN has become a policy-driving force, shaping curriculum, resourcing, and teaching priorities in ways that narrow learning rather than support it. As one Australian educator put it, “When performance and policy decisions are dictated by a narrow measure such as NAPLAN scores, it severely inhibits the capacity for educators to do things differently.”
I could list many more headlines.
What I have struggled to find are stories about thriving children, calmer classrooms, or empowered teachers as a result of these systems.
What I have found instead are apologies, reviews, postponements, and warnings that were raised early — and not listened to.
For New Zealand educators, this should feel uncomfortably familiar.
Because we have been here before.
We were told Novopay would be efficient.
We were told it would modernise the system.
We were told problems would be minor and temporary.
And we remember who paid the price.
The Government’s Contradiction
It is the role of government to protect children. Yet we are living with a glaring contradiction.
The same Education Minister who publicly condemns social media giants as dangerous and unregulated — and who has made holding them to account a centrepiece of an election-year campaign — has quietly signed off on arrangements that expose children’s data and privacy to an ed-tech company without consent from whānau or teachers.
The United Nations has already warned governments about this exact scenario: that ed-tech companies process and share children’s data far beyond the school gate; that the data is deeply personal and often sensitive; that harvesting it infringes children’s rights to privacy, education, and freedom from commercial exploitation; and that children themselves are unhappy about it.
The UN also warns that schools and Boards of Trustees are unfairly burdened with navigating opaque contracts and legal risk — while we are simultaneously told that Te Tiriti o Waitangi is too complex for Boards to understand or enact.
Apparently Boards can manage AI, data protection law, and global commercial interests — but cannot be trusted with partnership, protection, and participation.
That contradiction ought to trouble us deeply.
This is a public-interest alarm.
If there are lawyers, privacy experts, or public-law specialists reading this — we need you.
Is it lawful for children’s standardised assessment data, including data from neurodiverse and vulnerable learners, to be used to train AI systems without explicit, informed consent from whānau?
Is an “opt-out” meaningful once a system is embedded nationally — or does participation become compulsory by design?
At what point did administrative approval become public consent?
Can a Minister get away with this?
And perhaps the most important question of all:
Surely we have a say.
Because if we do not speak now — if silence is allowed to stand in for consent — then this is not just about SMART.
It is about the future rules of engagement between children, data, power, and profit.
And once those rules are set, they are very hard to undo.
This is the moment.
Before the scar sets.
Children should never be the collateral damage of speed, scale, or political ambition.
Sources & Further Reading
Australian media reporting on Janison-delivered assessments and NAPLAN Online
ABC News — Riot squad called in to control chaos at NSW selective school exam sites (May 2025)https://www.abc.net.au/news/2025-05-02/nsw-riot-squad-called-control-chaos-selective-school-exam-sites/105245958
The Guardian Australia — Police called as NSW selective school exams descend into chaoshttps://www.theguardian.com/australia-news/2025/may/02/nsw-selective-school-exams-crowd-traffic-issues-police-called-ntwnfb
News.com.au — Sweeping review flagged after NSW selective school tests postponed amid crowd chaoshttps://www.news.com.au/national/nsw-act/news/sweeping-review-flagged-for-nsw-selective-school-testing-after-tests-postponed-from-crowd-control-chaos/news-story/a5a5a788fd797e4d76fd38e987cdbaa9
TipRanks (Investor Reporting) — Janison postpones NSW school tests due to safety concernshttps://www.tipranks.com/news/company-announcements/janison-postpones-nsw-school-tests-due-to-safety-concerns
Evidence on NAPLAN Online validity, impact, and inequity
ABC News — NAPLAN: Differences do exist between online and pen-and-paper test results, document revealshttps://www.abc.net.au/news/2023-08-07/naplan-differences-online-paper-tests-document-reveals/102691882
The Guardian Australia — Critical or troublesome? All you need to know about NAPLAN and its impacthttps://www.theguardian.com/australia-news/2024/mar/12/naplan-test-2024-paper-explainer-what-is-the-test-results-released
Australian Education Union / educator commentary (referenced in Guardian & ABC coverage) Repeated warnings that NAPLAN has become a policy-driving force narrowing curriculum, teaching priorities, and resource allocation.
International warnings on EdTech, children’s data, and rights
UNICEF / UN Human Rights / Data Futures Commission — Children’s data and digital rights in education Summary of findings on data mining, commercial exploitation, privacy breaches, and the burden placed on schools and boards: https://www.unicef.org/globalinsight/reports/childrens-data
Digital Futures for Children (DFC) — EdTech and children’s rights https://www.digital-futures-for-children.net
Understanding AI bias (public explainer)
Video explainer — How AI Image Generators Make Bias Worse Demonstrates how AI systems reproduce and amplify existing social biases based on training data.
Corporate disclosures
Janison Education Group — 2025 Shareholders Annual Report
Referenced for statements on AI (Jai), global market strategy, platform customers, and operational challenges.
Available via ASX/company investor relations publications.




Comments