top of page

ACT MP Moves to Criminalize Harmful Deepfakes, Citing a Dangerous Gap in NZ Law

  • Phoebe Robertson
  • Oct 9
  • 3 min read

CW: Self Harm, Revenge Porn


When first-term ACT MP Laura McClure stood in Parliament and held up a blurred, nude AI-generated image of herself, the stunt landed with a jolt. She had deepfaked her own likeness, she says, to force colleagues to confront how easily such images can be made—and how New Zealand law hasn’t kept pace.


McClure did not imagine a career in politics. After university she trained as a pharmacy technician, then spent nearly two decades in her family’s health-and-safety business, gravitating to ACT over time on small-business issues and the End of Life Choice debate. “It wasn’t a chosen career path,” she says. 


Deepfakes entered her world through school visits in her education portfolio, where principals and parents began flagging cases. One, she says, involved a Year 9 girl in Auckland—just 13—who was deepfaked and later attempted suicide. The gendered nature of the abuse, she adds, is unmistakable. “Someone has to do something.” 


Her members’ bill aims to make it explicitly illegal to create or distribute synthetic sexual images intended to harm a person—closing what she describes as a glaring hole in the Harmful Digital Communications Act (HDCA). Sharing real intimate images without consent is already a crime; using someone’s face to fabricate them, she argues, can be “far more damaging,” yet is not clearly outlawed. To her knowledge, there have been no successful prosecutions under the current framework for deepfakes.


Rather than chase the technology, McClure borrowed from South Korea’s behaviour-based approach. Regulating the tech is “whack-a-mole,” she says; the image she showed MPs took minutes to make on a free site with a couple of clicks. “Are you 18? Do you have consent? Done.” The bill, therefore, targets the act of weaponising synthetic images, not the tools themselves. 


Getting any members’ bill heard is a lottery—literally. Proposals sit in a “biscuit tin” and rely on the ballot to be drawn. McClure says she deepfaked herself to attract attention across the House. It worked, but it wasn’t easy: “I’m pretty new… it was terrifying,” she says of showing the image in the chamber. 


Since then, she’s found unusual allies. McClure says MPs from multiple parties reacted with surprise that the conduct isn’t clearly illegal and expressed willingness to act. The Greens, she says, nearly co-signed before stepping back; Te Pāti Māori has been supportive. “This isn’t political. Anyone can be affected,” she says. Her lobbying pitch stresses urgency and asks colleagues to prioritise this bill over their own in the ballot—no small ask in a system where backing one bill can mean sidelining another.


What about ACT’s free-speech instincts? McClure argues the line is bright. “We already criminalise sharing someone’s nudes without consent,” she says. “Saying it’s okay because it was made with technology makes no sense.” She insists the bill protects legitimate, creative uses of AI by focusing on malicious behaviour: “We want to protect people who use it for good.”


With a maximum penalty around two years, criminalisation would principally send a signal and unlock support pathways. Schools that have dealt with cases lack consistent victim services precisely because the conduct isn’t clearly a crime, she argues. Naming it as such enables police, victim support and other agencies to act coherently. “It says to young people: this is serious.” 


McClure places her bill within a broader, overdue update of the HDCA, which she notes is roughly a decade old. Technology will keep moving—today’s deepfake will be tomorrow’s something-else—so she favors faster, periodic reviews to close new loopholes before harms metastasise. 


For students, the call-to-action is immediate: email MPs and lend your name to petitions urging the Government to add the issue to its work program, McClure says. And if you’re targeted, do not go it alone—contact Netsafe for takedowns and support, and talk to police. Even under current law, some remedies are available. “Please reach out,” she says.


McClure’s gambit—using a fabricated image of herself to argue for a real law—captured attention because it compressed the issue into a single, unsettling truth: if an MP can be deepfaked in minutes, so can anyone else. Her bet is that Parliament won’t wait for catastrophe to prove the point. 


Recent Posts

See All
Meet the Pres: Aidan Donoghue

By Darcy Lawrey (he/him) The day I interviewed Aidan Donoghue, the 2026 VUWSA president, he was already getting stuck into the job...

 
 
 
bottom of page