The stunt had a purpose. The image was an AI-generated deepfake, something which she was able to create within just a few minutes using software available for free on the internet.
There are hundreds of such sites which can "nudify" a supplied photo. Some require proof of consent: most do not.
Software to alter or edit photographs has existed for decades, but this technology — which can also be used on videos, or to make videos, represents a whole new level of digital manipulation.
Research released last year suggested, probably unsurprisingly, that celebrities and other prominent people — even backbench New Zealand list MPs are those most likely to find their images altered in this way. US singer Taylor Swift is but one prominent example.
Likely inevitably, 90%-95% of such images are used for pornographic purposes. Almost as equally inevitably, 90% of the images altered by such software are of women.
Some might regard such behaviour as a novel form of "artistic" free speech; others may consider it to be harmless fun.

Ms McClure, literally, put herself out there to try to raise awareness of the issue, and also to try to push for wider parliamentary support for a Member’s Bill in her name which seeks to have non-consensual deep fake images treated in the same way that the law treats illegal intimate visual recordings or harmful digital recordings.
Last week Ms McClure took another General Debate call to once more promote her Bill.
"It's been four months since I put my Member's Bill into the tin, and it could sit there for ever," she said.
"At the time, I shared the stories of those that had been traumatised by this form of digital harm, but what I didn't expect was the global exposure this would get and the flurries of further concerns that came through into my inbox."
What she also could not expect was that by sheer chance the very next day, during the next Member’s Bill ballot, that her number would be drawn. The Deepfake Digital Harm and Exploitation Bill now sits on the order paper and awaits its first reading.
If it remains a Member’s Bill that might be late this year, but early next year seems more likely. That is unless the government adopts it as government business, something not beyond the realm of possibility, given National has already backed its MP Catherine Wedd’s Bill to restrict social media access to over 16s.
Coincidentally, Ms Wedd’s Bill was drawn in the same ballot as Ms McClure’s Bill.
Ms McClure finished her speech with a passionate plea for cross-party support for her Bill, noting that Te Pati Maori MP Hana-Rawhiti Maipi-Clarke had already expressed her support.
"We're not talking about a bit of fun, or a bit of a joke; we're talking about young individuals, nearly always female, that are finding themselves abused by this kind of behaviour.
"Having your image taken and turned into some kind of pornography is so damaging. One of the survivors of this has found herself unable to get a job, for example, because her potential employer googled her and found pornography that wasn't even her. Dropping out of high school, dropping out of university — these are serious things."
Ms McClure is right and is also correct to say that sophisticated solutions are needed rather than blanket bans. Legislation and law enforcement struggles to keep up with the pace of technology but is no excuse for ignoring a real problem.
The legislation that Ms McClure proposes amending badly needs teeth, and those tasked with enforcing it need resources to do their job in a timely manner. But in this context the authorities have no tools, or teeth, to act to aid those captured in such malicious fakery.
Given Ms McClure’s own party, the avowed defenders of free speech, recognise that this is "speech" which needs to be curbed, should be all the incentive other political parties need to get on board and pass laws to protect unwitting victims of malevolent behaviour.












