
Regardless of our interest in our personal appearance, we are not tolerant of anyone stealing or misusing our facial images.
We expect any organisation, governmental or otherwise, who might be handling our image to have a responsible approach to our privacy and not to misrepresent us or expose us to the risk of misidentification.
This is something Hobson’s Pledge and The Campaign Company, handling its campaign against Māori wards in local government, did not seem to understand when the image of a kuia with moko kauae was projected on an electronic billboard last week.
The woman concerned and her whanau were appalled, since she is vehemently against the message in the billboard which was written in a way which made it seem like a quote from her.
It is easy to understand some might interpret this as a cynical way of getting some more publicity for the Hobson’s Pledge campaign, ramping up the inevitable online vitriol from those on both sides of the argument, rather than a genuine misunderstanding about the right to use the photo.
The fact the photo came from a company selling stock images should not override commonsense and common decency to check the person concerned was OK with such a contentious portrayal.

Other issues of facial recognition were in the news last week with the Privacy Commissioner Michael Webster issuing the Biometric Processing Privacy Code which has been developed after considerable consultation.
Mr Webster says biometrics should only be used if they are necessary, effective, and proportionate with the key thing to be sure the benefits outweigh the privacy risks.
Biometrics involves the automated recognition of individuals from their biological or behavioural characteristics. They could involve characteristics as diverse as a person’s face, fingerprints, eyes, voice or even the way a person walks or smells.
They can be used for verification, identification, and for categorisation or profiling.
As Mr Webster has pointed out, biometrics can have major benefits, including convenience, efficiency, and security.
However, they can also create significant risks, including those relating to surveillance and profiling, lack of transparency and control and accuracy, bias, and discrimination.
The new code comes into force on November 3, but agencies already using biometrics have a year to align themselves with the new rules.
The usual requirements of the Privacy Act will apply as well as requirements for agencies to assess if biometrics are fit for the circumstances, to adopt safeguards to reduce privacy risk, and to tell people a biometric system is being used.
It also limits particularly intrusive uses of biometric technologies such as predicting people’s emotions or inferring information like ethnicity or sex or other information protected under the Human Rights Act.
We hope those organisations who are using or wish to use this technology take the code seriously.
We have previously been critical of the police attitude to privacy, including its gung-ho attitude to the introduction of smartphones more than a decade ago.
Instead of embedding privacy principles from the get-go, thousands of images were taken and stored on individual police devices, many of them found likely to be unlawful.
The lack of proper storage systems made it hard for them to be accessed and it is also making it difficult for unlawful images to be deleted.
This saga has dragged on for years and RNZ reported last week police had missed another deadline on this. Mr Webster had given them until the end of June to find a way to detect and delete all unlawfully collected material in their systems.
They have also not found a failsafe way to not use the photos in any way in the meantime.
If it is a funding shortage stopping the police installing a system to comply with the law, our tough-on-crime politicians need to address it.
The public expects the police to obey the law just as police expect the rest of us to.