blog

Te Kete o Karaitiana Taiuru (Blog)

copyright symbol, facial recognition and one a light brown skinned face

Customary Māori Values and Copyright Legislation for Facial Recognition Technology (FRT)

For many years, I have written about the risks that Māori and other minority communities are confronted with, from emerging digital technologies.  In particular: deepfakes, cultural appropriation (such as the misuse of moko), voice cloning, and biased facial recognition systems.

These technologies raise significant concerns, including identity theft, fake pornography, digital impersonation, and the broader misuse of Māori identity and data. We continue to see Māori with facial moko have thier images stolen and sold as artworksand with fake adverts using people in videos and images. Perhpas the most famous cases include Tame Iti and Oriini Kaipara. Among other isses are moko being copied off photos and used to train AI image generators.

Recently, the Privacy Commission released findings from the Foodstuffs Facial Recognition Technology trial. They concluded that the trial complied with the Privacy Act and that live facial recognition was effective in reducing repeat offending during the period under review.

The Minister of Justice, Hon. Paul Goldsmith, was quick to endorse the findings, stating:

“I expect our Ministerial Advisory Group will continue to look at this technology as an option to be used more widely and engage with the sector on it. I’ll be encouraging the MAG to take this report into serious consideration. New Tools to Fight Retail Crime Welcomed

 

It is unfortunate that Māori perspectives were largely absent from emerging tech ethics and governance. There was a missed opportunity by the Privacy Commission to engage meaningfully with Māori communities, AI experts, and other government agencies already working with Māori, to ensure a diverse range of views on FRT were considered. However, the Privacy Commission’s mandate was limited to the scope of the Privacy Act.

Although the Commission stated it took advise from a range of Māori voices, it also has a Māori advisory group, that largely consists of individuals from a single network and lack in-depth technological expertise. This group recommended that facial recognition not be used in supermarkets, citing concerns over Māori being disproportionately under surveillance by authorities. While valid, this blanket opposition risks unintended consequences such as supermarkets becoming targets by criminals due to being among the few retail locations not using FRT. It also potentially contradicts feedback from Māori community consultations in particularly from victims of domestic abuse, stalking and child custody issues.

 

What Māori Communities Are Saying

From the consultations, a more nuanced position emerged:

  • Māori are not fundamentally opposed to FRT
  • A desire to see the technology co-designed with Māori to address potential bias before it is deployed
  • Tikanga Māori should be respected in its development and application
  • Ownership, human oversight, training data, and secure storage and access of biometric information are essential considerations.

 

Ongoing Concerns

The following issues remain unresolved:

  • FRT systems are not trained on datasets that represent Māori and Pacific faces.
  • These systems regularly misidentify people with darker skin, especially women.
  • Trials by the Department of Internal Affairs (DIA) and Police showed inconclusive results for Māori and Pacific participants.
  • DIA trials revealed inconsistencies when facial moko was present.

 

Historical and Cultural Context

Pre colonisation, everyone’s face was sacred to the individual, and no one was permitted to touch another person’s face without strict permission. There were many other protocols about what you could do. Reflections in the water were considered tapu, so you could not drink it, shadows upon certain landmarks were bad omens, and there were consequences of war where your head may be used as a decoration to humiliate your hapū (clan) and whānau (family).

Traditional stories warn of the dangers of your face or voice being stolen and the consequences, and for many, the popular Māui stories of how Māui could change his appearance into various birds have become a part of primary school teachings.

There were protocols about speaking, echoes and voices being stolen. For some people of a higher social status, there were protocols about showing various parts of their body in a society where clothing was largely not worn in the summer months.

Today, tikanga (customary practices) and mātauranga Māori (ancestral knowledge) continue to guide many Māori families and communities. It is therefore these traditional views are also considered.

 

International Perspectives and Potential Legislation

Denmark recently announced plans to introduce legislation allowing individuals to copyright their facial features and body images. Culture Minister Jakob Engel-Schmidt said:

“Everybody has the right to their own body, their own voice, and their own facial features… which is apparently not how the current law is protecting people against generative AI.”

This move would enable Danes to demand removal of deepfakes and AI-generated imitations of their likeness or performances, especially when shared without consent. Noncompliant tech platforms would face severe penalties, and impersonated individuals could seek compensation.

In the U.S., the Take It Down Act criminalises the distribution of non-consensual intimate imagery, including AI-generated content and deepfake ‘revenge porn.’

These legal moves align closely with Māori values. They address the same threats posed by AI-generated fakery, identity theft, cultural erasure, and misuse of voice and likeness. They reflect a growing global recognition of personal sovereignty in digital spaces.

 

A Māori-Led Approach Forward

For Māori, legislation must also consider protocols around the deceased and moko. Many wānanga have expressed a strong desire to uphold traditions around images of those who have passed. As the DIA considers upgrades to the Births, Deaths, and Marriages database and iwi data governance, this is a timely moment to include protections for the living and the deceased.

A law similar to Denmark’s could be adapted for New Zealand, allowing individuals to copyright their image, voice and other biometric data. This would offer protection in what is fast becoming a digital wild west, where tech innovation has outpaced ethics and accountability.

 

Conclusion

Yes, there will be complexities in implementation. But if Denmark can chart a path, so can New Zealand. A uniquely New Zealand approach that is rooted in Te Tiriti o Waitangi, tikanga Māori, and contemporary human rights, could provide both protection and leadership in the face of fast-moving technology.

 

Disclaimer: Image created by deep.ai using the prompt: “copyright symbol, facial recognition and one a light brown skinned face”.

DISCLAIMER: This post is the personal opinion of Dr Karaitiana Taiuru and is not reflective of the opinions of any organisation that Dr Karaitiana Taiuru is a member of or associates with, unless explicitly stated otherwise.

Archive