Te Kete o Karaitiana Taiuru (Blog)

Chatbots need Indigenous Beta testers

Microsoft recently announced that it was upgrading it’s search engine Bing’ to include AI from OpenAI, the maker of ChatGPT. One new feature is a chat feature.

Microsoft only sent out invites to select users so far, but it looks like it will be open for public testing in the coming weeks. At this early stage the general public were invited to sign up to be early testers of the AI Bing.

At the moment, I am still on the waiting list. However, I do have a suggestion for new AI due to the high risks to minorities and in particular to Indigenous Peoples. That is to establish minority testing groups to ensure that as wide as possible sector of the community can test and identify issues.

It is statistically likely that the testers Microsoft invited are middle class white men who work in technology, further supporting the bias of AI and Data. But even those testers have widely reported that the Bot has an “alternative personality” called Sydney, the code name Microsoft gave it during development. Sydney spoke of hacking and even told one reporter that it was in love with him, later trying to convince the reporter that he should leave his wife for Bing.

Sydney the BOT has made threats against users, told incorrect information such as the date, called testers liars,  and a whole range of other ‘gas lighting behaviour’. But, I have yet to see any feedback from an Indigenous perspective or about Indigenous Peoples which is a concern. This reinforces the dire need for Indigenous and other minority groups to be included with all beta testing of AI technologies.

Google have asked that their own staff test and assist with their artificial intelligence search tool BARD. The issue here is that according to ZIPPIA , Indigenous staff  do not have their ethnicity recorded, or are so minimal it is not recorded.  White ethnicity account for 49.7% of staff, Hispanic or Latino 18.2%, Asian 18.2% and Black 8.7 % (Microsoft are similar). Leaving a huge margin of error for bias against People of Colour and Indigenous Peoples with their BARD.

From a Māori perspective, a lot of testing will be required to check for bias, racist and false information with any AI products that are created. The issues with both Bing and BARD will be safety issues as well as fact checking.

DISCLAIMER: This post is the personal opinion of Dr Karaitiana Taiuru and is not reflective of the opinions of any organisation that Dr Karaitiana Taiuru is a member of or associates with, unless explicitly stated otherwise.

Leave a Reply

Archive