Q+A: Woman claims she was discriminated against by AI technology and had to get her white boyfriend to help her

An Australian woman of South Asian descent has exposed a major flaw in Airbnb’s use of artificial intelligence after she was locked out of the service.

Francesca Dias, a Sydney woman, told her story to the panel on ABC’s Q+A, revealing she was unable to create her own account due to an issue with the app’s AI.

She had to turn to her white partner, who easily set up an account.

‘So recently I found out that I couldn’t activate an Airbnb account simply because the facial recognition software couldn’t match two photos or a photo ID of me, so I ended up having my white male partner make the booking for me’ , she says. said.

The story was met with horror by the panel, with host Patricia Karvelas describing her treatment as ‘truly shocking’.

Ms Dias’ story was not surprising to AI expert and founder of the Responsible Metaverse Alliance Catriona Wallace, who said it was a social issue that caused the problem.

‘Often society doesn’t have a good representation of the entire population in its data sets because we’ve been so biased historically, and those historical sets are used to train the machines that are going to make decisions about the future, like what happened with Francesca,” she said.

‘So it’s baffling that this is still the case and that it concerns Airbnb. You would think that a large, international, global company would be able to do that, but they still haven’t.”

Karvelas then wondered why big tech companies wouldn’t invest more in ensuring all people can use their services, prompting technology journalist Angharad Yeo to say she was “optimistic” about it. ‘

So because the technology is still new, I think it’s very easy for them to get very excited about the technology being implemented at all,” she said.

A Sydney woman of South Asian descent, Francesca Dias (pictured), claims she was discriminated against by AI after it refused her to create an account due to facial recognition

The Q+A panel was “saddened” but “not surprised” by Ms Dias’ claims, who said big companies must do more to ensure their AI does not discriminate against minorities

‘…I think this is one of those areas that really puts a spotlight on these biases…if it’s a little more hidden it’s easy to ignore, but if it’s, ‘I literally can’t use this service because the AI ​​isn’t’ ‘doesn’t work’ then you really think we have a real problem here.”

That big companies are not equipped to deal with bias when it comes to AI should be something that needs to be addressed in a “regulatory” way, according to CSIRO chief executive Doug Hilton, who said he was “not at all” surprised through Francesca’s story.

“We have racial discrimination laws and we need to enforce them vigorously, so it’s in Airbnb’s best interest to get this right,” she said. ‘We actually know technically how to solve this, we know how to actually make the algorithm (work).’

Related Post