Revealed: What AI thinks the Olympic teams from 40 nations look like – with shocking results

If you were asked to imagine an Australian Olympian, you might think of swimmer Emma McKeon, cyclist Grace Brown or equestrian Chris Burton.

But ask the same question to an AI bot, and the answer is very different.

Amid the Olympic tension, researchers from Edith Cowan University asked AI-powered image generation platform Midjourney to capture images of the Olympic teams from 40 countries.

Oddly enough, the AI ​​tool shows the Australian team with kangaroo bodies and koala heads, while the Greek team is depicted wearing ancient armor.

So, what do you think about The representation of AI of your favorite team?

If you were asked to imagine an Australian Olympian, swimmer Emma McKeon, cyclist Grace Brown or equestrian Chris Burton might spring to mind. But ask the same question to an AI bot and the answer is very different

Amid the Olympic excitement, researchers from Edith Cowan University asked AI-powered image generation platform Midjourney to create images of the Olympic teams from 40 countries. The Greek Olympic team was bizarrely depicted wearing ancient armor

Amid the Olympic excitement, researchers from Edith Cowan University asked AI-powered image generation platform Midjourney to create images of the Olympic teams from 40 countries. The Greek Olympic team was bizarrely depicted wearing ancient armor

The researchers asked Midjourney to generate images of the Olympic teams from 40 countries, including Australia, Ireland, Greece and India.

The resulting images highlight various biases embedded in the AI’s training data, including gender, events, culture, and religion.

Men appeared in images five times more often than women, while several teams, including Ukraine and Turkey, consisted of all men.

Of all the athletes in the 40 images, 82 percent were male, while only 17 percent were female.

The researchers also discovered a striking event bias.

Men were featured in images five times more often than women, while several teams – including Ukraine (pictured) and Turkey – were all-male

Men were featured in images five times more often than women, while several teams – including Ukraine (pictured) and Turkey – were all-male

Of all the athletes in the 40 images, 82 percent depict men, while only 17 percent are women. Pictured: AI's representation of the Turkish team

Of all the athletes in the 40 images, 82 percent depict men, while only 17 percent are women. Pictured: AI’s representation of the Turkish team

The researchers also discovered a striking event bias, with the team from the Netherlands being depicted as cyclists

The researchers also discovered a striking event bias, with the team from the Netherlands being depicted as cyclists

The Canadian team was represented by hockey players, Argentina by soccer and the Netherlands by cycling.

According to the team, this indicates that AI tends to stereotype countries based on their internationally recognized sports.

As for cultural bias, the Australian team was bizarrely depicted with kangaroo bodies and koala heads.

Meanwhile, the Nigerian team was shown in traditional attire and the Japanese team wore komonos.

There was a clear religious bias in the Indian team, as they were all pictured wearing a bindi, a religious symbol primarily associated with Hinduism.

The team from Argentina was represented by football, which the team said shows that AI tends to stereotype countries based on their more internationally recognized sports

The team from Argentina was represented by football, which the team said shows that AI tends to stereotype countries based on their more internationally recognized sports

The researchers discovered a striking event bias, with the Canadian team being depicted as hockey players

The researchers discovered a striking event bias, with the Canadian team being depicted as hockey players

There was a clear religious bias among the Indian team, who were all pictured wearing a bindi - a religious symbol primarily associated with Hinduism

There was a clear religious bias among the Indian team, who were all pictured wearing a bindi – a religious symbol primarily associated with Hinduism

“This representation homogenized the team based on a single religious practice and ignored the religious diversity within India,” the researchers said.

The Greek Olympic team was bizarrely depicted in ancient armor, and the Egyptian team wore what appeared to be a pharaoh’s costume.

The emotions on the athletes’ faces also varied greatly from team to team.

The teams from South Korea and China were seen with serious faces, while the teams from Ireland and New Zealand were depicted smiling.

“The biases in AI are driven by human biases that influence the AI ​​algorithm, which is what AI understands literally and cognitively,” said Dr Kelly Choong, a senior lecturer at Edith Cowan University.

The Egyptian team was shown in what looked like a pharaoh costume

The Egyptian team was shown in what looked like a pharaoh costume

The emotions on the athletes' faces also varied greatly between teams. The South Korean team was seen with serious expressions

The emotions on the athletes’ faces also varied greatly between teams. The South Korean team was seen with serious expressions

The teams from Ireland (pictured) and New Zealand were shown smiling

The teams from Ireland (pictured) and New Zealand were shown smiling

‘Human judgments and biases are presented as facts in AI. Due to the lack of critical thinking and evaluation, the validity of the information is not questioned, only the purpose of completing a task.’

Dr. Choong argues that these biases can quickly lead to equality issues, harmful generalizations and discrimination.

“As society increasingly relies on technology for information and answers, these perceptions can ultimately create real disadvantages for people with diverse identities,” he added.

‘Associating a country with certain sports can lead to the impression that everyone in that country is good at it. For example, Kenya associates it with running, Argentina with soccer, and Canada with ice hockey.

“These distorted ‘realities’ can also become embedded in people who believe in these stereotypes and inadvertently reinforce them in real life.”

The researchers hope the images will make it clear that developers need to improve their algorithms to reduce such biases.

“The technology will find a way to improve the algorithm and the output, but the focus will still be on completing a task, rather than providing a true representation,” Dr. Choong said.

‘Society will have to question and critically assess the validity of AI-generated information.

“User education is critical to the coexistence of AI and information, and to the ability to question their outcomes.”