Faking an honest woman: Why Russia, China and Big Tech all use faux females to get clicks

WASHINGTON — When disinformation researcher Wen-Ping Liu examined China’s attempts to exert influence The recent elections in Taiwan Using fake social media accounts, something unusual stood out about the most successful profiles.

They were women, or at least they seemed to be. Fake profiles claiming to be women received more engagement, more attention and more influence than supposedly male accounts.

“Pretending to be a woman is the easiest way to gain credibility,” said Liu, an investigator at Taiwan’s Ministry of Justice.

Be that as it may Chinese or Russian propaganda agenciesonline scammers or AI chatbots, it pays to be a woman – proving that while technology is becoming increasingly sophisticated, the human brain remains surprisingly easy to hack, thanks in part to age-old gender stereotypes that have migrated from the real world to the virtual.

People have long assigned human characteristics such as gender to inanimate objects – ships are an example – so it makes sense that human characteristics would make fake social media profiles or chatbots more attractive. However, questions about how these technologies may reflect and reinforce gender stereotypes are gaining attention as more voice assistants and AI-enabled chatbots enter the market. the blurring of the lines between man (and woman) and machine.

“You want to inject some emotion and warmth and a very easy way to do that is to choose a woman’s face and voice,” says Sylvie Borau, a marketing professor and online researcher in Toulouse, France. whose work has found that internet users prefer ‘female’ bots and consider them more human than ‘male’ versions.

People tend to see women as warmer, less threatening and more pleasant than men, Borau told The Associated Press. Men, on the other hand, are often seen as more competent, but also more often as threatening or hostile. This makes many people, consciously or unconsciously, more willing to participate in a fake account pretending to be a woman.

When Sam Altman, CEO of OpenAI, was looking for a new voice for the ChatGPT AI program, he approached Scarlett Johanssonwho said Altman told her that users would find her voice – which served as the voice assistant of the same name in the movie ‘Her’ – ‘comforting.’ Johansson declined Altman’s request and threatened to sue when the company went along with what she called an “eerily similar” vote. OpenAI put the new voice on hold.

Female profile picturesespecially those that show women with flawless skin, luscious lips and big eyes in revealing outfits can be another online lure for many men.

Users also treat bots differently based on their perceived gender: Borau’s research found that “female” chatbots experience sexual harassment and threats at a much higher rate than “male” bots.

Female social media profiles receive on average more than three times as many views as men, according to an analysis of more than 40,000 profiles conducted for the AP by Cyabra, an Israeli technology company that specializes in bot detection. Female profiles that claim to be younger get the most views, Cyabra discovered.

“Creating a fake account and presenting it as a woman will give the account more reach than if it were presented as a man,” the Cyabra report said.

The online influence campaigns of countries such as China and Russia have long used fake females spreading propaganda and disinformation. These campaigns often exploit people’s opinions of women. Some appear to be wise, caring grandmothers dispensing homely wisdom, while others emulate young, conventionally attractive women eager to talk politics with older men.

Last month, researchers at the firm NewsGuard discovered that hundreds of fake accounts — some with AI-generated profile pictures — were being used to criticize President Joe Biden. It happened after some Trump supporters started posting a personal photo announcing that they “will not vote for Joe Biden.”

While many of the messages were authentic, more than 700 came from fake accounts. Most profiles claimed to be young women living in states like Illinois or Florida; one was called PatriotGal480. But many of the accounts used nearly identical language and had profile photos that were AI-generated or stolen from other users. And while they couldn’t say for sure who was running the fake accounts, they found dozens with links to countries like Russia and China.

X deleted the accounts after NewsGuard contacted the platform.

A UN report suggested there is an even clearer reason why so many fake accounts and chatbots are feminine: they are made by men. The report, entitled “ Are robots sexist?,” looked at gender differences in the technology industry and concluded that greater diversity in programming and AI development could lead to fewer sexist stereotypes embedded in their products.

For programmers looking to make their chatbots as human as possible, this creates a dilemma, says Borau: If they select a female character, are they encouraging sexist views of real women?

“It’s a vicious circle,” Borau said. “Humanizing AI can dehumanize women.”