Warning: Undefined array key 0 in /var/www/vhosts/nybreaking.com/httpdocs/wp-content/plugins/monumetric-ads/libs/advertisement.php on line 49

Google algorithms point readers to fake news less often than their own political biases, study finds

>

Paranoid about social media and search engine algorithms presenting oblique news to the public? Perhaps the worst offenders are those prejudiced members of the public themselves.

Communications and data scientists from three universities tracked the browsing habits of more than 1,000 internet news consumers during the 2018 and 2020 US election cycles.

They compared the biased nature of various Google search results to these users’ own independent Internet habits, as well as the links from those Google recommendations that their subjects engaged with.

Only 31.3 percent of their participants accounted for a whopping 90 percent of all unreliable news stories in 2018. And only 25.1 percent picked up all 90 percent of that fake news in 2020.

This percentage that took the clickbait, at least judging by these findings, was probably older and more likely to self-identify as “strongly Republican.”

Researchers from Rutgers, Stanford and Northeastern University found that Google’s search results returned more diverse and reliable news than readers were likely to click on. On average, their study participants revealed a slight preference for biased and unreliable news

In both the 2018 and 2020 election cycles, subjects who self-identified as

In both the 2018 and 2020 election cycles, subjects who self-identified as “Strong Republican” were more likely to engage with both unreliable and highly partisan news online

“What our findings suggest is that Google surfaces this content equally among users with different political views,” said study co-author Katherine Ognyanova, an associate professor of communications at the Rutgers School of Communication and Information.

“As far as people engage with those websites,” Ognyanova said, “it’s largely based on personal political views.”

In both election cycles, the average participant was slightly more likely to engage with untrustworthy news than Google was likely to expose them to untrustworthy news in their search results.

The difference was about one percentage point per year.

Links leading to untrustworthy news appeared an average of 2.05 percent in 2018 and 0.72 percent in 2020 in the Google search results of the test subjects.

But these participants were overall slightly more likely to click those sketchy links based on Google’s recommendation: 2.36 percent in 2018 and 0.93 percent in 2020.

And the chance that they voluntarily visit those untrustworthy sites is slightly higher, with 3.03 percent in 2018 and 1.86 percent in 2020.

Rutgers’ Ognyanova and her colleagues at Northeastern University’s Stanford Internet Observatory and Network Science Institute also had their volunteer subjects complete a survey about their political identity.

The participants self-reported their political identification on a seven-point scale ranging from “strong Democrat” to “strong Republican.”

The researchers then linked these survey results to web traffic data collected from the same participants, 1,021 in all, who voluntarily installed a special extension for their Chrome and Firefox browsers.

This custom browser extension logged Google search results URLs, as well as participants’ Google and browser histories, and tracked their exposure to and engagement with online news and political content.

The software provided the team with detailed information not only about the media these users engaged with online, but also how long.

Ognyanova and her colleagues used the term “Google search follows” to indicate instances where participants actually clicked and engaged with content that emerged from their search results.

They defined “follows” as cases where the person visited a URL immediately or within 60 seconds of being exposed to the content via Google Search.

These ‘Tracking Google searches’ were also compared to ‘general engagement’, meaning all news sites their participants visited themselves, without the help of Google.

By tracking the online habits of 1,021 participants, the team found that both self-identified Republicans (red, above) and Independents (gray) were more likely to engage with both partisan and untrustworthy news for extended periods of time, despite similar Google results for each group

By tracking the online habits of 1,021 participants, the team found that both self-identified Republicans (red, above) and Independents (gray) were more likely to engage with both partisan and untrustworthy news for extended periods of time, despite similar Google results for each group

The researchers also found a strong relationship between partisan news and news that was factually unreliable.  As they tracked users' exposure to

The researchers also found a strong relationship between partisan news and news that was factually unreliable. As they tracked users’ exposure to “fake news” through Google, they could see a preference among self-described “strong Republicans” for biased and unreliable news

For both election years, the study found that the bias towards partisan newsreading, “the difference in news partisanship between the average strong Republican and the average strong Democrat,” was small based on what Google had to offer. But the gap grew based on what these groups clicked on or visited themselves.

“Right-wing partisans, but not left-wingers, are more likely to follow identity-congruent news sources from Google Search,” the researchers argue in their study, published today in the magazine Nature“even if they take into account the content of their searches.”

Strong Republicans were involved in significantly more news from unreliable sources than independents, according to their findings for both 2018 and 2020.

The study also found that their participants aged 65 and older were more likely to encounter and engage with unreliable news than participants in younger demos.

While her team’s results suggest that news consumers may be their own worst enemy, Ognyanova said she thinks Google’s algorithms could generate results that are polarizing and potentially inflammatory.

“This doesn’t let platforms like Google off the hook,” she said. “They are still showing people information that is biased and unreliable. But our research underlines that it is the satisfied consumer who is in the driver’s seat.’