- Fewer people visited untrustworthy websites ahead of the 2016 election than the 2020 election, a new study says.
- Just 26.2% of Americans were exposed to untrustworthy sites in 2020, compared to 44.3% in 2016.
Fewer Americans visited websites containing misinformation ahead of the 2020 presidential election compared to the 2016 election, a new study says.
The study, published April 13 in the journal "Nature Human Behavior," found that just 26.2% of Americans were "exposed to untrustworthy websites" in 2020, down from 44.3% in 2016.
Journalists and pundits increased efforts to quell the spread of misinformation ahead of the 2020 election after misinformation spread online dominated the 2016 election.
One article on Facebook that incorrectly claimed former President Donald Trump won the popular vote in 2016 received more than 480,000 Facebook engagements, while another fake article that said Hillary Clinton suggested that Trump should run for president garnered more than 407,000 engagements, Insider previously reported.
At the end of the 2016 election, several leading fake news stories outperformed actual news stories shared by popular media companies about the election online, Buzzfeed News reported at the time.
Buzzfeed reported that the top 20 fake news stories on Facebook outperformed the top 20 real news stories in number of link clicks in the final three months of the 2016 presidential election campaigns, citing a Facebook monitoring tool.
False election stories from fake websites and partisan blogs garnered 8,711,000 shares, reactions, and comments on Facebook in the final three months of the 2016 presidential campaigns, Buzzfeed reported.
The most popular fake news story in the final three months before the 2016 presidential election on Facebook was titled "Pope Francis Shocks World, Endorses Donald Trump for President, Releases Statement," which had around 960,000 engagements, according to Buzzfeed.
According to the researchers, conservatives and older adults were still the most exposed to untrustworthy sites in 2020, as they were in 2016, but they are now coming across them at lower rates.
"The role of online platforms in exposing people to untrustworthy websites changed, with Facebook playing a smaller role in 2020 than in 2016," the study says. "Our findings do not minimize misinformation as a key social problem, but instead highlight important changes in its consumption, suggesting directions for future research and practice."
Jeff Hancock, the founding director of the Stanford Social Media Lab and the lead author of the report, told The New York Times that the results make him optimistic about the increasing majority of people becoming more aware of misinformation on the internet.
"We're getting better and better at distinguishing really problematic, bad, harmful information from what's reliable or entertainment," Hancock told The Times.