“We the People” as we self-identify within the preamble of the Constitution, are more divided than ever in these ostensibly United State. Social media is a great place to look when misinformation is fostering this polarization. Research shows that Twitter, Facebook, and other platforms can widen social gaps , not by creating echo chambers , but by motivating users towards identifying causes, identities, and political parties. If the entire world is watching you engage in heated dialogue, it’s difficult to see the differences and make connections with others. There is also the public roster of the club you belong to–the lists that list precisely who you follow on social networks. The “wrong” accounts can be interpreted as complicity through association. However, following a “desirable group of people and sharing their posts will give you a sense of belonging.
Misinformation can also be spread through traditional forms of social communication, such as the mailbox, television ads, and person-to-person contact. Outside the online global forum, we tend to maintain messier social circles and personal interactions. The audience is smaller and consists of close friends or family members with whom you can work out differences and must live together. Brick-and-mortar relationships are less likely to lead to polarized views.
Social media, by contrast, pulls users en masse in opposite directions, hardening partisan differences. This environment creates an opportunity for misinformation to be widely disseminated in the public sphere. It also makes it easier to inflict harm, especially on platforms that are most likely to see users sharing content with the intent to harm “the other side”.
So, what can we do to make the world a better place? It is natural to give in to the pressures to exclude outsiders or prove one’s worthiness within an in-group. How can social media users avoid falling for misinformation manipulators whose mission is chaos, confusion, and lies threatens the stability and democracy of our nation’s elections? This question is now clearly apparent as voters hear the flurry of misbehavior and malfeasance leading up to next month’s midterm elections.
Jay Van Bavel’s research focuses on misinformation in public life. Jay Van Bavel’s research has focused on how social media and other factors exploit our human need for belonging–and exclusion–leading to the spread false information that was so common before November’s tipping point elections. Van Bavel, an associate professor of psychology at New York University and of neural science, spoke to Scientific American about who’s most susceptible to these lies, how they spread, and what we can do as victims of the misinformation ecosystem.
[ An edited transcript of the interview is available . ]
How does misinformation spread and why? How does misinformation spread and how does it impact the political environment?
Social media users are scrolling through about 300 feet of news feed a day. This is quite a lot. It is easy to lose attention amongst the sea of information. It is either relevant or interesting and signals our affiliation to a political leader or party. Many of these messages are filled with scandalous allegations of corruption or affairs. These messages can be very damaging for candidates if they are spread widely.
A surprising data point in the 2016 election data was that older people spread misinformation [more than] young people [do]. Older people are more polarized than young people and are more committed to their identities. They are also less savvy with social media than young people who use it frequently. In midterm elections, older people vote at twice the rate as young people.
What are some factors in the success of viral misinformation?
It’s a combination. In structural terms, the U.S. is more polarized than it’s been in 20 years, more driven by out-group hate than by in-group love.
Then, you have political leaders. During the pandemic, Donald Trump was the one who minimized the dangers of the pandemic in the initial months. He admitted it was a risk in private, but he was running for reelection, and didn’t want the economy to be damaged and people to blame him.
And then you have the group of people around them, the political elites. The role of the political elites is important. They give people clues about who to trust and what to believe. It’s a trickle down effect, starting with senators who are active on social media or Fox News. Then it goes to people like us online, scrolling and sharing or tweeting it, or mentioning it in a friend’s e-mail. People trust their family and friends more than news sources. It makes it seem more trustworthy if they see it from them.
Not all leaders are misinformation sources. Trump was tweeting conspiracy theories and misinformation to his followers and would amplify it if it was in his favor. Many conspiracy theorists are creative in creating stories. This trickles up and is amplified and supported by elites. There is a vicious cycle and an ecosystem. The more misinformation is shared, the more people will share it and get monetized. Fox News is invited to join. This has many incentive structures.
What platforms are the likeliest contenders for the best incubators of misinformation?
TikTok is a big one, although [the company] is trying to update its policies to manage misinformation. Facebook is probably more popular than Twitter. There is a big study that found that people who got their COVID information from Facebook had the greatest levels of vaccine hesitancy even versus getting it from Fox News. These data suggest that Facebook is a major risk factor because it is where most people get their news.
How do the parties responsible for generating misinformation decide on what to disseminate?
Misinformation that spreads quickly is information that connects with themes people already believe or have heard. One of the most popular was “stop the theft”, the false belief that the election was stolen. During the pandemic there were many conspiracy theories and misinformation. These included everything about vaccines and how they affect children to masks. In an effort to discredit the health risks that the pandemic posed, a lot was shared.
What surprises might be on the horizon for the midterms?
It’s harder to predict than a presidential race. The majority of information is focused on specific candidates, most likely in the most contested senatorial races such as Georgia or Pennsylvania. It will focus on the candidates running in these races, or perhaps some swing districts in Congress. This is where the architects and promoters of misinformation are trying to shake things up.
A small number of people are very extreme politically. The majority of these people are on the right, which makes them Republican political actors. You can also imagine misinformation coming from the left. It is not as widely spread. The people spreading [misinformation] are the usual suspects–Alex Jones, Roger Stone. It is usually generated from a few accounts, and then spreads by people who are politically aligned.
What can people do to be less vulnerable to influences from these misinformation campaigns?
We need to be more skeptical of information that comes from our political leaders, our own groups. I have only shared a few pieces of misinformation. One was a parody. I didn’t know it was there because it aligned with something in zeitgeist, my beliefs, and my identity. Friends on social media corrected it, and I felt embarrassed and took it down. If I started sharing information like Alex Jones’s, I would stop being invited to conferences.
We need norms to correct one another, and not have it all done by us. This is science–peer reviewing: Scholars come after us and point out our mistakes. Scientists live in a community that makes us smarter over time. You want to become a part of communities and make it a norm.
Another tactic is “prebunking”, which is getting the facts out before misinformation spreads. This acts as a vaccine. Inoculation means that your brain is armed with antibodies to counter this misinformation. This seems to make people more skeptical.
Online there is a really cool game, the Bad News Game, that teaches how [misinformation spreads] and what works. Data suggests that the game helps people.
Journalists play a role in identifying misinformation that is likely to spread during this cycle. This will piggyback onto “stop the steal” and “fake vote” allegations. I encourage journalists to provide accurate information about how people will be manipulated based upon the last election.
How does someone with an entrenched belief react to fact-checks?
Fact-checks are generally effective, but their impact on studies is very small compared to that of partisan identity. Sometimes, fact-checking actually backfired. People became more entrenched if facts were checked by the other side. People who experienced this backfire effect identified with a particular group. This type of entrenchment is more common for people who drive around with five Trump flags on their pickups or with stickers all over the car that list every liberal cause they support. People who don’t consider it an integral part of their identity are less likely to become entrenched.
I think this is going to become a major problem. It’s not going to go away, it’s only growing. It’s not just about individuals being more savvy. It is that we are entangled in systems that reward misinformation spreaders with financial things. Alex Jones is a good example. He is making so much every year selling conspiracy theories to his audience. People don’t like being controlled and need to be more aware of how they are being manipulated.