My grandfather had a saying, and I’ve shared it often, “Never wrestle with a pig, you both get dirty but the pig likes it.”
The pig has some success no matter what. This is something that I think is playing out with anti-vax and conspiracy arguments… they have some success every time we argue. The reason for this success is that they are operating from a fixed mindset, their minds are made up… but they are often arguing with people who have a growth mindset and are open to some level of persuasion. It’s a guaranteed downward spiral, with some of their fixed and misguided ideas seeping into the consciousness of people who try to factor all things in to their understanding.
An example of this is when the twin towers fell in New York. There were all kinds of conspiracy theories that started with the premise that ‘steel towers can’t crumble like that just because a plane crashed into them’. Spoiler alert, they can. But at the time we had no examples to go by, no science to support the possibility, and so just raising this concern could put doubt into a reasonable person’s mind. Then came the videos. Google something like “twin tower conspiracy video” and you’ll see what I mean. These videos are well crafted and convincing.
If you are someone prone to the idea that there is some cabal that has a master plan to rule the world, the fall of the twin towers easily fits that narrative. However, if you are someone who looks at evidence and makes sound decisions based on the information you have, too much of this convincing misdirection and misinformation could influence your thinking. In other words the spread of well constructed fake news has influence on all parties… meanwhile simple logic and boring facts only work on those with growth mindsets willing to do the research work.
The pig wins the moment you engage you in the fight. They get you dirty. Here is a study done at MIT, ‘Does correcting online falsehoods make matters worse?‘, which looks at how pointing out mistakes doesn’t help the argument:
Not only is misinformation increasing online, but attempting to correct it politely on Twitter can have negative consequences, leading to even less-accurate tweets and more toxicity from the people being corrected, according to a new study co-authored by a group of MIT scholars.
The study was centered around a Twitter field experiment in which a research team offered polite corrections, complete with links to solid evidence, in replies to flagrantly false tweets about politics.
“What we found was not encouraging,” says Mohsen Mosleh, a research affiliate at the MIT Sloan School of Management, lecturer at University of Exeter Business School, and a co-author of a new paper detailing the study’s results. “After a user was corrected … they retweeted news that was significantly lower in quality and higher in partisan slant, and their retweets contained more toxic language.”
And the article goes on to say,
“We might have expected that being corrected would shift one’s attention to accuracy. But instead, it seems that getting publicly corrected by another user shifted people’s attention away from accuracy — perhaps to other social factors such as embarrassment.” The effects were slightly larger when people were being corrected by an account identified with the same political party as them, suggesting that the negative response was not driven by partisan animosity.
Now in this case the ‘evidence’ will often degrade, and so it may not be too convincing, but research like this suggests that the conspiracy or fake news spreader is very unlikely to change their minds given sound evidence against their ideas… but when their false ideas are well crafted and instil doubt, the same can’t be said for thoughtful people who aren’t fixed in their opinions.
Social media engagement is more likely to influence people towards believing aspects of fake news that to promote facts and sound evidence. It’s a downward spiral, and it’s getting us all a little dirty.