Writing is my artistic expression. My keyboard is my brush. Words are my medium. My blog is my canvas. And committing to writing daily makes me feel like an artist.
As we watch war and resistance play out in the Ukraine, we are watching it from the film footage of civilians. Phones have become weapons against propaganda, weapons against tyranny, weapons of war. Instead of the story being told by a handful of brave reporters, anyone with a phone is now reporting and sharing updates.
These videos are being mapped and checked for authenticity, and they are being shared on social media. The battle might be being played out in the Ukraine, but it’s also being replayed all over the world. And now Elon Musk’s Starlink will ensure that the videos keep coming.
Resistance is now a shared global experience. The phone camera has become a weapon against tyranny.
Daniel Funke shared a thread of images that are NOT from the current invasion of the Ukraine by Russia, but are being spread in social media as if they are from the current battle.
Today I read an article that stated, “Facebook has blocked Russian state media outlets from using its advertising platform or using other monetization features in response to the invasion of Ukraine.”
Its amazing that propaganda is so prevalent today when there is such easy access to information. But we are not living in an age where facts travel at the speed of fiction. Lies spread faster than truth. Sensationalism trumps information, and upset or outrage create the perfect venue for the re-sharing of fabricated stories that go viral.
Facts blend with fiction into a narrative that is anything but real news. What stories do the news stations in Moscow share with their citizens? How different does the news sound in neighbouring Belarus, compared to China, compared to news here in North America?
It’s easy to share narratives that match your own view, even if the source of the data is unreliable. We are living in an era when misinformation reigns. Social media has become an unstoppable misinformation machine, and every time we click a like, re-share, or forward a narrative that isn’t true, we become part of the machine. After all, we are the social in social media. We are cogs in the misinformation machine.
Yesterday I popped into a TikTok Live event related to the pandemic, and I was horrified by the comments. The live event was a spokesperson from the Canadian government and put on by a news outlet account. I get it if there are a few hecklers unsatisfied with the way things are going, but these were the kinds of comments being shared:
“Are the side effects as low as ivermectin?”
“All these people need to be charged with Crimes Against Humanity.”
“You don’t need any shots”
“There’s no pandemic it’s a scam.”
“Wake up people.”
“Health Canada shifted to primarily pharmaceutical funding in 2018.”
“We listen to Malone!!!! Not you filthy animals!!!!”
“Time for a new government.”
“Not true!”
“The government is corrupt”
“I’m sick of these lies!”
“Evil pieces of 💩”
“We need to hire a hit team to take them all down.”
“There is no science, it’s called being bought by Pfizer”
“How can anyone believe there’s a pandemic?”
“this segment of propaganda brought to you by Bill Gates.”
“none of the vaccines work, it’s all a scam!!!”
“Hurry up you only have a 98.5% chance of survival.”
“Sorry, I don’t play medical Russian roulette.”
“Oh hey! It’s a terrorist speaking! Listen to your overlords, slaves.”
—
I cleaned up the punctuation a bit, and I didn’t put a few of these in all capitals, the way they were written. I also didn’t share them all, but none of them were positive. None.
I have a 30 minute timer for TikTok and I am committed to not going past that on work days. It’s crazy how the algorithm works. Before the self-administered time restraint, I could get sucked in for over an hour… Occasionally, on weekends, I still do. And my TikTok is nothing like my daughter’s, we are into completely different things.
What’s scary is how well the algorithm has me figured out. I can watch 20-30 short videos in a row without skipping one. It has completely figured out what I like, and feeds me related and relatable content. If you are a fan of Facebook or Instagram, you’ll notice the same thing.
But I’m someone who watches very little TV, and doesn’t spend much time on social media, and so 30 minutes is like sitting down to watch a single TV show. It’s entertainment for me and I allow myself that break.
But what about our younger generations? What kind of time are they spending sucked into attention algorithms designed to keep them engaged? Designed to keep them watching?
In China, they stop the Chinese version of TikTok, Douyin, from 10pm to 6am, and I’ve heard that they intentionally push educational content when it is on. This may seem draconian, but I’m not sure that letting addictive social media tools run rampant is a good idea. I’m not sure what balance looks like, but I am pretty sure that these tools are a bit too addictive to let them co-parent our kids.
I remember joining Twitter reluctantly in 2007. I thought, ‘I never update my status on Facebook, why would I join a new social media platform that is just the one feature of another social media platform that I don’t use?’ But as an educational blogger, I was reading about how powerful this tool was for educators and I hesitantly jumped on board.
After a short experimental phase I was hooked. Things like this happened all the time!
I was connected to a powerful network of educators who went out of their way to make connections, build community, and converse about teaching and learning. I’d go to conferences and connect with people I’d never met face to face, but whom I knew well, thanks to this amazing tool.
I even wrote a book to help others get started on Twitter:
Now I no longer use Twitter, and most other social media tools, nearly as much. They have become one-way transmission tools for my daily blog, which auto-posts to Twitter, Facebook, and LinkedIn when I hit ‘Publish’. I focus more on productivity, writing, than spending time consuming and using these tools.
But it’s still fun to get notifications like this yesterday:
I may not be on it as much, but Twitter helped me create an amazing community, and I cherish the connections and memories made.
My grandfather had a saying, and I’ve shared it often, “Never wrestle with a pig, you both get dirty but the pig likes it.”
The pig has some success no matter what. This is something that I think is playing out with anti-vax and conspiracy arguments… they have some success every time we argue. The reason for this success is that they are operating from a fixed mindset, their minds are made up… but they are often arguing with people who have a growth mindset and are open to some level of persuasion. It’s a guaranteed downward spiral, with some of their fixed and misguided ideas seeping into the consciousness of people who try to factor all things in to their understanding.
An example of this is when the twin towers fell in New York. There were all kinds of conspiracy theories that started with the premise that ‘steel towers can’t crumble like that just because a plane crashed into them’. Spoiler alert, they can. But at the time we had no examples to go by, no science to support the possibility, and so just raising this concern could put doubt into a reasonable person’s mind. Then came the videos. Google something like “twin tower conspiracy video” and you’ll see what I mean. These videos are well crafted and convincing.
If you are someone prone to the idea that there is some cabal that has a master plan to rule the world, the fall of the twin towers easily fits that narrative. However, if you are someone who looks at evidence and makes sound decisions based on the information you have, too much of this convincing misdirection and misinformation could influence your thinking. In other words the spread of well constructed fake news has influence on all parties… meanwhile simple logic and boring facts only work on those with growth mindsets willing to do the research work.
The pig wins the moment you engage you in the fight. They get you dirty. Here is a study done at MIT, ‘Does correcting online falsehoods make matters worse?‘, which looks at how pointing out mistakes doesn’t help the argument:
Not only is misinformation increasing online, but attempting to correct it politely on Twitter can have negative consequences, leading to even less-accurate tweets and more toxicity from the people being corrected, according to a new study co-authored by a group of MIT scholars.
The study was centered around a Twitter field experiment in which a research team offered polite corrections, complete with links to solid evidence, in replies to flagrantly false tweets about politics.
“What we found was not encouraging,” says Mohsen Mosleh, a research affiliate at the MIT Sloan School of Management, lecturer at University of Exeter Business School, and a co-author of a new paper detailing the study’s results. “After a user was corrected … they retweeted news that was significantly lower in quality and higher in partisan slant, and their retweets contained more toxic language.”
And the article goes on to say,
“We might have expected that being corrected would shift one’s attention to accuracy. But instead, it seems that getting publicly corrected by another user shifted people’s attention away from accuracy — perhaps to other social factors such as embarrassment.” The effects were slightly larger when people were being corrected by an account identified with the same political party as them, suggesting that the negative response was not driven by partisan animosity.
Now in this case the ‘evidence’ will often degrade, and so it may not be too convincing, but research like this suggests that the conspiracy or fake news spreader is very unlikely to change their minds given sound evidence against their ideas… but when their false ideas are well crafted and instil doubt, the same can’t be said for thoughtful people who aren’t fixed in their opinions.
Social media engagement is more likely to influence people towards believing aspects of fake news that to promote facts and sound evidence. It’s a downward spiral, and it’s getting us all a little dirty.
“We are living in a redpill/blue pill moment, except people are colourblindandeveryonethinks they are taking the red pill.”
— — —
The TermsRed PillandBlue Pillrefer to a choice between revealing an unpleasant truth, represented by the red pill, or to remain in blissful ignorance, represented by the blue pill. These terms are in reference to the 1999 filmThe Matrix.~Wikipedia
The insightfulthingabout this is that there are a lot of people who are (unknowingly) choosing the blue pill. This can be summarized by 2 TikToks I’ve seen recently:
While these are American references, (welcome to using social media in Canada, that’s what you get), there are many conspiracy theorists and anti-vaxers all over the world that think they’ve somehow taken the red pill, but are colourblind and have ignorantly taken the blue pill.
This is so much more dangerous that people who just choose the blue pill because that’s what they wanted. This is about people steadfastly believing that they have seen behind the (metaphorical) curtain. They “know” the unpleasant ‘Truth’.
Ignorance may be bliss but intentionally seeking out ignorance and claiming it is fact is outright dangerous.
Dangerous. Not mistaken, not misguided, not just ignorant. Dangerous.
Social media has amplified this danger. When Facebook posts with misinformation get shared 5 times as fast and as much as the information debunking the information; When QAnon can constantly change their stance(s) and people still believe, despite how wrong this ‘inside information’ has been; When crackpots that claim to be experts get more views than researchers who actually share the data… this is dangerous.
It’s one thing to choose the blue pill, it’s a whole other kind of scary thing when the blue pill is ignorantly chosen while the taker believes they are taking the red pill.
Signal-to-noise ratio (SNR or S/N) is a measure used in science and engineering that compares the level of a desired signal to the level of background noise. SNR is defined as the ratio of signal power to the noise power, often expressed in decibels. A ratio higher than 1:1 (greater than 0 dB) indicates more signal than noise. Wikipedia
This is a scientific term that relates to how much background noise there is interfering with the data or information you are trying to receive. A simple way to think about this is having a conversation in a party. If the noise of the party is too loud, you can’t pick up the signal (what the other person is saying). There is a point at which the noise does not interfere and the signal/communication is easy to hear, then moving along the scale the noise can interfere a little or a lot.
With machines this ratio is easy to calculate. With humans it’s a lot harder. It isn’t always about the quality of the signal, it’s also about the the willingness of the receiver to receive the signal. Sometimes people are not ready to receive the signal no matter how clear it is. Sometimes people choose to listen to the noise. Sometimes the noise is in their own head, not just coming from outside.
We are currently living in a world where a large number of people pay attention to the noise and are missing the signal altogether. A world where the noise is intentionally being spread. A world where the signal is considered noise. But humans aren’t machines, and so the noise isn’t easily calibrated and removed.
Social media used to amplify the signal, now it amplifies the noise. News used to amplify the signal, now it constantly reports about the problem of the noise, thus highlighting the noise and bringing it to everyone’s attention… not always in a negative light… or putting the signal and the noise on an equal footing as if to say here are two equal signals to be weighed and considered. As a result, communities, families, and friendships are being torn apart as they argue about what is signal and what is noise.
I’m reminded of the ‘More Cowbell’ skit on Saturday Night Live.
https://vimeo.com/257364428
The noise is becoming too loud to receive the signal in any meaningful way. We need to simultaneously turn up the signal and turn down the noise. If not, we better get used to the cow bell.
This is an interesting time that we live in. I find myself in a position where I need to question my own values. I don’t do this lightly. I don’t pretend that my values have suddenly changed. It’s just that present circumstances put me at odds with my own beliefs around freedom of speech.
I am a strong believer in freedom of speech. I think that when a society sensors speech, they are on a dangerous path. I take this to an extreme. Except for slander, threats, and inciting violence, I think people have a right to say and believe what they want. I believe that taking away such freedom puts us on a perilous path where a select few get too much control, and can undermine our freedoms.
An example where I take this to the extreme would be agreeing with Noam Chomsky.
So now, even as an ardent defender of free speech, I find myself agreeing with YouTube’s decision to ban vaccine misinformation:
YouTube doesn’t allow content that poses a serious risk of egregious harm by spreading medical misinformation about currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and by the World Health Organization (WHO). This is limited to content that contradicts local health authorities’ or the WHO’s guidance on vaccine safety, efficacy, and ingredients.
Two, four, eight, or sixteen years ago when YouTube began, I would have screamed ‘Censorship!’ at the idea of a platform banning free speech. Even now it bothers me. But I think it is necessary. The first problem is that lies and misinformation are too easily shared, and spread too easily. The second problem is that the subject area is one where too many people do not have enough information to discern fact from fiction, science from pseudoscience. The third problem is that any authentic discussion about these topics is unevenly biased towards misinformation. This last point needs explanation.
If I wanted to argue with you that Zeus the Greek God produces lightning and thunder when he is angry, I think everyone today would say that I was stupid to think such a thing. However, if I was given an opportunity to debate a scientist on this in a public forum, what inadvertently happens is that my crazy idea now gets to have an equal amount of airtime with legitimate science. These two sides do not deserve equal airtime in a public, linkable, shareable format that appears to give my opinion an equal footing against scientific evidence.
Now when dealing with something as silly as believing in a thunder god is the topic, this isn’t a huge issue. But when it’s scientific sounding, persuading and fear mongering misinformation that can cause harm, that’s a totally different situation. When a single counter example, say for example a person having adverse effects from a vaccine, becomes a talking point, it’s hard to balance that in an argument with millions of people not having adverse effects and also drastically reducing their risk of a death the vaccine prevented. The one example, one data point, ends up being a scare tactic that works to convince some people hearing the argument that the millions of counter examples don’t matter. And when social media platforms feed similar, unbalanced but misleading information to people over and over again, and the social media algorithms share ‘similar’ next videos, or targeted misinformation, this actually gets dangerous. It threatens our ability to weigh fact from fiction, news from fake news, science from pseudoscience. It feeds and fosters ignorance.
I don’t know how else to fight this than to stop bad ideas from spreading by banning them?
This flies in the face of my beliefs about free speech, but I don’t know any alternative to prevent bad ideas from spreading faster than good ones. And so while I see censorship as inherently evil, it is a lesser evil to allowing ignorance to spread and go viral. And while it potentially opens a door to less freedom, and I have concerns about who makes the decision of what information should be banned, I’d rather see a ban like this attempted, than for us to continue to let really bad ideas spread.
I thought in this day and age common sense would prevail and there would be no need to censor most if not all free speech. However it seems that as a society, we just aren’t smart enough to discern truth from cleverly said fiction. So we need to stop the spread of bad ideas, even if that means less freedom to say anything we want.
It has been a slow process, but I’ve really disengaged from interactions on social media. It has become a one-way transmission tool for my daily blog, and not much else. Well, other than 30 minutes of TikTok that I watch instead of TV, but that’s entertainment rather than engagement. My only social media comments tend to be responses made to my posts about my blog.
I think the disengagement started with US political news dominating everything a few years back. I got fed up watching post after post that had no real connection to me as a Canadian, but still angered and upset me. I got tired of the childish anger and upset. Then came the pandemic, and more (digital) yelling and screaming about how to handle it… with healthy doses of ignorance and bickering about the science. But this fighting isn’t between professionals and real experts, it’s between doctors/scientists and ‘armchair experts’ that demonstrates how expert they are at spewing stupidity and ignorance.
Between politics and pandemic, I’m really done engaging on social media much. That said, these topics still reach me, and I still find myself talking about them here on my blog. Now there is a Canadian election, but I tend not to discuss who I’m voting for and why. Instead, I prefer to focus on encouraging people to get out and vote. I think it’s our duty as citizens to exercise our right to vote, and even want to see tax related fines for those that don’t.
I might be disengaged from social media. I may not like the news that I see. But I believe we should all be appreciative and respectful of living in a democracy, and that we should participate in a democracy if we want to keep it. If we value having a voice, we should use it in a vote… before worrying about what that voice should be saying on social media.