When bad ideas go viral

A decade ago viral videos went viral organically. People shared videos, these went into other people’s timelines, and they shared it too. This can still happen on Twitter, to a small extent, but not on Facebook. Facebook viral videos only happen through advertising dollars. Even that cute cat video doesn’t spread unless it is pushed through. The next time you see a viral video with million of views on Facebook, take the time to notice three things:

First, notice that he video was uploaded to Facebook. It’s not a YouTube or Vimeo embedded video, it’s actually a video that was saved on Facebook. This is so that they can fully track, and have advertisers pay for, engagement.

Second, notice that the video will have some branding on it: A website logo or emblem of some kind. It is being promoted by a company or organization.

Third, notice if it is from someone that you regularly see in your timeline. Your timeline used to be every post from every person you follow, now your timeline is curated and you see some people more than others… but when a friend that’s seldom seen in your curated list shares a paid-for/promoted video, suddenly you see their post in your timeline again.

This becomes dangerous when the information shared is false. Here are two specific examples where this is scary right now:

1. Anti-vaccination propaganda: vaccines have made our world a significantly safer place. Hopefully there will be a vaccine for Covid-19 soon, but the reality is that millions of people will likely opt out of taking it and the virus could linger for years, mutating and making the very vaccine useless. Measles have had a resurgence because less people around the world or vaccinating their kids.

2. False claims about Covid-19 cures and anti-mask groups are undermining the science behind fighting this virus. The most recent viral video, taken down by Facebook, has a doctor espousing how she has cured over 300 Covid-19 patients with Hydroxychloroquine. I wrote about this drug as a treatment for covid-19 here: Trying to find the Truth

It’s one thing to say a drug is useful in helping treat an ailment and yet another to claim it is a cure or somehow preventative (like a vaccination).

I’ve had an argument with some very smart people that think Facebook should not take down videos like this because ‘people should be able to watch them and make their own decisions’. Maybe I’d have agreed with them a decade ago, because the video would have to spread organically and people could share their concerns along with the video. But today, that video will get millions of viewers targeted through advertising dollars to promote bad ideas. And advertisers with their own agendas will feed the video to people most likely to agree and share it with support.

This isn’t an organic process. It’s marketplace advertising being used to market and sell bad ideas, entice anger, and polarize opinions and perspectives. Even the taking down of the video has become polarized with the ‘Leftist social media sites only taking down videos they don’t agree with.’ …suddenly this is about politics and not about the spreading of dangerous ideas.

There are a lot of bad ideas being intentionally spread right now. The scary part of this is that these bad ideas are going viral through advertising dollars spent with an agenda to create anger, divisiveness, and the polarization of people.

And while I’ve focused on anti-vaccines and Covid-19 cures, media outlets have used fake (or carefully edited) protest videos to entice anger and gain clicks and advertising revenue, and used language to specifically pander to specific audiences. The desire to share messages virally has made it so that almost any (newsworthy) viral video you see will likely be one that has an agenda, and that agenda is seldom to give you the truth.

Ask yourself who is behind the next viral video you see, then ask yourself what their agenda is?

Your chance to share: