top of page

Growing up in the 90s and aughts, I never thought very deeply about what I saw online. Back then, the internet was the Wild West; depending on who you asked, it was either a place of total freedom where files and information were distributed equally to everyone who wanted, or it was a place full of offensive, dangerous and sometimes highly illegal material. To me, it was both. It was horrible and caused a lot of damage to my developing psyche, and it was an exciting wonderland full of ideas I’d never encountered before. While I don’t miss the hours I’d spent scrolling through gory photos, overall I’m grateful I grew up in the internet renaissance.

 

As the internet grew, the user base grew from millions to billions, and with it came opportunities to monetize it. The ads came, and bots to facilitate the ads, and now, here we are. The Dead Internet theory is a conspiracy theory that posits that the internet consists primarily of bot activity and is manipulated by algorithms, which reduces organic human interaction and creates a population that is easily controlled.1 While it is still a conspiracy theory and thus is unable to be proven to be factual, there is a grain of truth to it – nearly half (49.6%) of all internet traffic in came from bots, the majority of them being malicious in origin. 2 There are plenty of real humans on the internet, of course, as so much of the world is connected via accessible forms of internet now, primarily mobile – but in another few years, the number of bots likely would have increased to comprise most of the internet. That is a terrifying thought.

Harmful bots include not only ones that try to scam you out of your private information, but far more nefarious ones. Propaganda bots were and are a known fixture online3, and their efforts to control the narrative during the Covid pandemic and the Trump presidency (as well as to spread “anti-woke” efforts through Star Wars, of all things4) have been widely reported on. Slightly lesser known, however, are efforts to manipulate progressives – the Kremlin has been caught numerous times using bots and trolls to spread anti-Ukrainian misinformation in the form of Tweets and memes5, as well as false posts supporting Hamas.6

The jury seems to be out on whether or not propaganda has led to the increase in polarization online when it comes to political discourse – or if things are actually more polarized in the first place, it being a somewhat difficult thing to quantify, but I’m fairly sure that it has to at least be a part of why it’s led to the internet being a more unpleasant place to be these days. It’s led me to become increasingly weary of social media and the amount of content that gets shoved in my face – each meme or tweet or short video lasting mere seconds or minutes before the algorithm places yet another in front of me. Here, a post of a cute kitten, then a photo of a beautiful sunrise, next, a photograph of a corpse of a child, then back to cute kittens. It’s gruesome and dystopian, and social media companies don’t care as long as people can keep up the engagement.

I ended up disconnecting from most social media. And interestingly enough, while I thought I would be extremely bored and without much to do to distract myself, I actually feel myself somewhat relishing in moments I used to see as being banal. My attention span may be healing. Time will tell.

 

All that got me diving more into learning more about WHY I had originally become so entrenched in social media and reading peoples’ Hot Takes during the pandemic. At the time, it had been almost intoxicating – I admit to back then enjoying content I now find ghoulish like Reddit’s Herman Cain Awards subreddit, which celebrated and mocked the death of Covid denialists.7

There is a certain comfort and pleasure you get from schadenfreude, and feeling like you are right. It elevates you to a place where you are better than “those other people”. Sometimes they even start to not seem like people. I felt myself sliding into that territory far too often. Almost like with some sort of cult, it’s a gradual process; in the beginning I didn’t feel like anything was that different. I’d always been a consumer of online content, particularly on sites like Reddit, and I never felt like I was changing, even as the content I was seeing became more and more insular and polarized. While I had been extremely critical of how QAnon became a cult like phenomenon, I was starting to wonder if similar ways of thinking had infected my brain as well, though, thankfully, in a less severe and devastating way.

Because I was so curious about this, I started to read through the absolutely massive pile of articles, think pieces and the like that have been written about it. There was an interesting thread that was woven throughout – the similarity of some political content now to the Seven Propaganda Devices.

The Seven Propaganda Devices was introduced by the Institute for Propaganda Analysis back in 1937, and it has been a widely used framework to analyze propaganda; in particular, wartime propaganda in the form of speeches, posters, literature, etc. It’s definitely a generalization and not accurate when used in every context, but it’s a useful tool to analyze any existing media one consumes. The following is excerpted from the Institute’s list. 8

***

These techniques are: 


1) Name Calling 2) Glittering Generalities 3) Transfer 4) Testimonial 5) Plain Folks 6) Card Stacking, and 7) Band Wagon.

Name Calling: Propagandists use this technique to create fear and arouse prejudice by using negative words and names to create an unfavorable opinion or hatred against a group, beliefs, ideas or institutions they would have us denounce. This method calls for a conclusion without examining the evidence. Name Calling is used as a substitute for arguing the merits of an idea, belief, or proposal. It is often employed using sarcasm and ridicule in political cartoons and writing. Some examples of name calling could be “nazi”, “groomer”, “fascist”, etc.

When confronted with this technique the Institute for Propaganda Analysis suggests we ask ourselves the following questions: What does the name mean? Is there a real connection between the idea and the name being used? What are the merits of the idea if I leave the name out of consideration? When examining this technique try to separate your feelings about the name and the actual idea or proposal.

 

Glittering Generalities: Propagandists employ vague, sweeping statements (often slogans or simple catchphrases) using language associated with values and beliefs deeply held by the audience without providing supporting information or reason. They appeal to such notions as honor, glory, love of country, desire for peace, freedom, and family values. The words and phrases are vague and suggest different things to different people but the implication is always favorable. It cannot be proved true or false because it really says little or nothing at all. A good example from recent times could be “Make America Great Again”.

The Institute of Propaganda Analysis suggests a number of questions we should ask ourselves if we are confronted with this technique: What do the slogans or phrases really mean? Is there a legitimate connection between the idea being discussed and the true meaning of the slogan or phrase being used? What are the merits of the idea itself if it is separated from the slogans or phrases? 

 

Transfer: Transfer is a technique used to carry over the authority and approval of something we respect and revere to something the propagandist would have us accept. Propagandists often employ symbols (e.g., waving the flag) to stir our emotions and win our approval. 

The Institute for Propaganda Analysis suggests we ask ourselves these questions when confronted with this technique. What is the speaker trying to pitch? What is the meaning of the thing the propagandist is trying to impart? Is there a legitimate connection between the suggestion made by the propagandist and the person or product? Is there merit in the proposal by itself? When confronted with this technique, question the merits of the idea or proposal independently of the convictions about other persons, ideas, or proposals. 

Testimonial: Propagandists use this technique to associate a respected person or someone with experience to endorse a product or cause by giving it their stamp of approval hoping that the intended audience will follow their example. 

The Institute for Propaganda Analysis suggests we ask ourselves the following question when confronted with this technique. Who is quoted in the testimonial?  Why should we regard this person as an expert or trust their testimony? Is there merit to the idea or product without the testimony? You can guard yourself against this technique by demonstrating that the person giving the testimonial is not a recognized authority, prove they have an agenda or vested interest, or show there is disagreement by other experts. 

Plain Folks: Propagandists use this approach to convince the audience that the spokesperson is from humble origins, someone they can trust and who has their interests at heart. Propagandists have the speaker use ordinary language and mannerisms to reach the audience and identify with their point of view. 

The Institute for Propaganda Analysis suggests we ask ourselves the following questions before deciding on any issue when confronted with this technique. Is the person credible and trustworthy when they are removed from the situation being discussed? Is the person trying to cover up anything? What are the facts of the situation? When confronted with this type of propaganda consider the ideas and proposals separately from the personality of the presenter. 

Bandwagon: Propagandists use this technique to persuade the audience to follow the crowd. This device creates the impression of widespread support. It reinforces the human desire to be on the winning side. It also plays on feelings of loneliness and isolation. Propagandists use this technique to convince people not already on the bandwagon to join in a mass movement while simultaneously reassuring that those on or partially on should stay aboard. Bandwagon propaganda has taken on a new twist. Propagandists are now trying to convince the target audience that if they don't join in they will be left out. The implication is that if you don't jump on the bandwagon the parade will pass you by. While this is contrary to the other method, it has the same effect: getting the audience to join in with the crowd. 

The Institute of Propaganda Analysis suggests we ask ourselves the following questions when confronted with this technique. What is the propagandist's program?  What is the evidence for and against the program? Even though others are supporting it, why should I? As with most propaganda techniques, getting more information is the best defense.  When confronted with Bandwagon propaganda, consider the pros and cons before joining in. 

Card Stacking: Propagandists use this technique to make the best case possible for his side and the worst for the opposing viewpoint by carefully using only those facts that support his or her side of the argument while attempting to lead the audience into accepting the facts as a conclusion. In other words, the propagandist stacks the cards against the truth. Card stacking is the most difficult technique to detect because it does not provide all of the information necessary for the audience to make an informed decision. The audience must decide what is missing. 

The Institute for Propaganda Analysis suggests we ask ourselves the following question when confronted with this technique: Are facts being distorted or omitted? What other arguments exist to support these assertions? As with any other propaganda technique, the best defense against Card Stacking is to get as much information that is possible before making a decision.

***

I started looking at the things I read through these lenses and dusting off my innate curiosity and skepticism to filter everything. It’s still quite a challenge to leave my comfortable bubble of a space that I had built to help my fragile mental state during the pandemic, but I do feel like a fog has been lifted and I am able to navigate the world a bit better.

All that said, though, ultimately, I’ve come to the conclusion that limiting seeing negative content was the way to go for me. While I occasionally have the energy to weed through content and pick it apart (and find it fun, even) – I usually do not. In those times, I found it best to keep to a social media-free life as much as possible and work on supporting the immediate community around me as much as is feasible. 
 

1 Maybe You Missed It, but the Internet ‘Died’ Five Years Ago: A conspiracy theory spreading online says the whole internet is now fake. It’s ridiculous, but possibly not that ridiculous? The Atlantic, Kaitlyn Tiffany
 

Bots Now Make Up Nearly Half of All Internet Traffic Globally, Imperva

3 Bots and Computational Propaganda: Automation for Communication and Control, Social Media and Democracy, Cambridge University PressSamuel C. Woolley

4 Russian trolls used Star Wars to sow discord online. The fact that it worked is telling. Grappling with the rise of “fandamentalism,” where too many of us turn pop culture into a religion. Vox, Emily St. James

5 Millions of Leftists Are Reposting Kremlin Misinformation by Mistake. A Kremlin-backed media outlet masquerading as a left-wing news source has been racking up likes and shares on its posts about Ukraine. Vice, David Gilbert

6 In a Worldwide War of Words, Russia, China and Iran Back Hamas. Officials and researchers say the deluge of online propaganda and disinformation is larger than anything seen before. New York Times, Steven Lee Myers and Sheera Frenkel

7 Reddit's Herman Cain Covid 'award' is a depressing sign of our times. This push to revel in schadenfreude, and to assign collective blame, is understandable. But it's not psychologically healthy. NBC News Opinion, F. Diane Barth


8 Via Eastern Illinois University.

bottom of page