[MUSIC] Our press secretary gave alternative facts to that. >> My goal in this deposition was to be truthful, but not particularly helpful. >> Welcome to Unspun, the podcast that makes you better at finding the truth. The way people get news is changing. It used to be that there were many reporters who had researched stories and write articles, but now politicians and famous people share information directly with you on social media and the internet. That means you find out things fast, but it's up to you to make sure the information's actually accurate. And newsmakers don't always do their part. The temptation to manipulate information is strong. They bend the truth to deceive so that they can avoid accountability, so that they can advance their agendas. When you recognize these agendas, you can sometimes find out what's real. And we're at a crossroads where anyone can share anything online. So it's important to sharpen your critical thinking skills. Finding that deception before it goes viral is pretty much a survival skill now. And we're going to do it together. Let's get unspun. [MUSIC] >> The truth seekers and welcome to this week's Unspun. Today, we'll break down a new logic issue and also hear about what current research says about the state of propaganda we're in right now. Sound good? Before we get going, I'd like to ask you a small favor. If these episodes of Unspun value for you, if you're a little bit smarter in conversation or better able to think about the news, would you please consider giving a quick rating in the podcast app you're listening to right now? Your ratings and comments help with the show's discoverability and also help me to recruit better guests. So if you have a moment, I'd really appreciate it. Okay, let's get unspun. Today, we're talking about guilt by association. This idea is when someone is blamed not for what they did, but because of who they know. So let's explore this with a couple of stories. First, we'll talk about Sophia. She lives in a small city and she works at the local convenience store, but her real love is helping animals. Every weekend, she volunteers at the local no kill animal shelter doing things like cleaning out cages and grooming dogs. She's known for her kindness and her dedication, and she shows up every week to help her furry friends. She makes a difference and even helped organize a big adoption event where many happy families met their future pets. Sophia's co-worker at the convenience store, Mark, also volunteers at the shelter. Being a small convenience store clerk doesn't pay great and Mark has cut stealing money from the shelter's donations. People start to whisper and point fingers at Sophia. They wonder if she's involved too, even though she knew nothing about what Mark did. Across the town of the high school, Mr. Thompson is a science teacher. He helps his students learn biology, but you find students hanging out in his room before school almost every day as he takes them through personal challenges. The students and the parents adore him for his dedication, and the PTA even threw a surprise party to celebrate Mr. Thompson when he hit 20 years of teaching. The school's principal, who was a close friend, even showed up with Mylar balloons to congratulate him while his students clapped. At one morning, there was a scandal on the front page of the local news website. It turned out the principal didn't actually have the doctorate degree she claimed when she was hired. And despite having had no part in the deception, Mr. Thompson still gets distrust from parents and students alike. They question his integrity just because he and the principal are friends. Now these are both cases of guilt by association. You know, Sophia didn't do anything wrong and neither did Mr. Thompson. Guilp by association is an informal logic problem where someone has blamed not for what they did, but because of who they know or hang out with. Being blamed for something bad just because you were with someone who did it, for example, or being accused of corruption because a relative did something wrong, something like that. And this idea has been around for a long time and it still affects us today. So let's listen to a couple of examples for this week's warm up. The Clinton camp hope super delegates will take a look at Obama's problems, especially controversial remarks by his former pastor. Barack Obama's preacher is on the record saying a number of very troubling anti-American thing. That's former Fox News host Bill O'Reilly questioning the credibility of then candidate Barack Obama because of comments made by a pastor. Jeremiah Wright, who led a church that Obama used to attend. Wright's comments raised a lot of eyebrows. Have a listen. So, as you can imagine, those remarks caused a lot of controversy. But is it fair to blame Obama for them? I mean, Obama didn't say those things. Let's listen to another example. And why don't you tell the 800,000 Polish Americans right here in Pennsylvania how quickly you would give for the sake of favor and what you think is a friendship with what is known to be a dictator who would eat you for lunch? That was Vice President Kamala Harris addressing Donald Trump in a presidential debate. And she's suggesting that Trump's actions in the future will be predictable because of his affinity for Russian president Vladimir Putin now. This is an interesting case of using guilt by association. She's asking listeners to judge predicted future behavior because of affinity now. It's still guilt by association and it's still a logical issue. So here are a few tips to recognize guilt by association. First, you should identify the important facts. You know, is the speaker giving real proof that someone themselves did this something wrong? You can also look for the logic, are they blaming someone just because they know a guilty person? You could consider the source, right? Is the source reliable or are they known to be exaggerating? And finally, you can find more views, you know, read different sources and get the full story. So I need to take a quick break. But when I come back, the main topic with the research tells us about propaganda today. Welcome back. Let's take a look at how a failed novelist in academic became one of history's most dangerous propagandists. Paul Joseph Goebbels was born in 1897 to work in class Catholic parents in industrial rate Germany. And he had an ordinary intellectual wife, he had a doctorate, and he desperately wanted to be a writer. But unfortunately, his early literary attempts failed. He wrote two plays in 1923 and they just went nowhere. He worked at jobs he hated, like being a bank clerk, and he obsessively kept diaries that were later really great insight into the Nazi regimes and her workings. What's fascinating is how this intellectual, who had once studied under Jewish professors, transformed into the architect of Nazi propaganda. In 1924, Goebbels found himself drawn to Hitler's nationalist movement becoming a Nazi party member. And the key point came in 1926 when Hitler appointed him as district head of Berlin. Goebbels, who had actually initially aligned with the more socialist wing of the Nazi party under one of the dramatic shift after hearing Hitler speak. In his diary, he wrote, "I love him, such a sparkling mind can be my leader." And when the Nazi seized power in 1933, Goebbels became Reich minister of propaganda. And he was masterful at using new media, especially radio and film, to spread the regime's message. He was a man who understood that controlling information meant controlling people. So think about that for a second, you know, a failed writer who wanted to create fiction ended up crafting one of history's most devastating real world narratives. He shaped how millions of Germans perceived reality during the Third Reich. And this kind of tells us how sophisticated propaganda can transform a modern society. You know, think about those iconic images from Nazi Germany, massive rallies with torchlight parades, films that portrayed Hitler as a godlike savior and posters demonizing Jews and other minority groups. That was all Goebbels. He used every tool at his disposal in newspaper, radio, film, even art and music to flood the information landscape with pro-Nazi messaging. And he was really effective. He created an atmosphere of fear and hatred and blind obedience that allowed the Nazi party to seize control of Germany and to drag the world into war. But you know, it didn't leave propaganda in the 1940s, did we? It may have changed its face, but it's still very much with us today. But propaganda today isn't always dramatic films or speeches, sometimes it takes a softer touch. So let me take you inside the world of Confucius Institutes. On paper, they're candidates and centers for Chinese language and cultural exchange, and they spring up on university campuses worldwide, like kind of educational embassies. They offer language classes and cultural programs and a window into Chinese civilization, which sounds really benign, right? But here's where it gets interesting. These institutes aren't just teaching Mandarin. After promoting what's carefully termed China's "excellent traditional culture," a curated highlight reel, if you want, of Chinese heritage that happens to just line up perfectly with current political ideas. Think about it this way. Imagine if you went to a history class that only taught the parts of history that made one particular government look good. That's essentially what Confucius Institute critics are worried about. But the real story isn't just about what these institutes are teaching. It's about what they're not teaching. As I've pointed to a pattern of self-censorship and host institutions, certain topics suddenly becoming too sensitive to discuss, you know, kind of like having an invisible editor standing over your shoulder, except this editor has some very specific ideas about what constitutes acceptable things to talk about at your university. We're talking about a system where academic freedom, which is a cornerstone of university education, might be kind of quietly compromised, you know, not through obvious censorship, but through subtle pressure and self-regulation. It's like watching academic institutions develop an allergic reaction to some topics, but the allergy is really selective and it consistently aligns with particular political interests. Well, Uncle Sam had some thoughts about those, big thoughts, you know, the kind that end up in congressional legislation. And so these institutes operated through what you might call a joint custody arrangement. American universities partnered with Chinese entities, complete with directors that were appointed by both sides, and that sounds really collaborative, right? But there's a catch. When one parent is paying the bills and that parent happens to be the Chinese government, Congress starts asking some pretty pointed questions about who's calling the shots. And finally in 2018, Washington decided to play hardball. Congress essentially told universities it's us or them, choose between keeping your Confucius Institute or keeping your federal funding, and spoiler alert, money talks, and the Government Accountability Office, which is the US's national watchdog, found that more than 60% of schools closed their Confucius Institutes, mostly because they didn't want to lose department of defense money. And it wasn't just about money, we're talking about a broader concern over what some people have called academic colonization. And reports started surfacing about the topics that had become those educational third rails that you couldn't touch, things like Tibet, Taiwan, Tiananmen Square. And the aftermath is complicated. Some universities did manage to keep their Chinese language programs by tapping alternative funding, but you could think of it as academic diversification, not keeping all your educational eggs in one Beijing basket. So soft power is one type of modern propaganda, and it shows up in a bunch of different ways. Here's another example, state-run TV networks for export. In one corner we have RT, formerly Russia Today, and in the other Voice of America or VOA, and they're both state-funded media organizations, they both broadcast globally, and they're both trying to tell their nation's stories to the world. You can think of the networks like two different restaurants, maybe they're both serving news but with really different recipes. VOA is like a traditional diner, you know, maybe not too fancy, but you know what you're getting. RT, we're like a restaurant where the menu says one thing, but what shows up on your plate might be something else completely. RT is an editor-in-chief compared the network to Russia's Ministry of Defense. You know, so let that sink in, essentially saying, yes, we're fighting an information war. It's like they're not even trying to hide the propaganda cookbook. And research suggests that RT employs a partisan-parasite propaganda model that's hard, big mouthful there, focusing on US domestic issues and personalities and often framing them through a right-wing lens, kind of similar to outlets like Fox News or Newsmax or OANN. And that approach aims to kind of find existing divisions and amplify them within the United States. The VOA, on the other hand, operates behind what they call a firewall. You can think of it as like a journalistic force field designed to keep government hands off the editorial process. So Uncle Sam signs the checks, but he's not supposed to be in the newsroom. Keep in mind, though, that both of these networks are playing the same game, soft power, just with different rule books. VOA is kind of trying to win the championship following the rules, while RT is more trying to change the whole game. Sometimes, though, propaganda is more of a direct effort. Your social media field became a battlefield in the war for your mind. And so, let's think about why that like button might be more dangerous than you think. Prabhagandha doesn't just mean dropping leaflets from airplanes. You know, every scroll, every share, every retweet can be a weapon in the arsenal of influence. And social media is essentially eliminated in the middleman. You know, all those lovely journalists who used to fact check things, they're increasingly irrelevant in a world where organizations can tweet as easily as Taylor Swift. Groups like ISIS and Al Qaeda didn't need a press release, they just needed a good Wi-Fi connection. The speed and reach of social media means that false information can circle the globe while truth is still putting its shoes on. And we can also think about echo chambers, you know, cozy little digital bubbles that we live in. Social media algorithms are like overprotective parents, and they show us just what they want us to see, and it's comfortable, but it's also dangerous. You know, once the last time that your social media showed you something you fundamentally disagreed with. But wait, there's more. The real masters of the game understand that a picture is worth a thousand words, and a meme might be worth a million votes, because they could take the internet research agencies, Russia's notorious troll factory. They didn't just post propaganda, they created personets. The regular guy, the attractive young woman, digital wolves in sheep's clothing designed to make you let you guard down. And these propagandists are playing identity politics on a whole new level, turning your very sense of self into a weapon against you. This isn't just about spreading lies anymore, it's about weaponizing who you are. It's about politics or your religion, your community, anything that makes you, you, can be turned into ammunition. And according to research, Islamist and far-right extremist groups have become very good at this, sort of masters of this digital domain, creating what experts call a, quote, seductive subculture, which is a dangerous mix of slick production values and some really bad ideology. And they're not amateur operations. We're talking about sophisticated propaganda machines that know exactly how to push your emotional buttons. To create content, McHollwood producers take notes, except they're not selling movie tickets or selling extremism. And they're targeting, they're creating carefully crafted messages designed to resonate with specific audiences, they're exploiting frustrations and fears and grievances people already have. And they're also masters of digital camouflage. So they create those seemingly authentic profiles, you know, the passionate political commentator or your friendly neighbor, all designed to kind of draw you into their world, kind of the frog in slowly boiling water. But the time you realize what's going on, you're already pretty deep in the ecosystem. And they're not just preaching, they're teaching, they're using social media to distribute virtual playbooks for violence, turning online radicalization into real-world action. They're exploiting those real-world grievances and conflicts, and using those as catalysts for extremism. So here are a couple of other recent examples of propaganda campaigns that illustrate the vulnerability of social media platforms for this kind of manipulation. The first one would be the 2016 US presidential election, which was a watershed moment in digital propaganda. There was a whole mix of foreign operatives and domestic political groups and profit-motivated actors who took social media platforms and turned them into instruments of influence. And they deployed really targeted disinformation campaigns and conspiracy theories, trying to exploit divisions within the electorate. Russia's actions during the 2014 Ukrainian crisis are another sophisticated application of social media propaganda. So Russian social media outlets and other accounts systematically put out fabricated accounts of Ukrainian military atrocities, with a clear strategic purpose to undermine the legitimacy of the Ukrainian government while creating that pretext for Russian intervention. The anti-vaccination movement presents a different but also significant case study. So unlike state-sponsored campaigns, this was a grassroots propaganda effort, showing that social media can amplify fringe views into mainstream. And anti-vaccine advocates have effectively leveraged kind of the reach of social media and the anonymity to spread medical misinformation. And this has actually kept people from getting vaccinated, and it's really led to some disease outbreaks in a bunch of places. But there's a consistent pattern here. Whether it's proven by state actors, political organizations, or grassroots movements, all modern propaganda campaigns exploit social media's core features. So that's its reach, most people use it. It's targeting capabilities. And the fact that it's really resistant to traditional fact-checking. So finally, research in psychology, communication, and political science offers a bunch of insights into the factors that make people more susceptible to propaganda. And we can think of three main categories, cognitive factors, emotional factors, and social factors. So here are the cognitive factors. First up is cognitive laziness. People are bombarded with a constant stream of information. So they will usually take mental shortcuts to process information quickly. And this makes them more susceptible to accepting information at face value without really thinking about if it's true or not. Next up is confirmation bias. People are much more likely to accept even false information if it confirms what they already believe. It means people are individuals vulnerable to propaganda that reinforces their preconceived notions or seems in line with the group they're already a member of. And sometimes it's just, you know, poor skill at telling true or false. Research shows that people struggle to tell between true and false information, particularly when there's a lot of contradictory information coming in. And this is made worse by what they call the sleeper effect. This is where information from low credibility sources becomes convincing over time as people forget the source, but remember the message. All right. And then there are the emotional factors. Information that evokes strong emotions such as fear or anger or joy is more likely to be shared and remembered regardless of whether it's true or not. So propagandists will create messages that tap into emotion and trigger those strong responses just to keep people from thinking rationally. Propaganda often plays on fears and anxieties and it uses scare tactics and really exaggerated threats just to manipulate people. But for example, terrorist organizations use graphic imagery and violent words to instill fear in their target populations but also appealing to a sense of belonging and purpose to attract recruits. Now on the brighter side, maybe propaganda can also use humor. And this can disarm audiences and make the messages more palatable. So hysterical news outlets, for example, can criticize political opponents or promote specific viewpoints, but the message goes down easily because it's funny and it kind of blurs the line between entertainment and propaganda. There's social factors too, like identity belonging. So propagandists often target specific groups based on their social identities. You know, we talked about divisions. They'll also look at things that people would have in common. People are just more likely to trust information from sources they identify with. And people who struggle with critical thinking skills are less equipped to evaluate information sources and identify propagated techniques. And then, you know, in the information itself, social media platforms create echo chambers where users mostly see information that confirms what they already believe. They don't see diverse viewpoints. And this was really interesting. Another factor is trust in authority figures. Propaganda often relies on the credibility of authority figures and institutions to legitimize its messages. Why should I think about that for a second? How many times have you been told that you should look for information from, quote, reliable networks? What's the basis for that source being reliable? You know, could a reliable source ever make a mistake? And if they do, could a reliable source be sharing a bad message? People who place blind trust in authority are more likely to accept propaganda. So how do we become more resistant? Here's a couple of practical strategies. Let's start with what might be the most powerful tool in the arsenal, understanding how persuasion works. Kind of like learning how a magic trick is done. Once you know the mechanics, you're less likely to be fooled. And when you understand how emotional appeals work, how narratives are designed to manipulate your thinking, how your own biases can work against you, you're halfway there in a building resistance. But understanding isn't enough. You also need to develop skills, and that's where critical thinking comes in. It's not just about being skeptical. It's about knowing what questions to ask. When you encounter information, ask yourself, what's the evidence? Who benefits from this message? Now media literacy isn't just a buzzword, it's a survival skill. And that means knowing how to evaluate sources and understanding the difference between news and opinion, and recognizing those common propaganda techniques like we go over on this show. Here's something concrete that you can do. Use fact checking websites like PolitiFact, but don't stop there. Cross-reference the information across multiple sources, and when you see a flood of similar messages across different platforms, which experts call the firehose of falsehood, that's your cue, you should be extra cautious. But perhaps the most powerful strategy is active engagement, and that means stepping out of your information comfort zone. You know, actually talk to people who hold different views from you, support quality journalism, and being forewarned about propaganda makes them more resistant to it. It's kind of like getting vaccinated against misinformation, so don't put your head in the sand and only look for the news that you care about. Sometimes, directed at arguing against propaganda isn't the most effective approach. The focus instead on understanding what the propaganda is trying to achieve. Remember, this is a community effort, so I hope you can support media literacy, education, you can join in constructive debates, and most of all you can share reliable information, because the more we work together to create an informed society, the more resistant we all become to manipulation. See you next time. Thanks for getting Unspun with me this week. Unspun is a production of me, Amanda Sturgel, and is a proud member of the MSW Media Family of Podcasts. Some of your thoughts and ideas about trickery in the news on Gmail at TheUnspunPodcast@gmail.com. I'll even write back. To find this episode's show notes and more information at TheUnspunPodcast.substack.com. Want to learn more and get smarter? Check out my book, Detecting Deception, Tools to Fight Fake News, which is available on Amazon or your favorite online bookseller. And until next time, stay sharp, everyone. [MUSIC] [BLANK_AUDIO]
In this episode of UnSpun, Dr. Sturg explorse guilt by association and dive deep into modern propaganda. From historical exampless to today's digital battlegrounds, discover how propaganda has evolved and what research says makes you susceptible.