The Friendly Face of Modern Propaganda
Modern influence campaigns don’t force ideas upon us — they create environments where those ideas feel like natural conclusions.
Last week, a reader asked me to comment on the GOP’s decades-long campaign to define Democrats in negative terms — a highly effective form of propaganda. This request jump started my brain into thinking how best to explore that topic. It also triggered a small army of algorithms who do their best to shape my online world.
Suddenly, propaganda was everywhere in my feeds. Bluesky offered up experts. YouTube suggested documentaries about information warfare. Threads provided an infographic about media manipulation. Newsletters hinted a subscription could reveal the secrets of political persuasion. Even a spontaneous visit to Amazon served up a library of books on propaganda — all on sale.
My tendency toward mystical thinking perked up: “This is a sign from the universe — write about this topic next.” But my more realistic side intervened. What felt like cosmic synchronicity was actually just really good digital marketing. The algorithms lying beneath the surface of every screen had caught a whiff of my interest and served up exactly what they thought would keep me scrolling.
I was watching propaganda work its magic in real time, even as I was researching how propaganda works. It was a perfect demonstration of how modern influence campaigns operate: they don’t force ideas upon us — they create environments where those ideas feel like natural conclusions.
From Posters to Personalization
Propaganda used to be a blunt instrument. World War I posters used simple illustrations to engage audiences and convince them to “Join the Army,” or “Buy a Liberty Bond.” More recent campaigns leaned on cartoon characters like Smokey the Bear or bold acronyms like D.A.R.E. (Drug Abuse Resistance Education) to hopefully influence behaviors on a massive scale.
In comparison, modern propaganda is a heat-seeking missile targeted on your particular anxieties, hopes, and beliefs. The system learns which emotional notes to strike, which formats capture your attention, and which sources you trust. Each click, pause, or share helps it refine its approach, making its next attempt more likely to penetrate your natural skepticism.
This all works because the system understands something fundamental about human nature: we’re much more likely to believe something when it feels like we discovered it ourselves.
Like my algorithm-driven journey into propaganda research, each suggestion feels natural, each article seems relevant, each video appears at just the right moment. The path feels self-directed, even though it’s carefully curated. That’s no accident — it’s by design.
Inside the Influence Machine
Propaganda’s evolution helps explain how it works now, but seeing these principles in action is even more revealing. The GOP’s effective recasting of Democrats in negative terms offers a perfect case study in modern propaganda. While the full story spans decades, the basic mechanics are surprisingly simple to understand.
It often begins with a carefully chosen phrase like “liberal elite.” Conservative thought leaders introduce it in speeches and interviews. Friendly media outlets like Fox News or NewsMax weave it into their coverage, bolstering its authenticity by connecting it to real tensions — like when environmental regulations impact mining communities, or when free trade policies reshape local manufacturing economies.
Then the algorithms take over, initiating a networked click-fest. The phrase spreads through social media, appearing in Facebook posts, Twitter debates, and Instagram memes. Someone posts about “liberal elites” cancelling conservative speech. Their uncle comments about his experience with a condescending liberal professor. A friend posts a meme about $10 lattes, and strangers pile on with “so true!”
Each mention reinforces the others, creating an impression that coastal liberals are somehow disconnected from “real” Americans. The narrative spreads not because it’s demonstrably true, but because it’s emotionally engaging — exactly what algorithms are designed to multiply.
The message resonates because it simplifies real economic and cultural divisions, offering an “other” to blame. When an Ohio factory worker earning $45,000 hears about coastal tech salaries topping $200,000, the “elite” label feels viscerally true. When rural communities are hollowed out and left to die, the “disconnection” feels real. The propaganda succeeds by offering easily believed explanations for complex realities.
Interestingly, attempts to create similarly potent labels for conservatives haven’t worked as well yet. Terms like “radical right” or “MAGA Republicans” focus mainly on political ideology rather than lifestyle or economic status. They lack the same impact because they don’t tap into comparable social and economic divisions. This asymmetry reveals another key insight about effective propaganda: its power often comes from connecting to existing fault lines in society, not just political disagreements.
No master strategists need to oversee or coordinate these attacks. They just plant the initial seeds. Social media’s architecture — its reward for emotional content, its embrace of conflict, its spotlighting of provocative posts — does the rest. People frustrated by social or economic changes become unwitting amplifiers, spreading the message through shares, comments, and personal anecdotes.
In short, modern propaganda builds itself — a self-perpetuating narrative machine powered by algorithms and fueled by our human need to make sense of a complex world. It’s cheap, it’s easy, and it works.
The Media’s Impossible Choice
Combatting this type of propaganda creates a genuine Catch 22 for responsible media outlets. When they try to debunk false narratives, they often inadvertently strengthen them. A fringe theory about “liberal indoctrination in schools” circulates online. Major newspapers and networks, committed to informing the public, feel compelled to cover it “objectively.” But their fact-checking, however well-intended, keeps the topic trending on social media, spawning more coverage and discussion — frequently reinforcing rather than debunking the claim.
Fox News recognized and exploited this dynamic years ago. Their strategy of “just asking questions” about Obama’s birth certificate or Hunter’s computer forced other mainstream outlets into an unwinnable position. Each attempt to fact-check or contextualize these “controversies” breathed new life into the narratives. News organizations who refused to play along faced accusations of bias: “What are they trying to hide? Why won’t they cover this story?”
Conservative media may have pioneered this approach, but they don’t hold a monopoly on it anymore. Climate activists link every major storm to global warming. Voting rights advocates frame every electoral reform as potential voter suppression. Social media amplifies local controversies about book banning into evidence of creeping authoritarianism. These narratives gain power because they connect to genuine concerns about real social changes. But they don’t help develop or support dialog. They smother debate with a cloud of emotional exhaust.
And there’s the media’s dilemma in a nutshell: ignore questionable claims and watch them spread, or engage with them and risk extending their reach and distorting the issues. The challenge becomes even more complex because news organizations depend financially on the very social media platforms that make propaganda so effective. Each click, share, and angry comment helps somebody’s bottom line — just rarely the truth’s.
Beyond the Echo Chamber
We’d all like to think we’re too smart to fall for propaganda. That’s another reason it is so effective — because it doesn’t feel like propaganda our defenses are lowered. Consider my own experience: despite decades of living and working in Silicon Valley surrounded by discussions of algorithms and machine learning, I still found myself attributing online content suggestions to cosmic significance. That’s how seamlessly engineered opinions can blend into our natural thought processes.
Each day, thousands of messages compete for our attention through social feeds, news alerts, and casual conversations. We simply don’t have the time or energy to fact-check everything so we rely on shortcuts. Does this align with my experience? Is it shared by people I trust? Have I seen similar posts from reliable sources? These mental shortcuts aren’t flaws in our thinking — they’re necessary tools for navigating a complex world. But they’re also the very pathways that modern propaganda exploits with increasing sophistication.
The real challenge isn’t spotting individual pieces of propaganda — it’s recognizing how these messages collectively create the lens through which we view the world. We see patterns that feel meaningful and natural, even when they’re carefully constructed.
Today’s propaganda doesn’t arrive wearing a warning label or announce itself with bold posters and slogans. Instead, it shows up as a steady stream of content that feels personally relevant, naturally discovered, and seemingly validated by multiple sources. Before we realize it, these curated perspectives become the foundation for how we understand new information and interpret the world around us. It becomes easy and valid to view all Democrats as “liberal elites.”
Understanding this dynamic doesn’t make us immune to it, but it does give us a fighting chance to think more critically about the information environments we inhabit. When we recognize propaganda as a process rather than just content — a subtle shaping of our digital world rather than mere messages within it — we have a better chance of maintaining our intellectual independence while staying genuinely informed.
Your Turn
While there’s no perfect defense against modern propaganda, I believe we can develop better habits for constraining its influence. Consider these suggestions for improving your ability to spot propaganda for what it is:
Notice patterns. When stories cluster in your feed, ask yourself why they’re appearing now. What actions did you take that might attract these narratives and this point of view?
Check your reactions. Strong emotional responses often signal that you're interacting with content designed to provoke that very response. Think about the last political story that made you angry. Can you trace how it reached you? What made it feel especially credible or concerning?
Seek primary sources. If something seems outrageous, track down the original context before sharing. If the news source is a single person claiming expertise, confirm their credentials.
Expand your inputs. Follow thoughtful people you sometimes disagree with. Not the provocateurs, but the genuine bridge-builders who offer an alternative perspective.
Practice patience. Not every story needs an immediate reaction. Sometimes the best response is to wait for more information. Or give your response offline to friends or colleagues and listen to their point of view.
The goal isn’t to become cynical about everything we read. That’s just another form of manipulation. Instead, we can aim for a kind of informed skepticism — one that acknowledges both our own biases and the sophisticated systems trying to exploit them.
I’d love to hear your thoughts in the comments. Your insights on how propaganda shows up for you might help others better understand how it invades and shapes their thought processes.
If you’re new here, an algorithm probably guided you. In that case, I recommend you confirm who I am, where my expertise lies, and what biases I may bring to my posts. If you want to read more, I’d suggest you start with my foundational post, The Hidden Influence of Branding in American Politics.
Modern propaganda feels less like being told what to think and more like arriving at "your own" conclusions—thanks to algorithms feeding on our biases. As someone who's worked in consumer and political marketing, I see the same tactics at play: personalization, emotional hooks, and amplification. It’s not just manipulation—it’s manufactured discovery.
Okay. I am a believer. I was looking up the definition of the word polemic on my PC used in "Twilight of Democracy (hard copy)," Then suddenly (it seemed), on my ipad, I received a notification from NYT book review on the book "Sociopath, A memoir",and THEN Libby sent me a notification that the book was available to borrow. At which point I borrowed it. I guess that answers my question...Would a newspaper have alogrithms or is it specific to search/ecommerce engines? How is this monetized?