The Economics of Division
#NEWSLETTER | How digital media profits from our anger and distorts our worldview, and why families need to reclaim a balanced perspective to prepare kids for an AI-driven future.
"We've never been more divided…"
Is that true? Or is it a byproduct of how digital media (social + mainstream + new) make money? How many times in a day do you feel angry or anxious because of something you read online? You probably don't often think about the revenue generated by our response to online content, but the future depends on us reclaiming control.
This is not to say we have no personal agency. But our actions and engagement create data, and as the requirement for more data to fuel the development of AI increases, so too does our susceptibility to being manipulated to engage and react.
The Attention Economy: How it Works
It’s important to remind ourselves, and our kids, that media platforms primarily generate revenue through advertising, which, of course, requires maximizing user attention and engagement to achieve financial results.
It’s an economic ecosystem that relies on getting your attention. And it’s far easier to get you to click and participate when you are angry, concerned, or feeling scorned.
It’s quite remarkable to consider the tools available to play with our emotions. In fact, when it comes to social media, in particular, the entire system is built to maximize "time on platform" through infinite scrolling, autoplay features, and notification systems that trigger a dopamine response.
Dr. Anna Lembke, the author of "Dopamine Nation: Finding Balance in the Age of Indulgence," often talks about the mechanics of addiction as it relates to online content. She explains how these digital environments exploit the same neural pathways as addictive substances (like candy), creating powerful feedback loops that keep us coming back for more.
A New System Built on Division
So how, literally, does this all work? Here is what’s behind the wizard’s curtain:
Algorithms
Algorithms, in this context, are the "rules" that determine what content is shown to you. These underpin the ecosystem by promoting content that triggers strong emotional reactions (especially outrage, fear, and anger) because this content gets more clicks, comments, and shares. These invisible systems are constantly learning which emotional buttons to push to keep you scrolling.
Personalization Features
With repeated engagement, we end up in "filter bubbles" that reinforce existing beliefs and minimize exposure to diverse perspectives.
For instance, YouTube or Instagram’s recommendation system typically guides viewers toward increasingly extreme content in the same ideological direction. Someone watching basic fitness videos/reels might gradually be recommended content about extreme diets, while someone watching basic political commentary might be steered toward more radical viewpoints.
These bubbles create the illusion that everyone thinks like we do.
Clickbait Headlines
Frequently used (and abused) by even our most mainstream and historically-trusted media, are these attention-grabbing tactics that get people to click, engage, and share via deliberately sensationalized headlines and scandalous content. The gap between headline and actual substance often doesn't matter, as long as the initial click occurs.
And, unfortunately, there is very little recourse if the content or a headline is wrong. Once a story is out in the digital universe, it’s hard to say that it was inaccurate.
Viral Sharing Mechanics
Of course, if all the above is "working" then the media want more eyeballs on the content and most users are happy to oblige by sharing the content with others. This amplification system ensures that the most emotionally provocative content—not necessarily the most accurate or important—reaches the widest audience.
The Old Way Fostered Consensus
Now, I'm not going to argue that the way we consumed media in the past was better… I'm excited by technology and the future. But to understand the constraints and ecosystem inherent to media years ago is to understand how division yesterday hurt sales but today fuels revenue.
When you think about their previous economic environment, it can also help us understand what "divided" even means.
And none of this is to say that sensational media did not exist to titillate and sell ad space before. But ultimately, the structural incentives for extreme polarization weren't as strong or pervasive.
Here is why:
Space Limitations
First, the literal, physical limitations of space in print media (newspapers, magazines) and time on air for broadcast media, meant that the content had to appeal to broad audiences. And by appealing more broadly, consensus and connection was a byproduct of the content. Editors had to make careful choices about what deserved coverage, prioritizing stories with wider relevance.
Incentives for Satisfaction not Rage
Also, to get people to subscribe, it was important to gain and keep trust. Rather than rapid-fire content designed for fleeting engagement, publications needed to cultivate long-term satisfaction to keep subscribers spending money to receive a newspaper or magazine. The relationship was built on reliability rather than constant emotional stimulation.
Higher Barrier to Entry
There has been something liberating and exciting about having fewer gatekeepers and many more voices on the media front. But in having a much higher barrier to entry in the space, resulting in far fewer media voices, there wasn't as much extreme content available. Mainstream publications had reputational stakes that moderated their approach.
Slower News Cycle
The entire pre-digital ecosystem was slower by design. While, again, the benefits of speed are obvious, when it wasn't available to us, there was more time for fact-checking and deliberation. Stories developed more gradually, allowing for context and understanding rather than instant reactions.
Connection to AI Innovation
I attended a number of events this week with experts, academics, and leaders in the "Responsible AI" space. And there is a serious concern that the way in which we have allowed kids to experience a polarized digital environment will affect their ability to participate critically and safely with AI.
Because if children are easily swayed, they may struggle to distinguish between credible and non-credible content. It may be harder to ask questions and think critically if they have been trained to seek peer influence and social validation above thoughtful engagement.
The fact is, as their sense of reality and social norms is still forming, the type of worldview that appears fractured and encourages them to act out or be angry is not conducive to using future innovation to promote change that benefits society. Children raised in digital environments optimized for outrage rather than understanding may bring those same patterns of interaction to emerging technologies.
Sounds heavy, right? Well, there are things we can do…
How to Navigate This New Reality
Just like getting those fruits and veggies in front of kids, or promoting good habits of study and fitness, we should consider adding in work to think critically about digital content. We want kids to advocate, debate, protest, and have a voice. But it's also important that we don't look away, as much of this generates profit for someone else.
Have Open Conversations
Explain how social media companies make money through attention and engagement and be honest about what is happening
Use age-appropriate examples to show how algorithms promote extreme content and discuss how this content makes both of you feel
Discuss how headlines can be misleading or designed to trigger emotions
Share examples from your experience noticing media manipulation (and maybe even examples where you weren’t super proud of your own response)
Develop Media Literacy Skills Together
Practice identifying credible vs. non-credible sources
Seek evidence and multiple perspectives on controversial topics
Discuss the difference between facts, opinions, and emotional appeals
Analyze headlines together: "Why did they phrase it this way?"
Create Healthy Media Habits
Establish tech-free times and zones in your home (and be honest about how hard it is for you, too)
Model balanced media consumption yourself
Follow a diverse range of sources that represent different viewpoints (don’t vilify credible but alternative outlets)
Take "digital detox" breaks as a family
Focus on the Real World
Encourage community involvement and real-world social connections
Discuss how online discourse often differs from in-person conversations
Point out that the most extreme voices often get amplified online, while moderate perspectives are common in everyday life
Remind children that what they see online is often designed to provoke reaction rather than represent reality
Seek Quality Alternatives
Look for age-appropriate news sources designed for children
Support subscription-based journalism that isn't solely dependent on clicks
Find online communities centered around constructive dialogue rather than conflict (there are some great ones here on Substack!)
Explore educational content that teaches critical thinking
Final Thoughts…
The commercial structures of today's digital media landscape have fundamentally changed how information reaches us, and understanding how they often reward polarization and extreme emotion can be helpful to all of us.
By understanding these dynamics, families can help children navigate this environment more thoughtfully, developing the skills to recognize manipulation and maintain a balanced perspective on the world around them.
The goal isn't to shield children completely from digital media—that's neither practical nor desirable in today's world. Instead, we want to empower them with the awareness and skills to engage thoughtfully, recognizing when they're being manipulated and making conscious choices about their digital consumption.
Remember that while technology has created these challenges, it's still ultimately human judgment and critical thinking that offer the best protection against their harmful effects.
I was reminded this week of human agency when talking to a digital investigator who helps companies root out bad actors burrowing deep into AI and manipulating consumers. He made a point about bad people doing bad things, no matter the vehicle.
It's a good point to remember: while the world shifts around us, what's good, bad, right, or wrong stays the same. We just need to find our footing.