Blood in the Feed: Social Media’s Deadly Design
The assassination of Charlie Kirk on September 10, 2025, marked a horrifying inflection point in the growing debate over how digital platforms amplify rage and destabilize politics. As someone who had already stepped back from social media after Trump’s re-election, watching these events unfold from a distance only confirmed my decision. My feeds had become pits of despair, grievances, and overall negativity that didn’t do well for my mental health. While I understand the need to shine a light on the atrocities of Trump and his government, the constant barrage was too much. So I mostly opted out, save for the occasional promotion of my writing.
Kirk’s death feels like the inevitable conclusion of systems we’ve built—systems that reward outrage, amplify division, and transform human beings into content machines optimized for engagement at any cost.
The Mechanics of Disconnection
As it turns out, my behavior isn’t out of the ordinary. People quit social media for various reasons, often situational—seeking balance in an increasingly overwhelming digital landscape. As a participant explained in a research project about social media disconnection:
It was just a build-up of stress and also a huge urge to change things in life. Like, ‘It just can’t go on like this.’ And that made me change a number of things. So I started to do more sports and eat differently, have more social contacts and stop using online media. And instead of sitting behind my phone for two hours in the evening, I read a book and did some work, went to work out, I went to a birthday or a barbecue. I was much more engaged in other things. It just gave me energy. And then I thought, ‘This is good. That’s the way it’s supposed to be. I have to maintain this.’
Sometimes the realization is more visceral—that on these platforms, we are the product. As Jef van de Graaf provocatively puts it:
Every post we make, every friend we invited, every little notification dragging us back into the feed serves one purpose: to extract money from us—and give nothing back but dopamine addiction and mental illness.
While his language is deliberately inflammatory, the sentiment resonates with many who’ve watched their relationship with these platforms sour. As he cautions:
Remember: social media exists because we feed it our lives. We trade our privacy and sanity so VCs and founders can get rich and live like greedy fucking kings.
The Architecture of Rage
The internet was built to connect people and ideas. Even the early iterations of Facebook and Twitter were relatively harmless because the timelines were chronological. But then the makers—product managers, designers, and engineers—of social media platforms began to optimize for engagement and visit duration. Was the birth of the social media algorithm the original sin?
Kevin Roose and Casey Newton explored this question in their Hard Fork episode following Kirk’s assassination, discussing how platforms have evolved to optimize for what they call “borderline content”—material that comes right up to the line of breaking a platform’s policy without quite going over. As Newton observed about Kirk himself:
He excelled at making what some of the platform nerds that I write about would call borderline content. So basically, saying things that come right up to the line of breaking a platform’s policy without quite going over... It turns out that the most compelling thing you can do on social media is to almost break a policy.
Kirk mastered this technique—speculating that vaccines killed millions, calling the Civil Rights Act a mistake, flirting with anti-Semitic tropes while maintaining plausible deniability. He understood the algorithm’s hunger for controversy, and fed it relentlessly. And then, in a horrible irony, he was killed by someone who had likely been radicalized by the very same algorithmic forces he’d helped unleash.
As Roose reflected:
We as a culture are optimizing for rage now. You see it on the social platforms. You see it from politicians calling for revenge for the assassination of Charlie Kirk. You even see it in these individual cases of people getting extremely mad at the person who made a joke about Charlie Kirk that was edgy and tasteless, and going to report them to their employer and get them fired. It’s all this sort of spectacle of rage, this culture of destroying and owning and humiliating.
The Unraveling of Digital Society
Social media and smartphones have fundamentally altered how we communicate and socialize, often at the expense of face-to-face interactions. These technologies have created a market for attention that fuels fear, anger, and political conflict. The research on mental health impacts is sobering: studies found that the introduction of Facebook to college campuses led to measurable increases in depression, accounting for approximately 24 percent of the increased prevalence of severe depression among college students over two decades.
In the wake of Kirk’s assassination, what struck me most was how the platforms immediately transformed tragedy into content. Within hours, there were viral posts celebrating his death, counter-posts condemning those celebrations, organizations collecting databases of “offensive” comments, people losing their jobs, death threats flying in all directions. As Newton noted:
This kind of surveillance and doxxing is essentially a kind of video game that you can play on X. And people like to play video games. And because you’re playing with people’s real lives, it feels really edgy and cool and fun for those who are participating in this.
The human cost is remarkable—teachers, firefighters, military members fired or suspended for comments about Kirk’s death. Many received death threats. Far-right activists called for violence and revenge, doxxing anyone they accused of insufficient mourning.
Blood in the Feed
The last five years have been marked by eruptions of political violence that cannot be separated from the online world that incubated them.
- The attack on Paul Pelosi (2022). The man who broke into the Speaker of the House Nancy Pelosi’s San Francisco home and fractured her husband’s skull had been marinating in QAnon conspiracies and election denialism online. Extremism experts warned it was a textbook case of how stochastic terrorism—the idea that widespread demonization online can trigger unpredictable acts of violence by individuals—travels from platform rhetoric into a hammer-swinging hand.
- The Trump assassination attempt (July 2024). A young man opened fire at a rally in Pennsylvania. His social media presence was filled with antisemitic, anti-immigrant content. Within hours, extremist forums were glorifying him as a martyr and calling for more violence.
- The killing of Minnesota legislator Melissa Hortman and her husband (June 2025). Their murderer left behind a manifesto echoing the language of online white supremacist and anti-abortion communities. He wasn’t a “lone wolf.” He was drawing from the same toxic well of white supremacist and anti-abortion rhetoric that floods online forums. The language of his manifesto wasn’t unique—it was copied, recycled, and amplified in the ideological swamps anyone with a Wi-Fi connection can wander into.
These headline events sit atop a broader wave: the New Orleans truck-and-shooting rampage inspired by ISIS propaganda online (January 2025), the Cybertruck bombing outside Trump’s Los Angeles hotel tied to accelerationist forums—online spaces where extremists argue that violence should be used to hasten the collapse of society (January 2025), and countless smaller assaults on election workers, minority communities, and public officials.
The pattern is depressingly clear. Platforms radicalize, amplify, and normalize the language of violence. Then, someone acts.
The Death of Authenticity
As social media became commoditized—a place to influence and promote consumption—it became less personal and more like TV. The platforms are now being overrun by AI spam and engagement-driven content that drowns out real human connection. As James O’Sullivan notes:
Platforms have little incentive to stem the tide. Synthetic accounts are cheap, tireless and lucrative because they never demand wages or unionize... Engagement is now about raw user attention – time spent, impressions, scroll velocity – and the net effect is an online world in which you are constantly being addressed but never truly spoken to.
Research confirms what users plainly see: tens of thousands of machine-written posts now flood public groups, pushing scams and chasing engagement. Whatever remains of genuine human content is increasingly sidelined by algorithmic prioritization, receiving fewer interactions than the engineered content and AI slop optimized solely for clicks.
The result? Networks that once promised a single interface for the whole of online life are splintering. Users drift toward smaller, slower, more private spaces—group chats, Discord servers, federated microblogs, and email newsletters. A billion little gardens replacing the monolithic, rage-filled public squares that have led to a burst of political violence.
The Designer’s Reckoning
This brings us to design and our role in creating these systems. As designers, are we beginning to reckon with what we've wrought?
Jony Ive, reflecting on his own role in creating the smartphone, acknowledges this burden:
I think when you’re innovating, of course, there will be unintended consequences. You hope that the majority will be pleasant surprises. Certain products that I’ve been very involved with, I think there were some unintended consequences that were far from pleasant. My issue is that even though there was no intention, I think there still needs to be responsibility. And that weighs on me heavily.
His words carry new weight after Kirk’s assassination—a death enabled by platforms we designed, algorithms we optimized, engagement metrics we celebrated.
At the recent World Design Congress in London, architect Indy Johar didn’t mince words:
We need ideas and practices that change how we, as humans, relate to the world... Ignoring the climate crisis means you’re an active operator in the genocide of the future.
But we might ask: What about ignoring the crisis of human connection? What about the genocide of civil discourse? Climate activist Tori Tsui’s warning applies equally to our digital architecture saying, “The rest of us are at the mercy of what you decide to do with your imagination.”
Political violence is accelerating and people are dying because of what we did with our imagination. If responsibility weighs heavily, so too must the search for alternatives.
The Possibility of Bridges
There are glimmers of hope in potential solutions. Aviv Ovadya’s concept of “bridging-based algorithms” offers one path forward—systems that actively seek consensus across divides rather than exploiting them. As Casey Newton explains:
They show them to people across the political spectrum... and they only show the note if people who are more on the left and more on the right agree. They see a bridge between the two of you and they think, well, if Republicans and Democrats both think this is true, this is likelier to be true.
But technological solutions alone won’t save us. The participants in social media disconnection studies often report developing better relationships with technology only after taking breaks. One participant explained:
It’s more the overload that I look at it every time, but it doesn’t really satisfy me, that it no longer had any value at a certain point in time. But that you still do it. So I made a conscious choice – a while back – to stop using Facebook.
Designing in the Shadow of Violence
Rob Alderson, in his dispatch from the World Design Congress, puts together a few pieces. Johar suggests design’s role is “desire manufacturing”—not just creating products, but rewiring society to want and expect different versions of the future. As COLLINS co-founder Leland Maschmeyer argued, design is about…
What do we want to do? What do we want to become? How do we get there?’… We need to make another reality as real as possible, inspired by new context and the potential that holds.
The challenge before us isn't just technical—it's fundamentally about values and vision. We need to move beyond the Post-it workshops and develop what Johar calls “new competencies” that shape the future.
As I write this, having stepped back from the daily assault of algorithmic rage, I find myself thinking about the Victorian innovators Ive mentioned—companies like Cadbury’s and Fry’s that didn’t just build factories but designed entire towns, understanding that their civic responsibility extended far beyond their products. They recognized that massive societal shifts of moving people from land that they farmed, to cities they lived in for industrial manufacturing, require holistic thinking about how people live and work together.
We stand at a similar inflection point. The tools we’ve created have reshaped human connection in ways that led directly to Charlie Kirk’s assassination. A young man, radicalized online, killed a figure who had mastered the art of online radicalization. The snake devoured its tail on a college campus in Utah, and we all watched it happen in real-time, transforming even this tragedy into content.
The vast majority of Americans, as Newton reminds us, “do not want to participate in a violent cultural war with people who disagree with them.” Yet our platforms are engineered to convince us otherwise, to make civil war feel perpetually imminent, to transform every disagreement into an existential threat.
The Cost of Our Imagination
Perhaps the real design challenge lies not in creating more engaging feeds or stickier platforms, but in designing systems that honor our humanity, foster genuine connection, and help us build the bridges we so desperately need.
Because while these US incidents show how social media incubates lone attackers and small cells, they pale in comparison to Myanmar, where Facebook’s algorithms directly amplified hate speech and incitement, contributing to the deaths of thousands—estimates range from 6,700 to as high as 24,000—and the forced displacement of over 700,000 Rohingya Muslims. That catastrophe made clear: when platforms optimize only for engagement, the result isn’t connection but carnage.
This is our design failure. We built systems that reward extremism, amplify rage, and treat human suffering as engagement. The tools meant to bring us together have instead armed us against each other. And we all bear responsibility for that.
It’s time we imagined something better—before the systems we’ve created finish the job of tearing us apart.