Skip to content
6 min read
Close-up of a Frankenstein-like monster face with stitched scars and neck bolts, overlaid by horizontal digital glitch bars

Architects and Monsters

According to recently unsealed court documents, Meta discontinued its internal studies on Facebook’s impact after discovering direct evidence that its platforms were detrimental to users’ mental health.

Jeff Horwitz reporting for Reuters:

In a 2020 research project code-named “Project Mercury,” Meta scientists worked with survey firm Nielsen to gauge the effect of “deactivating” Facebook, according to Meta documents obtained via discovery. To the company’s disappointment, “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness and social comparison,” internal documents said.

Rather than publishing those findings or pursuing additional research, the filing states, Meta called off further work and internally declared that the negative study findings were tainted by the “existing media narrative” around the company.

Privately, however, a staffer insisted that the conclusions of the research were valid, according to the filing.

As more and more evidence comes to light about Mark Zuckerberg and Meta’s failings and possibly criminal behavior, we as tech workers and specifically designers making technology that billions of people use, have to do better. While my previous essay written after the assassination of Charlie Kirk was an indictment on the algorithm, I’ve come across a couple of pieces recently that bring the responsibility closer to UX’s doorstep.

In an essay published in UX Collective, Daley Wilhelm asks, “Is addiction the responsibility of UX?” My short answer is yes. In her essay, Wilhelm dives into the infinite scroll. (Supposedly) invented by Asa Raskin, son of computer pioneer and the one who started the Macintosh project at Apple, Jef Raskin, we even have a pejorative name for it: doomscrolling.

While I whole- and heavy-heartedly believe that the innovations piloted by the UX industry, like infinite scroll, have blame to shoulder for how appealing and engrossing social media can be, however research has revealed that the best way for users to break out of the loop is to do it themselves. The real world has to step in; another app or notification isn’t going to pack the same punch as a pet needing attention or a friend asking a question.

That said, I do believe that more robust, industry-wide ethical mandates — like the ones proposed by Eleanor Howe in her excellent and extremely relevant article on the subject — would help to alleviate the overall harm brought on by the tech industry. More accountability for moving fast and breaking things would be a great next step toward healing what has been harmed.

We’ll get to Elenor Howe’s article in a second. But first, Elvis Hsiao analyzes the causes of brain rot. It’s obviously social media. He cites a recent report that says:

The average user now spends 2 hours and 24 minutes per day on social platforms, roughly 19 hours each week. These users are not confined to one or two apps. They spread their time across an average of seven different platforms monthly.

This is the attention economy, Hsiao declares.

Your attention is worth serious money. In 2024, Meta generated $160,6 billion in advertising revenue, with ad profits reaching approximately $87 billion. TikTok generated $23 billion, which is a 42.8% increase from the previous year.

These companies aren’t selling products to you. They’re selling you to advertisers.

But why do we willingly spend 5.5% of our week doomscrolling? It’s because we’re biologically wired to do so. It’s dopamine triggered by push notifications, red badges on icons, and wanting to see what the next reel is.

Social media companies pretend like they’re helping. Elvis Hsiao again:

Instagram lets you set daily time limits. When you hit your limit, a gentle reminder appears with a button that says “Ignore Limit For Today. One tap and you’re back to scrolling…

TikTok offers screen time management rules buried three menus deep. YouTube suggests taking a break, but it automatically queues up the next video.

But here’s the indictment:

Those designers knew what they were building. Internal documents, whistleblower testimony, and the simple existence of features like screen time limits prove the companies understand the addictive nature of their products, but prioritize profits over well-being.

So what can we as designers and makers of technology do?

Perfectly timed with the release of Guillermo del Toro’s Frankenstein, Elenor Howe uses Victor Frankenstein as a parabolic tale:

Victor Frankenstein’s great folly lay not in his ambition, but in his abdication. Horrified that he “had turned loose into the world a depraved wretch,” he ran. He left his creation to be feared and misunderstood by a society that was not prepared for it.

This is the dilemma of the modern tech profession. We, the architects of the digital commons, have acted with the same hubris. We build systems optimized for profit and engagement and unleash them upon society, then hide behind a structure of diffused responsibility when systemic harms emerge (teen mental health crisis, political polarization, mass addiction, erosion of privacy).

Howe argues that the tech industry needs its own form of the Hippocratic Oath, the pledge that all medical professionals take to do no harm to their patients. She calls it “The Architect’s Mandate.”

Summarized, the proposed code goes like this:

I. The Mandate of Inquiry. A protected right to ask “why,” demand root‑cause intent, and reject engagement metrics as a proxy for human value.

II. The Mandate of Consequence. A right to access research on likely human impact before release and to prioritize long‑term user well‑being over short‑term platform goals.

III. The Mandate of Refusal. A protected right to decline building systems that exploit vulnerabilities, erode agency or privacy, or amplify division—without retaliation.

IV. The Mandate of Precedence. A pledge to place public safety, agency, and mental health above employer profit or quarterly targets.

V. The Mandate of Testimony. A right to warn the public about systemic harm from a product despite NDAs or internal policies, free from retribution.

VI. The Mandate of Audit. A right to audit datasets and models pre‑deployment, test and document bias, and mitigate or suspend flawed algorithms even if timelines slip.

VII. The Mandate of Sustainable Design. A right to refuse work that enshrines planned obsolescence or materially contributes to environmental degradation.

In my last essay about this subject, I called designers out on the carpet, saying that we should take responsibility. This is more forward-looking. I cosign Elenor Howe’s Architect’s Mandate. To borrow her metaphor, we’ve wrought a monster, but we can’t abdicate our duties any longer. We have to make things better by making better things. Next time you’re asked to design a feature that drives engagement at the expense of the user’s well-being, consider your responsibilities and the mandates above. It’s your choice to do the right thing.

Subscribe for updates

Get design insights in your inbox. Sent weekly (or so).