I sent this article to both of my kids this week. My daughter is in college studying publishing. My son is a high school senior planning to go into real estate. Neither of them works in tech. That’s exactly why they need to read it.
Matt Shumer has spent six years building an AI startup and investing in the space. He wrote this piece for the people in his life who keep asking “so what’s the deal with AI?”—and getting the sanitized answer:
I keep giving them the polite version. The cocktail-party version. Because the honest version sounds like I’ve lost my mind. And for a while, I told myself that was a good enough reason to keep what’s truly happening to myself. But the gap between what I’ve been saying and what is actually happening has gotten far too big. The people I care about deserve to hear what is coming, even if it sounds crazy.
I know this feeling. I wrote yesterday about how AI is collapsing the gap between design and code and shifting the designer’s value toward taste and orchestration. That essay was for the software design industry. Shumer is writing for everyone else.
His core argument: tech workers have already lived through the disruption that’s coming for every other knowledge-work profession. He explains why tech got hit first:
The AI labs made a deliberate choice. They focused on making AI great at writing code first… because building AI requires a lot of code. If AI can write that code, it can help build the next version of itself. A smarter version, which writes better code, which builds an even smarter version. Making AI great at coding was the strategy that unlocks everything else. That’s why they did it first.
Christina Wodtke agrees something big is happening but thinks Shumer’s timeline for everyone else is off. Programming, she argues, is a near-ideal use case for AI—there’s an ocean of public training data, and code has a built-in quality check: it runs or it doesn’t. Hallucinations get caught by the compiler. Other fields aren’t so clean-cut.
Shumer makes the classic tech-insider mistake: assuming his experience generalizes to everyone else’s. It doesn’t. Ethan Mollick’s “jagged frontier” of AI capability is as jagged as ever. AI is spectacular at some tasks and embarrassingly bad at others, and the pattern doesn’t map to human intuitions about difficulty.
She makes another point that matters for anyone in a creative field:
A nuance Shumer completely misses: industries where there isn’t one right answer but there are better and worse answers may actually fare better with AI. When you’re writing strategy, designing an experience, or crafting a narrative, a “hallucination” isn’t necessarily a bug. It might be an interesting idea.
That maps to what I know is true in design. A wrong answer in code crashes the app. A wrong answer in a design brainstorm might be the seed of something good.
This is why I sent Shumer’s piece to my kids but didn’t tell them to panic. Publishing runs on editorial judgment, taste, and relationships with authors. Real estate depends on physical presence, local knowledge, and trust built over handshakes. Neither field has the clean training data and binary pass/fail that made coding so vulnerable so fast. But that doesn’t mean nothing changes. Wodtke again:
Your job probably won’t disappear. But parts of it will shift, and the timeline depends on your field’s specific relationship to data, verification, and ambiguity. Prepare thoughtfully instead of panicking.
Shumer’s practical advice is modest: use AI one hour a day, experiment with it. Not reading about it, but really using it. I’d add Wodtke’s framing to that: spend the hour figuring out which parts of your work sit on the easy side of the jagged frontier, and which parts don’t. That’s more useful than assuming the whole thing collapses overnight.
I said yesterday that the gap between “designer who orchestrates AI” and “designer who pushes pixels” will be enormous within 12 months. Shumer is making that same argument for every knowledge-work profession. The whole piece is worth your time and maybe worth sharing with someone who’s been resistant to AI. Just keep in mind Wodtke’s nuance.


