54 posts tagged with “ux design

Coincidentally, I was considering adding a service designer to my headcount plan when this article came across my feeds. Perfect timing. It’s hard to imagine that service design as a discipline is so young—only since 2012 according to the author.

Joe Foley, writing in Creative Bloq:

As a discipline, service design is still relatively new. A course at the Royal College of Art in London (RCA) only began in 2012 and many people haven't even heard of the term. But that's starting to change.

He interviews designer Clive Grinyer, whose new book on service design has just come out. He was co-founder of the design consultancy Tangerine, Director of Design and Innovation for the UK Design Council, and Head of Service Design at the Royal College of Art.

Griner:

Great service design is often invisible as it solves problems and removes barriers, which isn’t necessarily noticed as much as a shiny new product. The example of GDS (Government Digital Service) redesigning every government department from a service design perspective and removing many frustrating and laborious aspects of public life from taxing a car to getting a passport, is one of the best.

The key difference between service design and UX is that it’s end product is not something on a screen:

But service design is not just the experience we have through the glass of a screen or a device: it’s designed from the starting point of the broader objective and may include many other channels and touchpoints. I think it was Colin Burns who said a product is just a portal to a service.

In other words, if you open the aperture of what user experience means, and take on the challenge of designing real-world processes, flows, and interaction—that is service design.

preview-1753921779925.jpg

Service design isn't just a hot buzzword, it affects everything in your life

Brands need to catch up fast.

Earth 3 Streamline Icon: https://streamlinehq.comcreativebloq.com

Luke Wroblewski, writing in his blog:

Across several of our companies, software development teams are now "out ahead" of design. To be more specific, collaborating with AI agents (like Augment Code) allows software developers to move from concept to working code 10x faster. This means new features become code at a fast and furious pace.

When software is coded this way, however, it (currently at least) lacks UX refinement and thoughtful integration into the structure and purpose of a product. This is the work that designers used to do upfront but now need to "clean up" afterward. It's like the development process got flipped around. Designers used to draw up features with mockups and prototypes, then engineers would have to clean them up to ship them. Now engineers can code features so fast that designers are ones going back and cleaning them up.

This is what I’ve been secretly afraid of. That we would go back to the times when designers were called in to do cleanup. Wroblewski says:

Instead of waiting for months, you can start playing with working features and ideas within hours. This allows everyone, whether designer or engineer, an opportunity to learn what works and what doesn’t. At its core rapid iteration improves software and the build, use/test, learn, repeat loop just flipped, it didn't go away.

Yeah, or the feature will get shipped this way and be stuck this way because startups move fast and move on.

My take is that as designers, we need to meet the moment and figure out how to build design systems and best practices into the agentic workflows our developer counterparts are using.

preview-1753725448535.png

AI Has Flipped Software Development

For years, it's been faster to create mockups and prototypes of software than to ship it to production. As a result, software design teams could stay "ahead" of...

Earth 3 Streamline Icon: https://streamlinehq.comlukew.com

This is a really well-written piece that pulls the AI + design concepts neatly together. Sharang Sharma, writing in UX Collective:

As AI reshapes how we work, I’ve been asking myself, it’s not just how to stay relevant, but how to keep growing and finding joy in my craft.

In my learning, the new shift requires leveraging three areas
1. AI tools: Assembling an evolving AI design stack to ship fast
2. AI fluency: Learning how to design for probabilistic systems
3. Human-advantage: Strengthening moats like craft, agency and judgment to stay ahead of automation

Together with strategic thinking and human-centric skills, these pillars shape our path toward becoming an AI-native designer.

Sharma connects all the crumbs I’ve been dropping this week:

preview-1752771124483.jpeg

AI tools + AI fluency + human advantage = AI-native designer

From tools to agency, is this what it would take to thrive as a product designer in the AI era?

Earth 3 Streamline Icon: https://streamlinehq.comuxdesign.cc
Copilots helped enterprises dip their toes into AI. But orchestration platforms and tools are where the real transformation begins — systems that can understand intent, break it down, distribute it, and deliver results with minimal hand-holding.

Think of orchestration as how “meta-agents” are conducting other agents.

The first iteration of AI in SaaS was copilots. They were like helpful interns eagerly awaiting your next command. Orchestration platforms are more like project managers. They break down big goals into smaller tasks, assign them to the right AI agents, and keep everything coordinated. This shift is changing how companies design software and user experiences, making things more seamless and less reliant on constant human input.

For designers and product teams, it means thinking about workflows that cross multiple tools, making sure users can trust and control what the AI is doing, and starting small with automation before scaling up.

Beyond Copilots: The Rise of the AI Agent Orchestration Platform

AI agent orchestration platforms are replacing simple copilots, enabling enterprises to coordinate autonomous agents for smarter, more scalable workflows.

Earth 3 Streamline Icon: https://streamlinehq.comuxmag.com

Let’s stay on the train of designing AI interfaces for a bit. Here’s a piece by Rob Chappell in UX Collective where he breaks down how to give users control—something I’ve been advocating—when working with AI.

AI systems are transforming the structure of digital interaction. Where traditional software waited for user input, modern AI tools infer, suggest, and act. This creates a fundamental shift in how control moves through a experience or product — and challenges many of the assumptions embedded in contemporary UX methods.

The question is no longer:
“What is the user trying to do?”

The more relevant question is:
“Who is in control at this moment, and how does that shift?”

Designers need better ways to track how control is initiated, shared, and handed back — focusing not just on what users see or do, but on how agency is negotiated between human and system in real time.

Most design frameworks still assume the user is in the driver’s seat. But AI is changing the rules. The challenge isn’t just mapping user flows or intent—it’s mapping who holds the reins, and how that shifts, moment by moment. Designers need new tools to visualize and shape these handoffs, or risk building systems that feel unpredictable or untrustworthy. The future of UX is about negotiating agency, not just guiding tasks.

preview-1752705140164.png

Beyond journey maps: designing for control in AI UX

When systems act on their own, experience design is about balancing agency — not just user flow

Earth 3 Streamline Icon: https://streamlinehq.comuxdesign.cc

Vitaly Friedman writes a good primer on the design possibilities for users to interact with AI features. As AI capabilities become more and more embedded in the products designers make, we have to become facile in manipulating AI as material.

Many products are obsessed with being AI-first. But you might be way better off by being AI-second instead. The difference is that we focus on user needs and sprinkle a bit of AI across customer journeys where it actually adds value.
preview-1752639762962.jpg

Design Patterns For AI Interfaces

Designing a new AI feature? Where do you even begin? From first steps to design flows and interactions, here’s a simple, systematic approach to building AI experiences that stick.

Earth 3 Streamline Icon: https://streamlinehq.comsmashingmagazine.com

Ted Goas, writing in UX Collective:

I predict the early parts of projects, getting from nothing to something, will become shared across roles. For designers looking to branch out, code is a natural next step. I see a future where we’re fixing small bugs ourselves instead of begging an engineer, implementing that animation that didn’t make the sprint but you know would absolutely slap, and even building simple features when engineering resources are tight.

Our new reality is that anyone can make a rough draft.

But that doesn’t mean those drafts are good. That’s where our training and taste come in.

I think Goas is right and it echoes the AI natives post by Elena Verna. I wrote a little more extensively in my newsletter over the weekend.

preview-1752467928143.jpg

Designers: We’ll all be design engineers in a year

And that’s a good thing.

Earth 3 Streamline Icon: https://streamlinehq.comuxdesign.cc

Miquad Jaffer, a product leader at OpenAI shares his 4D method on how to build AI products that users want. In summary, it's…

  • Discover: Find and prioritize real user pain points and friction in daily workflows.
  • Design: Make AI features invisible and trustworthy, fitting naturally into users’ existing habits.
  • Develop: Build AI systematically, with robust evaluation and clear plans for failures or edge cases.
  • Deploy: Treat each first use like a product launch, ensuring instant value and building user trust quickly.
preview-1752209855759.png

OpenAI Product Leader: The 4D Method to Build AI Products That Users Actually Want

An OpenAI product leader's complete playbook to discover real user friction, design invisible AI, plan for failure cases, and go from "cool demo" to "daily habit"

Earth 3 Streamline Icon: https://streamlinehq.comcreatoreconomy.so

I remember the article from 2016 titled “Hamburger Menus and Hidden Navigation Hurt UX Metrics” where the conclusion from NN/g was:

Discoverability is cut almost in half by hiding a website’s main navigation. Also, task time is longer and perceived task difficulty increases.

Fast forward nearly 10 years later and NN/g says:

Hamburger menus are a more familiar pattern today than 10 years ago, but the same old best practices for hidden navigation still apply.

Kate Kaplan, revisiting her conclusion from nearly a decade ago:

Over the past decade, the hamburger menu — much like its namesake — has become a classic. As mobile-first design took hold, it offered a clean, space-saving solution, and when design leaders like Apple and Amazon adopted it, others followed. Its growing ubiquity helped standardize its meaning: Through repeated exposure, users learned to recognize and interpret the icon with increasing confidence.

I think the hamburger menu grew in popularity despite NN/g’s authoritative finger wagging. As designers, most of the time, we have to balance between the needs of the project and client with known best practices. Many websites, especially e-commerce, don’t have four or fewer main navigation links. We had to put the links somewhere and the hamburger menu made sense.

preview-1750137551560.png

The Hamburger-Menu Icon Today: Is it Recognizable?

Hamburger menus are a more familiar pattern today than 10 years ago, but the same old best practices for hidden navigation still apply.

Earth 3 Streamline Icon: https://streamlinehq.comnngroup.com

Christopher Butler writes a wonderful essay about the “best interfaces we never built,” exploring the UIs from sci-fi:

Science fiction, by the way, hasn’t just predicted our technological future. We all know the classic examples, particularly those from Star Trek: the communicator and tricorder anticipated the smartphone; the PADD anticipated the tablet; the ship’s computer anticipated Siri, Alexa, Google, and AI voice interfaces; the entire interior anticipated the Jony Ive glass filter on reality. It’s enough to make a case that Trek didn’t anticipate these things so much as those who watched it as young people matured in careers in design and engineering. But science fiction has also been a fertile ground for imagining very different ways for how humans and machines interact.

He goes on to namecheck 2001: A Space Odyssey, Quantum Leap, Inspector Gadget and others. I don’t know Butler personally, but I’d bet $1 he’s Gen X like me.

As UX designers, it’s very easy to get stuck thinking that UI is just pixels rendered on a screen. But in fact, an interface is anything that translates our intentions into outcomes that technology can deliver.

preview-1750007787508.jpg

The Best Interfaces We Never Built

Every piece of technology is an interface. Though the word has come to be a shorthand for what we see and use on a screen, an interface is anything

Earth 3 Streamline Icon: https://streamlinehq.comchrbutler.com

Sara Paul writing for NN/g:

The core principles of UX and product design remain unchanged, and AI amplifies their importance in many ways. To stay indispensable, designers must evolve: adapt to new workflows, deepen their judgment, and double down on the uniquely human skills that AI can’t replace.

They spoke with seven UX practitioners to get their take on AI and the design profession.

I think this is great advice and echoes what I’ve written about previously (here and here):

There is a growing misconception that AI tools can take over design, engineering, and strategy. However, designers offer more than interaction and visual-design skills. They offer judgment, built on expertise that AI cannot replicate.

Our panelists return to a consistent message: across every tech hype cycle, from responsive design to AI, the value of design hasn’t changed. Good design goes deeper than visuals; it requires critical thinking, empathy, and a deep understanding of user needs.
preview-1749705164986.png

The Future-Proof Designer

Top product experts share four strategies for remaining indispensable as AI changes UI design, accelerates feature production, and reshapes data analysis.

Earth 3 Streamline Icon: https://streamlinehq.comnngroup.com

Great reminder from Kai Wong about getting stuck on a solution too early:

Imagine this: the Product Manager has a vision of a design solution based on some requirements and voices it to the team. They say, “I want a table that allows us to check statuses of 100 devices at once.”

You don’t say anything, so that sets the anchor of a design solution as “a table with a bunch of devices and statuses.”
preview-1749704193306.jpeg

Avoid premature solutions: how to respond when stakeholders ask for certain designs

How to avoid anchoring problems that result in stuck designers

Earth 3 Streamline Icon: https://streamlinehq.comdataanddesign.substack.com

When you’re building a SaaS app, I believe it’s important to understand the building blocks, or objects, in your app. What are they? How do they relate to each other? Should those relationships be peer-to-peer or parent-child? Early in my tenure at BuildOps, I mentioned this way of thinking to one of my designers and they pointed me to Object-Oriented UX (OOUX), a methodology pioneered by Sophia Prater.

Object-Oriented UX is a way of thinking about design, introduced and popularized by Sophia Prater. It assumes that instead of starting with specific screens or user flows, we begin by identifying the objects that should exist in the system, their attributes, the relationships between them, and the actions users can take on those objects. Only after this stage do we move on to designing user flows and wireframes.

To be honest, I’d long thought this way, ever since my days at Razorfish when our UX director Marisa Gallagher talked about how every website is built around a core unit, or object. At the time, she used Netflix as an example—it’s centered around the movie. CRMs, CMSes, LMSes, etc. are all object-based.

Anyway, I think Litarowicz writes a great primer for OOUX. The other—and frankly more important, IMHO—advantage to thinking this way, especially for a web app, is because your developers think this way too.

preview-1749443340299.heic

Introduction to Object-Oriented UX

How Object-Oriented UX can help you design complex systems

Earth 3 Streamline Icon: https://streamlinehq.comfundament.design

In this short piece by Luke Wroblewski, he observes how the chat box is slowly giving way as agents and MCP give AI chatbots a little more autonomy.

When agents can use multiple tools, call other agents and run in the background, a person's role moves to kicking things off, clarifying things when needed, and making use of the final output. There's a lot less chatting back and forth. As such, the prominence of the chat interface can recede even further. It's there if you want to check the steps an AI took to accomplish your task. But until then it's out of your way so you can focus on the output.
preview-1749011480163.png

The Receding Role of AI Chat

While chat interfaces to AI models aren't going away anytime soon, the increasing capabilities of AI agents are making the concept of chatting back and forth wi...

Earth 3 Streamline Icon: https://streamlinehq.comlukew.com

Following up on OpenAI’s acquisition of Jony Ive’s hardware startup, io, Mark Wilson, writing for Fast Company:

As Ive told me back in 2023, there have been only three significant modalities in the history of computing. After the original command line, we got the graphical user interface (the desktop, folders, and mouse of Xerox, Mac OS, and Windows), then voice (Alexa, Siri), and, finally, with the iPhone, multitouch (not just the ability to tap a screen, but to gesture and receive haptic feedback). When I brought up some other examples, Ive quickly nodded but dismissed them, acknowledging these as “tributaries” of experimentation. Then he said that to him the promise, and excitement, of building new AI hardware was that it might introduce a new breakthrough modality to interacting with a machine. A fourth modality. 

Hmm, it hasn’t taken off yet because AR hasn’t really gained mainstream popularity, but I would argue hand gestures in AR UI to be a fourth modality. But Ive thinks different. Wilson continues:

Ive’s fourth modality, as I gleaned, was about translating AI intuition into human sensation. And it’s the exact sort of technology we need to introduce ubiquitous computing, also called quiet computing and ambient computing. These are terms coined by the late UX researcher Mark Weiser, who in the 1990s began dreaming of a world that broke us free from our desktop computers to usher in devices that were one with our environment. Weiser did much of this work at Xerox PARC, the same R&D lab that developed the mouse and GUI technology that Steve Jobs would eventually adopt for the Macintosh. (I would also be remiss to ignore that ubiquitous computing is the foundation of the sci-fi film Her, one of Altman’s self-stated goalposts.)

Ah, essentially an always-on, always watching AI that is ready to assist. But whatever the form factor this device takes, it will likely depend on a smartphone:

The first io device seems to acknowledge the phone’s inertia. Instead of presenting itself as a smartphone-killer like the Ai Pin or as a fabled “second screen” like the Apple Watch, it’s been positioned as a third, er, um . . . thing next to your phone and laptop. Yeah, that’s confusing, and perhaps positions the io product as unessential. But it also appears to be a needed strategy: Rather than topple these screened devices, it will attempt to draft off them.

Wilson ends with the idea of a subjective computer, one that has personality and gives you opinions. He explains:

I think AI is shifting us from objective to subjective. When a Fitbit counts your steps and calories burned, that’s an objective interface. When you ask ChatGPT to gauge the tone of a conversation, or whether you should eat better, that’s a subjective interface. It offers perspective, bias, and, to some extent, personality. It’s not just serving facts; it’s offering interpretation. 

The entire column is worth a read.

preview-1748580958171.jpg

Can Jony Ive and Sam Altman build the fourth great interface? That's the question behind io

Where Meta, Google, and Apple zig, Ive and Altman are choosing to zag. Can they pull it off?

Earth 3 Streamline Icon: https://streamlinehq.comfastcompany.com

Related to my earlier post today about Arc’s novelty tax, here’s an essay by DOC, a tribute to consistency.

Leveraging known, established UX patterns and sticking to them prevent users from having to learn net-new interactions and build net-new mental models every time they engage with a new product.

But, as Josh Miller wrote in the aforementioned post, “New interfaces start from familiar ones.” DOC’s essay uses jazz as a metaphor:

Consistency is about making room for differentiation. Think about a jazz session: the band starts from a known scale, rhythm. One musician breaks through, improvising on top of that pattern for a few minutes before joining the band again. The band, the audience, everyone knows what is happening, when it starts and when it ends, because the foundation of it all is a consistent melody.
Geometric pattern of stacked rectangular blocks forming a diagonal structure against a dark sky. Artwork by Maya Lin.

Consistency

On compounding patterns and the art of divergence.

Earth 3 Streamline Icon: https://streamlinehq.comdoc.cc

For as long as I can remember, I’ve been fascinated by how television shows and movies are made. I remember the specials ABC broadcast about the making of The Empire Strikes Back and other Lucasfilm movies like the Indiana Jones series. More recently—especially with the advent of podcasts—I’ve loved listening to how show runners think about writing their shows. For example, as soon as an episode of Battlestar Galactica aired, I would rewatch it with Ronald D. Moore’s commentary. These days, I‘m really enjoying the official The Last of Us podcast because it features commentary from both Craig Mazin and Neil Druckmann.

Anyway, thinking about personas as characters from TV shows and movies and using screenwriting techniques is right up my alley. Laia Tremosa for the IxDF:

Hollywood spends millions to bring characters to life. UX design teams sometimes spend weeks… only to make personas no one ever looks at again. So don’t aim for personas that look impressive in a slide deck. Aim for personas that get used—in design reviews, product decisions, and testing plans.

Be the screenwriter. Be the director. Be the casting agent.
preview-1747105241059.jpg

The Hollywood Guide to UX Personas: Storytelling That Drives Better Design

Great products need great personas. Learn how to build them using the storytelling techniques Hollywood has perfected.

Earth 3 Streamline Icon: https://streamlinehq.cominteraction-design.org
If users don’t trust the systems we design, that’s not a PM problem. It’s a design failure. And if we don’t fix it, someone else will, probably with worse instincts, fewer ethics, and a much louder bullhorn.

UX is supposed to be the human layer of technology. It’s also supposed to be the place where strategy and empathy actually talk to each other. If we can’t reclaim that space, can’t build products people understand, trust, and want to return to, then what exactly are we doing here?

It is a long read but well worth it.

preview-1746118018231.jpeg

We built UX. We broke UX. And now we have to fix it!

We didn’t just lose our influence. We gave it away. UX professionals need to stop accepting silence, reclaim our seat at the table, and…

Earth 3 Streamline Icon: https://streamlinehq.comuxdesign.cc
A futuristic scene with a glowing, tech-inspired background showing a UI design tool interface for AI, displaying a flight booking project with options for editing and previewing details. The screen promotes the tool with a “Start for free” button.

Beyond the Prompt: Finding the AI Design Tool That Actually Works for Designers

There has been an explosion of AI-powered prompt-to-code tools within the last year. The space began with full-on integrated development environments (IDEs) like Cursor and Windsurf. These enabled developers to use leverage AI assistants right inside their coding apps. Then came a tools like v0, Lovable, and Replit, where users could prompt screens into existence at first, and before long, entire applications.

A couple weeks ago, I decided to test out as many of these tools as I could. My aim was to find the app that would combine AI assistance, design capabilities, and the ability to use an organization’s coded design system.

While my previous essay was about the future of product design, this article will dive deep into a head-to-head between all eight apps that I tried. I recorded the screen as I did my testing, so I’ve put together a video as well, in case you didn’t want to read this.

Illustration of humanoid robots working at computer terminals in a futuristic control center, with floating digital screens and globes surrounding them in a virtual space.

Prompt. Generate. Deploy. The New Product Design Workflow

Product design is going to change profoundly within the next 24 months. If the AI 2027 report is any indication, the capabilities of the foundational models will grow exponentially, and with them—I believe—will the abilities of design tools.

A graph comparing AI Foundational Model Capabilities (orange line) versus AI Design Tools Capabilities (blue line) from 2026 to 2028. The orange line shows exponential growth through stages including Superhuman Coder, Superhuman AI Researcher, Superhuman Remote Worker, Superintelligent AI Researcher, and Artificial Superintelligence. The blue line shows more gradual growth through AI Designer using design systems, AI Design Agent, and Integration & Deployment Agents.

The AI foundational model capabilities will grow exponentially and AI-enabled design tools will benefit from the algorithmic advances. Sources: AI 2027 scenario & Roger Wong

The TL;DR of the report is this: companies like OpenAI have more advanced AI agent models that are building the next-generation models. Once those are built, the previous generation is tested for safety and released to the public. And the cycle continues. Currently, and for the next year or two, these companies are focusing their advanced models on creating superhuman coders. This compounds and will result in artificial general intelligence, or AGI, within the next five years. 

Karri Saarinen, writing for the Linear blog:

Unbounded AI, much like a river without banks, becomes powerful but directionless. Designers need to build the banks and bring shape to the direction of AI’s potential. But we face a fundamental tension in that AI sort of breaks our usual way of designing things, working back from function, and shaping the form.

I love the metaphor of AI being the a river and we designers are the banks. Feels very much in line with my notion that we need to become even better curators.

Saarinen continues, critiquing the generic chatbox being the primary form of interacting with AI:

One way I visualize this relationship between the form of traditional UI and the function of AI is through the metaphor of a ‘workbench’. Just as a carpenter's workbench is familiar and purpose-built, providing an organized environment for tools and materials, a well-designed interface can create productive context for AI interactions. Rather than being a singular tool, the workbench serves as an environment that enhances the utility of other tools – including the ‘magic’ AI tools.

Software like Linear serves as this workbench. It provides structure, context, and a specialized environment for specific workflows. AI doesn’t replace the workbench, it's a powerful new tool to place on top of it.

It’s interesting. I don’t know what Linear is telegraphing here, but if I had to guess, I wonder if it’s closer to being field-specific or workflow-specific, similar to Generative Fill in Photoshop. It’s a text field—not textarea—limited to a single workflow.

preview-1744257584139.png

Design for the AI age

For decades, interfaces have guided users along predefined roads. Think files and folders, buttons and menus, screens and flows. These familiar structures organize information and provide the comfort of knowing where you are and what's possible.

Earth 3 Streamline Icon: https://streamlinehq.comlinear.app
The rise of AI tools doesn't mean becoming a "unicorn" who can do everything perfectly. Specialization will remain valuable in our field: there will still be dedicated researchers, content strategists, and designers.

However, AI is broadening the scope of what any individual can accomplish, regardless of their specific expertise.

What we're seeing isn't the elimination of specialization but rather an increased value placed on expanding the top of a professional's "expertise T.”

This reinforces what I talked about in a previous essay, "T-shaped skills [will become] increasingly valuable—depth in one area with breadth across others."

They go on to say:

We believe these broad skills will coalesce into experience designer and architect roles: people who direct AI-supported design tasks to craft experiences for humans and AI agents alike, while ensuring that the resulting work reflects well-researched, strategic thinking.

In other words, curation of the work that AI does.

They also make the point that designers need to be strategic, i.e., focus on the why:

This evolution means that the unique value we bring as UX professionals is shifting decidedly toward strategic thinking and leadership. While AI can execute tasks, it cannot independently understand the complex human and organizational contexts in which our work exists.

Finally, Gibbons and Sunwall end with some solid advice:

To adapt to this shift toward generalist skills, UX professionals should focus on 4 key areas:
• Developing a learning mindset
• Becoming fluent in AI collaboration
• Focusing on transferable skills
• Expanding into adjacent fields

I appreciate the learning mindset bit, since that's how I'm wired. I also believe that collaborating with AI is the way to go, rather than seeing it as a replacement or a threat.

preview-1743633930526.jpg

The Return of the UX Generalist

AI advances make UX generalists valuable, reversing the trend toward specialization. Understanding multiple disciplines is increasingly important.

Earth 3 Streamline Icon: https://streamlinehq.comnngroup.com
A cut-up Sonos speaker against a backdrop of cassette tapes

When the Music Stopped: Inside the Sonos App Disaster

The fall of Sonos isn’t as simple as a botched app redesign. Instead, it is the cumulative result of poor strategy, hubris, and forgetting the company’s core value proposition. To recap, Sonos rolled out a new mobile app in May 2024, promising “an unprecedented streaming experience.” Instead, it was a severely handicapped app, missing core features and broke users’ systems. By January 2025, that failed launch wiped nearly $500 million from the company’s market value and cost CEO Patrick Spence his job.

What happened? Why did Sonos go backwards on accessibility? Why did the company remove features like sleep timers and queue management? Immediately after the rollout, the backlash began to snowball into a major crisis.

A collage of torn newspaper-style headlines from Bloomberg, Wired, and The Verge, all criticizing the new Sonos app. Bloomberg’s headline states, “The Volume of Sonos Complaints Is Deafening,” mentioning customer frustration and stock decline. Wired’s headline reads, “Many People Do Not Like the New Sonos App.” The Verge’s article, titled “The new Sonos app is missing a lot of features, and people aren’t happy,” highlights missing features despite increased speed and customization.

As a designer and longtime Sonos customer who was also affected by the terrible new app, a little piece of me died inside each time I read the word “redesign.” It was hard not to take it personally, knowing that my profession could have anything to do with how things turned out. Was it really Design’s fault?