This isn’t just a simple effect. It’s built from several layers working together:
Highlights respond to environmental lighting and device motion. When you unlock your phone, lights move through 3D space, causing illumination to travel around the material.
Shadows automatically adjust based on what’s behind them—darker over text for separation, lighter over solid backgrounds.
Tint layers continuously adapt. As content scrolls underneath, the material flips between light and dark modes for optimal legibility.
Interactive feedback spreads from your fingertip throughout the element, making it feel alive and responsive.
Regular is the workhorse—full adaptive behaviors, works anywhere.
Clear is more transparent but needs dimming layers for legibility.
Clear should only be used over media-rich content when the content layer won’t suffer from dimming. Otherwise, stick with Regular.
It’s like ice cubes—cloudy ones from your freezer versus clear ones at fancy bars that let you see your drink’s color.
Regular is the workhorse—full adaptive behaviors, works anywhere.
Clear should only be used over media-rich content when the content layer won’t suffer from dimming.
Smart Contextual Changes
When elements scale up (like expanding menus), the material simulates thicker glass with deeper shadows. On larger surfaces, ambient light from nearby content subtly influences the appearance.
Elements don’t fade—they materialize by gradually modulating light bending. The gel-like flexibility responds instantly to touch, making interactions feel satisfying.
This is something that’s hard to see in stills.
The New Tinting Approach
Instead of flat color overlays, Apple generates tone ranges mapped to content brightness underneath. It’s inspired by how colored glass actually works—changing hue and saturation based on what’s behind it.
Apple recommends sparing use of tinting. Only for primary actions that need emphasis. Makes sense.
Design Guidelines That Matter
Liquid Glass is for the navigation and controls layer floating above content—not for everything. Don’t add Liquid Glass to or make content areas Liquid Glass. Never stack glass on glass.
Accessibility features are built-in automatically—reduced transparency, increased contrast, and reduced motion modify the material without breaking functionality.
The Legibility Outcry (and Why It’s Overblown)
“Legibility” was mentioned 13 times in the 19-minute video. Clearly that was a concern of theirs. Yes, in the keynote, clear tinted device home screens were shown and many on social media took that to be an accessibility abomination. Which, yes, that is. But that’s not the default.
The fact that the system senses the type of content underneath it and adjusts accordingly—flipping from light to dark, increasing opacity, or adjusting shadow depth—means they’re making accommodations for legibility.
Maybe Apple needs to do some tweaking, but it’s evident that they care about this.
And like the 18 macOS releases before Tahoe—this version—accessibility settings and controls have been built right in. Universal Access debuted with Mac OS X 10.2 Jaguar in 2002. Apple has had a long history of supporting customers with disabilities, dating all the way back to 1987.
So while the social media outcry about legibility is understandable, Apple’s track record suggests they’ll refine these features based on real user feedback, not just Twitter hot takes.
The Real Goal: Device Continuity
Why and what is Liquid Glass meant to do? It’s unification. With the new design language, Apple has also come out with a new design system. This video presented by Apple designer Maria Hristoforova lays it out.
Hristoforova says that Apple’s new design system overhaul is fundamentally about creating seamless familiarity as users move between devices—ensuring that interface patterns learned on iPhone translate directly to Mac and iPad without requiring users to relearn how things work. The video points out that the company has systematically redesigned everything from typography (hooray for left alignment!) and shapes to navigation bars and sidebars around Liquid Glass as the unifying foundation, so that the same symbols, behaviors, and interactions feel consistent across all screen sizes and contexts.
The Pattern of Promised Unity
This isn’t Apple’s first rodeo with “unified design language” promises.
Back in 2013, iOS 7’s flat design overhaul was supposed to create seamless consistency across Apple’s ecosystem. Jony Ive ditched skeuomorphism for minimalist interfaces with translucency and layering—the foundation for everything that followed.
OS X Yosemite (2014) brought those same principles to desktop. Flatter icons, cleaner lines, translucent elements. Same pitch: unified experience across devices.
macOS Big Sur (2020) pushed even further with iOS-like app icons and redesigned interfaces. Again, the promise was consistent visual language across all platforms.
And here we are in 2025 with Liquid Glass making the exact same promises.
But maybe “goal” is a better word.
Consistency Makes the Brand
I’m OK with the goal of having a unified design language. As designers, we love consistency. Consistency is what makes a brand. As Apple has proven over and over again for decades now, it is one of the most valuable brands in the world. They maintain their position not only by making great products, but also by being incredibly disciplined about consistency.
San Francisco debuted 10 years ago as the system typeface for iOS 9 and OS El Capitan. They’ve since extended it and it works great in marketing and in interfaces.
The rounded corners on their devices are all pretty much the same radii. Now that concentricity is being incorporated into the UI, screen elements will be harmonious with their physical surroundings. Only Apple can do that because they control the hardware and the software. And that is their magic.
Design Is Both How It Works and How It Looks
In 2003, two years after the iPod launched, Rob Walker of The New York Times did a profile on Apple. The now popular quote about design from Steve Jobs comes from this piece.
[The iPod] is, in short, an icon. A handful of familiar clichés have made the rounds to explain this — it’s about ease of use, it’s about Apple’s great sense of design. But what does that really mean? “Most people make the mistake of thinking design is what it looks like,” says Steve Jobs, Apple’s C.E.O. “People think it’s this veneer — that the designers are handed this box and told, ‘Make it look good!’ That’s not what we think design is. It’s not just what it looks like and feels like. Design is how it works.”
People misinterpret this quote all the time to mean design is only how it works. That is not what Steve meant. He meant, design is both what it looks like and how it works.
Steve did care about aesthetics. That’s why the Graphic Design team mocked up hundreds of PowerMac G5 box designs (the graphics on the box, not the construction). That’s why he obsessed over the materials used in Pixar’s Emeryville headquarters. From Walter Isaacson’s biography:
Because the building’s steel beams were going to be visible, Jobs pored over samples from manufacturers across the country to see which had the best color and texture. He chose a mill in Arkansas, told it to blast the steel to a pure color, and made sure the truckers used caution not to nick any of it.
Liquid Glass is a welcomed and much-needed visual refresh. It’s the natural evolution of Apple’s platforms, going from skeuomorphic so users knew they could use their fingers and tap on virtual buttons on a touchscreen, to flat as a response to the cacophony of visual noise in UIs at the time, and now to something kind of in-between.
Humans eventually tire of seeing the same thing. Carmakers refresh their vehicle designs every three or four years. Then they do complete redesigns every five to eight years. It gets consumers excited.
Liquid Glass will help Apple sell a bunch more hardware.
I’ve been very interested in finding tools to close the design-to-code gap. Martina Sartor writing in UX Planet articulates why that is so important:
After fifteen years hopping between design systems, dev stand-ups, and last-minute launch scrambles, I’m convinced design-to-dev QA is still one of the most underestimated bottlenecks in digital product work. We pour weeks into meticulous Figma files, yet the last mile between mock-up and production code keeps tripping us up.
This is an honest autopsy of why QA hurts and how teams can start healing it — today — without buying more software (though new approaches are brewing).
In the early days of computing, it was easy for one person to author a complete program. Nowadays, because the software we create is so complex, we need teams.
The faster you accept that they’re not going to change their communication style, the faster you can focus on what actually works — learning to decode what they’re really telling you. Because buried in all that technical jargon is usually something pretty useful for design decisions.
It’s a fun piece on learning how to speak engineer.
I have relayed here before the story that I’ve been using Macs since 1985. It wasn’t the hardware that drew me in—it was MacPaint. I was always an artistic kid so being able to paint on a digital canvas seemed thrilling to me. And of course it was back then.
I say this with no hyperbole: Bill Atkinson may well have been the best computer programmer who ever lived. Without question, he’s on the short list. What a man, what a mind, what gifts to the world he left us.
Every day at Figma, we wrestle with the same challenges Atkinson faced: How do you make powerful tools feel effortless? How do you hide complexity behind intuitive interactions? His fingerprints are on every pixel we push, every selection we make, every moment of creative flow our users experience.
Apple’s annual developer conference kicked off today with a keynote that announced:
Unified Version 26 across all Apple platforms (iOS, iPadOS, macOS, watchOS, tvOS, visionOS)
“Liquid Glass” design system. A complete UI and UX overhaul, the first major redesign since iOS 7
Apple Intelligence. Continued small improvements, though not the deep integration promised a year ago
Full windowing system on iPadOS. Windows comes to iPad! Finally.
Of course, those are the very high-level highlights.
For designers, the headline is Liquid Glass. Sebastiaan de With’s predictive post and renderings from last week were very spot-on.
I like it. I think iOS and macOS needed a fresh coat of paint and Liquid Glass delivers.
There’s already been some criticism—naturally, because we’re opinionated designers after all!—with some calling it over the top, a rehash of Windows Vista, or an accessibility nightmare.
The new Liquid Glass design language acts like real glass, refracting light and bending the image behind it accordingly.
In case you haven’t seen it, it’s a visual and—albeit less so—experience overhaul for the various flavors of Apple OSes. Imagine a transparent glass layer where controls sit. The layer has all the refractive qualities of glass, bending the light as images pass below it, and its edges catching highlights from a light source. This is all powered by a sophisticated 3D engine, I’m sure. It’s gorgeous.
It’s been 12 years since the last major refresh, with iOS 7 bringing on an era of so-called flat design to the world. At the time, it was a natural extension of Jony Ive’s predilection for minimalism, to strip things to their core. What could be more pure than using only type? It certainly appealed to my sensibilities. But what it brought on was a universe of sameness in UI design.
**
The design team at Apple studied the physical properties of real glass to perfect the material in the new versions of the OSes.
With the release of Liquid Glass, led by Apple’s VP of Design, Alan Dye, I hope we’ll see designers add a little more personality, depth, and texture back into their UIs. No, we don’t need to return to the days of skeuomorphism—kicked off by Mac OS X’s Aqua interface design. I do think there’s been a movement away from flat design recently. Even at the latest Config conference, Figma showed off functionality to add noise and texture into our designs. We’ve been in a flat world for 12 years! Time to add a little spice back in.
Finally, it’s a beta. This is typical of Apple. The implementation will be iterated on and by the time it ships later this year in September, it will have been further refined.
When you’re building a SaaS app, I believe it’s important to understand the building blocks, or objects, in your app. What are they? How do they relate to each other? Should those relationships be peer-to-peer or parent-child? Early in my tenure at BuildOps, I mentioned this way of thinking to one of my designers and they pointed me to Object-Oriented UX (OOUX), a methodology pioneered by Sophia Prater.
Object-Oriented UX is a way of thinking about design, introduced and popularized by Sophia Prater. It assumes that instead of starting with specific screens or user flows, we begin by identifying the objects that should exist in the system, their attributes, the relationships between them, and the actions users can take on those objects. Only after this stage do we move on to designing user flows and wireframes.
To be honest, I’d long thought this way, ever since my days at Razorfish when our UX director Marisa Gallagher talked about how every website is built around a core unit, or object. At the time, she used Netflix as an example—it’s centered around the movie. CRMs, CMSes, LMSes, etc. are all object-based.
Anyway, I think Litarowicz writes a great primer for OOUX. The other—and frankly more important, IMHO—advantage to thinking this way, especially for a web app, is because your developers think this way too.
The Talking Heads have released a new music video for an old song. Directed by Mike Mills—who is not only a filmmaker but also a graphic designer—and starring Saoirse Ronan, the video for the band’s first hit, “Psycho Killer” is a wonderful study on the pressures, anxieties, and joys of being a young person in today’s world. It was made to celebrate the band’s 50th anniversary.
On Instagram, the band said, “This video makes the song better- We LOVE what this video is NOT - it’s not literal, creepy, bloody, physically violent or obvious.”
Me too.
Bell Labs was a famed research lab run by AT&T (aka “Ma Bell” before it was broken up). You can draw a straight line from Bell Labs to Xerox PARC where essential computing technologies like the graphical user interface, the mouse, Ethernet, and more were invented.
The reason why we don’t have Bell Labs is because we’re unwilling to do what it takes to create Bell Labs — giving smart people radical freedom and autonomy.
The freedom to waste time. The freedom to waste resources. And the autonomy to decide how.
Peter Yang has been doing some amazing experiments with gen AI tools. There are so many models out there now, so I appreciate him going through and making this post and video.
I made a video testing Claude 4, ChatGPT O3, and Gemini 2.5 head-to-head for coding, writing, deep research, multimodal and more. What I found was that the “best” model depends on what you’re trying to do.
I’ve been focused a lot on AI for product design recently, but I think it’s just as important to talk about AI for web design. Though I spend my days now leading a product design team and thinking a lot about UX for creating enterprise software, web design is still a large part of the design industry, as evidenced by the big interest in Framer in the recent Design Tools Survey.
Several companies have released AI-based site generators; WordPress.com is among the latest. Our own Matt Medeiros took it for a spin. He “chatted” with a friendly bot that wanted to know more about his website needs. Within minutes, he had a website powered by WordPress.
These tools aren’t producing top agency-level websites just yet. Maybe they’re a novelty for the time being. But they’ll improve. With that comes the worry of their impact on freelancers. Will our potential clients choose a bot over a seasoned expert?
Karkovack is right. Current AI tools aren’t making well-thought custom websites yet. So as an agency owner or a freelance designer, you have to defend your position of expertise and customer service:
Those tools have a place in the market. However, freelancers and agencies must position themselves as the better alternative. We should emphasize our expertise and attention to detail, and communicate that AI is a helpful tool, not a magic wand.
But Karkovack misses an opportunity to offer sage advice, which I will do here. Take advantage of these tools in your workflow so that you can be more efficient in your delivery. If you’re in the WordPress ecosystem, use AI to generate some layout ideas, write custom JavaScript, make custom plugins, or write some copy. These AI tools are game-changing, so don’t rest on your laurels.
In this short piece by Luke Wroblewski, he observes how the chat box is slowly giving way as agents and MCP give AI chatbots a little more autonomy.
When agents can use multiple tools, call other agents and run in the background, a person’s role moves to kicking things off, clarifying things when needed, and making use of the final output. There’s a lot less chatting back and forth. As such, the prominence of the chat interface can recede even further. It’s there if you want to check the steps an AI took to accomplish your task. But until then it’s out of your way so you can focus on the output.
Sebastiaan de With, former designer at Apple and currently co-founder and designer at Lux (makers of Halide, Kino, Spectre, and Orion) imagined what the next era in iOS design might be. (WWDC, Apple’s developer conference is next week. This is typically when they unveil the new operating systems that will launch in the fall. Rumors are flying as usual.)
But he starts with a history lesson:
Smart people study history to understand the future. If we were to categorize the epochs of iOS design, we could roughly separate them into the Shaded Age, the Adaptive Age, and the New Age.
The Shaded Age, or skeuomorphic age, took inspiration from the Dashboard feature of Mac OS X Tiger. And then the Flat Age brought on by the introduction of iOS 7.
de With’s concept mocks for the New Age are fantastic. Based on the physicality of visionOS, with specular highlights and reactive reflections, it’s luscious and reminds me of the first time I ever laid eyes on Aqua—the glossy, candy-like look of the original Mac OS X. Steve Jobs said at its introduction, “…one of the design goals was when you saw it you wanted to lick it.”
Sebastiaan de With: “Philosophically, if I was Apple, I’d describe this as finally having an interface that matches the beautiful material properties of its devices. All the surfaces of your devices have glass screens. This brings an interface of a matching material, giving the user a feeling of the glass itself coming alive.”
Apologies for linking to a lot of Christopher Butler recently, but I really love his thinking about design. This time, Butler reminds us about the importance of structure and how the proto-graphic designers we studied in art history, like Piet Mondrian, mastered it.
A well-composed photograph communicates something essential even before we register its subject. A thoughtfully designed page layout feels right before we read a single word. There’s something happening in that first moment of perception that transcends the individual elements being composed.
My favorite passage in his essay begins here:
Perhaps we “read” composition the way we read text — our brains processing visual structure as a kind of fundamental grammar that exists beneath conscious recognition. Just as we don’t typically think about parsing sentences into subjects and predicates while reading, we don’t consciously deconstruct the golden ratio or rule of thirds while looking at an image. Yet in both cases, our minds are translating structure into meaning.
The next eight short paragraphs build on this idea and crescendo with this banger:
In recognizing composition as this fundamental visual language, we begin to understand why good design works at such a deep level. It’s not just about making things look nice — it’s about speaking fluently in a language that predates words, tapping into patterns of perception that feel as natural as breathing.
Composition is a fundamental visual language. I had never thought of it that way and yet it feels right.
Brad Feld is sharing the Cursor prompts his friend Michael Natkin put together. It is more or less the same that I’ve gleaned from the Cursor forums, but it’s nice to have it consolidated here. If you’re curious to tackle any weekend coding project, follow these steps.
Nate Jones performed a yeoman’s job of summarizing Mary Meeker’s 340-slide deck on AI trends, the “2025 Technology as Innovation (TAI) Report.” For those of you who don’t know, Mary Meeker is a famed technology analyst and investor known for her insightful reports on tech industry trends. For the longest time, as an analyst at Kleiner Perkins, she published the Internet Trends report. And she was always prescient.
Half of Jones’ post is the summary, while the other half is how the report applies to product teams. The whole thing is worth 27 minutes of your time, especially if you work in software.
First, two quick observations before I move on to longer ones:
The respondent population of 2,200+ designers is well-balanced among company size, team structure, client vs. product focus, and leadership responsibility
Predictably, Figma dominates the tools stacks of most segments
Surprise #1: Design Leaders Use AI More Than ICs
From the summary of the AI section:
Three clear patterns emerge from our data:
Leadership-IC Divide. Leaders adopt AI at a higher rate (29.0%) than ICs (19.9%)
Text-first adoption. 75.2% of AI usage focuses on writing, documentation, and content—not visuals
Client Influence. Client-facing designers show markedly higher AI adoption than internal-facing peers
That nine-point difference is interesting. The report doesn’t go into speculating why, but here are some possible reasons:
Design leaders are experimenting with AI tools looking for efficiency gains
Leaders write more than design, so they’re using AI more for emails, memos, reports, and presentations
ICs are set in their processes and don’t have time to experiment
I believe that any company operating with resource constraints—which is all startups—needs to embrace AI. AI enables us to do more. I don’t believe—at least not yet—mid- to senior-level jobs are on the line. Engineers can use Cursor to write code, sure, but it’s probably better for them to give Cursor junior-level tasks like bug fixes. Designers should use AI to generate prototypes so that they can test and iterate on ideas more quickly.
The data here is stale, unfortunately. The survey was conducted between November 2024 and January 2025, just as the AI prompt-to-code tools were coming to market. I suspect we will see a huge jump in next year’s results.
Surprise #2: There’s Excitement for Framer
I’m surprised about Framer winning the “Future of Design” award. Maybe it’s the name of the award; does Framer really represent the “future of design”? Ten percent of respondents say they want to try this.
I’ve not gone back to Framer since its early days when it supported code exports. I will give them kudos that they’ve pivoted and built a solid business and platform. I’m personally weary of creating websites for clients in a closed platform; I would rather it be portable like a Node.js app or even WordPress. But to each their own.
Not Surprised at All
In the report’s conclusion, its first two points are unsurprising:
AI enters the workflow. 8.5% of designers cited AI tools as their top interest for 2025. With substantial AI tooling innovation in early 2025, we expect widespread adoption to accelerate next year.
Like I mentioned earlier, I think this will shift big time.
Design-code gap narrows. Addressing the challenge faced by 46.3% of teams reporting inconsistencies between design system specifications and code implementations.
As I said in a previous essay on the future of product design, the design-to-code gap is begging to be solved, “For any designer who has ever handed off a Figma file to a developer, they have felt the stinging disappointment days or weeks later when it’s finally coded.…The developer handoff experience has been a well-trodden path full of now-defunct or dying companies like InVision, Abstract, and Zeplin.”
Reminder: The Tools Don’t Make You a Better Designer
Inevitably, someone in the comments section will point this out: Don’t focus on the tool. To quote photographer and camera reviewer Ken Rockwell, “Cameras don’t take pictures, photographers do. Cameras are just another artist’s tool.” Tools are commodities, but our skills as craftspeople, thinkers, curators, and tastemakers are not.
…I keep thinking back to Star Trek, and how the device that probably inspired the least wonder in me as a child is the one that seems most relevant now: the Federation’s wearables. Every officer wore a communicator pin — a kind of Humane Pin light — but they also all wore smaller pins at their collars signifying rank. In hindsight, it seems like those collar pins, which were discs the size of a watch battery, could have formed some kind of wearable, personal mesh network. And that idea got me going…
He describes the device as a standardized disc that can be attached to any enclosure. I love his illustration too:
Christopher Butler: “I imagine a magnetic edge system that allows the disc to snap into various enclosures — wristwatches, handhelds, desktop displays, wearable bands, necklaces, clips, and chargers.”
Essentially, it’s an always-on, always observing personal AI.
I knew instantly that the brand identity was paying homage to Eadweard Muybridge’s famous photographic studies of a galloping horse. It’s a logo for an AI video company.
The whole case study from Jody Hudson-Powell and Luke Powell of Pentagram is great.
The design blog that connects the dots others miss. Written by Roger Wong.
If you’re new here, check out what others are reading in the Popular feed.