Skip to content

39 posts tagged with “apple”

As much as I defended the preview, and as much as Apple wants to make Liquid Glass a thing, the new UI is continuing to draw criticism. Dan Moren for Six Colors:

“Glass” is the overall look of these updates, and it’s everywhere. Transparent, frosted, distorting. In some places it looks quite cool, such as in the edge distortion when you’re swiping up on the lock screen. But elsewhere, it seems to me that glass may not be quite the right material for the job. The Glass House might be architecturally impressive, but it’s not particularly practical.

It’s also a definite philosophical choice, and one that’s going to engender some criticism—much of it well-deserved. Apple has argued that it’s about getting controls out of the way, but is that really what’s happening here? It’s hard to argue that having a transparent button sitting right on top of your email is helping that email be more prominent. To take this argument to its logical conclusion, why is the keyboard not fully transparent glass over our content?

I’ve yet to upgrade myself. I will say that everyone dislikes change. Lest we forget that the now-ubiquitous flat design introduced by iOS 7 was also criticized.

preview-1758732622764.png

iOS 26 Review: Through a glass, liquidly

iOS 26! It feels like just last year we were here discussing iOS 18. How time flies. After a year that saw the debut of Apple Intelligence and the subsequent controversy over the features that it d…

sixcolors.com iconsixcolors.com

Ah, this brings back memories! I spent so much time in MacPaint working with these patterns when I was young. Paul Smith faithfully recreates them:

I was working on something and thought it would be fun to use one of the classic Mac black-and-white patterns in the project. I’m talking about the original 8×8-pixel ones that were in the original Control Panel for setting the desktop background and in MacPaint as fill patterns.

I figured there’d must be clean, pixel-perfect GIFs or PNGs of them somewhere on the web. And perhaps there are, but after poking around a bit, I ran out of energy for that, but by then had a head of steam for extracting the patterns en masse from the original source, somehow. Then I could produce whatever format I needed for them.

preview-1757693571067.png

Classic 8×8-pixel B&W Mac patterns

TL;DR: I made a website for the original classic Mac patterns I was working on something and thought it would be fun to use one of the classic Mac black-and-white patterns in the project. I'm talking about the original 8×8-pixel ones that were in the...

pauladamsmith.com iconpauladamsmith.com
Still from a video shown at Apple Keynote 2025. Split screen of AirPods Pro connection indicator on left, close-up of earbuds in charging case on right.

Notes About the September 2025 Apple Event

Today’s Apple keynote opened with a classic quote from Steve Jobs.

Steve Jobs quote at Apple Keynote 2025 – Black keynote slide with white text: “Design is not just what it looks like and feels like. Design is how it works.” – Steve Jobs.

Then a video played, focused on the fundamental geometric shapes that can be found in Apple’s products: circles in the HomePod, iPhone shutter button, iPhone camera, MagSafe charging ring, Digital Crown on Apple Watch; rounded squares in the charging block, Home scene button, Mac mini, keycaps, Finder icon, FaceID; to the lozenges found in the AirPods case, MagSafe port, Liquid Glass carousel control, and the Action button on Apple Watch Ultra.

Then Tim Cook repeated the notion in his opening remarks:

At Apple, design has always been fundamental to who we are and what we do. For us, design goes beyond just how something looks or feels. Design is also how it works. This philosophy guides everything we do, including the products we’re going to introduce today and the experiences they provide.

Apple announced a bunch of products today, including:

  • AirPods Pro 3 with better active noise canceling, live translation, and heart rate sensing (more below)
  • Apple Watch Series 11, thinner and with hypertension alerts and sleep score
  • iPhone 17 with a faster chip and better camera (as always)
  • iPhone Air at 5.6 mm thin! They packed all the main components into a new full-width camera “plateau” (I guess that’s the new word for camera bump)
  • iPhone 17 Pro / Pro Max with a faster chip and even better camera (as always), along with unibody construction and cool vapor cooling (like liquid cooling, but with vapor), and a beefy camera plateau

Highlights

Live Translation is Star Trek’s Universal Translator

In the Star Trek universe, humans regularly speak English with aliens and the audience hears those same aliens reply in English. Of course, it’s television and it was always explained away—that a “universal translator” is embedded in the comm badge all Starfleet crew members wear.

Apple Keynote 2025 iPhone Live Translation feature – Woman holds up an iPhone displaying translated text, demonstrating Apple Intelligence with AirPods Pro 3.

With AirPods Pro 3, this is becoming real! In one demo video, Apple shows a woman at a market. She’s shopping and hears a vendor speak to her in Spanish. Through her AirPods, she hears the live translation and can reply in English and have that translated back to Spanish on her iPhone. Then, in another scene, two guys are talking—English and Italian—and they’re both wearing the new AirPods and having a seamless conversation. Amazing.

Apple Keynote 2025 AirPods Pro 3 Live Translation demo at café – Man wearing AirPods Pro 3 sits outdoors at a café table, smiling while testing real-time language translation.

Heart Rate Monitoring in AirPods

Apple is extending its fitness tracking features to AirPods, specifically the new AirPods Pro 3. These come with a new sensor that pulses invisible infrared light at 256 times per second to detect blood flow and calculate heart rate. I’m always astonished by how Apple keeps extending the capabilities of its devices to push health and fitness metrics, which—at least their thesis goes—helps with overall wellbeing. (See below.)

Full-Width Camera Bump

Or, the new camera plateau. I actually prefer the full width over just the bump. I feel like the plain camera bump on my iPhone 16 Pro makes the phone too wobbly when I put it on its back. I think a bump that spans the full width of the phone will make it more stable. This new design is on the new iPhone Air and iPhone 17 Pro.

To Air or Not to Air?

I’m on the iPhone Upgrade Program so I can get a new phone each year—and I have for that last few. I’m wondering if I want to get the Air this time. One thing I dislike about the iPhone Pros is their weight. The Pro is pretty heavy and I can feel it in my hand after prolonged use. At 165 grams, the Air is 17% lighter than the 16 Pro (199 grams). It might make a difference.

Overall Thoughts

Of course, in 2025, it’s a little striking that Apple didn’t mention much about AI. Apple framed AI not as a standalone product but as an invisible layer woven through AirPods, Watch, and iPhone—from Live Translation and Workout Buddy nudges to on-device models powering health insights and generative photo features. Instead of prompts and chatbots, Apple Intelligence showed up as contextual, ambient assistance designed to disappear into the flow of everyday use. And funnily enough, iOS 26 was mentioned in passing, as if Apple assumed everyone watching had seen the prior episode—er, keynote—in June.

It’s interesting that the keynote opened with that Steve Jobs quote about design. Maybe someone in Cupertino read my piece breaking down Liquid Glass where I argued:

People misinterpret this quote all the time to mean design is only how it works. That is not what Steve meant. He meant, design is both what it looks like and how it works.

(Actually, it was probably what Casey Newton wrote in Platformer about Liquid Glass.) 

If you step back and consider why Apple improves its hardware and software every year, it goes back to their implied mission: to make products that better human lives. This is exemplified by the “Dear Apple” spot they played as part of the segment on Apple Watch.

Play

Apple’s foray into wearables—beyond ear- and headphones—with Apple Watch ten years ago was really an entry into health technology. Lives have been saved and people have gotten healthier because Apple technology enabled them. Dr. Sumbul Ahmad Desai, VP of Health mentioned their new hypertension detection feature could notify over one million people with undiagnosed hypertension in its first year. Apple developed this feature using advanced machine learning, drawing on training data from multiple studies that involved over 100,000 participants. Then they clinically validated it in a separate study of over 2,000 participants. In other words, they’ve become a real force is shaping health tech.

And what also amazes me, is that now AirPods Pro 3 will help with health and fitness tracking. (See above.)

There’s no doubt that Apple’s formal design is always top-notch. But it’s great to be reminded of their why and how these must-buy-by-Christmas devices are capable of solving real world problems and bettering our lives. (And no, I don’t think having a lighter, thinner, faster, cooler phone falls into this category. We can have both moral purpose and commercial purpose.)

It’s no secret that I am a big fan of Severance, the Apple TV+ show that has 21 Emmy nominations this year. I made a fan project earlier in the year that generates Outie facts for your Innie.

After launching a teaser campaign back in April, Atomic Keyboard is finally taking pre-orders for their Severance-inspired keyboard just for Macrodata Refinement department users. The show based the MDR terminals on the Data General Dasher D2 terminal from 1977. So this new keyboard includes three layouts:

  1. “Innie” which is show-accurate, meaning no Escape, no Option, and no Control keys, and includes the trackball
  2. “Outie,” a 60% layout that includes modern modifier keys and the trackball
  3. “Dasher” which replicates the DG terminal layout

It’s not cheap. The final retail price will be $899, but they’re offering a pre-Kickstarter price of $599.

preview-1752862402377.png

MDR Dasher Keyboard | For Work That's Mysterious & Important

Standard equipment for Macrodata Refinement: CNC-milled body, integrated trackball, modular design. Please enjoy each keystroke equally.

mdrkeyboard.com iconmdrkeyboard.com

John Calhoun joined Apple 30 years ago as a programmer to work on the Color Picker.

Having never written anything in assembly, you can imagine how overjoyed I was. It’s not actually a very accurate analogy, but imagine someone handing you a book in Chinese and asking you to translate it into English (I’m assuming here that you don’t know Chinese of course). Okay, it wasn’t that hard, but maybe you get a sense that this was quite a hurdle that I would have to overcome.

Calhoun was given an old piece of code and tasked with updating it. Instead, he translated it into a programming language he knew—C—and then decided to add to the feature. He explains:

I disliked HSL as a color space, I preferred HSV (Hue, Saturation, Value) because when I did artwork I was more comfortable thinking about color in those terms. So writing an HSV color picker was on my short list.

When I had my own color picker working I think I found that it was kind of fun. Perhaps for that reason, I struck out again and wrote another color picker. The World Wide Web (www) was a rather new thing that seemed to be catching on, so I naturally thought that an HTML color picker made sense. So I tackled that one as well. It was more or less the RGB color picker but the values were in hexadecimal and a combined RGB string value like “#FFCC33” was made easy to copy for the web designer.

So an engineer decided, all on his own, that he’d add a couple extra features. Including the fun crayon picker:

On a roll, I decided to also knock out a “crayon picker”. At this point, to be clear, the color picker was working and I felt I understood it well enough. As I say, I was kind of just having some fun now.

Screenshot of a classic Mac OS color picker showing the “Crayon Picker” tab. A green color named “Watercress” is selected, replacing the original orange color. Options include CMYK, HLS, and HSV pickers on the left.

And Calhoun makes this point:

It was frankly a thing I liked about working for Apple in those days. The engineers were the one’s driving the ship. As I said, I wrote an HSV picker because it was, I thought, a more intuitive color space for artists. I wrote the HTML color picker because of the advent of the web. And I wrote the crayon picker because it seemed to me to be the kind of thing Apple was all about: HSL, RGB — these were kind of nerdy color spaces — a box of crayons is how the rest of us picked colors.

Making software—especially web software—has matured since then, with product managers and designers now collaborating closely with engineers. But with AI coding assistants, the idea of an individual contributor making solo decisions and shipping code might become de rigueur again.

Man sitting outside 2 Infinite Loop, Apple’s former headquarters in Cupertino, holding a book with an ID badge clipped to his jeans.

Almost Fired

I was hired on at Apple in October of 1995. This was what I refer to as Apple’s circling the drain period. Maybe you remember all the doomsaying — speculation that Apple was going to be shuttering soon. It’s a little odd perhaps then that they were hiring at all but apparently Apple reasoned that they nonetheless needed another “graphics engineer” to work on the technology known as QuickdrawGX. I was then a thirty-one year old programmer who lived in Kansas and wrote games for the Macintosh — surely, Apple thought, I would be a good fit for the position.

engineersneedart.com iconengineersneedart.com

Let’s continue down Mac memory lane with this fun post from Basic Apple Guy:

With macOS 26, Apple has announced a dramatically new look to their UI: Liquid Glass. Solid material icon elements give way to softer, shinier, glassier icons. The rounded rectangle became slightly more rounded, and Apple eliminated the ability for icon elements to extend beyond the icon rectangle (as seen in the current icons for GarageBand, Photo Booth, Dictionary, etc.).

With this release being one of the most dramatic visual overhauls of macOS’s design, I wanted to begin a collection chronicling the evolution of the system icons over the years. I’ve been rolling these out on social media over the past week and will continue to add to and update this collection slowly over the summer. Enjoy!

preview-1752036853593.png

macOS Icon History

Documenting the evolution of macOS system icons over the past several decades.

basicappleguy.com iconbasicappleguy.com

This is an amazing article and website by Marcin Wichary, the man behind the excellent Shift Happens book.

…I had a realization that the totemic 1984 Mac control panel, designed by Susan Kare, is still to this day perhaps the only settings screen ever brought up in casual conversation.

I kept wondering about that screen, and about what happened since then. Turns out, the Mac settings have lived a far more fascinating life than I imagined, have been redesigned many times, and can tell us a lot about the early history and the troubled upbringing of this interesting machine.

Indeed, Wichary goes through multiple versions of Mac operating systems and performs digital paleontology, uncovering long lost Settings minutiae. It’s also a great lesson in UI along the way. Be sure to click in the Mac screens.

preview-1752030135022.png

Frame of preference

A story of early Mac settings told by 10 emulators.

aresluna.org iconaresluna.org

It’s been said that desktop publishing democratized graphic design. For those of you too young to know what the term means, it means the technology that enabled graphic design to go digital. It was an ecosystem, really: the Mac, PostScript, LaserWriter, and PageMaker. But before all that, designers depended on typesetters to set type.

David Langton writing for UX Collective:

A lot was lost when the Macintosh wiped out the traditional typesetting industry. From the art of typography to the craft of typesetting, many essential elements were lost. Typesetters were part of a tradition that stretched back more than 500 years to Gutenberg’s printing press. They understood the basics of type: kerning (spacing between the letters), leading (the space between lines of text), and line breaks (how to avoid widows — those solo words abandoned at the end of a paragraph). They knew about readability (like how to avoid setting type that was too wide to read). There were classic yet limited fonts, with standards for size and leading that assured that everyone working within common ranges maintained a threshold for quality. Yet it was in the craft or business side of typesetting that these services were most under appreciated. Typesetters provided overnight service. They worked overnight, so graphic designers did not have to. We would finish our days specifying the type, and the typesetters would keystroke the manuscripts, proofread, stylize the type, and set up columns following our instructions.

Designers would then pick up the galleys from the typesetters in the morning. The black type was photographically printed on white photo paper. You’d have to cut them up and paste them onto boards, assembling your layout.

Because this was such a physical process, we had to slow down. Langton says:

But since the Macintosh became an in-house tool, the process was reversed. Now, designers design first, then think about it. This shift in process has contributed to a trivialization of the role of graphic designer because anyone can noodle around with the Mac’s sophisticated type tools and make layouts. The design process has been trivialized while the thinking, the evaluation, and the strategic part of the process are often abandoned.

One small thing I’ll point out is that desktop publishing wasn’t popularized until 1985.

  • PostScript was released by Adobe in 1984.
  • The LaserWriter printer was released by Apple in 1985.
  • PageMaker was released by Aldus—later bought by Adobe—in 1985.
preview-1750050186871.jpeg

What the 1984 Macintosh revolution teaches designers about the 2025 AI revolution

Upheaval and disruption are nothing new for graphic designers.

uxdesign.cc iconuxdesign.cc

Vincent Nguyen writing for Yanko Design, interviewing Alan Dye, VP of Human Interface Design at Apple:

This technical challenge reveals the core problem Apple set out to solve: creating a digital material that maintains form-changing capabilities while preserving transparency. Traditional UI elements either block content or disappear entirely, but Apple developed a material that can exist in multiple states without compromising visibility of underlying content. Dye’s emphasis on “celebrating user content” exposes Apple’s hierarchy philosophy, where the interface serves content instead of competing with it. When you tap to magnify text, the interface doesn’t resize but stretches and flows like liquid responding to pressure, ensuring your photos, videos, and web content remain the focus while navigation elements adapt around them.

Since the Jony Ive days, Apple’s hardware has always been about celebrating the content. Bezels got smaller. Screens got bigger and brighter. Even the flat design brought on by iOS 7 and eventually adopted by the whole ecosystem was a way to strip away the noise and focus on the content.

Dye’s explanation of the “glass layer versus application layer” architecture provides insight into how Apple technically implements this philosophy. The company has created a distinct separation between functional controls (the glass layer) and user content (the application layer), allowing each to behave according to different rules while maintaining visual cohesion. This architectural decision enables the morphing behavior Dye described, where controls can adapt and change while content remains stable and prominent.

The Apple platform UI today sort of does that, but Liquid Glass seems to take it even further.

Nguyen about his experience using the Music app on Mac:

The difference from current iOS becomes apparent in specific scenarios. In the current Music app, scrolling through your library feels like moving through flat, static layers. With Liquid Glass, scrolling creates a sense of depth. You can see your album artwork subtly shifting beneath the translucent controls, creating spatial awareness of where interface elements sit in relation to your content. The tab bar doesn’t just scroll with you; it creates gentle optical distortions that make the underlying content feel physically present beneath the glass surface.

preview-1749793045679.jpg

Apple’s Liquid Glass Hands-On: Why Every Interface Element Now Behaves Like Physical Material

Liquid Glass represents more than an aesthetic update or surface-level polish. It functions as a complex behavioral system, precisely engineered to dictate how interface layers react to user input. In practical terms, this means Apple devices now interact with interface surfaces not as static, interchangeable panes, but as dynamic, adaptive materials that fluidly flex and

yankodesign.com iconyankodesign.com

The Steve Jobs archive sharing a little behind-the-scenes of Jobs’s famous Stanford commencement speech:

The talk generated no small measure of anxiety for Steve. He had attended Reed College for only a few months before dropping out; now he would be speaking to graduates of one of the world’s top research universities, a place that meant a great deal to him. An intensely private man, Steve was not in the habit of talking about his personal journey—but he knew the occasion required it.

Steve Jobs has always had an aura of invincibility around him—a creative genius who could convince those around him and the world of anything he wanted using his “reality distortion field.” But he was also human.

I’m sure you’ve seen it before. But whether you’re 22 years old or 50, his advice still resonates. I love the clarity in this scaled-up version.

Play
preview-1749791832757.jpg

Stay Hungry, Stay Foolish

Marking the 20th anniversary of Steve Jobs’ 2005 Stanford commencement speech with a digitally enhanced version of the video as well as a behind-the-scenes look at how it came to be: from firsthand accounts from people who were connected to the commencement to Steve’s personal drafts.

stevejobsarchive.com iconstevejobsarchive.com
Collection of iOS interface elements showcasing Liquid Glass design system including keyboards, menus, buttons, toggles, and dialogs with translucent materials on dark background.

Breaking Down Apple’s Liquid Glass: The Tech, The Hype, and The Reality

I kind of expected it: a lot more ink was spilled on Liquid Glass—particularly on social media. In case you don’t remember, Liquid Glass is the new UI for all of Apple’s platforms. It was announced Monday at WWDC 2025, their annual developers conference.

The criticism is primarily around legibility and accessibility. Secondary reasons include aesthetics and power usage to animate all the bubbles.

How Liquid Glass Actually Works

Before I go and address the criticism, I think it would be great to break down the team’s design thinking and how Liquid Glass actually works. 

I watched two videos from Apple’s developer site. Much of the rest of the article is a summary of the videos. You can watch them and skip to the end of this piece.

First off is this video that explains Liquid Glass in detail.

As I watched the video, one thing stood out clearly to me: the design team at Apple did a lot of studying of the real world before digitizing it into UI.

The Core Innovation: Lensing

Instead of scattering light like previous materials, Liquid Glass dynamically bends and shapes light in real-time. Apple calls this “lensing.”

It’s their attempt to recreate how transparent objects work in the physical world. We all intuitively understand how warping and bending light communicates presence and motion. Liquid Glass uses these visual cues to provide separation while letting content shine through.

A Multi-Layer System That Adapts

Liquid Glass toolbar with pink tinted buttons (bookmark, refresh, more) floating over geometric green background, showing tinting capabilities.

This isn’t just a simple effect. It’s built from several layers working together:

  • Highlights respond to environmental lighting and device motion. When you unlock your phone, lights move through 3D space, causing illumination to travel around the material.
  • Shadows automatically adjust based on what’s behind them—darker over text for separation, lighter over solid backgrounds.
  • Tint layers continuously adapt. As content scrolls underneath, the material flips between light and dark modes for optimal legibility.
  • Interactive feedback spreads from your fingertip throughout the element, making it feel alive and responsive.

All of this happens automatically when developers apply Liquid Glass.

Two Variants (Frosted and Clear)

Liquid Glass has the same two types of material.

  • Regular is the workhorse—full adaptive behaviors, works anywhere.
  • Clear is more transparent but needs dimming layers for legibility.

Clear should only be used over media-rich content when the content layer won’t suffer from dimming. Otherwise, stick with Regular.

It’s like ice cubes—cloudy ones from your freezer versus clear ones at fancy bars that let you see your drink’s color.

Four examples of regular Liquid Glass elements: audio controls, deletion dialog, text selection menu, and red toolbar, demonstrating various applications.

Regular is the workhorse—full adaptive behaviors, works anywhere.

Video player interface with Liquid Glass controls (pause, skip buttons) overlaying blue ocean scene with sea creature.

Clear should only be used over media-rich content when the content layer won’t suffer from dimming.

Smart Contextual Changes

When elements scale up (like expanding menus), the material simulates thicker glass with deeper shadows. On larger surfaces, ambient light from nearby content subtly influences the appearance.

Elements don’t fade—they materialize by gradually modulating light bending. The gel-like flexibility responds instantly to touch, making interactions feel satisfying.

This is something that’s hard to see in stills.

The New Tinting Approach

Red "Add" button with music note icon using Liquid Glass material over black and white checkered pattern background.

Instead of flat color overlays, Apple generates tone ranges mapped to content brightness underneath. It’s inspired by how colored glass actually works—changing hue and saturation based on what’s behind it.

Apple recommends sparing use of tinting. Only for primary actions that need emphasis. Makes sense.

Design Guidelines That Matter

Liquid Glass is for the navigation and controls layer floating above content—not for everything. Don’t add Liquid Glass to or make content areas Liquid Glass. Never stack glass on glass.

Liquid Glass button with a black border and overlapping windows icon floating over blurred green plant background, showing off its accessibility mode.

Accessibility features are built-in automatically—reduced transparency, increased contrast, and reduced motion modify the material without breaking functionality.

The Legibility Outcry (and Why It’s Overblown)

Apple devices (MacBook, iPad, iPhone, Apple Watch) displaying new Liquid Glass interface with translucent elements over blue gradient wallpapers.

“Legibility” was mentioned 13 times in the 19-minute video. Clearly that was a concern of theirs. Yes, in the keynote, clear tinted device home screens were shown and many on social media took that to be an accessibility abomination. Which, yes, that is. But that’s not the default. 

The fact that the system senses the type of content underneath it and adjusts accordingly—flipping from light to dark, increasing opacity, or adjusting shadow depth—means they’re making accommodations for legibility.

Maybe Apple needs to do some tweaking, but it’s evident that they care about this.

And like the 18 macOS releases before Tahoe—this version—accessibility settings and controls have been built right in. Universal Access debuted with Mac OS X 10.2 Jaguar in 2002. Apple has had a long history of supporting customers with disabilities, dating all the way back to 1987.

So while the social media outcry about legibility is understandable, Apple’s track record suggests they’ll refine these features based on real user feedback, not just Twitter hot takes.

The Real Goal: Device Continuity

Why and what is Liquid Glass meant to do? It’s unification. With the new design language, Apple has also come out with a new design system. This video presented by Apple designer Maria Hristoforova lays it out.

Hristoforova says that Apple’s new design system overhaul is fundamentally about creating seamless familiarity as users move between devices—ensuring that interface patterns learned on iPhone translate directly to Mac and iPad without requiring users to relearn how things work. The video points out that the company has systematically redesigned everything from typography (hooray for left alignment!) and shapes to navigation bars and sidebars around Liquid Glass as the unifying foundation, so that the same symbols, behaviors, and interactions feel consistent across all screen sizes and contexts. 

The Pattern of Promised Unity

This isn’t Apple’s first rodeo with “unified design language” promises.

Back in 2013, iOS 7’s flat design overhaul was supposed to create seamless consistency across Apple’s ecosystem. Jony Ive ditched skeuomorphism for minimalist interfaces with translucency and layering—the foundation for everything that followed.

OS X Yosemite (2014) brought those same principles to desktop. Flatter icons, cleaner lines, translucent elements. Same pitch: unified experience across devices.

macOS Big Sur (2020) pushed even further with iOS-like app icons and redesigned interfaces. Again, the promise was consistent visual language across all platforms.

And here we are in 2025 with Liquid Glass making the exact same promises. 

But maybe “goal” is a better word.

Consistency Makes the Brand

I’m OK with the goal of having a unified design language. As designers, we love consistency. Consistency is what makes a brand. As Apple has proven over and over again for decades now, it is one of the most valuable brands in the world. They maintain their position not only by making great products, but also by being incredibly disciplined about consistency.

San Francisco debuted 10 years ago as the system typeface for iOS 9 and OS El Capitan. They’ve since extended it and it works great in marketing and in interfaces.

iPhone Settings screen showing Liquid Glass grouped table cells with red outline highlighting the concentric shape design.

The rounded corners on their devices are all pretty much the same radii. Now that concentricity is being incorporated into the UI, screen elements will be harmonious with their physical surroundings. Only Apple can do that because they control the hardware and the software. And that is their magic.

Design Is Both How It Works and How It Looks

In 2003, two years after the iPod launched, Rob Walker of The New York Times did a profile on Apple. The now popular quote about design from Steve Jobs comes from this piece.

[The iPod] is, in short, an icon. A handful of familiar clichés have made the rounds to explain this — it’s about ease of use, it’s about Apple’s great sense of design. But what does that really mean? “Most people make the mistake of thinking design is what it looks like,” says Steve Jobs, Apple’s C.E.O. “People think it’s this veneer — that the designers are handed this box and told, ‘Make it look good!’ That’s not what we think design is. It’s not just what it looks like and feels like. Design is how it works.”

People misinterpret this quote all the time to mean design is only how it works. That is not what Steve meant. He meant, design is both what it looks like and how it works.

Steve did care about aesthetics. That’s why the Graphic Design team mocked up hundreds of PowerMac G5 box designs (the graphics on the box, not the construction). That’s why he obsessed over the materials used in Pixar’s Emeryville headquarters. From Walter Isaacson’s biography:

Because the building’s steel beams were going to be visible, Jobs pored over samples from manufacturers across the country to see which had the best color and texture. He chose a mill in Arkansas, told it to blast the steel to a pure color, and made sure the truckers used caution not to nick any of it.

Liquid Glass is a welcomed and much-needed visual refresh. It’s the natural evolution of Apple’s platforms, going from skeuomorphic so users knew they could use their fingers and tap on virtual buttons on a touchscreen, to flat as a response to the cacophony of visual noise in UIs at the time, and now to something kind of in-between.

Humans eventually tire of seeing the same thing. Carmakers refresh their vehicle designs every three or four years. Then they do complete redesigns every five to eight years. It gets consumers excited. 

Liquid Glass will help Apple sell a bunch more hardware.

I have relayed here before the story that I’ve been using Macs since 1985. It wasn’t the hardware that drew me in—it was MacPaint. I was always an artistic kid so being able to paint on a digital canvas seemed thrilling to me. And of course it was back then.

Behind MacPaint, was a man named Bill Atkinson. Atkinson died last Thursday, June 5 of pancreatic cancer. In a short remembrance, John Gruber said:

I say this with no hyperbole: Bill Atkinson may well have been the best computer programmer who ever lived. Without question, he’s on the short list. What a man, what a mind, what gifts to the world he left us.

I‘m happy that Figma also remembered Atkinson and that they are standing on his shoulders.

Every day at Figma, we wrestle with the same challenges Atkinson faced: How do you make powerful tools feel effortless? How do you hide complexity behind intuitive interactions? His fingerprints are on every pixel we push, every selection we make, every moment of creative flow our users experience.

preview-1749532457343.jpg

Bill Atkinson’s 10 Rules for Making Interfaces More Human

We commemorate the Apple pioneer whose QuickDraw and HyperCard programs made the Macintosh intuitive enough for nearly anyone to use.

figma.com iconfigma.com
Abstract gradient design with flowing liquid glass elements in blue and pink colors against a gray background, showcasing Apple's new Liquid Glass design language.

Quick Notes About WWDC 2025

Apple’s annual developer conference kicked off today with a keynote that announced:

  • Unified Version 26 across all Apple platforms (iOS, iPadOS, macOS, watchOS, tvOS, visionOS)
  • “Liquid Glass” design system. A complete UI and UX overhaul, the first major redesign since iOS 7
  • Apple Intelligence. Continued small improvements, though not the deep integration promised a year ago
  • Full windowing system on iPadOS. Windows comes to iPad! Finally.

Of course, those are the very high-level highlights.

For designers, the headline is Liquid Glass. Sebastiaan de With’s predictive post and renderings from last week were very spot-on.

I like it. I think iOS and macOS needed a fresh coat of paint and Liquid Glass delivers.

There’s already been some criticism—naturally, because we’re opinionated designers after all!—with some calling it over the top, a rehash of Windows Vista, or an accessibility nightmare.

Apple Music interface showing the new Liquid Glass design with translucent playback controls and navigation bar overlaying colorful album artwork, featuring "Blest" by Yuno in the player and navigation tabs for Home, New, Radio, Library, and Search.

The new Liquid Glass design language acts like real glass, refracting light and bending the image behind it accordingly.

In case you haven’t seen it, it’s a visual and—albeit less so—experience overhaul for the various flavors of Apple OSes. Imagine a transparent glass layer where controls sit. The layer has all the refractive qualities of glass, bending the light as images pass below it, and its edges catching highlights from a light source. This is all powered by a sophisticated 3D engine, I’m sure. It’s gorgeous.

It’s been 12 years since the last major refresh, with iOS 7 bringing on an era of so-called flat design to the world. At the time, it was a natural extension of Jony Ive’s predilection for minimalism, to strip things to their core. What could be more pure than using only type? It certainly appealed to my sensibilities. But what it brought on was a universe of sameness in UI design. 

Person using an iPad with a transparent glass interface overlay, demonstrating the new Liquid Glass design system with translucent app icons visible through the glass layer.

**

Hand interacting with a translucent glass interface displaying text on what appears to be a tablet or device, showing the new design's transparency effects.

The design team at Apple studied the physical properties of real glass to perfect the material in the new versions of the OSes.

With the release of Liquid Glass, led by Apple’s VP of Design, Alan Dye, I hope we’ll see designers add a little more personality, depth, and texture back into their UIs. No, we don’t need to return to the days of skeuomorphism—kicked off by Mac OS X’s Aqua interface design. I do think there’s been a movement away from flat design recently. Even at the latest Config conference, Figma showed off functionality to add noise and texture into our designs. We’ve been in a flat world for 12 years! Time to add a little spice back in.

Finally, it’s a beta. This is typical of Apple. The implementation will be iterated on and by the time it ships later this year in September, it will have been further refined. 

I do miss a good 4-minute video from Jony Ive talking about the virtues of software material design though…

Sebastiaan de With, former designer at Apple and currently co-founder and designer at Lux (makers of Halide, Kino, Spectre, and Orion) imagined what the next era in iOS design might be. (WWDC, Apple’s developer conference is next week. This is typically when they unveil the new operating systems that will launch in the fall. Rumors are flying as usual.)

But he starts with a history lesson:

Smart people study history to understand the future. If we were to categorize the epochs of iOS design, we could roughly separate them into the Shaded Age, the Adaptive Age, and the New Age.

The Shaded Age, or skeuomorphic age, took inspiration from the Dashboard feature of Mac OS X Tiger. And then the Flat Age brought on by the introduction of iOS 7.

de With’s concept mocks for the New Age are fantastic. Based on the physicality of visionOS, with specular highlights and reactive reflections, it’s luscious and reminds me of the first time I ever laid eyes on Aqua—the glossy, candy-like look of the original Mac OS X. Steve Jobs said at its introduction, “…one of the design goals was when you saw it you wanted to lick it.”

Close-up of a glass-rendered user interface

Sebastiaan de With: “Philosophically, if I was Apple, I’d describe this as finally having an interface that matches the beautiful material properties of its devices. All the surfaces of your devices have glass screens. This brings an interface of a matching material, giving the user a feeling of the glass itself coming alive.

preview-1749013108308.jpg

Physicality: the new age of UI

There’s a lot of rumors of a big impending UI redesign from Apple. Let’s imagine what’s (or what could be) next for the design of iPhones, Macs and iPads.

lux.camera iconlux.camera

OpenAI is acquiring a hardware company called “io” that Jony Ive cofounded just a year ago:

Two years ago, Jony Ive and the creative collective LoveFrom, quietly began collaborating with Sam Altman and the team at OpenAI.

It became clear that our ambitions to develop, engineer and manufacture a new family of products demanded an entirely new company. And so, one year ago, Jony founded io with Scott Cannon, Evans Hankey and Tang Tan.

We gathered together the best hardware and software engineers, the best technologists, physicists, scientists, researchers and experts in product development and manufacturing. Many of us have worked closely for decades.

The io team, focused on developing products that inspire, empower and enable, will now merge with OpenAI to work more intimately with the research, engineering and product teams in San Francisco.

It has been an open rumor that Sam Altman and Ive has been working together on some hardware. I had assumed they formalized their partnership already, but I guess not.

Play

There are some bold statements that Ive and Altman make in the launch video, teasing a revolutionary new device that will enable quicker, better access to ChatGPT. Something that is a lot less friction than how Altman explains in the video:

If I wanted to ask ChatGPT something right now about something we had talked about earlier, think about what would happen. I would like reached down. I would get on my laptop, I’d open it up, I’d launch a web browser, I’d start typing, and I’d have to, like, explain that thing. And I would hit enter, and I would wait, and I would get a response. And that is at the limit of what the current tool of a laptop can do. But I think this technology deserves something much better.

There are a couple of other nuggets about what this new device might be from the statements Ive and Altman made to Bloomberg:

…Ive and Altman don’t see the iPhone disappearing anytime soon. “In the same way that the smartphone didn’t make the laptop go away, I don’t think our first thing is going to make the smartphone go away,” Altman said. “It is a totally new kind of thing.”

“We are obviously still in the terminal phase of AI interactions,” said Altman, 40. “We have not yet figured out what the equivalent of the graphical user interface is going to be, but we will.”

While we don’t know what the form factor will be, I’m sure it won’t be a wearable pin—ahem, RIP Humane. Just to put it out there—I predict it will be a voice assistant in an earbud, very much like the AI in the 2013 movie “Her.” Altman has long been obsessed with the movie, going as far as trying to get Scarlett Johansson to be one of the voices for ChatGPT.

EDIT 5/22/2025, 8:58am PT: Added prediction about the form factor.

preview-1747889382686.jpg

Sam and Jony introduce io

Building a family of AI products for everyone.

openai.com iconopenai.com

John Gruber wrote a hilarious rant about the single-story a in the iOS Notes app:

I absolutely despise the alternate single-story a glyph that Apple Notes uses. I use Notes every single day and this a bothers me every single day. It hurts me. It’s a childish silly look, but Notes, for me, is one of the most serious, most important apps I use.

Since that sparked some conversation online, he followed up with a longer post about typography in early versions of the Mac system software:

…Apple actually shipped System 1.0 with a version of Geneva with a single-story a glyph — but only in the 9-point version of Geneva. At 12 points (and larger), Geneva’s a was double-story.

To me, it does make sense that 9-point Geneva would have a single-story a, since there are less pixels to draw the glyph well and to distinguish better from the lowercase e.

preview-1747273905636.png

Single-Story a’s in Very Early Versions of Macintosh System 1

A single-story “a” in Chicago feels more blasphemous than that AI image Trump tweeted of himself as the new pope.

daringfireball.net icondaringfireball.net
A screenshot of the YourOutie.is website showing the Lumon logo at the top with the title "Outie Query System Interface (OQSI)" beneath it. The interface has a minimalist white card on a blue background with small digital patterns. The card contains text that reads "Describe your Innie to learn about your Outie" and a black "Get Started" button. The design mimics the retro-corporate aesthetic of the TV show Severance.

Your Outie Has Both Zaz and Pep: Building YourOutie.is with AI

A tall man with curly, graying hair and a bushy mustache sits across from a woman with a very slight smile in a dimly lit room. There’s pleasant, calming music playing. He’s eager with anticipation to learn about his Outie. He’s an Innie who works on the “severed” floor at Lumon. He’s undergone a surgical procedure that splits his work self from his personal self. This is the premise of the show Severance on Apple TV+.

Ms. Casey, the therapist:

All right, Irving. What I’d like to do is share with you some facts about your Outie. Because your Outie is an exemplary person, these facts should be very pleasing. Just relax your body and be open to the facts. Try to enjoy each equally. These facts are not to be shared outside this room. But for now, they’re yours to enjoy.

Your Outie is generous. Your Outie is fond of music and owns many records. Your Outie is a friend to children and to the elderly and the insane. Your Outie is strong and helped someone lift a heavy object. Your Outie attends many dances and is popular among the other attendees. Your Outie likes films and owns a machine that can play them. Your Outie is splendid and can swim gracefully and well.

The scene is from season one, episode two, called “Half Loop.” With season two wrapping up, and with my work colleagues constantly making “my Outie” jokes, I wondered if there was a Your Outie generator. Not really. There’s this meme generator from imgflip, but that’s about it.

Screenshot of the Your Outie meme generator from imgflip.

So, in the tradition of name generator sites like Fantasy Name Generators (you know, for DnD), I decided to make my own using an LLM to generate the wellness facts.

The resulting website took four-and-a-half days. I started Monday evening and launched it by dinner time Friday. All totaled, it was about 20 hours of work. Apologies to my wife, to whom I barely spoke while I was in the zone with my creative obsession.

Lumon Outie Query System Interface (OQSI)

Lumon Outie Query System Interface (OQSI)

Your Outie started with a proof-of-concept.

I started with a proof-of-concept using Claude. I gathered information about the show and all the official Your Outie wellness facts from the fantastic Severance Wiki and attached them to this prompt:

I would like to create a “Wellness Fact” generator based on the “Your Outie is…” format from the character Ms. Casey. Question: What questions should we ask the user in order to create responses that are humorous and unique? These need to be very basic questions, potentially from predefined dropdowns.

Claude’s response made me realize that asking about the real person was the wrong way to go. It felt too generic. Then I wondered, what if we just had the user role-play as their Innie?

The prototype was good and showed how fun this little novelty could be. So I decided to put my other side-project on hold for a bit—I’ve been working on redesigning this site—and make a run at creating this.

Screenshot of Claude with the chat on the left and the prototype on the right. The prototype is a basic form with dropdowns for Innie traits.

Your Outie developed the API first but never used it.

My first solution was to create a Python API with a Next.js frontend. With my experience building AI-powered software, I knew that Python was the preferred method for working with LLMs. I also used LangChain so that I could have optionality with foundational models. I took the TypeScript code from Claude and asked Cursor to use Python and LangChain to develop the API. Before long, I had a working backend.

One interesting problem I ran into was that the facts from GPT often came back very similar to each other. So, I added code to categorize each fact and prevent dupes. Tweaking the prompt also yielded better-written results.

Additionally, I tried all the available models—except for the reasoning ones like o1. OpenAI’s GPT-4o-mini seemed to strike a good balance.

This was Monday evening.

Honestly, this was very trivial to do. Cursor plus Python LangChain made it easy. 172 lines of code. Boom.

I would later regret choosing Python, however.

Your Outie designed the website in Figma but only the first couple of screens.

Now the fun part was coming up with the design. There were many possibilities. I could riff on the computer terminals on the severed floor like the macrodata refinement game. I could emulate 1970s and ’80s corporate design like Mr. Milchick’s performance review report.

Screenshot of an old CRT monitor with a grid of numbers. Some of these numbers are captured into a box on the bottom of the screen.

The official macrodata refinement game from Apple.

Still from the show of the character Seth Milchick's performance review report.

Seth Milchick receives his first performance review in this report.

I ended up with the latter, but as I started designing, I realized I could incorporate a little early Macintosh vibe. I began thinking of the website as a HyperCard stack. So I went with it.

I was anxious to build the frontend. I started a new Next.js project and fired up Cursor. I forwent a formal PRD and started vibe coding (ugh, I hate that term, more on this in an upcoming post). Using static mock data, I got the UI to a good place by the end of the evening—well, midnight—but there was still a lot of polishing to do.

This was Tuesday night.

Screenshot of the author's Figma canvas showing various screen designs and typographic explorations.

My Figma canvas showing some quick explorations.

Your Outie struggled bravely with Cursor and won.

Beyond the basic generator, I wanted to create something that had both zaz and pep. Recalling the eight-hour remix of the Severance theme by ODESZA, “Music to Refine To,” I decided to add a music player to the site. I found a few cool tracks on Epidemic Sound and tried building the player. I thought it would be easy, but Cursor and I struggled mightily for hours. Play/pause wouldn’t work. Autoplaying the next track wouldn’t work. Etc. Eventually, I cut my losses after figuring out at least play/pause and combined the tracks together into a long one. Six minutes should be long enough, right?

v0 helped with generating the code for the gradient background.

This is my ode to the Music Dance Experience (MDE) from season one. That was Wednesday.

Still from the show of two characters dancing in the middle of the office.

Your Outie reintegrated.

Thursday’s activity was integrating the backend with the frontend. Again, with Cursor, this was relatively straightforward. The API took the request from the frontend and provided a response. The frontend displayed it. I spent more time fine-tuning the animations and getting the mobile layout just right. You wouldn’t believe how much Cursor-wrangling I had to do to get the sliding animations and fades dialed in. I think this is where AI struggles—with the nuances.

By the end of the night, I had a nice working app. Now, I had to look for a host. Vercel doesn’t support Python. After researching Digital Ocean, I realized I would have to pay for two app servers: one for the Node.js frontend and another for the Python backend. That’s not too cost-effective for a silly site like this. Again, it was midnight, so I slept on it.

Your Outie once refactored code from Python to React in just one hour.

Still from the show of the main character, Mark S. staring at his computer monitor.

In the morning, I decided to refactor the API from Python to React. LangChain has a JavaScript version, so I asked Cursor to translate the original Python code. The translation wasn’t as smooth as I had hoped. Again, it missed many of the details that I spent time putting into the original prompt and logic. But a few more chats later, the translation was completed, and now the app was all in React.

Between the end of my work day and dinner on Friday, I finished the final touchups on the site: removing debugging console messages, rewriting error messages to be more Severance-like, and making sure there were no layout bugs.

I had to fix a few more build errors and used Claude Code. It seemed a lot easier than sitting there and going back and forth with Cursor.

Then, I connected my repo to Vercel, and voila! The Lumon Outie Query System Interface (OQSI) was live at YourOutie.is.

I hope you enjoy it as much as I had fun making it. Now, I think I owe my wife some flowers and a date night.

Zuckerberg believes Apple “[hasn’t] really invented anything great in a while…”

Appearing on Joe Rogan’s podcast, this week, Meta CEO Mark Zuckerberg said that Apple “[hasn’t] really invented anything great in a while. Steve Jobs invented the iPhone and now they’re just kind of sitting on it 20 years later.”

Let’s take a look at some hard metrics, shall we?

I did a search of the USPTO site for patents filed by Apple and Meta since 2007. In that time period, Apple filed for 44,699 patents. Meta, nee Facebook, filed for 4,839, or about 10% of Apple’s inventions.

Side-by-side screenshots of patent searches from the USPTO database showing results for Apple Inc. and Meta Platforms. The Apple search (left) returned 44,699 results since 2007, while the Meta search (right) returned 4,839 results.

You can argue that not all companies file for patents for everything, or that Zuck said Apple hasn’t “really invented anything great in a while.” Great being the keyword here.

He left out the following “great” Apple inventions since 2007:

  • App Store (2008)
  • iPad (2010)
  • Apple Pay (2014)
  • Swift (2014)
  • Apple Watch (2015)
  • AirPods (2016)
  • Face ID (2017)
  • Neural engine SoC (2017)
  • SwiftUI (2019)
  • Apple silicon (2020)
  • Vision Pro (2023) [arguable, since it wasn’t a commercial success, but definitely a technical feat]

The App Store, I’d argue, is on the same level as the iPhone because it opened up an entire new economy for developers, resulting in an astounding $935 billion market in 2025. Apple Watch might be a close second, kicking off a $38 billion market for smartwatches.

Let’s think about Meta’s since 2007, excluding acquisitions*:

  • Facebook Messenger (2011)
  • React (2013)
  • React Native (2015)
  • GraphQL (2015)
  • PyTorch (2016)
  • Ray-Ban Stories (2021)
  • Llama (2023)

*Yes, excluding acquisitions, as Zuckerberg is talking about inventions. That’s why WhatsApp, Instagram, and Quest are not included. Anything I’m missing on this list?

As you can see, other than Messenger and the Ray-Ban glasses, the rest of Meta’s inventions are aimed at developers, not consumers. I’m being a little generous.

Update 1/12/2025

I’ve added some products to the lists above based on some replies to my Threads post. I also added a sentence to clarify excluding acquisitions.

Apple VR headset on a table

Thoughts on Apple Vision Pro

Apple finally launched its Vision Pro “spatial computing” device in early February. We immediately saw TikTok memes of influencers being ridiculous. I wrote about my hope for the Apple Vision Pro back in June 2023, when it was first announced. When preorders opened for Vision Pro in January, I told myself I wouldn’t buy it. I couldn’t justify the $3,500 price tag. Out of morbid curiosity, I would lurk in the AVP subreddits to live vicariously through those who did take the plunge.

After about a month of reading all the positives from users about the device, I impulsively bought an Apple Vision Pro. I placed my order online at noon and picked it up just two hours later at an Apple Store near me.

Many great articles and YouTube videos have already been produced, so this post won’t be a top-to-bottom review of the Apple Vision Pro. Instead, I’ll try to frame it from my standpoint as someone who has designed user experiences for VR

Welcome to the Era of Spatial Computing

Augmented reality, mixed reality, or spatial computing—as Apple calls it—on a “consumer” device is pretty new. You could argue that Microsoft HoloLens did it first, but that didn’t generate the same cultural currency as AVP has, and the HoloLens line has been relegated to industrial applications. The Meta Quest 3, launched last October, also has a passthrough camera, but they don’t market the feature; it’s still sold as a purely virtual reality headset.

Screenshot of the Apple Vision Pro home screen showing floating app icons in an augmented reality workspace. Visible apps include TV, Music, Mindfulness, Settings, Safari, Photos, Notes, App Store, Freeform, Mail, Messages, Keynote, and Compatible Apps, overlaid on a real-world office environment.

Vision Pro Home Screen in my messy home office.

Putting on Vision Pro for the first time is pretty magical. I saw the world around me—though a slightly muted and grainy version of my reality—and I saw UI floating and pinned to reality. Unlike any other headset I’ve tried, there is no screen door effect. I couldn’t see the pixels. It’s genuinely a retina display just millimeters away from my actual retinas. 

The UI is bright, vibrant, and crisp in the display. After launching a weather app from the home “screen” and positioning it on a wall, it stays exactly where it is in my living room. As I move closer to the app, everything about the app remains super sharp. It’s like diving into a UI. 

The visionOS User Interface

The visionOS UI feels very much like an extension of macOS. There’s a lot of translucency, blurred backgrounds for a frosted glass effect, and rounded corners. The controls for moving, closing, and resizing a window feel very natural. There were times when I wished I could rotate a window on its Y-axis to face me better, but that wasn’t possible. 

Admittedly, I didn’t turn on the accessibility feature. But as is, a significant issue that the UI presents is contrast. As someone with no accessibility issues, it was hard to tell half the time when something was highlighted. I would often have to look at another UI component and then back again to make sure a button was actually highlighted.

When you launch a Vision Pro app, it is placed right in front of you. For example, I would look at the Photos app, then click the Digital Crown (the dial for immersion) to bring up the Home Screen, which is then overlaid on top of the app. The background app does get fainter, and I can tell that the new screen is on top of Photos. Launching the Apple TV app from there would bring up the TV window on top of Photos, and I would run into issues where the handles for the windows are really close together, making it difficult to select the right one with my eyes so I can move it.

Window management, in general, is a mess. First of all, there is none. There’s no minimizing of windows; I would have to move them out of the way. There’s no collecting of windows. For instance, I couldn’t set up a workspace with the apps in the right place, collapse them all, and bring them with me to another room in my house. I would have to close them all, reopen them, and reposition them in the new room.

Working in Apple Vision Pro

I was excited to try the Mac Virtual Display feature, where you can see your Mac’s screen inside Vision Pro. Turning this on is intuitive. A “Connect” button appeared just above my MacBook Pro when I looked at it.

The Mac’s screen blacks out, and a large screen inside Vision Pro appears. I could resize it, move it around, and position it exactly where I wanted it. Everything about this virtual screen was crisp, but I ran into issues.

First, I’m a pretty good typist but cannot touch-type. With the Mac Virtual Display, I need to look down at my keyboard every few seconds. The passthrough camera on the headset is great but not perfect. There is some warping of reality on the edges, and that was just enough to cause a little motion sickness.

Second, when I’m sitting at my desk, I’m used to working with dual monitors. I usually have email or comms software on the smaller laptop screen while I work in Figma, Illustrator, or Photoshop on my larger 5K Apple Studio Display. If I sit at my desk and turn on Mac Virtual Display, I also lose my Studio Display. Only one virtual display shows up in Vision Pro. 

I tried to mitigate the lost space by opening Messages, Spark Email (the iPad version), and Fantastical in Vision Pro and placing those apps around me. But I found switching from my Mac to these other apps cumbersome. I’d have to stop using my mouse and use my fingers instead when I looked at Spark. I found that keyboard focus depended on where my eyes were looking. For example, if I were reading an email in Spark but needed to look at my keyboard to find the “E” key to archive that email, if I pressed the key before my eyes were back in the Spark window, that E would go to whatever app my eyes happened to cross. In other words, my eyes are my cursor, which takes a while to get used to.

Spatial Computing 1.0

It is only the first version of visionOS (currently 1.1). I expect many of these issues, like window management, eye tracking and input confusion, and contrast, to improve in the coming years. 

Native visionOS Apps

In many ways, Apple has been telegraphing what they want to achieve with Vision Pro for years. Apple’s API for augmented reality, ARKit, was released way back in June 2017, a full six years before Vision Pro was unveiled. Some of the early AR apps for Vision Pro are cool tech demos.

Screenshot from Apple Vision Pro using the JigSpace app, showing a detailed 3D augmented reality model of a jet engine overlaid in a modern living room environment.

There’s a jet engine in my living room!

The JigSpace app plunks real-world objects into your living room. I pulled up a working jet engine and was able to peel away the layers to see how it worked. There’s even a Formula 1 race car that you can load into your environment.

The Super Fruit Ninja game was fun. I turned my living room into a fruit-splattered dojo. I could even launch throwing stars from my hands that would get stuck on my walls.

Screenshot from Apple Vision Pro using the Zillow Immerse app, displaying a virtual tour interface overlaid on a dining area. Navigation options such as “Breakfast nook,” “Living room,” and “Kitchen” appear at the bottom, along with a broken 3D floor plan model in the center.

That’s half a floor plan on top of a low-resolution 360° photo.

Some Vision Pro apps were rushed out the door and are just awful. The Zillow Immerse app is one of them. I found the app glitchy and all the immersive house tours very low-quality. The problem is that the environments that ship with Vision Pro are so high-resolution and detailed that anything short of that is jarringly inferior. 

UX Considerations in Vision Pro

Apple Vision Pro can run iPad apps, at least the ones where the developer has enabled the capability. However, I found that many of the touch targets in iPad apps were not sufficient. Apple’s Human Interface Guidelines specify that hit targets should be at least 44x44 pts. But if opened in Vision Pro, that’s not enough. For visionOS, Apple recommends controls’ centers be at least 60 pts apart. 

I would further recommend that controls for visionOS apps should have large targets. In Apple’s own Photos app, in the left sidebar, only the accordion arrow is a control. Looking at and selecting the accordion label like “Spatial” or “Selfies” does not work. I had to look to the right of the label, to the arrow in order to select the item. Not great.

Eye and hand tracking in Vision Pro are excellent, although not perfect. There were many times when I couldn’t get the device to register my pinch gesture or get my eyes to a point in a window to resize it.

Some apps take advantage of additional gestures like pinching with both hands and then pulling them apart to resize something. I do believe that more standard gestures need to be introduced in the future for visionOS.

Steve Jobs famously once said, “God gave us ten styluses. Let’s not invent another.” Apple eventually introduced the Pencil for iPad. I think for many applications and for users to be productive with them, Apple will have to introduce a controller.

IMAX in My Bedroom

The single most compelling use case for Apple Vision Pro right now is consuming video content, specifically movies and TV shows. The built-in speakers, which Apple calls audio pods, sound fantastic. Apple has been doing a lot of work in Spatial Audio over the years and I experienced really great surround sound in the Vision Pro. The three apps that currently stand out for video entertainment are IMAX, Disney Plus, and Apple TV. 

Watching content in the IMAX —only a couple of trailers were free—reminded me of the best IMAX screen I’ve ever been to, which is the one in the Metreon in San Francisco. The screen is floor-to-ceiling high with a curved railing in front of it. On either side is a backlit IMAX logo, and I could choose from a few different positions in the theater!

Screenshot from Apple Vision Pro using the Disney+ app, showing a virtual Star Wars-themed environment with a sunset over Tatooine. A floating screen displays a scene featuring droids BB-8 and R2-D2, blending immersive AR with cinematic playback.

Watching a Star Wars movie on Tatooine.

Disney leverages its IP very well by giving us various sets to watch their content. I could watch Avengers: End Game from Avengers Tower, Monsters, Inc. from the scare floor, or The Empire Strikes Back from Luke’s land speeder on Tatooine. 

With Apple TV, I could watch Masters of the Air in a window in my space or go into an immersive environment. Whether it’s lakeside looking towards Mount Hood, on the surface of the moon, or in a discrete movie theater, the content was the star. My wife goes to sleep before me, and I usually put on my AirPods and watch something on my iPad. With Vision Pro, I could be much more engrossed in the show because the screen is as big as my room.

Still from an Apple Vision Pro commercial showing a person lying on a couch wearing the headset, watching a large virtual screen suspended in the air that displays warplanes flying through clouds. The scene emphasizes immersive home entertainment; caption reads “Apple TV+ subscription required.”

From the Apple commercial “First Timer”

I rewatched Dune from 2021 and was blown away by the audio quality of my AirPods Pro. The movie has incredible sound and uses bass and sub-bass frequencies a lot, so I was surprised at how well the AirPods performed. Of course, I didn’t feel the bass rumble in my chest, but I could certainly hear it in my ears.

Vision Pro Industrial Design

Close-up photo of the Apple Vision Pro headset, showcasing its sleek design with a reflective front visor, external cameras, and adjustable fabric headband resting on a dark surface.

The Vision Pro hardware is gorgeous.

As many others have pointed out, the hardware is incredible. It feels very premium and is a technological marvel. The cool-looking Solo Knit Band works pretty well for me, but everyone’s heads are so different that your mileage may vary. Everyone’s face is also very different, and Apple uses the Face ID scanner on the iPhone to scan your face when you order it. This determines the exact light seal they’ll include with your Vision Pro.

There are 28 different models of light seals. Finding the right light seal to fit my face wasn’t as easy as taking the recommendation from the scan. When I went to pick it up, I opted for a fitting, but the 21W that was suggested didn’t feel comfortable. I tried a couple of other light seal sizes and settled on the most comfortable one. But at home, the device was still very uncomfortable. I couldn’t wear it for more than 10 minutes without feeling a lot of pressure on my cheeks.

The next day, I returned to the Apple Store and tried three or four more light seal and headband combinations. But once dialed in, the headset was comfortable enough for me to watch an hour-long TV show.

I wonder why Apple didn’t try to develop a method that requires less variation. Wouldn’t some memory foam cushioned light seal work?

Apple’s Ambitions

The Apple Vision Pro is an audacious device, and I can tell where they want to go, but they don’t yet have the technology to get there. They want to make AR glasses with crystal-clear, super-sharp graphics that can then be converted to immersive VR with the flick of a dial.

That’s why EyeSight, the screen on the front of the headset, allows people in the surrounding area to see the user’s eyes. The device also has a passthrough camera, allowing the user to see out. Together, these two features allow Vision Pro to act as a clear two-way lens.

But Apple seems to want both AR and VR in the same device. I would argue that it might be physically impossible. Imagine an Apple device more like the HoloLens, where they are truly glasses with imagery projected onto them. That eliminates the smaller-than-their-competitors’ field of vision, or FOV. That would eliminate the ridiculous fitting conundrum as the glasses could float in front of your eyes. And that would probably reduce the device’s weight, which has been discussed at length in many reviews.

And then, for VR, maybe there’s a conversion that could happen with the AR glasses. A dial could turn the glasses from transparent to opaque. Then, the user would snap on a light-blocking attachment (a light seal). I believe that would be a perfectly acceptable tradeoff.

What $3,500 Buys You

In 1985, when I was 12 years old, I badgered my father daily to buy me a Macintosh computer. I had seen it at ComputerLand, a computer shop on Van Ness Avenue. I would go multiple times per week after school just to mess around with the display unit. I was enamored with MacPaint.

Vintage black-and-white print ad announcing the Apple Macintosh, featuring a hand using a computer mouse and a sketch of the Macintosh computer. The headline reads, “We can put you in touch with Macintosh,” promoting its simplicity and ease of use. The ad is from ComputerLand with the tagline “Make friends with the future.”

After I don’t know how many months, my dad relented and bought me a Macintosh 512K. The retail cost of the machine in 1985 was $2,795, equivalent to $8,000 in 2024 dollars. That’s a considerable investment for a working-class immigrant family. But my wise father knew then that computers were the future. And he was right.

With my Mac, I drew illustrations in MacPaint, wrote all my school essays in MacWrite, and made my first program in HyperCard. Eventually, I upgraded to other Macs and got exposed to and honed my skills in Photoshop and Illustrator, which would help my graphic design career. I designed my first application icon when I was a senior in high school.

Of course, computers are much cheaper today. The $999 entry model MacBook Air is able to do what my Mac 512K did and so much more. A kid today armed with a MacBook Air could learn so much!

Which brings us to the price tag of the Apple Vision Pro. It starts at $3,499. For a device where you can’t—at least for now—do much but consume. This was an argument against iPad for the longest time: it is primarily a consumption device. Apple went so far as to create a TV spot showing how a group of students use an iPad to complete a school project. With an iPad, there is a lot of creation that can happen. There are apps for drawing, 3D sculpting, video editing, writing, brainstorming, and more. It is more than a consumption device.

More than a Consumption Device? Not So Fast.

For Vision Pro, today, I’m not so sure. The obvious use case is 3D modeling and animation. Already, someone is figuring out how to visualize 3D models from Blender in AVP space. It’s tied to the instance of Blender running on his Mac, though, isn’t it? 3D modeling and animation software is notoriously complicated. The UI for Cinema 4D, the 3D software that I know best, has so many options and commands and so many keyboard shortcuts and combinations that it would be impossible to replicate in visionOS. Or take simpler apps like Final Cut Pro or Photoshop. Both have iPad apps, but a combination of the keyboard and mouse can make a user so much more productive. Imagine having to look at precisely the right UI element in Vision Pro, then pinch at exactly the right thing in a dense interface like Final Cut Pro. It would be a nightmare.

Screenshot from Apple Vision Pro using the Djay app, showing a realistic virtual DJ setup with turntables and music controls overlaid in a modern living room. A user’s hand interacts with the virtual record player, blending AR and music mixing in real time.

Being creative with djay in Apple Vision Pro

I do think that creative apps will eventually find their way to the platform. One of the launch apps is djay, the DJing app, of course. But it will take some time to figure out.

Beyond that, could a developer use Vision Pro to program in? If we look to the iPadOS ecosystem there are a handful of apps to write code. But there is no way to check your code, at least not natively. Erik Bledsoe from Coder writes, “The biggest hurdle to using an iPad for coding is its lack of a runtime environment for most languages, forcing you to move your files to a server for compiling and testing.” The workaround is to use a cloud-based IDE in the browser like Coder. I imagine that the same limitations will apply to Vision Pro.

The Bottom Line

For $3,500, you could buy a 16-inch MacBook Pro with an M3 Pro chip and an iPhone 15 Pro. Arguably, this would be a much more productive setup. With the Mac, you’d have access to tens of thousands of apps, many for professional applications. With the iPhone, there are nearly five million apps in the App Store.

In other words, I don’t believe buying an Apple Vision Pro today would open a new world up for a teenager. It might be cool and a little inspirational, but it won’t help the creator inside them. It won’t do what the Mac 512K did for me back in 1985.

Vision Pro’s Future

Clearly, the Apple Vision Pro released in 2024 is a first generation product. Just like the first-gen Apple Watch, Apple and its customers will need to feel their collective way and figure out all the right use cases. We can look to the Meta Quest 3 and Microsoft HoloLens 2 to give us a glimpse.

As much as people were marveling at the AR vacuum cleaning game for Vision Pro, AR and VR apps have existed for a while. PianoVision for Meta Quest 3 combines your real piano or keyboard with a Guitar Hero-like game to teach you how to play. The industrial applications for HoloLens make a lot of sense.

Now that Apple is overtly out of the closet in the AR/VR game, developers will show great enthusiasm and investment in the space. At least on Reddit, there’s a lot of excitement from users and developers. We will have to see if the momentum lasts. The key for the developers will be the size of the market. Will there be enough Vision Pro users to sustain a thriving app ecosystem?

As for me, I decided to return my Vision Pro within the 14-day return window. The only real use case for me was the consumption of media, which I couldn’t justify spending $3,500 for a room-sized TV that only I could watch. Sign me up for version 2, though.

Apple Vision Pro

Transported into Spatial Computing

After years of rumors and speculation, Apple finally unveiled their virtual reality headset yesterday in a classic “One more thing…” segment in their keynote. Dubbed Apple Vision Pro, this mixed reality device is perfectly Apple: it’s human-first. It’s centered around extending human productivity, communication, and connection. It’s telling that one of the core problems they solved was the VR isolation problem. That’s the issue where users of VR are isolated from the real world; they don’t know what’s going on, and the world around them sees that. Insert meme of oblivious VR user here. Instead, with the Vision Pro, when someone else is nearby, they show through the interface. Additionally, an outward-facing display shows the user’s eyes. These two innovative features help maintain the basic human behavior of acknowledging each other’s presence in the same room.

Promotional image from Apple showing a woman smiling while wearing the Vision Pro headset, with her eyes visible through the front display using EyeSight technology. She sits on a couch in a warmly lit room, engaging with another person off-screen.

I know a thing or two about VR and building practical apps for VR. A few years ago, in the mid-2010s, I cofounded a VR startup called Transported. My cofounders and I created a platform for touring real estate in VR. We wanted to help homebuyers and apartment hunters more efficiently shop for real estate. Instead of zigzagging across town running to multiple open houses on a Sunday afternoon, you could tour 20 homes in an hour on your living room couch. Of course, “virtual tours” existed already. There were cheap panoramas on real estate websites and “dollhouse” tours created using Matterport technology. Our tours were immersive; you felt like you were there. It was the future! There were several problems to solve, including 360° photography, stitching rooms together, building a player, and then most importantly, distribution. Back in 2015–2016, our theory was that Facebook, Google, Microsoft, Sony, and Apple would quickly make VR commonplace because they were pouring billions of R&D and marketing dollars into the space. But it turned out we were a little ahead of our time.

Consumers didn’t take to VR as all the technologists predicted. Headsets were still cumbersome. The best device in the market then was the Oculus Rift, which had to be tethered to a high-powered PC. When the Samsung Gear VR launched, it was a game changer for us because the financial barrier to entry was dramatically lowered. But despite the big push from all these tech companies, the consumer adoption curve still wasn’t great.

For our use case—home tours—consumers were fine with the 2D Matterport tours. They didn’t want to put on a headset. Transported withered as the gaze from the tech companies wandered elsewhere. Oculus continued to come out with new hardware, but the primary applications have all been entertainment. Practical uses for VR never took off. Despite Meta’s recent metaverse push, VR was still seen as a sideshow, a toy, and not the future of computing.

Until yesterday.

Blurry, immersive view of a cozy living room with the centered text “Welcome to the era of spatial computing,” representing the Apple Vision Pro experience and its introduction to augmented reality.

Apple didn’t coin the term “spatial computing.” The credit belongs to Simon Greenwold, who, in 2003, defined it as “human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces.” But with the headline “Welcome to the era of spatial computing,” Apple brilliantly reminds us that VR has practical use cases. They take a position opposite of the all-encompassing metaverse playland that Meta has staked out. They’ve redefined the category and may have breathed life back into it.

Beyond marketing, Apple has solved many of the problems that have plagued VR devices.

  • **Isolation: **As mentioned at the beginning of this piece, Apple seems to have solved the isolation issue with what they’re calling EyeSight. People around you can see your eyes, and you can see them inside Vision Pro.
  • Comfort: One of the biggest complaints about the Oculus Quest is its heaviness on your face. Apple solves this with a wired battery pack that users put into their pockets, thus moving that weight off their heads. But it is a tether.
  • Screen door effect: Even though today’s screens have really tiny pixels, users can still see the individual pixels because they’re so close to the display. In VR, this is called the “screen door effect” because you can see the lines between the screen’s pixels. The Quest 2 is roughly HD-quality (1832x1920) per eye. Apple Vision Pro will be double that to 4K quality per eye. We’ll have to see if this is truly eliminated once reviewers get their hands on test units.
  • Immersive audio: Building on the spatial audio technology they debuted with AirPods Pro, Vision Pro will have immersive audio to transport users to new environments.
  • Control: One of the biggest challenges in VR adoption has been controlling the user interface. Handheld game controllers are not intuitive for most people. In the real world, you look at something to focus on it, and you use your fingers and hands to manipulate objects. Vision Pro looks to overcome this usability issue with eye tracking and finger gestures.
  • Performance: Rendering 3D spaces in real-time requires a ton of computing and graphics-processing power. Apple’s move to its own M-series chips leapfrogs those available on competitors’ devices.
  • Security: In the early days of the Oculus Rift, users had to take off their headsets in the middle of setup to create and log into an online account. More recently, Meta mandated that Oculus users log in with their Facebook accounts. I’m not sure about the setup process, but privacy-focused Apple has built on their Face ID technology to create iris scanning technology called Optic ID. This identifies the specific human, so it’s as secure as a password. Finally, your surroundings captured by the external cameras are processed on-device.
  • Cross-platform compatibility: If Vision Pro is to be used for work, it will need to be cross-platform. In Apple’s presentation, FaceTime calls in VR didn’t exclude non-VR participants. Their collaborative whiteboard app, Freeform, looked to be usable on Vision Pro.
  • Development frameworks: There are 1.8 million apps in Apple’s App Store developed using Apple’s developer toolkits. From the presentation, it looked like converting existing iOS and possibly macOS apps to be compatible with visionOS should be trivial. Additionally, Apple announced they’re working with Unity to help developers bring their existing apps—games—to Vision Pro.

Person wearing an Apple Vision Pro headset stands at a desk in a loft-style office, interacting with multiple floating app windows in augmented reality. The text reads, “Free your desktop. And your apps will follow.” promoting spatial computing.

While Apple Vision Pro looks to be a technological marvel that has been years in the making, I don’t think it’s without its faults.

  • Tether: The Oculus Quest was a major leap forward. Free from being tethered to a PC, games like Beat Saber were finally possible. While Vision Pro isn’t tethered to a computer, there is the cord to the wearable battery pack. Apple has been in a long war against wires—AirPods, MagSafe charging—and now they’ve introduced a new one.
  • Price: OK, at $3,500, it is as expensive as the highest-end 16-inch MacBook Pro. This is not a toy and not for everyday consumers. It’s more than ten times the price of an Oculus Quest 2 ($300) and more than six times that of a Sony PlayStation VR 2 headset ($550). I’m sure the “Pro” designation softens the blow a little.

Apple Vision Pro will ship in early 2024. I’m excited by the possibilities of this new platform. Virtual reality has captured the imagination of science-fiction writers, futurists, and technologists for decades. Being able to completely immerse yourself into stories, games, and simulations by just putting on a pair of goggles is very alluring. The technology has had fits and starts. And it’s starting again.

Creative Selection book with Roger Wong's Apple badge

The Apple Design Process

I recently came across Creative Selection: Inside Apple’s Design Process During the Golden Age of Steve Jobs by former software engineer Ken Kocienda. It was in one of my social media feeds, and since I’m interested in Apple, the creative process, and having been at Apple at that time, I was curious.

I began reading the book Saturday evening and finished it Tuesday morning. It was an easy read, as I was already familiar with many of the players mentioned and nearly all the technologies and concepts. But, I’d done something I hadn’t done in a long time—I devoured the book.

Ultimately this book gave more color and structure to what I’d already known, based on my time at Apple and my own interactions with him. Steve Jobs was the ultimate creative director who could inspire, choose, and direct work. 

Kocienda describes a nondescript conference room called Diplomacy in Infinite Loop 1 (IL1), the first building at Apple’s then main campus. This was the setting for an hours-long meeting where Steve held court with his lieutenants. Their team members would wait nervously outside the room and get called in one by one to show their in-progress work. In Kocienda’s case, he describes a scene where he showed Steve the iPad software keyboard for the first time. He presented one solution that allowed the user to choose from two layouts: more keys but smaller keys or fewer keys but bigger. Steve asked which Kocienda liked better, and he said the bigger keys, and that was decided.

Before reading this book, I had known about these standing meetings. Not the one about software, but I knew about the MarCom meeting. Every Wednesday afternoon, Steve would hold a similar meeting—Phil Schiller would be there too, of course—to review in-progress work from the Marketing & Communications teams. This included stuff from the ad agency and work from the Graphic Design Group, where I was.

My department was in a plain single-story building on Valley Green Drive, a few blocks from the main campus and close to the Apple employee fitness center. The layout inside consisted of one large room where nearly everyone sat. Our workstations were set up on bench-style desks. Picture a six-foot table, with a workstation on the left facing north and another on the right facing south. There were three of these six-foot tables per row and maybe a dozen rows. Tall 48” x 96” Gatorfoam boards lined the perimeter of the open area. On these boards, we pinned printouts of all our work in progress. Packaging concepts, video storyboards, Keynote themes, and messaging headlines were all tacked up. 

There was a handful of offices at one end and two large offices in the back. One was called the Lava Lounge and housed a group of highly-skilled Photoshop and 3D artists. They retouched photos and recreated screenshots and icons at incredibly-high resolutions for use on massive billboards in their dim room, lit only by lava lamps. The other office was for people who were working on super secret projects. Of course, that was badge access only. 

My boss, Hiroki Asai, the executive creative director at the time, sat out in the open area with the rest of us. Every day around 4pm, he would walk around the perimeter of the room and review all the work. He’d offer his critique, which often ended up being, “I think this needs to be more…considered.” (He was always right!) A gaggle of designers, copywriters, and project managers would follow him around and offer their own opinions of the work as well. In other words, as someone who worked in the room, I had to pin up my work by 4pm every day and show some progress to get some feedback. Feedback from Hiroki was essential to moving work forward.

So every Wednesday afternoon, with a bundle of work tucked under his arms, he would exit the side door of the building and race over to IL1 to meet with Steve. I never went with him to those meetings. He usually brought project managers or creative directors. Some of the time, Hiroki would come back dejected after being yelled at by Steve, and some of the time, he’d come back triumphant, having got the seal of approval from him.

I like to tell one story about how our design team created five hundred quarter-scale mockups to get to an approval for the PowerMac G5 box. In the end, the final design was a black box with photos of the computer tower on each side of the box corresponding to the same side of the product. Steve didn’t want to be presented with only one option. He needed many. And then they were refined.

The same happened with the Monsters, Inc. logo when I was at USWeb/CKS. We presented Steve with a thick two-inch binder full of logo ideas. There must have been over a hundred in there.

Steve always expected us to do our due diligence, explore all options, and show our work. Show him that we did the explorations. He was the ultimate creative director.

That’s how Steve Jobs also approached software and hardware design, which is nicely recounted in Kocienda’s book. 

In the book, Kocienda enumerates seven essential elements in Apple’s (product) design process: inspiration, collaboration, craft, diligence, decisiveness, taste, and empathy. I would expand upon that and say the act of exploration is also essential, as it leads to inspiration. In Steve’s youth, he experimented with LSD, became a vegetarian, took classes on calligraphy, and sought spiritual teachers in India. He was exploring to find his path. As with his own life, he used the act of exploration to design everything at Apple, to find the right solutions.

As designers, copywriters, and engineers, we explored all possibilities even when we knew where we would end up, just to see what was out there. Take the five hundred PowerMac G5 boxes to get to a simple black box with photos. Or my 14 rounds of MacBuddy. The concept of exploring and then refining is the definition of “creative selection,” Kocienda’s play on Darwin’s natural selection. But his essential element of diligence best illustrates the obsessive refinement things went through at Apple. Quality isn’t magic. It’s through a lot of perspiration.

Page 1 of 2