Skip to content

44 posts tagged with “apple”

Nielsen Norman Group weighs in on iOS 26 Liquid Glass. Predictably, they don’t like it. Raluca Budiu:

With iOS 26, Apple seems to be leaning harder into visual design and decorative UI effects — but at what cost to usability? At first glance, the system looks fluid and modern. But try to use it, and soon those shimmering surfaces and animated controls start to get in the way.

I get it. Flat—or mostly flat—and static UI conforms to the heuristics. But honestly, it can get boring and homogenous quickly. Put the NNg microscope on any video game UI and it’ll be torn to shreds, despite gamers learning to adapt quickly.

I’ve had iOS 26 on my phone for just a couple of weeks. I continue to be delighted by the animations and effects. So far, nothing has hindered the usability for me. We’ll see what happens as more and more apps get translated.

Liquid Glass Is Cracked, and Usability Suffers in iOS 26

Liquid Glass Is Cracked, and Usability Suffers in iOS 26

iOS 26’s visual language obscures content instead of letting it take the spotlight. New (but not always better) design patterns replace established conventions.

nngroup.com iconnngroup.com

As much as I defended the preview, and as much as Apple wants to make Liquid Glass a thing, the new UI is continuing to draw criticism. Dan Moren for Six Colors:

“Glass” is the overall look of these updates, and it’s everywhere. Transparent, frosted, distorting. In some places it looks quite cool, such as in the edge distortion when you’re swiping up on the lock screen. But elsewhere, it seems to me that glass may not be quite the right material for the job. The Glass House might be architecturally impressive, but it’s not particularly practical.

It’s also a definite philosophical choice, and one that’s going to engender some criticism—much of it well-deserved. Apple has argued that it’s about getting controls out of the way, but is that really what’s happening here? It’s hard to argue that having a transparent button sitting right on top of your email is helping that email be more prominent. To take this argument to its logical conclusion, why is the keyboard not fully transparent glass over our content?

I’ve yet to upgrade myself. I will say that everyone dislikes change. Lest we forget that the now-ubiquitous flat design introduced by iOS 7 was also criticized.

preview-1758732622764.png

iOS 26 Review: Through a glass, liquidly

iOS 26! It feels like just last year we were here discussing iOS 18. How time flies. After a year that saw the debut of Apple Intelligence and the subsequent controversy over the features that it d…

sixcolors.com iconsixcolors.com

Ah, this brings back memories! I spent so much time in MacPaint working with these patterns when I was young. Paul Smith faithfully recreates them:

I was working on something and thought it would be fun to use one of the classic Mac black-and-white patterns in the project. I’m talking about the original 8×8-pixel ones that were in the original Control Panel for setting the desktop background and in MacPaint as fill patterns.

I figured there’d must be clean, pixel-perfect GIFs or PNGs of them somewhere on the web. And perhaps there are, but after poking around a bit, I ran out of energy for that, but by then had a head of steam for extracting the patterns en masse from the original source, somehow. Then I could produce whatever format I needed for them.

preview-1757693571067.png

Classic 8×8-pixel B&W Mac patterns

TL;DR: I made a website for the original classic Mac patterns I was working on something and thought it would be fun to use one of the classic Mac black-and-white patterns in the project. I'm talking about the original 8×8-pixel ones that were in the...

pauladamsmith.com iconpauladamsmith.com
Still from a video shown at Apple Keynote 2025. Split screen of AirPods Pro connection indicator on left, close-up of earbuds in charging case on right.

Notes About the September 2025 Apple Event

Today’s Apple keynote opened with a classic quote from Steve Jobs.

Steve Jobs quote at Apple Keynote 2025 – Black keynote slide with white text: “Design is not just what it looks like and feels like. Design is how it works.” – Steve Jobs.

Then a video played, focused on the fundamental geometric shapes that can be found in Apple’s products: circles in the HomePod, iPhone shutter button, iPhone camera, MagSafe charging ring, Digital Crown on Apple Watch; rounded squares in the charging block, Home scene button, Mac mini, keycaps, Finder icon, FaceID; to the lozenges found in the AirPods case, MagSafe port, Liquid Glass carousel control, and the Action button on Apple Watch Ultra.

It’s no secret that I am a big fan of Severance, the Apple TV+ show that has 21 Emmy nominations this year. I made a fan project earlier in the year that generates Outie facts for your Innie.

After launching a teaser campaign back in April, Atomic Keyboard is finally taking pre-orders for their Severance-inspired keyboard just for Macrodata Refinement department users. The show based the MDR terminals on the Data General Dasher D2 terminal from 1977. So this new keyboard includes three layouts:

  1. “Innie” which is show-accurate, meaning no Escape, no Option, and no Control keys, and includes the trackball
  2. “Outie,” a 60% layout that includes modern modifier keys and the trackball
  3. “Dasher” which replicates the DG terminal layout

It’s not cheap. The final retail price will be $899, but they’re offering a pre-Kickstarter price of $599.

preview-1752862402377.png

MDR Dasher Keyboard | For Work That's Mysterious & Important

Standard equipment for Macrodata Refinement: CNC-milled body, integrated trackball, modular design. Please enjoy each keystroke equally.

mdrkeyboard.com iconmdrkeyboard.com

John Calhoun joined Apple 30 years ago as a programmer to work on the Color Picker.

Having never written anything in assembly, you can imagine how overjoyed I was. It’s not actually a very accurate analogy, but imagine someone handing you a book in Chinese and asking you to translate it into English (I’m assuming here that you don’t know Chinese of course). Okay, it wasn’t that hard, but maybe you get a sense that this was quite a hurdle that I would have to overcome.

Calhoun was given an old piece of code and tasked with updating it. Instead, he translated it into a programming language he knew—C—and then decided to add to the feature. He explains:

I disliked HSL as a color space, I preferred HSV (Hue, Saturation, Value) because when I did artwork I was more comfortable thinking about color in those terms. So writing an HSV color picker was on my short list.

When I had my own color picker working I think I found that it was kind of fun. Perhaps for that reason, I struck out again and wrote another color picker. The World Wide Web (www) was a rather new thing that seemed to be catching on, so I naturally thought that an HTML color picker made sense. So I tackled that one as well. It was more or less the RGB color picker but the values were in hexadecimal and a combined RGB string value like “#FFCC33” was made easy to copy for the web designer.

So an engineer decided, all on his own, that he’d add a couple extra features. Including the fun crayon picker:

On a roll, I decided to also knock out a “crayon picker”. At this point, to be clear, the color picker was working and I felt I understood it well enough. As I say, I was kind of just having some fun now.

Screenshot of a classic Mac OS color picker showing the “Crayon Picker” tab. A green color named “Watercress” is selected, replacing the original orange color. Options include CMYK, HLS, and HSV pickers on the left.

And Calhoun makes this point:

It was frankly a thing I liked about working for Apple in those days. The engineers were the one’s driving the ship. As I said, I wrote an HSV picker because it was, I thought, a more intuitive color space for artists. I wrote the HTML color picker because of the advent of the web. And I wrote the crayon picker because it seemed to me to be the kind of thing Apple was all about: HSL, RGB — these were kind of nerdy color spaces — a box of crayons is how the rest of us picked colors.

Making software—especially web software—has matured since then, with product managers and designers now collaborating closely with engineers. But with AI coding assistants, the idea of an individual contributor making solo decisions and shipping code might become de rigueur again.

Man sitting outside 2 Infinite Loop, Apple’s former headquarters in Cupertino, holding a book with an ID badge clipped to his jeans.

Almost Fired

I was hired on at Apple in October of 1995. This was what I refer to as Apple’s circling the drain period. Maybe you remember all the doomsaying — speculation that Apple was going to be shuttering soon. It’s a little odd perhaps then that they were hiring at all but apparently Apple reasoned that they nonetheless needed another “graphics engineer” to work on the technology known as QuickdrawGX. I was then a thirty-one year old programmer who lived in Kansas and wrote games for the Macintosh — surely, Apple thought, I would be a good fit for the position.

engineersneedart.com iconengineersneedart.com

Let’s continue down Mac memory lane with this fun post from Basic Apple Guy:

With macOS 26, Apple has announced a dramatically new look to their UI: Liquid Glass. Solid material icon elements give way to softer, shinier, glassier icons. The rounded rectangle became slightly more rounded, and Apple eliminated the ability for icon elements to extend beyond the icon rectangle (as seen in the current icons for GarageBand, Photo Booth, Dictionary, etc.).

With this release being one of the most dramatic visual overhauls of macOS’s design, I wanted to begin a collection chronicling the evolution of the system icons over the years. I’ve been rolling these out on social media over the past week and will continue to add to and update this collection slowly over the summer. Enjoy!

preview-1752036853593.png

macOS Icon History

Documenting the evolution of macOS system icons over the past several decades.

basicappleguy.com iconbasicappleguy.com

This is an amazing article and website by Marcin Wichary, the man behind the excellent Shift Happens book.

…I had a realization that the totemic 1984 Mac control panel, designed by Susan Kare, is still to this day perhaps the only settings screen ever brought up in casual conversation.

I kept wondering about that screen, and about what happened since then. Turns out, the Mac settings have lived a far more fascinating life than I imagined, have been redesigned many times, and can tell us a lot about the early history and the troubled upbringing of this interesting machine.

Indeed, Wichary goes through multiple versions of Mac operating systems and performs digital paleontology, uncovering long lost Settings minutiae. It’s also a great lesson in UI along the way. Be sure to click in the Mac screens.

preview-1752030135022.png

Frame of preference

A story of early Mac settings told by 10 emulators.

aresluna.org iconaresluna.org

It’s been said that desktop publishing democratized graphic design. For those of you too young to know what the term means, it means the technology that enabled graphic design to go digital. It was an ecosystem, really: the Mac, PostScript, LaserWriter, and PageMaker. But before all that, designers depended on typesetters to set type.

David Langton writing for UX Collective:

A lot was lost when the Macintosh wiped out the traditional typesetting industry. From the art of typography to the craft of typesetting, many essential elements were lost. Typesetters were part of a tradition that stretched back more than 500 years to Gutenberg’s printing press. They understood the basics of type: kerning (spacing between the letters), leading (the space between lines of text), and line breaks (how to avoid widows — those solo words abandoned at the end of a paragraph). They knew about readability (like how to avoid setting type that was too wide to read). There were classic yet limited fonts, with standards for size and leading that assured that everyone working within common ranges maintained a threshold for quality. Yet it was in the craft or business side of typesetting that these services were most under appreciated. Typesetters provided overnight service. They worked overnight, so graphic designers did not have to. We would finish our days specifying the type, and the typesetters would keystroke the manuscripts, proofread, stylize the type, and set up columns following our instructions.

Designers would then pick up the galleys from the typesetters in the morning. The black type was photographically printed on white photo paper. You’d have to cut them up and paste them onto boards, assembling your layout.

Because this was such a physical process, we had to slow down. Langton says:

But since the Macintosh became an in-house tool, the process was reversed. Now, designers design first, then think about it. This shift in process has contributed to a trivialization of the role of graphic designer because anyone can noodle around with the Mac’s sophisticated type tools and make layouts. The design process has been trivialized while the thinking, the evaluation, and the strategic part of the process are often abandoned.

One small thing I’ll point out is that desktop publishing wasn’t popularized until 1985.

  • PostScript was released by Adobe in 1984.
  • The LaserWriter printer was released by Apple in 1985.
  • PageMaker was released by Aldus—later bought by Adobe—in 1985.
preview-1750050186871.jpeg

What the 1984 Macintosh revolution teaches designers about the 2025 AI revolution

Upheaval and disruption are nothing new for graphic designers.

uxdesign.cc iconuxdesign.cc

Vincent Nguyen writing for Yanko Design, interviewing Alan Dye, VP of Human Interface Design at Apple:

This technical challenge reveals the core problem Apple set out to solve: creating a digital material that maintains form-changing capabilities while preserving transparency. Traditional UI elements either block content or disappear entirely, but Apple developed a material that can exist in multiple states without compromising visibility of underlying content. Dye’s emphasis on “celebrating user content” exposes Apple’s hierarchy philosophy, where the interface serves content instead of competing with it. When you tap to magnify text, the interface doesn’t resize but stretches and flows like liquid responding to pressure, ensuring your photos, videos, and web content remain the focus while navigation elements adapt around them.

Since the Jony Ive days, Apple’s hardware has always been about celebrating the content. Bezels got smaller. Screens got bigger and brighter. Even the flat design brought on by iOS 7 and eventually adopted by the whole ecosystem was a way to strip away the noise and focus on the content.

Dye’s explanation of the “glass layer versus application layer” architecture provides insight into how Apple technically implements this philosophy. The company has created a distinct separation between functional controls (the glass layer) and user content (the application layer), allowing each to behave according to different rules while maintaining visual cohesion. This architectural decision enables the morphing behavior Dye described, where controls can adapt and change while content remains stable and prominent.

The Apple platform UI today sort of does that, but Liquid Glass seems to take it even further.

Nguyen about his experience using the Music app on Mac:

The difference from current iOS becomes apparent in specific scenarios. In the current Music app, scrolling through your library feels like moving through flat, static layers. With Liquid Glass, scrolling creates a sense of depth. You can see your album artwork subtly shifting beneath the translucent controls, creating spatial awareness of where interface elements sit in relation to your content. The tab bar doesn’t just scroll with you; it creates gentle optical distortions that make the underlying content feel physically present beneath the glass surface.

preview-1749793045679.jpg

Apple’s Liquid Glass Hands-On: Why Every Interface Element Now Behaves Like Physical Material

Liquid Glass represents more than an aesthetic update or surface-level polish. It functions as a complex behavioral system, precisely engineered to dictate how interface layers react to user input. In practical terms, this means Apple devices now interact with interface surfaces not as static, interchangeable panes, but as dynamic, adaptive materials that fluidly flex and

yankodesign.com iconyankodesign.com

The Steve Jobs archive sharing a little behind-the-scenes of Jobs’s famous Stanford commencement speech:

The talk generated no small measure of anxiety for Steve. He had attended Reed College for only a few months before dropping out; now he would be speaking to graduates of one of the world’s top research universities, a place that meant a great deal to him. An intensely private man, Steve was not in the habit of talking about his personal journey—but he knew the occasion required it.

Steve Jobs has always had an aura of invincibility around him—a creative genius who could convince those around him and the world of anything he wanted using his “reality distortion field.” But he was also human.

I’m sure you’ve seen it before. But whether you’re 22 years old or 50, his advice still resonates. I love the clarity in this scaled-up version.

Play
preview-1749791832757.jpg

Stay Hungry, Stay Foolish

Marking the 20th anniversary of Steve Jobs’ 2005 Stanford commencement speech with a digitally enhanced version of the video as well as a behind-the-scenes look at how it came to be: from firsthand accounts from people who were connected to the commencement to Steve’s personal drafts.

stevejobsarchive.com iconstevejobsarchive.com
Collection of iOS interface elements showcasing Liquid Glass design system including keyboards, menus, buttons, toggles, and dialogs with translucent materials on dark background.

Breaking Down Apple’s Liquid Glass: The Tech, The Hype, and The Reality

I kind of expected it: a lot more ink was spilled on Liquid Glass—particularly on social media. In case you don’t remember, Liquid Glass is the new UI for all of Apple’s platforms. It was announced Monday at WWDC 2025, their annual developers conference.

The criticism is primarily around legibility and accessibility. Secondary reasons include aesthetics and power usage to animate all the bubbles.

I have relayed here before the story that I’ve been using Macs since 1985. It wasn’t the hardware that drew me in—it was MacPaint. I was always an artistic kid so being able to paint on a digital canvas seemed thrilling to me. And of course it was back then.

Behind MacPaint, was a man named Bill Atkinson. Atkinson died last Thursday, June 5 of pancreatic cancer. In a short remembrance, John Gruber said:

I say this with no hyperbole: Bill Atkinson may well have been the best computer programmer who ever lived. Without question, he’s on the short list. What a man, what a mind, what gifts to the world he left us.

I‘m happy that Figma also remembered Atkinson and that they are standing on his shoulders.

Every day at Figma, we wrestle with the same challenges Atkinson faced: How do you make powerful tools feel effortless? How do you hide complexity behind intuitive interactions? His fingerprints are on every pixel we push, every selection we make, every moment of creative flow our users experience.

preview-1749532457343.jpg

Bill Atkinson’s 10 Rules for Making Interfaces More Human

We commemorate the Apple pioneer whose QuickDraw and HyperCard programs made the Macintosh intuitive enough for nearly anyone to use.

figma.com iconfigma.com
Abstract gradient design with flowing liquid glass elements in blue and pink colors against a gray background, showcasing Apple's new Liquid Glass design language.

Quick Notes About WWDC 2025

Apple’s annual developer conference kicked off today with a keynote that announced:

  • Unified Version 26 across all Apple platforms (iOS, iPadOS, macOS, watchOS, tvOS, visionOS)
  • “Liquid Glass” design system. A complete UI and UX overhaul, the first major redesign since iOS 7
  • Apple Intelligence. Continued small improvements, though not the deep integration promised a year ago
  • Full windowing system on iPadOS. Windows comes to iPad! Finally.

Of course, those are the very high-level highlights.

Sebastiaan de With, former designer at Apple and currently co-founder and designer at Lux (makers of Halide, Kino, Spectre, and Orion) imagined what the next era in iOS design might be. (WWDC, Apple’s developer conference is next week. This is typically when they unveil the new operating systems that will launch in the fall. Rumors are flying as usual.)

But he starts with a history lesson:

Smart people study history to understand the future. If we were to categorize the epochs of iOS design, we could roughly separate them into the Shaded Age, the Adaptive Age, and the New Age.

The Shaded Age, or skeuomorphic age, took inspiration from the Dashboard feature of Mac OS X Tiger. And then the Flat Age brought on by the introduction of iOS 7.

de With’s concept mocks for the New Age are fantastic. Based on the physicality of visionOS, with specular highlights and reactive reflections, it’s luscious and reminds me of the first time I ever laid eyes on Aqua—the glossy, candy-like look of the original Mac OS X. Steve Jobs said at its introduction, “…one of the design goals was when you saw it you wanted to lick it.”

Close-up of a glass-rendered user interface

Sebastiaan de With: “Philosophically, if I was Apple, I’d describe this as finally having an interface that matches the beautiful material properties of its devices. All the surfaces of your devices have glass screens. This brings an interface of a matching material, giving the user a feeling of the glass itself coming alive.

preview-1749013108308.jpg

Physicality: the new age of UI

There’s a lot of rumors of a big impending UI redesign from Apple. Let’s imagine what’s (or what could be) next for the design of iPhones, Macs and iPads.

lux.camera iconlux.camera

OpenAI is acquiring a hardware company called “io” that Jony Ive cofounded just a year ago:

Two years ago, Jony Ive and the creative collective LoveFrom, quietly began collaborating with Sam Altman and the team at OpenAI.

It became clear that our ambitions to develop, engineer and manufacture a new family of products demanded an entirely new company. And so, one year ago, Jony founded io with Scott Cannon, Evans Hankey and Tang Tan.

We gathered together the best hardware and software engineers, the best technologists, physicists, scientists, researchers and experts in product development and manufacturing. Many of us have worked closely for decades.

The io team, focused on developing products that inspire, empower and enable, will now merge with OpenAI to work more intimately with the research, engineering and product teams in San Francisco.

It has been an open rumor that Sam Altman and Ive has been working together on some hardware. I had assumed they formalized their partnership already, but I guess not.

Play

There are some bold statements that Ive and Altman make in the launch video, teasing a revolutionary new device that will enable quicker, better access to ChatGPT. Something that is a lot less friction than how Altman explains in the video:

If I wanted to ask ChatGPT something right now about something we had talked about earlier, think about what would happen. I would like reached down. I would get on my laptop, I’d open it up, I’d launch a web browser, I’d start typing, and I’d have to, like, explain that thing. And I would hit enter, and I would wait, and I would get a response. And that is at the limit of what the current tool of a laptop can do. But I think this technology deserves something much better.

There are a couple of other nuggets about what this new device might be from the statements Ive and Altman made to Bloomberg:

…Ive and Altman don’t see the iPhone disappearing anytime soon. “In the same way that the smartphone didn’t make the laptop go away, I don’t think our first thing is going to make the smartphone go away,” Altman said. “It is a totally new kind of thing.”

“We are obviously still in the terminal phase of AI interactions,” said Altman, 40. “We have not yet figured out what the equivalent of the graphical user interface is going to be, but we will.”

While we don’t know what the form factor will be, I’m sure it won’t be a wearable pin—ahem, RIP Humane. Just to put it out there—I predict it will be a voice assistant in an earbud, very much like the AI in the 2013 movie “Her.” Altman has long been obsessed with the movie, going as far as trying to get Scarlett Johansson to be one of the voices for ChatGPT.

EDIT 5/22/2025, 8:58am PT: Added prediction about the form factor.

preview-1747889382686.jpg

Sam and Jony introduce io

Building a family of AI products for everyone.

openai.com iconopenai.com

John Gruber wrote a hilarious rant about the single-story a in the iOS Notes app:

I absolutely despise the alternate single-story a glyph that Apple Notes uses. I use Notes every single day and this a bothers me every single day. It hurts me. It’s a childish silly look, but Notes, for me, is one of the most serious, most important apps I use.

Since that sparked some conversation online, he followed up with a longer post about typography in early versions of the Mac system software:

…Apple actually shipped System 1.0 with a version of Geneva with a single-story a glyph — but only in the 9-point version of Geneva. At 12 points (and larger), Geneva’s a was double-story.

To me, it does make sense that 9-point Geneva would have a single-story a, since there are less pixels to draw the glyph well and to distinguish better from the lowercase e.

preview-1747273905636.png

Single-Story a’s in Very Early Versions of Macintosh System 1

A single-story “a” in Chicago feels more blasphemous than that AI image Trump tweeted of himself as the new pope.

daringfireball.net icondaringfireball.net

The New FOX Sports Scorebug

I was sitting on a barstool next to my wife in a packed restaurant in Little Italy. We were the lone Kansas City Chiefs supporters in a nest full of hipster Philadelphia Eagles fans. After Jon Batiste finished his fantastic rendition of the national anthem, and the teams took the field for kickoff, I noticed something. The scorebug—the broadcast industry’s term for the lower-third or chyron graphic at the bottom of the screen—was different, and in a good way.

A Bluesky post praising the minimalistic Super Bowl lower-thirds, with a photo of a TV showing the Chiefs vs. Eagles game and sleek on-screen graphics.

posted about it seven minutes into the first quarter, saying I appreciated “the minimalistic lower-thirds for this Super Bowl broadcast.” It was indeed refreshing, a break from the over-the-top 3D-animated sparkling. I thought the graphics were clear and utilitarian while being exquisitely-designed. They weren’t distracting from the action. As with any good interface design, this new scorebug kept the focus on the players and the game, not itself. I also thought they were a long-delayed response to Apple’s Friday Night Baseball scorebug.

Zuckerberg believes Apple “[hasn’t] really invented anything great in a while…”

Appearing on Joe Rogan’s podcast, this week, Meta CEO Mark Zuckerberg said that Apple “[hasn’t] really invented anything great in a while. Steve Jobs invented the iPhone and now they’re just kind of sitting on it 20 years later."

Let's take a look at some hard metrics, shall we?

I did a search of the USPTO site for patents filed by Apple and Meta since 2007. In that time period, Apple filed for 44,699 patents. Meta, nee Facebook, filed for 4,839, or about 10% of Apple’s inventions.

Side-by-side screenshots of patent searches from the USPTO database showing results for Apple Inc. and Meta Platforms. The Apple search (left) returned 44,699 results since 2007, while the Meta search (right) returned 4,839 results.

Apple VR headset on a table

Thoughts on Apple Vision Pro

Apple finally launched its Vision Pro “spatial computing” device in early February. We immediately saw TikTok memes of influencers being ridiculous. I wrote about my hope for the Apple Vision Pro back in June 2023, when it was first announced. When preorders opened for Vision Pro in January, I told myself I wouldn’t buy it. I couldn’t justify the $3,500 price tag. Out of morbid curiosity, I would lurk in the AVP subreddits to live vicariously through those who did take the plunge.

After about a month of reading all the positives from users about the device, I impulsively bought an Apple Vision Pro. I placed my order online at noon and picked it up just two hours later at an Apple Store near me.

Many great articles and YouTube videos have already been produced, so this post won’t be a top-to-bottom review of the Apple Vision Pro. Instead, I’ll try to frame it from my standpoint as someone who has designed user experiences for VR

Apple Vision Pro

Transported into Spatial Computing

After years of rumors and speculation, Apple finally unveiled their virtual reality headset yesterday in a classic “One more thing…” segment in their keynote. Dubbed Apple Vision Pro, this mixed reality device is perfectly Apple: it’s human-first. It’s centered around extending human productivity, communication, and connection. It’s telling that one of the core problems they solved was the VR isolation problem. That’s the issue where users of VR are isolated from the real world; they don’t know what’s going on, and the world around them sees that. Insert meme of oblivious VR user here. Instead, with the Vision Pro, when someone else is nearby, they show through the interface. Additionally, an outward-facing display shows the user’s eyes. These two innovative features help maintain the basic human behavior of acknowledging each other’s presence in the same room.

Promotional image from Apple showing a woman smiling while wearing the Vision Pro headset, with her eyes visible through the front display using EyeSight technology. She sits on a couch in a warmly lit room, engaging with another person off-screen.

I know a thing or two about VR and building practical apps for VR. A few years ago, in the mid-2010s, I cofounded a VR startup called Transported. My cofounders and I created a platform for touring real estate in VR. We wanted to help homebuyers and apartment hunters more efficiently shop for real estate. Instead of zigzagging across town running to multiple open houses on a Sunday afternoon, you could tour 20 homes in an hour on your living room couch. Of course, “virtual tours” existed already. There were cheap panoramas on real estate websites and “dollhouse” tours created using Matterport technology. Our tours were immersive; you felt like you were there. It was the future! There were several problems to solve, including 360° photography, stitching rooms together, building a player, and then most importantly, distribution. Back in 2015–2016, our theory was that Facebook, Google, Microsoft, Sony, and Apple would quickly make VR commonplace because they were pouring billions of R&D and marketing dollars into the space. But it turned out we were a little ahead of our time.