Skip to content
The State of UX in 2025

The State of UX in 2025

From design tools, to our design process, to the user behaviors that will change the way we design — a list of what to expect for User Experience (UX) Design in the next year.

trends.uxdesign.cc icontrends.uxdesign.cc
Vibrant artistic composition featuring diverse models in striking, colorful fashion. The central figure is dressed in an elaborate orange-red gown, surrounded by models in bold outfits of pink, red, yellow, and orange tones. The background transitions between shades of orange and pink, with the word ‘JAGUAR’ displayed prominently in the center.

A Jaguar Meow

The British automaker Jaguar unveiled its rebrand last week, its first step at relaunching the brand as an all-EV carmaker. Much ink has been spilled about the effort already, primarily negative, regarding the toy-like logotype in design circles and the bizarre film in the general town square.

Play

Jaguar’s new brand film

Interestingly, Brand New, the preeminent brand design website, hasn’t weighed in yet. It has decided to wait until after December 2, when Jaguar will unveil the first “physical manifestation of its Exuberant Modernism creative philosophy, in a Design Vision Concept” at Miami Art Week. (Update: Brand New has weighed in with a review of the rebrand. My commentary on it is below.)

There have been some contrarian views, too, decrying the outrage by brand experts. In Print Magazine, Saul Colt writes:

Critics might say this is the death of the brand, but I see it differently. It’s the rebirth of a brand willing to take a stand, turn heads, and claw its way back into the conversation. And that, my friends, is exactly what Jaguar needed to do.

With all due respect to Mr. Colt—and he does make some excellent points in his piece—I’m not in the camp that believes all press is good press. If Jaguar wanted to call attention to itself and make a statement about its new direction, it didn’t need to abandon its nearly 90 years of heritage to do so. A brand is a company’s story over time. Jeff Bezos once said, “Your brand is what people say about you when you’re not in the room.” I’m not so sure this rebrand is leaving the right impression.

Here’s the truth: the average tenure of a chief marketing officer tends to be a short four years, so they feel as if they need to prove their worth by paying for a brand redesign, including a splashy new website and ad campaign filled with celebrities. But branding alone does not turn around a brand—a better product does. Paul Rand, one of the masters of logo design and branding, once said:

A logo derives its meaning from the quality of the thing it symbolizes, not the other way around. A logo is less important than the product it signifies; what it means is more important than what it looks like.

It’s the thing the logo represents and the meaning instilled in it by others. In other words, it’s not the impression you make but the impression you’re given.

There were many complaints about the artsy, haute couture brand film to introduce their new “Copy Nothing” brand ethos. The brand strategy itself is fine, but the execution is terrible. As my friend and notable brand designer Joe Stitzlein says, “At Nike, we used to call this ‘exposing the brief to the end user.’” Elon Musk complained about the lack of cars in the spot, trolling with “Do you sell cars?” Brand campaigns that don’t show the product are fine as long as the spot reinforces what I already know about the brand, so it rings authentic. Apple’s famous “Think Different” ad never showed a computer. Sony’s new Playstation “Play Has No Limits” commercial shows no gameplay footage.

Play

Apple’s famous “Think Different” ad never showed a computer.

Play

Sony’s recent Playstation “Play Has No Limits” commercial doesn’t show any gameplay footage.

All major automakers have made the transition to electric. None have thrown away their brands to do so. Car marques like VolkswagenBMW, and Cadillac have made subtle adjustments to their logos to signify an electrified future, but none have ditched their heritage.

Volkswagen’s logo redesign in 2019

Before and after of BMW's logo redesign in 2020

BMW’s logo redesign in 2020

Instead, they’ve debuted EVs like the Mustang Mach E, the Lyriq, and the Ioniq 5. They all position these vehicles as paths to the future.

Mr. Colt:

The modern car market is crowded as hell. Luxury brands like Porsche and Tesla dominate mindshare, and electric upstarts are making disruption their personal brand. Jaguar was stuck in a lane of lukewarm association: luxury-ish, performance-ish, but ultimately not commanding enoughish to compete.

Hyundai built a splashy campaign around the Ioniq 5, but they didn’t do a rebrand. Instead, they built a cool-looking, retro-future EV that won numerous awards when it launched, including MotorTrend’s 2023 SUV of the Year.

We shall see what Jaguar unveils on December 2. The only teaser shot of the new vehicle concept does look interesting. But the conversation has already started on the wrong foot.

Cropped photo of a new Jaguar concept car


Update

December 3, 2024

As expected, Jaguar unveiled their new car yesterday. Actually, it’s not a new car, but a new concept car called Type 00. If you know anything about concept cars, they are never what actually ships. By the time you add the required safety equipment, including side mirrors and bumpers, the final car a consumer will be able to purchase will look drastically different.

Putting aside the aesthetics of the car, the accompanying press release is full of pretension. Appropriate, I suppose, but feels very much like they’re pointing out how cool they are rather than letting the product speak for itself.

Two Jaguar Type 00 concept cars, one blue and one pink


Update 2

December 9, 2024

Brand New has weighed in with a review of the rebrand. Armin Vit ends up liking the work overall because it did what it set out to do—create conversation. However, his readers disagree. As of this writing, the votes are overwhelmingly negative while the comments are more mixed.

Poll results from Brand New showing the overwhelming negative response to the Jaguar rebrand

Griffin AI logo

How I Built and Launched an AI-Powered App

I’ve always been a maker at heart—someone who loves to bring ideas to life. When AI exploded, I saw a chance to create something new and meaningful for solo designers. But making Griffin AI was only half the battle…

Birth of an Idea

About a year ago, a few months after GPT-4 was released and took the world by storm, I worked on several AI features at Convex. One was a straightforward email drafting feature but with a twist. We incorporated details we knew about the sender—such as their role and offering—and the email recipient, as well as their role plus info about their company’s industry. To accomplish this, I combined some prompt engineering and data from our data providers, shaping the responses we got from GPT-4.

Playing with this new technology was incredibly fun and eye-opening. And that gave me an idea. Foundational large language models (LLMs) aren’t great yet for factual data retrieval and analysis. But they’re pretty decent at creativity. No, GPT, Claude, or Gemini couldn’t write an Oscar-winning screenplay or win the Pulitzer Prize for poetry, but it’s not bad for starter ideas that are good enough for specific use cases. Hold that thought.

I belong to a Facebook group for WordPress developers and designers. From the posts in the group, I could see most members were solopreneurs, with very few having worked at a large agency. From my time at Razorfish, Organic, Rosetta, and others, branding projects always included brand strategy, usually weeks- or months-long endeavors led by brilliant brand or digital strategists. These brand insights and positioning always led to better work and transformed our relationship with the client into a partnership.

So, I saw an opportunity. Harness the power of gen AI to create brand strategies for this target audience. In my mind, this could allow these solo developers and designers to charge a little more money, give their customers more value, and, most of all, act like true partners.

Validating the Problem Space

The prevailing wisdom is to leverage Facebook groups and Reddit forums to perform cheap—free—market research. However, the reality is that good online communities ban this sort of activity. So, even though I had a captive audience, I couldn’t outright ask. The next best thing for me was paid research. I found Pollfish, an online survey platform that could assemble a panel of 100 web developers who own their own businesses. According to the data, there was overwhelming interest in a tool like this.*

Screenshot of two survey questions showing 79% of respondents would "Definitely buy" and "probably buy" Griffin AI, and 58% saying they need the app a lot.

Notice the asterisk. We’ll come back to that later on.

I also asked some of my designer and strategist friends who work in branding. They all agreed that there was likely a market for this.

Testing the Theory

I had a vague sense of what the application would be. The cool thing about ChatGPT is that you can bounce ideas back and forth with it as almost a co-creation partner. But you had to know what to ask, which is why prompt engineering skills were developed.

I first tested GPT 3.5’s general knowledge. Did it know about brand strategy? Yes. What about specific books on brand strategy, like Designing Brand Identity by Alina Wheeler? Yes. OK, so the knowledge is in there. I just needed the right prompts to coax out good answers.

I developed a method whereby the prompt reminded GPT of how to come up with the answer and, of course, contained the input from the user about the specific brand.

Screenshot of prompt

Through trial and error and burning through a lot of OpenAI credits, I figured out a series of questions and prompts to produce a decent brand strategy document.

I tested this flow with a variety of brands, including real ones I knew and fake ones I’d have GPT imagine.

Designing the MVP

The Core Product

Now that I had the conceptual flow, I had to develop a UI to solicit the answers from the user and have those answers inform subsequent prompts. Everything builds on itself.

I first tried an open chat, just like ChatGPT, but with specific questions. Only issue was I couldn’t limit what the user wrote in the text box.

Early mockup of the chat UI for Griffin AI

Early mockup of the chat UI for Griffin AI

AI Prompts as Design

Because the prompts were central to the product design, I decided to add them into my Figma file as part of the flow. In each prompt, I indicated where the user inputs would be injected. Also, most of the answers from the LLM needed to be stored for reuse in later parts of the flow.

Screenshot of app flow in Figma

AI prompts are indicated directly in the Figma file

Living With Imperfect Design

Knowing that I wanted a freelance developer to help me bring my idea to life, I didn’t want to fuss too much about the app design. So, I settled on using an off-the-shelf design system called Flowbite. I just tweaked the colors and typography and lived with the components as-is.

Building the MVP

Building the app would be out of my depths. When GPT 3.5 first came out, I test-drove it for writing simple Python scripts. But it failed, and I couldn’t figure out a good workflow to get working code. So I gave up. (Of course, fast-forward until now, and gen AI for coding is much better!)

I posted a job on Upwork and interviewed four developers. I chose Geeks of Kolachi, a development agency out of Pakistan. I picked them because they were an agency—meaning they would be a team rather than an individual. Their process included oversight and QA, which I was familiar with working at a tech company.

Working Proof-of-Concept in Six Weeks

In just six weeks, I had a working prototype that I could start testing with real users. My first beta testers were friends who graciously gave me feedback on the chat UI.

Through this early user testing, I found that I needed to change the UI. Users wanted more real estate for the generated content, and the free response feedback text field was simply too open, as users didn’t know what to do next.

So I spent another few weekends redesigning the main chat UI, and then the development team needed another three or four weeks to refactor the interface.

Mockup of the revised chat UI

The revised UI gives more room for the main content and allows the user to make their own adjustments.

AI Slop?

As a creative practitioner, I was very sensitive to not developing a tool that would eliminate jobs. The fact is that the brand strategies GPT generated were OK; they were good enough. However, to create a real strategy, a lot more research is required. This would include interviewing prospects, customers, and internal stakeholders, studying the competition, and analyzing market trends.

Griffin AI was a shortcut to producing a brand strategy good enough for a small local or regional business. It was something the WordPress developer could use to inform their website design. However, these businesses would never be able to afford the services of a skilled agency strategist in addition to the logo or website work.

However, the solo designer could charge a little extra for this branding exercise or provide more value in addition to their normal offering.

I spent a lot of time tweaking the prompts and the flow to produce more than decent brand strategies for the likes of Feline Friends Coffee House (cat cafe), WoofWagon Grooming (mobile pet wash), and Dice & Duels (board game store).

Beyond the Core Product

While the core product was good enough for an MVP, I wanted to figure out a valuable feature to justify monthly recurring revenue, aka a subscription. LLMs are pretty good at mimicking voice and tone if you give it enough direction. Therefore I decided to include copywriting as a feature, but writing based on a brand voice created after a brand strategy has been developed. ChatGPT isn’t primed to write in a consistent voice, but it can with the right prompting and context.

Screenshots of the Griffin AI marketing site

Screenshots of the Griffin AI marketing site

Beyond those two features, I also had to build ancillary app services like billing, administration, onboarding, tutorials, and help docs. I had to extend the branding and come up with a marketing website. All this ate up weeks more time.

Failure to Launch

They say the last 20% takes 80% of the time, or something like that. And it’s true. The stuff beyond the core features just took a lot to perfect. While the dev team was building and fixing bugs, I was on Reddit, trying to gather leads to check out the app in its beta state.

Griffin AI finally launched in mid-June. I made announcements on my social media accounts. Some friends congratulated me and even checked out the app a little. But my agency and tech company friends weren’t the target audience. No, my ideal customer was in that WordPress developers Facebook group where I couldn’t do any self-promotion.

Screenshot of the announcement on LinkedIn

I continued to talk about it on Reddit and everywhere I could. But the app never gained traction. I wasn’t savvy enough to build momentum and launch on ProductHunt. The Summer Olympics in Paris happened. Football season started. The Dodgers won the World Series. And I got all but one sale.

When I told this customer that I was going to shut down the app, he replied, “I enjoyed using the app, and it helped me brief my client on a project I’m working on.” Yup, that was the idea! But not enough people knew about it or thought it was worthwhile to keep it going.

Lessons Learned

I’m shutting Griffin AI down, but I’m not too broken up about it. For me, I learned a lot and that’s all that matters. Call it paying tuition into the school of life.

When I perform a post-mortem on why it didn’t take off, I can point to a few things.

I’m a maker, not a seller.

I absolutely love making and building. And I think I’m not too bad at it. But I hate the actual process of marketing and selling. I believe that had I poured more time and money into getting the word out, I could have attracted more customers. Maybe.

Don’t rely on survey data.

Remember the asterisk? The Pollfish data that showed interest in a product like this? Well, I wonder if this was a good panel at all. In the verbatims, some comments didn’t sound like these respondents were US-based, business owners, or taking the survey seriously. Comments like “i extremely love griffin al for many more research” and “this is a much-needed assistant for my work.” Instead of survey data with a suspect panel, I need to do more first-hand research before jumping into it.

AI moves really fast.

AI has been a rocket ship this past year-and-a-half. Keeping up with the changes and new capabilities is brutal as a side hustle and as a non-engineer. While I thought there might be a market for a specialized AI tool like Griffin, I think people are satisfied enough with a horizontal app like ChatGPT. To break through, you’d have to do something very different. I think Cursor and Replit might be onto something.


I still like making things, and I’ll always be a tinkerer. But maybe next time, I’ll be a little more aware of my limitations and either push past them or find collaborators who can augment my skills.

Closeup of MU/TH/UR 9000 computer screen from the movie Alien:Romulus

Re-Platforming with a Lot of Help From AI

I decided to re-platform my personal website, moving it from WordPress to React. It was spurred by a curiosity to learn a more modern tech stack like React and the drama in the WordPress community that erupted last month. While I doubt WordPress is going away anytime soon, I do think this rift opens the door for designers, developers, and clients to consider alternatives.

First off, I’m not a developer by any means. I’m a designer and understand technical things well, but I can’t code. When I was young, I wrote programs in BASIC and HyperCard. In the early days of content management systems, I built a version of my personal site using ExpressionEngine. I was always able to tweak CSS to style themes in WordPress. When Elementor came on the scene, I could finally build WP sites from scratch. Eventually, I graduated to other page builders like Oxygen and Bricks.

So, rebuilding my site in React wouldn’t be easy. I went through the React foundations tutorial by Next.js and their beginner full-stack course. But honestly, I just followed the steps and copied the code, barely understanding what was being done and not remembering any syntax. Then I stumbled upon Cursor, and a whole new world opened up.

Screenshot of the Cursor website, promoting it as “The AI Code Editor” designed to boost productivity. It features a “Download for Free” button, a 1-minute demo video, and a coding interface with AI-generated suggestions and chat assistance.

Cursor is an AI-powered code editor (IDE) like VS Code. In fact, it’s a fork of VS Code with AI chat bolted onto the side panel. You can ask it to generate and debug code for you. And it works! I was delighted when I asked it to create a light/dark mode toggle for my website. In seconds, it outputted code in the chat for three files. I would have to go into each code example and apply it to the correct file, but even that’s mostly automatic. I simply have to accept or reject the changes as the diff showed up in the editor. And I had dark mode on my site in less than a minute. I was giddy!

To be clear, it still took about two weekends of work and a lot of trial and error to finish the project. But a non-coder like me, who still can’t understand JavaScript, would not have been able to re-platform their site to a modern stack without the help of AI.

Here are some tips I learned along the way.

Plan the Project and Write a PRD

While watching some React and Next.js tutorials on YouTube, this video about 10xing your Cursor workflow by Jason Zhou came up. I didn’t watch the whole thing, but his first suggestion was to write a product requirements document, or PRD, which made a lot of sense. So that’s what I did. I wrote a document that spelled out the background (why), what I wanted the user experience to be, what the functionality should be, and which technologies to use. Not only did this help Cursor understand what it was building, but it also helped me define the functionality I wanted to achieve.

Screenshot of a project requirements document titled “Personal Website Rebuild,” outlining a plan to migrate the site rogerwong.me from WordPress to a modern stack using React, Next.js, and Tailwind CSS. It includes background context, required pages, and navigation elements for the new site.

A screenshot of my PRD

My personal website is a straightforward product when compared to the Reddit sentiment analysis tool Jason was building, but having this document that I could refer back to as I was making the website was helpful and kept things organized.

Create the UI First

I’ve been designing websites since the 1990s, so I’m pretty old school. I knew I wanted to keep the same design as my WordPress site, but I still needed to design it in Figma. I put together a quick mockup of the homepage, which was good enough to jump into the code editor.

I know enough CSS to style elements however I want, but I don’t know any best practices. Thankfully, Tailwind CSS exists. I had heard about it from my engineering coworkers but never used it. I watched a quick tutorial from Lukas, who made it very easy to understand, and I was able to code the design pretty quickly.

Prime the AI

Once the design was in HTML and Tailwind, I felt ready to get Cursor started. In the editor, there’s a chat interface on the right side. You can include the current file, additional files, or the entire codebase for context for each chat. I fed it the PRD and told it to wait for further instructions. This gave Cursor an idea of what we were building.

Make It Dynamic

Then, I included the homepage file and told Cursor to make it dynamic according to the PRD. It generated the necessary code and, more importantly, its thought process and instructions on implementing the code, such as which files to create and which Next.js and React modules to add.

Screenshot of the AI coding assistant in the Cursor editor helping customize Tailwind CSS Typography plugin settings. The user reports issues with link and heading colors, especially in dark mode. The assistant suggests editing tailwind.config.ts and provides code snippets to fix styling.

A closeup of the Cursor chat showing code generation

The UI is well-considered. For each code generation box, Cursor shows the file it should be applied to and an Apply button. Clicking the Apply button will insert the code in the right place in the file, showing the new code in green and the code to be deleted in red. You can either reject or accept the new code.

Be Specific in Your Prompts

The more specific you can be, the better Cursor will work. As I built the functionality piece by piece, I found that the generated code would work better—less error-prone—when I was specific in what I wanted.

When errors did occur, I would simply copy the error and paste it into the chat. Cursor would do its best to troubleshoot. Sometimes, it solved the problem on its first try. Other times, it would take several attempts. I would say Cursor generated perfect code the first time 80% of the time. The remainder took at least another attempt to catch the errors.

Know Best Practices

Screenshot of the Cursor AI code editor with a TypeScript file (page.tsx) open, showing a blog post index function. An AI chat panel on the right helps troubleshoot Tailwind CSS Typography plugin issues, providing a tailwind.config.ts code snippet to fix link and heading colors in dark mode.

Large language models today can’t quite plan. So, it’s essential to understand the big picture and keep that plan in mind. I had to specify the type of static site generator I wanted to build. In my case, just simple Markdown files for blog posts. However, additional best practices include SEO and accessibility. I had to have Cursor modify the working code to incorporate best practices for both, as they weren’t included automatically.

Build Utility Scripts

Since I was migrating my posts and links from WordPress, a fair bit of conversion had to be done to get it into the new format, Markdown. I thought I would have to write my own WordPress plugin or something, but when I asked Cursor how to transfer my posts, it proposed the existing WordPress-to-Markdown script. That was 90% of the work!

I ended up using Cursor to write additional small scripts to add alt text to all the images and to ensure no broken images. These utility scripts came in handy to process 42 posts and 45 links in the linklog.

The Takeaway: Developers’ Jobs Are Still Safe

I don’t believe AI-powered coding tools like Cursor, GitHub Copilot, and Replit will replace developers in the near future. However, I do think these tools have a place in three prominent use cases: learning, hobbying, and acceleration.

For students and those learning how to code, Cursor’s plain language summary explaining its code generation is illuminating. For hobbyists who need a little utilitarian script every once in a while, it’s also great. It’s similar to 3D printing, where you can print out a part to fix the occasional broken something.

Two-panel graphic promoting GitHub Copilot. The left panel states, “Proven to increase developer productivity and accelerate the pace of software development,” with a link to “Read the research.” The right panel highlights “55% Faster coding” with a lightning bolt icon on a dark gradient background.

For professional engineers, I believe this technology can help them do more faster. In fact, that’s how GitHub positions Copilot: “code 55% faster” by using their product. Imagine planning out an app, having the AI draft code for you, and then you can fine-tune it. Or have it debug for you. This reduces a lot of the busy work.

I’m not sure how great the resulting code is. All I know is that it’s working and creating the functionality I want. It might be similar to early versions of Macromedia (now Adobe) Dreamweaver, where the webpage looked good, but when you examined the HTML more closely, it was bloated and inefficient. Eventually, Dreamweaver’s code got better. Similarly, WordPress page builders like Elementor and Bricks Builder generated cleaner code in the end.

Tools like Cursor, Midjourney, and ChatGPT are enablers of ideas. When wielded well, they can help you do some pretty cool things. As a fun add-on to my site, I designed some dingbats—mainly because of my love for 1960s op art and ’70s corporate logos—at the bottom of every blog post. See what happens if you click them. Enjoy.

Photo of Kamala Harris

The Greatest Story Ever Told

I was floored. Under immense pressure, under the highest of expectations, Kamala outperformed, delivering way beyond what anyone anticipated. Her biography is what makes her relatable. It illustrates her values. And her story is the American story.

When she talked about her immigrant parents, I thought about mine. My dad was a cook and a taxicab driver. My mother worked as a waitress. My sister and I grew up squarely in the middle class, in a rented flat in the San Francisco working class neighborhood of North Beach (yes, back in the 1970s and ’80s it was working class). Our school, though a private parochial one, was also attended by students from around the neighborhood, also mostly kids of immigrants. Education was a top value in our immigrant families and they made sacrifices to pay for our schooling.

Because my mother and father worked so hard, my parents taught my sister and me the importance of dedication and self-determination. Money was always a worry in our household. It was an unspoken presence permeating all decisions. We definitely grew up with a scarcity mindset.

But our parents, especially my dad, taught us the art of the possible. There wasn’t a problem he was unwilling to figure out. He was a jack of all trades who knew how to cook anything, repair anything, and do anything. Though he died when my sister and I were teenagers, his curiosity remained in us, and we knew we could pursue any career we wanted.

With the unwavering support of our mother, we were the first ones in our extended family to go to college, coming out the other end to pursue white collar, professional careers. And creative ones at that. We became entrepreneurs, starting small businesses that created jobs.

Kamala Harris’s story and my story are not dissimilar. They’re echoes, variations on the American story of immigrants coming to seek a better life in the greatest country in the world. So that they may give a better life for their children and their children’s children.

The American story changes the further you get away from your original immigrant ancestors — yes, unless your ancestors are indigenous, we’re all descendants of immigrants. But it is still about opportunity; it is still about the art of the possible; it is still about freedom. It is about everyone having a chance.

Kamala ended her speech with “And together, let us write the next great chapter in the most extraordinary story ever told.” It resonated with me and made me emotional. Because she captured exactly what it means to me to be an American and to love this country where an unlikely journey like hers and mine could only happen here.

Apple VR headset on a table

Thoughts on Apple Vision Pro

Apple finally launched its Vision Pro “spatial computing” device in early February. We immediately saw TikTok memes of influencers being ridiculous. I wrote about my hope for the Apple Vision Pro back in June 2023, when it was first announced. When preorders opened for Vision Pro in January, I told myself I wouldn’t buy it. I couldn’t justify the $3,500 price tag. Out of morbid curiosity, I would lurk in the AVP subreddits to live vicariously through those who did take the plunge.

After about a month of reading all the positives from users about the device, I impulsively bought an Apple Vision Pro. I placed my order online at noon and picked it up just two hours later at an Apple Store near me.

Many great articles and YouTube videos have already been produced, so this post won’t be a top-to-bottom review of the Apple Vision Pro. Instead, I’ll try to frame it from my standpoint as someone who has designed user experiences for VR

Welcome to the Era of Spatial Computing

Augmented reality, mixed reality, or spatial computing—as Apple calls it—on a “consumer” device is pretty new. You could argue that Microsoft HoloLens did it first, but that didn’t generate the same cultural currency as AVP has, and the HoloLens line has been relegated to industrial applications. The Meta Quest 3, launched last October, also has a passthrough camera, but they don’t market the feature; it’s still sold as a purely virtual reality headset.

Screenshot of the Apple Vision Pro home screen showing floating app icons in an augmented reality workspace. Visible apps include TV, Music, Mindfulness, Settings, Safari, Photos, Notes, App Store, Freeform, Mail, Messages, Keynote, and Compatible Apps, overlaid on a real-world office environment.

Vision Pro Home Screen in my messy home office.

Putting on Vision Pro for the first time is pretty magical. I saw the world around me—though a slightly muted and grainy version of my reality—and I saw UI floating and pinned to reality. Unlike any other headset I’ve tried, there is no screen door effect. I couldn’t see the pixels. It’s genuinely a retina display just millimeters away from my actual retinas. 

The UI is bright, vibrant, and crisp in the display. After launching a weather app from the home “screen” and positioning it on a wall, it stays exactly where it is in my living room. As I move closer to the app, everything about the app remains super sharp. It’s like diving into a UI. 

The visionOS User Interface

The visionOS UI feels very much like an extension of macOS. There’s a lot of translucency, blurred backgrounds for a frosted glass effect, and rounded corners. The controls for moving, closing, and resizing a window feel very natural. There were times when I wished I could rotate a window on its Y-axis to face me better, but that wasn’t possible. 

Admittedly, I didn’t turn on the accessibility feature. But as is, a significant issue that the UI presents is contrast. As someone with no accessibility issues, it was hard to tell half the time when something was highlighted. I would often have to look at another UI component and then back again to make sure a button was actually highlighted.

When you launch a Vision Pro app, it is placed right in front of you. For example, I would look at the Photos app, then click the Digital Crown (the dial for immersion) to bring up the Home Screen, which is then overlaid on top of the app. The background app does get fainter, and I can tell that the new screen is on top of Photos. Launching the Apple TV app from there would bring up the TV window on top of Photos, and I would run into issues where the handles for the windows are really close together, making it difficult to select the right one with my eyes so I can move it.

Window management, in general, is a mess. First of all, there is none. There’s no minimizing of windows; I would have to move them out of the way. There’s no collecting of windows. For instance, I couldn’t set up a workspace with the apps in the right place, collapse them all, and bring them with me to another room in my house. I would have to close them all, reopen them, and reposition them in the new room.

Working in Apple Vision Pro

I was excited to try the Mac Virtual Display feature, where you can see your Mac’s screen inside Vision Pro. Turning this on is intuitive. A “Connect” button appeared just above my MacBook Pro when I looked at it.

The Mac’s screen blacks out, and a large screen inside Vision Pro appears. I could resize it, move it around, and position it exactly where I wanted it. Everything about this virtual screen was crisp, but I ran into issues.

First, I’m a pretty good typist but cannot touch-type. With the Mac Virtual Display, I need to look down at my keyboard every few seconds. The passthrough camera on the headset is great but not perfect. There is some warping of reality on the edges, and that was just enough to cause a little motion sickness.

Second, when I’m sitting at my desk, I’m used to working with dual monitors. I usually have email or comms software on the smaller laptop screen while I work in Figma, Illustrator, or Photoshop on my larger 5K Apple Studio Display. If I sit at my desk and turn on Mac Virtual Display, I also lose my Studio Display. Only one virtual display shows up in Vision Pro. 

I tried to mitigate the lost space by opening Messages, Spark Email (the iPad version), and Fantastical in Vision Pro and placing those apps around me. But I found switching from my Mac to these other apps cumbersome. I’d have to stop using my mouse and use my fingers instead when I looked at Spark. I found that keyboard focus depended on where my eyes were looking. For example, if I were reading an email in Spark but needed to look at my keyboard to find the “E” key to archive that email, if I pressed the key before my eyes were back in the Spark window, that E would go to whatever app my eyes happened to cross. In other words, my eyes are my cursor, which takes a while to get used to.

Spatial Computing 1.0

It is only the first version of visionOS (currently 1.1). I expect many of these issues, like window management, eye tracking and input confusion, and contrast, to improve in the coming years. 

Native visionOS Apps

In many ways, Apple has been telegraphing what they want to achieve with Vision Pro for years. Apple’s API for augmented reality, ARKit, was released way back in June 2017, a full six years before Vision Pro was unveiled. Some of the early AR apps for Vision Pro are cool tech demos.

Screenshot from Apple Vision Pro using the JigSpace app, showing a detailed 3D augmented reality model of a jet engine overlaid in a modern living room environment.

There’s a jet engine in my living room!

The JigSpace app plunks real-world objects into your living room. I pulled up a working jet engine and was able to peel away the layers to see how it worked. There’s even a Formula 1 race car that you can load into your environment.

The Super Fruit Ninja game was fun. I turned my living room into a fruit-splattered dojo. I could even launch throwing stars from my hands that would get stuck on my walls.

Screenshot from Apple Vision Pro using the Zillow Immerse app, displaying a virtual tour interface overlaid on a dining area. Navigation options such as “Breakfast nook,” “Living room,” and “Kitchen” appear at the bottom, along with a broken 3D floor plan model in the center.

That’s half a floor plan on top of a low-resolution 360° photo.

Some Vision Pro apps were rushed out the door and are just awful. The Zillow Immerse app is one of them. I found the app glitchy and all the immersive house tours very low-quality. The problem is that the environments that ship with Vision Pro are so high-resolution and detailed that anything short of that is jarringly inferior. 

UX Considerations in Vision Pro

Apple Vision Pro can run iPad apps, at least the ones where the developer has enabled the capability. However, I found that many of the touch targets in iPad apps were not sufficient. Apple’s Human Interface Guidelines specify that hit targets should be at least 44x44 pts. But if opened in Vision Pro, that’s not enough. For visionOS, Apple recommends controls’ centers be at least 60 pts apart. 

I would further recommend that controls for visionOS apps should have large targets. In Apple’s own Photos app, in the left sidebar, only the accordion arrow is a control. Looking at and selecting the accordion label like “Spatial” or “Selfies” does not work. I had to look to the right of the label, to the arrow in order to select the item. Not great.

Eye and hand tracking in Vision Pro are excellent, although not perfect. There were many times when I couldn’t get the device to register my pinch gesture or get my eyes to a point in a window to resize it.

Some apps take advantage of additional gestures like pinching with both hands and then pulling them apart to resize something. I do believe that more standard gestures need to be introduced in the future for visionOS.

Steve Jobs famously once said, “God gave us ten styluses. Let’s not invent another.” Apple eventually introduced the Pencil for iPad. I think for many applications and for users to be productive with them, Apple will have to introduce a controller.

IMAX in My Bedroom

The single most compelling use case for Apple Vision Pro right now is consuming video content, specifically movies and TV shows. The built-in speakers, which Apple calls audio pods, sound fantastic. Apple has been doing a lot of work in Spatial Audio over the years and I experienced really great surround sound in the Vision Pro. The three apps that currently stand out for video entertainment are IMAX, Disney Plus, and Apple TV. 

Watching content in the IMAX —only a couple of trailers were free—reminded me of the best IMAX screen I’ve ever been to, which is the one in the Metreon in San Francisco. The screen is floor-to-ceiling high with a curved railing in front of it. On either side is a backlit IMAX logo, and I could choose from a few different positions in the theater!

Screenshot from Apple Vision Pro using the Disney+ app, showing a virtual Star Wars-themed environment with a sunset over Tatooine. A floating screen displays a scene featuring droids BB-8 and R2-D2, blending immersive AR with cinematic playback.

Watching a Star Wars movie on Tatooine.

Disney leverages its IP very well by giving us various sets to watch their content. I could watch Avengers: End Game from Avengers Tower, Monsters, Inc. from the scare floor, or The Empire Strikes Back from Luke’s land speeder on Tatooine. 

With Apple TV, I could watch Masters of the Air in a window in my space or go into an immersive environment. Whether it’s lakeside looking towards Mount Hood, on the surface of the moon, or in a discrete movie theater, the content was the star. My wife goes to sleep before me, and I usually put on my AirPods and watch something on my iPad. With Vision Pro, I could be much more engrossed in the show because the screen is as big as my room.

Still from an Apple Vision Pro commercial showing a person lying on a couch wearing the headset, watching a large virtual screen suspended in the air that displays warplanes flying through clouds. The scene emphasizes immersive home entertainment; caption reads “Apple TV+ subscription required.”

From the Apple commercial “First Timer”

I rewatched Dune from 2021 and was blown away by the audio quality of my AirPods Pro. The movie has incredible sound and uses bass and sub-bass frequencies a lot, so I was surprised at how well the AirPods performed. Of course, I didn’t feel the bass rumble in my chest, but I could certainly hear it in my ears.

Vision Pro Industrial Design

Close-up photo of the Apple Vision Pro headset, showcasing its sleek design with a reflective front visor, external cameras, and adjustable fabric headband resting on a dark surface.

The Vision Pro hardware is gorgeous.

As many others have pointed out, the hardware is incredible. It feels very premium and is a technological marvel. The cool-looking Solo Knit Band works pretty well for me, but everyone’s heads are so different that your mileage may vary. Everyone’s face is also very different, and Apple uses the Face ID scanner on the iPhone to scan your face when you order it. This determines the exact light seal they’ll include with your Vision Pro.

There are 28 different models of light seals. Finding the right light seal to fit my face wasn’t as easy as taking the recommendation from the scan. When I went to pick it up, I opted for a fitting, but the 21W that was suggested didn’t feel comfortable. I tried a couple of other light seal sizes and settled on the most comfortable one. But at home, the device was still very uncomfortable. I couldn’t wear it for more than 10 minutes without feeling a lot of pressure on my cheeks.

The next day, I returned to the Apple Store and tried three or four more light seal and headband combinations. But once dialed in, the headset was comfortable enough for me to watch an hour-long TV show.

I wonder why Apple didn’t try to develop a method that requires less variation. Wouldn’t some memory foam cushioned light seal work?

Apple’s Ambitions

The Apple Vision Pro is an audacious device, and I can tell where they want to go, but they don’t yet have the technology to get there. They want to make AR glasses with crystal-clear, super-sharp graphics that can then be converted to immersive VR with the flick of a dial.

That’s why EyeSight, the screen on the front of the headset, allows people in the surrounding area to see the user’s eyes. The device also has a passthrough camera, allowing the user to see out. Together, these two features allow Vision Pro to act as a clear two-way lens.

But Apple seems to want both AR and VR in the same device. I would argue that it might be physically impossible. Imagine an Apple device more like the HoloLens, where they are truly glasses with imagery projected onto them. That eliminates the smaller-than-their-competitors’ field of vision, or FOV. That would eliminate the ridiculous fitting conundrum as the glasses could float in front of your eyes. And that would probably reduce the device’s weight, which has been discussed at length in many reviews.

And then, for VR, maybe there’s a conversion that could happen with the AR glasses. A dial could turn the glasses from transparent to opaque. Then, the user would snap on a light-blocking attachment (a light seal). I believe that would be a perfectly acceptable tradeoff.

What $3,500 Buys You

In 1985, when I was 12 years old, I badgered my father daily to buy me a Macintosh computer. I had seen it at ComputerLand, a computer shop on Van Ness Avenue. I would go multiple times per week after school just to mess around with the display unit. I was enamored with MacPaint.

Vintage black-and-white print ad announcing the Apple Macintosh, featuring a hand using a computer mouse and a sketch of the Macintosh computer. The headline reads, “We can put you in touch with Macintosh,” promoting its simplicity and ease of use. The ad is from ComputerLand with the tagline “Make friends with the future.”

After I don’t know how many months, my dad relented and bought me a Macintosh 512K. The retail cost of the machine in 1985 was $2,795, equivalent to $8,000 in 2024 dollars. That’s a considerable investment for a working-class immigrant family. But my wise father knew then that computers were the future. And he was right.

With my Mac, I drew illustrations in MacPaint, wrote all my school essays in MacWrite, and made my first program in HyperCard. Eventually, I upgraded to other Macs and got exposed to and honed my skills in Photoshop and Illustrator, which would help my graphic design career. I designed my first application icon when I was a senior in high school.

Of course, computers are much cheaper today. The $999 entry model MacBook Air is able to do what my Mac 512K did and so much more. A kid today armed with a MacBook Air could learn so much!

Which brings us to the price tag of the Apple Vision Pro. It starts at $3,499. For a device where you can’t—at least for now—do much but consume. This was an argument against iPad for the longest time: it is primarily a consumption device. Apple went so far as to create a TV spot showing how a group of students use an iPad to complete a school project. With an iPad, there is a lot of creation that can happen. There are apps for drawing, 3D sculpting, video editing, writing, brainstorming, and more. It is more than a consumption device.

More than a Consumption Device? Not So Fast.

For Vision Pro, today, I’m not so sure. The obvious use case is 3D modeling and animation. Already, someone is figuring out how to visualize 3D models from Blender in AVP space. It’s tied to the instance of Blender running on his Mac, though, isn’t it? 3D modeling and animation software is notoriously complicated. The UI for Cinema 4D, the 3D software that I know best, has so many options and commands and so many keyboard shortcuts and combinations that it would be impossible to replicate in visionOS. Or take simpler apps like Final Cut Pro or Photoshop. Both have iPad apps, but a combination of the keyboard and mouse can make a user so much more productive. Imagine having to look at precisely the right UI element in Vision Pro, then pinch at exactly the right thing in a dense interface like Final Cut Pro. It would be a nightmare.

Screenshot from Apple Vision Pro using the Djay app, showing a realistic virtual DJ setup with turntables and music controls overlaid in a modern living room. A user’s hand interacts with the virtual record player, blending AR and music mixing in real time.

Being creative with djay in Apple Vision Pro

I do think that creative apps will eventually find their way to the platform. One of the launch apps is djay, the DJing app, of course. But it will take some time to figure out.

Beyond that, could a developer use Vision Pro to program in? If we look to the iPadOS ecosystem there are a handful of apps to write code. But there is no way to check your code, at least not natively. Erik Bledsoe from Coder writes, “The biggest hurdle to using an iPad for coding is its lack of a runtime environment for most languages, forcing you to move your files to a server for compiling and testing.” The workaround is to use a cloud-based IDE in the browser like Coder. I imagine that the same limitations will apply to Vision Pro.

The Bottom Line

For $3,500, you could buy a 16-inch MacBook Pro with an M3 Pro chip and an iPhone 15 Pro. Arguably, this would be a much more productive setup. With the Mac, you’d have access to tens of thousands of apps, many for professional applications. With the iPhone, there are nearly five million apps in the App Store.

In other words, I don’t believe buying an Apple Vision Pro today would open a new world up for a teenager. It might be cool and a little inspirational, but it won’t help the creator inside them. It won’t do what the Mac 512K did for me back in 1985.

Vision Pro’s Future

Clearly, the Apple Vision Pro released in 2024 is a first generation product. Just like the first-gen Apple Watch, Apple and its customers will need to feel their collective way and figure out all the right use cases. We can look to the Meta Quest 3 and Microsoft HoloLens 2 to give us a glimpse.

As much as people were marveling at the AR vacuum cleaning game for Vision Pro, AR and VR apps have existed for a while. PianoVision for Meta Quest 3 combines your real piano or keyboard with a Guitar Hero-like game to teach you how to play. The industrial applications for HoloLens make a lot of sense.

Now that Apple is overtly out of the closet in the AR/VR game, developers will show great enthusiasm and investment in the space. At least on Reddit, there’s a lot of excitement from users and developers. We will have to see if the momentum lasts. The key for the developers will be the size of the market. Will there be enough Vision Pro users to sustain a thriving app ecosystem?

As for me, I decided to return my Vision Pro within the 14-day return window. The only real use case for me was the consumption of media, which I couldn’t justify spending $3,500 for a room-sized TV that only I could watch. Sign me up for version 2, though.

What is brand strategy and why is it so powerful

What Is Brand Strategy and Why Is It So Powerful

Let me tell you a story…

Imagine a smoky wood-paneled conference room. Five men in smart suits sit around a table with a slide projector in the middle. Atop the machine is a finned plastic container that looks like a donut or a bundt cake. A sixth man is standing and begins a pitch.

Technology is a glittering lure, but there’s the rare occasion when the public can be engaged on the level beyond flash, if they have a sentimental bond with the product.

My first job, I was in-house at a fur company with this old pro copywriter—Greek named Teddy. And Teddy told me the most important idea in advertising is “new.” Creates an itch. You simply put your product in there as a kind of calamine lotion.

But he also talked about a deeper bond with the product. Nostalgia. It’s delicate, but potent.

Courtesy of Lions Gate Entertainment, Inc.

Of course, I’m describing an iconic scene from the TV show Mad Men, in which Don Draper, creative director of Sterling Cooper, a mid-level advertising agency on the rise, vying for Kodak’s business.

Draper weaves a story about technology, newness, and nostalgia. As he clicks through a slideshow of his family on the screen, he channels the desire—no—need of everyone, i.e., consumers, to be loved and how the power of memories can take us there.

Teddy told me that in Greek “nostalgia” literally means “the pain from an old wound.” It’s a twinge in your heart, far more powerful than memory alone. This device isn’t a spaceship. It’s a time machine. It goes backwards, forwards. It takes us to a place where we ache to go again.

It’s not called the Wheel. It’s called the Carousel. It lets us travel the way a child travels. Round and around and back home again, to a place where we know we are loved.

This isn’t brand strategy. However, it is an excellent illustration of how using insights about an audience and the uniqueness of your brand can create a powerful emotional connection. You see, one of Don Draper’s gifts is his instinct about people. He can immediately get deep into a single person’s heart and manipulate them, and he can also apply that skill to audiences. It’s about understanding what makes them tick, what they care about, and then combining their desires with whatever is unique about the brand. (Ironically, in the show, he knows himself the least.)

What is brand strategy? It is identifying the intersection of these two circles of the Venn diagram and finding the emotional truth therein.

What is brand strategy? It's the intersection of Audience and Brand. It's magic.

Understanding the essence of brand strategy

In Alina Wheeler’s seminal book on brand identity called Designing Brand Identity, she emphasizes that:

Effective brand strategy provides a central, unifying idea around which all behavior, actions, and communications are aligned. It works across all products and services, and is effective over time. The best brand strategies are so differentiated and powerful that they deflect the competition. They are easy to talk about, whether you are the CEO or an employee.

Wheeler goes on to say that brand strategy is deeply rooted in the company’s vision, which is aligned with its leadership and employees, and encapsulates a deep understanding of the customer’s perceptions and needs.

A brand strategy enhances the connection with ideal customers by clearly defining the brand’s value proposition and ensuring the messaging resonates with their needs, preferences, and desires. It streamlines marketing by creating a cohesive narrative across all channels, making it easier to communicate the benefits and unique selling points of products. Furthermore, a solid brand strategy amplifies brand awareness, setting a foundation for consistent and memorable brand experiences, which fosters recognition and loyalty among the target audience.

The core elements of an effective brand strategy

There are five essential elements of brand strategy:

  1. Brand purpose and mission
  2. Consistency in messaging and design
  3. Emotional connection and storytelling
  4. Employee involvement and brand advocacy
  5. Competitive awareness and positioning

Brand purpose and mission

All good brands must exist for some reason beyond just the financial aspect. No consumer will have any affinity with a brand that’s only out to make money. Instead, the brand needs to have a higher purpose—a reason for being that is greater than themselves. Simon Sinek’s Start with Why is a great primer on why brand purpose is necessary.

A brand’s purpose is then distilled into a succinct statement that acts as the brand’s mission. It is the unifying internal rallying cry for employees so they can share a common purpose.

Branding is consistency in messaging and design

Collage of three images: Woman playing tennis, woman with headphones, abstract pattern.

Target’s brand is very consistent with its white and red color palette.

Keeping the message and design consistent is critical to making a brand stand out. This means always sharing the same core message and look, which helps people recognize and trust the brand. It’s like they’re getting a note from a familiar friend. This builds a strong, trustworthy brand image that people can easily remember, connect with, and love.

Emotional connection and storytelling

Football player diving to catch ball in ad.

Nike celebrates the athlete in all of us.

Creating an emotional connection and weaving compelling storytelling into the fabric of a brand goes beyond mere transactions; it invites the audience into a narrative that resonates on a personal level. Through stories, a brand can illustrate its values, mission, and the impact it aims to have in the world, making its purpose relatable and its vision inspiring. This narrative approach fosters a deeper bond with the audience, turning passive consumers into passionate advocates. Engaging storytelling not only captivates but also enriches the brand experience, ensuring that every interaction is meaningful and memorable.

By integrating authentic stories into the brand strategy, companies can give light to the human element of their brand, making it more accessible and emotionally appealing to their audience.

Competitive awareness and positioning

Understanding the competitive landscape and strategically positioning the brand within it is crucial. It involves recognizing where your brand stands in relation to competitors and identifying what makes your brand unique through techniques like SWOT analyses and competitive audits. This awareness enables a brand to differentiate itself, highlighting its unique value propositions that appeal to the target audience. By carefully analyzing competitors and the market, a brand can craft a positioning strategy that emphasizes its strengths, addresses consumer needs more effectively, and carves out a distinct space in the consumer’s mind, setting the stage for sustainable growth and loyalty.

More than a logo: The power of storytelling in brand strategy

Man in glasses pondering (maybe crying) during a meeting.

The character Harry Crane reacting to Don Draper’s Carousel pitch.

Brand strategy is much more than just a pretty logo or shiny new website. It’s about creating a meaningful connection with a brand’s audience, as demonstrated by Don Draper’s memorable pitch in Mad Men. The key lies in storytelling and emotional resonance, moving beyond the novelty to forge a genuine bond with customers.

Alina Wheeler’s work further highlights the importance of a unified narrative that aligns with the company’s mission and resonates with both employees and customers. A successful brand strategy differentiates the brand from competitors, not just through its products or services, but through the story it tells and the values it embodies.

To navigate the complexities of brand development effectively, creating a narrative that speaks directly to the audience’s needs and desires is essential. Building a brand is about more than just standing out in the market; it’s about creating a lasting relationship with customers by reflecting their values and aspirations.

What is brand strategy? It’s a secret power.

Apple advertisement: Inspirational tribute to innovative thinkers poster.

Apple’s Think Different campaign celebrated iconoclasts and invited those consumers into their tent.

Not all clients know they need this. Effective brand strategy is key to all successful brands like Nike, Apple, Patagonia, and Nordstrom. It’s the foundation upon which all lasting brands are built. These companies don’t just sell products; they sell stories, experiences, and values that resonate deeply with their customers. These brands stand out not only because of their innovative offerings but also because of their ability to connect with consumers on an emotional level, embedding their products into the lifestyles and identities of their audience. This deep connection is the result of a carefully crafted brand strategy that articulates a clear vision, mission, and set of values that align with those of their target market.

Moreover, an effective brand strategy acts as a guiding star for all of a company’s marketing efforts, ensuring consistency across all touchpoints. It helps businesses understand their unique position in the market, differentiate themselves from competitors, and communicate their message in a compelling and memorable way. By investing in a solid brand strategy, companies can build a robust and cohesive brand identity that attracts and retains loyal customers, driving long-term success and growth. In a world where consumers are bombarded with choices, a well-executed brand strategy is not just a secret power—it’s an essential one.

Why is brand strategy important

Why Is Brand Strategy Important

Designing since 1985

I’ve been a designer for as long as I can remember. I usually like to say that I started in the seventh grade, after being tenacious enough to badger my father into buying my first Macintosh computer and then spending hours noodling in MacPaint and MacDraw. Pixel by pixel, I painstakingly drew Christopher Columbus’s ship, the Santa Maria, for a book report cover. I observed the lines of the parabolic exterior of Saint Mary’s Cathedral, known colloquially in San Francisco as “the washing machine,” and replicated them in MacDraw. Of course, that’s not design, but that was the start of my use of the computer to make visuals that communicate. Needless to say, I didn’t know what brand strategy even was, or why it’s important, but we’ll get there.

Pixel art of a woman in traditional attire drawn on an early computer program called MacPaint.

Screenshot of MacPaint (1984)

Amateur hour

The first real logo I designed was for a friend of mine who ran a computer consulting company consisting of only himself. Imagine the word “MacSpect” set in Garamond, with a black square preceding it and then a wave running through the shape and logotype, inverted out of the letters. I thought it was the coolest thing in 1992. But it meant nothing. There was no concept behind it. It borrowed Garamond from Apple’s official typeface at the time, and the invert technique was popular in the late 1980s and early 1990s.

MacSpect logo with stylized typography.

Author’s attempt at recreating his first logo from memory, 32 years later

Concept is king

Fast-forward to my first real design job after design school. One of my early projects was to design a logo for Levi’s. It was not to be their official corporate logo, but instead, it was for a line of clothing called Americana. It would be used on hangtags and retail store signage. I ended up designing a distressed star—grunge was the shit in the mid-1990s—with a black and white inverted bottle cap pattern behind it. (There’s that inverting again!) Even though this was just as trendy as my student-level MacSpect logo, this mark worked. You see, the Levi’s brand has always been rooted in American authenticity, with its history going back to the Gold Rush in the mid-1800s. The distress in the logo represented history. The star shape was a symbol of America. And the pattern in the circle is straight from the label on every pair of Levi’s jeans.

This logo worked because it was rooted in a concept, or put another way, rooted in strategy. And this is where I learned why brand strategy was important to design.

Levi's jeans logo with star design

Why is brand strategy important? Why does it matter?

Designing something visually appealing is easy. Find some inspiration on Instagram, Dribbble, or Behance, put your spin on it, and call it a day. But what you create won’t be timeless. In fact, its shelf life will be as long as the trend lasts. A year? Two at best?

Collage of various user interface design examples. Why is brand strategy important? So you can avoid being the same as everyone else.

Trends like neumorphism come and go quickly

But if your design is rooted in brand strategy—concepts representing the essence of the brand you’re designing for—your creation will last longer. (I won’t say forever because eventually, all logos are redesigned, usually based on the whims of the new marketing person who takes charge.)

Brand strategy is the art of distilling a brand

Big design, branding, marketing, or advertising agencies have dedicated brand strategists. Clients pay a premium for their expertise because they can distill the essence of a brand into key pillars. The process is not unlike talking to a friend about a problem and then having them get to the heart of the matter because they know you and have an objective point of view. For a client, seeing excellent brand strategy deliverables is often jaw-dropping because strategists can articulate the brand better than they can. Their secret isn’t telling clients something they don’t know. Instead, the secret is revealing what they know in their hearts but can’t express.

Woman in orange with a wizard's hat conversing with man sitting.

Brand strategists work their magic by being therapists to clients. (Midjourney)

How do brand strategists work their magic? Through research and by acting as therapists, in a way. They listen and then reflect what they hear and learn.

Branding is more than just creative work

The brand insights articulated by brand strategists are typically used to inform the creative work. From logos to slogans, from landing pages to Instagram posts, all the creative is rooted in the pillars of the brand. So then, the brand’s audience experiences a consistent voice.

However, what clients find most valuable is the illumination of their brand purpose and company mission. You see, brand strategy also crosses into business strategy. They’re not one and the same, but there is overlap. The purpose and mission of a company help align employees and partners. They help with product or service development—the very future of the company.

This is why Simon Sinek’s “Start with why” talk from 2009 resonated with so many business leaders. It’s about purpose and mission. Why also happens to be the root of great branding.

Play

Brand strategy is the foundation for building brands—and the companies they represent. And the partner agencies that create that brand strategy for them are invaluable.

Offering brand strategy can propel you from “vendor” to “partner”

Clients will call freelancers and agencies “vendors,” lumping them into the same category as those who sell them copy paper. To transcend from being thought of as a vendor to being a partner, offering brand strategy is crucial.

Nearly all clients not listed in the Fortune 500 will not know what is brand strategy, nor why is brand strategy important. But once they see it, they’ll come to appreciate it.

This shift demands not just skill but a change in mindset. As a freelancer or small agency owner, your value lies in weaving brand stories, not just creating aesthetically pleasing designs and building websites. Your work should mirror the brand’s ethos and vision, making you an essential part of your client’s journey.

Apple Vision Pro

Transported into Spatial Computing

After years of rumors and speculation, Apple finally unveiled their virtual reality headset yesterday in a classic “One more thing…” segment in their keynote. Dubbed Apple Vision Pro, this mixed reality device is perfectly Apple: it’s human-first. It’s centered around extending human productivity, communication, and connection. It’s telling that one of the core problems they solved was the VR isolation problem. That’s the issue where users of VR are isolated from the real world; they don’t know what’s going on, and the world around them sees that. Insert meme of oblivious VR user here. Instead, with the Vision Pro, when someone else is nearby, they show through the interface. Additionally, an outward-facing display shows the user’s eyes. These two innovative features help maintain the basic human behavior of acknowledging each other’s presence in the same room.

Promotional image from Apple showing a woman smiling while wearing the Vision Pro headset, with her eyes visible through the front display using EyeSight technology. She sits on a couch in a warmly lit room, engaging with another person off-screen.

I know a thing or two about VR and building practical apps for VR. A few years ago, in the mid-2010s, I cofounded a VR startup called Transported. My cofounders and I created a platform for touring real estate in VR. We wanted to help homebuyers and apartment hunters more efficiently shop for real estate. Instead of zigzagging across town running to multiple open houses on a Sunday afternoon, you could tour 20 homes in an hour on your living room couch. Of course, “virtual tours” existed already. There were cheap panoramas on real estate websites and “dollhouse” tours created using Matterport technology. Our tours were immersive; you felt like you were there. It was the future! There were several problems to solve, including 360° photography, stitching rooms together, building a player, and then most importantly, distribution. Back in 2015–2016, our theory was that Facebook, Google, Microsoft, Sony, and Apple would quickly make VR commonplace because they were pouring billions of R&D and marketing dollars into the space. But it turned out we were a little ahead of our time.

Consumers didn’t take to VR as all the technologists predicted. Headsets were still cumbersome. The best device in the market then was the Oculus Rift, which had to be tethered to a high-powered PC. When the Samsung Gear VR launched, it was a game changer for us because the financial barrier to entry was dramatically lowered. But despite the big push from all these tech companies, the consumer adoption curve still wasn’t great.

For our use case—home tours—consumers were fine with the 2D Matterport tours. They didn’t want to put on a headset. Transported withered as the gaze from the tech companies wandered elsewhere. Oculus continued to come out with new hardware, but the primary applications have all been entertainment. Practical uses for VR never took off. Despite Meta’s recent metaverse push, VR was still seen as a sideshow, a toy, and not the future of computing.

Until yesterday.

Blurry, immersive view of a cozy living room with the centered text “Welcome to the era of spatial computing,” representing the Apple Vision Pro experience and its introduction to augmented reality.

Apple didn’t coin the term “spatial computing.” The credit belongs to Simon Greenwold, who, in 2003, defined it as “human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces.” But with the headline “Welcome to the era of spatial computing,” Apple brilliantly reminds us that VR has practical use cases. They take a position opposite of the all-encompassing metaverse playland that Meta has staked out. They’ve redefined the category and may have breathed life back into it.

Beyond marketing, Apple has solved many of the problems that have plagued VR devices.

  • **Isolation: **As mentioned at the beginning of this piece, Apple seems to have solved the isolation issue with what they’re calling EyeSight. People around you can see your eyes, and you can see them inside Vision Pro.
  • Comfort: One of the biggest complaints about the Oculus Quest is its heaviness on your face. Apple solves this with a wired battery pack that users put into their pockets, thus moving that weight off their heads. But it is a tether.
  • Screen door effect: Even though today’s screens have really tiny pixels, users can still see the individual pixels because they’re so close to the display. In VR, this is called the “screen door effect” because you can see the lines between the screen’s pixels. The Quest 2 is roughly HD-quality (1832x1920) per eye. Apple Vision Pro will be double that to 4K quality per eye. We’ll have to see if this is truly eliminated once reviewers get their hands on test units.
  • Immersive audio: Building on the spatial audio technology they debuted with AirPods Pro, Vision Pro will have immersive audio to transport users to new environments.
  • Control: One of the biggest challenges in VR adoption has been controlling the user interface. Handheld game controllers are not intuitive for most people. In the real world, you look at something to focus on it, and you use your fingers and hands to manipulate objects. Vision Pro looks to overcome this usability issue with eye tracking and finger gestures.
  • Performance: Rendering 3D spaces in real-time requires a ton of computing and graphics-processing power. Apple’s move to its own M-series chips leapfrogs those available on competitors’ devices.
  • Security: In the early days of the Oculus Rift, users had to take off their headsets in the middle of setup to create and log into an online account. More recently, Meta mandated that Oculus users log in with their Facebook accounts. I’m not sure about the setup process, but privacy-focused Apple has built on their Face ID technology to create iris scanning technology called Optic ID. This identifies the specific human, so it’s as secure as a password. Finally, your surroundings captured by the external cameras are processed on-device.
  • Cross-platform compatibility: If Vision Pro is to be used for work, it will need to be cross-platform. In Apple’s presentation, FaceTime calls in VR didn’t exclude non-VR participants. Their collaborative whiteboard app, Freeform, looked to be usable on Vision Pro.
  • Development frameworks: There are 1.8 million apps in Apple’s App Store developed using Apple’s developer toolkits. From the presentation, it looked like converting existing iOS and possibly macOS apps to be compatible with visionOS should be trivial. Additionally, Apple announced they’re working with Unity to help developers bring their existing apps—games—to Vision Pro.

Person wearing an Apple Vision Pro headset stands at a desk in a loft-style office, interacting with multiple floating app windows in augmented reality. The text reads, “Free your desktop. And your apps will follow.” promoting spatial computing.

While Apple Vision Pro looks to be a technological marvel that has been years in the making, I don’t think it’s without its faults.

  • Tether: The Oculus Quest was a major leap forward. Free from being tethered to a PC, games like Beat Saber were finally possible. While Vision Pro isn’t tethered to a computer, there is the cord to the wearable battery pack. Apple has been in a long war against wires—AirPods, MagSafe charging—and now they’ve introduced a new one.
  • Price: OK, at $3,500, it is as expensive as the highest-end 16-inch MacBook Pro. This is not a toy and not for everyday consumers. It’s more than ten times the price of an Oculus Quest 2 ($300) and more than six times that of a Sony PlayStation VR 2 headset ($550). I’m sure the “Pro” designation softens the blow a little.

Apple Vision Pro will ship in early 2024. I’m excited by the possibilities of this new platform. Virtual reality has captured the imagination of science-fiction writers, futurists, and technologists for decades. Being able to completely immerse yourself into stories, games, and simulations by just putting on a pair of goggles is very alluring. The technology has had fits and starts. And it’s starting again.