Chickens to Chatbots: Web Design’s Next Evolution
A grid of winners from The FWA in 2009. Source: Rob Ford.
In the early 2000s to the mid-oughts, every designer I knew wanted to be featured on the FWA, a showcase for cutting-edge web design. While many of the earlier sites were Flash-based, it’s also where I discovered the first uses of parallax, Paper.js, and Three.js. Back then, websites were meant to be explored and their interfaces discovered.
One of my favorite sites of that era was Burger King’s Subservient Chicken, where users could type free text into a chat box to command a man dressed in a chicken suit. In a full circle moment that perfectly captures where we are today, we now type commands into chat boxes to tell AI what to do.
The Wild West mentality of web design meant designers and creative technologists were free to make things look cool. Agencies like R/GA, Big Spaceship, AKQA, Razorfish, and CP+B all won numerous awards for clients like Nike, BMW, and Burger King. But as with all frontiers, civilization eventually arrives with its rules and constraints.
The Robots Are Looking
Last week, Sam Altman, the CEO of OpenAI, and a couple of others from the company demonstrated Operator, their AI agent. You’ll see them go through a happy path and have Operator book a reservation on OpenTable. The way it works is that the AI agent is reading a screenshot of the page and deciding how to interact with the UI. (Reminds me of the promise of the Rabbit R1.)
Let me repeat: the AI is interpreting UI by looking at it. Inputs need to look like inputs. Buttons need to look like buttons. Links need to look like links and be obvious.
In recent years, there’s been a push in the web dev community for accessibility. Complying with WCAG standards for building websites has become a positive trend. Now, we know the unforeseen secondary effect is to unlock AI browsing of sites. If links are underlined and form fields are self-evident, an agent like Operator can interpret where to click and where to enter data.
(To be honest, I’m surprised they’re using screenshots instead of interpreting the HTML as automated testing software would.)
The Economics of Change
Since Perplexity and Arc Search came onto the scene last year, the web’s economic foundation has started to shift. For the past 30 years, we’ve built a networked human knowledge store that’s always been designed for humans to consume. Sure, marketers and website owners got smart and figured out how to game the system to rank higher on Google. But ultimately, ranking higher led to more clicks and traffic to your website.
But the digerati are worried. Casey Newton of Platformer, writing about web journalism (emphasis mine):
The death of digital media has many causes, including the ineptitude of its funders and managers. But today I want to talk about another potential rifle on the firing squad: generative artificial intelligence, which in its capacity to strip-mine the web and repurpose it as an input for search engines threatens to remove one of the few pillars of revenue remaining for publishers.
Elizabeth Lopatto, writing for The Verge points out:
That means that Perplexity is basically a rent-seeking middleman on high-quality sources. The value proposition on search, originally, was that by scraping the work done by journalists and others, Google’s results sent traffic to those sources. But by providing an answer, rather than pointing people to click through to a primary source, these so-called “answer engines” starve the primary source of ad revenue — keeping that revenue for themselves.
Their point is that the fundamental symbiotic economic relationship between search engines and original content websites is changing. Instead of sending traffic to websites, search engines, and AI answer engines are scraping the content directly and providing them within their platforms.
Christopher Butler captures this broader shift in his essay “Who is the internet for?”:
Old-school SEO had a fairly balanced value proposition: Google was really good at giving people sources for the information they need and benefitted by running advertising on websites. Websites benefitted by getting attention delivered to them by Google. In a “clickless search” scenario, though, the scale tips considerably.
This isn’t just about news organizations—it’s about the fundamental relationship between websites, search engines, and users.
The Designer’s Dilemma
As the web is increasingly consumed not by humans but by AI robots, should we as designers continue to care what websites look like? Or, put another way, should we begin optimizing websites for the bots?
The art of search engine optimization, or SEO, was already pushing us in that direction. It turned personality-driven copywriting into “content” with keyword density and headings for the Google machine rather than for poetic organization. But with GPTbot slurping up our websites, should we be more straightforward in our visual designs? Should we add more copy?
Not Dead Yet
It’s still early to know if AI optimization (AIO?) will become a real thing. Changes in consumer behavior happen over many single-digit years, not months. As of November 2024, ChatGPT is eighth on the list of the most visited websites globally, ranked by monthly traffic. Google is first with 291 times ChatGPT’s traffic.
Top global websites by monthly users as of November 2024. Source: SEMRush.
Interestingly, as Google rolled out its AI overview for many of its search results, the sites cited by Gemini do see a high clickthrough rate, essentially matching the number one organic spot. It turns out that nearly 40% of us want more details than what the answer engine tells us. That’s a good thing.
Clickthrough rates by entities on the Google search results page. Source: FirstPageSage, January 2025.
Finding the Sweet Spot
There’s a fear that AI answer engines and agentic AI will be the death of creative web design. But what if we’re looking at this all wrong? What if this evolution presents an interesting creative challenge instead?
Just as we once pushed the boundaries of Flash and JavaScript to create award-winning experiences for FWA, designers will need to find innovative ways to work within new constraints. The fact that AI agents like Operator need obvious buttons and clear navigation isn’t necessarily a death sentence for creativity—it’s just a new set of constraints to work with. After all, some of the most creative periods in web design came from working within technical limitations. (Remember when we did layouts using tables?!)
The accessibility movement has already pushed us to think about making websites more structured and navigable. The rise of AI agents is adding another dimension to this evolution, pushing us to find that sweet spot between machine efficiency and human delight.
From the Subservient Chicken to ChatGPT, from Flash microsites to AI-readable interfaces, web design continues to evolve. The challenge now isn’t just making sites that look cool or rank well—it’s creating experiences that serve both human visitors and their AI assistants effectively. Maybe that’s not such a bad thing after all.