Skip to content

35 posts tagged with “technology”

9 min read
Worn white robots with glowing pink eyes, one central robot displaying a pink-tinted icon for ChatGPT Atlas, in a dark alley with pink neon circle

OpenAI’s ChatGPT Atlas Browser Needs Work

Like many people, I tried OpenAI’s ChatGPT Atlas browser last week. I immediately made it my daily driver, seeing if I could make the best of it. Tl;dr: it’s still early days and I don’t believe it’s quite ready for primetime. But let’s back up a bit.

The Era of the AI Browser Is Here

Back in July, I reviewed both Comet from Perplexity and Dia from The Browser Company. It was a glimpse of the future that I wanted. I concluded:

The AI-powered ideas in both Dia and Comet are a step change. But the basics also have to be there, and in my opinion, should be better than what Chrome offers. The interface innovations that made Arc special shouldn’t be sacrificed for AI features. Arc is/was the perfect foundation. Integrate an AI assistant that can be personalized to care about the same things you do so its summaries are relevant. The assistant can be agentic and perform tasks for you in the background while you focus on more important things. In other words, put Arc, Dia, and Comet in a blender and that could be the perfect browser of the future.

There were also open rumors that OpenAI was working on a browser of their own, so the launch of Atlas was inevitable.

With Cursor and Lovable as the darlings of AI coding tools, don’t sleep on Claude Code. Personally, I’ve been splitting my time between Claude Code and Cursor. While Claude Code’s primary persona is coders and tinkerers, it can be used for so much more.

Lenny Rachitsky calls it “the most underrated AI tool for non-technical people.”

The key is to forget that it’s called Claude Code and instead think of it as Claude Local or Claude Agent. It’s essentially a super-intelligent AI running locally, able to do stuff directly on your computer—from organizing your files and folders to enhancing image quality, brainstorming domain names, summarizing customer calls, creating Linear tickets, and, as you’ll see below, so much more.

Since it’s running locally, it can handle huge files, run much longer than the cloud-based Claude/ChatGPT/Gemini chatbots, and it’s fast and versatile. Claude Code is basically Claude with even more powers.

Rachitsky shares 50 of his “favorite and most creative ways non-technical people are using Claude Code in their work and life.”

Everyone should be using Claude Code more

Everyone should be using Claude Code more

How to get started, and 50 ways non-technical people are using Claude Code in their work and life

lennysnewsletter.com iconlennysnewsletter.com

Slow and steady wins the race, so they say. And in Waymo’s case, that’s true. Unlike the stereotypical Silicon Valley of “Move fast and break things,” Waymo has been very deliberate and intentional in developing its self-driving tech. In other words, they’re really trying to account for the unintended consequences.

Writing for The Atlantic, Saahil Desai:

Compared with its robotaxi competitors, “Waymo has moved the slowest and the most deliberately,” [Bryant Walker Smith] said—which may be a lesson for the world’s AI developers. The company was founded in 2009 as a secretive project inside of Google; a year later, it had logged 1,000 miles of autonomous rides in a tricked-out Prius. Close to a decade later, in 2018, Waymo officially launched its robotaxi service. Even now, when Waymos are inching their way into the mainstream, the company has been hypercautious. The company is limited to specific zones within the five cities it operates in (San Francisco, Phoenix, Los Angeles, Austin, and Atlanta). And only Waymo employees and “a growing number of guests” can ride them on the highway, Chris Bonelli, a Waymo spokesperson, told me. Although the company successfully completed rides on the highway years ago, higher speeds bring more risk for people and self-driving cars alike. What might look like a few grainy pixels to Waymo’s cameras one moment could be roadkill to swerve around the very next.

Move Fast and Break Nothing

Move Fast and Break Nothing

Waymo’s robotaxis are probably safer than ChatGPT.

theatlantic.com icontheatlantic.com

The web is a magical place. It started out as a way to link documents like research papers across the internet, but has evolved into the representation of the internet and the place where we get information and get things done. Writer Will Leitch on Medium:

It is difficult to describe, to a younger person or, really, anyone who wasn’t there, what the emergence of the Internet — this thing that had not been there your entire life, that you had no idea existed, that was suddenly just everywhere — meant to someone who wanted to write. When I graduated college in 1997, the expectation for me, and most wanna-be writers, was that we had two options: Start on the bottom rung of a print publication and toil away for years, hoping that enough people with jobs above you would retire or die in time for you to get a real byline by the time you were 40, or write a brilliant novel or memoir that turned you into Dave Eggers or Elizabeth Wurtzel. That was pretty much it! Then, suddenly, from the sky, there was this place where you could:

  • Write whatever you wanted.
  • Write as long as you wanted.
  • Have your work available to read by anyone, anywhere on the entire freaking planet.

This was — and still is — magical.

The core argument of what Leitch write is that while the business and traffic models that fueled web publishing are collapsing—due to changing priorities of platforms like Google and the dominance of video on social media (i.e., TikTok and Reels), the essential, original magic of publishing on the web isn’t dead.

But that does not mean that Web publishing — that writing on the Internet, the pure pleasure of putting something out in the world and having it be yours, of discovering other people who are doing the same thing — itself is somehow dead, or any less magical than it was in the first place. Because it is magical. It still is. It always was.

It’s the (Theoretical) End of Web Publishing (and I Feel Fine)

It’s the (Theoretical) End of Web Publishing (and I Feel Fine)

Let’s remember why we started publishing on the Web in the first place.

williamfleitch.medium.com iconwilliamfleitch.medium.com
A computer circuit board traveling at warp speed through space with motion-blurred light streaks radiating outward, symbolizing high-performance computing and speed.

The Need for Speed: Why I Rebuilt My Blog with Astro

Two weekends ago, I quietly relaunched my blog. It was a heart transplant really, of the same design I'd launched in late March.

The First Iteration

Back in early November of last year, I re-platformed from WordPress to a home-grown, Cursor-made static site generator. I'd write in Markdown and push code to my GitHub repository and the post was published via Vercel's continuous deployment feature. The design was simple and it was a great learning project for me.

Tim Berners-Lee, the father of the web who gave away the technology for free, says that we are at an inflection point with data privacy and AI. But before he makes that point, he reminds us that we are the product:

Today, I look at my invention and I am forced to ask: is the web still free today? No, not all of it. We see a handful of large platforms harvesting users’ private data to share with commercial brokers or even repressive governments. We see ubiquitous algorithms that are addictive by design and damaging to our teenagers’ mental health. Trading personal data for use certainly does not fit with my vision for a free web.

On many platforms, we are no longer the customers, but instead have become the product. Our data, even if anonymised, is sold on to actors we never intended it to reach, who can then target us with content and advertising. This includes deliberately harmful content that leads to real-world violence, spreads misinformation, wreaks havoc on our psychological wellbeing and seeks to undermine social cohesion.

And about that fork in the road with AI:

In 2017, I wrote a thought experiment about an AI that works for you. I called it Charlie. Charlie works for you like your doctor or your lawyer, bound by law, regulation and codes of conduct. Why can’t the same frameworks be adopted for AI? We have learned from social media that power rests with the monopolies who control and harvest personal data. We can’t let the same thing happen with AI.

preview-1759201284501.jpg

Why I gave the world wide web away for free

My vision was based on sharing, not exploitation – and here’s why it’s still worth fighting for

theguardian.com icontheguardian.com

Here’s a fun visual essay about a artist Yufeng Zhao’s piece “Alt Text in NYC.” It’s a essentially a visual search engine that searches all the text (words) on the streets of New York City. The dataset comprises of over eight million photos from Google Street View! Matt Daniels, writing for The Pudding:

The result is a search engine of much of what’s written in NYC’s streets. It’s limited to what a Google Street View car can capture, so it excludes text in areas such as alleyways and parks, or any writing too small to be read by a moving vehicle.

The scale of the data is immense: over 8 million Google Street View images (from the past 18 years) and 138 million identified snippets of text.

Just over halfway down the article, there is a list of the top 1,000 words in the data. Most are expected words from traffic signs like “stop.” But number twenty-five is “Fedders,” the logo of an air-conditioner brand popular in the 1950s to the 1990s. They’re all over the exteriors of the city’s buildings.

Best viewed on your computer, IMHO.

preview-1757042735943.jpg

NYC’s Urban Textscape

Analyzing All of the Words Found on NYC Streets

pudding.cool iconpudding.cool

Here’s a fun project from Étienne Fortier-Dubois. It is both a timeline of tech innovations throughout history and a family tree. For example, the invention of the wheel led to chariots, or the ancestors of the bulletin board system were the home computer and the modem. From the about page:

The historical tech tree is a project by Étienne Fortier-Dubois to visualize the entire history of technologies, inventions, and (some) discoveries, from prehistory to today. Unlike other visualizations of the sort, the tree emphasizes the connections between technologies: prerequisites, improvements, inspirations, and so on.

These connections allow viewers to understand how technologies came about, at least to some degree, thus revealing the entire history in more detail than a simple timeline, and with more breadth than most historical narratives. The goal is not to predict future technology, except in the weak sense that knowing history can help form a better model of the world. Rather, the point of the tree is to create an easy way to explore the history of technology, discover unexpected patterns and connections, and generally make the complexity of modern tech feel less daunting.

preview-1756485191427.png

Historical Tech Tree

Interactive visualization of technological history

historicaltechtree.com iconhistoricaltechtree.com

Jessica Davies reports that new publisher data suggests that some sites are getting 25% less traffic from Google than the previous year.

Writing in Digiday:

Organic search referral traffic from Google is declining broadly, with the majority of DCN member sites — spanning both news and entertainment — experiencing traffic losses from Google search between 1% and 25%. Twelve of the respondent companies were news brands, and seven were non-news.

Jason Kint, CEO of DCN, says that this is a “direct consequence of Google AI Overviews.”

I wrote previously about the changing economics of the web here, here, and here.

And related, Eric Mersch writes in a LinkedIn post that Monday.com’s stock fell 23% because co-CEO Roy Mann said, “We are seeing some softness in the market due to Google algorithm,” during their Q2 earnings call and the analysts just kept hammering him and the CFO about how the algo changes might affect customer acquisition.

Analysts continued to press the issue, which caught company management completely off guard. Matthew Bullock from Bank of America Merrill Lynch asked frankly, “And then help us understand, why call this out now? How did the influence of Google SEO disruption change this quarter versus 1Q, for example?” The CEO could only respond, “So look, I think like we said, we optimize in real-time. We just budget daily,” implying that they were not aware of the problem until they saw Q2 results.

This is the first public sign that the shift from Google to AI-powered searches is having an impact.

preview-1755493440980.jpg

Google AI Overviews linked to 25% drop in publisher referral traffic, new data shows

The majority of Digital Content Next publisher members are seeing traffic losses from Google search between 1% and 25% due to AI Overviews.

digiday.com icondigiday.com

Yesterday, OpenAI launched GPT-5, their latest and greatest model that replaces the confusing assortment of GPT-4o, o3, o4-mini, etc. with just two options: GPT-5 and GPT-5 pro. The reasoning is built in and the new model is smart enough to know what to think harder, or when a quick answer suffices.

Simon Willison deep dives into GPT-5, exploring its mix of speed and deep reasoning, massive context limits, and competitive pricing. He sees it as a steady, reliable default for everyday work rather than a radical leap forward:

I’ve mainly explored full GPT-5. My verdict: it’s just good at stuff. It doesn’t feel like a dramatic leap ahead from other LLMs but it exudes competence—it rarely messes up, and frequently impresses me. I’ve found it to be a very sensible default for everything that I want to do. At no point have I found myself wanting to re-run a prompt against a different model to try and get a better result.

It’s a long technical read but interesting nonetheless.

preview-1754630277862.jpg

GPT-5: Key characteristics, pricing and model card

I’ve had preview access to the new GPT-5 model family for the past two weeks (see related video) and have been using GPT-5 as my daily-driver. It’s my new favorite …

simonwillison.net iconsimonwillison.net

It’s no secret that I am a big fan of Severance, the Apple TV+ show that has 21 Emmy nominations this year. I made a fan project earlier in the year that generates Outie facts for your Innie.

After launching a teaser campaign back in April, Atomic Keyboard is finally taking pre-orders for their Severance-inspired keyboard just for Macrodata Refinement department users. The show based the MDR terminals on the Data General Dasher D2 terminal from 1977. So this new keyboard includes three layouts:

  1. “Innie” which is show-accurate, meaning no Escape, no Option, and no Control keys, and includes the trackball
  2. “Outie,” a 60% layout that includes modern modifier keys and the trackball
  3. “Dasher” which replicates the DG terminal layout

It’s not cheap. The final retail price will be $899, but they’re offering a pre-Kickstarter price of $599.

preview-1752862402377.png

MDR Dasher Keyboard | For Work That's Mysterious & Important

Standard equipment for Macrodata Refinement: CNC-milled body, integrated trackball, modular design. Please enjoy each keystroke equally.

mdrkeyboard.com iconmdrkeyboard.com

As a certified Star Wars geek, I love this TED talk from ILM’s Rob Bedrow. For the uninitiated, Industrial Light & Magic, or ILM, is the company that George Lucas founded to make all the special effects for the original and subsequent Star Wars films. The firm has been an award-winning pioneer in special and visual effects, responsible for the dinosaurs in Jurassic Park, the liquid metal T-1000 in Terminator 2: Judgement Day, and the de-aging of Harrison Ford in Indiana Jones and the Dial of Destiny.

The point Bedrow makes is simple: ILM creates technology in service of the storyteller, or creative.

I believe that we’re designed to be creative beings. It’s one of the most important things about us. That’s one of the reasons we appreciate and we just love it when we see technology and creativity working together. We see this on the motion control on the original “Star Wars” or on “Jurassic Park” with the CG dinosaurs for the first time. I think we just love it when we see creativity in action like this. Tech and creative working together. If we fast forward to 2020, we can see the latest real-time virtual production techniques. This was another creative innovation driven by a filmmaker. In this case, it’s Jon Favreau, and he had a vision for a giant Disney+ “Star Wars” series.

He later goes on to show a short film test made be a lone artist at ILM using an internal AI tool. It’s never-before-seen creatures that could exist in the Star Wars universe. I mean, for now they look like randomized versions of Earth animals and insects, but if you squint, you can see where the technology is headed.

Bedrow goes on…

Now the tech companies on their own, they don’t have the whole picture, right? They’re looking at a lot of different opportunities. We’re thinking about it from a filmmaking perspective. And storytellers, we need better artist-focused tools. Text prompts alone, they’re not great ways to make a movie. And it gets us excited to think about that future where we are going to be able to give artists these kinds of tools.

Again, artists—or designers, or even more broadly, professionals—need fine-grained control to adjust the output of AI.

Watch the whole thing. Instead of a doom and gloom take on AI, it’s an uplifting one that shows us what’s possible.

Star Wars Changed Visual Effects — AI Is Doing It Again

Jedi master of visual effects Rob Bredow, known for his work at Industrial Light & Magic and Lucasfilm, takes us on a cinematic journey through the evolution of visual effects, with behind-the-scenes stories from the making of fan favorites like “Jurassic Park,” “Star Wars,” “Indiana Jones” and more. He shares how artist-driven innovation continues to blend old and new technology, offering hope that AI won’t replace creatives but instead will empower artists to create new, mind-blowing wonders for the big screen. (Recorded at TED2025 on April 8, 2025)

youtube.com iconyoutube.com

I found this post from Tom Blomfield to be pretty profound. We’ve seen interest in universal basic income from Sam Altman and other leaders in AI, as they’ve anticipated the decimation of white collar jobs in coming years. Blomfield crushes the resistance from some corners of the software developer community in stark terms.

These tools [like Windsurf, Cursor and Claude Code] are now very good. You can drop a medium-sized codebase into Gemini 2.5’s 1 million-token context window and it will identify and fix complex bugs. The architectural patterns that these coding tools implement (when prompted appropriately) will easily scale websites to millions of users. I tried to expose sensitive API keys in front-end code just to see what the tools would do, and they objected very vigorously.

They are not perfect yet. But there is a clear line of sight to them getting very good in the immediate future. Even if the underlying models stopped improving altogether, simply improving their tool use will massively increase the effectiveness and utility of these coding agents. They need better integration with test suites, browser use for QA, and server log tailing for debugging. Pretty soon, I expect to see tools that allow the LLMs to to step through the code and inspect variables at runtime, which should make debugging trivial.

At the same time, the underlying models are not going to stop improving. they will continue to get better, and these tools are just going to become more and more effective. My bet is that the AI coding agents quickly beat top 0.1% of human performance, at which point it wipes out the need for the vast majority software engineers.

He quotes the Y Combinator stat I cited in a previous post:

About a quarter of the recent YC batch wrote 95%+ of their code using AI. The companies in the most recent batch are the fastest-growing ever in the history of Y Combinator. This is not something we say every year. It is a real change in the last 24 months. Something is happening.

Companies like Cursor, Windsurf, and Lovable are getting to $100M+ revenue with astonishingly small teams. Similar things are starting to happen in law with Harvey and Legora. It is possible for teams of five engineers using cutting-edge tools to build products that previously took 50 engineers. And the communication overhead in these teams is dramatically lower, so they can stay nimble and fast-moving for much longer.

And for me, this is where the rubber meets the road:

The costs of running all kinds of businesses will come dramatically down as the expenditure on services like software engineers, lawyers, accountants, and auditors drops through the floor. Businesses with real moats (network effect, brand, data, regulation) will become dramatically more profitable. Businesses without moats will be cloned mercilessly by AI and a huge consumer surplus will be created.

Moats are now more important than ever. Non-tech companies—those that rely on tech companies to make software for them, specifically B2B vertical SaaS—are starting to hire developers. How soon will they discover Cursor if they haven’t already? These next few years will be incredibly interesting.

Tweet by Tom Blomfield comparing software engineers to farmers, stating AI is the “combine harvester” that will increase output and reduce need for engineers.

The Age Of Abundance

Technology clearly accelerates human progress and makes a measurable difference to the lives of most people in the world today. A simple example is cancer survival rates, which have gone from 50% in 1975 to about 75% today. That number will inevitably rise further because of human ingenuity and technological acceleration.

tomblomfield.com icontomblomfield.com

Elizabeth Goodspeed, writing for It’s Nice That:

The cynicism our current moment inspires appears to be, regrettably, universal. For millennials, who watched the better-world-by-design ship go down in real time, it’s hard-earned. We saw the idealist fantasy of creative autonomy, social impact, and purpose-driven work slowly unravel over the past decade, and are now left holding the bag. Gen Z designers have the same pessimism, but arrived at it from a different angle. They’re entering the field already skeptical, shaped by a job market in freefall and constant warnings of their own obsolescence. But the result is the same: an industry full of people who care deeply, but feel let down.

Sounds very similar to what Gen X-ers are facing in their careers too. I think it’s universal for nearly all creative careers today.

preview-1744176795240.png

Elizabeth Goodspeed on why graphic designers can’t stop joking about hating their jobs

Designers are burnt out, disillusioned, and constantly joking that design ruined their life – but underneath the memes lies a deeper reckoning. Our US editor-at-large explores how irony became the industry’s dominant tone, and what it might mean to care again.

itsnicethat.com iconitsnicethat.com

Steven Kurtz, writing for The New York Times:

For many of the Gen X-ers who embarked on creative careers in the years after [Douglas Coupland’s Generation X] was published, lessness has come to define their professional lives.

If you entered media or image-making in the ’90s — magazine publishing, newspaper journalism, photography, graphic design, advertising, music, film, TV — there’s a good chance that you are now doing something else for work. That’s because those industries have shrunk or transformed themselves radically, shutting out those whose skills were once in high demand.

My first assumption was that Kurtz was writing about AI and how it’s taking away all the creative jobs. Instead, he weaves together a multifactorial illustration about the diminishing value of commercial creative endeavors like photography, music, filmmaking, copywriting, and design.

“My peers, friends and I continue to navigate the unforeseen obsolescence of the career paths we chose in our early 20s,” Mr. Wilcha said. “The skills you cultivated, the craft you honed — it’s just gone. It’s startling.”

Every generation has its burdens. The particular plight of Gen X is to have grown up in one world only to hit middle age in a strange new land. It’s as if they were making candlesticks when electricity came in. The market value of their skills plummeted.

It’s more than AI, although certainly, that is top of everyone’s mind these days. Instead, it’s also stock photography and illustrations, graphic templates, the consolidation of ad agencies, the revolutionary rise of social media, and the tragic fall of traditional media.

Similar shifts have taken place in music, television and film. Software like Pro Tools has reduced the need for audio engineers and dedicated recording studios; A.I., some fear, may soon take the place of actual musicians. Streaming platforms typically order fewer episodes per season than the networks did in the heyday of “Friends” and “ER.” Big studios have slashed budgets, making life for production crews more financially precarious.

Earlier this year, I cited Baldur Bjarnason’s essay about the changing economics of web development. As an opening analogy, he referenced the shifting landscape of film and television.

Born in 1973, I am squarely in Generation X. I started my career in the design and marketing industry just as the internet was taking off. So I know exactly what the interviewees of Kurtz’s article are facing. But by dogged tenacity and sheer luck, I’ve been able to pivot and survive. Am I still a graphic designer like I was back in the mid-1990s? Nope. I’m more of a product designer now, which didn’t exist 30 years ago, and which is a subtle but distinct shift from UX designer, which has existed for about 20 years.

I’ve been lucky enough to ride the wave with the times, always remembering my core purpose.

preview-1743608194474.png

The Gen X Career Meltdown (Gift Article)

Just when they should be at their peak, experienced workers in creative fields find that their skills are all but obsolete.

nytimes.com iconnytimes.com

Zuckerberg believes Apple “[hasn’t] really invented anything great in a while…”

Appearing on Joe Rogan’s podcast, this week, Meta CEO Mark Zuckerberg said that Apple “[hasn’t] really invented anything great in a while. Steve Jobs invented the iPhone and now they’re just kind of sitting on it 20 years later."

Let's take a look at some hard metrics, shall we?

I did a search of the USPTO site for patents filed by Apple and Meta since 2007. In that time period, Apple filed for 44,699 patents. Meta, nee Facebook, filed for 4,839, or about 10% of Apple’s inventions.

Side-by-side screenshots of patent searches from the USPTO database showing results for Apple Inc. and Meta Platforms. The Apple search (left) returned 44,699 results since 2007, while the Meta search (right) returned 4,839 results.