ChatGPT Killed the Web: For the Better?
After 30 years of clicking, scrolling, and optimizing pixels, websites are becoming obsolete. LLM agents will read and act for us, ending search engines, blue links, and traditional websites.
I haven’t used Google in a year. No search results, no blue links. ChatGPT became my default web browser in December 2024, and it has completely replaced the entire traditional web for me.
Soon, no one will use search engine. No one will click on 10 blue links.
But there is more: No one will navigate to websites.
Hell, no one will even read a website again.
A Historical Perspective: The Three Ages
Web 1.0: Read (1990s-2004)
The original web was simple. Static HTML pages. You could read about a restaurant—its menu, hours, location. But that was it. Pure consumption.
Web 2.0: Write (2004-2024)
Then came interactivity. Databases. User accounts. Now you could *do* things like reserve a table at that restaurant, leave a review, upload photos. The web became bidirectional. Every click was an action, every form a transaction.
Web 3.0: LLMs Read and Act For You (2024+)
Now we’re entering a new evolution. You don’t navigate and read the restaurant’s website. You don’t fill out the reservation form.
An LLM agent does both for you.
LLMs Change Everything
Look at websites today. Companies spend millions building elaborate user interfaces—frontend frameworks, component libraries, animations that delight users, complex backends orchestrating data flows. Teams obsess over pixel-perfect designs, A/B test button colors, and optimize conversion funnels.
All of this sophisticated web infrastructure exists for one purpose: to present information to humans and let them take actions.
But if the information is consumed by a LLM - why does it need any of this?
You don’t need a website. You need a text file:
# Markdown File for Bella Vista Italian Restaurant
## Location
123 Main St, San Francisco, CA 94102
## Hours
Mon-Thu: 5pm-10pm. Fri-Sat: 5pm-11pm. Sun: 4pm-9pm
## Menu
### Appetizers
- Bruschetta - $12
- Calamari - $16
- Caesar Salad - $14
### Mains
- Margherita Pizza - $22
- Spaghetti Carbonara - $24
- Osso Buco - $38
## Reviews
4.7/5 (312 reviews): “Best Italian in the neighborhood”
That’s it. That’s all an LLM needs to answer any question about a restaurant. No need for UI, clean UX etc.
The Great Consolidation
Here’s what nobody’s talking about: we don’t need thousands of websites anymore.
Take a French boeuf bourguignon recipe. Today, there are hundreds of recipe websites, each with their own version:
- AllRecipes with its community ratings
- Serious Eats with detailed techniques
- Food Network with celebrity chef branding
- Marmiton for French speakers
- Countless food blogs with personal stories
Why do all these exist? They differentiated through:
- Better UI design
- Fewer ads
- Faster load times
- Native language content
- Unique photography
- Personal narratives before the recipe
But LLMs don’t care about any of this. They don’t see your beautiful photos. They skip past your childhood story about grandma’s kitchen. They ignore your pop-up ads.
They just need the recipe:
## Boeuf Bourguignon Markdown Recipe
- 2 lbs beef chuck, cubed
- 6 oz bacon, diced
- 1 bottle red wine
- 2 cups beef stock
[...]
Braise for 2-3 hours at 325°F
Language barriers? Irrelevant. The LLM translates instantly. French, Italian, Japanese. It doesn’t matter.
What this means: Instead of 10,000 cooking websites, we need maybe... a couple? or a single, comprehensive markdown repository of recipes.
This pattern repeats everywhere:
- Travel guides
- Product reviews
- News sites
- Educational content
The web doesn’t need redundancy when machines are the readers.
Wait, there is more: LLM machines can create content too.
The Death of User-Generated Content
Web 2.0’s breakthrough was making everyone a writer. YouTube, Instagram, TikTok—billions of people creating content for billions of people to read. But here’s the thing: why do you need a million human creators when AI can be all of them?
Your favorite cooking influencer? Soon it’ll be an AI chef who knows exactly what’s in your fridge, your dietary restrictions, and your skill level. No more scrolling through 50 recipe videos to find one that works for you.
Your trusted news anchor? An AI that only covers YOUR interests—your stocks, your sports teams, your neighborhood. Not broadcasting to millions, but narrowcasting to one.
That fitness instructor you follow? An AI trainer that adapts to your fitness level, your injuries, your equipment. Every video made just for you, in real-time.
Web 2.0 writing: Humans create content → Millions read the same thing
Web 3.0 writing: AI creates content → Each person reads something unique
The entire creator economy—the crown jewel of Web 2.0—collapses into infinite personalized AI agents. Social media feeds won’t be filled with human posts anymore. They’ll be generated in real-time, specifically for you. Every scroll, unique. Every video, personalized. Every post, tailored.
The paradox: We’ll have infinite content variety with zero human creators. Maximum personalization through total artificial generation.
Just as 10,000 recipe websites collapse into one markdown file for LLMs to read, millions of content creators collapse into personalized AI agents. The “write” revolution of Web 2.0 is being replaced by AI that writes everything, for everyone, individually.
Ok what about taking actions like booking a restaurant?
From APIs to MCP: The Evolution of Acting
The Old Way: APIs
Web 2.0 gave us APIs—structured endpoints for programmatic interaction:
- `POST /api/reservations`
- Rigid schemas: exact field names, specific formats
- Documentation hell: dozens of pages explaining endpoints
- Integration nightmare: every API different, nothing interoperable
APIs assumed developers would read documentation, write integration code, and handle complex error scenarios. They were built for humans to program against; requiring manual updates whenever the API changed, breaking integrations, and forcing developers to constantly maintain compatibility.
The New Way: MCP (Model Context Protocol)
MCP isn’t just another API. It’s designed for LLM agents:
- Dynamic discovery: Agents explore capabilities in real-time through tool introspection
- Flexible schemas: Natural language understanding, not rigid fields
- Universal interoperability: One protocol, infinite services
- Context-aware: Maintains conversation state across actions
What makes MCP special technically:
- Three primitives: Tools (functions agents can call), Resources (data agents can read), and Prompts (templates for common tasks)
- Transport agnostic: Works over STDIO for local servers or HTTP/SSE for remote services
- Stateful sessions: Unlike REST APIs, MCP maintains context between calls
- Built-in tool discovery: Agents can query `listTools()` to understand capabilities dynamically—no documentation parsing needed
Traditional APIs are like giving someone a thick manual and saying “follow these exact steps.” MCP is like having a smart assistant who can figure out what’s possible just by looking around.
When you walk into that restaurant, the agent doesn’t need a 50-page guide—it instantly knows it can check tables, make reservations, or view the menu. And unlike APIs that forget everything between requests (like talking to someone with amnesia!), MCP remembers the whole conversation—so when you say “actually, make it 8pm instead,” it knows exactly what reservation you’re talking about.
Real Example: Booking Your Restaurant
With traditional API:
javascript
fetch(’/api/restaurants/search?type=italian&date=tonight’)
.then(res => res.json())
.then(restaurants => {
return fetch(’/api/availability’, {
method: ‘POST’,
body: JSON.stringify({
restaurant_id: restaurants[0].id,
party_size: 2,
date: ‘2024-12-15’,
time: ‘19:30’
})
})
})
// ... more complex error handling and flow
With MCP + LLM:
User: “Book me a table for 2 at Bella Vista tonight”
Agent: [Discovers restaurant’s MCP service]
[Checks availability]
[Makes reservation]
“Done. 7:30pm tonight at Bella Vista, table for 2.”
The agent handles all complexity. No documentation needed. No rigid formats. Just natural interaction.
Even better: when the restaurant adds new capabilities—like booking the entire venue for private events, adding wine pairings, or offering chef’s table experiences—there’s no developer work required.
The LLM agent automatically discovers the expanded schema and adapts. Traditional APIs would break existing integrations or require manual updates. MCP just works.
Read + Act = The Complete Picture
With markdown for reading and MCP for acting, the entire web infrastructure becomes invisible:
- Read: LLM ingests markdown → understands everything about your service
- Act: LLM uses MCP → performs any action a user needs
Websites become obsolete. Users never leave their chat interface.
The Irony
The web started as simple text documents linked together. We spent 30 years adding complexity such as animations, interactivity, rich media. Now we’re stripping it all away again.
But this time, the simplicity isn’t for humans. It’s for machines. And that changes everything.
The web as we know it is disappearing. What replaces it will be invisible, powerful, and fundamentally different from anything we’ve built before.
For someone like me who love designing beautiful UIs, this is bittersweet. All those carefully crafted interfaces, micro-interactions, and pixel-perfect layouts will be obsolete. But I’m genuinely excited because it’s all about the user experience, and the UX of chatting (or even calling) your agent is infinitely better than website navigation.
I can’t wait.
I was thinking about this too Nick. It’s easy to get spooked about. I want say I’m not!