Your best user will never see your interface​
The web’s most valuable user will never appreciate your color palette.
We’ve spent years and years perfecting pixels for people. Color theory, typography scales, whitespace ratios, micro-interactions that feel just right under a thumb. We’ve built entire careers around understanding how a human eye scans a page, where attention drifts, and what makes someone click instead of scroll past.

 

None of that matters to an AI agent.

That’s the uncomfortable truth sitting in front of every product designer right now. The web’s fastest-growing audience doesn’t see your hero image. It doesn’t care about your font pairing. It will never feel the satisfying bounce of your pull-to-refresh animation. And according to Cloudflare CEO Matthew Prince, speaking at SXSW just days ago, this non-human audience will generate more internet traffic than all of us combined by 2027.

01 — The traffic inversion

Bots crossed 50% in 2025.
Humans are now the minority.

Before the generative AI era, bots were roughly 20% of web traffic. By 2027, Cloudflare projects they’ll be the majority.

100% 75% 50% 25% 0% 2020 2021 2022 2023 2024 2025 2026 2027 50% NOW 65% 35%
Bot traffic
Human traffic

Sources: Cloudflare (SXSW 2026), Imperva Bad Bot Report 2025, TollBit State of the Bots

Bot traffic crossed 50% in 2025 and isn’t slowing down. By 2027, humans will be the minority online.

The numbers aren’t subtle. Bots already account for over 50% of all web traffic. A human shopping for a camera might visit five websites. An AI agent doing the same thing on that person’s behalf will hit 5,000. That’s not a rounding error; that’s a different internet.

And if you’re a product designer who hasn’t started thinking about what this means for your work, you’re already behind.

We’ve been here before (sort of)

The shift to mobile is the obvious comparison. In the early 2010s, product teams scrambled to rethink everything they’d built for desktop screens. Responsive design went from a nice-to-have to a baseline expectation. Touch targets replaced hover states. Information architecture got flattened to accommodate smaller viewports and shorter attention spans.

That transition was disorienting, but the user was still a person. We were still designing for eyes, hands, and brains. The agentic web asks something fundamentally different: design for a user that has no eyes, no hands, and a very different kind of brain.

The closest precedent we actually have isn’t mobile, it’s SEO. When search engines became the primary way people discovered content, designers had to start thinking about a non-human intermediary, the crawler. We learned to structure content semantically, to write alt text, to care about heading hierarchies not just for visual rhythm but for machine readability. Google’s bot became an invisible stakeholder in every design decision.

02 — The 1,000x multiplier

A human visits 5 sites.
An agent visits 5,000.

Shopping for a camera? You might check five retailers. Your AI agent will scan a thousand times that; comparing specs, prices, and reviews across the entire web in seconds.

Human
0
websites visited
AI agent
0
websites visited
Each agent request is real traffic and real server load. Your infrastructure bill doesn’t care whether the visitor has a pulse.

A human shopping for a camera visits 5 sites. An AI agent doing the same task hits 5,000 in seconds. Same internet; very different load.

What’s happening now is that same dynamic, but amplified by orders of magnitude. The agent isn’t just indexing your content; it’s acting on it. It’s comparing your prices, evaluating your product specs, and potentially completing a purchase, all without a single human ever seeing your carefully designed checkout flow.

The attention economy has a new problem

For over two decades, the web’s economic engine has run on a simple exchange: creators make content, platforms distribute it, and advertisers pay to interrupt the process. Every banner ad, every interstitial, every sponsored post assumes the same thing, that a human being is on the other side, capable of being persuaded.

AI agents break that assumption completely. They don’t see ads. They don’t respond to emotional triggers. They can’t be retargeted. When an agent is browsing on someone’s behalf, the entire persuasion layer of the web becomes invisible noise, or worse, an obstacle that slows down the task.

This isn’t theoretical. Clickthrough rates from AI applications to websites dropped from 0.8% to just 0.27% between Q2 and Q4 of 2025. Even publishers with AI licensing deals saw their referral rates collapse. The old bargain, you give us content, we send you traffic, is dissolving in real time.

03 — The attention collapse

AI referral clicks dropped 66% in six months.

AI apps are consuming content but sending almost no one back. The old deal; content in exchange for traffic; is breaking apart.

1.1%
Q1 ’25
0.8%
Q2 ’25
0.52%
Q3 ’25
0.27%
Q4 ’25
Clickthrough rate from AI apps to websites (TollBit, 2025)
0%
Human web visits declined
Q3→Q4 2025
0
Pages ClaudeBot crawls
per referral sent back

For product designers, this creates a paradox. We’ve spent years optimizing conversion funnels that assume a human is moving through them. The entire concept of a “funnel” presupposes someone who can be guided, nudged, and persuaded at each stage. But when the user is an agent acting on behalf of a human, the funnel collapses into something more like a query, give me the best option that matches these criteria, and let me act on it immediately.

The implications for how we think about interface design are enormous. If the most valuable “user” of your product never sees your interface, what exactly are you designing?

Two users, one product

The answer, at least for now, is both. We’re entering a period where every digital product needs to serve two fundamentally different audiences simultaneously: humans who interact through visual interfaces, and agents who interact through data structures and APIs.

Some people in the industry are calling this “agent-responsive design,” a deliberate echo of the responsive design movement. The analogy works up to a point. Just as responsive design meant thinking about how the same content adapts to different screen sizes, agent-responsive design means thinking about how the same product adapts to different types of users, biological and artificial.

But the parallel breaks down when you look at what agents actually need. A human needs clear visual hierarchy, intuitive navigation, and emotional resonance. An agent needs structured data, semantic markup, and machine-readable APIs. These aren’t just different presentations of the same thing; they’re fundamentally different layers of the product.

Think of it this way: when you design a product page for a human, you’re crafting an experience. When you design it for an agent, you’re exposing a data model. Both need to be accurate, complete, and trustworthy. But the design skills required for each are quite different.

This is where the concept of AX, agent experience, starts to matter. It sits alongside UX and DX (developer experience) as a distinct design discipline. AX asks questions like: Can an agent discover what this product does? Can it extract the information it needs without scraping? Can it complete a transaction through a structured interface? Can it verify that the data it’s receiving is current and accurate?

04 — The design split

Two users. One product.

What each audience actually needs from your design

Matters to humans
Matters to agents
Trust signals are the one category that matters equally to both. For humans, trust is aesthetic. For agents, trust is structural.

These aren’t engineering questions masquerading as design questions. They’re genuinely new design problems that require understanding of information architecture, data modeling, and system design at a level most visual designers haven’t had to engage with.

What actually changes in our work

Let me get concrete about what this means in practice, because the abstract version of this conversation has been floating around for a while without enough specificity.

Semantic structure becomes a first-class design deliverable. We’ve always known that semantic HTML matters for accessibility and SEO. Now it matters for a third reason, agents need it to understand what your interface represents. Every product listing, every piece of content, every interactive element needs to be marked up in a way that machines can parse without guessing. Schema.org markup, JSON-LD, and OpenAPI specifications are no longer backend concerns. They’re design decisions.

Information architecture splits into two tracks. The IA that serves humans; navigation menus, content groupings, visual flow; and the IA that serves agents; structured data, API endpoints, machine-readable capability descriptions; will increasingly diverge. Designing both well requires understanding how each audience discovers and consumes information.

Trust becomes a design material. When an agent is acting on someone’s behalf, the human needs to trust the agent, and the agent needs to trust the data source. Designing for trust in an agentic context means things like transparent pricing structures that machines can verify, consistent and accurate product data, clear permission models, and auditable transaction logs. The visual cues we’ve used to signal trustworthiness to humans; clean design, recognizable branding, social proof, don’t translate to machines. For agents, trust is structural, not aesthetic.

The design system expands. A modern design system already includes components, tokens, and patterns for human interfaces. Going forward, it will also need to include structured data templates, API interaction patterns, and agent-facing documentation. The design system becomes the single source of truth for both human-readable and machine-readable experiences.

05 — The timeline

This isn’t a five-year prediction.
It’s already happening.

2024 Bots reach ~45% of web traffic
2025 Bots surpass 50%; human traffic starts declining
2026 Agent-responsive design emerges as a discipline YOU ARE HERE
2027 Bot traffic exceeds human traffic Cloudflare
2028 33% of enterprise software includes agentic AI Gartner
2030 ~25% of online spending via AI shopping agents Morgan Stanley

Conversion metrics split. We’ll need separate analytics for human sessions and agent sessions, because the patterns look completely different. An agent might “convert” in milliseconds with zero page views. A human might browse for twenty minutes before making a decision. Measuring both against the same KPIs makes no sense.

Humans aren’t going away. People will still use products directly. They’ll still need interfaces that are intuitive, beautiful, and emotionally resonant. The craft of visual design, interaction design, and human-centered research remains incredibly valuable.

What changes is the scope of our responsibility. We’re no longer designing just for the person in front of the screen. We’re designing for the person and for the agent acting on their behalf. That’s an expansion of the design problem, not a replacement.

The skills gap is real

Here’s the part that keeps me up at night. The skills required to design well for agents are not the skills most product designers have been developing. We’ve been trained to think visually, to prototype interactions, to conduct user research with people. Designing for agents requires understanding of data structures, API design, and system architecture, areas that have traditionally been the domain of engineers.

This doesn’t mean product designers need to become engineers. But it does mean we need to get comfortable with a layer of abstraction that sits below the visual interface. We need to understand how data flows through a system, how APIs expose functionality, and how structured data enables machine comprehension. We need to be literate in these areas even if we’re not writing the code ourselves.

The designers who thrive in this transition will be the ones who can hold both perspectives simultaneously, the human experience and the agent experience, and design systems that serve both well. That’s a genuinely new capability that combines traditional UX thinking with what’s essentially information engineering.

What we don’t lose

It would be easy to read all of this and conclude that visual design is dead, that interfaces don’t matter, that the future belongs to APIs and JSON schemas. That’s wrong, and it’s worth being explicit about why.

Humans aren’t going away. People will still use products directly. They’ll still need interfaces that are intuitive, beautiful, and emotionally resonant. The craft of visual design, interaction design, and human-centered research remains incredibly valuable.

What changes is the scope of our responsibility. We’re no longer designing just for the person in front of the screen. We’re designing for the person and for the agent acting on their behalf. That’s an expansion of the design problem, not a replacement.

And there’s something else that doesn’t change: the need for someone to hold the whole picture together. Engineers can build APIs. Data architects can structure schemas. But someone needs to ensure that the human experience and the agent experience are coherent, consistent, and aligned with the product’s purpose. That’s a design job. It always has been.

The uncomfortable question

There’s a deeper question lurking beneath all of this, one that goes beyond tooling and process. If agents become the primary way people interact with the web, who decides what “good” looks like?

When a human shops for a camera, they bring their own judgment, biases, preferences, and intuitions to the process. When an agent shops for them, all of those human qualities get compressed into a set of criteria and constraints. The agent optimizes against those criteria, but it doesn’t browse. It doesn’t get curious. It doesn’t stumble across something unexpected and change its mind.

There’s a real risk that an agent-first web becomes a web optimized purely for efficiency, where the messy, serendipitous, human experience of discovery gets engineered out. As designers, we’ve always been advocates for the human experience. That role becomes even more important when the loudest voice in the room belongs to a machine.