The Universal Interface
What replaces the browser — and how every organisation must now build two front doors.
What replaces the browser is not another browser. It is the user's preferred LLM — a single conversational surface through which all information, services, and decision-points are rendered. Most apps, most websites, most purpose-built interfaces become redundant. They collapse into data sources that the LLM consumes and re-presents contextually.
The direction of flow inverts. Users no longer navigate to information. Information — and the human-in-the-loop decisions that require their attention — comes to them. The LLM becomes the last interface: adaptive, personal, and capable of rendering any service without requiring the user to learn a new app, visit a new site, or manage a new login.
This is not a prediction about a possible future. It is a description of a structural transition already underway. The consequences ripple through every institution, every business model, every system that was built for the browsing paradigm.
The Dual Front Door
If the LLM is the universal interface, every organisation must now maintain a dual presence: a visual website for humans to browse, and a machine-readable gateway that exposes their underlying services so AI agents can programmatically discover, invoke, and transact with them — without ever parsing a web page or requiring custom developer integrations.
The mechanism for this is already emerging. Protocols like Anthropic's Model Context Protocol (MCP) act as a universal discovery layer on top of existing APIs — translating complex endpoints into standardised tools that give the AI a clear instruction manual for interacting with external systems. Historically, websites were built for human eyeballs and traditional APIs were built strictly for software developers. MCP opens the third channel: Business-to-Agent.
Two Modes of Agent Interaction
Not all agent interactions are equal. Where the AI is the application — a dynamic assistant reasoning through unpredictable prompts — protocol-mediated discovery is essential. The agent must find, interpret, and invoke services contextually. The token cost is justified by the complexity of reasoning.
But where the AI builds the application — coding an automated pipeline, a trading bot, a dashboard — the final artefact should bypass the discovery layer entirely and call APIs directly for zero-latency, zero-overhead execution. The protocol layer serves as a development sandbox; the production system runs natively against the underlying infrastructure.
This distinction matters for decomposition. When a firm exposes its primitives, it is not choosing between human interfaces and machine interfaces. It is building both — and understanding that the machine interface itself has two registers: one for reasoning agents that need contextual discovery, and one for automated systems that need raw speed.