The Nafitko Manifest
The Naftiko Manifest traces the evolution of the World Wide Web from its static beginnings through the rise of Dynamic Web applications and the attempts at a Semantic Web, highlighting how each phase introduced new paradigms for content consumption and interaction. It argues that while earlier iterations made significant strides, the full potential of hypermedia for truly dynamic, machine-driven interactions remained largely unfulfilled.
This Naftiko Manifest proposes that the increasing investment, adoption, and growth of AI technologies, combined with the power of hypermedia, is now poised to usher in an Agentic Web — a “Web of Capabilities” where new types of user agents can [semi-]autonomously discover and interact with diverse tools, functions, workflows, and rich resources, moving beyond rigid API integrations to create a more meaningful and adaptable web experience.
Short History of the Web
The World Wide Web, or Web 1.0, has evolved significantly since its creation in the 1990s as a fast-growing set of hyperlinked static HTML documents. It has always embraced new ways to consume, publish and interconnect content, knowledge and people on the Internet.
While new generations have not erased the needs and solutions from the past (we still heavily rely on regular web browsers), they have added additional facets to our web experience, including new devices, more automation, and less manual browsing and actions–-automation that is governed by the robots.txt communicating with web crawlers, with advertising increasingly funding the experience via advertising.
The Hypertext Transfer Protocol went through several revisions, focusing primarily on performance while remarkably keeping its semantics stable. HTTP is truly the backbone protocol of the Web.
REST, the architectural style of the Web published in 2000, has been foundational in formalizing its core principles and shaping HTTP/1.1, especially through its uniform API, which includes resources identified by URIs and interactions via the exchange of representations, animated by hypermedia.
Along the way, Web APIs emerged to help power Web 2.0, enabling predefined programmatic interactions between web pages and web services by leveraging asynchronous HTTP calls in JavaScript (AJAX) to avoid forced page changes. Web 2.0 began carving up the Web into platforms, blurring the lines between the open web, and walled gardens that were generating an increasingly valuable data resources that was driving advertising, but would lay the base for a new and more valuable way to do business on the public web, as well behind the corporate firewall.
Eventually, this approach was reused more broadly outside the context of web pages, giving rise to the Programmable Web of RESTful APIs. This facet of the Web was less driven by hypermedia as the engine of applications, but it became hugely successful nonetheless, partly due to the open-source OpenAPI Specification enabling tooling interoperability, JSON Schema being used to validate data along the way, which was all augmented by commercial formats like Postman Collections which were increasingly driving business automation.
Significant investment was made to try and make web API consumption driven by hypermedia by embracing linked data media types such as RDF and JSON-LD. This Semantic Web, also known as Web 3.0, already had a vision of semi-autonomous agents but achieved limited success due to the static nature of its clients—typically hard-coded programs with specific integrations and automations that were developed and then operated—thereby missing the dynamic power of the original Web and due to the difficulty to share common semantics via ontologies.
The Semantic Web’s main benefit was to enrich search engines with additional metadata to build knowledge graphs and help users obtain better, more structured results than with typical web page extractions and hyperlinks analysis.
Emergence of the Web 4.0
The Agentic Web is a new attempt at building on previous generations, on HTTP and other standards, schemas already in use. Artificial Intelligence is being leveraged to first assimilate knowledge in models in a way that allows the interaction with those models (typically LLMs) as a textual conversation between humans and agents, as well as between agents themselves.
Many use cases are emerging and both its strengths and weaknesses are still being discovered but they appear well equipped to address the complexity that humans have been struggling with to keep up with–producing documentation, refining schema, defining policies, and delivering upon the context which will be needed to power the iteration of the Web.
- Due to the increased investment, adoption, and growth of AI technologies and the acceleration of related innovations, programs and robots have finally gained the ability to read and write like humans, benefit from the power of hypermedia, and thrive in more dynamic, unpredictable environments, given the right semantics and context being present. Robots learning to read and write opens up a new generation of the Web, filled with both opportunities and consequential disruptions, which we believe will lead to a new Web 4.0 generation.
- AI is both a consumer of content on the Web, pushing well beyond the traditional search engine indexing and crawling governed by the robots.txt, agent.json, terms of service, and copyright law to deeply digest its content and build LLMs usable for generative AI and agentic AI use cases. This harvesting use case does reinforce the importance of the HTML-first approach to web pages versus the overly dynamic web pages resulting from a JavaScript-first approach promoted over the past 10 years.
- More recently, the Model Context Protocol (MCP) has captured a lot of attention and despite initial design flaws around security, licensing, and access control fuels a wave of experimentation and growth not different to the early days of the Web 1.0. Some of the issues with MCP are being incrementally addressed by revising the specifications, why the more serious ones will require solid tooling to compensate until a more mature major revision becomes available and adopted.
- While MCP relies on HTTP and SSE to exchange messages with an agent, its media type is a specialization of JSON-RPC, itself a specialization of JSON, like XHTML was a specialization of XML. While the value of MCP lies in underlying applications, data, often exposed via regular web APIs, it enables dynamic discovery and interaction with capabilities in the form of tools / procedures, resources and prompts.
- NLWeb is a specialization of MCP and Schema.org that facilitates agents’ interaction with web sites using a unified conversational language and microdata, similar to how HTML facilitates interaction with humans.
- Agent2Agent (A2A) is a complementary protocol that helps agents dynamically discover and use additional agents to achieve their goal via delegation or coordination. By ensuring capabilities describe associated agents with an A2A AgentCard, they can then be referenced as an MCP resource to one or many external agents.
Web 4.0 is just getting started, and there is a lot of work ahead. A model-driven approach to software is beginning to shift how we do business on the web, and MCP, A2A and NLWeb are just the front line of the technical shift in how data is integrated into this model-driven shift. These specifications are aligning engineering groups, but there is a lot of refactoring of previous waves of digital transformation, including how we manage cost, revenue, access control, and the automation of our valuable business resources into the capabilities needed for what is next.
Core Web 4.0 Principles
The shift towards the Agentic Web takes technical discipline, but also alignment with business functions. It builds upon existing standards and tooling, keeping pace with the protocols powering the web, leaning towards automation and scale using AI, but having the discipline to include humans in the loop, value of open ecosystems and tooling, while also ensuring the sovereignty and integrity of the network in which we operate.
- World Wide Web of capabilities: Web 4.0 envisions a Web where the fundamental components of interaction are “capabilities.” These capabilities represent diverse tools, functions, workflows, and rich resources that can be discovered and utilized to power personal and business automation at a web scale.
- Hypermedia driven and intelligent: This new generation of the web will harness the power of hypermedia, but unlike previous iterations, it will be animated by artificial intelligence. AI agents will be able to read, write, and benefit from hypermedia, thriving in dynamic environments by understanding and interacting with these capabilities.
- Protocols: HTTP/3.0, QUIC: The underlying network protocols for Web 4.0 will be HTTP/3.0 and QUIC, indicating a focus on modern, efficient, and performant communication that operates at web scale–providing the real-time bandwidth required to support the next generation of automation.
- Media types: MCP, A2A, NLWeb: Key media types enabling Web 4.0 are the Model Context Protocol (MCP), the Agent-to-Agent (A2A) protocol and the Natural Language Web (NLWeb). MCP facilitates dynamic discovery and interaction with capabilities (tools, procedures, resources, and prompts), while A2A allows agents to dynamically discover and use other agents through delegation or coordination.
- User agents: [semi-] autonomous: In Web 4.0, user agents will evolve to be (semi-)autonomous. This signifies a shift towards intelligent programs and robots that can independently interact with the web’s capabilities, moving beyond traditional manual browsing and rigid integrations.
- Open-Source Core with a Negotiated Commercial Layer: The core of Web 4.0 is built on open standards and tooling, laying a solid foundation for then layering on commercial, partner, and 3rd-party services that intentionally employ user, usage, and model-based pricing that is machine-readable, standardized, and automated as part of the rest of operations.
- Security, Data Sovereignty, Control & Clear Boundaries: A modern Web 4.0 operates within and across well-defined business, industry, and sovereign boundaries, with identity, authentication, and access control the default on the open web, as well as within the corporate firewall–providing business leadership with the control they need to operate.
Web 4.0 principles reflect the technical, business, but also the political shifts occurring on and offline today. Web 4.0 represents the public, private, and partner Web that has emerged over the last decade, but presents a new opportunity for scaling and streamlining automation that allows us to do more with fewer resources, asserting our place within the marketplace–securicly and confidently being capable of operating and leading in a very volatile and digital marketplace that will continue to shift and evolve.
Web 4.0 Opportunity
If you adhere to the vision and core principles of Web 4.0, you can star the manifesto here (GitHub), as well as share or embed to voice your support. As it is still nascent, there is also an opportunity for the community to revise this manifesto by forking it and contributing a pull request, and engaging in related discussion. You are also encouraged to experiment with the emerging protocols (see references below), contribute to their specifications and help make them better, more robust for wider adoptions by enterprises and society in general, and develop interoperable tooling to power their effective usage.
References
These are the references used to underpin the Naftiko Manifest, which is used to inform the Naftiko product road map and documentation, building on the work of other visionairies who have set in motion the Web we all use to power our businesses and markets.
Key statements
Key papers
- Tim O’Reilly - AI Integration is the new Moat
- Tim Berners-Lee - Weaving the Web
- Tim O’Reilly - What is Web 2.0?
- Jon Bosak, Tim Bray - XML and the Second-Generation Web
- Tim Berners-Lee, James Hendler, Ora Lassila - The Semantic Web
- Roy Thomas Fielding - Dissertation on architecture styles and REST
Key specifications
- Model Context Protocol (MCP)
- Agent2Agent (A2A)
- Agent Communication Protocol (ACP), merging into A2A
- Agent Network Protocol (ANP)
- Agents.json
Additional resources
- Microsoft - NLWeb Pioneers: Success Stories & Use Cases
- Julien Simon - Why MCP’s Disregard for 40 Years of RPC […] Will Burn Enterprises
- Dick Hardt - OAuth is not a good fit for MCP
- Alexander Williams - HTML-First, Framework-Second: Is JavaScript Finally Growing Up?
- Daniel Kocot - Beyond MCP - How Capability, Ecosystem, and Systems Thinking […]
- Christian Posta - From APIs to Capabilities, What API Agents Mean for App Architecture
- Sean Falconer - AI agents are finally delivering on the semantic web’s promise The Agentic Web: How AI Agents Are Rewiring Internet Infrastructure
- Gaowei Chang, Ruoxi Ran - Agent Network Protocol White Paper
- Serena Capital, The State of Commercial Open Source 2025