📢 Gate Square Exclusive: #WXTM Creative Contest# Is Now Live!
Celebrate CandyDrop Round 59 featuring MinoTari (WXTM) — compete for a 70,000 WXTM prize pool!
🎯 About MinoTari (WXTM)
Tari is a Rust-based blockchain protocol centered around digital assets.
It empowers creators to build new types of digital experiences and narratives.
With Tari, digitally scarce assets—like collectibles or in-game items—unlock new business opportunities for creators.
🎨 Event Period:
Aug 7, 2025, 09:00 – Aug 12, 2025, 16:00 (UTC)
📌 How to Participate:
Post original content on Gate Square related to WXTM or its
a16z Crypto Entrepreneurship Course: Three Mental Models of Encrypted Infrastructure
Ali Yahya|Author
Sissi|Compilation
Translator's Guide:
**Web3 is an emerging Internet model whose core lies in building the infrastructure of the entire ecosystem. However, due to its importance, Web3 infrastructure has also sparked widespread controversy and discussion. In order to help us better understand the Web3 infrastructure, Ali Yahya, a partner of a16z crypto, proposed three mental models that are helpful for thinking: narrow waist, modular advantage and network flywheel. **
**These mental models are significant in providing us with a framework for thinking about the characteristics and potential of Web3 infrastructure. Understanding these models is critical for those involved in the Web3 space, both from a technical and business perspective. By in-depth research and application of these models, we can better build and shape the future of Web3 and promote the sustainable development of the entire ecosystem. For more content of a16z Encryption Entrepreneurship Course, see the link. **
>>Mental Model One: Narrow Waist
The overarching mental model is the "narrow waist" in the protocol technology stack. Here's a question that will spark your thinking: how to build a protocol that can dominate the world? Can anyone name the most successful protocol in the history of computer science? Mind you, it's a network within a network. The Internet Protocol is a good example of the importance of this mental model and is one of the most successful protocols. Internet Protocol (IP) and Transmission Control Protocol (TCP) are inextricably linked. For the convenience of discussion, this sharing will focus on Internet protocols.
Internet Protocol
We explain in detail why these protocols have been so successful, using the Internet protocols as a driving case study for this mental model. In the following sharing, if you think carefully, you will find out the extraordinary design of **Internet protocol. It has taken us from a state of few connected devices to a world of over 20 billion connected devices. In the past 40 years, even in the period of rapid development of computer technology, the Internet protocol has maintained a strong vitality. **During this time, the technology used for data transmission has improved about a million times. However, the protocol design survived all of these changes and kept it future-proof and extensible with minor tweaks. Although there have been some changes, overall it still retains the character of when it was designed in the early 1970s. Throughout their development, the ideas behind the Internet Protocol and Transmission Control Protocol have remained the same.
The "narrow waist" design concept introduced by the Internet Protocol
There are many reasons why the Internet Protocol has been so successful. However, the most important reason is the "narrow waist" concept introduced by the Internet Protocol.
First, the Internet Protocol creates a unifying layer that connects the fragmented world of computer networks. **From the beginning, one of its design goals was to enable any network based on any technology to provide network support for other applications. Let's take a look at the diagram below, with IP and TCP in the middle and various types of networking technologies at the bottom, including cable, fiber optics, radio, and more. And at the top are various different types of applications that require different types of network support.
Every packet of data sent over the Internet goes through the Internet Protocol, which creates a simple interface layer above the application layer and below the hardware layer. Before the Internet Protocol, each application had to deal with the intricate details of various network technologies, which led to the fragmentation of the network. This means that each network must be custom-built according to the needs of the application, and in turn, each application must build its own network and network protocols, and provide its own bandwidth and hardware. This also leads to incompatibility and cooperation between applications, that is, lack of interoperability.
So all of these issues lead to the fact that the web at the time was of little practical use and was just a toy for researchers and academics. In contrast, the Internet Protocol acts as an aggregator, bringing the entire online world into one standard. By doing so, it achieves three main functions. First, **it allows requests from any top-level application (an application that needs to move data from point A to point B) to be serviced by any provider, regardless of the technology they use, as long as they can communicate between points A and B transfer data between points. Secondly, it enables all hardware providers and bandwidth providers to cover the entire application market and only need to support one protocol instead of pairing with various different applications one by one. **Most importantly, the Internet Protocol decouples the application layer and the hardware layer, allowing the two layers to evolve independently. **
In fact, there is a very classic line in David Clark's book "Designing an Internet" (Designing an Internet), he said: "Interfaces can unshackle the constraints." This means that by defining a simple interface layer, the application program is separated from the implementation details of the underlying hardware, which promotes the flexibility and scalability of the system. The interface is not only a restriction, because both parties must abide by and support the specification of the interface, but also a means of lifting the restriction. Once the two parties meet the interface requirements, they can evolve independently without paying too much attention to what the other party is doing. This is exactly what the Internet Protocol was designed to do in the first place.
Positive Feedback Loops Created by Internet Protocols
Of more interest, however, is the economic impact of the Internet Protocol. By bridging disparate domains of hardware and top-level software, the simple interface of the Internet Protocol created a powerful positive feedback loop that ultimately drove the ubiquity of IP around the world. The way it works is that as more bandwidth providers join the network, application developers get more bandwidth resources available, enabling them to build more applications that take full advantage of that bandwidth.
As developers build applications that are useful to users, the demand for bandwidth increases, further incentivizing more bandwidth providers to join the network. A cycle was formed that resulted in the widespread adoption of Internet protocols that eventually became the Internet we know today. So why is this so-called "narrow waist" so important?
The most fundamental property of the Internet Protocol, and one that any textbook on computer networking will mention, is that it is unrestricted. This is crucial. It means that the protocol itself doesn't care how it is used by top-level applications, low-level hardware providers, and bandwidth providers. Internet protocols attempt to maximize the degrees of freedom for all parties, which is the defining characteristic of the Internet. Therefore, it is called the "narrow waist" of the Internet.
The Internet Protocol is just a simple interface through which all data can flow regardless of the type of traffic and the technology used. It is for this reason that the Internet Protocol has broad coverage, flexibility and sustainability, and has remained alive through the technological changes of the past 40 years. Its global coverage covers the entire online world, which is also one of its characteristics.
It is worth mentioning that during the development of the Internet Protocol, many competing standards emerged, such as ATM and XNS. However, they are actually more complex and have more functions, but lack the flexibility of Internet protocols, and there are certain restrictions on how users can use these protocols. In the end, the Internet Protocol, with its radical minimalism and unrestricted nature, triumphed and became the victor of history.
Blockchain
So, for today's new agreements, how do they affect us? Let's bring this question back to the realm of encryption. First, let's take a step back and look at the role of the political "waist" in enabling the emergence of "multi-sided markets." **
What is a "multilateral market"? Here, we define a "multilateral market" as a common ground that creates value by enabling direct economic interaction between a variety of actors. There are many examples to prove this.
An example is the Internet Protocol, which creates value by enabling service providers and hardware vendors to interact with application developers. There are other examples from the traditional technology world, such as Windows, Mac OS 10 or iOS operating systems, which are also agreements that can enable the emergence of multi-sided markets. The same is true for shared mobility services like Uber, with passengers on one side and drivers on the other. It may go against the prevailing view that multilateral markets are a central template for the success of any agreement, but that is what the agreement was designed to do. Its original intention is to connect, to build bridges between different types of participants. **
Taking the crypto space as an example, UniSwap and Compound are decentralized finance (DeFi) protocols. UniSwap involves market makers and traders, Compound involves lenders and borrowers. There are other examples like Sound.xyz is a decentralized music streaming platform that involves artists and listeners. Forecaster is a decentralized social network involving content creators and platform users. They provide a common ground for different types of actors to interact with each other on an economic level.
Consensus mechanism: the key to connecting blockchain participants
Blockchain itself is the prime example. In the future, we will see the existence of blockchain computing as a "narrow waist". In the high-level architecture of the blockchain, the absolute "narrow waist" naturally appears in the central consensus mechanism. The consensus mechanism creates a multi-sided market that connects all participants. **It connects verifiers and application developers, verifiers provide computing resources and security, and application developers build applications and deploy smart contracts to the blockchain. Just like the Internet protocol creates a unified address space for all participants on the Internet, the ** consensus mechanism builds a unified computing foundation for people in the blockchain. **Similar to IP, people don't have to care about the complex details of the underlying technology, but focus on supporting the upper layer interface. The consensus mechanism makes people don't have to worry about the specifics of the verifier's hardware when deploying smart contracts on the blockchain.
Similar to Internet protocols, the ** consensus mechanism establishes two-sided network effects and positive loops. **As more validators join the network, they provide application developers with more security and better computing power to build useful applications, which creates more demand and provides Tokens create value. This in turn provides stronger incentives for more validators, providing more security and computing resources to the network, creating a self-reinforcing feedback loop.
Balance of unrestrictedness and autonomy
So, what can we learn from comparing blockchain and internet protocols? As we've seen, one of the most important factors driving the success of IP is its unlimited nature. Today, however, there are a myriad of different blockchains undergoing various experiments, spanning the entire spectrum.
At one end, there are highly autonomous blockchains that are vertically integrated and control everything from peer-to-peer networks to consensus mechanisms and upper computing layers, including the allowed instruction sets. At the other end of the spectrum, there are completely minimal blockchains that strive to remain extremely unrestricted, expressing no preference for the network layer or the programming language used. There are various degrees of autonomy between these two segments.
To give a controversial example, Bitcoin is a highly autonomous blockchain, albeit very minimalistic. It prescriptively requires that any program that runs on it must be built using Bitcoin Script (Bitcoin), a limited non-Turing-complete programming language. Ethereum, on the other hand, is a more complex blockchain that is more unlimited. It provides a more expressive programming language to build smart contracts and gives users more freedom than Bitcoin.
It's too early to tell, but it will be interesting to see if there is a similar situation where unrestricted agreements lead to a narrow waist and effectively decouple things, creating a flywheel effect. Perhaps these are the dominant blockchains rising like the Internet. However, we must note that the pursuit of unrestrictedness may come with a certain sacrifice of control. **Therefore, we need to weigh different factors.
Finally, there are a few questions worth pondering for those who are building the protocol: How does the protocol enable the emergence of a multi-sided market? Who are the participants? How much freedom do these actors have? How autonomous is the agreement? Does the agreement have what it takes to be a narrow waist? **These questions are worth asking because they may influence the design and the direction taken.
>>Mental Model Two: Modularity
The second mental model is the concept of "modularity" which is closely related to "unlimitedness". Although these two concepts are often related and easily confused, they are actually independent. In the second mental model, the core question is: what degree of modularity should be adopted in the protocol. **
Modular vs Unrestricted
First, let's give a simple definition. The "unrestricted" nature of the protocol means that the protocol provides as much freedom as possible when users use the protocol or build functions on top of the protocol. This is what unrestricted means. On the other hand, the "modularity" of the protocol means that the protocol or the architecture resulting from the protocol can be broken down into self-contained basic components. Sometimes, modularity of protocols leads to higher unrestrictedness, because modularity often leads to greater flexibility. But not always, some examples are provided below. In fact, sometimes in order to increase unrestrictedness, you must reduce the degree of modularity. Therefore, it can be seen that the two concepts are different.
It can also be further clarified that the Internet is a prime example of protocols and stacks that combine unrestrictedness and modularity. **It is an unrestricted protocol design, but also modular, because it is built on a hierarchical structure, each layer is independent of each other, and there are clear abstractions and interfaces between each layer. In order to better illustrate the independence of these two concepts, we can use a diagram as follows. The graph has two axes, the vertical axis represents unlimited, that is, the user's degrees of freedom; the horizontal axis represents the degree of modularity.
Let’s take the example of Bitcoin and Ethereum discussed earlier. Bitcoin is relatively less modular as it is fairly vertically integrated. It binds the consensus mechanism to a specific computing environment and defines a specific programming language. Therefore, it limits the freedom of the user and does not allow it to be built using a Turing-complete programming language like other blockchains. In contrast, Ethereum is more modular and encourages the construction of a layer-based blockchain stack, providing Layer1 and Layer2 solutions to provide higher performance and additional functionality. The vision of Ethereum is to achieve a more modular and unrestricted way of working blockchain, although it may be more complex.
Another example is Aptos and Sui, which are highly vertically integrated, with the teams building these blockchains responsible for every layer of the entire blockchain stack. While they also provide expressive programming languages, they are relatively less unrestricted. In contrast, Celestia envisions a hyper-modular and minimalist architecture. Celestia exists as a data availability layer, and other components are built as modules around it. Celestia is more modular, separating availability and computation as different modules, but has some restrictions on running smart contracts, so it is relatively weak on the unrestricted side.
TCP/IP is a very modular and highly unrestricted architecture. It is designed to be structured in layers, with each layer independent of each other and with clear abstractions and interfaces. This example is a good example of the difference between modularity and unlimitedness.
The advantages and challenges of "modularization"
The Innovator's Solution is an excellent book written by Clay Christensen in the early 2000s. Two chapters are devoted to the rationale for monolithic rather than modular when technology fails to fully meet user needs. This view also applies to the cryptocurrency and blockchain space. In the early stages or when the technology fails to meet user expectations, users will crave more features, which also reflects the current situation of blockchain. Blockchain cannot meet all expectations and needs.
Modularity becomes more reasonable and beneficial as technology meets or exceeds user needs. Modularity reduces costs, by outsourcing each module to an ecosystem of participants who build functionality for the module and commercialize it, reducing the cost of the overall system. In addition, modularity also provides greater flexibility for users and developers, who can choose to replace modules. However, ** also has some costs, especially in terms of complexity. **When building a modular system, you need to consider edge cases and define concrete interfaces and abstractions to divide different modules. This adds complexity and limits the flexibility to try different approaches.
According to Clay Christensen, modularity becomes critical once technical capabilities outstrip customer needs. If you don't take a modular approach, someone else will build a modular architecture and surpass you. However, until that point is achieved, maintaining monolithic and controlling the entire architecture can be optimized to get as close as possible to user needs. The chart below is from the book "The Innovator's Solution". Three lines are shown: the dashed line represents user expectations, the top solid line represents capabilities achieved through an integrated, centralized approach, and the bottom line represents capabilities achieved through a more modular architecture. Lines below mean that opting for a modular architecture reduces control and thus prevents overly aggressive optimization. It sits under the lines of a vertically integrated architecture, which has better capabilities. This shows that over time, monolithic architectures start to overshoot, providing more functionality than users actually need or desire. In this case, integration no longer provides any additional benefits, so it makes sense to move to a modular approach to reap the benefits of modularity while still meeting the basic user needs represented by the dotted lines.
Let's clarify this concept with an example: the evolution of mobile devices. Before the iPhone, there was the Blackberry, a highly integrated, vertically integrated device. It's focused on one important application -- email -- and is tightly controlled by RIM, the company that makes the BlackBerry. It was designed to provide good enough email functionality for business customers. At the time, it would have been very challenging to try a more modular, general and unbiased approach because the technology wasn't advanced enough. It wasn't until the technology advanced that Apple introduced modularity through the App Store, giving the iPhone even more flexibility. With the introduction of the App Store, developers could write apps for the iPhone, which ended up being the winning formula. However, implementing this approach can be challenging until the technology evolves to support this functionality.
Now, let's get one out of the way: the internet seems to contradict this theory. When the Internet first appeared, it could not meet the needs of users and often operated unstable. Competing with this is a highly vertically integrated approach called the "information superhighway" designed to create a seamless end-to-end experience. Interestingly, however, IP and the Internet with their hyper-modular approach eventually triumphed over the information superhighway, despite their imperfect beginnings. This challenges the theory proposed by Clay Christensen. One possible explanation is that Clay Christensen may not have fully appreciated the power of network effects. In the case of IP, the unbiasedness of the narrow waist becomes so important that it defies conventional theory. This underscores the enormous influence of network effects.
So, how should we look at the problem of modularity? ** Modularity can be good or bad, depending on the situation. **It has the advantage of increased flexibility, but at the same time comes with some costs, such as increased complexity and decisions that need to be made. Modularization can reduce costs, increase user flexibility, and attract more people to participate. However, we note that in some cases modularity may reduce fairness, requiring vigilance and linking to the "narrow waist" concept discussed earlier. In general, we need a mental model to deal with the complexities of modular choices, as choices will vary from case to case.
>>Mental Model Three: Network Flywheel
Next comes the third and final mental model - the network flywheel. **This model focuses on the role of tokens in the protocol, and the goal of the protocol is to promote multi-party participation in the market. **However, in order for the protocol to be successful, a difficult problem known as the "cold start" problem, the situation where the protocol cannot function properly without a supply or demand side, must be solved. Solving the cold-start problem has been a challenging task in the past. However, the introduction of tokens provides a solution to this problem. Tokens can be used as an incentive mechanism to motivate and engage participants. They provide a way to capture and distribute value across the network, thereby encouraging both supply and demand sides to participate and interact with the protocol.
Tokenized Incentive Model
In the past, addressing the cold start problem in multi-participant markets has often required significant capital infusions, often from venture capitalists (VCs) or government entities. Companies like Uber, for example, have driven growth by subsidizing one or both sides of the market. This infusion of external funding helps overcome the supply-demand matching problem in the initial stages. However, **tokens offer a solution to the cold start problem. They can be used as an incentive mechanism to mobilize the enthusiasm of network participants. **By distributing tokens to participants in exchange for their contribution or participation, the network can propel itself rather than solely relying on the infusion of external funds.
Let's consider an idea for improving Uber. Suppose instead of driving the market with cash subsidies, Uber gives drivers a small stake in the company for every trip they take. What will be the effect of doing so? Such equity could give drivers a sense of the long-term value of their stake, fostering loyalty to Uber and incentivizing them to drive for the company. This approach represents a more efficient capital structure, as funding comes from the participants themselves rather than external subsidies.
We can use the concepts of the Internet and TCP/IP to explain the potential benefits of tokens. Imagine if TCP/IP had a token where those who contributed to its early development would gain ownership of the network.
Network Effects of Tokenization
Tokens in the above cases allow holders to share in network revenue. This token-based approach may make it easier to launch the internet without over-reliance on government funding. Additionally, it can enhance network effects by attracting capital and human resources. This leads to the concept of a network flywheel, which applies especially to layer 1 blockchains, but can be applied to other protocols as well. **The starting point of Flywheel is a founding team and core developers who conceived and built the protocol. Funded by investors, they create an initial token value. These tokens now have value, incentivizing validators to join the network and providing productive capital such as computing resources to ensure the security and functionality of the network. This in turn attracts third-party developers who contribute their human capital by building useful applications on the platform. These applications then provide utility to end users, gradually forming a community that reinforces the original vision at the core of the protocol, thus completing the circle. This is how the web flywheel works.
As the protocol vision becomes stronger, the token becomes more valuable. Perhaps with the help of additional funds from investors, there will be stronger incentives for validators or other participants to provide more productive capital to the network, improve the network's functionality, and in turn encourage more third-party developers to build applications , to provide more utility to the end user. This again reinforces the original vision for the core of the web. This is the concept of the web flywheel.
Therefore, we want to emphasize: if a network is properly designed, then at each stage of the flywheel, various participants can earn tokens for helping to start the network. This actually helps the network overcome the cold start problem, where tokens play a moderating role. Tokens can play a variety of roles, but one of the most influential is helping to launch a multi-party marketplace for protocols.
原文:3 mental models for crypto infrastructure | Ali Yahya
Source: a16z crypto