admin

Token design: the deployment phase, focusing on system safety

January 9, 2019

Contents

[image id=’3386′ alignment=’center’]

As Web 3.0 continues to develop and various projects build out important infrastructure to scale a decentralised web to the masses, many of these protocols can be viewed as public utilities to be used by new emerging market applications. As public utilities, it is crucial that these systems are reliable and robust. The token models that we create are simply hypotheses at this point, therefore iterating an initial design into a mathematical model, and eventually into a Minimal Viable Token (MVT) computational model should be the objective of any token team. Creating a computational model is required for parameter tuning and general optimization while enabling quick iteration of the initial design through to an MVT, ready for system integration.

 

However, for these systems to operate safely, it is crucial that we apply the same engineering rigour that is applied to the construction of physical infrastructure projects (like highways and buildings), to the design, deployment and maintenance of tokenised ecosystems.

 

Deployment is focused on system integration and continued model optimization through testing and iteration. Deploying the token model is a continuous process, involving two distinct layers of iteration, on design and development.

 

The initial token design is still an untested hypothesis; therefore, once finalized, it needs to be tested before being implemented. Throughout this phase, design iterations require a deep understanding of the possible states of the model, where the question is what could happen instead of focusing on determining what exactly will happen. Within this framework, incentives should be used to encourage desired states and cut out mechanisms that could lead to undesired states. By limiting the action space in this way, the token model is shaved down to the minimum viable token (MVT). Getting to the optimal MVT requires applying principles from systems engineering and control theory throughout the testing and implementation phase to achieve the optimal incentive structure and effective production-ready model. In much the same way the laws of physics are the primitives for classical engineering problems, the economic theories, and incentives that guided the initial design are the primitives used in token engineering.

 

Iteration

[image id=’3385′ alignment=’center’]

 

In the Design Phase some of the parameters in the model may have been validated independently to generate the initial design; however, most likely the sum of all parts have not been tested together. As mentioned before, there are two distinct layers of iteration, namely design and development iteration at the Deployment Phase. At each iteration, we should be focused on integrating the most recent model updates, and test results into the latest version
of the model.

 

System Integration

This stage comes into play when the model has moved past design iteration, and we have a fully validated model that needs to be integrated into the rest of the technology stack. As mentioned before, these decentralized networks are essentially two-layer systems, consisting of a market and a ledger layer.

 

Measuring Monitoring & Maintenance

Once these models are out in the wild, the job of token design and engineering has only just started. Token models will need to be constantly monitored, maintained, tuned, and iterated through. A good project will feel responsible for the network it has launched and ensures its long-term sustainability, effectiveness, and security well past initial network launch.

[image id=’3384′ alignment=’center’]

System safety needs to be a primary focus for emerging tokenised ecosystems if they are to successfully scale. By focusing on continued iteration and creating robust feedback loops, tokenised ecosystems can react to and learn from small shocks and become more robust and stable over time as they adjust and learn from further shocks. The idea of being anti-fragile (renownedly articulated Nassim Nicholas Taleb), or being strengthened by disorder, applies well to the token design space. This is why engineering rigour and system safety needs to be a key focus when deploying your token design into the wild.

Stay in the loop

Subscribe to our weekly overview of the crypto market.