What could tokenised generative AI look like?

Contents

Generative AI is evolving at such a rapid pace; a pace that is only accelerating and is more rapid than our human minds can reasonably comprehend. What we would once consider distant-future-sci-fi capabilities, are being realised today. This technology is both promising and worrying; in the hands of the few (as opposed to the hands of the many) may have similar implications to the centralisation of control over any important technology (e.g. money) – it holds the promise of utopia, but also bears potential for dystopia.

For the same reason it is beneficial to have decentralized alternatives to traditional technology (efficiency, credible neutrality, etc.), it is important to apply this to AI.

This short article was born of an internal thought experiment conducted last November after the launch of ChatGPT by OpenAI. In this experiment, we unbundled the AI tech stack (at a high level), and looked at certain ways it could be decentralized at each layer. 

When it comes to decentralisation, it is crucial to correctly align the incentives of disparate stakeholders; which is usually achieved through the use of a token. Therefore, the potential token use case is explored when theorising what each layer could look like if it was decentralised. 

It must be mentioned as well that decentralisation is a spectrum, not binary. In practice, nothing will be ‘truly’ decentralised, but rather exist within various degrees of decentralisation.

Please note that this may not all be technically correct, or technically feasible. It is a thought experiment conducted by non-AI experts.

1. The Application Layer

What is it

The first layer is the application layer, which consists of the applications and widgets built on top of the underlying technology that serves a specific purpose or solves a specific problem.

– The state of the art

At OpenAI, the application layer follows a freemium business model where API may be used at a throttled rate for free, and extra usage credits may be purchased. However, the ChatGPT web app built by OpenAI is openly accessible. In centralized operations, the API is not open and freely available or accessible for consumption, but permissioned (via registration, identification, payment, etc.). Access is controlled and granted by a centralized entity. 

How it would look if decentralized

In decentralized operations, there is an open, permissionless API that anyone can access, use and build on top of. Tokens could be utilized in the form of credits (payment and discounts), access, and curation. Tokens here could represent ‘prompt tickets’, and/or signalling mechanisms showing which applications users rated as most valuable – in such a way that they would all have incentives to rate the apps.

Access to this API can be maintained and provided by a market of providers running their own AI nodes, wherein there is permissionless discovery, matching, and curation.

2. The functional Layer

What is it

The second layer is the functional layer, which comprises the AI model and parameter settings describing how the AI functions and works in practice. This layer also refers to the talent with the relevant expertise that is actually doing the day-to-day work in the development of these models and their training. 

The state of the art

In centralized operations, the functional layer is closed source and opaque. It is usually the main IP of a for-profit business entity. The functional layer is governed by a closed group of stakeholders that decide upon and implement the functions of the model itself (how it works). 

How it would look if decentralized

In decentralized operations, the functional layer is more open, optimally consisting of big tech, governments, enterprises, and relevant opinion leaders/experts that have a say in how the model functions, and coordinating the development and maintenance of those models. This could be some kind of elected governing body, or Proof of Authority within a rough DAO structure. Different types of tokens (both fungible or non-fungible) can be leveraged here to construct governance frameworks and voting rights.

DAO members who have attained various badges and attestations (tokens) to verify expertise can propose or vote on suggested changes to the workings of the model. Various different governance structures could be applied here, however emphasis is on those that value expertise and reputation as opposed to financial capital. Theoretically, any person can become one of those directly working on the functional layer. However, they must appeal to those with the power to include them in this core group.

3. The Data Layer

What is it

The third layer is the data layer, which consists of the datasets, and the selection of, that are used for the training of the AI model in the functional layer. 

The state of the art

In centralized operations, the centralised team decides what datasets it is trained on without or with minimal oversight from external stakeholders. At OpenAI, the training data is selected and curated by the OpenAI team. To the best of our knowledge, this currently includes data from commoncrawl.org, webtext, Wikipedia, and e-books. 

How it would look if decentralized

In decentralized operations, a federated group of internal and external stakeholders make recommendations and reach consensus on what training data sets should be used. A token here would be used to attach rational, economic incentives for these stakeholders to reach consensus.

A DAO permissionlessly nominates and elects a diverse group of experts to decide upon what training data is used. An additional layer of decentralization would be to not only decentralize decision-making on what data set to use for training, but also the source of data via decentralized marketplaces.

4. The Computational Layer

What is it

The computational layer of generative AI architecture refers to the computing resources that are utilized in the actual computation of prompts, as well as the pre-training and reinforcement of models prior to release. This layer also includes the parameters that are used for each individual computation.

The state of the art

Computation occurs on centralized servers run on-premise by OpenAI or in enterprise cloud computing solutions, or a combination of both. Users trust that the same model and parameters are used for each prompt.

How it would look if decentralized

The responsibility of computation can be given to a federated network with whitelisted nodes that have the capacity to run the model and have to pass certain requirements to participate in this network. Alternatively, requests can be computed by delegating tasks to a decentralized computing network such as Cudos or Golem, ensuring that computations can be proven to be conducted honestly in a trustless and crypto-economic way. Security is achieved through tokens, which can create crypto-economic guarantees that computations are done honestly, and providers of computation resources get paid in tokens for providing their service.

Furthermore, each computing node in a network would have to submit a proof, or a hash, of the model and the parameters it is using to fulfil tasks requested by the network. If the submitted proof/hash does not match others submitted by other nodes, this could trigger a slashing condition. If triggered, this would cause the loss of tokens that nodes had put up as collateral in order to start operating a node in the first place.

Conclusion

The AI tech stack is a complex system of layers, each with its own unique characteristics and functions. The comparison between centralized and decentralized operations of the layers within the AI tech stack has been discussed. While centralized operations involve a closed group of stakeholders controlling the functional layer, decentralized operations allow for more open decision-making through a DAO, as well as collective ownership and development of a highly disruptive new technology.   

 model. 

The data and computational layers have also been discussed, with the potential for decentralized decision-making and marketplaces. Security may be achieved through native consensus, outsourced consensus, and different economic and/or cryptographic guarantees that computations are done honestly. 

As we enter the exponential age, the wielders of new, permissioned technology will have access to bewilderingly more advanced capabilities than those without this privilege. Hence, elements of decentralisation are arguably more important now than ever for the equality of both opportunities and outcomes. In a world where most economic activity transpires digitally, cryptography and tokenisation are two of the leading tools that we can use to create a more equitable future for all. 

Related to this content

Discover more categories

The Atlas Report

Regular web3 insights, analysis, and reports to stay ahead of the game. Sign up to our newsletter.

Sign up to our newsletter