The Decentralised Value Ecosystem: Production > Distribution > Consumption
By Lawrence Lundy-Bryan
Contributions from Joel John, Harry McLaverty, and Shaquile Noor
What does artificial intelligence have to do with blockchains? Well it’s actually helpful to remove the buzzwords and talk about a new decentralised data value ecosystem in which data is produced, distributed and consumed. Using that framing, the Internet of Things produces data, blockchains and other authentication technologies distribute it, and then the data needs to be processed, analysed and automated. This is where so-called smart contracts, decentralised compute and decentralised machine learning can be used on data in decentralised databases, document stores and blockchains.
It is here where innovations in the blockchain and artificial intelligence communities blur and it becomes clear they are intertwined and interconnected.
Both smart contracts and machine learning offer differing levels of automation and decentralisation depending on the type of input data and level of trust the use case demands.
Distributed Compute
Distributed compute refers to computing whereby a complex problem is broken down into more simple tasks. These simple problems are distributed out to a network of trusted computers to be solved in parallel, and then the solutions to these simple problems are combined in such a way to solve the main problem at hand. This is quite similar to how processors (CPUs and GPUs) developed from single-core to multi-core on the same circuit, and multiple cores were used to solve a problem more quickly than one core by itself. Although a simple premise, the other computers need to be trusted for the system to work. Conversely, blockchains and ledgers may be used to create networks of computers through a ‘trust framework’ and to incentivise these nodes to work together, rewarding those who solve these simple problems with tokens that have a financial value no matter how small. Blockchain projects including Golem and iExec are actively solving this problem. Other projects like Truebit are working towards off-chain computation in a trustless way using a prover-verifier game. Verifiable and non-verifiable distributed processing will both be needed depending on the level of trust between participants in the network. Interestingly, we could finally could see the realisation of the National Science Foundation Network (NSFNET) project from the 1980s, a supercomputer on-demand for any computing task. Other distributed computing projects like Nyriad are looking to achieve hyper-scale storage processing but without tokens using a concept called ‘liquid data’.
Quantum computing is different to distributed computing in that it looks to solve problems that cannot be solved by existing computers (read: Turing Machines). By using quantum particles, the nascent technology has the potential to test all potential solutions to problems in one go in a single machine, rather than a network of machines. These machines pose a potential threat to blockchain technology because they are reliant on public key cryptography (also commonly used in banking for credit card security) which is made secure based on the difficulty of finding prime factors for huge numbers. These problems would typically take many hundreds or even several thousands of years to solve, but with quantum computers, this timeframe could be reduced to hours or minutes. Companies like IBM, Rigetti and D-Wave, are driving progress in the field.
Parallelisation is the thread that ties together distributed computing and quantum computing. On the one hand, distributed computing involves networks of computers that look to solve a problem by solving smaller problems in parallel, while in quantum computing one computer is solving many complex problems simultaneously. In both cases, we can start to rely on networks of incentivised machines to solve computational challenges, rather than servers owned by centralised entities. From an incentivisation perspective, blockchains enable these networks to work efficiently and ‘trustlessly’ with a token powering a marketplace of nodes with computing power. Quantum computers could also form part of these networks, solving the specific problems that the classical computers could not.
Smart Contracts
There are currently a handful of smart contracts blockchain platforms that have successfully captured the market. According to Etherscan there are 93039 ERC20 token contracts. Waves, NEO and Stellar, are all developing their own standards in an attempt to challenge Ethereum’s dominance. In a nutshell, smart contracts are programmable “if this, then that” conditions attached to transactions on the blockchain. If situation ‘A’ occurs, the contract is coded to have an automated response ‘B’. This idea isn’t new, and we can find examples all around us, such as in vending machines: if button ‘A’ is pressed, then ‘X’ amount is required; if ‘X’ amount is paid, then snack ‘B’ is dispensed. By adding this simple concept to blockchains, contracts cannot be forged, changed, or destroyed without an audit trail. This is because the ledger distributes identical copies of that contract across a network of nodes, for verification by anyone at any time. When transparency can be guaranteed, these contracts now become possible in industries which would have previously deemed them too risky.
With embedded legal frameworks, smart contracts have the potential to replace and automate many existing paper contracts. Mattereum is working on such legally-enforceable smart contracts. The process of buying a house could become more efficient with no banks, lawyers, or estate agents. Countless hours, expenses and middle-men can be condensed into a few dozen lines of code and an automated product. This automation principle in blockchain-based smart contracts applies to any industry which requires trusted third parties to oversee agreements. Contracts are only as good as their enforcement, so decentralised dispute resolution services are necessary to make smart contracts useful. Early efforts in this direction are utilising prediction markets and reputation staking tools as with Kleros.
With the rapid development and convergence of AI and decentralised networks, we will begin to see more complex smart contracts develop, such as contracts which are connected to expansive neural networks.
The development of these systems could see inconsistencies being found in legal frameworks, resulting in a more robust legal system. Smart contracts would be built upon those legal models, within which AI must comply. It is still early in the development cycle of smart contracts and progress with require collaboration from the legal industry as well as lawmakers in Governments; smart contracts should be seen as the legal rails for the digital world. If tokens are the beginnings of digitally-native money and financial assets; smart contracts are the beginning of a digitally-native legal system. Smart contracts as with distributed computation and decentralised machine learning will automate data in the Convergence Ecosystem creating unprecedented levels of automation within auditable parameters.
Decentralised Machine Learning
Machine learning is a field within artificial intelligence that focuses on enabling computers to learn rather than be explicitly programmed. More traditional AI approaches based on rules and symbols are not capable of capturing the complex statistical patterns present in natural environments such as visual and auditory scenes, and our everyday modes of interaction such as movement and language. A relatively recent breakthrough in machine learning called deep learning is currently driving progress in the field (however for how much longer is up for debate). Deep learning techniques are ‘deep’ because they use multiple layers of information processing stages to identify patterns in data. The different layers train the system to understand structures within data. In fact, deep learning as a technique is not new but combined with big data, more computing power, and parallel computing it has become increasingly accurate in previously challenging tasks such as computer vision and natural language processing. The most recent breakthroughs in transfer learning and strategic play comes from the combination of deep learning and reinforcement learning as with DeepMind’s AlphaGo.
Machine and deep learning techniques can transform raw data into actionable knowledge; converting voice input into text output in voice-to-text programs or turning LIDAR input into a driving decision. In diverse fields including image and speech recognition, medical diagnosis, and fraud detection, machine learning is equipping us with the ability to learn from large amounts of data. The current machine learning paradigm is where solutions are delivered as cloud-based APIs by a few leading companies. But it is becoming increasingly apparent that this paradigm is not sustainable.
“Data and services are costly to use and can’t sell themselves. It’s staggering to consider all that gets lost without its value ever being realised — especially when it comes to intelligence constructed about markets and data. We simply can’t let all that value be captured by a select few. Fetch has a mission to build an open, decentralised, tokenised network that self-organises and learns how to connect those with value to those who need it, or indeed may need it; creating a more equitable future for all.“ Toby Simpson, Co-founder, Fetch
As per the theme of the Convergence paper in general, centralised systems suffer from a few fundamental problems: inability to coordinate globally, limits on collaboration and interoperability, and the tendency toward market monopoly and censorship behaviours. With machine learning becoming integral to our lives, centralised machine learning is a threat to both economic competition and freedom of speech.
The Convergence Ecosystem if realised provides global data sharing and marketplace infrastructure enabling AIs to collaborate and coordinate processing in a decentralised way. Removing centralised bottlenecks for heavy computational workloads and helps address latency issues reducing the time needed to train models. On-device training like Google’s Federated Learning model is a technical improvement but lacks the ability for mass coordination using marketplaces and tokens.
Decentralised machine learning not only provides a coordination mechanism for the more efficient allocation of resources, it increases access to machine learning capabilities but allowing anyone to submit models and algorithms and get paid based on quality and utility. SingularityNET, doc.ai and Fetch (a portfolio company) are examples of companies already building the type of decentralised artificial intelligence described. Decentralised machine learning will be the result but would not be possible without the development of distributed ledgers, consensus, identity, reputation, interoperability protocols and data marketplaces.
We must avoid the “disconnected and dysfunctional “villages” of specialization” as Alexander von Humboldt put it and instead aim for a holistic view to see the connectedness of seemingly disparate technological innovations.
Read the full Convergence paper here, or go back and read the rest of the abridged Convergence articles: :
- Decentralised Marketplaces – Value Capture in Web 3.0
- Interoperability – Building the Internet of Blockchains
- Data Distribution – A Real Use Case for Blockchains: A Global Data Commons
- Data Collection – Building a New Data Infrastructure with Blockchains and the Internet of Things
- Governance – The End of Scale: Blockchains, Community, & Crypto Governance
- Thesis Introduction – VC for The Decentralised Future: Introducing the Convergence Ecosystem