[image id=’2036′]
By Lawrence Lundy-Bryan, Head of Research
The Convergence Ecosystem is open-source, distributed, decentralised, automated and tokenised and we believe it is nothing less than an economic paradigm shift.
We are excited to introduce the Outlier Ventures vision of the future and our investment thesis: The Convergence Ecosystem. The new data value ecosystem sees data captured by the Internet of Things, managed by blockchains, automated by artificial intelligence, and all incentivised using crypto-tokens. For a summary of the thesis take a look at the introductory blog, and for a deeper look into Blockchains, Community, & Crypto Governance have a read of my last post here. Today though I want to talk specifically about the convergence of blockchains and the Internet of Things.
[image id=’1961′]
Complexity versus simplicity
As the graphic above shows, data enters the ecosystem at the ‘data collection’ layer through either hardware (Internet of Things) or Software (Web, VR, AR, etc). In fact, early feedback on the paper has suggested what we are really talking about here is a new data value chain, and I agree with that to some extent. But of course, this is just a snapshot, a simplification of the emerging data value chain.
If your first thought upon reading the paper or looking at the graphic was “buzzword salad” or “this is too abstract, what are the actual products and protocols that need to be built?” well you are not alone. Indeed, in thinking through the Convergence Ecosystem it was a constant tension between complexity and simplification.
I felt that actually, it was more important that non-technical people understood that all these seemingly disparate technologies were connected rather than I went into detail about the technical differences between Cosmos and Polkodot in addressing blockchain interoperability. This simplification can be seen at the data collection layer. I note the Internet of Things and software as the two entry points for data. This is purposefully broad, I had another attempt which separated hardware into types of devices – mobile, wearables, IoT devices, learning robots – but ultimately the ecosystem becomes too complex and overwhelming to understand for the layperson. With that in mind, I decided that any sensor measuring the external environment should be often bundled together under the umbrella term the ‘Internet of Things’; and this includes all sensors in smartphones and wearables such as gyroscopes, accelerometers, and proximity sensors as well as hundreds of others sensors measuring our world. As for software, well this is so broad as to include any data created from the digital environment regardless of application – augmented reality and virtual reality worlds, our digital exhaust from online activity, and gaming are just a few examples.
The key exercise isn’t to define exactly where data will come from. The key message is that the amount of data created annually will reach 180 zettabytes (one zettabyte is equal to one trillion gigabytes) by 2025 up from 4.4 zettabytes in 2013 and an average person anywhere will interact with connected devices every 18 seconds (nearly 4,800 times a day).
[image id=’2037′]
The so called Internet of ‘Things’
If you thought that the blockchain industry lacked a clear definition, well the internet of so called ‘things’ is even worse. The industry lacks a standard definition of the IoT, and in its broadest sense, it will come to include every physical object that has a sensor, microcontroller and Internet connection. Today that mainly means connected home devices like Amazon Echos, wearables like the Apple Watch, industrial and agricultural connected sensors, and smart meters measuring home energy usage. But the range of applications is growing, and it has been estimated that by 2024, the automotive industry will account for the almost a third of all IoT connections, followed by consumer electronics and FMCG (fast moving consumer goods) and the utility sector. Other sectors including Smart Cities, supply chain, manufacturing, healthcare and others will make up a relatively small proportion of the connections. The IoT market intersects with the robotics market in the sense that a robot has the same features as an IoT device, but with the addition of actuators and the means to move and respond to the environment. We would consider connected vehicles, service robots and other types of robotics as data collecting machines.
The IoT market is often measured in the number of connections — roughly 30 billion by the end of the decade — or the economic impact — 11 trillion dollars over the next decade says McKinsey. A less-asked question is: what happens to all the data? The same McKinsey study found we may be using as little as 1% of data being generated. As well as under-utilising data, how data is being used is unclear. In a survey by Ponemon Institute, 82% say IoT manufacturers had not provided any details about how their personal information is handled.
The emergence of distributed systems like IPFS, Filecoin, and other blockchains offers a potential new model for data storage and utilisation. It has been expected that data would be fought over by devices makers, software providers, cloud providers and data analytics companies. In fact, the reluctance of car makers to put Android Auto or Apple CarPlay into their cars is an awareness that they would lose control of valuable data.
So the key value proposition for distributed and decentralised systems in many cases isn’t actually ‘censorship resistance’ or ‘unstoppable payments’, it is actually a shared (but private) dataset of industry data, both transactional and otherwise. As we know we are still early in the development of the blockchain industry, we still need to prove and scale privacy technologies like zero-knowledge proofs, secure multi-party computation, and differential privacy. As well as increasing throughput of blockchains and linking blockchains robustly with off-chain databases for the volumes of data we expect to be generated from the IoT.
Very broadly speaking, decentralised technologies can provide shared data infrastructure whereby data use isn’t a zero-sum game. It is no longer a case of generating data and a use-it-or-lose-it model. The stack of technologies including blockchain-based marketplaces enables IoT data creators – machine-owned or human-owned – to buy and sell data.
Software is eating the world; and throwing off valuable data
Adding to the 50 billion IoT connections, we also need to add digital media and human-generated digital data. We are on our way to quantifying and digitising our external world, and we are even further along in gathering data on our digital lives. We use the term ‘software’ as a producer of data broadly to capture all personal and business data produced through the interaction with databases, operating systems, applications, and APIs. These interactions build up digital dossiers including cookie and web browsing data as well as active traces like social media and messaging.
On the business side, as we continue to digitise and bring online our offline interactions and documents like electronic health records and learning records, key sectors will have an overwhelming amount of data to handle, which they do not have the capabilities to utilise. On the consumer side, digitally-created and digitally-augmented environments with augmented reality (AR) or virtual reality (VR) will lead the growth in available personal information.
Half the world’s population — 3.82 billion — will have an Internet connection by the end of 2018 and by 2020 it will be 4.17 billion. Mobile data traffic will grow to 49 exabytes per month by 2021, a sevenfold increase over 2016 according to Cisco. We are creating unfathomable amounts of data, and the growth shows no sign of abating. Adoption of AR and VR will further drive the amount and granular detail of data that we can collect, enabling deeper insights into individual and collective behaviours. Whether it’s from the IoT or software, we have a massive data problem.
IoT needs blockchains
We are creating and collecting more data than ever, but we are storing it in insecure private databases with no incentives to share the data. Data breaches and hacks are commonplace, and the data can be censored or tampered with. Software-generated data is lost, hoarded or latent. There is no reason for consumers to do anything other than to give data away for free and for corporations to hoard it.
Open-source, distributed, decentralised, automated and tokenised infrastructure offers a solution.
For more in how communities and tokens will integrate with the Internet of Things and Artificial Intelligence, read the full paper here.
[image id=’1529′]