Building a Responsible Data Economy, Dawn Song of Oasis Labs


Building a Responsible Data Economy, Dawn Song of Oasis Labs

October 2020

Posted by

Jamie Burke

CEO and Founder

Dawn is founder & CEO of Oasis Lab & Network, as well as an award-winning professor at Berkeley working on world leading research in applied cryptography, security, blockchain and machine learning as well as being a serial entrepreneur. We talk about spinning out deep tech startups from academia as well as her important mission to create a more responsible data economy based on privacy and user centricity.

Posted by Jamie Burke - October 2020

October 2020

Posted by

Jamie Burke

CEO and Founder

Key Themes:

  • The new open / responsible data economy
  • Surveillance Capitalism
  • User centricity in Web 3
  • Creating a AI Eden; sovereign AI
  • New data ownership models
  • Data commons
  • Convergence of AI & blockchain

Listen on iTunes


Jamie Burke 0:06
Join us for diffusion digital diffusion dot events a two day virtual conference from the 15th to the 16th of September, showcasing leading projects from across web three with fire size and panels featuring leaders like Joe Lubin athyrium and others from projects including set protocol, the graph, Apple apps, parity are three, Gollum and etoro. So today, I’m really happy to welcome the founder of awakeness Labs dawn song, welcome to on.

Dawn Song 0:35
Hey, thanks for having me.

Jamie Burke 0:37
You describe Oasis labs, at least on the website as the all in one tool for controlling and sharing data. With the Oasis network, you’re looking to build a better internet Actually, that’s the tagline. We use that outlier for our accelerator. So I think we’re we’re probably well aligned. And you describe it as one that protects The data rights of individual users, whilst enabling fundamentally new use cases and applications, and we’re gonna unpack a lot of what that means later. You also have something called the open foundation. And that is putting open data in the context of open money, and open finance. And now of course, open data. And this is something that I personally really subscribe to. And again, fundamental part of the thesis that we’ve been developing outlier ventures for several years. So some of the reasons why I wanted you on the show. Firstly, you’re an award winning professor and academic. You’ve won the MacArthur Fellowship and the Guggenheim Fellowship for natural sciences, as well as a serial entrepreneur and not many people managed to straddle both of those worlds. You’ve managed to do it very effectively. At Oasis network, you’re focused on an area as I said, that’s been really core to the thesis at I ventures that have a what we call a new data, new Open Data economy, as well as this convergence of machine learning and blockchain. And so I believe you call what we call the Open Data economy, the responsible data economy. And again, we’re gonna come to exactly why that terminology and why you believe we need a more responsible data economy. But I think if you look at your kind of academic background, specifically in security, this idea is that it is possible to build a an antidote to surveillance. Capitalism seems to be a theme that runs through a lot of the work that you’re doing. But also, as I’ve been doing the web three founder podcast, you specifically have come up in the journey of other founders, I believe as a consequence of the work that you’ve been doing at Berkeley. So these are the several reasons why I’m really excited to have you on the show. So, by way of a kind of segue into understanding your origin story, which I’ll try my best to summarise, as I mentioned, you’ve your research interests in academia lie within deep learning and security, in particular privacy issues in computer systems and networks, distributed systems security and applied cryptography, as well as the intersection of machine learning and security. So I believe you first began studying in academia in I’m going to try to pronounce this right you might have to correct me but it’s saying you why University Did I get it right?

Dawn Song 3:49
And see why diversity?

Unknown Speaker 3:52

Jamie Burke 3:52
right. And was that physics I didn’t actually understand exactly what it was. But I heard reference somewhere you started out in physics.

Dawn Song 3:59
Yeah, is my undergraduate physics.

Jamie Burke 4:02
Okay, perfect. All right. I’m glad. I’m glad I got that bit right with a bit of an awkward start. You then did a master’s in the science field of computer science at Carnegie Mellon University, graduating in 99. You then went to University of California, Berkeley, he did a Doctor of Philosophy in computer science. And then went back to Carnegie Mellon as a professor from 2002 to seven I guess the kind of more relevant work specific to to blockchain and some of the stuff that you’re doing now at Oasis was as a professor at Berkeley, which is present from 2007 to now to around 13 years. And there are several different research centres that you either lead or play a very active role in. trying my best to summarise Some of them, but things like real time intelligence, secure explainable systems rise lab, all the way through to the Berkeley Institute for data science, Berkeley artificial intelligence research lab, and foundations of resilient cyber physical systems forces. So very varied. But of course, related. And I believe in parallel to a lot of the work that you’ve been doing at Berkeley, and as I said, where you’ve been formative to a number of people’s careers in in blockchain, you’ve also been a serial entrepreneur, but that was something I couldn’t find so much information about. So it’d be really interesting to understand your journey as, as, I guess, an academic in the space, but also somebody that’s been commercially exploring or exploring the commercialization of some of these research areas,

Dawn Song 5:55
Andres? Yeah, thanks a lot. And it’s really impressive. A sudden, like you did a lot of research, because this, this podcast, thanks. Thanks a lot. And yes, thank you by the entrepreneurship side. So adding, you know, to research, as I mentioned for, for, you know, decades now and as my research group as we develop new technologies, awareness, speed to rapid pace and also, you know, a number of our papers got like festival wise are because of time and awareness and so on. And so it’s one thing to push boys and the research frontier. And at the same time, in addition, a, it’s often very rewarding to, to see the technology have a real impact in the real world. So that’s the next step is about taking research technology, actually into the real world and they Doing startups, commercial ideas, developing products and so on, to really have the technology taking the next step. And so in the past, I actually have done this a few times. The first time was a startup called insane security where we focus on mobile security in particular, developing augmented technologies and tools to analyse mobile apps. For example, Android apps and so on, to analyse for its security vulnerabilities as well as malicious behaviours. And this information can be very helpful in protecting, for example, mobile phone users enterprise against mobile security threats. And yeah, so the company was successful, it was acquired by fire NIH, and enterprise actually was used by both Talking to companies and governments around the world.

Jamie Burke 8:04
I was at a spin out there was that something that was developed within say, Berkeley with students and then was spun out as a commercial entity. It was just in parallel.

Dawn Song 8:13
It started out in my research lab at Berkeley, and that we develop these other methods, security analysis tools for, for mobile apps. And as technology becomes powerful, and sufficiently powerful to, again, to really analyse these real world apps, that’s when me as some of my students, we realise it’s a good time to take this to the real world, and to have the technologies and tools to benefits the real world as well. And yeah, so that’s how that happens. And one of the time cycles

Jamie Burke 8:51
involved in something like that. So obviously, often these things are referred to as deep tack and that that that definition usually means It could be decades of research before the commercialization of a technology, but how does that work for you and an adventure like that,

Dawn Song 9:09
it was developed actually, you know, over a long period of time, and in fact, this automated analysis tools, we, you know, like I started the research in the space initial actually was in a programme binaries that was I think, even well was when I was at kind of demand and, and then, so, these premium analysis tools and the oftentimes they actually shared similar and similar principles and at very parts you can apply to actually in different domains, we certainly programme boundaries with the develop some of the first you know, analysis dynamic And static analysis tools in programme binaries with advanced capabilities like symbolic execution and so on. And and then also the working actually Nic web applications and then and the whole application. I think the timing was very good. And, and certain features in mobile application makes the technology more effective in addressing certain types of issues. So, so yes, I would say the accumulation of interaction technical expertise and developing rice approaching solutions, it takes many years of research and development.

Jamie Burke 10:48
And so you were talking about that was one instance. But there were kind of several more where you’d kind of spun out and commercialised

Dawn Song 10:56
actually, the company is now called the mental security. So will we also started out as a research technology developed in my research. Essentially, it’s about secure browsing. So as you you, for example, as you browse the web, and of course there can be, you know, malicious JavaScript that comes through and some of these malicious codes, they can compromise your browser compromise your machine and and then the user in the insecure state. So So the idea is can I actually builds a really easy way. And that doesn’t really impact the user experience using a steel browser, like how he or she normally does, but actually provides a strong protection and to prevent, for example, this malicious like a JavaScript other malicious code. come through. So. So while we’re here, it is like building a glass box, where you you contain and the malicious code inside this glass box. So the user can still see and interact with the webpage as normal. But it’s more like through this graph box, you can see what’s inside, but what’s inside can not come out. So that’s just one analogy. So essentially, that’s what we call a secure browser. And so the way to think about is actually the glass boxes kind of in the cloud. So it’s a cloud browser. And that’s your vendors renders the webpage in sandbox environments, and it passes through the image allows the user to view and interact with

with the web page through this security

glass box. And so this way, the users machine is protectors. But at the same time the user’s browsing experience stays the same. And so that, yeah, so that company is also doing well, it also has fortune 500 companies as its customers around the globe as well. So these are great examples. How technologies developed in a research group can know can really help benefits the real world.

Jamie Burke 13:31
Yeah, it’s fascinating. And so, it’s a good segue into awakeness labs where your, your founder and CEO, but in a way, it could be said that you as a, as an innovator, you’re almost building a portfolio of IP that you can then commercially exploit with various teams. How How is your role in in these organisations Is it the same as now where the waste is where effectively? You are, You play a role in early stage of a company in terms of kind of helping bring it to market provide that leadership, but then you, you kind of step back. And there’s a bigger team that kind of scale that how, as a founder, how would I understand? Because obviously, the different types of founders, how would I understand, you know, typically your role as a founder in that context of these kind of r&d? spinouts?

Dawn Song 14:35
Yeah, that’s a very good question. Oftentimes, these technologies even though we have been, you know, working on the developments in the restaurant for oftentimes a long time, but when we first take it into the commercial world is still early and oftentimes, you still need to figure out what is a product so replicates in in the US academia in a research setting we focus on developing the technology developing the best cutting edge technology and solving really hard, challenging technical problems. And but when you take that technology into the real world, you then actually have to pick out what is the right product to build. And that’s it technology can enable and to make it easy to use, make it easy to deploy, and, and solve customers pain points. And, and because of that, they’re oftentimes the person who doesn’t need to know the technology well, and, and also, you know, seeing the real world problems, and they can see what problem the technology can help solve and what’s the best. You can kind of each has as well how to best use emojis. To a product. And and because of that, I think often times and is ready these examples. Like I oftentimes, like reading in this case and absolutely. So in this case, I was the CEO. And a big part of that is to try to identify how to best apply the technology in the real world. And that so so I think, especially at the early stage, and that’s really important, because yeah, you need someone who knows the technology very well

Jamie Burke 16:44
to do that. You know, often when I speak to founders who are from an r&d background, especially an academic one, they are very good at solving technical problems, but they’re quite difficult for a non techie. Got personal like myself to keep up with but I must say, just even hearing you talk now about analogies like a containing things in a glass box. And other interviews that I’ve seen you’re very good at framing and simplifying complexity. And so I think that’s, that’s clearly why you’ve been able to be so successful in the commercialization of projects. So now you’re awakeness Labs founded in March 2018, where your founder and CEO, you’ve been backed by a number of major investors, including Andreessen Excel, as well as by Nance and several others, I believe, you have to pass in that way slabs and Oasis Foundation, and main net is imminent that time of talking so we’re September 2020. And this will allow the network to become a token optimised before we get into Oasis labs, I think it’s important to step back a bit and frame the mission. So, as I alluded to earlier, you refer to it as a responsible data economy and something we would get out I would refer to as an open data economy. But it would be great to hear from you. Firstly, why do we need a responsible data economy? I’m assuming if we kind of it is in response to the current data economy and what’s wrong with it, obviously, a word that starting to resonate beyond kind of the Nisha for technical communities surveillance capitalism, this idea that the current business model of the data economy is is is not working or doesn’t work for the average user. But it’d be great to hear from you. Why do we need a responsible data economy? You know, why is that mission important to you the most presumably, the Most important thing to you, because you could be working on several other things right now, we’re not Oasis labs.

Dawn Song 19:06
So, yeah, that’s a very good question. And it’s also you know, areas that have been working for a very long time. So today, as we all know that data is a key driver of modern economy. It helps us extract better insights helps us to make better decisions and so on. And especially with the rise of data science, machining, and so on. And it’s becoming even more important, and in the future, it’s going to be prevalence. And in terms of high point data is how it is in time a lot of this data is sensitive, and handling sensitive data has been posing many challenges, as we have experienced today. So firstly, let’s look at the user signs on the user side. A lot of users realising that they are actually losing control of their data, once they did is collected oftentimes don’t even know when data is being collected and what data after it’s been collectors. And they don’t know when after that it is quite as to how it’s being used, and so on. And, and also, right uses atoms you trust, the services that they use, because of this as well. And also, at the same time, users are now getting sufficient benefits from their own valuable data. And, of course, users are getting certain free services and so on. But in the future, as we know, again, they had is going to be even more important and also, ultimately in the future. I think a big part of who Your eye is actually going to be defined by your data by our digital self. And hence, there is of course, the question, is it sufficient just for users, again, is free services are actually users can get even more direct benefits from their data. So this is one side, four on the user side. And on the business side, we’re also seeing a lot of challenges. First, businesses continue to suffer a really large scale data breaches and so on. So even when they want to do a good job, oftentimes, they just don’t have sufficient technologies and tools to help them with that. And also with the rise of the new privacy regulations such as GDPR SSP and so on, as becoming more and more cumbersome and expensive for the businesses to To comply with these regulations, and and there has been estimates, even just physically alone is going to cost like billions of dollars for businesses to comply. And even more importantly, what we’re seeing, what we continue to hear is that it’s actually really difficult for businesses to get access to data to actually utilise data. A lot of valuable data is locked up in data silos, due to privacy concerns, and so on. And even with the company, it’s a very good example. And they of course, is really important for us to have better fights against the COVID-19. And both in terms of doing better conduct chasing so that we can better identify who I effect is and why risk and what’s the right measure to take and also to identify Five

you know how to better

again who may be sick and has how to better utilise like hospital resources and so on. But in general, still is actually difficult for people to get, like, for example, for medical researchers to get access to data, and to help them develop better tools and engineers, for qubits and so on. And and sometimes it actually goes the other way too far. And then there is no privacy protection and some of the earlier guidance that was put in place was also you know, there is also a discussion about removing them, and to try to optimise for speed as well. So then, again, I think the COVID-19 is a great example, showing that we really need to have better and more effective and systematic solutions to address these challenges, essentially how you can provide better privacy protection and at the same time to be able to utilise data. So to address these challenges that I just mentioned, essentially, we need a new framework, because what we have today, as we have seen is really insufficient. So this leads to this new paradigm and framework that I call responsible data economy, essentially to, to build up this new framework that can ultimately can help us to both establish and ensure users rights to data, and how and to help them maintain and they rise to data, including how they want that data to be used as on and also at the same time to really enable data to be more effectively utilised, while



honouring and enforcing these users rights to me. So that’s really the essence of this responsibility Academy is to, I would say is to bring what we see the conflicts, actually to ideally a win win situation. for users and businesses.

Jamie Burke 25:37
Obviously, when we’re thinking about data and privacy. There are different cultural attitudes towards it. So I would argue it’s not the same in Europe in Germany, for example, as it is on the west coast of the US. Similarly in China and parts of Asia. So, when we’re talking about privacy, and we’re talking about data, how do you see what you’re doing with a waste? So that kind of mission of responsible data? Do you believe that there is a universal form of privacy that is common across all of these different cultures? Are there actually multiple data economies? So rather than being a data economy, there are multiple data economies? And how does that interact with the different polities that effectively, you know, from a jurisdictional perspective?

Dawn Song 26:39
Oh, yeah. Yeah, that’s a great question. And of course, different people naturally have different preferences. And, and hence, I think, the point of the responsibility economy as to this one I would say is to give users control and users can decide What what they are comfortable with and what their preferences. And so, for example, let me actually give a give you one use case that we have been working on. And this is products that we will be launching soon with a partner. So this is a use case in genomic data. So, and so as we know, like much genomic data, of course, is one of the most sensitive data for users and in particular, you cannot change it, you can’t change your genes. And so, so, right, so, it’s, you know, privacy is particularly important. And so, you may have heard of 23andme and, and other companies like that these are direct to consumer and genomic companies there. They provide economic analysis to their users and these companies are 23andme and ancestry as earlier this year, they actually in a heads layoffs, society in particular for this lower demands in consumer, and due to mainly due to like, largely due to privacy concerns. And I’ve had a number of my friends actually told me that they would be concerned about using these type of genomic services, for privacy concerns and so on. So so the the products that we sell, we were in the process of launching with a pioneer in the space is to actually help users to become owners of their genomic data and take control of their genomic data. So in this case, And users data will be stored in encrypted form. And and users have the have the control, they can decide how they want the data to be used and who they who they would allow the data to be used. So for example, in this case, the user can give consent to a genomic company, and to have the genomic company run genomic analysis on their data to provide results, for example, you know, like you what disease you may have higher likelihoods and have and so on. And then in this case, actually utilising our technology actually by combining secure computing and blockchain to the blockchain helps. helps in keep a immutable ledger of users the rights to data in this case, their rights to their genomic data. And also the, the policy for that data, for example, in this case, the consent for their data to be used by this particular genomic analysis company. And, and then when the genomic company runs the analysis, the genome company actually doesn’t get a raw copy of the user’s data. The analysis is running in a secure execution environments. And then in this case, so the user will see the results. But the genomic company from the because the computation actually runs in this secure execution environments, that you now can company again, right doesn’t get a copy of this of the raw data. And hence, after the computation is done, the genomic calculator doesn’t really have access to the data. Again, if it wants to use the data, it needs to ask for permission again to use the data and so on. So Then. And in this case, this is how users can be thinking chop off their data and also at the same time, so benefits come with data. And, and, of course, there are many more other use cases like this. And then again, so one thing here, you may notice that it’s up to the user to decide. So for example, for some users, actually, they put their genomic data open in the public.

But lots of other users have more concerns about property, protecting their genomic data, and so on. So I think what we’re doing here is provides the technologies and tools to help users to achieve what they want, well provides the capabilities to support a wide spectrum.

Jamie Burke 31:50
The interesting point about that use case genomic data is that it builds a bit of a strong argument for user centricity as you say, you user control because on the one hand, you know, I’ve heard the example of, if you hand over the rights to your genomic data to a particular organisation, then effectively it’s turned your DNA into IP and that IP could be sold to pharma, you effectively lose control, you could be cloned in the future, who knows. But at the same time, for the organisation that has to own that data, they have to be able to secure it and to secure it to fail to secure it is a huge liability now, especially with data laws like GDPR. The cost of not securing it could be billions, it could bankrupt a company. I think in some, in some cases, it’s almost unlimited. So there is a very good commercial argument as to why you don’t necessarily want to directly own the data. You just want to have access To that data to derive value from it for putting into machine learning, for example,

Dawn Song 33:06
right? Yeah, exactly. Yes. Yeah, this is also some of the value propositions that we would provide to our customers. Is that, right? Because oftentimes, they actually don’t have the skill sets and the technology needed to do that well,

Jamie Burke 33:24
right. So I will come in naturally going into one of my other questions, which is if you look at your research background, there are kind of these parallel streams you have, you know, interest in cryptography applied cryptography, blockchain and and of course, machine learning. And these two streams have converged into what you’re now doing at a wasteless Labs and with the Oasis network. Was that was there an aha moment in that process? Or Did it just naturally happen? And could you explain, I guess the benefits of blockchain to machine learning?

Dawn Song 34:10
And yeah, that’s a good question. So say, me, the supply chain here, as I mentioned is something that always is now our platform. We are combining the blotchy and secure computing, where the blockchain provides a immutable ledger for users rights to data and their policies on how the data should be utilised, and so on. And in machine learning, of course, as we know, in particular, with this methods like deep learning and so on, it’s hugely they have hungry and you need huge amounts of data to train these models and so on. And, and of course, oftentimes, a lot of data can be very sensitive. So, so, essentially with the Oasis platform, again as I mentioned with this responsibility they can meet it helps to helps us to maintain control after dinner and the rest of it and also at the same time, the data can be utilised in a privacy preserving way for example, to help train machining your models and also because with the blockchain here,

we actually track

every access to use this data, how the data is being used, by who and by what programme and so on. So, then in this case, it also naturally establish an audit trail, even for value distribution. So, for example,

Unknown Speaker 35:51

Dawn Song 35:53
if missioning model is trained using

users data

and the machine Any model of a sound becomes a profitable service. And then some of that profits could be digitally distributed back to the data contributors. And in my research at Berkeley, my students are both out in salon collaborators. We also have to have studies. The question about how to validate is, for example, if you’re challenging the model, using a set of data points from different data sources, different data contributors, what’s the best way with the fairway to distributes the value increases by the maturity model, back to the data distributed? These are data contributors.


they’re actually identified this notion called the Shapley value, which actually provides a unique they suit it’s a glasses and

solution that essentially

satisfies a set of desired properties.

And so,

residence is another example, how you know how this responsibility can meet and this new framework can actually help users not just make a job that they have invested in and also getting benefits directly from the data.

And we are

also exploring using this

technology so, we can create what we call data trust, where different users can put the data into this data trust. And

and again, I write for these

data consumers, they can then utilise the data to do machine models,

do other analysis and so on. And then

they this helps with breaking apart social goods for medical research. On or if it


revenue or profits, it can be issued back into the trust and issued back to the different contributors and so on. And, and the hope is that in the future, again, this actually can be a very good way to help solve this bigger set of problems as well. So to make it easier for machine learning, researchers to gain access to data and also at the same time to as as intended to provide privacy protection for usage data.

Jamie Burke 38:37
I think, you know, new models in data ownership data trust Data Commons, where you can aggregate data against predefined purposes, I think is a really interesting space. I’m gonna be interested to see how that converges with daos decentralised, autonomous organisations around the ownership of data and An interesting area which trend economy of ocean talks about a lot he’s done a recent post again in September is the idea of leveraging the billions of dollars that have gone into creating wallets assets and exchanges for crypto. This is kind of high value asset, how that can be now repurposed for the data economy, which you could argue as a low value asset. But we can now leverage the same infrastructure if we can turn data into into a digital asset. So one of the things so I kind of I personally, totally subscribe to all the things that you’ve been talking about. It kind of feels like there’s still some missing building blocks for us to achieve this in web three, in particular identity. And I know that this has been an area you’ve been speaking about a lot recently. If we’re going to give control and permission to data in a Increasingly decentralised way, we somehow need to solve for decentralised identity, that self sovereign identity through innovations such as dids, decentralised identifiers, how do you see the importance of solving identity? And, you know, do you think that that limits what’s possible today? Or do you think that there’s kind of a hybrid approach and more centralised approach, which can allow us to still unlock this value from the data economy?

Dawn Song 40:32
And yeah, that’s a good question. So I think decentralise I didn’t see it’s just one step in this responsible economy. And where you want

users to own their data or to retain

control to their data and their rights to data, and they this entire decentralised identity helped them and it’s a it’s one step in this in this process. Essentially. If you think about this as used to maintain, you know, control of the data, the way that anything could draft their data is that essentially you can utilise decentralised identity to help

meeting, you know,

to identify which they had the user has control over, and so on. So the the mechanics Actually, it’s a natural components in that, but the responsibility economy is much broader than just identity, as you know, different aspects of users data and how that data can be utilised.

Jamie Burke 41:40
Right. And I guess to zoom out a bit, you know, if we look at what we collectively are trying to achieve in the responsible data economy, we’re kind of putting in place these various building blocks protocols to create a stack To allow this this fairer, more responsible or at least user controlled data economy. But ultimately, in my mind that is really to serve a fairer or more responsible AI economy because ultimately, the purpose for data these days is to to feed into or improve forms of AI, or at least machine learning. So if we look at that paradigm that we’re building, alongside other instances, things like GPT three, that’s being rolled out by open AI, which is a collaboration between a number of organisations including musk and Peter teal. That is, permissions but at least for now, once it’s open source, it is heavily permissioned and is not leveraging blockchain. It’s not token optimised in any way. Do you see this as a parallel competing paradigm? Or are they different things? Are they working towards different ends?

Dawn Song 43:14
So I think one thing and that’s really important

that we are still just at the very beginning as that right now, as the the AI power is more concentrators in the, like the bigger, bigger companies and so on. And ultimately, as the the AFR becomes more even even more and more advanced in song like so many aspects of users lives, with the

no impact is increments that

are even determined by These AI agents. And ideally, I think as a next step, talking about self sovereignty is, you know, self sovereignty or finance, or they have, but ultimately you want to have self sovereignty of AI me, you would like to have AI assistance agents that actually work on your behalf that that is working for your messages instead of just, you know, providing data to some other third party agents that’s, that makes have other’s best interests at heart. So, I, we are still working really early on for that. And I think like two years ago, I actually give a talk on this about how eventually we would like to build this paradigm that we call AI even that actually helps users to notice me thinking after they had recipe. But ultimately, I should have these AI agents that is under users control and works by the best interest

of users.

So I radney ultimately, that’s where we want to be. And, and it’s great to see. So first of all these wrestlers willing to actually feel this powerful AI agents and so on. And yeah, so today like Gypsy three, it’s so like, the model is really big. And so it’s, it’s difficult also for individuals to like, do making their own and so on. But, but that helps that in the future. As the Yeah, that technology continues to improve. And I do hope that one day I think is what’s become true.

Jamie Burke 46:11
Yes was interesting. So whilst, in parallel, people are building out these, these new models, which will hopefully become increasingly more open or at least less concentrated in the hands of a few platforms. In parallel, we’re also fixing the web. We’re building a web that has increased levels of user centricity. By the time that that open AI matures, it can be birthed into a web which serves the individual versus, say, the shareholder or the state. Dawn has been really fascinating talking to you. I know we tried really hard to make this happen for a number of reasons. Not withstanding Zoo zoo. problems because at the moment I’m speaking to your you’re in China. And that’s always technical difficulties. So thanks for making this happen and bearing with those technical difficulties. And good luck with main net. For a wireless network, I believe that this is imminent, right. You’re aiming for for October this year? 2020. Yes. Yes. Perfect. I like the way you gave yourself some flex they’re here. A good four week flex. Gone. It’s been fascinating talking to you. Thanks for your time. Looking forward to speaking to you again.

Dawn Song 47:33
Great. Yeah, thank you so much. I see a lot of fun talking together.

Jamie Burke 47:41
You’re an early stage web three founder apply to our award winning accelerator programme Basecamp. At ally ventures.io slash Basecamp. We write your first $50,000 check and give you access to 200 mentors, including many of the leading web three founders, and a network of 1000s of the world’s leading investors in exchange We’ve helped over 30 startups from 15 countries from all around the world, raise 100 and $30 million in growth funding. It can help you fast track product market fit and where relevant the launch of your token economy.