Artwork for podcast Ocean Missions Campfire
Tim Daubenschütz: Rugpull Index - Exposing Rugpull risks on the Ethereum Blockchain
Episode 329th March 2022 • Ocean Missions Campfire • Scott Milat
00:00:00 00:59:16

Share Episode

Shownotes

We speak with Tim Daubenschütz the founder of rugpullindex.com and learn how monitoring the Ethereum Blockchain can help expose the risks of getting rugpulled before you stake your tokens in liquidity pools.

The rugpullindex is currently monitoring Ocean Protocol's data tokens but we explore how the service can be extended to cover all balancer liquidity pools.

Website - https://rugpullindex.com/

Discord - https://discord.gg/hBQVJY9Me6

RPI Blog - https://rugpullindex.com/blog

Tim's Blog - https://timdaub.github.io/

Transcripts

[:

But this is basically just a conversation to, to learn more about the project. We're going to assume that people listening have either heard about it or, or haven't heard about it at all. So it should cover that sort of that broad. Area and yeah. So basically I was just wondering if you could start off by telling us a bit about yourself and how you became involved with ocean protocol.

[:

And so it was looking like in Berlin specifically for an employer where I could learn more about blockchain. And yeah, I had been previously on the Bitcoin talk forum also doing a bunch of like freelance and always getting scammed and whatever, and not like basically my work didn't get paid, but ultimately I ended up working for a company called ascribe.

latest pivot date pivoted in:

So I was really a let's say, an early employee of ocean protocol. I started out as a front end developer there at ascribe, and we did very similar things that then people are doing today with NFTs. So registering digital art. And I ended up being the product manager of big chain DB this a blockchain database.

And when we came to the pivot that went from big chain DB to ocean, I helped prepare the token launch. I was also, I think one of the people behind arguing for launching a token within the company, because I think at that point it was fairly controversial to do that.

d to do like my own things in:

Although I think we used to call like automated market-makers largely, I think we're calling them bonding curves still. And there is like a tiny detail also on I think, how these work differently. But yeah, when, when I saw that they had deployed this marketplace for data, but interest.

I got really interested and I started Rugpullindex.

So when ocean protocol launched a data marketplace, like the version three, so to speak there were rugpulls happening. So a rugpull in a way is a financial fraud. Where let's say you set up a project and you advertise it with a roadmap and you say, yeah, we're going to do like all these awesome things in the future and so on.

And you, maybe you have some like sales website where you sell a token or an NFT or any kind of financial asset. And yeah, once you have finalized the sale and maybe you have made a bunch of money also through it then you quote unquote, pull the rug. And what it means is that either you just, you just don't do anything of the promised things on the roadmap.

So you just quote unquote exit scam, or another way I think of pulling a rug would be that maybe you kept some of those tokens that you sold. Maybe you kept the reserve. And now what you're doing is you're selling this reserve on the market

in a very unfairly priced way. So you are just dumping it on the market. And yeah, in both cases you are metaphorically like pulling the rug under someone. And yeah, that happened for those rugpulls. They happened for ocean protocol, especially in the beginning.

I wasn't even quite there to witness them at first. But I saw an issue on github where they were just simply outlining the problem. And I thought, oh, that's actually quite interesting. As a problem to solve, because it seems it's not only applicable to the ocean protocol, but as I said, I think also the exit scam is very similar.

It's yeah, it's really just one class of financial fraud that can happen in like myriad ways somehow. And, and yeah, so what I did then since I'm a freelancer, I can allocate most of my free time as I want. I just just went heads-down coded a thing. Like I think my first prototype was simply an XL spreadsheet.

And kind of the idea that I had for ocean protocol was that a rugpull can only happen. And that is still the intuition. That is the rugpullindex is building. That rugpull can only happen if there is not a diversified interest of investors. So what I mean by that is that that maybe there is only one really big investor that has one certain interests and not many investors that have like a diverse set of interests, for example in the us stock market on the S&P 500 I think there are so many different people trading that asset with so many different motivations, that one single event or one single thing happening like an event happening will not crash the whole market because everyone else somehow thinks differently about the value of that asset .

If everyone would be thinking alike, like let's say everyone would say The S&P 500 is really like a valuable asset because the sun shines in California and suddenly the sun would shine in California, then probably everyone would sell it. The price would go to zero, but since that is not the case there is some sort of like robustness also in pricing.

And that is actually a good thing. So mathematically, you can capture something like this in a, in a Gini index. And the Gini index in a way is a mathematical formula that just calculate the statistical income like overall population. And so really the idea of Rugpullindex was if we have many different investors in a, in a pool of a data set that have all different interests.

Let's this is just an assumption, of course. But if, if we have like many. And we can measure, for example, the inequality in a pool. So whether only one person controls it or many like 10 people control it, then we can also make like a statement about the safety of the market and also of the health of the market, because the other option at that time when Rugpullindex didn't exist was to maybe let's say to just like do a manual due diligence check.

So what people did, I think also Datawhale was one of them and others. They basically, for all the assets on ocean protocol, they sometimes they bought them. So they put their own money. They bought these assets, then they looked at the data and they said, ah, this is a really you know, very valuable piece of data.

So we should price it higher. Or this. Trash, so we should price it lower, but that I think in an open market where there can always be changed, like where there's always like a constant change going on and also where yeah, prices can be quite high, like already today. I think the, in the ocean marketplace, for example, there is one data set that costs 40,000 euros.

I think this way of manually doing due diligence becomes kind of cumbersome and yeah, just not very, I think it's not very efficient or, or very tough to automate. And so I thought actually, I mean already we are rating financial assets based on their market health and the S&P 500, in fact, A good example of this, where the S&P 500, that contains like 500 most prosperous companies in the U S it picks the ranking.

So which one is first, second and third. And so on, it picks this by the market capitalization of each individual company. So I think, I'm not sure, but I think the top companies, apple, and that's simply because they have over a trillion dollar market cap. And then the second is I dunno, alphabet, which is Google.

The third is Facebook and so on and so forth until you go to the 500, which is some like, company that I definitely don't know, but which has a much lower market cap than than apple or Google. Yeah. And so Rugpullindex is basically that if you open the website today and you You look at the, at the table that, that is presented to you under the graph.

Then you can see that there is that we are giving a score for each dataset that we call it the rating. And this rating is being formed by the by the inequality factor. That is the Gini index that I just mentioned. And a secondary factor that we are using is the total amount of liquidity in the in the photo code, all the book or automated market maker.

And with these two with these two factors, essentially we are giving a rating. And as you can see, for example, today, and also for the last few days, I think, or even months, the top product was from innovation attalier the products on amazon.com and we are getting. Like the best rating of the day, which is a 7.1%.

Even though they have a very high rate of inequality but they also have relatively speaking to all the other assets on the ocean market and the big data protocol market. They have a very like a lot of liquidity, so that's, that's I think why they aren't.

[:

So inside of ocean protocol all of the data tokens are essentially ERC 20 based. So is, is that you know, w what are your, your plans for the future of rug pull index? Is, is this something that's gonna start playing more of a role you know, this interoperability with the ERC 20, or what are some of the things that you sort of have, have in mind for the future?

[:

But I think it makes total sense, like the way we have started to rate ERC 20 token based on the market health I think it has been done of course by other platforms. So yeah, classic examples are the coinmarketcap and also coingecko, but they are just looking at the at the total market capitalization of the token, let's say, and now I'm getting back to your question.

It would totally make sense to look at other as you correctly pointed out ERC 20 tokens because that, because I think it would just be interesting. Let's say I would love to know the top asset based on market health on the Ethereum network, you know, All of the pools that exist. Like uniswap sushiswap balancer and so on.

And so I think in the beginning we were quite naive like in, in doing this project because we thought, yeah, okay, this is a way to solve Rugpullindex to solve Rugpulls on the Ocean protocol, right? Because what it allows you to do now, the, the idea is that a user comes to our website because they want to, let's say speculate in data assets.

And so our website gives them an overview of the data assets that, are really trustworthy. And in fact, if you look at it, I mean, we only listing 40 data sets and then they the rating rapidly drops off. So there's really only a few data sets that we think are, is, are really good compared to many that are not that great, I think.

The idea was that that kind of like, we, we are allowing investors, but also yeah. Data scientists that want to consume data, we are allowing them to make an informed buying or investing decision in yeah. What, what assets to invest in. And that, of course, I think it applies for any kind of ERC 20 token, and we want to expand our crawler and we want to maybe one day in the future, we are able to really track all of the first of all, I think, balancer pools and then maybe we can extend to Uniswap .

To sushiswap pools and so on. That is what I think, you know, the dimension that a user visits, our website, that that seemed, seems to have been. The initial idea for the project. And that's also, I think what we are still framing ourselves as in a way as like a knowledge base or whatever. But since this has been launched like 13 months ago or so people have come to us with also with different motivations than just looking at our website.

So like for example, I had people reaching out from a project similar to numerai signals and they wanted from my website to tell them when to buy and when to sell Ocean protocol, based on the information I have from the data assets.

So their idea was that let's say if the data market in total kind of crashes. Then this would be an indication that also the ocean should be sold or bought depending on what you want to bet on. So that was one, very interesting use case that I, that we had anticipated. And then recently through conversation with the CEO of Paraswap I think there was another interesting dimension discovered, which is that for everyone that is doing routing on automated market-makers, so 1inch, Paraswap and others I think even like Metamask might do this.

For all of them, it might also be interesting to get a heuristic on which of the pools that exist. Let's say on the Ethereum network are trustworthy. Or have this yeah, quote unquote high diversity of interest from a liquidity providers, because quite clearly you want to route a payment through a pool that is safe and where the liquidity is not rugpulled or in any kind of any other way kind of manipulate it, yeah, you always want to suggest to the user to use pools that are non malicious and do what they promise to do so to speak.

So, yeah, I'm not sure how much the how much inequality and the liquidity factor really plays as a qualitative metric for those products is how relevant it is because it might not be relevant enough, but I think. The, like the principle that we are using, that we are looking at basically pools and we are rating them based on some factors.

I think that is at least for very nominal minded people and is, is I think quite interesting. And yeah. Guess it's interesting that there is like different use cases for such a simple website.

[:

Putting some data into, into an Excel spreadsheet or whatever it was. And then over time, obviously you've kind of hardened that, that product, and then also updated the front end. But just even the, the fact that you're going in and looking at on chain data and, and highlighting specific pieces of information you know, and it's starting to just open up a whole world of new use cases is, is definitely something that's, that's very interesting.

Is there, did you want to mention anything about the rugpulled network?

[:

And I think it became kind of clear in the beginning that building such a website comes with its own challenges. I mean, one of them is for example, that the dotcom domain, the server, like all of this, the, the bills are paid by me. And then also the backend, I have decided to make it I decided to leave it closed source.

So it's not actually available on Github , but this came with its own challenges. When in, I think probably, yeah, in July or August, we started onboarding developers through kind of this blockchain based process of working together.

So we basically, we have a markdown file on GitHub that basically outlines the process of how you can work together with us. And it's very simple. It's very similar to let's say, GitCoin where you can create bounties in the same way you can come to our project.

You can read through. The handbook document on GitHub, you make a proposal to us on what you want to work on. And you send it to me. And I have received since many months, like money from the OceanDAO. And if I accept the proposal, then you are good to go to start developing some feature or something.

And in the end you will get paid in like cryptocurrency. So it's a kind of like this regular bounty based approach towards recruiting or paying people. And I had used this process already in a job that I had done before at a project called LeapDAO, where we did almost the exact same things except.

The money was like the money to be dispersed was held in a gnosis safe, like a multisignature wallet. And then everyone had the opportunity to just create a GitHub issues. There was a process that would allow you to veto any kind of bounty, but if it wasn't waiter, then you could start working on it.

y that, that had been like in:

I was like really working for a Dow. It was very weird. But I always, what I always liked about this process was like there was a very low threshold of. And it also through that, the kind of people that we onboarded were like super interesting, because they were like, yeah, just really entrepreneurial.

And they were like really living and breathing crypto. And also because we didn't have any interviews or, or like any kind of a gatekeeping on who could work with us. I think the level of diversity of the contributors that we had was extremely high. So I got to work with people that I would have never, otherwise I think would have worked with.

And so I wanted to replicate this, or let's say I almost replicate this process, except for that. I was that I'm still gatekeeping on Rugpullindex accepting the proposals myself and managing the money and so on. But one of the problems with this approach that I now had to learn Was that if you have closed source software, then it's really not possible to have an external contributor contribute to the project in this like blockchain bounty fashion, because, you know, in the traditional environment, what we would do is I would start talking with my lawyer and we would set up a contract.

And then, you know, that, that is like there would be like this NDA style. So a nondisclosure agreement style contract, where I would say, yeah, well, I'm sharing the closed source software with you. And that means you on one side, you will be able to work on it. But if you kind of you, you should never distributed publicly or like use it for your own benefit, like hosted for your own benefit or whatever, but in like the, let's say industry.

Strict interpretation of working with people online, through cryptocurrencies, this kind of working with the, like actual law in the country is not possible because essentially I don't really necessarily know my contributors. I, I mean, I know the names on Discord, but I don't know where they live.

I don't know where they are from. I'm also not super interested in that either. So we are paying ourselves through invoicing or requested, or actually that works quite well. So, but then of course, if I can not rely on the legal system and I mean, of course we could all make this happen. I'm not saying that's impossible.

I just want to kind of, with this principle of working people in a crypto native way, it's more of a, and let's say. Yeah, it's a, it's a principle. It's a, it's a, it's a way of life of living and working that I want to do. And in this kind of particular paradigm, I cannot enforce NDAs, which means that if I would give someone access to my closed source library, and I gave access to a anonymous contributor, then they could either decide to work with me.

And that would be really great or they can, yeah, they can just decide to quote unquote Rugpull me and and run away with my software and, you know, hosted somewhere else and like and do the exact same thing as me and become a competitor and basically ruin my business. So, then really, there's only two ways of working with external contributors in this like blockchain or cryptocurrency way that is either through.

Actually this, yeah, there's probably only one way that is actually through opensource. And that's, now this is a lot of context for this rugpulled network, but that's basic. That was basically the problem that led to the idea of a community crawler being born. Because because I'm thinking this, that this problem of not being able to enforce NDAs with anonymous contributors, I'm speculating that it's a, it's a problem of the future that we'll face more and more often.

And in, especially in DAOs I think it will become like a really prevalent problem. And so I thought, okay, then we have to somehow address the problem on a, on a technical side and that we can actually, I think. You know, these are my skills out there for, for solving that. And so I said, why don't we decentralize the bread and butter of the, of the website, which is the, the crawler itself that picks up on chain data and delivers it to a database.

And then, you know, when you visit my website, Rugpullindex.Com, you see these charts and the table. And so why don't we decentralize the process of acquiring that data and how do we do it? We do it by basically giving people the incentives to deliver me a certain kind of data. So maybe I can say in this like Oracle network that we are now building, maybe I can say something like, give me for every hour, the price of a quick cross zero, which is an ocean protocol.

You know, give me that price in euros every hour. And if you do that, so if someone else you know sets up their computer, runs the community, crawl a website that, you know, checks periodically once an hour and sends the information to me, then what I would do in exchange to pay the person for what they've done, I would send them also some money back.

So I would either send them, I don't know, USCC or, or, I mean, also ocean seems to be a very reasonable choice or any kind of other cryptocurrency. And through that, because now, so suddenly then, you know, we are hollowing out the value of my website and we are shifting the, the value of it really onto like this network where different people can run it.

And we are also the guarantees on the reliability and robustness of you know, receiving this data. periodically is enforced through the network and ideally is also incentivized through the network. And that I, at one point maybe either I you know, I just run the website or in even a better scenario, everyone can just kind of like click together their own.

You know, maybe there's like a client for the rugpulled a network that works in a similar fashion to let's say the Bloomberg terminal. And, you know, they can load up their wallet with 100 ocean or whatever they would. Then they want to track five different prices of five different datasets. Then they just click it together.

They launch these bounties. They say, you know for the next 24 hours, I want to see the prices and this and that resolution And yeah, and if they are lucky and there's a minor that can actually deliver this data, then it will show up on the dashboard. And in that case the benefits for me are that, that I don't have to run a website anymore.

I don't have to pay bills and fiat currency anymore. I don't have to have a closed source beckoned, and I can just talk to people on high level and not like as a, let's say an employer or in a contractor relationship. And yeah, I think also through a network, I think things would start to become much more robust because networks have tendency of like overcompensating through a redundancy.

So maybe instead of just, you know, one person submitting these data feeds, there would be like 100. And so that's of course much better than. Just me reporting on this data, you know, I can make mistakes, I need to sleep. Yeah. All of these things.

[:

And from there you're surfacing some information from the blockchain and you're, you know, providing that and, and, and graphs or, or whatever. And really the, the idea is, is kind of adding the networks layer to, to the backend so that now you're not just servicing, you know, these, these two pieces of information, you're actually creating the infrastructure where.

Anyone can come along, like all of these contributors who are already coming up to you and just basically plugging into, into, into their network providing you with a data feed and then basically getting, getting some, some funding and returns. So I mean, that, that concept is just super, super deep, super rich, and very fascinating with it does lead me to, to, to one question, which you're welcome to kind of tackle at any sort of level that you feel comfortable with, but I can imagine that there is a number of, of interesting technical kind of considerations or challenges that would need to be overcome in order to enable something like this.

Can you maybe just address or highlight the one or, you know, maybe just the one that, that kind of, is it top of mind for you right now? In terms of actually the, the first kind of piece of the puzzle that really needs to be moved in order to, to kind of pursue this direction further.

[:

The Rugpulled in a way it's very, I think conceptually it's very interesting too. Peer to peer Oracles that put price on, on chain. So I think chain link and bent protocol with their price feeds for DeFi applications. I think they are already doing something that I think is very similar to what we might also want to do in a first.

But on a, let's say on a very high level, what we want and what I think there is a, if we were able to achieve that it would be a really beneficiary thing is that if we could guarantee the quality of data that a buyer gets transferred to, that would be really awesome. So you have to think of it in I think also

in, in like the blockchain way you know, you can design a blockchain or a peer to peer network in any kind of way you want, you can there's, there's not, it's not mandated that you, or written in stone that for example the transaction is finalized within like 13 times 35. As in the Ethereum network and that, you know, it's atomic, meaning that, you know, once it's being written, it's, it's written and it does not leave any side effects or, you know, that you know, there's kind of like an agreement that if you pay a high enough gas fee that, you know, you have some sort of economic or statistical agreement that your transaction gets into the block and also gets on top of the transaction pool, all of these things when Bitcoin is Ethereum but mainly when Bitcoin was launched, they were new.

And Bitcoin really just nailed these qualities. Like they could have been in it, like they could have been built in any other way. Like for example I mean, compare those things to PayPal. PayPal is just. Money transmitting website, but for example, it doesn't have the concept of you know, bidding on a, you know, how fast your transaction gets included or gets executed.

Because I think for example, PayPal just says all the transactions are equal or whatever. So and in the same way, I think when we build new software and especially when we build this kind of like very innovative software where we don't really know what we want, it's always an important, or yeah, it's important to have kind of like a teases of what people might want or what can create kind of value.

And in the term, in the case of a data economy, I think I definitely agree with, let's say everyone at the ocean protocol that, you know a good way of pricing a data set is something that is really valuable. And so as a. Using automated market-makers to determine the price of an ERC 20 token that represents actually the access or to a dataset.

It's actually a really cool innovation. But on the other hand, for example, ocean protocol has no guarantee or almost, no, I think it has no guarantee , about the delivery of the data. So for example, unless I'm missing something the way, for example, how the data is being published on the Ocean protocol is by you give a, an HTP link to your server.

And then once someone sends a bias the token and sends it to you in a, in a data set sales, so to speak, and the HTP endpoint is exposed and you can download the server and you could argue that the. Reflects the quality and the uptime of the silver as well. You could argue that definitely. But the fact of the matter is that the ocean protocol itself on a technical level, it does not provide any any guarantees on either, you know, whether the server is online, whether the server sends you the right kind of data, whether you know, the data, maybe, you know, the surveys sending you the data, but it's actually not what you think it was.

Or it's just, Ciro's you know, it's just like zeros on a binary level or yeah, whatever. I don't know. It could literally be that someone just multiplied all the data fields of the numeric values by PI or whatever. So it looks still valid, but it's not. And, and that's, that's kind of the mindset that I'm approaching that we are approaching.

Rugpulled Network because for us, we think that it's most important that you get an extremely strong guarantee that what you buy is what you, or what you buy is what you get. So if you are buying the price of USDC to ether on the Uniswap pool in one hour, then we want to have a 99.9% chance that this is the, the data that gets delivered to you.

And if not, then we want to have someone feel the consequences and that being yeah, it's we slash the person that has provided you with this false information, or, you know, maybe we slash they're like, I don't know, what's there somehow that steak or whatever, or or we would ban them from, you know you know, committing you know you know, photo selling the price feed and so on.

So yeah, on top of, so like the big, biggest technical challenge on the Rugpulled network, Dennis and top of my head is how can we make sure that the data feed that you are buying is the most accurate, best possible data feed and that on a technical level, I think it has been practically solved by it.

I think the solutions are not super elegant, but what I've so far seen is that for example, on Chainlink kind of, there is like this bidding process. So for a given data feed, someone just says, ah, to whoever delivers me the valid price of USDC to Ether in one hour, I'm giving that person, you know, I dunno 0.1 ether.

And then what is happening is that like hundreds of people maybe are submitting certain prices,

and then what Chainlink is doing is it creates like an aggregate. So it creates like a, for example, it checks the medium median value or the average value and it picks the submission that is closest to the median value or that, so that would be one approach. So we are picking something that is the middle of all submissions .

But of course that that would be then subject to collusion. So let's say if you have. 10 people submitting the data source and they all talk to each other and say, that's all collude to commit like that the price is one. Then the average would also be one and somewhat everyone would get rewarded, but the price would still be wrong.

But that I think was also that was kind of a problem that Chainlink had faced. I think there is a mix. There is a way to circumvent this by playing two groups by just fragmenting submission in two groups and having the one group play the other groups. So there is always like there's kind of like this almost like a prisoner's dilemma where, you know, if one group colludes and says something wrong, then the other group will put them into jail and vice versa.

I think that was proposed some, I don't know, it's very high level. What Chainlink has done in their version two. I think another approach that's very interesting is what Numerai signal is doing. And in general, I think it's a very Numerai idea where each person that is providing a data feed would along with providing the actual data itself would always take some amount of money on, on the data.

on you know, set my conflict.:

Then I think it would take, you know for 99.9%, you would use the value of the person that that submitted with $1,000, actually, not exactly that percentage value, but then you would only use, you know, the, the $1 value of a percentage of roughly a 1.0 0.01%. So that is another option. And, you know, Numerai doing it in a very interesting way, because basically your Numerai and only stake with numeraire, which is their token.

And so if everyone would collude and stake them and, you know, stake with confidence, a Numeraire then they would fuck themselves over because you know that Numerai would be worthless because it's not functioning as a system. And because everyone has been staked with no morale, those number, I think at one point, then the Margaret was realized.

Yeah. But you know, if numeracy doesn't work, the numeraire is not worth anything so we can also sell it. So the people that basically are staking and gaining from cheating, the system would ultimately, I think also get punished by just like staking and holding the token. So th that's I think approach number two, and I think, you know, the, there is like different combinations of these approaches.

I think let's say a novelty that we are anticipating with Rugpulled network is also. We want to make a one to do an extremely good job at identifying a data source. So let's say who's going to be the president of the United States is I think a very difficult thing to define or to specify, or to identify as a data point, because how would you, like, who would you ask, you know, in the end, like I think you would somehow to witness, who's going to be the president.

I mean, I'm not American. So us Americans, I really don't know on a legal perspective how this would work, but I would guess that you would have to be at the inauguration. It would have to see it with your own eyes to understand who's the next president, or let's say you would have to somehow look at all the, all the different you know, votes and it concluded from there.

So that's a very difficult data point to identify, I would say. But on the other hand the spot price of a Balancer V1 pool, you know, that features on one side, the Lumsta42 or the two data token. And on the other side of the ocean protocol, ERC 20 token at, you know, on the 5th of January at 5:00 PM central European time, that is a data point that we can concretely identify.

We can like, it's a number of strings being put together. You know, we know it's on Ethereum. We know the chain ID, we know the contract, we know. Yeah, we know, we know how the Jason RPC and works. And additionally, we can even like confirm this kind of data point by recomputing, the Merkel proof off that block that we were looking for this data.

So we can not only identify that this data, like we know we can not only point to this exact data point, but we can also identify. And these are the pieces that in the beginning, we will need to establish a network with very strong guarantees for data. So I think, Rugpulled is will not be usable.

Any kind of data trading or any kind of data feeds, but really only for data feeds where the actual Datums. So the individual pieces of data can be yeah. Uniquely identified within some kind of computer network. So yeah. Ethereum chain is good. Think any kind of like API is great. Anything digital, I think is good

[:

That's been operational for a number of years is a lot more concrete. Am I right in assuming though that, with that architecture so I can understand how you would continue to. Maybe, you know, sit up no, that's, that's checking that data source that you have built yourself or whatever, and, and obviously you can have high guarantees of, of, of its quality or the quality of the feed.

But doesn't, it defeat the purpose of kind of de-centralizing it over time anyway, or, you know, is this like an interim fix or is this something that I'm missing that would enable you know, that, that approach to expand in an open network without the need to have, you know, like a a single source of truth that you come back and compare it to?

[:

I think so. And just to give examples I think the one where, you know, we are able to validate, validate a data source would be the Ethereum network. When we want to have the automated market maker a price of loom staff or the tool or whatever. That's very easy to value or like code or code, very easy to validate relatively easily, whereas relatively.

Yeah. Whereas I think there was a student could be hugely complicated, but for, yeah, for anything that, that, so that's, I think I definitely not the best person to speak to this probably. But I, I know that for example, the Chainlink founder he defines this kind of concept of, you know, identifying through, but not necessarily being able to validate it, I think, as defined truth.

And I think I agree with that vision or idea that I guess it's technically possible to also buy and sell data that where you cannot be absolutely sure whether or not they reflect the truth or not. But for those, I think the mechanisms for pricing and delivering the data feed, I think they are rather challenging.

In the space that started in:

And then you know, some people better don't do some people better than that. And the assumption was that, you know, in the end the prediction market always gets the answer right. Or whatever. And I don't know if it actually worked out. I honestly, I think there was some, like, there was some really difficult questions that were asked in the end.

Like I think the precedent thing worked out well, but I remember one case in Clara's there is also this sort of like arbitration platform that is doing very similar. You know, where they are seeking basically the, the truth through it's sort of like game theory games or whatever.

I don't know. The devil is so much in the detail that sometimes also for a certain you know, data feed that you want. I think it's nearly impossible to answer it like in a, in a diff definite way. And I want to, I think I want to avoid going down that rabbit hole in the first place. So, and I want to, so I think also to double down on that, I think I want to stay practical right now with the community crawler, because it's, I mean, I I've told here in this interview, it has it can have a huge scope if you want, but I intentionally framed it.

And I'm only doing. Because I have a real problem. And the real problem is that I cannot put anonymous people under NDA. So I'm not really committed or married to the idea of solving the vision that I have laid out as much more to solving the actual real life problem, which is that I need data on chain and I run a website and I have people that are working for me and I have a closed source backend, and I don't want to have that.

I want everyone to be able to kind of co-run my website with me and yeah, whatever serves the purpose of solving this problem will eventually become the Rugpulled network. And so I think because, you know, I think the problem always reflect technical visions. And so it's always that we are idolizing a certain way of doing things and so on.

Whereas the actual solution. This always so messy. And so there's so many weird details that in the end, they never really equate what you originally planned for. And yeah, I think that's that's important. So basically I think you were asking kind of this question, if if it wouldn't defeat the purpose of being decentralized, when we can validate the data.

I think, you know, already, if I don't have to be there, have to have the responsibility in one year or so of being the administrator of the website of heavier closed source code base and so on, then I'm already happy. So for example, where we can scale the crawl, where we have different data points that we can crawl when you know, other people are running my website, so there's a redundancy.

So if my server goes down, the website, isn't done. All these things. If we accomplish, you know, only a minor, a minor fraction of those, I think would be quite a success. Yeah,

[:

The chance of it being led astray because there's this sort of more eyes on it. Obviously you've got, you know, social layers, like people acting in good faith, et cetera. So there is these things you can do and, you know, it's, it's contextual as well. You know, if you were engaging with someone for a Rugpullindex's data feeds, for example, and they started sending you a whole bunch of garbage, then, you know, presumably you know, they wouldn't continue to send that for much longer or at least it wouldn't continue to influence the information that was on the website for much longer.

[:

[:

So you know, through ocean missions, I've been speaking to a few people who are, you know, their, their developers or their, their data scientists and, you know, they, they have experience , with Python and whatnot. But in directing with, you know extracting information from the Ethereum blockchain or whatever is is new to them.

Is there any sort of you know, practical advice.

So, you know, say, say I'm a developer and I, and I have the skills to do it, but just, you know, what is, what is the best way to, go in and start pulling information from the blockchain.

[:

So so if you crawl data on Ethereum, then get a Ethereum full node like rent, like pay the cost rentals. You know, do all of those things yourself, because you will be happy for you know, being able to go into the note and run commands and maybe, you know kill it and restart it, or, you know, clean out some of the database, if in case, you know, there's a fork or something.

And yeah, don't rely on other services that, you know, they have fancy things written on the websites and whatever. And you know, they're like, yeah, we have this and that uptime don't rely on those also don't like do things yourself. Like don't use huge libraries use. Like if you can do it yourself, you know, if you can code a script that can do a certain thing, do it yourself.

Unless of course, you know, I mean, don't roll your own crypto. Don't build your own calendar. But apart from that, I think A benefit of doing things yourself, because if you have control you also have you also have a quicker turnaround when things go wrong, because you can just lock in, you can fix the problem.

You can look into most of the source code on open source. You know, people are very helpful. Whereas, you know, when you use hosted services, they then, you know, maybe they push an update Sunday morning and they fuck up your website and then you have to get up and do what they have done. And the, you know, you're not going to be happy.

And so I think the more kind of you at, so that's how Rugpullindex also is run, is like, we like the, most of our infrastructure. We are running ourselves. So we have a Ethereum node and we really try to not be reliant on any other kinds of. Data sources speeds. Of course, all of those can go down and like things can happen and oh, you know, API APIs change and blah, blah, blah.

So yeah, I have like now Rockpool index, for example, I'm like, it's a privilege to focus also on the Rockpool network, because really there just has to be no maintenance on the Rockford website. I mean, the crawler has been running since we are eliminating most of the external data sources and everything.

And since we have tested everything, I basically, I don't know, like the last time I had to restart the crawl or to some database maintenance was probably like two months ago. So I'm like the websites online, it works. That's it. Like, sometimes I don't even look like for a day and still everything works and so on.

So there's really. It's counterintuitive, especially as I think as a junior, I would have never would've never advised this kind of thing, but now that I've made the experience, I think, yeah, there's a lot of value in having control over the things you have over the quality of service you want to deliver.

Awesome. So that

[:

Where's the best place for them to, to keep, keep up to date with what's going on.

[:

We have I think a fairly active pit up organization. So we are maintaining a bunch of projects that I think can be used in production. So we have a library for querying, the Ethereum, Ethereum Full node on the Jason artsy. It's called eth fun. And it's basically similar to ethos chairs or web three.

Then we also, now we have like, for example a library on solidity for creating a spouse, a Merkle tree that is cost effective. We have a library. You know, where you can generate our pricing charts using SVG. We have done some research on roll-ups and so on. And of course we are in, we are in progress of building the Rugpulled network, and that's also just a report and I'll get up right now if that, so if that, like, if you want to technically contribute then we are, I am writing a lot of blog posts.

And so if you visit Rockpool index.com/blog basically since the very beginning of the project, I have written blog posts very frequently on all sorts of topics. And for example, if you are interested in being a project on the ocean tout, and I think a lot of insight might be in those posts because I've just like made it.

My task of journaling, all my thinking there. I also have a blog it's called timdaub.githb.io. And it's my personal blog, where I blog about all sorts of things crypto related.

But I would be very happy. I think if you really want to start working with us, for example, what to contribute, if you want to. Yeah. Just like learn more or something then I think the best tangible ways, if you come on our discord chat a bit and yeah. See. You can get started on whatever you want to explore.

Awesome. Cool. Thank you so much for

[:

And you'll see a link up there to the discord and get involved in and find out more.

Links