Chris Wessles Indexer The Graph Graphops Grantee Wave 1 Graph Foundation

GRTiQ Podcast: 16 Chris Wessels

Episode 16: Today I’m speaking with Chris Wessels, an Indexer at The Graph. In addition to his Indexer operation, Graphops, and being a well-respected voice in the Indexer community, Chris also received a Wave 1 grant from The Graph Foundation for his work on the Indexer-Agent. 

Our conversation is incredibly insightful – Chris is brilliant. We discuss his departure from FinTech to crypto, how he thinks about DeFi and its potential impact, we have a fascinating discussion about Layer 2 blockchains, Ethereum, and gas fees, and we conclude our conversation with a great discussion of The Graph – where it’s heading and where we are in that journey. 

The GRTiQ Podcast owns the copyright in and to all content, including transcripts and images, of the GRTiQ Podcast, with all rights reserved, as well our right of publicity. You are free to share and/or reference the information contained herein, including show transcripts (500-word maximum) in any media articles, personal websites, in other non-commercial articles or blog posts, or on a on-commercial personal social media account, so long as you include proper attribution (i.e., “The GRTiQ Podcast”) and link back to the appropriate URL (i.e.,[episode]). We do not authorized anyone to copy any portion of the podcast content or to use the GRTiQ or GRTiQ Podcast name, image, or likeness, for any commercial purpose or use, including without limitation inclusion in any books, e-books or audiobooks, book summaries or synopses, or on any commercial websites or social media sites that either offers or promotes your products or services, or anyone else’s products or services. The content of GRTiQ Podcasts are for informational purposes only and do not constitute tax, legal, or investment advice.



We use software and some light editing to transcribe podcast episodes.  Any errors, typos, or other mistakes in the show transcripts are the responsibility of GRTiQ Podcast and not our guest(s). We review and update show notes regularly, and we appreciate suggested edits – email: iQ at GRTiQ dot COM). The GRTiQ Podcast owns the copyright in and to all content, including transcripts and images, of the GRTiQ Podcast, with all rights reserved, as well our right of publicity. You are free to share and/or reference the information contained herein, including show transcripts (500-word maximum) in any media articles, personal websites, in other non-commercial articles or blog posts, or on a on-commercial personal social media account, so long as you include proper attribution (i.e., “The GRTiQ Podcast”) and link back to the appropriate URL (i.e.,[episode]).

The following podcast is for informational purposes only the contents of this podcast do not constitute tax, legal or investment advice, take responsibility for your own decisions, consult with the proper professionals and do your own research.

I think that as blockchain begins to eat more and more of both our fundamental institutions but also the kind of products and services that we just use every day, they increase the total addressable market of The Graph.

Welcome to the GRTiQ podcast. Today I’m speaking with Chris Wessels Indexer at The Graph. In addition to his Indexer operation, Graphops, Chris is a well-respected voice within the Indexer community. He is also a wave one grant recipient from The Graph foundation for his work on the Indexer Agent. Our conversation is incredibly insightful. We discussed Chris’s departure from FinTech, his thoughtful description of what DeFi is, and how it might impact the world. We also have a fascinating conversation about Ethereum, Layer 2 blockchains and gas fees. We concluded by discussing Chris’s long term vision for The Graph, and where we currently are in its evolution. I began the conversation by asking Chris about his background.

I’m a technologist, I’d say, within the realm of technologists, I’m definitely more on the generalist side of things. You know, I’ve always just been fascinated by interesting problems and trying to understand and reason about them. I was born in South Africa, actually, and very early on, I you know, was absolutely taken by computers and you know, the vast world that existed within them. And so from quite a young age, I started playing with the computer. And that led me into programming quite young. First, starting with silly little scripts, you know, back in the days of Windows BAT files, and, you know, really just things to tickle my imagination, you know, understand this new world better. That led me into games. And you know, at first it was kind of playing games with brands. But very quickly, I think I became more fascinated by how these games work, then playing the games themselves. So I started writing games at quite a young age. And then from games, I kind of went into websites, and then from websites into network applications. And yeah, all kind of fueled by this desire to understand and just curiosity. And yeah, so this takes us to kind of like very early High School. And yeah, at this point, I kind of started a business. And we ran, essentially, like a web agency in South Africa and across High School completed about 20 projects of varying complexity, ecommerce websites, brochure sites, and things like that. And it was, I suppose, you know, from a young age, I’ve always been interested in technology and computers, but I’ve also always had a bit of an entrepreneurial spirit. And so after finishing high school, I went immediately into the software industry and professional programming, at first in Cape Town. But it wasn’t long before I felt like I had exhausted the opportunities space in Cape Town. And so at 21, I moved to London, to kind of seek out bigger opportunities and more interesting problems to work on. I’ve been in London now for about seven years. And in that time, I’ve mostly worked on startups and within the startup ecosystem, spanning quite a wide range of industries. You know, both as a as a founder as an employee and as a consultant. I went through entrepreneur first in London and you know, since that founded a few businesses along the way, raised lots of venture funding, hired lots of people and kind of built out teams failed a lot of times, and it was really in the depths of building out a business within the regulated financial services sector that I really kind of encountered DeFi for the first time seriously and was, you know, really engaged with it with the kind of depth of knowledge of how the traditional banking and payment system work, then it just completely blew me away. And I really knew that I needed to focus on it.

I want to come back to DeFi in a minute, but can you tell us when you first got involved or interested in crypto,

So I first encountered crypto painfully early actually. So I you know, it must have been, you know, maybe 2013 or 2012, when I first started mining Bitcoin, on my laptop CPU, you know, back in the day, when that was, was totally a viable way of mining. And then he, you know, thought this, this is awesome, and proceeded to forget about it the next couple of years and format, my hard drive and lose all of my coins, I also had the opportunity to invest in the Ethereum ICO. So I was kind of around then and engaged enough to see that happen. I bought a pair of speakers instead. so painful decision, in hindsight, but I suppose you know, all through that time, crypto, and, you know, really what was Bitcoin back then was definitely at the periphery for me and something that I was aware of, and tracking from a distance, but not something that I’d really engaged with, you know, fast forward many years. And I was, you know, found myself in London, as a CTO of a banking as a service company. And, you know, I’d really spent the time engaging with the traditional payment system and working with the card networks, and the banks and even the European Central Banking level, we got regulated, we did all the right things, at least in hindsight to deliver what feels like quite a simple product. And it was at that time that I kind of seriously engaged with DeFi, and back then it was really like, the very early days of Maker DAO and kind of Uniswap be one coming shortly thereafter. But it just really hit me how, you know, through a number of factors, you know, partially just that the technology was fundamentally superior for building and representing these kinds of systems. But I think also the lack of regulation, which, you know, in my experience within FinTech really stifled innovation and served a kind of strong function of, you know, keeping the kind of existing group of entrenched businesses in a strong position of power and making it really difficult for innovative competitors to come in. It just struck me that DeFi was just this incredible change over that. And, you know, in a matter of kind of weeks of, of deeply engaging with the technology kind of went from the opinion of what we’re doing here is working on the forefront of Finance. And you know, FinTech is like an amazing force to change the landscape to actually realizing Well, we’re working on the past here, you know, quite literally, what we’re building will be leapfrogged by a new class of protocol built on blockchains, like Ethereum. And you know, I think subsequently, I’ve come to believe that it really will be Ethereum at the center of a lot of this. And so yeah, that was the point at which I decided, you know, I need to take a step back. I need to focus on this because, you know, now that Pandora’s Box is open, I can’t close it.

I want to ask you more about DeFi, and it’s a little bit of a jargon or buzzword in the crypto space. How do you think about what DeFi is?

I think it always depends on how much time there is to have the discussion and how kind of genuinely interested the other person is. For me. I like to kind of think of the implications of blockchain almost from first principles. And in that sense, I think understanding first principles allows one to understand the impact of blockchain on finance and that kind of frames what DeFi is I would say.

Well, I love that answer. So let’s do it. Take us through what DeFi is starting from first principles.

I think the key realization for me was that I think when most people think about blockchain, they think about it as a digital technology. And it is a digital technology but it’s also so much more than that. I think. Blockchain is fundamentally a complete revolution in social technology. And this kind of, you know, speaks to what I was saying earlier about the breadth of definition of what technology is, I think so many people think technology equals digital technology. But if we kind of wind back the clock on humanity, something like, you know, discovering, and being able to manipulate fire was an incredibly transformative technology for humankind, and really shifted our trajectory in a way that was completely unimaginable. And similarly, you know, if you’re talking about small groups of humans and tribes and coordinating socially to serve some broader purpose, you run into this idea of Dunbar’s number, which if anybody doesn’t know that is, is a number that was proposed by I think, a British anthropologist. And it is theorized as the maximum number of people that a single individual can have a meaningful relationship with. And it’s kind of theorized to be a cognitive limitation. Oh, and Dunbar’s number is 150. You know, that’s kind of commonly used as Dunbar’s number. So as you have these groups of humans, and they’re attempting to coordinate, and you know, function together as a small society, we invented another piece of technology, a social technology, which is the institution and the institution is this really useful piece of abstraction, it allows us to dissociate the role of something from a kind of group of human beings or you know, human beings that we kind of connect with on a personal basis. And we kind of formed this relationship with this abstract thing that is the institution. And that technology is really what allowed human beings to scale societies beyond Dunbar’s number. So you know, institutions are an incredible technology in the kind of grand arc of humanity. But unfortunately, and you know, maybe this is somewhat informed by my experiences and perspectives growing up in South Africa. But I believe that, unfortunately, given enough human beings and given enough time, institutions naturally evolve to become misaligned with the mandate that gave them their stature and credibility within society as a whole. And ultimately transition into extractive layers that no longer really serve the public good, or even kind of private interest that, you know, they will architect it for. And so that is kind of for me where blockchain comes in. And, you know, I see blockchain as a fundamental revolution in social technology. And yes, that’s enabled through this digital technology. But the implications extends so far beyond that, because we can move the operating frameworks and rules by which these institutions on mass, right, like we’re talking about the most substantial abstractions in the world that we live in. We can transition those rules and operating principles from the meet space, which is subject to, you know, the interoperability of human beings into smart contracts and, you know, into essentially code where that code runs at a global scale, and is not corruptible, and can be designed to serve its participants in a way that cannot become extractive or you know, is far more anti fragile in that sense. And it was really like, I think that was one of the major revelations that I think that caused crypto to become like a one way street for me is I realized that this was an opportunity to work on something that had enormous implications in the context of humanity’s development, and arc as a species. And so I think that is the frame that I, you know, like to think about when I’m trying to assess how blockchain can impact any single industry. And so, you know, DeFi is obviously the impact of blockchain and kind of the think DeFi is what how happens when blockchain begins to eat finance, as it exists in the world today, we know where it largely exists as a group of interlinked institutions, run by human beings, and unfortunately, in many layers, incredibly corrupt and extractive and not serving, the kind of public good mandate that they were originally created for. And practically, you know, DeFi is about permissionless inclusion, right, like anybody can access these services from anywhere, nobody can prevent them from doing so. And, you know, that’s not just on the demand side of the market where you have like customers that are looking for these services, but it’s equally on the supply side, which is really something that was it was a stark realization for me that the existing financial incumbents essentially have regulation as a moat, and a moat that, on the surface, seems like it’s there to protect consumers, but in practice, does an incredibly good job of insulating the incumbents from competition. I find the implications of that on many of the most important industries and institutions to be incredibly exciting.

Well, that might be one of my favorite answers to what DeFi is I’ve ever heard. But I want to ask a follow up. So a lot of the listeners to this podcast aren’t particularly working in crypto, so they’re sort of outside the space. So for the average everyday person, how will DeFi impact them?

You can come at it from different perspectives, I think, like fundamentally, you know, like, I’ve heard the sentiment that Bitcoin is defined, I’m not sure I’d agree with that entirely. But I do think that it hints at one of the most powerful components of what DeFi is, and that is access to monetary systems and services and assets, that have hard properties. And, you know, like, a kind of particular set of reading that really kind of informed my perspective on this note is Ray Dalio, his writings, both in his books around Principles and How to Navigate Big Debt Cycles. And then his more recent work on the Changing World Order, if you kind of zoom out and look at the history of money and financial services, as you know, a set of institutions that are really there to serve the needs of human beings, right, like money is really just an abstraction that allows us to coordinate more easily, we have a really bad track record of building systems that are anti-fragile, and that serve the interests of the everyday person in the long term. And particularly given our environment. Now, I think we are in an incredibly interesting time in terms of the International Monetary Order and the jammening of the US dollar. And I think, at the kind of base layer, DeFi and blockchains just as a way of having everyday people having permissionless access to an asset class that is not subject to the same manipulations as our existing systems is huge. And that may not be the case, if we were, you know, at the beginning of a grand, you know, supercycle or some kind of reserve currency cycle, but particularly where we are now where, you know, the US dollar, and the US is really being challenged in its role. I think this is an important thing for everyday people now. And then on top of that, you know, I think that people don’t just want, you know, idle assets, right, like, so much of the value that exists within financial services isn’t just about representing and trading assets, but about the productive value and the real utility that can come from those assets. And, you know, that for me, is what DeFi is. And, you know, I think that the fact that everyday people will be able to access real productive utility on their assets without anyone being able to prevent them from doing so, or changing the terms under which they do that is incredibly exciting. And you know, is incredibly empowering. For the individual in this evolving, yeah, evolving environment, but there’s so many layers to it.

I believe you. I know that’s true. And I think he did a really great job there. So let’s turn our attention back to you a little bit. Why it was when you had seems like a lot of opportunities, and you were on a career track? Why did you decide to pursue a vocation in crypto?

I think it just appealed deeply, to me in terms of being an interesting and kind of fascinating intersection of all these different disciplines. And, you know, really an opportunity to understand more about the world in a way that kind of had practical benefit and real world implementation benefit. But other than that, then the kind of, you know, interesting challenge of the problems, something that really attracted me to the space was the culture of the community and the ecosystem. And I do think it’s important to say that, you know, crypto is an incredibly diverse bunch. And, you know, I certainly don’t agree with every subsets perspectives and positioning, but something that I have always valued tremendously, is how much the community values, innovation, and rigorous intellectual discourse. And I think something that really exemplifies this is how, you know, maybe 50% of the key contributors, at least in DeFi are a anon or pseudo non folks, right, like, we have no idea who they are, where they are. And it always just struck me as an incredible thing, when you can have, you know, an ecosystem of people that are so unconcerned with things like who you are, or what you look like or where you’re from, and just so focused on progress and on innovation and on advancing the discussion. And I cannot think of any other industry that wouldn’t reject that with a degree of pretense. Yeah, you know, I can’t imagine finance or you know, any number of other industries, having a serious conversation or engagement with a bunch of anonymous people. And so for me, that really just exemplifies where the values within the community are. And I think that that environment is, you know, predictive of hugely impactful and innovative change at a global scale.

Do you remember when you first became aware of The Graph?

I think I first encountered The Graph, somewhere around mid-2019, maybe early 2019. This was kind of after I had begun to investigate, DeFi more in depth. And it struck me that there was this incredibly interesting activity, burgeoning within DeFi right, like we were making the very real transition and progress from Ethereum mostly being, you know, the rails to move around ICO tokens that had basically no value to building real applications that created this productive utility for assets. But it was very difficult to, in concrete terms, reason about this activity and measure it and, you know, track it. And so one of the projects that I had considered working on was essentially, you know, something to solve this problem, something to, you know, essentially stream changes as they come off of the blockchain and index data in a way that would make it accessible and easy to represent. You know, even in the case of dashboards to help educate and bring awareness to this activity that was burgeoning. It was in that context that I first came across The Graph. And then sometime later, kind of towards the end of 2019, I joined ConsenSys, I made the decision that you know, there was absolutely nothing else that I wanted to work on more than blockchain and the Ethereum ecosystem and ConsenSys seemed like a really fantastic place to be able to impact and have real input into the development and evolution of the ecosystem. But you know, also retain quite a broad perspective of what was happening in the market. And I landed up working on a similar set of problems within ConsenSys around you extracting data about the real activity and utility that was being created in these burgeoning DeFi platforms. And, you know, delivering it to in the case of ConsenSys institutions that wanted like API endpoints to fetch this data and stick it into their ML Algorithms, or whatever, and crunch it. And so with the gain at ConsenSys that we really look carefully at The Graph. And, and, you know, I’m grateful we landed up going this way, because it really helped me to be convinced of the value of The Graph. But we landed up working on essentially a an alternative to The Graph, right, like a real time data pipeline that was capable of streaming data off of the blockchain as blocks were mined and kind of transforming it and normalizing it into these higher order data schemas that can then easily be queried, and, you know, leading the teams that build those solutions, and really getting into the meat of the problems that needed to be solved to do that, whether you’re interested, as we were in quite a holistic sense, or whether you’re like an individual project that’s looking to materialize information about your own system. It you know, it really gave me a firsthand experience that the challenges associated with doing this.

I have a follow up about that. So what did those challenges or that experience teach you about The Graph?

I think it made me appreciate the legitimacy and difficulty of the problem. And, you know, I think there were kind of two or three realizations that I had about The Graph, that were enough for me to become convinced that it was going to be the solution that won in this layer.


So then, I guess you decide at some point, you want to make the move and get involved in The Graph community, what was behind that move?

So I think, first and foremost, I had, you know, firsthand experience of the problem that The Graph was trying to solve. And, you know, by that experience, I knew that it was a real problem, and that there was tremendous value in whichever solution, you know, ultimately won that race. So there was that I think the next thing was that, around me, all of these incredible DeFi protocols were being built out. And The Graph really had unparalleled traction in DeFi right, like everybody, you know, is like I have this DeFi protocol, I want to materialize higher order information about the activity within it. What are my solutions? Well, it, you know, it really came down to The Graph or roll your own. So that had incredible traction. And you know, I think traction, there are a few better substitutes than traction. And then I started to do some thinking around the problem and had some realizations about The Graphs approach. That made me very much a believer. And I think the first was realizing the incredible power of building a credibly neutral, open standard for this problem, or as a solution to this problem. And, you know, I, kind of a few years earlier had spent some time building out essentially a Kubernetes. Alternative. This was before Kubernetes, had actually launched, if anyone doesn’t know, Kubernetes is a container orchestration platform, and really revolutionized how we deploy software. And Kubernetes itself was built on top of Docker, which is this piece of software for making containers. And it really struck me that what The Graph was doing was very similar to what Docker had done. And Docker is, you know, really just a way of taking, you know, some components that you need, kind of like your runtime environment to run a given piece of software, and package them up into this little blob of data, that you could then ship anywhere and run on any server. And it really could totally change the game for how software is deployed and orchestrated. What was curious about Docker is that the underlying primitives, the underlying kind of bits of technology that made Docker possible, had been around in the Linux kernel for you know, a decade plus. And all it really took to create this spark was for Docker, to create an open standard, right, truly open source software that was in the community, for the community and eventually by the community. To, to solve that problem, then they had incredible traction and success with that. And I think that generally, when you have credibly neutral open standards, if you can rally your community around that, and you know, you can essentially rally that kind of collective brain power, trying to solve that problem around an open standard, then you create this snowball, and everybody wins, right? It doesn’t matter if you have competitors that are working on the standard, you know, the kind of, I think it’s a matter of the whole becomes greater than the sum of its parts. And this was something that I recognized about what The Graph was doing specifically in terms of subgraphs. So you know, subgraphs, where this open standard, this open packaging format, essentially, to bring together all the individual pieces of the technical problem, right, like things like your smart contract ABI’s, and your mappings and your schema, creating this open package to kind of put all these things together into one open standard, that, you know, would allow anyone to develop subgraphs and solve this problem. So that was the first thing you know, it’s very difficult for any centralized competitor to compete with a credibly neutral, open standard. And then the second slightly more practical thing, which is really just an implication of the first is that this effectively executes an inversion of control. And what I mean by that is, before The Graph, there were solutions, you know, companies that effectively offered this kind of solution as a service where you could go to them and say, This is my protocol, these are my contracts, you know, I want essentially this schema, and I want you to like, materialize that schema and maintain it and make sure that I can query it, and I’ll pay you for that. And that worked, right. And like we still have companies that operate that way successfully, today. But a key challenge there is that it puts the onus of developing the solutions for that individual customer or product on this centralized data company. So you go to that, and you say, I want the solution. They ask you some questions, and they go away, and they build it, and then you get it and you can use it. And I think when I realized that subgraphs as a credibly neutral, open standard, where you know that that’s awesome, first and foremost. But it also then repositions that responsibility from developing the essentially the components of the subgraph, right, all those things that packages up, instead of requiring a centralized entity to do that work. You have an inversion of control, and whoever is actually building the product, whether it’s a DeFi product, or an NFT product or whatever, who you know, naturally has the best understanding of their own domain, right, they understand their system, the best, and they probably understand their customers the best. It allows them to develop all of the components required to deliver that data solution. And not only does that mean that the quality of that component, and the likelihood that it meets the brief goes up rapidly. But it also removes the centralized bottleneck, right? If you have 1000 of these projects, all coming to the same company at the same time, there’s going to be a huge bottleneck, and the ability for that business to deliver all of those projects is going to be severely limited. But in this model with the subgraph inversion of control, individual developers, individual projects, and companies are empowered to solve that problem themselves, and not have to solve, you know, the vast kind of technical problems at the periphery, like, you know, database management and index optimization. And, you know, they don’t have to worry about any of that. They just build the few pieces that are critical to getting the right data off the chain and putting it in the right format. And that little package then can be shipped off to any Indexer who can serve those requests and those requirements.

How would you explain your rationale for deciding to become an Indexer?

For me, it was a matter of leveraging existing skills, you know, specifically around large scale systems and infrastructure and, you know, running and maintaining big systems that serve lots of requests. You know, these are skills that I had kind of developed in, in my professional career and feel tremendously relevant in the context of what it means to be a good and effective Indexer. So there was a really good natural fit with my skill set. But I think beyond that, you know, being a core participant in the network, has always felt like a tremendous opportunity to impact the evolution of what I see is one of the most important protocols in the Web3 space. And, you know, zooming out or thinking even more holistically, you know, if we think of things like The Graph, as these new age institutions, a new way of collaborating and coordinating at a global scale, around a common objective, we’re really talking about working within systems that are completely new and innovative, and in the context of our history as a species. So the opportunity to be intimately engaged and involved in a protocol like that at this early stage is just absolutely tremendous. And you know, something that really attracted me to getting involved as well.

In addition to being an Indexer, you also received a wave one grant from The Graph foundation. Can you tell us a little bit about that experience?

Yeah, I was. And, you know, that whole process was really fantastic. Generally, I found that interacting with The Graph Foundation has been incredibly accessible. And they’re generally a very friendly and welcoming group of people. I applied and went through a small series of interviews, just so that they could better understand the scope of work that I was proposing. And, you know, generally, they were very supportive and, you know, happy to offer additional support in in kind of any way that they could, I would highly encourage anybody who is interested in contributing to The Graph to have a look at the foundation grants website and get a feel for the process involved. It really is quite light touch, and focused on finding the right valuable project. So there really isn’t, you know, a lot of red tape or anything like that.

So tell us a little bit about the project you receive the grant for?

My grant is focused on Indexer agent, which is a component of the software stack that indexes run. It is responsible for interacting with The Graph protocol smart contracts on the Ethereum blockchain. And In short, the scope of the grant is to do some work, which optimizes the gas efficiency of Indexer allocations, and that work has already resulted in about a 23% gas cost reduction for Indexer reallocations.

The GRTiQ podcast is made possible by a generous grant from The Graph Foundation. The Graph Grants Program provides support for protocol infrastructure, tooling, subgraphs, and community building efforts. Learn more at The Graph dot foundation. That’s The Graph dot foundation.

Hi, this is GRTiQ. In past episodes of the podcast, we’ve learned that one way Delegators helped to secure the network at The Graph is by staking GRT with Indexers. The network is designed to look at how Delegator stake their GRT to identify trustworthy and reliable Indexers. In a very similar way, podcast directories look to the reviews and ratings of listeners like you as a way to gauge the trustworthiness and reliability of podcasts. If every listener of this episode took five minutes to leave a review and rate the podcast, that directory would rank the GRTiQ podcast higher in search results, making it easier for community members and those seeking more information about The Graph to find it. Thanks for supporting this project.

In preparation for this interview and trying to learn a little bit more about your background, I stumbled across this notion that like so many of my prior guests. It was really a theory that drew your attention to the crypto space. And so I guess I’d like to ask Is that a fair characterization was Ethereum the hook that brought you into the crypto space?

So my entry like I think many of us was Bitcoin. But the problem with Bitcoin is that all you can really do with the Bitcoin blockchain is move around coins from one wallet to the next. And people don’t just want to hold assets, they want to make real, productive use of those assets. The problem with Bitcoin in that context is that to do anything of productive value with your Bitcoin, you need to move it off of the Bitcoin blockchain, and in the case of the Bitcoin ecosystem, mostly into any number of fully centralized fully custodial services to make productive use of your Bitcoin. And that always just seemed, you know, totally antithetical to me, because the properties provided by the blockchain are entirely useless in that context. And we effectively just recreate the same system where the assets that are being used productively are just held and controlled by a small number of institutions. What really blew my mind about Ethereum was this ability to construct more general purpose applications that were capable of creating utility and productivity for assets. And even beyond assets. If you’re talking about things like, you know, nation state scale voting, or any number of other coordination problems, you can create real value. And you can encode the operating principles and frameworks of how that value functions and created and sustained and have all of those rules be subject to the same scrutiny and immutability and security as just the asset itself.

Okay, I want to come back to gas fees for a minute here in the context of Ethereum. And certainly within the context of this podcast, a lot of listeners have interest in the future of gas fees. How should we think about that?

So people obviously want ultra-low costs for transacting. And that’s fair. And it’s something that we should seek to optimize for in the long term, we need interacting with these decentralized applications and protocols to be as cheap as possible, so that access is not restricted to those that can pay lots to interact. But transaction revenues are actually a really important component of the incentives that ensure security for our blockchains. And the vast majority of that security today. And this is less the case for Ethereum, which is another great indicator of why it’s such a strong network. But generally the vast majority of these incentives are provided by block rewards, right? New currency issuance to pay for that security, and longer term, we want transaction revenues to be high enough to really provide a strong and consistent set of revenues that ultimately result in incentives and security spend by the actors, like miners are eventually Proof of Stake stakers that provide real security to the network.

Well, I’ve never really thought of gas fees in this way. So I really appreciate that answer. So then how do we resolve this conflict between the ideas that gas fees help secure the Ethereum network and this constant push within crypto to somehow lower gas fees?

So I think the answer is Layer 2’s, maybe not Layer 2’s in isolation, things like side-chains, as well are aligned with this vision. But I really think Layer 2’s are going to be the major catalyst for the reduction of gas costs for end users. What makes Layer 2’s really interesting is that they move the computational aspect of transacting off of Ethereum Layer 1, and all of the data relating to the transactions that happen still get settled onto Ethereum Layer 1. And this means that Layer 2 still inherit the incredibly strong security properties of Ethereum Layer 1, but because that computational aspect, which is really the majority of the gas cost that is incurred when transacting is moved off chain, you’re able to pass that set of gas cost savings on to the user. So as we see Layer 2’s launch in the coming months, it’s going to be really exciting to see applications migrate into these new environments and to have users and capital follow. And I think Graph protocol is just as many other apps, a prime candidate to move certain aspects of the systems operation into Layer 2 that will drop gas costs for all participants and including Indexers Delegators Curators, but without any reduction in the economic security that exists today on Ethereum Layer 1. Now beyond that, I think there’s quite an interesting opportunity that emerges if you follow the train of thought, which is, as applications and users migrate on to these different Layer 2 solutions, the data that sits underneath these systems, the data that is critical to understanding, you know, the activity within these products and protocols, and you know, users positions within them, is then segmented or fragmented across many different execution environments. And this further complicates the problem that The Graph is seeking to solve, right? If I’m a developer, and I don’t have something like The Graph, today, accessing the data or for my application on Ethereum, Layer 1 and creating a data model to reason about it is difficult enough as it is. But as my applications and users migrate into many different locations, that data just becomes even more difficult to access, unify and reason about, and this really cements the value proposition of The Graph.

Okay, we’re definitely coming back to that observation about The Graph and its value proposition. But before we do, I gotta be honest, I’ve never really understood why gas fees exist, who receives them? Why we pay them? Can you help me understand that?

Gas fees exist, I think fundamentally to protect the network from spam. So if transactions didn’t cost anything, then an attacker could feasibly overwhelm the network and bring it to its knees with zero cost. So, so gas fee is actually, you know, an important part of, of preventing the network from being spam attacked in that nature, where gas fees actually go is going to change over time. So today, gas fees go to miners, and they form part of the incentives that miners receive to spend money on equipment and energy and secure the network as a result. And this touches on what we were speaking about earlier around how transaction revenues are a really important part of security incentives. And that, you know, cryptocurrency networks, should really be striving to maximize those incentives in terms of transaction fees, as opposed to block rewards, which are effectively just subsidized with supply inflation.

Great, that’s very helpful. And it brings up a second follow up question. And I’ve never really understood the difference between Layer 1 and Layer 2 blockchain. But something you said there kind of prompts this question, which is, is one way to think about it that Layer 2 blockchains exists somehow to deal with gas speeds or to speed up the transactions? How do you think about that?

Yes, I think the definition of a Layer 2 is that it is a scaling solution that inherits the full degree of economic security from the Layer 1 that it kind of branches out of and then subsequently, batch settles updates into, I think Layer 2’s really bring to transformational change changes from a user experience perspective. The first is the one that we spoken about, which is by virtue of moving the computational aspect out of Layer 1, and doing that in an environment that is a lot cheaper, that significant component of gas cost is removed. And so those cost benefits can be passed on to users of the Layer 2 and that just means transacting a lot cheaper and, you know significantly increases the economic scalability of the Layer 2. But the other very important benefit that Layer 2’s bring is much faster state advancement. And what I mean by that is the rate at which state advances on Ethereum Layer 1 is the blockchain, right, so roughly every 13 seconds, a new block is mined, and all of the transactions in that block are committed into the state of the system. And so, if I’m a user, and I’m interacting with Ethereum Layer 1, I can submit a transaction. And in the best case, I’m going to get feedback about whether that succeeded or failed, in you know, 13 seconds. And as we try to build out Web3 products and experiences that, you know, not just on par, but even exceed the experiences of Web 2.0 that 13 second feedback loop is just not acceptable. And so one of the great things that Layer 2’s do is allow, you know, what you could call block times to become very short. So the feedback loop for indicating to a user, whether the transaction that they’re attempting has succeeded or failed, and then, you know, showing them the results of that can get cut down from, you know, 13 seconds, all the way down to, you know, half a second or less. And from a user experience perspective, I think that that is as important and impactful as just reducing cost as much as possible.

So you’ve really got me thinking here about gas fees, and
L1 and L2 blockchains. I want to ask, then, is your vision of the future that L2s will be more important? Are L2s the future of blockchain?

Yes, I do think so I think that the definition of Layer 2 is going to evolve slightly. I think fundamentally, what makes Layer 2 is just a superior scaling solution to anything else, is that they inherit that very strong set of security properties from Ethereum Layer 1. But I think if you, you know, play time forward, the vast majority of the applications that users care about will be located on their
Layer 2’s. And, you know, you might even see a future where users never really interact with Ethereum Layer 1 directly, they just have no need to that’s not where the applications that they care about are located. And, you know, there’s no real downside to only using Layer 2’s because they’re fast, they’re cheap, but they still come with those strong security properties.

So if L2s are the future of blockchain, how does The Graph fit into all of that?

If l L2’s are the future then the data problems that The Graph solves today only become worse for developers and users trying to reason about their activity on chain and what will become on many chains? And so the more that Layer 2’s and other solutions, like side chains are the future, the more that the value of The Graph as a network is cemented, as a way of having a unified model, or accessing queering and reasoning about that data.

Alright, Chris, this is all very helpful. And I do I feel like I’m thinking more clearly about gas fees, Layer 1 Layer 2 blockchain. But I need to throw Ethereum 2.0 into the mix now. So in the context of the discussion we’re having, how should I think about the difference or the impact of Ethereum 1.0, and the move to Ethereum 2.0?

I think that the community as a result of the kind of messaging or positioning that that was put out, at some point has got quite hung up on, you know, what feels like a very stark difference between Ethereum 1.0 and Ethereum. Two, I would say that the reality is more that Ethereum follows a roadmap that allows it to evolve and be upgraded over time. And really, it’s the same network. The most fundamental change in Ethereum 2.0 is really just that we change the consensus mechanism of the network. And we plug out Proof of Work. And we replace that with Proof of Stake. And that comes with, you know, an enormous set of advantages, in terms of hugely reducing the environmental and operational costs of maintaining that security. So quite literally, the cost to provide a given level of security to the network is going to drop drastically, once Proof of Stake is launched. So that’s really fantastic for the entire network. And it also brings a bunch of other really great properties like faster finalization of transactions and things like that. I think if you look beyond Proof of Stake, and you tried to bring together the idea of Layer 1, and Layer 2 and Ethereum 2.0, essentially, what roll ups do is use Ethereum 1.0, as it exists today as a data availability layer. And what I mean by that is, the transactions that, you know, is practically a bunch of code that runs and figures out, you know, a result in answer are no longer actually executed on Layer 1, unless there’s some kind of dispute or something of that nature. And all of that computation is moved off of Ethereum, Layer 1, and it happens, you know, off chain in the Layer 2, and therefore doesn’t incur all of the gas costs associated with that computation. But to ensure that security is maintained, even though those transactions are executed on Layer 1, the transactions themselves i.e., something like Bob send 10 ether to Alice, those are recorded on Ethereum Layer 1, even if they’re not executed there, because that ensures that the state of the Layer 2 can always be reconstructed and validated. And you know, all of the economic security that goes into Ethereum itself also secures that data. So you can kind of think of Layer 2’s as using Ethereum Layer 1 as a place to save and access data about what happened on the Layer 2 in the past. What Ethereum 2.0 brings post Proof of Stake and as we introduce data shards, is many different data shard chains that each has a data capacity of its own. Um, so you can imagine if the amount of data in terms of like, you know, bytes, or megabytes or whatever, per second of data that the chain can actually come to consensus on and secure. If that is x and Ethereum 1.0 is capable of x. And that kind of naturally results in in scale constraints for Layer 2, right, because Layer 2 is really just saving data onto Layer 1. But Layer 1 can only handle X amount of data per unit of time, then, you know, that creates kind of natural scale limits on how much can happen in Layer 2, because all of that needs to be saved back to Layer 1. And as Ethereum, two introduces data shards, each shard has a capacity of x. So if there are 64 shards, then Ethereum as a Layer 1 network can support 64 times more data capacity in terms of, you know, megabytes per second of data. And that extends into, you know, what roll ups are capable of because suddenly, you know, the amount of intermediary state right, all of this detail about what has happened inside the Layer 2, that can effectively 64x as well, right, because we now have so much more space on the Layer 1 to save that data.

Hi, this is Chris, with graphops, and I’m an Indexer at The Graph. If my conversation with the GRTiQ podcast has been helpful to you, then please consider supporting future episodes by becoming a subscriber. Visit for more information. That’s slash podcast. Thanks for listening.

I want to move to a part in the podcast where I asked guests to define important concepts or ideas. So let’s start with The Graph itself. How would you describe or define what The Graph is for somebody who’s non-technical?

So I wouldn’t describe The Graph as a fair marketplace that brings together actors on both the supply side people who index data and can answer questions about that data. And people on the demand side people who are seeking those answers and the rules of this marketplace, which is also a protocol aren’t forced by the blockchain. And that’s means that all of its participants can be sure that the protocol will not become extractive and is a true public good. And its objective. And its governing rules and its operating principles are geared towards serving its participants as well as possible. Beyond that, I think The Graph is a revolution for data markets waiting to happen. It’s really powerful when you can create open networks that also have strong incentives, because incentives drives so much behavior. And I think the incentives inherent to The Graph protocol, are going to drive a huge amount of innovation, and, you know, unification in the availability and cohesion of data, in terms of, you know, global data markets. And I think these kinds of pressures, competitive pressures, and incentives are going to build better global data businesses. And I think that that is really exciting when compared to our environment today, where data is incredibly siloed. And the systems that allow it to be productively used to generate insights are equally siloed, when I think The Graph has a real, a real force to change that.

So if we were to complement that definition or that description, how would you answer the question of what makes The Graph special or unique?

What’s special about it is because it is, you know, one of these kind of Neo institutions, right? Like protocols as institutions, it can coordinate this marketplace, in a way that optimizes for the outcomes that are important to its participants. It can be a public good in that sense. It can facilitate this commerce of people selling answers to questions and people buying those answers in a way that is not extractive at all, and promotes, you know, rapid innovation and development and competition, which really leads to the best outcomes for that network’s participants and customer.

Another concept I always ask guests to define or describe is that of a subgraph. So let’s do that. How would you describe or define what a subgraph is?

Subgraphs are an open standard that allows Web3 developers to solve their own data needs. They do this by writing only the bits of code and technology that are essential to their specific data problem. And then they package all of that up into the subgraph. That subgraph can then be loaded into the Indexer software to serve an open API that anyone can use to get the data they need. So in a way, I think you can think of a subgraph, kind of like a, you know, international shipping container, what’s inside the container, the cargo. And you know how that cargo is, is chosen to be packed and padded and made safe for transit can vary significantly. And this is also true of a subgraph, the contents of the subgraph will really be specific to the data challenges associated with that problem. But like a shipping container, once that subgraph is packaged up, it can be loaded on to you know, any truck or ship and treated like any other shipping container. So this standardization, and abstraction brings huge benefits, because it effectively delegates each part of the problem to the party that is best suited to solve it. So the specifics of the data problem are solved by the subgraph developer who knows the most about their problem. And the specifics relating to indexing all of that data and serving requests for it at scale are solved by the Indexer. But each set of problems is isolated from the other.

It’s crazy to think about, and you just kind of described it there. But there’s conflicts with the data models and the blockchain and how to query this data. So all the way down to the smart contract. How do you think about that?

When you are writing smart contracts, you know, which is really what these Web3 applications are. The optimizations that you are making are largely about costs production, because blockchains, you know, as we know, are expensive to transact on and like, they’re not really the place to store vast amounts of data. So when you are structuring the storage of that data on the blockchain, you’re making decisions that optimize the structure of that data to be as low cost as possible. And the set of tradeoffs that you would make if you wanted that data to be really easy to reason about and ask questions, is quite different. But the implications of that set of optimizations, if you were doing it directly on the blockchain are that it would be insanely expensive to transact. So it’s like, if you can recognize that the data model that smart contract developers are optimizing for is very different to the data model that you would optimize for, if you wanted to very easily reason about and query the data are very different data models. And so the subgraph is effectively the glue between those two data models.

So how does The Graph network fit into all of this?

So I think of The Graph network really just as a coordination layer, for all of the participants in the system, practically, the subgraph describes how to fetch the data of interest from the chain, transform it into the structure or format that best solves the problem that the subgraph developer was trying to solve. And so the Indexer takes this subgraph and runs it. And as it runs, it, you know, effectively pulls that data off of the blockchain and transforms it. And then it actually gets saved into a database that is operated and controlled and managed by the Indexer. So you could almost say that all of the data in The Graph network is essentially held by all of the indexes that exist within it. And then you have mechanisms like the Proof of Indexing mechanism that allows the network to validate and ensure that if you have, you know, a large set of Indexers, and they are, you know, saying, Well, I’m indexing this subgraph, and I am therefore I am storing the data for this subgraph and allowing people to query it. The proof of indexing mechanism allows the protocol to ensure that all of the Indexers supplying that data have acted in good faith and truly have all of the data that they claim they have. And that the integrity of that data is 100%. And therefore, when queries are distributed to those Indexers, the protocol is ensuring that they are serving the correct information.

I’m going to also ask you about the role of Delegators. In your opinion, how important are Delegators to The Graph ecosystem,

Really important, I think, you know, there’s naturally a separation between those who have capital to deploy into ecosystems like The Graph, and those who have the technical expertise that’s required to operate a competitive Indexer. And without that separation, in, you know, the network model, I think that would really drive centralization and generally undesirable characteristics. So I think, you know, separating people who have access to capital and are willing to deploy it into an ecosystem and are willing to make well informed choices, and those who actually operate the indexes just makes a lot of sense to me as a design decision. I think Delegators also, surface a really interesting intelligence in the market, right, like they signal to the market, what sentiment or confidence is, for a given set of indexes. And I think that’s also an important data point to consider in the broader perspective of who is doing well, and who isn’t, and how do we, you know, drive the network towards participants that are maximally additive, and effective in that role?

What advice do you have for Delegators when it comes to selecting an Indexer to stake their GRT with?

I think there’s going to be a lot of tooling that comes out soon to make this kind of decision easier. You know, I’m naturally somebody that’s drawn to the data and want to take a data driven approach to it. And I think right now there’s a fairly limited set of reliable indicators around who is running a good operation. And who isn’t. I would certainly say don’t bet solely on APRs, or, you know, APYs, especially if they are subsidized by negative, effective cuts from the Indexer. You know, those are never sustainable. You know, you would question any business that operates at a loss, I think beyond, you know, kind of taking a more holistic view than just APRs. I think at this stage, community engagement is one of the best ways to assess the competency of an Indexer. So I definitely look at community engagement. And one of the best ways to assess that is to pop into Indexer office hours, which happens weekly on a Tuesday, this is a great way to be part of the conversation or even just to listen in on what Indexers are doing, what are the active concerns within the network at any given point in time, what challenges are being faced and what solutions are being proposed. And I think you’ll find that Indexers that are engaged and active in that context, really understand the challenges of operating an Indexer within The Graph protocol. And I’d say that that is a great signal to understand who is worth delegating to outside of Indexer office hours. And this can be challenging because the data isn’t always easy to get out. But as the network becomes more dynamic, and network conditions shift more rapidly, you know, things like new subgraphs or version upgrades being posted or things like that, I think, you know, who is first to move on those opportunities, and who responds quickly, I think is another great in indication of engagement and how competent an Indexer is. And yeah, then as we approach the post migration phases, where we have real query demand coming through the network, I think the ratio of query fees relative to self-stake will be a great indicator of how well an Indexer is leveraging their stake in the network to meet demand levels and, you know, get paid for queries.

On this topic of query fees, I want to ask a follow up question. And it’s just something I’ve been thinking about, but consumers of The Graphs services have been using to host a service for free. And now as subgraphs, move to the main net, they’ll have to pay query fees. So the question is, do you think there’s any reason to be concerned that maybe people won’t be willing to pay query fees?

I don’t think so. Not in a way that represents any kind of existential threat. I think that there will be some adjustment period, you know, people like getting things free. But I also would not underestimate the tremendous network effects that have been formed around the subgraph format. I don’t see many teams, as an alternative undertaking, you know, huge engineering costs to attempt to solve this problem. In some other way, I think they’re far more likely to accept that there is a real cost basis to operating these services. And once the hosted service disappears, it’s not like there’s any other venue where you can really get these answers free. So I would predict that, you know, the vast majority of applications will adopt the decentralized network. I think another thing to realize is that, if you kind of like play through your head, what and a developer might go through, you know, they’ve had this service, free for some time, and it’s met their needs incredibly well, they have sunk very significant engineering hours and funds into developing subgraphs and ensuring that they meet their needs and developing that competency and skill set within their team, then free queries begin to go away. And the choice that they are presented with is, well, this is unpalatable. You know, I want free queries. And I don’t think that they’re going to get free queries from anywhere else in the market. So then the next step becomes, well, you know, I could basically attempt to completely re-develop this solution, right, like develop something that isn’t The Graph from scratch. That makes that just makes no sense. I think the next step is you know, what, gets you to the point of believing in the decentralized network, which is well ‘Okay, I have my subgraphs, and they’re incredibly neutral open standard. And The Graph node itself is completely open source. So I can essentially set up a mini indexing operation’, right, like minus all of the decentralized network components. But I can launch my own service somewhere in the cloud that indexes my own subgraph, and serves queries for that subgraph. And then, you know, I point my application at it, and I don’t pay query fees. But obviously, I you know, have to maintain that service, I have to pay for the infrastructural costs. And I need to develop competency and expertise in scaling and optimizing that service as demand so too increases. And I think you only get so far down that path, before you realize that it is near enough impossible for an independent to compete with, you know, the decentralized network Indexers that essentially sync all of their time resources. And you know, in some cases, teams of people into running and optimizing these services, and operating them in a reliable and cost efficient manner. And I think at that point, as a developer, you just realize that the net cost of deploying my subgraph to the decentralized network is actually going to be substantially lower than trying to cut out the query fee market, and just run the raw infrastructure myself. And, you know, the decentralized network comes with a tremendous amount of benefits in terms of resiliency, and redundancy, but also, you know, it’s highly geo distributed. So latency to the end user for queries is going to be a lot better than anything that they could run. And just generally, it’s really hard to compete with a network that, you know, has collectively got billions of dollars at stake to do one thing really, really well.

So the fact that we’re talking about query fees, and migration of subgraphs to the main net all speaks to the fact that we’re still in an evolving environment. And Delegators constantly have this question or this concern of where are we in the evolution of The Graph? So what would be your advice to Delegators who are curious about that? Where we’re at in the evolution of The Graph?

I think we’re incredibly early. I think The Graph is in an unusual position of having bootstrapped an incredibly capable product in a centralized context, right, like the hosted service is the data backbone of Web3. And that’s definitely an unusual position for a protocol to be in. But when we think about The Graph, in the decentralized network context, we are incredibly early. I think the migration itself has been incredibly positive in highlighting some of the deficit that exists in terms of serving that data market in a decentralized way at scale. I think my advice to delegate is would be you know, if you’re not expecting, like, if you’re not in this for the long haul, then get out because we are right at the beginning. The network is incredibly young. And we are still at the phase of bootstrapping the network. And validating that it is sufficiently capable and sufficiently tested to serve, you know, the very significant levels of demand that exist already in the market. And systems like these need to be rolled out in a progressive and careful manner. Because we need to maintain the credibility of the network and doing things too quickly, you know, would risk damaging that. I also, you know, if you’re zooming out sufficiently far enough, I think that generally the conception of where we are, is somewhat constrained to the single Ethereum blockchain context. And actually, if you zoom out and recognize that we are going to see the migration of applications and users and capital into these alternate execution environments, like side chains. But you know more significantly, I think Layer 2’s, then you really begin to see that in the grand arc of The Graph, and it becoming this unified, aggregated data layer. We’re really just at the beginning.

I love that answer. And I love the way you articulated that. So I guess the best follow up to a question about where we are in the evolution of The Graph is, what’s your long term vision for The Graph?

I think that as blockchain begins to eat more and more of both our fundamental institutions, but also the kind of products and services that we just use every day, they increase the total addressable market of The Graph. And our lives already revolve around digital services. And there’s certainly no doubt in my mind that the vast majority of these services will in time transition to be blockchain based protocols and networks. Because that style of coordination just is hugely superior for maximizing the outcomes for its participants and users. So as more and more and more of these digital services we depend on become blockchain-ified, they essentially fall into that total addressable market for The Graph. And I think we’re really just in the beginning of what that looks like. And today, you know, that’s largely, these applications exist on things like Ethereum. And you know, not much else. But in time, I think we are going to see a diversification of where these applications run, and where they store their data as well. And what’s really interesting about The Graph is that in that context, it functions almost as this like, layer of connective tissue that is able to aggregate data, and you know, create a real time free market for that data in one place, even though it will be hugely disparate in where it is located in terms of the blockchains or execution environments, like Layer 2’s where these applications actually run. But then also, if you look at the kind of evolution of the data requirements that any given application has, I think we’re definitely going to see an evolution of data storage beyond storing data on the blockchain. And you know, that data, depending on the importance and tradeoffs associated with storing that data, that data will move into other places like Filecoin, or our weave, or other storage systems that are decentralized and, you know, have kind of strong properties over our existing systems, but still occupy quite a wide spectrum of tradeoffs, that applications will make depending on the importance or value of that data. And that, you know, in the same way that we talk about the fragmentation of capital and applications as they move into things like Layer 2’s, I think the same is true for data. And as blockchain subsumes more and more of the services that we depend on every day, where a data is located, is going to become really disparate. And The Graph is incredibly well positioned to just glue everything together and offer an aggregated and unified interface for querying that data in a meaningful way in one place without needing to deal with you know any of the complexities of where that data is located.

Chris, thank you so much for your time. I really enjoyed this interview for those listeners that want to stay in touch with you or learn more about what you’re up to. What’s the best way to stay in touch? Yeah,

Thank you so much. It’s been so fantastic to be on the show. I’ve really enjoyed chatting with you. I’m available on discord in the in The Graph protocol discord under chris.eth. You’ll find me in the Indexer channel and a few others, and I’m also available on Twitter at undefinedza you can find that in the show notes. Please do reach out happy to speak to anyone.

This has been a production of the GRTiQ podcast. You can find additional information including detailed show notes by visiting slash podcast that’s slash podcast project, helping out the community by subscribing and leaving a review


Please support this project
by becoming a subscriber!



DISCLOSURE: GRTIQ is not affiliated, associated, authorized, endorsed by, or in any other way connected with The Graph, or any of its subsidiaries or affiliates.  This material has been prepared for information purposes only, and it is not intended to provide, and should not be relied upon for, tax, legal, financial, or investment advice. The content for this material is developed from sources believed to be providing accurate information. The Graph token holders should do their own research regarding individual Indexers and the risks, including objectives, charges, and expenses, associated with the purchase of GRT or the delegation of GRT.