* This episode was recorded on July 11, 2021.
Episode 17: Today I am releasing a special edition of the podcast – a panel discussion with three members of the Graph Community, each of whom played an important part in the July 7th launch of Curation, the Graph Explorer, and the Subgraph Studio at The Graph.
Only days after one of the most important events in the evolution of The Graph, I spoke with Nena, Juan, and Payne – all names that anyone familiar with the recent product launches will recognize. I can’t emphasize enough, nor adequately express my appreciation, that they would take the time, after such an exhausting and labor-intensive contribution, to share their insights and story with the GRTiQ Podcast audience. I can tell you that each of them sacrificed greatly, emptying an already empty tank, to provide the following content for the community.
Last week was a historic event. And despite Nena, Juan, and Payne’s critical roles, you will also quickly realize that they acknowledge the contributions of many others who also made last week’s launch possible. I think it’s important to recognize the efforts of all who helped – I know there were many – and I am sure we will never fully appreciate everything they did, and continue to do, to make The Graph’s vision a reality.
During this panel discussion, we discuss each element of last week’s launch – Curation, Graph Explorer, and Subgraph Studio – what they learned along the way, and how these things will impact The Graph ecosystem.
The GRTiQ Podcast owns the copyright in and to all content, including transcripts and images, of the GRTiQ Podcast, with all rights reserved, as well our right of publicity. You are free to share and/or reference the information contained herein, including show transcripts (500-word maximum) in any media articles, personal websites, in other non-commercial articles or blog posts, or on a on-commercial personal social media account, so long as you include proper attribution (i.e., “The GRTiQ Podcast”) and link back to the appropriate URL (i.e., GRTiQ.com/podcast[episode]). We do not authorized anyone to copy any portion of the podcast content or to use the GRTiQ or GRTiQ Podcast name, image, or likeness, for any commercial purpose or use, including without limitation inclusion in any books, e-books or audiobooks, book summaries or synopses, or on any commercial websites or social media sites that either offers or promotes your products or services, or anyone else’s products or services. The content of GRTiQ Podcasts are for informational purposes only and do not constitute tax, legal, or investment advice.
We use software and some light editing to transcribe podcast episodes. Any errors, typos, or other mistakes in the show transcripts are the responsibility of GRTiQ Podcast and not our guest(s). We review and update show notes regularly, and we appreciate suggested edits – email: iQ at GRTiQ dot COM). The GRTiQ Podcast owns the copyright in and to all content, including transcripts and images, of the GRTiQ Podcast, with all rights reserved, as well our right of publicity. You are free to share and/or reference the information contained herein, including show transcripts (500-word maximum) in any media articles, personal websites, in other non-commercial articles or blog posts, or on a on-commercial personal social media account, so long as you include proper attribution (i.e., “The GRTiQ Podcast”) and link back to the appropriate URL (i.e., GRTiQ.com/podcast[episode]).
The following podcast is for informational purposes only the contents of this podcast do not constitute tax, legal or investment advice, take responsibility for your own decisions, consult with the proper professionals and do your own research.
To me, it’s just gave me this and it just gives me tingles right now how powerful it is to be a part of a project like that. And as I’m building things, I always care about the users who is going to use those wise they’re going to use them and most importantly, how is that going to impact them in their life. And to me, that’s kind of my big ‘Why’ it is going to impact people’s lives.
Welcome to the GRTiQ Podcast. Today I’m releasing a special edition of the podcast the panel discussion with three members of The Graph community, each of whom played an important part in the July 8, launch of curation, The Graph Explore, and the Subgraph Studio. Only days after one of the most important events in the evolution of The Graph, I met with Nena, Juan, and Payne, all names that anyone familiar with the recent product launches will recognize. I can’t emphasize enough nor adequately express my appreciation that they would take the time after such an exhausting and labor intensive contribution to share their insights and story with the GRTiQ Podcast audience, I can tell you that each of them sacrificed greatly emptying and already empty tank to provide the following content for this community. Last week was a historic event. And despite Nena Juan and Payne’s critical roles, you’ll also quickly realize that they acknowledge the contribution of many others who also made last week’s product launch possible. I think it’s important to recognize the efforts of all who contributed. I know there were many, and I’m sure will never fully appreciate everything they did, and continue to do to make The Graph’s vision a reality. During the panel discussion, we discussed each element of last week’s launch what each panelist learned along the way, and how these things will impact The Graph ecosystem. We started the discussion by having each guest introduce themselves and share the role they played in the recent launch.
All right, and thank you for having me. My name is Nana. And I’m the product engineering, lead and Edge & Node. And in terms of this particular launch, I had kind of multiple roles that one of them was coding, of course developing this app that we launched. And another was tech lead, leading my team and helping their communication around the launch getting everybody together and things like that.
Hey, I’m Juan. I’m a software developer working with node and also helping the Edge & Node team with everything related to subgraphs. I’m basically the guy that maintained and upgraded the subgraph for the network. And it’s not just one subgraph, its two now. So yeah, basically maintain and split the subgraph into new stuff, added new stuff, so the subgraphs and everything.
Hi, everyone, my name is Alex, people know me by Payne in the Discord and Twitter, I was responsible with the QA on the product side. So basically explorer and the studio or the product launch.
So Nena, Juan and Payne. The reason we’re meeting today is because of the recent announcement of The Graph, launching the decentralized data economy, right with the focus on curation, Graph Explorer and Subgraph Studio, I want to talk about each one of these products that were shipped in greater detail and allow listeners to better understand not only what happened, but how will impact the entire Graph ecosystem. So can we start with Graph Explorer? I want to start there because I think most listeners have used it they’ve seen it. What’s changed or what’s different about Graph Explorer?
That’s a very good question. So since December last year, we have kind of a very stripped down version out there that was throwing some stats about the network showing Delegators and enabling delegations to the users. And since then, we’d work really hard to add more features to support other protocols roles. So one of the big things we have is subgraphs and curation. So that particular part is really exciting for us. And right now, the subgraphs that are published the decentralized network, which is what you can do Subgraph Studio, they show in explorer and people can signal on them. On signal people can watch Indexers live how much progress they made indexing those hoppers, and they can view a lot of other KPIs. We also have charts to see data over time. And then we have Curators table. In addition to Delegator stable we show also all the Indexers. And a lot of other statistics, we also we kept Delegators feature, we improved them. And for Indexers we add a staking into the protocol through the UI.
And also we added the profile settings that way, you as an Indexer, or as another role in the protocol, whatever you may be, you can go there and set your profile picture yourself against that your ENS name. And as an Indexer, you can also set the delegation parameters and the operators for your Indexer node.
Yeah, I guess we could probably talk about the whole changes that were required on the subgraph for all these changes. Yeah, there were many, many changes that were needed for the Explorer to be able to be fast, mainly, because the old explorer was kind of slow. It needed to do a lot of calculations on the front end to be able to display in many of the rich data that it was showing. And this new overhaul of the Explorer required a lot of changes in the in the subgraph. And the main change that it required was to have a separate subgraph, which will track all the analytics, and by analytics, I mean, historical data that was going to be used for, for example, the charts on the delegation charts and the subgraph charts. And pretty much all the charts that are being used to display the historical data are using a new subgraph, which is what we call the analytic subgraph. And also the analytic subgraphs doing some more intense calculations for Delegators data, which we didn’t have before in the main subgraph. So yeah, there was a major overhaul of the subgraphs and split even to be able to improve on the speed of the Explorer, which was quite awesome. To be honest, it was a major milestone for the team.
Yes, and it was great what Juan did there because originally, we started, we did a lot of those calculations on the front end, and then once one upgrade the subgraph, which is now going to be available for everybody else to use it as well, we could really simplify that and use the power of GraphQL to query all the data, make it sortable, add pagination and all the other features.
And it’s so much faster now, like everywhere you go is so much faster than before. Yes makes me so happy.
Indeed, what explains the increase in speed then was it just you know, a different type of subgraph written differently was that a whole bunch of other things that I just don’t know about?
This in particular was just like a refactor of the… basically, how we consume the data from the subgraph, we wanted to make it so the front end. So basically, the end users don’t have to calculate a bunch of stuff to be able to display on the page, the main idea was okay, we can move all those calculations, or most of those calculations to the subgraph. And of course, that will impact the amount of days that the subgraph requires to sync, we basically went from syncing the whole subgraph in about eight hours, to about three and a half days. So it’s a major like change in the subgraph. But it allowed, it allowed us to basically avoid having to calculate everything in the front end. And it allowed basically the page to be load, like a second or two faster. So it feels a lot more snappy. And it also like simplifies the code a lot for the front end. So Nena could throw some new stuff there without the code again, too bloated.
When we’ve looked at the impact subgraphs in The Graph are having in the larger kind of crypto DeFi Web 3 environment. On this podcast, we’ve used user case examples of Uniswap or Livepeer. But now The Graph and its own explorer seems to be like this really cool example of a use case. Is that how you all see it, as well as this is a highly visible use case of what subgraphs and The Graph network can do?
Yeah, absolutely. And we were also talking about putting the analytics and the main subgraph on the explorers for Indexers to save data for but unfortunately, they use IPFS data that is not yet reliable, to make it fully deterministic. Correct me if I’m wrong one. But as of now, we can’t do it yet. But we’re working on solution. We initially tried to strip those subgraphs off the IPFS data that was required, but that didn’t cut it in because we needed more stuff to be pulled from the subgraphs.
Yeah, basically, to give you an example, on the difficulties of publishing this to the network yet, is basically that it requires some data that it’s gathered from the IPFS nodes. So it’s basically external data to the on chain data. And the issue with that is that the files could be not there. Basically, you your Graph node might not find them and that makes the subgraph non deterministic, at least for now. So that would impact Indexers that want to index it and get rewards for indexing it. So yeah, that’s one of the reasons why it’s not published yet on this on the network. But yeah, we needed that data, because it, for example, it’s used for the images and everything related to profiles in the Explorer. So we couldn’t like, get rid of the data, because it’s, its super useful to have it on the subgraph. So we chose to basically index it ourselves, for now. But the main idea is to be able to at least deploy either a stripped down version to the network or the main version, whenever IPFS becomes domestic.
I just want to say that the specific metadata we get from IPFS, such as user images and names, and a couple of other things is what I heard users really asks us to do several times because we have a huge community score, and we really listened to feedback. So we really didn’t want to eliminate that. So when you make those compromises like we had to do in several places, you need to sacrifice something else,
Nena, what’s your vision for how members of The Graph community use the Explorer, then? I mean, I understand the amount of data that is now accessible. But what’s the use case for members of the community? Why should they go use the Explorer?
That’s a great question. So first, I just want to give a little bit of a background that we spend a lot of time trying to brainstorm and figure out what would be the best user experience for people to jump into The Graph and easily become one of the protocol roles, because some of them are more technical, and some of them are less technical. We did a lot of user research and try to gather user feedback, which I always love to hear. And the reason we built Explorer is really make it easier for anybody who is interested in The Graph and maybe is unsure how to get started what to do to have this hopefully intuitive and beautiful user experience that they can jump in. And that’s going to guide them through the process. At least that’s what our hopes so now that it’s out there, we’re going to gather more user feedback and improve things of course.
Yeah. So from my point of view, as an Indexer is very nice to have all this new data in the Explorer now, because I can go view my own profile. And I can see, when I opened the allocations how long they have been open for. I can also see the past allocations and how many rewards they generated, what poi proof of indexing that they have been closed with, and all that sorts of great stuff that we have all been using Graph, graphscan until now, and I’m very happy to use the official Explorer for this. Yeah, it’s, it’s amazing. And I still have some, some feedback that I want to route in, specifically for the subgraph pages. Now, that’s also very useful.
Well, that’s one thing I really appreciate about the new look and feel of the Explorer as a Delegator. within the community, I have been using other dashboards in the community to get data and to kind of look at how my Indexers are doing how I’m doing on rewards and different things like that. But I feel like with the new Explorer, I can kind of just go there was that sort of the vision Nena when the Explorer was created,
Yes, there was there was really the goal. And originally, I was the only person building in last year, on top of all the other work that I had to do. And then this year, I hired some really amazing developers who started several months ago, a kind of different times. And that really helped us and gave Explorer, this new look. And stability and performance is just the team that I work with right now is so amazing. And you can really notice that. And the reason it took us some time because as a startup, as probably a lot of people know, we experienced a little bit of growing pain together, we kind of tried to stay focused. But yes, the goal was to have explorer that you could go to and view whatever they’re interested to buy The Graph. Having said that, we are totally fine. If people go to other dashboards as well, it depends on your preference. But this is something that we wanted to provide for our users as kind of a one stop. And like I was saying we’re Payne had some feedback. And we had a lot of work and a lot of features that are coming up that we’re going to add to explore over time.
Well, Juan, The Graph Explorer is getting get a lot of attention, it’s highly visible element of the launch, should we say more about the subgraph behind the scenes powering at all?
The spotlight here should be on the Explorer, because the subgraph just follows along wherever the Explorer needs and will not only explorer, the studio and other tools. So I could probably add like, for example, I’m not the only one working on the subgraph even though I’m I guess the only one that’s fully working on the subgraphs. There’s also Dave, Dave also worked on the subgraph, but he basically built the first version of the subgraph, which is a huge subgraph. It’s incredibly huge. It’s one of the I guess, the only other subgraph that is big as this one is the Maker one, which is also of course huge because Maker’s huge. So it’s incredible the amount of work that they’ve had to put on this one, of course He’s also working on other stuff for the whole protocol. So when I joined the team to work on the subgraph, I kinda… basically was the only one developing the subgraph. From that point onwards, minus a few like, specific tasks that I need help with from day one. But you have pretty much we, we were always working as a team trying to improve on the product. And whatever changes the subgraph need to have to basically allow the explorer UI to be better. We basically worked on them. Even if it took us months to get there, its super useful to have a back end that helps the UI done from like a subgraph developer perspective, the explorer itself, it’s, and it’s a huge improvement to what we had before. There are many third party tools that also have a bunch of really useful metrics on whatever from the protocol, even, I even made a spreadsheet when we first started on the December launch last year, which for me was super useful for but having like the official UI, have a ton of really useful metrics and data, it’s, it’s incredibly useful, to be honest, that that’s at least my point of view, as a subgraph developer, not as, as the internal developer, or this specific tool.
Juan, to add on top of what you already said, The thing with subgraphs is that everyone can consume that data, and they can now use it for their own purposes, or dashboards or like to analyze the network to draw conclusions or whatever they want to do. Everyone can use them. So this is the beauty of The Graph. You know, it’s, and it’s really great to see it as a live example. Yeah. Those many great things you can build on The Graph.
Yeah, that’s, that’s totally true.
I totally agree. We then I’m so happy we did this upgrade, because many times when I was building things, and somebody was really painful and got frustrated, I was thinking, wow, this must be the same for others who are building their UIs on top of this Graph. And like we always have those users in mind. And now it’s becoming so much easier from like a dApp developer perspective. Basically, you can just consume the data queried and basically rely on GraphQL API.
Well, for listeners that aren’t familiar with The Graph Explorer, you can go to The Graph, comm slash explore and see all the things that have been added lots of great data. And as the panelists have said, it’s a really great example of the power of subgraphs of The Graph, and all the things that you can do with this technology.
I now want to turn our attention to Subgraph Studio. And I know that this was dissimilar to The Graph Explorer, which was like an upgrade of something that already existed. Subgraph Studio was something entirely new, who can maybe in non-technical terms, explain what the Subgraph Studio is and what it does.
So we built Subgraph Studio primarily geared toward subgraph developers. And even that’s an improvement from what we had in the hosted service, we actually started from scratch. And we were envisioning this playground where developers can easily create and deploy their subgraphs without having to publish them right away to decentralized networks. So they would need to cover for any of the gas fees and things so they can basically just try and test things out. They can query them in the playground, update their code, and when they’re happy with it, they can just with what I hope is an easy UX, publish their subgraphs does network. And that’s the main purpose of Studio. And another big set of features that we have there is creating API keys in order to use our subgraphs and billing and we decided to go with a Polygon bridge for the billing that makes transactions more faster and more scalable. So that’s also done through studio.
So Juan, as a subgraph developer explain how people like yourself will benefit from something like Subgraph Studio.
Well, what the separate studio allows you to do right now, which we couldn’t do before, actually, is to basically deploy stuff to the network. And the idea behind it is that right now with the studio, you can deploy to the network. And you can have as many Indexers as wants to do index your subgraph indexing and serving data, which basically makes your subgraph be more production ready than it was before. The hosted service always had some issues with specific subgraphs and indexing some stuff, because of course, it’s a huge beast of an infrastructure that needs to handle 1000s of subgraphs from basically everywhere in the world. So basically, a subgraph developer migrating to the network will allow you to have production ready subgraphs deployed in a decentralized manner, which basically will make your uptime of your subgraph better, most likely, your latency go will go down. And basically, that’s something that you couldn’t do before that it’s not just an improvement, it’s just like a new tool for you. So that’s the main, main, I guess, point for the studio, it allows you to do stuff that you could never do before
Payne. How about you? What do you want to add to that?
Right, so I want to like make an overview of the studio and the experience that goes with it. So when you log into the studio, you have to connect your wallet, in order for you to be able to publish those subgraphs to the network. And once you are connected, you can create a subgraph deployed to the studio, which is basically a staging environment kind of before you… to allow you to prepare to publish it to the network. And once you deploy it to the studio, you can then play around with your subgraph. The Studio basically here acts as a kind of a hosted service. You can query your subgraph with a specific URL, you can play in the playground, create queries, see that everything works, and upload their pictures and metadata for that subgraph, and then eventually publishing it to the decentralized protocol where like the indexes can pick it up and do their job. Apart from that you also have API keys, like Nene mentioned, you can create an API key, and you can add the GRT to the billing contract. So you basically have a way to query that subgraph when it’s published to the network using the API key. You can also control which domains are allowed to interact with that API key and also which subgraphs that API key can work with, right from the studio, which is very great. And yeah, I was there a couple of days before launch, I was literally bombarding the whole network of Indexers with queries, and I couldn’t believe my eyes, like everything was there, and everything was working, send, like 1.6 million queries at some point to the centralized protocol. And I was like, wow, this is really happening. I went to the Janis, because I was asking him a bunch of questions about the gateways and stuff. And I was like, holy cow, this is amazing. Like this actually works. We’re actually here we’re actually doing it. And it’s like, this is literally just mind blowing.
Nena with the release of the Subgraph Studio, what’s kind of the vision for what happens next? Do you it was the vision, we see more subgraphs? Was the vision, we lowered the barriers to creating and deploying subgraphs? How do you think about that?
Yeah, basically, both of those, we are hoping to make it easier for people to create and deploy subgraphs. And then in terms of publishing, you can publish to Rinkeby, which will be visible on our testnet. And then from there published on Mainnet, as well. But we are hoping that people will create more subgraphs and in the future collaborating on them. That’s something that we’re really thinking about adding into the UI. And then there’s this whole API key and billing, it’s really easy necessary to use the subgraphs. So if that there had to exist somewhere. And for that, we actually use database, which is what studio uses as well.
So Payne, how should listeners think about all those subgraphs being hosted? And now the introduction of the Subgraph Studio? How do those two different paths merge?
Right, so everyone that has their subgraphs on the hosted service right now will eventually want to migrate them through the studio to the decentralized protocol. So there is no way to avoid the studio, you have to deploy your subgraph to the studio, and then from the studio, you can then publish it to the network. And given that where we are going to deprecate the hosted service at some point, yeah, I will just ask everyone to just get used to how the studio works and get a feel for it.
Yeah, one of the things that you should really have in mind, with the network, the studio and the hosted service, and how they all interact with each other is that the hosted service is a single Indexer a single, gigantic Indexer. But it’s also a single point of failure. Whenever that Indexer goes down, it goes with everything down. So basically, the idea behind the network is that you have 150 Indexers or more, right now indexing stuff, and basically being hosted service, each of them by itself. They are, they may not be just as gigantic as the hosted services right now. But they could very well be in the future. And you will have like 115 different infrastructures serving your data anywhere in the world. So that is really a change in the playing field, you won’t really rely on a single point of failure, you will have many. I mean, if you’re if your server really goes down, it should really go down with 150 Indexers at least. So it’s a really hyperbole that you your subgraph will be alive at all times whenever you need it. So that’s really part of the industry, I guess, it’s completely different as to having your stuff on the hosted service.
Yeah. And another good thing about the decentralized protocol is that you have a kind of a natural load balancing environment and also a geographically distributed environment, you can like if someone connects from Africa, they will hit an Indexer, that sitting in Africa, basically. And we have a couple of them, I think, at this point. And yeah, like the latency will be very small compared to like hitting the postal service, which I believe it’s running in the in Europe and US at this point. So and like one said, if one of the Indexers goes down, there’s 150 more to hit.
I just want to quickly add to all of the great points, then one and Payne mentioned that we will be deprecating hosted service over time, we have a plan for it, and we’re not going to shut it down. So like people don’t need to worry about it. And at the same time, we would like to help users transition to studio and encourage them to do so.
Yep. Regarding the studio, you can basically think of it as a tool for easy access to the network. It makes deploying and publishing software from the network a lot easier, you could technically do it before. It was possible as basically everything since December. But you need to go through the contracts, you need to basically craft the transactions yourself. They are not easy to craft. I’ve been there, I try to play around when we were testing this, this new tools. And it’s not easy and not easy. Even if you really know what you’re doing. And playing around with on chain data. It’s never fun. Because it’s on chain data, it’s immutable, you need to do it right the first time around. So having a tool that pretty much makes it a three click adventure for you to publish a subgraph. It’s a game changer. So the way that I think about studio is the tool that allows me to use the network. It’s not that it’s a product in an in and of itself. But it’s the tool that allows you to go the next step in your subgraph deployment and publishing.
Nena. When I look at the new Explorer, the updates to the Explorer, and I hear what you guys are saying about Subgraph Studio, I can’t help but think that a lot of this will attract web 2.0 developers that were maybe waiting for an interface, an invitation to get more involved. How do you think about that?
Yes, I would love for more web 2.0 developers, to join us. As a matter of fact, I’m a Web 2 developer who switched to Web 3 a few years ago. And the first project I built was, together with Dave, who I think was on your podcast, it was Everest. And honestly, at that time, it was really painful to do anything in Web 3. And things have improved over time. So I totally understand the Payne of positioning from web to developer to Web 3. And also understand the value of the experience that Web 2 developers often have who want to learn Web 3. And so one of our big goals is to basically hold their hands, help them, teach them onboard them and get more Web 2 developers into our space. For sure. It’s such a valuable group of people.
Payne and Juan. What’s your opinion on that? I mean, as you look at the Web 3 landscape, it seems like there’s a lot of people that kind of started in Web 2 and maybe moved to Web 3, how do these tools assist in that process?
I mean, I’m also an ex Web 2.0 developer that transitioned to Web 3. But I fortunately had the luck, I guess, to go into Web 3, with The Graph, already an established product. So my interaction with Web 3 was completely hand in hand with The Graph, I basically started working as a web developer, as a subgraph developer. So my experience with Web 3 was much better than I guess Nena’s one nice experience, because I didn’t have to do manual stuff. All the time, I didn’t need to bother myself with how each of the chains that I need to interact work, how they the underlying’s of each chain work. So pretty much I’m not one of those few unlucky developers that had to transition when everything was pretty much ultra-new, it’s still really early. I mean, they were building the tools for the next developers. But I had the luck to already land on Web 3, with a few of those tools already deployed. So for sure, I see those tools as something that really, really allows you to really productive developer, because losing time, with caveats of how stuff works on the underlying notes, is never fun. And it doesn’t allow you to really focus on the really big issues that you have, which is focused on your product, or your focus on whatever you’re trying to ship. So yeah, it’s a game changer.
So I want to turn our attention to curation now, which was obviously something the whole Graph community was really excited about. Before we do that, any initial learnings after deploying the improvements on Graph Explorer, and the Subgraph Studio.
So when the Explorer launched, there were a bunch of new subgraphs coming in, like right after the launch. And the thing that you need to remember is that the protocol is permissionless. So everyone can publish new subgraphs, regardless of if they are the actual developers of those subgraphs, or they are just a bunch of forks off the subgraph not developed by them. And this is kind of what happens, there were there were a bunch of subgraphs that were not deployed during the underlying projects themselves, but by some other entities. But the subgraphs actually have the actual data that is needed to index for example, for Uniswap. But then there is not coming from those developers of Uniswap. And the catchy thing here is that if those subgraphs ever fail, there is no way for the Uniswap team to upgrade them, because they are not the owners of the subgraphs in the Explorer. So those projects can’t actually trust those subgraphs that were published in the future because of this is something to keep in mind.
Given that it’s a permissionless protocol, as Payne said, anyone can deploy pretty much any subgraph they want. And the really important thing to keep in mind here is that there’s a specific role in the protocol, which was designed this way, which is the Curator role, as you many, you know, one have been waiting to be able to curate, which the main idea behind the Curators is to be able to identify which subgraphs are worth indexing for Indexers. And which ones are not worth indexing because of whatever reason you might want. Yeah, the main goal here is that Curators will be the basically the gatekeepers of the whole network, they will be the ones to decide, hey, I think this subgraph is really, really well done. The data that its serving is really, really useful. It’s going to be used by many, many, I don’t know web pages, protocols, and whatever you like. And by signaling, they will be saying that formally in the protocol to Indexers so they can understand, hey, this subgraph is good, this subgraph will be used, there’s real value to indexing the subgraph go ahead and index it. And the way that they do it is by signaling cycling, because they are showing the commitment to support that subgraph. Of course, given it’s a permissionless protocol, anyone can deploy anything. So you are so Curator will have the responsibility to be able to understand all the specifics of whether that subgraph is actually doing something for anyone to actually be able to use it. It’s indexing data that’s useful for them, whether that subgraph will be maintained or not, whether there’s some subgraph developer behind that subgraph that will be working actively full time or part time or whatever, on maintaining that subgraph. There are many like different caveats on different like points of view that you can, as a Curator, try to understand of the subgraph that you’re trying to create. And that will basically allow you to, if you want, give it a score on whether that’s subgraph actually good, or whether or not subgraph is not that good. Maybe it’s maybe the data that its serving, its super useful, but the developer decided to retire and will never be updating it again. And you know, that perfect. So where is it worth signaling, if it could fail at some point, and nobody’s gonna be updating it, there are many different aspects that you can look at as a Curator. And the idea is that Curators will be rewarded by finding really, really good subgraphs and by curating early on those subgraphs. So that’s the main goal of the curation role. And yeah, they’re pretty much the gatekeepers of the network.
I want to emphasize a bit on one Juan side on the relationship between Curators and Indexers to why essentially happens here is that when a Curator deposits tokens into the bonding curve, we call it signal tokens, the subgraph have more signal tokens, they will attract more of the indexing reward split towards them. So out of the 3%, annual inflation, this 3% is distributed proportional to the signal tokens on each of the subgraphs. And then is distributed again proportional to each of the allocations of his Indexers on those subgraphs. So if, for some reason, the Curators are not doing the job well, and there are signaling fake subgraph that are not going to be used, they’re essentially wasting everyone’s resources in the protocol and splitting the rewards in a way that everyone earn less. And it’s very important that when I say everyone is also Curators, there’s also Indexer. And, obviously Delegators, because they’re tied already to the Indexers. So it’s very important for everyone to do their job well, and to, like do their own research when it comes to signaling and not just, hey, this, this image looks good. So I’ll throw a bunch of GRT on it.
Yeah, that’s true. And also like, certain not, because the idea is that curation will be a market that regulates itself. If a subgraph is not generating queries, nobody’s using it the data is day, eventually, Curators will migrate from that subgraph to another subgraph, that will be a lot more useful. Because nobody wants to support something that doesn’t work. You know, as a Curator you, you have the responsibility to signal and stuff that is really, really good. But you can also make mistakes, you know, so if you make mistake, its fine, you can just get out of that subgraph and go find another subgraph. That works. Maybe it worked at some point and the developer, like, as I said, retired, maybe he’s like, turned 80, or something, and he’s no longer working. So you can never know what the what’s gonna happen to something, you know. But yeah, basically, as a Curator, you will be moving and signaling tokens to show that you support the subgraph. And you think the subgraph is good. Of course, this comes with the warning that you really need to do your research. Because whenever be the subgraph or be someone, if you for example, in real life, give your support someone you want to be really giving your support to someone that you don’t even know you, you want to go on the streets saying, Hey, I support that person that I don’t even know, you obviously support the people that you know you are that you at least research about them a bit. If you want to think of it as maybe politicians or whatever, if football stars, whatever you like, you know, you support the people that you know that you have trust in. So of course you won’t be like supporting whatever comes first. The idea is that the curation market will regulate itself and find good subgraphs. Even if at the beginning, of course, its semi chaotic, because everyone’s getting used to the UI, everyone’s getting used to the idea that they now can curate, they couldn’t before. So of course, we expected it to be a little bit rocky at the beginning. Because it’s going to be it’s going to be happening, people are going to be trying it out for the first time they are going to be learning a bit. There’s always a learning curve. Even if the… I guess the most technical role in the network would be the Indexers. Because they have to rent infrastructure and stuff. The Curator is kind of a really, really important role that also requires some technical knowledge. So there’s always a learning curve for everything that is technical. We expect writers to maybe not be 100% on the first time they will be learning from their mistakes. We’ve seen many creators, forming groups and forming like communities around curation, they are gathering they are discussing they are we have seen, for example, some curators like Graph God and many others, creating Telegram groups to discuss curation to discuss, hey, we find the subgraph. This is good, we find this subgraph discussed with done by this subgraph developer, they are really cool, we’ve been able to discuss with them. And there are many, like, as I said, many points of view or when you’re creating some something, some of them are directly like trying to find who created the subgraph to ask them questions to better understand what the subgraph does. That is amazing. It’s part of what we envisioned. Well, I’m saying we were mainly the reaction, Edge & Node team really, really envisioned this. I just joined the bandwagon afterwards. But yeah, it’s an amazing product. And it’s an amazing role. And it’s super important, we need to really understand that there’s a learning curve for this. And it seems that people are picking up really, really fast. We’ve seen the very beginning of the first hours, I’d say of the deployment, there were many new subgraphs coming in, some of them were good, and some of them were not. There were also the already migrated partners, which were I don’t know the way X ray Omen there were many, many subgraphs that were already curated because they were already well known. But there were many new subgroups that could be good could be not as good. People, of course, would maybe make mistakes or the first subgraphs. But it seems that they could the creation community really a grouped up and they are really discussing the early learning. It’s amazing to see this process happening live. So yeah, I’m pretty I’m pretty pumped.
The bonding curves really important to curation, it seems like early on a lot of Curators participant in the network, maybe didn’t fully understand that. What’s your opinion on that Juan?
So yeah, basically, the way that the bonding curve works is the earlier you getting the learning curve, the more shares of the bonding curve, you will get for the same amount of GRT. So you could think of it as basically an incentivized way of getting early in the curation that like trying to signal early. There’s also like the other aspect of it, given that whenever you signal on a on a subgraph, or you basically get into the bonding curve, you will be getting different amount of shares for the same amount of GRT. And it always gives you more shares when you’re early. The other side effect is if you’re entering a bonding curve that is already really, really overpopulated, you won’t really be getting much shares. So the idea behind that is that you don’t really want to incentivize people getting in whatever subgraph they want. Because if they all jump into, of course, the 10 or eight subgraph migration partners, and there’s it’s not really the incentivized, everyone will just jump on the ones that have already been like certified by a centralized entity, like for example, Edge & Node or The Graph foundation or whatever. The idea is that with the one new groups, you will be incentivized to look out for new subgraphs and try to support new subgraphs. Because if we, if we didn’t have the bonding curve, like everyone would just jump in, I don’t know pool together, or maybe Dodo, or maybe Rai or whatever any of the partners are. But with the bonding group, since you are incentivized to get in early enough in the integration of a subgraph. Everyone will try to maximize their profits rationally, by jumping in new subgraphs in new unseat on subgraph, ideally, but of course, you don’t really want to jump in any subgraph. Because if you if I jump in whatever subgraph, I find, just because it’s early, I could be getting into several of the recent good, and maybe it’s not worth as an investment or a signaling towards subgraph. So the idea is that, even though you incentivize early signaling, there’s also like the idea that Curators really need to research on where they are signaling. Because if you get in among the group that is not worth getting into. There’s really no point, there’s really no incentive getting in those. But you also don’t want to jump in something that’s overpopulated by Curators, because it doesn’t really give anything to the network. Indexers will already be indexing that subgraph. So basically, nobody’s getting anything out of that specific signaling. So yeah, that’s pretty much how the bonding curve works. Get in early get more shares. If you get in late, you get less shares, it tries to incentivize those behaviors.
So Nena, what would be your advice then to listeners who have been just kind of watching curation launch and trying to decide if they want to get involved? What would be your advice to them?
That’s a great question. So first, I think we have a really big community on Discord and Payne and Juan are active there. I think there is a Curators group. We also updated our docs, and there is videos and tutorials that some of our team members are launching. We have a forum Telegram channel, so there is many ways to get involved and learn from us, but also from each other. So I would encourage people to take advantage of those resources prior to jumping into things if they don’t know what they’re doing. If you know what you’re doing, you can just go into Graph Explorer, pick yourself Graph and signal it really is easy. So there is there’s different ways to do your own research.
Nena, what would you tell listeners about the amount of work and effort that you and others that work to launch last week, these important products? What went into all of this?
So there are many aspects to this product is a first starting with our amazing designers who put a ton of work thinking through all the UI and UX related problems. And the research we all done together to get to some solutions, then, to developers who all of us were very involved in building it… to Juan and Payne, Payne during the QA nonstop, then the contracts team. And pretty much everybody in the company was someone involved in this launch. And it took a lot of brainstorming, building, learning, trying fixing bugs, learning from mistakes, and many, many hours of work to basically get where we are here. And that’s kind of included when you want to innovate and push the boundaries, which is what we were doing. Oftentimes, the solutions to your problems are not very clear. Or maybe you go down one path. And then when you’re at the end of that path, you realize, well, maybe this isn’t the best bet. And you need to kind of go back and start things from scratch. And this is what happened several times, for example, like Juan talked before that we needed to refactor this half Graph, because exactly, we had a down one route, and then we realized that we actually need to do something else.
Yeah, I mean, innovation never comes cheap. You know, you need to really put in the hours you need to really, really think about, you might get the wrong answer the first time, you might even fail multiple times until you don’t, until you really manage to create something that it’s actually really useful. It’s something that it’s actually game changer, actually. Because that’s not underestimate the impact of this tool, like it’s super impressive that you can pretty much work as a web 2.0 developer and basically interact with anything Web 3, you don’t really need to learn much more as a front end developer, you can just use the subgraphs that some other people create, and use it as a normal GraphQL endpoint, it’s really, really mind blowing. It’s not, it’s not something to underestimate, be honest.
Yeah. And for me, personally, I get into this mode when I obsess over problems that I’m unable to solve, or I don’t think I saw them correctly. So sometimes I feel like they wake me up overnight. They show up when I’m taking a walk and thinking about something else. And then I go back and feel like, Oh, we need to do this and curation. So it happened many times throughout, especially as we were kind of getting closer to the launch. And what I love about our team, and I really need to give a shout out to our amazing team at The Graph and a big shout out to my team of product engineers, everybody really stepped up. Even people who just joined by two weeks ago or a month ago, they stepped up offered to help dug into the code, basically onboarding themselves. So which was, which was really great to see. But it did take a lot of effort. And I feel like all of us at this point are a little bit of mentally overwhelmed with everything we put in. But it’s also very rewarding to see that all the effort we put in not just building features, but just making them stable, addressing all the use cases, edge cases. And Payne did such an intense testing of every feature that we build. And just like kind of going back to the code, and you spend so much time and just adding this one more Act case that you need to address and fix. I personally will fail even though it was at times really hard. But I think it was all really worth it. And we’re kind of seeing how things are really stable. And we can all kind of take a deep breath, be proud of ourselves and reset for the next set of things that we are going to build.
So when you think about the amount of work and time and energy that went into last week’s launch, I want to give each of you a chance to describe your personal motivation. I know you sacrifice time with family and friends. Why did you do it? Why put in all this time and effort?
During this past few years that I’ve been with The Graph, I kind of first and observed how the protocol that we created and the products we build so far are changing people’s lives. And how are they’re impacting users around the world giving them opportunities that they didn’t have before how they’re creating kind of their own economy of its own. That’s taking shapes. And to me, it just gave me this and it just gives me tingles right now how powerful it is to be a part of a project like that. And as I’m building things, I always care about the users who is going to use this why they’re going to use And most importantly, how is that going to impact them in their life. And to me, that’s kind of my big Why is to impact people’s lives. And I do it through technology, because that’s what my expertise is in. And another big reason for me personally, it’s just this wonderful team we have and, and not just Edge & Node team, but also our contributors, or community and people who are getting involved with The Graph. It’s just such a great group of really, really smart and really, really nice people. And just the fact that we’re building this together, they were basically, changing the world in a way together makes me very motivated to get up every day and jump into new things.
My motivations, I guess, I, as I said, Before, I was a Web 2.0 developer, a full stack developer that does many weird things like designing a parking meter, for example. So when I joined crypto, or the crypto space, I had the opportunity to join it in a specific state, where subgraphs were really a thing where you could like, do stuff pretty easily. And I didn’t have to like, really go through the whole process of doing stuff manually with the nodes or trying to get the data that you needed in a specific manner that didn’t work for a new version that did broke again, two weeks after whatever. So I had the real opportunity to work in a space that was easy to work in. And having like, really seen pretty much only that. And also I did try to use like stuff manually and so on. But I really want people to be able to work with the same eases that I did, have a learning curve that isn’t extremely motivating for anyone to work in. And also, I want that pray for everybody else. Because I think that crypto is something that could potentially change the world. And I’m not being like, I’m not overestimating it, I really think that it’s something that could drive change in the world. And I think The Graph especially and the whole technology, and everything that it allows you to do, really allows other teams that are working on the crypto space, to be able to improve easily make it make their work more accessible, make it everything work easier with everything else, like make the teams working behind the scenes don’t have really allowed focused on whatever they want to build instead of focusing on all the caveats of how everything works. So I really think that my main motivation is making sure that we are able to actually help crypto change the world.
Ever since I just heard about The Graph, I think it was very early in its development and inception, I was waiting for the product to be kind of ready. And to dive more into it, because I knew from the very beginning that it’s gonna be something really useful and needed for the whole blockchain space. And I then had the opportunity to dive into the Indexers test net, basically, last summer. And it really caught me in to the whole ecosystem. And I feel like I had to do something because like, the whole community is great. Like, everyone was so friendly, and everyone was like, I literally made tons of friends, by just being in the community. And it was like really meaningful to me. I felt like I had to give back to the community by doing everything I can to help the protocol grow and help the community grow. And I hope I did the right thing.
So I want to congratulate all of you and everyone else that had a role in last week’s launch. I know it was a huge deal. I know, a lot of time and effort went into it. And I also recognize that this will have a huge impact on the Web 3 space and on the future. But I’d love to hear your vision for that. What is, in your own minds, the impact of what was accomplished last week, on the future of The Graph?
Yes, well, first, thank you. It’s very gratifying that people are excited about The Graph. But for me personally, this is just the beginning of a long journey. When we get more and more people involved in a protocol. We also get people coming up with different ideas, different things that we can build on, how can we help them and we just continue building this thing that we created and see how it behaves in real life. And we keep innovating.
As I said before, like mainly the everything that we launched last week, pretty much allowed anyone to use the whole network like that that was the only thing that was missing because everything was there in on chain. In December back in December when the main net launch happened, but it wasn’t really easy to use, you know, you had to use the contracts, it wasn’t easy to use, it wasn’t friendly for, for anyone to go ahead and use it. So I guess the main thing that it means for the future of The Graph is that anyone will be able to experience The Graph, and especially the network, and everyone will be able to really, really use and improve their own products by using the network, everyone will be able to curate everyone will be able to really show support for their favorite subgraphs their favorite developers, everyone will be able to do pretty much everything that they want with the network. So I guess, in a sense, the last week launch, what it means for the future is that it’s just starting, you know, it’s just the beginning. We are just barely starting with this whole idea of changing the world and helping others change the world. So that’s pretty much what I think the future holds. We need to continue building we need to continue getting new stuff out getting other people to join, get people excited about it. Yeah, it’s pretty much just the beginning. You know.
Please support this project
by becoming a subscriber!
CONTINUE THE CONVERSATION
DISCLOSURE: GRTIQ is not affiliated, associated, authorized, endorsed by, or in any other way connected with The Graph, or any of its subsidiaries or affiliates. This material has been prepared for information purposes only, and it is not intended to provide, and should not be relied upon for, tax, legal, financial, or investment advice. The content for this material is developed from sources believed to be providing accurate information. The Graph token holders should do their own research regarding individual Indexers and the risks, including objectives, charges, and expenses, associated with the purchase of GRT or the delegation of GRT.