Quarterly Recap Rook Q2 Quarterly Report: Engineering Recap

soldiers standing on rocks alongside a catapult

The Rook Labs Product and Protocol teams present an update to the community on Q2 development, the new tech being built, and how the project will continue to improve.

Key Takeaways: Protocol Update

Coordination Protocol [2:30]

  • This mechanism enables keepers (trading bots) to create value flow and return MEV profits back to users.
  • The greenlight algorithm determines who will facilitate a trade. It  balances how much keepers share with users and how much they must utilize themselves to remain equitable.
  • The burn mechanism ensures that there is always some ROOK that is only bought and never sold, which creates value for the ecosystem.
  • Users can control their own order flow by receiving MEV rebates into their wallets.
  • Users are receiving 80% of bid value via the Coordinator.

Developing Meta [6:00]

  • Keepers’ bidding, staking, and transaction broadcasting behavior is competitive, meaning they must adapt and adjust to gain an edge.
  • Keepers decide how to broadcast and fill the trade for users (e.g., mempool, flashbots).
  • The team is focusing on easy integrations for other application-layer projects.
  • A new partner rewards mechanism can incentivize anything that delivers Rook order flow, allowing for further MEV capture.
  • This is a feature that plugs into the Coordinator to conveniently enable partner integrations.
  • The new feature helps to cater and support the strategies and requirements of any given partner, and allows them to adjust the reward distribution parameters.

Protocol Performance [12:30]

  • $300M in trading volume since the Coordination Protocol launch has created deflationary pressure.
  • A baseline of 0.1% of volume is extracted as MEV.
  • Revenue scales directly with volume through the Coordinator.

Trading Protocol [16:00]

  • The protocol upgrade will maximize gas efficiency while maintaining gasless trading fees.
  • This will reduce margin costs and increase capital efficiency.
  • Keepers will no longer need to flash loan or have their own assets; they use the users as the flash loans. Keepers are rewarded based on the execution price.
  • This upgrade allows for competition with DEX aggregators.
  • New order features are on the way to make trading convenient (to be announced).
  • Transaction relay will be a plug-and-play RPC for wallets and backends to integrate. It is easy and scalable as a means of capturing nascent MEV.
  • The GMX transaction relay is already online.
  • The next step is to integrate relay with the Coordinator.
  • Using a generic Ethereum-based transaction relay, transactions from outside the Rook ecosystem can be funneled through Rook, which allows for increased MEV capture.

Ninja [27:00]

  • Ninjas are divided into "Block Ninja" and "Event Ninja."
  • Block Ninja waits for a block to be mined to discover arbitrage opportunities to trade on.
  • Event Ninja processes data as it hits the order book, and moves to trade on those opportunities.
  • Event Ninja is filling a high percentage of orders and volume per day, and has been increasing coverage.
  • Challenging other keepers to match performance helps encourage keeper competition.
  • DEX aggregator Ninja is coming soon, which will further increase coverage.

Token Listings [30:00]

  • A new streamlined token whitelisting process is in place.
  • Once DEX aggregator Ninja is released, there will be ERC-20 token support on supported DEXs.

Questions [34:18]

  • The protocol upgrade will treat limit and market orders the same, addressing the gap between retail users who desire quick swaps and whales who patiently perform limit orders.
  • Trades made using either the Rook trading app or DEXs that have integrated with the transaction relay will still allow for MEV capture. Any profit made through facilitating or arbitraging these trades will be delivered back to the users.

Key Takeaways: Product Updates

Trading App [41:00]

  • The Rook trading app was built during Q1 and Q2, and launched in April.
  • The app was rebuilt from the ground up with all original code.
  • Rook also built and maintains a server that allows for persistent data storage, primarily for the Coordination Game.
  • Moving forward, the app team will work on additional ways to interface with the trading protocol.
  • Future partnerships are in the pipeline. No names can be released publicly before third-party deals are confirmed.

Analytics [50:44]

  • The Rook database allows the team to view orders in the HidingBook and analyze orders to help monitor the protocol.
  • The database also allows for insight into the types of trades that are being made, including sizes and trading pairs. It can also help flag orders that are difficult to fill.

DevOps Improvements [55:57]

  • Containerization will allow for quicker deployment and an increase in CI/CD pipeline speed.
  • Logging and observability capabilities have been greatly improved.
  • These improvements enable more efficient full stack monitoring and quicker response times.
  • With the implementation of Rook's own database, there is no longer a constant need for Rook's frontend to make web3 calls. Instead, the backend may be queried.


[00:00:00.490] - Deetz
All right. Well, looking forward to get started here with our third event of the end of the quarter series here. So this event will be focused on our quarterly report, and specifically the engineering in portion of it. So I'm really excited to do this. We've got a bunch of our contributors on hand here to walk us through some of the developments from the last three months.

[00:00:27.830] - Deetz
Just a little bit of housekeeping before we get started. So as you can see, we're in a new channel that has a little bit different permissions than we typically do with our stages. So if you do have a question for folks, be sure to tag me in the voice chat text channel and I can unmute you for a question when we're ready for taking questions throughout each section. But with that, we'll be running through pretty quickly today. As you guys can see, I'm screen-sharing the report simultaneously as we run through it, but I believe each of our speakers are going to be keeping it pretty high level as we talk through each of these sections. But as we go through, we'll have room for questions and obviously have some time at the end to kind of talk about any of the bigger engineering questions that we might have.

[00:01:26.180] - Deetz
But with that, I'm going to ask JZ to kick us off here and give us a little bit of an intro into the last three months on the engineering side.

[00:01:36.710] - JZ
Welcome, everybody. Thanks for joining the call. Thank you, Deetz. All right. So we're going to dive into the engineering side of our quarterly update. We've had a very busy quarter. A very exciting quarter. Our team has grown so much across the entire tech stack. We have a lot of new folks over the past three, four months, and many of them have stepped up into important roles and made significant contributions in just one quarter. And it's fantastic to see. I love being part of this team. You really can only just kind of scratch the surface with the quarterly report, and then in an hour-long call, you can only then even scratch the surface of the report itself. We'll do our best and try to color as best we can here in this short call.

[00:02:25.010] - JZ
So really the big exciting thing is that we launched our coordination protocol and our flagship trading app where users can conveniently trade and own their own order flow by receiving MEV rebates right back into their wallets. And that's an exciting feature and there's a lot of engineering that goes into that, so we're going to touch on that here.

[00:02:50.190] - JZ
So let's dive into the first section here and I'll review some things you guys may have heard before, and some of this may be new to you depending on how much you follow. So the Coordinator is the mechanism that enables keepers, which are trading bots, to efficiently and profitably extract value and give it back to the user, right? So without the Coordinator, you can't do that. Without the Coordinator, you have basically a bunch of bots fighting over this value. They're fighting over trade execution and they're fighting over value. And it makes for a really inefficient system. The Coordinator makes all of this possible and all of this efficient. And so one of the key features of the Coordinator is the green light algorithm. And what the green light algorithm does is it basically determines who is going to obtain the rights to facilitate trade for a user. And this doesn't have to be a trade. Like this could be any other thing as well. This could be something coming through some other product or protocol as well. But for this example, we can just focus on just simple trades, right? And we've been constantly tuning this algorithm as we learn more and as things start to evolve, which is really cool because there's a lot of changes that may need to be made over time, right?

[00:04:13.880] - JZ
Because if you have an ecosystem -- you have users that are executing trades and you have keepers that are executing that on behalf of the user and facilitating that trade -- you're going to have value flowing through this ecosystem, and you may have users getting a ton of rewards. And so then you may need to adjust the algorithm so that keepers can make more money, so that the keepers aren't going broke. And then at the same time, if keepers are making too much money and users aren't getting enough of their reward, you can tweak the algorithm to make sure that users are happy and keepers aren't being too greedy. And then there's a lot of economic factors in this as well. So our token economics graphics are fantastic. They're so cool to look at and it really helps people understand what this does. It's hard to visualize this without data and without charts and graphs. So this graphic does a great job of that. And so it kind of shows the flow of value through the ecosystem. You have basically opportunities that arise as a result of users wanting to trade, right? So it all starts with users or market makers -- anybody who comes to the table and they say: hey, I want to trade. There are opportunities on the blockchain to extract value from that.

[00:05:33.720] - JZ
And that value flows through the ecosystem and it creates value for the users. It creates value for the keepers. And then we even have a burn mechanism in there which basically adds some value to the ecosystem as well, because if users have to buy ROOK to participate -- I'm sorry, if keepers have to buy ROOK to participate -- the burn is making it so that there's always ROOK that will be bought but never sold. And so that adds a lot of value to the ecosystem. And then we even have the DAO Treasury looped in, the Rook staking pool looped in as well to add a lot of value there. And really it's an exciting ecosystem. It's greatly benefiting the user. Users are getting 80% of the value that's bid through the Coordinator. You won't find that anywhere else. That is, that beats everything that's top notch. Even other protocols, if they wanted to try to compete with that, it would be very difficult based on some of the other factors playing in. There's a lot of great mechanisms in here. And then there's also an interesting meta that we're seeing develop. And I didn't mention this in the report because this is still something we're kind of learning about.

[00:06:42.240] - JZ
So maybe this will come in the next quarterly report. But there's this keeper meta that's developing within the ecosystem, and I definitely expect this to change over time. But basically keepers' bidding behavior, their staking behavior, their transaction broadcasting behavior is... think of it as like a meta in a game where people are kind of countering each other and trying to gain an edge here, trying to gain an edge there. And then it kind of finds a balance, but then something changes and it shifts that balance, and they adapt and adjust. And some keepers are staking more ROOK than others to get an advantage, and other keepers are staking less ROOK than others because they feel that that gives them maybe less token risk, right? So there's a lot of different factors in there. And then there's this meta where people are deciding. Keepers are deciding. How do they want to broadcast this trade and fill the trade for the user? Do you want to use mempool, or do you want to use something like a Flashbots where it gives you some advantages? And there's pros and cons both ways. And we're even working on some strategy on to intelligently decide. Like how do you decide that?

[00:07:47.500] - JZ
It's really cool, and we're basically helping keepers and working with keepers. We spend a lot of time on this where we discover something, we learn something, and then we talk to keepers about it and relay this information -- share it, get their opinion on it, ask them what are they seeing in the meta, making sure are keepers making enough money, are they bidding too high, are they bidding too low? That sort of thing. So it's really cool.

[00:08:11.070] - JZ
Another big feature that we're working on... and we've basically planned this since last year. So since the Hiding Game, we planned a mechanism for partner rewards, right? So if we have a partner who wants to integrate with us -- this could be a trading app, this could be a wallet, this could be some aggregator, this could be any sort of a partner, some other project -- anything that somehow in some way delivers us order flow. We want to incentivize that and basically reward them because they're participating in this MEV capture as well, right? Because they built the product for the user whose user then generated MEV for themselves. So we're basically building a feature to plug right into the Coordinator to enable this in the most convenient way possible.

[00:09:07.490] - JZ
And it's really cool feature. I'll hand it off to Pai-Sho to talk more about this.

[00:09:14.670] - Pai-Sho
Yeah, so this is a really exciting feature because it basically lets us adjust what that pie chart looks like in the graphic above for each individual partner or integration that we have. So there's a lot of different potential use cases for this. For example, there might be some partner protocol that wants to keep all of the bid as a protocol and then distribute it to their users in some way that makes sense within the context of their own tokenomics. Or maybe they have their own stakers or something like that. So we can support arbitrary bid distribution parameters on a per-partner basis. And this also... we're expanding right now to include a referral program where we'll be able to sort of use the same functionality to have like a user referral program, partner referral program, and adjust the bid distribution that way to incentivize the sort of viral marketing that you get from these referral programs. So, yeah, it's a really exciting feature. We're stoked to ship it.

[00:10:24.090] - Deetz
So why don't we pass it back to Joey here. Let's talk a little bit about protocol performance.

[00:10:32.130] - JZ
Excellent. Thank you. So protocol performance can be measured in a variety of different ways. So you can look at volume that's going through the ecosystem and then you can also look at bids that are going through the ecosystem. Right? So if you look at trading volume, you're talking about the quantity of trades users are putting through, but that doesn't really factor in MEV, right? MEV can be compared when looking at the ROOK that is bid through the ecosystem, and in a lot of ways they're related. And our team has actually been working on some tools to not only simulate different scenarios, but also just measure the current data that we have. Because we've accumulated data for the last several months now, and it's been very helpful to look at this data and try to understand the ratio of MEV to volume. Maybe I'll hand it back to either Pai or Zubair or whoever wants to take this one to talk about their model that they made. Yeah.

[00:11:36.790] - Pai-Sho
So first I just want to highlight... The coordination protocol has been live for just a hair over two months now. We've gotten $300 million in trading volume. This is an important figure because if you remember our old Hiding Game, we had about something like $600 million in all-time volume in the 14 months or so that it was live. And that volume was mostly the result of the inflation of the ROOK token. Whereas this $300 million of volume we've gotten in the last two months is purely natural trading volume that's actually resulted in deflationary pressure on the token. So I think that's really important to highlight. But I'll hand it off to Zubair to talk about the model we've made here.

[00:12:28.090] - Zubair
Yes. Thanks Pai. This is actually pretty interesting because we were looking at the Rook bids for several trades that go through the ecosystem, and sort of came up with a ballpark figure of how much MEV we capture for the volume that flows into the ecosystem. So with last two months of data, two and a half months of data since the launch... what you see on the chart -- the lower chart -- is the accumulated volume going through the ecosystem.

[00:13:01.510] - Zubair
So by the end of June, we have 270-280 million USD through the various orders filled, and we also looked at the amount of ROOK that was bid for filling those transactions. So the chart above is the ratio between the two. So it simply describes the income ratio, which is the money that gets into the ecosystem because of the trades being felt. So it stays pretty consistent around .1%, which is a little higher than the ballpark figure we came up with. And one interesting aspect about this ratio is this is just the baseline figure, because as Pai and JZ mentioned earlier, with the partners reward program, we can tune it to generate more revenue if there's... some partners want to customize the model for them.

[00:14:03.680] - Zubair
So this is really like the bottom line and we are hoping to do better than that. But if it stays around .1% going forward, it's very consistent.

[00:14:15.670] - Pai-Sho
Yeah, that's a really exciting chart there on the top, because it shows that the ratio of MEV captured to volume stays very consistent even as volume grows. So that means our revenue basically scales directly with the amount of volume we get through the Coordinator.

[00:14:35.590] - Zubair
Yeah. And this includes small trades, the whale trades, trades of various sizes. It's just pretty exciting to see that we have a very consistent ratio -- the income ratio.

[00:14:49.870] - Deetz
Absolutely awesome, guys. Joey, why don't you pop back in here and talk about what's next for coordination and the protocol team.

[00:14:59.170] - JZ
Yeah, absolutely. So the biggest thing that's next for coordination is our protocol upgrade. And this has been something that's been in the works since last year, but it is finally at the forefront of maximum priority. So when we launched the Hiding Game last year, we started kicking things off using the 0x v4's RFQ protocol. And while that one is battle tested and executes swaps -- it works just fine -- it is not tailored fit for exactly what we're doing. And so along the way, we discovered some inefficiencies that we'd known had existed, but unfortunately have been preventing us from making the most optimal and gas-efficient trading protocol possible. So starting last year, I've basically been designing a new protocol. And then it was just a matter of, like, when do we take the time, hunker down and finish it -- knock it out and push it across finish line? So we decided to launch the coordination protocol first, which is what's happened. And now we're getting ready to push the new protocol upgrade across the finish line. This is really exciting because once we get this in place, we're going to see a massive upgrade, right?

[00:16:26.880] - JZ
So this is going to maximize gas efficiency still while maintaining gasless trades. So users can trade without paying gas, but now their trades will execute sooner. They will generate more MEV. So trade costs are reduced. Keepers will now make more money because they can trade sooner. Their profit margin gets better -- or if their profit margin stays the same, they can just trade now, when previously they couldn't have. And now keepers won't have to flash loan. Keepers won't even have to have their own assets, because basically we're using the user as the flash loan in this case. And it's totally secure because at the start of the trade, inside the smart contract, the user transfers the token to the keeper. The keeper executes the trade in a flash loan style, which is safe and secure. And then at the end of execution, it checks that the keeper has been paid based on their execution price. So the keeper gets to decide what price they want to have their swap executed as. And if somebody tries to rip them off, or if there was a bug and it didn't fully execute and the keeper didn't get paid, it will revert.

[00:17:37.880] - JZ
So it's completely safe, operates just like any other flash loan would, and it has so many benefits. It's going to be above and beyond amazing. And this is going to allow us to have more competitive pricing. It will allow us to now heavily compete with DEX aggregators. And then what's even cooler -- which we'll talk about a bit later -- is we're going to also add DEX aggregators into the mix as well. But when we turn on this protocol upgrade, from the user's perspective not much changes, right? You'll have to approve allowance on a new protocol, which is just one simple allowance approval. And then you can swap just as you normally would. You're just signing an order just as you normally would with some specifications.

[00:18:24.410] - JZ
We are going to add some new order features. Not going to get into the details on this call, but the new order features are really cool and they're going to make trading fun and convenient and really efficient for everybody. And the next big thing that we're working on right now -- have been for quite awhile -- is the transaction relay. So you guys have probably heard a good bit about the GMX transaction relay, which is already online, and we have one keeper working with that and that's really cool.

[00:18:55.790] - JZ
The next step for the GMX transaction relay is really to just kind of gain more traction, and gain more volume, and gain more order flow. And then assuming that kind of hits our internal threshold, then we're going to integrate that with the Coordinator, right? Because once that becomes a higher demand and there's a lot of order flow coming through there and more keepers want to integrate with that, then we hook it up to the Coordinator and now essentially allow that to show up on the dashboard. We'll have multiple keepers bidding on it. For today, it's kind of tied into one specific keeper until we kind of let that one take off.

[00:19:29.080] - JZ
But really the big one that we're working on behind the scenes is our generic Ethereum-based transaction relay. And this is absolutely huge because this enables us to take say, any transaction on Ethereum that anybody would do, even if it's outside of the Rook ecosystem. Actually, especially if it's outside the Rook ecosystem and it allows us to bring that within our ecosystem, right? So if somebody's going to Uniswap's app directly and they're just purely just trading on Uniswap and they're not touching our app, if we could get them to funnel that transaction through our relay, then what it does is it brings their transaction into our whitelisted keeper realm and it allows our keepers to extract MEV for them.

[00:20:17.040] - JZ
And it's really exciting because now this user who wasn't even a part of our ecosystem before is now benefiting from our ecosystem. And I'll hand this off to Gman to tell you guys more about the transaction relay.

[00:20:34.430] - Gman
Is my audio working?

[00:20:35.970] - JZ
Yes, it is. Loud and clear.

[00:20:41.850] - Gman
This is probably the most exciting thing that I'm currently working on. I think that this is the thing that will make it really easy for people to use Rook in all aspects of Ethereum. Before, you would have to use Rook only if you're doing a regular swap. But with the transaction relay, it's really easy for people to just get onboarded into the coordination engine because you're just going to plug and play the RPC in your wallet or your back end system. If you are a partner and you don't want to change your smart contracts in order to integrate, it's just easy. And it's going to really make this entire system scalable in capturing MEV, and getting all of Ethereum to just coordinate together, and kind of remove the invisible tax that happens when you use Ethereum. Just by using the [inaudible 00:21:52] to capture all the MEV in the transaction that you're putting out. Whereas before you would constantly be losing your value to the ecosystem from bots that are just doing some of the money.

[00:22:22.890] - Gman
I don't want to go into too much detail, because we're still working on it and we have partnerships coming up that are wanting to test it out. But overall, I think Joey already went through a lot of the details here. So yeah, I'm super excited to be working on it, and keep your eyes peeled.

[00:22:44.910] - Deetz
Awesome. Thanks, Gman. Appreciate it. Why don't we toss it back over to Joey for Rook for smart contracts?

[00:22:53.490] - JZ
Excellent. Thank you. Yeah, one thing I did forget to mention is that Gman is leading our engineering on the transaction relay. So that's really exciting. Another new contributor that I had mentioned earlier -- he's really taking the reins on things.

[00:23:12.430] - Gman
What's really cool: before coming in on this, I was just a community member that was kind of thinking about these things. And just by being in the Discord and kind of talking about these issues, it's kind of one of the first things that popped online to make things easy, as... If I was a user, how would I want to use Rook? And it's really cool to be able to join the team and immediately start working on something so cool. We really are just a bunch of community members that are really changing DeFi. I'll let you go.

[00:23:54.790] - JZ
Thanks, Gman. Excellent. Yeah, that's a common theme among our team. That's a great way to look at it. Excellent.

[00:24:02.740] - JZ
So another thing that we're working on is basically really deep integration at the smart contract level with some other folks. And the vision here is this. So let's say you're a protocol developer, and you're building a thing in DeFi. How could you directly integrate with the Rook protocol and gain access to its keeper network, right? So that's what we're focused on. So we have a variety of partners, including B.Protocol, mStable, Gamma, a number of others. And we're basically working on the most streamlined way to work with these folks so that our ecosystems can work together. And their biggest benefit is that they get the benefits of things like our keeper network -- network of bots that are going to do things for them -- and also our partner rewards, so they can essentially share in that MEV for their users. For their app. For whatever they're doing. That sort of thing. And then so for us, really, it's been a matter of: how do we most efficiently integrate with these guys? Because in the perfect world, if you can just envision, say, let's say, the next big bad awesome protocol comes out... We want every new protocol to look at Rook and say: we got to integrate with that. We need to design our protocol with Rook in mind. And that's the goal. That's really the end goal here.

[00:25:31.200] - JZ
So we're focused on integrations on the application layer, on the protocol layer -- from DeFi application protocol layer, not necessarily blockchain protocol at this point in time. And there's a lot of good use cases that we have that we're working on, and they're definitely still a work in progress. But the good news is we have designed basically an interface -- a way for folks to access our whitelist on chain. And it's cool because it's upgradable, it's future-proof, and then they can still take advantage of things like the Coordinator partner rewards, while very, very simply -- with a very simple mechanism -- interface into our keeper network and basically just say: hey, we're going to allow Rook keepers to do this thing and no one else, and that way they can essentially lean on our whitelist. It's a really cool feature, and this will be something that is going to be an ongoing evolution. So look for more updates on this on a partner-by-partner basis as the year goes on, and then hopefully forever going forward as well.

[00:26:39.010] - Deetz
Really awesome. So something that I've seen a ton of work on, and really actually have seen kind of the fruits of it come with Ninja's better participation throughout the coordination is some of these upgrades that have been done. So we'd love to bring in Zubair and Mia to talk about some of the Ninja upgrades and refactors that have been completed.

[00:27:06.370] - Zubair
All right. Yeah. So we have been thinking about improving Ninja's arbitrage algorithms and execution of those algorithms. So there's been some work done to address some technical debt and make scaling improvements in how Ninja processes data, and how quickly can make decisions on whether an order is arbitrageable or not. So the main thing has been splitting one Ninja into two. We have a Block Ninja and Event Ninja now. So the block Ninja would wait for a block to get mined and then fetch all the new data -- discover arbitrage opportunities and trade on them. Event Ninja actually processes the order as soon as it hits the order book. So that's pretty cool. It can quickly make a decision on a single order and go forth with a trade if the opportunity's available. So recently Ninja has been filling a lot more... a bigger percentage of orders per day that flow through the ecosystem. And

not only is it filling a higher percentage of orders, it's also filling a higher percentage of volume per day. That's been pretty exciting to see. This is important for the... not only increases the coverage for the number of orders we sell, but it also kind of changes other keepers to match the performance.

[00:28:35.960] - Zubair
So if any of the keepers outperforms others significantly, other keepers just have to [inaudible 00:28:41] their deposit and the arbitrage algorithms to keep up with it. So overall it's just great for the protocol in general. So even though we made these improvements... we didn't compromise the speed and efficiency. So Ninja actually discovers more arbitrage opportunities, but also it has a smaller memory footprint. So that's been pretty good. It's important for scaling. As more and more volume goes through, the keepers may have to process a lot more in a given amount of time then, with less [inaudible 00:29:18]. JZ also already mentioned DEX aggregator Ninja. So in addition to the Event and Block driven Ninja, we will also have Ninjas that just route trades through different DEX aggregators for... that our keeper network doesn't support. So they also improve coverage of the trades that get bid through the ecosystem. Other than that, there's been a bunch of other developments as well. Let Mia or Joey take over.

[00:29:57.390] - Deetz
Did you have anything else that you wanted to add about the Ninja work happening?

[00:30:03.210] - Mia
I think it did a great job. I just want to say that we are working on several fronts -- updates to the contract and several things in the back end.

[00:30:14.870] - Mia
And it's pretty cool to see all this happening. But I think it did a great job to my eyes on everything.

[00:30:23.850] - Deetz
Awesome. Fantastic. Well, Joey, why don't we have you take us home on the protocol side and talk about just some of the other things that have been going on as well, and then we'll pause and open up for some questions. So Joey, go ahead.

[00:30:36.660] - JZ
Sure. Excellent. Great job guys. Yeah. So just to kind of touch on a few other developments, there are a lot of things that we did not fit into this quarterly report. Like a lot. And some of them are... big, but it's like you just can't fit because there's just too much. And then it's also super specific to some random feature within an upgrade. But the big ones that I think that the community would resonate with are listed here.

[00:31:07.680] - JZ
As far as other developments: so we've talked about token listing and like, oh, why do you support this token, and not that token? Or: hey, can we get this token added to the list? And then in the past there'd be like lengthy delays before we finally get it listed. Well, we now have a streamlined token whitelisting process which is really cool. So that process is much faster. Much easier. And then to go even one step further soon: once we launch our DEX aggregator and Ninjas, we'll be able to support swapping, in theory, any ERC-20 token. And this is because if, say, any of our keepers don't specifically support it, or if say a keeper requires a reboot to add the new token and it just hasn't been rebooted yet, well, guess what?

[00:31:49.850] - JZ
Any DEX aggregator Ninja that is trading through 1inch or Matcha or anything like that -- as long as 1inch or Matcha supports it, we will support it. And what's cool is the DEX aggregators have done a good job of dynamically adding new tokens, so that support will be very good. So basically, if the token is supported on a DEX aggregator, we will support it. And that will be out later this year once our DEX aggregator images are up.

[00:32:14.660] - JZ
So we have an awesome internal tool called the HidingBook viewer which allows us to visualize everything that's going on. Fantastic tool. It is amazing. And we may consider releasing a version of this to the public at some point in the future, but for right now it's kind of for our own internal use. So we've made a lot of node infrastructure upgrades, one of them specifically being adding MEV-Geth to our network of nodes. That helps us simulate trades faster -- simulate transactions. So we've done a big focus on DevOps and infrastructure scaling which we've talked a little bit about. And we're also working on refining our coding standards and these last two things, like coding standards and DevOps and infrastructure.

[00:33:00.100] - JZ
This is big when you have a growing team, and we have had a growing team, and so we're really taking a good pass on this, making sure that our team can grow and scale together. And let's see. So I already mentioned the contract integration interface, and we've also talked about GMX. So that's it. And obviously just wanted to give one last shout-out to our team. We've had a lot of team growth in the protocol side. Mia, Gman, Holger, Zubair -- it's been fantastic having you guys join this quarter. Really excited for the future with you guys. And yeah, so next is up -- we'll do the product engineering side of things. So, Deetz, do you want to give them an introduction?

[00:33:42.300] - Deetz
Yeah. First, we've been at it for about 30 minutes here, so if anyone's got any questions for the protocol side, ping me in voice chat and I'll unmute, you and we can go ahead and ask some questions. So I'll give ten seconds for anyone to just quickly tag me if they want to ask a question. If not, we'll hand it off to Pangolin and keep moving through the report. Frogmonkee -- got you. Let me get you, bud. All right, go on ahead.

[00:34:14.660] - frogmonkee
Can you guys hear me?

[00:34:15.260] - Deetz
Yes, we can.

[00:34:17.050] - frogmonkee
Cool. This may have already been covered, or might be a bit of a beginner question, but my understanding with part of the adoption of Rook and... The way that Rook works is using limit orders for now, right? Which take some time to get filled. Which prevents, perhaps -- and that applies well for whales that have the patience or the desire to get the best possible price. But it doesn't work for the long tail of retail users that just want to sort of quickly swap their token from one to another using an AMM or something like that. So I think I heard JZ talk about a swap product that might be in the pipeline, but... I guess can you speak more to applying the Coordination Game to swaps and capturing that long tail of user adoption?

[00:35:24.350] - JZ
Yes, great question, and I think I glossed over this a bit too much, trying to cram too much into the call. This is really important, so I'm glad that you asked this question. So regarding our trading protocol upgrade: it is specifically geared towards this exact scenario where somebody wants to come in and make an instant swap -- a market order type of a trade. What's cool about our protocol upgrade is it's going to essentially treat limit orders and market orders exactly the same from a protocol level. But the behavior and how the UI sets things up for you is going to determine whether or not this is essentially a limit order -- one we expect to sit on the books for a while -- or a market order, one we expect to execute very quickly. And this is absolutely our focus with the protocol upgrade, and we are going to make sure that the UI/UX is good. And so the goal is: when you think about MEV extraction, with limit orders versus MEV extraction with like a market order, the goal with a market order is to have it trade as close to breakeven as possible.

[00:36:38.130] - JZ
There are weird times where things shift. Maybe you took a while to sign it and the market moved a little bit, and there might be something going on there. Right? So we're going to basically tailor the

UI/UX such that it gives the user a great experience. And if there is any MEV extractable, it will be given back to you as a ROOK rebate. But for the most part -- ideally, in a perfect world -- there would be no rebate, and you'll be getting the maximized output of the trade as well. You should be able to do cool things like, basically: hey, let me find the market order price and just swap immediately. And it'll be a gasless swap. Or maybe you could even get a little creative and say: hey, what's the market order price? I'm going to be a little greedy. I'm going to back off a little bit. Maybe it'll take an hour, maybe it'll take a day, but hopefully it'll get through. But yeah, like you said, the large majority of people, they want to swap immediately. And so that's what this protocol upgrade is designed to handle. And that is the majority of users. That's their desire, that's what they want, and that's what we're delivering with this next upgrade.

[00:37:40.190] - frogmonkee
Got you. That makes sense. Thank you.

[00:37:44.130] - Deetz
Gman, did you have something to add, then?

[00:37:46.790] - Gman
Yeah. The other cool thing is: if users start using the transaction relay, even if they go to somewhere else to do a swap, like on Uniswap, then they're going to potentially be able to capture -- as the transaction comes through the transaction relay, the keepers are going to be able to see if there's an arbitrage opportunity on another DEX, and they'll bid on the transaction, and you'll still get a rebate even though you aren't going through our swap protocol. So if it comes out, people will still be able to collect their MEV even from swaps outside of our ecosystem.

[00:38:31.310] - JZ
And to piggyback on that: I know we're digging into this call a little longer, but this is just super interesting and high value stuff. Let's say someone were to just go to 1inch -- like kind of a more detailed example -- someone were to go to 1inch, and they're a whale, and they're making a really big trade. Oftentimes what bots in the wild will do is they'll backrun that user. So that 1inch trade, maybe it'll hit Uniswap two a little too hard, and that actually makes a backrunning profit opportunity. If you were to go to 1inch directly and make this trade, either a miner or a bot or both are going to make profit off of you.

[00:39:08.250] - JZ
But if you were to route that same exact trade through the Rook ecosystem, you're going to have keepers in the Rook ecosystem making that money back for you, and then you're going to get 80% of that bid. Right? And so it's super high value. Right. So either... The suggestion would be to do one of two things. Either make your trade through the Rook app instead, or if you still want to use 1inch, use 1inch, but then have it routed through our TX relay. And then if there is money made off of your trade, it goes right back into your pocket. So, absolutely. It's a fantastic upgrade, and we're looking forward to both things, right? The transaction relay and the protocol upgrade.

[00:39:46.430] - frogmonkee
That would require 1inch being part of this transaction relay network, though, right? Like, it's not something that we can sort of, like, impose on to other apps?

[00:39:56.690] - Gman
You can do it at two different levels. One is a partnership with 1inch. The other is a user can just set up their wallet to use their transaction relay, and all their transactions from whatever protocol they use will be routed through it, and it will work the same way.

[00:40:18.200] - JZ
That's something you can do in Metamask yourself. So, like, a power user can do that themselves. Or just a random user who's going through one of our partners, they can latch on through the partner.

[00:40:29.750] - Deetz
And Joey, you want to just quickly just... My assumption would be that we'll have documentation on this to walk users through, say, if they wanted to set it up inside of their wallet.

[00:40:40.260] - JZ Yes, absolutely.

[00:40:42.050] - Deetz
Awesome. Great. Do we have anyone else with protocol-based questions here? Feel free to quickly tag me. If not, we'll keep trucking along. All right, well, with that, we're going to skip on down ahead to the product side. So, Pangolin, why don't you quickly introduce us to the last quarter's worth of product work and then we can get through the update.

[00:41:08.090] - Pangolin
Cool. Yeah. I'll try to be quick. I also feel very sick today, so apologies if my voice is sounding scuffed. I don't want to take too much time with product because I think protocol is the important thing, and that's what we're all excited about. I was thinking about what to talk about with the product team over the last quarter, and I realized one thing to mention is: what is the product team, and why are there two engineering teams?

[00:41:44.190] - Pangolin
So I think we're not really a company selling products. We use product to refer to anything, really -- the interfaces between a human and the protocol. So it could be called the interface team or the service team -- basically things which help usage, particularly by humans. And one kind of key thing about this is I think you could actually spin up your own product team and build all this stuff yourself. It's using public APIs, the way it interfaces with the protocol. We have our own one at Rook Labs because obviously we know the protocol very well. But this makes it very easy from an engineering point of view to split the work completely, because there's not too much direct engineering work that needs to be done on both sides at the same time.

[00:42:45.390] - Pangolin
So the product team, when it started, was basically just me and Pai. Pai has since kind of moved over to protocol, and we've grown. The thing that you see the most of the product team is the app itself, of course, but we've also kind of grown to expand with back end product and with a lot of the analytics you see, like the HidingBook viewer that Joey mentioned before. And the reason for that is we don't really have... Well, we don't yet feel the need to have our own analytics team. And because the product team owns the database with all the data, it makes sense for the analytics to kind of be done in the same team. So we have this list of contributors here. I think it's like ten overall, and it's been... Like the story of this year has been building out the app while onboarding and breaking in a new team of people. So we've been very carefully trying not to let one side of that equation affect the other -- maintain a cadence of development while bringing people up to speed. And now we have this great team who are very agile and can delve into stuff as it arises.

[00:44:15.190] - Pangolin
In April 22, I think, we launched the app. This is about three or four months of development. We rebuilt it completely from the ground up, which is the first time this has happened at Rook since the beginning. Traditionally, Rook has always kind of limped along, using the old code and adding to it. And we just found that this was not making it. It wasn't a really tenable situation. The code was getting very hard to maintain.

[00:44:49.510] - Pangolin
So we built it all again from scratch. It's all our own code. We didn't fork Uniswap or anything. The app, I'm sure you're very familiar with. It's kind of four tabs which will do different things, the most complex one being a trading tab. At the moment, it just does limit orders. But we're looking to do swaps with the upgrade, and that will probably be... coincide with another larger refresh of the app. I think we can skip over the app. Sorry, just losing myself... yeah. Okay. So the app is what everyone sees, and it's probably what you're most familiar with, with the product. The app itself is actually served mainly from a server, which we maintain and also built from scratch.

[00:45:50.570] - Pangolin
And this is a pretty big leap forward for us. Before, in the Hiding Game era, we didn't really collect much persistent data, because we just didn't have the team to do it. So everything was kind of just running, and a lot of the records were logged only a little bit, or not at all. We decided that we wanted to have like a good persistent data store to log everything that goes on, especially with the Coordination Game -- the coordination protocols -- things are like an order of magnitude more complicated. We have several different keeper bots run by different teams. We have the Coordinator, the bidding, auctioning, all of that. And then all the stuff coming in from user land, like the orders themselves coming in from market makers and users and integrations. All of that. There's a lot of moving parts. When we launched in April, actually the whole thing worked very well, and that was a huge victory for us because it's not very good advice to kind of launch an entirely new tech stack at the same time. And I think the team did a really good job of just, like, making sure this wasn't going to fall over when it entered the public, and then after the launch.

[00:47:19.560] - Pangolin
And the story of this quarter has kind of been just, like, making sure it stays running. So most problems in any of the millions of lines of code in the stack tend to surface eventually as a problem in the app. So someone will be like: oh my, the auctions aren't loading. And then we have to kind of track back and see what's happened. And this is where all of the analytics and logging and DevOps and stuff really comes into its own, and allows us to move quickly and make sure that it's not a problem. So far there's not really been any critical issues, but there have been points where like something has fallen over. So this quarter we've been doing a lot of work in improving this process. Obviously we'd like it to never break, but when it does break we can quickly diagnose the problem, get into it, and then we learn from it and add something -- some sort of process to prevent this in the future. So this has been Q2 product team. Most of Q1 was building the app, and the first month of Q2 as well. We froze adding any new features for the rest of this quarter, just to go back and really make sure things are being done well and the code is healthy and all of that.

[00:48:49.320] - Pangolin
So you'll see towards the end of the year... Probably start adding in new things to our own app. The other thing we're largely going to be focusing on going forward is other ways to interface. Like, using app.rook.fi is convenient, and is one way you can access the protocol. But we think there's going to be a lot of value in going through other people's products and interfaces to use the protocol. That is very general purpose, and all of those efforts are also kind of owned by the product team.

[00:49:30.320] - Pangolin
So we're talking with a lot of different partners and evaluating the engineering cost of partnering with different people. And some of these things are ongoing. I'd love to talk more about them, but we kind of don't want to. It's difficult when there's a third party involved because we want to make sure everything is set in stone before announcing their involvement publicly. A lot of things cannot pan out for various reasons. You should see some more stuff to do with partnerships coming up this year. Yeah, that's kind of the high level. I don't know, Pai or Perry, If you wanted to dig in a bit more with the back end and analytics.

[00:50:21.530] - Pai-Sho
I think that was a good summary. We can probably leave it high-level. There's a lot of detail in the report itself, but if anyone has any questions, we're happy to answer them now.

[00:50:31.850] - Deetz
I will just give a quick plug, and it's something that I do as I do some of our internal tracking for market makers and keepers. If you've got some time, go to api.rook.fi/docs. It is an awesome little system that was built where you can filter through pretty much every little piece of data that's coming through the protocol. So just want to give another shout-out to the product team and specifically Pai for building that as well. And it's something that... I think you can just grab a bunch of great data. You can even do things as simple as pull what the most current xROOK APR might be for the last week. The capabilities of it is absolutely fantastic on the analytics side. Stile, do you want to add some color to the analytics section here?

[00:51:39.270] - Stile
There's not a whole lot, I think, to add... But I think the addition of the Rook database has really allowed us to look at things that we haven't been able to look at before. So we've been able to look at both the orders on the HidingBook viewer and also what we're calling our order analyzer. And we've been able to look at what types of trades we're seeing -- both in terms of what sizes, what pairs -- and we've been able to explore whether or not there are certain things that we're having issues with. If there are certain pairs that aren't filling, or they're having harder time filling. We're also able to look at if keepers are crossing user orders. Those are some really interesting trades, because sometimes a keeper can just take two orders in the HidingBook and fill both sides. So we can look at that sort of stuff, and look at some of our past auctions, and just get a really good insight on the entire ecosystem.

[00:52:41.010] - Pangolin
Yeah, I just want to add to that something kind of interesting with DeFi is that a lot of... Obviously everything is on chain at the end of the day, and that is touted as one of the great advantages of doing things this way. But something I've definitely found over the last year is that a lot of teams working in this space are actually flying pretty blind with regards to analytics. Most teams don't have the expertise or resources to really look at the data at all. And the data in its rawest form is very difficult to look at. Like if you're actually just pulling off the chain, that's very hard for us as well. This was generally true -- that we were aware that we didn't have enough insight into our own protocol -- and that was something that we... really when we knew we needed to increase headcount. This is one of the main reasons for doing it. And now we're already seeing the fruit of this labor a lot. It comes up all the time when there's, like I said, something not loading in the app. It comes up when we talk to a partner and they ask us to engineer something for them, and then we do a quick study and it turns out that would be incredibly inefficient or unprofitable or whatever.

[00:54:15.370] - Pangolin
There's just all sorts of questions that can be answered very quickly. And I think most teams have not invested in the ability to answer these questions. But we have. And I think that's all compound going forward. Like these tools -- they get built, and then you can just lean on them, and then you can build on top of them and build on top of them. And we've just seen the first phase of that, complete with the database and the API. And we're already seeing these secondary tools coming this quarter, which makes it easy for someone -- even, like, even people like Kyle can just quickly tag in and have a look at some data and not pull in an engineer. And all of these things, like all of these little things compound and make the whole team much more productive at achieving the goal of advancing the protocol.

[00:55:13.930] - Pai-Sho
And it also gives us a lot of really valuable granular insight into the dynamics of the coordination protocol itself. So, like Joey was talking about earlier, we've been constantly tuning the green light algorithm and these bid distribution parameters, and all these things sort of play into this really complicated dynamical system. And having all this data -- this really granular data -- helps us sort of analyze that and figure out how to tune it, and it makes that whole iterative process a lot more efficient. Awesome. Okay.

[00:55:48.700] - Deetz
Yeah. So why don't we just send it to the last part here, which I believe Perry is going to talk a little bit about the DevOps improvements.

[00:55:54.510] - Pangolin

[00:55:57.550] - Perry
Yeah. Some of this has already been covered, so I will just cover the three things that really haven't been touched on: the containerization, our testing, and then our logging and our ability to understand what's going on with our back end and just all of our infrastructure.

[00:56:15.790] - Perry
So the biggest thing is probably the containerization. This isn't just from the product side, either; this is also happening on the protocol side. But starting to containerize our services will allow us to deploy quicker. Our CI-CD pipelines will speed up. We'll be a lot faster going from dev to production because you're kind of working in the same environment, no matter if it's locally, if it's on your staging deployment, or if it's production. So I think that's really going to help all teams with our testing. We didn't really have any testing, especially on the product back end, before Q2. But we've made a lot of progress with our testing in the back end. We still want to continue pushing that because it's never bad to have more tests, but we're in a much better place than we were before. And then our logging. We have current efforts that are going on that will improve our logging capabilities, and our ability to monitor our full stack and understand where these errors are coming from.

[00:58:29.660] - Perry
When we do have a downtime and be quicker to respond to that, we will have a better understanding of where that error is coming from and be able to really root cause the problem quicker. But our logging is much better than it was at the beginning of Q2. We've made a lot of efforts on the back end, especially to kind of log in every single spot that something may be going down. So we really have a good idea anymore on the back end of when something goes down where it might be happening.

[00:58:29.660] - Perry
Some context for those errors. It's really kind of come together over the last quarter. We're a lot more robust to errors and downtime than we used to be, and we're a lot quicker to respond. The other points in here with the modern Python API stack was kind of touched on with our new product back end, and then same with the modern React stack for the front end. We kind of touched on that stuff already. But I will say both of those improvements have come to fruition with a lot better performance. Our page load times have significantly improved.

[00:58:56.760] - Perry
A big part of that was: the front end used to have to make all of its own web3 calls, but anymore, all of that is done through the product back end, and a lot of that information is just stored in the database. So we don't have to make those web3 calls all the time. We can just query our own back end, which has really helped with the performance of the app. And that's really all we had for DevOps. There's a lot more detail we could go into, but we don't really need to dive into all the specifics.

[00:58:56.760] - Deetz
Great. And I believe Pangolin did touch on a little bit about this -- about development practices and using analytics. Is there anything that we didn't cover on this section, guys?

[00:58:56.760] - Pangolin
I would shout out -- if anyone is in a technical field -- we switched over every... all our project management on our product team to a tool called Linear. And if you've ever used Jira, it's kind of like that, but way better. And I just got to shout it out because it's a new kid on the block and it's a really good tool. Does everything that we need it to do. It's like a hundred times better than what I've used in the past.

[00:59:43.750] - Deetz
Awesome. So why don't we give an opportunity here for anybody to ask any questions to the product team. Go ahead and ping me and chat if you do have any questions.

[00:59:57.010] - Deetz
All right. I'm not seeing any typing happening here, so I would just say that for folks that -- maybe we went through stuff a little quicker and they want to go back to stuff -- we will have a recording and a recap available for this call at the latest early next week, if not much sooner than that. But as we kind of wrap up this call, Joey or Hazard or anyone, do you want to kind of just give us a wrap on Q2 engineering?

[01:00:31.870] - JZ
Sure. Yeah, I'll give it a wrap. Q2 was extremely successful. The foundation that we've laid here for our future use and the future public good of DeFi-ers around Ethereum and around just crypto in general -- this is massive. This is, again, only the foundation. We're working on upgrades. We have many plans. There's a lot of cool stuff that can be done. And it all starts with this critical foundation. And special thanks to this team because everyone here is kicking ass and working really hard, and people are owning so many things, and it's really enabled us to do what we're doing. So look for improvements, look for add-ons to the foundation. And I'm really excited to be working with this group, and really excited to see how this year concludes with Q3 and Q4 coming up here.

[01:01:33.790] - Deetz
Absolutely awesome. Well, great. I will just have a little bit of housekeeping as we clear out here for this call today. Just a quick heads up. We have another call starting in less than an hour on Rook narratives, which I know that Watts and Troy and Hazard have put a lot of time in. And this is something that we're excited to not just share with you guys, but it's something we'll be taking to a broader stage much sooner as well. So please make some time for that 1:00 p.m. Eastern time call. And then we'll be wrapping up these kind of end of quarter events at 3:00 p.m. today with the operations portion of the quarterly report.

[01:02:18.580] - Deetz
But with that, I appreciate everybody being here for this call today. The questions that were asked especially. A big thanks to the folks on the engineering teams that got up and presented and contributed to this report. But we'll be seeing, hopefully, a bunch of you in about 54 minutes, but we'll go ahead and wrap this one. So thanks, everybody, and we'll talk to you all soon.