Introduction
Typically, when using Paima, you are reacting to events created by underlying funnels. There are cases, however, where you want your rollup to react to events emitted by your rollup itself, and this case is supported through "ticks" (scheduled events). However, what if you want your rollup to emit events purely for external applications to connect to, as opposed to triggering a state transition in your rollup? This is the goal of event logs.
Core concept
When writing a application, being able to easily know usage patterns is crucial to:
- As the core developers, to know what is working and what isn't
- As an external developer, to be able to build interesting projects on top of the core protocol
Although there are cases where having relational databases can provide a lot of speed, clarity and usability benefits, in a lot of cases non-relational databases are significantly easier to write and much easier to consume externally. This is especially true in cases where use-generated content plays a key role in the protocol, as you often cannot know which relational structures best fit user behavior ahead of time. You can see this in practice for example with blockchains, where often there are relational database options for parts that seldom change (the core protocol itself), whereas user-generated content (ex: dApps) are often indexed using general logging systems (ex: event logs for Solidity).
Similarly, for Paima Engine, we provide database management of rollup state out of the box, but also provide a simpler logging system for cases where it makes sense.
Notably, for Paima's event system, we had the following desirable properties:
- Events should be customizable
- Events can be defined on a per-app basis, be emitted wherever needed (ex: an event if a player completes a quest)
- Events can self-define which fields are indexable to fine-tune performance
- Events should be easily parsable
- Events should support static typing using json-schema
- Events should come with a code generation and documentation generation to make it easy for external developers to understand
- Events should be easy to access
- Historical events are kept permanently by default
- (for historical data) Events should be accessible via a REST interface
- (for realtime data) Events should be accessible via a pub-sub system
Use-cases
- Analytics: event logs can help you get exactly the data you need for better insight into your game, like knowing how many users have completed a quest
- Notifications: events can easily be streamed into other tools like Discord bots to give notifications when users perform notable actions
- Shadow logs: sometimes it can be hard (or gas expensive!) to encode the exact logging statements you want in Solidity. In fact, sometimes it can be impossible if your application spans multiple stacks or contains non-EVM components. Paima event logs can solve this by acting as shadow logs: logs that are deterministically emitted as an open standard (assuming you open source the code you wrote with Paima) that others can consume for their application. Better support will come for this in Paima in the future (see this issue for more)
- Triggering offchain computation / oracles: there are many cases where you want that your applications reaching a certain state triggers some known private key / entity to perform some action (ex: rewarding the users with tokens, marking an objective as complete in some 3rd party tool, etc.). This can be achieved with the Paima event logs, as you can have a Javascript program listen for specific events as their trigger for additional logic to be executed.