Creating Specs
Hyperstack uses a declarative Rust DSL (Domain Specific Language) to define how on-chain Solana data should be transformed, aggregated, and streamed to your application. Instead of writing complex ETL pipelines, you simply declare the final state you want, and Hyperstack handles the rest.
Why Declarative?
Section titled “Why Declarative?”Building data pipelines for Solana typically involves manual account parsing, complex event handling, and managing state synchronization. Hyperstack replaces this imperative approach with a declarative model:
| Imperative Approach (Traditional) | Declarative Approach (Hyperstack) |
|---|---|
| Write custom decoding logic for every account | Use #[map] to link IDL fields to your state |
| Manually track and sum event values | Use #[aggregate(strategy = Sum)] |
| Manage WebSocket connections and state diffs | Define entities and let Hyperstack stream updates |
| Build custom backend services for data | Deploy a spec and use generated SDKs |
Anatomy of a Spec
Section titled “Anatomy of a Spec”A Hyperstack spec is a Rust module annotated with #[hyperstack]. Inside this module, you define Entities—the structured data objects your application will consume.
use hyperstack_macros::{hyperstack, Stream};
#[hyperstack(idl = "path/to/idl.json")]pub mod my_stream { #[entity(name = "Token")] #[derive(Stream)] pub struct Token { /// The primary key (mint address) #[from_instruction(Create::mint, primary_key)] pub mint: String,
/// Real-time reserves mapped from account state #[map(BondingCurve::reserves)] pub reserves: u64,
/// Total volume aggregated from individual trade events #[aggregate(from = Trade, field = amount, strategy = Sum)] pub total_volume: u64,
/// Computed field derived from existing fields #[computed(reserves * price)] pub tvl: u64, }}Key Components
Section titled “Key Components”#[hyperstack]Module: The container for your specification. It links to your data sources (IDL or Protobuf files).#[entity]Struct: Defines a projection of on-chain data. Each entity represents a collection of related data that will be streamed as a single unit.- Primary Key: Every entity must have a primary key (usually a
PubkeyorString). This is how Hyperstack tracks individual instances of an entity. - Field Mappings: Attributes on struct fields that define where the data comes from and how it’s processed.
Mapping Types
Section titled “Mapping Types”Hyperstack provides several mapping attributes to populate your entity fields:
| Attribute | Source | Description |
|---|---|---|
#[map] | Account State | Tracks fields within a Solana account. Updates whenever the account changes. |
#[from_instruction] | Instructions | Extracts arguments or account keys from a specific instruction. |
#[aggregate] | Events/Instructions | Computes running values (Sum, Count, etc.) from a stream of events. |
#[event] | Events | Captures specific instructions as a log of events within the entity. |
#[snapshot] | Account State | Captures the entire state of an account at a specific point in time. |
#[computed] | Local Fields | Derives a new value by performing calculations on other fields in the same entity. |
#[derive_from] | Instructions | Populates fields by deriving data from instruction context. |
Population Strategies
Section titled “Population Strategies”When data arrives, Strategies determine how the field value is updated. This is particularly powerful for aggregations.
| Strategy | Behavior |
|---|---|
LastWrite | (Default) Overwrites the field with the latest value. |
SetOnce | Sets the value once and ignores subsequent updates (perfect for IDs). |
Sum | Adds the incoming value to the existing total. |
Count | Increments the total by 1 for every matching event. |
Append | Adds the incoming value to a list (creating an event log). |
Max / Min | Keeps only the highest or lowest value seen. |
How Specs Become Data Streams
Section titled “How Specs Become Data Streams”When you build your Rust project containing a Hyperstack spec, the macros perform several steps:
- Validation: The macro verifies that your mappings match the provided IDL or Protobuf definitions.
- AST Generation: It generates a JSON-based Abstract Syntax Tree (AST) representing your data pipeline.
- Bytecode Compilation: When you deploy your spec (
hyperstack up), Hyperstack Cloud compiles this AST into optimized bytecode. - Execution: The Hyperstack VM executes this bytecode in real-time against the Solana transaction firehose, updating your entity state and streaming diffs to your clients.
Next Steps
Section titled “Next Steps”- Mapping Macros — Deep dive into every mapping attribute and its parameters.
- Aggregation Strategies — Learn how to build complex metrics using different strategies.
- CLI Reference — Learn how to build and deploy your specs.