Streaming

Push indexed events to external systems in real-time.

Webhooks

HTTP POST after each block is committed. Useful for triggering downstream pipelines or cache invalidation.

sieve.toml
[[streams]]
name = "my_webhook"
type = "webhook"
backfill = false

Set WEBHOOK_URL in .env.

Payload

json
{
  "block_number": 22516100,
  "block_timestamp": 1700000000,
  "tables": [
    { "name": "usdc_transfers", "event": "Transfer", "count": 3 },
    { "name": "eth_transfers", "event": "transfer", "count": 1 }
  ]
}

RabbitMQ

Per-event JSON messages to an AMQP exchange with configurable routing keys.

sieve.toml
[[streams]]
name = "rabbitmq_events"
type = "rabbitmq"
exchange = "sieve_events"
routing_key = "{table}.{event}"
backfill = false

Set RABBITMQ_URL in .env. The routing_key supports {table} and {event} placeholders.

Message payload

json
{
  "table": "usdc_transfers",
  "event": "Transfer",
  "contract_name": "USDC",
  "contract": "0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48",
  "block_number": 22516100,
  "block_timestamp": 1700000000,
  "tx_hash": "0xabc...",
  "log_index": 5,
  "tx_index": 42,
  "tx_from": "0x1234...5678",
  "data": {
    "from": "0xDead...beef",
    "to": "0xCafe...babe",
    "value": "1000000"
  }
}

The data keys use raw ABI parameter names, not your TOML column names. If the Solidity ABI defines _troveId, the stream sends _troveId even though PostgreSQL stores it as trove_id.

Stream options

name · required unique stream identifier
type · required webhook or rabbitmq
backfill · optional · default: true whether to send during historical sync
exchange · required for rabbitmq AMQP exchange name
routing_key · optional default: {table}.{event}

Behavior

Both webhook and RabbitMQ can run simultaneously
Best-effort delivery. Failures are logged, never block indexing
RabbitMQ uses lazy connection with auto-reconnect on failure
Receipt fields are included in payloads when include_receipts = true