Back to Marketplace

Event Streams

Trigger business processes, and stream data to off-chain caching or analytics

Description

Build event-driven applications on the Blockchain with ease. Respond dynamically when participants on your chain submit state changes, or transactions the require coordination between multiple parties to complete.

Connect your on-chain data, to your off-chain systems of record. Drive updates into your systems of record as the state changes in the shared ledger, triggering business processes, applications and notifications.

Build an off-chain cache of on-chain data, for end-user Mobile and Web experiences that perform high-volume data query to the on-chain data. Stream data to your analytics data-lake, or real-time analytics platform, to drive actionable insights from your consortium shared ledger.

Kaleido Event Streams perform at-least once delivery of Ethereum events from your blockchain node, to any REST HTTPS endpoint.

The events are delivered as simple JSON payloads, with all of the complexities of Ethereum topics, RLP encoding and type mapping handled for you. Just provide a simple service that delivers the payload to your chosen backend, pre-batched by the event stream.

The subscription interface is integrated with the REST API Gateway, so subscribing to events is a simple API call. You can subscribe to events on a contract instance, by instance basis. Or subscribe to all events of a given signature on the chain.

Features

Reliable batched delivery of events

Events from multiple subscriptions are batched and streamed efficiently together on a single event stream, with checkpointing on each subscription to ensure reliable at-least-once delivery

Bind to AWS Kinesis with a AWS Lambda serverless function

No transformation required. Just receive the payload, pre-batched, and pass it on to create a firehose into your analytics data lake

Bind to Microsoft Azure Event Hubs with an Azure Functions serverless function

No transformation required. Just receive the payload, pre-batched, and pass it on to create a firehose into your analytics data lake