Snowflake Summit '25

Snowflake's annual user conference is returning to San Francisco. Register today and save on a full conference pass.

USE CASE

Breaking the Streaming and Batch Silos

No more separation of streaming and batch pipelines. Unified ingestion and transformation in a single system.

Simplify Data Pipelines in One System

Unify stream and batch ingestion and processing pipelines in one architecture. Stream and process data at low latency where your historical data is. Ingest data easily with pre-built streaming connectors via Snowflake Openflow. Use SQL to process streaming data for many use cases with Dynamic Tables.

Optimize Cost Without Wasted Compute

Streaming ingest for rowsets is as much as 50% cheaper than file ingestion at the same volume. Dynamic Tables help you avoid wasted compute by providing performance guidance with incremental or full refresh for more efficient transformations.

Take Advantage of the AI Data Cloud

Since streaming capabilities are deeply integrated with the AI Data Cloud, you can still  enjoy the security and governance capabilities you’ve come to rely on through Snowflake Horizon.

High throughput, low latency streaming data

Snowflake Openflow directly connects to streaming sources, including Apache Kafka and Amazon Kinesis, plus Kafka Sink*, so streaming data can flow into Snowflake and back to streaming systems. 

Thanks to the new Snowpipe Streaming integration with Openflow, streaming ingestion now delivers 10GB/s throughput with five-second-to-query latency and inline transformation.

Developer window showing a single parameter change

Adjust Latency with a Single Parameter Change

With Dynamic Tables, you can use SQL or Python to declaratively define data transformations. Snowflake will manage the dependencies and automatically materialize results based on your freshness targets. Dynamic Tables only operate on data that has changed since the last refresh, making high data volumes and complex pipelines simpler and more cost-efficient.

Easily adapt to evolving business needs by making a batch pipeline into a streaming pipeline — with a single latency parameter change.

Bring Streaming to Open Lakehouse

Snowflake’s streaming capabilities work with Apache Iceberg format to help you build an open lakehouse architecture easily with versatile processing options.

Snowflake Openflow persists data in Apache Iceberg format and supports Apache Polaris-based catalogs. Then build low latency, declarative processing with Dynamic  Tables for both Snowflake managed and unmanaged* Apache Iceberg Tables.

Where Data Does More

  • 30-day free trial
  • No credit card required
  • Cancel anytime