Combining the concept of streams and tasks to build pipelines that process changed data on a schedule
In this recipe, we will combine the concept of streams and tasks and set up a scheduled Snowflake data pipeline that processes only changed data into a target table.
How to do it…
The following steps describe how to set up a stream to track and process changes that occur on table data. The steps are as follows:
- Let's start by creating a database and a staging table on which we will create our stream object. We will be creating a staging table to simulate data arriving from outside Snowflake and being processed further through a stream object:
CREATE DATABASE stream_demo; USE DATABASE stream_demo; CREATE TABLE customer_staging ( ID INTEGER, Name STRING, State STRING, Country STRING );
- Next, create a stream on the table that captures only the inserts. The insert-only mode is achieved by setting
APPEND_ONLY
to...