Tutorial: Create a streams flow from a Data Historian example flow
In this tutorial, you learn how to create and run an example streams flow. We supply the Data Historian example streams flow and the sample data. You don’t need to configure anything.
This tutorial is a high-level, bird’s eye view of a streams flow that takes approximately 10 minutes to finish. Other tutorials do an in-depth examination into the canvas,
The sample data is taken from five weather stations. The data includes weather station ID, time zone, date in Universal Coordinated Time (UTC) format, latitude, longitude, temperature, barometric pressure, humidity, indoor temperature, and rainfall today.
The Data Historian example streams flow has two Aggregation operators:
The first Aggregation operator partitions the incoming data by weather station ID. Consequently, each weather station has a partition. Within each partition, the data is grouped by weather station. As a result, every partition has one group.
Every 60 seconds, the data “tumbles out” and a designated function is applied to data in each group. For example, the Average function is applied to rainfall data, but the Min function is applied to the barometric pressure data.
The second Aggregation operator ingests the output of the first Aggregation operator. It partitions and groups the data just like the first Aggregation operator, but the data “tumbles out” every 180 seconds.
Output data from the second Aggregation operator is stored in Cloud Object Storage (COS) for further analysis later.
Watch this video to see how to create and run a simple streams flow by using an example flow and sample data.
Now it’s your turn - try out the following tutorial steps in your own environment.
You must have a Cloud Object Storage (COS) instance and a Streaming Analytics service instance that is associated with the project where the streams flow runs. Go to the
In the Storage section of the page, check that a COS instance is listed.
In the Associated Services section of the page, check that a Streaming Analytics service is listed.
To provision either instance, go to your account in IBM Cloud Dashboard. Click Create resource, and then follow the prompts.
Watch this video to see how to provision the services necessary to create, edit, and run a streams flow.
Create the example streams flow
Create a streams flow by using the Data Historian example streams flow and its sample data.
From the Projects menu, click View All Projects. Click the name of the project where you want to put your new streams flow.
Project page, click the **Assets** tab. In the Streams flow area, click **New streams flow**. ![View project](images/TUTORIAL_DH_PROJECT.gif)
New Streams Flowpage, click the From example tab.
Do the following steps:
The example streams flow automatically completes the Name and Description fields, so leave them blank.
In the Streaming Analytics service list, the Streaming Analytics service that is associated with the project is already selected. Here’s an example:
In the Select Example area, click the Data Historian Example box. Note that the Name and Description fields are now filled.
In the Connection list, the COS instance that is associated with the project is already listed. In the following example, we select the COS instance “CloudObjectStorage-2w”
In the File path field, click the icon to open the
Select Data Assetwindow. Select a bucket, and then click Select.
In this example, let’s select a bucket called “datahistorian”.
In the File path field, append the string /%TIME. The variable %TIME appends the time to the file name to make it unique. Add the file extension
.parquetbecause partitioning uses parquet files.
Here is an example file path for a bucket.
- In the
New Streams Flowpage, click Create.
Run the example streams flow
Your new Data Historian example streams flow is automatically shown in the
Click the Run icon ( ) to start the streams flow. The Status indicator reflects the changing stages as the streams flow is deployed.
Notice that until the status is “Running”, the streams flow is static and uses arrows to connect operators.
When the status is “Running, you can see the data as it flows between operators. Put your mouse pointer over a data flow to get real-time metrics. Click the data flow to see the events flowing to the next operator.
You just created a streams flow from the Data Historian example flow and its sample data. You started the streams flow in the
Learn more about the Data Historian example streams flow – its scope, its operators, and its output - in the topic Data Historian example streams flow.
For more information about specific features and operators, check out our other tutorials for streams flow.