Tutorial: Create a streams flow from a Data Historian example flow

Learning objective

In this tutorial, you learn how to create and run an example streams flow. We supply the Data Historian example streams flow and the sample data. You don’t need to configure anything.

This tutorial is a high-level, bird’s eye view of a streams flow that takes approximately 10 minutes to finish. Other tutorials do an in-depth examination into the canvas, Metrics page, and operators.

Overview

The sample data is taken from five weather stations. The data includes weather station ID, time zone, date in Universal Coordinated Time (UTC) format, latitude, longitude, temperature, barometric pressure, humidity, indoor temperature, and rainfall today.

The Data Historian example streams flow has two Aggregation operators:

  • The first Aggregation operator partitions the incoming data by weather station ID. Consequently, each weather station has a partition. Within each partition, the data is grouped by weather station. As a result, every partition has one group.

    Every 60 seconds, the data “tumbles out” and a designated function is applied to data in each group. For example, the Average function is applied to rainfall data, but the Min function is applied to the barometric pressure data.

  • The second Aggregation operator ingests the output of the first Aggregation operator. It partitions and groups the data just like the first Aggregation operator, but the data “tumbles out” every 180 seconds.

    Output data from the second Aggregation operator is stored in Cloud Object Storage (COS) for further analysis.

Preview

Watch this video to see how to create and run a simple streams flow by using an example flow and sample data.

Figure 1. Video iconCreate a Streams Flow based on the Data Historian Example
This video will demonstrate how to create and run a simple streams flow.

   


Now it’s your turn - try out the following tutorial steps in your own environment.

Prerequisites

You must have a Cloud Object Storage (COS) instance and a Streaming Analytics service instance that is associated with the project where the streams flow runs. Go to the Projects page of the project, and then click the Settings tab.

Settings tab of a project

  • In the Storage section of the page, check that a COS instance is listed.

  • In the Associated Services section of the page, check that a Streaming Analytics service is listed.

To provision either instance, go to your account in IBM Cloud Dashboard. Click Create resource, and then follow the prompts.

Create the example streams flow

Create a streams flow by using the Data Historian example streams flow and its sample data.

  1. From the Projects menu, click View All Projects. Click the name of the project where you want to put your new streams flow.

View all projects

  1. In the Project page, click the Assets tab. In the Streams flow area, click New streams flow.

    View project

  2. In the New Streams Flow page, click the From example tab.

    New stream From Example

  3. Perform the following steps:

    • The example streams flow automatically completes the Name and Description fields, so leave them blank.
    • In the Streaming Analytics service list, the Streaming Analytics service that is associated with the project is already selected. Here’s an example: Streaming Analytics service

    • In the Select Example area, click the Data Historian Example box. Note that the Name and Description fields are now filled.
    • In the Connection list, the COS instance that is associated with the project is already listed. In the following example, we select the COS instance “CloudObjectStorage-2w”

    • In the File path field, click the icon Data assets icon to open the Select Data Asset window. Select a bucket, and then click Select.

      In this example, let’s select a bucket called “datahistorian”.
      Select bucket for Data Historian

    • In the File path field, append the string /%TIME. The variable %TIME appends the time to the file name to make it unique. Add the file extension .parquet because partitioning uses parquet files. Here is an example file path for a bucket.
      File path for bucket

    • In the New Streams Flow page, click Create.

Run the example streams flow

Your new Data Historian example streams flow is automatically shown in the Metrics page. The Status indicator shows that it is in a Stopped state.

Click the Run icon Run icon to start the streams flow. The Status indicator reflects the changing stages as the streams flow is deployed. New streams flow in Stopped state

Notice that until the status is “Running”, the streams flow is static and uses arrows to connect operators.

When the status is “Running, you can see the data as it flows between operators. Put your mouse pointer over a data flow to get real-time metrics. Click the data flow to see the events flowing to the next operator.

New streams flow in Running state

Summary

You just created a streams flow from the Data Historian example flow and its sample data. You started the streams flow in the Metrics page and saw weather station data that flows between operators.

Learn more

Learn more about the Data Historian example streams flow – its scope, its operators, and its output - in the topic Data Historian example streams flow.

For more information about specific features and operators, check out our other tutorials for streams flow.