About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Last updated: Dec 09, 2024
DataStage test cases are design-time assets that use data files to define the inputs and expected outputs of your DataStage flows.
A test case consists of input data files, output data files, and a test specification.
Creating a test case
To create a test case for your flow, click the flask-shaped Test cases icon in the toolbar of the DataStage design canvas. You can also create a test case from the Assets tab of your project by clicking .
Specify which input and output links to use as stubbed links. Test data will be injected into the input links, and the data from the output links will be compared to the expected output. You can also specify the names of parameters or parameter sets for which your test will supply its own values.
When you create a test case, DataStage creates an
empty test data file for each source and target link in your flow. DataStage also creates a test specification with references
to each test data file and all parameters in your flow. The test case specification is a JSON file.
See the following example:
For more information on the spec, see the MettleCI documentation on the Unit test specification format.{ … "given": [ { # Inject this test data file into stageA.linkA "path": "fileCustomers.csv", "stage": "sfCustomers", "link": "Customers" }, { # Inject this test data file into stageB.linkB "path": "fileOrders.csv", "stage": "sfOrders", "link": "Orders" } ], "when": { # Execute the test case with these options and hardcoded parameter values #An internally-generated reference to the flow with which this test is associated "data_intg_flow_ref": "3023970f-ba2dfb02bd3a", "parameters": { "DSJobStartDate": "2012-01-15" "DSJobStartTime": "11:05:01" "paramStartKey": "100" } }, "then": [ { # Expect flow output on stageX.linkX to look like this test data file "path": "ODBC_customers.csv", "stage": "ODBC_customer", "link": "customer_out" }, { # Expect flow output on stageY.linkY to look like this test data file "path": "ODBC_orders.csv", "stage": "ODBC_order", "link": "order_out" } ], }
Editing a test case
Click Test cases to edit your specifications, test data and metadata. Click Settings to specify test settings, including scheduling and record storage.
Running a test case
Open Test cases and click the run icon in a specific test case to run it. All input data in your flow is replaced by your input test data during execution. The data on your output links is compared to your output test data. If the actual output is different from the expected output, the test fails and generates a difference report. To view results for a specific test, open Test cases and click on the timestamp of the test.
For more details on unit testing, see the MettleCI docs.
Was the topic helpful?
0/1000