The Classification and Regression (C&R) Tree node generates a decision
tree that allows you to predict or classify future observations. The method uses recursive
partitioning to split the training records into segments by minimizing the impurity at each step,
where a node in the tree is considered "pure" if 100% of cases in the node fall into a specific
category of the target field. Target and input fields can be numeric ranges or categorical (nominal,
ordinal, or flags); all splits are binary (only two subgroups).
C&R Tree
models require a single target and one or more input fields. A frequency
field can also be specified. See the topic Common modeling node properties for more information.
continue_training_existing_model
flag
objective
StandardBoostingBaggingpsm
psm is used for very large datasets, and requires
a Server connection.
model_output_type
SingleInteractiveBuilder
use_tree_directives
flag
tree_directives
string
Specify directives for growing the tree. Directives can be wrapped in triple quotes to avoid
escaping newlines or quotes. Note that directives may be highly sensitive to minor changes in data
or modeling options and may not generalize to other datasets.
use_max_depth
DefaultCustom
max_depth
integer
Maximum
tree depth, from 0 to 1000. Used only if use_max_depth
= Custom.
prune_tree
flag
Prune tree
to avoid overfitting.
use_std_err
flag
Use maximum
difference in risk (in Standard Errors).
std_err_multiplier
number
Maximum
difference.
max_surrogates
number
Maximum
surrogates.
use_percentage
flag
min_parent_records_pc
number
min_child_records_pc
number
min_parent_records_abs
number
min_child_records_abs
number
use_costs
flag
costs
structured
Structured property.
priors
Data Equal Custom
custom_priors
structured
Structured property.
adjust_priors
flag
trails
number
Number of
component models for boosting or bagging.
About cookies on this siteOur websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising.For more information, please review your cookie preferences options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.