Gradient Boosting
A Gradient Boosting model can be used both for classification and for regression. Like the Random Forest, for example, the model is one of the Ensemble Tree methods. Compared to an individual Decision Tree, the accuracy of the prediction can be improved with the Gradient Boosting model at the cost of the model no longer being simple to explain. Random Forest and Gradient Boosting differ from each other in the way the individual trees are generated.
Supported properties
ONNX support
- TreeEnsambleClassifier
- TreeEnsambleRegressor
Samples of the export of Gradient Boosting models can be found here: ONNX export of Gradient Boosting.
![]() | Classification limitation With classification models, only the output of the labels is mapped in the PLC. The scores/probabilities are not available in the PLC. |
Supported data types
A distinction must be made between "supported datatype" and "preferred datatype". The preferred datatype corresponds to the precision of the execution engine.
The preferred datatype is floating point 64 (E_MLLDT_FP64-LREAL).
When using a supported datatype, an efficient type conversion automatically takes place in the library. Slight losses of performance can occur due to the type conversion.
A list of the supported datatypes can be found in ETcMllDataType.