Open Neural Network Exchange (ONNX)
What is ONNX?
ONNX is an open file format for the representation of Machine Learning Models and is managed as a community project. Homepage of the ONNX community: onnx.ai
The ONNX format defines groups of operators in a standardized format, allowing learned models to be used interoperably with various frameworks, runtimes and further tools.
ONNX supports descriptions of neural networks as well as classic machine learning algorithms and is therefore the suitable format for both the TwinCAT Machine Learning Inference Engine and the TwinCAT Neural Network Inference Engine.
Why ONNX?
Through support for ONNX, Beckhoff integrates the TwinCAT Machine Learning products in an open manner and thus guarantees flexible workflows. While the automation specialist can work in TwinCAT 3, the data scientist can work with his usual tools (PyTorch, Scikit-Learn, ...).
The use of ONNX facilitates cross-workgroup working, both internally and cross-company with partners. The automation specialist provides the data scientist with recorded data. The data scientist creates an ML model and hands over his work as an ONNX file to the automation specialist. This file already contains all information to execute the created model in TwinCAT.
The offline testing of models is simplified, because all common AI frameworks can load and also execute the ONNX file.
Which software supports ONNX?
Supported tools of the ONNX community can be viewed here: onnx.ai/supported-tools.
Including, for example, the frameworks:
- PyTorch
- Keras/TensorFlow
- MXNet
- Scikit-learn
- …
Graph Optimizer
Graph Visualizer
- Netron (https://github.com/lutzroeder/Netron)
- …