Batch files for control
The console client can be used to create batch files to control the Analytics Storage Provider. Some parameters are provided for this purpose:
-Help / -H / -? | Returns a description of all parameters. |
Parameters for the configuration settings:
-CreateASPConfig | Create a new Analytics Storage Provider settings XML. |
-MainTopic <mainTopic> | Analytics Storage Provider Main Topic. |
-Comment <comment> | Analytics Storage Provider comment. |
-EventLogTrace <True|False> | Trace to the event log. |
-DebugLog <True|False> | Additional DebugLog. |
-StorageType <type> | Storage type (ANALYTICSFILE, AZURESQL, AZUREBLOB). |
-StorageConnString <connString> | Connection string or path to storage. |
-TlsType <Tls1.0|Tls1.1|Tls1.2> | Tls type (for AzureBlob). |
-MaxDuration <duration (sec)> | Maximum duration of a TAY file. |
-MaxWriteLen <writeLen (bytes)> | Maximum length of a data packet. |
Configuration parameters:
-LocalProvider | Use the connection settings of the locally installed Analytics Storage Provider. |
-ConfigFile <path> | Use all configurations from the configuration file of an Analytics Storage Provider Recorder window. |
-ProviderGuid <guid> | Provider of the Analytics Storage Provider to be used. |
-ConfigCmdID <id> | ID number of the preconfigured recording in the configuration file. |
-ConfigCmdAlias <alias> | Alias of the preconfigured recording in the configuration file. |
Connection parameters:
-Broker /-Host <hostname> | Host name or IP address of the broker used. |
-Port <port> | Broker port (default value: 1883). |
-User <username> | Username for the connection. |
-Password / -Pwd <password> | Password for the connection. |
-CA <path> | Path to the CA certificate for the connection. |
-Cert <path> | Path to the certificate for the connection. |
-Key_Cert <path> | Path to the key file for the connection. |
-Key_Pwd <password> | Password for the key file for the connection. |
Function parameters:
-StartRecord | Sends a StartRecord command. |
-StopRecord | Sends a StopRecord command. |
-IsRecordingActive | Checks whether a recording is currently active. |
-GetHistorical | Sends a GetHistoricalData command. |
-StopHistorical | Sends a StopHistoricalData command. |
-UpdateHistorical | Sends a HistoricalUpdate command. |
-CancelAllRec | Sends a Cancel command to all active recordings. |
-CancelAllHist | Sends a Cancel command to all active historical data streams. |
-StartPipeline | Sends a StartRuleEngine pipeline command. |
-StopPipeline | Sends a StopRuleEngine pipeline command. |
-RestartRule | Sends a RestartRule command. |
-DeleteRecordingsOlderThan | Deletes recordings whose end time is older than a specified timestamp. Optionally, the topic of the historical stream can also be specified. Only the active historical streams are taken into account. |
Recording start/stop parameters:
-Alias <alias> | Alias name of the recording. |
-RecName <record> | Alias name of the data set. |
-Topic <topic> | Topic to be included. |
-DataFormat <Bin|Json> | Data format of the live data stream. |
-Duration <seconds> | Recording duration. |
-Ringbuffer <None|TimeBased|DataBased> | Ring buffer mode (default value: Default). |
-RinbufferPara <minutes/MB> | Parameters for the ring buffer (in seconds or megabytes). |
-Mode <All|Subset> | Mode of recording. Takes all symbols and a subset of the symbols. |
-Symbols / -Sym <Symbol1,Symbol2> | List of symbol subset as comma-separated list. |
-RecorderGuid <guid> | Guid of the Analytics Storage Provider Recorder window. |
-Storage <guid> | Guid from Storage where to write. |
-SubBroker <guid> | Guid from the Sub Broker from which the data is to be recorded. |
Historical data stream start/stop parameters:
-SystemID <systemID guid> | System ID of the recorded data set. |
-Topic <topic> | Topic of the recorded data set. |
-Layout <layout guid> | Layout of the recorded data set. |
-RecordID <id> | ID of the data set to be streamed. |
-StartTime <time ns> | Start time of the data set to be streamed in nanoseconds. |
-EndTime <time ns> | End time of the data set to be streamed in nanoseconds. |
-MaxSamples <samples> | Maximum number of samples (default value: 5000) |
-UsrSampleTime <ms> | Sampling rate. (Default value: -1; sampling rate of the recording) |
-DataFormat <Bin|Json> | Data format of the data stream. |
-ResultTopic <topic> | Result MQTT topic to which the data will be streamed. |
-Mode <All|Subset> | Streaming mode. Streams all or a subset of the symbols. |
-Symbols / -Sym <Symbol1,Symbol2> | List of symbol subset as comma-separated list. |
Historical data stream update parameters:
-MaxSamples <samples> | Maximum number of samples (default value: 5000) |
-UsrSampleTime <ms> | Sampling rate. (Default value: -1; sampling rate of the recording) |
-MaxPackSize <samples> | Maximum message size in kilobytes |
-SendDuration <ms> | Waiting time between sending messages in milliseconds. |
-ResultTopic <topic> | Result MQTT topic to which the data will be streamed. |
RuleEngine pipeline parameter:
- PipelineGuid <guid> | Guid of the RuleEngine pipeline. |
- RuleID <id> | ID of the rule within a RuleEngine pipeline. |
Delete recordings Parameters:
- DateTimeOlderThan <datetime> | Timestamp in the format "yyyy-MM-dd hh:mm". Any recording with an end time older than this timestamp will be deleted. |
- HistoricalStreamTopic <topic> | Topic of the historical stream (optional). |
Data Import Parameters:
-SourcePath <Path> | Path to the folder with the files to be imported. |
-StorageType <StorageType> | Choice of Storage type: AnalyticsFile Apache_IoTDB AzureBlob CSVFile InfluxDB_Plain MsSQL MsSQL_Plain PostgreSQL |
-Storage <StorageGuid> | Guid of the Storage. |
-CopyData <true|false> | Copy or move the data. |
-Topic <string> | Topic name of the data to be imported. |
-Alias <string> | Alias name of the data to be imported. |
-RecName <string> | Record name of the data to be imported |
-SystemID <Guid> | System ID of the data to be imported. |
-SysIDAlias <string> | System alias of the data to be imported. |
-Address <string> | Address of the data to be imported. |
-Latitude <double> | Latitude of the data to be imported. |
-Longitude <double> | Longitude of the data to be imported. |
Command line samples:
Create configuration:
TwinCAT.Analytics.StorageProvider.Client
-CreateASPConfig
-MainTopic Beckhoff/ASPTest
-Comment Analytics Storage Provider (Test)
-EventLogTrace False
-DebugLog False
-StorageType ANALYTICSFILE
-StorageConnString C:\TwinCAT\Functions\TF3520-Analytics-StorageProvider\Storage
-MaxDuration 120
-MaxWriteLen 2048
-Broker 172.17.62.135
-Port 1883
-User tcanalytics
-Pwd 123
Start recording with local Analytics Storage Provider:
TwinCAT.Analytics.StorageProvider.Client
-localprovider
-startrecord
-alias cmdTest
-recname cmdRec1
-topic TestSignals/TestStream
-dataformat Bin
-Duration 30
-mode Subset
-Symbols Variables.fCosine,Variables.fSine
Start configuration file of a recording:
TwinCAT.Analytics.StorageProvider.Client
-ConfigFile "C:\Users\User\AppData\Roaming\Beckhoff\TwinCAT Analytics Storage Provider\TcAnalyticsStorageProvider_Recorder.xml"
-ProviderGuid 76141a7f-e580-4281-99d8-1b8a75ca014d
-startrecord
-ConfigCmdAlias cmdTest
Check recording status
TwinCAT.Analytics.StorageProvider.Client
-Broker 172.17.62.135
-Port 1883
-User tcanalytics
-Pwd 123
-ProviderGuid 76141a7f-e580-4281-99d8-1b8a75ca014d
-IsRecordingActive
-alias cmdTest
-recorderGuid a8e171d2-712d-bd8e-da15-7eef28b71ad2
Stop all recordings:
TwinCAT.Analytics.StorageProvider.Client
-Broker 172.17.62.135
-Port 1883
-User tcanalytics
-Pwd 123
-ProviderGuid 76141a7f-e580-4281-99d8-1b8a75ca014d
-CancelAllRec
Start historical data stream:
TwinCAT.Analytics.StorageProvider.Client
-localprovider
-GetHistorical
-systemID c29ac2d4-76ce-ff44-4d7f-355ffbcca6bf
-layout 9a8e171d-712d-bd8e-da15-7eef28b71ad2
-topic TestSignals/TestStream
-recordID 1
-startTime 132696863612730000
-endTime 132696864177720000
-maxSamples 5000
-usrSampleTime -1
-resultTopic _TestSignals/TestStream/123
-dataformat Bin
-mode Subset -symbols Variables.fSineStart RuleEngine pipeline:
TwinCAT.Analytics.StorageProvider.Client
-localprovider
-StartPipeline
-PipelineGuid d00c5366-4cf5-4d4e-a2f6-9dbe759e9dd2
Stop RuleEngine pipeline:
TwinCAT.Analytics.StorageProvider.Client
-localprovider
-StopPipeline
-PipelineGuid d00c5366-4cf5-4d4e-a2f6-9dbe759e9dd2Start a special rule of a RuleEngine pipeline:
TwinCAT.Analytics.StorageProvider.Client
-localprovider
-RestartRule
-PipelineGuid d00c5366-4cf5-4d4e-a2f6-9dbe759e9dd2
-RuleID 2Delete old recordings:
TwinCAT.Analytics.StorageProvider.Client
-localprovider
-DeleteRecordingsOlderThan
-DateTimeOlderThan yyyy-MM-dd 00:00
- HistoricalStreamTopic Beckhoff /TcAnalyticsStorageProvider/41cfa2be-ca72-4145-9e37-875851502aa6/Historical/Stream_65Importing data
TwinCAT.Analytics.StorageProvider.Client
-DataImport
-SourcePath C:\\temp\\ED6A9F45-04D7-2D3A-7834-D3D1CF5EB21D
-StorageType CSVFile
-Storage e5a61c3d-dd98-40fc-a63f-4c41f6f19729
-CopyData true
-Topic AnalyticsStorageProvider/UnknownAnalyticsFile
-Alias Unknown
-RecName Record
-SystemID 53fae9bf-03fa-48ac-81e7-74f042eec6c2
-SysIDAlias Unknown AnalyticsFile
-Address TestAddress
-Latitude 1.0
-Longitude 5.0