Locate Edge
In this sample,
- the function F_VN_LocateEdgeExp is used to locate an edge in a defined search window,
- and the execution time is monitored with a watchdog and limited if necessary.
Explanation
A search window can be used to locate edges with the function F_VN_LocateEdge. Some parameters are fixed for simplicity. Alternatively, the function F_VN_LocateEdgeExp that is used here provides full access to all parameters. This sample is intended to illustrate the individual parameters. We recommend trying out configuration changes in addition to the standard configuration and consider the effects on the edge result and on the processing time. A selection of configuration changes including descriptions can be found under Results.
Variables
hr : HRESULT;
hrFunc : HRESULT;
ipImageIn : ITcVnImage;
ipImageInDisp : ITcVnDisplayableImage;
ipImageRes : ITcVnImage;
ipImageResDisp : ITcVnDisplayableImage;
// result
ipEdgePoints : ITcVnContainer;
// parameters
aStartPoint : TcVnPoint2_REAL := [850, 400];
aEndPoint : TcVnPoint2_REAL := [550, 400];
eDirection : ETcVnEdgeDirection := TCVN_ED_DARK_TO_LIGHT;
fMinStrength : REAL := 50;
nSearchLines : UDINT := 31;
fSearchLineDist : REAL := 1;
nMaxThickness : UDINT := 7;
nSubpixIter : UDINT := 10;
eAlgorithm : ETcVnEdgeDetectionAlgorithm := TCVN_EDA_INTERPOLATION;
fAvgStrength : REAL;
// Watchdog
hrWD : HRESULT;
tStop : DINT := 15000;
tRest : DINT;
nFraction : UDINT;
// drawing
aLine : TcVnVector4_LREAL;
aColorGreen : TcVnVector4_LREAL := [0, 175, 0];
aColorBlue : TcVnVector4_LREAL := [0, 0, 255];
aColorRed : TcVnVector4_LREAL := [255, 0, 0];
sText : STRING(255);
Code
hrWD := F_VN_StartRelWatchdog(tStop, hr);
hrFunc := F_VN_LocateEdgeExp(
ipSrcImage := ipImageIn,
ipEdgePoints := ipEdgePoints,
aStartPoint := aStartPoint,
aEndPoint := aEndPoint,
eEdgeDirection := eDirection,
fMinStrength := fMinStrength,
nSearchLines := nSearchLines,
fSearchLineDist := fSearchLineDist,
nMaxThickness := nMaxThickness,
nSubpixelsIterations := nSubpixIter,
fApproxPrecision := 0.0001,
eAlgorithm := eAlgorithm,
hrPrev := hr,
fAvgStrength => fAvgStrength);
hrWD := F_VN_StopWatchdog(hrWD, nFractionProcessed=>nFraction, tRest=>tRest);
// Draw result for visualization
hr := F_VN_ConvertColorSpace(ipImageIn, ipImageRes, TCVN_CST_GRAY_TO_RGB, hr);
sText := CONCAT(CONCAT('Processed ', UDINT_TO_STRING(nFraction)), '%');
hr := F_VN_PutTextExp(sText, ipImageRes, 25, 50, TCVN_FT_HERSHEY_SIMPLEX, 1.3, aColorGreen, 2, TCVN_LT_8_CONNECTED, FALSE, hr);
sText := CONCAT(CONCAT('Time ', DINT_TO_STRING(tStop - tRest)), 'us');
hr := F_VN_PutTextExp(sText, ipImageRes, 25, 100, TCVN_FT_HERSHEY_SIMPLEX, 1.3, aColorGreen, 2, TCVN_LT_8_CONNECTED, FALSE,hr);
sText := CONCAT('Returncode ', DINT_TO_STRING(hrFunc));
hr := F_VN_PutTextExp(sText, ipImageRes, 25, 150, TCVN_FT_HERSHEY_SIMPLEX, 1.3, aColorGreen, 2, TCVN_LT_8_CONNECTED, FALSE,hr);
hr := F_VN_DrawPoint(REAL_TO_UDINT(aStartPoint[0]), REAL_TO_UDINT(aStartPoint[1]), ipImageRes, TCVN_DS_CIRCLE, aColorRed, hr);
hr := F_VN_DrawPoint(REAL_TO_UDINT(aEndPoint[0]), REAL_TO_UDINT(aEndPoint[1]), ipImageRes, TCVN_DS_X, aColorRed, hr);
hr := F_VN_FitLine(ipEdgePoints, aLine, hr);
hr := F_VN_DrawLine_TcVnVector4_LREAL(aLine, ipImageRes, aColorGreen, 2, hr);
hr := F_VN_DrawPointsExp(ipEdgePoints, ipImageRes, TCVN_DS_PLUS, aColorBlue, 1, 1, TCVN_LT_8_CONNECTED, hr);
// Display source and result image
hr := F_VN_TransformIntoDisplayableImage(ipImageIn, ipImageInDisp, S_OK);
hr := F_VN_TransformIntoDisplayableImage(ipImageRes, ipImageResDisp, S_OK);
Results
For visualization, the edge points that are found are shown in blue, and a regression line is drawn in green in the output image. In addition, the starting point (red circle) and the end point (red X) are drawn. The processing percentage, the required execution time and the return value of the function are displayed in the top left corner.
For the parameters used in this sample, the result looks like this:
If the number of search lines is increased from 31 to 61 with the parameter nSearchLines
, the required computing time is also doubled. In return, the regression line becomes more precise:
If you want to find the milling edge of the component instead of the outer edge, change the parameter eDirection
to TCVN_ED_LIGHT_TO_DARK
. This ignores the outer edge transition from dark to light and instead finds the following edge with the transition from light to dark:
Reducing execution time
It is noticeable that the execution time, despite otherwise identical parameters, is higher than when locating the outer edge. This is because the edge that is found is further away from the starting point.
Conversely, this means that the execution time for a fixed search window varies slightly depending on the position of the component in the image, which can also be seen in the three sample images.
An even greater fluctuation can occur with the two approximation algorithms, as they require different iterations to approximate the model parameters using the surrounding pixel intensities.
In general, fewer iterations are required if the search lines of the search window hit the edge as orthogonally as possible. If the position and orientation of the objects in the image are subject to little or no fluctuation, the search window can be adjusted accordingly to achieve shorter execution times. In addition, the maximum time required can be reduced by reducing the maximum number of iterations, but this may lead to less accurate results.
Using a watchdog
In addition, the time can be limited with an external watchdog, which terminates the execution of the function if necessary and returns the existing partial results. For example, if the algorithm is changed to TCVN_EDA_APPROX_ERF
and the maximum number of iterations is set to 60, the execution for the image LocateEdge2.bmp (outer edge orthogonal to the search lines) will take almost exactly 4 ms (depending on the installed CPU), while the other two images will take 4.3 and 4.4 ms. If the parameter tStop
is set to 4000, the execution of the function is stopped as soon as possible after 4 ms, and the partial results available up to then are returned. In other words, the function on the image LocateEdge2.bmp continues to be processed 100%, whereas the function on the other two images (in this case LocateEdge3.bmp) is terminated prematurely:
Since it is only possible to abort at certain points in the algorithm, and the partial results (previously found edge points on the search lines) are returned, the function is not terminated after exactly 4000 µs but takes a little longer, which must be taken into account when selecting the abort time. The maximum additional time required generally depends on the algorithm and the parameterization.
Despite the termination, the partial results are sufficient to achieve the desired result, since only the results of one or two search lines are missing.