CN113537303B - Multi-optical target rapid classification and identification method based on template matching - Google Patents

Multi-optical target rapid classification and identification method based on template matching Download PDF

Info

Publication number
CN113537303B
CN113537303B CN202110705144.7A CN202110705144A CN113537303B CN 113537303 B CN113537303 B CN 113537303B CN 202110705144 A CN202110705144 A CN 202110705144A CN 113537303 B CN113537303 B CN 113537303B
Authority
CN
China
Prior art keywords
template matching
area
image
blob
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110705144.7A
Other languages
Chinese (zh)
Other versions
CN113537303A (en
Inventor
王拯洲
王力
魏际同
王伟
李刚
弋东驰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XiAn Institute of Optics and Precision Mechanics of CAS
Original Assignee
XiAn Institute of Optics and Precision Mechanics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XiAn Institute of Optics and Precision Mechanics of CAS filed Critical XiAn Institute of Optics and Precision Mechanics of CAS
Priority to CN202110705144.7A priority Critical patent/CN113537303B/en
Publication of CN113537303A publication Critical patent/CN113537303A/en
Application granted granted Critical
Publication of CN113537303B publication Critical patent/CN113537303B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a multi-optical target rapid classification and identification method based on template matching, which is realized according to the following principle: firstly, collecting an original image, carrying out binarization processing on the original image, and carrying out statistics on BLOB information of the binarized image; secondly, performing template matching on a BLOB region corresponding to the original image; then, optical target identification, classification and positioning; and finally, carrying out full-image matching when the two targets are close to each other, and re-identifying the light spot and the target of the boosting hole. The method can accurately calculate the light spot center and the boosting hole center (double targets) in the simulated light collimation image under the condition of weak contrast, and provides important guarantee for the smooth completion of the final simulated light collimation process.

Description

Multi-optical target rapid classification and identification method based on template matching
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a multi-optical-target rapid classification and identification method based on template matching.
Technical Field
The laser parameter comprehensive diagnosis system is a closed-loop regulation and monitoring system comprising an optical sampling assembly, a detection component, a servo system, a monitoring system, a control system and the like, is positioned at the tail end of a host device, and is designed aiming at the measurement of a far-field focal spot with a high dynamic range. In order to accurately measure the main and side lobe data in the main emission stage, the optical path needs to be automatically collimated in the pre-emission stage, and the main laser beam is guided to the reference position by using the analog light.
The analog light collimation is one of three collimation processes (concave reflector collimation, analog light collimation and light path butt collimation) of the comprehensive diagnosis system. The aim of the simulated light collimation is to collimate the sampled laser beam to a simulated target point, and because the simulated target point corresponds to a pre-calibrated reference position on the near-field CCD and the far-field CCD, the simulated light collimation is to simultaneously adjust the motors M1 and M2 to enable the center of a light spot to move to a specified target position on the near-field CCD and the far-field CCD.
In the light path collimation process, the light paths of all monitoring positions need to be collected and the positions of light beams need to be interpreted so as to determine the size and the method of light beam convergence. The light path collimation image of the whole comprehensive diagnosis system generates 5 types of images which are respectively a near field image, a far field image, a small hole image, a schlieren ball image and a simulated light butt joint image. However, since the object to be monitored differs for each monitoring position, the image characteristics acquired at each monitoring position differ. Therefore, the difficulty in processing the simulated light collimation images is that under the conditions of different light spot shapes, intensities and positions, the optical center of each collimation image needs to be accurately and reliably calculated for the collimation images acquired by different light paths and different time.
The characteristics of the simulated light-collimated image are summarized as follows:
1) Two optical targets, namely a light spot center and a boosting hole center, need to be identified in the simulated light image, and a plurality of optical targets need to be classified, identified and positioned due to real-time change of the reference, so that the target identification time is shortened;
2) Each optical target has no special distribution characteristics, and the center of a light spot or the center of a boosting hole cannot be distinguished from the characteristics such as shape, area, texture, gray level and the like;
3) Each optical target is discontinuously distributed, namely each optical target is not an independent connected area, and each optical target can have intensity change along with the change of the light intensity of the laser, so that one optical target can be split into a plurality of small and unconnected targets;
4) The two optical targets can be adhered, and in the condition, the motor needs to be adjusted to separate the two optical targets;
5) Due to the time requirement for collimation, when the simulated light collimated image size is 1600 × 1200, the optical target recognition time is no more than 30 seconds per pass.
Therefore, based on the above characteristics of simulated light collimation, a method needs to be provided, which can accurately calculate the light spot center and the boosting hole center (dual targets) in the simulated light collimation image under the condition of weak contrast.
Disclosure of Invention
In order to meet the requirement that the light spot center and the boosting hole center (double targets) in a simulated light collimation image can be accurately calculated under the condition of weak contrast, the invention provides a multi-optical target rapid classification and identification method based on template matching.
The specific technical scheme of the invention is as follows:
the method for rapidly classifying and identifying the multiple optical targets based on template matching comprises the following program steps executed on a computer:
step 1: binarization processing and BLOB area information statistics;
carrying out binarization processing on an original image acquired in a simulated light collimation process, and counting information of each BLOB area chain table in the binarized image; the chain table information comprises the central position of the BLOB area and the area of the BLOB area;
and 2, step: acquiring a template matching image similar to the characteristics of the boosting hole area from the original image;
and 3, step 3: carrying out template matching by using an autocorrelation matching method;
the process of obtaining the correlation coefficient r (x, y) is specifically as follows:
moving the template matching image w (s, t) point by point in the original image f (x, y), enabling the origin of the template matching image w (s, t) to coincide with any point in the original image f (x, y), calculating the sum of products of corresponding pixels of the template matching image w (s, t) and an image area covered by the template matching image w (s, t) in the original image f (x, y), and then carrying out normalization processing on the calculation result to obtain a correlation coefficient;
recording the upper left corner (x 0, y 0) of the template matching image w (s, t) as an optimal matching point;
and 4, step 4: optical target identification, classification and localization;
step 4.1: determination of light spot area and booster hole area
Step 4.1.1: the initial search of each BLOB area is 0, the BLOB area with the largest correlation coefficient r (x, y) is selected first from all the BLOB areas, if there is only one BLOB area with the largest correlation coefficient, the area is considered as a boost hole area, the searched flag search is changed to 1, and the step 4.1.3 is skipped: if the number of the BLOB area with the maximum correlation coefficient is more than two, step 4.1.2 is executed:
step 4.1.2: comparing the areas of more than two BLOB areas, determining a boosting hole area by the BLOB area with the largest area, and changing the searched mark searched to 1;
if the correlation coefficients of more than two BLOB regions are the same and the areas are also the same, the pixels of the template matching image are considered to be smaller than the distance between the center of the light spot and the center of the boost hole, and the boost hole region needs to be screened according to the distance between the position of the template matching center and the center of each BLOB region;
the method comprises the following specific steps: calculating the distance between the template matching center position and the center of each BLOB area, wherein the BLOB area with the minimum distance is a boosting hole area, and the corresponding template matching center position with the minimum distance is the center of the boosting hole;
step 4.1.3: searching all BLOB areas with search of 0, and determining the maximum BLOB area as a light spot area;
step 4.2: center position calculation of light spot region and boosting hole region
Because the position returned by template matching is the upper left corner of the template matching image, the template matching position needs to be converted into the position of the template matching center, and the conversion method comprises the following steps: the template matching center position x = the position x + the template matching image width/2 returned by the template matching; the template matching center position y = the position y + template image matching height/2 returned by the template matching.
Further, in order to reduce the time for identifying the target and improve the efficiency of identifying the target, the template matching may be performed in step 3 by narrowing the search range and sampling at intervals:
the search range is narrowed specifically as follows: performing autocorrelation operation only in a BLOB region with the gray level of the binary image of 255;
the interval sampling specifically comprises the following steps: in the template matching operation, the autocorrelation operation is performed in a manner of sampling 2 or 4 or 8 samples per interval.
Further, the specific formula of the autocorrelation operation in step 3 is as follows:
Figure GDA0003919140050000051
wherein f (x, y) represents an original image;
w (s, t) represents a template matching image;
x=0,1,2,...,n-L;y=0,1,2,...,m-J;
s=0,1,2,...,L;t=0,1,2,...,J。
further, the method is executed on frequency tripling analog light source software.
The invention has the beneficial effects that:
1. based on the principle of template matching, the invention can accurately calculate the light spot center and the boosting hole center (double targets) in the simulated light collimation image under the condition of weak contrast, so that the center of the side lobe image can be accurately shielded after the light beam convergence process of acquiring the side lobe image to calculate the center and moving the BM6 XY motor is completed, and the important guarantee is provided for the smooth completion of the final simulated light collimation process.
2. In the template matching process, the invention adopts means of reducing the search range, sampling at intervals and the like to reduce the calculated amount, thereby reducing the target recognition time and improving the target recognition efficiency.
Drawings
FIG. 1 is a flow chart of an embodiment of the present invention.
Fig. 2 (a) is an original image collected in a simulated light collimation procedure.
Fig. 2 (b) shows a binarized image of fig. 2 (a).
FIG. 3 is a template matching image of a booster hole.
Fig. 4 is a schematic diagram of a template matching process when two targets are far away from each other (a matching process is performed by narrowing a search range and sampling at intervals).
Fig. 5 (a) is a diagram of the effect of target recognition processing on the first two images when the distance between the two targets is long in the simulated light collimation experiment process.
Fig. 5 (b) is a diagram of the target recognition processing effect of the image when the distance between the second two targets is far in the simulated light collimation experiment process.
Fig. 5 (c) is a diagram showing the effect of the target recognition processing on the third image when the distance between the two targets is long in the simulated light collimation experiment process.
Fig. 5 (d) is a diagram of the target recognition processing effect of the fourth image when two targets are far away in the simulated light collimation experiment process.
Fig. 6 (a) is an original image when two objects are close to each other.
Fig. 6 (b) is an effect diagram after the object recognition is performed on fig. 6 (a).
Fig. 7 is a schematic view of a processing interface of frequency tripled analog light source software.
Detailed Description
The invention provides a multi-optical target rapid classification and identification method based on template matching, which has the following basic principles:
firstly, collecting an original image, carrying out binarization processing on the original image, and carrying out statistics on BLOB information of the binarized image;
secondly, performing template matching on a BLOB region corresponding to the original image;
then, optical target identification, classification and positioning;
and finally, carrying out full-image matching when the two targets are close to each other, and re-identifying the light spot and the target of the boosting hole.
The method of the invention is described in further detail below by way of a specific example in conjunction with figure 1.
Step 1: binarization processing and BLOB region information statistics
Performing binarization processing on an original image acquired in a simulated light collimation process (4 BLOB areas in the embodiment), and counting linked list information of the 4 BLOB areas in the binarized image; the chain table information comprises the central position of the BLOB area and the area of the BLOB area;
the original image and the binarized image are shown in fig. 2, and the size of the original image and the binarized image is 1600 × 1200.
And 2, step: acquiring a template matching image similar to the characteristics of the boosting hole area from the original image;
and step 3: template matching using autocorrelation matching
The fast target classification algorithm based on template matching mainly identifies optical targets in simulated light collimation and light path butt-joint collimation processes, and is mainly different from the traditional optical target center calculation method in the following steps:
(1) The number of the optical targets in the image is more than 2;
(2) Identifying which light spot is a light spot target and which is a boosting hole target;
(3) The method has certain requirements on a graph processing algorithm, and needs less than 40 seconds;
in order to perform template matching using the autocorrelation matching method, the autocorrelation matching method is first described as follows:
A. for an original image f (x, y) of size m × n and a template matching image w (s, t) of size J × L, the correlation of f (x, y) and w (s, t) can be expressed as the following formula (1):
Figure GDA0003919140050000081
wherein x =0,1,2.., n-L; y =0,1,2, ·, m-J; s =0,1,2, ·, L; y =0,1,2, ·, J;
specifically, the method comprises the following steps: the process of obtaining the correlation coefficient r (x, y) is to move the template matching image w (s, t) point by point in the original image f (x, y), make the origin of w coincide with any point in the original image f, calculate the sum of products of w and the corresponding pixel of the image area covered by w in f, and then normalize the calculation result to obtain the correlation coefficient. When w moves the whole image f, the upper left corner (x 0, y 0) of the w image is the best matching point;
B. based on the above requirements, in order to reduce the target recognition time and improve the target recognition efficiency, the invention adopts the following improvement measures when the template matching algorithm is used for distinguishing and recognizing the light spot region and the boosting hole region in the simulated light collimation image:
matching is performed by narrowing the search range and sampling at intervals, as shown in fig. 4:
b1, for the measure of reducing the search range, the template matching is only carried out in the BLOB area with the gray level of 255 of the binary image, so that the calculation amount of a template matching algorithm can be greatly reduced;
b2, for the interval sampling matching measure, in performing the template matching operation on each pixel, a template matching algorithm of interval sampling is adopted, for example: in this embodiment, the original image f (x, y) used is as shown in fig. 2 (a), the image size is 1600 × 1200, the template matching image w (s, t) is as shown in fig. 3, the size is 181 × 145, if the correlation coefficient calculation is performed for each pixel, the calculation amount is 26245, if one correlation coefficient is calculated in the range of 4*4 size, the calculation amount is reduced to 1/16 of the original, and the calculation amount after the interval sampling is 1640;
if a BLOB object needs to search for a range of 20 × 20, taking a BLOB area as an example, the maximum correlation coefficient calculation time is 40885, the actual correlation coefficient calculation time is 13834, and the calculation amount is reduced to 33.84% of the original calculation amount. The correlation coefficient calculation for the 4 BLOB regions is shown in table 1 below:
TABLE 1 comparison of correlation coefficient calculation results for all BLOB regions
Figure GDA0003919140050000091
4. Optical target identification, classification and localization
The optical target identification, classification and positioning aims to distinguish which BLOB area is a light spot area and which BLOB area is a booster hole area, and the specific steps are as follows:
step 4.1: determination of light spot area and booster hole area
Step 4.1.1: the initial search of each BLOB area is 0, in all BLOB areas, the BLOB area with the largest correlation coefficient r (x, y) is selected first, if there is only one BLOB area with the largest correlation coefficient, the area is considered as a boost hole area, the searched flag search is changed to 1, and the step 4.1.3 is skipped: if the number of the BLOB area with the maximum correlation coefficient is more than two, step 4.1.2 is executed:
step 4.1.2: comparing the areas of more than two BLOB areas, determining a boosting hole area from the BLOB area with the largest area, and changing the searched mark to 1;
step 4.1.3: searching all BLOB areas with searched 0, and determining the maximum BLOB area as a light spot area;
4.2: center position calculation of light spot region and boost hole region
Because the position returned by template matching is the upper left corner of the clipping area, the template matching position needs to be converted into the template matching center position, and the conversion method comprises the following steps: the template matching center position x = the position x + the template matching image width/2 returned by the template matching; the template matching center position y = the position y + template image matching height/2 returned by template matching;
the center of the light spot obtained this time is calculated to be (332,90) and the center of the boosting hole is (898,602), as shown in table 1.
5. Re-recognition when two objects are very close together
If the correlation coefficients of more than two BLOB areas are the largest and the same after template matching, and the area areas of the two BLOB areas are also consistent, it is considered that the distance between the light spot center and the center of the boost hole is too close (less than 150 pixels, i.e. less than the pixels of the matched template), and a situation that the template matching areas are overlapped may occur in the template matching process, i.e. the light spot target and the boost hole target may be identified as the same target, so that it is necessary to distinguish again which BLOB area is the boost hole area and which is the light spot area, and the steps are as follows:
1. under the condition of not reducing the search range and sampling at intervals, template matching is carried out on each pixel point of the full original image;
2. calculating the center position of template matching;
3. calculating the distance between the template matching center position and the center of each BLOB area, wherein the BLOB area corresponding to the minimum distance is a boosting hole area, the corresponding template matching center position when the distance is the minimum is the boosting hole center, and changing the searched mark search to 1;
4. searching all BLOB areas with search of 0, determining the maximum area of the BLOB areas as the light spot area, and setting the template matching center position of the area with the maximum area as the light spot center.
6. Image processing result analysis
1. Analysis of processing results when two targets are far apart (the distance between two targets is greater than the pixel value of template matching image)
In order to illustrate the recognition effect of the target recognition algorithm provided by the invention on multiple optical targets, 4 different simulated light collimation images are selected and processed by the algorithm.
The result of the object recognition processing when the two objects are far away is shown in fig. 5, as in fig. 5 (a), the large cross of mark a identifies the center of the light spot (332.5,90.6), the large cross of mark b indicates the center position of the boosting hole (893.3,602.1), and the small cross of mark c identifies the top left corner (794.0,524.0) of the best matching position searched by the original image f (x, y) of the template matching image w (s, t) shown in fig. 2. When the best matching position is searched, the area of the template matching image w (s, t) corresponding to the original image f (x, y) is shown as a rectangular frame in fig. 5 (a).
As can be seen from fig. 5, the algorithm can identify the center of the light spot and the center of the boosting hole for different simulated light collimation images, and as can be seen from table 2 and the log, the time consumed for identifying the target in image 5 (a) is 6 seconds, and the average time for processing 4 images is 7.25 seconds.
TABLE 2 comparison of different image processing results
Figure GDA0003919140050000111
Figure GDA0003919140050000121
Fig. 5 (a) the image processing log is as follows:
[2020-11-18
[2020-11-18 14:14:10]No:0 Area:1671 x:333 y:91
[2020-11-18 14:14:10]No:1 Area:145 x:898 y:602
[2020-11-18 14:14:10]No:2 Area:103 x:885 y:591
[2020-11-18 14:14:10]No:3 Area:43 x:874 y:578
[ 2020-11-14 ] Start time: 2020-11-18 14:14:10
[2020-11-18 14:14:10]【matching begin】!!!!!!!!!!!!!!!!!!
[2020-11-18 14:14:12]【matching end】!!!!!!!!!!!!!!!!!!
[ 2020-11-14 ] maximum number of matches: 40885 the current matching times of 13834 coefficient 0.3750
[ 2020-11-14: no:0 area
[2020-11-18 14:14:12]【matching begin】!!!!!!!!!!!!!!!!!!
[2020-11-18 14:14:13]【matching end】!!!!!!!!!!!!!!!!!!
[ 2020-11-14 ] maximum matching number: 40885 Current matching times 9158 coefficient 0.5008
[ 2020-11-14: no. 1 area
[2020-11-18 14:14:13]【matching begin】!!!!!!!!!!!!!!!!!!
[2020-11-18 14:14:15]【matching end】!!!!!!!!!!!!!!!!!!
[ 2020-11-14 ] maximum matching number: 40885 the current matching times is 11808 coefficient 0.5008
[ 2020-11-14: no. 2 area
[2020-11-18 14:14:15]【matching begin】!!!!!!!!!!!!!!!!!!
[2020-11-18 14:14:16]【matching end】!!!!!!!!!!!!!!!!!!
[ 2020-11-14 ] maximum number of matches: 40885 Current matching times 14354 coefficient 0.5008
[ 2020-11-14: no. 3 area
[ 2020-11-14: no:1 area
[ 2020-11-14: no:0 area
[ 2020-11-14: 2020-11-18 14:14:16
Fig. 5 (b), 5 (c), and 5 (d) are processing results of the three remaining different multi-target images in the simulated light collimation experiment process, and the processing process is similar to that in fig. 5 (a).
2. Analysis of processing results when two targets are close to each other
When the two target centers are closer, that is, the distance between the spot center and the booster hole center is less than 150 pixels, the correlation coefficients of the two BLOB regions are the same and the areas are the same when template matching is used, that is, the spot target and the booster hole target may be identified as the same target, so that template matching is required for the whole image (at this time, template matching cannot be performed to narrow the search range and sample the interval), so that the target identification time is increased, for example, the target time in fig. 6 (a) is 34 seconds. The processing results when the two targets are closer are shown in fig. 6 (b), where the large cross of mark d indicates the spot center (258.5,437.6), the large cross of mark e indicates the booster hole center position (360.6,455.9), and the small cross of mark f indicates the top left corner of the best matching position (271.0,390.0).
Fig. 6 (a) image processing log is as follows:
[2021-05-07 17
[2021-05-07 17:26:53]No:0 Area:1671 x:259 y:438
[2021-05-07 17:26:53]No:1 Area:145 x:374 y:467
[2021-05-07 17:26:53]No:2 Area:103 x:361 y:456
[2021-05-07 17:26:53]No:3 Area:43 x:350 y:443
[2021-05-07 17: 2021-05-07 17:26:53
[2021-05-07 17:26:53]【matching begin】!!!!!!!!!!!!!!!!!!
[2021-05-07 17:26:55]【matching end】!!!!!!!!!!!!!!!!!!
[2021-05-07 17 ] maximum matching number: 40885 number of current matches:
25326 coefficient 0.4170
[2021-05-07 17 ] search: no 0 area 1671 x 259 y
[2021-05-07 17:26:55]【matching begin】!!!!!!!!!!!!!!!!!!
[2021-05-07 17:26:56]【matching end】!!!!!!!!!!!!!!!!!!
[2021-05-07 17 ] maximum matching number: 40885 the current matching times is 9158 the coefficient is 0.4170
[2021-05-07 17: no. 1 area
[2021-05-07 17:26:56]【matching begin】!!!!!!!!!!!!!!!!!!
[2021-05-07 17:26:58]【matching end】!!!!!!!!!!!!!!!!!!
[2021-05-07 17 ] maximum number of matches: 40885 number of current matches:
11808 coefficient 0.4170
[2021-05-07 17 ] search: no. 2 area
[2021-05-07 17:26:58]【matching begin】!!!!!!!!!!!!!!!!!!
[2021-05-07 17:26:59]【matching end】!!!!!!!!!!!!!!!!!!
[2021-05-07 17 ] maximum number of matches: 40885 the current matching times are 14354 coefficient 0.4170
[2021-05-07 17 ] search: no. 3 area
[2021-05-07 17: no:0 area
[2021-05-07 17: no:1 area
[2021-05-07 17!
[ 2021-05-07! | A | A ! | A | A | A
[2021-05-07 17:26:59]matching begin
[2021-05-07 17:27:34]matching end
[2021-05-07 17 ] maximum matching number: 1499520 current matching times 58775 coefficient 0.42
[2021-05-07 17
[2021-05-07 17 ] recalculates the center: no:0 BlbBinder (x: 259 y
[2021-05-07 17 ] recalculates the center: no:1 BlbBinder (x: 374 y
[2021-05-07 17 ] recalculates the center: no:2 BlbBinder (x: 361 y 456) MatchCenter (x: 361 y 462) r:6
[2021-05-07 17: no:3 BlbBinder (x: 350 y 443) MatchCenter (x: 361 y 462) r:22
[2021-05-07 17 ] reselect template information: no. 2 area
[2021-05-07 17: no:0 area
[2021-05-07 17: 2021-05-07 17:27:34
7. Image processing software
The method for rapidly classifying and identifying the multiple optical targets based on template matching provided by the embodiment comprises the following steps: the method is realized in frequency tripling analog light source software, and the software interface is shown in figure 7. Clicking an 'image on' button selects an image to be processed, and clicking a 'match by BLOB' button completes the multi-optical target recognition processing.
8. Fast template matching algorithm
The fast template matching algorithm is implemented By a function template _ By _ blobnfo in an image processing class CImageProcess, and codes and parameters are described as follows:
Figure GDA0003919140050000151
Figure GDA0003919140050000161
Figure GDA0003919140050000171
Figure GDA0003919140050000181
Figure GDA0003919140050000191
Figure GDA0003919140050000201

Claims (4)

1. a multi-optical target rapid classification and identification method based on template matching is characterized by comprising the following program steps executed on a computer:
step 1: binarization processing and BLOB area information statistics;
carrying out binarization processing on an original image acquired in a simulated light collimation process, and counting information of each BLOB area chain table in the binarized image; the chain table information comprises the central position of the BLOB area and the area of the BLOB area;
step 2: acquiring a template matching image similar to the characteristics of the boosting hole area from the original image;
and step 3: carrying out template matching by using an autocorrelation matching method;
the process of obtaining the correlation coefficient r (x, y) is specifically as follows:
moving the template matching image w (s, t) point by point in the original image f (x, y), enabling the origin of the template matching image w (s, t) to coincide with any point in the original image f (x, y), calculating the sum of products of corresponding pixels of the template matching image w (s, t) and an image area covered by the template matching image w (s, t) in the original image f (x, y), and then carrying out normalization processing on the calculation result to obtain a correlation coefficient;
recording the upper left corner (x 0, y 0) of the template matching image w (s, t) as an optimal matching point;
and 4, step 4: optical target identification, classification and localization
Step 4.1: determination of light spot area and booster hole area
Step 4.1.1: the initial search of each BLOB area is 0, in all BLOB areas, the BLOB area with the largest correlation coefficient r (x, y) is selected first, if there is only one BLOB area with the largest correlation coefficient, the area is considered as a boost hole area, the searched flag search is changed to 1, and the step 4.1.3 is skipped: if the number of BLOB areas for the maximum correlation coefficient is two or more, then step 4.1.2 is performed:
step 4.1.2: comparing the areas of more than two BLOB areas, determining a boosting hole area by the BLOB area with the largest area, and changing the searched mark searched to 1;
if the correlation coefficients of more than two BLOB areas are the same and the areas are also the same, the pixels of the template matching image are considered to be smaller than the distance between the light spot center and the center of the boost hole, and the boost hole area needs to be screened according to the distance between the template matching center position and the center of each BLOB area;
the method specifically comprises the following steps: calculating the distance between the template matching center position and the center of each BLOB area, wherein the BLOB area corresponding to the minimum distance is a boosting hole area, and the corresponding template matching center position when the distance is the minimum is the center of the boosting hole;
step 4.1.3: searching all BLOB areas with search of 0, and determining the maximum BLOB area as a light spot area;
step 4.2: calculating the central positions of the light spot area and the boosting hole area;
because the position returned by template matching is the upper left corner of the template matching image, the template matching position needs to be converted into the position of the template matching center, and the conversion method comprises the following steps: the template matching center position x = the position x + the template matching image width/2 returned by the template matching; the template matching center position y = the position y + template image matching height/2 returned by the template matching.
2. The template matching-based multi-optical target rapid classification and identification method according to claim 1, characterized in that:
in the step 3, template matching is performed by adopting a mode of reducing a search range and sampling at intervals:
the search range is narrowed specifically as follows: performing autocorrelation operation only in a BLOB area with the gray level of the binary image of 255;
the interval sampling specifically comprises the following steps: in the template matching operation, the autocorrelation operation is performed by sampling every interval of 2 or 4 or 8.
3. The template matching-based multi-optical target rapid classification and identification method according to claim 1, characterized in that: the specific formula of the autocorrelation operation in the step 3 is as follows:
Figure FDA0003919140040000031
wherein f (x, y) represents an original image;
w (s, t) represents a template matching image;
x=0,1,2,...,n-L;y=0,1,2,...,m-J;
s=0,1,2,...,L;t=0,1,2,...,J。
4. the template matching-based multi-optical target rapid classification and identification method according to claim 1, characterized in that: the method is executed on frequency tripling analog light source software.
CN202110705144.7A 2021-06-24 2021-06-24 Multi-optical target rapid classification and identification method based on template matching Active CN113537303B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110705144.7A CN113537303B (en) 2021-06-24 2021-06-24 Multi-optical target rapid classification and identification method based on template matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110705144.7A CN113537303B (en) 2021-06-24 2021-06-24 Multi-optical target rapid classification and identification method based on template matching

Publications (2)

Publication Number Publication Date
CN113537303A CN113537303A (en) 2021-10-22
CN113537303B true CN113537303B (en) 2023-01-06

Family

ID=78096610

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110705144.7A Active CN113537303B (en) 2021-06-24 2021-06-24 Multi-optical target rapid classification and identification method based on template matching

Country Status (1)

Country Link
CN (1) CN113537303B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114580522B (en) * 2022-02-28 2023-08-11 中国科学院西安光学精密机械研究所 Method for identifying multiple optical targets based on least square circle fitting method
CN114580532B (en) * 2022-02-28 2023-05-26 中国科学院西安光学精密机械研究所 Multi-target identification method based on optical target one-dimensional curve crest feature extraction

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102542660A (en) * 2011-12-05 2012-07-04 中钞实业有限公司 Bill anti-counterfeiting identification method based on bill watermark distribution characteristics
CN103065162A (en) * 2013-01-31 2013-04-24 西安电子科技大学 SAR (Synthetic Aperture Radar) target azimuth angle estimation method based on sparse description
CN109146006A (en) * 2018-10-23 2019-01-04 上海交通大学 Template library combines non-linear template matching Fetal Heart Rate extracting method
CN109816627A (en) * 2018-12-13 2019-05-28 中国科学院西安光学精密机械研究所 The small and weak defect object detection method in flat glass parts ink area
CN110910318A (en) * 2019-10-21 2020-03-24 中国科学院西安光学精密机械研究所 Weak contrast schlieren small ball center calculation method for comprehensive diagnosis light path quick automatic collimation system
RU2738025C1 (en) * 2020-03-27 2020-12-07 Общество с Ограниченной Ответственностью "СТРИМ Лабс" Method of television channel logo detection in television broadcast

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102542660A (en) * 2011-12-05 2012-07-04 中钞实业有限公司 Bill anti-counterfeiting identification method based on bill watermark distribution characteristics
CN103065162A (en) * 2013-01-31 2013-04-24 西安电子科技大学 SAR (Synthetic Aperture Radar) target azimuth angle estimation method based on sparse description
CN109146006A (en) * 2018-10-23 2019-01-04 上海交通大学 Template library combines non-linear template matching Fetal Heart Rate extracting method
CN109816627A (en) * 2018-12-13 2019-05-28 中国科学院西安光学精密机械研究所 The small and weak defect object detection method in flat glass parts ink area
CN110910318A (en) * 2019-10-21 2020-03-24 中国科学院西安光学精密机械研究所 Weak contrast schlieren small ball center calculation method for comprehensive diagnosis light path quick automatic collimation system
RU2738025C1 (en) * 2020-03-27 2020-12-07 Общество с Ограниченной Ответственностью "СТРИМ Лабс" Method of television channel logo detection in television broadcast

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Detection and quantification of single engineered nanoparticles in complex samples using template matching in wide-field surface plasmon microscopy;S Nizamov 等;《ACS Publications》;20160916;全文 *
The method for calculating optical target center of weak contrast collimating image in integrated diagnosis system;Zhengzhou Wang 等;《AOPC 2020: Optical Sensing and Imaging Technology》;20201105;全文 *
基于形状模板的快速高精度可靠图像匹配;赵振民等;《计算机应用》;20100201(第02期);全文 *
基于权值模板匹配算法的全自主足球机器人目标识别;吴晓等;《厦门大学学报(自然科学版)》;20081115(第06期);全文 *

Also Published As

Publication number Publication date
CN113537303A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
CN113537303B (en) Multi-optical target rapid classification and identification method based on template matching
CN110687904B (en) Visual navigation routing inspection and obstacle avoidance method for inspection robot
Li et al. In-field tea shoot detection and 3D localization using an RGB-D camera
EP1703465B1 (en) Color-based inspection of printed circuit boards
CN108171247B (en) Vehicle re-identification method and system
US8686892B2 (en) Synthetic aperture radar chip level cross-range streak detector
CN116188475B (en) Intelligent control method, system and medium for automatic optical detection of appearance defects
CN109785338A (en) The online visible sensation method of screw thread critical size parameter under a kind of movement background
CN113256598A (en) Visual detection system for chip production
CN110910318A (en) Weak contrast schlieren small ball center calculation method for comprehensive diagnosis light path quick automatic collimation system
CN115100104A (en) Defect detection method, device and equipment for glass ink area and readable storage medium
US10923317B2 (en) Detecting defects in a logic region on a wafer
CN115880501A (en) High-voltage wire infrared image processing method and system based on infrared camera
Liu et al. Automatic detection and segmentation of laser stripes for industrial measurement
TWI543117B (en) Method for recognizing and locating object
CN110991469B (en) Fruit soluble solid online detection method and system
KR101249374B1 (en) Apparatus and method for detecting target shape from image
US10776912B2 (en) Self-determining inspection method for automated optical wire bond inspection
CN114689539A (en) Soybean seed pathological change particle identification method and system based on near-infrared hyperspectral image
CN112991327B (en) Steel grid welding system, method and terminal equipment based on machine vision
Kaiyan et al. Review on the Application of Machine Vision Algorithms in Fruit Grading Systems
CN113128249B (en) Garlic scar identification method and system and sorting equipment
Sreekanth et al. Fruit detection using improved K-means algorithm
CN113052177B (en) Garlic scar identification method and device and sorting equipment
Feichtenhofer et al. Spatio-temporal good features to track

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant