WO2020037466A1 - 一种图像匹配方法及视觉系统 - Google Patents
一种图像匹配方法及视觉系统 Download PDFInfo
- Publication number
- WO2020037466A1 WO2020037466A1 PCT/CN2018/101371 CN2018101371W WO2020037466A1 WO 2020037466 A1 WO2020037466 A1 WO 2020037466A1 CN 2018101371 W CN2018101371 W CN 2018101371W WO 2020037466 A1 WO2020037466 A1 WO 2020037466A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- matching
- matched
- reference line
- image
- point
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 230000000007 visual effect Effects 0.000 title abstract 2
- 239000000284 extract Substances 0.000 claims description 9
- 230000001105 regulatory effect Effects 0.000 claims 2
- 238000010586 diagram Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000007689 inspection Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Definitions
- the present application relates to the field of industrial vision, and in particular, to an image matching method and a vision system.
- Industrial vision systems are used for automatic inspection, workpiece processing and assembly automation, and control and monitoring of production processes.
- the image recognition process of the industrial vision system is to extract relevant information from the original image data and describe the content of the image in a high-level manner in order to explain and judge certain content of the image.
- the present application mainly provides an image matching method and a vision system, which can realize fast image matching, improve matching speed, and save matching time.
- the first technical solution adopted in this application is to provide an image matching method, which includes: extracting the contour lines of a workpiece to be processed in a template image, and generating a matching reference line according to the contour lines of the workpiece to be processed, where , The intersection point between the matching reference lines is the feature point, which is used as the position lock before matching; obtaining the contour line of the target workpiece in the target image, extracting the corner points of the contour line of the target workpiece as the corner points to be matched; and matching the reference line Move to the corner point to be matched and start matching with the target workpiece to get the matching result.
- the position of the feature point coincides with the corner point to be matched.
- the second technical solution adopted in this application is to provide a vision system, including: an image acquisition device, a memory, a processor, and the processor are coupled to the image acquisition device and the memory, respectively; Template image, target image, and send the collected image to the processor; the memory is used to store the template image, target image, program data, and data processed by the processor; the processor performs the following steps when processing the data: extract the template image
- the contour line of the workpiece to be processed is generated according to the contour line of the workpiece to be processed, wherein the intersection between the matching reference lines is a feature point and is used as a position lock before matching; the contour line of the target workpiece in the target image is obtained , Extract the corner points of the contour line of the target workpiece as the corner points to be matched; move the matching reference line to the corner points to be matched, and start matching with the target workpiece to obtain the matching result, and the feature points coincide with the corner points to be matched.
- the present application Different from the situation of the prior art, the present application generates a matching reference line according to the contour line of the workpiece to be processed, and extracts the corner points of the contour line of the target workpiece as the corner points to be matched, and matches the reference line Move to the corner point to be matched and start matching with the target workpiece to get the matching result.
- the matching reference line By matching the matching reference line with the target workpiece, the image can be quickly matched, the matching speed can be increased, and the matching time can be saved.
- FIG. 1 is a schematic flowchart of an embodiment of an image matching method provided by the present application
- FIG. 2 is a schematic structural diagram of an embodiment of extracting a contour line of a workpiece to be processed provided by the present application
- FIG. 3 is a schematic structural diagram of an embodiment of a matching reference line generated from an outline in FIG. 2;
- FIG. 4 is a schematic structural diagram of an embodiment of extracting a contour line of a target workpiece in a target image provided by the present application
- 5a is a schematic diagram of an initial position of a matching reference line moved to a corner point to be matched provided in the present application;
- 5b is a schematic diagram of the position when the matching reference line is rotated by 40 degrees with the corner point to be matched as the rotation center point provided in the present application;
- 6a is a schematic diagram of an initial position provided by an application for moving a matching reference line to a corner point to be matched;
- FIG. 6b is a schematic diagram of the position provided when the matching reference line is rotated by 40 degrees with the corner point to be matched as the rotation center point;
- FIG. 7 is a curve diagram of a change in the matching probability value obtained by the second matching provided by the present application as a function of a rotation angle
- FIG. 8 is a schematic structural diagram of an embodiment of a vision system provided by the present application.
- the present application extracts matching reference lines from the template image and matches the matching reference lines with the target workpiece, thereby enabling rapid image matching and increasing the matching speed.
- FIG. 1 is a schematic flowchart of an embodiment of an image matching method provided by the present application.
- the image matching method of this embodiment includes the following steps:
- Step 101 Extract the contour line of the workpiece to be processed in the template image, and generate a matching reference line according to the contour line of the workpiece to be processed.
- the intersection between the matching reference lines is a feature point and is used as a position lock before matching.
- the contour line of the workpiece to be processed is extracted from the template image, and two adjacent and intersecting lines are randomly selected from the contour line to generate a matching reference line, which is used instead of the template image as a matching reference for matching.
- the intersection of the two adjacent and intersecting lines is a feature point, and is used as a position lock before matching.
- FIG. 2 is a schematic structural diagram of an embodiment of extracting contour lines of a workpiece to be processed provided in the present application.
- FIG. 3 is a structure of an embodiment of matching reference lines generated by the contour lines in FIG. 2. schematic diagram.
- the workpiece 20 to be processed has nine contour lines 21, 22, 23, 24, 25, 26, 27, 28, and 29, and the intersection point of each adjacent two contour lines is a feature point.
- Eight feature points a, b, c, d, e, f, g, h are marked, where the intersection point of the contour line 21 and the contour line 22 is the feature point a, and the intersection point of the contour line 21 and the contour line 23 is the feature point b.
- the contour line 22 and the contour line 26 are selected to generate a matching reference line.
- the intersection of the contour line 22 and the contour line 26 is the feature point d, that is, the feature point of the reference line is the point d.
- the reference line is used instead of the template image as Matching benchmarks are matched in subsequent steps.
- two adjacent and intersecting contour lines in the template image may also be selected to generate a reference line, which is not specifically limited herein.
- the reference lines have the same function as the template image, and both serve as a reference for the matching process. However, compared to the template image, the reference line has a simplified form, and the time required for matching is greatly reduced.
- the matching reference line selected in FIG. 3 includes two contour lines 22 and 26 that are adjacent to each other in the template image, that is, the length of the contour line in the matching reference line is equal to the length of the contour line in the template image.
- the length of the contour line in the matching reference line may also be shorter than the length of the contour line in the template image, which is not specifically limited here.
- Step 102 Obtain the outline of the target workpiece in the target image, and extract the corner points of the outline of the target workpiece as the corner points to be matched.
- FIG. 4 is a schematic structural diagram of an embodiment of extracting the contour lines of the target workpiece in the target image provided in the present application.
- the target workpiece 40 has nine contour lines 41, 42, 43, 44, 45 , 46, 47, 48, 49, the intersection of two adjacent contour lines that intersect each other is a corner point.
- eight corner points A, B, C, D, E, F, G, H are marked in total.
- the intersection point of the contour line 41 and the contour line 42 is the corner point A
- the intersection point of the contour line 21 and the contour line 23 is the corner point B
- the corner points of other adjacent and intersecting two contour lines are deduced by analogy. As the corner point to be matched.
- Step 103 Move the matching reference line to the corner point to be matched, and start matching with the target workpiece to obtain a matching result, and the feature point coincides with the position of the corner point to be matched.
- the matching reference line is moved to the corner point of the target workpiece to be matched, and the feature point coincides with the position of the corner point to be matched.
- the matching reference line uses the corner point to be matched as the rotation center point, and is adjusted and rotated to complete the matching. Specifically, the matching reference line is moved to the corner point of the target workpiece to be matched, and the feature points in the matching reference line coincide with the position of the corner point to be matched.
- the matching reference line uses the corner point to be matched as the rotation center point, and performs a large rotation First match; obtain the first match result, use the rotation angle interval with the larger matching probability value in the first match result as the control interval for the second match, and rotate the matching reference line within the control interval to complete the first match. Second match.
- the feature point is used as the position lock before matching, which means that the feature point is moved to the corner point to be matched, and the feature point coincides with the position of the corner point to be matched.
- the line is rotated and matched around the corner point to be matched as the rotation center point.
- the corner points to be matched are sequentially selected, the matching reference line is moved to the corner points to be matched of the target workpiece, the feature points coincide with the positions of the corner points to be matched, and the matching reference line is set with the corner points to be matched as the rotation center point.
- the first preset angle rotation completes the first matching, and obtains the matching probability value change curve.
- the matching reference line selected in FIG. 3 is sequentially moved to the corner point to be matched in FIG. 4 to make the feature point d coincide with the position of the corner point to be matched.
- the first preset angle is equal to 40 degrees is taken as an example.
- FIG. 5a is a schematic diagram of an initial position for moving a matching reference line to a corner point to be matched provided in the present application
- FIG. 5b is a rotation center of the matching reference line with the corner point to be matched provided in the present application.
- two adjacent dotted lines represent the matching reference line.
- the corner point to be matched is the corner point A in the target image, and the matching reference line is rotated by the corner point A to be matched as the center of rotation.
- a reference point as the rotation center point to perform a large rotation of the first preset angle equal to 40 degrees, and record the matching probability value of the matching reference line and the target image every 40 degrees of rotation to obtain the matching reference line around the A point.
- Rotated matching probability value curve Use the A reference point as the rotation center point to perform a large rotation of the first preset angle equal to 40 degrees, and record the matching probability value of the matching reference line and the target image every 40 degrees of rotation to obtain the matching reference line around the A point.
- the matching probability value in this embodiment is a matching probability value between the gray information or edge information of the reference line and the gray information or edge information of the target image.
- the matching probability value is larger.
- the matching reference line in FIG. 5a is closer to the contour line of the target image.
- the matching probability value of the matching reference line and the target image in FIG. 5a is 8%
- the matching probability value of the matching reference line and the target image in FIG. 5b is 4%.
- the matching reference line is rotated around point A to At other positions, the matching probability value of the target image and so on can be deduced by analogy, and then the matching probability value change curve when rotating around point A can be obtained, that is, the curve of the matching probability value changing with rotation angle when rotating around point A.
- FIG. 6a is a schematic diagram of the initial position provided by the application to move the matching reference line to the corner point to be matched.
- FIG. 6b is the application provided to rotate the matching reference line with the to-be-matched corner point as the rotation center point. Location diagram at 40 degrees.
- two adjacent dotted lines represent the matching reference line.
- the corner point to be matched is the corner point D in the target image, and the matching reference line is rotated by the corner point D to be matched as the rotation center point 80.
- the D reference point as the rotation center point to perform a large rotation of the first preset angle equal to 40 degrees, and record the matching probability value of the matching reference line to the target image every 40 degrees of rotation.
- the figure The matching probability value of the matching reference line to the target image in 6a is 70%
- the matching probability value of the matching reference line to the target image in FIG. 6b is 9%.
- the probability value is deduced by analogy, so as to obtain the curve of the matching probability value changing with the rotation angle when the matching reference line rotates around the point D.
- the gray information of the target image and the template image can be used for matching.
- the similarity between the pixel region selected in the template image and the pixel region in the target image is greater, the larger the matching probability value is.
- T (x, y) to represent the pixel area at (x, y) in the template image
- Si, j (x, y) to represent the pixel area at (x, y) in the target image.
- the similarity R between the pixel region in the selected template image and the pixel region in the target image (i, j) is calculated by the following formula (1):
- the value range of R (i, j) is [0,1].
- R (i, j) is equal to 0, it means that the pixel region in the selected template image matches the pixel region of the target image the lowest; when R ( When i, j) is equal to 1, it means that the pixel region in the selected template image and the pixel region of the target image have the highest matching degree, that is, they are completely matched.
- the pixel area of the selected template image is a 3 * 3 pixel area, that is, the pixel area includes 9 pixels. Assume that the gray value of 1 pixel of the 9 pixels is 55, and the gray value of 3 pixels.
- the gray value of 2 pixels is 90
- the gray value of 3 pixels is 100, that is, the proportion of pixels with a gray value of 55 is 1/9
- the percentage of pixels with a gray value of 78 is The ratio is 3/9
- the ratio of pixels with a gray value of 90 is 2/9
- the ratio of pixels with a gray value of 100 is 3/9
- matching is performed by using gray information of the target image and the template image. In other specific embodiments, matching may also be performed by using edge information.
- edge information of a template image and a target image is obtained, and the edge is a set of pixels in which gray levels are abruptly changed in the image.
- the edge information is described with parameters that can represent the attributes of the image similarity.
- the described parameters are matched.
- the set of contour line pixels in the image is edge information.
- the maximum matching probability value obtained for the first match If the maximum matching probability value is not less than a predetermined value, it means that the target image matches the template image. In this case, only a large rotation is required to complete the first match to obtain a match. Result; if the maximum matching probability value is less than a predetermined value, obtain the matching probability value change curve obtained from the first match, and use the rotation angle interval with the larger matching probability value in the matching probability value change curve as the control interval for the second match.
- the matching reference line is rotated at a second preset angle within the control interval to complete the second matching to obtain a matching result. The second preset angle is smaller than the first preset angle.
- the maximum matching probability value obtained by one match is not less than a predetermined value
- only one rotation matching may be performed; or the maximum matching probability value of the first time is less than the predetermined value, and the maximum matching of the second match is performed.
- the probability value is not less than the predetermined value, it means that the target image matches the template image; otherwise, it means that the two do not match.
- a third rotation matching may be performed according to the actual situation, that is, a rotation matching in which the rotation angle is gradually reduced.
- the rotation matching process in which the rotation angle is gradually reduced is two times. Of course, it can also be adjusted according to the actual situation, such as three times or other times, which is not specifically limited here.
- the corner points to be matched in the target image are sequentially traversed. After all eight corner points A, B, C, D, E, F, G, and H in the target image are traversed, from A rotation angle interval with a large matching probability value is obtained from a matching probability value change curve obtained from the first matching, and the rotation angle interval is used as a control interval for the second matching.
- all corner points in the target image may not be traversed during the first matching process. For example, the corner points to be matched are sequentially obtained, and the matching probability value curve corresponding to some corner points is obtained. The larger the value of the matching probability value in the value change curve is greater than the predetermined value, the other corner points to be matched can be obtained for matching, which can further save the matching time.
- the maximum matching probability value obtained by the first matching is equal to 70%
- 70% is less than a predetermined value of 95%
- 70% is obtained by rotating the reference line to be matched around point D, as shown in FIGS. 6a to 6b
- the first preset angle is equal to 40 degrees
- the rotation angle interval for example, 0 degrees to 40 degrees is used as the second matching adjustment interval
- the matching reference line is equal to 5 at the second preset angle within the adjustment interval.
- FIG. 7 is a schematic diagram of a change in the matching probability value obtained by the second matching provided by the application as a function of the rotation angle.
- the ordinate represents the matching probability value
- the abscissa represents the rotation angle.
- the second preset angle is equal to 5 degrees
- the rotation angle is 5 degrees, 10 degrees, 15 degrees, 20 degrees, 25 degrees, 30 degrees, 35 degrees, and 40 degrees.
- the matching probability values are 80%, 96%, 79%, 51%, 30%, 20%, 13%, and 9%.
- the matching probability value obtained by the second match is the largest, and the rotation angle is The matching probability value at 10 degrees is equal to 96%, and 96% is greater than the predetermined value of 95%, indicating that the target image matches the template image. In other embodiments, if the maximum value of the matching probability values obtained by the second matching is less than 95% of the predetermined value, it indicates that the target image does not match the template image.
- the selected matching reference line includes two contour lines that are adjacent to each other in the template image.
- the selected matching reference line may include only one contour in the template image. Line and match one end of the contour line as a feature point.
- a contour line 22 in the template image in FIG. 2 is selected as a matching reference line, and an end point d is used as a feature point.
- the contour line 22 is moved to the target image, so that the feature point d is sequentially positioned with the eight corner points in the target image. Coincidence, rotation matching in turn at each corner point to be matched.
- all the contour lines in the template image can also be directly selected as the matching reference line, and a feature point is selected, and the matching reference line is moved to the target image, so that the selected feature point is sequentially included in the target image.
- the corners to be matched coincide, and rotation matching is performed.
- the present application Different from the situation of the prior art, the present application generates a matching reference line according to the contour line of the workpiece to be processed, and extracts the corner points of the contour line of the target workpiece as the corner points to be matched, and matches the reference line Move to the corner point to be matched and start matching with the target workpiece to get the matching result.
- the matching reference line By matching the matching reference line with the target workpiece, the image can be quickly matched, the matching speed can be increased, and the matching time can be saved.
- FIG. 8 is a schematic structural diagram of an embodiment of a vision system provided by the present application.
- the vision system 80 includes an image collector 81, a memory 82, and a processor 83.
- the processor 83 is coupled to the image collector 81 and the memory 82, respectively.
- the image collector 81 is used to collect a template image, a target image, and the acquired image.
- the memory 82 is used to store the template image, the target image, the program data, and the data processed by the processor 83; when processing the data, the processor 83 performs the following steps: extracting the contour lines of the workpiece to be processed in the template image, according to The contour line of the workpiece to be processed generates a matching reference line, where the intersection between the matching reference lines is a feature point, which is used as a position lock before matching; obtaining the contour line of the target workpiece in the target image, and extracting the contour line of the target workpiece The corner point is used as the corner point to be matched; the matching reference line is moved to the corner point to be matched and starts to match with the target workpiece to obtain a matching result, and the feature point coincides with the position of the corner point to be matched.
- the vision system of the present application can execute the image matching method in any of the foregoing embodiments for image matching.
- the beneficial effect of the present application is that, unlike the case of the prior art, the vision system of the present application generates a matching reference line according to the contour line of the workpiece to be processed, and extracts the corner points of the contour line of the target workpiece as the corner points to be matched.
- the matching reference line is moved to the corner point to be matched, and starts to match with the target workpiece to obtain a matching result.
Landscapes
- Image Analysis (AREA)
Abstract
一种图像匹配方法及视觉系统,该图像匹配方法包括:提取模板图像中待加工工件的轮廓线,根据待加工工件的轮廓线生成匹配基准线条,其中,匹配基准线条之间的交点为特征点,用以作为匹配前的位置锁定(101);获取目标图像中目标工件的轮廓线,提取目标工件的轮廓线的角点作为待匹配角点(102);将匹配基准线条移至待匹配角点处,并与目标工件开始匹配进而得到匹配结果,特征点与待匹配角点位置重合(103)。通过上述方式,能够实现图像快速匹配,提高匹配速度,节省匹配时间。
Description
本申请涉及工业视觉领域,特别是涉及一种图像匹配方法及视觉系统。
工业视觉系统是用于自动检验、工件加工和装配自动化以及生产过程的控制和监视的。工业视觉系统的图像识别过程是按任务需要从原始图像数据中提取有关信息、高度概括地描述图像内容,以便对图像的某些内容加以解释和判断。
在工业流水生产中,常采用工业视觉系统代替人眼来做测量和判断。在工业流水过程中,常常因为检测目标图像分辨率过大,检测角度的不固定,从而使图像匹配速度非常缓慢,降低了工作效率。
因此,有必要提出一种图像匹配方法及视觉系统以解决上述技术问题。
【发明内容】
本申请主要提供一种图像匹配方法及视觉系统,能够实现图像快速匹配,提高匹配速度,节省匹配时间。
为解决上述主要技术问题,本申请采用的第一个技术方案是提供一种图像匹配方法,包括:提取模板图像中待加工工件的轮廓线,根据待加工工件的轮廓线生成匹配基准线条,其中,匹配基准线条之间的交点为特征点,用以作为匹配前的位置锁定;获取目标图像中目标工件的轮廓线,提取目标工件的轮廓线的角点作为待匹配角点;将匹配基准线条移至待匹配角点处,并与目标工件开始匹配进而得到匹配结果,特征点与待匹配角点位置重合。
为解决上述技术问题,本申请采用的第二个技术方案是提供一种视觉系统,包括:图像采集器,存储器,处理器,处理器分别与图像采集器和存储器耦合连接;图像采集器用于采集模板图像、目标图像,并将采集到的图像发送至处理器;存储器用于存储模板图像、目标图像、程序数据以及处理器处理的数据; 处理器在处理数据时执行以下步骤:提取模板图像中待加工工件的轮廓线,根据待加工工件的轮廓线生成匹配基准线条,其中,匹配基准线条之间的交点为特征点,用以作为匹配前的位置锁定;获取目标图像中目标工件的轮廓线,提取目标工件的轮廓线的角点作为待匹配角点;将匹配基准线条移至待匹配角点处,并与目标工件开始匹配进而得到匹配结果,特征点与待匹配角点位置重合。
本申请的有益效果是:区别于现有技术的情况,本申请通过根据待加工工件的轮廓线生成匹配基准线条,并提取目标工件的轮廓线的角点作为待匹配角点,将匹配基准线条移至待匹配角点处,并与目标工件开始匹配进而得到匹配结果。通过将匹配基准线条与目标工件进行匹配的方式,能够实现图像快速匹配,提高匹配速度,节省匹配时间。
图1是本申请提供的图像匹配方法一实施方式的流程示意图;
图2是本申请提供的提取待加工工件的轮廓线一实施方式的结构示意图;
图3是由图2中的轮廓线所生成的匹配基准线条一实施方式的结构示意图;
图4是本申请提供的提取目标图像中目标工件的轮廓线一实施方式的结构示意图;
图5a是本申请提供的将匹配基准线条移至待匹配角点处的初始位置示意图;
图5b是本申请提供的将匹配基准线条以待匹配角点为旋转中心点旋转40度时的位置示意图;
图6a是申请提供的将匹配基准线条移至待匹配角点处的初始位置示意图;
图6b是申请提供的将匹配基准线条以待匹配角点为旋转中心点旋转40度时的位置示意图;
图7是本申请提供的第二次匹配得到的匹配概率值随旋转角度变化的曲线示意图;
图8是本申请提供的视觉系统一实施方式的结构示意图。
下面将结合本申请实施方式中的附图,对本申请实施方式中的技术方案进行清楚、完整地描述,显然,所描述的实施方式仅仅是本申请一部分实施方式, 而不是全部的实施方式。基于本申请中的实施方式,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施方式,均属于本申请保护的范围。
将目标图像与模板图像进行匹配时,由于二者的采样图像通常分辨率较高,匹配时采用现有的整体像素一一匹配的技术方法所产生的计算量较大,故其匹配速度无法满足工业化快速生产的标准。为解决上述技术问题,本申请从模板图像中提取匹配基准线条,并将匹配基准线条与目标工件进行匹配,从而能够实现图像快速匹配,提高匹配速度。以下,结合附图对本申请进行详细说明。
请参阅图1,图1是本申请提供的图像匹配方法一实施方式的流程示意图,本实施方式的图像匹配方法包括以下步骤:
步骤101:提取模板图像中待加工工件的轮廓线,根据待加工工件的轮廓线生成匹配基准线条,其中,匹配基准线条之间的交点为特征点,用以作为匹配前的位置锁定。
从模板图像中提取待加工工件的轮廓线,从轮廓线中任意选取相邻两条且相交的线条,生成匹配基准线条,用以替代模板图像作为匹配基准进行匹配。其中,该相邻两条且相交的线条的交点为特征点,用以作为匹配前的位置锁定。
请参阅图2~图3,图2是本申请提供的提取待加工工件的轮廓线一实施方式的结构示意图,图3是由图2中的轮廓线所生成的匹配基准线条一实施方式的结构示意图。如图2所示,待加工工件20共有九条轮廓线21、22、23、24、25、26、27、28、29,每相邻相交的两条轮廓线的交点为特征点,图2中共标示出了八个特征点a、b、c、d、e、f、g、h,其中轮廓线21与轮廓线22的交点为特征点a,轮廓线21与轮廓线23的交点为特征点b,其他相邻且相交的两轮廓线的特征点依此类推。图3中选取轮廓线22和轮廓线26生成得到匹配基准线条,轮廓线22和轮廓线26的交点为特征点d,即该基准线条的特征点为点d,将该基准线条替代模板图像作为匹配基准进行后续步骤中的匹配。在其他实施例中也可以选取模板图像中其他相邻且相交的两条轮廓线来生成得到基准线条,在此不做具体限定。基准线条与模板图像的作用相同,均作为匹配过程的一个基准参照,但所述基准线条相对于模板图像而言,由于其自身的形态简化,匹配所需的时间大大减少。
本实施方式中,图3中选取的匹配基准线条包括模板图像中相邻相交的两条轮廓线22和26,即匹配基准线条中轮廓线的长度等于模板图像中轮廓线的长 度,在其他实施方式中,匹配基准线条中轮廓线的长度也可以小于模板图像中轮廓线的长度,此处不做具体限定。
步骤102:获取目标图像中目标工件的轮廓线,提取目标工件的轮廓线的角点作为待匹配角点。
获取目标图像中目标工件的轮廓线,选取其中相邻且相交的任意两条轮廓线之间的交点作为待匹配角点。
请参阅图4,图4是本申请提供的提取目标图像中目标工件的轮廓线一实施方式的结构示意图,如图4所示,目标工件40共有九条轮廓线41、42、43、44、45、46、47、48、49,每相邻相交的两条轮廓线的交点为角点,图4中共标示出了八个角点A、B、C、D、E、F、G、H,其中轮廓线41与轮廓线42的交点为角点A,轮廓线21与轮廓线23的交点为角点B,其他相邻且相交的两轮廓线的角点依此类推,上述角点均可作为待匹配角点。
步骤103:将匹配基准线条移至待匹配角点处,并与目标工件开始匹配进而得到匹配结果,特征点与待匹配角点位置重合。
将匹配基准线条移至目标工件的待匹配角点处,特征点与待匹配角点位置重合,并将匹配基准线条以待匹配角点为旋转中心点,进行调控旋转进而完成匹配。具体地,将匹配基准线条移至目标工件的待匹配角点处,匹配基准线条中的特征点与待匹配角点位置重合,匹配基准线条以待匹配角点为旋转中心点,进行大幅旋转完成第一次匹配;获取第一次匹配结果,将第一次匹配结果中匹配概率值较大的旋转角度区间作为第二次匹配的调控区间,将匹配基准线条在调控区间内小幅旋转,完成第二次匹配。
上述步骤101中将特征点作为匹配前的位置锁定指将特征点移动至待匹配角点处,特征点与待匹配角点位置重合,进而进行以待匹配角点为旋转中心点,将匹配基准线条绕待匹配角点为旋转中心点进行旋转匹配。
本实施例中,依次选取待匹配角点,将匹配基准线条移至目标工件的待匹配角点,特征点与待匹配角点位置重合,将匹配基准线条以待匹配角点为旋转中心点按照第一预设角度旋转完成第一次匹配,并得到匹配概率值变化曲线。在一具体实施方式中,将图3中选取出的匹配基准线条依次移至图4中的待匹配角点处,使特征点d与待匹配角点位置重合,以下以第一预设角度等于40度为例进行说明。
请参阅图5a~图5b,图5a是本申请提供的将匹配基准线条移至待匹配角点 处的初始位置示意图,图5b是本申请提供的将匹配基准线条以待匹配角点为旋转中心点旋转40度时的位置示意图。图5a~图5b中,相邻且相交的两虚线段代表匹配基准线条,待匹配角点为目标图像中的角点A,将匹配基准线条以待匹配角点A为旋转中心点分别旋转80度、120度、160度、200度、240度、280度、320度时的情形依此类推。将匹配基准线条以A点为旋转中心点进行第一预设角度等于40度的大幅旋转,每旋转40度时记录下匹配基准线条与目标图像的匹配概率值,从而得到匹配基准线条绕A点旋转的匹配概率值变化曲线。
本实施例中的匹配概率值为匹配基准线条的灰度信息或/和边缘信息与目标图像的灰度信息或/和边缘信息的匹配概率值。当匹配基准线条与目标图像的轮廓线越接近重合时,二者的灰度信息或/和边缘信息越接近,即匹配概率值越大。比如相对于图5b而言,图5a中匹配基准线条与目标图像轮廓线更接近重合。在一具体实施方式中,图5a中匹配基准线条与目标图像的匹配概率值为8%,图5b中匹配基准线条与目标图像的匹配概率值为4%,将匹配基准线条绕A点旋转到其他位置时,其与目标图像的匹配概率值依此类推,进而能得到绕A点旋转时的匹配概率值变化曲线,即绕A点旋转时的匹配概率值随旋转角度变化的曲线。
请参阅图6a~图6b,图6a是申请提供的将匹配基准线条移至待匹配角点处的初始位置示意图,图6b是申请提供的将匹配基准线条以待匹配角点为旋转中心点旋转40度时的位置示意图。图6a~图6b中,相邻且相交的两虚线段代表匹配基准线条,待匹配角点为目标图像中的角点D,将匹配基准线条以待匹配角点D为旋转中心点分别旋转80度、120度、160度、200度、240度、280度、320度时的情形依此类推。将匹配基准线条以D点为旋转中心点进行第一预设角度等于40度的大幅旋转,每旋转40度时记录下匹配基准线条与目标图像的匹配概率值,在一具体实施方式中,图6a中匹配基准线条与目标图像的匹配概率值为70%,图6b中匹配基准线条与目标图像的匹配概率值为9%,将匹配基准线条绕待匹配角点D旋转到其他位置时的匹配概率值依此类推,从而得到匹配基准线条绕点D旋转时的匹配概率值随旋转角度变化的曲线。
上述只描述了匹配基准线条分别绕待匹配角点A和D旋转的情形,匹配基准线条分别绕待匹配角点B、C、E、F、G、H旋转的情形依此类推,此处不再赘述。
在一具体实施方式中,可以通过目标图像与模板图像的灰度信息进行匹配, 当在模板图像中选取的像素区域与目标图像中像素区域的相似度越大时代表匹配概率值越大。例如,用T(x,y)表示模板图像中选取(x,y)处的像素区域,用Si,j(x,y)表示目标图像中(x,y)处的像素区域,像素区域的大小为M*N的矩形区域,其中的(i,j)代表目标图像中M*N矩形区域的左上角处的坐标,选取的模板图像中的像素区域与目标图像中像素区域的相似度R(i,j)通过如下公式(1)进行计算:
R(i,j)的取值范围为[0,1],当R(i,j)等于0时代表选取的模板图像中的像素区域与目标图像的像素区域的匹配度最低;当R(i,j)等于1时代表选取的模板图像中的像素区域与目标图像的像素区域的匹配度最高,即二者完全相匹配。例如,选取的模板图像的像素区域为3*3的像素区域,即该像素区域包括9个像素,假设该9个像素中的1个像素的灰度值为55,3个像素的灰度值为78,2个像素的灰度值为90,3个像素的灰度值为100,即灰度值为55的像素所占的比例为1/9,灰度值为78的像素所占的比例为3/9,灰度值为90的像素所占的比例为2/9,灰度值为100的像素所占的比例为3/9,选取的模板图像的像素区域可表示为T(x,y)=(1/9,3/9,2/9,3/9);接着选取目标图像中矩形区域左上角的坐标为(i,j)处的像素区域可表示为S
i,j(x,y)=(1/9,2/9,4/9,2/9);将T(x,y)和S
i,j(x,y)代入上式(1)中即可计算得到目标图像中矩形区域左上角坐标为(i,j)处的像素区域与选取的模板图像像素区域的相似度R(i,j)的值。如果计算得到的相似度R(i,j)的值不小于预定值,说明目标图像与模板图像相匹配,否则二者不相匹配。
本实施方式中是通过目标图像与模板图像的灰度信息进行匹配的,在其他具体实施方式中也可以通过边缘信息进行匹配。
在另一具体实施方式中,获取模板图像和目标图像的边缘信息,边缘是图像中灰度发生突变的像素的集合,对边缘信息进行参数描述,该参数能代表图像相似度的属性,然后用所描述的参数进行匹配。本实施方式中,图像中轮廓线的像素的集合为边缘信息,当用参数进行匹配得到的相似度不小于预定值,说明目标图像与模板图像相匹配,否则二者不相匹配。第一次匹配得到的匹配 概率值变化曲线由将匹配基准线条分别绕待匹配角点A、B、C、D、E、F、G、H旋转得到的匹配概率值变化曲线组成。分别获取匹配基准线条绕八个待匹配角点A、B、C、D、E、F、G、H旋转得到的匹配概率值的较大值,将这八个较大值中的最大值确定为第一次匹配得到的最大匹配概率值,若该最大匹配概率值不小于预定值,则说明目标图像与模板图像相匹配,此种情形只需进行大幅旋转完成第一次匹配即可得到匹配结果;若该最大匹配概率值小于预定值,获取第一次匹配所得的匹配概率值变化曲线,将匹配概率值变化曲线中匹配概率值较大的旋转角度区间作为第二次匹配的调控区间,将匹配基准线条在调控区间内以第二预设角度旋转,完成第二次匹配进而得到匹配结果,第二预设角度小于第一预设角度。
上述实施方式中,若一次匹配得到的最大匹配概率值不小于预定值时则只进行一次旋转匹配即可;或者是第一次的最大匹配概率值小于预定值,进行第二次匹配的最大匹配概率值不小于预定值时说明目标图像与模板图像相匹配,否则说明二者不相匹配。在其他具体实施方式中,若第二次匹配的最大匹配概率值小于预定值,可根据实际情形再进行第三次旋转匹配,即进行旋转角度逐级减小的旋转匹配。一般旋转角度逐级减小的旋转匹配过程为两次,当然也可以根据实际情况进行调整,比如设置为三次或是其他次数,此处不做具体限定。
本实施例中,第一次匹配时依次遍历目标图像中的待匹配角点,遍历完目标图像中的所有八个角点A、B、C、D、E、F、G、H后,从第一次匹配所得的匹配概率值变化曲线中获取匹配概率值较大的旋转角度区间,将该旋转角度区间作为第二次匹配的调控区间。其他实施例中,在第一次匹配的过程中也可以不遍历目标图像中的所有角点,例如依次获取待匹配角点,得到部分角点所对应的匹配概率值变化曲线,且这些匹配概率值变化曲线中匹配概率值的较大值大于预定值,则可以不获取其他待匹配角点进行匹配,能进一步地节省匹配时间。
在一具体实施方式中,若第一次匹配得到的最大匹配概率值等于70%,70%小于预定值95%,70%为待匹配基准线条绕D点旋转得到的,如图6a~图6b所示,第一预设角度等于40度,将旋转角度区间,例如将0度~40度作为第二次匹配的调控区间,将匹配基准线条在该调控区间内以第二预设角度等于5度旋转,得到第二次匹配的匹配概率值随旋转角度变化的曲线。
请参阅图7,图7是本申请提供的第二次匹配得到的匹配概率值随旋转角度 变化的曲线示意图。纵坐标代表匹配概率值,横坐标代表旋转角度,第二预设角度等于5度,旋转角度为5度、10度、15度、20度、25度、30度、35度、40度时的匹配概率值分别为80%、96%、79%、51%、30%、20%、13%、9%,当旋转角度为10度时,第二次匹配得到的匹配概率值最大,旋转角度为10度时的匹配概率值等于96%,96%大于预定值95%,说明目标图像与模板图像相匹配。在其他实施方式中,若第二次匹配得到的匹配概率值中的最大值小于预定值95%时,说明目标图像与模板图像不相匹配。
上述实施方式中,选取的匹配基准线条包括模板图像中相邻相交的两条轮廓线,在其他实施方式中为了进一步地节省匹配时间,所选取的匹配基准线条可以只包括模板图像中的一条轮廓线,并将该轮廓线的一端点作为特征点进行匹配。例如选取图2中模板图像的一条轮廓线22作为匹配基准线条,并将一端点d作为特征点,将轮廓线22移至目标图像,使特征点d依次与目标图像中的八个角点位置重合,在每个待匹配角点处依次进行旋转匹配。
在另一实施方式中,也可以直接选取模板图像中所有的轮廓线为匹配基准线条,并选取一个特征点,将匹配基准线条移至目标图像,使选取的该一个特征点依次与目标图像中的待匹配角点重合,进行旋转匹配。
本申请的有益效果是:区别于现有技术的情况,本申请通过根据待加工工件的轮廓线生成匹配基准线条,并提取目标工件的轮廓线的角点作为待匹配角点,将匹配基准线条移至待匹配角点处,并与目标工件开始匹配进而得到匹配结果。通过将匹配基准线条与目标工件进行匹配的方式,能够实现图像快速匹配,提高匹配速度,节省匹配时间。
参见图8,图8为本申请提供的视觉系统一实施方式的结构示意图。视觉系统80包括图像采集器81、存储器82、处理器83,处理器83分别与图像采集器81和存储器82耦合连接;图像采集器81用于采集模板图像、目标图像,并将采集到的图像发送至处理器83;存储器82用于存储模板图像、目标图像、程序数据以及处理器83处理的数据;处理器83在处理数据时执行以下步骤:提取模板图像中待加工工件的轮廓线,根据待加工工件的轮廓线生成匹配基准线条,其中,匹配基准线条之间的交点为特征点,用以作为匹配前的位置锁定;获取目标图像中目标工件的轮廓线,提取目标工件的轮廓线的角点作为待匹配角点;将匹配基准线条移至待匹配角点处,并与目标工件开始匹配进而得到匹配结果,特征点与待匹配角点位置重合。
进一步的,本申请的视觉系统能够执行上述任一实施方式中的图像匹配方法进行图像匹配,具体步骤参阅前述说明,此处不再赘述。
本申请的有益效果是:区别于现有技术的情况,本申请的视觉系统通过根据待加工工件的轮廓线生成匹配基准线条,并提取目标工件的轮廓线的角点作为待匹配角点,将匹配基准线条移至待匹配角点处,并与目标工件开始匹配进而得到匹配结果。通过将匹配基准线条与目标工件进行匹配的方式,能够实现图像快速匹配,提高匹配速度,节省匹配时间。
以上所述仅为本申请的实施方式,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。
Claims (16)
- 一种图像匹配方法,其特征在于,包括:提取模板图像中待加工工件的轮廓线,根据所述待加工工件的轮廓线生成匹配基准线条,其中,所述匹配基准线条之间的交点为特征点,用以作为匹配前的位置锁定;获取目标图像中目标工件的轮廓线,提取所述目标工件的轮廓线的角点作为待匹配角点;将所述匹配基准线条移至所述待匹配角点处,并与所述目标工件开始匹配进而得到匹配结果,所述特征点与所述待匹配角点位置重合。
- 根据权利要求1所述的图像匹配方法,其特征在于,所述提取模板图像中待加工工件的轮廓线,根据所述待加工工件的轮廓线生成匹配基准线条的步骤包括:提取所述模板图像中待加工工件的轮廓线,任意选取其中相邻两条且相交的线条,生成所述匹配基准线条,用以替代所述模板图像作为匹配基准进行匹配。
- 根据权利要求1所述的图像匹配方法,其特征在于,所述获取目标图像中目标工件的轮廓线,提取所述目标工件的轮廓线的角点作为待匹配角点的步骤包括:获取所述目标图像中目标工件的轮廓线,选取其中相邻且相交的任意两条轮廓线之间的交点作为所述待匹配角点。
- 根据权利要求1所述的图像匹配方法,其特征在于,所述将所述匹配基准线条移至所述待匹配角点处,并与所述目标工件开始匹配进而得到匹配结果,所述特征点与所述待匹配角点位置重合的步骤包括:将所述匹配基准线条移至所述目标工件的所述待匹配角点处,所述特征点与所述待匹配角点位置重合,并将所述匹配基准线条以所述待匹配角点为旋转中心点,进行调控旋转进而完成匹配。
- 根据权利要求4所述的图像匹配方法,其特征在于,所述将所述匹配基准线条移至所述目标工件的所述待匹配角点处,所述特征点与所述待匹配角点位置重合,所述匹配基准线条以所述待匹配角点为旋转中心点,进行调控旋转进 而完成匹配的步骤包括:将所述匹配基准线条移至所述目标工件的所述待匹配角点处,所述匹配基准线条中的所述特征点与所述待匹配角点位置重合,所述匹配基准线条以所述待匹配角点为旋转中心点,进行大幅旋转完成第一次匹配;获取第一次匹配结果,将第一次匹配结果中匹配概率值较大的旋转角度区间作为第二次匹配的调控区间,将所述匹配基准线条在所述调控区间内小幅旋转,完成第二次匹配。
- 根据权利要求5所述的图像匹配方法,其特征在于,所述将所述匹配基准线条移至所述目标工件的所述待匹配角点处,所述匹配基准线条中的特征点与所述待匹配角点位置重合,所述匹配基准线条以所述待匹配角点为旋转中心点,进行大幅旋转完成第一次匹配的步骤包括:依次选取所述待匹配角点,将所述匹配基准线条移至所述目标工件的所述待匹配角点,所述特征点与所述待匹配角点位置重合,将所述匹配基准线条以所述待匹配角点为旋转中心点按照第一预设角度旋转完成第一次匹配,并得到匹配概率值变化曲线。
- 根据权利要求6所述的图像匹配方法,其特征在于,所述获取第一次匹配结果,将第一次匹配结果中匹配概率值较大的旋转角度区间作为第二次匹配的调控区间,将所述匹配基准线条在所述调控区间内小幅旋转,完成第二次匹配的步骤包括:获取第一次匹配所得的匹配概率值变化曲线,将所述匹配概率值变化曲线中匹配概率值较大的旋转角度区间作为第二次匹配的调控区间,将所述匹配基准线条在所述调控区间内以第二预设角度旋转,完成第二次匹配进而得到匹配结果,所述第二预设角度小于所述第一预设角度。
- 根据权利要求7所述的图像匹配方法,其特征在于,所述将所述匹配概率值变化曲线中匹配概率值较大的旋转角度区间作为第二次匹配的调控区间,将所述匹配基准线条在所述调控区间内以第二预设角度旋转,完成第二次匹配进而得到匹配结果的步骤包括:将所述匹配基准线条在所述调控区间内以第二预设角度旋转,完成第二次匹配,获取第二次匹配所得的匹配概率值变化曲线,若所述第二次匹配所得的匹配概率值中的最大值不小于预定值,则所述目标图像与所述模板图像相匹配,否则,所述目标图像与所述模板图像不相匹配。
- 一种视觉系统,其特征在于,包括:图像采集器,存储器,处理器,所述处理器分别与所述图像采集器和所述存储器耦合连接;所述图像采集器用于采集模板图像、目标图像,并将采集到的图像发送至所述处理器;所述存储器用于存储所述模板图像、所述目标图像、程序数据以及所述处理器处理的数据;所述处理器在处理所述数据时执行以下步骤:提取模板图像中待加工工件的轮廓线,根据所述待加工工件的轮廓线生成匹配基准线条,其中,所述匹配基准线条之间的交点为特征点,用以作为匹配前的位置锁定;获取目标图像中目标工件的轮廓线,提取所述目标工件的轮廓线的角点作为待匹配角点;将所述匹配基准线条移至所述待匹配角点处,并与所述目标工件开始匹配进而得到匹配结果,所述特征点与所述待匹配角点位置重合。
- 根据权利要求9所述的视觉系统,其特征在于,所述处理器执行提取模板图像中待加工工件的轮廓线,根据所述待加工工件的轮廓线生成匹配基准线条的步骤包括:所述处理器提取所述模板图像中待加工工件的轮廓线,任意选取其中相邻两条且相交的线条,生成匹配基准线条,用以替代所述模板图像作为匹配基准进行匹配。
- 根据权利要求9所述的视觉系统,其特征在于,所述处理器执行获取目标图像中目标工件的轮廓线,提取所述目标工件的轮廓线的角点作为待匹配角点的步骤包括:所述处理器获取所述目标图像中目标工件的轮廓线,选取相邻且相交的任意两条轮廓线条之间的交点作为所述待匹配角点。
- 根据权利要求9所述的视觉系统,其特征在于,所述处理器执行将所述匹配基准线条移至所述待匹配角点处,并与所述目标工件开始匹配进而得到匹配结果,所述特征点与所述待匹配角点位置重合的步骤包括:所述处理器将所述匹配基准线条移至所述目标工件的所述待匹配角点处,所述特征点与所述待匹配角点位置重合,所述匹配基准线条以所述待匹配角点为旋转中心点,进行调控旋转进而完成匹配。
- 根据权利要求12所述的视觉系统,其特征在于,所述处理器执行将所述匹配基准线条移至所述目标工件的所述待匹配角点处,所述特征点与所述待匹配角点位置重合,所述匹配基准线条以所述待匹配角点为旋转中心点,进行调控旋转进而完成匹配的步骤包括:所述处理器将所述匹配基准线条移至所述目标工件的所述待匹配角点处,所述匹配基准线条中的所述特征点与所述待匹配角点位置重合,所述匹配基准线条以所述待匹配角点为旋转中心点,进行大幅旋转完成第一次匹配;所述处理器获取第一次匹配结果,将第一次匹配结果中匹配概率值较大的旋转角度区间作为第二次匹配的调控区间,将所述匹配基准线条在所述调控区间内小幅旋转,完成第二次匹配。
- 根据权利要求13所述的视觉系统,其特征在于,所述处理器执行将所述匹配基准线条移至所述目标工件的所述待匹配角点处,所述匹配基准线条中的特征点与所述待匹配角点位置重合,所述匹配基准线条以所述待匹配角点为旋转中心点,进行大幅旋转完成第一次匹配的步骤包括:所述处理器依次选取所述待匹配角点,将所述匹配基准线条移至所述目标工件的所述待匹配角点,所述特征点与所述待匹配角点位置重合,将所述匹配基准线条以所述待匹配角点为中心点按照第一预设角度旋转完成第一次匹配,并得到匹配概率值变化曲线。
- 根据权利要求14所述的视觉系统,其特征在于,所述处理器执行获取第一次匹配结果,将第一次匹配结果中匹配概率值较大的旋转角度区间作为第二次匹配的调控区间,将所述匹配基准线条在所述调控区间内小幅旋转,完成第二次匹配的步骤包括:所述处理器获取第一次匹配所得的匹配概率值变化曲线,将所述匹配概率值变化曲线中匹配概率值较大的旋转角度区间作为第二次匹配的调控区间,将所述匹配基准线条在所述调控区间内以第二预设角度旋转,完成第二次匹配进而得到匹配结果,所述第二预设角度小于所述第一预设角度。
- 根据权利要求15所述的视觉系统,其特征在于,所述处理器执行将所述匹配概率值变化曲线中匹配概率值较大的旋转角度区间作为第二次匹配的调控区间,将所述匹配基准线条在所述调控区间内以第二预设角度旋转,完成第二次匹配进而得到匹配结果的步骤包括:所述处理器将所述匹配基准线条在所述调控区间内以第二预设角度旋转, 完成第二次匹配,获取第二次匹配所得的匹配概率值变化曲线,若所述第二次匹配所得的匹配概率值中的最大值不小于预定值,则所述目标图像与所述模板图像相匹配,否则,所述目标图像与所述模板图像不相匹配。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2018/101371 WO2020037466A1 (zh) | 2018-08-20 | 2018-08-20 | 一种图像匹配方法及视觉系统 |
CN201880087415.3A CN111684462B (zh) | 2018-08-20 | 2018-08-20 | 一种图像匹配方法及视觉系统 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2018/101371 WO2020037466A1 (zh) | 2018-08-20 | 2018-08-20 | 一种图像匹配方法及视觉系统 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020037466A1 true WO2020037466A1 (zh) | 2020-02-27 |
Family
ID=69592368
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/101371 WO2020037466A1 (zh) | 2018-08-20 | 2018-08-20 | 一种图像匹配方法及视觉系统 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111684462B (zh) |
WO (1) | WO2020037466A1 (zh) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114637261A (zh) * | 2022-03-07 | 2022-06-17 | 深圳市玄羽科技有限公司 | 一种基于云平台的工业制造系统及其控制方法 |
CN116205361A (zh) * | 2023-03-07 | 2023-06-02 | 河海大学 | 一种基于匹配度的工业用水效率分级预测方法 |
CN118644397A (zh) * | 2024-08-15 | 2024-09-13 | 全芯智造技术有限公司 | 图像处理方法、装置及计算机可读存储介质 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112926695B (zh) * | 2021-04-16 | 2024-05-24 | 动员(北京)人工智能技术研究院有限公司 | 基于模板匹配的图像识别方法和系统 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102651069A (zh) * | 2012-03-31 | 2012-08-29 | 重庆大学 | 基于轮廓的局部不变区域的检测方法 |
CN105389586A (zh) * | 2015-10-20 | 2016-03-09 | 浙江大学 | 一种基于计算机视觉自动检测虾体完整性的方法 |
CN107967471A (zh) * | 2017-09-20 | 2018-04-27 | 北京工业大学 | 一种基于机器视觉的表具自动识别方法 |
CN108229560A (zh) * | 2018-01-02 | 2018-06-29 | 上海维宏电子科技股份有限公司 | 基于轮廓曲线匹配算法实现数控系统工件定位匹配的方法 |
-
2018
- 2018-08-20 CN CN201880087415.3A patent/CN111684462B/zh active Active
- 2018-08-20 WO PCT/CN2018/101371 patent/WO2020037466A1/zh active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102651069A (zh) * | 2012-03-31 | 2012-08-29 | 重庆大学 | 基于轮廓的局部不变区域的检测方法 |
CN105389586A (zh) * | 2015-10-20 | 2016-03-09 | 浙江大学 | 一种基于计算机视觉自动检测虾体完整性的方法 |
CN107967471A (zh) * | 2017-09-20 | 2018-04-27 | 北京工业大学 | 一种基于机器视觉的表具自动识别方法 |
CN108229560A (zh) * | 2018-01-02 | 2018-06-29 | 上海维宏电子科技股份有限公司 | 基于轮廓曲线匹配算法实现数控系统工件定位匹配的方法 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114637261A (zh) * | 2022-03-07 | 2022-06-17 | 深圳市玄羽科技有限公司 | 一种基于云平台的工业制造系统及其控制方法 |
CN114637261B (zh) * | 2022-03-07 | 2022-11-15 | 深圳市玄羽科技有限公司 | 一种基于云平台的工业制造系统及其控制方法 |
CN116205361A (zh) * | 2023-03-07 | 2023-06-02 | 河海大学 | 一种基于匹配度的工业用水效率分级预测方法 |
CN116205361B (zh) * | 2023-03-07 | 2024-02-23 | 河海大学 | 一种基于匹配度的工业用水效率分级预测方法 |
CN118644397A (zh) * | 2024-08-15 | 2024-09-13 | 全芯智造技术有限公司 | 图像处理方法、装置及计算机可读存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN111684462B (zh) | 2024-03-01 |
CN111684462A (zh) | 2020-09-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020037466A1 (zh) | 一种图像匹配方法及视觉系统 | |
WO2021138990A1 (zh) | 一种棋盘格亚像素级角点自适应检测的方法 | |
JP6336117B2 (ja) | 建物高さの計算方法、装置及び記憶媒体 | |
TWI394087B (zh) | 追蹤目標物的方法及裝置 | |
WO2019105044A1 (zh) | 一种镜头畸变矫正和特征提取的方法及系统 | |
CN112150551B (zh) | 物体位姿的获取方法、装置和电子设备 | |
TWI395145B (zh) | 手勢辨識系統及其方法 | |
CN106981077B (zh) | 基于dce和lss的红外图像和可见光图像配准方法 | |
JP5468332B2 (ja) | 画像特徴点抽出方法 | |
CN110490067B (zh) | 一种基于人脸姿态的人脸识别方法及装置 | |
CN110688947A (zh) | 一种同步实现人脸三维点云特征点定位和人脸分割的方法 | |
CN111401449B (zh) | 一种基于机器视觉的图像匹配方法 | |
CN113095385B (zh) | 一种基于全局和局部特征描述的多模图像匹配方法 | |
CN111105452A (zh) | 基于双目视觉的高低分辨率融合立体匹配方法 | |
CN210386980U (zh) | 一种基于机器视觉的冷床智能控制系统 | |
JP3886471B2 (ja) | 画像処理装置 | |
CN109658453B (zh) | 圆心确定方法、装置、设备及存储介质 | |
Luo et al. | Improved Harris corner detection algorithm based on canny edge detection and Gray difference preprocessing | |
CN109313708B (zh) | 图像匹配方法和视觉系统 | |
CN115222912A (zh) | 目标位姿估计方法、装置、计算设备及存储介质 | |
JP2008116206A (ja) | パターン寸法測定装置、方法及びプログラム | |
Feng et al. | Research on an image mosaic algorithm based on improved ORB feature combined with surf | |
Chen et al. | Algorithm design of lane departure warning system based on image processing | |
CN112634377B (zh) | 扫地机器人的相机标定方法、终端和计算机可读存储介质 | |
Zhang et al. | Real-time Lane Detection Method Based On Region Of Interest |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18930726 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18930726 Country of ref document: EP Kind code of ref document: A1 |