CN112233178B - Dynamic material ranging method in complex environment based on machine vision - Google Patents

Dynamic material ranging method in complex environment based on machine vision Download PDF

Info

Publication number
CN112233178B
CN112233178B CN202011110323.8A CN202011110323A CN112233178B CN 112233178 B CN112233178 B CN 112233178B CN 202011110323 A CN202011110323 A CN 202011110323A CN 112233178 B CN112233178 B CN 112233178B
Authority
CN
China
Prior art keywords
matching
image
target
template
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011110323.8A
Other languages
Chinese (zh)
Other versions
CN112233178A (en
Inventor
宋宝
唐小琦
漆满江
刘永兴
李鹏帅
钟靖龙
周向东
罗小军
赵磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Guangdong Topstar Technology Co Ltd
Original Assignee
Huazhong University of Science and Technology
Guangdong Topstar Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology, Guangdong Topstar Technology Co Ltd filed Critical Huazhong University of Science and Technology
Priority to CN202011110323.8A priority Critical patent/CN112233178B/en
Publication of CN112233178A publication Critical patent/CN112233178A/en
Application granted granted Critical
Publication of CN112233178B publication Critical patent/CN112233178B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a dynamic material ranging method in a complex environment based on machine vision, which utilizes camera calibration parameters to complete image preprocessing and correct images; identifying material image features, obtaining a target image by matching the source image features and the template image features, and obtaining a material distance by ranging the target image; simultaneously, using the target image as a template image for matching a subsequent source image, wherein the template image adopts an iteration strategy in the matching process; continuously updating the template image by using the result obtained by matching; the subsequent source images are matched through templates to obtain target images, and distance measurement is completed on the target images to obtain material distances; and adopting solvepnp algorithm, and completing the material ranging task by matching with target identification and target tracking. And obtaining a target image through feature recognition, optimized feature matching and template matching, simultaneously combining a ranging algorithm to finish the measurement of the material distance, and verifying the effectiveness of the method and the accuracy of distance measurement through experiments.

Description

Dynamic material ranging method in complex environment based on machine vision
Technical Field
The invention relates to the technical field of machine vision, in particular to a dynamic material ranging method in a complex environment based on machine vision.
Background
In the logistics link of industrial production, the movement of the grabbing device needs to be controlled according to the distance between the material to be grabbed and the grabbing device. In order to improve the grabbing efficiency of the device, the trend of completing the material distance measurement by adopting machine vision is increasing, however, the background of machine vision detection is complex due to similar interference factors such as background illumination intensity change in the factory environment, other materials in the view field, personnel walking and the like, and the efficiency and accuracy of the material dynamic distance measurement are affected.
At present, ranging research on still images has become mature. In the process of ranging dynamic materials, the target image is regarded as a static image frame by frame, and the distance measurement is completed by utilizing a static image ranging algorithm, so that the accuracy of the distance measurement depends on whether the acquisition of the target image for realizing the ranging is correct or not. Wang Shiyu et al, complete the position acquisition of the target image by using the improved canny algorithm in combination with template matching. The rotation of the image is corrected by using Hough transformation, the obtained corrected image does not have a rotation angle, and then the position of the target image is obtained by using template matching. The two methods can better realize the acquisition of the target image under the design background. But applied to industrial production logistics links, the outline of the object image is similar to that of the object image due to the interference image in the complex background. The use of the target image obtained by edge detection and template matching for ranging results in poor ranging accuracy.
Disclosure of Invention
The invention aims to solve the problems and provide a dynamic material ranging method in a complex environment based on machine vision.
The invention realizes the above purpose through the following technical scheme:
The invention comprises the following steps:
S1: image preprocessing is completed by using camera calibration parameters, and the image is corrected;
S2: identifying material image features, obtaining a target image by matching the source image features and the template image features, and obtaining a material distance by ranging the target image; simultaneously, using the target image as a template image for matching a subsequent source image, wherein the template image adopts an iteration strategy in the matching process; continuously updating the template image by using the result obtained by matching;
S3: the subsequent source images are matched through templates to obtain target images, and distance measurement is completed on the target images to obtain material distances; and adopting solvepnp algorithm, and completing the material ranging task by matching with target identification and target tracking.
The target identification is realized by matching SIFT with FLANN algorithm through feature identification and feature matching selection; the feature matching algorithm is optimized by using four steps: deleting matcher a larger Euclidean distance value in a matching result, wherein deleting matching shares a point and deleting matching is chaotic; adopting RansacStatus method to make integral optimization on the matching result;
the target tracking: and the first image of the pipeline target image is compared with the template image thereof to have the characteristics of scaling and rotation, the target image is compared with the template image thereof in the subsequent target movement process to have the movement characteristic of scaling only, and the recognition of the materials on the subsequent pipeline is realized through template matching.
The euclidean distance in the deletion matcher matching result is a larger value: matching the characteristic points in the two images through a matcher, and comparing to obtain a minimum Euclidean distance D min and a judgment value D c=min(2*Dmin and 0.5; comparing each Euclidean distance D with a judgment value D c to obtain a result D d=Dc-Di, and deleting the matching result if D d is more than 0, so that the recognition accuracy is improved;
the delete matches share a point: comparing each pair of matching characteristic values one by one; if the condition that the feature points are the same in the two pairs of matched feature values is found, the condition that one point is shared is judged, deletion is carried out, and the identification accuracy is improved;
The deletion matches the confusion: similar triangle matching is adopted: aiming at the error of the matching result in the characteristic point matching process, the principle that the corresponding characteristic points in the similar images form similar triangles is utilized, two points with the minimum Euclidean distance are selected for comparison, and the matching result is deleted if the matching result does not meet the requirement of the similar triangles;
The RansacStatus method deletes the mismatching point: the RansacStatus method comprises the steps of firstly, estimating parameters of the whole model by iteration, and removing abnormal data which do not accord with the model to obtain an effective matching result; and (5) optimizing the image matching result.
The invention has the beneficial effects that:
Compared with the prior art, the invention provides a dynamic material distance measuring method combining feature optimization matching and iterative learning for the problem of poor accuracy of dynamic material distance measurement under a complex background. And obtaining a target image through feature recognition, optimized feature matching and template matching, simultaneously combining a ranging algorithm to finish the measurement of the material distance, and verifying the effectiveness of the method and the accuracy of distance measurement through experiments.
Drawings
FIG. 1 is a dynamic ranging flow chart;
FIG. 2 is an optimization flow chart;
fig. 3 is a flowchart of optimizing a matching result using euclidean distance.
Detailed Description
The invention is further described below with reference to the accompanying drawings:
Aiming at the source image distortion interference characteristics, the source image obtained by shooting is inconsistent with the actual image due to the source image distortion caused by camera hardware, the accurate acquisition of the target image is directly affected, and finally the ranging precision is reduced.
Aiming at complex background interference characteristics, the acquisition of a target image is influenced by the interference of similar images, so that the accuracy of a material ranging result is influenced. In order to solve the problems, on one hand, the invention realizes feature-based target identification based on SIFT (Scale-INVARIANT FEATURE TRANSFORM) algorithm and FLANN (Fast library for approximate nearest neighbors) matching algorithm through four-step optimization of matching results. The interference resistance requirement of feature recognition is high due to the influence of a complex background. Compared with the feature recognition methods such as SURF (Speeded Up Robust Features) and ORB (ORiented Brief), the SIFT algorithm has higher anti-interference performance on complex backgrounds. Meanwhile, the FLANN method is adopted to solve the problem of feature matching under the condition of image distortion, and single-adaptability matching is realized. On the other hand, the invention realizes the identification of the target image in the subsequent source image by adopting MATCHTEMPLE algorithm with high operation speed on the basis of target identification, and improves the target tracking speed on the basis of ensuring the correct identification result.
The dynamic material ranging identification scheme of the invention is shown in figure 1. And finishing image preprocessing by using camera calibration parameters, and correcting the image. Identifying the image characteristics of the materials, obtaining a target image by matching the image characteristics of the source image and the image characteristics of the template, and obtaining the distance of the materials by ranging the target image. And simultaneously, using the target image as a template image for subsequent source image matching, wherein the template image adopts an iteration strategy in the matching process, and the template image is continuously updated by using a matching result. The subsequent source images are matched through templates to obtain target images, and distance measurement is completed on the target images to obtain material distances. In the scheme, the distance measurement is realized through solvepnp algorithm, and the material distance measurement task is completed through the interaction with target recognition and target tracking.
In the dynamic material ranging algorithm, target identification and target tracking achieve determination of a target area in the preprocessed image, and a basis is provided for measurement of the dynamic material distance. Meanwhile, the interference of the complex environment on the distance measurement of the dynamic materials is reduced by utilizing target identification and target tracking, and high-precision distance measurement is realized.
Target identification:
The target recognition is realized by matching SIFT with FLANN algorithm through feature recognition and feature matching selection. In the target identification process, the FLANN matching algorithm realizes matching according to the characteristic quantity comparison, so that a matching result can have a matching error of a corresponding point. In order to improve the accuracy of target identification, the invention optimizes the feature matching algorithm by using four steps. Aiming at the characteristic point corresponding matching error, based on Euclidean distance parameters for judging the matching degree and the distribution characteristics of the matching result, the matching result is integrally optimized by adopting a method comprising deleting a pair of matching points with larger Euclidean distance, deleting matching to share one point and deleting matching confusion. The specific flow is shown in figure 2.
Deleting matcher the larger value of Euclidean distance in the matching result: as shown in fig. 3, after the SIFT algorithm is used to find the feature points, the matching is performed through the FlannBasedMatcher matcher, and the larger the euclidean distance in the matching result is, the lower the matching degree of the two feature points is. And matching the characteristic points in the two images through a matcher, and comparing to obtain a minimum Euclidean distance D min and a judgment value D c=min(2*Dmin and 0.5). The invention obtains the result D d=Dc-Di by comparing each Euclidean distance D with the judgment value D c, and deletes the matching result if D d >0, thereby improving the recognition accuracy.
Deleting the case sharing a point: aiming at the condition that point-to-multipoint occurs in the process of feature point correspondence, the invention compares each pair of matched feature values one by one. If the condition that the feature points are the same in the two pairs of matched feature values is found, the condition that one point is shared is judged, deletion is carried out, and the identification accuracy is improved.
Similar triangle matching: aiming at the error of the matching result in the characteristic point matching process, the principle that the corresponding characteristic points in the similar images form similar triangles is utilized, and two points with the minimum Euclidean distance are selected, namely two points m 1 and m 2 in the template image, namely two points s 1 and s 2 in the detected image. The remaining pairs of points m i and s i are compared.
When the above formula 1, formula 2 and formula 3 are satisfied simultaneously, it means that the point meets the requirement of similar triangle, and the matching result is deleted.
RansacStatus method deletes the mismatching point: the RansacStatus method firstly utilizes iteration to estimate the parameters of the whole model, and eliminates abnormal data which do not accord with the model to obtain an effective matching result. And (5) optimizing the image matching result.
Under a complex background, the target recognition algorithm of the invention completes the extraction of the target image with rotation characteristics in the source image. Target image acquisition for a full angle of 0 deg. to 360 deg. can be achieved.
Target tracking
The first image of the pipeline target image is compared with the template image thereof to have zooming and rotating characteristics, the target image is compared with the template image thereof in the subsequent target movement process to have zooming characteristics only, and the recognition of the materials on the subsequent pipeline is realized through template matching by utilizing the movement characteristics. Template matching is realized by adopting CVMATCHTEMPLE algorithm, and the target recognition result is utilized to start template matching, so that the defect that the algorithm cannot recognize rotation can be avoided, and the operation speed of the algorithm is improved.
The target tracking algorithm is adopted, so that under the condition that the source image generates different degrees of scaling, the target image is acquired, and the identification of different scaling degrees can be met. The difference of scaling factors of the target tracking part template image is 0.1, and the target tracking algorithm verification is realized by adopting the difference of scaling factors of the target image as 0.01 because the size of the target image is continuously scaled in the actual application process. The target tracking uses a pixel value square difference method to carry out image matching, the best matching value is at 0, and the larger the value is, the worse the matching result is. The results of the operation are shown in Table 1.
Experimental results show that when the targets continuously move, the target can be identified and matched by setting the template scaling multiple with the difference value of 0.1, and the continuously-changed target images can be matched with the optimal templates through target tracking. Meets the requirement of target tracking.
TABLE 1 target tracking result parameters
Experimental results and analysis
In order to verify the accuracy of the invention on material distance measurement, a simulation scene is built, and experimental verification is carried out. The experimental equipment comprises a computer, an industrial camera with 500 ten thousand pixels and 1080×1920 resolution and a black-and-white bird-shaped image with the size of 80mm×80mm as target pictures. The travel distance was controlled by laying calibration plates at 40mm intervals on the ground. And placing sundries interfering with the detected image in an experimental environment to realize complex background requirements.
Table 2 shows the average error of material ranging in a complex background.
TABLE 2 target recognition and tracking algorithm running results
The target image was rotated 90 ° clockwise to re-perform material ranging with average ranging errors as shown in table 3.
TABLE 3 90 rotated target recognition and tracking algorithm operation results
In a complex background, the non-rotation ranging of the target image and the 90-degree ranging error of the target image are respectively 2.46mm and 1.77mm. The invention considers that the number of templates has important influence on the ranging accuracy by analyzing experimental data. Meanwhile, the detection speed is reduced due to the fact that the number of templates is too large, and the detection precision is reduced due to the fact that the number of templates is too small, so that in order to achieve the detection speed and the detection precision, the proper number of templates is required to be selected according to actual application conditions, and the accuracy of data is guaranteed.
Conclusion(s)
The invention provides a dynamic material distance detection method under a complex background, which optimizes a matching result on the basis of matching the characteristics of a source image and a template image, combines template matching to obtain a target image in a series of source images, and adopts a distance measurement algorithm to finish distance detection. Based on the method, an experimental environment is established for verification, and the result shows that: the method meets the detection requirement of the actual working condition.
The foregoing has shown and described the basic principles and main features of the present invention and the advantages of the present invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, and that the above embodiments and descriptions are merely illustrative of the principles of the present invention, and various changes and modifications may be made without departing from the spirit and scope of the invention, which is defined in the appended claims. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (1)

1. The dynamic material ranging method in the complex environment based on the machine vision is characterized by comprising the following steps:
S1: image preprocessing is completed by using camera calibration parameters, and the image is corrected;
S2: identifying material image features, obtaining a target image by matching the source image features and the template image features, and obtaining a material distance by ranging the target image; simultaneously, using the target image as a template image for matching a subsequent source image, wherein the template image adopts an iteration strategy in the matching process; continuously updating the template image by using the result obtained by matching;
S3: the subsequent source images are matched through templates to obtain target images, and distance measurement is completed on the target images to obtain material distances; adopting solvepnp algorithm, and completing the material ranging task by matching with target identification and target tracking;
The target identification is realized by matching SIFT with FLANN algorithm through feature identification and feature matching selection; the feature matching algorithm is optimized by using four steps: deleting matcher a larger Euclidean distance value in a matching result, wherein deleting matching shares a point and deleting matching is chaotic; adopting RansacStatus method to make integral optimization on the matching result;
The target tracking: the first piece of the pipeline target image is compared with the template image of the pipeline target image to have the characteristics of scaling and rotation, the target image is compared with the template image of the pipeline target image in the subsequent target movement process to have the movement characteristics of scaling, and the recognition of the materials on the subsequent pipeline is realized through template matching;
Wherein, the euclidean distance is larger in the deleting matcher matching result: matching the characteristic points in the two images through a matcher, and comparing to obtain a minimum Euclidean distance D min and a judgment value D c=min(2*Dmin and 0.5; comparing each Euclidean distance D with a judgment value D c to obtain a result D d=Dc-Di, and deleting the matching result if D d is more than 0, so that the recognition accuracy is improved;
the delete matches share a point: comparing each pair of matching characteristic values one by one; if the condition that the feature points are the same in the two pairs of matched feature values is found, the condition that one point is shared is judged, deletion is carried out, and the identification accuracy is improved;
The deletion matches the confusion: similar triangle matching is adopted: aiming at the error of the matching result in the characteristic point matching process, the principle that the corresponding characteristic points in the similar images form similar triangles is utilized, two points with the minimum Euclidean distance are selected for comparison, and the matching result is deleted if the matching result does not meet the requirement of the similar triangles;
The RansacStatus method deletes the mismatching point: the RansacStatus method comprises the steps of firstly, estimating parameters of the whole model by iteration, and removing abnormal data which do not accord with the model to obtain an effective matching result; and (5) optimizing the image matching result.
CN202011110323.8A 2020-11-11 2020-11-11 Dynamic material ranging method in complex environment based on machine vision Active CN112233178B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011110323.8A CN112233178B (en) 2020-11-11 2020-11-11 Dynamic material ranging method in complex environment based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011110323.8A CN112233178B (en) 2020-11-11 2020-11-11 Dynamic material ranging method in complex environment based on machine vision

Publications (2)

Publication Number Publication Date
CN112233178A CN112233178A (en) 2021-01-15
CN112233178B true CN112233178B (en) 2024-05-17

Family

ID=74119204

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011110323.8A Active CN112233178B (en) 2020-11-11 2020-11-11 Dynamic material ranging method in complex environment based on machine vision

Country Status (1)

Country Link
CN (1) CN112233178B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113132685A (en) * 2021-04-02 2021-07-16 浙江安防职业技术学院 Fire control remote control system based on thing networking

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106595800A (en) * 2016-12-27 2017-04-26 上海云鱼智能科技有限公司 Machine vision based material level meter
CN109598715A (en) * 2018-12-05 2019-04-09 山西镭谱光电科技有限公司 Material size online test method based on machine vision
WO2020134617A1 (en) * 2018-12-28 2020-07-02 南京航空航天大学 Positioning method for matching buildings of repetitive structures on the basis of street view image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106595800A (en) * 2016-12-27 2017-04-26 上海云鱼智能科技有限公司 Machine vision based material level meter
CN109598715A (en) * 2018-12-05 2019-04-09 山西镭谱光电科技有限公司 Material size online test method based on machine vision
WO2020134617A1 (en) * 2018-12-28 2020-07-02 南京航空航天大学 Positioning method for matching buildings of repetitive structures on the basis of street view image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于移动视觉的工业机器人分拣系统应用研究;彭辉辉 等;《现代电子技术》;第43卷(第20期);18-22 *

Also Published As

Publication number Publication date
CN112233178A (en) 2021-01-15

Similar Documents

Publication Publication Date Title
CN107301654B (en) Multi-sensor high-precision instant positioning and mapping method
CN108256394B (en) Target tracking method based on contour gradient
CN110097093B (en) Method for accurately matching heterogeneous images
CN109903313B (en) Real-time pose tracking method based on target three-dimensional model
CN111899334B (en) Visual synchronous positioning and map building method and device based on point-line characteristics
CN107993258B (en) Image registration method and device
JP6899189B2 (en) Systems and methods for efficiently scoring probes in images with a vision system
CN105740899A (en) Machine vision image characteristic point detection and matching combination optimization method
Chen et al. Robust affine-invariant line matching for high resolution remote sensing images
CN102169581A (en) Feature vector-based fast and high-precision robustness matching method
Feng et al. Fine-grained change detection of misaligned scenes with varied illuminations
CN113393524B (en) Target pose estimation method combining deep learning and contour point cloud reconstruction
CN107490356B (en) Non-cooperative target rotating shaft and rotation angle measuring method
WO2018199958A1 (en) Object recognition
CN110222661B (en) Feature extraction method for moving target identification and tracking
CN111311618A (en) Circular arc workpiece matching and positioning method based on high-precision geometric primitive extraction
CN112163588A (en) Intelligent evolution-based heterogeneous image target detection method, storage medium and equipment
CN114331879A (en) Visible light and infrared image registration method for equalized second-order gradient histogram descriptor
CN108257153B (en) Target tracking method based on direction gradient statistical characteristics
CN112233178B (en) Dynamic material ranging method in complex environment based on machine vision
CN103714550A (en) Image registration automatic optimization algorithm based on matching of curve characteristic evaluation
Luo et al. Improved Harris corner detection algorithm based on canny edge detection and Gray difference preprocessing
Schmid et al. Features for Ground Texture Based Localization--A Survey
CN113095385A (en) Multimode image matching method based on global and local feature description
Yuan et al. Realtime CNN-based keypoint detector with Sobel filter and CNN-based descriptor trained with keypoint candidates

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant