CN111080703B - Mobile robot repositioning method based on linear matching - Google Patents
Mobile robot repositioning method based on linear matching Download PDFInfo
- Publication number
- CN111080703B CN111080703B CN201911415258.7A CN201911415258A CN111080703B CN 111080703 B CN111080703 B CN 111080703B CN 201911415258 A CN201911415258 A CN 201911415258A CN 111080703 B CN111080703 B CN 111080703B
- Authority
- CN
- China
- Prior art keywords
- straight line
- optimal
- line pair
- pair
- global map
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/168—Segmentation; Edge detection involving transform domain methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20061—Hough transform
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention is suitable for the technical field of robot positioning, and provides a mobile robot repositioning method based on linear matching, which comprises the following steps: s1, loading the global map, extracting straight lines from the global map, and storing the straight lines; s2, extracting straight lines from the local map, and screening out the optimal crossed straight line pair and the optimal parallel straight line pair; s3, detecting whether the local map has the optimal crossed straight line pair, if so, executing a step S4, and if not, executing a step S5; s4, matching the optimal crossed straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and a translation vector T, and positioning the mobile robot; and S5, matching the optimal parallel straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and translation vector T, and positioning the mobile robot. After the mobile robot has abnormal conditions such as 'kidnapping' or restarting, the relocation can be rapidly carried out.
Description
Technical Field
The invention belongs to the technical field of robot positioning, and provides a mobile robot repositioning method based on linear matching.
Background
With the development of science and technology, mobile robots play an increasingly important role in the fields of automation plants, intelligent warehousing and logistics and the like. In some occasions, when the robot is restarted or suddenly is "kidnapped" to other positions, the robot cannot position the pose, and at the moment, the robot needs to be moved to the initial position by a human and can work after being restarted. In the existing solution for robot relocation, robot positioning is mostly realized by attaching two-dimensional codes or installing UWB and other auxiliary devices in the environment, which limits the application range of the robot and increases the cost. The method can realize quick and accurate repositioning of the mobile robot under abnormal conditions such as self pose failure or restarting and the like, and is a problem to be solved urgently at present.
Disclosure of Invention
The embodiment of the invention provides a mobile robot repositioning method based on linear matching, which can reposition quickly and accurately under the abnormal condition that the self pose of the mobile robot is invalid or restarted.
The invention is realized in this way, a mobile robot repositioning method based on straight line matching, which specifically comprises the following steps:
s1, loading the global map, extracting straight lines from the global map, and storing the straight lines;
s2, extracting straight lines from the local map, and screening out the optimal crossed straight line pair and the optimal parallel straight line pair;
s3, detecting whether the local map has the optimal crossed straight line pair, if so, executing a step S4, and if not, executing a step S5;
s4, matching the optimal crossed straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and a translation vector T, and positioning the mobile robot based on the optimal rotation matrix R and the translation vector T;
and S5, matching the optimal parallel straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and translation vector T, and positioning the mobile robot based on the optimal rotation matrix R and translation vector T.
Further, the method for extracting the straight line pair in the global map specifically comprises the following steps:
s11, performing edge detection on the global map, converting the global map into a binary image, and detecting all straight line segments in the binary image by using Hough transform;
s12, filtering all the straight line segments extracted from the global map, and reserving the straight line segments with the length larger than the minimum length threshold and smaller than the maximum length threshold;
and S13, performing straight line fitting on the filtered straight line segment, wherein the fitted straight line is the straight line extracted from the global map.
Further, the straight line fitting method specifically comprises the following steps:
forming straight line pairs by the filtered straight line segments pairwise, calculating included angles and distances among the straight line pairs, and performing straight line fitting on the straight line pairs with the included angles smaller than an angle threshold value I and the straight line distances smaller than a distance threshold value;
the distance between the two straight lines is the distance between the middle points of the two straight lines.
Further, the method for extracting the straight line pair in the local map specifically comprises the following steps:
s11, performing edge detection on the local map, converting the local map into a binary image, and detecting all straight line segments in the binary image by using Hough transform;
s12, filtering all the straight line segments extracted from the local map, and reserving the straight line segments with the length larger than the minimum length threshold and smaller than the maximum length threshold;
and S13, performing straight line fitting on the filtered straight line segment, wherein the fitted straight line is the straight line extracted from the global map.
Further, the straight line fitting method specifically comprises the following steps:
forming straight line pairs by the filtered straight line segments pairwise, calculating included angles and distances among the straight line pairs, and performing straight line fitting on the straight line pairs with the included angles smaller than an angle threshold value I and the straight line distances smaller than a distance threshold value;
the distance between the two straight lines is the distance between the middle points of the two straight lines.
Further, the matching process based on the optimal crossed straight line pair specifically includes the following steps:
s41, determining the intersection point of two straight lines in the optimal crossing straight line pair, taking the intersection point as a starting point and two straight line end points far away from the intersection point as end points, and defining the two generated vectors as the optimal crossing vector pair;
s42, acquiring all crossed straight line pairs in the global map, taking the intersection point as a starting point and two straight line end points far away from the intersection point as end points, and defining the generated two vectors as crossed vector pairs;
s42, respectively calculating a rotation matrix R and a translation vector T of the optimal cross vector pair in the local map relative to each cross vector pair in the global map;
and S43, scoring each group of rotation matrix R and translation vector T through the likelihood domain model, wherein the rotation matrix R and the translation vector T with the highest scores are the optimal rotation matrix R and translation vector T.
Further, the matching process based on the optimal parallel straight line pair specifically includes the following steps:
s51, defining the direction of the optimal parallel straight line pair in the local map, and endowing each group of parallel straight line pairs in the global map with the same direction;
s52, acquiring the highest likelihood score of each group of parallel straight line pairs and the corresponding rotation matrix R and translation vector T;
and S53, acquiring the maximum value of the highest likelihood score of each group of parallel straight line pairs and the corresponding rotation matrix R and translation vector T, wherein the rotation matrix R and the translation vector T are the optimal rotation matrix R and translation vector T.
Further, the method for obtaining the highest likelihood score of the parallel straight line pair specifically includes:
s521, calculating a rotation matrix R of the optimal parallel straight line pair in the local map relative to the parallel straight line pair in the global map;
s522, extracting a starting point of one straight line in the optimal parallel straight line pair, extracting one straight line in the parallel straight line pair in the global map, sequentially traversing each pixel point on the straight line from the starting point of the straight line, and calculating a translation vector T of each pixel point relative to the starting point;
and S523, scoring each group of rotation matrixes R and translation vectors T through a likelihood domain, wherein the highest score is the highest likelihood score of the corresponding parallel straight line pair.
According to the mobile robot repositioning method based on the linear matching, the repositioning can be rapidly carried out after the mobile robot has abnormal conditions such as 'kidnapping' or restarting.
Drawings
Fig. 1 is a flowchart of a mobile robot repositioning method based on line matching according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Fig. 1 is a flowchart of a method for repositioning a mobile robot based on line matching according to an embodiment of the present invention, where the method specifically includes the following steps:
and S1, loading the global map, extracting the straight line end from the global map, storing the straight line end into the straight line set, and only directly reading the straight line end from the straight line set without repeatedly performing the straight line extraction operation in the global map in the subsequent matching operation.
In the embodiment of the present invention, the method for extracting a straight line segment in a global map specifically includes the following steps:
s11, carrying out edge detection on the global map, converting the global map into a binary image, detecting all straight line segments in the binary image by utilizing Hough transform, wherein each straight line segment adopts four elements (x)1,y1,x2,y2) Is represented by (x)1,y1) And (x)2,y2) Respectively representing the starting point and the ending point of the straight line segment;
s12, filtering all the straight line segments extracted from the global map, namely keeping the straight line segments with the length larger than a minimum length threshold and smaller than a maximum length threshold;
s13, performing straight line fitting on the filtered straight line segment, wherein the fitted straight line is the straight line segment in the global map, and the fitting method is as follows:
forming straight line pairs by the filtered straight line segments pairwise, calculating included angles and distances among the straight line pairs, and performing straight line fitting on the straight line pairs with the included angles smaller than an angle threshold value I and the straight line distances smaller than a distance threshold value; the distance between the two straight lines is the distance between the middle points of the two straight lines.
S2, extracting straight lines from the local map, and screening out the optimal crossed straight line pair and the optimal parallel straight line pair;
in the embodiment of the present invention, a laser sensor is used to acquire current frame data and convert the current frame data into a local image, and the current position of the laser sensor is determined as the central position of the image, and the method for extracting a straight line in the local map specifically includes the following steps:
s21, carrying out edge detection on the local map, converting the local map into a binary image, detecting all straight line segments in the binary image by utilizing Hough transform, wherein each straight line segment adopts four elements (x)1,y1,x2,y2) Is represented by (x)1,y1) And (x)2,y2) Respectively representing the starting point and the ending point of the straight line segment;
s22, filtering all the straight line segments extracted from the local map, namely keeping the straight line segments with the length larger than a minimum length threshold and smaller than a maximum length threshold;
s23, performing straight line fitting on the filtered straight line segment, wherein the fitted straight line is a straight line in the local map, and the fitting method is as follows:
forming straight line pairs by the filtered straight line segments pairwise, calculating included angles and distances among the straight line pairs, and performing straight line fitting on the straight line pairs with the included angles smaller than an angle threshold value I and the straight line distances smaller than a distance threshold value; the distance between the two straight lines is the distance between the middle points of the two straight lines.
In the embodiment of the invention, after the straight line extraction operation is carried out on the local map, a plurality of straight lines are extracted, the straight lines are combined pairwise to form a plurality of groups of straight line pairs, the included angle between the straight line pairs is calculated, if the included angle is greater than an angle preset value, the straight line pair is judged to be a crossed straight line pair, and if the included angle is less than or equal to the angle preset value, the straight line pair is judged to be a parallel straight line pair;
for a crossed straight line pair: extracting a crossed straight line pair with an included angle larger than an angle threshold value two, detecting the distance of a short straight line section in the crossed straight line pair, and taking the crossed straight line pair where the short straight line section with the longest distance is located as an optimal crossed straight line pair;
for a parallel straight line pair: and taking the parallel straight line pair with the smallest angle as the optimal parallel straight line pair.
S3, detecting whether the local map has the optimal crossed straight line pair, if so, executing a step S4, and if not, executing a step S5;
s4, matching the optimal crossed straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and translation vector T, and positioning the mobile robot based on the optimal rotation matrix R and translation vector T;
in the embodiment of the invention, the straight line pair in the global map is formed by combining two straight lines in the global map, and the matching process based on the optimal crossed straight line pair specifically comprises the following steps:
s41, determining the intersection point of two straight lines in the optimal crossing straight line pair, taking the intersection point as a starting point and two straight line end points far away from the intersection point as end points, and defining the two generated vectors as the optimal crossing vector pair;
s42, acquiring all crossed straight line pairs in the global map, taking the intersection point as a starting point and two straight line end points far away from the intersection point as end points, and defining the generated two vectors as crossed vector pairs;
s42, respectively calculating a rotation matrix R and a translation vector T of the optimal cross vector pair in the local map relative to each cross vector pair in the global map;
and S43, scoring the rotation matrixes R and the translation vectors T through the likelihood domain, wherein the rotation matrix R and the translation vector T with the highest score are the optimal rotation matrix R and translation vector T.
In the embodiment of the present invention, the calculation formulas of the rotation matrix R and the translation vector T are specifically as follows:
wherein, VLRepresenting straight line components in a local mapVector of (a), VWVectors representing the straight line components of the global map, PLRepresenting the coordinates of points on a straight line of a local map, PWRepresenting the coordinates of points on a global map straight line.
S5, matching the optimal parallel straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and translation vector T, and positioning the mobile robot based on the optimal rotation matrix R and translation vector T;
in the embodiment of the invention, the matching process based on the optimal parallel straight line specifically comprises the following steps:
s51, defining the direction of the optimal parallel straight line pair in the local map, and endowing each group of parallel straight line pairs in the global map with the same direction;
s52, obtaining the highest likelihood score of each group of parallel straight lines, and the rotation matrix R and the translation vector T corresponding to the highest likelihood score, where the highest likelihood score is the highest score of each group of parallel straight lines calculated by the likelihood model, and the obtaining method is specifically as follows:
s521, calculating a rotation matrix R of the optimal parallel straight line pair in the local map relative to the parallel straight line pair in the global map by using VL=R·VWSolving the rotation matrix R, VLVectors, V, representing the rectilinear components of the local mapWA vector representing a global map straight line composition;
s522, extracting a starting point of a straight line in the local map, extracting a straight line in a parallel straight line pair in the global map, sequentially traversing each pixel point on the straight line from the starting point of the straight line, calculating a translation vector T of each pixel point relative to the starting point, and calculating the translation vector T by using a formula T as PL-R·PWAnd obtaining a translation vector T corresponding to each pixel point.
And S523, scoring the rotation matrixes R and the translation vectors T through the likelihood domain to obtain the highest likelihood score of the corresponding parallel straight line pair.
And S53, acquiring the highest value of the highest score of each group of parallel straight line pairs and the corresponding rotation matrix R and translation vector T, and taking the rotation matrix R and the translation vector T as the optimal rotation matrix R and translation vector T.
According to the mobile robot repositioning method based on linear matching, the repositioning can be rapidly carried out after the mobile robot is subjected to abnormal conditions such as 'kidnapping' or restarting.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.
Claims (6)
1. A mobile robot repositioning method based on straight line matching is characterized by specifically comprising the following steps:
s1, loading the global map, extracting straight lines from the global map, and storing the straight lines;
s2, extracting straight lines from the local map, and screening out the optimal crossed straight line pair and the optimal parallel straight line pair;
extracting a crossed straight line pair with an included angle larger than an angle threshold value two, detecting the distance of a short straight line section in the crossed straight line pair, and taking the crossed straight line pair where the short straight line section with the longest distance is located as an optimal crossed straight line pair;
taking the parallel straight line pair with the minimum angle as the optimal parallel straight line pair;
s3, detecting whether the local map has the optimal crossed straight line pair, if so, executing a step S4, and if not, executing a step S5;
s4, matching the optimal crossed straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and a translation vector T, and positioning the mobile robot based on the optimal rotation matrix R and the translation vector T, wherein every two straight lines in the global map are combined to form the straight line pair in the global map;
s5, matching the optimal parallel straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and translation vector T, and positioning the mobile robot based on the optimal rotation matrix R and translation vector T;
the matching process based on the optimal crossed straight line pair specifically comprises the following steps:
s41, determining the intersection point of two straight lines in the optimal crossing straight line pair, taking the intersection point as a starting point and two straight line end points far away from the intersection point as end points, and defining the two generated vectors as the optimal crossing vector pair;
s42, acquiring all crossed straight line pairs in the global map, taking the intersection point as a starting point and two straight line end points far away from the intersection point as end points, and defining the generated two vectors as crossed vector pairs;
s42, respectively calculating a rotation matrix R and a translation vector T of the optimal cross vector pair in the local map relative to each cross vector pair in the global map;
s43, scoring each group of rotation matrix R and translation vector T through a likelihood domain model, wherein the rotation matrix R and the translation vector T with the highest scores are the optimal rotation matrix R and translation vector T;
the matching process based on the optimal parallel straight line pair specifically comprises the following steps:
s51, defining the direction of the optimal parallel straight line pair in the local map, and endowing each group of parallel straight line pairs in the global map with the same direction;
s52, acquiring the highest likelihood score of each group of parallel straight line pairs and the corresponding rotation matrix R and translation vector T;
and S53, acquiring the highest value of the highest likelihood score of each group of parallel straight line pairs and the corresponding rotation matrix R and translation vector T, wherein the rotation matrix R and the translation vector T are the optimal rotation matrix R and translation vector T.
2. The method for relocating the mobile robot based on the straight line matching as claimed in claim 1, wherein the method for extracting the straight line pair in the global map specifically comprises the following steps:
s11, carrying out edge detection on the global map, converting the global map into a binary image, and detecting all straight line segments in the binary image by using Hough transform;
s12, filtering all the straight line segments extracted from the global map, and reserving the straight line segments with the length larger than the minimum length threshold and smaller than the maximum length threshold;
and S13, performing straight line fitting on the filtered straight line segment, wherein the fitted straight line is the straight line extracted from the global map.
3. The mobile robot repositioning method based on straight line matching as claimed in claim 2, wherein the straight line fitting method is as follows:
forming straight line pairs by the filtered straight line segments pairwise, calculating included angles and distances among the straight line pairs, and performing straight line fitting on the straight line pairs with the included angles smaller than an angle threshold value I and the straight line distances smaller than a distance threshold value;
the distance between the two straight lines is the distance between the middle points of the two straight lines.
4. The method for relocating the mobile robot based on the straight line matching as claimed in claim 1, wherein the method for extracting the straight line pair in the local map specifically comprises the following steps:
s11, performing edge detection on the local map, converting the local map into a binary image, and detecting all straight line segments in the binary image by using Hough transform;
s12, filtering all the straight line segments extracted from the local map, and reserving the straight line segments with the length larger than the minimum length threshold and smaller than the maximum length threshold;
and S13, performing straight line fitting on the filtered straight line segment, wherein the fitted straight line is the straight line extracted from the global map.
5. The method for repositioning the mobile robot based on the straight line matching as claimed in claim 4, wherein the straight line fitting method is as follows:
forming straight line pairs by the filtered straight line segments pairwise, calculating included angles and distances among the straight line pairs, and performing straight line fitting on the straight line pairs with the included angles smaller than an angle threshold value I and the straight line distances smaller than a distance threshold value;
the distance between the two straight lines is the distance between the middle points of the two straight lines.
6. The method for repositioning the mobile robot based on the straight line matching as claimed in claim 1, wherein the method for obtaining the highest likelihood score of the parallel straight line pair is specifically as follows:
s521, calculating a rotation matrix R of the optimal parallel straight line pair in the local map relative to the parallel straight line pair in the global map;
s522, extracting a starting point of one straight line in the optimal parallel straight line pair, extracting one straight line in the parallel straight line pair in the global map, sequentially traversing each pixel point on the straight line from the starting point of the straight line, and calculating a translation vector T of each pixel point relative to the starting point;
and S523, scoring each group of rotation matrixes R and translation vectors T through a likelihood domain, wherein the highest score is the highest likelihood score of the corresponding parallel straight line pair.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911415258.7A CN111080703B (en) | 2019-12-31 | 2019-12-31 | Mobile robot repositioning method based on linear matching |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911415258.7A CN111080703B (en) | 2019-12-31 | 2019-12-31 | Mobile robot repositioning method based on linear matching |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111080703A CN111080703A (en) | 2020-04-28 |
CN111080703B true CN111080703B (en) | 2022-05-27 |
Family
ID=70320798
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911415258.7A Active CN111080703B (en) | 2019-12-31 | 2019-12-31 | Mobile robot repositioning method based on linear matching |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111080703B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113176783B (en) * | 2021-05-26 | 2024-05-07 | 珠海一微半导体股份有限公司 | Positioning control method based on map matching, chip and robot |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2618232A1 (en) * | 2010-09-17 | 2013-07-24 | Tokyo Institute of Technology | Map generation device, map generation method, method for moving mobile body, and robot device |
CN104503449A (en) * | 2014-11-24 | 2015-04-08 | 杭州申昊科技股份有限公司 | Positioning method based on environment line features |
CN105094135A (en) * | 2015-09-03 | 2015-11-25 | 上海电机学院 | Distributed multi-robot map fusion system and fusion method |
CN107065887A (en) * | 2017-05-26 | 2017-08-18 | 重庆大学 | Backward air navigation aid in omni-directional mobile robots passage |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101457148B1 (en) * | 2008-05-21 | 2014-10-31 | 삼성전자 주식회사 | Apparatus for localizing moving robot and method the same |
US8369606B2 (en) * | 2010-07-21 | 2013-02-05 | Palo Alto Research Center Incorporated | System and method for aligning maps using polyline matching |
-
2019
- 2019-12-31 CN CN201911415258.7A patent/CN111080703B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2618232A1 (en) * | 2010-09-17 | 2013-07-24 | Tokyo Institute of Technology | Map generation device, map generation method, method for moving mobile body, and robot device |
CN104503449A (en) * | 2014-11-24 | 2015-04-08 | 杭州申昊科技股份有限公司 | Positioning method based on environment line features |
CN105094135A (en) * | 2015-09-03 | 2015-11-25 | 上海电机学院 | Distributed multi-robot map fusion system and fusion method |
CN107065887A (en) * | 2017-05-26 | 2017-08-18 | 重庆大学 | Backward air navigation aid in omni-directional mobile robots passage |
Non-Patent Citations (2)
Title |
---|
Vision-based mobile robot localization and mapping using the PLOT features;Rui Lin等;《2012 IEEE International Conference on Mechatronics and Automation》;20120827;第1-10页 * |
一种室内自主移动机器人定位方法;高云峰等;《华中科技大学学报(自然科学版)》;20131031;第245-253页 * |
Also Published As
Publication number | Publication date |
---|---|
CN111080703A (en) | 2020-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5164222B2 (en) | Image search method and system | |
CN104200495B (en) | A kind of multi-object tracking method in video monitoring | |
KR20200045522A (en) | Methods and systems for use in performing localization | |
CN109493313B (en) | Vision-based steel coil positioning method and equipment | |
CN113379718A (en) | Target detection method and device, electronic equipment and readable storage medium | |
CN101556647A (en) | mobile robot visual orientation method based on improved SIFT algorithm | |
JP2009020014A (en) | Self-location estimation device | |
LU500407B1 (en) | Real-time positioning method for inspection robot | |
CN104376328B (en) | Coordinate-based distributed coding mark identification method and system | |
CN111832634B (en) | Foreign matter detection method, foreign matter detection system, terminal device and storage medium | |
CN110533699B (en) | Dynamic multi-frame velocity measurement method for pixel change based on optical flow method | |
Liao et al. | A method of image analysis for QR code recognition | |
CN111080703B (en) | Mobile robot repositioning method based on linear matching | |
CN107463939B (en) | Image key straight line detection method | |
CN112652020A (en) | Visual SLAM method based on AdaLAM algorithm | |
CN114863129A (en) | Instrument numerical analysis method, device, equipment and storage medium | |
CN109034151A (en) | A kind of localization method for the identification of multiple pointer instruments | |
CN111400537B (en) | Road element information acquisition method and device and electronic equipment | |
CN112418242B (en) | Color identification system suitable for large-scale targets and identification method thereof | |
CN105654474A (en) | Mechanical arm positioning method based on visual guidance and device thereof | |
CN115902977A (en) | Transformer substation robot double-positioning method and system based on vision and GPS | |
CN111007441B (en) | Electrolytic capacitor polarity detection method and detection system | |
Xiong et al. | Research on real-time multi-object detections based on template matching | |
Yang et al. | CA-YOLOv5: A YOLO model for apple detection in the natural environment | |
CN112069849A (en) | Identification and positioning method, device, equipment and storage medium based on multiple two-dimensional codes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |