CN111080703A - Mobile robot repositioning method based on linear matching - Google Patents

Mobile robot repositioning method based on linear matching Download PDF

Info

Publication number
CN111080703A
CN111080703A CN201911415258.7A CN201911415258A CN111080703A CN 111080703 A CN111080703 A CN 111080703A CN 201911415258 A CN201911415258 A CN 201911415258A CN 111080703 A CN111080703 A CN 111080703A
Authority
CN
China
Prior art keywords
straight line
optimal
pair
line pair
rotation matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911415258.7A
Other languages
Chinese (zh)
Other versions
CN111080703B (en
Inventor
伍永健
陈智君
郝奇
曹雏清
高云峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhu Hit Robot Technology Research Institute Co Ltd
Original Assignee
Wuhu Hit Robot Technology Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhu Hit Robot Technology Research Institute Co Ltd filed Critical Wuhu Hit Robot Technology Research Institute Co Ltd
Priority to CN201911415258.7A priority Critical patent/CN111080703B/en
Publication of CN111080703A publication Critical patent/CN111080703A/en
Application granted granted Critical
Publication of CN111080703B publication Critical patent/CN111080703B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/168Segmentation; Edge detection involving transform domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform

Abstract

The invention is suitable for the technical field of robot positioning, and provides a mobile robot repositioning method based on linear matching, which comprises the following steps: s1, loading the global map, extracting straight lines from the global map, and storing the straight lines; s2, extracting straight lines from the local map, and screening out the optimal crossed straight line pair and the optimal parallel straight line pair; s3, detecting whether the local map has the optimal crossed straight line pair, if so, executing a step S4, and if not, executing a step S5; s4, matching the optimal crossed straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and a translation vector T, and positioning the mobile robot; and S5, matching the optimal parallel straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and translation vector T, and positioning the mobile robot. After the mobile robot has abnormal conditions such as 'kidnapping' or restarting, the relocation can be rapidly carried out.

Description

Mobile robot repositioning method based on linear matching
Technical Field
The invention belongs to the technical field of robot positioning, and provides a mobile robot repositioning method based on linear matching.
Background
With the development of science and technology, mobile robots play an increasingly important role in the fields of automation plants, intelligent warehousing and logistics and the like. In some occasions, when the robot is restarted or suddenly is "kidnapped" to other positions, the robot cannot position the pose, and at the moment, the robot needs to be moved to the initial position by a human and can work after being restarted. In the existing solution for robot relocation, robot positioning is mostly realized by attaching two-dimensional codes or installing UWB and other auxiliary devices in the environment, which limits the application range of the robot and increases the cost. The method can realize quick and accurate repositioning of the mobile robot under abnormal conditions such as self pose failure or restarting and the like, and is a problem to be solved urgently at present.
Disclosure of Invention
The embodiment of the invention provides a mobile robot repositioning method based on linear matching, which can reposition quickly and accurately under the abnormal condition that the self pose of the mobile robot is invalid or restarted.
The invention is realized in this way, a mobile robot repositioning method based on straight line matching, which specifically comprises the following steps:
s1, loading the global map, extracting straight lines from the global map, and storing the straight lines;
s2, extracting straight lines from the local map, and screening out the optimal crossed straight line pair and the optimal parallel straight line pair;
s3, detecting whether the local map has the optimal crossed straight line pair, if so, executing a step S4, and if not, executing a step S5;
s4, matching the optimal crossed straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and translation vector T, and positioning the mobile robot based on the optimal rotation matrix R and translation vector T;
and S5, matching the optimal parallel straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and translation vector T, and positioning the mobile robot based on the optimal rotation matrix R and translation vector T.
Further, the method for extracting the straight line pair in the global map specifically comprises the following steps:
s11, carrying out edge detection on the global map, converting the global map into a binary image, and detecting all straight line segments in the binary image by using Hough transform;
s12, filtering all the straight line segments extracted from the global map, and reserving the straight line segments with the length larger than the minimum length threshold and smaller than the maximum length threshold;
and S13, performing straight line fitting on the filtered straight line segment, wherein the fitted straight line is the straight line extracted from the global map.
Further, the straight line fitting method specifically comprises the following steps:
forming straight line pairs by the filtered straight line segments pairwise, calculating included angles and distances among the straight line pairs, and performing straight line fitting on the straight line pairs with the included angles smaller than an angle threshold value I and the straight line distances smaller than a distance threshold value;
the distance between the two straight lines is the distance between the middle points of the two straight lines.
Further, the method for extracting the straight line pair in the local map specifically comprises the following steps:
s11, performing edge detection on the local map, converting the local map into a binary image, and detecting all straight line segments in the binary image by using Hough transform;
s12, filtering all the straight line segments extracted from the local map, and reserving the straight line segments with the length larger than the minimum length threshold and smaller than the maximum length threshold;
and S13, performing straight line fitting on the filtered straight line segment, wherein the fitted straight line is the straight line extracted from the global map.
Further, the straight line fitting method specifically comprises the following steps:
forming straight line pairs by the filtered straight line segments pairwise, calculating included angles and distances among the straight line pairs, and performing straight line fitting on the straight line pairs with the included angles smaller than an angle threshold value I and the straight line distances smaller than a distance threshold value;
the distance between the two straight lines is the distance between the middle points of the two straight lines.
Further, the matching process based on the optimal crossed straight line pair specifically includes the following steps:
s41, determining the intersection point of two straight lines in the optimal crossing straight line pair, taking the intersection point as a starting point and two straight line end points far away from the intersection point as end points, and defining the two generated vectors as the optimal crossing vector pair;
s42, acquiring all crossed straight line pairs in the global map, taking the intersection point as a starting point and two straight line end points far away from the intersection point as end points, and defining the generated two vectors as crossed vector pairs;
s42, respectively calculating a rotation matrix R and a translation vector T of the optimal cross vector pair in the local map relative to each cross vector pair in the global map;
and S43, scoring each group of rotation matrix R and translation vector T through the likelihood domain model, wherein the rotation matrix R and the translation vector T with the highest scores are the optimal rotation matrix R and translation vector T.
Further, the matching process based on the optimal parallel straight line pair specifically includes the following steps:
s51, defining the direction of the optimal parallel straight line pair in the local map, and endowing each group of parallel straight line pairs in the global map with the same direction;
s52, acquiring the highest likelihood score of each group of parallel straight line pairs and the corresponding rotation matrix R and translation vector T;
and S53, acquiring the maximum value of the highest likelihood score of each group of parallel straight line pairs and the corresponding rotation matrix R and translation vector T, wherein the rotation matrix R and the translation vector T are the optimal rotation matrix R and translation vector T.
Further, the method for obtaining the highest likelihood score of the parallel straight line pair specifically includes:
s521, calculating a rotation matrix R of the optimal parallel straight line pair in the local map relative to the parallel straight line pair in the global map;
s522, extracting a starting point of one straight line in the optimal parallel straight line pair, extracting one straight line in the parallel straight line pair in the global map, sequentially traversing each pixel point on the straight line from the starting point of the straight line, and calculating a translation vector T of each pixel point relative to the starting point;
and S523, scoring each group of rotation matrixes R and translation vectors T through a likelihood domain, wherein the highest score is the highest likelihood score of the corresponding parallel straight line pair.
According to the mobile robot repositioning method based on linear matching, the repositioning can be rapidly carried out after the mobile robot is subjected to abnormal conditions such as 'kidnapping' or restarting.
Drawings
Fig. 1 is a flowchart of a mobile robot repositioning method based on line matching according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Fig. 1 is a flowchart of a method for repositioning a mobile robot based on line matching according to an embodiment of the present invention, where the method specifically includes the following steps:
and S1, loading the global map, extracting the straight line end from the global map, storing the straight line end into the straight line set, and only directly reading the straight line end from the straight line set without repeatedly performing the straight line extraction operation in the global map in the subsequent matching operation.
In the embodiment of the present invention, the method for extracting a straight line segment in a global map specifically includes the following steps:
s11, carrying out edge detection on the global map, converting the global map into a binary image, detecting all straight line segments in the binary image by utilizing Hough transform, wherein each straight line segment adopts four elements (x)1,y1,x2,y2) Is represented by (x)1,y1) And (x)2,y2) Respectively representing the starting point and the ending point of the straight line segment;
s12, filtering all the straight line segments extracted from the global map, namely keeping the straight line segments with the length larger than a minimum length threshold and smaller than a maximum length threshold;
s13, performing straight line fitting on the filtered straight line segment, wherein the fitted straight line is the straight line segment in the global map, and the fitting method is as follows:
forming straight line pairs by the filtered straight line segments pairwise, calculating included angles and distances among the straight line pairs, and performing straight line fitting on the straight line pairs with the included angles smaller than an angle threshold value I and the straight line distances smaller than a distance threshold value; the distance between the two straight lines is the distance between the middle points of the two straight lines.
S2, extracting straight lines from the local map, and screening out the optimal crossed straight line pair and the optimal parallel straight line pair;
in the embodiment of the present invention, a laser sensor is used to acquire current frame data and convert the current frame data into a local image, and the current position of the laser sensor is determined as the central position of the image, and the method for extracting a straight line in the local map specifically includes the following steps:
s21, carrying out edge detection on the local map, converting the local map into a binary image, detecting all straight line segments in the binary image by utilizing Hough transform, wherein each straight line segment adopts four elements (x)1,y1,x2,y2) Is represented by (x)1,y1) And (x)2,y2) Respectively representing the starting point and the ending point of the straight line segment;
s22, filtering all the straight line segments extracted from the local map, namely keeping the straight line segments with the length larger than a minimum length threshold and smaller than a maximum length threshold;
s23, performing straight line fitting on the filtered straight line segment, wherein the fitted straight line is a straight line in the local map, and the fitting method is as follows:
forming straight line pairs by the filtered straight line segments pairwise, calculating included angles and distances among the straight line pairs, and performing straight line fitting on the straight line pairs with the included angles smaller than an angle threshold value I and the straight line distances smaller than a distance threshold value; the distance between the two straight lines is the distance between the middle points of the two straight lines.
In the embodiment of the invention, after the straight line extraction operation is carried out on the local map, a plurality of straight lines are extracted, the straight lines are combined pairwise to form a plurality of groups of straight line pairs, the included angle between the straight line pairs is calculated, if the included angle is greater than an angle preset value, the straight line pair is judged to be a crossed straight line pair, and if the included angle is less than or equal to the angle preset value, the straight line pair is judged to be a parallel straight line pair;
for a crossed straight line pair: extracting a crossed straight line pair with an included angle larger than an angle threshold value two, detecting the distance of a short straight line section in the crossed straight line pair, and taking the crossed straight line pair where the short straight line section with the longest distance is located as an optimal crossed straight line pair;
for a parallel straight line pair: and taking the parallel straight line pair with the smallest angle as the optimal parallel straight line pair.
S3, detecting whether the local map has the optimal crossed straight line pair, if so, executing a step S4, and if not, executing a step S5;
s4, matching the optimal crossed straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and translation vector T, and positioning the mobile robot based on the optimal rotation matrix R and translation vector T;
in the embodiment of the present invention, the line pair in the global map is formed by combining two lines in the global map, and the matching process based on the optimal crossing line pair specifically includes the following steps:
s41, determining the intersection point of two straight lines in the optimal crossing straight line pair, taking the intersection point as a starting point and two straight line end points far away from the intersection point as end points, and defining the two generated vectors as the optimal crossing vector pair;
s42, acquiring all crossed straight line pairs in the global map, taking the intersection point as a starting point and two straight line end points far away from the intersection point as end points, and defining the generated two vectors as crossed vector pairs;
s42, respectively calculating a rotation matrix R and a translation vector T of the optimal cross vector pair in the local map relative to each cross vector pair in the global map;
and S43, scoring the rotation matrixes R and the translation vectors T through the likelihood domain, wherein the rotation matrix R and the translation vector T with the highest score are the optimal rotation matrix R and translation vector T.
In the embodiment of the present invention, the calculation formulas of the rotation matrix R and the translation vector T are specifically as follows:
Figure BDA0002351021350000061
wherein, VLVectors, V, representing straight line components in a local mapWVectors representing the straight line components of the global map, PLRepresenting the coordinates of points on a straight line of a local map, PWRepresenting the coordinates of points on a global map straight line.
S5, matching the optimal parallel straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and translation vector T, and positioning the mobile robot based on the optimal rotation matrix R and translation vector T;
in the embodiment of the present invention, the matching process based on the optimal parallel straight line specifically includes the following steps:
s51, defining the direction of the optimal parallel straight line pair in the local map, and endowing each group of parallel straight line pairs in the global map with the same direction;
s52, obtaining the highest likelihood score of each group of parallel straight lines, and the rotation matrix R and the translation vector T corresponding to the highest likelihood score, where the highest likelihood score is the highest score of each group of parallel straight lines calculated by the likelihood model, and the obtaining method is specifically as follows:
s521, calculating a rotation matrix R of the optimal parallel straight line pair in the local map relative to the parallel straight line pair in the global map by using VL=R·VWSolving the rotation matrix R, VLVectors, V, representing the composition of straight lines of a local mapWA vector representing a global map straight line composition;
s522, extracting a starting point of a straight line in the local map, extracting a straight line in a parallel straight line pair in the global map, and extracting a straight line from the straight lineStarting from the starting point of the line, sequentially traversing each pixel point on the straight line, calculating the translation vector T of each pixel point relative to the starting point, and using a formula T as PL-R·PWAnd obtaining a translation vector T corresponding to each pixel point.
And S523, scoring the rotation matrixes R and the translation vectors T through the likelihood domain to obtain the highest likelihood score of the corresponding parallel straight line pair.
And S53, acquiring the highest value of the highest score of each group of parallel straight line pairs and the corresponding rotation matrix R and translation vector T, and taking the rotation matrix R and the translation vector T as the optimal rotation matrix R and translation vector T.
According to the mobile robot repositioning method based on linear matching, the repositioning can be rapidly carried out after the mobile robot is subjected to abnormal conditions such as 'kidnapping' or restarting.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (8)

1. A mobile robot repositioning method based on straight line matching is characterized by specifically comprising the following steps:
s1, loading the global map, extracting straight lines from the global map, and storing the straight lines;
s2, extracting straight lines from the local map, and screening out the optimal crossed straight line pair and the optimal parallel straight line pair;
s3, detecting whether the local map has the optimal crossed straight line pair, if so, executing a step S4, and if not, executing a step S5;
s4, matching the optimal crossed straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and translation vector T, and positioning the mobile robot based on the optimal rotation matrix R and translation vector T;
and S5, matching the optimal parallel straight line pair in the local map with the straight line pair in the global map to obtain an optimal rotation matrix R and translation vector T, and positioning the mobile robot based on the optimal rotation matrix R and translation vector T.
2. The method for relocating the mobile robot based on the straight line matching as claimed in claim 1, wherein the method for extracting the straight line pair in the global map specifically comprises the following steps:
s11, carrying out edge detection on the global map, converting the global map into a binary image, and detecting all straight line segments in the binary image by using Hough transform;
s12, filtering all the straight line segments extracted from the global map, and reserving the straight line segments with the length larger than the minimum length threshold and smaller than the maximum length threshold;
and S13, performing straight line fitting on the filtered straight line segment, wherein the fitted straight line is the straight line extracted from the global map.
3. The mobile robot repositioning method based on straight line matching as claimed in claim 2, wherein the straight line fitting method is as follows:
forming straight line pairs by the filtered straight line segments pairwise, calculating included angles and distances among the straight line pairs, and performing straight line fitting on the straight line pairs with the included angles smaller than an angle threshold value I and the straight line distances smaller than a distance threshold value;
the distance between the two straight lines is the distance between the middle points of the two straight lines.
4. The method for relocating the mobile robot based on the straight line matching as claimed in claim 1, wherein the method for extracting the straight line pair in the local map specifically comprises the following steps:
s11, performing edge detection on the local map, converting the local map into a binary image, and detecting all straight line segments in the binary image by using Hough transform;
s12, filtering all the straight line segments extracted from the local map, and reserving the straight line segments with the length larger than the minimum length threshold and smaller than the maximum length threshold;
and S13, performing straight line fitting on the filtered straight line segment, wherein the fitted straight line is the straight line extracted from the global map.
5. The method for repositioning the mobile robot based on the straight line matching as claimed in claim 4, wherein the straight line fitting method is as follows:
forming straight line pairs by the filtered straight line segments pairwise, calculating included angles and distances among the straight line pairs, and performing straight line fitting on the straight line pairs with the included angles smaller than an angle threshold value I and the straight line distances smaller than a distance threshold value;
the distance between the two straight lines is the distance between the middle points of the two straight lines.
6. The method for repositioning the mobile robot based on the straight line matching as claimed in claim 1, wherein the matching process based on the optimal crossed straight line pair specifically comprises the following steps:
s41, determining the intersection point of two straight lines in the optimal crossing straight line pair, taking the intersection point as a starting point and two straight line end points far away from the intersection point as end points, and defining the two generated vectors as the optimal crossing vector pair;
s42, acquiring all crossed straight line pairs in the global map, taking the intersection point as a starting point and two straight line end points far away from the intersection point as end points, and defining the generated two vectors as crossed vector pairs;
s42, respectively calculating a rotation matrix R and a translation vector T of the optimal cross vector pair in the local map relative to each cross vector pair in the global map;
and S43, scoring each group of rotation matrix R and translation vector T through the likelihood domain model, wherein the rotation matrix R and the translation vector T with the highest scores are the optimal rotation matrix R and translation vector T.
7. The method for repositioning the mobile robot based on the straight line matching as claimed in claim 1, wherein the matching process based on the optimal parallel straight line pair specifically comprises the following steps:
s51, defining the direction of the optimal parallel straight line pair in the local map, and endowing each group of parallel straight line pairs in the global map with the same direction;
s52, acquiring the highest likelihood score of each group of parallel straight line pairs and the corresponding rotation matrix R and translation vector T;
and S53, acquiring the highest value of the highest likelihood score of each group of parallel straight line pairs and the corresponding rotation matrix R and translation vector T, wherein the rotation matrix R and the translation vector T are the optimal rotation matrix R and translation vector T.
8. The method for repositioning the mobile robot based on the straight line matching as claimed in claim 7, wherein the method for obtaining the highest likelihood score of the parallel straight line pair is specifically as follows:
s521, calculating a rotation matrix R of the optimal parallel straight line pair in the local map relative to the parallel straight line pair in the global map;
s522, extracting a starting point of one straight line in the optimal parallel straight line pair, extracting one straight line in the parallel straight line pair in the global map, sequentially traversing each pixel point on the straight line from the starting point of the straight line, and calculating a translation vector T of each pixel point relative to the starting point;
and S523, scoring each group of rotation matrixes R and translation vectors T through a likelihood domain, wherein the highest score is the highest likelihood score of the corresponding parallel straight line pair.
CN201911415258.7A 2019-12-31 2019-12-31 Mobile robot repositioning method based on linear matching Active CN111080703B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911415258.7A CN111080703B (en) 2019-12-31 2019-12-31 Mobile robot repositioning method based on linear matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911415258.7A CN111080703B (en) 2019-12-31 2019-12-31 Mobile robot repositioning method based on linear matching

Publications (2)

Publication Number Publication Date
CN111080703A true CN111080703A (en) 2020-04-28
CN111080703B CN111080703B (en) 2022-05-27

Family

ID=70320798

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911415258.7A Active CN111080703B (en) 2019-12-31 2019-12-31 Mobile robot repositioning method based on linear matching

Country Status (1)

Country Link
CN (1) CN111080703B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113176783A (en) * 2021-05-26 2021-07-27 珠海市一微半导体有限公司 Positioning control method, chip and robot based on map matching

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090292394A1 (en) * 2008-05-21 2009-11-26 Samsung Electronics Co., Ltd. Apparatus for locating moving robot and method for the same
US20120020533A1 (en) * 2010-07-21 2012-01-26 Palo Alto Research Center Incorporated System And Method For Aligning Maps Using Polyline Matching
EP2618232A1 (en) * 2010-09-17 2013-07-24 Tokyo Institute of Technology Map generation device, map generation method, method for moving mobile body, and robot device
CN104503449A (en) * 2014-11-24 2015-04-08 杭州申昊科技股份有限公司 Positioning method based on environment line features
CN105094135A (en) * 2015-09-03 2015-11-25 上海电机学院 Distributed multi-robot map fusion system and fusion method
CN107065887A (en) * 2017-05-26 2017-08-18 重庆大学 Backward air navigation aid in omni-directional mobile robots passage

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090292394A1 (en) * 2008-05-21 2009-11-26 Samsung Electronics Co., Ltd. Apparatus for locating moving robot and method for the same
US20120020533A1 (en) * 2010-07-21 2012-01-26 Palo Alto Research Center Incorporated System And Method For Aligning Maps Using Polyline Matching
EP2618232A1 (en) * 2010-09-17 2013-07-24 Tokyo Institute of Technology Map generation device, map generation method, method for moving mobile body, and robot device
CN104503449A (en) * 2014-11-24 2015-04-08 杭州申昊科技股份有限公司 Positioning method based on environment line features
CN105094135A (en) * 2015-09-03 2015-11-25 上海电机学院 Distributed multi-robot map fusion system and fusion method
CN107065887A (en) * 2017-05-26 2017-08-18 重庆大学 Backward air navigation aid in omni-directional mobile robots passage

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
RUI LIN等: "Vision-based mobile robot localization and mapping using the PLOT features", 《2012 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATION》 *
高云峰等: "一种室内自主移动机器人定位方法", 《华中科技大学学报(自然科学版)》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113176783A (en) * 2021-05-26 2021-07-27 珠海市一微半导体有限公司 Positioning control method, chip and robot based on map matching

Also Published As

Publication number Publication date
CN111080703B (en) 2022-05-27

Similar Documents

Publication Publication Date Title
CN104200495B (en) A kind of multi-object tracking method in video monitoring
JP5164222B2 (en) Image search method and system
KR20200045522A (en) Methods and systems for use in performing localization
CN109493313B (en) Vision-based steel coil positioning method and equipment
CN101556647A (en) mobile robot visual orientation method based on improved SIFT algorithm
JP2009020014A (en) Self-location estimation device
LU500407B1 (en) Real-time positioning method for inspection robot
CN104376328B (en) Coordinate-based distributed coding mark identification method and system
Liao et al. A method of image analysis for QR code recognition
CN111080703B (en) Mobile robot repositioning method based on linear matching
CN107463939B (en) Image key straight line detection method
CN110910389B (en) Laser SLAM loop detection system and method based on graph descriptor
CN112652020A (en) Visual SLAM method based on AdaLAM algorithm
CN111832634B (en) Foreign matter detection method, foreign matter detection system, terminal device and storage medium
CN109034151A (en) A kind of localization method for the identification of multiple pointer instruments
CN111400537B (en) Road element information acquisition method and device and electronic equipment
Jiang et al. Mobile robot gas source localization via top-down visual attention mechanism and shape analysis
CN116642492A (en) Mobile robot repositioning method and device and mobile robot
CN114526724B (en) Positioning method and equipment for inspection robot
CN115902977A (en) Transformer substation robot double-positioning method and system based on vision and GPS
CN113112551B (en) Camera parameter determining method and device, road side equipment and cloud control platform
CN115272482A (en) Camera external reference calibration method and storage medium
CN111931786B (en) Image processing method and device and computer readable storage medium
Xiong et al. Research on real-time multi-object detections based on template matching
CN112069849A (en) Identification and positioning method, device, equipment and storage medium based on multiple two-dimensional codes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant