CN115575963A - Positioning method based on fusion of reflector and SLAM - Google Patents

Positioning method based on fusion of reflector and SLAM Download PDF

Info

Publication number
CN115575963A
CN115575963A CN202211565497.2A CN202211565497A CN115575963A CN 115575963 A CN115575963 A CN 115575963A CN 202211565497 A CN202211565497 A CN 202211565497A CN 115575963 A CN115575963 A CN 115575963A
Authority
CN
China
Prior art keywords
reflector
pose
slam
robot
positioning mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211565497.2A
Other languages
Chinese (zh)
Inventor
周军
谢杰
龙羽
徐菱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Ruixinxing Technology Co ltd
Original Assignee
Chengdu Ruixinxing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Ruixinxing Technology Co ltd filed Critical Chengdu Ruixinxing Technology Co ltd
Priority to CN202211565497.2A priority Critical patent/CN115575963A/en
Publication of CN115575963A publication Critical patent/CN115575963A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a positioning method based on fusion of a reflector and an SLAM (simultaneous localization and mapping), which comprises the following steps of S1: entering a reflector scanning area to perform reflector matching; the mode is not the reflector positioning mode, the reflector positioning mode is switched, and then reflector matching is carried out; s2: solving the pose of the robot; s3: issuing the pose; s4: and judging whether the robot leaves a reflector scanning area, if so, switching to an SLAM positioning mode, and issuing the robot pose, otherwise, repeating the steps S2 and S3. The robot enters a reflector scanning area, when the reflector positioning mode is switched, the pose of the robot is obtained and issued, when the laser radar does not scan the reflectors or scans that the number of the reflectors is less than a certain number, the robot is switched to the SLAM positioning mode, the pose of the robot is continuously issued in the SLAM positioning mode, and the method is suitable for high-dynamic environments and reduces cost.

Description

Positioning method based on fusion of reflector and SLAM
Technical Field
The invention relates to the technical field of positioning, in particular to a positioning method based on fusion of a reflector and a SLAM.
Background
With the rapid development of Chinese economy, agv carriers are frequently used in all corners of a factory, but the working environment of agv is highly dynamic, many people and goods including other mechanical equipment change frequently, so that the traditional slam positioning cannot be applied to the high dynamic environment, but the actual field environment is too large, the high dynamic area only accounts for a small part, and if the reflector is deployed in the whole field, the cost is high. Through long-term research of the inventor, the invention provides a positioning method based on the fusion of a reflector and a SLAM.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a positioning method based on the fusion of a reflector and a SLAM.
The purpose of the invention is realized by the following technical scheme: a positioning method based on fusion of a reflector and a SLAM comprises the following steps:
s1: entering a reflector scanning area, and if the reflector is in a reflector positioning mode, matching the reflectors; if the position of the reflector is not in the reflector positioning mode, the SLAM position is used as an initialization position and is switched to the reflector positioning mode, then reflector matching is carried out, and the SLAM positioning mode continues to operate and solve continuous positions in the reflector positioning mode, but the positioning positions are not issued;
s2: solving the pose of the robot in a reflector positioning mode;
s3: issuing the pose;
s4: and judging whether the robot leaves a reflector scanning area, if so, switching to an SLAM positioning mode and concurrence the pose of the robot, and if not, repeating the steps S2 and S3.
Preferably, in step S1, if the light source is not in the reflector positioning mode, the SLAM positioning pose is acquired as the initial pose
Figure 796713DEST_PATH_IMAGE001
The pose matrix is:
Figure 674539DEST_PATH_IMAGE002
preferably, the step S2 further includes the steps of:
s21: in a reflector positioning mode, scanning a high-reflectivity object by a laser radar to obtain laser data;
s22: filtering and extracting laser point cloud according to the scanned laser data and the intensity threshold value, and calculating the number of points on the reflector
Figure 306640DEST_PATH_IMAGE003
Figure 928114DEST_PATH_IMAGE004
Wherein
Figure 735795DEST_PATH_IMAGE005
Is the radius of the light-reflecting plate,
Figure 987785DEST_PATH_IMAGE006
is the distance between the lidar and the reflector,
Figure 435078DEST_PATH_IMAGE007
is the radar angular resolution;
s23: when the number of the extracted laser point clouds satisfies n, the coordinates of the extracted laser point clouds
Figure 564970DEST_PATH_IMAGE008
The equation can be obtained by a polar coordinate formula as follows:
Figure 522431DEST_PATH_IMAGE009
Figure 164896DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure 443693DEST_PATH_IMAGE006
is the distance between the lidar and the reflector,
Figure 938128DEST_PATH_IMAGE011
obtaining the centroid position of the reflector by weighted average of all the laser point clouds extracted from the current reflector for the scanning angle of the radar coordinate system
Figure 177525DEST_PATH_IMAGE012
Recording the set of the centroid positions of all the reflectors extracted from the current frame as sets;
s24: according to the extracted sets of the mass centers of the reflector, a position matrix of the reflector in a radar coordinate system
Figure 709000DEST_PATH_IMAGE013
According to the position of the reflector in the radar coordinate system
Figure 442470DEST_PATH_IMAGE014
Figure 976481DEST_PATH_IMAGE015
Based on the relative position of the known radar to the center of the robot
Figure 111797DEST_PATH_IMAGE016
Then the current estimated pose matrix of the reflector
Figure 440272DEST_PATH_IMAGE017
Figure 395459DEST_PATH_IMAGE018
The current position of each reflector can be obtained according to the pose matrix of the reflector
Figure 61057DEST_PATH_IMAGE019
And matching with the calibrated reflector, and recording the reflector which is successfully matched as a set mapsets if the reflector is successfully matched.
Preferably, step S24 further includes the steps of:
s241: the matched reflecting plates form a combination according to the two reflecting plates and are divided into a plurality of groups;
s242: traversing all combinations, wherein each combination comprises information of two reflectors, and the calibration coordinate of the reflector a is
Figure 395087DEST_PATH_IMAGE020
Figure 550256DEST_PATH_IMAGE021
Figure 258318DEST_PATH_IMAGE022
) The coordinates in the current radar coordinate system are
Figure 242454DEST_PATH_IMAGE023
(ii) a The calibration coordinate of the reflector b is
Figure 57089DEST_PATH_IMAGE024
And the coordinates in the current radar coordinate system are
Figure 350536DEST_PATH_IMAGE025
Thereby calculating the current pose of the vehicle
Figure 781779DEST_PATH_IMAGE026
S243: the vectors of the two reflectors under the world coordinate system are respectively
Figure 428661DEST_PATH_IMAGE027
Figure 2862DEST_PATH_IMAGE028
) Coordinate with
Figure 844041DEST_PATH_IMAGE020
As a starting point, the azimuth angle is wyaw, calculated from the vector as follows,
wyaw=atan2(
Figure 729958DEST_PATH_IMAGE029
);
then two reflective platesConstructed vector of
Figure 83927DEST_PATH_IMAGE030
As a starting point, a position matrix under the world coordinate system is
Figure 106110DEST_PATH_IMAGE031
Figure 321453DEST_PATH_IMAGE031
=
Figure 756982DEST_PATH_IMAGE032
Similarly, the vectors of the two reflectors in the radar coordinate system
Figure 778290DEST_PATH_IMAGE033
Figure 405712DEST_PATH_IMAGE034
) Coordinate with
Figure 821649DEST_PATH_IMAGE035
As starting point, angle of directionlyaw, calculated from the vector as follows:
lyaw=atan2(
Figure 42677DEST_PATH_IMAGE036
);
two reflectors are arranged under a radar coordinate system
Figure 54627DEST_PATH_IMAGE035
Vector pose matrix as starting point
Figure 349604DEST_PATH_IMAGE037
Figure 467602DEST_PATH_IMAGE037
=
Figure 956352DEST_PATH_IMAGE038
Preferably, according to the known position relation of the laser to the center of the robot
Figure 53883DEST_PATH_IMAGE039
Then the pose matrix of the robot in the world coordinate system is Tc,
Figure 967482DEST_PATH_IMAGE040
matrix of
Figure 401919DEST_PATH_IMAGE041
Inverting to obtain a matrix
Figure 112386DEST_PATH_IMAGE042
Then, then
Figure 934979DEST_PATH_IMAGE043
And averaging the calculated pose of the robot according to the calculated pose of the robot to obtain the current pose of the robot, and issuing.
Preferably, in step S4, when the laser radar does not scan the reflectors or the number of scanned reflectors is less than a certain number, the method switches to the SLAM positioning mode.
The invention has the following advantages: when the robot enters a reflector scanning area, whether the robot is switched to a reflector positioning mode is judged, when the robot is switched to the reflector positioning mode, the pose of the robot is obtained and issued, when the laser radar does not scan reflectors or the number of the scanned reflectors is less than a certain number, the robot is switched to an SLAM positioning mode, and the pose of the robot is continuously issued in the SLAM positioning mode, so that the robot pose positioning method is suitable for a high-dynamic environment and reduces the cost.
Drawings
Fig. 1 is a schematic structural diagram of a logic flow of a positioning method.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings of the embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
In addition, the embodiments of the present invention and the features of the embodiments may be combined with each other without conflict.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, or orientations or positional relationships that the products of the present invention conventionally lay out when in use, or orientations or positional relationships that are conventionally understood by those skilled in the art, which are merely for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the device or element referred to must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and the like are used merely to distinguish one description from another, and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should also be noted that, unless otherwise explicitly specified or limited, the terms "disposed," "mounted," "connected," and "connected" are to be construed broadly and may, for example, be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
In this embodiment, as shown in fig. 1, a positioning method based on the fusion of a reflector and a SLAM includes the following steps:
s1: entering a reflector scanning area, and if the reflector is in a reflector positioning mode, matching the reflectors; if the position of the reflector is not in the reflector positioning mode, the SLAM position is used as an initialization position and is switched to the reflector positioning mode, then reflector matching is carried out, and the SLAM positioning mode continues to operate and solve continuous positions in the reflector positioning mode, but the positioning positions are not issued; specifically, in a reflector scanning area, the difference between the pose in the SLAM positioning mode and the pose in the reflector mode is matched in real time, a threshold value is set, the threshold value can be set manually, and when the difference exceeds the threshold value, the pose of the SLAM is corrected through the pose in the current reflector positioning mode. However, in order to ensure the accuracy, because the SLAM algorithm has matching scores with the map, the laser emitted by the current positioning pose scans an object, and the object is successfully matched with the environment map, and when the pose matching degree of the reflector is lower than that of the SLAM positioning pose, the SLAM pose does not need to be corrected; and when the pose matching degree of the reflector is higher than that of the SLAM positioning pose, correcting the SLAM pose by using the pose of the reflector, so that the SLAM positioning pose is converged and matched near the correct initial pose. In this embodiment, solving the continuous pose of the robot in the SLAM positioning mode belongs to the prior art, and is performed in the prior art, which is not described herein again.
S2: solving the pose of the robot in a reflector positioning mode;
s3: issuing the pose;
s4: and judging whether the robot leaves a reflector scanning area, if so, switching to an SLAM positioning mode and concurrence the pose of the robot, and if not, repeating the steps S2 and S3. Further, in step S4, when the laser radar does not scan the reflectors or the number of scanned reflectors is less than a certain number, the method switches to the SLAM positioning mode. When the robot enters a reflector scanning area, whether the reflector positioning mode is switched is judged, when the reflector positioning mode is switched, the pose of the robot is obtained and issued, when the laser radar cannot scan the reflectors or the number of the scanned reflectors is smaller than a certain number, the robot is switched to the SLAM positioning mode, and the pose of the robot is continuously issued in the SLAM positioning mode, so that the robot is suitable for a high-dynamic environment and the cost is reduced.
Further, in step S1, if the mode is not the reflector positioning mode, the SLAM positioning pose is acquired as the initial pose
Figure 470128DEST_PATH_IMAGE001
The pose matrix is:
Figure 195508DEST_PATH_IMAGE002
specifically, if the position is in the reflector area, the position obtained in the previous frame is used as the initial position.
In this embodiment, step S2 further includes the following steps:
s21: in a reflector positioning mode, scanning a high-reflectivity object by a laser radar to obtain laser data;
s22: filtering and extracting laser point cloud according to the scanned laser data and the intensity threshold value, and calculating the number of points on the reflector
Figure 284949DEST_PATH_IMAGE003
Figure 488397DEST_PATH_IMAGE004
Wherein
Figure 815735DEST_PATH_IMAGE005
Is the radius of the light-reflecting plate,
Figure 603694DEST_PATH_IMAGE006
is the distance between the lidar and the reflector,
Figure 570644DEST_PATH_IMAGE007
is the radar angular resolution;
s23: when the number of the extracted laser point clouds satisfies n, the coordinates of the extracted laser point clouds
Figure 108942DEST_PATH_IMAGE008
The method can be obtained by a polar coordinate formula, which is as follows:
Figure 821945DEST_PATH_IMAGE009
Figure 92389DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure 480641DEST_PATH_IMAGE006
is the distance between the lidar and the reflector,
Figure 917570DEST_PATH_IMAGE011
obtaining the centroid position of the reflector by weighted average of all the laser point clouds extracted from the current reflector for the scanning angle of the radar coordinate system
Figure 514773DEST_PATH_IMAGE012
Recording the set of the centroid positions of all the reflectors extracted from the current frame as sets;
s24: according to the extracted sets of the mass centers of the reflector, a position and pose matrix of the reflector in a radar coordinate system
Figure 988742DEST_PATH_IMAGE013
According to the position of the reflector in the radar coordinate system
Figure 648394DEST_PATH_IMAGE014
Figure 997335DEST_PATH_IMAGE015
Specifically, since the position of the reflector is solved and the direction is not concerned, the direction angle is 0 by default.
Based on the relative position of the radar to the center of the robot
Figure 950510DEST_PATH_IMAGE016
Then the current estimated pose matrix of the reflector
Figure 251173DEST_PATH_IMAGE017
Figure 53912DEST_PATH_IMAGE018
The current position of each reflector can be obtained according to the pose matrix of the reflector
Figure 301485DEST_PATH_IMAGE019
And matching with the calibrated reflector, and if the reflector is successfully matched, marking the reflector which is successfully matched as a set mapsets. Specifically, the matching is performed by using the mahalanobis distance, for example, when the distance between the obtained coordinates of the reflector and the calibrated coordinates of the reflector is smaller than a set threshold, the matching is considered to be successful, and the threshold can be manually set according to the actual situation.
In this embodiment, step S24 further includes the following steps:
s241: the matched reflecting plates form a combination according to the two reflecting plates and are divided into a plurality of groups;
s242: traversing all combinations, wherein each combination comprises two reflector information, and the calibration coordinate of the reflector a is
Figure 640325DEST_PATH_IMAGE020
Figure 157894DEST_PATH_IMAGE021
Figure 683816DEST_PATH_IMAGE022
) The coordinates in the current radar coordinate system are
Figure 640139DEST_PATH_IMAGE023
(ii) a The calibration coordinate of the reflector b is
Figure 290608DEST_PATH_IMAGE024
Coordinates in the current radar coordinate system are
Figure 41395DEST_PATH_IMAGE025
Thereby calculating the current pose of the vehicle
Figure 382509DEST_PATH_IMAGE026
S243: the vectors of the two reflectors under the world coordinate system are respectively
Figure 565360DEST_PATH_IMAGE027
Figure 613213DEST_PATH_IMAGE028
) Coordinate with
Figure 534901DEST_PATH_IMAGE020
As a starting point, the azimuth angle is wyaw, calculated from the vector as follows,
wyaw=atan2(
Figure 441939DEST_PATH_IMAGE029
);
the vector formed by the two reflectors
Figure 474486DEST_PATH_IMAGE030
As a starting point in world coordinatesPose matrix under tie is
Figure 970321DEST_PATH_IMAGE031
Figure 797331DEST_PATH_IMAGE031
=
Figure 253983DEST_PATH_IMAGE032
Similarly, the vectors of the two reflectors in the radar coordinate system
Figure 559062DEST_PATH_IMAGE033
Figure 16062DEST_PATH_IMAGE034
) Coordinate with
Figure 967969DEST_PATH_IMAGE035
As starting point, angle of directionlyaw, calculated from the vector as follows:
lyaw=atan2(
Figure 207189DEST_PATH_IMAGE036
);
two reflectors are arranged under a radar coordinate system
Figure 83004DEST_PATH_IMAGE035
Vector pose matrix as starting point
Figure 865015DEST_PATH_IMAGE037
Figure 722243DEST_PATH_IMAGE037
=
Figure 887908DEST_PATH_IMAGE038
Further, according to the known position relation of the laser to the center of the robot
Figure 754364DEST_PATH_IMAGE039
Then the pose matrix of the robot under the world coordinate system is Tc,
Figure 453198DEST_PATH_IMAGE040
matrix array
Figure 559957DEST_PATH_IMAGE041
Inverting to obtain a matrix
Figure 868710DEST_PATH_IMAGE042
Then, then
Figure 116020DEST_PATH_IMAGE043
And averaging the calculated pose of the robot according to the calculated pose of the robot to obtain the current pose of the robot, and issuing.
In this embodiment, how to determine the stable entrance into the reflector region is: when the position and attitude errors of the reflector and the SLAM are continuously within a reasonable error range after the robot enters the reflector area for the first time, the robot is considered to stably enter the reflector area, and the positioning mode is switched to the reflector positioning mode; when the reflector cannot be scanned or the number of the scanned reflectors is smaller than a certain number, the method is switched to the SLAM positioning mode, and the SLAM positioning pose is corrected in real time in the reflector positioning mode, so that when the method is switched to the SLAM positioning mode, the robot pose is issued through the SLAM positioning mode, the accuracy is guaranteed, meanwhile, the deployment is flexible, the robustness is good, and when the robot pose is issued through the SLAM positioning mode, the reflector positioning mode is closed to issue the robot pose.
Although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments described in the foregoing embodiments, or equivalents may be substituted for elements thereof.

Claims (6)

1. A positioning method based on the fusion of a reflector and an SLAM is characterized in that: the method comprises the following steps:
s1: entering a reflector scanning area, and if the reflector is in a reflector positioning mode, matching the reflectors; if the position of the reflector is not in the reflector positioning mode, the SLAM position is used as an initialization position and is switched to the reflector positioning mode, then reflector matching is carried out, and the SLAM positioning mode continues to operate and solve continuous positions in the reflector positioning mode, but the positioning positions are not issued;
s2: solving the pose of the robot in a reflector positioning mode;
s3: issuing the pose;
s4: and judging whether the robot leaves the reflector scanning area, if so, switching to an SLAM positioning mode and issuing the pose of the robot, and if not, repeating the steps S2 and S3.
2. The positioning method based on the fusion of the reflector and the SLAM as claimed in claim 1, wherein: in the step S1, if the reflector positioning mode is not adopted, the SLAM positioning pose is acquired as an initial pose
Figure 199177DEST_PATH_IMAGE001
The pose matrix is:
Figure 663787DEST_PATH_IMAGE002
3. the positioning method based on the fusion of the reflector and the SLAM as claimed in claim 2, wherein: in the step S2, the method further includes the following steps:
s21: in the reflector positioning mode, the laser radar scans a high-reflection object to obtain laser data;
s22: filtering and extracting laser point cloud according to the scanned laser data and the intensity threshold value, and calculating the number of points on the reflector
Figure 771421DEST_PATH_IMAGE003
Figure 792729DEST_PATH_IMAGE004
Wherein
Figure 59631DEST_PATH_IMAGE005
Is the radius of the light-reflecting plate,
Figure 180296DEST_PATH_IMAGE006
is the distance between the lidar and the reflector,
Figure 837542DEST_PATH_IMAGE007
is the radar angular resolution;
s23: when the number of the extracted laser point clouds satisfies n, the coordinates of the extracted laser point clouds
Figure 990437DEST_PATH_IMAGE008
The equation can be obtained by a polar coordinate formula as follows:
Figure 518370DEST_PATH_IMAGE009
Figure 980576DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure 829845DEST_PATH_IMAGE006
is the distance between the lidar and the reflector,
Figure 691491DEST_PATH_IMAGE011
obtaining the centroid position of the reflector by weighted average of all the laser point clouds extracted from the current reflector for the scanning angle of the radar coordinate system
Figure 840975DEST_PATH_IMAGE012
Recording the set of the centroid positions of all the reflectors extracted from the current frame as sets;
s24: according to the extracted sets of the mass centers of the reflector, a position and pose matrix of the reflector in a radar coordinate system
Figure 457770DEST_PATH_IMAGE013
According to the position of the reflector in the radar coordinate system
Figure 884133DEST_PATH_IMAGE014
Figure 893678DEST_PATH_IMAGE015
Based on the relative position of the known radar to the center of the robot
Figure 865045DEST_PATH_IMAGE016
Then the current estimated pose matrix of the reflector
Figure 685364DEST_PATH_IMAGE017
Figure 742182DEST_PATH_IMAGE018
The current position of each reflector can be obtained according to the pose matrix of the reflector
Figure 555417DEST_PATH_IMAGE019
Matching with the calibrated reflector, and recording the reflector successfully matched as a set mapse if the reflector is successfully matchedts。
4. The positioning method based on the fusion of the reflector and the SLAM as claimed in claim 3, wherein: in step S24, the method further includes the steps of:
s241: the matched reflecting plates form a combination according to the two reflecting plates and are divided into a plurality of groups;
s242: traversing all combinations, wherein each combination comprises information of two reflectors, and the calibration coordinate of the reflector a is
Figure 69706DEST_PATH_IMAGE020
Figure 201873DEST_PATH_IMAGE021
Figure 542724DEST_PATH_IMAGE022
) The coordinates in the current radar coordinate system are
Figure 159650DEST_PATH_IMAGE023
(ii) a The calibration coordinate of the reflector b is
Figure 872654DEST_PATH_IMAGE024
Coordinates in the current radar coordinate system are
Figure 408677DEST_PATH_IMAGE025
Thereby calculating the current pose of the vehicle
Figure 66186DEST_PATH_IMAGE026
S243: the vectors of the two reflectors under the world coordinate system are respectively
Figure 877016DEST_PATH_IMAGE027
Figure 444526DEST_PATH_IMAGE028
) Coordinate with
Figure 417030DEST_PATH_IMAGE020
As a starting point, the azimuth angle is wyaw, calculated from the vector as follows,
wyaw=atan2(
Figure 702780DEST_PATH_IMAGE028
);
the vector formed by the two reflectors
Figure 51722DEST_PATH_IMAGE020
The position matrix of the starting point in the world coordinate system is
Figure 847639DEST_PATH_IMAGE029
Figure 746370DEST_PATH_IMAGE029
=
Figure 486793DEST_PATH_IMAGE030
Similarly, the vectors of the two reflectors in the radar coordinate system
Figure 999945DEST_PATH_IMAGE031
Figure 40582DEST_PATH_IMAGE032
) Coordinate with
Figure 590774DEST_PATH_IMAGE033
As a starting point, a direction anglelyaw, calculated from the vector as follows:
lyaw=atan2(
Figure 615231DEST_PATH_IMAGE034
);
two reflectors are arranged under a radar coordinate system
Figure 650183DEST_PATH_IMAGE033
Vector pose matrix as starting point
Figure 46792DEST_PATH_IMAGE035
Figure 797579DEST_PATH_IMAGE035
=
Figure 76376DEST_PATH_IMAGE036
5. The positioning method based on the fusion of the reflector and the SLAM as claimed in claim 4, wherein: according to the known position relation of the laser to the center of the robot
Figure 305232DEST_PATH_IMAGE037
Then the pose matrix of the robot in the world coordinate system is Tc,
Figure 87505DEST_PATH_IMAGE038
matrix array
Figure 743615DEST_PATH_IMAGE039
Inverting to obtain a matrix
Figure 712970DEST_PATH_IMAGE040
Then, then
Figure 745517DEST_PATH_IMAGE041
And averaging the calculated pose of the robot to obtain the current pose of the robot, and publishing.
6. The positioning method based on the fusion of the reflector and the SLAM as claimed in claim 1, wherein: in step S4, when the laser radar does not scan the reflector, the laser radar switches to the SLAM positioning mode.
CN202211565497.2A 2022-12-07 2022-12-07 Positioning method based on fusion of reflector and SLAM Pending CN115575963A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211565497.2A CN115575963A (en) 2022-12-07 2022-12-07 Positioning method based on fusion of reflector and SLAM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211565497.2A CN115575963A (en) 2022-12-07 2022-12-07 Positioning method based on fusion of reflector and SLAM

Publications (1)

Publication Number Publication Date
CN115575963A true CN115575963A (en) 2023-01-06

Family

ID=84590456

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211565497.2A Pending CN115575963A (en) 2022-12-07 2022-12-07 Positioning method based on fusion of reflector and SLAM

Country Status (1)

Country Link
CN (1) CN115575963A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110989592A (en) * 2019-12-02 2020-04-10 华中科技大学 Automatic mapping and positioning system for mobile robot
CN111017804A (en) * 2019-11-08 2020-04-17 华中科技大学 Intelligent mobile transfer system and transfer method thereof
CN211477160U (en) * 2019-12-06 2020-09-11 江西洪都航空工业集团有限责任公司 Laser navigation system with multiple positioning navigation modes
CN112629522A (en) * 2020-12-31 2021-04-09 山东大学 AGV positioning method and system with reflector and laser SLAM integrated
CN113625320A (en) * 2021-08-06 2021-11-09 珠海丽亭智能科技有限公司 Outdoor combined positioning method based on differential GPS and reflector
CN115220079A (en) * 2022-07-14 2022-10-21 中国中煤能源集团有限公司 Fusion positioning method and device and storage medium
CN115220012A (en) * 2022-09-20 2022-10-21 成都睿芯行科技有限公司 Positioning method based on reflecting plate
CN115437385A (en) * 2022-10-21 2022-12-06 上海木蚁机器人科技有限公司 Laser positioning method, device, equipment and medium for mobile robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111017804A (en) * 2019-11-08 2020-04-17 华中科技大学 Intelligent mobile transfer system and transfer method thereof
CN110989592A (en) * 2019-12-02 2020-04-10 华中科技大学 Automatic mapping and positioning system for mobile robot
CN211477160U (en) * 2019-12-06 2020-09-11 江西洪都航空工业集团有限责任公司 Laser navigation system with multiple positioning navigation modes
CN112629522A (en) * 2020-12-31 2021-04-09 山东大学 AGV positioning method and system with reflector and laser SLAM integrated
CN113625320A (en) * 2021-08-06 2021-11-09 珠海丽亭智能科技有限公司 Outdoor combined positioning method based on differential GPS and reflector
CN115220079A (en) * 2022-07-14 2022-10-21 中国中煤能源集团有限公司 Fusion positioning method and device and storage medium
CN115220012A (en) * 2022-09-20 2022-10-21 成都睿芯行科技有限公司 Positioning method based on reflecting plate
CN115437385A (en) * 2022-10-21 2022-12-06 上海木蚁机器人科技有限公司 Laser positioning method, device, equipment and medium for mobile robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
曹勇: "基于多传感器融合的仓储AGV导航定位系统设计与实现", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Similar Documents

Publication Publication Date Title
CN110645974B (en) Mobile robot indoor map construction method fusing multiple sensors
CN110658530B (en) Map construction method and system based on double-laser-radar data fusion and map
EP4283328A1 (en) Multi-radar and camera joint calibration method, system and device, and storage medium
CN115220012A (en) Positioning method based on reflecting plate
US20200030982A1 (en) Robot recharge docking method and robot with the same
CN111273312B (en) Intelligent vehicle positioning and loop detection method
CN109541632B (en) Target detection missing detection improvement method based on four-line laser radar assistance
CN113792699B (en) Object-level rapid scene recognition method based on semantic point cloud
US10509980B2 (en) Method to provide a vehicle environment contour polyline from detection data
CN110794396B (en) Multi-target identification method and system based on laser radar and navigation radar
CN111678516B (en) Bounded region rapid global positioning method based on laser radar
CN109188382A (en) A kind of target identification method based on millimetre-wave radar
CN116449392B (en) Map construction method, device, computer equipment and storage medium
CN115027482A (en) Fusion positioning method in intelligent driving
CN112700537A (en) Tire point cloud construction method, tire point cloud assembly method, tire point cloud control device, and storage medium
CN114295099B (en) Ranging method based on monocular camera, vehicle-mounted ranging equipment and storage medium
CN115201849A (en) Indoor map building method based on vector map
CN115575963A (en) Positioning method based on fusion of reflector and SLAM
Morris et al. A view-dependent adaptive matched filter for ladar-based vehicle tracking
TW202019742A (en) Lidar detection device for close obstacles and method thereof capable of effectively detecting obstacles and enhancing detection accuracy
CN113280829A (en) Target detection method and device based on fisheye vision and millimeter wave radar data
CN112255616B (en) Multi-radar reflective column positioning method and reflective column positioning device
CN111812659A (en) Iron tower posture early warning device and method based on image recognition and laser ranging
CN116399354A (en) High-precision low-drift large-range three-dimensional point cloud map construction and repositioning method
Steder et al. Maximum likelihood remission calibration for groups of heterogeneous laser scanners

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20230106

RJ01 Rejection of invention patent application after publication