CN111580508A - Robot positioning method and device, electronic equipment and storage medium - Google Patents
Robot positioning method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN111580508A CN111580508A CN202010291345.2A CN202010291345A CN111580508A CN 111580508 A CN111580508 A CN 111580508A CN 202010291345 A CN202010291345 A CN 202010291345A CN 111580508 A CN111580508 A CN 111580508A
- Authority
- CN
- China
- Prior art keywords
- pose
- robot
- data
- positioning
- estimated
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 64
- 238000005259 measurement Methods 0.000 claims description 13
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 5
- 238000004364 calculation method Methods 0.000 claims description 2
- 238000004590 computer program Methods 0.000 claims 2
- 230000008569 process Effects 0.000 abstract description 20
- 239000002245 particle Substances 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/027—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0272—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a positioning method and device of a robot, electronic equipment and a storage medium, wherein the method comprises the following steps: collecting positioning data of the robot; generating a pose of the robot at the previous moment according to the positioning data, the initial pose and a preset map of the robot; and generating a plurality of estimated poses corresponding to the pose at the previous moment, and taking the estimated pose with the maximum weight and/or the difference value between the pose and the pose at the previous moment smaller than a preset threshold value as the current pose of the robot. Therefore, the problems that the robot is easy to fail in positioning in the scheduling process, and then scheduling failure is caused are solved, the intelligence and flexibility of the robot are reduced, and the use experience is poor.
Description
Technical Field
The present invention relates to the field of robot technologies, and in particular, to a method and an apparatus for positioning a robot, an electronic device, and a storage medium.
Background
With the rapid development of current robotics, Automatic Guided Vehicles (AGVs) are widely used. In order to enable the AGV to automatically realize the sensing, decision-making and executing functions of the environment, the AGV is generally carried with various sensors for sensing, then the decision-making is carried out by using sensor data, and the position posture of the AGV is picked by using a chassis motor.
In the related art, the pose of the robot in the known two-dimensional map is generally updated by an Adaptive monte carlo localization Algorithm (AMCL) by using 2D radar data and wheel odometer data and using a particle filter.
However, in the process of positioning the AGVs by using the AMCL method, it is found that, in the process of scheduling the AGVs, especially when the AGVs enter the elevator and the elevator is closed, positioning errors/drifts often occur due to the change between the actual map and the known map, so that the scheduling of the AGVs cannot be completed.
Disclosure of Invention
The invention provides a robot positioning method and device, electronic equipment and a storage medium, and aims to solve the problems that positioning failure is easy to occur in the robot scheduling process, so that scheduling failure is caused, the intelligence and flexibility of the robot are reduced, the use experience is poor, and the like.
An embodiment of a first aspect of the present invention provides a positioning method for a robot, including the following steps: collecting positioning data of the robot; generating a previous-time pose of the robot according to the positioning data, the initial pose and a preset map of the robot; and generating a plurality of estimated poses corresponding to the pose at the previous moment, and taking the estimated pose with the maximum weight and/or the difference value between the pose and the pose at the previous moment smaller than a preset threshold value as the current pose of the robot.
An embodiment of a second aspect of the present invention provides a positioning device for a robot, including: the acquisition module is used for acquiring positioning data of the robot; the generating module is used for generating a pose of the robot at the previous moment according to the positioning data, the initial pose and a preset map of the robot; and the positioning module is used for generating a plurality of estimated poses corresponding to the pose at the previous moment, and taking the estimated pose with the maximum weight and/or the difference value between the pose and the pose at the previous moment and smaller than a preset threshold value as the current pose of the robot.
An embodiment of a third aspect of the present invention provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor, the instructions being arranged to perform a method of positioning a robot as described in the above embodiments.
A fourth aspect of the present invention provides a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the positioning method of a robot according to the above embodiments.
The method comprises the steps of generating a pose of a robot at the previous moment and a plurality of estimated poses corresponding to the pose at the previous moment according to collected positioning data, an initial pose and a preset map of the robot, taking the estimated poses with the maximum estimated pose weights and/or the estimated poses with the difference value between the poses at the previous moment being smaller than a preset threshold value as the current pose of the robot, avoiding the problem that the robot is prone to positioning failure in a scheduling process, particularly a floor scheduling process, improving the positioning accuracy, avoiding the phenomenon of scheduling failure, ensuring the reliability of the robot and improving the use experience. Therefore, the problems that positioning failure easily occurs in the dispatching process of the robot, and then dispatching failure is caused are solved, the intelligence and flexibility of the robot are reduced, and the use experience is poor.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart of a positioning method of a robot according to an embodiment of the present invention;
fig. 2 is a schematic illustration of a hoistway positioning failure;
FIG. 3 is a schematic diagram of pose estimation according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a positioning process after positioning correction according to an embodiment of the present invention;
fig. 5 is a flowchart of a positioning method of a robot according to an embodiment of the present invention;
fig. 6 is a block diagram illustrating a positioning apparatus of a robot according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
A positioning method, an apparatus, an electronic device, and a storage medium of a robot according to embodiments of the present invention are described below with reference to the drawings. Aiming at the problems that the robot is easy to generate positioning failure in the scheduling process, thereby causing the scheduling failure, reducing the intelligence and the flexibility of the robot, having poor use experience and the like, which are provided by the background technology center, the invention provides a positioning method of the robot, in the method, the pose of the robot at the previous moment and a plurality of estimated poses corresponding to the pose at the previous moment can be generated according to the collected positioning data, the initial pose and a preset map of the robot, and the pose with the largest weight of the plurality of estimated poses and/or the difference value between the estimated poses at the previous moment being smaller than a preset threshold value is taken as the current pose of the robot, thereby avoiding the problem that the robot is easy to generate the positioning failure in the scheduling process, particularly the inter-floor scheduling process, improving the positioning accuracy, avoiding the scheduling failure and ensuring the reliability of the robot, the use experience is improved. Therefore, the problems that the robot is easy to fail in positioning in the scheduling process, and then scheduling failure is caused are solved, the intelligence and flexibility of the robot are reduced, and the use experience is poor.
Specifically, fig. 1 is a flowchart of a positioning method of a robot according to an embodiment of the present invention.
As shown in fig. 1, the positioning method of the robot includes the following steps:
in step S101, positioning data of the robot is collected.
Specifically, the robot takes AGV as an example, when AGV enters an elevator and an elevator door is closed, 2D radar data scanned by the AGV is changed from three sides to four sides, in a known map, a graph of an elevator area is three sides, and a building structure similar to an elevator car in morphology is formed in the known map, and then the positioning failure problem shown in fig. 2 is caused. Also, since the area reached by the default AGV is at a black square, the dispatch of the elevator will be affected at this time. Therefore, the embodiment of the present invention is modified by the following method.
It is to be understood that the positioning data may include, but is not limited to, radar data, inertial measurement data, and wheel odometry data, wherein the radar data may be 2D lidar data, each of which may be collected by a sensor. It should be noted that, those skilled in the art may collect the positioning data of the robot according to actual situations, and the positioning data is not specifically limited herein.
Further, in one embodiment of the present invention, the positioning data comprises radar data, inertial measurement data and wheel odometer data, wherein collecting the positioning data of the robot comprises: performing data format and coordinate conversion on the radar data to obtain first positioning data; and optimizing the inertia measurement data and the wheel odometer data to obtain second positioning data.
It is understood that when the collected positioning data includes radar data, inertial measurement data, and wheel odometer data, embodiments of the present invention may also process the data collected by the sensor, for example, perform data format and coordinate conversion on the radar data, and perform optimization on the inertial measurement data and the wheel odometer data.
Specifically, assume that the acquired inertial measurement data has three-axis acceleration and three-axis angular velocity [ a ]x,ay,az,ωx,ωy,ωz]The embodiment of the invention can carry out integration and coordinate system conversion on the data so as to obtain the position and the attitude P under the reference coordinate systemimuI.e. the first positioning data; wheel odometer data can be obtained from the tf _ tree and is denoted as PodomI.e. the second positioning data, from which:
Podom=[xo,yo,zo,rollo,pitcho,yawo]。
in step S102, a pose of the robot at a previous time is generated according to the positioning data, the initial pose, and a preset map of the robot.
Further, in an embodiment of the present invention, generating a pose of the robot at a previous time according to the positioning data, the initial pose and the preset map of the robot includes: and solving by a least square method according to the first positioning data and the second positioning data to obtain the pose of the previous moment.
It can be understood that the sensor has errors necessarily existing when acquiring data, and P is the case when the sensor errors are ignoredimu=PodomHowever, due to the presence of sensor errors, Pimu≠PodomTherefore, assuming that P' exists, letThe embodiment of the invention can solve the P' by using a least square method, so that the pose of the robot at the previous moment can be obtained. It should be noted that, the calculation method when the pose of the robot at the previous time is solved by the least square method is the same as the technical method in the related art, and details are not described here to avoid redundancy.
In step S103, a plurality of estimated poses corresponding to the pose at the previous time are generated, and the estimated pose with the largest weight and/or the estimated pose with the difference value from the pose at the previous time being smaller than a preset threshold is used as the current pose of the robot.
Specifically, according to the positioning node and the map management node, a plurality of estimated poses corresponding to the position node and the map management node can be generated near the pose P' at the previous moment according to the initial pose of the AGV and the preset map. As shown in fig. 3, a line a is a pose curve of the inertia measurement data, a line B is a pose curve of the wheel odometer data, a line C is a pose curve of the robot at the previous time, and a line D is an estimated pose curve. In an embodiment of the present invention, after generating a plurality of estimated poses corresponding to the pose at the previous time, the method further includes: and matching the positioning data with a preset map at each estimated pose of the estimated poses to determine the weight of each estimated pose according to the matching degree of each estimated pose.
For example, a plurality of estimated poses may be represented by a particle cloud, where the particles represent possible locations of the AGV (e.g., via radar data), and their weights represent the likelihood of the AGV being at that point. According to the embodiment of the invention, the particle cloud can be traversed, and the radar data scanned by the AGV is matched with the preset map at each particle, so that the weight of each estimated pose is determined according to the matching degree of each particle. It should be noted that the preset map may be an internal known map of the AGV, and a higher matching degree of the particle may indicate that the weight of the particle is larger.
Therefore, each estimated pose of the estimated poses is traversed again, and the estimated pose with the maximum estimated pose weight, or the estimated pose with the difference value between the pose and the pose at the previous moment being smaller than the preset threshold value, or the estimated pose with the maximum estimated pose weight and the pose at the previous moment being smaller than the preset threshold value is used as the current pose of the robot. As shown in fig. 4, the preset threshold range may be shown by a dashed circle, and the preset threshold may be a threshold preset by a user, may be a threshold obtained through a limited number of experiments, or may be a threshold obtained through a limited number of computer simulations.
In order to further understand the positioning method of the robot according to the embodiment of the present invention, a detailed description of a specific embodiment is provided below.
As shown in fig. 5, the positioning method of the robot includes the following steps:
step S501, collecting positioning data of the robot.
Step S502, starting a multithreading program to process and optimize the collected positioning data.
And S503, generating the pose of the robot at the previous moment according to the positioning data, the initial pose and the preset map of the robot.
Step S504, a plurality of estimated poses corresponding to the pose at the previous moment are generated, and the estimated pose with the maximum weight and/or the difference value between the pose at the previous moment and the pose at the previous moment being smaller than a preset threshold value is used as the current pose of the robot.
Step S505, determining whether an end signal is received, if yes, executing step S506, otherwise, executing step S503.
And step S506, ending.
According to the positioning method of the robot provided by the embodiment of the invention, the pose of the robot at the previous moment and a plurality of estimated poses corresponding to the pose at the previous moment can be generated according to the acquired positioning data, the initial pose and the preset map of the robot, the estimated poses with the largest estimated pose weight and/or the estimated pose with the difference value between the poses at the previous moment being smaller than the preset threshold value are taken as the current pose of the robot, the problem that the robot is easy to generate positioning failure in the scheduling process, particularly the inter-floor scheduling process, is avoided, the positioning accuracy is improved, the scheduling failure phenomenon is avoided, the reliability of the robot is ensured, and the use experience is improved.
Next, a positioning apparatus of a robot according to an embodiment of the present invention will be described with reference to the accompanying drawings.
Fig. 6 is a block diagram schematically illustrating a positioning apparatus of a robot according to an embodiment of the present invention.
As shown in fig. 6, the positioning device 10 of the robot includes: an acquisition module 100, a generation module 200 and a positioning module 300.
Wherein, the collecting module 100 is used for collecting the positioning data of the robot. The generating module 200 is configured to generate a pose of the robot at a previous time according to the positioning data, the initial pose, and a preset map of the robot. The positioning module 300 is configured to generate a plurality of estimated poses corresponding to the pose at the previous time, and use the estimated pose with the largest weight and/or the estimated pose with the difference value from the pose at the previous time being smaller than a preset threshold as the current pose of the robot.
Further, in one embodiment of the present invention, the positioning data comprises radar data, inertial measurement data, and wheel odometry data, wherein the acquisition module comprises: the first processing unit is used for carrying out data format and coordinate conversion on the radar data to obtain first positioning data; and the second processing unit is used for optimizing the inertia measurement data and the wheel odometer data to obtain second positioning data.
Further, in one embodiment of the present invention, the generating module comprises:
and the calculating unit is used for solving the current time pose by a least square method according to the first positioning data and the second positioning data.
Further, in an embodiment of the present invention, the positioning device for a robot further includes: and the matching module is used for matching the positioning data with a preset map at each estimated pose of the estimated poses after the estimated poses corresponding to the pose at the previous moment are generated so as to determine the weight of each estimated pose according to the matching degree of each estimated pose.
It should be noted that the above explanation of the embodiment of the positioning method for a robot is also applicable to the positioning device for a robot in this embodiment, and is not repeated here.
According to the positioning device of the robot provided by the embodiment of the invention, the pose of the robot at the previous moment and a plurality of estimated poses corresponding to the pose at the previous moment can be generated according to the acquired positioning data, the initial pose and the preset map of the robot, the estimated poses with the largest estimated pose weight and/or the estimated pose with the difference value between the poses at the previous moment being smaller than the preset threshold value are taken as the current pose of the robot, the problem that the robot is easy to generate positioning failure in the scheduling process, particularly the inter-floor scheduling process, is avoided, the positioning accuracy is improved, the scheduling failure phenomenon is avoided, the reliability of the robot is ensured, and the use experience is improved.
In order to implement the above embodiments, the present invention further provides an electronic device, including: at least one processor and a memory. Wherein the memory is in communication connection with the at least one processor, wherein the memory stores instructions executable by the at least one processor, the instructions being configured to perform the positioning method of the robot of the above embodiments, such as to:
collecting positioning data of the robot;
generating a pose of the robot at the previous moment according to the positioning data, the initial pose and a preset map of the robot; and
and generating a plurality of estimated poses corresponding to the pose at the previous moment, and taking the estimated pose with the maximum weight and/or the difference value between the pose and the pose at the previous moment smaller than a preset threshold value as the current pose of the robot.
In order to achieve the above embodiments, the present invention also proposes a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute the positioning method of the robot of the above embodiments.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or N embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "N" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more N executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of implementing the embodiments of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or N wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the N steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
Claims (10)
1. A method for positioning a robot, comprising the steps of:
collecting positioning data of the robot;
generating a previous-time pose of the robot according to the positioning data, the initial pose and a preset map of the robot; and
and generating a plurality of estimated poses corresponding to the pose at the previous moment, and taking the estimated pose with the maximum weight and/or the difference value between the pose and the pose at the previous moment smaller than a preset threshold value as the current pose of the robot.
2. The method of claim 1, wherein the positioning data comprises radar data, inertial measurement data, and wheel odometry data, and wherein the collecting the positioning data for the robot comprises:
performing data format and coordinate conversion on the radar data to obtain first positioning data;
and optimizing the inertia measurement data and the wheel odometer data to obtain second positioning data.
3. The method of claim 2, wherein generating the previous-time pose of the robot from the positioning data, the initial pose, and a preset map of the robot comprises:
and solving by a least square method according to the first positioning data and the second positioning data to obtain the pose of the previous moment.
4. The method of claim 1, after generating a plurality of estimated poses corresponding to the previous-time pose, further comprising:
and matching the positioning data with the preset map at each estimated pose of the estimated poses so as to determine the weight of each estimated pose according to the matching degree of each estimated pose.
5. A positioning device for a robot, comprising:
the acquisition module is used for acquiring positioning data of the robot;
the generating module is used for generating a pose of the robot at the previous moment according to the positioning data, the initial pose and a preset map of the robot; and
and the positioning module is used for generating a plurality of estimated poses corresponding to the pose at the previous moment, and taking the estimated pose with the maximum weight and/or the difference value between the estimated pose and the pose at the previous moment, which is smaller than a preset threshold value, as the current pose of the robot.
6. The apparatus of claim 5, wherein the positioning data comprises radar data, inertial measurement data, and wheel odometry data, wherein the acquisition module comprises:
the first processing unit is used for carrying out data format and coordinate conversion on the radar data to obtain first positioning data;
and the second processing unit is used for optimizing the inertia measurement data and the wheel odometer data to obtain second positioning data.
7. The apparatus of claim 6, wherein the generating module comprises:
and the calculation unit is used for solving the previous time pose by a least square method according to the first positioning data and the second positioning data.
8. The apparatus of claim 5, further comprising:
and the matching module is used for matching the positioning data with the preset map at each estimated pose of the estimated poses after generating a plurality of estimated poses corresponding to the pose at the previous moment so as to determine the weight of each estimated pose according to the matching degree of each estimated pose.
9. An electronic device, comprising: memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the positioning method of the robot according to any of claims 1-4.
10. A non-transitory computer-readable storage medium, on which a computer program is stored, the program being executable by a processor for implementing a positioning method of a robot according to any of claims 1-4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010291345.2A CN111580508A (en) | 2020-04-14 | 2020-04-14 | Robot positioning method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010291345.2A CN111580508A (en) | 2020-04-14 | 2020-04-14 | Robot positioning method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111580508A true CN111580508A (en) | 2020-08-25 |
Family
ID=72126497
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010291345.2A Pending CN111580508A (en) | 2020-04-14 | 2020-04-14 | Robot positioning method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111580508A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112700495A (en) * | 2020-11-25 | 2021-04-23 | 北京旷视机器人技术有限公司 | Pose determination method and device, robot, electronic device and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140129027A1 (en) * | 2012-11-02 | 2014-05-08 | Irobot Corporation | Simultaneous Localization And Mapping For A Mobile Robot |
CN109144056A (en) * | 2018-08-02 | 2019-01-04 | 上海思岚科技有限公司 | The global method for self-locating and equipment of mobile robot |
CN109506641A (en) * | 2017-09-14 | 2019-03-22 | 深圳乐动机器人有限公司 | The pose loss detection and relocation system and robot of mobile robot |
CN109579849A (en) * | 2019-01-14 | 2019-04-05 | 浙江大华技术股份有限公司 | Robot localization method, apparatus and robot and computer storage medium |
CN110319832A (en) * | 2019-07-05 | 2019-10-11 | 北京海益同展信息科技有限公司 | Robot localization method, apparatus, electronic equipment and medium |
-
2020
- 2020-04-14 CN CN202010291345.2A patent/CN111580508A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140129027A1 (en) * | 2012-11-02 | 2014-05-08 | Irobot Corporation | Simultaneous Localization And Mapping For A Mobile Robot |
CN109506641A (en) * | 2017-09-14 | 2019-03-22 | 深圳乐动机器人有限公司 | The pose loss detection and relocation system and robot of mobile robot |
CN109144056A (en) * | 2018-08-02 | 2019-01-04 | 上海思岚科技有限公司 | The global method for self-locating and equipment of mobile robot |
CN109579849A (en) * | 2019-01-14 | 2019-04-05 | 浙江大华技术股份有限公司 | Robot localization method, apparatus and robot and computer storage medium |
CN110319832A (en) * | 2019-07-05 | 2019-10-11 | 北京海益同展信息科技有限公司 | Robot localization method, apparatus, electronic equipment and medium |
Non-Patent Citations (1)
Title |
---|
张涛 等: "机器人概论", 机械工业出版 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112700495A (en) * | 2020-11-25 | 2021-04-23 | 北京旷视机器人技术有限公司 | Pose determination method and device, robot, electronic device and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3384360B1 (en) | Simultaneous mapping and planning by a robot | |
CN111290385B (en) | Robot path planning method, robot, electronic equipment and storage medium | |
CN111587430B (en) | Method for detecting an object | |
CN112179353B (en) | Positioning method and device of self-moving robot, robot and readable storage medium | |
EP3665501A1 (en) | Vehicle sensor calibration and localization | |
CN111201448A (en) | Method and apparatus for generating an inverse sensor model and method for identifying obstacles | |
CN111915675B (en) | Particle drift-based particle filtering point cloud positioning method, device and system thereof | |
CN110481559B (en) | Method of mapping an environment and system for mapping an environment on a vehicle | |
JP7047576B2 (en) | Cartography device | |
CN116982011A (en) | Method and device for updating environment map of robot for self-positioning | |
CN111752294B (en) | Flight control method and related device | |
CN111045433B (en) | Obstacle avoidance method for robot, robot and computer readable storage medium | |
CN114926809A (en) | Passable area detection method and device, moving tool and storage medium | |
CN110608742A (en) | Map construction method and device based on particle filter SLAM | |
CN115249066A (en) | Quantile neural network | |
CN111580508A (en) | Robot positioning method and device, electronic equipment and storage medium | |
CN112097772B (en) | Robot and map construction method and device thereof | |
CN112034844A (en) | Multi-intelligent-agent formation handling method, system and computer-readable storage medium | |
CN116187611A (en) | Multi-agent path planning method and terminal | |
CN116279424A (en) | Parking path planning method, device, equipment and storage medium | |
CN113034538B (en) | Pose tracking method and device of visual inertial navigation equipment and visual inertial navigation equipment | |
KR102412066B1 (en) | State determination method and device, robot, storage medium and computer program | |
CN115685236A (en) | Robot, robot skid processing method, device and readable storage medium | |
CN113932815A (en) | Robustness optimized Kalman filtering method and device, electronic equipment and storage medium | |
JP2012141662A (en) | Method for estimating self-position of robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200825 |
|
RJ01 | Rejection of invention patent application after publication |