CN114562994A - Positioning method of mobile robot in dynamic environment - Google Patents
Positioning method of mobile robot in dynamic environment Download PDFInfo
- Publication number
- CN114562994A CN114562994A CN202210231443.6A CN202210231443A CN114562994A CN 114562994 A CN114562994 A CN 114562994A CN 202210231443 A CN202210231443 A CN 202210231443A CN 114562994 A CN114562994 A CN 114562994A
- Authority
- CN
- China
- Prior art keywords
- data
- sensing data
- unit
- mobile robot
- measurement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1652—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Navigation (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention provides a positioning method of a mobile robot in a dynamic environment, which comprises the following steps: respectively acquiring first sensing data and second sensing data obtained by measurement of an inertia measurement unit and an odometer unit, performing primary processing, performing fusion processing on the processed first sensing data and second sensing data to obtain comprehensive sensing data, and obtaining first attitude information according to the comprehensive sensing data; collecting third perception data measured by the laser radar unit, and matching and positioning the third perception data and an original map; judging whether the third sensing data is effective or not, and solving second position and attitude information under the condition that the third sensing data is effective; and carrying out fusion processing on the first position and posture information and the second position and posture information to obtain third position and posture information of the mobile robot. The invention can adapt to accurate positioning in a dynamic environment, and avoids the problems of incapability of positioning or inaccurate positioning caused by large environmental change.
Description
Technical Field
The invention relates to the technical field of positioning methods, in particular to a positioning method of a mobile robot in a dynamic environment.
Background
When the mobile robot works in dynamic environments such as outdoor environments and indoor complex environments, the robot needs to be positioned when the mobile robot works in the complex working environments, then the robot can well work in the complex environments, and the positioning of the robot is generally realized by depending on a robot positioning algorithm.
The robot positioning algorithm in the prior art generally adopts a two-dimensional technology only comprising a front-back left-right traveling mode, the positioning technology generally only adopts a laser radar, initial environment map data information, namely a so-called off-line map, is established firstly, the off-line map is generally unchanged, then the mobile robot starts to work, the laser radar collects the data information, the original environment map data information is matched with data obtained by the laser radar, and finally the data is analyzed to obtain the position and the direction of the mobile robot.
The method has the following defects:
when the method is applied to a relatively static environment, the relatively static environment refers to the environment in which the states and positions of various articles are basically unchanged or slightly changed, but an outdoor environment or an indoor complex environment is generally a dynamic environment, the states and positions of various articles are often changed, the environment is changed by the layout of the whole production unit due to the changes in the environment, such as a factory workshop, material transportation and storage in the factory workshop, product and cargo placement, product and material transportation, worker movement, equipment operation and maintenance, and the environment is dynamically complex, and a generally established original map cannot reflect the changes in the map timely and accurately, so that data on the original map cannot be matched with real environment data obtained by a sensor, or the error rate of data comparison and matching is high, so that the positioning failure of the robot is easily caused, and safety accidents are generated.
Therefore, the positioning technology cannot be well adapted to the dynamic environment in the factory workshop, and a mobile robot positioning method with high positioning accuracy and capable of achieving good positioning in a high dynamic environment needs to be provided for solving the problems existing in the positioning.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a positioning method of a mobile robot in a dynamic environment, which comprises an inertial measurement unit, an odometer unit and a laser radar unit which are fixedly connected with the mobile robot, and comprises the following steps:
respectively acquiring first sensing data and second sensing data obtained by measurement of an inertia measurement unit and an odometer unit, performing primary processing, performing fusion processing on the processed first sensing data and second sensing data to obtain comprehensive sensing data, and obtaining first attitude information according to the comprehensive sensing data;
collecting third sensing data measured by a laser radar unit, and matching and positioning the third sensing data and an original map;
judging whether the third sensing data is effective or not, and solving second position and attitude information under the condition that the third sensing data is effective;
and carrying out fusion processing on the first position information and the second position information to obtain third position information of the mobile robot.
Optionally, the first perception data comprises acceleration, a first angular velocity and an azimuth angle, and the second perception data comprises a linear velocity and a second angular velocity.
Optionally, the inertial measurement unit comprises an accelerometer and a gyroscope, wherein,
before the inertial measurement unit carries out data measurement, the accelerometer is firstly stopped, whether the current gravity acceleration is equal to the two-norm of the acceleration value measured in the measurement process or not is judged, and under the condition that the gravity acceleration is equal to the two-norm of the acceleration value, the gyroscope is calibrated by using the relevant data of the calibration accelerometer.
Optionally, the processing procedure of the odometer unit to measure the second perception data includes:
and in the sampling period of the odometer unit, the odometer unit measures the movement distance of the wheels of the mobile robot relative to the ground and the variation of the direction angles of the wheels, and calculates the linear velocity and the second angular velocity according to the movement distance of the wheels relative to the ground and the variation of the direction angles of the wheels.
Optionally, determining whether the third sensing data is valid further includes:
acquiring coordinate position data in an original map where the mobile robot is located when the mobile robot is currently detected and coordinate position data actually measured by a laser radar unit;
obtaining measurement quality data according to coordinate position data in an original map where the laser radar unit is located during current detection and coordinate position data actually measured by the laser radar unit;
when the measured quality data is smaller than the preset threshold value, the third sensing data is obtained through measurement again;
and when the measured quality data is greater than or equal to a preset threshold value, adopting third sensing data obtained by current measurement.
Optionally, the lidar unit measures with rf2o method to obtain third sensing data.
Optionally, the first sensing data and the second sensing data are subjected to fusion processing, and the first attitude information and the second attitude information are subjected to fusion processing by using a kalman filtering method and an extended kalman filtering method.
Compared with the prior art, the invention has the following beneficial effects:
the positioning method of the mobile robot in the dynamic environment can adapt to accurate positioning in the dynamic environment, and avoids the problems of incapability of positioning or inaccurate positioning caused by large environmental change.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
fig. 1 is a flowchart of a positioning method of a mobile robot in a dynamic environment according to the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the invention.
Before describing the present embodiment, the present kalman filter method is first described in conjunction with the embodiments of the present application
Kalman filtering (Kalman filtering) is an algorithm that uses a linear system state equation to optimally estimate the state of a system by inputting and outputting observation data through the system. The optimal estimation can also be seen as a filtering process, since the observed data includes the effects of noise and interference in the system.
Data filtering is a data processing technique for removing noise and restoring true data, and Kalman filtering can estimate the state of a dynamic system from a series of data with measurement noise under the condition that measurement variance is known. Because the method is convenient for realizing computer programming and can update and process the data acquired on site in real time, Kalman filtering is the most widely applied filtering method at present and is better applied to the fields of communication, navigation, guidance, control and the like.
Extended Kalman Filter method (EKF): is an extended form of standard kalman filtering in the case of non-linearity, which is a highly efficient recursive filter (autoregressive filter).
The basic idea of EKF is to linearize the nonlinear system using taylor series expansion and then filter the signal using the kalman filtering framework, so it is a sub-optimal filtering.
As shown in fig. 1, in the positioning method of the mobile robot in a dynamic environment according to the present invention, before the method is implemented, an inertial measurement unit, an odometer unit and a lidar unit need to be fixedly connected to the mobile robot, wherein the inertial measurement unit is composed of an accelerometer, a magnetometer and a gyroscope;
in addition, as the data measured by the inertial measurement unit is measured, the data error gradually increases and affects the positioning accuracy, so the error needs to be reduced before the next measurement of the data, and the method for reducing the error of the data measured by the inertial measurement unit includes:
before the inertial measurement unit carries out data measurement, the accelerometer is firstly stopped, the accelerometer is firstly corrected and is realized by a dodecahedron calibration method, the first step is to fix the inertial measurement unit on a cube with three orthogonal axes, the second step is to place one side of the cube on a calibration surface, collect data after standing for a period of time, then rotate the cube by 180 degrees, collect data after standing for a period of time, the third step is to average the two collected data, the fourth step is to repeat the second step to measure the data of the rest surfaces, and when the calibration surface is parallel to a horizontal plane, the actual value of the accelerometer is [ g 00 ]]TG is the current gravity acceleration, and when the included angle alpha exists between the calibration surface and the horizontal plane, the actual value measured by the accelerometer is [ -g cos alpha g sin alpha 0]TWhere α is generally relatively small, the actual value of acceleration may be approximated as [ -g g α 0 [ - ]]TThe actual value of the acceleration after rotation is approximately [ -g-g α 0 ]]TThe average of the two measurements corresponds to the actual value [ -g 00 ]]TTherefore, the influence of the included angle alpha is eliminated, whether the current gravity acceleration is equal to the two-norm of the acceleration value measured in the measuring process or not is judged, and the gyroscope is calibrated by using the relevant data of the calibration accelerometer under the condition that the gravity acceleration is equal to the two-norm of the acceleration value.
After the calibration of the inertial measurement unit is completed, positioning is started, and the steps are as follows:
the method comprises the steps of respectively collecting first sensing data and second sensing data obtained by measurement of an inertia measurement unit and a speedometer unit, conducting primary processing, conducting fusion processing on the processed first sensing data and the processed second sensing data to obtain comprehensive sensing data, and obtaining first attitude information according to the comprehensive sensing data.
In a specific application, first sensing data obtained by measurement by the inertial measurement unit generally includes an acceleration, a first angular velocity and an azimuth angle, second sensing data obtained by measurement by the odometer unit generally includes a linear velocity and a second angular velocity, and third sensing data generally includes data such as obstacles, mark points, coordinates and the like in a dynamic scene, wherein a measurement process of the odometer unit includes:
the pulses of the photoelectric encoder change during the sampling period of the odometer unit. At the moment, the odometer unit starts to work, the trolley starts to move for a period of time, the variation of the moving distance l of the wheels of the trolley relative to the ground, the direction angle delta beta of the wheels and the speed v of the left and right small wheels can be obtainedr,vlThe calculation formula of the linear velocity and the second angular velocity measured by the serial programming unit is as follows:
ω2=(vr-vl)/2
Δβ=ω2/Δt
wherein v is2Is the linear velocity, omega2The second angular velocity is set, and delta t is the movement duration of the trolley;
in this embodiment, the fusion of the first sensing data and the second sensing data is realized by a kalman filtering method and an extended kalman filtering method. After the fusion, new linear velocity and angular velocity are generally obtained, the first attitude information is obtained by combining the comprehensive sensing data and a dead reckoning algorithm, that is, the first attitude information is obtained through the new linear velocity and angular velocity, wherein the first attitude information comprises position coordinate information and direction information, and the direction information can be generally understood as a direction angle.
And collecting third perception data measured by the laser radar unit, and matching and positioning the third perception data and the original map.
In practical application, when a laser radar unit is selected, performance indexes such as a visible range, a scanning frequency, a distance measurement resolution, an angle measurement resolution and the like of the laser radar unit are generally considered, the indexes can meet measurement requirements, the laser radar unit generally adopts an rf2o method for measurement, wherein the rf2o measurement method can formulate a range flow constraint equation according to a sensor speed for each scanning point, and minimize a robust function of obtained geometric constraint to obtain motion estimation, an original map is an offline map obtained by collecting a field environment before measurement, so that data in the original map is generally fixed and invariable, and matching and positioning third sensing data and the original map can be understood as matching feature points in the third sensing data with feature points in the original map, namely, the coincidence rate of the feature points in the third sensing data and the feature points in the original map is more than sixty percent Under the condition, the laser radar unit is considered to be in the coordinate position in the original map, and therefore the position of the laser radar is judged.
In the above situation, in a static environment, the position accuracy of the obtained lidar is high, but in a dynamic environment, it is necessary to determine whether the third sensing data is valid.
Judging whether the third sensing data is effective or not, and solving second position and attitude information under the condition that the third sensing data is effective;
after the third sensing data obtained by the laser radar measurement is obtained, the third sensing data generally comprises data measured by the laser radar at a plurality of moments, and after the data measured at the plurality of moments are obtained, the specific process of judging whether the third sensing data is effective is as follows:
acquiring coordinate position data in an original map where the mobile robot is located when the mobile robot is currently detected and coordinate position data actually measured by a laser radar unit;
and solving measurement quality data according to coordinate position data in the original map where the laser radar unit is located when the laser radar unit is currently detected and coordinate position data actually measured.
It can be understood that the coordinate position in the original map where the current detection is located is obtained by matching and positioning, and the data generally has a certain error from the coordinate position data actually measured by the laser radar unit.
When the measured quality data is smaller than the preset threshold value, the third sensing data is obtained through measurement again;
when the measured quality data is greater than or equal to a preset threshold value, adopting currently measured third perception data;
wherein, the calculation formula of the measurement quality data is as follows:
L~N(μ,σ2)
in the formula, x and y respectively represent coordinates of an x axis and a y axis where the robot is located when the robot is currently detected, the coordinates are obtained by matching and positioning third sensing data and an original map, a and b respectively represent coordinates of the x axis and the y axis where the robot is located, which are obtained by laser radar measurement, and sigma represents a standard deviation obtained by several times of measurement, namely, the standard deviation can be understood as a standard deviation of a coordinate position of a certain point measured for several times;
the change rate delta L of L in time is obtained, and the obtaining method can adopt a derivative or derivative function obtaining mode;
the Δ L may be understood as measurement quality data, when the measurement quality data is smaller than a preset threshold, the measurement is reacquired to obtain third sensing data, and when the Δ L is smaller than the preset threshold, it may be understood that a large change occurs in the surrounding environment, and in this case, the data measured by the laser radar and the original map are not accurately located in a matching manner;
when the measured quality data is greater than or equal to the preset threshold value, the third sensing data obtained by current measurement is adopted, and it can be understood that in this case, the change of the surrounding environment is small, and the data measured by the laser radar and the original map are accurately matched and positioned, so that the third sensing data can be adopted, and the position information obtained by matching and positioning the third sensing data and the original map is accurate.
And carrying out fusion processing on the first position information and the second position information to obtain third position information of the mobile robot.
In practical application, the fusion processing of the first attitude information and the second attitude information is also obtained by processing through a Kalman filtering method and an extended Kalman filtering method, and after the fusion processing, some interference information is filtered out, so that accurate third attitude information can be obtained.
The foregoing description has described specific embodiments of the present invention. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.
Claims (7)
1. A positioning method of a mobile robot in a dynamic environment is characterized by comprising an inertia measurement unit, an odometer unit and a laser radar unit which are fixedly connected with the mobile robot, and the method comprises the following steps:
respectively acquiring first sensing data and second sensing data obtained by measurement of an inertia measurement unit and an odometer unit, performing primary processing, performing fusion processing on the processed first sensing data and second sensing data to obtain comprehensive sensing data, and obtaining first attitude information according to the comprehensive sensing data;
collecting third perception data measured by the laser radar unit, and matching and positioning the third perception data and an original map;
judging whether the third sensing data is effective or not, and solving second attitude information under the condition that the third sensing data is effective;
and carrying out fusion processing on the first position and posture information and the second position and posture information to obtain third position and posture information of the mobile robot.
2. A method as recited in claim 1, wherein said first perception data comprises acceleration, first angular velocity and azimuth angle, and said second perception data comprises linear velocity and second angular velocity.
3. Positioning method of a mobile robot in a dynamic environment according to claim 2, characterized in that the inertial measurement unit comprises an accelerometer and a gyroscope, wherein,
before the inertial measurement unit carries out data measurement, the accelerometer is firstly stopped, whether the current gravity acceleration is equal to the two-norm of the acceleration value measured in the measurement process or not is judged, and under the condition that the gravity acceleration is equal to the two-norm of the acceleration value, the gyroscope is calibrated by utilizing relevant data of the calibration accelerometer.
4. The method of claim 2, wherein the processing of the odometer unit to measure the second perception data comprises:
and in the sampling period of the odometer unit, the odometer unit measures the movement distance of the wheels of the mobile robot relative to the ground and the variation of the direction angle of the wheels, and calculates the linear velocity and the second angular velocity according to the movement distance of the wheels relative to the ground and the variation of the direction angle of the wheels.
5. The method of claim 1, wherein the determining whether the third perception data is valid further comprises:
acquiring coordinate position data in an original map where the mobile robot is located when the mobile robot is currently detected and coordinate position data actually measured by the laser radar unit;
obtaining measurement quality data according to coordinate position data in an original map where the laser radar unit is located during current detection and coordinate position data actually measured by the laser radar unit;
when the measured quality data is smaller than the preset threshold value, the third sensing data is obtained through measurement again;
and when the measured quality data is greater than or equal to a preset threshold value, adopting third sensing data obtained by current measurement.
6. A method according to claim 3, wherein the lidar unit is configured to measure third sensing data using rf2 o.
7. The method of claim 5, wherein the fusion of the first and second sensing data and the fusion of the first and second attitude information are performed by using Kalman filtering and extended Kalman filtering.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210231443.6A CN114562994A (en) | 2022-03-09 | 2022-03-09 | Positioning method of mobile robot in dynamic environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210231443.6A CN114562994A (en) | 2022-03-09 | 2022-03-09 | Positioning method of mobile robot in dynamic environment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114562994A true CN114562994A (en) | 2022-05-31 |
Family
ID=81718221
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210231443.6A Pending CN114562994A (en) | 2022-03-09 | 2022-03-09 | Positioning method of mobile robot in dynamic environment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114562994A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116224349A (en) * | 2022-12-12 | 2023-06-06 | 珠海创智科技有限公司 | Robot positioning method, system and electronic device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109900280A (en) * | 2019-03-27 | 2019-06-18 | 浙江大学 | A kind of livestock and poultry information Perception robot and map constructing method based on independent navigation |
CN111089585A (en) * | 2019-12-30 | 2020-05-01 | 哈尔滨理工大学 | Mapping and positioning method based on sensor information fusion |
CN112066982A (en) * | 2020-09-07 | 2020-12-11 | 成都睿芯行科技有限公司 | Industrial mobile robot positioning method in high dynamic environment |
CN113252033A (en) * | 2021-06-29 | 2021-08-13 | 长沙海格北斗信息技术有限公司 | Positioning method, positioning system and robot based on multi-sensor fusion |
-
2022
- 2022-03-09 CN CN202210231443.6A patent/CN114562994A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109900280A (en) * | 2019-03-27 | 2019-06-18 | 浙江大学 | A kind of livestock and poultry information Perception robot and map constructing method based on independent navigation |
CN111089585A (en) * | 2019-12-30 | 2020-05-01 | 哈尔滨理工大学 | Mapping and positioning method based on sensor information fusion |
CN112066982A (en) * | 2020-09-07 | 2020-12-11 | 成都睿芯行科技有限公司 | Industrial mobile robot positioning method in high dynamic environment |
CN113252033A (en) * | 2021-06-29 | 2021-08-13 | 长沙海格北斗信息技术有限公司 | Positioning method, positioning system and robot based on multi-sensor fusion |
Non-Patent Citations (1)
Title |
---|
伍宗伟;姚敏立;马红光;贾维敏;田方浩;: "移动卫星通信低成本多传感器融合姿态估计方法", 西安交通大学学报, no. 12 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116224349A (en) * | 2022-12-12 | 2023-06-06 | 珠海创智科技有限公司 | Robot positioning method, system and electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110673115B (en) | Combined calibration method, device, equipment and medium for radar and integrated navigation system | |
CN103412565B (en) | A kind of robot localization method with the quick estimated capacity of global position | |
Lee et al. | Robust mobile robot localization using optical flow sensors and encoders | |
CN110702091B (en) | High-precision positioning method for moving robot along subway rail | |
CN109807911B (en) | Outdoor patrol robot multi-environment combined positioning method based on GNSS, UWB, IMU, laser radar and code disc | |
CN109282808B (en) | Unmanned aerial vehicle and multi-sensor fusion positioning method for bridge three-dimensional cruise detection | |
CN107490378B (en) | Indoor positioning and navigation method based on MPU6050 and smart phone | |
CN111380573B (en) | Method for calibrating the orientation of a moving object sensor | |
CN103438890B (en) | Based on the planetary power descending branch air navigation aid of TDS and image measurement | |
CN112254729A (en) | Mobile robot positioning method based on multi-sensor fusion | |
CN109737968B (en) | Indoor fusion positioning method based on two-dimensional LiDAR and smart phone | |
Yao et al. | Research and application of indoor positioning method based on fixed infrared beacon | |
KR101764222B1 (en) | System and method for high precise positioning | |
CN114562994A (en) | Positioning method of mobile robot in dynamic environment | |
Bikmaev et al. | Improving the accuracy of supporting mobile objects with the use of the algorithm of complex processing of signals with a monocular camera and LiDAR | |
CN104075710B (en) | A kind of motor-driven Extended target based on Trajectory Prediction axial attitude real-time estimation method | |
Wongwirat et al. | A position tracking experiment of mobile robot with inertial measurement unit (imu) | |
Zhang et al. | Self-positioning for mobile robot indoor navigation based on wheel odometry, inertia measurement unit and ultra wideband | |
CN115540854A (en) | Active positioning method, equipment and medium based on UWB assistance | |
CN112362052B (en) | Fusion positioning method and system | |
CN114789439B (en) | Slope positioning correction method, device, robot and readable storage medium | |
CN110954096B (en) | Method for measuring course attitude based on MEMS device | |
CN107962568A (en) | The real-time location method and system of a kind of robot | |
CN112578363B (en) | Laser radar motion track obtaining method and device and medium | |
CN111881899A (en) | Robot positioning deployment method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |