CN115046543A - Multi-sensor-based integrated navigation method and system - Google Patents
Multi-sensor-based integrated navigation method and system Download PDFInfo
- Publication number
- CN115046543A CN115046543A CN202110253038.XA CN202110253038A CN115046543A CN 115046543 A CN115046543 A CN 115046543A CN 202110253038 A CN202110253038 A CN 202110253038A CN 115046543 A CN115046543 A CN 115046543A
- Authority
- CN
- China
- Prior art keywords
- data
- information
- imu
- navigation
- pose
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
Abstract
The invention discloses a combined navigation method and a system based on multiple sensors, wherein the method comprises the following steps: acquiring information of the IMU and the wheel type encoder for initialization, obtaining the linear velocity and the angular velocity of the wheel type encoder according to internal parameters of the odometer, and integrating the linear velocity and the angular velocity with IMU data; then, by a depth camera, obtaining the pose of the robot at any position by using a visual SLAM algorithm, obtaining the information of six degrees of freedom, acquiring an image frame, extracting an ORB descriptor, initializing the IMU data obtained after integration and the pose information of the image, taking the pose information of the IMU obtained after integration as a prior value, taking the pose information obtained by the visual SLAM as an observed value, and performing time synchronization with the image frame; then, performing optimization processing by using a Kalman filter; and performing data conversion on the acquired laser radar data, performing filtering optimization on the acquired laser radar data and pose data obtained after preliminary filtering by using an extended Kalman filter, and outputting final optimal navigation information. Finally, a navigation system for a mobile robot is provided according to the combined navigation method. The combined navigation method and the system based on the multiple sensors provided by the invention complement the advantages of the sensors, meet the navigation requirements of the intelligent robot, can maintain stable and reliable navigation, and improve the navigation precision, the anti-interference capability and the application range of the navigation system.
Description
Technical Field
The invention relates to the technical field of multi-sensor integrated navigation, in particular to an integrated navigation method and system based on multiple sensors.
Background
With the rapid development of economy in recent years, technologies such as internet, artificial intelligence and cloud computing are developed vigorously, the development requirement of the robot industry is higher and higher, the development of the robot needs to depend on high-precision navigation positioning, the requirement of high-precision navigation cannot be met by a single sensor, and the research of the multi-sensor combined navigation technology becomes a research hotspot at the present stage.
For a single-line laser radar, due to the fact that laser points are not dense enough, a constructed map often deviates, and therefore the situation that positioning is not accurate and deviation occurs is caused, and the situation is caused by accumulated errors; meanwhile, in the laser SLAM, closed loop detection is always a big difficulty: because the acquired laser data are two-dimensional point cloud data, no obvious characteristics exist and the two-dimensional point cloud data are very similar to each other, the closed-loop detection based on the laser data is often poor in effect.
Although the visual sensor has good effect in most scenes with rich textures, the visual sensor basically cannot work if meeting scenes with few characteristics such as glass, white walls and the like, in addition, the working frequency of the visual camera has certain limitation, and the updating delay phenomenon exists in the visual updating process in the process of high-speed driving.
Therefore, how to combine multiple sensors to achieve positioning with higher precision and better robustness is a problem that needs to be solved urgently by those skilled in the art.
Disclosure of Invention
In view of the above, the invention provides a combined navigation method and system based on multiple sensors, which are used for performing advantage complementation by comprehensively considering the advantages of an IMU, an encoder, a depth camera and a laser radar and realizing high-precision navigation positioning in an indoor environment, mainly based on the current situation of robot navigation.
In order to achieve the above object, the present invention adopts the following technical solutions.
A combined navigation method based on multiple sensors comprises the following specific steps.
And acquiring data of the IMU and the wheel type encoder for initialization, and obtaining the linear velocity and the angular velocity of the wheel type encoder according to internal parameters of the odometer.
When the robot moves on a plane, between two continuous odometer data, only the angle changes (yaw angle) around the z-axis direction, other angle changes are 0, and under an odometer coordinate systemAndthe change in rotation is:and respectively obtaining an angle and a position by integrating the linear velocity and the angular velocity of the odometer:
the angle data of the data obtained by the IMU is relative data, and the angle data obtained by the calculation of the wheel type encoder is replaced by the angle data.
Acquiring the pose of the robot at any position by a depth camera by using a visual SLAM algorithm, acquiring image frames and extracting ORB descriptors, processing IMU data obtained after integration and the pose information of the images, and performing time synchronization with the image frames; next, optimization processing is performed using a kalman filter.
System for solving state transition matrixNoise covariance matrixAnd obtaining a prediction process of Kalman filtering:
using visual attitude estimationRepresenting the coordinate transformation from the camera coordinate system to the world coordinate system, and simultaneously representing the observed value z of the filter, and combining the state equationObtaining an expression of an observation equation H by the observation matrix in the step (1);
from an observation matrix and an observation noise variance matrixObtaining an updating process of Kalman filtering:
collecting laser radar data information with original data formatWherein, in the process,rfor detecting the distance information from the point to the mobile robot,to measure the declination angle. Collection ofRepresenting a set of laser scan data, N being the number of scan data for one frame,is as followsiThe distance of the obstacle measured in each direction is maximized if no laser data information is returned in that direction.
Converting the position points scanned by the laser radar from polar coordinates to sensor coordinates:
and further, performing final filtering processing on the optimized pose information and the converted laser radar data by adopting an extended Kalman filter. The method comprises the following specific steps.
in a second aspect, the multi-sensor-based integrated navigation system provided in an embodiment of the present invention is configured to execute any one of the multi-sensor-based integrated navigation methods in the foregoing method embodiments, and includes an IMU and a wheel encoder as internal sensors, a depth camera and a single line lidar as external sensors, which are mounted on a robot and jointly calibrated, and are connected to a data processing module to perform data initialization processing.
When the data is transmitted to the data conversion and fusion module, the laser radar data is converted and pose information output by the IMU, the wheel encoder and the depth camera is optimized through filtering, wherein the pose information comprises a kalman filter and an extended kalman filter, and the process of the pose information is detailed in embodiment 1.
Further, the pose information after filtering optimization is input to an SLAM module for synchronous positioning and mapping, and the pose information is input to a positioning and navigation module.
In some embodiments, outputting the final navigation information includes position information, velocity information, and pose information of the robot.
Further, the SLAM module outputs the constructed two-dimensional grid map to a positioning and navigation module to realize path planning; and uploaded to the Rviz interactive interface.
Furthermore, the motor control module controls the robot chassis to move according to a planned route according to the coordinates of a target point issued by the positioning and navigation module, and uploads real-time speed information and pose information to the Rviz interactive interface.
In addition, in the technical solutions of the present invention, the technical solutions can be implemented by adopting a conventional means in the art, unless otherwise specified.
Drawings
To more clearly illustrate the embodiments or prior art solutions of the present invention, the following embodiments or prior art solutions will be described
While the drawings needed to describe the invention are briefly described, it should be apparent that the drawings in the following description are merely examples of the invention and that other drawings may be derived from the drawings provided without inventive step to those skilled in the art.
Fig. 1 is a schematic flow chart of a multi-sensor-based integrated navigation method provided by the invention.
FIG. 2 is a schematic diagram of a combined navigation system based on multiple sensors according to the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1:
fig. 1 shows a combined navigation method based on multiple sensors according to an embodiment of the present invention, which includes the following specific steps.
Step 1: data of the IMU and the wheel type encoder are collected for initialization, angle data of the data obtained by the IMU are relative data, and linear velocity and angular velocity of the wheel type encoder are obtained according to internal parameters of the odometer.
Wherein the content of the first and second substances,the rotational speeds of the left and right wheels respectively,the radii of the left and right wheels,athe distance between the left and right wheels.
When the robot moves on a plane, between two continuous odometer data, only the angle changes (yaw angle) around the z-axis direction, other angle changes are 0, and under an odometer coordinate systemAndthe change in rotation is:and respectively obtaining an angle and a position by integrating the linear velocity and the angular velocity of the odometer:
wherein the content of the first and second substances,,,,is a 3 x 1 dimensional gaussian noise vector.
The angle data of the data obtained by the IMU is relative data, and the angle is used for replacing the angle data obtained by the calculation of the wheel type encoder.
Step 2: acquiring the pose of the robot at any position by a depth camera by using a visual SLAM algorithm, acquiring image frames and extracting ORB descriptors, initializing the integrated IMU data and the pose information of the image, and performing time synchronization with the image frames; next, optimization processing is performed using a kalman filter.
Adopting the pose information of the integrated IMU as a prior value and the pose information obtained by visual SLAM as an observed value, specifically, solving a state transition matrixDiscretizing the system matrix, namely expanding the Taylor series, and converging to a second-order term:
system for solving state transition matrixNoise covariance matrixAnd obtaining a prediction process of Kalman filtering:
using visual attitude estimationRepresenting the coordinate transformation from the camera coordinate system to the world coordinate system, and simultaneously representing the observed value z of the filter, and combining the state equationThe observation matrix in (1) obtains an expression of an observation equation H:
from an observation matrix and an observation noise variance matrixObtaining an updating process of Kalman filtering:
and step 3: collecting laser radar data information with original data format ofWherein, in the step (A),rfor detecting the distance information from the point to the mobile robot,to measure the declination angle. CollectionRepresenting a set of laser scan data, N being the number of scan data for one frame,is as followsThe distance of the obstacle measured in each direction is maximized if no laser data information is returned in that direction.
Converting the position points scanned by the laser radar from polar coordinates to sensor coordinates:
and further, performing final filtering processing on the optimized pose information and the converted laser radar data by adopting an extended Kalman filter. The method comprises the following specific steps:
wherein, the first and the second end of the pipe are connected with each other,is a system model,For observing the model,Is composed ofFirst order Taylor's formula expansion,Is composed ofThe first order taylor formula expands.
The navigation method provided by the invention integrates data by using the wheel type encoder and the IMU to realize more accurate track tracking, then synchronizes the image pose information and the integrated IMU data by adopting the ORB descriptor, corrects the pose by adopting the Kalman filter, and finally converts the acquired laser radar data into the laser radar dataAnd the optimized IMU pose information and the laser radar data are filtered and processed by adopting extended Kalman filtering to output the final optimal pose information to realize positioning navigation.
Example 2:
fig. 2 shows a multi-sensor based integrated navigation system according to embodiment 2 of the present invention, which is configured to execute any one of the multi-sensor based integrated navigation methods in the foregoing method embodiments, and includes an IMU and a wheel encoder as internal sensors, a depth camera and a single line laser radar as external sensors, which are mounted on a robot and jointly calibrated, and are connected to a data processing module for data initialization.
When the data is transmitted to the data conversion and fusion module, the laser radar data is converted and the pose information output by the IMU, the wheel encoder and the depth camera is subjected to filtering optimization, wherein the pose information comprises a kalman filter and an extended kalman filter, and the process of the pose information is detailed in embodiment 1.
Further, the pose information after filtering optimization is input to an SLAM module for synchronous positioning and mapping, and the pose information is input to a positioning and navigation module.
In some embodiments, outputting the final navigation information includes position information, velocity information, and pose information of the robot.
Further, the SLAM module outputs the constructed two-dimensional grid map to a positioning and navigation module to realize path planning; and uploaded to the Rviz interactive interface.
Furthermore, the motor control module controls the robot chassis to move according to the planned route according to the coordinates of the target point issued by the positioning and navigation module, and uploads the real-time speed information and the pose information to the Rviz interactive interface.
The combined navigation system based on multiple sensors of embodiment 2 may be used to execute the combined navigation method based on multiple sensors of embodiment 1 of the present invention, and accordingly achieve the technical effects achieved by the combined navigation method based on multiple sensors of embodiment 1 of the present invention, which are not described herein again.
The above described embodiments of the apparatus are merely illustrative, wherein the units illustrated as separate parts may be
Or may not be physically separate, and components displayed as units may or may not be physical units, i.e. may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The emphasis of each embodiment in the present specification is on the difference from the other embodiments, and the same and similar parts among the various embodiments may be referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, and not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (8)
1. A combined navigation method based on multiple sensors is characterized by comprising the following specific steps:
step 1: initializing IMU and wheel type encoder data, obtaining linear velocity and angular velocity of the wheel type encoder according to internal parameters of the odometer, and integrating to obtain position and angle information; wherein the angular information calculated using the IMU replaces the angular information obtained by the wheel encoder;
step 2: acquiring an image frame by a visual SLAM algorithm to extract an ORB descriptor, initializing IMU data obtained after integration and pose information of the image, carrying out time synchronization with the image frame, and then carrying out optimization processing on the data by Kalman filtering; the prediction process of Kalman filtering is as follows;
and then obtaining a Kalman filtering updating process by the observation matrix and the observation noise variance matrix as follows:
and step 3: collecting laser radar data, formatting the raw dataData converted into corresponding rectangular coordinate systemFurther performing final filtering processing on the optimized pose and the converted laser radar data by using extended Kalman filtering; the method comprises the following specific steps of;
4. the multi-sensor based integrated navigation method of claim 1, wherein the angular information of the wheel encoders is replaced with angular data obtained by the IMU.
5. A multi-sensor based integrated navigation system for performing the multi-sensor based integrated navigation method of any one of claim 1,
the system comprises a laser radar (1), a depth camera (2), an IMU (3) and a wheel type encoder (4), wherein the data output ends of the laser radar (1), the depth camera (2), the IMU (3) and the wheel type encoder (4) are connected with a data processing module, the data conversion and fusion module acquires multi-source sensor data, filters the multi-source sensor data and then sends a fusion pose to an SLAM mapping module, the SLAM module further uploads a created grid map to an Rviz interactive interface and issues the grid map to a positioning and navigation module, and further the received navigation information is controlled by a motor control module to move a robot chassis.
6. Combined multi-sensor-based navigation system according to claim 5, where the IMU (3) is a six-axis IMU comprising a gyroscope and an accelerometer, the outputs of which are connected to the data processing module input.
7. The integrated multi-sensor-based navigation system of claim 5, wherein the data transformation and fusion module includes a Kalman filter and an extended Kalman filter.
8. The multi-sensor based integrated navigation system of claim 5, wherein the output final positional navigation information includes robot pose information and velocity information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110253038.XA CN115046543A (en) | 2021-03-09 | 2021-03-09 | Multi-sensor-based integrated navigation method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110253038.XA CN115046543A (en) | 2021-03-09 | 2021-03-09 | Multi-sensor-based integrated navigation method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115046543A true CN115046543A (en) | 2022-09-13 |
Family
ID=83156622
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110253038.XA Pending CN115046543A (en) | 2021-03-09 | 2021-03-09 | Multi-sensor-based integrated navigation method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115046543A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117490705A (en) * | 2023-12-27 | 2024-02-02 | 合众新能源汽车股份有限公司 | Vehicle navigation positioning method, system, device and computer readable medium |
-
2021
- 2021-03-09 CN CN202110253038.XA patent/CN115046543A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117490705A (en) * | 2023-12-27 | 2024-02-02 | 合众新能源汽车股份有限公司 | Vehicle navigation positioning method, system, device and computer readable medium |
CN117490705B (en) * | 2023-12-27 | 2024-03-22 | 合众新能源汽车股份有限公司 | Vehicle navigation positioning method, system, device and computer readable medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110243358B (en) | Multi-source fusion unmanned vehicle indoor and outdoor positioning method and system | |
CN112347840B (en) | Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method | |
CN112634451B (en) | Outdoor large-scene three-dimensional mapping method integrating multiple sensors | |
CN112083726B (en) | Park-oriented automatic driving double-filter fusion positioning system | |
CN113269837B (en) | Positioning navigation method suitable for complex three-dimensional environment | |
CN109029448B (en) | Monocular vision inertial positioning's IMU aided tracking model | |
CN112556719B (en) | Visual inertial odometer implementation method based on CNN-EKF | |
CN106289250A (en) | A kind of course information acquisition system | |
CN115272596A (en) | Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene | |
CN112985450B (en) | Binocular vision inertial odometer method with synchronous time error estimation | |
CN115479598A (en) | Positioning and mapping method based on multi-sensor fusion and tight coupling system | |
CN116429116A (en) | Robot positioning method and equipment | |
CN115639547A (en) | Multi-line laser radar and GNSS-INS combined calibration method, system and medium | |
CN115046543A (en) | Multi-sensor-based integrated navigation method and system | |
CN112945233A (en) | Global drift-free autonomous robot simultaneous positioning and map building method | |
CN117075158A (en) | Pose estimation method and system of unmanned deformation motion platform based on laser radar | |
Wang et al. | Micro aerial vehicle navigation with visual-inertial integration aided by structured light | |
CN116380039A (en) | Mobile robot navigation system based on solid-state laser radar and point cloud map | |
CN116608873A (en) | Multi-sensor fusion positioning mapping method for automatic driving vehicle | |
CN115307646A (en) | Multi-sensor fusion robot positioning method, system and device | |
CN114897967A (en) | Material form recognition method for autonomous operation of excavating equipment | |
CN115344033A (en) | Monocular camera/IMU/DVL tight coupling-based unmanned ship navigation and positioning method | |
CN114764830A (en) | Object pose estimation method based on quaternion EKF and uncalibrated hand-eye system | |
Liu et al. | Research on Simultaneous localization and mapping Algorithm based on lidar and IMU | |
CN116772828B (en) | Multi-sensor fusion positioning and mapping method based on graph optimization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication |