WO2021016749A1 - Procédé de positionnement basé sur la fusion de multiples données, plateforme mobile et support de stockage - Google Patents
Procédé de positionnement basé sur la fusion de multiples données, plateforme mobile et support de stockage Download PDFInfo
- Publication number
- WO2021016749A1 WO2021016749A1 PCT/CN2019/097957 CN2019097957W WO2021016749A1 WO 2021016749 A1 WO2021016749 A1 WO 2021016749A1 CN 2019097957 W CN2019097957 W CN 2019097957W WO 2021016749 A1 WO2021016749 A1 WO 2021016749A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- positioning
- gnss
- verification
- output
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/49—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/18—Stabilised platforms, e.g. by gyroscope
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/40—Correcting position, velocity or attitude
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
- G01S19/47—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
Definitions
- the present invention relates to the technical field of positioning, in particular to a positioning method based on multi-data fusion, a movable platform and a storage medium.
- Positioning technology can provide position and other information for the movable platform, which is a prerequisite for path planning, motion control and autonomous decision-making of the movable platform.
- the more mature method is to use the Inertial Measurement Unit (IMU) and the Global Navigation Satellite System (Global Navigation Satellite System, GNSS) to combine to achieve real-time positioning.
- the global navigation satellite system GNSS has the problem of frequent signal loss in complex environments such as urban canyons, tunnels, or wild jungles. Frequent signal loss will cause the mobile platform to be unable to use the global satellite navigation system GNSS to accurately locate the situation.
- the movable platform can only use the inertial measurement module IMU for positioning, but the inertial measurement module IMU has low positioning accuracy and cannot meet the precise positioning requirements of the movable platform.
- the embodiment of the invention discloses a positioning method, a movable platform and a storage medium based on multi-data fusion, which can position the movable platform in different environments based on different data fusion methods, effectively ensuring positioning accuracy.
- an embodiment of the present invention discloses a positioning method based on multiple data fusion, which is applied to a movable platform, and the method includes:
- GNSS global satellite navigation system
- SLAM real-time positioning and mapping
- an embodiment of the present invention discloses a movable platform, including: a memory and a processor,
- the memory is used to store program instructions
- the processor is configured to execute program instructions stored in the memory, and when the program instructions are executed, the processor is configured to:
- the embodiment of the present invention also discloses a computer-readable storage medium in which a computer program is stored, and when the computer program is executed by a processor, the positioning method based on multiple data fusion as described above is implemented A step of.
- the GNSS data, inertial navigation system data, driving state data and at least one SLAM sensor data of the movable platform are mutually verified to obtain the data that has passed the verification, and the data is determined according to the data that has passed the verification.
- Target data fusion mode instructions fusion process the GNSS data of the movable platform, inertial navigation system data, driving status data and at least one SLAM sensor data to obtain the target information, and determine the position of the movable platform based on the target information.
- Different data fusion methods locate mobile platforms in different environments to effectively ensure positioning accuracy.
- Figure 1 is a schematic structural diagram of a movable platform disclosed in an embodiment of the present invention.
- FIG. 2 is a schematic flowchart of a positioning method based on multiple data fusion disclosed in an embodiment of the present invention
- FIG. 3 is a schematic flowchart of another positioning method based on multiple data fusion disclosed in an embodiment of the present invention.
- FIG. 4 is a schematic diagram of the conversion relationship between filtering modes disclosed in an embodiment of the present invention.
- FIG. 5 is a schematic diagram of the conversion relationship between sub-modes in the position observation mode disclosed in the embodiment of the present invention.
- Fig. 6 is a schematic structural diagram of another movable platform disclosed in an embodiment of the present invention.
- FIG. 1 is a schematic structural diagram of a movable platform provided by an embodiment of the present invention.
- the mobile platform is configured with a global satellite navigation system GNSS101, an inertial navigation system INS or a strapdown inertial navigation system SINS102, and a sensor module 103 for collecting driving state data; the mobile platform is also configured with at least one A positioning module 104 for acquiring real-time positioning and map construction (Simultaneous Localization And Mapping, SLAM) sensor data.
- GNSS101 global satellite navigation system
- INS inertial navigation system
- SINS102 strapdown inertial navigation system
- sensor module 103 for collecting driving state data
- the mobile platform is also configured with at least one A positioning module 104 for acquiring real-time positioning and map construction (Simultaneous Localization And Mapping, SLAM) sensor data.
- SLAM Simultaneous Localization And Mapping
- the inertial navigation system INS or strapdown inertial navigation system SINS102 may include an inertial measurement module IMU, which may include gyroscopes and accelerometers, etc.; including the inertial measurement module may be a low-precision microelectromechanical system (MEMS) IMU, It can also be a fiber type or laser type IMU.
- the positioning module 104 can be carried on the fuselage 106 of the movable platform through the platform 105 of the movable platform. The platform 105 can drive the positioning module 104 around one or more of the yaw axis, roll axis, and pitch axis.
- the positioning module 104 can also be directly carried on the body 106 of the movable platform.
- the positioning module 104 may be completely fixed to the pan/tilt 104, or may be partially fixed to the pan/tilt 104, and the other part may be directly carried on the body 106 of the movable platform.
- GNSS systems 101 there may be one or more GNSS systems 101, one or more inertial navigation systems, and one or more sensor modules 103 for collecting driving state data, and positioning modules for acquiring SLAM sensor data.
- 104 can also be one or more.
- the positioning module 104 used to acquire SLAM sensor data may be a positioning module based on image sensors, a positioning module based on lidar, or the like.
- the sensor module 103 for collecting driving state data may be an odometer or the like.
- the movable platform shown in FIG. 1 is described by taking a vehicle as an example.
- the movable platform in the embodiment of the present invention may also be an unmanned aerial vehicle (UAV), an unmanned ship, a mobile robot, etc. Removable equipment.
- UAV unmanned aerial vehicle
- the positioning method based on multiple data fusion described in the embodiment of the present invention can be applied to the movable platform shown in FIG. 1, specifically: the movable platform obtains its GNSS data, inertial navigation system data, driving state data, and at least one SLAM Sensor data: The acquired GNSS data, inertial navigation system data, driving state data and at least one SLAM sensor data are mutually verified to obtain data that has passed the verification, and the target data fusion method is determined based on the data that has passed the verification.
- the data fusion method can be used to indicate the type or type of data to be used for data fusion.
- the movable platform performs fusion processing on the GNSS data, inertial navigation system data, driving state data, and at least one SLAM sensor data of the movable platform according to the instructions of the target data fusion mode to obtain target information, and determine the target information according to the target information.
- the location of the mobile platform By adopting the above method, the movable platform in different environments can be positioned based on different data fusion methods, effectively ensuring the positioning accuracy. The detailed description is given below.
- FIG. 2 is a schematic flowchart of a positioning method based on multiple data fusion according to an embodiment of the present invention.
- the positioning method based on multi-data fusion described in the embodiment of the present invention is applied to a mobile platform configured with a global satellite navigation GNSS system, an inertial navigation system INS or a strapdown inertial navigation system SINS for collecting A sensor module for driving state data, and at least one positioning module for acquiring SLAM sensor data.
- the method includes the following steps:
- the GNSS data is also the observation data output by the global satellite navigation system GNSS configured with the movable platform, including the carrier phase data and speed of the movable platform.
- the global satellite navigation system GNSS may be a single-point type global satellite navigation system and/or a differential type global satellite navigation system.
- the inertial navigation system data includes INS data and/or SINS data.
- the INS data is also the observation data output by the inertial navigation system INS configured on the mobile platform, and the SINS data is the output of the strapdown inertial navigation system SINS configured on the mobile platform. data.
- Inertial navigation system data includes measurement data of gyroscope and accelerometer in inertial navigation system, etc.
- the measurement data of gyroscope includes the angular velocity of the movable platform
- the measurement data of accelerometer includes the acceleration of the movable platform.
- the driving state data is the observation data output by the sensor module configured to collect driving state data on the movable platform.
- the sensor module used to collect driving state data may be an odometer, and the driving state data collected by the odometer includes movable The speed and acceleration of the platform, etc.; if the movable platform is a vehicle, the driving status data includes the wheel speed and acceleration of the vehicle; if the movable platform is a drone, the driving status data includes the ground speed and Ground acceleration, etc.
- the SLAM sensor data is also the observation data output by the positioning module configured to obtain SLAM sensor data on the mobile platform.
- the positioning module used to obtain SLAM sensor data includes a positioning module based on image sensors and/or a lidar as The main positioning module, the SLAM sensor data specifically includes the positioning data output by the image sensor-based positioning module and/or the positioning data output by the lidar-based positioning module.
- the positioning data output by the positioning module based on the image sensor includes the speed of the movable platform and the environmental feature information extracted from the image of the environment where the movable platform is located.
- the positioning data output by the positioning module based on lidar includes the speed of the movable platform and the point cloud data of the environment where the movable platform is located.
- S202 Perform mutual verification on the GNSS data, the inertial navigation system data, the driving state data, and at least one SLAM sensor data to obtain data that has passed the verification.
- the GNSS system, the inertial navigation system, the sensor module for collecting driving state data, and the positioning module for acquiring SLAM sensor data all add identification information to the respective output data, and the identification information is used to indicate the output Whether the data is valid; therefore, the mobile platform can detect whether the data is available through the identification information of the data.
- the output information of the GNSS system, inertial navigation system, sensor module for collecting driving state data, and positioning module for acquiring SLAM sensor data also includes the reference data output frequency; therefore, the movable platform can detect the output data of the above-mentioned system or module.
- the real frequency is the same as the reference data output frequency included in the information output by the system or module (or within a preset error range) to determine whether the data output by the system or module is available.
- the mobile platform is based on the identification information of each data in the data set composed of GNSS data, inertial navigation system data, driving state data, and at least one SLAM sensor data and the output data of the corresponding device (that is, the aforementioned system or module) Frequency, to detect the various data to determine from the various data that the identification information is valid, and the true frequency of the corresponding device output data is consistent with the output frequency of the reference data output by the corresponding device (or in advance Set the first data within the error range), the first data is also the data available in the various data; and the data set composed of the first data is determined as the data set that passes the detection.
- the mobile platform performs mutual verification on the data in the detected data set to obtain data that has passed the verification.
- the mobile platform can use the data output by the target device in the above-mentioned system or module and pass the detection as reference data, and use the reference data to output and pass the detection on the device other than the target device in the system or module.
- the data is verified, and the data passed the verification is obtained.
- the target device is a system or module whose default data accuracy of the mobile platform is minimally affected by environmental factors, and the data output by the target device is highly reliable.
- the target device may be a sensor module for collecting driving state data in the aforementioned system or module, and the sensor module may specifically be an odometer.
- the speed output by the odometer can be used to verify the speed output by the GNSS system. If the output speed is consistent or within the preset error range, it is determined that the data output by the GNSS system and passed the test is reliable, and the data output by the GNSS system and passed the test is determined as the data passed by the verification; otherwise, the GNSS system The output data that passed the test is excluded.
- the mobile platform determines the data set that has passed the test, it performs self-inspection on the second data output by the same system or module in the data set that has passed the test, so that the data that has not passed the self-inspection Exclude the data set that passed the self-test.
- the odometer can output the respective wheel speeds of the four wheels of the vehicle; if the four wheel speeds are compared, it is detected that a wheel speed is significantly different from the other three wheel speeds. It is determined that a certain wheel speed is abnormal, the self-inspection fails, and the abnormal wheel speed is excluded from the data set that has passed the detection.
- the GNSS system can simultaneously output the carrier phase data and speed of the movable platform.
- the speed of the movable platform can also be obtained; compare the processed speed with the output speed of the GNSS system, if If the difference between the two exceeds the preset error range, it can be determined that the data output by the GNSS system is abnormal and the self-inspection has failed, and the data output by the GNSS system can be excluded from the data set that has passed the test.
- the mobile platform performs mutual verification on the data in the data set that has passed the self-inspection to obtain data that has passed the verification.
- the embodiment of the present invention can effectively isolate inaccurate data by detecting, self-checking, and mutual verification on the data output by the above-mentioned system or module, thereby ensuring that the determined data fusion method is optimal and positioning accuracy.
- the mobile platform is based on GNSS data, inertial navigation system data, driving state data and at least one kind of SLAM sensor data in the data set consisting of the identification information of each data and the frequency of the corresponding device output data
- the data in the above data set is converted to the reference coordinate system to obtain the data set after coordinate conversion; then the identification information and corresponding data in the data set after coordinate conversion are obtained.
- the frequency of the output data of the device is to detect various data in the data set after coordinate conversion, and obtain the data set that passed the test.
- S203 Determine a target data fusion mode according to the data passed the verification.
- the verified data when the verified data includes GNSS data, it indicates that the accuracy of the data output by the GNSS system under the current environment is high. At this time, the data output by the GNSS system can be used for accurate positioning, and the movable platform will use GNSS
- the data-based data fusion method is determined as the target data fusion method.
- the at least one type of SLAM sensor data includes the positioning data output by the image sensor-based positioning module
- the verified data includes the positioning data output by the image sensor-based positioning module and does not include GNSS data
- the data output by the GNSS system cannot be used for accurate positioning, but the image sensor can be used as the main
- the positioning data output by the positioning module of the image sensor is used for accurate positioning; the movable platform determines the data fusion method based on the positioning data output by the positioning module mainly based on the image sensor as the target data fusion method.
- the image sensor may be a monocular image sensor, a binocular image sensor, a multi-eye image sensor, a fish-eye image sensor, or a compound-eye image sensor.
- the monocular image sensor can obtain the surrounding image information based on the machine vision (Machine View) through the image information returned by the image sensor, and then perform positioning or map construction. Based on the positioning and map construction information, the positioning of the sensor carrier can be determined. In addition to monocular image sensor data, you can also obtain positioning information through binocular image sensors, multi-eye image sensors, etc., and also increase the robustness of positioning and map construction based on the depth obtained by binocular or multi-eye. Finally, the positioning results are more accurate.
- the positioning information can be used as verification data to be verified with GNSS data and inertial sensor data to retain high-confidence data.
- the at least one type of SLAM sensor data includes positioning data output by a positioning module based on lidar
- the data passed the verification includes positioning data output by a positioning module based on lidar and does not include GNSS data
- the positioning data output by the positioning module of the mobile platform performs accurate positioning; the movable platform determines the method of data fusion based on the positioning data output by the positioning module mainly based on lidar as the target data fusion method.
- the at least one type of SLAM sensor data includes the positioning data output by the positioning module mainly based on lidar and the positioning data output by the positioning module mainly based on image sensors, then when the data passed the verification includes the mainly based on lidar
- the positioning data output by the positioning module and the positioning data output by the image sensor-based positioning module do not include GNSS data; the movable platform will use the positioning data output by the lidar-based positioning module and the positioning data
- the positioning data output by the positioning module mainly based on the image sensor is determined as the target data fusion method as the main data fusion method.
- the at least one type of SLAM sensor data includes positioning data output by a positioning module based on lidar, and/or positioning data output by a positioning module based on image sensors
- the verified data includes the positioning data output by the positioning module based on lidar and/or the positioning data output by the positioning module based on image sensors, and includes GNSS data
- the movable platform combines GNSS data and lidar
- the positioning data output by the main positioning module and/or the positioning data output by the positioning module mainly based on the image sensor are used as the main reference data for data fusion to determine the target data fusion method.
- S204 Perform fusion processing on the GNSS data, inertial navigation system data, driving state data, and at least one SLAM sensor data of the movable platform according to the instruction of the target data fusion mode, to obtain target information.
- the indication of the target data fusion mode is to specify the type or type of data to be used for fusion of the GNSS data, inertial navigation system data, driving status data, and at least one SLAM sensor data of the movable platform.
- Process to get target information For example, if the target data fusion method is based on the positioning data output by the lidar-based positioning module, the movable platform will use the positioning data output by the lidar-based positioning module as the main data fusion method.
- the GNSS data, inertial navigation system data, driving state data, and at least one SLAM sensor data of the movable platform are fused to obtain target information.
- Other integration methods can be deduced by analogy, so I won't repeat them here.
- the target information after fusion processing includes subsequent data used to determine the position of the movable platform.
- the target information may include carrier phase data in the GNSS data, and angular velocity and acceleration in the driving state data.
- the target information may include environmental feature information extracted from the image of the environment where the movable platform is located Wait.
- the target information may include the point cloud data of the environment where the movable platform is located and/or the point cloud data Feature point information extracted from cloud data, etc.
- the positioning data output by the positioning module based on lidar and/or the positioning data output by the positioning module based on image sensors are mainly used for data fusion to ensure that the target information obtained after data fusion has high accuracy. So as to meet the positioning accuracy requirements of the movable platform.
- S205 Determine the position of the movable platform according to the target information.
- the target information includes positioning data or positioning information.
- the movable platform can determine the position of the movable platform in the high-precision map according to the target information.
- the high-precision map may be an offline high-precision map downloaded in advance by the mobile platform.
- the high-precision map records map information that can be verified with the positioning data. For example, based on the positioning information obtained by the vision sensor or lidar, landmarks matching the positioning information can be obtained. In the high-precision map, through the matching of the landmarks Ways to determine where the platform can be moved.
- the GNSS data, inertial navigation system data, driving state data and at least one SLAM sensor data of the movable platform are mutually verified to obtain the data that has passed the verification, and the data is determined according to the data that has passed the verification.
- Target data fusion mode instructions fusion process the GNSS data of the movable platform, inertial navigation system data, driving status data and at least one SLAM sensor data to obtain the target information, and determine the position of the movable platform based on the target information.
- Different data fusion methods locate mobile platforms in different environments to effectively ensure positioning accuracy.
- the inertial navigation system in the embodiments of the present invention can use low-precision micro-electromechanical IMUs with low energy consumption, small size, and low cost, and based on the foregoing
- the positioning method of data fusion realizes the purpose of fusing multi-sensor data or redundant sensor data, and finally provides low-cost, accurate and reliable positioning information for the mobile platform of autonomous driving.
- FIG. 3 shows a schematic flowchart of another positioning method based on multiple data fusion.
- the sensor modules of the movable platform include an inertial measurement module, an odometer, a positioning module based on lidar, a positioning module based on image sensors, and a global satellite navigation system.
- the laser radar-based positioning module mainly uses laser point cloud data to achieve positioning, and its module can integrate sensors such as inertial measurement modules or odometers.
- the image-based positioning module uses image information to achieve positioning. For example, it uses image information obtained by a visual odometer to achieve positioning.
- the module can also integrate sensors such as inertial measurement modules or odometers.
- the global satellite navigation system can provide both single-point positioning results and higher-precision differential (RTK) positioning results.
- the mobile platform can carry multiple sets of the same sensors, for example, two sets of GNSS systems can be installed at the same time.
- the above-mentioned inertial measurement module corresponds to the aforementioned inertial navigation system
- the above-mentioned odometer corresponds to the aforementioned sensor module for collecting driving state data
- the positioning module based on lidar corresponds to the aforementioned The described positioning module for acquiring SLAM sensor data.
- the above-mentioned sensor module configured on the movable platform can provide some observation data for navigation and positioning.
- the inertial measurement module can output observation data such as acceleration and angular velocity of the movable platform
- the odometer can output observation data such as the speed of the movable platform
- the positioning module based on lidar can output the position of the movable platform.
- Observation data such as, heading, etc.
- the positioning module based on image sensors can output observation data such as the position, attitude and speed of the movable platform
- the global satellite navigation system can output the GNSS position and GNSS speed in the single-point positioning results of the movable platform.
- Observation data can also output the RTK position, RTK speed, dual-antenna route and other observation data in the differential positioning results of the movable platform. It can be seen that different sensor modules may output the same observation data.
- the main goal of this scheme is to manage and verify these rich sensor data, and design several different filter modes, that is, the data fusion method described above. In this way, the best filtering mode can be selected based on the current observation data of various sensor modules, so as to obtain higher positioning accuracy.
- the filter mode of the filter can be degraded accordingly, and the filter is also a device for data fusion processing.
- the movable platform can perform different processing strategies according to different filtering modes. For example, when the filtering mode is low, the movable platform can be controlled to actively stop moving.
- the observation data output by the above-mentioned sensor modules are marked as follows: the speed output by the odometer is recorded as odo_v; the position and heading output by the positioning module mainly based on laser radar are recorded as laser_p and laser_yaw; The position and attitude output by the main positioning module are recorded as vo_pq, the output velocity is recorded as vo_v, the output gravity observation is recorded as vo_gravity; the GNSS position and GNSS velocity output by the global satellite navigation system are recorded as gnss_p, gnss_v, and the output RTK The position and RTK speed are recorded as rtk_p and rtk_v respectively, and the output dual-antenna heading is recorded as rtk_yaw. Therefore, the available sensor module observation data are shown in Table 1:
- Invalid mode FS_NONE (the filter is in an invalid state); 2.
- Image observation mode FS_VPQ (the filter has the relative observation of the visual odometer VIO), the relative observation with VIO means that there is the output of the positioning module mainly based on the image sensor The position observation data of, other similar descriptions are the same; 3.
- Position observation mode FS_POSI (the filter is in position mode, with global position observation); 4.
- Image and position observation mode FS_POSI_VPQ (the filter has both VIO relative observation and global position Observation); 5.
- RTK observation mode FS_POSI_RTK (the filter has both RTK relative observation and global position observation); 6.
- Image and RTK observation mode FS_POSI_VPQ_RTK (the filter has both RTK relative observation, VIO relative observation and global position observation).
- the priority of the above 6 filtering modes from low to high are: FS_NONE, FS_VPQ, FS_POSI, FS_POSI_VPQ, FS_POSI_RTK, FS_POSI_VPQ_RTK.
- Fig. 4 shows the conversion relationship among the above 6 filtering modes, and the conversion conditions for the conversion from the filtering mode with low priority to the filtering mode with high priority are marked.
- the conversion conditions for the conversion to the filtering mode with lower priority can be deduced by analogy, and will not be repeated here.
- 3 sub-modes can be set in the position observation mode FS_POSI, which are respectively marked as invalid position observation mode FS_POSI_NOE, lidar position observation mode FS_POSI_LASER, and GNSS position observation Mode FS_POSI_GNSS.
- FIG. 5 shows the conversion relationship among the three sub-modes in the position observation mode FS_POSI.
- the sub-mode When there is lidar positioning observation, that is, there is position observation data output by the positioning module mainly based on lidar, the sub-mode will preferentially jump to the FS_POSI_LASER sub-mode; when there is no lidar positioning observation but GNSS observation In this case, the sub-mode will jump to the FS_POSI_GNSS sub-mode. If there is GNSS observation, that is, there is the position observation data output by the GNSS system; when there is neither lidar positioning observation nor GNSS observation, the sub-mode will jump Go to the FS_POSI_NONE sub-mode.
- Coordinate conversion Different sensor modules usually have different coordinate systems, so it is necessary to convert the coordinate system of each sensor module to the coordinate system set by the filter, such as the commonly used northeast sky coordinate system. Among them, the output of the positioning module mainly based on lidar is already the data in the northeast sky coordinate system, and the data output by other sensor modules need to undergo coordinate conversion before they can be used in the filter.
- Information statistics mainly detect whether the data output by each sensor module is based on the identification information of the data mark output by each sensor module and the data frequency output by each sensor module. Normal, that is, to count which data currently output by each sensor module is available and which is not available.
- Module data self-check each sensor module can usually give a variety of data. Through mutual verification between the various data output by the same sensor module, it can be judged whether the data currently output by the sensor module is available. Therefore, the module data self-check will filter the available data given by the information statistics again.
- the observation data output by each sensor module can also be mutually verified.
- the observation data output by the odometer is generally more reliable.
- the speed output by the odometer can be used to verify the speed of the GNSS output, so as to judge whether the observation data currently output by the GNSS is available. Therefore, the inter-module data mutual inspection will further filter the available data given by the module data self-inspection.
- Mode selection, filter configuration, mode selection corresponds to the data fusion method determination process in the previous article: According to the available data obtained in step (4), determine which filtering mode the filter can be in, and then select one of the available filtering modes The filter mode with the highest priority is used as the final filter mode of the filter.
- the filter mode corresponds to the data fusion method in the previous article; further, the filter is configured according to the selected filter mode, such as reducing or increasing the state dimension and configuration of the filter Observation data of the filter, etc.
- observation data required by the above 6 filtering modes are shown in Table 2:
- OPTION indicates that the observation data is optional
- NECE_0 and NECE_1 indicate that the observation data must be available.
- the necessary observation data must have rtk_p; for another example, to enter the FS_POSI mode, the necessary observation data are laser_p and laser yaw, or there is observation data gnss_p.
- Filter data fusion Use the configured filter to filter and fuse the data output by the inertial measurement module and the available data obtained in step (4) above or the data output by sensor modules other than the inertial measurement module to obtain Location information of the mobile platform.
- the observation data output by the inaccurate or even faulty sensor module can be effectively isolated.
- the identification information of the GNSS position observation data output by the GNSS system indicates that the GNSS position observation data is valid, but the positioning result of the GNSS system is reliable at this time
- the performance and accuracy are poor, and the corresponding observation data output by the GNSS system can be verified through the speed output by the odometer, which can avoid the problem of introducing the poorly reliable observation data currently output by the GNSS system into the filter and causing the positioning result deviation.
- the observation data output by the GNSS system can be used to compare the observation data output by the positioning module based on lidar Perform verification.
- the contamination of the filter by the wrong observation data can be greatly reduced.
- the above method can ensure that the movable platform has high positioning accuracy in different environments.
- the observation data output by the positioning module based on lidar or the positioning module based on image sensors or sensor modules such as odometer can be used.
- the input filter performs data fusion, which can still output high-precision positioning results at this time.
- the switching between filtering modes that is, the switching of the data fusion mode, can ensure that the filter can be configured flexibly and reasonably in the case of various sensor failures or effective conditions, and ensure the accuracy of the output results, while giving the current filtering Mode, can facilitate the corresponding operation of the movable platform.
- the positioning module based on lidar adopts the Monte Carlo positioning method based on grid map
- the corresponding grid map needs to be collected in advance.
- the position can be observed
- the sub-mode of the mode is switched from the FS_POSI_LASER sub-mode to the FS_POSI_GNSS sub-mode, so that the global position observation data provided by the GNSS system can be used to ensure the global positioning accuracy.
- the filtering mode can use the observation data provided by the image sensor-based positioning module to ensure the accuracy of the positioning results.
- FIG. 6 is a schematic structural diagram of another movable platform according to an embodiment of the present invention.
- the movable platform described in the embodiment of the present invention includes: a processor 601, a communication interface 602, and a memory 603.
- the processor 601, the communication interface 602, and the memory 603 may be connected through a bus or in other ways.
- the embodiment of the present invention takes the connection through a bus as an example.
- the processor 601 may be a central processing unit (CPU), a graphics processing unit (GPU), a network processor (NP), or a combination of a CPU, GPU, and NP.
- the processor 601 may also be a core in a multi-core CPU, a multi-core GPU, or a multi-core NP for implementing communication identification binding.
- the processor 601 may be a hardware chip.
- the hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (PLD) or a combination thereof.
- the PLD may be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), a general array logic (generic array logic, GAL) or any combination thereof.
- the communication interface 602 can be used for the interaction of sending and receiving information or signaling, and the receiving and transmitting of signals.
- the memory 603 may mainly include a storage program area and a storage data area.
- the storage program area may store an operating system and a storage program required by at least one function (such as text storage function, location storage function, etc.); the storage data area may store Data (such as image data, text data) created according to the use of the device, etc., and may include application storage programs, etc.
- the memory 603 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
- the memory 603 is also used to store program instructions.
- the processor 601 is configured to execute program instructions stored in the memory 603, and when the program instructions are executed, the processor 601 is configured to:
- the processor 601 determines the target data fusion mode according to the data passed the verification, it is specifically configured to:
- a data fusion method based on GNSS data is determined as the target data fusion method.
- the at least one type of SLAM sensor data includes: positioning data output by a positioning module based on an image sensor.
- the processor 601 determines the target data fusion mode according to the data passed the verification, it is specifically configured to:
- the positioning data output by the image sensor-based positioning module is The main data fusion method is determined as the target data fusion method.
- the at least one type of SLAM sensor data includes: positioning data output by a positioning module based on lidar.
- the processor 601 determines the target data fusion mode according to the data passed the verification, it is specifically configured to:
- the positioning data output by the lidar-based positioning module is The main data fusion method is determined as the target data fusion method.
- the at least one type of SLAM sensor data includes: positioning data output by a positioning module based on lidar, and positioning data output by a positioning module based on image sensors.
- the processor 601 determines the target data fusion mode according to the data passed the verification, it is specifically configured to:
- the verified data includes the positioning data output by the positioning module based on lidar and the positioning data output by the positioning module based on image sensors, and does not include the GNSS data, it will be
- the method of data fusion between the positioning data output by the positioning module mainly based on lidar and the positioning data output by the positioning module mainly based on image sensors is determined as the target data fusion manner.
- the processor 601 performs mutual verification on the GNSS data, the inertial navigation system data, the driving state data, and at least one SLAM sensor data, and when the data passed the verification is obtained, the specific Used for:
- the processor 601 performs mutual verification on the GNSS data, the inertial navigation system data, the driving state data, and at least one SLAM sensor data, and when the data passed the verification is obtained, the specific Used for:
- the identification information of each data item and the frequency of the corresponding device output data is detected to obtain a data set that has passed the test; the data in the data set that has passed the test are mutually verified to obtain the data that has passed the verification.
- the processor determines the position of the movable platform according to the target information, it is specifically configured to:
- the position of the movable platform is determined in a high-precision map according to the target information.
- the processor 601, the communication interface 602, and the memory 603 described in the embodiment of the present invention can execute the implementation described in the multi-data fusion-based positioning method provided in the embodiment of the present invention. Repeat.
- the GNSS data, inertial navigation system data, driving state data, and at least one SLAM sensor data of the movable platform are mutually verified by the processor to obtain the data that has passed the verification, and the data is determined according to the data that has passed the verification.
- the target data fusion method is indicated by fusing the GNSS data, inertial navigation system data, driving status data and at least one SLAM sensor data of the movable platform to obtain target information, and determine the position of the movable platform according to the target information, thereby
- the mobile platform in different environments can be positioned based on different data fusion methods, effectively ensuring positioning accuracy.
- An embodiment of the present invention also provides a computer-readable storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the positioning based on multiple data fusion described in the above method embodiment is implemented method.
- the embodiment of the present invention also provides a computer program product containing instructions, which when running on a computer, causes the computer to execute the positioning method based on multiple data fusion described in the above method embodiment.
- the modules in the device of the embodiment of the present invention can be combined, divided, and deleted according to actual needs.
- the program can be stored in a computer-readable storage medium, and the storage medium can include: Flash disk, read-only memory (Read-Only Memory, ROM), random access device (Random Access Memory, RAM), magnetic disk or optical disk, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- Manufacturing & Machinery (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
La présente invention concerne un procédé de positionnement basé sur la fusion de multiples données, une plateforme mobile et un support de stockage. Le procédé consiste à : acquérir des données GNSS, des données de système de navigation inertielle, des données d'état de conduite et au moins un type de données de capteur SLAM d'une plate-forme mobile; mettre en œuvre une vérification réciproque sur les données GNSS, les données de système de navigation inertielle, les données d'état de conduite et le ou les types de données de capteur SLAM pour obtenir des données vérifiées, et déterminer une manière de fusion de données de cible en fonction des données vérifiées; et fusionner les données GNSS, les données de système de navigation inertielle, les données d'état de conduite et le ou les types de données de capteur SLAM de la plate-forme mobile en fonction d'une instruction de la manière de fusion de données de cible pour obtenir des informations de cible, et déterminer la position de la plate-forme mobile en fonction des informations de cible. Grâce aux modes de réalisation de la présente invention, la plate-forme mobile dans différents environnements peut être positionnée sur la base de différentes manières de fusion de données, ce qui garantit la précision du positionnement.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/097957 WO2021016749A1 (fr) | 2019-07-26 | 2019-07-26 | Procédé de positionnement basé sur la fusion de multiples données, plateforme mobile et support de stockage |
CN201980030350.3A CN112105961B (zh) | 2019-07-26 | 2019-07-26 | 基于多数据融合的定位方法、可移动平台及存储介质 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/097957 WO2021016749A1 (fr) | 2019-07-26 | 2019-07-26 | Procédé de positionnement basé sur la fusion de multiples données, plateforme mobile et support de stockage |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021016749A1 true WO2021016749A1 (fr) | 2021-02-04 |
Family
ID=73748804
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/097957 WO2021016749A1 (fr) | 2019-07-26 | 2019-07-26 | Procédé de positionnement basé sur la fusion de multiples données, plateforme mobile et support de stockage |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112105961B (fr) |
WO (1) | WO2021016749A1 (fr) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112904396B (zh) * | 2021-02-03 | 2024-08-13 | 深圳亿嘉和科技研发有限公司 | 一种基于多传感器融合的高精度定位方法及系统 |
CN114018284B (zh) * | 2021-10-13 | 2024-01-23 | 上海师范大学 | 一种基于视觉的轮速里程计校正方法 |
CN117451034B (zh) * | 2023-12-25 | 2024-04-02 | 天津云圣智能科技有限责任公司 | 一种自主导航的方法、装置、存储介质及电子设备 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102011119762A1 (de) * | 2011-11-30 | 2012-06-06 | Daimler Ag | System und Verfahren zur Positionsbestimmung eines Kraftfahrzeugs |
CN103278837A (zh) * | 2013-05-17 | 2013-09-04 | 南京理工大学 | 基于自适应滤波的sins/gnss多级容错组合导航方法 |
CN106227220A (zh) * | 2016-09-28 | 2016-12-14 | 关健生 | 基于分布式框架的自主导航巡检机器人 |
CN108375370A (zh) * | 2018-07-02 | 2018-08-07 | 江苏中科院智能科学技术应用研究院 | 一种面向智能巡防无人机的复合导航系统 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2330472A1 (fr) * | 2009-09-07 | 2011-06-08 | BAE Systems PLC | Détermination de chemin |
CN103207634A (zh) * | 2013-03-20 | 2013-07-17 | 北京工业大学 | 一种智能车辆中差分gps与惯性导航数据融合的系统和方法 |
CN106780699B (zh) * | 2017-01-09 | 2020-10-16 | 东南大学 | 一种基于sins/gps和里程计辅助的视觉slam方法 |
CN109405824A (zh) * | 2018-09-05 | 2019-03-01 | 武汉契友科技股份有限公司 | 一种适用于智能网联汽车的多源感知定位系统 |
CN109752725A (zh) * | 2019-01-14 | 2019-05-14 | 天合光能股份有限公司 | 一种低速商用机器人、定位导航方法及定位导航系统 |
-
2019
- 2019-07-26 WO PCT/CN2019/097957 patent/WO2021016749A1/fr active Application Filing
- 2019-07-26 CN CN201980030350.3A patent/CN112105961B/zh active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102011119762A1 (de) * | 2011-11-30 | 2012-06-06 | Daimler Ag | System und Verfahren zur Positionsbestimmung eines Kraftfahrzeugs |
CN103278837A (zh) * | 2013-05-17 | 2013-09-04 | 南京理工大学 | 基于自适应滤波的sins/gnss多级容错组合导航方法 |
CN106227220A (zh) * | 2016-09-28 | 2016-12-14 | 关健生 | 基于分布式框架的自主导航巡检机器人 |
CN108375370A (zh) * | 2018-07-02 | 2018-08-07 | 江苏中科院智能科学技术应用研究院 | 一种面向智能巡防无人机的复合导航系统 |
Also Published As
Publication number | Publication date |
---|---|
CN112105961A (zh) | 2020-12-18 |
CN112105961B (zh) | 2024-06-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10788830B2 (en) | Systems and methods for determining a vehicle position | |
Garratt et al. | Integration of GPS/INS/vision sensors to navigate unmanned aerial vehicles | |
WO2018086122A1 (fr) | Système et procédé destinés à la fusion de multiples trajets de données de détection | |
CN109991636A (zh) | 基于gps、imu以及双目视觉的地图构建方法及系统 | |
WO2021016749A1 (fr) | Procédé de positionnement basé sur la fusion de multiples données, plateforme mobile et support de stockage | |
US8078399B2 (en) | Method and device for three-dimensional path planning to avoid obstacles using multiple planes | |
EP3734394A1 (fr) | Fusion de capteurs utilisant des capteurs à inertie et d'images | |
JP2017536586A (ja) | モバイルプラットフォームを操作するための方法及び装置 | |
JP7245084B2 (ja) | 自動運転システム | |
Yun et al. | IMU/Vision/Lidar integrated navigation system in GNSS denied environments | |
US20210035456A1 (en) | Unmanned aircraft, and method and system for navigation | |
Troiani et al. | Low computational-complexity algorithms for vision-aided inertial navigation of micro aerial vehicles | |
CN114167470A (zh) | 一种数据处理方法和装置 | |
CN112946681B (zh) | 融合组合导航信息的激光雷达定位方法 | |
Sabatini et al. | Low-cost navigation and guidance systems for Unmanned Aerial Vehicles. Part 1: Vision-based and integrated sensors | |
CN115790571A (zh) | 基于异构无人系统相互观测的同时定位与地图构建方法 | |
CN103994766A (zh) | 一种抗gps失效固定翼无人机定向方法 | |
Johansen et al. | Globally exponentially stable Kalman filtering for SLAM with AHRS | |
CN113093759A (zh) | 基于多传感器信息融合的机器人编队构造方法及系统 | |
Albrektsen et al. | Phased array radio system aided inertial navigation for unmanned aerial vehicles | |
Wang et al. | Monocular vision and IMU based navigation for a small unmanned helicopter | |
CN112154355B (zh) | 高精度地图定位方法、系统、平台及计算机可读存储介质 | |
Zhao et al. | Distributed filtering-based autonomous navigation system of UAV | |
WO2021223122A1 (fr) | Procédé et appareil de positionnement d'aéronef, aéronef et support d'enregistrement | |
Stepanyan et al. | Adaptive multi-sensor information fusion for autonomous urban air mobility operations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19939761 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19939761 Country of ref document: EP Kind code of ref document: A1 |