CN112105961A - Positioning method based on multi-data fusion, movable platform and storage medium - Google Patents

Positioning method based on multi-data fusion, movable platform and storage medium Download PDF

Info

Publication number
CN112105961A
CN112105961A CN201980030350.3A CN201980030350A CN112105961A CN 112105961 A CN112105961 A CN 112105961A CN 201980030350 A CN201980030350 A CN 201980030350A CN 112105961 A CN112105961 A CN 112105961A
Authority
CN
China
Prior art keywords
data
positioning
movable platform
gnss
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201980030350.3A
Other languages
Chinese (zh)
Other versions
CN112105961B (en
Inventor
冯国强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhuoyu Technology Co ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112105961A publication Critical patent/CN112105961A/en
Application granted granted Critical
Publication of CN112105961B publication Critical patent/CN112105961B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/40Correcting position, velocity or attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A positioning method, a movable platform and a storage medium based on multi-data fusion, wherein the method comprises the following steps: acquiring GNSS data, inertial navigation system data, driving state data and at least one SLAM sensor data of a movable platform; performing mutual verification on GNSS data, inertial navigation system data, driving state data and at least one SLAM sensor data to obtain data passing the verification, and determining a target data fusion mode according to the data passing the verification; and performing fusion processing on GNSS data, inertial navigation system data, driving state data and at least one SLAM sensor data of the movable platform according to the indication of the target data fusion mode to obtain target information, and determining the position of the movable platform according to the target information. According to the embodiment of the invention, the movable platform under different environments can be positioned based on different data fusion modes, and the positioning accuracy is effectively ensured.

Description

Positioning method based on multi-data fusion, movable platform and storage medium
Technical Field
The invention relates to the technical field of positioning, in particular to a positioning method based on multi-data fusion, a movable platform and a storage medium.
Background
The positioning technology can provide information such as position and the like for the movable platform, and is a precondition for path planning, motion control and autonomous decision making of the movable platform. Currently, a mature method is to combine an Inertial Measurement Unit (IMU) and a Global Navigation Satellite System (GNSS) to realize real-time positioning. However, the GNSS has the problem that signals are frequently lost in complex environments such as urban canyons, tunnels or field jungles, and the situation that the mobile platform cannot be accurately positioned by using the GNSS due to frequent signal loss can be caused. Under the condition, the movable platform can only be positioned by using the inertial measurement module IMU, but the positioning accuracy of the inertial measurement module IMU is low, and the accurate positioning requirement of the movable platform cannot be met.
Disclosure of Invention
The embodiment of the invention discloses a positioning method based on multi-data fusion, a movable platform and a storage medium, which can be used for positioning the movable platform under different environments based on different data fusion modes and effectively ensure the positioning accuracy.
On one hand, the embodiment of the invention discloses a positioning method based on multi-data fusion, which is applied to a movable platform and comprises the following steps:
obtaining Global Navigation Satellite System (GNSS) data, inertial navigation system data, travel state data, and at least one immediate positioning and mapping (SLAM) sensor data of the mobile platform;
the GNSS data, the inertial navigation system data, the driving state data and at least one SLAM sensor data are mutually verified to obtain data passing verification, and a target data fusion mode is determined according to the data passing verification;
and performing fusion processing on GNSS data, inertial navigation system data, driving state data and at least one type of SLAM sensor data of the movable platform according to the indication of the target data fusion mode to obtain target information, and determining the position of the movable platform according to the target information.
In another aspect, an embodiment of the present invention discloses a movable platform, including: a memory and a processor, wherein the processor is capable of,
the memory to store program instructions;
the processor to execute the memory-stored program instructions, the processor to, when executed:
acquiring GNSS data, inertial navigation system data, driving state data and at least one SLAM sensor data of the movable platform;
the GNSS data, the inertial navigation system data, the driving state data and at least one SLAM sensor data are mutually verified to obtain data passing verification, and a target data fusion mode is determined according to the data passing verification;
and performing fusion processing on GNSS data, inertial navigation system data, driving state data and at least one type of SLAM sensor data of the movable platform according to the indication of the target data fusion mode to obtain target information, and determining the position of the movable platform according to the target information.
Correspondingly, the embodiment of the invention also discloses a computer readable storage medium, wherein a computer program is stored in the computer readable storage medium, and when being executed by a processor, the computer program realizes the steps of the positioning method based on multi-data fusion.
According to the embodiment of the invention, the GNSS data, the inertial navigation system data, the driving state data and the at least one SLAM sensor data of the movable platform are mutually verified to obtain the data passing the verification, the GNSS data, the inertial navigation system data, the driving state data and the at least one SLAM sensor data of the movable platform are fused according to the indication of the target data fusion mode determined according to the data passing the verification to obtain the target information, and the position of the movable platform is determined according to the target information, so that the movable platform under different environments can be positioned based on different data fusion modes, and the positioning accuracy is effectively ensured.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a schematic structural diagram of a movable platform according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a positioning method based on multi-data fusion according to an embodiment of the present invention;
FIG. 3 is a schematic flowchart of another positioning method based on multi-data fusion according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a conversion relationship between filtering modes according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating a conversion relationship between sub-modes in a position observation mode according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of another movable platform disclosed in the embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a movable platform according to an embodiment of the present invention. As shown in fig. 1, the movable platform is configured with a global satellite navigation system GNSS101, an inertial navigation system INS or a strapdown inertial navigation system SINS102, and a sensor module 103 for collecting driving state data; the movable platform is also configured with at least one positioning module 104 for acquiring immediate positioning And Mapping (SLAM) sensor data. The inertial navigation system INS or the strapdown inertial navigation system SINS102 may include an inertial measurement module IMU, which may include a gyroscope, an accelerometer, and the like; the inertial measurement module can be a low-precision Micro Electro Mechanical System (MEMS) IMU, and can also be an optical fiber type or laser type IMU. The positioning module 104 can be carried on a body 106 of the movable platform through a holder 105 of the movable platform, and the holder 105 can drive the positioning module 104 to rotate around one or more axes of a yaw axis, a roll axis and a pitch axis so as to adjust the posture of acquiring the SLAM sensor data; the positioning module 104 may also be carried directly on the body 106 of the movable platform. In some embodiments, positioning module 104 may be fixed to pan/tilt head 104 entirely or may be fixed to pan/tilt head 104 partially, with the other portion being carried directly on body 106 of the movable platform.
The GNSS systems 101 may be one or more, the inertial navigation systems may be one or more, the sensor module 103 for acquiring the driving state data may be one or more, and the positioning module 104 for acquiring the SLAM sensor data may be one or more. The positioning module 104 for acquiring SLAM sensor data may be an image sensor-based positioning module, a lidar-based positioning module, or the like. The sensor module 103 for collecting driving state data may be a speedometer or the like. It should be noted that the movable platform shown in fig. 1 is described by taking a Vehicle as an example, and the movable platform in the embodiment of the present invention may also be a movable device such as an Unmanned Aerial Vehicle (UAV), an Unmanned ship, and a mobile robot.
The positioning method based on multi-data fusion in the embodiment of the present invention can be applied to the movable platform shown in fig. 1, and specifically: the method comprises the steps that a movable platform obtains GNSS data, inertial navigation system data, driving state data and at least one type of SLAM sensor data; and mutually verifying the acquired GNSS data, inertial navigation system data, driving state data and at least one SLAM sensor data to obtain data passing the verification, and determining a target data fusion mode according to the data passing the verification. The data fusion mode can be used for indicating which type or which data is mainly used for data fusion. Furthermore, the movable platform carries out fusion processing on GNSS data, inertial navigation system data, driving state data and at least one type of SLAM sensor data of the movable platform according to the indication of the target data fusion mode to obtain target information, and the position of the movable platform is determined according to the target information. By adopting the mode, the movable platform under different environments can be positioned based on different data fusion modes, and the positioning accuracy is effectively ensured. The details will be described below.
Referring to fig. 2, fig. 2 is a flowchart illustrating a positioning method based on multiple data fusion according to an embodiment of the present invention. The positioning method based on multi-data fusion described in the embodiment of the invention is applied to a movable platform, and the movable platform is provided with a global satellite navigation GNSS system, an inertial navigation system INS or a strapdown inertial navigation system SINS, a sensor module for acquiring driving state data, and at least one positioning module for acquiring SLAM sensor data. Wherein the method comprises the following steps:
s201, GNSS data, inertial navigation system data, driving state data and at least one SLAM sensor data of the movable platform are obtained.
In the embodiment of the present invention, the GNSS data is observation data output by a GNSS configured in the mobile platform, and includes carrier phase data, speed, and the like of the mobile platform. The global satellite navigation system GNSS may be a single-point type global satellite navigation system and/or a differential type global satellite navigation system. The inertial navigation system data includes INS data and/or SINS data, where the INS data is observation data output by the inertial navigation system INS configured for the mobile platform, and the SINS data is data output by the strapdown inertial navigation system SINS configured for the mobile platform. The inertial navigation system data includes measurement data of a gyroscope and an accelerometer in the inertial navigation system, the measurement data of the gyroscope includes an angular velocity of the movable platform, and the measurement data of the accelerometer includes an acceleration of the movable platform. The driving state data is observation data output by a sensor module which is configured on the movable platform and used for collecting the driving state data, the sensor module used for collecting the driving state data can be an odometer, and the driving state data collected by the odometer comprises the speed, the acceleration and the like of the movable platform; if the movable platform is a vehicle, the driving state data comprises the wheel speed, the acceleration and the like of the vehicle; if the movable platform is a drone, the driving state data includes the speed of the drone to the ground, acceleration to the ground, etc.
The SLAM sensor data is also observation data output by a positioning module configured by the movable platform and used for acquiring the SLAM sensor data, the positioning module used for acquiring the SLAM sensor data includes a positioning module mainly based on an image sensor and/or a positioning module mainly based on a laser radar, and the SLAM sensor data specifically includes positioning data output by the positioning module mainly based on the image sensor and/or positioning data output by the positioning module mainly based on the laser radar. The positioning data output by the positioning module mainly based on the image sensor comprises the speed of the movable platform, environment characteristic information extracted from the image of the environment where the movable platform is located and the like. The positioning data output by the positioning module mainly based on the laser radar comprises the speed of the movable platform, point cloud data of the environment where the movable platform is located and the like.
S202, performing mutual verification on the GNSS data, the inertial navigation system data, the driving state data and at least one type of SLAM sensor data to obtain data passing the verification.
In the embodiment of the invention, a GNSS system, an inertial navigation system, a sensor module for acquiring driving state data and a positioning module for acquiring SLAM sensor data add identification information to respective output data, wherein the identification information is used for indicating whether the output data is valid; the movable platform can detect whether the data is available or not through the identification information of the data. The information output by the GNSS system, the inertial navigation system, the sensor module for acquiring the driving state data and the positioning module for acquiring the SLAM sensor data also comprises reference data output frequency; therefore, the movable platform can determine whether the data output by the system or the module is available by detecting whether the real frequency of the data output by the system or the module is consistent with the output frequency of the reference data included in the information output by the system or the module (or within a preset error range).
The method comprises the steps that a movable platform detects various data based on identification information of the various data in a data set consisting of GNSS data, inertial navigation system data, driving state data and at least one type of SLAM sensor data and the frequency of data output by corresponding equipment (namely, the system or the module), so as to determine first data, namely available data in the various data, wherein the identification information indicates that the first data is effective and the real frequency of the data output by the corresponding equipment is consistent with the output frequency of reference data output by the corresponding equipment (or is within a preset error range); and determining a data set formed by the first data as a data set passing the detection.
Further, the movable platform performs mutual verification on the data in the detected data set to obtain verified data. Specifically, the movable platform may use data that is output by the target device in the system or the module and passes the detection as reference data, and verify the data that is output by the devices other than the target device in the system or the module and passes the detection by using the reference data to obtain data that passes the verification. The target device is a system or a module with default data accuracy of the movable platform, which is slightly influenced by environmental factors, and the reliability of data output by the target device is high. In an alternative embodiment, the target device may be a sensor module for collecting driving state data in the above-mentioned system or module, and the sensor module may be specifically an odometer. For example, if the set of data that passes the detection includes a speed output by the odometer and a speed output by the GNSS system, the speed output by the odometer may be used to verify the speed output by the GNSS system, and if the speed output by the GNSS system is consistent with the speed output by the odometer or within a preset error range, it is determined that the data that is output by the GNSS system and passes the detection is reliable, and the data that is output by the GNSS system and passes the detection is determined as data that passes the verification; otherwise, the data output by the GNSS system that passes the detection is excluded.
In an optional embodiment, after the movable platform determines the data set passing the detection, the movable platform performs self-test on second data output by the same system or module in the data set passing the detection, so as to exclude data failing the self-test, and obtain a data set passing the self-test. For example, when the movable platform is a vehicle, the odometer may output the respective wheel speeds of the four wheels of the vehicle; if four wheel speeds are compared, and a difference between a certain wheel speed and the rest three wheel speeds is detected to be large, the certain wheel speed can be determined to be abnormal, the self-detection fails, and the abnormal wheel speed is eliminated from the data set of the detection passing. For another example, the GNSS system may output the carrier phase data and the velocity of the movable platform at the same time, and may also obtain the velocity of the movable platform by processing the carrier phase data; and comparing the processed speed with the speed output by the GNSS system, if the difference between the processed speed and the speed output by the GNSS system exceeds a preset error range, determining that the data output by the GNSS system is abnormal and the self-check fails, and excluding the data output by the GNSS system from the detected data set. Further, the movable platform performs mutual verification on the data in the data set passing the self-checking to obtain data passing the verification. According to the embodiment of the invention, inaccurate data can be effectively isolated by detecting, self-checking and mutually checking the data output by the system or the module, so that the determined data fusion mode is optimal, and the positioning accuracy is ensured.
In another optional embodiment, because coordinate systems adopted by the GNSS system, the inertial navigation system, the sensor module for acquiring the driving state data, and the positioning module for acquiring the SLAM sensor data are generally different, in order to facilitate comparison and processing of the data, before detecting the data, the movable platform converts the data in the data set into a reference coordinate system based on the identification information of each item of data in the data set consisting of the GNSS data, the inertial navigation system data, the driving state data, and at least one type of SLAM sensor data and the frequency of output data of corresponding equipment, so as to obtain a coordinate-converted data set; and then detecting each item of data in the data set after coordinate conversion based on the identification information of each item of data in the data set after coordinate conversion and the frequency of the output data of the corresponding equipment to obtain a data set which passes the detection.
And S203, determining a target data fusion mode according to the data passing the verification.
In the embodiment of the invention, when the data passing the verification comprises the GNSS data, the accuracy of the data output by the GNSS system in the current environment is high, the data output by the GNSS system can be utilized to carry out accurate positioning, and the movable platform determines the mode of carrying out data fusion by taking the GNSS data as the main mode as the target data fusion mode.
If the at least one SLAM sensor data comprises positioning data output by a positioning module taking an image sensor as a main part, when the data passing the verification comprises the positioning data output by the positioning module taking the image sensor as a main part and does not comprise GNSS data, the accuracy of the data output by the GNSS system in the current environment is low, and the accuracy of the positioning data output by the positioning module taking the image sensor as a main part is high; at the moment, the data output by the GNSS system can not be utilized for accurate positioning, but the positioning data output by the positioning module mainly comprising the image sensor can be utilized for accurate positioning; and the movable platform determines a data fusion mode mainly based on the positioning data output by the positioning module mainly based on the image sensor as a target data fusion mode. The image sensor may be a monocular image sensor, a binocular image sensor, a multi-eye image sensor, a fisheye image sensor, or a compound eye image sensor. The monocular image sensor can obtain surrounding image information in a Machine vision (Machine View) based mode through image information transmitted back by the image sensor, and then positioning or map construction is carried out, and positioning of the sensor carrier can be determined based on the positioning and map construction information. Besides monocular image sensor data, positioning information can be obtained through a binocular image sensor, a multi-view image sensor and the like, and robustness of positioning and map construction is improved based on depth obtained through binocular or multi-view, and finally a positioning result is more accurate. The positioning information can be used as verification data to be verified with GNSS data and inertial sensor data, and high-confidence-degree data are reserved.
If the at least one SLAM sensor data comprises positioning data output by a positioning module taking a laser radar as a main part, when the data passing the verification comprises the positioning data output by the positioning module taking the laser radar as a main part and does not comprise GNSS data, the accuracy of the data output by the GNSS system under the current condition is low, and the accuracy of the positioning data output by the positioning module taking the laser radar as a main part is high; at the moment, the data output by the GNSS system can not be utilized for accurate positioning, but the positioning data output by the positioning module mainly based on the laser radar can be utilized for accurate positioning; and the movable platform determines a data fusion mode mainly based on the positioning data output by the positioning module mainly based on the laser radar as a target data fusion mode.
If the at least one SLAM sensor data comprises positioning data output by a positioning module mainly based on a laser radar and positioning data output by a positioning module mainly based on an image sensor, when the data passing the verification comprises the positioning data output by the positioning module mainly based on the laser radar and the positioning data output by the positioning module mainly based on the image sensor and does not comprise GNSS data; and the movable platform determines a mode of performing data fusion mainly by using the positioning data output by the positioning module mainly based on the laser radar and the positioning data output by the positioning module mainly based on the image sensor as a target data fusion mode.
In another optional embodiment, if the at least one SLAM sensor data includes positioning data output by a positioning module based on a laser radar and/or positioning data output by a positioning module based on an image sensor, when the verified data includes positioning data output by the positioning module based on the laser radar and/or positioning data output by the positioning module based on the image sensor and includes GNSS data; the movable platform determines a target data fusion mode by taking GNSS data, positioning data output by a positioning module taking a laser radar as a main reference data and/or positioning data output by a positioning module taking an image sensor as a main reference data for data fusion.
And S204, carrying out fusion processing on the GNSS data, the inertial navigation system data, the driving state data and at least one SLAM sensor data of the movable platform according to the indication of the target data fusion mode to obtain target information.
In the embodiment of the present invention, the indication of the target data fusion manner is to specify what kind or what kind of data is used as a main part to perform fusion processing on the GNSS data, the inertial navigation system data, the driving state data, and at least one type of SLAM sensor data of the movable platform, so as to obtain the target information. For example, if the target data fusion mode is a mode of performing data fusion mainly based on the positioning data output by the positioning module mainly based on the laser radar, the mobile platform performs fusion processing on the GNSS data, the inertial navigation system data, the driving state data, and at least one type of SLAM sensor data of the mobile platform by using the positioning data output by the positioning module mainly based on the laser radar as main reference data to obtain target information. Other fusion methods are analogized, and are not described herein.
The fused target information comprises data which are subsequently used for determining the position of the movable platform. When the target data fusion method is a method of performing data fusion mainly using GNSS data, the target information may include carrier phase data in the GNSS data, and angular velocity, acceleration, and the like in the driving state data. When the target data fusion mode is a mode of performing data fusion mainly based on the positioning data output by the positioning module mainly based on the image sensor, the target information may include environment feature information extracted from an image of an environment where the movable platform is located, and the like. When the target data fusion mode is a mode of performing data fusion mainly based on the positioning data output by the positioning module mainly based on the laser radar, the target information may include point cloud data of the environment where the movable platform is located and/or feature point information extracted from the point cloud data, and the like.
By adopting the positioning method based on multi-data fusion provided by the embodiment of the invention, under the condition that the GNSS data accuracy is low because the movable platform is in complex environments such as urban canyons, tunnels or field jungles, the positioning data output by the positioning module mainly based on the laser radar and/or the positioning data output by the positioning module mainly based on the image sensor with high data accuracy are selected to perform data fusion, so that the target information obtained after data fusion has higher accuracy, and the positioning accuracy requirement of the movable platform is met.
S205, determining the position of the movable platform according to the target information.
In the embodiment of the present invention, the target information includes positioning data or positioning information. The movable platform may determine a position of the movable platform in a high-precision map based on the target information. The high-precision map may be an offline high-precision map previously downloaded by the movable platform. The high-precision map records map information which can be verified with positioning data, for example, positioning information obtained based on a visual sensor or a laser radar can obtain a landmark matched with the positioning information, and the position of the movable platform can be determined in the high-precision map by matching with the landmark.
According to the embodiment of the invention, the GNSS data, the inertial navigation system data, the driving state data and the at least one SLAM sensor data of the movable platform are mutually verified to obtain the data passing the verification, the GNSS data, the inertial navigation system data, the driving state data and the at least one SLAM sensor data of the movable platform are fused according to the indication of the target data fusion mode determined according to the data passing the verification to obtain the target information, and the position of the movable platform is determined according to the target information, so that the movable platform under different environments can be positioned based on different data fusion modes, and the positioning accuracy is effectively ensured.
For better understanding of the positioning method based on multi-data fusion provided by the embodiment of the present invention, the following examples are provided for detailed description. Because the adoption of the high-precision optical fiber type or laser type IMU and the like can sharply increase the equipment cost, the inertial navigation system in the embodiment of the invention can adopt the low-precision micro-electromechanical IMU with low energy consumption, small volume and low cost, and realizes the purpose of fusing multi-sensor data or redundant sensor data based on the positioning method of multi-data fusion, thereby finally providing low-cost, accurate and reliable positioning information for the automatic driving movable platform. Referring to fig. 3, a schematic flow chart of another positioning method based on multiple data fusion is shown. As shown in fig. 3, the sensor module of the movable platform is configured with an inertial measurement module, a odometer, a lidar-based positioning module, an image sensor-based positioning module, and a global satellite navigation system. The laser radar-based positioning module mainly utilizes laser point cloud data to realize positioning, and an inertial measurement module or a sensor such as a speedometer can be fused in the laser radar-based positioning module. The image-based positioning module realizes positioning by using image information, for example, positioning is realized by using image information acquired by a visual odometer, and a sensor such as an inertial measurement module or an odometer can be fused in the module. The global satellite navigation system can provide both single point positioning results and higher precision differential (RTK) positioning results. It should be noted that, in order to ensure the safety of the autonomous driving of the movable platform, the movable platform may carry multiple sets of the same sensors, for example, two sets of GNSS systems may be installed at the same time. The inertial measurement module corresponds to the inertial navigation system described above, the odometer corresponds to the sensor module for acquiring the driving state data described above, and the positioning module mainly based on the laser radar and the positioning module mainly based on the image sensor correspond to the positioning module for acquiring the SLAM sensor data described above.
The above-described sensor module of the movable platform configuration may provide some of the observation data for navigational positioning. As shown in fig. 3, the inertial measurement module may output the observation data such as the acceleration and the angular velocity of the movable platform, the odometer may output the observation data such as the velocity of the movable platform, the positioning module based on the laser radar may output the observation data such as the position and the course of the movable platform, the positioning module based on the image sensor may output the observation data such as the position, the attitude, and the velocity of the movable platform, the global satellite navigation system may output the observation data such as the GNSS position and the GNSS velocity in the single-point positioning result of the movable platform, and may output the observation data such as the RTK position, the RTK velocity, and the dual-antenna flight path in the differential positioning result of the movable platform. It can be seen that the same observation data may be output between different sensor modules. The main objective of the scheme is to manage and verify the abundant sensor data, and design several different filter modes, namely the data fusion mode described above. Therefore, the optimal filtering mode can be selected based on the current observation data of various sensor modules, and higher positioning accuracy can be obtained. Moreover, in the case that some observation data are not available, based on the currently available observation data, the filtering mode of the filter, that is, the device for performing the data fusion process, may be correspondingly degraded. Therefore, the movable platform can carry out different processing strategies according to different filtering modes, for example, the movable platform can be controlled to actively stop moving under the condition that the filtering mode is low.
For convenience of subsequent description, the observation data output by each sensor module is labeled as follows: the speed output by the odometer is recorded as odo _ v; the position and the course output by the positioning module mainly based on the laser radar are respectively marked as laser _ p and laser _ yaw; the position and the posture output by the positioning module mainly comprising the image sensor are recorded as vo _ pq, the output speed is recorded as vo _ v, and the output gravity observation is recorded as vo _ slope; the GNSS position and the GNSS speed output by the global satellite navigation system are recorded as GNSS _ p and GNSS _ v respectively, the RTK position and the RTK speed output by the global satellite navigation system are recorded as RTK _ p and RTK _ v respectively, and the output double-antenna heading is recorded as RTK _ yaw. Therefore, the sensor module observation data that can be obtained is shown in table one:
laser_p laser_yaw vo_pq vo_v vo_gravity rtk_p rtk_v rtk_yaw gnss_p gnss_v odo_v
based on the above observation data, the following 6 filtering modes can be designed:
1. invalid mode FS _ NONE (filter in invalid state); 2. an image observation mode FS _ VPQ (a filter has relative observation of a visual odometer VIO), the relative observation with the VIO is position observation data output by a positioning module mainly comprising an image sensor, and the rest similar descriptions are similar; 3. position observation mode FS _ POSI (filter in position mode, with global position observation); 4. image and position observation mode FS _ POSI _ VPQ (both the VIO relative observation and the global position observation are observed by the filter); 5. an RTK observation mode FS _ POSI _ RTK (the filter simultaneously has RTK relative observation and global position observation); 6. and an image and RTK observation mode FS _ POSI _ VPQ _ RTK (the filter simultaneously has RTK relative observation, VIO relative observation and global position observation). The priority of the 6 filtering modes is from low to high in sequence: FS _ NONE, FS _ VPQ, FS _ POSI _ VPQ, FS _ POSI _ RTK, FS _ POSI _ VPQ _ RTK. Referring to fig. 4, fig. 4 shows the conversion relationship between the above 6 filtering modes, and marks the conversion condition for converting the filtering mode with low priority to the filtering mode with high priority, and so on for the conversion condition for converting the filtering mode with high priority to the filtering mode with low priority, which is not described herein again.
In addition, since both the position observation data LASER _ p and GNSS _ p provide global position information, 3 seed modes can be set in the position observation mode FS _ POSI, which are respectively denoted as null position observation mode FS _ POSI _ NOE, lidar position observation mode FS _ POSI _ LASER, and GNSS position observation mode FS _ POSI _ GNSS. Referring to fig. 5, fig. 5 shows a conversion relationship between the 3 seed patterns in the position observation pattern FS _ POSI. Under the condition that LASER radar positioning observation exists, namely position observation data output by a positioning module mainly based on the LASER radar exists, the sub-mode is preferentially jumped to an FS _ POSI _ LASER sub-mode; under the condition of no laser radar positioning observation but GNSS observation, the sub-mode jumps to an FS _ POSI _ GNSS sub-mode, and the GNSS observation is carried out, namely position observation data output by a GNSS system; in the absence of both lidar positioning observations and GNSS observations, the sub-mode would jump to the FS _ POSI _ NONE sub-mode.
Further, the various steps in fig. 3 are described:
(1) and (3) coordinate conversion: different sensor modules usually have different coordinate systems, so that the coordinate systems of the individual sensor modules need to be transformed into a coordinate system of the filter setting, for example the common northeast coordinate system. The data output by the positioning module mainly based on the laser radar is data in a northeast coordinate system, and the data output by other sensor modules can be applied to the filter only by coordinate conversion.
(2) Information statistics, corresponding to the data detection process in the foregoing: the information statistics is mainly to detect whether the data output by each sensor module is normal or not according to the identification information of the data mark output by each sensor module and the data frequency output by each sensor module, that is, to count which data output by each sensor module is available and which data output by each sensor module is unavailable.
(3) Self-checking module data: each sensor module can generally give various data, and whether the data currently output by the sensor module is available can be judged through mutual verification among the various data output by the same sensor module. Therefore, the module data self-check can perform one more screening on the available data given by the information statistics.
(4) Data mutual detection among modules: besides module data self-checking, observation data output by each sensor module can be checked mutually. For example, the observation data output by the odometer is generally reliable, and the speed output by the GNSS can be checked by using the speed output by the odometer, so that whether the observation data currently output by the GNSS is usable or not can be judged. Therefore, the data mutual inspection among the modules can further screen the available data given by the module data self-inspection.
(5) Mode selection, filter configuration, the mode selection corresponds to the data fusion mode determination process in the preceding text: judging which filtering mode the filter can be in according to the available data obtained in the step (4), and then selecting a filtering mode with the highest priority from the available filtering modes as a final filtering mode of the filter, wherein the filtering mode corresponds to the data fusion mode in the previous text; further, the filter is configured according to the selected filtering mode, for example, to decrease or increase the state dimension of the filter, configure the observation data of the filter, and the like.
In some embodiments, the observation data required for the above 6 filtering modes is shown in table two:
Figure BDA0002761238840000121
here, withrout indicates that the observation data is not required, OPTION indicates that the observation data is available, and NECE _0 and NECE _1 indicate that the observation data is required. For example, to enter the FS _ POSI _ RTK mode, RTK _ p is the necessary observation data; for another example, to enter the FS _ POSI mode, there must be observation data of laser _ p and laser raw, or observation data gnss _ p.
(6) And (3) fusion of filter data: and (4) filtering and fusing the data output by the inertia measurement module and the available data obtained in the step (4) or the data output by the sensor modules except the inertia measurement module by using the configured filter to obtain the position information of the movable platform.
By adopting the mode, the observation data output by the sensor module with inaccurate measurement and even fault can be effectively isolated through module data self-checking and module data mutual checking. For example, when the movable platform just exits the tunnel or is in an urban canyon environment, the identification information of the GNSS position observation data output by the GNSS system indicates that the GNSS position observation data is valid, but the positioning result reliability and accuracy of the GNSS system are poor at this time, and the corresponding observation data output by the GNSS system is verified by the speed output by the odometer, so that the problem of positioning result deviation caused by introducing the observation data output by the GNSS system at present with poor reliability into a filter can be avoided. For another example, when the positioning module based on the lidar is abnormal, if the observation data output by the GNSS system is accurate, the observation data output by the GNSS system may be used to verify the observation data output by the positioning module based on the lidar. In a word, the situation that the filter is polluted by error observation data can be greatly reduced through self-checking and mutual checking of module data.
In addition, the mode can ensure that the movable platform has higher positioning accuracy under different environments. For example, when the GNSS signal is frequently lost in a blocked environment, the mobile platform may perform data fusion on the observation data input filter output by the sensor modules such as the positioning module mainly based on the laser radar, the positioning module mainly based on the image sensor, and the odometer, so that a positioning result with higher accuracy may still be output at this time. In addition, the switching between the filtering modes, namely the switching of the data fusion mode can ensure that the filter can be flexibly and reasonably configured under the condition that various sensors are invalid or effective, the accuracy of an output result is ensured, and the current filtering mode is provided, so that the movable platform can conveniently carry out corresponding operation. For example, when a positioning module based on a LASER radar adopts a monte carlo positioning method based on a grid map, a corresponding grid map needs to be acquired in advance, and when a movable platform moves to an area without a map, a sub-mode in a position observation mode can be switched from an FS _ POSI _ LASER sub-mode to an FS _ POSI _ GNSS sub-mode, so that global positioning accuracy can be ensured by using global position observation data provided by a GNSS system. For another example, if the movable platform fails to find a suitable grid due to too similar surrounding texture in a tunnel, the laser positioning of the positioning module based on the lidar is disabled, the filtering mode may be switched from the FS _ POSI _ VPQ mode to the FS _ VPQ mode, so that the accuracy of the positioning result may be ensured by using the observation data provided by the positioning module based on the image sensor.
Referring to fig. 6, fig. 6 is a schematic structural diagram of another movable platform according to an embodiment of the present invention. The movable platform described in the embodiments of the present invention includes: a processor 601, a communication interface 602, a memory 603. The processor 601, the communication interface 602, and the memory 603 may be connected by a bus or other means, and the embodiment of the present invention is exemplified by being connected by a bus.
The processor 601 may be a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a Network Processor (NP), or a combination of a CPU, a GPU, and an NP. The processor 601 may also be a core of a multi-core CPU, a multi-core GPU, or a multi-core NP for implementing communication identity binding.
The processor 601 may be a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
The communication interface 602 may be used for transceiving information or signaling interactions, as well as for receiving and transferring signals. The memory 603 may mainly include a program storage area and a data storage area, where the program storage area may store an operating system, and a storage program required by at least one function (e.g., a text storage function, a location storage function, etc.); the storage data area may store data (such as image data, text data) created according to the use of the device, etc., and may include an application storage program, etc. Further, the memory 603 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The memory 603 is also used to store program instructions. The processor 601 is configured to execute the program instructions stored in the memory 603, and when the program instructions are executed, the processor 601 is configured to:
acquiring GNSS data, inertial navigation system data, driving state data and at least one SLAM sensor data of the movable platform;
the GNSS data, the inertial navigation system data, the driving state data and at least one SLAM sensor data are mutually verified to obtain data passing verification, and a target data fusion mode is determined according to the data passing verification;
and performing fusion processing on GNSS data, inertial navigation system data, driving state data and at least one type of SLAM sensor data of the movable platform according to the indication of the target data fusion mode to obtain target information, and determining the position of the movable platform according to the target information.
The method executed by the processor in the embodiment of the present invention is described from the perspective of the processor, and it is understood that the processor in the embodiment of the present invention needs to cooperate with other hardware structures to execute the method. The embodiments of the present invention are not described or limited in detail for the specific implementation process.
In an embodiment, when the processor 601 determines the target data fusion mode according to the data passing the verification, it is specifically configured to:
and when the checked data comprises the GNSS data, determining a data fusion mode taking the GNSS data as a main mode as a target data fusion mode.
In an embodiment, the at least one SLAM sensor data comprises: and the positioning data is output by a positioning module taking the image sensor as a main part.
In an embodiment, when the processor 601 determines the target data fusion mode according to the data passing the verification, it is specifically configured to:
and when the data passing the verification comprises the positioning data output by the positioning module taking the image sensor as the main part and does not comprise the GNSS data, determining a mode of performing data fusion by taking the positioning data output by the positioning module taking the image sensor as the main part as a target data fusion mode.
In an embodiment, the at least one SLAM sensor data comprises: and positioning data output by a positioning module mainly based on the laser radar.
In an embodiment, when the processor 601 determines the target data fusion mode according to the data passing the verification, it is specifically configured to:
and when the data passing the verification comprises the positioning data output by the positioning module mainly based on the laser radar and does not comprise the GNSS data, determining a mode of performing data fusion mainly based on the positioning data output by the positioning module mainly based on the laser radar as a target data fusion mode.
In an embodiment, the at least one SLAM sensor data comprises: the positioning data of the positioning module output that takes laser radar as the main to and the positioning data of the positioning module output that takes image sensor as the main.
In an embodiment, when the processor 601 determines the target data fusion mode according to the data passing the verification, it is specifically configured to:
when the data that the check-up passed include the positioning data of using laser radar as the output of the orientation module of giving first place to with the positioning data of using image sensor as the output of giving first place to, and do not include during the GNSS data, will with the positioning data of using laser radar as the output of the orientation module of giving first place to with the positioning data of using image sensor as the output of giving first place to carry out the mode of data fusion and confirm to be the target data fusion mode for mainly.
In an embodiment, the processor 601 performs mutual calibration on the GNSS data, the inertial navigation system data, the driving state data, and at least one type of SLAM sensor data, and when data passing the calibration is obtained, the processor is specifically configured to:
converting each item of data in a data set consisting of the GNSS data, the inertial navigation system data, the driving state data and the at least one type of SLAM sensor data into a reference coordinate system to obtain a coordinate-converted data set; and mutually verifying the data in the data set after the coordinate conversion to obtain the data passing the verification.
In an embodiment, the processor 601 performs mutual calibration on the GNSS data, the inertial navigation system data, the driving state data, and at least one type of SLAM sensor data, and when data passing the calibration is obtained, the processor is specifically configured to:
detecting each item of data in a data set consisting of the GNSS data, the inertial navigation system data, the driving state data and at least one kind of SLAM sensor data based on identification information of each item of data and the frequency of output data of corresponding equipment to obtain a data set passing detection; and mutually verifying the data in the data set which passes the detection to obtain the data which passes the verification.
In an embodiment, when the processor determines the position of the movable platform according to the target information, the processor is specifically configured to:
and determining the position of the movable platform in a high-precision map according to the target information.
In a specific implementation, the processor 601, the communication interface 602, and the memory 603 described in the embodiment of the present invention may execute an implementation manner described in a positioning method based on multiple data fusion provided in the embodiment of the present invention, and details are not described herein again.
According to the embodiment of the invention, the GNSS data, the inertial navigation system data, the driving state data and the at least one SLAM sensor data of the movable platform are mutually verified through the processor to obtain the data passing the verification, the GNSS data, the inertial navigation system data, the driving state data and the at least one SLAM sensor data of the movable platform are fused according to the indication of the target data fusion mode determined according to the data passing the verification to obtain the target information, and the position of the movable platform is determined according to the target information, so that the movable platform under different environments can be positioned based on different data fusion modes, and the positioning accuracy is effectively ensured.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the positioning method based on multiple data fusion described in the above method embodiment is implemented.
Embodiments of the present invention further provide a computer program product containing instructions, which when executed on a computer, enable the computer to execute the positioning method based on multiple data fusion described in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned embodiments of the method are described as a series of acts or combinations, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs.
The modules in the device provided by the embodiment of the invention can be combined, divided and deleted according to actual needs.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable storage medium, and the storage medium may include: flash disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The positioning method and the movable platform based on multi-data fusion provided by the embodiment of the invention are described in detail, a specific example is applied in the description to explain the principle and the implementation of the invention, and the description of the embodiment is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (23)

1. A positioning method based on multi-data fusion is applied to a movable platform, and is characterized in that the method comprises the following steps:
obtaining Global Navigation Satellite System (GNSS) data, inertial navigation system data, travel state data, and at least one immediate positioning and mapping (SLAM) sensor data of the mobile platform;
the GNSS data, the inertial navigation system data, the driving state data and at least one SLAM sensor data are mutually verified to obtain data passing verification, and a target data fusion mode is determined according to the data passing verification;
and performing fusion processing on GNSS data, inertial navigation system data, driving state data and at least one type of SLAM sensor data of the movable platform according to the indication of the target data fusion mode to obtain target information, and determining the position of the movable platform according to the target information.
2. The method according to claim 1, wherein the determining a target data fusion mode according to the checked data comprises:
and when the checked data comprises the GNSS data, determining a data fusion mode taking the GNSS data as a main mode as a target data fusion mode.
3. The method of claim 1, wherein the at least one SLAM sensor data comprises: and the positioning data is output by a positioning module taking the image sensor as a main part.
4. The method according to claim 3, wherein the determining a target data fusion mode according to the checked data comprises:
and when the data passing the verification comprises the positioning data output by the positioning module taking the image sensor as the main part and does not comprise the GNSS data, determining a mode of performing data fusion by taking the positioning data output by the positioning module taking the image sensor as the main part as a target data fusion mode.
5. The method of claim 1, wherein the at least one SLAM sensor data comprises: and positioning data output by a positioning module mainly based on the laser radar.
6. The method according to claim 5, wherein the determining a target data fusion mode according to the checked data comprises:
and when the data passing the verification comprises the positioning data output by the positioning module mainly based on the laser radar and does not comprise the GNSS data, determining a mode of performing data fusion mainly based on the positioning data output by the positioning module mainly based on the laser radar as a target data fusion mode.
7. The method of claim 1, wherein the at least one SLAM sensor data comprises: the positioning data of the positioning module output that takes laser radar as the main to and the positioning data of the positioning module output that takes image sensor as the main.
8. The method according to claim 7, wherein the determining a target data fusion mode according to the checked data comprises:
when the data that the check-up passed include the positioning data of using laser radar as the output of the orientation module of giving first place to with the positioning data of using image sensor as the output of giving first place to, and do not include during the GNSS data, will with the positioning data of using laser radar as the output of the orientation module of giving first place to with the positioning data of using image sensor as the output of giving first place to carry out the mode of data fusion and confirm to be the target data fusion mode for mainly.
9. The method of any one of claims 1 to 8, wherein said mutually verifying said GNSS data, said inertial navigation system data, said driving condition data and at least one SLAM sensor data to obtain verified data comprises:
converting each item of data in a data set consisting of the GNSS data, the inertial navigation system data, the driving state data and the at least one type of SLAM sensor data into a reference coordinate system to obtain a coordinate-converted data set;
and mutually verifying the data in the data set after the coordinate conversion to obtain the data passing the verification.
10. The method of any one of claims 1 to 8, wherein said mutually verifying said GNSS data, said inertial navigation system data, said driving condition data and at least one SLAM sensor data to obtain verified data comprises:
detecting each item of data in a data set consisting of the GNSS data, the inertial navigation system data, the driving state data and at least one kind of SLAM sensor data based on identification information of each item of data and the frequency of output data of corresponding equipment to obtain a data set passing detection;
and mutually verifying the data in the data set which passes the detection to obtain the data which passes the verification.
11. The method of any one of claims 1 to 8, wherein said determining the position of the movable platform from the target information comprises:
and determining the position of the movable platform in a high-precision map according to the target information.
12. A movable platform, comprising: a memory and a processor, wherein the processor is capable of,
the memory to store program instructions;
the processor to execute the memory-stored program instructions, the processor to, when executed:
acquiring GNSS data, inertial navigation system data, driving state data and at least one SLAM sensor data of the movable platform;
the GNSS data, the inertial navigation system data, the driving state data and at least one SLAM sensor data are mutually verified to obtain data passing verification, and a target data fusion mode is determined according to the data passing verification;
and performing fusion processing on GNSS data, inertial navigation system data, driving state data and at least one type of SLAM sensor data of the movable platform according to the indication of the target data fusion mode to obtain target information, and determining the position of the movable platform according to the target information.
13. The movable platform of claim 12, wherein the processor, when determining the target data fusion mode according to the verified data, is specifically configured to:
and when the checked data comprises the GNSS data, determining a data fusion mode taking the GNSS data as a main mode as a target data fusion mode.
14. The movable platform of claim 12, wherein the at least one SLAM sensor data comprises: and the positioning data is output by a positioning module taking the image sensor as a main part.
15. The movable platform of claim 14, wherein the processor, when determining the target data fusion mode according to the verified data, is specifically configured to:
and when the data passing the verification comprises the positioning data output by the positioning module taking the image sensor as the main part and does not comprise the GNSS data, determining a mode of performing data fusion by taking the positioning data output by the positioning module taking the image sensor as the main part as a target data fusion mode.
16. The movable platform of claim 12, wherein the at least one SLAM sensor data comprises: and positioning data output by a positioning module mainly based on the laser radar.
17. The movable platform of claim 16, wherein the processor, when determining the target data fusion mode according to the verified data, is specifically configured to:
and when the data passing the verification comprises the positioning data output by the positioning module mainly based on the laser radar and does not comprise the GNSS data, determining a mode of performing data fusion mainly based on the positioning data output by the positioning module mainly based on the laser radar as a target data fusion mode.
18. The movable platform of claim 12, wherein the at least one SLAM sensor data comprises: the positioning data of the positioning module output that takes laser radar as the main to and the positioning data of the positioning module output that takes image sensor as the main.
19. The movable platform of claim 18, wherein the processor, when determining the target data fusion mode according to the verified data, is specifically configured to:
when the data that the check-up passed include the positioning data of using laser radar as the output of the orientation module of giving first place to with the positioning data of using image sensor as the output of giving first place to, and do not include during the GNSS data, will with the positioning data of using laser radar as the output of the orientation module of giving first place to with the positioning data of using image sensor as the output of giving first place to carry out the mode of data fusion and confirm to be the target data fusion mode for mainly.
20. The movable platform of any one of claims 12 to 19, wherein the processor verifies the GNSS data, the inertial navigation system data, the driving state data, and the at least one SLAM sensor data against each other, and when the data that passes the verification is obtained, the processor is specifically configured to:
converting each item of data in a data set consisting of the GNSS data, the inertial navigation system data, the driving state data and the at least one type of SLAM sensor data into a reference coordinate system to obtain a coordinate-converted data set;
and mutually verifying the data in the data set after the coordinate conversion to obtain the data passing the verification.
21. The movable platform of any one of claims 12 to 19, wherein the processor verifies the GNSS data, the inertial navigation system data, the driving state data, and the at least one SLAM sensor data against each other, and when the data that passes the verification is obtained, the processor is specifically configured to:
detecting each item of data in a data set consisting of the GNSS data, the inertial navigation system data, the driving state data and at least one kind of SLAM sensor data based on identification information of each item of data and the frequency of output data of corresponding equipment to obtain a data set passing detection;
and mutually verifying the data in the data set which passes the detection to obtain the data which passes the verification.
22. The movable platform of any one of claims 12-19, wherein the processor, when determining the position of the movable platform from the target information, is specifically configured to:
and determining the position of the movable platform in a high-precision map according to the target information.
23. A computer-readable storage medium having a computer program stored therein, characterized in that: the computer program realizing the steps of the method according to any one of claims 1 to 11 when executed by a processor.
CN201980030350.3A 2019-07-26 2019-07-26 Positioning method based on multi-data fusion, movable platform and storage medium Active CN112105961B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/097957 WO2021016749A1 (en) 2019-07-26 2019-07-26 Multi-data fusion-based positioning method, movable platform and storage medium

Publications (2)

Publication Number Publication Date
CN112105961A true CN112105961A (en) 2020-12-18
CN112105961B CN112105961B (en) 2024-06-25

Family

ID=73748804

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980030350.3A Active CN112105961B (en) 2019-07-26 2019-07-26 Positioning method based on multi-data fusion, movable platform and storage medium

Country Status (2)

Country Link
CN (1) CN112105961B (en)
WO (1) WO2021016749A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112904396A (en) * 2021-02-03 2021-06-04 深圳亿嘉和科技研发有限公司 High-precision positioning method and system based on multi-sensor fusion
CN114018284A (en) * 2021-10-13 2022-02-08 上海师范大学 Wheel speed odometer correction method based on vision
CN117451034A (en) * 2023-12-25 2024-01-26 天津云圣智能科技有限责任公司 Autonomous navigation method and device, storage medium and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2330472A1 (en) * 2009-09-07 2011-06-08 BAE Systems PLC Path determination
CN103207634A (en) * 2013-03-20 2013-07-17 北京工业大学 Data fusion system and method of differential GPS (Global Position System) and inertial navigation in intelligent vehicle
CN103278837A (en) * 2013-05-17 2013-09-04 南京理工大学 Adaptive filtering-based SINS/GNSS (strapdown inertial navigation system/global navigation satellite system) multistage fault-tolerant integrated navigation method
CN106780699A (en) * 2017-01-09 2017-05-31 东南大学 A kind of vision SLAM methods aided in based on SINS/GPS and odometer
CN108375370A (en) * 2018-07-02 2018-08-07 江苏中科院智能科学技术应用研究院 A kind of complex navigation system towards intelligent patrol unmanned plane
CN109405824A (en) * 2018-09-05 2019-03-01 武汉契友科技股份有限公司 A kind of multi-source perceptual positioning system suitable for intelligent network connection automobile
CN109752725A (en) * 2019-01-14 2019-05-14 天合光能股份有限公司 Low-speed commercial robot, positioning and navigation method and positioning and navigation system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011119762A1 (en) * 2011-11-30 2012-06-06 Daimler Ag Positioning system for motor vehicle, has processing unit that determines localized position of vehicle using vehicle movement data measured based on specific location data stored in digital card
CN106227220A (en) * 2016-09-28 2016-12-14 关健生 Independent navigation crusing robot based on Distributed Architecture

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2330472A1 (en) * 2009-09-07 2011-06-08 BAE Systems PLC Path determination
CN103207634A (en) * 2013-03-20 2013-07-17 北京工业大学 Data fusion system and method of differential GPS (Global Position System) and inertial navigation in intelligent vehicle
CN103278837A (en) * 2013-05-17 2013-09-04 南京理工大学 Adaptive filtering-based SINS/GNSS (strapdown inertial navigation system/global navigation satellite system) multistage fault-tolerant integrated navigation method
CN106780699A (en) * 2017-01-09 2017-05-31 东南大学 A kind of vision SLAM methods aided in based on SINS/GPS and odometer
CN108375370A (en) * 2018-07-02 2018-08-07 江苏中科院智能科学技术应用研究院 A kind of complex navigation system towards intelligent patrol unmanned plane
CN109405824A (en) * 2018-09-05 2019-03-01 武汉契友科技股份有限公司 A kind of multi-source perceptual positioning system suitable for intelligent network connection automobile
CN109752725A (en) * 2019-01-14 2019-05-14 天合光能股份有限公司 Low-speed commercial robot, positioning and navigation method and positioning and navigation system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112904396A (en) * 2021-02-03 2021-06-04 深圳亿嘉和科技研发有限公司 High-precision positioning method and system based on multi-sensor fusion
CN114018284A (en) * 2021-10-13 2022-02-08 上海师范大学 Wheel speed odometer correction method based on vision
CN114018284B (en) * 2021-10-13 2024-01-23 上海师范大学 Wheel speed odometer correction method based on vision
CN117451034A (en) * 2023-12-25 2024-01-26 天津云圣智能科技有限责任公司 Autonomous navigation method and device, storage medium and electronic equipment
CN117451034B (en) * 2023-12-25 2024-04-02 天津云圣智能科技有限责任公司 Autonomous navigation method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
WO2021016749A1 (en) 2021-02-04
CN112105961B (en) 2024-06-25

Similar Documents

Publication Publication Date Title
CN107885219B (en) Flight monitoring system and method for monitoring flight of unmanned aerial vehicle
US10788830B2 (en) Systems and methods for determining a vehicle position
US10527720B2 (en) Millimeter-wave terrain aided navigation system
US10921460B2 (en) Position estimating apparatus and method
US10422872B2 (en) Integrity monitoring of radar altimeters
Garratt et al. Integration of GPS/INS/vision sensors to navigate unmanned aerial vehicles
WO2018086122A1 (en) Method and system for fusion of multiple paths of sensing data
EP2706379B1 (en) Method and system for providing integrity for hybrid attitude and true heading
JP6506302B2 (en) Method and apparatus for operating a mobile platform
CN112105961B (en) Positioning method based on multi-data fusion, movable platform and storage medium
US8078399B2 (en) Method and device for three-dimensional path planning to avoid obstacles using multiple planes
JP2001331787A (en) Road shape estimating device
US20210035456A1 (en) Unmanned aircraft, and method and system for navigation
CN113196109B (en) Method for determining an integrity range
CN114167470A (en) Data processing method and device
Madison et al. Vision-aided navigation for small UAVs in GPS-challenged environments
KR102093743B1 (en) System for lane level positioning location information of ground vehicle using sensor fusion
US11215459B2 (en) Object recognition device, object recognition method and program
Kopecki et al. Algorithms of measurement system for a micro UAV
US10246180B2 (en) Cooperative perception and state estimation for vehicles with compromised sensor systems
KR101960164B1 (en) A method for calculating a Real-Time Heading value of object using EKF-Cl
KR102087053B1 (en) Real time car navigation control system comprising terminal and server and cotrolling method thereof
CN113272625A (en) Aircraft positioning method and device, aircraft and storage medium
Matute et al. Sensor Fusion-Based Localization Framework for Autonomous Vehicles in Rural Forested Environments
KR102515245B1 (en) Method and apparatus for preventing loss of unmanned air vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20240517

Address after: Building 3, Xunmei Science and Technology Plaza, No. 8 Keyuan Road, Science and Technology Park Community, Yuehai Street, Nanshan District, Shenzhen City, Guangdong Province, 518057, 1634

Applicant after: Shenzhen Zhuoyu Technology Co.,Ltd.

Country or region after: China

Address before: 518057 Shenzhen Nanshan High-tech Zone, Shenzhen, Guangdong Province, 6/F, Shenzhen Industry, Education and Research Building, Hong Kong University of Science and Technology, No. 9 Yuexingdao, South District, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: SZ DJI TECHNOLOGY Co.,Ltd.

Country or region before: China

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant