WO2023226155A1 - 多源数据融合定位方法、装置、设备及计算机存储介质 - Google Patents

多源数据融合定位方法、装置、设备及计算机存储介质 Download PDF

Info

Publication number
WO2023226155A1
WO2023226155A1 PCT/CN2022/103369 CN2022103369W WO2023226155A1 WO 2023226155 A1 WO2023226155 A1 WO 2023226155A1 CN 2022103369 W CN2022103369 W CN 2022103369W WO 2023226155 A1 WO2023226155 A1 WO 2023226155A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
positioning
local
point cloud
coordinate system
Prior art date
Application number
PCT/CN2022/103369
Other languages
English (en)
French (fr)
Inventor
蒿杰
周怡
梁俊
孙亚强
Original Assignee
芯跳科技(广州)有限公司
广东人工智能与先进计算研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 芯跳科技(广州)有限公司, 广东人工智能与先进计算研究院 filed Critical 芯跳科技(广州)有限公司
Publication of WO2023226155A1 publication Critical patent/WO2023226155A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

Definitions

  • the present application relates to the technical field of navigation and positioning, and in particular to a multi-source data fusion positioning method, device, equipment and computer storage medium.
  • This application provides a multi-source data fusion positioning method, device, equipment and computer storage medium to solve the defect in the existing technology that positioning accuracy depends on satellite signal strength, and to achieve autonomous precise positioning that does not rely on the outside world.
  • This application provides a multi-source data fusion positioning method, including:
  • the multi-source positioning data includes point cloud data collected using lidar, inertial positioning data collected using inertial sensors and GPS data;
  • the local positioning data and the GPS data are fused to determine the global positioning result of the positioning target.
  • the step of calculating the relative pose parameters of the positioning target based on the inertial positioning data includes:
  • the relative pose parameters of the positioning target are calculated according to the motion parameters, where the relative pose parameters include relative speed and relative displacement.
  • the step of correcting the motion distortion of the point cloud data according to the relative pose parameters to obtain local positioning data includes:
  • inter-frame interpolation processing is performed on the pose results corresponding to the positioning target in the local point cloud data, and it is determined that the positioning target is at the local point.
  • each frame data point in the local point cloud data is converted into the local coordinate system at the end of frame scanning to correct the motion distortion of the point cloud data and obtain local positioning data.
  • the step of fusing the local positioning data and the GPS data to determine the global positioning result of the positioning target includes:
  • the fused positioning result is converted into the global coordinate system corresponding to the GPS data to obtain a global positioning result for the positioning target.
  • the step of performing point cloud matching on the local positioning data and determining the local positioning result of the positioning target includes:
  • Each frame of point cloud data in the local positioning data is superimposed according to the optimal parameters to match each frame of point cloud data in the local positioning data to obtain a pair of the point cloud data in the local coordinate system. Local positioning results of the positioning target.
  • the step of converting the GPS data into the local coordinate system includes:
  • Each data point in the GPS data is converted from the geodetic coordinate system to the local coordinate system by making a difference from the initial data point.
  • the step before the step of converting the GPS data into the local coordinate system, the step further includes:
  • This application also provides a multi-source data fusion positioning device, including:
  • a multi-source data collection module is used to collect multi-source positioning data of positioning targets.
  • the multi-source data collection module includes a lidar and an inertial sensor.
  • the multi-source positioning data includes point cloud data collected using the lidar. , using the inertial positioning data and GPS data collected by the inertial sensor;
  • a data correction module configured to calculate the relative pose parameters of the positioning target based on the inertial positioning data, and correct the motion distortion of the point cloud data based on the relative pose parameters to obtain local positioning data;
  • a fusion positioning module is used to fuse the local positioning data and the GPS data to determine the global positioning result of the positioning target.
  • This application also provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the program, the multi-source method as described in any one of the above is implemented. Data fusion positioning method.
  • the present application also provides a non-transitory computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the multi-source data fusion positioning method as described in any of the above is implemented.
  • the present application also provides a computer program product, which includes a computer program that, when executed by a processor, implements any of the above multi-source data fusion positioning methods.
  • the multi-source data fusion positioning method, device, equipment and computer storage medium provided by this application can perform motion distortion correction on the high-precision positioning data of the positioning target by fusing the collected multi-source data to obtain local positioning data of the positioning target.
  • Fusing local positioning data and GPS data to locate the positioning target can fuse the positioning results of different precisions on the positioning target, improve the positioning accuracy of the positioning target without relying on satellite signals, and thereby achieve independent and precise positioning of the positioning target. .
  • Figure 1 is one of the flow diagrams of the multi-source data fusion positioning method provided by this application.
  • Figure 2 is the second schematic flow chart of the multi-source data fusion positioning method provided by this application.
  • Figure 3 is a schematic structural diagram of the multi-source data fusion positioning device provided by this application.
  • Figure 4 is a schematic structural diagram of an electronic device provided by this application.
  • Figure 1 is one of the flow diagrams of the multi-source data fusion positioning method provided by the embodiment of the present application.
  • GPS is mainly relied on. Positioning accuracy depends almost entirely on the strength of the satellite signal.
  • embodiments of the present application propose a multi-source data fusion positioning method, using multiple sensors to fuse and complement GPS data to achieve autonomous precise positioning that does not rely on GPS accuracy.
  • the multi-source data fusion positioning method provided by the embodiment of the present application includes:
  • Step 100 collect multi-source positioning data of the positioning target.
  • the multi-source positioning data includes point cloud data collected using lidar, inertial positioning data collected using inertial sensors and GPS data;
  • the positioning target can be a drone, an unmanned vehicle, or a robot.
  • the positioning device is installed on the positioning target, and the positioning target is the carrier of the positioning device.
  • the positioning target is called the carrier.
  • the carrier is fixedly installed with devices such as GPS, inertial sensor (IMU, taking a 9-axis IMU as an example.
  • the 9-axis IMU includes a gyroscope, accelerometer and magnetometer) and lidar.
  • the multi-source positioning data collected includes GPS data, laser
  • the point cloud data collected by radar and the inertial positioning data (IMU data) collected by IMU are different sources of data collected by different devices, and the positioning accuracy of the carrier is different.
  • Step 200 Calculate the relative pose parameters of the positioning target based on the inertial positioning data, and correct the motion distortion of the point cloud data according to the relative pose parameters to obtain local positioning data;
  • step 200 may also include:
  • Step 201 Create a local coordinate system corresponding to the inertial positioning data, and determine the state matrix of the positioning target in the local coordinate system based on the inertial positioning data;
  • Step 202 Filter and integrate the inertial positioning data according to the state matrix to obtain the motion parameters of the positioning target in the local coordinate system.
  • the motion parameters include rotation angle, acceleration and angular velocity;
  • Step 203 Calculate relative pose parameters of the positioning target according to the motion parameters, where the relative pose parameters include relative speed and relative displacement.
  • the relative pose parameters of the carrier include relative speed and relative displacement.
  • R t is a 3*3 rotation matrix
  • T t is a 1*3 translation matrix
  • V t is a 1*3 velocity matrix
  • b t is a 1*3 Bayesian error matrix
  • the filtering process for the IMU data can be Kalman filtering, which is not limited here. Taking the Kalman filtering process as an example, after performing the Kalman filtering process on the IMU data, the next step at time t+ ⁇ t is solved by integrating the IMU data. , the corresponding translation parameter T t+ ⁇ t and velocity parameter V t+ ⁇ t of the carrier in the IMU coordinate system C I :
  • V t+ ⁇ t V t +R t (a t -b t -n t ) ⁇ t (2)
  • n t is the white noise at time t.
  • the motion trajectory of the carrier can be predicted.
  • the future direction of the carrier can be obtained.
  • the motion parameters at time t+ ⁇ t are used to predict the position of the carrier. Since the integration process is time-consuming, if the IMU value needs to be corrected during subsequent processing, it needs to be integrated again. Therefore, in this embodiment, the IMU data is pre-integrated to avoid repeated calculations, and the carrier is summed at any time t i All relative speeds ⁇ v ij and relative displacements ⁇ T ij between moments t j are expressed as follows:
  • the motion parameters of the carrier at a certain moment (reference moment) corresponding to the IMU data can be determined. Based on the defined relative speed and relative displacement of the carrier, and the calculated motion parameters of the carrier, we can get Get the relative pose parameters of the carrier. Based on the relative pose parameters, the relative pose of the carrier at a future time after the reference time can be predicted. The relative pose is the pose of the carrier at a certain time in the future, relative to the pose at the reference time. After determining the posture of the carrier at a certain moment, based on the relative posture parameters of the carrier, the posture of the carrier at a certain moment in the future can be predicted.
  • step 200 may also include:
  • Step 210 convert the point cloud data into the local coordinate system, and determine the local point cloud data corresponding to the point cloud data in the local coordinate system;
  • Step 220 In the local coordinate system, based on the relative pose parameter, perform inter-frame interpolation processing on the corresponding pose result of the positioning target in the local point cloud data to determine where the positioning target is. Describe the pose information at the corresponding time of each point in the local point cloud data;
  • Step 230 Convert each frame data point in the local point cloud data to the local coordinate system at the end of the frame scan according to the pose information to correct the motion distortion of the point cloud data and obtain local positioning data. .
  • the specific method is to first convert the collected point cloud data into the local coordinate system established based on the IMU, and determine each point cloud data in the collected point cloud data.
  • data from different sources are represented based on their respective coordinate systems. That is, the positioning of the carrier by data collected through different channels or methods can be represented based on different coordinate systems.
  • IMU The established local coordinate system is represented, while the GPS data can be represented by the longitude and latitude coordinate system (or global coordinate system) and so on.
  • the point cloud data first needs to be converted into a local coordinate system for representation.
  • the pose of the carrier at each moment corresponding to the IMU data is obtained by integrating the IMU data, that is, the motion state. Since the frequency of collecting IMU data is usually greater than the frequency of LiDAR collecting point cloud data, the point cloud data is interpolated based on the collected IMU data, and the state at the time corresponding to the IMU data is used as the time corresponding to the point cloud data. state to perform motion compensation on the collected point cloud data. It can be seen that the frequency when collecting IMU data is usually greater than 100Hz, and the frequency at which lidar collects point cloud data is usually 10Hz. Therefore, in each frame of lidar data, the corresponding moments of the IMU data between point cloud data frames are interpolated.
  • the data points in each frame are converted into the local coordinate system at the end of the scanning of the frame, and the carrier is used in the point cloud of each frame.
  • the pose information at the end of the data scan is used to correct the motion distortion of each data point in the point cloud data of the frame, so as to correct the motion distortion of the collected point cloud data and obtain the corresponding local positioning data.
  • step 210 you may also include:
  • Step 211 Eliminate outlier data points in the GPS data.
  • Step 300 Fusion of the local positioning data and the GPS data to determine the global positioning result of the positioning target.
  • the local positioning data obtained after motion distortion correction is fused with the GPS data to obtain the global positioning result of the carrier.
  • the fusion of the local positioning data and the GPS data includes converting the local positioning data and the GPS data into the same coordinate system. . It is understandable that the positioning accuracy of GPS and LiDAR is different. Generally, GPS can achieve positioning with meter-level accuracy, while LiDAR can achieve positioning with centimeter-level accuracy. By integrating the positioning results of LiDAR and GPS, it is possible to achieve a positioning that does not depend on Centimeter-level precision positioning of satellite signals, thereby improving the positioning accuracy of the carrier and achieving independent and precise positioning of the carrier.
  • motion distortion correction is performed on the high-precision positioning data of the positioning target to obtain local positioning data of the positioning target, and the positioning target is positioned by fusing the local positioning data and GPS data. , can fuse positioning results of different precisions to improve the positioning accuracy of positioning targets without relying on satellite signals, thereby achieving independent and precise positioning of positioning targets.
  • step 300 may also include:
  • Step 301 Perform point cloud matching on the local positioning data to determine the local positioning result of the positioning target;
  • Step 302 Convert the GPS data into the local coordinate system, fuse the local positioning result with the positioning result of the GPS data, and obtain a fused positioning result for the positioning target;
  • Step 303 Convert the fused positioning result to the global coordinate system corresponding to the GPS data to obtain a global positioning result for the positioning target.
  • the positioning accuracy of the local positioning result is higher than the positioning accuracy of the GPS data.
  • each frame of point cloud data has been converted to the created local coordinate system, and point cloud matching is performed on the obtained local positioning data in the local coordinate system. , thereby determining the local positioning result of the carrier.
  • the GPS data is also converted into the local coordinate system, the positioning result of the GPS data is fused with the local positioning result, and the fused positioning result is converted into the global coordinate system corresponding to the GPS data, thereby obtaining the global positioning result of the carrier.
  • step 301 may also include:
  • Step 311 Determine the relative pose relationship of point clouds between frames based on the local positioning data
  • Step 321 Perform voxel segmentation on each data point in the local positioning data, and calculate the mean and covariance matrix of each segmented voxel;
  • Step 331 Use the relative pose relationship as an initial parameter, iteratively optimize the mean value and the covariance matrix to obtain optimal parameters;
  • Step 341 Superimpose each frame of point cloud data in the local positioning data according to the optimal parameters to match each frame of point cloud data in the local positioning data to obtain the point cloud data in the local coordinate system.
  • the local positioning result of the positioning target is
  • each frame of lidar has been transferred to the local coordinate system, and we can obtain the relative translation matrix ⁇ T and relative rotation matrix ⁇ R between the two frame point clouds, However, this is the integration result of the IMU data, and there is a certain error. By eliminating the integration error through point cloud matching, the local positioning result of the carrier can be obtained.
  • the NDT (Nondestructive Testing, non-destructive testing) method is used for point cloud matching, and the relative pose relationship between the carrier in different frames of point cloud data in the point cloud data is determined based on the local positioning data.
  • the data points in each frame of point cloud data are also converted into the local coordinate system corresponding to the end of the point cloud scanning of each frame. , Therefore, the relative pose relationship of the carrier point cloud between frames can be calculated, and the relative pose relationship can be represented by a translation matrix and a rotation matrix.
  • step 302 may also include:
  • Step 312 Create a geodetic coordinate system and convert each data point in the GPS data to the geodetic coordinate system;
  • Step 322 Determine initial data points from each data point corresponding to the GPS data in the geodetic coordinate system according to the time corresponding to the first frame of point cloud data in the point cloud data;
  • Step 332 Convert each data point in the GPS data from the geodetic coordinate system to the local coordinate system by making a difference from the initial data point.
  • GPS uses the station center coordinate system (ENU coordinate system), and then convert the points in the station center coordinate system to When in the local coordinate system, the coordinate system needs to be aligned through the geodetic coordinate system. After creating the geodetic coordinate system, convert each data point in the GPS data to the geodetic coordinate system.
  • ENU coordinate system station center coordinate system
  • the angle between the ellipsoid normal line passing through the ground point and the ellipsoid equatorial plane is the geodetic latitude B
  • the angle between the ellipsoid meridian plane passing through the ground point and the starting meridian plane is The angle between is the geodetic longitude L
  • the distance from the ground point along the normal line of the ellipsoid to the ellipsoid is the geodetic height H.
  • e is the first eccentricity of the ellipsoid in the station center coordinate system, which is obtained based on the long and short semi-axes:
  • N is the radius of curvature of the Maoyou circle of the ellipsoid in the station center coordinate system, which can be obtained according to the following formula:
  • N a/sqrt(1-e 2 sin 2 B o ) (13)
  • each GPS data point filtered by outliers undergoes the same transformation, thereby converting the GPS data points to the geodetic coordinate system, and performs the transformation with the initial data points Difference, thereby converting the GPS data points into the local coordinate system, using the IMU data processed by Kalman filtering and integration to correct the GPS data, and integrating the positioning results of the GPS data and the local positioning results to achieve global positioning of the carrier.
  • Figure 2 is another schematic flow chart of the multi-source data fusion positioning method provided by the embodiment of the present application.
  • the position of the carrier can be predicted using processed IMU data and corrected GPS data.
  • the carrier is positioned.
  • pose prediction is performed based on IMU data integration.
  • the carrier translates, pose prediction is performed using the fusion of GPS data and IMU data.
  • the carrier's pose is updated based on the prediction results, and the updated pose is applied to subsequent predictions through pre-integration processing.
  • the centimeter-level precise positioning results based on lidar and the meter-level GPS positioning results are fused in the local coordinate system to generate
  • the optimal parameters are determined through voxel segmentation for point cloud matching to eliminate the integral error of the IMU data.
  • the fusion of the local positioning results and the GPS positioning results is achieved, improving the accuracy of the positioning.
  • the positioning accuracy of the target is determined through voxel segmentation for point cloud matching to eliminate the integral error of the IMU data.
  • the multi-source data fusion positioning device provided by the present application is described below.
  • the multi-source data fusion positioning device described below and the multi-source data fusion positioning method described above can be mutually referenced.
  • the multi-source data fusion positioning device provided by the embodiment of the present application includes:
  • the multi-source data collection module 10 is used to collect multi-source positioning data of the positioning target, wherein the multi-source data collection module includes lidar and inertial sensors, and the multi-source positioning data includes point clouds collected using the lidar. Data, inertial positioning data and GPS data collected by the inertial sensor;
  • the data correction module 20 is used to calculate the relative pose parameters of the positioning target based on the inertial positioning data, and correct the motion distortion of the point cloud data according to the relative pose parameters to obtain local positioning data;
  • the fusion positioning module 30 is used to fuse the local positioning data and the GPS data to determine the global positioning result of the positioning target.
  • the data correction module 20 is also used to:
  • the relative pose parameters of the positioning target are calculated according to the motion parameters, where the relative pose parameters include relative speed and relative displacement.
  • the data correction module 20 is also used to:
  • inter-frame interpolation processing is performed on the pose results corresponding to the positioning target in the local point cloud data, and it is determined that the positioning target is at the local point.
  • each frame data point in the local point cloud data is converted into the local coordinate system at the end of frame scanning to correct the motion distortion of the point cloud data and obtain local positioning data.
  • the fusion positioning module 30 is also used to:
  • the fused positioning result is converted into the global coordinate system corresponding to the GPS data to obtain a global positioning result for the positioning target.
  • the fusion positioning module 30 is also used to:
  • Each frame of point cloud data in the local positioning data is superimposed according to the optimal parameters to match each frame of point cloud data in the local positioning data to obtain a pair of the point cloud data in the local coordinate system. Local positioning results of the positioning target.
  • the fusion positioning module 30 is also used to:
  • Each data point in the GPS data is converted from the geodetic coordinate system to the local coordinate system by making a difference from the initial data point.
  • the multi-source data fusion positioning device further includes an outlier data filtering module for:
  • Figure 4 illustrates a schematic diagram of the physical structure of an electronic device.
  • the electronic device may include: a processor (processor) 410, a communications interface (Communications Interface) 420, a memory (memory) 430 and a communication bus 440.
  • the processor 410, the communication interface 420, and the memory 430 complete communication with each other through the communication bus 440.
  • the processor 410 can call logical instructions in the memory 430 to execute a multi-source data fusion positioning method, which method includes:
  • the multi-source positioning data includes point cloud data collected using lidar, inertial positioning data collected using inertial sensors and GPS data;
  • the local positioning data and the GPS data are fused to determine the global positioning result of the positioning target.
  • the above-mentioned logical instructions in the memory 430 can be implemented in the form of software functional units and can be stored in a computer-readable storage medium when sold or used as an independent product.
  • the technical solution of the present application is essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product.
  • the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in various embodiments of this application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disk and other media that can store program code.
  • the present application also provides a computer program product.
  • the computer program product includes a computer program.
  • the computer program can be stored on a non-transitory computer-readable storage medium.
  • the computer program can Execute the multi-source data fusion positioning method provided by the above methods, which includes:
  • the multi-source positioning data includes point cloud data collected using lidar, inertial positioning data collected using inertial sensors and GPS data;
  • the local positioning data and the GPS data are fused to determine the global positioning result of the positioning target.
  • the present application also provides a non-transitory computer-readable storage medium on which a computer program is stored.
  • the computer program is implemented when executed by the processor to execute the multi-source data fusion positioning method provided by the above methods.
  • Methods include:
  • the multi-source positioning data includes point cloud data collected using lidar, inertial positioning data collected using inertial sensors and GPS data;
  • the local positioning data and the GPS data are fused to determine the global positioning result of the positioning target.
  • the device embodiments described above are only illustrative.
  • the units described as separate components may or may not be physically separated.
  • the components shown as units may or may not be physical units, that is, they may be located in One location, or it can be distributed across multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment. Persons of ordinary skill in the art can understand and implement the method without any creative effort.
  • each embodiment can be implemented by software plus a necessary general hardware platform, and of course, it can also be implemented by hardware.
  • the computer software product can be stored in a computer-readable storage medium, such as ROM/RAM, disk , optical disk, etc., including a number of instructions to cause a computer device (which can be a personal computer, a server, or a network device, etc.) to execute the methods described in various embodiments or certain parts of the embodiments.

Abstract

本申请提供一种多源数据融合定位方法、装置、设备及计算机存储介质,所述方法包括:采集定位目标的多源定位数据,所述多源定位数据包括利用激光雷达采集的点云数据、利用惯性传感器采集的惯性定位数据和GPS数据;基于所述惯性定位数据计算所述定位目标的相对位姿参数,并根据所述相对位姿参数校正所述点云数据的运动畸变,得到局部定位数据;融合所述局部定位数据和所述GPS数据,确定所述定位目标的全局定位结果。通过融合采集的多源定位数据,可以融合对定位目标不同精度的定位结果,在不依赖卫星信号的情况下,提高对定位目标的定位精度,从而实现对定位目标的自主精确定位。

Description

多源数据融合定位方法、装置、设备及计算机存储介质
相关申请的交叉引用
本申请要求于2022年05月24日提交的、申请号为202210576163.9、发明名称为“多源数据融合定位方法、装置、设备及计算机存储介质”的中国专利申请的优先权,其通过引用方式全部并入本文。
技术领域
本申请涉及导航定位技术领域,尤其涉及一种多源数据融合定位方法、装置、设备及计算机存储介质。
背景技术
目前,对无人机、无人车和机器人等终端设备的户外定位,主要依赖于GPS,在精确定位时,常使用差分方法提高定位精度,这使得定位精度完全取决于卫星信号的强弱。但在实际应用中,例如,在卫星信号不稳定的区域,需要实现不依赖于外界的自主精确定位,而现有方法无法实现。
发明内容
本申请提供一种多源数据融合定位方法、装置、设备及计算机存储介质,用以解决现有技术中定位精度取决于卫星信号强度的缺陷,实现不依赖于外界的自主精确定位。
本申请提供一种多源数据融合定位方法,包括:
采集定位目标的多源定位数据,所述多源定位数据包括利用激光雷达采集的点云数据、利用惯性传感器采集的惯性定位数据和GPS数据;
基于所述惯性定位数据计算所述定位目标的相对位姿参数,并根据所述相对位姿参数校正所述点云数据的运动畸变,得到局部定位数据;
融合所述局部定位数据和所述GPS数据,确定所述定位目标的全局定位结果。
根据本申请提供的一种多源数据融合定位方法,所述基于所述惯性定位数据计算所述定位目标的相对位姿参数的步骤,包括:
创建所述惯性定位数据对应的局部坐标系,并基于所述惯性定位数据确定所述定位目标在所述局部坐标系下的状态矩阵;
根据所述状态矩阵对所述惯性定位数据进行滤波和积分处理,得到所述定位目标在所述局部坐标系下的运动参数,所述运动参数包括旋转角度、加速度和角速度;
根据所述运动参数计算所述定位目标的相对位姿参数,其中,所述相对位姿参数包括相对速度和相对位移。
根据本申请提供的一种多源数据融合定位方法,所述根据所述相对位姿参数校正所述点云数据的运动畸变,得到局部定位数据的步骤,包括:
将所述点云数据转换到所述局部坐标系中,确定所述点云数据在所述局部坐标系中对应的局部点云数据;
在所述局部坐标系中,基于所述相对位姿参数,对所述定位目标在所述局部点云数据中对应的位姿结果进行帧间插值处理,确定所述定位目标在所述局部点云数据中每个点对应时刻下的位姿信息;
根据所述位姿信息将所述局部点云数据中的每帧数据点转换到帧扫描结束时刻下的局部坐标系中,以校正所述点云数据的运动畸变,得到局部定位数据。
根据本申请提供的一种多源数据融合定位方法,所述融合所述局部定位数据和所述GPS数据,确定所述定位目标的全局定位结果的步骤,包括:
对所述局部定位数据进行点云匹配,确定所述定位目标的局部定位结果;
将所述GPS数据转换到所述局部坐标系中,融合所述局部定位结果与所述GPS数据的定位结果,得到对所述定位目标的融合定位结果;
将所述融合定位结果转换到所述GPS数据对应的全局坐标系下,得到对所述定位目标的全局定位结果。
根据本申请提供的一种多源数据融合定位方法,所述对所述局部定位数据进行点云匹配,确定所述定位目标的局部定位结果的步骤,包括:
根据所述局部定位数据确定帧间点云的相对位姿关系;
对所述局部定位数据中的各个数据点执行体素分割,并计算分割得到 的各个体素的均值和协方差矩阵;
将所述相对位姿关系作为初始参数,迭代优化所述均值和所述协方差矩阵,得到最优参数;
根据所述最优参数对所述局部定位数据中的每帧点云数据进行叠加,以对所述局部定位数据中的各帧点云数据进行匹配,得到在所述局部坐标系中对所述定位目标的局部定位结果。
根据本申请提供的一种多源数据融合定位方法,所述将所述GPS数据转换到所述局部坐标系中的步骤,包括:
创建大地坐标系,并将所述GPS数据中的各个数据点转换到所述大地坐标系下;
根据所述点云数据中的首帧点云数据对应的时间,从所述GPS数据在所述大地坐标系中对应的各个数据点中确定初始数据点;
通过与所述初始数据点做差,将所述GPS数据中的各个数据点从所述大地坐标系中转换到所述局部坐标系中。
根据本申请提供的一种多源数据融合定位方法,所述将所述GPS数据转换到所述局部坐标系中的步骤之前,还包括:
剔除所述GPS数据中的离群数据点。
本申请还提供一种多源数据融合定位装置,包括:
多源数据采集模块,用于采集定位目标的多源定位数据,其中,所述多源数据采集模块包括激光雷达和惯性传感器,所述多源定位数据包括利用所述激光雷达采集的点云数据、利用所述惯性传感器采集的惯性定位数据和GPS数据;
数据校正模块,用于基于所述惯性定位数据计算所述定位目标的相对位姿参数,并根据所述相对位姿参数校正所述点云数据的运动畸变,得到局部定位数据;
融合定位模块,用于融合所述局部定位数据和所述GPS数据,确定所述定位目标的全局定位结果。
本申请还提供一种电子设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现如上述任一种所述的多源数据融合定位方法。
本申请还提供一种非暂态计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现如上述任一种所述的多源数据融合定位方法。
本申请还提供一种计算机程序产品,包括计算机程序,所述计算机程序被处理器执行时实现如上述任一种所述的多源数据融合定位方法。
本申请提供的多源数据融合定位方法、装置、设备及计算机存储介质,通过融合采集的多源数据,对定位目标的高精度定位数据进行运动畸变校正,得到对定位目标的局部定位数据,通过融合局部定位数据和GPS数据对定位目标进行定位,可以融合对定位目标不同精度的定位结果,在不依赖卫星信号的情况下,提高对定位目标的定位精度,从而实现对定位目标的自主精确定位。
附图说明
为了更清楚地说明本申请或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请提供的多源数据融合定位方法的流程示意图之一;
图2是本申请提供的多源数据融合定位方法的流程示意图之二;
图3是本申请提供的多源数据融合定位装置的结构示意图;
图4是本申请提供的电子设备的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请中的附图,对本申请中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
下面结合图1-图2描述本申请的多源数据融合定位方法。
图1是本申请实施例提供的多源数据融合定位方法的流程示意图之一,需要说明的是,目前,对无人机、无人车和机器人等终端设备进行户外定位时,主要依靠GPS,定位精度几乎完全取决于卫星信号的强弱。而在实际应用时,例如,终端设备在卫星信号较弱的区域作业时,常常需要对其进行不完全依赖于外界的自主精确定位。基于此,本申请实施例提出了多源数据融合定位的方法,利用多种传感器与GPS数据融合互补,实现不依赖于GPS精度的自主精确定位。具体地,参照图1,本申请实施例提供的多源数据融合定位方法包括:
步骤100,采集定位目标的多源定位数据,所述多源定位数据包括利用激光雷达采集的点云数据、利用惯性传感器采集的惯性定位数据和GPS数据;
首先,采集定位目标的多源定位数据,其中,定位目标可以是无人机、无人车,还可以是机器人,一般地,定位装置安装于定位目标上,定位目标即为定位装置的载体,以下称定位目标为载体。载体上固定安装有GPS、惯性传感器(IMU,以9轴IMU为例,9轴IMU包括陀螺仪、加速度计和磁力计)和激光雷达等装置,采集的多源定位数据包括GPS数据、利用激光雷达采集的点云数据和利用IMU采集的惯性定位数据(IMU数据),通过不同装置采集的来源不同的数据,对载体的定位精度不同。
步骤200,基于所述惯性定位数据计算所述定位目标的相对位姿参数,并根据所述相对位姿参数校正所述点云数据的运动畸变,得到局部定位数据;
基于利用IMU采集的IMU数据计算载体的相对位姿参数,并根据计算出的载体的相对位姿参数,校正利用激光雷达采集的点云数据的运动畸变,从而实现对载体的不依赖于卫星信号的自主精确定位。可知地,激光雷达的一帧点云数据不是在同一时刻下扫描采集完成的,而是将旋转一周累计采集到的点云数据作为一帧数据,在激光雷达速度较快的情况下,采集的点云数据存在运动畸变,需要进行运动补偿进行校正。基于IMU数据对激光雷达采集的点云数据进行运动补偿,得到局部定位数据,根据该局部定位数据可以得到对载体的局部定位结果。
进一步地,步骤200还可以包括:
步骤201,创建所述惯性定位数据对应的局部坐标系,并基于所述惯性定位数据确定所述定位目标在所述局部坐标系下的状态矩阵;
步骤202,根据所述状态矩阵对所述惯性定位数据进行滤波和积分处理,得到所述定位目标在所述局部坐标系下的运动参数,所述运动参数包括旋转角度、加速度和角速度;
步骤203,根据所述运动参数计算所述定位目标的相对位姿参数,其中,所述相对位姿参数包括相对速度和相对位移。
载体的相对位姿参数包括相对速度和相对位移,在计算IMU数据的相对位姿参数时,首先以IMU为参照,创建IMU数据对应的局部坐标系C I,并定义载体在某一时刻t下的状态矩阵X t
X t=[R t,T t,V t,b t]  (1)
其中R t为3*3的旋转矩阵,T t为1*3的平移矩阵,V t为1*3的速度矩阵,b t为1*3的贝叶斯误差矩阵。
对IMU数据进行滤波和积分处理,得到载体此时在IMU坐标系C I下的运动参数,该运动参数包括载体的旋转角度R、加速度a以及角速度ω。其中,对IMU数据的滤波处理可以是卡尔曼滤波,在此不做限定,以卡尔曼滤波处理为例,对IMU数据进行卡尔曼滤波处理后,通过对IMU数据积分求解出t+Δt时刻下,载体在IMU坐标系C I下对应的平移参数T t+Δt和速度参数V t+Δt
V t+Δt=V t+R t(a t-b t-n t)Δt  (2)
Figure PCTCN2022103369-appb-000001
R t+Δt=R texp((ω t-b t-n t)Δt)  (4)
其中,n t为t时刻下的白噪声,通过计算出的运动参数,可以对载体的运动轨迹进行预测,根据采集的数据得出载体在时刻t下的运动参数后,即可得到载体在未来时刻t+Δt的运动参数,从而预测载体的位置。由于积分过程较为耗时,在后续处理过程中如果需要修正IMU的值,则需要重新积分,所以在本实施例中,对IMU数据进行预积分,避免重复计算,将载体在任意t i 时刻和t j时刻之间所有的相对速度Δv ij和相对位移ΔT ij表示如下:
Figure PCTCN2022103369-appb-000002
Figure PCTCN2022103369-appb-000003
通过对采集的IMU数据进行积分,可以确定载体在IMU数据对应的某一时刻(参考时刻)的运动参数,基于定义的载体的相对速度和相对位移,以及计算出的载体的运动参数,可以得出载体的相对位姿参数。基于该相对位姿参数,可以预测载体在参考时刻的未来某一时刻下的相对位姿。相对位姿是载体在未来某一时刻下的位姿,相对于在参考时刻下的位姿而言的。当确定载体在某一时刻下的位姿后,基于载体的相对位姿参数,就可以预测载体在未来某时刻下的位姿。
进一步地,步骤200还可以包括:
步骤210,将所述点云数据转换到所述局部坐标系中,确定所述点云数据在所述局部坐标系中对应的局部点云数据;
步骤220,在所述局部坐标系中,基于所述相对位姿参数,对所述定位目标在所述局部点云数据中对应的位姿结果进行帧间插值处理,确定所述定位目标在所述局部点云数据中每个点对应时刻下的位姿信息;
步骤230,根据所述位姿信息将所述局部点云数据中的每帧数据点转换到帧扫描结束时刻下的局部坐标系中,以校正所述点云数据的运动畸变,得到局部定位数据。
根据计算出的相对位姿参数校正激光雷达采集的点云数据的运动畸变时,具体是先将采集的点云数据转换到基于IMU建立的局部坐标系下,确定采集的点云数据中的各个数据点在局部坐标系下对应的点。在采集多源数据时,不同来源的数据基于各自的坐标系进行表示,也即,通过不同渠道或方式采集的数据对载体的定位,可以是基于不同的坐标系进行表示的,例如,IMU通过建立的局部坐标系表示,而GPS数据则可以通过经纬度坐标系(或者称全局坐标系)表示等等。对激光雷达采集的点云数据进行运动补偿,从而校正点云数据的运动畸变时,首先需要将点云数据转换到局部坐标系下进行表 示。
将点云数据转换到局部坐标系下后,通过对IMU数据积分得到载体在IMU数据对应的每一时刻下的位姿,即运动状态。由于采集IMU数据时的频率通常大于激光雷达采集点云数据的频率,因此,基于采集的IMU数据对点云数据进行插值处理,将在IMU数据对应的时刻下的状态作为点云数据对应的时刻下的状态,从而对采集的点云数据进行运动补偿。可知地,采集IMU数据时的频率通常大于100Hz,而激光雷达采集点云数据的频率通常为10Hz,所以在每一帧激光雷达数据中,对点云数据帧间IMU数据对应的时刻进行插值,插入载体在IMU数据对应时刻下的位置和姿态结果,得到点云数据每个数据点对应时刻下的位姿信息。根据点云数据中,载体在每个数据点对应的时刻下的位姿信息,将每帧中的数据点转换到其所在帧扫描结束时刻下的局部坐标系中,通过载体在每帧点云数据扫描结束时刻下的位姿信息,校正该帧点云数据中的各个数据点的运动畸变,实现对采集的点云数据的运动畸变的校正,得到对应的局部定位数据。
步骤210之前,还可以包括:
步骤211,剔除所述GPS数据中的离群数据点。
由于GPS数据会出现离群点,因此,对于采集到的GPS数据,需要剔除离群点,以免对定位结果造成影响。在剔除离群点时,根据匀速运动模型识别因GPS漂移产生的离群点并剔除。
步骤300,融合所述局部定位数据和所述GPS数据,确定所述定位目标的全局定位结果。
将经过运动畸变校正得到的局部定位数据与GPS数据融合,得到对载体的全局定位结果,其中,对局部定位数据和GPS数据的融合包括,将局部定位数据和GPS数据转换到相同的坐标系下。可以理解的是,GPS和激光雷达的定位精度不同,一般GPS可以实现米级精度的定位,而激光雷达可以实现厘米级精度的定位,将激光雷达与GPS的定位结果融合,可以实现不依赖于卫星信号的厘米级精度的定位,从而提高对载体的定位精度,实现对载体的 自主精确定位。
在本实施例中,通过融合采集的多源定位数据,对定位目标的高精度定位数据进行运动畸变校正,得到对定位目标的局部定位数据,通过融合局部定位数据和GPS数据对定位目标进行定位,可以融合不同精度的定位结果,在不依赖卫星信号的情况下,提高对定位目标的定位精度,从而实现对定位目标的自主精确定位。
在一个实施例中,步骤300还可以包括:
步骤301,对所述局部定位数据进行点云匹配,确定所述定位目标的局部定位结果;
步骤302,将所述GPS数据转换到所述局部坐标系中,融合所述局部定位结果与所述GPS数据的定位结果,得到对所述定位目标的融合定位结果;
步骤303,将所述融合定位结果转换到所述GPS数据对应的全局坐标系下,得到对所述定位目标的全局定位结果,该局部定位结果的定位精度高于GPS数据的定位精度。
在对采集的点云数据进行运动补偿,校正点云数据的运动畸变后,每帧点云数据已经转换到了创建的局部坐标系下,在局部坐标系下对得到的局部定位数据进行点云匹配,从而确定对载体的局部定位结果。将GPS数据同样转换到局部坐标系下,将GPS数据的定位结果与局部定位结果融合,并将融合定位结果转换到GPS数据对应的全局坐标系下,从而得到对载体的全局定位结果。
进一步地,步骤301还可以包括:
步骤311,根据所述局部定位数据确定帧间点云的相对位姿关系;
步骤321,对所述局部定位数据中的各个数据点执行体素分割,并计算分割得到的各个体素的均值和协方差矩阵;
步骤331,将所述相对位姿关系作为初始参数,迭代优化所述均值和所述协方差矩阵,得到最优参数;
步骤341,根据所述最优参数对所述局部定位数据中的每帧点云数据进 行叠加,以对所述局部定位数据中的各帧点云数据进行匹配,得到在所述局部坐标系中对所述定位目标的局部定位结果。
在对激光雷达采集的点云数据进行运动畸变校正后,此时,每一帧激光雷达已经转到了局部坐标系下,我们可以获取两帧点云间的相对平移矩阵ΔT和相对旋转矩阵ΔR,但这是对IMU数据的积分结果,存在一定的误差,通过点云匹配消除积分误差,即可得到对载体的局部定位结果。
具体地,采用NDT(Nondestructive Testing,无损检测)方法进行点云匹配,根据局部定位数据确定点云数据中,载体在不同帧点云数据之间的相对位姿关系。其中,通过校正点云数据的运动畸变,在将点云数据转换到局部坐标系中后,还将每帧点云数据中的数据点转换到了每帧点云扫描结束时刻对应的局部坐标系下,因此,可以计算出载体在帧间点云的相对位姿关系,该相对位姿关系可以通过平移矩阵和旋转矩阵表示。
根据局部定位数据确定载体在当前采集的点云数据下的相对位姿关系,并将该相对位姿关系作为起始参数[R,T],对局部坐标系空间中的点云数据的各个点进行体素分割,并计算分割得到的各个体素的均值
Figure PCTCN2022103369-appb-000004
和协方差矩阵Σ:
Figure PCTCN2022103369-appb-000005
Figure PCTCN2022103369-appb-000006
其中
Figure PCTCN2022103369-appb-000007
为均值,x i为体素中的每个点,n为点的个数,Σ为协方差矩阵。根据起始参数对两帧点云之间均值
Figure PCTCN2022103369-appb-000008
和协方差矩阵Σ的迭代优化,求取出最优参数[R o,T o],将后一帧点云根据最优参数转换到对应的坐标系下,把点云进行叠加,完成对一帧点云的匹配。通过对每一帧的点云的匹配,可以获取到每一帧点云数据对应的时刻下,载体相对于局部坐标系的平移和旋转,记为[R i,T i],即为对载体在局部坐标系下的局部定位结果。
进一步地,步骤302还可以包括:
步骤312,创建大地坐标系,并将所述GPS数据中的各个数据点转换到所述大地坐标系下;
步骤322,根据所述点云数据中的首帧点云数据对应的时间,从所述GPS 数据在所述大地坐标系中对应的各个数据点中确定初始数据点;
步骤332,通过与所述初始数据点做差,将所述GPS数据中的各个数据点从所述大地坐标系中转换到所述局部坐标系中。
在将GPS数据转换到局部坐标系中时,首先创建大地坐标系,于GPS坐标系对齐,一般地,GPS使用站心坐标系(ENU坐标系),在将站心坐标系中的点转换到局部坐标系中时,需要通过大地坐标系进行坐标系对齐。在创建大地坐标系之后,将GPS数据中的各个数据点转换到大地坐标系下。具体地,基于GPS使用的站心坐标系,以过地面点的椭球法线与椭球赤道面的夹角为大地纬度B,以过地面点的椭球子午面与起始子午面之间的夹角为大地经度L,地面点沿椭球法线至椭球面的距离为大地高H,将时间与首帧IMU数据最接近的GPS数据点作为初始数据点转换到大地坐标系下,并记为G 0=[B o,L o,H o],其中以参心0为坐标原点,Z轴与参考椭球的短轴(旋转轴)相重合,X轴与起始子午面和赤道的交线重合,Y轴在赤道面上与X轴垂直,构成右手直角坐标系0-XYZ:
X o=(N+H o)cos B o cos L o  (9)
Y o=(N+H o)cos B o sin L o  (10)
Z o=(N(1-e 2)+H o)sin B o  (11)
其中e为站心坐标系中的椭球的第一偏心率,根据长短半轴求得:
e=sqrt(a 2-b 2)/a  (12)
N为站心坐标系中的椭球面的卯酉圈的曲率半径,可以根据以下公式求得:
N=a/sqrt(1-e 2sin 2B o)  (13)
在将初始数据点转换到大地坐标系下之后,对经过离群点过滤的每一个GPS数据点都经过同样的变换,从而将GPS数据点转换到大地坐标系中,并通过与初始数据点做差,从而将GPS数据点转换到局部坐标系下,利用经过卡尔曼滤波和积分处理的IMU数据对GPS数据进行校正,并融合GPS数据的定位结果和局部定位结果,实现对载体的全局定位。
参照图2,图2为本申请实施例提供的多源数据融合定位方法的另一流程示意图,在图2中,利用经过处理的IMU数据和经过校正的GPS数据,可以对载体进行位置预测,从而对载体进行定位,当载体仅进行旋转时,基于IMU数据积分进行位姿预测,当载体平移时,利用GPS数据和IMU数据融合进行位姿预测。根据对载体的位姿预测,当载体进行平移和/或旋转后,基于预测结果对载体的位姿进行更新,并通过预积分处理将更新后的位姿应用于后续的预测中。
进一步地,将基于激光雷达的点云数据的局部定位和基于GPS的坐标转换完成后,在局部坐标系下,基于激光雷达的厘米级精准定位结果和米级的GPS定位结果进行融合,产生的融合定位结果可根据GPS的初始数据点G 0=[X o,Y o,Z o]转换到GPS数据对应的全局坐标系下,该全局坐标系可以是大地坐标系,也可以是常用的经纬度坐标系,以经纬度坐标系为例,将GPS的初始数据点转换到大地坐标系下,并记为G t=[X t,Y t,Z t],最后根据下列公式转换到经纬度坐标系下,实现对载体不依赖于GPS精度的厘米级高精度全局定位:
L t=tan -1(Y t/X t)  (14)
Figure PCTCN2022103369-appb-000009
Figure PCTCN2022103369-appb-000010
在本实施例中,通过体素分割确定最优参数进行点云匹配,消除IMU数据的积分误差,同时,通过对GPS数据进行坐标对齐,实现局部定位结果和GPS定位结果的融合,提高了对定位目标的定位精度。
下面对本申请提供的多源数据融合定位装置进行描述,下文描述的多源数据融合定位装置与上文描述的多源数据融合定位方法可相互对应参照。
参照图3,本申请实施例提供的多源数据融合定位装置包括:
多源数据采集模块10,用于采集定位目标的多源定位数据,其中,所述多源数据采集模块包括激光雷达和惯性传感器,所述多源定位数据包括利用所述激光雷达采集的点云数据、利用所述惯性传感器采集的惯性定位数据和 GPS数据;
数据校正模块20,用于基于所述惯性定位数据计算所述定位目标的相对位姿参数,并根据所述相对位姿参数校正所述点云数据的运动畸变,得到局部定位数据;
融合定位模块30,用于融合所述局部定位数据和所述GPS数据,确定所述定位目标的全局定位结果。
在一个实施例中,所述数据校正模块20,还用于:
创建所述惯性定位数据对应的局部坐标系,并基于所述惯性定位数据确定所述定位目标在所述局部坐标系下的状态矩阵;
根据所述状态矩阵对所述惯性定位数据进行滤波和积分处理,得到所述定位目标在所述局部坐标系下的运动参数,所述运动参数包括旋转角度、加速度和角速度;
根据所述运动参数计算所述定位目标的相对位姿参数,其中,所述相对位姿参数包括相对速度和相对位移。
在一个实施例中,所述数据校正模块20,还用于:
将所述点云数据转换到所述局部坐标系中,确定所述点云数据在所述局部坐标系中对应的局部点云数据;
在所述局部坐标系中,基于所述相对位姿参数,对所述定位目标在所述局部点云数据中对应的位姿结果进行帧间插值处理,确定所述定位目标在所述局部点云数据中每个点对应时刻下的位姿信息;
根据所述位姿信息将所述局部点云数据中的每帧数据点转换到帧扫描结束时刻下的局部坐标系中,以校正所述点云数据的运动畸变,得到局部定位数据。
在一个实施例中,所述融合定位模块30,还用于:
对所述局部定位数据进行点云匹配,确定所述定位目标的局部定位结果;
将所述GPS数据转换到所述局部坐标系中,融合所述局部定位结果与所述GPS数据的定位结果,得到对所述定位目标的融合定位结果;
将所述融合定位结果转换到所述GPS数据对应的全局坐标系下,得到对所述定位目标的全局定位结果。
在一个实施例中,所述融合定位模块30,还用于:
根据所述局部定位数据确定帧间点云的相对位姿关系;
对所述局部定位数据中的各个数据点执行体素分割,并计算分割得到的各个体素的均值和协方差矩阵;
将所述相对位姿关系作为初始参数,迭代优化所述均值和所述协方差矩阵,得到最优参数;
根据所述最优参数对所述局部定位数据中的每帧点云数据进行叠加,以对所述局部定位数据中的各帧点云数据进行匹配,得到在所述局部坐标系中对所述定位目标的局部定位结果。
在一个实施例中,所述融合定位模块30,还用于:
创建大地坐标系,并将所述GPS数据中的各个数据点转换到所述大地坐标系下;
根据所述点云数据中的首帧点云数据对应的时间,从所述GPS数据在所述大地坐标系中对应的各个数据点中确定初始数据点;
通过与所述初始数据点做差,将所述GPS数据中的各个数据点从所述大地坐标系中转换到所述局部坐标系中。
在一个实施例中,所述多源数据融合定位装置还包括离群数据过滤模块,用于:
剔除所述GPS数据中的离群数据点。
图4示例了一种电子设备的实体结构示意图,如图4所示,该电子设备可以包括:处理器(processor)410、通信接口(Communications Interface)420、存储器(memory)430和通信总线440,其中,处理器410,通信接口420,存储器430通过通信总线440完成相互间的通信。处理器410可以调用存储器430中的逻辑指令,以执行多源数据融合定位方法,该方法包括:
采集定位目标的多源定位数据,所述多源定位数据包括利用激光雷达采 集的点云数据、利用惯性传感器采集的惯性定位数据和GPS数据;
基于所述惯性定位数据计算所述定位目标的相对位姿参数,并根据所述相对位姿参数校正所述点云数据的运动畸变,得到局部定位数据;
融合所述局部定位数据和所述GPS数据,确定所述定位目标的全局定位结果。
此外,上述的存储器430中的逻辑指令可以通过软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁盘或者光盘等各种可以存储程序代码的介质。
另一方面,本申请还提供一种计算机程序产品,所述计算机程序产品包括计算机程序,计算机程序可存储在非暂态计算机可读存储介质上,所述计算机程序被处理器执行时,计算机能够执行上述各方法所提供的多源数据融合定位方法,该方法包括:
采集定位目标的多源定位数据,所述多源定位数据包括利用激光雷达采集的点云数据、利用惯性传感器采集的惯性定位数据和GPS数据;
基于所述惯性定位数据计算所述定位目标的相对位姿参数,并根据所述相对位姿参数校正所述点云数据的运动畸变,得到局部定位数据;
融合所述局部定位数据和所述GPS数据,确定所述定位目标的全局定位结果。
又一方面,本申请还提供一种非暂态计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现以执行上述各方法提供的多源数据融合定位方法,该方法包括:
采集定位目标的多源定位数据,所述多源定位数据包括利用激光雷达采集的点云数据、利用惯性传感器采集的惯性定位数据和GPS数据;
基于所述惯性定位数据计算所述定位目标的相对位姿参数,并根据所述相对位姿参数校正所述点云数据的运动畸变,得到局部定位数据;
融合所述局部定位数据和所述GPS数据,确定所述定位目标的全局定位结果。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性的劳动的情况下,即可以理解并实施。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到各实施方式可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件。基于这样的理解,上述技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如ROM/RAM、磁盘、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行各个实施例或者实施例的某些部分所述的方法。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。

Claims (10)

  1. 一种多源数据融合定位方法,包括以下步骤:
    采集定位目标的多源定位数据,所述多源定位数据包括利用激光雷达采集的点云数据、利用惯性传感器采集的惯性定位数据和GPS数据;
    基于所述惯性定位数据计算所述定位目标的相对位姿参数,并根据所述相对位姿参数校正所述点云数据的运动畸变,得到局部定位数据;
    融合所述局部定位数据和所述GPS数据,确定所述定位目标的全局定位结果。
  2. 根据权利要求1所述的多源数据融合定位方法,其中所述基于所述惯性定位数据计算所述定位目标的相对位姿参数的步骤,包括:
    创建所述惯性定位数据对应的局部坐标系,并基于所述惯性定位数据确定所述定位目标在所述局部坐标系下的状态矩阵;
    根据所述状态矩阵对所述惯性定位数据进行滤波和积分处理,得到所述定位目标在所述局部坐标系下的运动参数,所述运动参数包括旋转角度、加速度和角速度;
    根据所述运动参数计算所述定位目标的相对位姿参数,其中,所述相对位姿参数包括相对速度和相对位移。
  3. 根据权利要求2所述的多源数据融合定位方法,其中所述根据所述相对位姿参数校正所述点云数据的运动畸变,得到局部定位数据的步骤,包括:
    将所述点云数据转换到所述局部坐标系中,确定所述点云数据在所述局部坐标系中对应的局部点云数据;
    在所述局部坐标系中,基于所述相对位姿参数,对所述定位目标在所述局部点云数据中对应的位姿结果进行帧间插值处理,确定所述定位目标在所述局部点云数据中每个点对应时刻下的位姿信息;
    根据所述位姿信息将所述局部点云数据中的每帧数据点转换到帧扫描结束时刻下的局部坐标系中,以校正所述点云数据的运动畸变,得到局部定位数据。
  4. 根据权利要求2所述的多源数据融合定位方法,其中所述融合所 述局部定位数据和所述GPS数据,确定所述定位目标的全局定位结果的步骤,包括:
    对所述局部定位数据进行点云匹配,确定所述定位目标的局部定位结果;
    将所述GPS数据转换到所述局部坐标系中,融合所述局部定位结果与所述GPS数据的定位结果,得到对所述定位目标的融合定位结果;
    将所述融合定位结果转换到所述GPS数据对应的全局坐标系下,得到对所述定位目标的全局定位结果。
  5. 根据权利要求4所述的多源数据融合定位方法,其中所述对所述局部定位数据进行点云匹配,确定所述定位目标的局部定位结果的步骤,包括:
    根据所述局部定位数据确定帧间点云的相对位姿关系;
    对所述局部定位数据中的各个数据点执行体素分割,并计算分割得到的各个体素的均值和协方差矩阵;
    将所述相对位姿关系作为初始参数,迭代优化所述均值和所述协方差矩阵,得到最优参数;
    根据所述最优参数对所述局部定位数据中的每帧点云数据进行叠加,以对所述局部定位数据中的各帧点云数据进行匹配,得到在所述局部坐标系中对所述定位目标的局部定位结果。
  6. 根据权利要求4所述的多源数据融合定位方法,其中所述将所述GPS数据转换到所述局部坐标系中的步骤,包括:
    创建大地坐标系,并将所述GPS数据中的各个数据点转换到所述大地坐标系下;
    根据所述点云数据中的首帧点云数据对应的时间,从所述GPS数据在所述大地坐标系中对应的各个数据点中确定初始数据点;
    通过与所述初始数据点做差,将所述GPS数据中的各个数据点从所述大地坐标系中转换到所述局部坐标系中。
  7. 根据权利要求3所述的多源数据融合定位方法,其中所述将所述GPS数据转换到所述局部坐标系中的步骤之前,还包括:
    剔除所述GPS数据中的离群数据点。
  8. 一种多源数据融合定位装置,包括:
    多源数据采集模块,用于采集定位目标的多源定位数据,其中,所述多源数据采集模块包括激光雷达和惯性传感器,所述多源定位数据包括利用所述激光雷达采集的点云数据、利用所述惯性传感器采集的惯性定位数据和GPS数据;
    数据校正模块,用于基于所述惯性定位数据计算所述定位目标的相对位姿参数,并根据所述相对位姿参数校正所述点云数据的运动畸变,得到局部定位数据;
    融合定位模块,用于融合所述局部定位数据和所述GPS数据,确定所述定位目标的全局定位结果。
  9. 一种电子设备,包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,其中所述处理器执行所述程序时实现如权利要求1至7任一项所述的多源数据融合定位方法。
  10. 一种非暂态计算机可读存储介质,其上存储有计算机程序,其中所述计算机程序被处理器执行时实现如权利要求1至7任一项所述的多源数据融合定位方法。
PCT/CN2022/103369 2022-05-24 2022-07-01 多源数据融合定位方法、装置、设备及计算机存储介质 WO2023226155A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210576163.9A CN115236714A (zh) 2022-05-24 2022-05-24 多源数据融合定位方法、装置、设备及计算机存储介质
CN202210576163.9 2022-05-24

Publications (1)

Publication Number Publication Date
WO2023226155A1 true WO2023226155A1 (zh) 2023-11-30

Family

ID=83668316

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/103369 WO2023226155A1 (zh) 2022-05-24 2022-07-01 多源数据融合定位方法、装置、设备及计算机存储介质

Country Status (2)

Country Link
CN (1) CN115236714A (zh)
WO (1) WO2023226155A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117609750A (zh) * 2024-01-19 2024-02-27 中国电子科技集团公司第五十四研究所 一种基于电数字数据处理技术计算目标识别率区间的方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109116397A (zh) * 2018-07-25 2019-01-01 吉林大学 一种车载多相机视觉定位方法、装置、设备及存储介质
CN110686704A (zh) * 2019-10-18 2020-01-14 深圳市镭神智能系统有限公司 激光雷达与组合惯导的位姿标定方法、系统及介质
CN111077907A (zh) * 2019-12-30 2020-04-28 哈尔滨理工大学 一种室外无人机的自主定位方法
CN112967392A (zh) * 2021-03-05 2021-06-15 武汉理工大学 一种基于多传感器触合的大规模园区建图定位方法
CN113466890A (zh) * 2021-05-28 2021-10-01 中国科学院计算技术研究所 基于关键特征提取的轻量化激光雷达惯性组合定位方法和系统
CN113587930A (zh) * 2021-10-08 2021-11-02 广东省科学院智能制造研究所 基于多传感融合的自主移动机器人室内外导航方法及装置
US20220138896A1 (en) * 2019-07-12 2022-05-05 Beijing Voyager Technology Co., Ltd. Systems and methods for positioning

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109116397A (zh) * 2018-07-25 2019-01-01 吉林大学 一种车载多相机视觉定位方法、装置、设备及存储介质
US20220138896A1 (en) * 2019-07-12 2022-05-05 Beijing Voyager Technology Co., Ltd. Systems and methods for positioning
CN110686704A (zh) * 2019-10-18 2020-01-14 深圳市镭神智能系统有限公司 激光雷达与组合惯导的位姿标定方法、系统及介质
CN111077907A (zh) * 2019-12-30 2020-04-28 哈尔滨理工大学 一种室外无人机的自主定位方法
CN112967392A (zh) * 2021-03-05 2021-06-15 武汉理工大学 一种基于多传感器触合的大规模园区建图定位方法
CN113466890A (zh) * 2021-05-28 2021-10-01 中国科学院计算技术研究所 基于关键特征提取的轻量化激光雷达惯性组合定位方法和系统
CN113587930A (zh) * 2021-10-08 2021-11-02 广东省科学院智能制造研究所 基于多传感融合的自主移动机器人室内外导航方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117609750A (zh) * 2024-01-19 2024-02-27 中国电子科技集团公司第五十四研究所 一种基于电数字数据处理技术计算目标识别率区间的方法
CN117609750B (zh) * 2024-01-19 2024-04-09 中国电子科技集团公司第五十四研究所 一种基于电数字数据处理技术计算目标识别率区间的方法

Also Published As

Publication number Publication date
CN115236714A (zh) 2022-10-25

Similar Documents

Publication Publication Date Title
CN108759833B (zh) 一种基于先验地图的智能车辆定位方法
CN110412635B (zh) 一种环境信标支持下的gnss/sins/视觉紧组合方法
EP3454008B1 (en) Survey data processing device, survey data processing method, and survey data processing program
CN113406682B (zh) 一种定位方法、装置、电子设备及存储介质
US8213706B2 (en) Method and system for real-time visual odometry
WO2021143286A1 (zh) 车辆定位的方法、装置、控制器、智能车和系统
CN108981687B (zh) 一种视觉与惯性融合的室内定位方法
CN113551665B (zh) 一种用于运动载体的高动态运动状态感知系统及感知方法
CN111829532B (zh) 一种飞行器重定位系统和重定位方法
KR102239562B1 (ko) 항공 관측 데이터와 지상 관측 데이터 간의 융합 시스템
CN108627152B (zh) 一种微型无人机基于多传感器数据融合的导航方法
US20220398825A1 (en) Method for generating 3d reference points in a map of a scene
CN112946681B (zh) 融合组合导航信息的激光雷达定位方法
CN108444468B (zh) 一种融合下视视觉与惯导信息的定向罗盘
WO2020198963A1 (zh) 关于拍摄设备的数据处理方法、装置及图像处理设备
WO2023226155A1 (zh) 多源数据融合定位方法、装置、设备及计算机存储介质
CN109916417B (zh) 一种地图建立方法、装置、计算机设备及其存储介质
CN108921896B (zh) 一种融合点线特征的下视视觉罗盘
CN113129377B (zh) 一种三维激光雷达快速鲁棒slam方法和装置
CN112197765B (zh) 一种实现水下机器人精细导航的方法
CN114022561A (zh) 一种基于gps约束和动态校正的城区单目测图方法和系统
CN114897942B (zh) 点云地图的生成方法、设备及相关存储介质
CN109459046B (zh) 悬浮式水下自主航行器的定位和导航方法
WO2023226154A1 (zh) 自主定位方法、装置、设备及计算机可读存储介质
CN115775242A (zh) 一种基于匹配的点云地图质量评估方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22943346

Country of ref document: EP

Kind code of ref document: A1