CN116047565A - Multi-sensor data fusion positioning system - Google Patents

Multi-sensor data fusion positioning system Download PDF

Info

Publication number
CN116047565A
CN116047565A CN202211582224.9A CN202211582224A CN116047565A CN 116047565 A CN116047565 A CN 116047565A CN 202211582224 A CN202211582224 A CN 202211582224A CN 116047565 A CN116047565 A CN 116047565A
Authority
CN
China
Prior art keywords
module
positioning
vehicle
point cloud
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211582224.9A
Other languages
Chinese (zh)
Inventor
柯希俊
施亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SAIC Volkswagen Automotive Co Ltd
Original Assignee
SAIC Volkswagen Automotive Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SAIC Volkswagen Automotive Co Ltd filed Critical SAIC Volkswagen Automotive Co Ltd
Priority to CN202211582224.9A priority Critical patent/CN116047565A/en
Publication of CN116047565A publication Critical patent/CN116047565A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/46Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being of a radio-wave signal type
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/40Correcting position, velocity or attitude
    • G01S19/41Differential correction, e.g. DGPS [differential GPS]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/43Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a multi-sensor data fusion positioning system. The multi-sensor data fusion positioning system comprises a combined positioning subunit, a positioning module and a positioning module, wherein the combined positioning subunit comprises a GNSS module, an RTK service module and an IMU module; the two-dimensional state error Kalman filtering unit comprises a high-precision map module, a visual lane line detection module, a lane line matching module, a wheel speed mileage calculation module and a first state error Kalman filtering module; the laser radar point cloud matching and positioning unit comprises a point cloud module, a point cloud map module, a positioning initialization module and an NDT matching and positioning module; the three-dimensional state error Kalman filtering unit comprises a Kalman filtering initialization module, an IMU pose state quantity resolving module, a judging module, a second state error Kalman filtering module and a posterior pose module. The multi-sensor data fusion positioning system provided by the invention can enhance the robustness of global positioning and has stronger scene adaptation capability.

Description

Multi-sensor data fusion positioning system
Technical Field
The invention relates to the technical field of intelligent network-connected automobile intelligent driving, in particular to a multi-sensor data fusion positioning system.
Background
For advanced autopilot systems (e.g. level L4), the positioning module is one of the most central modules of autopilot, positioning includes global positioning and local positioning, for autopilot, the accuracy of which needs to reach the centimeter level, and the positioning system is required to have strong robustness, i.e. in case of any sensor data degradation, accurate real-time positioning results can still be output. The positioning module usually fuses the measurement data of a plurality of sensors such as GNSS, IMU, wheel speed odometer (track deduction output of a vehicle chassis encoder), visual detection information, visual odometer, laser radar odometer (LiDAR odometer) and the like, and uses a filtering algorithm (ESKF, EKF, UKF and the like) to obtain smooth and centimeter-level absolute positioning, wherein the registration positioning (Lidar odometer) based on a point cloud map and a laser radar has high precision, strong scene adaptability, good robustness and reliability, usually takes a large weight in the whole fusion positioning, and is a relatively reliable 'absolute positioning' data source in an automatic driving positioning system.
Global positioning is the acquisition of the pose of a vehicle in global world coordinates of the earth by means of a global satellite positioning system (GNSS). Global positioning necessarily depends on GNSS, however, in complex scenarios of autopilot deployment work, such as urban canyons, overpasses, tunnels, boulevees, etc., there are inevitably no GNSS signals, or sometimes no GNSS. Thus, the autopilot positioning system does not have robust positioning capabilities in complex scenarios if it only employs global positioning schemes that must rely on GNSS. This is the most deadly problem for all-weather autopilot systems.
The local positioning of the vehicle end is to obtain the pose of the vehicle under a local scene (such as an indoor or park) or a local map coordinate system. Conventional autonomous mobile robots (such as AGVs) often adopt a local positioning technology to achieve tasks such as path planning in a given scene, for example, the autonomous mobile robots often adopt a synchronous positioning and mapping (SLAM, simultaneous Localization and Mapping) method to perform online positioning. The method has the advantages that various sensors with lower cost are adopted under the condition of no GNSS signals, and local positioning result output with considerable precision can be realized by combining an advanced multi-sensor data fusion algorithm. However, the method has the problems of large scene positioning odometer error accumulation increase, positioning delay and the like, and is difficult to be directly applied to a complex automatic driving scene. Because the automatic driving vehicle usually has a faster running speed, a longer distance and a very complex scene, the map constructed in real time by the mapping algorithm has overlarge deviation, the odometer error output by the positioning algorithm is accumulated, and the positioning accuracy is reduced.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a multi-sensor data fusion positioning system which can enhance the robustness of global positioning and has stronger scene adaptability.
Specifically, the invention provides a multi-sensor data fusion positioning system, which is suitable for a vehicle and comprises:
the combined positioning subunit comprises a GNSS module, an RTK service module and an IMU module, wherein the GNSS module is used for acquiring GNSS data, the RTK service module is used for providing real-time motion difference enhancement resolving service, and the IMU module is used for acquiring IMU data of a vehicle;
the two-dimensional state error Kalman filtering unit comprises a high-precision map module, a visual lane line detection module, a lane line matching module, a wheel speed mileage calculation module and a first state error Kalman filtering module, wherein the high-precision map module is used for acquiring vertical coordinates of a lane line where a vehicle is located based on a high-precision map and GNSS data, the visual lane line detection module is used for acquiring the coordinates of the nearest point and the farthest point of the lane line where the vehicle is located through a visual camera of the vehicle, the lane line matching module is used for acquiring the transverse updating quantity of the vehicle based on output results of the high-precision map module and the visual lane line detection module, the wheel speed mileage calculation module is used for acquiring the longitudinal updating quantity of the vehicle, and the first state error Kalman filtering module is used for fusing the transverse updating quantity, the longitudinal updating quantity, the GNSS data and the IMU data of the vehicle to acquire the two-dimensional positioning result of the vehicle;
the laser radar point cloud matching and positioning unit comprises a point cloud module, a point cloud map module, a positioning initialization module and an NDT matching and positioning module, wherein the point cloud module is used for acquiring real-time point cloud data of the vehicle through a laser radar of the vehicle, the point cloud map module is used for acquiring an off-line constructed point cloud map, the positioning initialization module is used for acquiring global coordinates of the vehicle, and the NDT matching and positioning module is used for acquiring matching and positioning pose of the vehicle based on the real-time point cloud data and the point cloud map;
the three-dimensional state error Kalman filtering unit comprises a Kalman filtering initialization module, an IMU pose state quantity resolving module, a judging module, a second state error Kalman filtering module and a posterior pose module, wherein the Kalman filtering initialization module acquires a global pose state quantity initial value based on global coordinates, the IMU pose state quantity resolving module acquires an estimated state quantity based on the global pose state quantity initial value and IMU data, the judging module is used for judging whether the laser radar point cloud matching positioning unit outputs an effective matching positioning pose or not, if yes, the second state error Kalman filtering module updates the estimated state quantity based on the matching positioning pose, and the posterior pose module calculates a three-dimensional positioning result of the vehicle based on the updated estimated state quantity; if not, the IMU pose state quantity calculating module directly outputs a three-dimensional positioning result based on the estimated state quantity;
if the GNSS module can normally receive GNSS data and the RTK service module can provide real-time motion difference enhancement resolving service, the multi-sensor data fusion positioning system outputs a positioning result obtained by fusing the GNSS data and the IMU data through the combined positioning subunit;
if the GNSS module can normally receive GNSS data and the RTK service module can not provide real-time motion differential enhancement resolving service, the multi-sensor data fusion positioning system acquires a two-dimensional positioning result of the vehicle through the two-dimensional state error Kalman filtering unit;
and if the GNSS module can not receive the complete GNSS data, the multi-sensor data fusion positioning system acquires a three-dimensional positioning result of the vehicle through a laser radar point cloud matching positioning unit and a three-dimensional state error Kalman filtering unit.
According to one embodiment of the invention, the GNSS data acquired by the first state error kalman filter module includes latitude and longitude data of the vehicle, and the first state error kalman filter module acquires a lateral state quantity and a longitudinal state quantity of the vehicle based on the latitude and longitude data and updates and corrects the lateral state quantity and the longitudinal state quantity based on the lateral update quantity and the longitudinal update quantity; and the first state error Kalman filtering module obtains the course angle of the vehicle based on the fusion of the GNSS data and the IMU data.
According to one embodiment of the invention, in the two-dimensional state error Kalman filtering unit, the high-precision map module acquires vertical point coordinates of left and right lane lines closest to the vehicle based on the high-precision map and GNSS data, the visual lane line detection module acquires closest point and farthest point coordinates of left and right lane lines closest to the front end of the visual camera through the visual camera of the vehicle, and the lane line matching module acquires the transverse updating quantity of the vehicle based on output results of the high-precision map module and the visual lane line detection module.
According to one embodiment of the present invention, the process of obtaining global coordinates of the vehicle by the positioning initialization module includes:
based on global coordinates obtained after the GNSS data transformation, realizing positioning initialization, and continuously updating the global coordinates based on the GNSS data;
and/or acquiring global coordinates based on pose calculation of the IMU data after short-time interval numerical integration so as to realize positioning initialization, and continuously updating the global coordinates based on the IMU data.
According to one embodiment of the invention, in a point cloud module of the laser radar point cloud matching positioning unit, the point cloud data of the vehicle is obtained through the laser radar of the vehicle, and after being subjected to voxel grid downsampling and filtering, real point cloud data are formed; the point cloud map is generated by offline construction of data acquired by multiple sensors on the vehicle through an optimization-based fusion mapping algorithm.
According to one embodiment of the invention, the voxel grid downsampling filtering comprises the steps of: and a series of three-dimensional voxel grids are created based on the point cloud data acquired by the laser radar by using the VoxelGrid class of the open source point cloud library, and for each three-dimensional voxel grid, the center of gravity of all points in the three-dimensional voxel grid is used for representing other points in the three-dimensional voxel grid.
According to one embodiment of the invention, the NDT matching and positioning module adopts an NDT matching algorithm to realize the matching of the real-time point cloud data and the point cloud map, and the matching voxel leaf size is set to be 3, the iteration step number is 30, the resolution is 1.0, the step length is 0.1, and the minimum tolerance is 0.01 in the parameter configuration of the NDT algorithm.
According to one embodiment of the invention, if the NDT matching algorithm fails in matching, the NDT matching positioning module is initialized based on the current global coordinates output by the positioning initialization module or based on an intermediate result obtained by the NDT matching algorithm.
According to one embodiment of the invention, in the three-dimensional state error kalman filter unit, the kalman filter initialization module obtains the initial value of the global pose state quantity through rolling iteration based on the global coordinates.
According to one embodiment of the invention, the IMU pose state quantity calculating module selects a time interval, and performs median value product resolving on IMU data based on the initial value of the global pose state quantity to obtain the estimated state quantity.
The invention provides a multi-sensor data fusion positioning system, which is characterized in that a laser radar point cloud matching positioning unit module is introduced to enhance the robustness and scene adaptability of global positioning, and the multi-sensor data fusion global positioning and vehicle-end local positioning technology on a vehicle is used for considering the real-time performance and the robustness required by the whole positioning system, so that the multi-sensor data fusion positioning system has stronger scene adaptability.
It is to be understood that both the foregoing general description and the following detailed description of the present invention are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention.
In the accompanying drawings:
FIG. 1 shows a schematic diagram of a multi-sensor data fusion positioning system in accordance with an embodiment of the invention.
Wherein the above figures include the following reference numerals:
multi-sensor data fusion positioning system 100
Combined positioning subunit 101
Two-dimensional state error Kalman filtering unit 102
Laser radar point cloud matching positioning unit 103
Three-dimensional state error Kalman filtering unit 104
GNSS module 105
RTK service module 106
IMU module 107
High-precision map module 108
Visual lane line detection module 109
Lane line matching module 110
Wheel speed mileage calculation module 111
First state error Kalman filtering module 112
Point cloud module 113
Point cloud map module 114
Positioning initialization module 115
NDT matching and positioning module 116
Kalman filter initialization module 117
IMU pose state quantity resolving module 118
Judgment module 119
Second state error Kalman filtering module 120
Posterior pose module 121
Detailed Description
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. The following description of at least one exemplary embodiment is merely exemplary in nature and is in no way intended to limit the application, its application, or uses. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments in accordance with the present application. As used herein, the singular is also intended to include the plural unless the context clearly indicates otherwise, and furthermore, it is to be understood that the terms "comprises" and/or "comprising" when used in this specification are taken to specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof.
The relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present application unless it is specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description. Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but should be considered part of the specification where appropriate. In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
In the description of the present application, it should be understood that, where azimuth terms such as "front, rear, upper, lower, left, right", "transverse, vertical, horizontal", and "top, bottom", etc., indicate azimuth or positional relationships generally based on those shown in the drawings, only for convenience of description and simplification of the description, these azimuth terms do not indicate and imply that the apparatus or elements referred to must have a specific azimuth or be constructed and operated in a specific azimuth, and thus should not be construed as limiting the scope of protection of the present application; the orientation word "inner and outer" refers to inner and outer relative to the contour of the respective component itself.
Spatially relative terms, such as "above … …," "above … …," "upper surface at … …," "above," and the like, may be used herein for ease of description to describe one device or feature's spatial location relative to another device or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as "above" or "over" other devices or structures would then be oriented "below" or "beneath" the other devices or structures. Thus, the exemplary term "above … …" may include both orientations of "above … …" and "below … …". The device may also be oriented 90 degrees or at other orientations and the spatially relative descriptors used herein interpreted accordingly.
In addition, the terms "first", "second", etc. are used to define the components, and are merely for convenience of distinguishing the corresponding components, and unless otherwise stated, the terms have no special meaning, and thus should not be construed as limiting the scope of the present application. Furthermore, although terms used in the present application are selected from publicly known and commonly used terms, some terms mentioned in the specification of the present application may be selected by the applicant at his or her discretion, the detailed meanings of which are described in relevant parts of the description herein. Furthermore, it is required that the present application be understood, not simply by the actual terms used but by the meaning of each term lying within.
FIG. 1 shows a schematic diagram of a multi-sensor data fusion positioning system in accordance with an embodiment of the invention. As shown, the present invention provides a multi-sensor data fusion positioning system 100 suitable for use in a vehicle. The multi-sensor data fusion positioning system 100 mainly comprises a combined positioning subunit 101, a two-dimensional state error Kalman filtering unit 102, a laser radar point cloud matching positioning unit 103 and a three-dimensional state error Kalman filtering unit 104.
The combined positioning subunit 101 includes a GNSS module 105, an RTK service module 106, and an IMU module 107. The GNSS module 105 is configured to acquire GNSS data. GNSS (Global Navigation Satellite System ) is a space-based radio navigation positioning system capable of providing all-weather 3-dimensional coordinates and velocity and time information to a user at any point on the earth's surface or near earth space, including one or more satellite constellations and augmentation systems required to support a particular job. The RTK (Real-time kinematic) service module is used for providing Real-time motion differential enhancement and calculation service. The method can send the carrier phase acquired by the reference station to the user receiver for calculating the difference and solving the coordinates, and can obtain centimeter-level positioning accuracy in real time in the field. The IMU module 107 is configured to obtain IMU data of the vehicle. The IMU (Inertial Measurement Unit ) is used to measure the three-axis attitude angle (or angular rate) and acceleration of the vehicle.
The two-dimensional state error kalman filter unit 102 includes a high-precision map module 108, a visual lane line detection module 109, a lane line matching module 110, a wheel speed odometer calculation module 111, and a first state error kalman filter module 112. The high-precision map module 108 performs combined search based on the high-precision map provided by the provider and the GNSS data to obtain the vertical coordinates of the lane line where the vehicle is located. The visual lane line detection module 109 obtains the coordinates of the nearest point and the farthest point of the lane line where the vehicle is located through the visual camera of the vehicle. The lane line matching module 110 acquires the lateral update amount of the vehicle based on the output results of the high-definition map module 108 and the visual lane line detection module 109. The wheel speed mileage calculation module 111 is used to acquire a longitudinal update amount of the vehicle. The first state error kalman filter module 112 is configured to fuse the transverse update amount, the longitudinal update amount, the GNSS data and the IMU data of the vehicle, and obtain a two-dimensional positioning result of the vehicle through rolling iteration. The two-dimensional positioning result comprises a transverse positioning result, a longitudinal positioning result and a course angle.
The laser radar point cloud matching and positioning unit 103 comprises a point cloud module 113, a point cloud map module 114, a positioning initialization module 115 and an NDT matching and positioning module 116. The point cloud module 113 acquires real-time point cloud data of the vehicle through a laser radar of the vehicle. The point cloud map module 114 is configured to obtain a high-quality point cloud map that is built offline. The location initialization module 115 is configured to obtain global coordinates of the vehicle. The NDT match-locating module 116 obtains a match-locating pose of the vehicle based on real-time point cloud data and a point cloud map.
The three-dimensional state error kalman filtering unit 104 includes a kalman filter initializing module 117, an IMU pose state quantity resolving module 118, a judging module 119, a second state error kalman filtering module 120, and a posterior pose module 121. The kalman filter initialization module 117 acquires the initial value of the global pose state quantity based on the global coordinates. The IMU pose state quantity calculation module 118 calculates an estimated state quantity based on the global pose state quantity initial value and IMU data. The judging module 119 is configured to judge whether the laser radar point cloud matching and positioning unit 103 outputs a valid matching and positioning pose, if yes, the second state error kalman filtering module 120 updates the estimated state quantity based on the matching and positioning pose, and the posterior pose module 121 calculates a three-dimensional positioning result of the vehicle based on the updated estimated state quantity; if not, the IMU pose state quantity calculation module 118 directly outputs the three-dimensional positioning result based on the estimated state quantity. The three-dimensional positioning result contains three-dimensional six-degree-of-freedom data.
If the GNSS module 105 can normally receive GNSS data and the RTK service module 106 can provide real-time motion differential enhancement resolving service, the multi-sensor data fusion positioning system 100 outputs a high-precision positioning result obtained by fusing GNSS data and IMU data through the combined positioning subunit 101.
If the GNSS module 105 can normally receive GNSS data and the RTK service module 106 cannot provide the real-time motion differential enhancement resolving service, the multi-sensor data fusion positioning system 100 is switched to the two-dimensional state error kalman filter unit 102, and the two-dimensional state error kalman filter unit 102 fuses the matching results of the lane line matching module 110, the wheel speed mileage calculation module 111 and the GNSS data and the IMU data to obtain a two-dimensional positioning result of the vehicle with relatively high accuracy.
If the GNSS module 105 cannot receive the complete GNSS data, that is, cannot obtain valid GNSS data, and sometimes, if no GNSS data exists, the multi-sensor data fusion positioning system 100 obtains a high-precision three-dimensional positioning result of the vehicle by fusing the laser radar point cloud matching positioning unit 103 and the three-dimensional state error kalman filtering unit 104 through heterogeneous sensor data.
According to the multi-sensor data fusion positioning system 100 provided by the invention, the multi-sensor data are fused, based on a state error Kalman filtering fusion algorithm, a global positioning technology and a vehicle end local positioning technology are combined, and according to the actual scene and whether degradation condition change occurs in the sensor data, high-precision positioning results are output by different modules, so that the real-time performance and the robustness are both considered. The multi-sensor data fusion positioning system 100 introduces a laser radar point cloud matching positioning unit 103 based on a high-precision point cloud map and a real-time laser radar scanning point cloud to enhance the robustness and scene adaptability of global positioning, has the advantage of low cost, and is particularly suitable for complex automatic driving scenes such as urban canyons, viaducts, tunnels, boulevees and the like.
Preferably, the GNSS data acquired by the first state error kalman filter module 112 includes longitude and latitude data of the vehicle, the first state error kalman filter module 112 obtains a transverse state quantity and a longitudinal state quantity of the vehicle based on the longitude and latitude data, and updates and corrects the transverse state quantity and the longitudinal state quantity based on a transverse update quantity and a longitudinal update quantity, so as to correspondingly obtain a transverse positioning result and a longitudinal positioning result in the two-dimensional positioning result. The first state error Kalman filtering module 112 obtains an initial heading angle of the vehicle based on the GNSS data and the IMU data after fusion, and iteratively updates the heading angle based on a median numerical integration increment of the yaw rate in the IMU data. Preferably, in the course of iteratively updating the heading angle, the yaw angle updating amount of the angular velocity data of the four wheel encoders of the vehicle after differential motion calculation can be further included to obtain a more accurate result. Specifically, when the RTK service module 106 cannot provide the real-time motion differential enhancement resolving service, that is, does not receive the RTK differential enhancement resolving service, the combined positioning subunit 101 switches to the positioning data, such as longitude and latitude height, attitude angle, and the like, which are provided by the output provider and are fused by the GNSS and the IMU, to perform the re-transformation and definition on the code level, so that the positioning data is adapted to the coordinate system definition and the attitude angle definition convention of the vehicle hardware platform. In addition, the combined positioning subunit 101 processes the GNSS data and IMU data, and makes the longitude and latitude high data table reach the local coordinate system through the mercator projection; the linear acceleration and angular velocity data of the IMU data are transformed from the navigation coordinate system to the carrier coordinate system by three-dimensional space transformation, so as to be used for providing initial data for a positioning initialization module 115 of the laser radar point cloud matching positioning unit 103 and an IMU pose state quantity resolving module 118 of the three-dimensional state error kalman filtering unit 104.
Preferably, in the two-dimensional state error kalman filtering unit 102, the high-precision map module 108 performs combined search based on the high-precision map and the GNSS data to obtain the vertical point coordinates of the left and right lane lines closest to the vehicle. The visual lane line detection module 109 acquires the closest point and the farthest point coordinates of the left and right lane lines, the front ends of which are closest to each other, through the visual camera of the vehicle. The lane line matching module 110 acquires the lateral update amount of the vehicle based on the output results of the high-definition map module 108 and the visual lane line detection module 109.
Preferably, the process of obtaining global coordinates of the vehicle by the positioning initialization module 115 includes:
based on global coordinates obtained after GNSS data transformation, realizing positioning initialization, and continuously updating the global coordinates based on the GNSS data;
and/or acquiring global coordinates based on pose calculation of the IMU data after short time interval numerical integration to realize positioning initialization, and continuously updating the global coordinates based on the IMU data.
Preferably, in the point cloud module 113 of the laser radar point cloud matching and positioning unit 103, the point cloud data of the vehicle is obtained through the laser radar of the vehicle, and is subjected to voxel grid downsampling and filtering to form real-time point cloud data; the point cloud map is generated by offline construction of data acquired by multiple sensors on the vehicle through an optimization-based fusion mapping algorithm. Based on various sensors calibrated with internal and external parameters in advance, a fusion mapping algorithm is adopted to construct a high-quality, undistorted and low-storage-capacity point cloud map in an off-line mode. More preferably, the voxel grid downsampling filtering comprises the steps of: the VoxelGrid class of the open source point cloud library is adopted to create a series of three-dimensional voxel grids based on point cloud data acquired by a laser radar, and for each three-dimensional voxel grid, the center of gravity of all points in the three-dimensional voxel grid is used for representing other points in the three-dimensional voxel grid. The density and the number of the point clouds processed by the voxel grid downsampling are obviously reduced, and the shape characteristics of curved surfaces and the like of the original point clouds can be well reserved, so that the matching speed and accuracy of the NDT matching and positioning module 116 are improved. The laser radar point cloud matching and positioning unit 103 is an auxiliary positioning technology based on laser point cloud, has lower cost advantage, and has higher robustness and all-weather scene adaptation capability.
Preferably, the NDT matching location module 116 employs an NDT matching algorithm to achieve matching of real time point cloud data and a point cloud map. The process of matching the NDT matching algorithm is a process of iterative optimization convergence. The leaf size of the matching voxel is set to be 3, the iteration step number is set to be 30, the resolution is set to be 1.0, the step size is set to be 0.1, and the minimum tolerance is set to be 0.01 in the NDT algorithm parameter configuration. And outputting real-time high-precision matching positioning pose after the optimization iteration matching meets the convergence condition. The obtained matching position and pose is a more accurate homogeneous transformation matrix, and the obtained homogeneous transformation matrix is transformed and converted to be used for updating the estimated state quantity obtained by the IMU pose state quantity calculating module 118 in the three-dimensional state error Kalman filtering unit 104.
Preferably, if the NDT matching algorithm fails, the NDT matching location module 110 is initialized based on the current global coordinates output by the location initialization module 115 or based on intermediate results obtained by the NDT matching algorithm. As previously described, the current global coordinates may be obtained based on continuous iterative updating of GNSS data or based on continuous iterative updating of IMU data. The NDT match locator module 110 is initialized to enable efficient operation of the NDT match algorithm in real-time.
Preferably, in the three-dimensional state error kalman filter unit 104, the kalman filter initialization module 117 acquires the initial value of the global pose state quantity through the rolling iteration based on the global coordinates. As previously described, global coordinates may be obtained after transformation based on GNSS data or IMU data.
Preferably, the IMU pose state quantity calculation module 118 selects a time interval, and performs median value product calculation on the IMU data based on the initial value of the global pose state quantity to obtain the estimated state quantity. As will be readily appreciated, the state quantity for IMU data generally refers to: the position quantity of 3 degrees of freedom, the attitude angle of 3 degrees of freedom (represented by quaternion), the linear velocity of 3 degrees of freedom, the angular velocity deviation of the IMU of 3 degrees of freedom and the linear acceleration deviation of the IMU of 3 degrees of freedom. More preferably, the posterior pose module 121 calculates the three-dimensional positioning result of the vehicle based on the updated estimated state quantity, and clears the state error quantity.
It should be noted that, with respect to the three-dimensional state error kalman filter unit 104, the GNSS clock of the preferred combination positioning subunit 101 is used as a reference, and an existing time alignment algorithm is used to perform time alignment interpolation on the time stamps of the laser radar, the IMU and the vision camera, so that the data output by each sensor has reasonably aligned time stamps before entering the three-dimensional state error kalman filter unit 104.
It will be apparent to those skilled in the art that various modifications and variations can be made to the above-described exemplary embodiments of the present invention without departing from the spirit and scope of the invention. Therefore, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

Claims (10)

1. A multi-sensor data fusion location system for a vehicle, comprising:
the combined positioning subunit comprises a GNSS module, an RTK service module and an IMU module, wherein the GNSS module is used for acquiring GNSS data, the RTK service module is used for providing real-time motion difference enhancement resolving service, and the IMU module is used for acquiring IMU data of a vehicle;
the two-dimensional state error Kalman filtering unit comprises a high-precision map module, a visual lane line detection module, a lane line matching module, a wheel speed mileage calculation module and a first state error Kalman filtering module, wherein the high-precision map module is used for acquiring vertical coordinates of a lane line where a vehicle is located based on a high-precision map and GNSS data, the visual lane line detection module is used for acquiring the coordinates of the nearest point and the farthest point of the lane line where the vehicle is located through a visual camera of the vehicle, the lane line matching module is used for acquiring the transverse updating quantity of the vehicle based on output results of the high-precision map module and the visual lane line detection module, the wheel speed mileage calculation module is used for acquiring the longitudinal updating quantity of the vehicle, and the first state error Kalman filtering module is used for fusing the transverse updating quantity, the longitudinal updating quantity, the GNSS data and the IMU data of the vehicle to acquire the two-dimensional positioning result of the vehicle;
the laser radar point cloud matching and positioning unit comprises a point cloud module, a point cloud map module, a positioning initialization module and an NDT matching and positioning module, wherein the point cloud module is used for acquiring real-time point cloud data of the vehicle through a laser radar of the vehicle, the point cloud map module is used for acquiring an off-line constructed point cloud map, the positioning initialization module is used for acquiring global coordinates of the vehicle, and the NDT matching and positioning module is used for acquiring matching and positioning pose of the vehicle based on the real-time point cloud data and the point cloud map;
the three-dimensional state error Kalman filtering unit comprises a Kalman filtering initialization module, an IMU pose state quantity resolving module, a judging module, a second state error Kalman filtering module and a posterior pose module, wherein the Kalman filtering initialization module acquires a global pose state quantity initial value based on global coordinates, the IMU pose state quantity resolving module acquires an estimated state quantity based on the global pose state quantity initial value and IMU data, the judging module is used for judging whether the laser radar point cloud matching positioning unit outputs an effective matching positioning pose or not, if yes, the second state error Kalman filtering module updates the estimated state quantity based on the matching positioning pose, and the posterior pose module calculates a three-dimensional positioning result of the vehicle based on the updated estimated state quantity; if not, the IMU pose state quantity calculating module directly outputs a three-dimensional positioning result based on the estimated state quantity;
if the GNSS module can normally receive GNSS data and the RTK service module can provide real-time motion difference enhancement resolving service, the multi-sensor data fusion positioning system outputs a positioning result obtained by fusing the GNSS data and the IMU data through the combined positioning subunit;
if the GNSS module can normally receive GNSS data and the RTK service module can not provide real-time motion differential enhancement resolving service, the multi-sensor data fusion positioning system acquires a two-dimensional positioning result of the vehicle through the two-dimensional state error Kalman filtering unit;
and if the GNSS module can not receive the complete GNSS data, the multi-sensor data fusion positioning system acquires a three-dimensional positioning result of the vehicle through a laser radar point cloud matching positioning unit and a three-dimensional state error Kalman filtering unit.
2. The multi-sensor data fusion positioning system of claim 1, wherein the GNSS data acquired by the first state error kalman filter module includes latitude and longitude data of the vehicle, the first state error kalman filter module obtains a lateral state quantity and a longitudinal state quantity of the vehicle based on the latitude and longitude data, and updates and corrects the lateral state quantity and the longitudinal state quantity based on the lateral update quantity and the longitudinal update quantity; and the first state error Kalman filtering module obtains the course angle of the vehicle based on the fusion of the GNSS data and the IMU data.
3. The multi-sensor data fusion positioning system according to claim 1, wherein in the two-dimensional state error kalman filtering unit, the high-precision map module acquires vertical point coordinates of left and right lane lines closest to the vehicle based on a high-precision map and GNSS data, the visual lane line detection module acquires closest point and farthest point coordinates of left and right lane lines closest to the front end of the visual camera through the visual camera of the vehicle, and the lane line matching module acquires a lateral update amount of the vehicle based on output results of the high-precision map module and the visual lane line detection module.
4. The multi-sensor data fusion positioning system of claim 1, wherein the process of the positioning initialization module obtaining global coordinates of the vehicle comprises:
based on global coordinates obtained after the GNSS data transformation, realizing positioning initialization, and continuously updating the global coordinates based on the GNSS data;
and/or acquiring global coordinates based on pose calculation of the IMU data after short-time interval numerical integration so as to realize positioning initialization, and continuously updating the global coordinates based on the IMU data.
5. The multi-sensor data fusion positioning system according to claim 4, wherein in a point cloud module of the laser radar point cloud matching positioning unit, the point cloud data of the vehicle is obtained through the laser radar of the vehicle, and is subjected to voxel grid downsampling and filtering to form real-time point cloud data; the point cloud map is generated by offline construction of data acquired by multiple sensors on the vehicle through an optimization-based fusion mapping algorithm.
6. The multi-sensor data fusion positioning system of claim 5, wherein the voxel grid downsampling filtering comprises the steps of: and a series of three-dimensional voxel grids are created based on the point cloud data acquired by the laser radar by using the VoxelGrid class of the open source point cloud library, and for each three-dimensional voxel grid, the center of gravity of all points in the three-dimensional voxel grid is used for representing other points in the three-dimensional voxel grid.
7. The multi-sensor data fusion positioning system of claim 4, wherein the NDT matching positioning module implements the matching of the real-time point cloud data and the point cloud map by using an NDT matching algorithm, and a matching voxel leaf size of 3, an iteration step number of 30, a resolution of 1.0, a step size of 0.1, and a minimum tolerance of 0.01 are set in the NDT algorithm parameter configuration.
8. The multi-sensor data fusion positioning system of claim 7, wherein if the NDT matching algorithm fails, the NDT matching positioning module is initialized based on current global coordinates output by the positioning initialization module or based on intermediate results obtained by the NDT matching algorithm.
9. The multi-sensor data fusion positioning system according to claim 1, wherein in the three-dimensional state error kalman filter unit, the kalman filter initialization module obtains a global pose state quantity initial value through rolling iteration based on global coordinates.
10. The multi-sensor data fusion positioning system of claim 9, wherein the IMU pose state quantity calculation module selects a time interval and performs median value product analysis on IMU data based on a global pose state quantity initial value to obtain the estimated state quantity.
CN202211582224.9A 2022-12-09 2022-12-09 Multi-sensor data fusion positioning system Pending CN116047565A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211582224.9A CN116047565A (en) 2022-12-09 2022-12-09 Multi-sensor data fusion positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211582224.9A CN116047565A (en) 2022-12-09 2022-12-09 Multi-sensor data fusion positioning system

Publications (1)

Publication Number Publication Date
CN116047565A true CN116047565A (en) 2023-05-02

Family

ID=86128588

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211582224.9A Pending CN116047565A (en) 2022-12-09 2022-12-09 Multi-sensor data fusion positioning system

Country Status (1)

Country Link
CN (1) CN116047565A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117433511A (en) * 2023-12-20 2024-01-23 绘见科技(深圳)有限公司 Multi-sensor fusion positioning method
CN117490705A (en) * 2023-12-27 2024-02-02 合众新能源汽车股份有限公司 Vehicle navigation positioning method, system, device and computer readable medium
CN117492056A (en) * 2023-12-26 2024-02-02 合众新能源汽车股份有限公司 Vehicle fusion positioning method, system, device and computer readable medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117433511A (en) * 2023-12-20 2024-01-23 绘见科技(深圳)有限公司 Multi-sensor fusion positioning method
CN117433511B (en) * 2023-12-20 2024-03-12 绘见科技(深圳)有限公司 Multi-sensor fusion positioning method
CN117492056A (en) * 2023-12-26 2024-02-02 合众新能源汽车股份有限公司 Vehicle fusion positioning method, system, device and computer readable medium
CN117492056B (en) * 2023-12-26 2024-03-22 合众新能源汽车股份有限公司 Vehicle fusion positioning method, system, device and computer readable medium
CN117490705A (en) * 2023-12-27 2024-02-02 合众新能源汽车股份有限公司 Vehicle navigation positioning method, system, device and computer readable medium
CN117490705B (en) * 2023-12-27 2024-03-22 合众新能源汽车股份有限公司 Vehicle navigation positioning method, system, device and computer readable medium

Similar Documents

Publication Publication Date Title
US11243081B2 (en) Slam assisted INS
CN107144285B (en) Pose information determination method and device and movable equipment
CN113884102B (en) Calibration method of sensor installation deviation angle, combined positioning system and vehicle
KR101454153B1 (en) Navigation system for unmanned ground vehicle by sensor fusion with virtual lane
CN116047565A (en) Multi-sensor data fusion positioning system
CN110178048A (en) The method and system that vehicle environmental map is generated and updated
WO2013150630A1 (en) Map data creation device, autonomous movement system and autonomous movement control device
JP2018028489A (en) Position estimation device and position estimation method
CN113899375B (en) Vehicle positioning method and device, storage medium and electronic equipment
CN110411457B (en) Positioning method, system, terminal and storage medium based on stroke perception and vision fusion
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN113670334B (en) Initial alignment method and device for aerocar
CN104777499A (en) Combined navigation method based on INS (inertial navigation system)/GPS (global position system)/SAR (synthetic aperture radar)
CN112596089B (en) Fusion positioning method and device, electronic equipment and storage medium
CN110458885B (en) Positioning system and mobile terminal based on stroke perception and vision fusion
CN113984044A (en) Vehicle pose acquisition method and device based on vehicle-mounted multi-perception fusion
Ding et al. Adding optical flow into the GPS/INS integration for UAV navigation
Andert et al. Optical-aided aircraft navigation using decoupled visual SLAM with range sensor augmentation
Li et al. Aerial-triangulation aided boresight calibration for a low-cost UAV-LiDAR system
US11287281B2 (en) Analysis of localization errors in a mobile object
CN113093759A (en) Robot formation construction method and system based on multi-sensor information fusion
Hussnain et al. Enhanced trajectory estimation of mobile laser scanners using aerial images
Hariz et al. High-Resolution Mobile Mapping Platform Using 15-mm Accuracy LiDAR and SPAN/TerraStar C-PRO Technologies
JP7302966B2 (en) moving body
Noureldin et al. a Framework for Multi-Sensor Positioning and Mapping for Autonomous Vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination