CN117804448A - Autonomous system positioning method, device, computer equipment and storage medium - Google Patents

Autonomous system positioning method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN117804448A
CN117804448A CN202410200168.0A CN202410200168A CN117804448A CN 117804448 A CN117804448 A CN 117804448A CN 202410200168 A CN202410200168 A CN 202410200168A CN 117804448 A CN117804448 A CN 117804448A
Authority
CN
China
Prior art keywords
data
autonomous system
points
ultra
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410200168.0A
Other languages
Chinese (zh)
Other versions
CN117804448B (en
Inventor
王春彦
孔唯一
陈雪梅
董伟
王丹丹
张乐乐
邓方
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202410200168.0A priority Critical patent/CN117804448B/en
Publication of CN117804448A publication Critical patent/CN117804448A/en
Application granted granted Critical
Publication of CN117804448B publication Critical patent/CN117804448B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • H04W64/006Locating users or terminals or network equipment for network management purposes, e.g. mobility management with additional information processing, e.g. for direction or speed determination

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Electromagnetism (AREA)
  • Navigation (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses an autonomous system positioning method, an autonomous system positioning device, computer equipment and a storage medium. The autonomous system positioning method comprises the steps of carrying out state prediction and motion compensation on inertial measurement data of an autonomous system, and obtaining a point cloud containing construction points; time alignment and space alignment of the collected ultra-wideband data and the laser radar feature points are carried out, and position information corresponding to the laser radar feature points is recorded; compensating for a lidar error along a predetermined directional dimension based on the ultra-wideband data; calculating residual errors between state quantity predicted values of all construction points and measured values of planes of all construction points in the current iteration round; and adjusting the iteration weight of the IESKF according to the residual error, constructing a point cloud based on the adjusted IESKF matching, and updating the map points. According to the invention, the IESKF is utilized to fuse the laser radar characteristic points with the IMU data and the ultra-wideband data, the ultra-wideband data is introduced to compensate the laser radar error along the dimension of the preset direction and the adjustment is carried out in the Kalman gain formula part, so that the indoor positioning performance is improved.

Description

Autonomous system positioning method, device, computer equipment and storage medium
Technical Field
The present invention relates to the field of autonomous system positioning, and in particular, to an autonomous system positioning method, apparatus, computer device, and storage medium.
Background
Autonomous systems generally refer to artificial systems that operate or manage themselves without human intervention by incorporating a variety of advanced technologies such as artificial intelligence, electromechanical control, computers, communications, materials, etc. An autonomous system is a system capable of continuously monitoring its own state and automatically adjusting its state and performing operations to meet a predetermined objective according to internal or external events.
With the increasingly wide application of autonomous system technology in the fields of aerial photography, logistics, agriculture and the like, autonomous positioning also faces a great challenge, and particularly in weak characteristic environments such as tunnels, mines, urban canyons and the like, the positioning performance of the autonomous system can be seriously affected. Because the conventional autonomous system positioning method mainly depends on GPS (Global Positioning System, global satellite positioning system), in the weak characteristic environment where GPS signals are weak or cannot be received, such as indoors (e.g. tunnels, mines), outdoors (e.g. urban canyons), the accuracy and reliability of the autonomous system based on GPS positioning are easily reduced.
In view of this, researchers have been required to explore new positioning methods to improve the performance of autonomous systems in various environments.
Disclosure of Invention
The invention aims to solve the technical problems that: the traditional autonomous system positioning method is easy to cause the accuracy and reliability of the autonomous system based on GPS positioning to be reduced under the weak characteristic environments that GPS signals are weak or GPS signals cannot be received, such as indoors (such as tunnels and mines), outdoors (such as urban canyons) and the like.
In order to solve the above technical problems, the present invention provides an autonomous system positioning method, where the autonomous system has an ultra wideband positioning module, including:
acquiring inertial measurement data and point cloud of the autonomous system, and performing forward propagation and backward propagation on the inertial measurement data based on a kinematic model; the point cloud comprises space points formed by measuring angles and distances of points returned by the laser radar;
performing state prediction on the autonomous system according to the data obtained after the forward propagation, performing motion compensation on the autonomous system according to the data obtained after the backward propagation, and correcting the point cloud based on the motion compensation and obtaining the point cloud containing the construction points;
Collecting ultra-wideband data which are time-synchronous with the inertial measurement data, performing time alignment and space alignment on the ultra-wideband data and the laser radar characteristic points, and recording position information corresponding to the laser radar characteristic points;
compensating for a lidar error along a predetermined directional dimension based on the ultra-wideband data;
calculating residual errors between state quantity predicted values of all construction points and measured values of planes of all construction points in the current iteration round;
and adjusting the iteration weight of the error iteration Kalman filter according to the residual error, matching the construction point cloud based on the adjusted iteration weight, and updating the map points.
Optionally, the step of forward propagating and backward propagating the inertial measurement data based on the kinematic model includes:
integrating the inertial measurement data and then performing forward propagation to obtain a state quantity predicted value and a noise estimated value of the autonomous system;
and carrying out multi-source data fusion on the inertial measurement data, the ultra-wideband data and the laser radar characteristic points through an iterative extended Kalman filter so as to simultaneously estimate the position and the gesture of the autonomous system.
Optionally, the step of motion compensating the result of the state prediction based on the backward propagation data comprises:
The inertial measurement data is back-propagated and motion distortion between the first scan time and the second scan time is compensated during the back-propagation.
Optionally, before forward propagating and backward propagating the inertial measurement data based on the kinematic model, further comprising:
taking a first IMU frame in the inertial measurement data as a global coordinate system;
calculating external parameters between the laser radar characteristic points and the inertial measurement data after the global coordinate system is determined;
accordingly, the step of predicting the state of the autonomous system according to the data obtained after the forward propagation includes:
the autonomous system is state predicted using a kinematic model and based on the external parameters between lidar feature points and inertial measurements.
Optionally, the step of performing motion compensation on the autonomous system according to the data obtained after the back propagation includes:
and (3) carrying out backward propagation on the inertial measurement data, and obtaining a point cloud result of each moment in the current frame by utilizing interpolation approximation to correct the attitude relation.
Optionally, the step of compensating for a laser radar error along a predetermined directional dimension based on the ultra-wideband data comprises:
And calculating and obtaining the deviation of the laser radar in a preset direction according to the measured data of the ultra-wideband data along the degradation direction distance of the indoor weak characteristic environment.
Optionally, the step of matching the construction point cloud and updating the map points based on the adjusted iteration weights includes:
and adopting an incremental KD tree data structure, matching the construction point cloud based on the adjusted iterative extended Kalman filter, and updating the map points.
In order to solve the above technical problems, the present invention provides an autonomous system positioning device, the autonomous system having an ultra wideband positioning module, comprising:
the data acquisition module is used for acquiring inertial measurement data and point cloud of the autonomous system, and performing forward propagation and backward propagation on the inertial measurement data based on a kinematic model; the point cloud comprises space points formed by measuring angles and distances of points returned by the laser radar;
the data processing module is used for carrying out state prediction on the autonomous system according to the data obtained after the forward propagation, carrying out motion compensation on the autonomous system according to the data obtained after the backward propagation, and correcting the point cloud based on the motion compensation and obtaining the point cloud containing the construction points;
The data measurement module is used for acquiring ultra-wideband data which are time-synchronous with the inertia measurement data, carrying out time alignment and space alignment on the ultra-wideband data and the laser radar characteristic points, and recording position information corresponding to the laser radar characteristic points;
the data processing module is further used for compensating laser radar errors along a preset direction dimension based on the ultra-wideband data;
the data processing module is also used for calculating residual errors between the state quantity predicted value of each construction point and the measured value of the plane to which each construction point belongs in the current iteration round;
and the positioning module is used for adjusting the iteration weight of the error iteration Kalman filter according to the residual error, matching the construction point cloud based on the adjusted iteration weight and updating the map points.
To solve the above technical problem, the present invention provides a computer device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor implements the above method when executing the computer program.
To solve the above technical problem, the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method described above.
The autonomous positioning method provided by the invention provides an IESKF-based framework, ultra-wideband data is introduced to be suitable for environments with weak or no GPS signals such as indoor and outdoor GPS signals, and autonomous system positioning and map updating are performed by fusing three influencing factors including inertial measurement unit data, ultra-wideband data and laser radar characteristic points, so that multi-source fusion of data is realized, and the accuracy and reliability of the autonomous system based on GPS positioning are improved. In addition, the invention adopts the increment KD tree data structure, thereby reducing the calculation complexity. And the unmanned aerial vehicle autonomous positioning experiment in the tunnel environment also verifies that the positioning scheme provided by the invention effectively improves the positioning precision in weak characteristic environments such as indoor and outdoor environments.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of an autonomous system positioning method according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an anchor point setting manner of an ultra wideband base station according to an embodiment of the present invention;
fig. 3 is another anchor point setting manner of the ultra wideband base station provided in the embodiment of the present invention;
fig. 4 is a comparison chart of test results of the autonomous system positioning method and the conventional positioning method respectively applied in the weak environment provided by the embodiment of the present invention;
FIG. 5 is a block diagram of an autonomous system positioning apparatus according to an embodiment of the present invention;
FIG. 6 is a block diagram of a computer device according to the present invention;
FIG. 7 is a schematic diagram of an attitude calibrator according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a truth measurement provided by an embodiment of the present invention;
fig. 9 is a schematic diagram of a test mapping effect provided by an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Before describing the autonomous system positioning scheme provided by the present invention, a brief description of several concepts presented in the present invention will be provided. The method comprises the following steps: the IMU (Inertial Measurement Unit ) is a sensor that measures acceleration and angular velocity of an object and can provide high frequency attitude information. UWB (Ultra Wide Band) can realize high-precision distance measurement, and is suitable for environments where GPS signals are weak or cannot be received, such as indoor and outdoor. LiDAR (Light Detection and Ranging, laser radar) is a radar system that emits a detection signal (laser beam) to a target to be measured, and then measures parameters such as the arrival time, intensity, etc. of the reflected or scattered signal to determine the distance, azimuth, motion state, and surface optical characteristics of the target; the lidar can provide a high-precision ground characteristic point cloud for building an environment map and obstacle detection.
In the conventional autonomous system positioning scheme, in a weak characteristic environment where GPS signals are weak or GPS signals cannot be received, such as indoors (e.g. tunnels and mines), outdoors (e.g. urban canyons), the accuracy and reliability of the autonomous system based on GPS positioning are easily reduced. In order to solve the problems, the invention provides an autonomous system positioning method, an autonomous system positioning device, computer equipment and a storage medium.
The following describes an autonomous system positioning method according to an embodiment of the present invention.
Example 1
As shown in fig. 1, a flowchart of an autonomous system positioning method according to an embodiment of the present invention, where the autonomous system has an ultra wideband positioning module, may include the following steps:
step S101: inertial measurement data and point clouds of the autonomous system are acquired, and forward propagation and backward propagation of the inertial measurement data are performed based on a kinematic model.
Wherein the point cloud comprises spatial points consisting of measured angles and distances of points returned by the lidar.
Specifically, after the inertial measurement data and the point cloud are acquired, a model may be constructed using a kinematic algorithm, and the inertial measurement data is used as input, and each frame is propagated forward and backward using the constructed model. The invention processes inertial measurement data based on a kinematic model, including forward and backward propagation of the inertial measurement data. In one implementation of the present invention, the inertial measurement data may be propagated forward and backward based on a kinematic model, and specifically may include the following steps:
(1) Integrating the inertial measurement data and then performing forward propagation to obtain a state quantity predicted value and a noise estimated value of the autonomous system;
(2) And the inertial measurement data, the ultra-wideband data and the laser radar characteristic points are subjected to multi-source data fusion through an iterative extended Kalman filter so as to simultaneously estimate the position and the posture of the autonomous system, thereby solving the problem of insufficient positioning precision of the laser radar in the indoor weak environment characteristic.
It should be noted that the above-mentioned iterative extended kalman filter is only one specific implementation of the present invention, and when the inertial measurement data is propagated forward and backward based on the kinematic model, the present invention is not limited to the above-mentioned one, and any possible implementation may be applied to the present invention.
In one implementation, before forward propagating and backward propagating the inertial measurement data based on a kinematic model, the method further comprises:
taking a first IMU frame in the inertial measurement data as a global coordinate system;
after the global coordinate system is determined, the method is expressed by the expressionRepresenting external parameters between the lidar characteristic point and the inertial measurement data, wherein +.>Representing the attitude of the radar relative to the inertial measurement unit, < >>Representing the position of the radar relative to the inertial odometer by the expression +.>Representing the current position and orientation between inertial measurement data and head frame data, wherein +. >Representing the attitude of the inertial measurement unit with respect to the first frame, while +.>Representing the position of the inertial measurement unit relative to the first frame;
in one implementation, the step of predicting the state of the autonomous system according to the data obtained after the forward propagation includes:
the autonomous system is state predicted using a kinematic model and based on the external parameters between lidar feature points and inertial measurement data.
Specifically, the first IMU frame (denoted asI) As a global coordinate system (expressed asG) And willRepresenting the external parameters between the LiDAR and the IMU.
Step S102: and carrying out state prediction on the autonomous system according to the data obtained after the forward propagation, carrying out motion compensation on the autonomous system according to the data obtained after the backward propagation, and correcting the point cloud based on the motion compensation and obtaining the point cloud containing the construction points.
The point cloud refers to a set of points of the target surface characteristics. The following several application scenarios may be included: a point cloud obtained according to a laser measurement principle, including three-dimensional coordinates (XYZ) and laser reflection Intensity (Intensity); a point cloud obtained according to the photogrammetry principle, comprising three-dimensional coordinates (XYZ) and color information (RGB); the point cloud is obtained by combining laser measurement and photogrammetry principles, and comprises three-dimensional coordinates (XYZ), laser reflection Intensity (Intensity) and color information (RGB). However, in either case, after the spatial coordinates of each sample Point on the object surface are acquired, a set of points is obtained, called a "Point Cloud".
It can be understood that, in step S102, motion compensation is performed by using the data obtained after backward propagation, and element values of the set where each point in the point cloud is located are corrected based on the motion compensation, and the set where the corrected point is located is used as a construction point in the point cloud, so as to obtain the point cloud including the construction point.
In one implementation, the inertial measurement data is back-propagated and motion distortion between a first scan time and a second scan time is compensated during the back-propagation to motion compensate a result of state prediction based on the back-propagated data.
In one implementation, the point cloud results at each time within the frame are corrected by back-propagating the inertial measurement data and using interpolation approximation to obtain the pose relationship.
The following describes the propagation process of the inertial measurement unit IMU:
forward propagation process: suppose that after fusing the last (i.e. the firstPersonal) optimal state after laser radar scanning is estimated to be +.>The covariance matrix is +.>. Forward propagation occurs upon receipt of inertial measurement unit IMU results. Specifically, by adding process noise->Set to zero, integrate angular velocity and linear acceleration. Calculating the most probable first wheel-like estimate of the forward part of the iterative extended Kalman filter and the covariance of the corresponding quantities of this estimate >. By making process noise->And setting to zero, and calculating according to a forward process in Kalman filtering to obtain a prediction process noise matrix.
Backward propagation process: to compensate for timeAnd time->Motion distortion between them, i.e. back propagationFrom the last moment state quantity +>And +.>Obtaining state quantity (such as speed and deviation) to interpolate under the assumption of constant change of posture, wherein +.>Representing the addition logic corresponding to each state quantity, in which part of the state quantity belongs to +.>Lie algebra category, different from addition logic such as velocity, position, linear acceleration, etc., and +.>Indicating the interpolation point is +.>Is defined as the expression +.>Representing the kinematic relationship, indicating the pose at the current moment +.>Input +.>And the noise is +.>Derivative of state under conditions. The frequency of the backward propagation is the same as the frequency of the feature points, typically much higher than the sampling rate of the IMU. For all feature points sampled between the two IMU measurements, the left IMU measurement is used as an input to the back propagation.
The back propagation will produce a relative pose, at an intermediate timeAnd scanning end time +.>Between: Wherein->Representing->Time IMU coordinate System +.>Interpolative pose at each interpolation point moment, whereas +.>Then the representation is->Time IMU coordinate System +.>Interpolation estimated positions at the times of the interpolation points. This relative pose enables us to measure local +.>Measurement projected to the end of a scan +.>And (3) upper part.
Step S103: and collecting ultra-wideband data which are time-synchronous with the inertial measurement data, performing time alignment and space alignment on the ultra-wideband data and the laser radar characteristic points, and recording position information corresponding to the laser radar characteristic points. It can be appreciated that the ultra-wideband data and the lidar feature points are time aligned and spatially aligned to ensure that the time stamps and coordinate systems of the two data sources are matched.
In a specific implementation manner, anchor point layout can have various forms, the invention is not limited to this, for example, four base stations (as shown in fig. 2) can be arranged, a degradation direction line result can be directly output and solved, only one base station (as shown in fig. 3) can be arranged, and a single base station is used for calculating a distance, so that the distance can be measured along the degradation direction line through known more accurate two-direction estimation. And recording the distance measurement data measured by the indoor ultra-wideband positioning module, and collecting the laser radar characteristic points.
And collecting distance measurement data measured by the high-precision ultra-wideband positioning module in a preset direction, and recording position information corresponding to the laser radar characteristic points. And collecting the laser radar characteristic points simultaneously in the same time period. And performing time alignment and space alignment on the ultra-wideband data and the laser radar characteristic points, and ensuring that the time stamps and the coordinate systems of the two data sources are matched. And (3) deducing the deviation of the laser radar in the indoor degradation direction by adding the ultra-wideband positioning module to the measurement data of the degradation direction distance in the indoor weak characteristic environment in an observation equation, and correcting the deviation.
It should be noted that, for collecting the distance measurement data measured by the ultra-wideband positioning module, the distance measurement data may be collected along a predetermined direction, for example, the distance measurement data may be collected along a direction of indoor degradation, where, of course, the "direction of indoor degradation along the line" is only a preferred implementation of the present invention, and the present invention is not limited to the specific direction of the predetermined direction, and one skilled in the art needs to determine the specific direction according to the practical application.
Other observables of the radar point cloud are: assume the state in the current iteration updateIs estimated as +.>When->Each measured LiDAR point +. >Projection into the global coordinate system +.>Wherein->Representing the location information of each measured LiDAR point projected into the global coordinate system and searching for its nearest 5 points in the map represented by ikd-Tree. Then, the nearest neighbor found is used to fit the vector with normal +.>And centroid->Is included in the block of the local facet blocks. Furthermore, by->A first order approximation is made to the measurement equation to approximate the measurement equation. />Referred to as the residual error,
the error included in the process, which is used to characterize that the point inferred to lie on the plane is not projected to lie on the plane under the current pose, is mainly the misalignment of the planes due to the pose error, and has the effect of measuring process noise.
Step S104: and compensating laser radar errors along a predetermined direction dimension based on the ultra-wideband data.
In one implementation mode, the deviation of the laser radar in the preset direction is calculated through the measured data of the ultra-wideband data along the degradation direction distance of the indoor weak characteristic environment.
Step S105: and calculating residual errors between the state quantity predicted value of each construction point and the measured value of the plane to which each construction point belongs in the current iteration round.
Step S106: and adjusting the iteration weight of the error iteration Kalman filter according to the residual error, matching the construction point cloud based on the adjusted iteration weight, and updating the map points.
In one implementation, an incremental KD-tree data structure may be employed and the construction point cloud matched and map points updated based on an adjusted iterative extended Kalman filter.
Map points are organized into a ikd-Tree that grows dynamically by merging new scans of the point cloud at the odometry rate. In order to prevent the size of the map from being limited, only map points in a large local area with the length L around the current position of the laser radar are reserved on ikd-Tree.
The map area is initialized to a cube of length L to initiate lidar positionIs central. It is assumed that the detection area of the lidar is a detection sphere centered on the current location of the lidar. Assume that the radius of the probe sphere is +.>WhereinrFor the laser radar field of view range, < >>Is a relaxation parameter greater than 1. When the lidar is moved to a new position detecting the ball contact map boundary +.>In this case, the map area moves in a direction to increase the distance between the laser radar detection area and the contact boundary. The distance moved by the map area is set to be constant +.>Wherein->Indicating the distance of movement. All points in the subtraction area between the new map area and the old map area will be deleted from ikd-Tree by a frame-by-frame delete operation.
It should be noted that, under the weak characteristic environments of weak or unable to receive GPS signals, such as indoor (e.g. tunnels, mines), outdoor (e.g. urban canyons), the manual closed environment is usually complex in terrain and has fewer degradation direction characteristics, and the acquisition and processing of LiDAR data may be limited, resulting in autonomous positioning, thereby resulting in the establishment of a related global or partial map and the decrease of the accuracy of the detection of the obstacle.
The autonomous system positioning method provided by the invention provides a framework based on IESKF, ultra-wideband data is introduced to be suitable for environments with weak or no GPS signals such as indoor and outdoor GPS signals, and autonomous system positioning and map updating are carried out by fusing three influencing factors including inertial measurement data, ultra-wideband data and laser radar characteristic points, so that multi-source fusion of data is realized, and the accuracy and reliability of the autonomous system based on GPS positioning are improved. And the IESKF carries out information fusion on the inertial measurement data and the laser radar characteristic points, so that the simultaneous estimation of the position and the attitude of the autonomous system is realized, and the positioning performance is improved. In addition, the invention adopts the increment KD tree data structure, thereby reducing the calculation complexity. And, through unmanned aerial vehicle autonomous positioning experiments in tunnel environments, as shown in fig. 4, the positioning scheme provided by the invention is verified to effectively improve the positioning accuracy in weak characteristic environments such as indoor and outdoor environments.
It should be noted that the invention provides a positioning method for fusing laser radar feature points with inertial measurement data and ultra-wideband data by using an iterative extended kalman filter, the method is based on a frame of FAST-LIO2, in the laser radar scanning process, the state is predicted by forward propagation of an inertial measurement unit IMU and point cloud motion compensation is performed backward, ultra-wideband data is introduced to compensate laser radar errors along a predetermined direction dimension and is adjusted in a kalman gain formula part, so that the weight of each observation is adjusted, and the accuracy of the predetermined direction is effectively improved.
Example two
Fig. 5 is a block diagram of an autonomous positioning device according to an embodiment of the present invention, corresponding to the autonomous system positioning method described above, to solve the above technical problem, the present invention provides an autonomous system positioning device, where the autonomous system has an ultra-wideband positioning module, and the device includes:
a data acquisition module 210, configured to acquire inertial measurement data and a point cloud of the autonomous system, and perform forward propagation and backward propagation on the inertial measurement data based on a kinematic model; the point cloud comprises space points formed by measuring angles and distances of points returned by the laser radar;
The data processing module 220 is configured to predict a state of the autonomous system according to the data obtained after the forward propagation, perform motion compensation on the autonomous system according to the data obtained after the backward propagation, and correct a point cloud based on the motion compensation and obtain a point cloud including a construction point;
the data measurement module 230 is configured to collect ultra-wideband data that is time-synchronized with the inertial measurement data, time-align and spatially-align the ultra-wideband data with the lidar feature points, and record location information corresponding to the lidar feature points;
the data processing module 220 is further configured to compensate for a laser radar error along a predetermined direction dimension based on the ultra wideband data;
the data processing module 220 is further configured to calculate a residual error between a state quantity predicted value of each construction point and a measured value of a plane to which each construction point belongs in the current iteration round;
the positioning module 240 is configured to adjust an iteration weight of the error iterative kalman filter according to the residual error, and match the construction point cloud and update the map point based on the adjusted iteration weight.
According to the autonomous system positioning device provided by the invention, based on the IESKF framework, ultra-wideband data is introduced to be suitable for environments with weak or no GPS signals such as indoor and outdoor GPS signals, and three influencing factors including inertial measurement data, ultra-wideband data and laser radar characteristic points are fused to perform autonomous system positioning and map updating, so that multi-source fusion of data is realized, and the accuracy and reliability of the autonomous system based on GPS positioning are improved. And the IESKF carries out information fusion on the inertial measurement data and the laser radar characteristic points, so that the simultaneous estimation of the position and the attitude of the autonomous system is realized, and the positioning performance is improved. In addition, the invention adopts the increment KD tree data structure, thereby reducing the calculation complexity. And the unmanned aerial vehicle autonomous positioning experiment in the tunnel environment also verifies that the positioning scheme provided by the invention effectively improves the positioning precision in weak characteristic environments such as indoor and outdoor environments.
Example III
To solve the above technical problem, the present invention provides a computer device, as shown in fig. 6, including a memory 310, a processor 320, and a computer program stored on the memory and capable of running on the processor, where the processor implements the method as described above when executing the computer program.
The computer equipment can be a desktop computer, a notebook computer, a palm computer, a cloud server and other computing equipment. The computer devices may include, but are not limited to, a processor 320, a memory 310. It will be appreciated by those skilled in the art that fig. 6 is merely an example of a computer device and is not intended to be limiting, and that more or fewer components than shown may be included, or certain components may be combined, or different components may be included, for example, in a computer device that may also include input-output devices, network access devices, buses, etc.
The processor 320 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 310 may be an internal storage unit of the computer device, such as a hard disk or a memory of the computer device. The memory 310 may also be an external storage device of a computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like. Further, the memory 310 may also include both internal storage units and external storage devices of the computer device. The memory 310 is used to store the computer program and other programs and data required by the computer device. The memory 310 may also be used to temporarily store data that has been output or is to be output.
Example IV
The present application also provides a computer-readable storage medium, which may be a computer-readable storage medium contained in the memory in the above embodiments; or a computer readable storage medium, alone, that is not assembled into a computer device. The computer readable storage medium stores one or more computer programs which when executed by a processor implement the methods described above.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each method embodiment described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory 310, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier wave signal, a telecommunication signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
Example five
The method for positioning an autonomous system according to the present invention will be described in detail below with reference to the weak environment feature of "tunnel".
And 1, measuring coordinates of the beginning end and the end of the tunnel by using a total station.
Specifically, targets are respectively erected on the edges of the walkways at the tail ends of the tunnels (targets on ultra-wideband base stations can be utilized in practical application), coordinates of the positions of the targets are measured through a total station, and tunnel direction vectors are obtained after data processing through obtained coordinates of the starting ends and the tail ends of the tunnels.
Step 2, fixing the ultra-wideband base station near a starting point, and measuring the coordinates by using a total station; the anchor point layout mode is shown in fig. 2 and 3.
Step 3, installing a calibration frame at the radar starting point, for example, for the anchor point setting mode in fig. 2, the position and posture of the starting point and the position of the end point can be calculated by measuring the target positions of the four corner points. The specific scheme is as follows:
(1) A positioner calibration frame as shown in fig. 7 was fabricated using i-shaped structural steel, and a connection mechanism of the i-shaped steel and the radar crown was fabricated by 3D printing. The wedge-shaped structure of the radiating fin at the upper part of the radar is utilized to clamp and fix the radar, and the I-steel is fixed in the groove by utilizing the special clamping groove, the screw hole and the fixing plate of the printing piece, so that the stability of the whole structure is realized.
And pasting two lower calibration targets on the same horizontal height of the vertical steel at two sides by using information provided by a radar drawing, and obtaining the starting point coordinates of the total station system of the radar center by taking the average value. The two targets on the upper side are arranged at the centers of the upper edges of the two vertical steel of the I-steel. The method comprises the steps of obtaining the x-axis direction by using plane vector cross multiplication through the known condition of four-point theory coplanarity, obtaining the z-axis direction by using vector normalization average of upper and lower targets of two vertical steels, calculating the y-axis direction by using vector normalization average of upper left and right connecting line vectors and lower left and lower right, and finally verifying the tri-axial pairwise perpendicularity by using the cross multiplication relation of unit vectors. And after confirming the error, obtaining an initial attitude matrix through coordinate conversion, wherein an initial position origin is the radar center.
The end position is also obtained by a method similar to the start position, and the position of the end position is not required to be calculated, so that the positions of two targets at the lower side are only required to be measured, and then the total station system coordinate of the radar center is obtained by taking the midpoint of the connecting line, namely the end coordinate.
And 4, after the ultra-wideband positioning module is installed at the top of the radar equipment, the positioning device is held to move, and the ROS data set rosbag is recorded.
Step 5, measuring coordinates of the end point position (similar to step 1, the relevant points can refer to step 1), and the specific measurement scheme is as follows:
(1) And establishing a total station coordinate system.
And selecting a position with good visual field and flat topography from the experimental field, installing a total station, measuring the instrument height by using a tape measure after leveling the total station, and inputting the instrument height on the total station. At the moment, the laser foot point of the laser counter at the ground is the origin of the coordinate device of the total station. And installing a prism at any position in the tunnel, aligning the total station with the position of the prism, and setting the current total station angle to zero.
(2) The total station marks the base station position.
Firstly, attaching a reflective patch to the center of a base station, enabling a lens of a total station to be aligned to the center of the reflective patch, measuring coordinates at the initial position by using a coordinate measuring mode by using the total station, and recording the coordinates as base station coordinates.
(3) An endpoint coordinate measurement scheme is performed.
Since the total station cannot directly observe the end position at the start point, a plurality of relay points are selected to indirectly measure the end coordinates.
As shown in fig. 7, the relay point coordinates and the alignment coordinate system are measured as follows:
1) And installing a prism above the relay point, measuring the height of the prism, setting the height of the prism in the total station, aligning the total station with the center of the prism, and measuring the three-dimensional coordinates of the ground point where the prism is positioned by using a coordinate measurement mode.
2) And setting a reflective patch near the relay point, and measuring the coordinates of the reflective patch by using a total station.
3) The angle a is calculated using the retroreflective stickers and the prism coordinates.
4) Moving the total station to the position of the prism, leveling the total station, and aligning the laser counter to the relay point; the total station aims at the reflective patch, and the current total station angle is set to be a, so that the current relay point coordinate axis is parallel to the original coordinate axis. And adding the coordinates of the relay points in the original coordinate system to the coordinates of the relay points measured by the relay points to obtain the coordinates of the original total station coordinate system. The coordinates of the end point can be measured indirectly by repeating the method, please refer to fig. 8.
And 6, running the program and checking the recorded result.
Referring to fig. 4, the above measurement results show that the positioning accuracy of the autonomous system provided by the invention is greatly improved, and the deviation 30 m can be controlled within 5 cm. The test mapping result is shown in fig. 9, it can be found that the mapping result has no ghost and overlapping under the algorithm, the positioning result and the true value can be seen in fig. 4, the dotted line in fig. 4 represents the total station measurement reference end point, the solid line represents the odometer conversion result, and the accuracy difference between the two is 5 cm as can be seen in fig. 4.
It should be noted that, for the system or apparatus embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and references to the parts of the description of the method embodiments are only required.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
It is to be understood that the terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a condition or event is determined" or "if a condition or event is detected" may be interpreted in the context to mean "upon determination" or "in response to determination" or "upon detection of a condition or event, or" in response to detection of a condition or event.
Although the embodiments of the present invention are disclosed above, the embodiments are only used for the convenience of understanding the present invention, and are not intended to limit the present invention. Any person skilled in the art can make any modification and variation in form and detail without departing from the spirit and scope of the present disclosure, but the scope of the present disclosure is still subject to the scope of the present disclosure as defined by the appended claims.

Claims (10)

1. An autonomous system positioning method, the autonomous system having an ultra wideband positioning module, comprising:
acquiring inertial measurement data and point cloud of the autonomous system, and performing forward propagation and backward propagation on the inertial measurement data based on a kinematic model; the point cloud comprises space points formed by measuring angles and distances of points returned by the laser radar;
performing state prediction on the autonomous system according to the data obtained after the forward propagation, performing motion compensation on the autonomous system according to the data obtained after the backward propagation, and correcting the point cloud based on the motion compensation to obtain a point cloud containing construction points;
collecting ultra-wideband data which are time-synchronous with the inertial measurement data, performing time alignment and space alignment on the ultra-wideband data and the laser radar characteristic points, and recording position information corresponding to the laser radar characteristic points;
compensating for a lidar error along a predetermined directional dimension based on the ultra-wideband data;
calculating residual errors between state quantity predicted values of all construction points and measured values of planes of all construction points in the current iteration round;
And adjusting the iteration weight of the error iteration Kalman filter according to the residual error, constructing a point cloud based on the adjusted iteration weight, and updating the map points.
2. The autonomous system positioning method of claim 1, wherein the step of forward propagating and backward propagating the inertial measurement data based on a kinematic model comprises:
integrating the inertial measurement data and then performing forward propagation to obtain a state quantity predicted value and a noise estimated value of the autonomous system;
and carrying out multi-source data fusion on the inertial measurement data, the ultra-wideband data and the laser radar characteristic points through an iterative extended Kalman filter so as to simultaneously estimate the position and the gesture of the autonomous system.
3. The autonomous system positioning method of claim 2, wherein said step of motion compensating said autonomous system in accordance with said backward propagated data comprises:
the inertial measurement data is back-propagated and motion distortion between the first scan time and the second scan time is compensated during the back-propagation.
4. The autonomous system positioning method of claim 3, further comprising, prior to forward and backward propagating the inertial measurement data based on a kinematic model:
Taking a first IMU frame in the inertial measurement data as a global coordinate system;
calculating external parameters between the laser radar characteristic points and the inertial measurement data after the global coordinate system is determined;
accordingly, the step of predicting the state of the autonomous system according to the data obtained after the forward propagation includes:
the autonomous system is state predicted using a kinematic model and based on the external parameters between lidar feature points and inertial measurements.
5. The autonomous system positioning method of claim 1, wherein said step of motion compensating said autonomous system in accordance with said backward propagated data comprises:
and (3) carrying out backward propagation on the inertial measurement data, and obtaining a point cloud result of each moment in the current frame by utilizing interpolation approximation to correct the attitude relation.
6. The autonomous system positioning method of claim 1, wherein the step of compensating for a lidar error along a predetermined directional dimension based on the ultra-wideband data comprises:
and calculating and obtaining the deviation of the laser radar in a preset direction according to the measured data of the ultra-wideband data along the degradation direction distance of the indoor weak characteristic environment.
7. The autonomous system positioning method of claim 2, wherein the step of matching the construction point cloud based on the adjusted iterative extended kalman filter and updating map points comprises:
and adopting an incremental KD tree data structure, matching the construction point cloud based on the adjusted iterative extended Kalman filter, and updating the map points.
8. An autonomous system positioning device, the autonomous system having an ultra wideband positioning module, comprising:
the data acquisition module is used for acquiring inertial measurement data and point cloud of the autonomous system, and performing forward propagation and backward propagation on the inertial measurement data based on a kinematic model; the point cloud comprises space points formed by measuring angles and distances of points returned by the laser radar;
the data processing module is used for carrying out state prediction on the autonomous system according to the data obtained after the forward propagation, carrying out motion compensation on the autonomous system according to the data obtained after the backward propagation, and correcting the point cloud based on the motion compensation and obtaining the point cloud containing the construction points;
the data measurement module is used for acquiring ultra-wideband data which are time-synchronous with the inertia measurement data, carrying out time alignment and space alignment on the ultra-wideband data and the laser radar characteristic points, and recording position information corresponding to the laser radar characteristic points;
The data processing module is further used for compensating laser radar errors along a preset direction dimension based on the ultra-wideband data;
the data processing module is also used for calculating residual errors between the state quantity predicted value of each construction point and the measured value of the plane to which each construction point belongs in the current iteration round;
and the positioning module is used for adjusting the iteration weight of the error iteration Kalman filter according to the residual error, matching the construction point cloud based on the adjusted iteration weight and updating the map points.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any one of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any one of claims 1 to 7.
CN202410200168.0A 2024-02-23 2024-02-23 Autonomous system positioning method, device, computer equipment and storage medium Active CN117804448B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410200168.0A CN117804448B (en) 2024-02-23 2024-02-23 Autonomous system positioning method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410200168.0A CN117804448B (en) 2024-02-23 2024-02-23 Autonomous system positioning method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN117804448A true CN117804448A (en) 2024-04-02
CN117804448B CN117804448B (en) 2024-04-30

Family

ID=90431688

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410200168.0A Active CN117804448B (en) 2024-02-23 2024-02-23 Autonomous system positioning method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117804448B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6240367B1 (en) * 1998-11-27 2001-05-29 Ching-Fang Lin Full fusion positioning method for vehicle
US20170367766A1 (en) * 2016-03-14 2017-12-28 Mohamed R. Mahfouz Ultra-wideband positioning for wireless ultrasound tracking and communication
CN109946730A (en) * 2019-03-06 2019-06-28 东南大学 Ultra-wideband-based high-reliability fusion positioning method for vehicles under cooperation of vehicle and road
CN115308684A (en) * 2022-07-05 2022-11-08 广州晓网电子科技有限公司 Uwb ultra-wideband indoor positioning method and device
CN115598634A (en) * 2022-10-24 2023-01-13 中国科学技术大学(Cn) Ultra-wideband radar and laser radar fusion positioning method, device and storage medium
CN116242372A (en) * 2023-01-03 2023-06-09 东南大学 UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment
CN116642482A (en) * 2023-04-14 2023-08-25 陕西远海探科电子科技有限公司 Positioning method, equipment and medium based on solid-state laser radar and inertial navigation
US11808848B1 (en) * 2022-10-08 2023-11-07 Zhejiang Deqing Zhilu Navigation Technology Co., LTD Method, system and terminal for wide-area acoustic indoor positioning based on RF enhancement
CN117570958A (en) * 2023-11-02 2024-02-20 霞智科技有限公司 Lubang positioning method applying unstructured environment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6240367B1 (en) * 1998-11-27 2001-05-29 Ching-Fang Lin Full fusion positioning method for vehicle
US20170367766A1 (en) * 2016-03-14 2017-12-28 Mohamed R. Mahfouz Ultra-wideband positioning for wireless ultrasound tracking and communication
CN109946730A (en) * 2019-03-06 2019-06-28 东南大学 Ultra-wideband-based high-reliability fusion positioning method for vehicles under cooperation of vehicle and road
CN115308684A (en) * 2022-07-05 2022-11-08 广州晓网电子科技有限公司 Uwb ultra-wideband indoor positioning method and device
US11808848B1 (en) * 2022-10-08 2023-11-07 Zhejiang Deqing Zhilu Navigation Technology Co., LTD Method, system and terminal for wide-area acoustic indoor positioning based on RF enhancement
CN115598634A (en) * 2022-10-24 2023-01-13 中国科学技术大学(Cn) Ultra-wideband radar and laser radar fusion positioning method, device and storage medium
CN116242372A (en) * 2023-01-03 2023-06-09 东南大学 UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment
CN116642482A (en) * 2023-04-14 2023-08-25 陕西远海探科电子科技有限公司 Positioning method, equipment and medium based on solid-state laser radar and inertial navigation
CN117570958A (en) * 2023-11-02 2024-02-20 霞智科技有限公司 Lubang positioning method applying unstructured environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
巩青歌;段妍羽;彭圳生;彭爱军;: "基于多手段融合的室内外连续定位技术研究", 激光杂志, no. 06, 25 June 2018 (2018-06-25) *

Also Published As

Publication number Publication date
CN117804448B (en) 2024-04-30

Similar Documents

Publication Publication Date Title
He et al. An integrated GNSS/LiDAR-SLAM pose estimation framework for large-scale map building in partially GNSS-denied environments
CN108921947B (en) Method, device, equipment, storage medium and acquisition entity for generating electronic map
Puente et al. Accuracy verification of the Lynx Mobile Mapper system
EP3454008A1 (en) Survey data processing device, survey data processing method, and survey data processing program
CN110187375A (en) A kind of method and device improving positioning accuracy based on SLAM positioning result
CN116625354B (en) High-precision topographic map generation method and system based on multi-source mapping data
CN111025032B (en) Aerial beam measuring system and method based on lift-off platform
Lösler et al. Terrestrial monitoring of a radio telescope reference point using comprehensive uncertainty budgeting: Investigations during CONT14 at the Onsala Space Observatory
CN109031269A (en) Localization method, system, equipment and storage medium based on millimetre-wave radar
CN112702699B (en) Indoor positioning method fusing UWB and LiDAR
CN115200572B (en) Three-dimensional point cloud map construction method and device, electronic equipment and storage medium
CN115950418A (en) Multi-sensor fusion positioning method
Khoshelham et al. Vehicle positioning in the absence of GNSS signals: Potential of visual-inertial odometry
CN112815863B (en) Deformation monitoring system, deformation monitoring method, deformation calculation device and storage medium
CN112098926B (en) Intelligent angle measurement training sample generation method by using unmanned plane platform
Ren et al. Optimal camera focal length detection method for GPS-supported bundle adjustment in UAV photogrammetry
CN113534110A (en) Static calibration method for multi-laser radar system
CN117804448B (en) Autonomous system positioning method, device, computer equipment and storage medium
CN113495281B (en) Real-time positioning method and device for movable platform
CN115752468A (en) Unmanned aerial vehicle obstacle avoidance method based on hand-eye coordination
CN114705223A (en) Inertial navigation error compensation method and system for multiple mobile intelligent bodies in target tracking
WO2022256976A1 (en) Method and system for constructing dense point cloud truth value data and electronic device
Elsayed et al. From Stationary to Mobile: Unleashing the Full Potential of Terrestrial LiDAR through Sensor Integration
Zhan et al. Fast Self-calibration Method for Massive UWB Anchors Aided by Odometry
CN118244785B (en) Amphibious unmanned aerial vehicle with air-ground double modes, positioning method and device thereof, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant