CN115601519B - Remote control-oriented control end mapping method under low communication bandwidth - Google Patents

Remote control-oriented control end mapping method under low communication bandwidth Download PDF

Info

Publication number
CN115601519B
CN115601519B CN202211382900.8A CN202211382900A CN115601519B CN 115601519 B CN115601519 B CN 115601519B CN 202211382900 A CN202211382900 A CN 202211382900A CN 115601519 B CN115601519 B CN 115601519B
Authority
CN
China
Prior art keywords
point cloud
map
control end
incremental
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211382900.8A
Other languages
Chinese (zh)
Other versions
CN115601519A (en
Inventor
任明武
孙苏军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN202211382900.8A priority Critical patent/CN115601519B/en
Publication of CN115601519A publication Critical patent/CN115601519A/en
Application granted granted Critical
Publication of CN115601519B publication Critical patent/CN115601519B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/005Tree description, e.g. octree, quadtree
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/04Protocols for data compression, e.g. ROHC

Abstract

The invention discloses a remote control end mapping method oriented to low communication bandwidth, wherein a robot end performs synchronous positioning and mapping through a synchronous positioning and mapping algorithm, and maintains a front sub map and a rear sub map by using a double-cache octree to obtain incremental point clouds; after the incremental point cloud is encoded and compressed, the incremental point cloud and the robot pose output by the synchronous positioning and mapping algorithm are transmitted to the control end; the control end performs incremental point cloud reconstruction to obtain a 3D point cloud map, and the map quality is evaluated through a map quality evaluation and feedback module and fed back to the robot end, so that the incremental point cloud calculation parameters are adjusted adaptively. According to the invention, the control end uses the incremental point cloud to reconstruct the three-dimensional map, so that the communication bandwidth requirement of the man-machine cooperative remote control system is greatly reduced; the three-dimensional map is generated at the control end, so that the control load of the control personnel is greatly reduced, and the control accuracy of the control personnel is improved.

Description

Remote control-oriented control end mapping method under low communication bandwidth
Technical Field
The invention belongs to the technical field of man-machine coordination, and particularly relates to a remote control end mapping method oriented to low communication bandwidth.
Background
For a man-machine cooperative remote control system, in order to facilitate operators to observe the surrounding environment of a robot and reduce the load of operators, sensor data such as images acquired by a robot end are generally sent to the control end. If the control end can acquire the laser radar point cloud data, the control end has various benefits for the man-machine cooperative remote control system. On the one hand, other sensor data can fail in certain scenes, such as indoor no-illumination conditions or underground spaces, images acquired through cameras are often black, operators cannot judge the surrounding conditions of the robot through the cameras, and a three-dimensional point cloud map is constructed through laser point clouds and is not influenced by illumination conditions and the like; on the other hand, the three-dimensional point cloud map can more intuitively reflect the surrounding environment condition of the robot, reduce the control burden of the control personnel and improve the control accuracy of the control personnel.
However, because the amount of the point cloud data is very large, the current man-machine collaborative remote control system only transmits the compressed image data to the control end, but does not transmit the point cloud data to the control end, and especially under the condition of low communication bandwidth, the transmission of the point cloud data to the control end becomes far more inaccessible.
Some multi-machine collaborative systems choose to transmit point cloud data between robots, between robots and base stations. The document Complementary multi-modal sensor fusion for resilient robot pose estimation in subterranean environments compresses a single point cloud factor graph to about 2MB by using a DRACO algorithm, and sends the point cloud factor graph to a base station to perform global optimization; the LOCUS 2.0:Robust and computationally efficient lidar odometry for real-time unrelround 3D mapping performs voxel grid downsampling on the point cloud keyframe and then transmits the point cloud keyframe to other robots and base stations; tianling et al discloses a method for distributed co-location and mapping of multiple unmanned aerial vehicles in a satellite refusing environment (CN 115166686A), wherein point cloud features are mutually transmitted among the multiple unmanned aerial vehicles. The method is mainly used for transmitting point cloud data between each robot and the base station in the multi-machine collaborative system, and the problem of optimizing the pose of each robot is emphasized. In addition, these methods do not perform point cloud data compression, or the point cloud data compression algorithm is not efficient enough, for example, even if the frame rate is 1 frame/s, the DRACO algorithm needs a transmission bandwidth of 2MB/s to about 2MB, many repeated redundant points can be accumulated between the multiple frames due to downsampling of the voxel grid, the transmission of the point cloud data still occupies a very large bandwidth, and if the data such as an image is added, the whole transmission link becomes overwhelmed, which is not feasible under the condition of low communication bandwidth.
In summary, under the condition of low communication bandwidth, the control end has important significance in constructing the three-dimensional point cloud map, but a plurality of technical problems still exist at present.
Disclosure of Invention
In order to solve the technical defects in the prior art, the invention provides a remote control-oriented control end mapping method under low communication bandwidth.
The technical scheme for realizing the purpose of the invention is as follows: a remote control-oriented control end mapping method under low communication bandwidth comprises the following steps:
s1: the robot end acquires multi-sensor data, solves the pose of the robot under a global coordinate system in real time through a synchronous positioning and mapping algorithm, and maintains a map under the global coordinate system;
s2: the robot end obtains an incremental point cloud through an incremental point cloud acquisition algorithm, encodes and compresses the incremental point cloud, and sends the encoded and compressed incremental point cloud to the control end together with the current pose of the robot;
s3: the control end decodes the increment point cloud, creates a three-dimensional point cloud map, executes a map quality evaluation and feedback module, evaluates the map construction quality and feeds back to the robot end.
Preferably, the robot end acquires multi-sensor data, and solves the pose of the robot under a global coordinate system in real time through a synchronous positioning and mapping algorithm, and the specific method for maintaining a map under the global coordinate system comprises the following steps:
s11: laser radar initial pose as global coordinate system pose
S12: at the kth moment, the unmanned vehicle end collects laser point cloud data P in real time k With IMU data I k And the IMU data is converted into IMU data I 'under the laser radar coordinate system through calibration parameters' k
S13: at the kth time, laser point cloud data P k With IMU data I' k Calculating a laser inertial odometer through iterative Kalman filter frame fusion to obtain the pose of the laser radar at the kth moment under a global coordinate system
S14: at the kth moment, laser point cloud data P in a laser radar coordinate system k According to T k Converting to the global coordinate system to obtain laser point cloud data in the global coordinate system
S15: at the kth timeLaser point cloud data in global coordinate systemGlobal map Q superimposed to the k-1 time k-1 Obtaining a global map Q at the kth time k
Preferably, the obtaining the incremental point cloud by the robot end through the incremental point cloud obtaining algorithm specifically includes the following steps:
s21: setting relevant parameters of an incremental point cloud acquisition algorithm, including octree resolution r and a point cloud field angle alpha;
s22: intercepting sub-maps in the range of [ -alpha, alpha ] in front of the robot from the current global map at each moment;
s23: and organizing the sub-maps at the kth moment and the k+1 moment by utilizing the double-cache octree with the resolution ratio r, and performing exclusive OR operation on the sub-maps through the double-cache region to obtain the difference of the two sub-maps, namely the increment point cloud at the k+1 moment.
Preferably, in step S21, when the relevant parameters of the incremental point cloud acquisition algorithm are set, the initial parameters are set through the configuration file, and then the relevant parameters are adaptively adjusted through the feedback result of the control end, so as to dynamically control the number of the incremental point clouds.
Preferably, in the step S3, the process of creating the three-dimensional point cloud map is performed at the control end, and specifically, the three-dimensional point cloud map synchronized with the robot end is obtained by directly superposing multiple frames of incremental point clouds under the global coordinate system.
Preferably, the map quality evaluation and feedback module is executed, and the specific method for evaluating the map construction quality is as follows:
s31: initializing related parameters including performing frame rate f, local sub-map radius R, map density factor minimum threshold M min And a maximum threshold M max An octree resolution adjustment coefficient Δr, a point cloud view angle adjustment coefficient Δα;
s32: every interval f frames, the control end counts the local sub-map within the radius R range, including the number n of point clouds in the sub-map and the distance d between each point and the nearest neighbor point i
S33: computing point cloudGraph density factor
S34: if M > M is satisfied max The quality of the point cloud map is poor, the resolution of the octree is adjusted to be r=r-deltar, and the angle of view of the point cloud is adjusted to be alpha=alpha+deltaalpha;
s35: if M < M min The point cloud map is denser, the resolution of the octree is adjusted to be r=r+Δr, and the angle of view of the point cloud is adjusted to be alpha=alpha- Δalpha;
s36: and sending the adjusted parameters to a robot end for feedback.
Compared with the prior art, the invention has the remarkable advantages that:
1. the invention creatively uses the incremental point cloud transmission technology in the man-machine cooperative remote control system, thereby greatly reducing the communication bandwidth requirement of the man-machine cooperative remote control system;
2. according to the invention, the map is synchronously built at the robot end and the control end, so that the control load of the control personnel is reduced, and the control accuracy of the control personnel is improved;
3. the invention adaptively controls the increment point cloud quantity through a plurality of parameters, and has higher flexibility;
4. according to the method and the device for evaluating the quality of the three-dimensional point cloud map of the control end, the quality of the three-dimensional point cloud map of the control end is evaluated and fed back to the robot end, and more communication link bandwidths can be saved under the condition that the quality of the three-dimensional point cloud map of the control end is ensured.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
The drawings are only for purposes of illustrating particular embodiments and are not to be construed as limiting the invention, like reference numerals being used to refer to like parts throughout the several views.
FIG. 1 is a flow chart of a method for cloud synchronous mapping of remote control endpoints under low communication bandwidth;
FIG. 2 is a partial schematic view of a three-dimensional point cloud map constructed from an original point cloud;
FIG. 3 is a partial schematic view of a three-dimensional point cloud map constructed with incremental point clouds;
fig. 4 is an overall schematic diagram of a three-dimensional point cloud map constructed with incremental point clouds.
Reference numerals illustrate:
1-starting point; 2-region 1; 3-region 2.
Detailed Description
It is easy to understand that various embodiments of the present invention can be envisioned by those of ordinary skill in the art without altering the true spirit of the present invention in light of the present teachings. Accordingly, the following detailed description and drawings are merely illustrative of the invention and are not intended to be exhaustive or to limit or restrict the invention. Rather, these embodiments are provided so that this disclosure will be thorough and complete by those skilled in the art. Preferred embodiments of the present invention are described in detail below with reference to the attached drawing figures, which form a part of the present application and are used in conjunction with embodiments of the present invention to illustrate the innovative concepts of the present invention.
The robot end is an unmanned vehicle carrying 40-line mechanical laser radar and IMU; the control end is a notebook computer. The unmanned vehicle and the control end communicate through a radio station.
As shown in fig. 1, the remote control-oriented control end mapping method under low communication bandwidth includes the following steps:
s1: the robot end acquires multi-sensor data, solves the pose of the robot under a global coordinate system in real time through a synchronous positioning and mapping algorithm, and maintains a map under the global coordinate system;
s2: the robot end obtains an incremental point cloud through an incremental point cloud acquisition algorithm, encodes and compresses the incremental point cloud, and sends the encoded and compressed incremental point cloud to the control end together with the current pose of the robot;
s3: the control end decodes the increment point cloud, creates a three-dimensional point cloud map, executes a map quality evaluation and feedback module, evaluates the map construction quality and feeds back to the robot end.
Specifically, the implementation process of step S1 is as follows:
s11: laser radar initial pose as global coordinate system pose
S12: at the kth moment, the unmanned vehicle end collects laser point cloud data P in real time k With IMU data I k And the IMU data is converted into IMU data I 'under the laser radar coordinate system through calibration parameters' k
S13: at the kth time, laser point cloud data P k With IMU data I' k Calculating a laser inertial odometer through iterative Kalman filter frame fusion to obtain the pose of the laser radar at the kth moment under a global coordinate system
S14: at the kth moment, laser point cloud data P in a laser radar coordinate system k According to T k Converting to the global coordinate system to obtain laser point cloud data in the global coordinate system
S15: laser point cloud data in kth moment global coordinate systemGlobal map Q superimposed to the k-1 time k-1 Obtaining a global map Q at the kth time k
If the control end wants to construct a map synchronously with the unmanned vehicle end, the unmanned vehicle end must transmit the point cloud data to the control end, however, the amount of the point cloud data is very huge. Through experimental measurement and calculation, the laser radar runs at a frame rate of 10 frames/s, the unmanned vehicle runs at a speed of 20km/h for 10 minutes, and the average data volume of point cloud per frame reaches 55447 points. If only three fields representing coordinates, "x", "y", "z" are reserved and each field is saved in a 4-byte float type, then an average per-frame point cloud occupies 665364 bytes, with a communication rate of about 6.3454MB/s, i.e., a minimum required communication bandwidth of about 6.3454MB, which is not satisfied in the case of a low communication bandwidth.
The incremental point cloud transmission technology can solve the above problem, specifically, the incremental point cloud acquisition algorithm in step S2 is executed at the unmanned vehicle end, and the implementation process is as follows:
s21: at the 0 th moment, reading parameters such as octree resolution r, point cloud view angle alpha and the like from the configuration file, and adjusting the parameters such as octree resolution r, point cloud view angle alpha and the like according to feedback of the control end at other moments;
s22: at the (k+1) th moment, intercepting sub-maps in the range of [ -alpha, alpha ] in front of the robot from the current global map;
s23: organizing the sub-map at the kth moment and the (k+1) moment by using a double-cache octree with the resolution of r, wherein each branch node of the octree is provided with two buffer areas, when a new sub-node needs to be created, searching for a front pointer buffer area in the current branch node, if a relevant reference pointer cannot be found, creating the new sub-node, initializing the two buffer areas to 0, and if a corresponding sub-node pointer can be found in the front pointer buffer area, adopting the reference of the corresponding sub-node pointer, and initializing only the selected sub-pointer buffer area to 0;
and performing exclusive OR operation on the two sub-maps to obtain the difference of the two sub-maps, namely the increment point cloud at the k+1th moment.
Specifically, through experimental measurement and calculation, the laser radar runs at a frame rate of 10 frames/s, the unmanned vehicle runs at a speed of 20km/h for 10 minutes, the octree resolution is set to 0.3m, the point cloud view angle is set to 180 degrees, and the average increment point cloud data amount per frame is 4672 points. If only three fields representing coordinates, "x", "y", "z" are reserved and each field is saved in a 4-byte float type, the average per-frame incremental point cloud occupies 56064 bytes, with a communication rate of approximately 0.5346MB/s. Under the same condition, the communication bandwidth requirement required by the transmission of the incremental point cloud is far smaller than that required by the transmission of the original point cloud, and the incremental mapping effect at the control end is nearly identical to the original point cloud mapping effect, as shown in fig. 2 and 3.
In addition, the incremental point cloud may be further encoded and compressed, for example, lossless ZIP compression, and the average incremental point cloud of 56064 bytes per frame may be compressed to 51578 bytes, with a communication rate reduced to approximately 0.4919MB/s. The communication bandwidth requirements would be lower if other more efficient compression coding schemes were used.
More specifically, the laser radar operates at a frame rate of 10 frames/S, the unmanned vehicle runs at a speed of 20km/h for 10 minutes, and through a plurality of groups of experimental measurement and calculation, the reading of the octree resolution r and the point cloud view angle alpha parameter in the step S21 can effectively adjust the data volume of the incremental point cloud, so that the communication bandwidth occupied by the point cloud transmission can be flexibly adjusted:
experiment one: the octree resolution r=0.1m, the point cloud view angle alpha=180°, the increment point cloud number is 10411 points, the increment point cloud occupies 124932 bytes, the ZIP code occupies 1157916 bytes, and the communication rate is about 1.1035MB/s;
experiment II: the octree resolution ratio r=0.3m, the point cloud view angle alpha=180°, the increment point cloud number is 4672 points, the increment point cloud occupies 56064 bytes, the ZIP code occupies 51578 bytes, and the communication rate is about 0.4919MB/s;
experiment III: the octree resolution r=0.5m, the point cloud view angle alpha=180°, the increment point cloud count is 2547 points, the increment point cloud occupies 30564 bytes, the ZIP code occupies 28035 bytes, and the communication rate is about 0.2674MB/s;
experiment IV: the octree resolution r=1.0m, the point cloud view angle alpha=180°, the increment point cloud count is 1036 points, the increment point cloud occupies 12432 bytes, the ZIP code occupies 11592 bytes, and the communication rate is about 0.1106MB/s;
experiment five: the octree resolution r=0.5m, the point cloud view angle α=90°, the incremental point cloud count is 1686 points, the incremental point cloud occupies 20232 bytes, the ZIP code occupies 18593 bytes, and the communication rate is about 0.1773MB/s.
The data volume of the incremental point cloud can be effectively adjusted through the octree resolution in experiments one to four, and the data volume of the incremental point cloud can be effectively adjusted through the point cloud field angle in comparison of experiments three and five.
For the control end, the drawing construction process is very simple. The incremental point cloud is obtained under the global coordinate system, and the control end receives and decodes the coded incremental point cloud and then directly superimposes the coded incremental point cloud into the global map maintained by the control end.
In order to ensure the quality of the three-dimensional point cloud map constructed by the control end, the system has higher flexibility, and a map quality evaluation and feedback module is introduced. Specifically, the map quality evaluation and feedback module in step S3 is executed at the control end, and the implementation process is as follows:
s31: initializing related parameters including performing frame rate f, local sub-map radius R, map density factor minimum threshold M min And a maximum threshold M max An octree resolution adjustment coefficient Δr, a point cloud view angle adjustment coefficient Δα;
s32: every interval f frames, intercepting a local sub-map within the radius R range of the robot from a global map maintained by a control end, counting the number n of point clouds in the sub-map, and searching and counting the distance d between each point in the sub-map and the nearest neighbor point by using a KD tree nearest neighbor in PCL i
S33: calculating a point cloud map density factor
S34: if M > M is satisfied max The quality of the point cloud map is poor, the resolution of the octree is adjusted to be r=r-deltar, and the angle of view of the point cloud is adjusted to be alpha=alpha+deltaalpha;
s35: if M < M min The point cloud map is denser, the resolution of the octree is adjusted to be r=r+Δr, and the angle of view of the point cloud is adjusted to be alpha=alpha- Δalpha;
s36: and sending the adjusted octree resolution r and the point cloud field angle alpha to an unmanned vehicle end for feedback.
Specifically, as shown in FIG. 4, the unmanned vehicle is driven at an average speed of 20km/h, the frame rate of laser radar operation on the unmanned vehicle is 10 frames/s, and the unmanned vehicle is initiallyThe starting octree resolution r=0.3 m, and the initial point cloud field angle is α=180°. The parameters of the map quality evaluation and feedback module are set as follows: executing frame rate f=20, local sub-map radius r=50m, map density factor minimum threshold M min =0.05 and maximum threshold M max =0.3, octree resolution adjustment coefficient Δr=0.3 m, point cloud view angle adjustment coefficient Δα=90°.
The unmanned vehicle starts from the starting point, the kth frame returns to the region 1 after one circle of map, as the region 1 has constructed a more complete point cloud map when passing before, the incremental point cloud of the kth frame is overlapped, the local sub-map of the region 1 is denser, and the point cloud map density factor M is calculated k 0.0429 due to M k <M min The octree resolution is adjusted to r=r+Δr=0.6m, and the point cloud field angle is adjusted to α=α - Δα=90°. At this time, the communication rate of the incremental point cloud data is further reduced.
Further, the (k+20n) th frame of the unmanned vehicle reaches the region 2, and the local sub-map of the region 2 is sparse due to the previous parameter adjustment, and the point cloud map density factor M is calculated k 0.4355 due to M k >M max The octree resolution is adjusted to r=r- Δr=0.3m, and the point cloud field angle is adjusted to α=α+Δα=180°.
Further, the k+20n+1 frame of the unmanned vehicle calculates the point cloud map density factor M k The point cloud map quality is adaptively adjusted to 0.0933.
The present invention is not limited to the above-mentioned embodiments, and any changes or substitutions that can be easily understood by those skilled in the art within the technical scope of the present invention are intended to be included in the scope of the present invention.
It should be appreciated that in the above description of exemplary embodiments of the invention, various features of the invention are sometimes described in the context of a single embodiment or with reference to a single figure in order to streamline the invention and aid those skilled in the art in understanding the various aspects of the invention. The present invention should not, however, be construed as including features that are essential to the patent claims in the exemplary embodiments.
It should be understood that modules, units, components, etc. included in the apparatus of one embodiment of the present invention may be adaptively changed to arrange them in an apparatus different from the embodiment. The different modules, units or components comprised by the apparatus of the embodiments may be combined into one module, unit or component or they may be divided into a plurality of sub-modules, sub-units or sub-components.

Claims (5)

1. The remote control-oriented control end mapping method under the low communication bandwidth is characterized by comprising the following steps of:
s1: the robot end obtains multi-sensor data, solves the pose of the robot under a global coordinate system in real time through a synchronous positioning and mapping algorithm, and maintains a map under the global coordinate system, and the specific method comprises the following steps:
s11: laser radar initial pose as global coordinate system pose
S12: at the kth moment, the unmanned vehicle end collects laser point cloud data P in real time k With IMU data I k And the IMU data is converted into IMU data I 'under the laser radar coordinate system through calibration parameters' k
S13: at the kth time, laser point cloud data P k With IMU data I' k Calculating a laser inertial odometer through iterative Kalman filter frame fusion to obtain the pose of the laser radar at the kth moment under a global coordinate system
S14: at the kth moment, laser point cloud data P in a laser radar coordinate system k According to T k Converting to the global coordinate system to obtain laser point cloud data in the global coordinate system
S15: laser point cloud data in kth moment global coordinate systemGlobal map Q superimposed to the k-1 time k-1 Obtaining a global map Q at the kth time k
S2: the robot end obtains an incremental point cloud through an incremental point cloud acquisition algorithm, encodes and compresses the incremental point cloud, and sends the encoded and compressed incremental point cloud to the control end together with the current pose of the robot;
s3: the control end decodes the increment point cloud, creates a three-dimensional point cloud map, executes a map quality evaluation and feedback module, evaluates the map construction quality and feeds back to the robot end.
2. The remote control-oriented control end mapping method under low communication bandwidth according to claim 1, wherein the robot end obtains the incremental point cloud through an incremental point cloud acquisition algorithm, specifically comprising the following steps:
s21: setting relevant parameters of an incremental point cloud acquisition algorithm, including octree resolution r and a point cloud field angle alpha;
s22: intercepting sub-maps in the range of [ -alpha, alpha ] in front of the robot from the current global map at each moment;
s23: and organizing the sub-maps at the kth moment and the k+1 moment by utilizing the double-cache octree with the resolution ratio r, and performing exclusive OR operation on the sub-maps through the double-cache region to obtain the difference of the two sub-maps, namely the increment point cloud at the k+1 moment.
3. The remote control-oriented control end mapping method under low communication bandwidth according to claim 1, wherein the method comprises the following steps: in step S21, when the related parameters of the incremental point cloud acquisition algorithm are set, the initial parameters are set through a configuration file, and then the related parameters are self-adaptively adjusted through the feedback result of the control end, so that the quantity of the incremental point clouds is dynamically controlled.
4. The remote control-oriented control end mapping method under low communication bandwidth according to claim 1, wherein the method comprises the following steps: in the step S3, the process of creating the three-dimensional point cloud map is carried out at the control end, and the three-dimensional point cloud map synchronous with the robot end is obtained by directly superposing multi-frame increment point clouds under the global coordinate system.
5. The method for mapping a control end for remote control under a low communication bandwidth according to claim 1, wherein the specific method for performing map quality evaluation and feedback module to evaluate the mapping quality is as follows:
s31: initializing related parameters including performing frame rate f, local sub-map radius R, map density factor minimum threshold M min And a maximum threshold M max An octree resolution adjustment coefficient Δr, a point cloud view angle adjustment coefficient Δα;
s32: every interval f frames, the control end counts the local sub-map within the radius R range, including the number n of point clouds in the sub-map and the distance d between each point and the nearest neighbor point i
S33: calculating a point cloud map density factor
S34: if M > M is satisfied max The quality of the point cloud map is poor, the resolution of the octree is adjusted to be r=r-deltar, and the angle of view of the point cloud is adjusted to be alpha=alpha+deltaalpha;
s35: if M < M min The point cloud map is denser, the resolution of the octree is adjusted to be r=r+Δr, and the angle of view of the point cloud is adjusted to be alpha=alpha- Δalpha;
s36: and sending the adjusted parameters to a robot end for feedback.
CN202211382900.8A 2022-11-07 2022-11-07 Remote control-oriented control end mapping method under low communication bandwidth Active CN115601519B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211382900.8A CN115601519B (en) 2022-11-07 2022-11-07 Remote control-oriented control end mapping method under low communication bandwidth

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211382900.8A CN115601519B (en) 2022-11-07 2022-11-07 Remote control-oriented control end mapping method under low communication bandwidth

Publications (2)

Publication Number Publication Date
CN115601519A CN115601519A (en) 2023-01-13
CN115601519B true CN115601519B (en) 2024-04-05

Family

ID=84852634

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211382900.8A Active CN115601519B (en) 2022-11-07 2022-11-07 Remote control-oriented control end mapping method under low communication bandwidth

Country Status (1)

Country Link
CN (1) CN115601519B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107272454A (en) * 2017-06-19 2017-10-20 中国人民解放军国防科学技术大学 A kind of real time human-machine interaction method based on virtual reality
CN113470089A (en) * 2021-07-21 2021-10-01 中国人民解放军国防科技大学 Cross-domain cooperative positioning and mapping method and system based on three-dimensional point cloud
CN113720324A (en) * 2021-08-30 2021-11-30 上海大学 Octree map construction method and system
WO2021237667A1 (en) * 2020-05-29 2021-12-02 浙江大学 Dense height map construction method suitable for legged robot planning

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11852751B2 (en) * 2020-03-02 2023-12-26 Beijing Baidu Netcom Science And Technology Co., Ltd. Method, apparatus, computing device and computer-readable storage medium for positioning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107272454A (en) * 2017-06-19 2017-10-20 中国人民解放军国防科学技术大学 A kind of real time human-machine interaction method based on virtual reality
WO2021237667A1 (en) * 2020-05-29 2021-12-02 浙江大学 Dense height map construction method suitable for legged robot planning
CN113470089A (en) * 2021-07-21 2021-10-01 中国人民解放军国防科技大学 Cross-domain cooperative positioning and mapping method and system based on three-dimensional point cloud
CN113720324A (en) * 2021-08-30 2021-11-30 上海大学 Octree map construction method and system

Also Published As

Publication number Publication date
CN115601519A (en) 2023-01-13

Similar Documents

Publication Publication Date Title
CN112581590B (en) Unmanned aerial vehicle cloud edge terminal cooperative control method for 5G security rescue networking
CN109100730B (en) Multi-vehicle cooperative rapid map building method
CN105865454B (en) A kind of Navigation of Pilotless Aircraft method generated based on real-time online map
CN110060332B (en) High-precision three-dimensional mapping and modeling system based on airborne acquisition equipment
CN111339893B (en) Pipeline detection system and method based on deep learning and unmanned aerial vehicle
CN112461210B (en) Air-ground cooperative building surveying and mapping robot system and surveying and mapping method thereof
CN111415416A (en) Method and system for fusing monitoring real-time video and scene three-dimensional model
CN112257163B (en) Air-space-ground integrated road engineering earthwork construction intelligent dispatching command system and method
CN102695041A (en) Unmanned plane load device with real-time wireless high resolution image transmission function
CA3063606A1 (en) Method and apparatus for planning sample points for surveying and mapping, control terminal, and storage medium
JPWO2013031785A1 (en) Captured image compression transmission method and captured image compression transmission system
CN110634138A (en) Bridge deformation monitoring method, device and equipment based on visual perception
CN102927916A (en) Method and device of monitoring height of corn plants in wild environment
AU2018449839B2 (en) Surveying and mapping method and device
CN111914615A (en) Fire-fighting area passability analysis system based on stereoscopic vision
CN103136739B (en) Controlled camera supervised video and three-dimensional model method for registering under a kind of complex scene
CN115601519B (en) Remote control-oriented control end mapping method under low communication bandwidth
CN117310627A (en) Combined calibration method applied to vehicle-road collaborative road side sensing system
CN104683773A (en) Video high-speed transmission method using unmanned aerial vehicle
CN109147962A (en) A kind of geologic prospect remote medical consultation with specialists and supervisory systems based on Internet of Things
CN113516157A (en) Embedded three-dimensional scanning system and three-dimensional scanning device
CN113379908A (en) Three-dimensional GISVR circuit live-action platform building system for automatic inspection of power equipment
CN117636251B (en) Disaster damage detection method and system based on robot
CN116244343B (en) Cross-platform 3D intelligent traffic command method and system based on big data and VR
CN115019167B (en) Fusion positioning method, system, equipment and storage medium based on mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant