CN115601519A - Remote control end mapping method facing low communication bandwidth - Google Patents

Remote control end mapping method facing low communication bandwidth Download PDF

Info

Publication number
CN115601519A
CN115601519A CN202211382900.8A CN202211382900A CN115601519A CN 115601519 A CN115601519 A CN 115601519A CN 202211382900 A CN202211382900 A CN 202211382900A CN 115601519 A CN115601519 A CN 115601519A
Authority
CN
China
Prior art keywords
point cloud
map
control end
incremental
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211382900.8A
Other languages
Chinese (zh)
Other versions
CN115601519B (en
Inventor
任明武
孙苏军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN202211382900.8A priority Critical patent/CN115601519B/en
Publication of CN115601519A publication Critical patent/CN115601519A/en
Application granted granted Critical
Publication of CN115601519B publication Critical patent/CN115601519B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/005Tree description, e.g. octree, quadtree
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/04Protocols for data compression, e.g. ROHC

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a remote control end mapping method facing low communication bandwidth, wherein a robot end carries out synchronous positioning and mapping through a synchronous positioning and mapping algorithm, and uses a double-cache octree to maintain a front frame sub-map and a rear frame sub-map so as to obtain an incremental point cloud; after the incremental point cloud is coded and compressed, the incremental point cloud and the robot pose output by the synchronous positioning and mapping algorithm are sent to a control end; and the control end carries out incremental point cloud reconstruction to obtain a 3D point cloud map, and the map quality is evaluated and fed back to the robot end through the map quality evaluation and feedback module, so that the incremental point cloud is adaptively adjusted to obtain parameters. According to the invention, the communication bandwidth requirement of the man-machine cooperative remote control system is greatly reduced in a mode that the control end uses the incremental point cloud to reconstruct the three-dimensional map; and a three-dimensional map is generated at the control end, so that the control burden of the control personnel is greatly reduced, and the control accuracy of the control personnel is improved.

Description

Remote control end mapping method facing low communication bandwidth
Technical Field
The invention belongs to the technical field of man-machine cooperation, and particularly relates to a remote control end mapping method under low communication bandwidth.
Background
For a human-computer cooperative remote control system, in order to facilitate an operator to observe the surrounding environment of a robot and reduce the load of the operator, sensor data such as images collected by a robot end are generally sent to the operator end. If the control end can acquire the point cloud data of the laser radar, the man-machine cooperative remote control system has multiple benefits. On one hand, other sensor data can be invalid in certain scenes, for example, in the indoor non-illumination condition or underground space, an image acquired by a camera is often blackened, an operator cannot judge the surrounding condition of the robot through the camera, and a three-dimensional point cloud map constructed through laser point cloud is not affected by the illumination condition and the like; on the other hand, the three-dimensional point cloud map can reflect the surrounding environment condition of the robot more intuitively, reduce the control burden of the operator and improve the control accuracy of the operator.
However, because the amount of point cloud data is huge, the current human-computer cooperative remote control system usually only transmits compressed image data to the control end, but does not transmit point cloud data to the control end, and especially under the condition of low communication bandwidth, the transmission of point cloud data to the control end becomes more unreachable.
Some multi-robot cooperative systems choose to transmit point cloud data between robots and base stations. The document "comprehensive multi-modal sensor fusion for constrained robot position estimation in subordinate environment" compresses a single point cloud factor graph to about 2MB by using a DRACO algorithm, and sends the point cloud factor graph to a base station to perform global optimization; document "local 2.0; in a satellite rejection environment, the distributed cooperative positioning and mapping method (No. CN 115166686A) of multiple unmanned aerial vehicles, which is proposed by Tian Bai Ling et al, mutually transmits point cloud characteristics among multiple unmanned aerial vehicles. The methods are mainly used for transmitting point cloud data between each robot and a base station in a multi-robot cooperative system, and the optimization problem of the pose of each robot is emphasized. In addition, these methods either do not perform point cloud data compression, or the point cloud data compression algorithm is not efficient enough, for example, even if the frame rate is 1 frame/s, the transmission bandwidth of 2MB/s is also required for the DRACO algorithm to compress a single point cloud factor graph to about 2MB, many repeated redundant points are accumulated between multiple frames in voxel grid down-sampling, the transmission of the point cloud data still occupies a very large bandwidth, if data such as images are added, the whole transmission link becomes too heavy, which is not feasible under the condition of low communication bandwidth.
In summary, under the condition of low communication bandwidth, the construction of the three-dimensional point cloud map by the control end has important significance, but a plurality of technical problems exist at present.
Disclosure of Invention
In order to solve the technical defects in the prior art, the invention provides a remote control end mapping method under low communication bandwidth.
The technical scheme for realizing the purpose of the invention is as follows: a remote control end mapping method facing low communication bandwidth comprises the following steps:
s1: the robot end acquires data of multiple sensors, the pose of the robot in a global coordinate system is solved in real time through a synchronous positioning and mapping algorithm, and a map in the global coordinate system is maintained;
s2: the robot end obtains incremental point clouds through an incremental point cloud obtaining algorithm, encodes and compresses the incremental point clouds, and sends the incremental point clouds and the current pose of the robot to the control end;
s3: the control end decodes the incremental point cloud, creates a three-dimensional point cloud map, executes a map quality evaluation and feedback module, evaluates the map building quality and feeds the map building quality back to the robot end.
Preferably, the specific method for acquiring the data of the multiple sensors at the robot end, resolving the pose of the robot in the global coordinate system in real time through a synchronous positioning and mapping algorithm, and maintaining a map in the global coordinate system comprises the following steps:
s11: taking the initial pose of the laser radar as the pose of the global coordinate system
Figure BDA0003929282590000021
S12: at the kth time, the unmanned vehicle end is in real timeCollecting laser point cloud data P k And IMU data I k And converting the IMU data into IMU data I 'under a laser radar coordinate system through calibration parameters' k
S13: at the kth time, laser point cloud data P k And IMU data I' k Calculating the laser inertia odometer through the frame fusion of the iterative Kalman filter to obtain the pose of the laser radar at the kth moment under the global coordinate system
Figure BDA0003929282590000022
S14: laser point cloud data P under laser radar coordinate system at the k-th moment k According to T k Converting the laser point cloud data into a global coordinate system to obtain laser point cloud data in the global coordinate system
Figure BDA0003929282590000023
S15: laser point cloud data under k-th time global coordinate system
Figure BDA0003929282590000024
Global map Q superimposed to time k-1 k-1 To obtain the global map Q at the k-th time k
Preferably, the step of obtaining the incremental point cloud by the robot end through the incremental point cloud obtaining algorithm specifically comprises the following steps:
s21: setting incremental point cloud acquisition algorithm related parameters including octree resolution r and a point cloud field angle alpha;
s22: intercepting a sub-map within a range of [ -alpha, alpha ] in front of the robot from a current global map at each moment;
s23: and organizing the sub-maps at the kth moment and the (k + 1) th moment by using a double-buffer octree with the resolution of r, and performing exclusive OR operation through double buffer areas to obtain the difference of the two sub-maps, namely the incremental point cloud at the (k + 1) th moment.
Preferably, in the step S21, when the incremental point cloud obtaining algorithm related parameters are set, the initial parameters are set through the configuration file, and then the related parameters are adaptively adjusted through the feedback result of the control end, so as to dynamically control the number of the incremental point clouds.
Preferably, in the step S3, the process of creating the three-dimensional point cloud map is performed at the control end, and specifically, the three-dimensional point cloud map synchronized with the robot end is obtained by directly superimposing multiple frames of incremental point clouds in the global coordinate system.
Preferably, the map quality evaluation and feedback module is executed, and the specific method for evaluating the map building quality is as follows:
s31: initializing relevant parameters including an execution frame rate f, a local sub-map radius R and a map density factor minimum threshold M min And a maximum threshold value M max The octree resolution ratio adjustment coefficient delta r and the point cloud field angle adjustment coefficient delta alpha;
s32: every f frames are spaced, the control end counts the local sub-map within the radius R range, and the local sub-map comprises the number n of point clouds in the sub-map and the distance d between each point and the nearest neighbor point i
S33: calculating point cloud map density factor
Figure BDA0003929282590000031
S34: if M > M is satisfied max If the point cloud map quality is poor, adjusting the octree resolution to r = r-delta r, and adjusting the point cloud field angle to alpha = alpha + delta alpha;
s35: if M < M is satisfied min If the point cloud map is dense, adjusting the resolution of the octree to be r = r + delta r, and adjusting the field angle of the point cloud to be alpha = alpha-delta alpha;
s36: and sending the adjusted parameters to the robot end for feedback.
Compared with the prior art, the invention has the remarkable advantages that:
1. the invention creatively uses the incremental point cloud transmission technology in the human-computer cooperative remote control system, thereby greatly reducing the communication bandwidth requirement of the human-computer cooperative remote control system;
2. the robot end and the control end synchronously build the image, so that the control burden of the control personnel is reduced, and the control accuracy of the control personnel is improved;
3. the method has the advantages that the number of incremental point clouds is controlled in a self-adaptive mode through a plurality of parameters, and the flexibility is higher;
4. the invention carries out quality evaluation on the three-dimensional point cloud map of the control end and feeds the quality evaluation back to the robot end, and can save more communication link bandwidths under the condition of ensuring the quality of the three-dimensional point cloud map of the control end.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The drawings are only for purposes of illustrating particular embodiments and are not to be construed as limiting the invention, wherein like reference numerals are used to designate like parts throughout.
FIG. 1 is a flow chart of a point cloud synchronous mapping method of a control end for remote control under a low communication bandwidth;
FIG. 2 is a partial schematic view of a three-dimensional point cloud map constructed from an original point cloud;
FIG. 3 is a partial schematic view of a three-dimensional point cloud map constructed from incremental point clouds;
fig. 4 is an overall schematic diagram of a three-dimensional point cloud map constructed by incremental point clouds.
Description of the reference numerals:
1-a starting point; 2-zone 1; 3-region 2.
Detailed Description
It is easily understood that various embodiments of the present invention can be conceived by those skilled in the art according to the technical solution of the present invention without changing the essential spirit of the present invention. Therefore, the following detailed description and the accompanying drawings are merely illustrative of the technical aspects of the present invention, and should not be construed as all of the present invention or as limitations or limitations on the technical aspects of the present invention. Rather, these embodiments are provided so that this disclosure will be thorough and complete. The preferred embodiments of the present invention will now be described in detail with reference to the accompanying drawings, which form a part hereof, and which together with the embodiments of the invention serve to explain the innovative concepts of the invention.
The robot end is an unmanned vehicle carrying a 40-line mechanical laser radar and an IMU; the control end is a notebook computer. Wherein, communicate through the radio station between unmanned car and the control end.
As shown in fig. 1, the method for establishing a diagram for a remote control terminal under a low communication bandwidth includes the following steps:
s1: the robot end acquires data of multiple sensors, the pose of the robot in a global coordinate system is solved in real time through a synchronous positioning and mapping algorithm, and a map in the global coordinate system is maintained;
s2: the robot end obtains incremental point clouds through an incremental point cloud obtaining algorithm, encodes and compresses the incremental point clouds, and sends the incremental point clouds and the current pose of the robot to the control end;
s3: the control end decodes the incremental point cloud, creates a three-dimensional point cloud map, executes a map quality evaluation and feedback module, evaluates the map building quality and feeds the map building quality back to the robot end.
Specifically, the step S1 is implemented as follows:
s11: taking the initial pose of the laser radar as the pose of the global coordinate system
Figure BDA0003929282590000051
S12: at the kth moment, the unmanned vehicle end collects the laser point cloud data P in real time k And IMU data I k And converting the IMU data into IMU data I 'under a laser radar coordinate system through calibration parameters' k
S13: at the kth time, laser point cloud data P k And IMU data I' k Calculating the laser inertia odometer through the iterative Kalman filter frame fusion to obtain the pose of the laser radar at the kth moment in a global coordinate system
Figure BDA0003929282590000052
S14: laser point cloud data P under laser radar coordinate system at the k-th moment k According to T k Converting the laser point cloud data into a global coordinate system to obtain laser point cloud data in the global coordinate system
Figure BDA0003929282590000053
S15: laser point cloud data under global coordinate system at the kth moment
Figure BDA0003929282590000054
Global map Q superimposed to time k-1 k-1 In the above, the global map Q at the k-th time is obtained k
If the control end wants to establish a map synchronously with the unmanned vehicle end, the unmanned vehicle end needs to transmit the point cloud data to the control end, but the amount of the point cloud data is huge. Through experimental measurement, the laser radar runs at a frame rate of 10 frames/s, the unmanned vehicle runs for 10 minutes at a speed of 20km/h, and the average point cloud data amount of each frame reaches 55447 points. If only three coordinate-representing fields "x", "y" and "z" are reserved and each field is stored in a 4-byte float type, the average point cloud per frame occupies 665364 bytes, and the communication rate is about 6.3454MB/s, i.e., the minimum required communication bandwidth is about 6.3454MB, which cannot be satisfied in the case of a low communication bandwidth.
The incremental point cloud transmission technology can solve the problems, specifically, the incremental point cloud obtaining algorithm in the step S2 is executed at the unmanned vehicle end, and the implementation process is as follows:
s21: at the 0 th moment, reading parameters such as octree resolution r and point cloud field angle alpha from the configuration file, and adjusting the parameters such as octree resolution r and point cloud field angle alpha according to feedback of the control end at other moments;
s22: at the (k + 1) th moment, a sub map within the range of [ -alpha, alpha ] in front of the robot is intercepted from the current global map;
s23: organizing a sub map of a kth time and a kth +1 time by using a double-buffer octree with the resolution of r, wherein each branch node of the octree is provided with two buffer areas, when a new sub node needs to be created, searching for a front pointer buffer area can be executed in the current branch node, if a related reference pointer cannot be found, the new sub node is created, the two buffer areas are initialized to be 0, if a corresponding sub node pointer can be found in the front pointer buffer area, the reference of the sub node is adopted, and only the selected sub pointer buffer area is initialized to be 0;
and carrying out XOR operation through the double cache regions to obtain the difference between the two sub-maps, namely the incremental point cloud at the k +1 th moment.
Specifically, through experimental calculation, the laser radar runs at a frame rate of 10 frames/s, the unmanned vehicle runs for 10 minutes at a speed of 20km/h, the octree resolution is set to be 0.3m, the point cloud field angle is set to be 180 degrees, and the incremental point cloud data amount of each frame is 4672 points on average. If only three fields representing coordinates "x", "y" and "z" are reserved and each field is stored in a 4-byte float type, the average incremental point cloud per frame occupies 56064 bytes and the communication rate is about 0.5346MB/s. Under the same condition, the communication bandwidth requirement required for transmitting the incremental point cloud is far less than that required for transmitting the original point cloud, and the incremental mapping effect at the control end is also close to the original point cloud mapping effect, as shown in fig. 2 and 3.
The incremental point cloud may be further encoded for compression, for example, lossless ZIP compression, and an average of 56064 bytes per frame may be compressed to 51578 bytes, reducing the communication rate to about 0.4919MB/s. If other more efficient compression encoding methods are used, the required communication bandwidth requirements will be lower.
More specifically, the laser radar operates at a frame rate of 10 frames/S, the unmanned vehicle runs for 10 minutes at a speed of 20km/h, and through multiple groups of experimental measurement and calculation, the data volume of the incremental point cloud can be effectively adjusted by reading parameters of the octree resolution r and the point cloud field angle α in the step S21, so that the communication bandwidth occupied by point cloud transmission is flexibly adjusted:
experiment one: the octree resolution r =0.1m, the point cloud field angle α =180 °, the incremental point cloud number is 10411 points, the incremental point cloud occupies 124932 bytes, the incremental point cloud occupies 1157916 bytes after ZIP coding, and the communication rate is about 1.1035MB/s;
experiment two: the octree resolution r =0.3m, the point cloud field angle α =180 °, the incremental point cloud number is 4672 points, the incremental point cloud occupies 56064 bytes, the incremental point cloud occupies 51578 bytes after ZIP coding, and the communication rate is about 0.4919MB/s;
experiment three: the octree resolution r =0.5m, the point cloud field angle α =180 °, the incremental point cloud number is 2547 points, the incremental point cloud occupies 30564 bytes, the ZIP code occupies 28035 bytes, and the communication rate is about 0.2674MB/s;
experiment four: the octree resolution r =1.0m, the point cloud field angle α =180 °, the incremental point cloud number is 1036 points, the incremental point cloud occupies 12432 bytes, the ZIP coded point cloud occupies 11592 bytes, and the communication rate is about 0.1106MB/s;
experiment five: the octree resolution r =0.5m, the point cloud field angle α =90 °, the number of incremental point clouds is 1686 points, the incremental point clouds occupy 20232 bytes, the ZIP codes occupy 18593 bytes, and the communication rate is about 0.1773MB/s.
Experiments I to IV verify that the octree resolution can effectively adjust the data volume of the incremental point cloud, and experiments III and V verify that the point cloud field angle can effectively adjust the data volume of the incremental point cloud.
For the control end, the graph establishing process is very simple. Because the incremental point cloud is obtained under the global coordinate system, the control end receives and decodes the encoded incremental point cloud, and then the encoded incremental point cloud is directly superposed to a global map maintained by the control end.
In order to ensure the quality of the three-dimensional point cloud map constructed by the control end, the system has higher flexibility, and a map quality evaluation and feedback module is introduced. Specifically, the map quality evaluation and feedback module in step S3 is executed at the control end, and the implementation process is as follows:
s31: initializing relevant parameters including an execution frame rate f, a local sub-map radius R and a map density factor minimum threshold M min And a maximum threshold value M max The octree resolution ratio adjustment coefficient delta r and the point cloud field angle adjustment coefficient delta alpha;
s32: intercepting a local sub-map within the radius R of the robot from a global map maintained by a control end at intervals of f frames, and counting the number n of point clouds in the sub-map to ensure thatSearching and counting the distance d between each point and the nearest neighbor point in the sub-map by using the nearest neighbor of the KD tree in the PCL i
S33: calculating point cloud map density factor
Figure BDA0003929282590000071
S34: if M > M is satisfied max If the point cloud map quality is poor, adjusting the octree resolution to r = r-delta r, and adjusting the point cloud field angle to alpha = alpha + delta alpha;
s35: if M < M is satisfied min If the point cloud map is dense, adjusting the resolution of the octree to be r = r + delta r, and adjusting the field angle of the point cloud to be alpha = alpha-delta alpha;
s36: and sending the adjusted octree resolution r and the point cloud field angle alpha to the unmanned vehicle end for feedback.
Specifically, as shown in fig. 4, the unmanned vehicle travels at an average speed of 20km/h, the laser radar operation frame rate on the unmanned vehicle is 10 frames/s, the initial octree resolution r =0.3m, and the initial point cloud angle of view is α =180 °. The map quality evaluation and feedback module has the parameters as follows: the execution frame rate f =20, the local sub-map radius R =50M, and the map density factor minimum threshold M min =0.05 and maximum threshold value M max =0.3, octree resolution adjustment coefficient Δ r =0.3m, point cloud viewing angle adjustment coefficient Δ α =90 °.
Starting from a starting point, the unmanned vehicle returns to the area 1 after winding the map for one circle in the kth frame, because the area 1 passes through the area before, a complete point cloud map is constructed, and then the incremental point cloud of the kth frame is superposed, the local sub-map of the area 1 is dense, and the density factor M of the point cloud map is calculated k Is 0.0429, due to M k <M min The octree resolution is adjusted to r = r + Δ r =0.6m, and the point cloud field angle is adjusted to α = α - Δ α =90 °. At this time, the communication rate of the incremental point cloud data is further reduced.
Further, the k +20n th frame of the unmanned vehicle reaches the area 2, and due to the adjustment of the previous parameters, the local sub-map of the area 2 is sparse, and the density factor M of the point cloud map is calculated k Is 0.4355, due to M k >M max The octree resolution is adjusted to be r = r- Δ r =0.3m, and the point cloud view angle is adjusted to be α = α + Δ α =180 °.
Further, the k +20 (n + 1) th frame of the unmanned vehicle is subjected to the calculation of a point cloud map density factor M k At 0.0933, the point cloud map quality is adaptively adjusted.
While the invention has been described with reference to specific preferred embodiments, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the following claims.
It should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes described in the context of a single embodiment, or with reference to a single figure, for the purpose of streamlining the disclosure and aiding in the understanding of various aspects of the invention by those skilled in the art. However, the present invention should not be construed such that the features included in the exemplary embodiments are all the essential technical features of the patent claims.
It should be understood that the modules, units, components, and the like included in the device of one embodiment of the present invention may be adaptively changed to be provided in a device different from that of the embodiment. The different modules, units or components comprised by the apparatus of an embodiment may be combined into one module, unit or component or they may be divided into a plurality of sub-modules, sub-units or sub-components.

Claims (6)

1. A remote control end mapping method facing low communication bandwidth is characterized by comprising the following steps:
s1: the robot end acquires data of multiple sensors, the pose of the robot in a global coordinate system is solved in real time through a synchronous positioning and mapping algorithm, and a map in the global coordinate system is maintained;
s2: the robot end obtains incremental point clouds through an incremental point cloud obtaining algorithm, encodes and compresses the incremental point clouds, and sends the encoded and compressed incremental point clouds and the current pose of the robot to the control end;
s3: the control end decodes the incremental point cloud, creates a three-dimensional point cloud map, executes a map quality evaluation and feedback module, evaluates the map building quality and feeds the map building quality back to the robot end.
2. The remote control end mapping method facing low communication bandwidth according to claim 1, wherein the robot end acquires multi-sensor data, and the specific method for solving the pose of the robot in the global coordinate system in real time through a synchronous positioning and mapping algorithm and maintaining a map in the global coordinate system comprises the following steps:
s11: taking the laser radar initial pose as the global coordinate system pose
Figure FDA0003929282580000011
S12: at the kth moment, the unmanned vehicle end collects laser point cloud data P in real time k And IMU data I k And converting the IMU data into IMU data I 'under a laser radar coordinate system through calibration parameters' k
S13: at the kth time, laser point cloud data P k And IMU data I' k Calculating the laser inertia odometer through the iterative Kalman filter frame fusion to obtain the pose of the laser radar at the kth moment in a global coordinate system
Figure FDA0003929282580000012
S14: laser point cloud data P under laser radar coordinate system at the k-th moment k According to T k Converting the laser point cloud data into a global coordinate system to obtain the laser point cloud data under the global coordinate system
Figure FDA0003929282580000013
S15: laser point cloud data under k-th time global coordinate system
Figure FDA0003929282580000014
Global map Q superimposed to time k-1 k-1 To obtain the k-th timeGlobal map Q of k
3. The remote control end mapping method oriented to low communication bandwidth according to claim 1, wherein the step of obtaining the incremental point cloud by the robot end through the incremental point cloud obtaining algorithm specifically comprises the following steps:
s21: setting incremental point cloud acquisition algorithm related parameters including octree resolution r and a point cloud field angle alpha;
s22: intercepting a sub map within the range of [ -alpha, a ] right ahead of the robot from the current global map at each moment;
s23: and organizing the sub-maps at the kth moment and the (k + 1) th moment by using a double-cache octree with the resolution ratio of r, and performing XOR operation through a double-cache area to obtain the difference of the two sub-maps, namely the incremental point cloud at the (k + 1) th moment.
4. The method for remote control mapping at a control end facing low communication bandwidth as claimed in claim 1, wherein: in the step S21, when the incremental point cloud is set to obtain the algorithm-related parameters, the initial parameters are set by the configuration file, and then the related parameters are adaptively adjusted by the feedback result of the control end, so as to dynamically control the number of the incremental point clouds.
5. The method for remote control mapping at a control end facing low communication bandwidth as claimed in claim 1, wherein: and S3, in the step of S3, the process of creating the three-dimensional point cloud map is carried out at the control end, and the three-dimensional point cloud map synchronous with the robot end is obtained by directly superposing multi-frame incremental point clouds under a global coordinate system.
6. The remote control end mapping method oriented to low communication bandwidth of claim 1, wherein a map quality evaluation and feedback module is executed, and the specific method for evaluating mapping quality is as follows:
s31: initializing relevant parameters including an execution frame rate f, a local sub-map radius R and a map density factor minimum threshold M min And a maximum threshold value M max The octree resolution ratio adjustment coefficient delta r and the point cloud field angle adjustment coefficient delta alpha;
s32: every f frames are spaced, the control end counts the local sub-map within the radius R range, and the local sub-map comprises the number n of point clouds in the sub-map and the distance d between each point and the nearest neighbor point i
S33: calculating point cloud map density factor
Figure FDA0003929282580000021
S34: if M > M is satisfied max If the point cloud map quality is poor, adjusting the octree resolution to r = r-delta r, and adjusting the point cloud field angle to alpha = alpha + delta alpha;
s35: if M < M is satisfied min If the point cloud map is dense, adjusting the resolution of the octree to be r = r + delta r, and adjusting the field angle of the point cloud to be alpha = a-delta alpha;
s36: and sending the adjusted parameters to the robot end for feedback.
CN202211382900.8A 2022-11-07 2022-11-07 Remote control-oriented control end mapping method under low communication bandwidth Active CN115601519B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211382900.8A CN115601519B (en) 2022-11-07 2022-11-07 Remote control-oriented control end mapping method under low communication bandwidth

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211382900.8A CN115601519B (en) 2022-11-07 2022-11-07 Remote control-oriented control end mapping method under low communication bandwidth

Publications (2)

Publication Number Publication Date
CN115601519A true CN115601519A (en) 2023-01-13
CN115601519B CN115601519B (en) 2024-04-05

Family

ID=84852634

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211382900.8A Active CN115601519B (en) 2022-11-07 2022-11-07 Remote control-oriented control end mapping method under low communication bandwidth

Country Status (1)

Country Link
CN (1) CN115601519B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107272454A (en) * 2017-06-19 2017-10-20 中国人民解放军国防科学技术大学 A kind of real time human-machine interaction method based on virtual reality
US20210270612A1 (en) * 2020-03-02 2021-09-02 Beijing Baidu Netcom Science And Technology Co., Ltd. Method, apparatus, computing device and computer-readable storage medium for positioning
CN113470089A (en) * 2021-07-21 2021-10-01 中国人民解放军国防科技大学 Cross-domain cooperative positioning and mapping method and system based on three-dimensional point cloud
CN113720324A (en) * 2021-08-30 2021-11-30 上海大学 Octree map construction method and system
WO2021237667A1 (en) * 2020-05-29 2021-12-02 浙江大学 Dense height map construction method suitable for legged robot planning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107272454A (en) * 2017-06-19 2017-10-20 中国人民解放军国防科学技术大学 A kind of real time human-machine interaction method based on virtual reality
US20210270612A1 (en) * 2020-03-02 2021-09-02 Beijing Baidu Netcom Science And Technology Co., Ltd. Method, apparatus, computing device and computer-readable storage medium for positioning
WO2021237667A1 (en) * 2020-05-29 2021-12-02 浙江大学 Dense height map construction method suitable for legged robot planning
CN113470089A (en) * 2021-07-21 2021-10-01 中国人民解放军国防科技大学 Cross-domain cooperative positioning and mapping method and system based on three-dimensional point cloud
CN113720324A (en) * 2021-08-30 2021-11-30 上海大学 Octree map construction method and system

Also Published As

Publication number Publication date
CN115601519B (en) 2024-04-05

Similar Documents

Publication Publication Date Title
CN112581590B (en) Unmanned aerial vehicle cloud edge terminal cooperative control method for 5G security rescue networking
CN111415416A (en) Method and system for fusing monitoring real-time video and scene three-dimensional model
CN110060332B (en) High-precision three-dimensional mapping and modeling system based on airborne acquisition equipment
CN103327335B (en) For the FPGA coded method of unmanned plane image transmission, system
JP5861712B2 (en) Captured image compression transmission method and captured image compression transmission system
CN102915562A (en) Compressed sensing-based multi-view target tracking and 3D target reconstruction system and method
CN109917419B (en) Depth filling dense system and method based on laser radar and image
CN105812814B (en) A kind of flight data visualization coding and transmission method based on UAV Video
CN110009675B (en) Method, apparatus, medium, and device for generating disparity map
CN112461210A (en) Air-ground cooperative building surveying and mapping robot system and surveying and mapping method thereof
KR20240058858A (en) Multiple UAV-based image stitching method and system
CN111914615A (en) Fire-fighting area passability analysis system based on stereoscopic vision
CN105096284A (en) Method, device and system of generating road orthographic projection image
CN103442225A (en) Remote sensing image transmission system based on database online learning update limited rate
CN116430403A (en) Real-time situation awareness system and method based on low-altitude airborne multi-sensor fusion
CN114199235B (en) Positioning system and positioning method based on sector depth camera
CN115601519A (en) Remote control end mapping method facing low communication bandwidth
CN114092651A (en) Intelligent modeling system and method for emergency management
CN113569849A (en) Car fills electric pile interface detection intelligent interaction system based on computer vision
CN113012196A (en) Positioning method based on information fusion of binocular camera and inertial navigation sensor
CN112218051A (en) Unmanned aerial vehicle sea area monitoring method based on GIS
CN111025364A (en) Machine vision positioning system and method based on satellite assistance
CN115825946A (en) Millimeter wave radar ranging method and device based on unsupervised learning
CN116797716A (en) Real-time acquisition method, device and system for mapping model
Chen et al. Construction of a digital twin model for ships based on BeiDou communication

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant