CN117636251B - Disaster damage detection method and system based on robot - Google Patents
Disaster damage detection method and system based on robot Download PDFInfo
- Publication number
- CN117636251B CN117636251B CN202311616497.5A CN202311616497A CN117636251B CN 117636251 B CN117636251 B CN 117636251B CN 202311616497 A CN202311616497 A CN 202311616497A CN 117636251 B CN117636251 B CN 117636251B
- Authority
- CN
- China
- Prior art keywords
- wireless communication
- communication module
- disaster
- module
- disaster damage
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 121
- 238000000034 method Methods 0.000 claims abstract description 26
- 238000012545 processing Methods 0.000 claims abstract description 25
- 238000004891 communication Methods 0.000 claims description 693
- 238000013441 quality evaluation Methods 0.000 claims description 107
- 230000005540 biological transmission Effects 0.000 claims description 77
- 238000012544 monitoring process Methods 0.000 claims description 20
- 238000013507 mapping Methods 0.000 claims description 17
- 230000008569 process Effects 0.000 claims description 15
- 210000001503 joint Anatomy 0.000 claims description 10
- 238000011156 evaluation Methods 0.000 claims description 9
- 238000004458 analytical method Methods 0.000 claims description 8
- 238000013461 design Methods 0.000 claims description 8
- 238000007619 statistical method Methods 0.000 claims description 7
- 238000004364 calculation method Methods 0.000 claims description 6
- 230000009466 transformation Effects 0.000 claims description 6
- 238000010276 construction Methods 0.000 claims description 4
- 230000010354 integration Effects 0.000 claims description 4
- 230000011218 segmentation Effects 0.000 claims description 4
- 239000013598 vector Substances 0.000 claims description 4
- 238000009412 basement excavation Methods 0.000 claims description 3
- 201000010099 disease Diseases 0.000 claims description 3
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 description 10
- 238000005457 optimization Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000012876 topography Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1652—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
- G01C21/32—Structuring or formatting of map data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3841—Data obtained from two or more sources, e.g. probe vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/757—Matching configurations of points or features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Alarm Systems (AREA)
Abstract
The invention provides a disaster damage detection method and system based on a robot. The disaster damage detection method comprises the following steps: judging whether a disaster damage detection task is received in real time, and acquiring a disaster damage detection target and a target position through the disaster damage detection task after the disaster damage detection task is received; the control robot goes to the disaster damage detection target position to acquire disaster damage scene image information of the disaster damage detection target, and sends the disaster damage scene image acquired in real time to a control center platform; and performing image data processing on the disaster damage scene image to obtain a damage result reflected by the disaster damage scene image. The system comprises modules corresponding to the method steps.
Description
Technical Field
The invention provides a disaster damage detection method and system based on a robot, and belongs to the technical field of disaster damage detection.
Background
In modern society, the occurrence frequency of disaster accidents is gradually increased, and serious threat is caused to life and property safety of people. The rapid disaster damage detection and assessment after disaster accidents are important for timely taking rescue and repair measures. However, the traditional manual disaster damage detection method has the problems of low efficiency, poor safety, high professional knowledge requirement and the like.
With the rapid development of robot technology, robots show great potential in various fields, and the disaster detection field is no exception. The robot can execute tasks in dangerous environments, has high flexibility and accuracy, can effectively complete disaster detection tasks, and provides accurate data support. However, in the prior art, the wireless communication signal is unstable due to stronger environmental interference factors of the disaster scene, so that the problem of influencing the success rate and the stability of data transmission exists.
Disclosure of Invention
The invention provides a disaster damage detection method and system based on a robot, which are used for solving the problem that the wireless communication signal is unstable due to stronger environmental interference factors in the existing disaster damage field, so that the success rate and the stability of data transmission are affected, and the adopted technical scheme is as follows:
a robot-based disaster detection method, the disaster detection method comprising:
judging whether a disaster damage detection task is received in real time, and acquiring a disaster damage detection target and a target position through the disaster damage detection task after the disaster damage detection task is received;
The control robot goes to the disaster damage detection target position to acquire disaster damage scene image information of the disaster damage detection target, and sends the disaster damage scene image acquired in real time to a control center platform; wherein the robot includes, but is not limited to, an unmanned aerial vehicle, a robot, etc.;
performing image data processing on the disaster damage scene image to obtain a damage result reflected by the disaster damage scene image;
and judging the disaster damage range and counting the engineering quantity through the existing design/history information matching, and completing the disaster damage information statistical analysis.
Further, the control robot goes to the disaster damage detection target position to collect disaster damage scene image information of the disaster damage detection target, and sends the disaster damage scene image obtained in real time to a control center platform, which comprises the following steps:
The robot is controlled to go to the disaster damage detection target position to acquire disaster damage scene image information of the disaster damage detection target in real time, and a real-time disaster damage scene image is acquired;
the unmanned aerial vehicle sends the disaster damage scene image to a control center platform in real time in a wireless communication mode;
and in the process that the unmanned aerial vehicle sends the disaster damage scene image to the control center platform in real time in a wireless communication mode, the wireless communication strategy is adjusted by monitoring the wireless transmission quality condition in real time.
Further, controlling the robot to go to the disaster damage detection target position to acquire disaster damage scene image information of the disaster damage detection target in real time and acquire a real-time disaster damage scene image, including:
And carrying out space coordinate alignment on the disaster site by utilizing a plurality of sensors carried by the unmanned aerial vehicle, and carrying out joint calibration on the disaster site by utilizing a camera, a laser radar and an IMU.
Providing an initial pose through IMU data, and correcting the distortion of laser original radar data;
The laser radar performs initial pose transformation among key frames by using an IMU data pre-integration result, and obtains pose transformation coordinate parameters according to a point cloud characteristic registration result;
providing front-end vision by using a double/monocular camera, realizing image feature extraction and inter-frame feature matching, and outputting a result of matching the vision features with the laser point cloud features;
Providing inter-frame velocity estimation and global coordinate constraint by GNSS;
and constructing the following objective function according to laser odometer constraint, IMU constraint and global coordinate constraint by using the obtained multi-sensor data, optimizing the pose and completing the construction of the global coordinate 3D map:
wherein e ij denotes in unity:
eij(x)=fij(x)-zij (2)
f ij (x) is the laser radar, IMU, GNSS observation equation predictor of key frame pose x i to x j, and z ij is the key frame inter-match measurement.
Further, in the process that the unmanned aerial vehicle sends the disaster damage scene image to the control center platform in real time in a wireless communication mode, the wireless communication strategy is adjusted by monitoring the wireless transmission quality status in real time, and the method comprises the following steps:
At the starting moment of the unmanned aerial vehicle executing task, the unmanned aerial vehicle is controlled to carry out wireless data connection and disaster scene image transmission with a control center platform by utilizing a first wireless communication module through one communication mode of a plurality of communication modes; wherein, the plurality of communication modes comprise a wifi communication mode and a 4G/5G communication mode;
Controlling a second wireless communication module of the unmanned aerial vehicle to detect communication parameters of the first wireless communication module in real time, and acquiring communication quality evaluation parameters of the first wireless communication module according to the communication parameters;
When the second wireless communication module determines to detect the communication parameters of the first wireless communication module, the second wireless communication module is controlled to perform signal butt joint with the control center platform sequentially by using the plurality of communication modes, and the communication quality evaluation parameters of each communication mode are obtained;
when the continuous time of the communication quality evaluation parameter corresponding to the current wireless communication mode of the first wireless communication module is lower than the continuous time of the communication quality evaluation parameter corresponding to any one of the communication modes of the second wireless communication module and exceeds a preset first time threshold, the wireless communication mode between the first wireless communication module and a control center platform is adjusted to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module;
And when the continuous time of the communication quality evaluation parameter corresponding to the adjusted wireless communication mode of the first wireless communication module is lower than the continuous time of the communication quality evaluation parameter corresponding to any one of the communication modes of the second wireless communication module exceeds a preset second time threshold, exchanging the function roles between the first wireless communication module and the second wireless communication module.
The constraint conditions set between the first time threshold and the second time threshold are as follows:
Wherein T 1 and T 2 represent a first time threshold and a second time threshold, respectively; t s1 represents an actual data transmission delay average value corresponding to the wireless communication mode before adjustment of the first wireless communication module; t s2 represents an actual data transmission delay average value corresponding to the wireless communication mode before adjustment of the second wireless communication module; k represents a preset adjustment coefficient, and the value range of k is 2.1-2.3.
The communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module are obtained through the following formula:
Wherein E represents communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module; e 0 represents a preset reference parameter constant value; n represents the number of unit time contained in the wireless communication duration of the current unmanned aerial vehicle, and the unit time is 1s; v di denotes a data download speed of the wireless communication corresponding to the i-th unit time; v si denotes a data upload speed of wireless communication corresponding to the i-th unit time; v di denotes a data download speed of the wireless communication corresponding to the i-th unit time; v si denotes a data upload speed of wireless communication corresponding to the i-th unit time; v d0 and V s0 respectively represent a lower limit value of a data downloading speed and a lower limit value of a data uploading speed which correspond to the optimal data communication state; deltaV represents a nominal speed difference between a preset data upload and a data download.
Further, the wireless communication mode between the first wireless communication module and the control center platform is adjusted to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, which comprises the following steps:
controlling the second wireless communication module and the control center platform to establish temporary data communication connection in a wireless communication mode corresponding to the highest communication quality evaluation parameter, and transmitting disaster scene images;
After the data communication connection is completed between the second wireless communication module and the control center platform, the wireless communication mode between the first wireless communication module and the control center platform is adjusted to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, and disaster scene image transmission is carried out;
And after the transmission of the first disaster scene image is completed between the first wireless communication module and the control center platform, disconnecting the temporary data communication connection between the second wireless communication module and the control center platform.
Further, the determining the disaster damage range and counting the engineering quantity through the existing design/history information matching to complete the disaster damage information statistical analysis comprises the following steps:
the global 3D map, the image and the point cloud feature matching information transmitted by the information transmission module are accessed through the interface unit;
Introducing a longitudinal and transverse curve of a route and a section mileage into a global 3D map, and calculating a normal plane of a section route central line of the mileage according to the following formula
A(x-xi)+B(y-yi)+C(z-hi)=0 (4)
Wherein: x i、yi、hi is the plane coordinate and elevation of the center line of the mileage point route, A, B and C are plane parameters, the plane parameters are obtained by calculating a vector n=alpha×beta= (A, B and C), alpha is the tangential direction of the mileage point and the plane curve, and beta is the vertical tangential direction of the mileage point;
Setting slice thickness, searching for a point with a normal plane distance value smaller than 1/2 slice thickness, projecting the point to a normal plane, taking a center point of a mileage section route as a polar coordinate center, calculating the polar angles of each projected point of the disease recording normal plane, and connecting the projected points with straight lines in turn according to the increasing sequence of the polar angles to obtain the normal plane section shape;
comparing the designed section with the shape of the normal plane section to obtain the excavation or filling area;
obtaining the filling square quantity according to the average value of the filling areas of the adjacent mileage sections and the mileage length calculation;
And (3) carrying out segmentation statistics on the square quantity, identifying the disaster damage type according to the image between the adjacent mileage sections and the point cloud characteristic matching information, and carrying out classification statistics according to the route mileage to finish a disaster damage report.
A robot-based damage detection system, the damage detection system comprising:
The target acquisition module is used for judging whether a disaster damage detection task is received or not in real time, and acquiring a disaster damage detection target and a target position through the disaster damage detection task after the disaster damage detection task is received;
The data acquisition and mapping module is used for controlling the robot to go to the disaster damage detection target position to acquire disaster damage scene image information of the disaster damage detection target and transmitting the disaster damage scene image acquired in real time to the control center platform;
the result acquisition module is used for carrying out image data processing on the disaster damage scene image and acquiring a damage result reflected by the disaster damage scene image;
The statistics judging module is used for judging the disaster damage range and counting the engineering quantity through the existing design/history information matching, and completing the disaster damage information statistics analysis, and comprises the following steps.
Further, the data acquisition and mapping module comprises:
the image acquisition module is used for controlling the robot to go to the disaster damage detection target position to acquire disaster damage scene image information of the disaster damage detection target in real time and acquiring a real-time disaster damage scene image;
the data transmission module is used for transmitting the disaster damage scene image to the control center platform in real time by the unmanned aerial vehicle in a wireless communication mode;
And the strategy adjustment module is used for adjusting the wireless communication strategy by monitoring the wireless transmission quality condition in real time in the process that the unmanned aerial vehicle sends the disaster scene image to the control center platform in real time in a wireless communication mode.
Further, the policy adjustment module includes:
The first communication data acquisition and mapping module is used for controlling the unmanned aerial vehicle to carry out wireless data connection and disaster scene image transmission with the control center platform by utilizing one of a plurality of communication modes through the first wireless communication module at the starting moment of the unmanned aerial vehicle executing task; wherein, the plurality of communication modes comprise a wifi communication mode and a 4G/5G communication mode;
The first evaluation module is used for controlling a second wireless communication module of the unmanned aerial vehicle to detect communication parameters of the first wireless communication module in real time and acquiring communication quality evaluation parameters of the first wireless communication module according to the communication parameters;
The second evaluation module is used for controlling the second wireless communication module to perform signal butt joint with the control center platform in sequence by using the plurality of communication modes in turn while the second wireless communication module performs communication parameter detection on the first wireless communication module, and obtaining communication quality evaluation parameters of each communication mode;
The first adjustment module is configured to adjust a wireless communication mode between the first wireless communication module and the control center platform to be a wireless communication mode corresponding to a highest communication quality evaluation parameter of the second wireless communication module when a communication quality evaluation parameter corresponding to a current wireless communication mode of the first wireless communication module is lower than a continuous time of any communication quality evaluation parameter corresponding to the communication mode of the second wireless communication module and exceeds a preset first time threshold;
And the second adjustment module is used for exchanging the function roles between the first wireless communication module and the second wireless communication module when the communication quality evaluation parameter corresponding to the adjusted wireless communication mode of the first wireless communication module is lower than the continuous time of the communication quality evaluation parameter corresponding to any one of the communication modes of the second wireless communication module and exceeds a preset second time threshold.
The constraint conditions set between the first time threshold and the second time threshold are as follows:
Wherein T 1 and T 2 represent a first time threshold and a second time threshold, respectively; t s1 represents an actual data transmission delay average value corresponding to the wireless communication mode before adjustment of the first wireless communication module; t s2 represents an actual data transmission delay average value corresponding to the wireless communication mode before adjustment of the second wireless communication module; k represents a preset adjustment coefficient, and the value range of k is 2.1-2.3.
The communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module are obtained through the following formula:
Wherein E represents communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module; e 0 represents a preset reference parameter constant value; n represents the number of unit time contained in the wireless communication duration of the current unmanned aerial vehicle, and the unit time is 1s; v di denotes a data download speed of the wireless communication corresponding to the i-th unit time; v si denotes a data upload speed of wireless communication corresponding to the i-th unit time; v di denotes a data download speed of the wireless communication corresponding to the i-th unit time; v si denotes a data upload speed of wireless communication corresponding to the i-th unit time; v d0 and V s0 respectively represent a lower limit value of a data downloading speed and a lower limit value of a data uploading speed which correspond to the optimal data communication state; deltaV represents a nominal speed difference between a preset data upload and a data download.
Further, the first adjustment module includes:
The communication connection module is used for controlling the second wireless communication module and the control center platform to establish temporary data communication connection in a wireless communication mode corresponding to the highest communication quality evaluation parameter so as to transmit disaster damage scene images;
The adjustment execution module is used for adjusting the wireless communication mode between the first wireless communication module and the control center platform after the data communication connection is completed between the second wireless communication module and the control center platform, adjusting the wireless communication mode to the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, and transmitting disaster damage scene images;
And the communication cut-off module is used for disconnecting the temporary data communication connection between the second wireless communication module and the control center platform after the transmission of the first disaster damage scene image is completed between the first wireless communication module and the control center platform.
The invention has the beneficial effects that:
The disaster damage detection method and system based on the robot can automatically judge whether a disaster damage detection task is received or not, acquire the target and the target position information according to the task requirement, and realize the automatic processing of the disaster damage detection task. By controlling the robot to go to the target position and carrying the image acquisition equipment, the system can acquire disaster scene image information in real time, and timeliness and instantaneity of data acquisition are effectively improved. By processing and analyzing the image data of the disaster damage scene image, damage information in the image can be extracted, damage results can be obtained, and accuracy and timeliness of disaster damage information obtaining and disaster damage detection efficiency are effectively improved.
Drawings
FIG. 1 is a flowchart of a method for detecting disaster damage;
FIG. 2 is a flow chart II of the disaster detection method;
FIG. 3 is a system block diagram of the disaster detection system;
FIG. 4 is a schematic diagram of the data acquisition and mapping module;
Fig. 5 is a schematic structural diagram of the statistics judging module.
Detailed Description
The preferred embodiments of the present invention will be described below with reference to the accompanying drawings, it being understood that the preferred embodiments described herein are for illustration and explanation of the present invention only, and are not intended to limit the present invention.
The embodiment of the invention provides a disaster damage detection method based on a robot, as shown in fig. 1, comprising the following steps:
S1, judging whether a disaster damage detection task is received in real time, and acquiring a disaster damage detection target and a target position through the disaster damage detection task after the disaster damage detection task is received;
s2, controlling the robot to go to the disaster detection target position to acquire disaster scene image information of the disaster detection target, and sending the disaster scene image acquired in real time to a control center platform;
S3, performing image data processing on the disaster damage scene image to obtain a damage result reflected by the disaster damage scene image.
The working principle of the technical scheme is as follows: and judging whether a disaster damage detection task is received or not in real time, and acquiring targets and target position information in the disaster damage detection task. This stage is mainly the process of task reception and target information acquisition according to task requirements.
And controlling the robot to go to the disaster damage detection target position so as to acquire disaster damage scene image information of the target. The disaster scene images can be acquired in real time by flying to the target position and carrying corresponding image acquisition equipment, and the images are sent to the control center platform. This stage involves the control and image acquisition process of the drone.
And processing the image data of the acquired disaster damage scene image to acquire a damage result reflected by the image. The system can extract damage information in the disaster damage scene image and generate corresponding damage results through algorithms such as analysis, processing and recognition of the image.
The technical scheme has the effects that: the disaster damage detection method based on the robot can automatically judge whether a disaster damage detection task is received or not, and acquire the target and the target position information according to the task requirement, so that the automation processing of the disaster damage detection task is realized. By controlling the robot to go to the target position and carrying the image acquisition equipment, the system can acquire disaster scene image information in real time, and timeliness and instantaneity of data acquisition are effectively improved. By processing and analyzing the image data of the disaster damage scene image, damage information in the image can be extracted, damage results can be obtained, and accuracy and timeliness of disaster damage information obtaining and disaster damage detection efficiency are effectively improved.
In one embodiment of the present invention, controlling a robot to go to the disaster damage detection target position to collect disaster damage scene image information of the disaster damage detection target, and sending a disaster damage scene image obtained in real time to a control center platform, includes:
S201, controlling the robot to go to the disaster detection target position to acquire disaster scene image information of the disaster detection target in real time, and acquiring a real-time disaster scene image;
s202, the unmanned aerial vehicle sends the disaster damage scene image to a control center platform in real time in a wireless communication mode;
s203, in the process that the unmanned aerial vehicle sends the disaster damage scene image to a control center platform in real time in a wireless communication mode, the wireless communication strategy is adjusted by monitoring the wireless transmission quality condition in real time.
The working principle of the technical scheme is as follows: first, the robot is controlled to go to the disaster damage detection target position so as to acquire disaster damage scene image information in real time. The unmanned aerial vehicle carries corresponding image acquisition equipment, and performs image acquisition when flying to a target position, so that a real-time disaster damage scene image is acquired.
Then, the unmanned aerial vehicle sends the acquired disaster scene images to the control center platform in real time in a wireless communication mode. The unmanned aerial vehicle transmits the acquired image data to a control center platform by utilizing a wireless communication technology so as to facilitate subsequent processing and analysis.
Finally, in the process of transmitting images in real time by the unmanned aerial vehicle in a wireless communication mode, the system monitors the wireless transmission quality condition in real time and adjusts the wireless communication strategy according to the monitoring result. By monitoring the wireless transmission quality in real time, the system can find out the problems in transmission in time and adopts a corresponding adjustment strategy to ensure the reliable transmission of the image data.
The technical scheme has the effects that: real-time disaster scene image information can be obtained by controlling the robot to go to the target position for real-time image acquisition, and timely data support is provided. Through wireless communication mode, unmanned aerial vehicle can be in real time with the disaster damage scene image that gathers send to control center platform, can realize real-time transmission and the sharing of data, provide quick information feedback and throughput. The stability and reliability of data transmission can be optimized by monitoring the wireless transmission quality condition in real time and adjusting the wireless communication strategy according to the monitoring result, and the success rate and efficiency of data transmission are improved.
In summary, the technical scheme provided by the embodiment realizes timely acquisition and transmission of disaster damage scene image information through the processes of real-time image acquisition, real-time data transmission, wireless communication strategy optimization and the like, and provides an accurate and efficient data basis for subsequent processing and analysis.
In one embodiment of the present invention, controlling the robot to go to the disaster damage detection target position to collect disaster damage scene image information of the disaster damage detection target in real time and obtain a real-time disaster damage scene image includes:
And carrying out space coordinate alignment on the disaster site by utilizing a plurality of sensors carried by the unmanned aerial vehicle, and carrying out joint calibration on the disaster site by utilizing a camera, a laser radar and an IMU.
S2011, providing an initial pose through IMU data, and correcting laser original radar data distortion; specific:
The IMU sampling frame number frequency is too high, radar data frames are adopted as time stamps, and time coordinates are aligned; and pre-integrating IMU data at the starting and ending time of the laser frames to obtain a rotation increment and a translation increment of the position change of the frames, and converting the cloud coordinates of the laser points between the frames into initial coordinates.
S2012, performing initial pose transformation between key frames by using an IMU data pre-integration result by the laser radar, and obtaining pose transformation coordinate parameters according to a point cloud characteristic registration result;
specifically, filtering is carried out on the laser point cloud original data, redundancy is reduced, and noise points are filtered;
eliminating laser point cloud distortion by using IMU pre-integral parameters,
Calculating the curvature of each laser point by squaring the sum of the distance deviation between the front and rear five points and the current point, extracting corner points and plane characteristic points according to the curvature,
The laser odometer uses only laser radar key frames, the rest of which are ignored.
The laser point cloud map is constructed, a laser key frame characteristic map based on a sliding window is adopted for maintenance, an IMU pre-integration result is used as an initial matching pose, when a new frame of laser key frame appears, the frame and a local characteristic map are matched, constraint items of a point line and a point surface are constructed, the pose of the new frame of laser point cloud is obtained by minimizing total constraint,
When the pose of the mobile platform changes or the time interval of the key frames exceeds a certain threshold value, a new laser radar key frame is determined,
And an odometer is built based on feature matching, so that the calculated amount is reduced.
S2013, providing front-end vision by using a double/monocular camera, realizing image feature extraction and inter-frame feature matching, and outputting a result of matching the vision features with the laser point cloud features; specific:
setting brightness threshold, counting brightness of 16 pixel points around central pixel with 3 pixels as radius, traversing each pixel, selecting characteristic point,
For a binocular camera, matching left and right photo feature points, realizing depth estimation of the feature points according to the epipolar geometric relationship, carrying out motion pose estimation according to the change of the feature points of the front frame and the rear frame, for a monocular camera, matching the photo feature points, carrying out motion estimation according to the change of the pixel positions of the feature points of the front frame and the rear frame,
Associating a visual feature point laser point cloud set, specifically, (1) converting laser point cloud data into an image coordinate system, projecting visual feature points and laser radar depth points to a unit sphere taking a camera as a center, (2) obtaining 3 laser radar depth points closest to the feature point on each visual feature point on the sphere, interpolating to obtain the depth of the feature point, matching the image feature point and laser point cloud information,
And outputting a matching result of the image and the point cloud space.
S2014, providing inter-frame speed estimation and global coordinate constraint through GNSS;
s2015, constructing the following objective function according to laser milemeter constraint, IMU constraint and global coordinate constraint by using the obtained multi-sensor data, optimizing the pose and completing the construction of the global coordinate 3D map:
wherein e ij denotes in unity:
eij(x)=fij(x)-zij (2)
f ij (x) is the laser radar, IMU, GNSS observation equation predictor of key frame pose x i to x j, and z ij is the key frame inter-match measurement.
The working principle of the technical scheme is as follows: aiming at the positioning and mapping of disaster sites, the invention adopts a multi-sensor fusion method, and improves the accuracy and redundancy of the positioning and mapping of disaster site detection data. The data acquisition and mapping module architecture and flow are shown in fig. 4. The method comprises the following steps: and (3) utilizing the mobile platform and the sensor to couple and map the multi-source data, positioning the disaster area and forming a topographic map of the disaster area. And transmitting the collection and mapping module to the rear for statistical analysis. And judging the disaster damage range and counting the engineering quantity through the existing design/history information matching, and completing the disaster damage information statistical analysis.
The technical effects of the technical scheme are as follows: the robot disaster damage detection system can realize the on-site data acquisition of the autonomous multi-sensor fusion mode of the robot, and complete the construction of a disaster damage area map by adopting a factor map optimization mode, so that the on-site work load of personnel is greatly reduced under the condition of improving the on-site map mapping precision; meanwhile, the front-end robot data can be remotely transmitted to the control end, so that the coverage range of the post-disaster dangerous area is greatly expanded, and the dangerous working level of personnel is reduced. Finally, the disaster damage information is statistically reported, and the reflection speed of emergency rescue after high disaster of road traffic can be greatly improved.
In one embodiment of the present invention, in the process that the unmanned aerial vehicle sends the disaster damage scene image to the control center platform in real time through a wireless communication manner, the wireless communication strategy is adjusted by monitoring the wireless transmission quality status in real time, including:
S2031, controlling the unmanned aerial vehicle to perform wireless data connection and disaster scene image transmission with a control center platform by using a first wireless communication module through one of a plurality of communication modes at the starting moment of the unmanned aerial vehicle executing task; wherein, the plurality of communication modes comprise a wifi communication mode and a 4G/5G communication mode;
S2032, controlling a second wireless communication module of the unmanned aerial vehicle to detect communication parameters of the first wireless communication module in real time, and acquiring communication quality evaluation parameters of the first wireless communication module according to the communication parameters;
S2033, controlling the second wireless communication module to perform signal butt joint with the control center platform in sequence by using the plurality of communication modes in a fixed mode while the second wireless communication module performs communication parameter detection on the first wireless communication module, and obtaining communication quality evaluation parameters of each communication mode;
S2034, when the communication quality evaluation parameter corresponding to the current wireless communication mode of the first wireless communication module is lower than the continuous time of any communication quality evaluation parameter corresponding to the communication mode of the second wireless communication module and exceeds a preset first time threshold, adjusting the wireless communication mode between the first wireless communication module and the control center platform to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module;
S2035, when the communication quality evaluation parameter corresponding to the adjusted wireless communication mode of the first wireless communication module is lower than the continuous time of the communication quality evaluation parameter corresponding to any one of the communication modes of the second wireless communication module and exceeds the preset second time threshold, exchanging the function roles between the first wireless communication module and the second wireless communication module.
The constraint conditions set between the first time threshold and the second time threshold are as follows:
Wherein T 1 and T 2 represent a first time threshold and a second time threshold, respectively; t s1 represents an actual data transmission delay average value corresponding to the wireless communication mode before adjustment of the first wireless communication module; t s2 represents an actual data transmission delay average value corresponding to the wireless communication mode before adjustment of the second wireless communication module; k represents a preset adjustment coefficient, and the value range of k is 2.1-2.3.
The communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module are obtained through the following formula:
Wherein E represents communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module; e 0 represents a preset reference parameter constant value; n represents the number of unit time contained in the wireless communication duration of the current unmanned aerial vehicle, and the unit time is 1s; v di denotes a data download speed of the wireless communication corresponding to the i-th unit time; v si denotes a data upload speed of wireless communication corresponding to the i-th unit time; v di denotes a data download speed of the wireless communication corresponding to the i-th unit time; v si denotes a data upload speed of wireless communication corresponding to the i-th unit time; v d0 and V s0 respectively represent a lower limit value of a data downloading speed and a lower limit value of a data uploading speed which correspond to the optimal data communication state; deltaV represents a rated speed difference between a preset data upload and a preset data download; abs represents a positive number function.
The working principle of the technical scheme is as follows: firstly, at the starting moment of the unmanned aerial vehicle executing task, the first wireless communication module is utilized to carry out wireless data connection and disaster scene image transmission with the control center platform through one of a plurality of communication modes. The communication modes can comprise a WiFi communication mode and a 4G/5G communication mode, and communication connection is established with the control center platform by selecting a proper communication mode.
And then, controlling the second wireless communication module of the unmanned aerial vehicle to detect the communication parameters of the first wireless communication module in real time, and acquiring the communication quality evaluation parameters of the first wireless communication module according to the communication parameters. The second wireless communication module is responsible for monitoring the communication condition of the first wireless communication module and acquiring relevant parameters for evaluating the communication quality of the first wireless communication module.
And then, when the second wireless communication module regularly detects the communication parameters of the first wireless communication module, controlling the second wireless communication module to sequentially use a plurality of communication modes to carry out signal butt joint with the control center platform, and obtaining the communication quality evaluation parameters of each communication mode. Thus, performance indexes of various communication modes can be obtained for subsequent comparison and judgment.
And then, when the continuous time of the communication quality evaluation parameter of the current communication mode of the first wireless communication module is lower than the preset time threshold value of the communication quality evaluation parameter corresponding to any one communication mode of the second wireless communication module, the system adjusts the wireless communication mode between the first wireless communication module and the control center platform and selects the communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module.
Finally, when the continuous time of the communication quality evaluation parameter of the communication mode adjusted by the first wireless communication module is lower than the preset time threshold of the communication quality evaluation parameter corresponding to any one communication mode of the second wireless communication module, the system exchanges the function roles between the first wireless communication module and the second wireless communication module. That is, the first wireless communication module becomes monitoring the communication quality of the second wireless communication module, and the second wireless communication module is responsible for communicating with the control center platform.
The technical scheme has the effects that: and at the starting moment of the unmanned aerial vehicle executing task, the first wireless communication module is utilized to carry out wireless data connection and disaster scene image transmission with the control center platform through one of a plurality of communication modes. Thus, a proper communication mode, such as a WiFi communication mode and a 4G/5G communication mode, can be selected according to actual conditions.
And controlling the second wireless communication module to detect the communication parameters of the first wireless communication module in real time, and acquiring the communication quality evaluation parameters of the first wireless communication module. Meanwhile, the control center platform is subjected to signal butt joint by alternately using a plurality of communication modes, and communication quality evaluation parameters of each communication mode are obtained. This allows real-time monitoring and assessment of communication quality.
When the continuous time of the communication quality evaluation parameter of the current communication mode of the first wireless communication module is lower than the preset time threshold of the communication quality evaluation parameter corresponding to any one communication mode of the second wireless communication module, the system can adjust the wireless communication mode between the first wireless communication module and the control center platform. And adjusting the wireless communication mode to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module. If the adjusted communication mode still cannot meet the requirements, the system exchanges the function roles between the first wireless communication module and the second wireless communication module. Therefore, the dynamic adjustment and optimization of the communication mode can be realized, and the communication quality and reliability are improved.
Meanwhile, according to the method, the accuracy of setting the first time threshold and the second time threshold and the matching performance between the accuracy and the actual communication condition of setting the first time threshold and the second time threshold can be effectively improved, and further the rationality of setting the wireless communication mode is effectively improved. Meanwhile, by using the technical scheme to set the communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module, the setting accuracy of the communication quality evaluation parameters and the response accuracy of the communication quality evaluation parameters to the real communication quality can be effectively improved.
In one embodiment of the present invention, as shown in fig. 2, the method for adjusting the wireless communication mode between the first wireless communication module and the control center platform to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module includes:
Step 1, controlling the second wireless communication module and the control center platform to establish temporary data communication connection in a wireless communication mode corresponding to the highest communication quality evaluation parameter, and transmitting disaster scene images;
Step 2, after the data communication connection is completed between the second wireless communication module and the control center platform, the wireless communication mode between the first wireless communication module and the control center platform is adjusted to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, and disaster damage scene image transmission is carried out;
And 3, disconnecting the temporary data communication connection between the second wireless communication module and the control center platform after the transmission of the first disaster damage scene image is completed between the first wireless communication module and the control center platform.
The working principle of the technical scheme is as follows: and establishing temporary data communication connection with the control center platform by controlling the second wireless communication module, and transmitting disaster scene images by using a wireless communication mode corresponding to the highest communication quality evaluation parameter.
After stable data communication connection is established between the second wireless communication module and the control center platform, the wireless communication mode between the first wireless communication module and the control center platform is adjusted to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, and disaster damage scene image transmission is carried out.
And after the transmission of the first disaster damage scene image is completed, disconnecting the temporary data communication connection between the second wireless communication module and the control center platform.
The technical scheme has the effects that: the temporary data communication connection is established through the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, so that the rapid transmission of disaster scene images can be ensured, and the data transmission efficiency is improved.
The wireless communication mode of the first wireless communication module is adjusted to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, so that the communication quality can be further improved by optimizing the communication mode, and stable data transmission is ensured.
The temporary data communication connection between the second wireless communication module and the control center platform is disconnected, so that occupied communication resources can be released, the communication resources can be used for other communication demands, and the resource utilization efficiency is improved.
In one embodiment of the present invention, as shown in fig. 5, the determining the disaster damage range and counting the engineering quantity through the existing design/history information matching, and completing the disaster damage information statistical analysis, includes:
the global 3D map, the image and the point cloud feature matching information transmitted by the information transmission module are accessed through the interface unit;
Introducing a longitudinal and transverse curve of a route and a section mileage into a global 3D map, and calculating a normal plane of a section route central line of the mileage according to the following formula
A(x-xi)+B(y-yi)+C(z-hi)=0 (4)
Wherein: x i、yi、hi is the plane coordinate and elevation of the center line of the mileage point route, A, B and C are plane parameters, the plane parameters are obtained by calculating a vector n=alpha×beta= (A, B and C), alpha is the tangential direction of the mileage point and the plane curve, and beta is the vertical tangential direction of the mileage point;
Setting slice thickness, searching for a point with a normal plane distance value smaller than 1/2 slice thickness, projecting the point to a normal plane, taking a center point of a mileage section route as a polar coordinate center, calculating the polar angles of each projected point of the disease recording normal plane, and connecting the projected points with straight lines in turn according to the increasing sequence of the polar angles to obtain the normal plane section shape;
comparing the designed section with the shape of the normal plane section to obtain the excavation or filling area;
obtaining the filling square quantity according to the average value of the filling areas of the adjacent mileage sections and the mileage length calculation;
And (3) carrying out segmentation statistics on the square quantity, identifying the disaster damage type according to the image between the adjacent mileage sections and the point cloud characteristic matching information, and carrying out classification statistics according to the route mileage to finish a disaster damage report.
The working principle of the technical scheme is as follows: and accessing global 3D map, image and point cloud feature matching information transmitted by the information transmission module through an interface unit. Such information may include 3D data of various topographical features such as topography, buildings, vegetation, etc.
And importing a longitudinal and transverse curve of the route and the section mileage in the global 3D map, and calculating a normal plane of the center line of the section route of the mileage according to the input mileage information. The calculation involves calculation of plane parameters and acquisition of normal vectors describing the mileage points and tangential direction of the flat curves.
The slice thickness is set, and points with a normal plane distance value less than 1/2 slice thickness are searched for and projected onto the normal plane. And then taking the center point of the mileage section route as the polar coordinate center, and calculating and recording the polar angle of each point projected by the normal plane.
The projected points are connected by straight lines in order of increasing polar angle, thereby obtaining a normal plane cross-sectional shape. This shape can be used to analyze the topography of varying features such as ridges, valleys, etc.
By comparing the shape of the designed section with the shape of the normal plane section, the area of the cut or fill surface can be calculated. This can be used for estimation of engineering quantities, for example: and (5) carrying out statistics on engineering quantities such as filling and excavating convenience, temporary bridge convenience and the like.
And calculating according to the average value of the filling area of the sections of the adjacent mileage and the mileage length to obtain the filling square quantity. This is a further calculation of the engineering quantity, and the total fill and pick volumes can be obtained.
And (3) carrying out segmentation statistics on the square quantity, and identifying the disaster damage type according to the image between the adjacent mileage sections and the point cloud characteristic matching information. The method utilizes the technology of machine vision and image processing, and can identify various disaster types such as fire, flood, earthquake and the like through analyzing the image and the point cloud data.
And finally, counting the disaster damage types according to the route mileage classification, and completing the generation of a disaster damage report.
The technical effects of the technical scheme are as follows: a large amount of 3D map data and image data can be rapidly and accurately acquired and processed, and the data processing efficiency is improved. The polar angle of the plane and the projection point is calculated, so that the section shape of the terrain can be displayed more intuitively, and the analysis of the change condition of the terrain is facilitated. By comparing the shapes of the designed section and the normal plane section, the filling and digging areas can be accurately calculated, and the estimation of engineering quantity is facilitated. By utilizing the image processing and point cloud data processing technology, the disaster damage type can be automatically identified and classified statistics can be carried out, so that the efficiency and accuracy of disaster damage assessment are improved. The technical scheme of the embodiment combines the technical means of 3D map data processing, image processing, machine vision and the like, realizes the fine analysis of landforms and the automatic evaluation of disaster damage, and has high practical value and application prospect.
The embodiment of the invention provides a disaster damage detection system based on a robot, as shown in fig. 3, the disaster damage detection system comprises:
The target acquisition module is used for judging whether a disaster damage detection task is received or not in real time, and acquiring a disaster damage detection target and a target position through the disaster damage detection task after the disaster damage detection task is received;
The data acquisition and mapping module is used for controlling the robot to go to the disaster damage detection target position to acquire disaster damage scene image information of the disaster damage detection target and transmitting the disaster damage scene image acquired in real time to the control center platform;
the result acquisition module is used for carrying out image data processing on the disaster damage scene image and acquiring a damage result reflected by the disaster damage scene image.
The working principle of the technical scheme is as follows: firstly, judging whether a disaster damage detection task is received or not in real time through a target acquisition module, and acquiring a disaster damage detection target and a target position through the disaster damage detection task after the disaster damage detection task is received;
Then, a data acquisition and mapping module is used for controlling the robot to go to the disaster detection target position to acquire disaster scene image information of the disaster detection target, and a disaster scene image acquired in real time is sent to a control center platform;
And finally, performing image data processing on the disaster damage scene image through a result acquisition module to acquire a damage result reflected by the disaster damage scene image.
The technical scheme has the effects that: the disaster damage detection system based on the robot can automatically judge whether a disaster damage detection task is received or not, and acquire the target and the target position information according to the task requirement, so that the automation processing of the disaster damage detection task is realized. By controlling the robot to go to the target position and carrying the image acquisition equipment, the system can acquire disaster scene image information in real time, and timeliness and instantaneity of data acquisition are effectively improved. By processing and analyzing the image data of the disaster damage scene image, damage information in the image can be extracted, damage results can be obtained, and accuracy and timeliness of disaster damage information obtaining and disaster damage detection efficiency are effectively improved.
In one embodiment of the present invention, the data acquisition and mapping module includes:
the image acquisition module is used for controlling the robot to go to the disaster damage detection target position to acquire disaster damage scene image information of the disaster damage detection target in real time and acquiring a real-time disaster damage scene image;
the data transmission module is used for transmitting the disaster damage scene image to the control center platform in real time by the unmanned aerial vehicle in a wireless communication mode;
And the strategy adjustment module is used for adjusting the wireless communication strategy by monitoring the wireless transmission quality condition in real time in the process that the unmanned aerial vehicle sends the disaster scene image to the control center platform in real time in a wireless communication mode.
The working principle of the technical scheme is as follows: firstly, controlling a robot to go to the disaster damage detection target position through an image acquisition module to acquire disaster damage scene image information of the disaster damage detection target in real time, and acquiring a real-time disaster damage scene image;
Then, the unmanned aerial vehicle is controlled by a data transmission module to transmit the disaster damage scene image to a control center platform in real time in a wireless communication mode;
And finally, adopting a strategy adjustment module to adjust a wireless communication strategy by monitoring the wireless transmission quality condition in real time in the process that the unmanned aerial vehicle sends the disaster scene image to a control center platform in real time in a wireless communication mode.
The technical scheme has the effects that: real-time disaster scene image information can be obtained by controlling the robot to go to the target position for real-time image acquisition, and timely data support is provided. Through wireless communication mode, unmanned aerial vehicle can be in real time with the disaster damage scene image that gathers send to control center platform, can realize real-time transmission and the sharing of data, provide quick information feedback and throughput. The stability and reliability of data transmission can be optimized by monitoring the wireless transmission quality condition in real time and adjusting the wireless communication strategy according to the monitoring result, and the success rate and efficiency of data transmission are improved.
In summary, the technical scheme provided by the embodiment realizes timely acquisition and transmission of disaster damage scene image information through the processes of real-time image acquisition, real-time data transmission, wireless communication strategy optimization and the like, and provides an accurate and efficient data basis for subsequent processing and analysis.
In one embodiment of the present invention, the policy adjustment module includes:
The first communication data acquisition and mapping module is used for controlling the unmanned aerial vehicle to carry out wireless data connection and disaster scene image transmission with the control center platform by utilizing one of a plurality of communication modes through the first wireless communication module at the starting moment of the unmanned aerial vehicle executing task; wherein, the plurality of communication modes comprise a wifi communication mode and a 4G/5G communication mode;
The first evaluation module is used for controlling a second wireless communication module of the unmanned aerial vehicle to detect communication parameters of the first wireless communication module in real time and acquiring communication quality evaluation parameters of the first wireless communication module according to the communication parameters;
The second evaluation module is used for controlling the second wireless communication module to perform signal butt joint with the control center platform in sequence by using the plurality of communication modes in turn while the second wireless communication module performs communication parameter detection on the first wireless communication module, and obtaining communication quality evaluation parameters of each communication mode;
The first adjustment module is configured to adjust a wireless communication mode between the first wireless communication module and the control center platform to be a wireless communication mode corresponding to a highest communication quality evaluation parameter of the second wireless communication module when a communication quality evaluation parameter corresponding to a current wireless communication mode of the first wireless communication module is lower than a continuous time of any communication quality evaluation parameter corresponding to the communication mode of the second wireless communication module and exceeds a preset first time threshold;
And the second adjustment module is used for exchanging the function roles between the first wireless communication module and the second wireless communication module when the communication quality evaluation parameter corresponding to the adjusted wireless communication mode of the first wireless communication module is lower than the continuous time of the communication quality evaluation parameter corresponding to any one of the communication modes of the second wireless communication module and exceeds a preset second time threshold.
The constraint conditions set between the first time threshold and the second time threshold are as follows:
Wherein T 1 and T 2 represent a first time threshold and a second time threshold, respectively; t s1 represents an actual data transmission delay average value corresponding to the wireless communication mode before adjustment of the first wireless communication module; t s2 represents an actual data transmission delay average value corresponding to the wireless communication mode before adjustment of the second wireless communication module; k represents a preset adjustment coefficient, and the value range of k is 2.1-2.3.
The communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module are obtained through the following formula:
Wherein E represents communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module; e 0 represents a preset reference parameter constant value; n represents the number of unit time contained in the wireless communication duration of the current unmanned aerial vehicle, and the unit time is 1s; v di denotes a data download speed of the wireless communication corresponding to the i-th unit time; v si denotes a data upload speed of wireless communication corresponding to the i-th unit time; v di denotes a data download speed of the wireless communication corresponding to the i-th unit time; v si denotes a data upload speed of wireless communication corresponding to the i-th unit time; v d0 and V s0 respectively represent a lower limit value of a data downloading speed and a lower limit value of a data uploading speed which correspond to the optimal data communication state; deltaV represents a nominal speed difference between a preset data upload and a data download.
The working principle of the technical scheme is as follows: firstly, at the starting moment of the unmanned aerial vehicle executing task, the unmanned aerial vehicle is controlled to carry out wireless data connection and disaster scene image transmission with a control center platform by utilizing a first wireless communication module through one communication mode of a plurality of communication modes through a first communication data acquisition and mapping module; wherein, the plurality of communication modes comprise a wifi communication mode and a 4G/5G communication mode;
then, a first evaluation module is used for controlling a second wireless communication module of the unmanned aerial vehicle to detect communication parameters of the first wireless communication module in real time, and communication quality evaluation parameters of the first wireless communication module are obtained according to the communication parameters;
then, the second evaluation module is utilized to control the second wireless communication module to perform signal butt joint with the control center platform in sequence by utilizing the plurality of communication modes at the same time of determining the communication parameter detection of the first wireless communication module by the second wireless communication module, and the communication quality evaluation parameter of each communication mode is obtained;
Then, when the continuous time of the communication quality evaluation parameter corresponding to the current wireless communication mode of the first wireless communication module is lower than the continuous time of the communication quality evaluation parameter corresponding to any one of the communication modes of the second wireless communication module and exceeds a preset first time threshold value, the wireless communication mode between the first wireless communication module and the control center platform is adjusted to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module by using a first adjustment module;
And finally, adopting a second adjustment module to exchange the function roles between the first wireless communication module and the second wireless communication module when the communication quality evaluation parameter corresponding to the adjusted wireless communication mode of the first wireless communication module is lower than the continuous time of the communication quality evaluation parameter corresponding to any one of the communication modes of the second wireless communication module and exceeds a preset second time threshold.
The technical scheme has the effects that: and at the starting moment of the unmanned aerial vehicle executing task, the first wireless communication module is utilized to carry out wireless data connection and disaster scene image transmission with the control center platform through one of a plurality of communication modes. Thus, a proper communication mode, such as a WiFi communication mode and a 4G/5G communication mode, can be selected according to actual conditions.
And controlling the second wireless communication module to detect the communication parameters of the first wireless communication module in real time, and acquiring the communication quality evaluation parameters of the first wireless communication module. Meanwhile, the control center platform is subjected to signal butt joint by alternately using a plurality of communication modes, and communication quality evaluation parameters of each communication mode are obtained. This allows real-time monitoring and assessment of communication quality.
When the continuous time of the communication quality evaluation parameter of the current communication mode of the first wireless communication module is lower than the preset time threshold of the communication quality evaluation parameter corresponding to any one communication mode of the second wireless communication module, the system can adjust the wireless communication mode between the first wireless communication module and the control center platform. And adjusting the wireless communication mode to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module. If the adjusted communication mode still cannot meet the requirements, the system exchanges the function roles between the first wireless communication module and the second wireless communication module. Therefore, the dynamic adjustment and optimization of the communication mode can be realized, and the communication quality and reliability are improved.
Meanwhile, according to the method, the accuracy of setting the first time threshold and the second time threshold and the matching performance between the accuracy and the actual communication condition of setting the first time threshold and the second time threshold can be effectively improved, and further the rationality of setting the wireless communication mode is effectively improved. Meanwhile, by using the technical scheme to set the communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module, the setting accuracy of the communication quality evaluation parameters and the response accuracy of the communication quality evaluation parameters to the real communication quality can be effectively improved.
In one embodiment of the present invention, the first adjustment module includes:
The communication connection module is used for controlling the second wireless communication module and the control center platform to establish temporary data communication connection in a wireless communication mode corresponding to the highest communication quality evaluation parameter so as to transmit disaster damage scene images;
The adjustment execution module is used for adjusting the wireless communication mode between the first wireless communication module and the control center platform after the data communication connection is completed between the second wireless communication module and the control center platform, adjusting the wireless communication mode to the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, and transmitting disaster damage scene images;
And the communication cut-off module is used for disconnecting the temporary data communication connection between the second wireless communication module and the control center platform after the transmission of the first disaster damage scene image is completed between the first wireless communication module and the control center platform.
The working principle of the technical scheme is as follows: firstly, controlling the second wireless communication module and the control center platform to establish temporary data communication connection in a wireless communication mode corresponding to the highest communication quality evaluation parameter through the communication connection module, and transmitting disaster damage scene images;
then, after the data communication connection is completed between the second wireless communication module and the control center platform by utilizing the adjustment execution module, the wireless communication mode between the first wireless communication module and the control center platform is adjusted to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, and disaster damage scene image transmission is carried out;
and finally, disconnecting the temporary data communication connection between the second wireless communication module and the control center platform after the transmission of the first disaster scene image is completed between the first wireless communication module and the control center platform through the communication cut-off module.
The technical scheme has the effects that: the temporary data communication connection is established through the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, so that the rapid transmission of disaster scene images can be ensured, and the data transmission efficiency is improved.
The wireless communication mode of the first wireless communication module is adjusted to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, so that the communication quality can be further improved by optimizing the communication mode, and stable data transmission is ensured.
The temporary data communication connection between the second wireless communication module and the control center platform is disconnected, so that occupied communication resources can be released, the communication resources can be used for other communication demands, and the resource utilization efficiency is improved.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.
Claims (6)
1. The disaster damage detection method based on the robot is characterized by comprising the following steps:
judging whether a disaster damage detection task is received in real time, and acquiring a disaster damage detection target and a target position through the disaster damage detection task after the disaster damage detection task is received;
the control robot goes to the disaster damage detection target position to acquire disaster damage scene image information of the disaster damage detection target, and sends the disaster damage scene image acquired in real time to a control center platform;
performing image data processing on the disaster damage scene image to obtain a damage result reflected by the disaster damage scene image;
judging the disaster damage range and counting the engineering quantity through the existing design/history information matching, and completing the disaster damage information statistical analysis;
the control robot goes to the disaster damage detection target position to collect disaster damage scene image information of the disaster damage detection target, and sends the disaster damage scene image obtained in real time to a control center platform, comprising:
The robot is controlled to go to the disaster damage detection target position to acquire disaster damage scene image information of the disaster damage detection target in real time, and a real-time disaster damage scene image is acquired;
the unmanned aerial vehicle sends the disaster damage scene image to a control center platform in real time in a wireless communication mode;
In the process that the unmanned aerial vehicle sends the disaster damage scene image to a control center platform in real time in a wireless communication mode, wireless communication strategies are adjusted by monitoring wireless transmission quality conditions in real time;
In the process that the unmanned aerial vehicle sends the disaster damage scene image to the control center platform in real time in a wireless communication mode, the wireless communication strategy is adjusted by monitoring the wireless transmission quality condition in real time, and the method comprises the following steps:
At the starting moment of the unmanned aerial vehicle executing task, the unmanned aerial vehicle is controlled to carry out wireless data connection and disaster scene image transmission with a control center platform by utilizing a first wireless communication module through one communication mode of a plurality of communication modes; wherein, the plurality of communication modes comprise a wifi communication mode and a 4G/5G communication mode;
Controlling a second wireless communication module of the unmanned aerial vehicle to detect communication parameters of the first wireless communication module in real time, and acquiring communication quality evaluation parameters of the first wireless communication module according to the communication parameters;
When the second wireless communication module detects communication parameters of the first wireless communication module, the second wireless communication module is controlled to alternately utilize the plurality of communication modes to sequentially carry out signal butt joint with the control center platform, and communication quality evaluation parameters of each communication mode are obtained;
when the continuous time of the communication quality evaluation parameter corresponding to the current wireless communication mode of the first wireless communication module is lower than the continuous time of the communication quality evaluation parameter corresponding to any one of the communication modes of the second wireless communication module and exceeds a preset first time threshold, the wireless communication mode between the first wireless communication module and a control center platform is adjusted to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module;
When the continuous time of the communication quality evaluation parameter corresponding to the adjusted wireless communication mode of the first wireless communication module is lower than the continuous time of the communication quality evaluation parameter corresponding to any one of the communication modes of the second wireless communication module exceeds a preset second time threshold, exchanging the function roles between the first wireless communication module and the second wireless communication module;
The constraint conditions set between the first time threshold and the second time threshold are as follows:
Wherein T 1 and T 2 represent a first time threshold and a second time threshold, respectively; t s1 represents an actual data transmission delay average value corresponding to the wireless communication mode before adjustment of the first wireless communication module; t s2 represents an actual data transmission delay average value corresponding to the wireless communication mode before adjustment of the second wireless communication module; k represents a preset adjustment coefficient, and the value range of k is 2.1-2.3;
The communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module are obtained through the following formula:
Wherein E represents communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module; e 0 represents a preset reference parameter constant value; n represents the number of unit time contained in the wireless communication duration of the current unmanned aerial vehicle, and the unit time is 1s; v di denotes a data download speed of the wireless communication corresponding to the i-th unit time; v si denotes a data upload speed of wireless communication corresponding to the i-th unit time; v d0 and V s0 respectively represent a lower limit value of a data downloading speed and a lower limit value of a data uploading speed which correspond to the optimal data communication state; deltaV represents a rated speed difference between a preset data upload and a preset data download; abs represents a positive number function.
2. The method for detecting damage according to claim 1, wherein controlling the robot to travel to the damage detection target position to collect damage scene image information of the damage detection target in real time and obtain a real-time damage scene image comprises:
Carrying out space coordinate alignment on a disaster site by utilizing a plurality of sensors carried by the unmanned aerial vehicle, and carrying out joint calibration on the disaster site by utilizing a camera, a laser radar and an IMU;
providing an initial pose through IMU data, and correcting the distortion of laser original radar data;
The laser radar performs initial pose transformation among key frames by using an IMU data pre-integration result, and obtains pose transformation coordinate parameters according to a point cloud characteristic registration result;
providing front-end vision by using a double/monocular camera, realizing image feature extraction and inter-frame feature matching, and outputting a result of matching the vision features with the laser point cloud features;
Providing inter-frame velocity estimation and global coordinate constraint by GNSS;
and constructing the following objective function according to laser odometer constraint, IMU constraint and global coordinate constraint by using the obtained multi-sensor data, optimizing the pose and completing the construction of the global coordinate 3D map:
wherein e ij denotes in unity:
eij(x)=fij(x)-zij (2)
f ij (x) is the laser radar, IMU, GNSS observation equation predictor of key frame pose x i to x j, and z ij is the key frame inter-match measurement.
3. The disaster detection method according to claim 1, wherein the adjusting the wireless communication mode between the first wireless communication module and the control center platform to the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module comprises:
controlling the second wireless communication module and the control center platform to establish temporary data communication connection in a wireless communication mode corresponding to the highest communication quality evaluation parameter, and transmitting disaster scene images;
After the data communication connection is completed between the second wireless communication module and the control center platform, the wireless communication mode between the first wireless communication module and the control center platform is adjusted to be the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, and disaster scene image transmission is carried out;
And after the transmission of the first disaster scene image is completed between the first wireless communication module and the control center platform, disconnecting the temporary data communication connection between the second wireless communication module and the control center platform.
4. The method for detecting disaster damage according to claim 1, wherein said determining disaster damage range and calculating engineering quantity by existing design/history information matching, and completing disaster damage information statistical analysis, comprises:
the global 3D map, the image and the point cloud feature matching information transmitted by the information transmission module are accessed through the interface unit;
Introducing a longitudinal and transverse curve of a route and a section mileage into a global 3D map, and calculating a normal plane of a section route central line of the mileage according to the following formula
A(x-xi)+B(y-yi)+C(z-hi)=0 (4)
Wherein: x i、yi、hi is the plane coordinate and elevation of the center line of the mileage point route, A, B and C are plane parameters, the plane parameters are obtained by calculating a vector n=alpha×beta= (A, B and C), alpha is the tangential direction of the mileage point and the plane curve, and beta is the vertical tangential direction of the mileage point;
Setting slice thickness, searching for a point with a normal plane distance value smaller than 1/2 slice thickness, projecting the point to a normal plane, taking a center point of a mileage section route as a polar coordinate center, calculating the polar angles of each projected point of the disease recording normal plane, and connecting the projected points with straight lines in turn according to the increasing sequence of the polar angles to obtain the normal plane section shape;
comparing the designed section with the shape of the normal plane section to obtain the excavation or filling area;
obtaining the filling square quantity according to the average value of the filling areas of the adjacent mileage sections and the mileage length calculation;
And (3) carrying out segmentation statistics on the square quantity, identifying the disaster damage type according to the image between the adjacent mileage sections and the point cloud characteristic matching information, and carrying out classification statistics according to the route mileage to finish a disaster damage report.
5. A robot-based damage detection system, the damage detection system comprising:
The target acquisition module is used for judging whether a disaster damage detection task is received or not in real time, and acquiring a disaster damage detection target and a target position through the disaster damage detection task after the disaster damage detection task is received;
The data acquisition and mapping module is used for controlling the robot to go to the disaster damage detection target position to acquire disaster damage scene image information of the disaster damage detection target and transmitting the disaster damage scene image acquired in real time to the control center platform;
the result acquisition module is used for carrying out image data processing on the disaster damage scene image and acquiring a damage result reflected by the disaster damage scene image;
The statistics judging module is used for judging the disaster damage range and counting the engineering quantity through the existing design/history information matching, and completing the disaster damage information statistics analysis;
The data acquisition and mapping module comprises:
the image acquisition module is used for controlling the robot to go to the disaster damage detection target position to acquire disaster damage scene image information of the disaster damage detection target in real time and acquiring a real-time disaster damage scene image;
the data transmission module is used for transmitting the disaster damage scene image to the control center platform in real time by the unmanned aerial vehicle in a wireless communication mode;
the strategy adjustment module is used for adjusting the wireless communication strategy by monitoring the wireless transmission quality condition in real time in the process that the unmanned aerial vehicle sends the disaster scene image to the control center platform in real time in a wireless communication mode;
the policy adjustment module comprises:
The first communication data acquisition and mapping module is used for controlling the unmanned aerial vehicle to carry out wireless data connection and disaster scene image transmission with the control center platform by utilizing one of a plurality of communication modes through the first wireless communication module at the starting moment of the unmanned aerial vehicle executing task; wherein, the plurality of communication modes comprise a wifi communication mode and a 4G/5G communication mode;
The first evaluation module is used for controlling a second wireless communication module of the unmanned aerial vehicle to detect communication parameters of the first wireless communication module in real time and acquiring communication quality evaluation parameters of the first wireless communication module according to the communication parameters;
The second evaluation module is used for controlling the second wireless communication module to alternately utilize the plurality of communication modes to sequentially perform signal butt joint with the control center platform while the second wireless communication module detects the communication parameters of the first wireless communication module, and obtaining the communication quality evaluation parameters of each communication mode;
The first adjustment module is configured to adjust a wireless communication mode between the first wireless communication module and the control center platform to be a wireless communication mode corresponding to a highest communication quality evaluation parameter of the second wireless communication module when a communication quality evaluation parameter corresponding to a current wireless communication mode of the first wireless communication module is lower than a continuous time of any communication quality evaluation parameter corresponding to the communication mode of the second wireless communication module and exceeds a preset first time threshold;
The second adjustment module is used for exchanging the function roles between the first wireless communication module and the second wireless communication module when the communication quality evaluation parameter corresponding to the adjusted wireless communication mode of the first wireless communication module is lower than the continuous time of the communication quality evaluation parameter corresponding to any one of the communication modes of the second wireless communication module and exceeds a preset second time threshold;
The constraint conditions set between the first time threshold and the second time threshold are as follows:
Wherein T 1 and T 2 represent a first time threshold and a second time threshold, respectively; t s1 represents an actual data transmission delay average value corresponding to the wireless communication mode before adjustment of the first wireless communication module; t s2 represents an actual data transmission delay average value corresponding to the wireless communication mode before adjustment of the second wireless communication module; k represents a preset adjustment coefficient, and the value range of k is 2.1-2.3;
The communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module are obtained through the following formula:
Wherein E represents communication quality evaluation parameters corresponding to the first wireless communication module and the second wireless communication module; e 0 represents a preset reference parameter constant value; n represents the number of unit time contained in the wireless communication duration of the current unmanned aerial vehicle, and the unit time is 1s; v di denotes a data download speed of the wireless communication corresponding to the i-th unit time; v si denotes a data upload speed of wireless communication corresponding to the i-th unit time; v d0 and V s0 respectively represent a lower limit value of a data downloading speed and a lower limit value of a data uploading speed which correspond to the optimal data communication state; deltaV represents a rated speed difference between a preset data upload and a preset data download; abs represents a positive number function.
6. The disaster detection system of claim 5, wherein the first adjustment module comprises:
The communication connection module is used for controlling the second wireless communication module and the control center platform to establish temporary data communication connection in a wireless communication mode corresponding to the highest communication quality evaluation parameter so as to transmit disaster damage scene images;
The adjustment execution module is used for adjusting the wireless communication mode between the first wireless communication module and the control center platform after the data communication connection is completed between the second wireless communication module and the control center platform, adjusting the wireless communication mode to the wireless communication mode corresponding to the highest communication quality evaluation parameter of the second wireless communication module, and transmitting disaster damage scene images;
And the communication cut-off module is used for disconnecting the temporary data communication connection between the second wireless communication module and the control center platform after the transmission of the first disaster damage scene image is completed between the first wireless communication module and the control center platform.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311616497.5A CN117636251B (en) | 2023-11-30 | 2023-11-30 | Disaster damage detection method and system based on robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311616497.5A CN117636251B (en) | 2023-11-30 | 2023-11-30 | Disaster damage detection method and system based on robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117636251A CN117636251A (en) | 2024-03-01 |
CN117636251B true CN117636251B (en) | 2024-05-17 |
Family
ID=90021045
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311616497.5A Active CN117636251B (en) | 2023-11-30 | 2023-11-30 | Disaster damage detection method and system based on robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117636251B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109151751A (en) * | 2018-07-23 | 2019-01-04 | 上海华测导航技术股份有限公司 | A kind of disaster detection management method, apparatus, equipment and system |
CN109491383A (en) * | 2018-11-06 | 2019-03-19 | 上海应用技术大学 | Multirobot positions and builds drawing system and method |
CN111459166A (en) * | 2020-04-22 | 2020-07-28 | 北京工业大学 | Scene map construction method containing position information of trapped people in post-disaster rescue environment |
WO2022036980A1 (en) * | 2020-08-17 | 2022-02-24 | 浙江商汤科技开发有限公司 | Pose determination method and apparatus, electronic device, storage medium, and program |
CN114779698A (en) * | 2022-04-27 | 2022-07-22 | 招商局重庆交通科研设计院有限公司 | Long and large tunnel fire heterogeneous modular robot control system and method |
CN115345945A (en) * | 2022-08-10 | 2022-11-15 | 上海托旺数据科技有限公司 | Automatic inspection method and system for reconstructing expressway by using multi-view vision of unmanned aerial vehicle |
CN116359969A (en) * | 2023-04-17 | 2023-06-30 | 厦门四信物联网科技有限公司 | GNSS ground disaster monitoring device and system based on IMU dynamic self-adaption |
-
2023
- 2023-11-30 CN CN202311616497.5A patent/CN117636251B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109151751A (en) * | 2018-07-23 | 2019-01-04 | 上海华测导航技术股份有限公司 | A kind of disaster detection management method, apparatus, equipment and system |
CN109491383A (en) * | 2018-11-06 | 2019-03-19 | 上海应用技术大学 | Multirobot positions and builds drawing system and method |
CN111459166A (en) * | 2020-04-22 | 2020-07-28 | 北京工业大学 | Scene map construction method containing position information of trapped people in post-disaster rescue environment |
WO2022036980A1 (en) * | 2020-08-17 | 2022-02-24 | 浙江商汤科技开发有限公司 | Pose determination method and apparatus, electronic device, storage medium, and program |
CN114779698A (en) * | 2022-04-27 | 2022-07-22 | 招商局重庆交通科研设计院有限公司 | Long and large tunnel fire heterogeneous modular robot control system and method |
CN115345945A (en) * | 2022-08-10 | 2022-11-15 | 上海托旺数据科技有限公司 | Automatic inspection method and system for reconstructing expressway by using multi-view vision of unmanned aerial vehicle |
CN116359969A (en) * | 2023-04-17 | 2023-06-30 | 厦门四信物联网科技有限公司 | GNSS ground disaster monitoring device and system based on IMU dynamic self-adaption |
Non-Patent Citations (4)
Title |
---|
Multi-Agents System for Early Disaster Detection, Evacuation and Rescuing;Jawad Abusalama;2020 Advances in Science and Engineering Technology International Conferences (ASET);20200416;全文 * |
基于北斗RDSS的核辐射监测应急通讯方法;王廷银;林明贵;陈达;吴允平;;计算机系统应用;20191215(第12期);全文 * |
基于无人机影像的灾后快速评估系统研究;张炜;施展;;交通科技;20191215(第06期);全文 * |
矿山救援机器人群设计;文虎;工矿自动化;20191231;全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN117636251A (en) | 2024-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109270534B (en) | Intelligent vehicle laser sensor and camera online calibration method | |
CN112698302B (en) | Sensor fusion target detection method under bumpy road condition | |
CN106584451B (en) | automatic transformer substation composition robot and method based on visual navigation | |
CN112734765B (en) | Mobile robot positioning method, system and medium based on fusion of instance segmentation and multiple sensors | |
US20230236280A1 (en) | Method and system for positioning indoor autonomous mobile robot | |
CN109636837B (en) | Method for evaluating calibration accuracy of external parameters of monocular camera and millimeter wave radar | |
CN111932508B (en) | Image processing-based steel bar size measurement method and system | |
CN110926485B (en) | Mobile robot positioning method and system based on linear features | |
CN101576384A (en) | Indoor movable robot real-time navigation method based on visual information correction | |
CN103400392A (en) | Binocular vision navigation system and method based on inspection robot in transformer substation | |
CN113791074A (en) | Unmanned aerial vehicle bridge crack inspection system and method based on multi-sensor fusion | |
CN114038193B (en) | Intelligent traffic flow data statistics method and system based on unmanned aerial vehicle and multi-target tracking | |
CN113850102A (en) | Vehicle-mounted vision detection method and system based on millimeter wave radar assistance | |
CN111721279A (en) | Tail end path navigation method suitable for power transmission inspection work | |
CN115468567A (en) | Cross-country environment-oriented laser vision fusion SLAM method | |
CN110825112B (en) | Oil field dynamic invasion target tracking system and method based on multiple unmanned aerial vehicles | |
CN116699602A (en) | Target detection system and method based on millimeter wave radar and camera fusion | |
CN115690746A (en) | Non-blind area sensing method and system based on vehicle-road cooperation | |
CN115223361A (en) | Layout optimization method for roadside sensors in vehicle-road cooperative system | |
CN112798020B (en) | System and method for evaluating positioning accuracy of intelligent automobile | |
CN117636251B (en) | Disaster damage detection method and system based on robot | |
CN117789146A (en) | Visual detection method for vehicle road running under automatic driving scene | |
CN117470259A (en) | Primary and secondary type space-ground cooperative multi-sensor fusion three-dimensional map building system | |
CN110244717B (en) | Port crane climbing robot automatic path finding method based on existing three-dimensional model | |
CN114537442B (en) | Mobile intelligent vehicle management method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |