CN111070180B - Post-disaster rescue channel detection robot based on ROS - Google Patents

Post-disaster rescue channel detection robot based on ROS Download PDF

Info

Publication number
CN111070180B
CN111070180B CN201911389188.2A CN201911389188A CN111070180B CN 111070180 B CN111070180 B CN 111070180B CN 201911389188 A CN201911389188 A CN 201911389188A CN 111070180 B CN111070180 B CN 111070180B
Authority
CN
China
Prior art keywords
coordinate system
robot
disaster
image
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911389188.2A
Other languages
Chinese (zh)
Other versions
CN111070180A (en
Inventor
尹传忠
李志斌
李柯男
涂贵辉
凌梓钦
王瀛帜
张祥栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Maritime University
Original Assignee
Shanghai Maritime University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Maritime University filed Critical Shanghai Maritime University
Priority to CN201911389188.2A priority Critical patent/CN111070180B/en
Publication of CN111070180A publication Critical patent/CN111070180A/en
Application granted granted Critical
Publication of CN111070180B publication Critical patent/CN111070180B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/005Manipulators mounted on wheels or on carriages mounted on endless tracks or belts
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/087Controls for manipulators by means of sensing devices, e.g. viewing or touching devices for sensing other physical parameters, e.g. electrical or chemical properties
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N1/00Sampling; Preparing specimens for investigation
    • G01N1/02Devices for withdrawing samples
    • G01N1/22Devices for withdrawing samples in the gaseous state
    • G01N1/24Suction devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/35Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N27/00Investigating or analysing materials by the use of electric, electrochemical, or magnetic means
    • G01N27/26Investigating or analysing materials by the use of electric, electrochemical, or magnetic means by investigating electrochemical variables; by using electrolysis or electrophoresis
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/0004Gaseous mixtures, e.g. polluted air
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Abstract

The invention relates to a post-disaster rescue channel detection robot based on ROS, which comprises a robot body, a remote computer terminal connected with an ROS system, a base station positioning system, a terrain detection system, an image recognition system and a gas environment detection system, wherein the base station positioning system, the terrain detection system, the image recognition system and the gas environment detection system are arranged on the robot and connected with the ROS system; the robot also comprises a damping device, a front crawler wheel and a rear wheel which are used for driving the robot to move; the robot body is divided into an upper layer, a middle layer and a lower layer, and is powered by lithium batteries; the ROS system is provided with a driving device, and the front crawler wheel and the rear wheel are driven to move through the driving device. The invention can analyze the air composition of the environment in real time, and return the image and the information, and has the characteristics of strong climbing capability, high stability, agile action, easy control and comprehensive disaster relief function.

Description

Post-disaster rescue channel detection robot based on ROS
Technical Field
The invention belongs to the technical field of information, relates to technologies such as data preprocessing and image processing algorithm, and applies a dangerous chemical factory disaster relief robot safety path detection method based on a wireless UWB ad hoc network, a dangerous chemical factory disaster relief robot fire detection method based on big data, and a dangerous chemical factory disaster relief robot dangerous chemical gas leakage concentration detection method based on multi-source information fusion. The detection robot goes deep into a disaster site, collects disaster area information, collects components of toxic and harmful gas in the detection area and explores a safety channel, reduces unnecessary secondary casualties, protects personal and property safety to the maximum extent, and makes a scientific disaster relief scheme. The detection robot integrates the modern information technologies such as cloud computing, artificial intelligence and the Internet of things in a deep mode, and therefore the reliability of the detection result of the disaster-variable environment of the post-disaster rescue channel detection robot is comprehensively improved.
Background
With the development of chemical industry, the safety management and technical standards of dangerous chemical logistics storage are improved to a great extent, but safety accidents happen sometimes. When facilities such as chemical plants, dangerous chemical logistics storage and the like have fire disasters, explosions and other accidents, besides the loss of the disasters, major secondary casualties are often caused in the post-disaster rescue process. Due to the possibility of occurrence of a secondary disaster and the influence of factors such as high temperature, smoke, harmful gas and the like, rescue workers are difficult to accurately judge whether the rescue workers can enter a disaster site to execute rescue tasks, and rescue opportunities can be delayed. In order to guarantee the safety of rescue workers and avoid the input of a large amount of manpower and material resources to a chemical area to implement rescue tasks, when an accident occurs in a hazardous chemical factory, a disaster relief robot is used for entering an accident site in advance to detect the disaster environment of the chemical factory, and the rescue workers can be guaranteed to carry out rescue work safely.
The conventional fire detection technology is mainly a temperature-sensing monitoring method. The temperature sensing type monitoring method monitors hot air flow generated by a fire disaster of a hazardous chemical factory through a temperature sensing sensor carried by a disaster relief robot of the chemical factory, namely monitors the temperature rise rate and temperature of a certain point or along a certain line range of the fire disaster, thereby achieving the purpose of monitoring the fire disaster. The method has high accuracy and strong real-time performance, but has a large number of monitoring points and certain limitation on the detection of the fire in the chemical plant.
The traditional dangerous chemical gas concentration detection technology is a heat conduction type monitoring method. The concentration of the hazardous chemical gas is detected by a heat conduction type sensor carried by the disaster relief robot by utilizing the difference between the air heat conductivity and the heat conductivity of various gases and combining the relationship between the gas concentration and the heat conductivity. When the concentration of the measured gas is high, the difference between the thermal conductivity of the measured gas and the thermal conductivity of air is large, and the accuracy of the measurement result is high. When the concentration of the measured gas is low, the thermal conductivity of the measured gas is close to that of air, so that the output signal of the sensor is weak, and the sensitivity and the resolution are low. Therefore, the thermal conductivity type monitoring method is not suitable for monitoring the leakage of low-concentration hazardous chemical gas.
The above method has the following disadvantages: the traditional disaster relief robot does not comprehensively carry out the integrated analysis of multivariate information, but simply detects a disaster relief area in a simplified mode, and the false alarm rate is high. Therefore, it is needed to develop a patrol robot in a dangerous environment, which can perform real-time environmental air composition analysis, image and information return, reliable operation and control, and a robot with abundant power, and has the characteristics of strong climbing capability, high stability, agile action, easy operation and control, and comprehensive disaster relief function.
ROS (robot Operating System) is an open source meta-Operating system suitable for robots. It provides the services that the operating system should have, including hardware abstraction, underlying device control, implementation of common functions, interprocess message passing, and package management. It also provides the tools and library functions needed to obtain, compile, write, and run code across computers. The main goal of the ROS is to provide support for code reuse for robot research and development. The ROS is a distributed framework of processes (i.e., "nodes") that are encapsulated in packages of programs and functions that are easily shared and released. The ROS also supports a federated system similar to a code repository, which also enables engineering collaboration and release. This design allows a completely independent decision (not restricted by ROS) from the file system to the user interface to be developed and implemented for a project. At the same time, all the projects can be integrated by the basic tool of the ROS.
Disclosure of Invention
The invention is mainly developed based on an ROS system and consists of an intelligent carrier system, a base station positioning system, a terrain detection system, an image recognition system and a gas environment detection system. Generally, a centralized management and decentralized control mode is adopted, all modules are mutually independent, control interference among signals is reduced, meanwhile, the modules are organically combined, and various data are analyzed and processed by means of the strong processing capacity of the raspberry pi, the STM32 and the Arduino controller. The high-performance sensor is used for ensuring that the detection robot can still work normally in a severe environment after a disaster. The method has the advantages that while data are collected in real time, the original data are analyzed and processed in real time and are quickly transmitted back, and a reliable data basis is provided for disaster big data analysis.
The technical scheme of the invention is as follows:
a post-disaster rescue channel detection robot based on an ROS system comprises a robot body, a remote computer terminal connected with the ROS system, a base station positioning system, a terrain detection system, an image recognition system, a gas environment detection system, a damping device, a front crawler wheel and a rear wheel, wherein the base station positioning system, the terrain detection system, the image recognition system, the gas environment detection system, the damping device and the front crawler wheel and the rear wheel are arranged on the robot and connected with the ROS system; the method is characterized in that: the gas environment detection system comprises a toxic gas detection module and a temperature and humidity detection module; the robot body is divided into an upper layer, a middle layer and a lower layer, wherein the middle layer and the upper layer are stainless steel plates and are powered by lithium batteries; the front wheel driving the robot to move is a hard crawler wheel, and the rear wheel is a steel chain wrapping a steel inner wheel; the ROS system is provided with a driving device, and the hard crawler wheel and the steel chain are wrapped on the steel inner wheel to move through the driving device.
The damping device is positioned at the lower layer of the robot body and consists of a middle layer supporting plate, a central bearing, a central damping shaft, a motor damping plate, a front damping assembly and a rear damping assembly; the middle layer supporting plate is formed by cutting a novel heat-resistant material with the thickness of 5 mm; the central bearing consists of four supporting shafts and a novel heat-resistant high-strength material with the thickness of 5 mm; the central shock absorption shaft consists of a steel pipe and a spring sleeve; the motor damping plate is formed by cutting a novel heat-resistant material with the thickness of 5 mm; the front and rear shock absorption assemblies are made of steel springs and are connected with the motor shock absorption plate and the middle support plate; the left and the right of the central damping shaft are respectively provided with two damping springs, the front and the back of the motor damping plate are respectively provided with two hard steel pipes below; the front end of the front and rear shock absorption combination adopts double springs, and the rear end adopts a single spring.
The gas environment detection system also comprises a polyester gas recovery module.
The toxic gas detection module is an MQ-135 gas sensor, the temperature and humidity detection module is an SHT3x sensor, and the polyester gas recovery module is a 365 direct-current micro diaphragm pump.
The base station positioning system is a UWB base station positioning system, and the terrain detection system is a SLAM terrain detection system.
The use method of the ROS-based rescue channel detection robot for the hazardous chemical plant after the disaster comprises the following steps:
(1) a fire detection method for a post-disaster rescue channel detection robot of a hazardous chemical factory based on big data is disclosed. Aiming at disaster information related to fire, such as temperature, humidity, gas concentration, smell, wind speed and the like in the actual environment, a big data technology is adopted to carry out rapid analysis and processing on the information, so that the dangerous chemical fire is detected;
(2) a hazardous chemical substance gas concentration detection method for a hazardous chemical substance factory post-disaster rescue channel detection robot based on multi-information fusion. The coal mine disaster relief robot platform is used for carrying a heat conduction type sensor, a carrier thermal catalysis sensor, an electrochemical sensor and an infrared sensor, multi-information acquisition is achieved, data with the largest front and back change is screened out from data acquired by the sensors, fusion calculation is carried out on the data, and therefore an accurate dangerous chemical gas leakage concentration monitoring result is obtained.
(3) The Dijkstra global path planning algorithm is replaced by the A-algorithm, so that the positioning, navigation and autonomous obstacle avoidance capabilities of the rescue channel detection robot after the disaster of the hazardous chemical substance factory are enhanced. In a semi-structured environment, an environment map for storing hazardous chemicals in a hazardous chemical factory is established in advance, the shortest path is taken as an optimization target, a grid map is adopted to describe an environment model, and an A-x method is used for path planning.
(4) And generating real-time state estimation through the IMU and the laser radar carried by the positioning device, thereby completing the positioning of the positioning device. And the map is incrementally constructed while positioning, so that a basis is provided for the following path planning. And then planning a path, so that the machine can quickly plan an optimal path on the established map and avoid dynamic obstacles in real time. The Fast-slam algorithm mainly utilizes Fast-slam based on particle filtering, wherein the particle filtering is a filtering algorithm combined with Monte Carlo positioning Bayes estimation, each particle is mainly regarded as a real estimation of the current state, and n sample particles are updated and sampled at any moment, so that the current pose of the robot is guaranteed to be updated in real time. And the laser radar and the UWB positioning technology inside the hazardous chemical substance factory are combined to realize barrier and obstacle point marking and disaster area marking.
The invention provides a safe, reliable and quick rescue scheme for emergency rescue. And carrying out visual feedback and multi-data acquisition and analysis on the hazardous chemical substance relief area. The comprehensive solution is adopted for adapting to changeable disaster-stricken environments, and a plurality of advanced technologies are fused, so that a plurality of analysis indexes are provided for disaster relief and rescue in various environments.
Drawings
FIG. 1 is a schematic structural diagram of a detection robot;
fig. 2 is a flow chart of control and scheduling of the detection robot system.
Fig. 3 is a diagram of an appearance structure model of the inspection robot.
In the figure: 1. a rigid crawler wheel; 2. the steel chain wraps the steel inner wheel; 3. a central shock absorbing shaft;
4. a motor damping plate; a UWB base station; 6.365 DC miniature diaphragm pump; 7. a fisheye camera;
8. a laser radar; 9. binocular camera.
Detailed Description
The present invention will be described in further detail with reference to the following examples and accompanying drawings.
As shown in fig. 1, a post-disaster rescue channel detection robot based on an ROS system comprises a robot body, a remote computer terminal connected with the ROS system, a UWB base station 5 arranged on the robot and connected with the ROS system, a SLAM terrain detection system, a laser radar 8, a fisheye camera 7, a binocular camera 9, an MQ-135 gas sensor, an SHT3x sensor and a 365 direct current micro diaphragm pump 6; the robot is characterized by also comprising a damping device, a front wheel and a rear wheel, wherein the front wheel and the rear wheel are used for driving the robot to move, the front wheel is a hard crawler wheel 1, and the rear wheel is a steel chain wrapped on a steel inner wheel 2; the robot body is divided into an upper layer, a middle layer and a lower layer, wherein the middle layer and the upper layer are stainless steel plates and are powered by lithium batteries; the ROS system is provided with a driving device, and the hard crawler wheel 1 and the steel chain wrapped steel inner wheel 2 are driven to move by the driving device.
The damping device is positioned on the lower layer of the robot body and consists of a middle layer supporting plate, a center bearing, a center damping shaft 3, a motor damping plate 4, a front damping assembly and a rear damping assembly. The middle layer supporting plate is formed by cutting a novel heat-resistant material with the thickness of 5 mm; the central bearing consists of four supporting shafts and a 15mm novel heat-resistant high-strength material; the central damping shaft 3 consists of a steel pipe and a spring sleeve; the motor damping plate 4 is formed by cutting a novel heat-resistant material with the thickness of 5 mm; the front and rear shock absorption combination is made of steel springs and is connected with the motor shock absorption plate 4 and the middle layer supporting plate. The left and right sides of the central damping shaft 3 are respectively provided with two damping springs, the front and the back of the motor damping plate 4 are respectively provided with two hard steel pipes below; the front end of the front and rear shock absorption combination adopts a double-spring design, and the rear end adopts a single-spring design.
Step 1: algorithm optimization
Firstly, the map building and positioning functions in an emergency environment are realized through an SLAM algorithm established based on an Open SLAM Open source project, then the global path planning algorithm of Dijkstra of the ROS system is replaced by an A-x algorithm on the basis of a navigation module of the ROS system, and the positioning, navigation and autonomous obstacle avoidance capabilities of the robot are enhanced.
The basic idea of a search is to evaluate a node by combining the cost of reaching the node and the cost of going from the node to the target: (n) ═ g (n) + h (n) (1)
Since g (n) gives the cost of the path from the starting node to node n and h (n) gives the estimated cost value of the lowest cost path from node n to the destination node, f (n) is the estimated cost of the lowest cost solution through node n to the destination node. Therefore, if one wants to find the lowest cost solution, it is reasonable to first try to find the smallest node for the g (n) + h (n) value. And, provided that the heuristic function h (n) satisfies a certain condition, the a search is both complete and optimal.
Step 2: binocular vision recognition embarkation
The loaded binocular camera analyzes the disaster relief environment, and based on artificial intelligence machine learning, the personnel and the fire-fighting equipment in the disaster relief site are identified, and the safety coefficient of the disaster relief channel is analyzed. And identifying and determining internal personnel, fire conditions, safety channel identification and the position of the fire hydrant, and providing reference basis for later rescue.
(1) Coordinate system conversion
During the imaging process of the camera, there are 4 coordinate systems, respectively: a pixel coordinate system, an image coordinate system, a camera coordinate system, and a world coordinate system. The pixel coordinate system takes the upper left of the image as an origin, takes the horizontal row of the image matrix as a u-axis, and takes the vertical column of the image matrix as a v-axis. Correspondingly, the image coordinate system takes the center of the image matrix as an origin, the y axis of the image coordinate system is parallel to the v axis of the pixel coordinate system, and the x axis of the image coordinate system is parallel to the u axis of the pixel coordinate system. The coefficient of the interconverting constant between the two coordinate systems being the inverse of the unit pixel size
Figure GDA0003068020120000071
And
Figure GDA0003068020120000072
and offset u0,v0
Figure GDA0003068020120000073
Order to
Figure GDA0003068020120000074
The matrix relationship between the image coordinate system and the pixel coordinate system can be expressed as
Figure GDA0003068020120000075
At this time, orderThe optical axis of the camera is taken as z, the optical axis of the camera intersects the image coordinate system at the origin O, and the image coordinate system intersects the camera coordinate system at the origin OcWhen the matrix relationship between the camera coordinate system and the image coordinate system can be expressed as
Figure GDA0003068020120000076
The matrix relation of the world coordinate system mapped to the camera coordinate system is
Figure GDA0003068020120000077
Wherein R is a rotation matrix and T is a translation matrix
(2) Depth matrix calculation
The pixel coordinate system is successfully mapped to the world coordinate system through the coordinate transformation. In this case, 2 cameras are used, and the optical axes of the 2 cameras are arranged in parallel, and the relationship between the pixel coordinates of the two cameras is shown in the following figure
Figure GDA0003068020120000081
Can pass through
Figure GDA0003068020120000082
The relationship between the images of the left and right cameras and the camera coordinate system is calibrated.
Let D be xL+xR
At this time, the corresponding z value in the camera coordinate system can be obtained by matrix inversion as above.
(3) Method for acquiring depth-of-field image
After the relationship between the pixel coordinate system and the camera coordinate system is obtained, only the distance B between the optical axes of the two cameras, the focal length f, and the pixel coordinates (x) of the left camera and the right camera corresponding to each point need to be knownL,y),(xRY), the coordinate system of the camera can be obtainedDepth information z of
The optical axis distance and the focal length can be obtained through physical measurement. In this case, it is not possible to directly import data into Opencv for binocular vision measurement.
Because the optical axes of the left and right cameras are not parallel because of camera mounting errors and lens distortion. This would lead to an erroneous result if OpenCV were directly imported to make depth of field measurements. The camera is first calibrated to generate a rotation matrix and a translation matrix by calibration, and the image generated by the rotation and translation is adjusted to be the same as the image generated by the parallel optical axis.
The Bouguet toolbox by Matlab is subjected to stereo calibration to generate the correct rotation matrix and translation matrix. In this case, the image matrix can be directly converted into the depth map matrix by the createStereoBM function in OpenCV, and the recognition effect can be obtained.
And step 3: gas collection and detection
The 365 direct-current micro diaphragm pump is adopted for gas collection, the pump is quite wide in application range, can be applied to household appliances, medical treatment, model DIY, aquarium equipment and the like, mainly achieves the functions of pumping water, air circulation and the like, and achieves gas recovery for further analysis by using the powerful suction function of the pump. And gas collection and detection, real-time return and detection, and rare gas collection. The distribution condition of flammable and explosive toxic gas under the disaster environment is determined, and the secondary disaster in the disaster relief process is prevented by combining internal image processing and analysis.

Claims (6)

1. A post-disaster rescue channel detection robot based on an ROS system comprises a robot body, a remote computer terminal connected with the ROS system, a base station positioning system, a terrain detection system, an image recognition system, a gas environment detection system, a damping device, a front crawler wheel and a rear wheel, wherein the base station positioning system, the terrain detection system, the image recognition system, the gas environment detection system, the damping device and the front crawler wheel and the rear wheel are arranged on the robot and connected with the ROS system; the method is characterized in that: the gas environment detection system comprises a toxic gas detection module and a temperature and humidity detection module; the robot body is divided into an upper layer, a middle layer and a lower layer, wherein the middle layer and the upper layer are stainless steel plates and are powered by lithium batteries; the front wheel driving the robot to move is a hard crawler wheel (1), and the rear wheel is a steel chain wrapping a steel inner wheel (2); the ROS system is provided with a driving device, and the hard crawler wheel (1) and the steel inner wheel (2) wrapped by the steel chain are driven to move by the driving device;
the damping device is positioned at the lower layer of the robot body and consists of a middle layer supporting plate, a central bearing, a central damping shaft (3), a motor damping plate (4) and a front and rear damping combination; the middle layer supporting plate is formed by cutting a novel heat-resistant material with the thickness of 5 mm; the central bearing consists of four support shafts and a 5mm heat-resistant high-strength material; the central shock absorption shaft (3) consists of a steel pipe and a spring sleeve; the motor damping plate (4) is formed by cutting a novel heat-resistant material with the thickness of 5 mm; the front and rear shock absorption assemblies are made of steel springs and are connected with the motor shock absorption plate (4) and the middle support plate; the left and the right of the central damping shaft (3) are respectively provided with two damping springs, the front and the back of the motor damping plate (4) are respectively arranged, and two hard steel pipes are respectively arranged below the motor damping plate; the front end of the front and rear shock absorption combination adopts double springs, and the rear end adopts a single spring;
the use method of the post-disaster rescue channel detection robot based on the ROS system comprises the following steps:
step 1: optimization algorithm
Firstly, a map building and positioning function is realized in an emergency environment through an SLAM algorithm established based on an Open SLAM Open source project, and then on the basis of a navigation module of an ROS system, the Dijkstra global path planning algorithm is replaced by an A-x algorithm, so that the positioning, navigation and autonomous obstacle avoidance capabilities of the robot are enhanced;
the basic idea of a search is to evaluate a node by combining the cost of reaching the node and the cost of going from the node to the target:
f(n)=g(n)+h(n) (1)
it is reasonable to first try to find the smallest node for the value of g (n) + h (n);
step 2: binocular vision recognition embarkation
The loaded binocular camera analyzes the disaster relief environment, personnel and fire fighting equipment on the disaster relief site are identified based on artificial intelligence machine learning, and the safety coefficient of a disaster relief channel is analyzed; identifying and determining internal personnel, fire conditions, safety channel identification and the position of a fire hydrant, and providing reference basis for later rescue;
1) coordinate system conversion
During the imaging process of the camera, there are 4 coordinate systems, respectively: a pixel coordinate system, an image coordinate system, a camera coordinate system, and a world coordinate system; the pixel coordinate system takes the upper left of an image as an origin, takes a horizontal row of an image matrix as a u-axis and takes a vertical column of the image matrix as a v-axis; correspondingly, the image coordinate system takes the center of the image matrix as an original point, the y axis of the image coordinate system is parallel to the v axis of the pixel coordinate system, and the x axis of the image coordinate system is parallel to the u axis of the pixel coordinate system; the coefficient of the interconverting constant between the two coordinate systems being the inverse of the unit pixel size
Figure FDA0003084085490000021
And
Figure FDA0003084085490000022
and offset u0,v0(ii) a Order to
Figure FDA0003084085490000023
The matrix relationship between the image coordinate system and the pixel coordinate system can be expressed as
Figure FDA0003084085490000024
At this time, let the optical axis of the camera be s, the optical axis of the camera intersect the image coordinate system at the origin O, and the image coordinate system intersects the camera coordinate system at the origin OcWhen the matrix relationship between the camera coordinate system and the image coordinate system can be expressed as
Figure FDA0003084085490000025
The matrix relation of the world coordinate system mapped to the camera coordinate system is
Figure FDA0003084085490000026
Wherein R is a rotation matrix and T is a translation matrix
2) Depth matrix computation
Through the coordinate transformation, the pixel coordinate system is successfully mapped to a world coordinate system; using 2 cameras, and arranging optical axes of the 2 cameras to be parallel;
by passing
Figure FDA0003084085490000031
Calibrating the relation between the images of the left camera and the right camera and the camera coordinate system;
let D be xL+xRObtaining a corresponding z value under a camera coordinate system through matrix inversion;
3) method for acquiring depth-of-field image
After the relationship between the pixel coordinate system and the camera coordinate system is obtained through the above, the distance B between the optical axes of the two cameras, the focal length f, and the pixel coordinates (x) of the left camera and the right camera corresponding to each point are confirmedL,y),(xRY), i.e. finding the depth information z in the camera coordinate system
The optical axis distance and the focal length can be obtained through physical measurement;
calibrating the camera, and adjusting the image to be the same as the image generated by being parallel to the optical axis through rotation and translation; carrying out three-dimensional calibration through a Bouguet toolbox of Matlab so as to generate a correct rotation matrix and a correct translation matrix; then, an image matrix can be directly converted into a depth map matrix through a createStereoBM function in OpenCV, and an identification effect map is obtained;
and step 3: gas collection and detection
A 365 direct-current micro diaphragm pump is adopted to transmit back and detect in real time to collect the rare gas; the distribution condition of flammable and explosive toxic gas under the disaster environment is determined, and the secondary disaster in the disaster relief process is prevented by combining internal image processing and analysis.
2. The post-disaster rescue channel detection robot based on the ROS system in claim 1, characterized in that: the gas environment detection system also comprises a polyester gas recovery module.
3. The post-disaster rescue channel detection robot based on the ROS system in claim 2, characterized in that: the toxic gas detection module is an MQ-135 gas sensor, the temperature and humidity detection module is an SHT3x sensor, and the polyester gas recovery module is a 365 direct-current micro diaphragm pump.
4. The post-disaster rescue channel detection robot based on the ROS system in claim 3, wherein: the base station positioning system is a UWB base station positioning system, and the terrain detection system is a SLAM terrain detection system.
5. The ROS system-based post-disaster rescue channel detection robot of claim 1, 2 or 4, wherein: the image recognition system comprises a fisheye camera and a binocular camera.
6. The post-disaster rescue channel detection robot based on the ROS system in claim 5,
(1) A fire detection method of a rescue channel detection robot after a disaster of a hazardous chemical factory based on big data; aiming at the temperature, humidity, gas concentration, smell, wind speed and fire-related catastrophe information in the actual environment, a big data technology is adopted to carry out rapid analysis and processing on the information, so that the dangerous chemical fire is detected;
(2) a hazardous chemical substance gas concentration detection method of a robot for detecting a post-disaster rescue channel of a hazardous chemical substance factory based on multi-information fusion; the method comprises the following steps of carrying a heat conduction type sensor, a carrier thermal catalysis sensor, an electrochemical sensor and an infrared sensor through a coal mine disaster relief robot platform, realizing multi-information acquisition, screening data with maximum front and back change from the data acquired by the sensors, and carrying out fusion calculation on the data, thereby obtaining an accurate dangerous chemical gas leakage concentration monitoring result;
(3) the Dijkstra global path planning algorithm is replaced by the A-algorithm, so that the positioning, navigation and autonomous obstacle avoidance capabilities of the rescue channel detection robot after the disaster of the hazardous chemical substance factory are enhanced; in a semi-structured environment, an environment map for storing hazardous chemicals in a hazardous chemical factory is established in advance, the shortest path is taken as an optimization target, a grid map is adopted to describe an environment model, and an A-method is used for path planning;
(4) real-time state estimation is generated through an IMU (inertial measurement Unit) carried by the self-positioning device and a laser radar, so that the self-positioning is completed; the map is built in an incremental mode while the map is positioned, and a basis is provided for the following path planning; then, path planning is carried out, so that the machine can rapidly plan an optimal path on the established map and avoid dynamic obstacles in real time; fast-slam based on particle filtering is utilized, the particle filtering is a filtering algorithm combined with Monte Carlo positioning Bayes estimation, the Fast-slam algorithm mainly considers each particle as a real estimation of the current state, and at any moment, the Fast-slam algorithm updates and samples n sample particles, so that the current pose of the robot is guaranteed to be updated in real time; and the laser radar and the UWB positioning technology inside the hazardous chemical substance factory are combined to realize barrier and obstacle point marking and disaster area marking.
CN201911389188.2A 2019-12-30 2019-12-30 Post-disaster rescue channel detection robot based on ROS Active CN111070180B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911389188.2A CN111070180B (en) 2019-12-30 2019-12-30 Post-disaster rescue channel detection robot based on ROS

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911389188.2A CN111070180B (en) 2019-12-30 2019-12-30 Post-disaster rescue channel detection robot based on ROS

Publications (2)

Publication Number Publication Date
CN111070180A CN111070180A (en) 2020-04-28
CN111070180B true CN111070180B (en) 2021-07-20

Family

ID=70319544

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911389188.2A Active CN111070180B (en) 2019-12-30 2019-12-30 Post-disaster rescue channel detection robot based on ROS

Country Status (1)

Country Link
CN (1) CN111070180B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111618908B (en) * 2020-05-11 2023-08-15 中国科学院合肥物质科学研究院 Task-based robot platform autonomous capability testing device and testing method
CN111673756B (en) * 2020-05-15 2022-03-04 深圳国信泰富科技有限公司 Chemical leakage disaster relief robot and control method thereof
CN112109090A (en) * 2020-09-21 2020-12-22 金陵科技学院 Multi-sensor fusion search and rescue robot system
CN112857390A (en) * 2021-01-14 2021-05-28 江苏智派战线智能科技有限公司 Calculation method applied to intelligent robot moving path
CN113341951B (en) * 2021-05-13 2022-03-15 杭州电子科技大学 Rescue robot escape method based on multi-objective optimization
WO2024044975A1 (en) * 2022-08-30 2024-03-07 西门子股份公司 Control method and apparatus for mobile robotic arm

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9211648B2 (en) * 2012-04-05 2015-12-15 Irobot Corporation Operating a mobile robot
KR20170091987A (en) * 2016-02-02 2017-08-10 금오공과대학교 산학협력단 Pill bug robot to extinguish the flames
CN107368073A (en) * 2017-07-27 2017-11-21 上海工程技术大学 A kind of full ambient engine Multi-information acquisition intelligent detecting robot system
CN108500992A (en) * 2018-04-09 2018-09-07 中山火炬高新企业孵化器有限公司 A kind of multi-functional mobile security robot
CN109276833A (en) * 2018-08-01 2019-01-29 吉林大学珠海学院 A kind of robot patrol fire-fighting system and its control method based on ROS
CN109285190B (en) * 2018-09-06 2021-06-04 广东天机工业智能系统有限公司 Object positioning method and device, electronic equipment and storage medium
CN109352654A (en) * 2018-11-23 2019-02-19 武汉科技大学 A kind of intelligent robot system for tracking and method based on ROS
CN110489182B (en) * 2019-08-26 2021-05-18 北京机械工业自动化研究所有限公司 Robot cloud platform design method with autonomous navigation module

Also Published As

Publication number Publication date
CN111070180A (en) 2020-04-28

Similar Documents

Publication Publication Date Title
CN111070180B (en) Post-disaster rescue channel detection robot based on ROS
Bennetts et al. Towards real-world gas distribution mapping and leak localization using a mobile robot with 3d and remote gas sensing capabilities
CN104932001A (en) Real-time 3D nuclear radiation environment reconstruction monitoring system
CN111624641A (en) Explosion-proof type intelligent inspection robot for oil depot area
Singh et al. Comparative analysis of range sensors for the robust autonomous navigation–a review
CN205594404U (en) Security robot
Grehl et al. Mining-rox–mobile robots in underground mining
CN104597216B (en) For the removable objectionable impurities detection and location system and method for indoor and outdoors
CN116339337A (en) Target intelligent positioning control system and method based on infrared imaging, laser radar and sound directional detection
Schneider et al. Unmanned multi-robot CBRNE reconnaissance with mobile manipulation system description and technical validation
KR102615767B1 (en) Systems and methods to support safety management services using AI vision and the Internet of Things
CN204495800U (en) For the removable objectionable impurities detection and location system of indoor and outdoors
CN111590559A (en) Explosion-proof inspection task control method, storage medium and robot
Alhmiedat et al. A Systematic Approach for Exploring Underground Environment Using LiDAR-Based System.
Schneider et al. An autonomous unmanned vehicle for CBRNE reconnaissance
Bennetts et al. Gasbot: A mobile robotic platform for methane leak detection and emission monitoring
CN113963513A (en) Robot system for realizing intelligent inspection in chemical industry and control method thereof
CN112506206A (en) Coal mine rescue robot and obstacle avoidance method thereof
KLEIN et al. DamBotTM: An Unmanned Amphibious Vehicle for Earth Dam Outlet Inspection
Gianni Towards expendable robot teaming in extreme environments
Kumar et al. Gas leakage source localization and boundary estimation using mobile wireless sensor network
Silva et al. Promising Technologies and Solutions for Supporting Human Activities in Confined Spaces in Industry.
CN208969510U (en) A kind of submersible six-freedom motion real-time measurement apparatus
Reitbauer et al. LIWO-SLAM: A LiDAR, IMU, and Wheel Odometry Simultaneous Localization and Mapping System for GNSS-Denied Environments Based on Factor Graph Optimization
Lindqvist et al. Deployment of Autonomous Uavs in Underground Mines: Field Evaluations and Use-Case Demonstrations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant