CN110159869A - A kind of detecting robot of pipe and its Multi-sensor Fusion detection method - Google Patents

A kind of detecting robot of pipe and its Multi-sensor Fusion detection method Download PDF

Info

Publication number
CN110159869A
CN110159869A CN201910417013.1A CN201910417013A CN110159869A CN 110159869 A CN110159869 A CN 110159869A CN 201910417013 A CN201910417013 A CN 201910417013A CN 110159869 A CN110159869 A CN 110159869A
Authority
CN
China
Prior art keywords
data
robot
sensor
tube environment
oneself state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910417013.1A
Other languages
Chinese (zh)
Other versions
CN110159869B (en
Inventor
朱颖
张亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201910417013.1A priority Critical patent/CN110159869B/en
Publication of CN110159869A publication Critical patent/CN110159869A/en
Application granted granted Critical
Publication of CN110159869B publication Critical patent/CN110159869B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16LPIPES; JOINTS OR FITTINGS FOR PIPES; SUPPORTS FOR PIPES, CABLES OR PROTECTIVE TUBING; MEANS FOR THERMAL INSULATION IN GENERAL
    • F16L55/00Devices or appurtenances for use in, or in connection with, pipes or pipe systems
    • F16L55/26Pigs or moles, i.e. devices movable in a pipe or conduit with or without self-contained propulsion means
    • F16L55/28Constructional aspects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16LPIPES; JOINTS OR FITTINGS FOR PIPES; SUPPORTS FOR PIPES, CABLES OR PROTECTIVE TUBING; MEANS FOR THERMAL INSULATION IN GENERAL
    • F16L2101/00Uses or applications of pigs or moles
    • F16L2101/30Inspecting, measuring or testing

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The present invention relates to robotic technology field, a kind of detecting robot of pipe and its Multi-sensor Fusion detection method are provided, can accurately obtain tube environment perception and itself perception, infomation detection is comprehensive, and accuracy rate is high.Above-mentioned fusion detection method includes main control module every preset time, obtains the detection data of each sensor;All detection datas that current time obtains are divided into robot oneself state data and tube environment data;Respectively to the robot oneself state data and the progress feature extraction of tube environment data that current time obtains;Respectively to the robot oneself state data and tube environment data progress parameter Estimation and feature identification after feature extraction;Fusion calculation is carried out to the subsequent time data estimation of robot oneself state data and tube environment data and current time characteristic attribute respectively, obtains the comprehensive situation estimation of robot operating status and tube environment situation.Solve existing pipe robot, the problem that infomation detection is not complete, false detection rate is high.

Description

A kind of detecting robot of pipe and its Multi-sensor Fusion detection method
Technical field
The present invention relates to detecting robot of pipe technical fields, more particularly to a kind of detecting robot of pipe and its more biographies Sensor fusion detection method.
Background technique
Oil-gas pipeline buries mostly and spreads on underground, for the safe handling for guaranteeing pipeline, finds the damage such as pipe deforming, corrosion in time Condition of the injury condition needs periodically to carry out detection inside pipeline, finds various defects and damage in advance, understand the degree of danger of each pipeline section, Corresponding measure is taken, to effectively prevent and reduce pipeline accident, save pipeline maintenance fund.Detecting robot of pipe is to carry out The ideal equipment of pipeline detection, it can along inner wall of the pipe automatically walk, and usually equipped with one or more sensors, Under the remote control of operator, carry out detecting operation in pipe.
As disclosed a kind of modularization support track type pipeline inner machine people in Chinese patent literature CN 109483561A, Including main body mechanism, modularization supporting mechanism, Modular track mechanism.Main body mechanism uses hollow cylindrical rack, circumferentially Multiple sliding slots are provided with, for carrying the modules being mounted in robot and providing to the module being mounted in robot dynamic Power;Modularization supporting mechanism is mounted on main body mechanism, actively adapts to mechanism and spring company using leadscrew-nut mechanism composition The design that the passive adaptation mode of bar sliding block composition combines is used for Modular track mechanism supports in pipeline;Modularization Pedrail mechanism is driven using single-crawler single and electric machine built-in.Above-mentioned track type pipeline inner machine people, the sensor of carrying is seldom, nothing Method works independently in pipeline, and not comprehensive enough to the detection of pipeline environment, and detection error is big, and false detection rate is high, is believing Breath processing aspect only carries out individually simple judgement to the information of sensor passback, can not obtain accurate tube environment Information and robot oneself state information cause robot all poor to environment sensing in pipeline and itself perception.
Summary of the invention
For this purpose, carrying sensor technical problem to be solved by the present invention lies in the detecting robot of pipe of the prior art Less, infomation detection is not comprehensive, false detection rate is high and too simple to the processing of sensor back information, and it is accurate to obtain Pipeline in environment sensing and itself perception, and provide a kind of carrying multiple sensors, infomation detection is comprehensive, returns to sensor Information carries out comprehensive fusion treatment, can accurately obtain in pipeline the detecting robot of pipe of environment sensing and itself perception and its Multi-sensor Fusion detection method.
In order to solve the above technical problems, The technical solution adopted by the invention is as follows:
A kind of detecting robot of pipe, including main body rack and the three group walking groups circumferentially uniformly distributed around the main body rack Part, the Athey wheel that the walking component includes bottom bracket and is arranged on the bottom bracket, the bottom bracket pass through company It connects bracket to connect with the main body rack, laser radar, camera, gyroscope, temperature and humidity is provided on the main body rack and is passed Sensor and gas concentration sensor are provided with diaphragm pressure sensor, leakage field module and encoder, institute on each Athey wheel It states and is respectively arranged with infrared distance sensor and mileage wheel on the first extending bracket and the second extending bracket on the outside of main body rack Module.
Preferably, the laser radar and the camera are set to side of the main body rack towards direction of travel, The gyroscope, the Temperature Humidity Sensor and the gas concentration sensor are set to the inside of the main body rack, described Diaphragm pressure sensor is set in the interlayer of the Athey wheel, and the leakage field module setting is uniformly distributed in the Athey wheel Inside, the encoder are set on Athey wheel motor gear mounted.
A kind of Multi-sensor Fusion detection method of detecting robot of pipe, comprising:
Step 1, main control module obtain the detection data of each sensor every preset time;
All detection datas that current time obtains are divided into robot oneself state data and tube environment by step 2 Data;
Step 3, respectively to current time obtain the robot oneself state data and the tube environment data into Row feature extraction;
Step 4, respectively to the robot oneself state data and tube environment data progress after feature extraction Parameter Estimation obtains the subsequent time data estimation of the robot oneself state data and the tube environment data respectively; Meanwhile to the robot oneself state data and tube environment data progress feature identification after feature extraction, respectively Obtain the current time characteristic attribute of the robot oneself state data and the tube environment data;
Step 5, respectively to the subsequent time number of the robot oneself state data and the tube environment data Fusion calculation is carried out with the current time characteristic attribute according to estimates, obtains the comprehensive of robot operating status and tube environment situation Close battle field situation.
Preferably, in the step 3, the feature extraction, which refers to, carries out the time to all detection datas at current time Calibration and space coordinate transformation, with unified time reference point and spatial reference point needed for forming fusion calculation.
Preferably, in the step 4, the parameter Estimation is by the robot oneself state number after feature extraction According to the tube environment data be respectively formed a row of N column matrix measured value, by the detection data value at current time with Last moment obtains the robot oneself state data and institute multiplied by weight number to the deviation of the data estimated value at current time State the subsequent time data estimation of tube environment data;
The feature be identified as according to after feature extraction the robot oneself state data and the tube environment number According to observed result, the feature vector of a N-dimensional is respectively formed, wherein independent special per one-dimensional represent detected data one Sign, to obtain the current time characteristic attribute of the robot oneself state data and the tube environment data.
Preferably, the robot oneself state data include caterpillar drive status data, robot travel distance data, Obstacle principle condition and robot and front obstacle range data in pipeline;
The tube environment data include pipeline detection image, inner wall of the pipe stray field signal, in pipeline environment temperature Harmful gas concentration data in degree and humidity data and pipeline.
Preferably, the caterpillar drive status data is by the positive pressure force data and crawler belt row between crawler belt and inner wall of the pipe Walk speed data acquisition.
Preferably, the caterpillar drive status data is by the positive pressure force data and crawler belt row between crawler belt and inner wall of the pipe Walk speed data, again by robot run acceleration information verify obtaining.
Preferably,
Positive pressure force data between the crawler belt and inner wall of the pipe is obtained by diaphragm pressure sensor;
The crawler travel speed data is obtained by encoder;
The robot operation acceleration information is obtained by gyroscope;
The robot travel distance data are obtained by mileage wheel module;
Obstacle principle condition is obtained by laser radar in the pipeline;
The robot is obtained with front obstacle range data by infrared distance sensor;
The pipeline detection image is obtained by camera;
The inner wall of the pipe stray field signal is obtained by leakage field module;
The temperature and humidity data of environment are obtained by Temperature Humidity Sensor in the pipeline;
Harmful gas concentration data are obtained by gas concentration sensor in the pipeline.
Preferably, the comprehensive situation of the robot operating status and tube environment situation estimation is sent to master control mould Block carries out decision, obtains corresponding counter-measure;The preset time is 0.2s.
The above technical solution of the present invention has the following advantages over the prior art:
Detecting robot of pipe provided by the invention and its Multi-sensor Fusion detection method, robot is as a flexibility Platform is equipped with a variety of peripheral modules and sensor, and detection information is comprehensive, and a variety of biographies are handled by the way of Multi-sensor Fusion The information of sensor passback, can obtain the accurate data of operating status in tube environment situation and robot pipe, substantially increase Detect accuracy.
Detailed description of the invention
In order to make the content of the present invention more clearly understood, it below according to specific embodiments of the present invention and combines Attached drawing, the present invention is described in further detail, wherein
Fig. 1 is the schematic diagram one of inventive pipeline detection robot;
Fig. 2 is the schematic diagram two of inventive pipeline detection robot;
Fig. 3 is the control system architecture diagram of inventive pipeline detection robot;
Fig. 4 is the information processing model figure of Multi-sensor Fusion detection method of the present invention.
Appended drawing reference indicates in figure are as follows: 1- main body rack, 2- connecting bracket, 3- bottom bracket, 4- Athey wheel, 5- first prolong Stretch bracket, 6- electric pushrod, 7- motor gear, the second extending bracket of 8-.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall within the protection scope of the present invention.
It as shown in Figure 1, 2, is a kind of preferred embodiment of detecting robot of pipe of the present invention.The detecting robot of pipe Including main body rack 1 and the three group walking components circumferentially uniformly distributed around the main body rack 1, the walking component includes bottom bracket 3 and the Athey wheel 4 that is arranged on the bottom bracket 3, the bottom bracket 3 connected by connecting bracket 2 and the main body rack 1 It connects.The main body rack 1 is positive triangular prism shape, and the connecting bracket 2 is hinge with the main body rack 1, the bottom bracket 3 It connects.Be additionally provided with electric pushrod 6 between the connecting bracket 2 and the main body rack 1, one end of the electric pushrod 6 with it is described Main body rack 1 is hinged, and the other end and the middle part of the connecting bracket 2 are hinged, by the stretching motion of electric pushrod, make robot It can be bonded with inner wall of the pipe in operational process in pipeline, obtain suitable normal pressure.Due to the traveling movement etc. of robot It will not influence main body rack, therefore multiple sensors, control unit, battery etc. can be set as needed inside main body rack, Carrying ability is strong.
Laser radar, camera, gyroscope, Temperature Humidity Sensor and gas concentration is provided on the main body rack 1 to pass Sensor is provided with diaphragm pressure sensor, leakage field module and encoder, 1 outside of main body rack on each Athey wheel 4 The first extending bracket 5 and the second extending bracket 8 on be respectively arranged with infrared distance sensor and mileage wheel module.
Specifically, the laser radar and the camera are set to the main body rack 1 towards the side of direction of travel; The gyroscope, the Temperature Humidity Sensor and the gas concentration sensor are set to the inside of the main body rack 1, preferably The ground gyroscope is set to the center inside the main body rack 1;The diaphragm pressure sensor is set to the shoe In the interlayer of belt wheel 4;The leakage field module setting is uniformly distributed in the inside of the Athey wheel 4;The encoder is set to institute It states on the motor gear 7 mounted of Athey wheel 4.
The laser radar, for carrying out 360 ° of scannings of two dimension, building pipe to inner wall of the pipe in robot traveling process Road inner wall two dimension point cloud chart picture realizes the detection of barrier situation.The camera, for returning and storing pipeline detection figure Picture.The gyroscope, for detecting the operation acceleration of robot.The Temperature Humidity Sensor, for detecting pipeline inner ring border Temperature and humidity, as measure corrosive pipeline situation foundation.The gas concentration sensor is harmful in pipeline for detecting The concentration of gas (such as methane).The diaphragm pressure sensor, for detecting the positive pressure between each crawler belt and inner wall of the pipe Power.The leakage field module, for detecting inner wall of the pipe stray field signal, to judge defect of pipeline position.The encoder, is used for Detect the speed of travel of crawler belt.The infrared distance sensor, for detecting the robot at a distance from the obstacle of front, and When adjustment the robot speed of service pass through bend pipe.The mileage wheel module, for detecting the travel distance of the robot.
Detecting robot of pipe of the invention is directed to 1016mm large-scale petroleum pipe design, to comply in made in China 2025 Key post robot substitution, intelligent and national oil and gas pipeline tend to the trend of enlargement, can carry a variety of detections Module and sensor can accurately obtain environment sensing and robot itself perception in pipeline.
Detecting robot of pipe of the invention can realize Robot remote and autonomous operation double control mode, have open/stop, Differential speed rotation control, defect information detection, robot monitoring running state, machine in pipeline in travelling control, robot pipeline The functions such as people's automatic fault selftesting.As shown in figure 3, in the control system of inventive pipeline detection robot, in application layer part, on Position machine (PC machine) can be communicated by the interface and robot of application layer, complete artificial monitoring and remote control operation.Sensing layer part by Laser radar, encoder, diaphragm pressure sensor, gyroscope, mileage wheel, infrared distance sensor, Temperature Humidity Sensor, gas Concentration sensor, camera are constituted, and realize tube environment perception, the perception of robot oneself state and robot fault pre-detection, Multiple sensors information is handled by signal processing module and is merged, and to make up the erroneous detection of single-sensor, improves detection accuracy.Control Preparative layer part multi-sensor information processing fusion and autonomous layering are realized by DSP module and the control module of Cortex_M kernel Control, and communicated with host computer;Control layer part is first initialized after robot is opened, and then starts to advance, and is acquired The sensing data of sensing layer sensor transmissions carries out inspection, runs corresponding program by event-driven, and carry out robot fault inspection It surveys.Hardware bottom layer part is mainly made of direct current generator, driving (driving plate of direct current generator) and steering engine, is sent out by control module The control information sent is overdrived to control direct current generator movement, camera is provided on steering engine, by the movement of steering engine come real The existing multi-direction rotation of camera, encoder, the speed of pressure sensor feedback and crawler belt pressure information realize crawler belt and push rod fortune Dynamic closed-loop control.
As shown in figure 4, the Multi-sensor Fusion detection method of inventive pipeline detection robot, comprising:
Step 1, main control module obtain the detection data of each sensor every preset time.
In the present embodiment, the preset time is set as 0.2s.In the present invention, each sensor detection obtained Data all include two aspect content of data Layer information and characteristic layer information.
All detection datas that current time obtains are divided into robot oneself state data and tube environment by step 2 Data.
Specifically, the robot oneself state data include caterpillar drive status data, robot travel distance data, Obstacle principle condition and robot and front obstacle range data in pipeline;
The tube environment data include pipeline detection image, inner wall of the pipe stray field signal, in pipeline environment temperature Harmful gas concentration data in degree and humidity data and pipeline.
The caterpillar drive status data is by the positive pressure force data and crawler travel speed between crawler belt and inner wall of the pipe Data, again by robot run acceleration information verify obtaining.In the present embodiment,
Positive pressure force data between the crawler belt and inner wall of the pipe is obtained by diaphragm pressure sensor;
The crawler travel speed data is obtained by encoder;
The robot operation acceleration information is obtained by gyroscope;
The robot travel distance data are obtained by mileage wheel module;
Obstacle principle condition is obtained by laser radar in the pipeline;
The robot is obtained with front obstacle range data by infrared distance sensor;
The pipeline detection image is obtained by camera;
The inner wall of the pipe stray field signal is obtained by leakage field module;
The temperature and humidity data of environment are obtained by Temperature Humidity Sensor in the pipeline;
Harmful gas concentration data are obtained by gas concentration sensor in the pipeline.
After the diaphragm pressure sensor of tripodia Athey wheel interlayer and the detection data of encoder are in parallel, the data Layer of output is believed Breath and characteristic layer information are merged with the acceleration information of gyroscope, and the information of these three comprehensive sensors can accurate judgement tripodia shoe The motion state of belt wheel prejudges robot in time and is likely to occur situation about being stuck in pipeline, stops driving robot motor in advance, Carry out pose adjustment.Later, gyroscope, mileage wheel, laser radar, infrared distance sensor data parallel connection fusion, obtain machine People's oneself state accurate data, such as specific location, travel speed, with bend pipe turning distance etc., camera, leakage field module, temperature Humidity sensor and gas sensor data parallel connection fusion, obtain tube environment accurate data.
Step 3, respectively to current time obtain the robot oneself state data and the tube environment data into Row feature extraction.
The feature extraction refers to time and spatial reference point for unified each sensor, all inspections to current time Measured data carries out time calibration and space coordinate transformation, with unified time reference point and georeferencing needed for forming fusion calculation Point.
Step 4, respectively to the robot oneself state data and tube environment data progress after feature extraction Parameter Estimation obtains the subsequent time data estimation of the robot oneself state data and the tube environment data respectively; Meanwhile to the robot oneself state data and tube environment data progress feature identification after feature extraction, respectively Obtain the current time characteristic attribute of the robot oneself state data and the tube environment data.
The parameter Estimation is that the fusion of data Layer information is carried out to the detection data of each sensor, using Kalman The data fusion mode of filtering.Specifically, by after feature extraction the robot oneself state data and the tube environment Data are respectively formed the matrix measured value of row of N column, by the detection data value at current time and last moment to it is current when The deviation of the data estimated value at quarter obtains the robot oneself state data and the tube environment data multiplied by weight number The subsequent time data estimation.It should be noted that above-mentioned weight number changes always, the power in Kalman filtering Weight and detection data value and last moment are related to the deviation of the data estimated value at current time.
The feature identification is that the fusion of characteristic layer information is carried out to the detection data of each sensor, specifically, according to spy The observed result of the robot oneself state data and the tube environment data after sign extraction, is respectively formed a N-dimensional Feature vector, wherein per an one-dimensional independent characteristic for representing detected data, as having zero defect, robot traveling shape in pipe Condition quality etc., to obtain the current time characteristic attribute of the robot oneself state data and the tube environment data.
Step 5, respectively to the subsequent time number of the robot oneself state data and the tube environment data Fusion calculation is carried out with the current time characteristic attribute according to estimates, obtains the comprehensive of robot operating status and tube environment situation Close battle field situation.
The fusion calculation is the row of N column matrix measured value and N exported to the parameter Estimation and feature identification division The dependent observation result of dimensional feature vector is verified, is analyzed, supplementing choice, modification and status tracking estimation, to uncorrelated sight It surveys result and carries out analysis and synthesis, obtain and the comprehensive situation of robot operating status and tube environment situational awareness is estimated.
The final result obtained by Multi-sensor Fusion detection method provided by the invention, i.e. robot operating status Estimate with the comprehensive situation of tube environment situational awareness, is sent to main control module and carries out decision, obtain corresponding counter-measure, it is main The decision for carrying out corresponding That deal with the TBT will be changed for complex environment and target by controlling module, be controlled by machine main control module Direct current generator, push rod etc. are adjusted, thus guarantee trouble-free operation of the robot in pipeline and to tube environment carry out compared with Comprehensively and accurately to detect.For example, being estimated according to the comprehensive situation of robot operating status, feeds back robot out and be likely to occur The situation blocked is taken by main control module decision and stops the measure that tripodia motor makes robot stop in time, to avoid robot It blocks.
Detecting robot of pipe provided by the invention and its Multi-sensor Fusion detection method, and only use a kind of sensor Robot compare, multiple-sensor integration can more fully obtain the information to detected object and increase the reliability of system, Even if system may still operate normally in one or more sensors failure.The data processing side of Fusion Formula has better fault-tolerance compared to the data processing method for only individually being judged sensor information or being simply added, Because the noise of each sensor be it is incoherent, can obviously inhibit noise after fusion treatment, reduce uncertain.Meanwhile this The fusion detection method of invention improves the complementarity between each sensor information, and certain sensors provide dense information, other Sensor provides sparse information, these are complementary after fusion, can compensate for the uncertainty and measurement range of single-sensor Limitation.Use Kalman filtering that can predict that NextState value is provided under statistical significance according to current time value for fuse information Optimal estimation, and its recursion characteristic make system information processing do not need a large amount of information and storage operation.
In other embodiments, according to actual needs, caterpillar drive status data can also be by between crawler belt and inner wall of the pipe Positive pressure force data and crawler travel speed data obtain, without further pass through robot operation acceleration information tested Card, is also able to achieve the purpose of the present invention.
Obviously, the above embodiments are merely examples for clarifying the description, and does not limit the embodiments.It is right For those of ordinary skill in the art, can also make on the basis of the above description it is other it is various forms of variation or It changes.There is no necessity and possibility to exhaust all the enbodiments.And it is extended from this it is obvious variation or It changes still within the protection scope of the invention.

Claims (10)

1. a kind of detecting robot of pipe, including main body rack (1) and the three group walkings circumferentially uniformly distributed around the main body rack (1) Component, the walking component include the Athey wheel (4) of bottom bracket (3) and setting on the bottom bracket (3), the bottom Bracket (3) is connect by connecting bracket (2) with the main body rack (1), it is characterised in that: is arranged on the main body rack (1) There are laser radar, camera, gyroscope, Temperature Humidity Sensor and gas concentration sensor, is arranged on each Athey wheel (4) There are diaphragm pressure sensor, leakage field module and encoder, the first extending bracket (5) and second on the outside of the main body rack (1) Infrared distance sensor and mileage wheel module are respectively arranged on extending bracket (8).
2. detecting robot of pipe according to claim 1, it is characterised in that: the laser radar and the camera are set The main body rack (1) is placed in towards the side of direction of travel, the gyroscope, the Temperature Humidity Sensor and the gas are dense Degree sensor is set to the inside of the main body rack (1), and the diaphragm pressure sensor is set to the folder of the Athey wheel (4) In layer, the leakage field module setting is uniformly distributed in the inside of the Athey wheel (4), and the encoder is set to the Athey wheel (4) on motor gear (7) mounted.
3. a kind of Multi-sensor Fusion detection method of detecting robot of pipe as claimed in claim 1 or 2, which is characterized in that packet It includes:
Step 1, main control module obtain the detection data of each sensor every preset time;
All detection datas that current time obtains are divided into robot oneself state data and tube environment number by step 2 According to;
Step 3, the robot oneself state data to current time acquisition and the tube environment data carry out special respectively Sign is extracted;
Step 4, respectively to the robot oneself state data and tube environment data progress parameter after feature extraction Estimation obtains the subsequent time data estimation of the robot oneself state data and the tube environment data respectively;Meanwhile To the robot oneself state data and tube environment data progress feature identification after feature extraction, institute is obtained respectively State the current time characteristic attribute of robot oneself state data and the tube environment data;
Step 5 respectively estimates the subsequent time data of the robot oneself state data and the tube environment data Meter and the current time characteristic attribute carry out fusion calculation, obtain the synthesis state of robot operating status and tube environment situation Gesture estimation.
4. Multi-sensor Fusion detection method according to claim 3, it is characterised in that: in the step 3, the spy Sign, which is extracted, to be referred to all detection datas progress time calibration at current time and space coordinate transformation, to form fusion calculation institute The unified time reference point and spatial reference point needed.
5. Multi-sensor Fusion detection method according to claim 4, it is characterised in that: in the step 4, the ginseng It counts the robot oneself state data being estimated as by after feature extraction and the tube environment data is respectively formed one one The matrix measured value of row N column, by the detection data value at current time and last moment to the inclined of the data estimated value at current time Difference multiplied by weight number, estimate by the subsequent time data for obtaining the robot oneself state data and the tube environment data Meter;
The feature is identified as according to the robot oneself state data and the tube environment data after feature extraction Observed result is respectively formed the feature vector of a N-dimensional, wherein per an one-dimensional independent characteristic for representing detected data, from And obtain the current time characteristic attribute of the robot oneself state data and the tube environment data.
6. Multi-sensor Fusion detection method according to claim 5, it is characterised in that:
The robot oneself state data include caterpillar drive status data, robot travel distance data, obstacle in pipeline Principle condition and robot and front obstacle range data;
The tube environment data include pipeline detection image, inner wall of the pipe stray field signal, in pipeline the temperature of environment and Harmful gas concentration data in humidity data and pipeline.
7. Multi-sensor Fusion detection method according to claim 6, it is characterised in that: the caterpillar drive status data By between crawler belt and inner wall of the pipe positive pressure force data and crawler travel speed data obtain.
8. Multi-sensor Fusion detection method according to claim 7, it is characterised in that: the caterpillar drive status data By between crawler belt and inner wall of the pipe positive pressure force data and crawler travel speed data, again pass through robot operation accelerate degree It is obtained according to being verified.
9. Multi-sensor Fusion detection method according to claim 8, it is characterised in that:
Positive pressure force data between the crawler belt and inner wall of the pipe is obtained by diaphragm pressure sensor;
The crawler travel speed data is obtained by encoder;
The robot operation acceleration information is obtained by gyroscope;
The robot travel distance data are obtained by mileage wheel module;
Obstacle principle condition is obtained by laser radar in the pipeline;
The robot is obtained with front obstacle range data by infrared distance sensor;
The pipeline detection image is obtained by camera;
The inner wall of the pipe stray field signal is obtained by leakage field module;
The temperature and humidity data of environment are obtained by Temperature Humidity Sensor in the pipeline;
Harmful gas concentration data are obtained by gas concentration sensor in the pipeline.
10. Multi-sensor Fusion detection method according to claim 3, it is characterised in that: the robot operating status Comprehensive situation estimation with tube environment situation is sent to main control module and carries out decision, obtains corresponding counter-measure;Institute Stating preset time is 0.2s.
CN201910417013.1A 2019-05-20 2019-05-20 Pipeline detection robot and multi-sensor fusion detection method thereof Expired - Fee Related CN110159869B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910417013.1A CN110159869B (en) 2019-05-20 2019-05-20 Pipeline detection robot and multi-sensor fusion detection method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910417013.1A CN110159869B (en) 2019-05-20 2019-05-20 Pipeline detection robot and multi-sensor fusion detection method thereof

Publications (2)

Publication Number Publication Date
CN110159869A true CN110159869A (en) 2019-08-23
CN110159869B CN110159869B (en) 2020-11-03

Family

ID=67631406

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910417013.1A Expired - Fee Related CN110159869B (en) 2019-05-20 2019-05-20 Pipeline detection robot and multi-sensor fusion detection method thereof

Country Status (1)

Country Link
CN (1) CN110159869B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110824524A (en) * 2019-11-13 2020-02-21 西安通航装备科技开发有限公司 Satellite video transmission system based on airborne Ka wave band
CN110861123A (en) * 2019-11-14 2020-03-06 华南智能机器人创新研究院 Method and device for visually monitoring and evaluating running state of robot
CN110954593A (en) * 2019-12-18 2020-04-03 中北大学 Pipeline defect in-service detection device and method based on rotating magnetic field array type probe
CN111007532A (en) * 2019-12-27 2020-04-14 江苏恒澄交科信息科技股份有限公司 Pipeline measuring method based on laser radar
CN111735491A (en) * 2020-05-14 2020-10-02 国网浙江宁波市鄞州区供电有限公司 Cable pipeline detection device with accurate measurement
CN112228697A (en) * 2020-11-04 2021-01-15 杭州申昊科技股份有限公司 Crawler-type pipeline robot
CN112228696A (en) * 2020-11-04 2021-01-15 杭州申昊科技股份有限公司 Air bag type pipeline robot
WO2021098342A1 (en) * 2019-11-22 2021-05-27 长安大学 In-situ detection robot for loess geological information
CN113273480A (en) * 2021-05-20 2021-08-20 安徽师范大学皖江学院 Intelligent underground infiltrating irrigation system and method based on soil humidity sensing
CN114135740A (en) * 2021-12-01 2022-03-04 国网江苏省电力有限公司连云港供电分公司 Pipeline inspection robot
CN114136670A (en) * 2021-10-26 2022-03-04 中国石油化工股份有限公司 Pipeline detection robot evaluation method based on pipeline detection robot test platform
CN114414571A (en) * 2022-01-21 2022-04-29 付世艳 Wireless toxic gas detection robot
CN114739303A (en) * 2022-06-09 2022-07-12 国机传感科技有限公司 Pipeline inner diameter sensing scanning device based on line laser
CN114791067A (en) * 2021-01-25 2022-07-26 杭州申昊科技股份有限公司 Pipeline robot with heat detection function, control method and control system
CN115081741A (en) * 2022-07-21 2022-09-20 西南石油大学 Natural gas metrological verification intelligent prediction method based on neural network

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103615663A (en) * 2013-11-11 2014-03-05 江苏师范大学 Robot for detecting inner condition of pipeline
CN104565675A (en) * 2014-06-20 2015-04-29 北京石油化工学院 Pipeline detection robot
CN106764245A (en) * 2017-01-11 2017-05-31 福州川大软件科技有限公司 A kind of crawler belt type pipeline robot
CN109114355A (en) * 2018-10-25 2019-01-01 南京水动力信息科技有限公司 A kind of pipe network Robot system
CN208595354U (en) * 2018-06-08 2019-03-12 上海工程技术大学 A kind of oil pipeline detection maintenance multi-foot robot
CN109611640A (en) * 2018-12-04 2019-04-12 山东大学 Pipe robot
CN109737267A (en) * 2019-01-16 2019-05-10 青岛理工大学 Pipeline detection robot and method based on multi-sensor information fusion

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103615663A (en) * 2013-11-11 2014-03-05 江苏师范大学 Robot for detecting inner condition of pipeline
CN104565675A (en) * 2014-06-20 2015-04-29 北京石油化工学院 Pipeline detection robot
CN106764245A (en) * 2017-01-11 2017-05-31 福州川大软件科技有限公司 A kind of crawler belt type pipeline robot
CN208595354U (en) * 2018-06-08 2019-03-12 上海工程技术大学 A kind of oil pipeline detection maintenance multi-foot robot
CN109114355A (en) * 2018-10-25 2019-01-01 南京水动力信息科技有限公司 A kind of pipe network Robot system
CN109611640A (en) * 2018-12-04 2019-04-12 山东大学 Pipe robot
CN109737267A (en) * 2019-01-16 2019-05-10 青岛理工大学 Pipeline detection robot and method based on multi-sensor information fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王黎等: "基于多源信息融合的管道机器人定位技术研究", 《科技信息(学术研究)》 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110824524A (en) * 2019-11-13 2020-02-21 西安通航装备科技开发有限公司 Satellite video transmission system based on airborne Ka wave band
CN110824524B (en) * 2019-11-13 2021-10-26 西安通航装备科技开发有限公司 Satellite video transmission system based on airborne Ka wave band
CN110861123A (en) * 2019-11-14 2020-03-06 华南智能机器人创新研究院 Method and device for visually monitoring and evaluating running state of robot
WO2021098342A1 (en) * 2019-11-22 2021-05-27 长安大学 In-situ detection robot for loess geological information
CN110954593A (en) * 2019-12-18 2020-04-03 中北大学 Pipeline defect in-service detection device and method based on rotating magnetic field array type probe
CN111007532A (en) * 2019-12-27 2020-04-14 江苏恒澄交科信息科技股份有限公司 Pipeline measuring method based on laser radar
CN111735491A (en) * 2020-05-14 2020-10-02 国网浙江宁波市鄞州区供电有限公司 Cable pipeline detection device with accurate measurement
CN112228696A (en) * 2020-11-04 2021-01-15 杭州申昊科技股份有限公司 Air bag type pipeline robot
CN112228697A (en) * 2020-11-04 2021-01-15 杭州申昊科技股份有限公司 Crawler-type pipeline robot
CN114791067A (en) * 2021-01-25 2022-07-26 杭州申昊科技股份有限公司 Pipeline robot with heat detection function, control method and control system
CN114791067B (en) * 2021-01-25 2024-02-06 杭州申昊科技股份有限公司 Pipeline robot with heat detection function, control method and control system
CN113273480A (en) * 2021-05-20 2021-08-20 安徽师范大学皖江学院 Intelligent underground infiltrating irrigation system and method based on soil humidity sensing
CN113273480B (en) * 2021-05-20 2022-11-11 安徽师范大学皖江学院 Intelligent underground infiltrating irrigation system and method based on soil humidity sensing
CN114136670A (en) * 2021-10-26 2022-03-04 中国石油化工股份有限公司 Pipeline detection robot evaluation method based on pipeline detection robot test platform
CN114135740A (en) * 2021-12-01 2022-03-04 国网江苏省电力有限公司连云港供电分公司 Pipeline inspection robot
CN114135740B (en) * 2021-12-01 2024-03-05 国网江苏省电力有限公司连云港供电分公司 Pipeline detection robot
CN114414571B (en) * 2022-01-21 2024-01-26 北京中电华劳科技有限公司 Wireless poisonous gas detection robot
CN114414571A (en) * 2022-01-21 2022-04-29 付世艳 Wireless toxic gas detection robot
CN114739303A (en) * 2022-06-09 2022-07-12 国机传感科技有限公司 Pipeline inner diameter sensing scanning device based on line laser
CN115081741A (en) * 2022-07-21 2022-09-20 西南石油大学 Natural gas metrological verification intelligent prediction method based on neural network

Also Published As

Publication number Publication date
CN110159869B (en) 2020-11-03

Similar Documents

Publication Publication Date Title
CN110159869A (en) A kind of detecting robot of pipe and its Multi-sensor Fusion detection method
CN110095061A (en) Vehicle morpheme detection system and method based on profile scan
CN108154084A (en) For the farm environment cognitive method of the unpiloted Multi-sensor Fusion of agricultural machinery
CN103776463B (en) Manless working face coal-winning machine automatic Memory coal cutting freedom positioning device method of testing
CN104713491B (en) The method that the slope monitoring system of slope deforming three-dimensional data can be obtained and its obtain slope deforming three-dimensional data
KR20200141422A (en) Vehicle and sensing device of utilizing spatial information acquired using sensor, and server for the same
CN101839721A (en) Visual navigation method in autonomous rendezvous and docking
CN110766785B (en) Real-time positioning and three-dimensional reconstruction device and method for underground pipeline
CN110455275A (en) A kind of macrotype spherical storage tank climbing robot Position Fixing Navigation System and method
CN205950750U (en) Transformer station inspection robot control system that navigates based on inertial navigation
Song et al. Design of in-pipe robot based on inertial positioning and visual detection
CN107014296A (en) Comprehensive inspection car OCS inspecting system high speed orientation triggering method and device
CN110454642A (en) A kind of control method of detecting robot of pipe
CN108388187A (en) A kind of robot control system
CN104005324A (en) Pavement texture information detection system
US20220111531A1 (en) Robot globally optimal visual positioning method and device based on point-line features
CN117150425B (en) Segment erector motion state prediction method based on mechanism data fusion
CN109668547A (en) A kind of bridge intelligence inspection system
CN116524116A (en) Drainage pipeline three-dimensional model construction system with multi-sensor data fusion function
Gao et al. Multi-Sensor Fusion Perception System in Train
Monteriù et al. Real-time model-based fault detection and isolation for ugvs
KR102660478B1 (en) System and apparatus for crop automatic image acquisition and diagnosis method of acquired plant image
CN109284753A (en) A kind of localization method of liquid transmission line and application
Tian et al. A design of odometer-aided visual inertial integrated navigation algorithm based on multiple view geometry constraints
Monteriu et al. Experimental validation of a real-time model-based sensor fault detection and isolation system for unmanned ground vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201103