CN110174136B - Intelligent detection robot and intelligent detection method for underground pipeline - Google Patents

Intelligent detection robot and intelligent detection method for underground pipeline Download PDF

Info

Publication number
CN110174136B
CN110174136B CN201910375732.1A CN201910375732A CN110174136B CN 110174136 B CN110174136 B CN 110174136B CN 201910375732 A CN201910375732 A CN 201910375732A CN 110174136 B CN110174136 B CN 110174136B
Authority
CN
China
Prior art keywords
data
pipeline
mobile robot
robot body
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910375732.1A
Other languages
Chinese (zh)
Other versions
CN110174136A (en
Inventor
柳景斌
熊剑
王泽民
谭智
黄百川
张广东
高鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Survey And Design Co ltd
Wuhan University WHU
Original Assignee
Wuhan Survey And Design Co ltd
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Survey And Design Co ltd, Wuhan University WHU filed Critical Wuhan Survey And Design Co ltd
Priority to CN201910375732.1A priority Critical patent/CN110174136B/en
Publication of CN110174136A publication Critical patent/CN110174136A/en
Application granted granted Critical
Publication of CN110174136B publication Critical patent/CN110174136B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses an intelligent detection robot and an intelligent detection method for underground pipelines, wherein the intelligent detection robot comprises a mobile robot body and a plurality of sensors; firstly, calibrating relative position and posture information between sensors and between the sensors and a mobile robot body; then the mobile robot body moves autonomously in the pipeline space, and the sensor acquires data in real time; calculating and determining the accurate position of the mobile robot body in the data acquisition and movement process; calculating the spatial position of sensor data according to the position of the mobile robot body and the relative position and posture information of the sensor, and registering and fusing different sensor data to obtain fused pipeline space multi-source spatial data; finally, intelligently identifying and extracting the current situation information of the geometric shape and the attribute of the pipeline from the fused multi-source sensor data, and labeling the spatial position of the information; the invention realizes the automation of data acquisition of the current situation detection of the underground pipeline and the intellectualization of the identification and extraction process of the current situation information of the pipeline.

Description

Intelligent detection robot and intelligent detection method for underground pipeline
Technical Field
The invention belongs to the field of special surveying and mapping technical equipment, relates to an intelligent underground pipeline detection robot and an intelligent detection method, and particularly relates to an integrated intelligent pipeline detection robot integrating Beidou GNSS satellite positioning, inertial navigation and an environment perception sensor and an intelligent detection method.
Background
Underground pipelines (such as a municipal drainage pipe network) are city veins, are the basis of safe and stable operation of cities, are important contents for construction of smart cities, and a modern underground pipeline system becomes one of important marks for measuring the perfection degree of urban infrastructure and the urban management level. Pipeline detection is a prerequisite for realizing fine management and preventive repair of underground pipelines, and is necessary for maintaining normal operation of urban functions and ensuring life and property safety. For example, sewage and rainwater pipelines undertake the tasks of urban sewage collection and rainwater drainage, and the effective function is the premise of guaranteeing the safe operation of cities and guaranteeing the safety of lives and properties, so that intelligent water affair systems are vigorously built in large and medium-sized cities in China.
Because of invisibility of the underground pipeline, people pay attention to the underground pipeline always until accidents happen, a large amount of economic loss is caused, and even hidden dangers are brought to life and property safety. At present, the generally accepted solution is to survey and map the current situation of the underground pipeline periodically during the construction completion and use stage of the pipeline, detect and timely repair potential structural and functional damage. Surveying and mapping the current situation of the pipeline is a precondition for fine management and preventive repair. Therefore, the 2014 national institute office provides guidance on strengthening urban underground pipeline construction management, and clearly requires scale size, position relationship, functional attributes and the like of the general survey pipeline. Due to the special complexity of the pipeline environment, direct access by personnel is difficult and conventional mapping methods are difficult to implement. Underground pipeline detection robots are researched and developed at home and abroad, and pipeline robots of companies such as Teruil, Leidi century and China instrumentation and Internet of things represent the highest level in the domestic and foreign industrial fields at present. However, from the pipeline inspection service industry, the existing pipeline inspection technology and equipment have two defects, which cannot meet the following requirements:
1) the intelligent degree is low, data (such as CCTV video (Closed-Circuit Television monitoring system) is mainly read by human eyes), the pipeline detection is time-consuming and labor-consuming, the pipeline is required to be suspended for a long time, and even a road surface needs to be excavated;
2) the lack of precise reference position under the global geographic coordinate system frame in the pipeline can not carry out space positioning to the detected data, can not be fused with other systems of the 'smart city' or a thematic GIS system automatically, and reduces the effective utilization of the data and the overall efficiency exertion of the system (such as the support efficiency and the automation degree of decision making).
Modern 'smart cities' have urgent need for new intelligent detection methods and equipment for underground pipelines, so as to achieve high-efficiency data acquisition and intelligent pipeline current situation detection and guarantee normal operation of urban functions to the greatest extent.
Disclosure of Invention
In order to solve the technical problems, the invention provides a novel automatic and intelligent underground pipeline intelligent detection method, a pipeline detection robot and an intelligent detection method.
The intelligent detection robot adopts the technical scheme that: the utility model provides an underground piping intellectual detection system robot which characterized in that: the robot comprises a mobile robot body and a plurality of sensors;
the mobile robot body is used for moving in an underground pipeline space automatically;
the sensors are respectively used for autonomous positioning when the mobile robot body moves inside and outside an underground pipeline space, collecting high-precision fusion mapping data and providing geospatial position reference for the data; the high-precision fusion mapping data comprises an image in a collection pipeline space, three-dimensional laser point cloud and a depth image;
the sensors are fixedly arranged on the mobile robot body and are connected with a central processing unit arranged in the mobile robot body one by one through leads, and information collected by the sensors is transmitted to an external data processing center through a communication module arranged in the mobile robot body for data processing.
The intelligent detection method adopts the technical scheme that: an intelligent detection method for underground pipelines is characterized by comprising the following steps:
step 1: calibrating the relative position and posture information between the sensors and the mobile robot body;
step 2: the mobile robot body moves autonomously in a pipeline space, and a sensor acquires data in real time;
and step 3: calculating and determining the accurate position of the mobile robot body in the data acquisition and movement process;
and 4, step 4: calculating the spatial position of sensor data by using the position of the mobile robot body determined in the step 3 and the relative position and posture information of the sensor determined in the step 1, and registering and fusing different sensor data to obtain fused pipeline space multi-source spatial data;
and 5: intelligently identifying and extracting the current status information of the geometric shape and the attribute of the pipeline from the fused multi-source sensor data, and labeling the spatial position of the information;
the geometric shape information comprises a three-dimensional size and a geometric shape of the pipe;
the attribute status information comprises the status quo of structural defects and functional defects of the pipeline; the structural defects of the pipeline comprise structural breakage, deformation and collapse; the lack of pipe functionality includes blockage or fouling.
Compared with the prior art, the invention has the beneficial effects that:
(1) the pipeline detection robot equipment has seamless positioning capability in complex environments such as the inside and the outside of a pipeline and the like. The seamless positioning brings the following benefits: the robot can move autonomously according to a planned path and automatically acquire data; the collected data and the identified and extracted pipeline geometric shape and attribute status information have geographic coordinates, so that the equipment can automatically generate a pipeline detection report, and corresponding detection result data can be output and integrated with other computer systems (such as a related Geographic Information System (GIS)).
(2) The pipeline detection robot equipment and the method have the advantages of high data acquisition automation degree, high acquisition efficiency and small manual operation burden. The method integrates various sensors, synchronously collects multi-sensor data of the pipeline space, calibrates the position and the posture of the sensor, and obtains deeply fused pipeline space multi-dimensional data (containing geometric information and physical attribute information) with geographic position labels. Compared with the existing equipment and method (such as CCTV camera data acquisition through manual remote control), the automation degree of the data acquisition process is obviously improved, the accuracy and precision of the data are ensured through computer automatic acquisition and processing, the richness and diversity of the data are improved through the multi-sensor fusion method, and multi-dimensional data can be acquired simultaneously.
(3) The pipeline detection robot equipment and the method have the advantages of high data processing intelligence degree, high accuracy and small manual operation burden. The equipment and the method adopt a data processing algorithm which is specially designed and is subjected to an artificial intelligence principle, identify and extract three types of information (three-dimensional size and geometric shape of a pipeline, structural defects of the pipeline and functional missing situation of the pipeline) required by a pipeline detection project from multi-dimensional data of pipeline space fused by a plurality of sensors, generate the current situation type of the pipeline with geographic coordinates, and extract the current situations of the geometric shape and the attribute of the pipeline. Compared with the existing equipment and method (such as collecting CCTV camera data through manual remote control), the automation degree of the target identification and information extraction process is obviously improved, the integrity and accuracy of the detection result are ensured through computer automation processing, and the requirement of manual field operation is greatly reduced.
(4) The pipeline detection robot equipment can automatically generate the detection report according to the detection result processed by the computer, reduces the workload of manually generating the detection report, and has better integrity and accuracy when being compared with the existing manual mode. The pipeline detection robot equipment can output detection results according to a predefined data format and is integrated with other computer systems. Compared with the existing pipeline detection equipment, the equipment and the method have openness, and are beneficial to improving the overall digitization and informatization levels of the pipeline detection engineering industry.
Drawings
FIG. 1 is a flow chart of a method of an embodiment of the present invention;
FIG. 2 is a graph of visual data obtained for an underground pipe space in accordance with an embodiment of the present invention;
FIG. 3 is a graph of the effect of visual SLAM localization of a space within a subterranean passageway obtained in an embodiment of the present invention.
Detailed Description
In order to facilitate the understanding and implementation of the present invention for those of ordinary skill in the art, the present invention is further described in detail with reference to the accompanying drawings and examples, it is to be understood that the embodiments described herein are merely illustrative and explanatory of the present invention and are not restrictive thereof.
The invention provides an intelligent detection robot for underground pipelines, which comprises a mobile robot body and a plurality of sensors, wherein the mobile robot body is provided with a plurality of sensors;
the mobile robot body is used for autonomously moving in an underground pipeline space;
the sensors are respectively used for autonomous positioning when the mobile robot body moves inside and outside an underground pipeline space, collecting high-precision fusion mapping data and providing geospatial position reference for the data; the high-precision fusion mapping data comprises an image in a collection pipeline space, three-dimensional laser point cloud and a depth image;
the sensors are fixedly arranged on the mobile robot body and are connected with a central processing unit arranged in the mobile robot body one by one through leads, and information collected by the sensors is transmitted to an external data processing center through a communication module arranged in the mobile robot body for data processing.
The sensors of the present embodiment include one or more of a GNSS receiver, an Inertial Measurement Unit (IMU), a LiDAR (LiDAR), an optical camera, an RGB-D depth camera, a video camera, a thermal imaging camera, a multispectral camera; such as a GNSS receiver; the system comprises a GNSS receiver and an inertial measurement unit IMU; the system comprises a GNSS receiver, an inertial measurement unit IMU and a laser radar scanner LiDAR; the system comprises a GNSS receiver, an Inertial Measurement Unit (IMU) and an optical camera; an inertial measurement unit IMU and a laser radar scanner LiDAR; an inertial measurement unit IMU, an optical camera; an inertial measurement unit IMU, an RGB-D depth camera; an inertial measurement unit IMU, a thermal imaging camera and a multispectral camera; inertial measurement unit IMU, video camera.
The GNSS receiver can receive signals processing one or more of the following satellite navigation positioning systems to determine the receiver's spatial position, including: the united states Global Positioning System (GPS), the russian GLONASS navigation satellite system (GLONASS), the european union Galileo navigation satellite system (Galileo), the chinese Beidou navigation satellite system (Beidou), the japanese quasi-zenith star navigation satellite system (QZSS), the Indian Regional Navigation Satellite System (IRNSS).
Referring to fig. 1, the intelligent detection method for underground pipelines provided by the invention comprises the following steps:
step 1: calibrating the relative position and posture information between the sensors and the mobile robot body;
step 2: the mobile robot body moves autonomously in a pipeline space, and a sensor acquires data in real time;
in this embodiment, the data acquired by the sensor in real time includes the body position data (including an attitude angle and an azimuth angle) of the mobile robot, 2D image data and high definition video data in a pipeline space, thermal imaging picture data and multispectral picture data in the pipeline space, three-dimensional laser point cloud data, and depth image data.
And step 3: calculating and determining the accurate position of the mobile robot body in the data acquisition and movement process;
the specific implementation comprises the following substeps:
step 3.1, in the synchronous positioning and mapping SLAM technology, performing motion prediction and measurement equation construction joint estimation by using VIO (visual inertial odometer), and processing images acquired by a camera to obtain a position and posture sequence;
and 3.2, in an environment without GNSS signals (such as an underground pipeline environment), aligning the position and the attitude sequence estimated by the inertial measurement unit IMU and the position and the attitude sequence estimated by the camera to estimate the real scale of the camera track, and further obtaining the relative coordinates.
Step 3.3, fusing the relative coordinates with GPS coordinates obtained by the intelligent pipeline detection robot at the pipeline port, realizing seamless continuous positioning in complex environments (such as environments with GNSS signals and without GNSS signals, the inside and the outside of the pipeline and the like), and determining the geographic coordinates of the robot so as to obtain the accurate coordinates of the mobile robot body;
and 4, step 4: calculating the spatial position of sensor data by using the position of the mobile robot body determined in the step 3 and the relative position and posture information of the sensor determined in the step 1, and registering and fusing different sensor data to obtain fused pipeline space multi-source spatial data;
in this embodiment, the seamless continuous positioning precise coordinates of the mobile robot body determined in step 3 and the relative position and posture information between the sensor and the mobile robot body, between the sensor and the sensor determined in step 1 are used to calculate the spatial positions of 2D image data and high definition video data in the sensor data pipeline space, thermal imaging picture data and multispectral picture data in the pipeline space, three-dimensional laser point cloud data, and depth image data, and register and fuse the sensor data to obtain fused pipeline space multisource spatial data (such as two-dimensional image and three-dimensional laser radar point cloud fusion, three-dimensional laser radar point cloud and thermal imaging camera data fusion);
and 5: intelligently identifying and extracting the current status information of the geometric shape and the attribute of the pipeline from the fused multi-source sensor data, and labeling the spatial position of the information;
the geometry information includes the three-dimensional size and geometry of the pipe;
the attribute status information comprises the status quo of structural defects and functional defects of the pipeline; structural defects of the pipeline comprise structural breakage, deformation and collapse; the lack of pipe functionality includes blockage or fouling.
In this embodiment, a plurality of artificial intelligence methods (such as deep learning, machine learning, target recognition, geometric detection, etc.) are adopted to intelligently recognize and extract information such as the geometric shape and the attribute status of the pipeline from the sensor data, including but not limited to 1) the three-dimensional size and the geometric shape of the pipeline; 2) structural defects in the pipe (e.g., structural breakage, deformation, collapse); 3) the pipeline has the current situation of lack of functionality (such as blockage and siltation). Geographic coordinates are assigned to this information. The identified and extracted pipe geometry and attribute status information and its geographic coordinates are used to generate a pipe inspection effort report, which data and effort can be integrated with other systems (e.g., a relevant geographic information system, GIS).
In this embodiment, the three-dimensional size and the geometric shape of the pipeline are obtained from fused pipeline space multi-source space data (such as two-dimensional image and three-dimensional lidar point cloud fusion): the visible point on the two-dimensional image corresponds to the laser point cloud through collision, and the laser point cloud contains real coordinate position information, so that the three-dimensional size and the geometric shape of the pipeline can be obtained through the laser point cloud.
In this embodiment, the specific principle of the structural defect of the pipeline is that a deep learning technology is utilized to train a data set which is collected in advance and provided with a label, a deep learning model with the precision reaching 95% is obtained, when the deep learning model is used practically, multi-source data are transmitted into the deep learning model to be subjected to result judgment, the conditions of structural damage, deformation, collapse and the like can be roughly divided, and in the process, the selectable item is manual participation and manual error correction improvement model until the model reaches an ideal state.
In this embodiment, the current situation of functional deficiency (such as blockage and siltation) of the pipeline is detected and reported by using high-definition video data, thermal imaging picture data and multispectral picture data in a pipeline space through machine learning such as an SVM separator and target identification such as feature extraction and morphological operation.
Fig. 2 and fig. 3 are the results of data acquisition in a pipeline by the sensor provided by the invention.
The underground pipeline robot equipment has the capability of autonomously moving and automatically acquiring pipeline space data, the intelligent pipeline detection method has the capability of intelligently identifying and extracting the current situation information of the geometric shape and the attribute of the pipeline and marking the space position of the information, and the calculation process is automatically completed by a computer without manual calculation operation.
The invention has the functions of seamless positioning, automatic acquisition of pipeline space data, automatic identification and extraction of pipeline geometric shape and attribute current situation information and the like in complex environments such as the inside and the outside of an underground pipeline and the like, and is a multi-sensor fusion robot complex environment seamless positioning, automatic data acquisition and intelligent target identification method and robot intelligent equipment.
It should be understood that parts of the specification not set forth in detail are well within the prior art.
It should be understood that the above description of the preferred embodiments is given for clarity and not for any purpose of limitation, and that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (4)

1. An intelligent detection method for underground pipelines adopts an intelligent detection robot for underground pipelines;
the method is characterized in that: the intelligent underground pipeline detection robot comprises a mobile robot body and a plurality of sensors;
the mobile robot body is used for moving in an underground pipeline space automatically;
the sensors are respectively used for autonomous positioning when the mobile robot body moves inside and outside an underground pipeline space, collecting high-precision fusion mapping data and providing geospatial position reference for the data; the high-precision fusion mapping data comprises an image in a collection pipeline space, three-dimensional laser point cloud and a depth image;
the sensors are fixedly arranged on the mobile robot body and are connected with a central processing unit arranged in the mobile robot body one by one through leads, and information acquired by the sensors is transmitted to an external data processing center for data processing through a communication module arranged in the mobile robot body;
the method comprises the following steps:
step 1: calibrating the relative position and posture information between the sensors and the mobile robot body;
step 2: the mobile robot body moves autonomously in a pipeline space, and a sensor acquires data in real time;
and step 3: calculating and determining the accurate position of the mobile robot body in the data acquisition and movement process;
the specific implementation of the step 3 comprises the following substeps:
step 3.1: adopting a synchronous positioning and mapping SLAM method, performing motion prediction and measurement equation construction joint estimation by using a visual inertial odometer VIO, and processing images acquired by a camera to obtain a position and posture sequence;
step 3.2: in an environment without GNSS signals, aligning the position and attitude sequence estimated by an inertial measurement unit IMU and the position and attitude sequence estimated by a camera, estimating the real scale of a camera track, and further obtaining a relative coordinate;
step 3.3: fusing the relative coordinates with GPS coordinates obtained by the pipeline intelligent detection robot at the pipeline port, realizing seamless continuous positioning in various environments, and determining the geographic coordinates of the robot so as to obtain the accurate coordinates of the mobile robot body;
and 4, step 4: calculating the spatial position of sensor data by using the position of the mobile robot body determined in the step 3 and the relative position and posture information of the sensor determined in the step 1, and registering and fusing different sensor data to obtain fused pipeline space multi-source spatial data;
and 5: intelligently identifying and extracting the current status information of the geometric shape and the attribute of the pipeline from the fused multi-source sensor data, and labeling the spatial position of the information;
the geometric shape information comprises a three-dimensional size and a geometric shape of the pipe;
the attribute status information comprises the status quo of structural defects and functional defects of the pipeline; the structural defects of the pipeline comprise structural breakage, deformation and collapse; the current situation of lack of pipeline functionality comprises blockage or siltation;
the three-dimensional size and the geometric shape of the pipeline are obtained by fused two-dimensional images and three-dimensional laser radar point cloud data, the visible point on the two-dimensional images corresponds to the laser point cloud through collision, and the three-dimensional size and the geometric shape of the pipeline can be obtained by the laser point cloud as the laser point cloud comprises real coordinate position information;
the pipeline structural defect is that a deep learning technology is utilized to train a data set which is collected in advance and provided with a label, a deep learning model with the precision reaching 95% is obtained, and when the pipeline structural defect is actually used, multi-source data are input into the deep learning model to carry out result judgment;
the functional defects in the pipeline are detected and reported by high-definition video data, thermal imaging picture data in a pipeline space and multispectral picture data through machine learning, target identification and morphological operation under the condition of functional defect of the pipeline.
2. The method of claim 1, wherein: the sensor comprises one or more of a GNSS receiver, an Inertial Measurement Unit (IMU), a LiDAR, an optical camera, an RGB-D depth camera, a video camera, a thermal imaging camera, a multispectral camera;
the GNSS receiver is used for receiving and processing signals of a satellite navigation positioning system and determining the space position of the receiver; the satellite navigation positioning system comprises one or more of a United states global positioning system GPS, a Russian GLONASS navigation satellite system GLONASS, a European Union Galileo navigation satellite system Galileo, a Chinese Beidou navigation satellite system Beidou, a Japanese quasi-zenith satellite navigation satellite system QZSS and an Indian regional navigation satellite system IRNSS.
3. The method of claim 1, wherein: in the step 2, the sensor acquires data in real time, wherein the data comprises the position data of the mobile robot body, 2D image data and high-definition video data in a pipeline space, thermal imaging picture data and multispectral picture data in the pipeline space, three-dimensional laser point cloud data and depth image data; the mobile robot body position data comprises attitude angle data and azimuth angle data.
4. The method of claim 1, wherein: in step 4, calculating the spatial positions of 2D image data and high-definition video data in a sensor data pipeline space, thermal imaging picture data and multi-spectral picture data in the pipeline space, three-dimensional laser point cloud data and depth image data by using the seamless continuous positioning accurate coordinates of the mobile robot body determined in the step 3 and the relative positions and posture information between the sensor and the mobile robot body and between the sensor and the sensor determined in the step 1, and registering and fusing the sensor data to obtain fused pipeline space multi-source space data which comprises two-dimensional images and three-dimensional laser radar point cloud fusion data, three-dimensional laser radar point cloud and thermal imaging camera data fusion data.
CN201910375732.1A 2019-05-07 2019-05-07 Intelligent detection robot and intelligent detection method for underground pipeline Active CN110174136B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910375732.1A CN110174136B (en) 2019-05-07 2019-05-07 Intelligent detection robot and intelligent detection method for underground pipeline

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910375732.1A CN110174136B (en) 2019-05-07 2019-05-07 Intelligent detection robot and intelligent detection method for underground pipeline

Publications (2)

Publication Number Publication Date
CN110174136A CN110174136A (en) 2019-08-27
CN110174136B true CN110174136B (en) 2022-03-15

Family

ID=67691226

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910375732.1A Active CN110174136B (en) 2019-05-07 2019-05-07 Intelligent detection robot and intelligent detection method for underground pipeline

Country Status (1)

Country Link
CN (1) CN110174136B (en)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110634136B (en) * 2019-09-17 2022-09-13 北京华捷艾米科技有限公司 Pipeline wall damage detection method, device and system
CN110766785B (en) * 2019-09-17 2023-05-05 武汉大学 Real-time positioning and three-dimensional reconstruction device and method for underground pipeline
CN110851956B (en) * 2019-10-11 2022-06-14 南昌大学 Automatic calculation, labeling and plotting method for construction engineering pipeline construction positioning information
CN110992349A (en) * 2019-12-11 2020-04-10 南京航空航天大学 Underground pipeline abnormity automatic positioning and identification method based on deep learning
CN111007532A (en) * 2019-12-27 2020-04-14 江苏恒澄交科信息科技股份有限公司 Pipeline measuring method based on laser radar
CN111652261A (en) * 2020-02-26 2020-09-11 南开大学 Multi-modal perception fusion system
CN111551111B (en) * 2020-05-13 2021-02-05 华中科技大学 Part feature robot rapid visual positioning method based on standard ball array
CN113721249A (en) * 2020-05-21 2021-11-30 武汉中仪物联技术股份有限公司 Method, system and equipment for detecting defects in pipeline
CN113702996A (en) * 2020-05-21 2021-11-26 武汉中仪物联技术股份有限公司 Method, system and equipment for detecting laser radar in pipeline
CN111828774B (en) * 2020-07-02 2021-07-13 广州腾鼎勘测科技有限公司 Endoscopic three-dimensional position finder for gas transmission pipeline
CN111856496A (en) * 2020-07-07 2020-10-30 武汉中仪物联技术股份有限公司 Pipeline detection method and pipeline detection device
CN112576861B (en) * 2020-11-25 2021-11-05 深圳市博铭维智能科技有限公司 Geological radar robot, control system, method, terminal and readable storage medium
CN112665582A (en) * 2020-12-18 2021-04-16 南京理工大学 Underground pipeline detecting system based on IMU and laser spot image
CN113327289A (en) * 2021-05-18 2021-08-31 中山方显科技有限公司 Method for simultaneously calibrating internal and external parameters of multi-source heterogeneous sensor
CN113820735B (en) * 2021-08-31 2023-12-01 上海华测导航技术股份有限公司 Determination method of position information, position measurement device, terminal and storage medium
CN114331956A (en) * 2021-11-16 2022-04-12 武汉中仪物联技术股份有限公司 Pipeline detection method and device, electronic equipment and storage medium
CN115014334B (en) * 2021-11-19 2024-08-20 电子科技大学 Pipeline defect detection and positioning method and system based on multi-sensing information fusion
CN114991298B (en) * 2022-06-23 2023-06-06 华中科技大学 Urban drainage pipeline detection and dredging intelligent robot and working method
CN115015911B (en) * 2022-08-03 2022-10-25 深圳安德空间技术有限公司 Method and system for manufacturing and using navigation map based on radar image
CN115451919B (en) * 2022-09-28 2023-06-30 安徽理工大学 Intelligent unmanned mapping device and method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201138451Y (en) * 2007-12-10 2008-10-22 华中科技大学 Robot self-positioning system
CN105573310A (en) * 2014-10-11 2016-05-11 北京自动化控制设备研究所 Method for positioning and environment modeling of coal mine tunnel robot
CN106826833A (en) * 2017-03-01 2017-06-13 西南科技大学 Independent navigation robot system based on 3D solid cognition technologies
CN105222772B (en) * 2015-09-17 2018-03-16 泉州装备制造研究所 A kind of high-precision motion track detection system based on Multi-source Information Fusion
CN108051837A (en) * 2017-11-30 2018-05-18 武汉大学 Multiple-sensor integration indoor and outdoor mobile mapping device and automatic three-dimensional modeling method
CN208503767U (en) * 2018-06-20 2019-02-15 深圳供电局有限公司 Underground pipeline surveying and mapping robot

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104568983B (en) * 2015-01-06 2017-03-15 浙江工业大学 Pipeline Inner Defect Testing device and method based on active panoramic vision

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201138451Y (en) * 2007-12-10 2008-10-22 华中科技大学 Robot self-positioning system
CN105573310A (en) * 2014-10-11 2016-05-11 北京自动化控制设备研究所 Method for positioning and environment modeling of coal mine tunnel robot
CN105222772B (en) * 2015-09-17 2018-03-16 泉州装备制造研究所 A kind of high-precision motion track detection system based on Multi-source Information Fusion
CN106826833A (en) * 2017-03-01 2017-06-13 西南科技大学 Independent navigation robot system based on 3D solid cognition technologies
CN108051837A (en) * 2017-11-30 2018-05-18 武汉大学 Multiple-sensor integration indoor and outdoor mobile mapping device and automatic three-dimensional modeling method
CN208503767U (en) * 2018-06-20 2019-02-15 深圳供电局有限公司 Underground pipeline surveying and mapping robot

Also Published As

Publication number Publication date
CN110174136A (en) 2019-08-27

Similar Documents

Publication Publication Date Title
CN110174136B (en) Intelligent detection robot and intelligent detection method for underground pipeline
CN111337030B (en) Backpack-based laser radar scanning system and navigation positioning method
CN104567708B (en) Full section of tunnel high speed dynamical health detection means and method based on active panoramic vision
AU2019217205B2 (en) Method of and apparatus for analyzing images
Tao et al. Lane marking aided vehicle localization
AU2013200708B2 (en) Image registration of multimodal data using 3D GeoArcs
KR101314588B1 (en) Method and apparatus for producing map of artificial mark, method and apparatus for measuring position of mobile object by using same
CN111551958A (en) Mining area unmanned high-precision map manufacturing method
CN110766785B (en) Real-time positioning and three-dimensional reconstruction device and method for underground pipeline
CN111060924B (en) SLAM and target tracking method
CN108733053A (en) A kind of Intelligent road detection method based on robot
US20200240790A1 (en) Localization with Neural Network Based Image Registration of Sensor Data and Map Data
CN111275960A (en) Traffic road condition analysis method, system and camera
CN114638909A (en) Substation semantic map construction method based on laser SLAM and visual fusion
CN112859130B (en) High-precision electronic map position matching method for field navigation patrol
CN110751123B (en) Monocular vision inertial odometer system and method
US20220282967A1 (en) Method and mobile detection unit for detecting elements of infrastructure of an underground line network
CN109685893A (en) Space integration modeling method and device
CN114972970A (en) Coal mine mechanical arm scanning observation system
Tao et al. Automated processing of mobile mapping image sequences
Dai et al. LiDAR–Inertial Integration for Rail Vehicle Localization and Mapping in Tunnels
Koppanyi et al. Experiences with acquiring highly redundant spatial data to support driverless vehicle technologies
TWM600873U (en) Detection system for detecting road damage
Dwivedi et al. New horizons in planning smart cities using LiDAR technology
Roberts et al. Predictive intelligence for a rail traffic management system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant