CN107632615B - Autonomous flight four-rotor tunnel passing method based on visual inspection - Google Patents

Autonomous flight four-rotor tunnel passing method based on visual inspection Download PDF

Info

Publication number
CN107632615B
CN107632615B CN201710750773.5A CN201710750773A CN107632615B CN 107632615 B CN107632615 B CN 107632615B CN 201710750773 A CN201710750773 A CN 201710750773A CN 107632615 B CN107632615 B CN 107632615B
Authority
CN
China
Prior art keywords
tunnel
rotors
flight
flying
autonomous flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710750773.5A
Other languages
Chinese (zh)
Other versions
CN107632615A (en
Inventor
史莹晶
李�瑞
刘奇胜
蒋宏
张华林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201710750773.5A priority Critical patent/CN107632615B/en
Publication of CN107632615A publication Critical patent/CN107632615A/en
Application granted granted Critical
Publication of CN107632615B publication Critical patent/CN107632615B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an autonomous flying four-rotor tunnel-passing method based on visual inspection, which is characterized in that an entrance is autonomously searched by using an autonomous flying four-rotor and enters a tunnel, autonomous navigation flying is realized in the tunnel by using a bottom camera for the autonomous flying four-rotor, cables and peripheral conditions in the tunnel are detected in real time by using the camera while flying, and finally the cables and the peripheral conditions are autonomously flown out of the tunnel.

Description

Autonomous flight four-rotor tunnel passing method based on visual inspection
Technical Field
The invention belongs to the technical field of flight control, and particularly relates to an autonomous flight four-rotor tunnel passing method based on visual inspection.
Background
In recent years, underground cable has become the mainstream to replace traditional overhead lines in the development of large cities. Statistics show that in many modern cities the proportion of underground transmission lines is already over 70%. The cable tunnel is a fully-closed underground structure which is used for accommodating a large number of cables and has a passage for installation and inspection, is an optimal bearing mode of the underground cables, and along with the popularization of the underground cables, the problem that how to regularly detect the closed tunnel and prevent emergencies is needed to be solved urgently is solved.
Periodic inspections using a rail-guided robot or manually have been proposed. The rail-mounted robot moves mainly by means of a rail, the rail is laid at the top or the bottom of a tunnel, the robot passes through the tunnel along the rail through the rotation of a motor of the robot or the transmission of the rail, the tunnel is detected in real time through related sensors carried by the robot, the rail-mounted robot usually needs to be deployed in a target tunnel in advance, namely, one tunnel needs to be provided with the rail-mounted robot and a matched rail facility, and people still need to be sent to the tunnel regularly to inspect and maintain the rail-mounted robot, so that the use and maintenance cost is high, the rail-mounted robot does not need to be deployed in advance, the robot can enter and exit the tunnel independently, the tunnel is fully independent, and an unmanned device for comprehensive detection does not exist.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides an autonomous flying four-rotor tunnel-passing method based on visual inspection.
In order to achieve the purpose, the invention discloses an autonomous flying four-rotor tunnel-passing method based on visual inspection, which is characterized by comprising the following steps of:
(1) adding a yellow identification line along the central line of the tunnel, mounting a USB camera at the bottom of the four self-flying rotors, additionally mounting L ED light supplement lamps to ensure that the camera normally works in the tunnel, and additionally mounting miniature laser ranging sensors at the front and rear parts of the camera and the right side of the machine body;
(2) four rotors of controlling independent flight are sailed into tunnel well head
(2.1) carrying out power-on initialization on four autonomous flight rotors;
(2.2) manually taking off and autonomously flying four rotors to ensure that the autonomously flying four rotors are away from the ground by a distance h1Then manually remotely controlling the four self-flying rotors to drive into the vicinity of the tunnel wellhead, and switching the four self-flying rotors into a full self-flying mode through a remote controller;
(2.3) calculating the real-time attitude angle of the four autonomically flying rotors by the flight control module;
(2.4) positioning real-time position information of the current autonomous flight four-rotor by using laser ranging sensors on the right sides of the machine head and the machine body through the autonomous flight four-rotor, fusing the real-time position information and the current autonomous flight four-rotor with an accelerometer to estimate flight speed, and controlling the autonomous flight four-rotor to fly to the central position of a tunnel wellhead by a flight control module by combining a real-time attitude angle, the position information and the flight speed;
(3) the four rotors which are controlled to fly autonomously are driven into the tunnel entrance from the center of the tunnel wellhead
The flight control module controls the autonomous flight quadrotors to vertically descend from the center of a tunnel wellhead, when the autonomous flight quadrotors descend to a tunnel top plane, the laser ranging sensors on the tail and the right side of the machine body are used for positioning real-time position information of the autonomous flight quadrotors, the autonomous flight quadrotors continue to vertically descend according to the current position information and the real-time attitude angle, and when the autonomous flight quadrotors descend to a set cruising height, the autonomous flight quadrotors reach a tunnel entrance;
(4) four-rotor-wing passing tunnel controlled by flight control module to fly autonomously
(4.1) periodically collecting an environment image in the tunnel by the USB camera under the coordination of an L ED supplementary lighting lamp;
(4.2) starting a line patrol thread of the four self-flying rotors to obtain the actual distance d of the four self-flying rotors deviating from the yellow identification line and the deflection angle theta of the four self-flying rotors in the flying direction and the tunnel trend;
(4.2.1) converting the environment image into an HSV color space, and then carrying out binarization processing on the environment image according to the yellow HSV range value to obtain a binarization image;
(4.2.2) filtering white noise points in the binary image by utilizing open operation, and connecting discrete white areas in the binary image by utilizing closed operation to obtain an ideal binary image;
(4.2.3) taking the front n rows of images above the ideal binary image, searching the left edge of a white area from the leftmost side of the ideal binary image to the right and searching the right edge of the white area from the rightmost side to the left for the first row to obtain the white area of the row of images, and then finding out the center of the white area; for the rest n-1 lines of images, searching the left edge and the right edge of the line in the range of m pixels near the left edge and the right edge of the line searched on the upper line of the image to obtain the white area of the line of images, and then finding out the center of the white area corresponding to the line of images;
(4.2.4) making a difference between the center of the white area of the found n rows of images and the absolute center of the corresponding row in the ideal binary image, accumulating and averaging, and taking the average difference value as the pixel quantity of the autonomous flight quadrotors deviating from the white area;
(4.2.5) calculating the actual distance d of the autonomous flight four rotors deviating from the yellow identification line by using a pinhole imaging model according to the pixel quantity of the autonomous flight four rotors deviating from the white area, the current real-time height information of the autonomous flight four rotors and the focal length information of the USB camera;
(4.2.6) selecting the white area centers extracted from the n/3 th line and the 2n/3 th line from the n-line images, and calculating the arctan value of the difference e of the white area centers of the two lines by arctan (e/(n/3)) to obtain the drift angle theta of the flight direction of the autonomous flight four rotors and the tunnel trend;
(4.3) starting an optical flow thread of the four independent flying rotors to obtain the real-time flying speed v of the four independent flying rotors;
(4.3.1) at the initial moment t of the powering-on of the four autonomy flight rotors0Operating a corner detection algorithm on an environment image acquired at the initial moment of the USB camera to obtain k Shi-Tomasi corners, and taking the k Shi-Tomasi corners as a Shi-Tomasi corner set a (t) of the environment image at the moment0);
(4.3.2) periodically acquiring the environment image at the rest of subsequent moments, processing the environment image at the current moment by using an L K optical flow motion estimation method, searching matching points of Shi-Tomasi angular points in a Shi-Tomasi angular point set a (t-1) of the environment image at the previous moment, forming a point set b (t), and removing Shi-Tomasi angular points, at which the matching points cannot be found, from the point set a (t-1) so as to obtain a Shi-Tomasi angular point set a (t) at the current moment;
(4.3.3), judging whether the number of the Shi-Tomasi angular points in a (t) is larger than a threshold value M, if so, entering a step (4.3.4), otherwise, returning to the step (4.3.1);
(4.3.4) accumulating and averaging coordinate deviations between corresponding points in a (t) and b (t) to obtain pixel-wise flight speed information v1
(4.3.5) utilizing the real-time altitude and the real-time attitude angle of the current flight of the four independent flight rotors to v1Compensating to obtain flight speed information v2
(4.3.6) advantageInertial navigation speed information v estimated by accelerometer and gyroscope3Then v is further determined3And v2Performing Kalman filtering fusion to obtain the real-time flight speed v of the autonomous flight quadrotors;
(4.4) according to the actual distance d of the autonomous flight four rotors deviating from the yellow identification line, the real-time flight speed v and the deflection angle theta between the flight direction of the autonomous flight four rotors and the tunnel trend, controlling the autonomous flight four rotors to fly through the flight control module, enabling d and theta to approach 0, cruising and flying forwards, and enabling the distance measured by the locomotive laser ranging sensor to reach the tunnel outlet when the distance is half the width of the tunnel;
(5) the four rotors of the autonomous flight are controlled to drive away from the well head of the tunnel from the center position of the tunnel outlet
Utilize the laser range finding sensor location on aircraft nose and fuselage right side to independently fly the real-time positional information of four rotors, flight control module control independently flies four rotors and rises perpendicularly from the tunnel exit, when rising to tunnel top plane, change into the laser range finding sensor location on aircraft tail and fuselage right side and independently flies the real-time positional information of four rotors, flight control module control independently flies four rotors and continues to rise perpendicularly from tunnel exit central point, when rising to being higher than the tunnel well head after driving away from the tunnel well head.
The invention aims to realize the following steps:
the invention relates to a vision inspection-based tunnel passing method with four autonomous flying rotors, which is characterized in that an inlet is autonomously searched by the four autonomous flying rotors and enters a tunnel, autonomous navigation flying is realized in the tunnel by a bottom camera, cables and peripheral conditions in the tunnel are detected in real time by the camera during flying, and finally the cables and the peripheral conditions are autonomously flown out of the tunnel.
Meanwhile, the autonomous flying four-rotor tunnel passing method based on the visual inspection also has the following beneficial effects:
(1) at present, the tunnel detection robot mainly adopts a rail type, and the method needs to lay a rail on the top of the tunnel or on the ground, and the robot needs to be arranged in the tunnel in advance, namely, each tunnel needs to be provided with one robot, so that the cost is high, inspection is needed, and the detection range is limited, and the invention has the advantage of low cost without laying the rail;
(2) the invention can realize the entrance and exit of the tunnel completely and autonomously without being arranged in the target tunnel in advance, one robot can finish the inspection task of all tunnels in one block, and the maintenance work of the robot is more convenient than that of a tracked robot needing well descending maintenance;
(3) the inspection tour system has better trafficability in the tunnel, and the four rotors can perform inspection tour tasks in the tunnel only by reserving a certain space, so that the inspection tour task is less influenced by the environment in the tunnel;
(4) the invention can replace manual well descending to complete detection and inspection tasks, does not need workers to descend the well in the using and maintenance processes, greatly shortens the detection time, ensures the safety of personnel, reduces the using and maintenance cost, and is safer and more reliable.
Drawings
FIG. 1 is a schematic cross-sectional view of a tunnel;
FIG. 2 is a schematic top view of a tunnel;
FIG. 3 is a flow chart of the autonomous flying four-rotor tunneling method based on visual inspection of the present invention;
Detailed Description
The following description of the embodiments of the present invention is provided in order to better understand the present invention for those skilled in the art with reference to the accompanying drawings. It is to be expressly noted that in the following description, a detailed description of known functions and designs will be omitted when it may obscure the subject matter of the present invention.
Examples
In this embodiment, the tunnel can be abstracted into the structure shown in fig. 1 and fig. 2, and the tunnel entrance and exit are all of the three-sided closed type, and the present invention will be described in detail with reference to fig. 1 and fig. 2. .
As shown in fig. 3, the invention relates to an autonomous flying four-rotor tunnel passing method based on visual inspection, which specifically comprises the following steps:
s1, adding a yellow identification line along the central line of the tunnel, installing a USB camera at the bottom of the four self-flying rotors, additionally installing a L ED light supplement lamp to ensure that the camera normally works in the tunnel, and additionally installing miniature laser ranging sensors at the front and rear parts of the camera and the right side of the machine body;
s2, controlling the four rotors flying into the tunnel well
S2.1, performing power-on initialization on four independent flying rotors;
s2.2, manually taking off the four self-flying rotors, enabling the four self-flying rotors to have a certain height from the ground, wherein the height has no strict requirement, manually remotely controlling the four self-flying rotors to drive into the vicinity of a tunnel wellhead, generally, the four self-flying rotors fly to a position where a machine head is 1 meter or so away from the opposite wall surface, and switching the four self-flying rotors into a full self-flying mode through a remote controller;
s2.3, calculating a real-time attitude angle of the four autonomically flying rotors by the flight control module;
s2.4, positioning real-time position information of the current autonomous flight four-rotor by the aid of laser ranging sensors on the right sides of the machine head and the machine body through the autonomous flight four-rotor, fusing and estimating flight speed with an accelerometer, and controlling the autonomous flight four-rotor to fly to the center position of a tunnel wellhead by the aid of a flight control module in combination with a real-time attitude angle, position information and flight speed;
s3, controlling the four self-flying rotors to drive into the tunnel entrance from the center of the tunnel wellhead
The flight control module controls the autonomous flight quadrotors to vertically descend from the center of a tunnel wellhead, when the autonomous flight quadrotors descend to the plane of the top of a tunnel, due to the fact that a forward wall surface disappears, the distance measured by the forward laser ranging sensors is large in jumping or directly fails, meanwhile, the backward laser ranging sensors can obtain effective data, therefore, a control algorithm immediately starts the laser ranging sensors on the tail and the right side of the machine body to obtain real-time position information of the autonomous flight quadrotors by sensing the jumping of forward ranging, the autonomous flight quadrotors continue to vertically descend according to the current position information and the real-time attitude angle, and when the autonomous flight quadrotors descend to a set cruising height, the autonomous flight quadrotors;
s4 autonomous flight four-rotor traveling tunnel controlled by flight control module
S4.1, periodically collecting an environment image in the tunnel by a USB camera under the coordination of an L ED light supplement lamp;
s4.2, starting a line patrol thread of the four self-flying rotors to obtain the actual distance d of the four self-flying rotors deviating from the yellow identification line and the deflection angle theta of the four self-flying rotors in the flying direction and the tunnel trend;
s4.2.1, converting the environment image into an HSV color space, and performing binarization processing on the environment image according to a yellow HSV range value to obtain a binarized image, wherein the yellow HSV range value is as follows: h is 26 to 34; s is 43 to 255; v: 46-255;
s4.2.2, filtering white noise points in the binary image by utilizing open operation, and connecting discrete white areas in the binary image by utilizing closed operation to obtain an ideal binary image;
s4.2.3, taking the first 100 lines of images above the ideal binary image, searching the left edge of the white area from the leftmost side of the ideal binary image to the right and searching the right edge of the white area from the rightmost side to the left for the first line to obtain the white area of the line of images, and then finding out the center of the white area; for the remaining 99 lines of images, searching the left edge and the right edge of the line in the range of 30 pixels near the left edge and the right edge obtained by searching the previous line to obtain the white area of the line of images, and then finding out the center of the white area corresponding to the line of images;
s4.2.4, making difference between the center of the white area of the found 100 lines of images and the absolute center of the corresponding line in the ideal binary image, accumulating and averaging, and taking the average difference value as the pixel quantity of the autonomous flight quadrotors deviating from the white area;
s4.2.5, calculating the actual distance d of the autonomous flight four rotors deviating from the yellow identification line by using a pinhole imaging model according to the pixel quantity of the autonomous flight four rotors deviating from the white area, the current real-time height information of the autonomous flight four rotors and the focal length information of the USB camera;
s4.2.6, selecting white area centers extracted from the 33 rd line and the 66 th line from 100 lines of images, and calculating an arctan value of a difference value e of the white area centers of the two lines by arctan (e/33) to obtain a bias angle theta of the flight direction of the autonomous flight four-rotor and the tunnel trend;
s4.3, starting an optical flow thread of the four independent flying rotors to obtain the real-time flying speed v of the four independent flying rotors;
s4.3.1, initial moment t of powering on four rotors in autonomous flight0Running a corner detection algorithm on an environment image acquired by the USB camera at the initial moment to obtain 200 Shi-Tomasi corners, and taking the 200 Shi-Tomasi corners as a Shi-Tomasi corner set a (t) of the environment image at the moment0);
S4.3.2, periodically collecting the environment image at the rest subsequent time, processing the environment image at the current time by using a L K optical flow motion estimation method, searching matching points of Shi-Tomasi angular points in a Shi-Tomasi angular point set a (t-1) of the environment image at the previous time, forming a point set b (t), and removing Shi-Tomasi angular points which can not find the matching points from the point set a (t-1), thereby obtaining a Shi-Tomasi angular point set a (t) at the current time;
s4.3.3, judging whether the number of Shi-Tomasi corner points in a (t) is greater than a threshold value 30, if so, entering a step S4.3.4, otherwise, returning to the step S4.3.1;
s4.3.4, accumulating and averaging coordinate deviations between corresponding points in a (t) and b (t) to obtain pixel-wise flight speed information v1
S4.3.5, real-time altitude and real-time attitude angle pair v for the current flight of four independent flight rotors1Compensating to obtain flight speed information v2
S4.3.6, estimating inertial navigation speed information v by using accelerometer and gyroscope3Then v is further determined3And v2Performing Kalman filtering fusion to obtain the real-time flight speed v of the autonomous flight quadrotors;
s4.4, controlling the autonomous flight four-rotor wing to fly through a flight control module according to the actual distance d of the autonomous flight four-rotor wing deviating from a yellow identification line, the real-time flight speed v and the deflection angle theta of the autonomous flight four-rotor wing flying direction and the tunnel trend, enabling d and theta to approach 0, cruising and flying forwards, and reaching the center position of a tunnel outlet when the distance measured by a machine head laser ranging sensor is half the width of the tunnel;
s5, controlling the four self-flying rotors to drive away from the well mouth of the tunnel from the center position of the tunnel outlet
Utilize the laser range finding sensor location on aircraft nose and fuselage right side to independently fly the real-time positional information of four rotors, flight control module control independently flies four rotors and rises perpendicularly from tunnel exit central point department, when rising to tunnel top plane, change into the laser range finding sensor location on aircraft tail and fuselage right side and independently flies four rotors's of real-time positional information, flight control module control independently flies four rotors and continues to rise perpendicularly from tunnel exit central point, when rising to being higher than tunnel well head after driving away from tunnel well head.
Although illustrative embodiments of the present invention have been described above to facilitate the understanding of the present invention by those skilled in the art, it should be understood that the present invention is not limited to the scope of the embodiments, and various changes may be made apparent to those skilled in the art as long as they are within the spirit and scope of the present invention as defined and defined by the appended claims, and all matters of the invention which utilize the inventive concepts are protected.

Claims (1)

1. The method for the autonomous flight four-rotor-wing tunnel passing based on the visual inspection is characterized by comprising the following steps of:
(1) adding a yellow identification line along the central line of the tunnel, mounting a USB camera at the bottom of the four self-flying rotors, additionally mounting L ED light supplement lamps to ensure that the camera normally works in the tunnel, and additionally mounting miniature laser ranging sensors at the front and rear parts of the camera and the right side of the machine body;
(2) four rotors of controlling independent flight are sailed into tunnel well head
(2.1) carrying out power-on initialization on four autonomous flight rotors;
(2.2) manually taking off and autonomously flying four rotors to make autonomous flyingDistance h between four rotors and ground1Then manually remotely controlling the four self-flying rotors to drive into the vicinity of the tunnel wellhead, and switching the four self-flying rotors into a full self-flying mode through a remote controller;
(2.3) calculating the real-time attitude angle of the four autonomically flying rotors by the flight control module;
(2.4) positioning real-time position information of the current autonomous flight four-rotor by using laser ranging sensors on the right sides of the machine head and the machine body through the autonomous flight four-rotor, fusing the real-time position information and the current autonomous flight four-rotor with an accelerometer to estimate flight speed, and controlling the autonomous flight four-rotor to fly to the central position of a tunnel wellhead by a flight control module by combining a real-time attitude angle, the position information and the flight speed;
(3) the four rotors which are controlled to fly autonomously are driven into the tunnel entrance from the center of the tunnel wellhead
The flight control module controls the autonomous flight quadrotors to vertically descend from the center of a tunnel wellhead, when the autonomous flight quadrotors descend to a tunnel top plane, the laser ranging sensors on the tail and the right side of the machine body are used for positioning real-time position information of the autonomous flight quadrotors, the autonomous flight quadrotors continue to vertically descend according to the current position information and the real-time attitude angle, and when the autonomous flight quadrotors descend to a set cruising height, the autonomous flight quadrotors reach a tunnel entrance;
(4) four-rotor-wing passing tunnel controlled by flight control module to fly autonomously
(4.1) periodically collecting an environment image in the tunnel by the USB camera under the coordination of an L ED supplementary lighting lamp;
(4.2) starting a line patrol thread of the four self-flying rotors to obtain the actual distance d of the four self-flying rotors deviating from the yellow identification line and the deflection angle theta of the four self-flying rotors in the flying direction and the tunnel trend;
(4.2.1) converting the environment image into an HSV color space, and then carrying out binarization processing on the environment image according to the yellow HSV range value to obtain a binarization image;
(4.2.2) filtering white noise points in the binary image by utilizing open operation, and connecting discrete white areas in the binary image by utilizing closed operation to obtain an ideal binary image;
(4.2.3) taking the front n rows of images above the ideal binary image, searching the left edge of a white area from the leftmost side of the ideal binary image to the right and searching the right edge of the white area from the rightmost side to the left for the first row to obtain the white area of the row of images, and then finding out the center of the white area; for the rest n-1 lines of images, searching the left edge and the right edge of the line in the range of m pixels near the left edge and the right edge searched in the previous line of the line to obtain the white area of the line of images, and then finding out the center of the white area corresponding to the line of images;
(4.2.4) making a difference between the center of the white area of the found n rows of images and the absolute center of the corresponding row in the ideal binary image, accumulating and averaging, and taking the average difference value as the pixel quantity of the autonomous flight quadrotors deviating from the white area;
(4.2.5) calculating the actual distance d of the autonomous flight four rotors deviating from the yellow identification line by using a pinhole imaging model according to the pixel quantity of the autonomous flight four rotors deviating from the white area, the current real-time height information of the autonomous flight four rotors and the focal length information of the USB camera;
(4.2.6) selecting the white area centers extracted from the n/3 th line and the 2n/3 th line from the n-line images, and calculating the arctan value of the difference e of the white area centers of the two lines by arctan (e/(n/3)) to obtain the drift angle theta of the flight direction of the autonomous flight four rotors and the tunnel trend;
(4.3) starting an optical flow thread of the four independent flying rotors to obtain the real-time flying speed v of the four independent flying rotors;
(4.3.1) at the initial moment t of the powering-on of the four autonomy flight rotors0Operating a corner detection algorithm on an environment image acquired at the initial moment of the USB camera to obtain k Shi-Tomasi corners, and taking the k Shi-Tomasi corners as a Shi-Tomasi corner set a (t) of the environment image at the moment0);
(4.3.2) periodically acquiring the environment image at the rest of subsequent moments, processing the environment image at the current moment by using an L K optical flow motion estimation method, searching matching points of Shi-Tomasi angular points in a Shi-Tomasi angular point set a (t-1) of the environment image at the previous moment, forming a point set b (t), and removing Shi-Tomasi angular points, at which the matching points cannot be found, from the point set a (t-1) so as to obtain a Shi-Tomasi angular point set a (t) at the current moment;
(4.3.3), judging whether the number of the Shi-Tomasi angular points in a (t) is larger than a threshold value M, if so, entering a step (4.3.4), otherwise, returning to the step (4.3.1);
(4.3.4) accumulating and averaging coordinate deviations between corresponding points in a (t) and b (t) to obtain pixel-wise flight speed information v1
(4.3.5) utilizing the real-time altitude and the real-time attitude angle of the current flight of the four independent flight rotors to v1Compensating to obtain flight speed information v2
(4.3.6) estimating inertial navigation speed information v by using accelerometer and gyroscope3Then v is further determined3And v2Performing Kalman filtering fusion to obtain the real-time flight speed v of the autonomous flight quadrotors;
(4.4) according to the actual distance d of the autonomous flight four rotors deviating from the yellow identification line, the real-time flight speed v and the deflection angle theta between the flight direction of the autonomous flight four rotors and the tunnel trend, controlling the autonomous flight four rotors to fly through the flight control module, enabling d and theta to approach 0, cruising and flying forwards, and enabling the distance measured by the locomotive laser ranging sensor to reach the tunnel outlet when the distance is half the width of the tunnel;
(5) the four rotors of the autonomous flight are controlled to drive away from the well head of the tunnel from the center position of the tunnel outlet
Utilize the laser range finding sensor location on aircraft nose and fuselage right side to independently fly the real-time positional information of four rotors, flight control module control independently flies four rotors and rises perpendicularly from tunnel exit central point department, when rising to tunnel top plane, change into the laser range finding sensor location on aircraft tail and fuselage right side and independently flies four rotors's of real-time positional information, flight control module control independently flies four rotors and continues to rise perpendicularly from tunnel exit central point, when rising to being higher than tunnel well head after driving away from tunnel well head.
CN201710750773.5A 2017-08-28 2017-08-28 Autonomous flight four-rotor tunnel passing method based on visual inspection Active CN107632615B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710750773.5A CN107632615B (en) 2017-08-28 2017-08-28 Autonomous flight four-rotor tunnel passing method based on visual inspection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710750773.5A CN107632615B (en) 2017-08-28 2017-08-28 Autonomous flight four-rotor tunnel passing method based on visual inspection

Publications (2)

Publication Number Publication Date
CN107632615A CN107632615A (en) 2018-01-26
CN107632615B true CN107632615B (en) 2020-07-17

Family

ID=61100590

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710750773.5A Active CN107632615B (en) 2017-08-28 2017-08-28 Autonomous flight four-rotor tunnel passing method based on visual inspection

Country Status (1)

Country Link
CN (1) CN107632615B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108803652A (en) * 2018-04-26 2018-11-13 中国计量大学 A kind of autonomous tracking control method of rotor craft
CN112173104A (en) * 2020-09-03 2021-01-05 昆明理工大学 Inspection robot based on four-rotor aircraft
CN112985296B (en) * 2021-02-06 2022-06-24 郑州地铁集团有限公司 Urban rail transit tunnel structure and control method of protection area
CN113655803A (en) * 2021-08-26 2021-11-16 国网江苏省电力有限公司无锡供电分公司 System and method for calibrating course of rotor unmanned aerial vehicle in tunnel environment based on vision

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6879878B2 (en) * 2001-06-04 2005-04-12 Time Domain Corporation Method and system for controlling a robot
WO2012061137A2 (en) * 2010-10-25 2012-05-10 Lockheed Martin Corporation Building a three dimensional model of an underwater structure
CN102980510A (en) * 2012-08-07 2013-03-20 孟繁志 Laser optical ruler image tree measuring device and method thereof
WO2017091768A1 (en) * 2015-11-23 2017-06-01 Kespry, Inc. Autonomous mission action alteration
US9886845B2 (en) * 2008-08-19 2018-02-06 Digimarc Corporation Methods and systems for content processing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6879878B2 (en) * 2001-06-04 2005-04-12 Time Domain Corporation Method and system for controlling a robot
US9886845B2 (en) * 2008-08-19 2018-02-06 Digimarc Corporation Methods and systems for content processing
WO2012061137A2 (en) * 2010-10-25 2012-05-10 Lockheed Martin Corporation Building a three dimensional model of an underwater structure
CN102980510A (en) * 2012-08-07 2013-03-20 孟繁志 Laser optical ruler image tree measuring device and method thereof
WO2017091768A1 (en) * 2015-11-23 2017-06-01 Kespry, Inc. Autonomous mission action alteration

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"LED 定位技术在隧道无人驾驶导航中的应用";杜艳忠 等;《测绘通报》;20161231;第100-101页和第154页 *
"基于视觉的机器人在管道检测中的远程控制研究";陈应松 等;《制冷与空调》;20100831;第24卷(第4期);第133-137页 *

Also Published As

Publication number Publication date
CN107632615A (en) 2018-01-26

Similar Documents

Publication Publication Date Title
CN107632615B (en) Autonomous flight four-rotor tunnel passing method based on visual inspection
KR102583989B1 (en) Automated image labeling for vehicles based on maps
CN108496129B (en) Aircraft-based facility detection method and control equipment
CN103419944B (en) Air bridge and automatic abutting method therefor
JP5690539B2 (en) Automatic take-off and landing system
EP2413096B1 (en) Ground-based videometrics guiding method for aircraft landing or unmanned aerial vehicles recovery
US10878709B2 (en) System, method, and computer readable medium for autonomous airport runway navigation
US20180273173A1 (en) Autonomous inspection of elongated structures using unmanned aerial vehicles
KR101933714B1 (en) System for guiding a drone during the approach phase to a platform, in particular a naval platform, with a view to landing same
CN106697322A (en) Automatic abutting system and method for boarding bridge
JP2006027448A (en) Aerial photographing method and device using unmanned flying body
JP2012232654A (en) Taking-off and landing target device, and automatic taking-off and landing system
CN211554748U (en) Mine patrol micro unmanned aerial vehicle system
CN107992829A (en) A kind of traffic lights track level control planning extracting method and device
WO2020000790A1 (en) Vertical mine shaft detection method and system
CN105487092B (en) Airport shelter bridge docks aircraft hatch navigation system
CN106774387A (en) A kind of unmanned plane barrier-avoiding method and obstacle avoidance system
Savva et al. ICARUS: Automatic autonomous power infrastructure inspection with UAVs
CN104113733B (en) A kind of low slow Small object TV reconnaissance probe method
CN107272729B (en) Unmanned aerial vehicle system of cruising based on router
CN107562070B (en) Autonomous flight four-rotor tunnel passing method based on laser radar
CN115826596B (en) Intelligent thermal power plant chimney inspection method and system based on multi-rotor unmanned aerial vehicle
CN108932875B (en) Intelligent airplane berth indicating system with high safety performance
CN106327921A (en) Undercarriage safety monitoring method based on course line and visible data fusion
CN106128170A (en) A kind of aircraft berth intelligent indicating system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant