WO2021070308A1 - 無人航空機および無人航空機の制御方法 - Google Patents
無人航空機および無人航空機の制御方法 Download PDFInfo
- Publication number
- WO2021070308A1 WO2021070308A1 PCT/JP2019/039916 JP2019039916W WO2021070308A1 WO 2021070308 A1 WO2021070308 A1 WO 2021070308A1 JP 2019039916 W JP2019039916 W JP 2019039916W WO 2021070308 A1 WO2021070308 A1 WO 2021070308A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unmanned aerial
- aerial vehicle
- skeleton
- manhole
- distance information
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 22
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 43
- 238000007689 inspection Methods 0.000 abstract description 3
- XEEYBQQBJWHFJM-UHFFFAOYSA-N Iron Chemical compound [Fe] XEEYBQQBJWHFJM-UHFFFAOYSA-N 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 5
- 229910052742 iron Inorganic materials 0.000 description 4
- 230000001174 ascending effect Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 239000004918 carbon fiber reinforced polymer Substances 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 239000011150 reinforced concrete Substances 0.000 description 2
- 238000009412 basement excavation Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/102—Simultaneous control of position or course in three dimensions specially adapted for aircraft specially adapted for vertical take-off of aircraft
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
- B64U10/14—Flying platforms with four distinct rotor axes, e.g. quadcopters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U30/00—Means for producing lift; Empennages; Arrangements thereof
- B64U30/20—Rotors; Rotor supports
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/87—Combinations of sonar systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/242—Means based on the reflection of waves generated by the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/243—Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/245—Arrangements for determining position or orientation using dead reckoning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2105/00—Specific applications of the controlled vehicles
- G05D2105/80—Specific applications of the controlled vehicles for information gathering, e.g. for academic research
- G05D2105/89—Specific applications of the controlled vehicles for information gathering, e.g. for academic research for inspecting structures, e.g. wind mills, bridges, buildings or vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2107/00—Specific environments of the controlled vehicles
- G05D2107/50—Confined spaces, e.g. tanks, pipelines, tunnels or containers
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2109/00—Types of controlled vehicles
- G05D2109/20—Aircraft, e.g. drones
- G05D2109/25—Rotorcrafts
- G05D2109/254—Flying platforms, e.g. multicopters
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2111/00—Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
- G05D2111/10—Optical signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2111/00—Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
- G05D2111/10—Optical signals
- G05D2111/17—Coherent light, e.g. laser signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2111/00—Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
- G05D2111/20—Acoustic signals, e.g. ultrasonic signals
Definitions
- the present invention relates to an unmanned aerial vehicle and a control method for an unmanned aerial vehicle used for inspecting the inside of a manhole.
- an unmanned aerial vehicle that estimates its own position using GPS (Global Positioning System) and flies autonomously is known.
- GPS Global Positioning System
- unmanned aerial vehicles that estimate their own position and fly autonomously using Visual SLAM (Simultaneous Localization and Mapping) technology that processes images acquired from cameras installed on the aircraft. It is known (see, for example, Non-Patent Document 1).
- the purpose of the present disclosure made in view of such circumstances is to provide an unmanned aerial vehicle and a control method for an unmanned aerial vehicle capable of estimating a self-position with high accuracy even inside a manhole.
- the unmanned aerial vehicle is an unmanned aerial vehicle used for inspecting the inside of a manhole, and a plurality of rangefinders for measuring the distance between a camera sensor for photographing a manhole hole and a predetermined surface on the ground or inside the manhole.
- the self-position is estimated based on the recognition information of the manhole hole recognized from the image information acquired from the camera sensor and the distance information to the ground or the predetermined surface acquired from the rangefinder. It includes a control unit.
- the method for controlling an unmanned aerial vehicle is used for inspecting the inside of a manhole, and includes a camera sensor for photographing the manhole hole, a plurality of rangefinders for measuring the distance between the ground or a predetermined surface inside the manhole, and the like. It is a control method of the unmanned aerial vehicle provided with the above, and when the unmanned aerial vehicle flies from the ground portion to the neck portion, the recognition information of the manhole hole recognized from the image information acquired from the camera sensor and the rangefinder. Based on the step of estimating the self-position based on the acquired distance information to the ground and the distance information to the wall surface of the neck acquired from the rangefinder when the unmanned aerial vehicle flies from the neck to the skeleton.
- the step of estimating the self-position based on the recognition information of the manhole hole recognized from the image information acquired from the camera sensor and the distance information with the ground acquired from the range finder.
- an unmanned aerial vehicle and a control method for an unmanned aerial vehicle capable of estimating a self-position with high accuracy even inside a manhole.
- the unmanned aerial vehicle 100 includes a main body 1, a propeller 2, a motor 3, an arm 4, and a leg 5.
- the main body 1 includes a plurality of camera sensors 10, a plurality of rangefinders 21 and 22, and a control unit 30.
- the main body 1 has a disk shape with a diameter of about 420 mm and a height of about 230 mm, and is assembled from a plate material made of CFRP (Carbon Fiber Reinforced Plastics).
- the propellers 2a, 2b, 2c, and 2d are rotationally driven by the drive of the motors 3a, 3b, 3c, and 3d attached to the propellers 2a, 2b, 2c, and 2d, respectively, to generate lift.
- the arms 4a, 4b, 4c, and 4d are rod-shaped support members that extend in the horizontal direction to rotatably support the propellers 2a, 2b, 2c, and 2d.
- the legs 5a, 5b, 5c, and 5d are T-shaped support members, and by supporting the unmanned aerial vehicle 100, the unmanned aerial vehicle 100 is prevented from tipping over during takeoff and landing.
- the manhole 200 is a standard product communication manhole.
- the manhole 200 includes a neck portion 210, a skeleton portion 220, an iron lid 230, a pipeline 240, and duct portions 250a and 250b.
- the skeleton portion 220 includes a ceiling portion 221, a floor portion 222, and a side wall portion 223.
- the inside of the manhole 200 is surrounded by a wall surface H of the neck portion 210, a ceiling surface R of the ceiling portion 221, a wall surface H of the side wall portion 223, a floor surface F of the floor portion 222 (or a water surface W of the accumulated water 300), and the like.
- the neck portion 210 has a substantially cylindrical shape having a diameter of about 60 cm and a height of about 60 cm, and is manufactured of reinforced concrete or the like.
- the height of the neck portion 210 indicates the distance between the ground portion A and the ceiling portion 221.
- the skeleton 220 has, for example, a substantially rectangular shape having a length of about 2.3 m in the X-axis direction, a length of about 1.3 m in the Y-axis direction, and a length of about 1.5 m in the Z-axis direction.
- it is made of reinforced concrete or the like.
- Through holes connected to a plurality of pipelines 240 are formed in the side wall portion 223, and duct portions 250a and 250b are provided.
- the iron lid 230 has a substantially cylindrical shape and fits into the manhole hole C which is the entrance / exit of the manhole 200.
- the manhole hole C is formed at the boundary between the above-ground portion 110 and the underground portion B. Communication cables and the like are laid in the plurality of pipelines 240.
- One or more camera sensors 10 are arranged in the vertical direction.
- the unmanned aerial vehicle 100 will be described below assuming that the unmanned aerial vehicle 100 includes two camera sensors 10V1 and 10V2 in the vertical direction.
- the camera sensor 10V1 is arranged at the upper part in the vertical direction and photographs an object.
- the camera sensor 10V1 photographs the manhole hole C.
- the camera sensor 10V1 outputs the image information of the photographed object (for example, the manhole hole C) to the control unit 30. The operator can appropriately adjust the arrangement position of the camera sensor 10V1.
- the camera sensor 10V2 is arranged at the lower part in the vertical direction and photographs an object.
- the camera sensor 10V2 photographs the manhole hole C.
- the camera sensor 10V2 outputs the image information of the photographed object (for example, the manhole hole C) to the control unit 30. The operator can appropriately adjust the arrangement position of the camera sensor 10V2.
- the rangefinder 21 is, for example, an ultrasonic sensor 21V1,21V2
- the rangefinder 22 is, for example, a laser sensor 22H1,22H2,22H3,22H4,22V1.
- the unmanned aerial vehicle 100 includes four laser sensors 22H1,22H2,22H3,22H4 in the horizontal direction, two ultrasonic sensors 21V1,21V2 and one laser sensor 22V1 in the vertical direction. Will be described below.
- the ultrasonic sensor 21V1 is arranged at the upper part in the vertical direction and measures the distance from the target surface. For example, the ultrasonic sensor 21V1 measures the distance to the ceiling surface R inside the manhole 200. The ultrasonic sensor 21V1 outputs the measured distance information from the target surface (for example, the ceiling surface R) to the control unit 30. The operator can appropriately adjust the arrangement position of the ultrasonic sensor 21V1.
- the ultrasonic sensor 21V2 is arranged at the lower part in the vertical direction and measures the distance from the target surface. For example, the ultrasonic sensor 21V2 measures the distance from the ground A, the distance from the floor surface F inside the manhole 200, or the distance from the water surface W of the accumulated water 300 existing in the skeleton 220. The ultrasonic sensor 21V2 outputs the distance information to the measured target surface (for example, the ground surface A, the floor surface F, and the water surface W) to the control unit 30. The operator can appropriately adjust the arrangement position of the ultrasonic sensor 21V2.
- the measured target surface for example, the ground surface A, the floor surface F, and the water surface W
- Laser sensors 22H1,22H2,22H3,22H4 are arranged in the horizontal direction and measure the distance to the target surface.
- the laser sensors 22H1,22H2,22H3,22H4 measure the distance to the wall surface H (the wall surface H of the neck 210 or the wall surface H of the skeleton 220) inside the manhole 200.
- the laser sensors 22H1,22H2,22H3,22H4 output the distance information from the measured target surface (for example, the wall surface H of the neck 210 or the wall surface H of the skeleton 220) to the control unit 30.
- the operator can appropriately adjust the arrangement position of the laser sensors 22H1, 22, H2, 22H3, 22H4.
- the laser sensor 22V1 is arranged at the upper part in the vertical direction and measures the distance from the target surface. For example, the laser sensor 22V1 measures the distance to the ceiling surface R inside the manhole 200. The laser sensor 22V1 outputs the measured distance information from the target surface (for example, the ceiling surface R) to the control unit 30. The operator can appropriately adjust the arrangement position of the laser sensor 22V1.
- the control unit 30 is a small computer including, for example, RaspberryPi (registered trademark), and controls each part of the unmanned aerial vehicle 100.
- the control unit 30 controls the above-mentioned various sensors according to the flight environment of the unmanned aerial vehicle 100 (for example, the ground portion 110, the neck portion 210, the skeleton portion 220, etc.), and based on the information acquired from the various sensors, the control unit 30 self-positions. To estimate.
- the control unit 30 When the control unit 30 acquires image information of an object in the vertical direction, the camera sensor 1 It controls 0V1 and 10V2 and acquires image information of the object from the camera sensors 10V1 and 10V2. For example, the control unit 30 controls the camera sensors 10V1, 10V2 in the skeleton unit 220 and the ground unit 110, and acquires the image information of the manhole hole C from the camera sensors 10V1, 10V2.
- the control unit 30 controls the ultrasonic sensor 21V1 or the laser sensor 22V1 when acquiring the distance information from the target surface in the upper part in the vertical direction, and obtains the distance information from the ultrasonic sensor 21V1 or the laser sensor 22V1 from the ultrasonic sensor 21V1 or the laser sensor 22V1. get.
- the control unit 30 controls the ultrasonic sensor 21V1 or the laser sensor 22V1 in the skeleton 220, and acquires the distance information from the ultrasonic sensor 21V1 or the laser sensor 22V1 to the ceiling surface R inside the manhole 200.
- the control unit 30 controls the camera sensors 10V1 and 10V2 when acquiring the horizontal distance from the manhole hole in the skeleton unit 220 and the ground unit 110, captures an image with the camera sensors 10V1, 10V2, and captures an image from the captured image.
- the hole C is recognized.
- the control unit 30 controls the camera sensor 10V2 on the ground unit 110, captures an image in the vertical direction, recognizes the manhole hole C from the captured image, and determines the distance between the manhole hole C and the unmanned aerial vehicle in the horizontal direction. Get information.
- control unit 30 controls the camera sensor 10V1 in the skeleton unit 220, captures an image in the vertical direction, recognizes the manhole hole C from the captured image, and determines the distance between the manhole hole C and the unmanned aerial vehicle in the horizontal direction. Get information.
- the control unit 30 controls the ultrasonic sensor 21V2 when acquiring the distance information from the target surface in the lower part in the vertical direction, and acquires the distance information from the target surface from the ultrasonic sensor 21V2.
- the control unit 30 controls the ultrasonic sensor 21V2 on the ground unit 110, and acquires the distance information from the ground A from the ultrasonic sensor 21V2.
- the control unit 30 controls the ultrasonic sensor 21V2 in the skeleton 220, and acquires the distance information from the ultrasonic sensor 21V2 to the floor surface F inside the manhole 200.
- the control unit 30 controls the ultrasonic sensor 21V2 in the skeleton unit 220, and acquires the distance information from the ultrasonic sensor 21V2 from the water surface W of the accumulated water 300 existing in the skeleton unit 220.
- the control unit 30 controls the laser sensors 22H1,22H2,22H3,22H4 when acquiring the distance information from the target surface in the horizontal direction, and obtains the distance information from the target surface from the laser sensors 22H1,22H2,22H3,22H4. get.
- the control unit 30 controls the laser sensors 22H1,22H2,22H3,22H4 at the neck 210, and acquires the distance information from each laser sensor to the wall surface H of the neck 210 inside the manhole 200.
- the control unit 30 controls the laser sensors 22H1, 22, H2, 22H3, and 22H4 in the skeleton unit 220, and acquires the distance information from each laser sensor to the wall surface H of the skeleton unit 220 inside the manhole 200.
- the control unit 30 controls the various sensors described above according to the flight environment of the unmanned aerial vehicle 100.
- the control unit 30 measures the distance to the ground A by using the ultrasonic sensor 21V2 which is not easily affected by sunlight.
- the control unit 30 measures the distance of the neck portion 210 from the wall surface H by using the laser sensors 22H1, 22, H2, 22H3, 22H4, which are less likely to generate diffused reflection.
- the control unit 30 measures the distance to the water surface W by using the ultrasonic sensor 21V2 which is not easily affected by the accumulated water 300.
- the control unit 30 estimates its own position by using the image information and the distance information acquired from various sensors in combination. For example, the control unit 30 estimates its own position on the ground unit 110 by using the image information of the manhole hole C and the distance information from the ground A in combination. For example, the control unit 30 estimates the self-position of the neck 210 by using the distance information of the neck 210 from the wall surface H in combination. For example, in the skeleton unit 220, the control unit 30 combines the distance information with the ceiling surface R, the distance information with the wall surface H of the skeleton unit 220, and the distance information with the water surface W of the accumulated water 300 existing in the skeleton unit 220. To estimate the self-position. The control unit 30 appropriately controls various sensors according to the flight environment of the unmanned aerial vehicle 100, so that the self-position can be estimated with high accuracy.
- the control unit 30 controls the driving of the motors 3a, 3b, 3c, 3d, the rotation speed and the rotation direction of the propellers 2a, 2b, 2c, 2d, etc., based on the estimated self-position, and controls the flight of the unmanned aerial vehicle 100. To do.
- the control unit 30 controls the flight of the unmanned aerial vehicle 100 by, for example, PID (Proportional-Integral-Differential Controller) control.
- PID Proportional-Integral-Differential Controller
- the control unit 30 estimates the self-position with high accuracy and controls the flight of the unmanned aerial vehicle 100 based on the self-position, so that the unmanned aerial vehicle 100 can perform the vicinity of the manhole 200 or the inside of the manhole 200 with high accuracy. It will be possible to fly autonomously.
- the worker uses the unmanned aerial vehicle 100 to inspect the inside of the manhole 200. After opening the iron lid 230 of the manhole 200, the worker autonomously flies the unmanned aerial vehicle 100 near the manhole 200 or inside the manhole 200, evacuates to a safe sidewalk, etc., and the image taken by the unmanned aerial vehicle 100. Therefore, the deterioration status inside the manhole 200 is confirmed, and the inside of the manhole 200 is inspected. As a result, the safety of the inspection work can be enhanced and the efficiency of the inspection work can be improved.
- the control unit 30 appropriately controls various sensors according to the flight environment of the unmanned aerial vehicle 100, acquires image information and distance information, and estimates its own position. As a result, it is possible to realize an unmanned aerial vehicle 100 capable of estimating its own position with high accuracy even inside a manhole 200 in which GPS cannot be used and Visual SLAM technology cannot be used due to its narrowness.
- the unmanned aerial vehicle 100 autonomously flies based on the estimated self-position. Therefore, the unmanned aerial vehicle 100 can autonomously fly inside the manhole 200 with high accuracy while keeping the distance from the water surface W constant even if the accumulated water 300 is present in the skeleton 220.
- FIG. 4 A control method for the unmanned aerial vehicle 100 according to an embodiment of the present invention will be described with reference to FIGS. 3 and 4.
- black circles indicate the arrangement positions of various sensors (positions of the unmanned aerial vehicle 100).
- the dotted arrow indicates the shooting direction of the plurality of camera sensors 10.
- the solid arrow indicates the measurement direction of the plurality of rangefinders 21 and 22.
- step S1 the control unit 30 takes off the unmanned aerial vehicle 100 from the takeoff position.
- step S2 the control unit 30 controls the camera sensor 10V2, acquires the image information of the manhole hole C from the camera sensor 10V2, and recognizes the manhole hole C. Further, the control unit 30 controls the ultrasonic sensor 21V2 and acquires the distance information from the ground A from the ultrasonic sensor 21V2. Then, the control unit 30 together with the manhole hole C and the ground A based on the information of the manhole hole C recognized from the captured image acquired from the camera sensor 10V2 and the distance information with the ground A acquired from the ultrasonic sensor 21V2. The position of the unmanned aerial vehicle 100 on the ground 110 is estimated. Then, the control unit 30 flies the unmanned aerial vehicle 100 from the takeoff position to the position directly above the manhole hole C. Even in the vicinity of the position directly above the manhole hole C, if the directivity of the ultrasonic sensor 21V2 is wide, the control unit 30 can acquire the distance information from the ultrasonic sensor 21V2 from the ultrasonic sensor 21V2. ..
- step S3 the control unit 30 lowers the unmanned aerial vehicle 100 from a position directly above the manhole hole C.
- step S4 the control unit 30 controls the laser sensors 22H1,22H2,22H3,22H4, and acquires the distance information from the laser sensors 22H1,22H2,22H3,22H4 to the wall surface H of the neck portion 210. Then, the control unit 30 recognizes the distance of the neck 210 from the wall surface H based on the distance information of the neck 210 from the laser sensors 22H1, 22, H2, 22H3, and 22H4, and recognizes the distance of the neck 210 from the wall surface H, and the unmanned aerial vehicle 100 at the neck 210. Estimate the position of. Then, the control unit 30 lowers the unmanned aerial vehicle 100 from the upper part to the lower part inside the neck part 210.
- step S5 the control unit 30 controls the camera sensor 10V1, acquires the image information of the manhole hole C from the camera sensor 10V1, and recognizes the manhole hole C. Further, the control unit 30 controls the laser sensors 22H1,22H2,22H3,22H4, and acquires the distance information from the laser sensors 22H1,22H2,22H3,22H4 from the wall surface H of the skeleton portion 220. Further, when the accumulated water 300 exists in the skeleton 220, the control unit 30 controls the ultrasonic sensor 21V2, and the distance information from the ultrasonic sensor 21V2 to the water surface W of the accumulated water 300 existing in the skeleton 220 is obtained. get.
- the control unit 30 controls the ultrasonic sensor 21V2 and acquires the distance information of the skeleton 220 from the floor surface F from the ultrasonic sensor 21V2. Then, the control unit 30 receives the recognition information of the manhole hole C from the captured image acquired from the camera sensor 10V1, the distance information from the wall surface H of the skeleton portion 220 acquired from the laser sensors 22H1,22H2,22H3,22H4, and the ultrasonic sensor.
- the distance between the skeleton portion 220 and the wall surface H and the distance from the water surface W (or floor surface F) are recognized, and the unmanned skeleton portion 220 is unmanned. Estimate the position of the aircraft 100. Then, the control unit 30 stops the unmanned aerial vehicle 100 at a position directly below the manhole hole C.
- step S6 the control unit 30 controls the camera sensor 10V1 to capture an image with the camera sensor 10V1 and recognize the manhole hole C from the captured image. Further, the control unit 30 controls the laser sensors 22H1,22H2,22H3,22H4, and acquires the distance information from the laser sensors 22H1,22H2,22H3,22H4 from the wall surface H of the skeleton portion 220. Further, when the accumulated water 300 exists in the skeleton 220, the control unit 30 controls the ultrasonic sensor 21V2, and the distance information from the ultrasonic sensor 21V2 to the water surface W of the accumulated water 300 existing in the skeleton 220 is obtained. get.
- the control unit 30 controls the ultrasonic sensor 21V2 and acquires the distance information of the skeleton 220 from the floor surface F from the ultrasonic sensor 21V2. Then, the control unit 30 acquires the recognition information of the manhole hole C acquired from the camera sensor 10V1, the distance information with the wall surface H of the skeleton portion 220 acquired from the laser sensors 22H1, 22, H2, 22H3, 22H4, and the ultrasonic sensor 21V2. Based on the distance information to the water surface W (or floor surface F), the manhole hole C, the distance to the wall surface H of the skeleton 220, and the distance to the water surface W (or floor F) are recognized, and the skeleton 220 is recognized. Estimate the position of the unmanned aerial vehicle 100 in. Then, the control unit 30 arbitrarily flies the unmanned aerial vehicle 100, such as ascending, descending, and turning, inside the skeleton unit 220.
- control unit 30 controls the camera sensor 10V1 to take an image with the camera sensor 10V1 and recognize the manhole hole C from the taken image. Further, the control unit 30 controls the laser sensors 22H1,22H2,22H3,22H4, and acquires the distance information from the laser sensors 22H1,22H2,22H3,22H4 from the wall surface H of the skeleton portion 220. Further, the control unit 30 controls the ultrasonic sensor 21V1 or the laser sensor 22V1 and acquires the distance information from the ultrasonic sensor 21V1 or the laser sensor 22V1 from the ceiling surface R of the skeleton 220.
- the control unit 30 controls the ultrasonic sensor 21V2, and the distance information from the ultrasonic sensor 21V2 to the water surface W of the accumulated water 300 existing in the skeleton 220 is obtained. get. Further, when the accumulated water 300 does not exist in the skeleton 220, the control unit 30 controls the ultrasonic sensor 21V2 and acquires the distance information of the skeleton 220 from the floor surface F from the ultrasonic sensor 21V2.
- the control unit 30 receives recognition information of the manhole hole C acquired from the camera sensor 10V1, distance information with the wall surface H of the skeleton portion 220 acquired from the laser sensors 22H1, 22, H2, 22H3, 22H4, the ultrasonic sensor 21V1 or the laser sensor. Based on the distance information from the ceiling surface R acquired from 22V1 and the distance information from the water surface W (or floor surface F) acquired from the ultrasonic sensor 21V2, the distance between the manhole hole C and the wall surface H of the skeleton 220. The position of the unmanned aircraft 100 on the skeleton 220 is estimated by recognizing the distance of the skeleton 220 from the ceiling surface R and the water surface W (or the floor F). Then, the control unit 30 arbitrarily flies the unmanned aerial vehicle 100, such as ascending, descending, and turning, inside the skeleton unit 220.
- step S7 the control unit 30 controls the camera sensor 10V1 to capture an image with the camera sensor 10V1 and recognize the manhole hole C from the captured image. Further, the control unit 30 controls the laser sensors 22H1,22H2,22H3,22H4, and acquires the distance information from the laser sensors 22H1,22H2,22H3,22H4 from the wall surface H of the skeleton portion 220. Further, when the accumulated water 300 exists in the skeleton 220, the control unit 30 controls the ultrasonic sensor 21V2, and the distance information from the ultrasonic sensor 21V2 to the water surface W of the accumulated water 300 existing in the skeleton 220 is obtained. get.
- the control unit 30 controls the ultrasonic sensor 21V2 and acquires the distance information of the skeleton 220 from the floor surface F from the ultrasonic sensor 21V2. Then, the control unit 30 acquired the recognition information of the manhole hole C acquired from the camera sensor 10V1, the distance information with the wall surface H of the skeleton portion 220 acquired from the laser sensors 22H1, 22, H2, 22H3, 22H4, and the ultrasonic sensor 21V2. Based on the distance information with the water surface W (or floor surface F), the distance of the skeleton 220 from the wall surface H and the distance from the water surface W (or floor F) are recognized, and the position of the unmanned aerial vehicle 100 on the skeleton 220. To estimate. Then, the control unit 30 stops the unmanned aerial vehicle 100 at a position directly below the manhole hole C.
- step S8 the control unit 30 raises the unmanned aerial vehicle 100 from a position directly below the manhole hole C.
- step S9 the control unit 30 controls the laser sensors 22H1,22H2,22H3,22H4, and acquires the distance information from the laser sensors 22H1,22H2,22H3,22H4 to the wall surface H of the neck portion 210. Then, the control unit 30 recognizes the distance of the neck 210 from the wall surface H based on the distance information of the neck 210 from the laser sensors 22H1, 22, H2, 22H3, and 22H4, and recognizes the distance of the neck 210 from the wall surface H, and the unmanned aerial vehicle 100 at the neck 210. Estimate the position of. Then, the control unit 30 raises the unmanned aerial vehicle 100 from the lower part to the upper part inside the neck part 210.
- step S10 the control unit 30 controls the camera sensor 10V2 to capture an image with the camera sensor 10V2 and recognize the manhole hole C from the captured image. Further, the control unit 30 controls the ultrasonic sensor 21V2 and acquires the distance information from the ground A from the ultrasonic sensor 21V2. Then, the control unit 30 recognizes the distance between the manhole hole C and the ground A based on the recognition information of the manhole hole C acquired from the camera sensor 10V2 and the distance information with the ground A acquired from the ultrasonic sensor 21V2. The position of the unmanned aerial vehicle 100 at the neck 210 and the ground 110 is estimated.
- control unit 30 raises the unmanned aerial vehicle 100 from the upper part of the neck portion 210 to the position directly above the manhole hole C. Even in the vicinity of the position directly above the manhole hole C, if the directivity of the ultrasonic sensor 21V2 is wide, the control unit 30 can acquire the distance information from the ultrasonic sensor 21V2 to the ground A.
- step S11 the control unit 30 controls the camera sensor 10V2 to capture an image with the camera sensor 10V2 and recognize the manhole hole C from the captured image. Further, the control unit 30 controls the ultrasonic sensor 21V2 and acquires the distance information from the ground A from the ultrasonic sensor 21V2. Then, the control unit 30 recognizes the distance between the manhole hole C and the ground A based on the recognition information of the manhole hole C acquired from the camera sensor 10V2 and the distance information with the ground A acquired from the ultrasonic sensor 21V2. The position of the unmanned aerial vehicle 100 on the ground 110 is estimated. Then, the control unit 30 flies the unmanned aerial vehicle 100 from a position directly above the manhole hole C on the ground portion 110 to a position other than the position directly above the manhole hole C.
- step S12 the control unit 30 lands the unmanned aerial vehicle 100 at the landing position.
- the control unit 30 flies from the takeoff position to the ground portion 110, flies from the ground portion 110 to a position directly above the manhole hole C, descends inside the neck portion 210, and has a skeleton. Stop at a position directly below the manhole hole C in the portion 220, fly inside the skeleton 220, stop at a position directly below the manhole hole C in the skeleton 220, raise the inside of the neck 210, directly above the manhole hole C from the neck 210.
- the unmanned aircraft 100 It is arranged in the unmanned aircraft 100 in the flight to the position, the flight from the position directly above the manhole hole C to the position other than the position directly above the manhole hole C, and the flight from the position other than the position directly above the manhole hole C to the landing position.
- the camera sensor 10, the ultrasonic sensor 21, and the laser sensor 22 are appropriately controlled to acquire image information and distance information, and the self-position is estimated by using these information in combination.
- a control method for the unmanned aerial vehicle 100 that can estimate its own position with high accuracy even inside a manhole 200 in which GPS cannot be used and Visual SLAM technology cannot be used due to its narrowness can be obtained. realizable.
- the control unit 30 autonomously flies the unmanned aerial vehicle 100 based on the estimated self-position. Therefore, the unmanned aerial vehicle 100 recognizes the manhole hole C in the ground portion 110, enters the manhole hole C, descends the neck portion 210, flies through the skeleton portion 220, and recognizes the manhole hole C again in the skeleton portion 220. , The autonomous flight of ascending the neck 210 and exiting from the manhole hole C can be performed near the manhole 200 or inside the manhole 200. As a result, the worker can use the unmanned aerial vehicle 100 for inspecting the inside of the manhole 200.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Computer Networks & Wireless Communication (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
図1および図2を参照して、本発明の一実施形態に係る無人航空機100の構成について説明する。
0V1,10V2を制御し、カメラセンサ10V1,10V2から、対象物の画像情報を取得する。例えば、制御部30は、躯体部220や地上部110において、カメラセンサ10V1,10V2を制御し、カメラセンサ10V1,10V2から、マンホール孔Cの画像情報を取得する。
図3および図4を参照して、本発明の一実施形態に係る無人航空機100の制御方法について説明する。図4において、黒丸は、各種センサの配置位置(無人航空機100の位置)を示している。点線矢印は、複数のカメラセンサ10の撮影方向を示している。実線矢印は、複数の測距計21,22の測定方向を示している。
ステップS1において、制御部30は、無人航空機100を、離陸位置から離陸させる。
ステップS4において、制御部30は、レーザセンサ22H1,22H2,22H3,22H4を制御し、レーザセンサ22H1,22H2,22H3,22H4から、首部210の壁面Hとの距離情報を取得する。そして、制御部30は、レーザセンサ22H1,22H2,22H3,22H4から取得した首部210の壁面Hとの距離情報に基づいて、首部210の壁面Hとの距離を認識し、首部210における無人航空機100の位置を推定する。そして、制御部30は、首部210の内部で、上部から下部まで、無人航空機100を下降させる。
ステップS5において、制御部30は、カメラセンサ10V1を制御し、カメラセンサ10V1から、マンホール孔Cの画像情報を取得し、マンホール孔Cの認識を行う。また、制御部30は、レーザセンサ22H1,22H2,22H3,22H4を制御し、レーザセンサ22H1,22H2,22H3,22H4から、躯体部220の壁面Hとの距離情報を取得する。また、躯体部220に溜水300が存在する場合、制御部30は、超音波センサ21V2を制御し、超音波センサ21V2から、躯体部220に存在する溜水300の水面Wとの距離情報を取得する。また、躯体部220に溜水300が存在しない場合、制御部30は、超音波センサ21V2を制御し、超音波センサ21V2から、躯体部220の床面Fとの距離情報を取得する。そして、制御部30は、カメラセンサ10V1から取得した撮影画像からマンホール孔Cの認識情報、レーザセンサ22H1,22H2,22H3,22H4から取得した躯体部220の壁面Hとの距離情報、および超音波センサ21V2から取得した水面W(又は床面F)との距離情報に基づいて、躯体部220の壁面Hとの距離および水面W(又は床面F)との距離を認識し、躯体部220における無人航空機100の位置を推定する。そして、制御部30は、マンホール孔Cの直下位置で、無人航空機100を停止させる。
ステップS9において、制御部30は、レーザセンサ22H1,22H2,22H3,22H4を制御し、レーザセンサ22H1,22H2,22H3,22H4から、首部210の壁面Hとの距離情報を取得する。そして、制御部30は、レーザセンサ22H1,22H2,22H3,22H4から取得した首部210の壁面Hとの距離情報に基づいて、首部210の壁面Hとの距離を認識し、首部210における無人航空機100の位置を推定する。そして、制御部30は、首部210の内部で、下部から上部まで、無人航空機100を上昇させる。
ステップS10において、制御部30は、カメラセンサ10V2を制御し、カメラセンサ10V2での画像の撮影と、撮影画像からのマンホール孔Cの認識を行う。また、制御部30は、超音波センサ21V2を制御し、超音波センサ21V2から、地面Aとの距離情報を取得する。そして、制御部30は、カメラセンサ10V2から取得したマンホール孔Cの認識情報および超音波センサ21V2から取得した地面Aとの距離情報に基づいて、マンホール孔Cおよび地面Aとの距離を認識し、首部210および地上部110における無人航空機100の位置を推定する。そして、制御部30は、首部210の上部からマンホール孔Cの直上位置まで、無人航空機100を上昇させる。なお、マンホール孔Cの直上位置付近であっても、超音波センサ21V2の指向性が広ければ、制御部30は、超音波センサ21V2から地面Aとの距離情報を取得することが可能である。
ステップS11において、制御部30は、カメラセンサ10V2を制御し、カメラセンサ10V2での画像の撮影と、撮影画像からのマンホール孔Cの認識を行う。また、制御部30は、超音波センサ21V2を制御し、超音波センサ21V2から、地面Aとの距離情報を取得する。そして、制御部30は、カメラセンサ10V2から取得したマンホール孔Cの認識情報および超音波センサ21V2から取得した地面Aとの距離情報に基づいて、マンホール孔Cおよび地面Aとの距離を認識し、地上部110における無人航空機100の位置を推定する。そして、制御部30は、地上部110におけるマンホール孔Cの直上位置からマンホール孔Cの直上位置以外の位置まで、無人航空機100を飛行させる。
2a,2b,2c,2d プロペラ
3a,3b,3c,3d モータ
4a,4b,4c,4d 腕部
5a,5b,5c,5d 脚部
10V1,10V2 カメラセンサ
21,22 測距計
21V1,21V2 超音波センサ
22H1,22H2,22H3,22H4,22V1 レーザセンサ
30 制御部
100 無人航空機
110 地上部
200 マンホール
210 首部
220 躯体部
221 天井部
222 床部
223 側壁部
230 鉄蓋
240 管路
250a,250b ダクト部
Claims (6)
- マンホールの内部の点検に使用される無人航空機であって、
マンホール孔を撮影するカメラセンサと、
地面又は前記内部における所定面との距離を測定する複数の測距計と、
前記カメラセンサから取得した画像情報から認識を行った前記マンホール孔の認識情報、および前記測距計から取得した前記地面又は前記所定面との距離情報に基づいて、自己位置を推定する制御部と、
を備える、無人航空機。 - 前記測距計は、超音波センサ又はレーザセンサを含む、
請求項1に記載の無人航空機。 - 鉛直方向における上部に、前記超音波センサ、前記レーザセンサ、および前記カメラセンサが配置され、
前記鉛直方向における下部に、前記超音波センサ、および前記カメラセンサが配置され、
水平方向に、複数の前記レーザセンサが配置される、
請求項2に記載の無人航空機。 - 前記制御部は、前記認識情報および前記距離情報を複合的に用いて自己位置を推定する、
請求項1から3のいずれか一項に記載の無人航空機。 - 前記制御部は、
前記無人航空機が地上部から首部へ飛行する際、前記カメラセンサから前記マンホール孔の画像情報を取得し、前記画像情報から前記マンホール孔を認識し、前記測距計から地面との距離情報を取得し、
前記無人航空機が前記首部から躯体部へ飛行する際、前記測距計から前記首部の壁面との距離情報を取得し、
前記躯体部を飛行する際、前記カメラセンサから前記マンホール孔の画像情報を取得し、前記画像情報から前記マンホール孔を認識し、前記測距計から前記躯体部の壁面との距離情報、前記躯体部の天井面との距離情報、および前記躯体部に存在する溜水の水面との距離情報を取得し、
前記無人航空機が前記躯体部から前記首部へ飛行する際、前記測距計から前記首部の壁面との距離情報を取得し、
前記無人航空機が前記首部から前記地上部へ飛行する際、前記カメラセンサから前記マンホール孔の画像情報を取得し、前記画像情報から前記マンホール孔を認識し、前記測距計から地面との距離情報を取得する、
請求項1から4のいずれか一項に記載の無人航空機。 - マンホールの内部の点検に使用され、マンホール孔を撮影するカメラセンサと、地面又は前記内部における所定面との距離を測定する複数の測距計と、を備える無人航空機の制御方法であって、
前記無人航空機が地上部から首部へ飛行する際、前記カメラセンサから取得した画像情報から認識を行った前記マンホール孔の認識情報、および前記測距計から取得した地面との距離情報に基づいて、自己位置を推定するステップと、
前記無人航空機が前記首部から躯体部へ飛行する際、前記測距計から取得した前記首部の壁面との距離情報に基づいて、自己位置を推定するステップと、
前記無人航空機が前記躯体部を飛行する際、前記カメラセンサから取得した画像情報から認識を行った前記マンホール孔の認識情報、前記測距計から取得した前記躯体部の壁面との距離情報、前記躯体部の天井面との距離情報、および前記躯体部に存在する溜水の水面との距離情報に基づいて、自己位置を推定するステップと、
前記無人航空機が前記躯体部から前記首部へ飛行する際、前記測距計から取得した前記首部の壁面との距離情報に基づいて、自己位置を推定するステップと、
前記無人航空機が前記首部から前記地上部へ飛行する際、前記カメラセンサから取得した画像情報から認識を行った前記マンホール孔の認識情報、および前記測距計から取得した地面との距離情報に基づいて、自己位置を推定するステップと、
を含む、無人航空機の制御方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2019/039916 WO2021070308A1 (ja) | 2019-10-09 | 2019-10-09 | 無人航空機および無人航空機の制御方法 |
US17/767,834 US20240085928A1 (en) | 2019-10-09 | 2019-10-09 | Unmanned aerial vehicle and control method therefor |
JP2021551025A JP7324388B2 (ja) | 2019-10-09 | 2019-10-09 | 無人航空機および無人航空機の制御方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2019/039916 WO2021070308A1 (ja) | 2019-10-09 | 2019-10-09 | 無人航空機および無人航空機の制御方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021070308A1 true WO2021070308A1 (ja) | 2021-04-15 |
Family
ID=75438079
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/039916 WO2021070308A1 (ja) | 2019-10-09 | 2019-10-09 | 無人航空機および無人航空機の制御方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240085928A1 (ja) |
JP (1) | JP7324388B2 (ja) |
WO (1) | WO2021070308A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20240199246A1 (en) * | 2021-01-13 | 2024-06-20 | Hardshell Labs, Inc. | External cage for unmanned aerial vehicle |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007113240A (ja) * | 2005-10-19 | 2007-05-10 | Kitakyushu Foundation For The Advancement Of Industry Science & Technology | カメラセンサと距離センサによるロボットの自己位置検知方法及び装置 |
WO2017199940A1 (ja) * | 2016-05-16 | 2017-11-23 | 株式会社日水コン | 管路内壁の調査装置およびコンピュータプログラム |
JP2017226259A (ja) * | 2016-06-21 | 2017-12-28 | 株式会社日立製作所 | 管路施設点検飛行体と、それを用いた管路施設点検システム |
WO2018163699A1 (ja) * | 2017-03-06 | 2018-09-13 | 株式会社Spiral | 飛行体の制御システム、その制御装置および目印部 |
WO2019139172A1 (ja) * | 2018-01-15 | 2019-07-18 | 本郷飛行機株式会社 | 情報処理システム |
WO2019190398A1 (en) * | 2018-03-26 | 2019-10-03 | Singapore University Of Technology And Design | Aerial vehicles, methods of imaging a tunnel and methods of imaging a shaft |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6666095B2 (en) * | 2001-11-30 | 2003-12-23 | The Regents Of The University Of California | Ultrasonic pipe assessment |
JP6235213B2 (ja) | 2013-01-31 | 2017-11-22 | セコム株式会社 | 自律飛行ロボット |
US20160162743A1 (en) * | 2014-12-05 | 2016-06-09 | Magna Electronics Inc. | Vehicle vision system with situational fusion of sensor data |
JP6439439B2 (ja) * | 2014-12-24 | 2018-12-19 | 株式会社リコー | 情報処理装置、画像処理装置、プログラム、ユーザインタフェース |
JP6596235B2 (ja) | 2015-05-22 | 2019-10-23 | 株式会社日立製作所 | 下水管路施設の点検システム |
WO2018061823A1 (ja) | 2016-09-30 | 2018-04-05 | 日本電産株式会社 | マルチコプターの制御システム |
JP2019036269A (ja) | 2017-08-10 | 2019-03-07 | ミスギ工業株式会社 | 無人小型飛行体の飛行制御方法、内部空間の状況及びその壁面状況の点検方法 |
JP6904895B2 (ja) | 2017-12-08 | 2021-07-21 | 日立Geニュークリア・エナジー株式会社 | 位置推定装置、および、位置推定方法 |
WO2019123558A1 (ja) | 2017-12-20 | 2019-06-27 | 株式会社日立製作所 | 自己位置推定システム |
US10954648B1 (en) * | 2018-09-16 | 2021-03-23 | Michael D Blackshaw | Multi-sensor manhole survey |
-
2019
- 2019-10-09 US US17/767,834 patent/US20240085928A1/en active Pending
- 2019-10-09 JP JP2021551025A patent/JP7324388B2/ja active Active
- 2019-10-09 WO PCT/JP2019/039916 patent/WO2021070308A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007113240A (ja) * | 2005-10-19 | 2007-05-10 | Kitakyushu Foundation For The Advancement Of Industry Science & Technology | カメラセンサと距離センサによるロボットの自己位置検知方法及び装置 |
WO2017199940A1 (ja) * | 2016-05-16 | 2017-11-23 | 株式会社日水コン | 管路内壁の調査装置およびコンピュータプログラム |
JP2017226259A (ja) * | 2016-06-21 | 2017-12-28 | 株式会社日立製作所 | 管路施設点検飛行体と、それを用いた管路施設点検システム |
WO2018163699A1 (ja) * | 2017-03-06 | 2018-09-13 | 株式会社Spiral | 飛行体の制御システム、その制御装置および目印部 |
WO2019139172A1 (ja) * | 2018-01-15 | 2019-07-18 | 本郷飛行機株式会社 | 情報処理システム |
WO2019190398A1 (en) * | 2018-03-26 | 2019-10-03 | Singapore University Of Technology And Design | Aerial vehicles, methods of imaging a tunnel and methods of imaging a shaft |
Non-Patent Citations (2)
Title |
---|
MOY DE VITRY, MATTHEW ET AL.: "Sewer Inlet Localization in UAV Image Clouds:Improving Performance with Multiview Detection", REMOTE SENSING, vol. 10, no. 5, 706, 4 May 2018 (2018-05-04), XP055816703 * |
TAN,C.H ET AL.: "A smart unmanned aerial vehicle(UAV) based imaging system for inspection of deep hazardous tunnels", WATER PRACTICE AND TECHNOLOGY, vol. 13, no. 4, 1 December 2018 (2018-12-01), pages 991 - 1000, XP055816699 * |
Also Published As
Publication number | Publication date |
---|---|
US20240085928A1 (en) | 2024-03-14 |
JP7324388B2 (ja) | 2023-08-10 |
JPWO2021070308A1 (ja) | 2021-04-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230365129A1 (en) | Apparatus and methods for obstacle detection | |
JP6179502B2 (ja) | マルチコプタを用いた3次元形状計測方法および装置 | |
US20230070563A1 (en) | Control device, control method, and flight vehicle device | |
US10656096B2 (en) | Method and system for inspecting a surface area for material defects | |
CN107941204B (zh) | 飞行传感器 | |
Mirallès et al. | LineDrone Technology: Landing an unmanned aerial vehicle on a power line | |
EP3738892B1 (en) | In-service maintenance process using unmanned aerial vehicles | |
US9513635B1 (en) | Unmanned aerial vehicle inspection system | |
La et al. | Autonomous robotic system for high-efficiency non-destructive bridge deck inspection and evaluation | |
JP5690539B2 (ja) | 自動離着陸システム | |
JP6363632B2 (ja) | 小型無人飛行機を利用した遠隔構造物の検査システム | |
WO2017065103A1 (ja) | 小型無人飛行機の制御方法 | |
EP3889728A2 (en) | Selective processing of sensor data | |
WO2017116841A1 (en) | Unmanned aerial vehicle inspection system | |
US11490005B2 (en) | Overhead line image capturing system and overhead line image capturing method | |
CN109031312A (zh) | 适用于烟囱内部作业的飞行平台定位装置和定位方法 | |
KR20160022065A (ko) | 교량 내부 진단 시스템 | |
JP2019016197A (ja) | 移動体誘導システム | |
JP2019050007A (ja) | 移動体の位置を判断する方法および装置、ならびにコンピュータ可読媒体 | |
US20230122535A1 (en) | Processes for Generating and Updating Flyable Airspace for Unmanned Aerial Vehicles | |
Nasrollahi et al. | Designing LiDAR-equipped UAV platform for structural inspection | |
WO2021070308A1 (ja) | 無人航空機および無人航空機の制御方法 | |
Karam et al. | Micro and macro quadcopter drones for indoor mapping to support disaster management | |
Pierce et al. | Quantitative inspection of wind turbine blades using UAV deployed photogrammetry | |
Satler et al. | Towards an autonomous flying robot for inspections in open and constrained spaces |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19948749 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021551025 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 17767834 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19948749 Country of ref document: EP Kind code of ref document: A1 |