WO2022153391A1 - Estimation system for estimating self-location and/or attitude of unmanned aerial vehicle, flight control system, unmanned aerial vehicle, program, and recording medium - Google Patents

Estimation system for estimating self-location and/or attitude of unmanned aerial vehicle, flight control system, unmanned aerial vehicle, program, and recording medium Download PDF

Info

Publication number
WO2022153391A1
WO2022153391A1 PCT/JP2021/000814 JP2021000814W WO2022153391A1 WO 2022153391 A1 WO2022153391 A1 WO 2022153391A1 JP 2021000814 W JP2021000814 W JP 2021000814W WO 2022153391 A1 WO2022153391 A1 WO 2022153391A1
Authority
WO
WIPO (PCT)
Prior art keywords
cross
unmanned aerial
aerial vehicle
coordinate system
section
Prior art date
Application number
PCT/JP2021/000814
Other languages
French (fr)
Japanese (ja)
Inventor
ティトゥス ヴォイタラ
ビンセント テオドラス テジャウィリヤ
ニクラス ベリストロム
クリストファー トーマス ラービ
Original Assignee
株式会社Acsl
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Acsl filed Critical 株式会社Acsl
Priority to JP2022574908A priority Critical patent/JPWO2022153391A1/ja
Priority to PCT/JP2021/000814 priority patent/WO2022153391A1/en
Publication of WO2022153391A1 publication Critical patent/WO2022153391A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/16Flying platforms with five or more distinct rotor axes, e.g. octocopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U30/00Means for producing lift; Empennages; Arrangements thereof
    • B64U30/20Rotors; Rotor supports

Definitions

  • the present invention relates to an estimation system for estimating the self-position and / or attitude of an unmanned aerial vehicle, a flight control system, an unmanned aerial vehicle, a program, and a recording medium.
  • Non-Patent Document 1 Non-Patent Document 1 below.
  • the laser SLAM uses LIDAR that irradiates the surroundings with laser light and measures the distance to the irradiation target based on the time until the irradiated light is received, and based on the output from LIDAR, the self-position and the map. It is a technique to estimate the above in parallel.
  • self-position estimation by laser SLAM self-position is estimated by referring to the map based on the measurement data from LIDAR.
  • the inner wall surface of a tunnel (tank) extending in the vertical direction often has a rotationally symmetric shape such as a circle, and the map itself is also rotationally symmetric, making it difficult to estimate the self-position with high accuracy.
  • the present invention estimates the position and / or attitude of an unmanned aerial vehicle when the unmanned aerial vehicle autonomously flies in the internal space of a structure having a rotationally symmetric cross section such as a tunnel (tank) extending in the vertical direction.
  • a structure having a rotationally symmetric cross section such as a tunnel (tank) extending in the vertical direction.
  • One of the purposes is to provide a system and method for the purpose.
  • One aspect of the present invention is an estimation system that estimates the self-position and / or attitude of an unmanned aircraft, wherein the unmanned aircraft is a predetermined unmanned aircraft within a cross section in a direction crossing the longitudinal flight path of the unmanned aircraft. It includes a cross-section measuring device that measures the spatial shape with reference to the direction and generates cross-section measurement data.
  • Cross-section measurement data acquisition unit that acquires cross-section measurement data generated by the cross-section measurement device when flying in the internal space of a structure provided with an internal structure extending in the vertical direction, and cross-section measurement data.
  • the cross-section measurement data acquired by the acquisition unit was matched with the first predetermined rotationally symmetric shape corresponding to the cross-sectional shape of the inner wall surface of the structure, and matched in the relative coordinate system based on the unmanned aircraft in the cross-section.
  • the first position of the first feature point which is the center of the first predetermined shape, is estimated, and based on the cross-section measurement data acquired by the cross-section measurement data acquisition unit, the second related to the internal structure in the relative coordinate system. It provides an estimation system including a relative coordinate system calculation unit that estimates the second position of two feature points and calculates the direction of the second position with respect to the first position in the relative coordinate system. ..
  • an unmanned aerial vehicle in an absolute coordinate system with respect to the direction of the point corresponding to the second feature point of the internal structure relative to the point corresponding to the first feature point of the structure in the cross section.
  • a position / attitude estimation unit in the cross section for estimating the self-position and / or attitude of the vehicle can be further provided.
  • the position / orientation estimation unit in the cross section coordinates the relative coordinate system with the absolute coordinate system so that the direction of the second position with respect to the first position in the relative coordinate system corresponds to the reference axis. It can be configured to transform and estimate the heading direction of an unmanned aircraft in an absolute coordinate system.
  • the position / orientation estimation unit in the cross section has a position in which the direction of the second position with respect to the first position in the relative coordinate system corresponds to the reference axis and the first position in the relative coordinate system.
  • the coordinates in the absolute coordinate system of the unmanned aircraft can be estimated by converting the coordinates so as to correspond to the coordinates in the absolute coordinate system corresponding to the first feature point of the structure.
  • the first predetermined shape can be either a circle, an ellipse, or a rectangle.
  • a gap region in which no data exists in the region near the matched first predetermined shape in the cross-sectional measurement data is specified, and the angle range of the gap region with respect to the unmanned aircraft is substantially bisectored.
  • the distance between the preset first feature point and the second feature point is set as the radius, and the intersection with the circle centered on the first feature point can be estimated as the second position. ..
  • the relative coordinate system calculation unit matches the cross-sectional measurement data with a second predetermined shape corresponding to the portion corresponding to the internal structure, and a predetermined point in the matched second predetermined shape.
  • the relative coordinates of can be estimated as the second position.
  • the cross-sectional measurement data is discrete data
  • the relative coordinate system calculation unit can remove outliers in the discrete data and perform matching.
  • One aspect of the present invention includes the above estimation system, a flight control unit that flies an unmanned aerial vehicle along a set flight planning route based on the self-position and / or attitude of the unmanned aerial vehicle estimated by the estimation system. It provides a flight control system for unmanned aerial vehicles, including.
  • One aspect of the present invention is to provide an unmanned aerial vehicle having the above estimation system.
  • One aspect of the present invention is to provide an unmanned aerial vehicle having a flight control system.
  • One aspect of the present invention is an estimation method in which the self-position and / or attitude of an unmanned aircraft is estimated by a computer, and the unmanned aircraft is an unmanned aircraft in a cross section in a direction crossing the vertical flight path of the unmanned aircraft. It includes a cross-section measuring device that measures the spatial shape with respect to a predetermined direction and generates cross-section measurement data, and forms an internal space in which an unmanned aircraft extends along a predetermined path in the vertical direction.
  • a cross-section measurement data acquisition step for acquiring cross-section measurement data generated by a cross-section measurement device when flying in the internal space of a structure provided with an internal structure extending in the vertical direction in the space, and the acquisition.
  • the first predetermined shape that is rotationally symmetric corresponding to the cross-sectional shape of the inner wall surface of the structure is matched with the cross-sectional measurement data, and the first predetermined shape that is matched in the relative coordinate system based on the unmanned aircraft in the cross-section.
  • the second feature point related to the internal structure in the relative coordinate system is provided.
  • a self-position estimation method including a second feature point estimation step for estimating a second position of the above and a calculation step for calculating the direction of the second position with respect to the first position in a relative coordinate system. Is.
  • One aspect of the present invention provides a program for causing a computer to execute the above method.
  • One aspect of the present invention provides a computer-readable recording medium on which the above program is recorded.
  • the position and / or attitude of an unmanned aerial vehicle is estimated when the unmanned aerial vehicle autonomously flies in the internal space of a structure having a rotationally symmetric cross section such as a tunnel (tank) extending in the vertical direction.
  • a structure having a rotationally symmetric cross section such as a tunnel (tank) extending in the vertical direction.
  • FIG. 5A is a cross-sectional view taken along the line BB in FIG. 5A.
  • the unmanned aerial vehicle of the present invention is not limited to the multicopter shown in FIG. 1, and may be any unmanned aerial vehicle such as a rotary wing aircraft and a fixed wing aircraft.
  • the system configuration of the unmanned aerial vehicle 1 is not limited to that shown in the figure, and any configuration can be adopted as long as the same operation is possible.
  • the function of the communication circuit may be integrated into the flight control unit, or the operation executed by a plurality of components may be executed by a single component, or the function of the main calculation unit may be distributed to a plurality of calculation units.
  • the operation performed by a single component may be performed by a plurality of components.
  • various data stored in the memory of the unmanned aircraft 1 may be stored in a different location, and the information recorded in the various memories also distributes one type of information to a plurality of types. It may be stored together, or a plurality of types of information may be collectively stored in one type.
  • the shape of the tunnel is known from a design drawing or the like.
  • the internal space inside the tunnel has a circular shape of an inner wall surface in a horizontal cross section (cross section), and the shape of the inner wall surface in the same cross section is in the vertical direction (vertical direction). ) Is a shape that extends.
  • a pipe having a circular horizontal cross section extends in the vertical direction.
  • FIG. 1 is an external view of a multicopter which is an example of an unmanned aerial vehicle (multicopter) 1 according to an embodiment of the present invention.
  • the unmanned aerial vehicle 1 has a control unit 101, six motors 102 driven by control signals from the control unit 101, and six rotors (rotors) that rotate by driving each motor 102 to generate lift. It includes a wing) 103, six arms 104 connecting the control unit 101 and each motor 102, and a landing gear 105 that supports the unmanned aerial vehicle at the time of landing.
  • the number of the motor 102, the rotor 103, and the arm 104 can be 3 or more, such as 3, 4, and so on, respectively.
  • the six motors 102 are rotated by the control signal from the control unit 101, and by controlling the rotation speed of each of the six rotors 103, the unmanned aerial vehicle 1 can perform ascending, descending, flying back and forth and left and right, turning, and the like. Flight is controlled.
  • a pedestal 106 is attached above the control unit 101.
  • a camera 108 for photographing an object with high resolution is attached to the pedestal 106 via a support member 107 that rotatably supports the camera 108.
  • a cross section measuring device (horizontal cross section LIDAR) 109 for measuring the shape of the surrounding space with respect to the horizontal cross section (horizontal cross section) is provided.
  • the cross-section measuring device 109 has a horizontal cross-section LIDAR, and irradiates a pulse laser at a predetermined angular interval around the vertical axis in a horizontal plane while the unmanned aircraft 1 is landing or hovering. Then, by receiving the reflected laser that hits the object in the surrounding space, the distance to the object in the surrounding space can be measured based on the time from irradiation to light reception.
  • the cross-sectional measurement device 109 By driving the cross-sectional measurement device 109 while the unmanned aircraft 1 is flying in the tunnel, the cross-sectional measurement device 109 relates to the horizontal cross-sectional shape (cross-sectional shape) of the inner wall surface (shape of the tunnel internal space) of the tunnel. Generate cross-section measurement data.
  • the cross-sectional measurement data is, for example, a set of discrete data of polar coordinates with respect to the front of the unmanned aerial vehicle 1.
  • the horizontal cross section LIDAR is used as the cross section measuring device 111, but the present invention is not limited to this.
  • the shape of the inner wall surface of the tunnel in the cross section can be measured and crossed.
  • the self-position in the plane can be estimated.
  • SfM is known as an algorithm for creating such a 3D model
  • Dense Visual SLAM is known as an algorithm for estimating a self-position using SfM.
  • an altimeter 110 is provided above the control unit 101.
  • a barometric altimeter that measures altitude based on barometric pressure is suitable.
  • the unmanned aerial vehicle 1 also has an antenna 117.
  • FIG. 2 is a diagram showing a flight control system for the unmanned aerial vehicle shown in FIG.
  • the flight control system 200 of the unmanned aerial vehicle 1 includes a control unit 101, a motor 102 electrically connected to the control unit 101, a rotor 103 mechanically connected to the motor 102, a camera 108, a cross section measuring device 109, and an altimeter 110. , And IMU111.
  • the control unit 101 has a configuration for performing information processing for controlling the flight of the unmanned aircraft 1 and controlling electrical signals for that purpose, and typically arranges and wires various electronic components on a substrate. It is a unit that constitutes the circuit necessary to realize such a function.
  • the control unit 101 is further composed of an information processing unit 120, a communication circuit 121, a control signal generation unit 122, a speed controller 123, and an interface 125.
  • the camera 108 is a camera for capturing an object (in the present embodiment, the inner wall of the tunnel) with high resolution.
  • the camera 108 is attached to the pedestal 106 via a support member 107 that rotatably supports the camera 108, whereby the imaging direction can be changed.
  • the camera 108 acquires image data of the shooting range of the unmanned aerial vehicle 1, and the acquired image is stored in a storage device such as a memory described later.
  • the image is typically a moving image consisting of a series of still images.
  • the IMU 111 is an inertial measurement unit (Inertial Measurement Unit) that detects translational motion with an acceleration sensor and rotational motion with an angular velocity sensor (gyro). Further, the IMU 111 can calculate the velocity by integrating the translational motion (acceleration) detected by the acceleration sensor, and can further calculate the moving distance (position) by integrating the velocity. Similarly, the angle (posture) can be calculated by integrating the rotational motion (angular velocity) detected by the angular velocity sensor.
  • Inertial Measurement Unit Inertial Measurement Unit
  • the IMU 111 can calculate the velocity by integrating the translational motion (acceleration) detected by the acceleration sensor, and can further calculate the moving distance (position) by integrating the velocity.
  • the angle (posture) can be calculated by integrating the rotational motion (angular velocity) detected by the angular velocity sensor.
  • the antenna 117 is an antenna for receiving a radio signal including information and various data for maneuvering and controlling the unmanned aerial vehicle 1, and transmitting a radio signal including a telemetry signal from the unmanned aerial vehicle 1.
  • the communication circuit 121 demodulates maneuvering signals, control signals, various data, and the like for the unmanned aircraft 1 from the radio signals received through the antenna 117 and inputs them to the information processing unit 120, or telemetry output from the unmanned aircraft 1. It is an electronic circuit for generating a radio signal that carries a signal or the like, and is typically a radio signal processing IC. Note that, for example, the communication of the steering signal and the communication of the control signal and various data may be executed by different communication circuits in different frequency bands. For example, communicate with the transmitter of the controller (propo) for manual control at a frequency of 950 MHz band, and communicate data communication at a frequency of 2 GHz band / 1.7 GHz band / 1.5 GHz band / 800 MHz band. It is also possible to adopt a different configuration.
  • the control signal generation unit 122 has a configuration in which the control command value data obtained by calculation by the information processing unit 120 is converted into a pulse signal (PWM signal or the like) representing a voltage, and is typically an oscillation circuit and a switching circuit. Is an IC including.
  • the speed controller 123 has a configuration of converting a pulse signal from the control signal generation unit 122 into a drive voltage for driving the motor 102, and is typically a smoothing circuit and an analog amplifier.
  • the unmanned aerial vehicle 1 includes a battery device such as a lithium polymer battery or a lithium ion battery, and a power supply system including a distribution system for each element.
  • the interface 125 converts the signal form so that it can be transmitted and received between the information processing unit 120 and the functional elements such as the camera 108, the cross section measuring device 109, the altimeter 110, and the IMU 111. It is a configuration that connects electrically. For convenience of explanation, the interface is described as one configuration in the drawings, but it is usual to use another interface depending on the type of the functional element to be connected. Further, the interface 125 may not be required depending on the type of signal input / output by the functional element to be connected. Further, in FIG. 2, even in the information processing unit 120 to which the interface 125 is connected without mediation, the interface may be required depending on the type of signals input / output by the functional element to be connected.
  • FIG. 4 is a block diagram showing a configuration of an information processing unit of the unmanned aerial vehicle shown in FIG.
  • the information processing unit 120 includes a cross-sectional shape data storage unit 130, a flight path data storage unit 131, a relative coordinate system calculation unit 136, a cross-sectional position / attitude estimation unit 132, and a self-position. It includes a data generation unit 133, a flight control unit 134, and a cross-sectional measurement data acquisition unit 135.
  • the cross-sectional shape data storage unit 130, the relative coordinate system calculation unit 136, the position / attitude estimation unit 132 in the cross section, the self-position data generation unit 133, and the cross-section measurement data acquisition unit 135 are used.
  • FIG. 5 is a diagram showing a hardware configuration of an information processing unit of the unmanned aerial vehicle shown in FIG.
  • the information processing unit 120 includes a CPU 120a, a RAM 120b, a ROM 120c, an external memory 120d, an input unit 120e, an output unit 120f, and a communication unit 120g.
  • the RAM 120b, ROM 120c, external memory 120d, input unit 120e, output unit 120f, and communication unit 120g are connected to the CPU 120a via the bus 120h.
  • the CPU 120a comprehensively controls each device connected to the system bus 120h.
  • the ROM 120c and the external memory store the BIOS and OS, which are control programs of the CPU 120a, and various programs and data necessary for realizing the functions executed by the computer.
  • the RAM 120b functions as a main memory of the CPU 120a, a work area, and the like.
  • the CPU 120a realizes various operations by loading a program or the like necessary for executing a process from the ROM 120c or the external memory 120d into the RAM 120b and executing the loaded program.
  • the external memory 120d is composed of, for example, a flash memory, a hard disk, a DVD-RAM, a USB memory, or the like.
  • the input unit 120e receives an operation instruction or the like from a user or the like.
  • the input unit 120e is composed of, for example, an input device such as an input button, a keyboard, a pointing device, a wireless remote controller, a microphone, and a camera.
  • the output unit 120f outputs the data processed by the CPU 120a and the data stored in the RAM 120b, the ROM 120c, and the external memory 120d.
  • the output unit 120f is composed of, for example, a CRT display, an LCD, an organic EL panel, a printer, and an output device such as a speaker.
  • the communication unit 120g is an interface for connecting / communicating with an external device via a network or directly.
  • the communication unit 120g is composed of interfaces such as a serial interface and a LAN interface, for example.
  • each unit 121, 122, 125, 130 to 135 of the flight control system and the position estimation system shown in FIGS. This is realized by using the input unit 120e, the output unit 120f, the communication unit 120g, and the like as resources.
  • the cross-sectional shape data storage unit 130 stores the cross-sectional shape data related to the cross-sectional shape of the inner wall surface of the tunnel. Further, the flight route data storage unit 131 stores flight plan route data.
  • the cross-sectional shape data storage unit 130 and the flight path data storage unit 131 are realized by reading the data recorded in the storage medium and storing the data in the memory.
  • the cross-sectional shape data storage unit 130 and the flight path data storage unit 131 can be omitted by incorporating the cross-sectional shape data and the flight plan route data into the program.
  • the information processing unit 120 functions as a self-position estimation system and a flight control system, but it may be configured to be provided in an unmanned aerial vehicle by mounting these systems separately from the information processing unit 120. .. Further, the self-position estimation system and the flight control system or its components need not be configured as one physical device, but may be composed of a plurality of physical devices. In addition, the self-position estimation system and flight control system can be used as any suitable device such as a computer, PC, smartphone, tablet terminal of a ground station separate from the unmanned aerial vehicle, a cloud computing system, or a combination thereof. It may be configured. In addition, the functions of each part of the self-position estimation system and the flight control system may be one or more of one or more devices provided in the unmanned aerial vehicle and one or more devices separate from the unmanned aerial vehicle. It may be a configuration that is distributed and executed in.
  • FIGS. 5A-5B show a tunnel and a flight plan route according to an embodiment of the present invention
  • FIG. 5A is a horizontal sectional view of the tunnel
  • FIG. 5B is a sectional view taken along line BB in FIG. 5A.
  • the cross section of the inner wall surface 302 of the tunnel 300 is circular, and has the same cross-sectional shape and extends in the vertical direction.
  • the pipe 310 extends in the vertical direction in the inner section of the tunnel, and the cross section of the cross section of the outer wall surface 312 of the pipe 310 is circular.
  • the cross-sectional shape data has the center of the inner wall surface 302 of the tunnel 300 as the origin, the XY axis is set so that the center of the pipe 310 is located on the Y axis, and the radius of the inner wall surface of the tunnel 300.
  • It includes R 0 , the center coordinates of the pipe 310, and the radius R 1 of the pipe 310, but is not limited to this, and the distance between the inner wall surface 302 of the tunnel 300 and the center of the pipe 310 (R in the above example). Only 2 ) needs to be included.
  • the cross-sectional shape data includes data indicating a circle as a geometric shape corresponding to the inner wall surface of the tunnel, but the present invention is not limited to this, and the cross-sectional shape data is not limited to this, depending on the cross-sectional shape of the inner wall surface of the tunnel. It may include data indicating a two-dimensional shape such as an ellipse or a rectangle. Further, data indicating an ellipse, a rectangle, or the like may be included depending on the cross-sectional shape of the outer wall surface of the pipe.
  • the geometric shape referred to here is, for example, a shape that can be specified by mathematical formulas related to the X coordinate, the Y coordinate, and the Z coordinate.
  • the present invention is suitable for the cross-sectional shape data when the geometric shape is rotationally symmetric.
  • rotational symmetry as used herein means a state in which the cross-sectional shape of the inner wall surface is rotated around a specific point and is substantially equal to the original shape, and is an elliptical shape, a rectangle, or a regular polygon. Etc. are included.
  • the flight plan route data is data representing a three-dimensional (X, Y, Z) flight plan route in the absolute coordinate system of the unmanned aerial vehicle 1, and is typically a series of a plurality of ways existing on the flight plan route. It is the data of the set of points P 1 to P mn . Each waypoint P 1 to P mn is set by three-dimensional coordinates (X, Y, Z) in the absolute coordinate system.
  • a flight planning path is typically a set of straight lines connecting a plurality of waypoints in order, but can also be a curve of a predetermined curvature within a predetermined range of the waypoints.
  • the flight plan route data may include data that determine the flight speed at a plurality of waypoints.
  • Flight planning route data is typically used to determine flight planning routes in autonomous flight, but can also be used as a guide during flight in non-autonomous flight. Flight plan route data is typically input and stored in the unmanned aerial vehicle 1 prior to flight.
  • the cross-sectional shape data and the flight planning route data are represented by the XY coordinate system with the center of the tunnel 300 as the origin, but the present invention is not limited to this, and for example, the central direction of the pipe with respect to the center of the tunnel is expressed. As a reference, it may be expressed in height Z and polar coordinates with the center of the tunnel as the origin.
  • the flight plan route is set as a repetition of the following meandering route.
  • a predetermined angle is moved clockwise along the inner wall surface 302 of the tunnel 300 on a concentric circle (for example, waypoints P1 to Pn ).
  • a predetermined angle is moved counterclockwise along the inner wall surface 302 of the tunnel 300 (for example, waypoints P n + 1 to P 2 n ).
  • a predetermined distance in the vertical direction for example, waypoints P 2n to P 2n + 1
  • the relative coordinate system calculation unit 136 calculates the relative coordinates of each feature point in the relative coordinate system based on the cross section measurement data acquired by the cross section measuring device 109.
  • the position / attitude estimation unit 132 in the cross section crosses the unmanned aircraft in the cross section based on the relative coordinates calculated by the relative coordinate system calculation unit 136 and the cross section shape data recorded in the cross section shape data storage unit 130.
  • the in-plane self-position (X, Y) and heading ⁇ are estimated.
  • FIG. 6 is a horizontal cross-sectional view showing a region that can be measured by the cross-sectional measurement device.
  • the midpoint D in the figure indicates the position of the unmanned aerial vehicle 1.
  • the cross-section measuring device 109 irradiates the laser beam at predetermined angles with reference to a predetermined angle (in front of the unmanned aircraft 1: the direction of the arrow) in the horizontal cross section, and irradiates the inner wall surface 302 of the tunnel 300 and the pipe 310 with reference to the laser beam.
  • the light reflected by the outer wall surface 312 is received, and the distance to the object (the inner wall surface 302 of the tunnel 300 and the outer wall surface 312 of the pipe 310) is measured according to the time from irradiation to light reception.
  • the cross-section measurement data obtained from the cross-section measuring device 109 is based on the front of the unmanned aircraft 1 and the position of the unmanned aircraft 1 as the origin, up to the angle ⁇ i irradiated with the laser beam and the object.
  • the reflected light is reflected in the cross section measuring device in the portion A of the outer wall surface 312 of the pipe 310 facing the unmanned aerial vehicle 1.
  • the cross-section measuring device 109 cannot detect the side portion B of the outer wall surface 312 of the pipe 310 when viewed from the unmanned aerial vehicle 1.
  • the laser beam does not reach the portion C located behind the pipe 310 on the inner wall surface 302 of the tunnel 300. Therefore, the cross-section measuring device 109 cannot detect the portion C located behind the pipe 310 on the inner wall surface 302 of the tunnel 300.
  • the cross-section measurement data acquisition unit 135 can control the cross-section measurement device 109 and can acquire the cross-section measurement data measured by the cross-section measurement device 109.
  • the self-position in the cross section of the unmanned aerial vehicle is estimated as follows.
  • the relative coordinate system calculation unit 136 matches the first geometric shape corresponding to the inner wall surface 302 of the tunnel 300, that is, a circle in the present embodiment, with respect to the vertical cross-sectional data.
  • FIG. 7 is a diagram showing how the position estimation unit in the optical cross section matches the circle corresponding to the inner wall surface of the tunnel with the vertical cross section data. Note that FIG. 7 shows only a part of the data constituting the vertical cross-sectional data.
  • the vertical cross-sectional data consists of discrete points Q1 to Qn, and the relative coordinate system calculation unit 136 matches circles with respect to the discrete points Q1 to Qn.
  • a circle matching algorithm the least squares method, RANSAC, or the like can be applied, but RANSAC, which excludes outliers and performs matching, is suitable as in the case of linear matching.
  • RANSAC which excludes outliers and performs matching
  • the correct model candidates the one that best matches the data is adopted as the result of circle matching.
  • step 1 Q3 is far from the tendency of other discrete points among the discrete points corresponding to the inner wall surface of the tunnel.
  • the circle model becomes C1.
  • Q1, Q4, and Q8 are selected in step 1
  • the circle model becomes C2.
  • the error between the circle models C1 and C and Q1 to Qn is obtained, the error of the circle model C1 becomes larger than the error of the circle model C2. Therefore, in step 5, values that deviate from the tendency of other data such as Q3 used when setting the circle model C1 are not used as data for determining the circle, and are outliers like the circle model C2.
  • Circles with the effect of the value removed are matched.
  • the first relative coordinates of the center point O 1 ′ as the first feature point of the circle model C2 and the radius of the circle model C2 are estimated with reference to the front of the unmanned aerial vehicle 1.
  • the center point O 1 ′ as the first feature point of the circular model C2 corresponds to the center point of the inner wall surface 302 of the tunnel 300 in the absolute coordinate system.
  • RANSAC is used as an algorithm for matching by excluding outliers, but Markov Localization, Monte-Carlo Localization, Particle Filter, Hough Transform, etc. can also be used.
  • the relative coordinate system calculation unit 136 compares the circular model C2 matched in this way with the cross-sectional measurement data.
  • the area where the cross-sectional measurement data exists is shown by a thick line.
  • the laser beam emitted from the cross-section measuring device 109 does not reach the portion of the inner wall surface 302 of the tunnel 300 that is behind the pipe 310. Therefore, by comparing the matched circular model C2 with the cross-sectional measurement data, it is possible to identify the gap region G in the circular model C2 in which the discrete data of the cross-sectional measurement data does not exist in the vicinity.
  • the distance from the circle model C2 is within a predetermined range (that is, the distance from the center point O 1 ′ to the discrete point and the radius of the circle model C2.
  • a method such as between both end points of the discrete points of the cross-sectional measurement data (where the difference between the two is equal to or less than a predetermined value) may be adopted.
  • This gap region corresponds to the angle range in which the pipe 310 exists centering on the unmanned aerial vehicle 1 (cross-section measuring device 109). Then, the bisector L3 in the angle range centered on the unmanned aerial vehicle 1 (cross-section measuring device 109) in the gap region thus specified is specified.
  • the bisector L3 indicates the direction of the point corresponding to the center of the pipe 310 with respect to the unmanned aerial vehicle 1 with respect to the front of the unmanned aerial vehicle 1. That is, the direction of the center of the pipe 310 in relative coordinates with respect to the front of the unmanned aerial vehicle 1 can be estimated.
  • the bisector of the relative coordinate system calculation unit 136 gap region is set, but the present invention is not limited to this, and the case where the shape of the pipe 310 is a quadrangle or the like is set. May apply a correction according to the shape to the bisector.
  • a straight line or the like may be matched with the portion corresponding to the pipe in the cross-sectional measurement data, and the direction of the bisector may be corrected according to the angle of the straight line. ..
  • the bisector when the above circles are matched and the bisector with correction are referred to as substantially bisectors.
  • the shape of the pipe 310 is a quadrangle, it is not always necessary to apply correction, and even if a bisector is used, the direction of the center can be estimated with high accuracy.
  • the relative coordinate system calculation unit 136 transfers the circle C3 in which the center of the pipe 310 centered on the center point O 1 ′ of the circle model C2 exists and the bisector L3. Specify the relative coordinates of the intersection O 2 '.
  • the position where the center of the pipe 310 exists can be set as a circle having a radius R 2 centered on the center point O 1 of the tunnel 300 from the cross-sectional shape data.
  • the relative coordinates of the intersection O 2' are the second position (relative coordinates) of the center point as the second characteristic point of the pipe 310 with respect to the unmanned aerial vehicle 1 with respect to the front of the unmanned aerial vehicle 1.
  • the relative coordinate system calculation unit 136 specifies the direction of the second relative coordinates of the intersection point O 2 ′ with respect to the first relative coordinates of the center point O 1 ′ of the circle model C2 specified as described above.
  • the position / attitude estimation unit 132 in the cross section has the direction of the intersection point O 2 ′ with respect to the center point O 1 ′ in the relative coordinate system of the center point O2 of the pipe 310 with respect to the center point O1 of the tunnel 300 in the absolute coordinate system.
  • Coordinate conversion of the relative coordinate system to the absolute coordinate system so that the coordinates of the center point O 1 ′ in the relative coordinate system correspond to the direction (Y-axis direction) and the coordinates of the center point O 1 ′ in the relative coordinate system correspond to the center point O1 of the tunnel 300 in the absolute coordinate system. I do. Specifically, as shown in FIG.
  • the direction of the intersection point O 2 ′ with respect to the center point O 1 ′ of the circle model C2 specified as described above is the Y direction in the absolute coordinate system, and the center of the circle model C2.
  • the relative coordinate system with respect to the unmanned aircraft 1 is coordinate-converted to the absolute coordinate system so that the point O 1'is the origin. Thereby, the X coordinate, the Y coordinate and the heading ⁇ of the unmanned aerial vehicle 1 can be calculated.
  • the method of specifying the self-position of the unmanned aerial vehicle 1 and the second positional relationship of the pipe 310 is not limited to the method of specifying the gap region described above. That is, for example, a circle having a second predetermined shape is matched with the cross-sectional measurement data of the region inside the circle model C2 (particularly the region shown by A in FIG. 6), and a predetermined point of the matched circle is matched.
  • the relative coordinates of the center point which is the above, may be estimated as the relative coordinates corresponding to the center of the pipe 310, or both methods may be performed to improve the accuracy.
  • the self-position data generation unit 133 generates self-position data as a reference when the flight control unit 134 flies. Specifically, the self-position data generation unit 133 is based on the X coordinate, Y coordinate and heading ⁇ estimated by the position / attitude estimation unit 132 in the cross section and the altitude (Z coordinate) acquired from the altitude meter 110. Generate self-position data including XYZ coordinates and heading ⁇ .
  • the flight control unit 134 controls the flight of the unmanned aerial vehicle so as to follow the flight plan route of the flight plan route data stored in the flight route data storage unit 131 based on the self-position data generated by the self-position data generation unit 133. do.
  • the attitude, speed, etc. of the unmanned aircraft 1 are determined by various sensors, and the current flight position, heading, etc. of the unmanned aircraft 1 are determined based on the self-position data generated by the self-position data generation unit 133.
  • the control command value for each rotor 103 is calculated by comparing with the target values such as the flight control signal, flight plan route (target), speed limit, altitude limit, etc., and the data indicating the control command value is output to the control signal generation unit 122. do.
  • the control signal generation unit 122 converts the control command value into a pulse signal representing a voltage and transmits it to each speed controller 123.
  • Each speed controller 123 converts the pulse signal into a drive voltage and applies it to each motor 102, thereby controlling the drive of each motor 102 and controlling the rotation speed of each rotor 103, whereby the unmanned aerial vehicle 1 Flight is controlled.
  • FIG. 11 is a flowchart showing a flow of autonomously flying in the tunnel by the unmanned aerial vehicle shown in FIG. 1 and photographing the inner wall surface of the tunnel.
  • the information processing unit 120 refers to the flight plan route data stored in the flight path data storage unit 131, and the flight control unit 134 sets the first waypoint as a target (as shown in FIG. 11). S100).
  • the information processing unit 120 estimates the self-position and orientation of the unmanned aerial vehicle 1 by the self-position data generation unit 133, and the flight control unit 134 controls the rotation speed of the motor 102 based on the estimated self-position and orientation. , Perform autonomous flight toward the target (S110).
  • FIG. 12 is a flowchart showing in detail the flow of autonomous flight by the unmanned aerial vehicle shown in FIG.
  • the cross-section measurement data acquisition unit 135 controls the cross-section measurement device 109 to measure the cross-section measurement data, and controls the cross-section measurement device 109 to measure the cross-section measurement data.
  • S111 Cross section measurement data acquisition step.
  • the relative coordinate system calculation unit 136 and the position / attitude estimation unit 132 in the cross section combine the vertical cross section measurement data acquired by the cross section measuring device 109 and the cross section data recorded in the cross section data storage unit 130. Based on this, the self-position (X, Y) and attitude (heading ⁇ ) in the vertical section of the unmanned aircraft in the cross section are estimated.
  • the relative coordinate system calculation unit 136 matches the lateral cross-sectional data with the circle corresponding to the inner wall surface 302 of the tunnel 300, and the unmanned aircraft 1
  • the first relative coordinates (corresponding to the center coordinates of the inner wall surface 302 of the tunnel) of the center position as the first feature point of the circle based on the position and orientation of the tunnel are specified (S112: first feature point estimation step). ..
  • the relative coordinate system calculation unit 136 is located behind the pipe 310 in the circle model C2 in the lateral cross-sectional data, and crosses the circle model C2 without reaching it.
  • the gap region G corresponding to the angle range in which the discrete data of the surface measurement data does not exist in the vicinity is specified (S113).
  • the relative coordinate system calculation unit 136 sets the bisector L3 of the angle range of the gap region G centered on the unmanned aerial vehicle 1 (cross-section measuring device 109). (S114).
  • the relative coordinate system calculation unit 136 acquires the distance between the center of the tunnel 300 and the center of the pipe 310 based on the cross-sectional shape data, centers the center point of the matched circle, and has a radius of the center of the tunnel 300.
  • a circle equal to the distance from the center of the pipe 310 is set.
  • the relative coordinates of the intersection of this circle and the bisector are calculated, and the calculated relative coordinates are set as the second relative coordinates of the second feature point corresponding to the center of the pipe 310 (S115: second).
  • Feature point estimation step ).
  • the relative coordinate system calculation unit 136 calculates the direction of the center point O2 of the pipe 310 corresponding to the center of the pipe 310 with respect to the first relative coordinate of the center point O'of the circle in the relative coordinate system ( S116: Calculation step). Then, the position / orientation estimation unit 132 in the cross section indicates that the direction of the intersection point O 2 ′ with respect to the center point O 1 ′ in the relative coordinate system is the direction of the center point O2 of the pipe 310 with respect to the center point O1 of the tunnel 300 in the absolute coordinate system. The coordinates are converted so as to correspond to (Y-axis direction) and the coordinates of the center point O 1 ′ in the relative coordinate system correspond to the center point O1 of the tunnel 300 in the absolute coordinate system. As a result, the absolute coordinates of the self-position in the cross section of the unmanned aerial vehicle 1 and the heading ⁇ are estimated (S117: position / attitude estimation step in the cross section).
  • the self-position data generation unit 133 uses the lateral self-position (X, Y) and heading ⁇ estimated by the position / attitude estimation unit 132 in the cross section and the altitude (Z) measured by the altimeter 110. Generate self-position data (S118: self-position data generation step).
  • the self-position data generation unit 133 determines the accuracy of the generated self-position data based on the position information acquired from the IMU 111, and if the accuracy of the generated self-position data is low, the shooting in the tunnel is stopped. May be good.
  • the flight control unit 134 determines whether or not the unmanned aerial vehicle 1 has reached the target based on the self-position data generated by the self-position data generation unit 133 (S119).
  • the flight control unit 134 determines that the unmanned aerial vehicle 1 has not reached the target (NO in S119)
  • the flight control unit 134 heads for the target based on the self-position data generated by the self-position data generation unit 133.
  • the control command value for each rotor 103 is calculated so as to fly, and the data indicating the control command value is output to the control signal generation unit 122 (S120).
  • the control signal generation unit 122 converts the control command value into a pulse signal representing a voltage and transmits it to each speed controller 123.
  • Each speed controller 123 converts the pulse signal into a drive voltage and applies it to each motor 102, thereby controlling the drive of each motor 102 and controlling the rotation speed of each rotor 103, whereby the unmanned aerial vehicle 1 Flight is controlled.
  • the unmanned aerial vehicle 1 autonomously flies toward the target.
  • the information processing unit 120 drives the camera 108.
  • the inner wall surface of the tunnel is photographed (S130).
  • the image data of the captured image is stored in the memory of the information processing unit 120 or recorded in an appropriate terminal via the antenna 117.
  • the unmanned aerial vehicle 1 determines that all the necessary steps have been completed, and the flight control unit 134 determines, for example, the initial stage. Finish the work by flying to return to the position.
  • the following effects are achieved. Inside the tunnel, GPS and the like cannot be used, and further, the compass may not be used due to the influence of the geomagnetism. In such a case, when estimating the self-position of the unmanned aerial vehicle, it is conceivable to estimate the position by an algorithm such as laser SLAM. However, even if an attempt is made to estimate the position by an algorithm such as laser SLAM, it is difficult to estimate an accurate position if the cross-sectional shape of the tunnel or the like is rotationally symmetric.
  • the relative coordinate system calculation unit 136 calculates the directions of the point O1'corresponding to the center of the tunnel 300 and the point O2' corresponding to the center of the pipe 310 in the relative coordinate system. Thereby, the relationship between the relative coordinate system and the absolute coordinate system can be specified based on this direction, and the accurate self-position can be estimated even in the tunnel 300 having a rotationally symmetric shape. Further, in the present embodiment, the relative coordinate system calculation unit 136 matches a circle with the inner wall surface of the tunnel.
  • the inner wall of a tunnel is often circular, elliptical, or rectangular, but even if it is matched with a rotationally symmetric shape such as a circle, ellipse, or rectangle, it will still be a circle, ellipse, or rectangle for an unmanned aircraft. It is not possible to uniquely identify the position of an unmanned aircraft only from the positional relationship of. On the other hand, in the present embodiment, since the point corresponding to the center of the pipe 310 in the cross-sectional measurement data is specified, the position of the unmanned aerial vehicle can be uniquely specified.
  • the direction of the point O2'corresponding to the center of the pipe 310 with respect to the point O1'corresponding to the center of the tunnel 300 in the relative coordinate system of the position / attitude estimation unit 132 in the cross section is the reference axis.
  • the coordinates are converted so as to correspond to a certain Y axis. This makes it possible to estimate the heading direction of the unmanned aerial vehicle in the absolute coordinate system.
  • the direction of the point O2'corresponding to the center of the pipe 310 with respect to the point O1'corresponding to the center of the tunnel 300 in the relative coordinate system of the position / attitude estimation unit 132 in the cross section is the reference axis.
  • the coordinates are transformed so that the point O1'in the relative coordinate system corresponds to the origin corresponding to the center of the tunnel in the absolute coordinate system. This makes it possible to estimate the coordinates of the unmanned aerial vehicle in the absolute coordinate system.
  • the relative coordinate system calculation unit 136 identifies the gap region G in which no data exists within the range corresponding to the circle in the cross-sectional measurement data, and matches the gap region G with the angle bisector of the gap region G. By finding the intersection with the circle of radius R2 centered on the center point O1'of the circle C2, the point corresponding to the center point of the pipe 310 in the relative coordinate system is estimated. Thereby, for example, the position of the unmanned aerial vehicle 1 can be estimated even if the radius of the pipe 310 or the like is unknown.
  • the pipe is piped.
  • the relative coordinates of 310 can be estimated accurately.
  • the cross-sectional measurement data is discrete data
  • the relative coordinate system calculation unit 136 removes outliers in the discrete data and performs matching.
  • matching can be performed even when the inner wall surface 302 of the tunnel 300 has irregularities such as lighting or a ventilation port or the like is formed.
  • the cross-sectional shape of the pipe 310 is not limited to this, and may be another shape such as a rectangle. good.
  • the cross-sectional shape of the internal structure is rectangular, when the bisector of the gap region is set in S113, the bisector does not necessarily pass through the center of the rectangle, but the error is small. Therefore, there is no problem in practical use.
  • the relative coordinate system calculation unit 136 identifies the gap region in S113 after matching the circle with the cross-sectional measurement data in S112, and bisectors the angle range of the gap region in S114.
  • the relative coordinates of the pipe 310 are calculated by setting and specifying the intersection of the bisector of the gap and the circle where the center of the pipe is located in S115.
  • the method for calculating the relative coordinates of the pipe 310 by the relative coordinate system calculation unit 136 in the present invention is not limited to this.
  • the cross-sectional shape of the pipe 310 is circular, for example, a circle having a second predetermined shape is matched with data outside the range of the inner wall surface 302 of the tunnel 300 in the cross-sectional measurement data.
  • a predetermined position for example, the center position
  • this position can be set as the position of the second feature point related to the pipe 310 in the relative coordinate system.
  • the matching second predetermined shape may be changed accordingly.
  • the second predetermined shape to be matched is preferably a geometric shape.
  • the coordinates in the height direction are determined by the altimeter, but the present invention is not limited to this, and the coordinates in the height direction may be determined by the IMU.
  • the present invention is applied to the tunnel 300 extending in the vertical direction, but the present invention is not limited to this, and can be applied to a chimney, a shaft in a building, and the like. Furthermore, the present invention can be applied to tunnels and the like that extend the present invention diagonally upward.
  • the cross-sectional shape data storage unit 130 may store the cross-sectional shape data of the tunnel 300 and the internal structure for each height.
  • Unmanned aerial vehicle 101 Control unit 102 Motor 103 Rotor 104 Arm 105 Landing gear 106 Pedestal 107 Support member 108 Camera 110 Altimeter 109 Cross section measuring device 111 IMU 117 Antenna 120 Information processing unit 120a CPU 120b RAM 120c ROM 120d External memory 120e Input unit 120f Output unit 120g Communication unit 121 Communication circuit 122 Control signal generation unit 123 Speed controller 125 Interface 130 Cross-sectional shape data storage unit 131 Flight path data storage unit 132 Cross-sectional position / attitude estimation unit 133 Self-position data generation unit 134 Flight control unit 135 Cross-section measurement data acquisition unit 136 Relative coordinate system calculation unit 140 Self-position estimation system 200 Flight control system 300 Tunnel 302 Inner wall surface 310 Piping 312 Outer wall surface

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Provided is a system for estimating the location of an unmanned aerial vehicle in the case when autonomous flight is carried out by the unmanned aerial vehicle within the interior space of a structure such as a tunnel. In the present invention, a self-location estimation system comprises a transverse cross-sectional position/attitude estimation unit which matches a circle to transverse cross-sectional measurement data acquired by a transverse cross-sectional measurement data acquisition unit, estimates a first position of the center point O1' of the circle in a relative coordinate system that uses the unmanned aerial vehicle inside the transverse cross section as a reference, estimates a second position of the center point O2' of piping in the relative coordinate system on the basis of the transverse cross-sectional measurement data, estimates a direction of the second position with respect to the first position in the relative coordinate system, and estimates the self-location and attitude of the unmanned aerial vehicle in an absolute coordinate system that uses the direction of the center of the piping with respect to the center of the tunnel as a reference axis.

Description

無人航空機の自己位置及び/又は姿勢を推定する推定システム、飛行制御システム、無人航空機、プログラム、及び、記録媒体Estimating systems, flight control systems, unmanned aerial vehicles, programs, and recording media that estimate the self-position and / or attitude of unmanned aerial vehicles.
 本発明は、無人航空機の自己位置及び/又は姿勢を推定する推定システム、飛行制御システム、無人航空機、プログラム、及び、記録媒体に関する。 The present invention relates to an estimation system for estimating the self-position and / or attitude of an unmanned aerial vehicle, a flight control system, an unmanned aerial vehicle, a program, and a recording medium.
 近時ドローンによるトンネルなどの構造物の点検が試みられている。ドローンは、GPSの衛星測位情報を利用して自己位置を推定し、飛行計画経路に沿って自律飛行することができるが、トンネルなどの構造物内の内部空間では、GPSによって正確な位置情報を得ることが難しい。そのため、そのような環境においては、レーザ SLAMによる自己位置推定をGPSによる自己位置推定の代替とする技術が提案されている(下記非特許文献1)。ここで、レーザ SLAMとは、周囲にレーザ光を照射し、照射した光を受光するまでの時間に基づき照射対象までの距離を計測するLIDARを用い、LIDARからの出力に基づき、自己位置とマップの推定を並行して行う技術である。 Recently, attempts have been made to inspect structures such as tunnels with drones. The drone can estimate its own position using GPS satellite positioning information and fly autonomously along the flight plan route, but in the internal space inside a structure such as a tunnel, accurate position information can be obtained by GPS. Difficult to get. Therefore, in such an environment, a technique has been proposed in which self-position estimation by laser SLAM is an alternative to self-position estimation by GPS (Non-Patent Document 1 below). Here, the laser SLAM uses LIDAR that irradiates the surroundings with laser light and measures the distance to the irradiation target based on the time until the irradiated light is received, and based on the output from LIDAR, the self-position and the map. It is a technique to estimate the above in parallel.
 レーザ SLAMによる自己位置推定においては、LIDARからの測定データに基づきマップを参照して自己位置を推定する。しかしながら、縦方向に延びるトンネル(タンク)の内壁面は円形などの回転対称な形状であることが多く、マップ自体も回転対称となり、精度の高い自己位置推定は難しい。 In self-position estimation by laser SLAM, self-position is estimated by referring to the map based on the measurement data from LIDAR. However, the inner wall surface of a tunnel (tank) extending in the vertical direction often has a rotationally symmetric shape such as a circle, and the map itself is also rotationally symmetric, making it difficult to estimate the self-position with high accuracy.
 そこで、本発明は、無人航空機が縦方向に延びるトンネル(タンク)などの回転対称な断面形状の構造物の内部空間内の自律飛行を行う場合に、無人航空機の位置及び/又は姿勢を推定するためのシステム及び方法を提供することを目的の1つとする。 Therefore, the present invention estimates the position and / or attitude of an unmanned aerial vehicle when the unmanned aerial vehicle autonomously flies in the internal space of a structure having a rotationally symmetric cross section such as a tunnel (tank) extending in the vertical direction. One of the purposes is to provide a system and method for the purpose.
 本発明の一態様は、無人航空機の自己位置及び/又は姿勢を推定する推定システムであって、無人航空機は、無人航空機の縦方向の飛行経路を横切る方向の横断面内における無人航空機の所定の方向を基準とした空間形状を測定し、横断面測定データを生成する横断面測定装置を含み、無人航空機が、縦方向に所定の経路に沿って延びる内部空間が形成されるとともに、内部空間内に縦方向に延びる内部構造物が設けられた構造物の内部空間を飛行するときに、横断面測定装置により生成された横断面測定データを取得する横断面測定データ取得部と、横断面測定データ取得部が取得した横断面測定データに、構造物の内壁面の断面形状に対応する回転対称な第1の所定形状をマッチングし、横断面内の無人航空機を基準とした相対座標系におけるマッチングした第1の所定形状の中心である第1の特徴点の第1の位置を推定し、横断面測定データ取得部が取得した横断面測定データに基づき、相対座標系における内部構造物に関連する第2の特徴点の第2の位置を推定し、相対座標系における第1の位置に対する第2の位置の方向を計算する、相対座標系計算部と、を備える、推定システムを提供するものである。 One aspect of the present invention is an estimation system that estimates the self-position and / or attitude of an unmanned aircraft, wherein the unmanned aircraft is a predetermined unmanned aircraft within a cross section in a direction crossing the longitudinal flight path of the unmanned aircraft. It includes a cross-section measuring device that measures the spatial shape with reference to the direction and generates cross-section measurement data. Cross-section measurement data acquisition unit that acquires cross-section measurement data generated by the cross-section measurement device when flying in the internal space of a structure provided with an internal structure extending in the vertical direction, and cross-section measurement data. The cross-section measurement data acquired by the acquisition unit was matched with the first predetermined rotationally symmetric shape corresponding to the cross-sectional shape of the inner wall surface of the structure, and matched in the relative coordinate system based on the unmanned aircraft in the cross-section. The first position of the first feature point, which is the center of the first predetermined shape, is estimated, and based on the cross-section measurement data acquired by the cross-section measurement data acquisition unit, the second related to the internal structure in the relative coordinate system. It provides an estimation system including a relative coordinate system calculation unit that estimates the second position of two feature points and calculates the direction of the second position with respect to the first position in the relative coordinate system. ..
 本発明の一態様において、横断面における構造物の第1の特徴点に対応する点に対する内部構造物の第2の特徴点に対応する点の方向を基準軸とした絶対座標系における、無人航空機の自己位置及び/又は姿勢を推定する、横断面内位置/姿勢推定部をさらに、備えることができる。 In one aspect of the invention, an unmanned aerial vehicle in an absolute coordinate system with respect to the direction of the point corresponding to the second feature point of the internal structure relative to the point corresponding to the first feature point of the structure in the cross section. A position / attitude estimation unit in the cross section for estimating the self-position and / or attitude of the vehicle can be further provided.
 本発明の一態様において、横断面内位置/姿勢推定部は、相対座標系における第1の位置に対する第2の位置の方向が、基準軸に対応するように相対座標系を絶対座標系に座標変換し、絶対座標系における無人航空機のヘディングの方向を推定する、ように構成されることができる。 In one aspect of the present invention, the position / orientation estimation unit in the cross section coordinates the relative coordinate system with the absolute coordinate system so that the direction of the second position with respect to the first position in the relative coordinate system corresponds to the reference axis. It can be configured to transform and estimate the heading direction of an unmanned aircraft in an absolute coordinate system.
 本発明の一態様において、横断面内位置/姿勢推定部は、相対座標系における第1の位置に対する第2の位置の方向が、基準軸に対応し、かつ、相対座標系における第1の位置が、構造物の第1の特徴点に対応する絶対座標系の座標に対応するように座標変換し、無人航空機の絶対座標系における座標を推定することができる。 In one aspect of the present invention, the position / orientation estimation unit in the cross section has a position in which the direction of the second position with respect to the first position in the relative coordinate system corresponds to the reference axis and the first position in the relative coordinate system. However, the coordinates in the absolute coordinate system of the unmanned aircraft can be estimated by converting the coordinates so as to correspond to the coordinates in the absolute coordinate system corresponding to the first feature point of the structure.
 本発明の一態様において、第1の所定形状は、円、楕円、又は、矩形の何れかであることができる。 In one aspect of the present invention, the first predetermined shape can be either a circle, an ellipse, or a rectangle.
 本発明の一態様において、横断面測定データにおけるマッチングした第1の所定形状の近傍の領域内でデータが存在しないギャップ領域を特定し、無人航空機に対するギャップ領域の角度範囲の略二等分線と、予め設定された第1の特徴点と第2の特徴点との距離を半径とし、かつ、第1の特徴点を中心とする円との交点を、第2の位置として推定することができる。 In one aspect of the present invention, a gap region in which no data exists in the region near the matched first predetermined shape in the cross-sectional measurement data is specified, and the angle range of the gap region with respect to the unmanned aircraft is substantially bisectored. , The distance between the preset first feature point and the second feature point is set as the radius, and the intersection with the circle centered on the first feature point can be estimated as the second position. ..
 本発明の一態様において、相対座標系計算部は、横断面測定データに内部構造物に対応する部分に対応する第2の所定の形状をマッチングし、マッチングした第2の所定形状における所定の点の相対座標を、第2の位置として推定することができる。 In one aspect of the present invention, the relative coordinate system calculation unit matches the cross-sectional measurement data with a second predetermined shape corresponding to the portion corresponding to the internal structure, and a predetermined point in the matched second predetermined shape. The relative coordinates of can be estimated as the second position.
 本発明の一態様において、横断面測定データは、離散データであり、相対座標系計算部は、離散データにおける外れ値を除去し、マッチングを行うことができる。 In one aspect of the present invention, the cross-sectional measurement data is discrete data, and the relative coordinate system calculation unit can remove outliers in the discrete data and perform matching.
 本発明の一態様は、上記の推定システムと、推定システムにより推定された無人航空機の自己位置及び/又は姿勢に基づき、設定された飛行計画経路に沿って無人航空機を飛行させる飛行制御部と、を含む、無人航空機の飛行制御システムを提供するものである。 One aspect of the present invention includes the above estimation system, a flight control unit that flies an unmanned aerial vehicle along a set flight planning route based on the self-position and / or attitude of the unmanned aerial vehicle estimated by the estimation system. It provides a flight control system for unmanned aerial vehicles, including.
 本発明の一態様は、上記の推定システムを有する、無人航空機を提供するものである。 One aspect of the present invention is to provide an unmanned aerial vehicle having the above estimation system.
 本発明の一態様は、飛行制御システムを有する、無人航空機を提供するものである。 One aspect of the present invention is to provide an unmanned aerial vehicle having a flight control system.
 本発明の一態様は、無人航空機の自己位置及び/又は姿勢をコンピュータにより推定する推定方法であって、無人航空機は、無人航空機の縦方向の飛行経路を横切る方向の横断面内における無人航空機の所定の方向を基準とした空間形状を測定し、横断面測定データを生成する横断面測定装置を含み、無人航空機が、縦方向に所定の経路に沿って延びる内部空間が形成されるとともに、内部空間内に縦方向に延びる内部構造物が設けられた構造物の内部空間を飛行するときに、横断面測定装置により生成された横断面測定データを取得する横断面測定データ取得ステップと、取得した横断面測定データに、構造物の内壁面の断面形状に対応する回転対称な第1の所定形状をマッチングし、横断面内の無人航空機を基準とした相対座標系におけるマッチングした第1の所定形状の中心である第1の特徴点の第1の位置を推定する第1の特徴点推定ステップと、取得した横断面測定データに基づき、相対座標系における内部構造物に関連する第2の特徴点の第2の位置を推定する第2の特徴点推定ステップと、相対座標系における第1の位置に対する第2の位置の方向を計算する計算ステップと、を含む、自己位置推定方法を提供するものである。 One aspect of the present invention is an estimation method in which the self-position and / or attitude of an unmanned aircraft is estimated by a computer, and the unmanned aircraft is an unmanned aircraft in a cross section in a direction crossing the vertical flight path of the unmanned aircraft. It includes a cross-section measuring device that measures the spatial shape with respect to a predetermined direction and generates cross-section measurement data, and forms an internal space in which an unmanned aircraft extends along a predetermined path in the vertical direction. A cross-section measurement data acquisition step for acquiring cross-section measurement data generated by a cross-section measurement device when flying in the internal space of a structure provided with an internal structure extending in the vertical direction in the space, and the acquisition. The first predetermined shape that is rotationally symmetric corresponding to the cross-sectional shape of the inner wall surface of the structure is matched with the cross-sectional measurement data, and the first predetermined shape that is matched in the relative coordinate system based on the unmanned aircraft in the cross-section. Based on the first feature point estimation step for estimating the first position of the first feature point, which is the center of, and the acquired cross-sectional measurement data, the second feature point related to the internal structure in the relative coordinate system. Provided is a self-position estimation method including a second feature point estimation step for estimating a second position of the above and a calculation step for calculating the direction of the second position with respect to the first position in a relative coordinate system. Is.
 本発明の一態様は、上記の方法をコンピュータに実行させるためのプログラムを提供するものである。 One aspect of the present invention provides a program for causing a computer to execute the above method.
 本発明の一態様は、上記のプログラムを記録したコンピュータ読み取り可能な記録媒体を提供するものである。 One aspect of the present invention provides a computer-readable recording medium on which the above program is recorded.
 本発明によれば、無人航空機が縦方向に延びるトンネル(タンク)などの回転対称な断面形状の構造物の内部空間内の自律飛行を行う場合に、無人航空機の位置及び/又は姿勢を推定するためのシステム及び方法を提供することができる。 According to the present invention, the position and / or attitude of an unmanned aerial vehicle is estimated when the unmanned aerial vehicle autonomously flies in the internal space of a structure having a rotationally symmetric cross section such as a tunnel (tank) extending in the vertical direction. Systems and methods for this can be provided.
本発明の一実施形態に係る無人航空機(マルチコプタ)の一例であるマルチコプタの外観図である。It is an external view of the multicopter which is an example of the unmanned aerial vehicle (multicopter) which concerns on one Embodiment of this invention. 図1に示す無人航空機の飛行制御システムを示す図である。It is a figure which shows the flight control system of the unmanned aerial vehicle shown in FIG. 図1に示す無人航空機の情報処理ユニットの構成を示すブロック図である。It is a block diagram which shows the structure of the information processing unit of the unmanned aerial vehicle shown in FIG. 図1に示す無人航空機の情報処理ユニットの情報処理ユニットのハードウェア構成を示す図である。It is a figure which shows the hardware configuration of the information processing unit of the information processing unit of the unmanned aerial vehicle shown in FIG. 本発明の一実施形態におけるトンネル及び飛行計画経路を示す、トンネルの水平断面図である。It is a horizontal sectional view of the tunnel which shows the tunnel and the flight plan path in one Embodiment of this invention. 図5AにおけるB-B断面図である。FIG. 5A is a cross-sectional view taken along the line BB in FIG. 5A. 横断面測定装置により測定が可能な領域を示す水平断面図である。It is a horizontal cross-sectional view which shows the area which can measure by a cross-sectional measurement apparatus. 光学横断面内位置推定部が、縦方向断面データに対してトンネルの内壁面に対応する円をマッチングする様子を示す図である。It is a figure which shows how the position estimation part in the optical cross section matches the circle corresponding to the inner wall surface of a tunnel with respect to the vertical cross section data. マッチングされた円モデルと、横断面測定データとを比較する様子を示す図である。It is a figure which shows a mode of comparing a matched circle model and a cross-sectional measurement data. 配管の中心が存在する円C3と、二等分線L3との交点を特定する様子を示す図である。It is a figure which shows the state of specifying the intersection of the circle C3 where the center of a pipe exists, and the bisector L3. 無人航空機を基準とした相対座標系を絶対座標系に座標変換する。Coordinates are converted from the relative coordinate system based on the unmanned aerial vehicle to the absolute coordinate system. 図1に示す無人航空機によりトンネルの内を自律飛行し、トンネル内壁面を撮影する流れを示すフローチャートである。It is a flowchart which shows the flow of autonomously flying in a tunnel by the unmanned aerial vehicle shown in FIG. 1 and photographing the inner wall surface of the tunnel. 図1に示す無人航空機により自律飛行を行う流れを詳細に示すフローチャートである。It is a flowchart which shows the flow of performing autonomous flight by the unmanned aerial vehicle shown in FIG. 1 in detail.
 以下、本発明の実施形態について図面を参照して説明する。ただし、本発明は以下に説明する具体的態様に限定されるわけではなく、本発明の技術思想の範囲内で種々の態様を取り得る。例えば、本発明の無人航空機は、図1に示すマルチコプタに限らず、回転翼機、固定翼機等、任意の無人航空機であってもよい。また、無人航空機1のシステム構成も、図に示されるものに限らず同様の動作が可能であれば任意の構成を取ることができる。例えば通信回路の機能を飛行制御部に統合する等、複数の構成要素が実行する動作を単独の構成要素により実行してもよいし、あるいは主演算部の機能を複数の演算部に分散する等、単独の構成要素が実行する動作を複数の構成要素により実行してもよい。また、無人航空機1のメモリ内に記憶される各種データは、それとは別の場所に記憶されていてもよいし、各種メモリに記録される情報も、1種類の情報を複数の種類に分散して記憶してもよいし、複数の種類の情報を1種類にまとめて記憶してもよい。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. However, the present invention is not limited to the specific aspects described below, and various aspects can be taken within the scope of the technical idea of the present invention. For example, the unmanned aerial vehicle of the present invention is not limited to the multicopter shown in FIG. 1, and may be any unmanned aerial vehicle such as a rotary wing aircraft and a fixed wing aircraft. Further, the system configuration of the unmanned aerial vehicle 1 is not limited to that shown in the figure, and any configuration can be adopted as long as the same operation is possible. For example, the function of the communication circuit may be integrated into the flight control unit, or the operation executed by a plurality of components may be executed by a single component, or the function of the main calculation unit may be distributed to a plurality of calculation units. , The operation performed by a single component may be performed by a plurality of components. Further, various data stored in the memory of the unmanned aircraft 1 may be stored in a different location, and the information recorded in the various memories also distributes one type of information to a plurality of types. It may be stored together, or a plurality of types of information may be collectively stored in one type.
 以下の説明では、鉛直方向に延びるトンネル内を無人航空機により自律飛行し、トンネルの内壁面について撮像する場合について説明する。トンネルの形状については、設計図などにより既知であり、本実施形態では、トンネル内の内部空間が、水平断面(横断面)における内壁面の形状が円形であり、同断面で垂直方向(縦方向)に延びるような形状である。また、トンネル内には水平断面が円形の配管が垂直方向に延びている。 In the following explanation, a case where an unmanned aerial vehicle autonomously flies in a tunnel extending in the vertical direction and images the inner wall surface of the tunnel will be described. The shape of the tunnel is known from a design drawing or the like. In this embodiment, the internal space inside the tunnel has a circular shape of an inner wall surface in a horizontal cross section (cross section), and the shape of the inner wall surface in the same cross section is in the vertical direction (vertical direction). ) Is a shape that extends. Further, in the tunnel, a pipe having a circular horizontal cross section extends in the vertical direction.
 図1は、本発明の一実施形態に係る無人航空機(マルチコプタ)1の一例であるマルチコプタの外観図である。無人航空機1は、外観に関しては、制御ユニット101と、制御ユニット101からの制御信号により駆動される6つのモータ102と、各々のモータ102の駆動により回転して揚力を発生させる6つのロータ(回転翼)103と、制御ユニット101と各々のモータ102とを接続する6つのアーム104と、着陸時に無人航空機を支える着陸脚105とを備える。モータ102、ロータ103、及びアーム104の数は、それぞれ、3、4などのような3以上の数とすることもできる。制御ユニット101からの制御信号により6つのモータ102が回転させられ、それにより6つのロータ103の各々の回転数を制御することにより、上昇、下降、前後左右への飛行、旋回等、無人航空機1の飛行が制御される。 FIG. 1 is an external view of a multicopter which is an example of an unmanned aerial vehicle (multicopter) 1 according to an embodiment of the present invention. In terms of appearance, the unmanned aerial vehicle 1 has a control unit 101, six motors 102 driven by control signals from the control unit 101, and six rotors (rotors) that rotate by driving each motor 102 to generate lift. It includes a wing) 103, six arms 104 connecting the control unit 101 and each motor 102, and a landing gear 105 that supports the unmanned aerial vehicle at the time of landing. The number of the motor 102, the rotor 103, and the arm 104 can be 3 or more, such as 3, 4, and so on, respectively. The six motors 102 are rotated by the control signal from the control unit 101, and by controlling the rotation speed of each of the six rotors 103, the unmanned aerial vehicle 1 can perform ascending, descending, flying back and forth and left and right, turning, and the like. Flight is controlled.
 また、制御ユニット101の上方には台座106が取り付けられている。そして、高解像度で対象物を撮影するためのカメラ108が、カメラ108を回転可能に支持する支持部材107を介して台座106に取り付けられている。 Further, a pedestal 106 is attached above the control unit 101. A camera 108 for photographing an object with high resolution is attached to the pedestal 106 via a support member 107 that rotatably supports the camera 108.
 また、台座106の前方には水平断面(横方向断面)について周辺空間の形状を測定するための横断面測定装置(水平断面LIDAR)109が設けられている。横断面測定装置109は、本実施形態では、水平断面LIDARであり、無人航空機1が着地又はホバリングしている状態で水平面内において、鉛直軸周りに全周にわたって所定の角度間隔でパルスレーザを照射し、周辺空間の物体に当たって反射されたレーザを受光することにより、照射から受光までの時間に基づき周辺空間の物体までの距離を測定することができる。トンネル内で無人航空機1が飛行した状態で横断面測定装置109を駆動することにより、横断面測定装置109は、トンネルの内壁面(トンネル内部空間の形状)の水平断面形状(横断面形状)に関する横断面測定データを生成する。横断面測定データは、例えば、無人航空機1の前方を基準とした極座標の離散データの集合である。なお、本実施形態では、横断面測定装置111として、水平断面LIDARを用いているが、本発明はこれに限定されない。例えば、無人航空機を中心として横断面全周を可視光カメラにより撮像し、撮像した画像を統合し、3Dモデルを作成することにより、横断面内のトンネルの内壁面の形状を測定するとともに、横断面内の自己位置を推定することができる。このような3Dモデルを作成するためのアルゴリズムとしては、SfMが知られており、また、SfMを使用して自己位置を推定するアルゴリズムとしてはDense Visual SLAMが知られている。 Further, in front of the pedestal 106, a cross section measuring device (horizontal cross section LIDAR) 109 for measuring the shape of the surrounding space with respect to the horizontal cross section (horizontal cross section) is provided. In the present embodiment, the cross-section measuring device 109 has a horizontal cross-section LIDAR, and irradiates a pulse laser at a predetermined angular interval around the vertical axis in a horizontal plane while the unmanned aircraft 1 is landing or hovering. Then, by receiving the reflected laser that hits the object in the surrounding space, the distance to the object in the surrounding space can be measured based on the time from irradiation to light reception. By driving the cross-sectional measurement device 109 while the unmanned aircraft 1 is flying in the tunnel, the cross-sectional measurement device 109 relates to the horizontal cross-sectional shape (cross-sectional shape) of the inner wall surface (shape of the tunnel internal space) of the tunnel. Generate cross-section measurement data. The cross-sectional measurement data is, for example, a set of discrete data of polar coordinates with respect to the front of the unmanned aerial vehicle 1. In the present embodiment, the horizontal cross section LIDAR is used as the cross section measuring device 111, but the present invention is not limited to this. For example, by capturing the entire circumference of the cross section with a visible light camera centered on an unmanned aerial vehicle and integrating the captured images to create a 3D model, the shape of the inner wall surface of the tunnel in the cross section can be measured and crossed. The self-position in the plane can be estimated. SfM is known as an algorithm for creating such a 3D model, and Dense Visual SLAM is known as an algorithm for estimating a self-position using SfM.
 また、制御ユニット101の上方には、高度計110が設けられている。高度計としては、気圧に基づき高度を測定する気圧高度計が適している。また、無人航空機1は、アンテナ117も有している。 Further, an altimeter 110 is provided above the control unit 101. As the altimeter, a barometric altimeter that measures altitude based on barometric pressure is suitable. The unmanned aerial vehicle 1 also has an antenna 117.
 図2は、図1に示す無人航空機の飛行制御システムを示す図である。無人航空機1の飛行制御システム200は、制御ユニット101、制御ユニット101に電気的に接続されたモータ102、モータ102に機械的に接続されたロータ103、カメラ108、横断面測定装置109、高度計110、及び、IMU111を有する。 FIG. 2 is a diagram showing a flight control system for the unmanned aerial vehicle shown in FIG. The flight control system 200 of the unmanned aerial vehicle 1 includes a control unit 101, a motor 102 electrically connected to the control unit 101, a rotor 103 mechanically connected to the motor 102, a camera 108, a cross section measuring device 109, and an altimeter 110. , And IMU111.
 制御ユニット101は、無人航空機1の飛行制御を行うための情報処理や、そのための電気信号の制御を行うための構成であり、典型的には基板上に各種の電子部品を配置して配線することによってそのような機能の実現に必要な回路を構成したユニットである。制御ユニット101は、さらに、情報処理ユニット120、通信回路121、制御信号生成部122、スピードコントローラ123、インターフェイス125から構成される。 The control unit 101 has a configuration for performing information processing for controlling the flight of the unmanned aircraft 1 and controlling electrical signals for that purpose, and typically arranges and wires various electronic components on a substrate. It is a unit that constitutes the circuit necessary to realize such a function. The control unit 101 is further composed of an information processing unit 120, a communication circuit 121, a control signal generation unit 122, a speed controller 123, and an interface 125.
 カメラ108は、高解像度で対象物(本実施形態では、トンネルの内壁)を撮像するためのカメラである。カメラ108は、カメラ108を回転可能に支持する支持部材107を介して台座106に取り付けられていて、これにより撮像方向を変えることができるようになっている。カメラ108は、無人航空機1の飛行中、それの撮影範囲の画像のデータを取得し、取得された画像は、後述のメモリなどの記憶装置に記憶される。画像は、典型的には一連の静止画像からなる動画の画像である。 The camera 108 is a camera for capturing an object (in the present embodiment, the inner wall of the tunnel) with high resolution. The camera 108 is attached to the pedestal 106 via a support member 107 that rotatably supports the camera 108, whereby the imaging direction can be changed. During the flight of the unmanned aerial vehicle 1, the camera 108 acquires image data of the shooting range of the unmanned aerial vehicle 1, and the acquired image is stored in a storage device such as a memory described later. The image is typically a moving image consisting of a series of still images.
 IMU111は、慣性計測装置(Inertial Measurement Unit)であり、加速度センサにより並進運動を、角速度センサ(ジャイロ)により回転運動を検出する。さらに、IMU111は、加速度センサにより検出された並進運動(加速度)を積分することにより、速度を算出することができ、さらに、速度を積分することにより移動距離(位置)を算出することができる。また、同様に、角速度センサにより検出された回転運動(角速度)を積分することにより、角度(姿勢)を算出することができる。 The IMU 111 is an inertial measurement unit (Inertial Measurement Unit) that detects translational motion with an acceleration sensor and rotational motion with an angular velocity sensor (gyro). Further, the IMU 111 can calculate the velocity by integrating the translational motion (acceleration) detected by the acceleration sensor, and can further calculate the moving distance (position) by integrating the velocity. Similarly, the angle (posture) can be calculated by integrating the rotational motion (angular velocity) detected by the angular velocity sensor.
 アンテナ117は、無人航空機1を操縦したり制御するための情報や各種データを含む無線信号を受信したり、テレメトリ信号を含む無線信号を無人航空機1から送信するための空中線である。 The antenna 117 is an antenna for receiving a radio signal including information and various data for maneuvering and controlling the unmanned aerial vehicle 1, and transmitting a radio signal including a telemetry signal from the unmanned aerial vehicle 1.
 通信回路121は、アンテナ117を通じて受信した無線信号から、無人航空機1のための操縦信号、制御信号や各種データなどを復調して情報処理ユニット120に入力したり、無人航空機1から出力されるテレメトリ信号などを搬送する無線信号を生成するための電子回路であり、典型的には無線信号処理ICである。なお、例えば、操縦信号の通信と、制御信号、各種データの通信とを別の周波数帯の異なる通信回路で実行するようにしてもよい。例えば、手動での操縦を行うためのコントローラ(プロポ)の送信器と950MHz帯の周波数で通信し、データ通信を2GHz帯/1.7GHz帯/1.5GHz帯/800MHz帯の周波数で通信するような構成を採ることも可能である。 The communication circuit 121 demodulates maneuvering signals, control signals, various data, and the like for the unmanned aircraft 1 from the radio signals received through the antenna 117 and inputs them to the information processing unit 120, or telemetry output from the unmanned aircraft 1. It is an electronic circuit for generating a radio signal that carries a signal or the like, and is typically a radio signal processing IC. Note that, for example, the communication of the steering signal and the communication of the control signal and various data may be executed by different communication circuits in different frequency bands. For example, communicate with the transmitter of the controller (propo) for manual control at a frequency of 950 MHz band, and communicate data communication at a frequency of 2 GHz band / 1.7 GHz band / 1.5 GHz band / 800 MHz band. It is also possible to adopt a different configuration.
 制御信号生成部122は、情報処理ユニット120によって演算により得られた制御指令値データを、電圧を表わすパルス信号(PWM信号など)に変換する構成であり、典型的には、発振回路とスイッチング回路を含むICである。スピードコントローラ123は、制御信号生成部122からのパルス信号を、モータ102を駆動する駆動電圧に変換する構成であり、典型的には、平滑回路とアナログ増幅器である。図示していないが、無人航空機1は、リチウムポリマーバッテリやリチウムイオンバッテリ等のバッテリデバイスや各要素への配電系を含む電源系を備えている。 The control signal generation unit 122 has a configuration in which the control command value data obtained by calculation by the information processing unit 120 is converted into a pulse signal (PWM signal or the like) representing a voltage, and is typically an oscillation circuit and a switching circuit. Is an IC including. The speed controller 123 has a configuration of converting a pulse signal from the control signal generation unit 122 into a drive voltage for driving the motor 102, and is typically a smoothing circuit and an analog amplifier. Although not shown, the unmanned aerial vehicle 1 includes a battery device such as a lithium polymer battery or a lithium ion battery, and a power supply system including a distribution system for each element.
 インターフェイス125は、情報処理ユニット120とカメラ108、横断面測定装置109、高度計110、及び、IMU111などの機能要素との間で信号の送受信ができるように信号の形態を変換することにより、それらを電気的に接続する構成である。なお、説明の都合上、図面においてインターフェイスは1つの構成として記載しているが、接続対象の機能要素の種類によって別のインターフェイスを使用することが通常である。また、接続対象の機能要素が入出力する信号の種類によってはインターフェイス125が不要な場合もある。また、図2において、インターフェイス125が媒介せずに接続されている情報処理ユニット120であっても、接続対象の機能要素が入出力する信号の種類によってはインターフェイスが必要となる場合もある。 The interface 125 converts the signal form so that it can be transmitted and received between the information processing unit 120 and the functional elements such as the camera 108, the cross section measuring device 109, the altimeter 110, and the IMU 111. It is a configuration that connects electrically. For convenience of explanation, the interface is described as one configuration in the drawings, but it is usual to use another interface depending on the type of the functional element to be connected. Further, the interface 125 may not be required depending on the type of signal input / output by the functional element to be connected. Further, in FIG. 2, even in the information processing unit 120 to which the interface 125 is connected without mediation, the interface may be required depending on the type of signals input / output by the functional element to be connected.
 図4は、図1に示す無人航空機の情報処理ユニットの構成を示すブロック図である。図3に示すように、情報処理ユニット120は、断面形状データ記憶部130と、飛行経路データ記憶部131と、相対座標系計算部136と、横断面内位置/姿勢推定部132と、自己位置データ生成部133と、飛行制御部134と、横断面測定データ取得部135と、を含む。本実施形態では、断面形状データ記憶部130と、相対座標系計算部136と、横断面内位置/姿勢推定部132と、自己位置データ生成部133と、横断面測定データ取得部135と、により無人航空機の自己位置に関する自己位置データを生成する自己位置推定システム140が構成される。本実施形態において、飛行制御システムは、自己位置推定システム140と、飛行経路データ記憶部131と、飛行制御部134と、により構成される。
 図5は、図1に示す無人航空機の情報処理ユニットのハードウェア構成を示す図である。情報処理ユニット120は、CPU120aと、RAM120bと、ROM120cと、外部メモリ120dと、入力部120eと、出力部120fと、通信部120gとを含む。RAM120b、ROM120c、外部メモリ120d、入力部120e、出力部120f、及び、通信部120gはバス120hを介してCPU120aに接続されている。
FIG. 4 is a block diagram showing a configuration of an information processing unit of the unmanned aerial vehicle shown in FIG. As shown in FIG. 3, the information processing unit 120 includes a cross-sectional shape data storage unit 130, a flight path data storage unit 131, a relative coordinate system calculation unit 136, a cross-sectional position / attitude estimation unit 132, and a self-position. It includes a data generation unit 133, a flight control unit 134, and a cross-sectional measurement data acquisition unit 135. In the present embodiment, the cross-sectional shape data storage unit 130, the relative coordinate system calculation unit 136, the position / attitude estimation unit 132 in the cross section, the self-position data generation unit 133, and the cross-section measurement data acquisition unit 135 are used. A self-position estimation system 140 that generates self-position data regarding the self-position of an unmanned aircraft is configured. In the present embodiment, the flight control system includes a self-position estimation system 140, a flight path data storage unit 131, and a flight control unit 134.
FIG. 5 is a diagram showing a hardware configuration of an information processing unit of the unmanned aerial vehicle shown in FIG. The information processing unit 120 includes a CPU 120a, a RAM 120b, a ROM 120c, an external memory 120d, an input unit 120e, an output unit 120f, and a communication unit 120g. The RAM 120b, ROM 120c, external memory 120d, input unit 120e, output unit 120f, and communication unit 120g are connected to the CPU 120a via the bus 120h.
 CPU120aは、システムバス120hに接続される各デバイスを統括的に制御する。
 ROM120cや外部メモリには、CPU120aの制御プログラムであるBIOSやOS、コンピュータが実行する機能を実現するために必要な各種プログラムやデータ等が記憶されている。
The CPU 120a comprehensively controls each device connected to the system bus 120h.
The ROM 120c and the external memory store the BIOS and OS, which are control programs of the CPU 120a, and various programs and data necessary for realizing the functions executed by the computer.
 RAM120bは、CPU120aの主メモリや作業領域等として機能する。CPU120aは、処理の実行に際して必要なプログラム等をROM120cや外部メモリ120dからRAM120bにロードして、ロードしたプログラムを実行することで各種動作を実現する。
 外部メモリ120dは、例えば、フラッシュメモリ、ハードディスク、DVD-RAM、USBメモリ等から構成される。
The RAM 120b functions as a main memory of the CPU 120a, a work area, and the like. The CPU 120a realizes various operations by loading a program or the like necessary for executing a process from the ROM 120c or the external memory 120d into the RAM 120b and executing the loaded program.
The external memory 120d is composed of, for example, a flash memory, a hard disk, a DVD-RAM, a USB memory, or the like.
 入力部120eは、ユーザ等から操作指示等を受け付ける。入力部120eは、例えば、入力ボタン、キーボード、ポインティングデバイス、ワイヤレスリモコン、マイクロフォン、カメラ等の入力デバイスから構成される。 The input unit 120e receives an operation instruction or the like from a user or the like. The input unit 120e is composed of, for example, an input device such as an input button, a keyboard, a pointing device, a wireless remote controller, a microphone, and a camera.
 出力部120fは、CPU120aで処理されるデータや、RAM120b、ROM120cや外部メモリ120dに記憶されるデータを出力する。出力部120fは、例えば、CRTディスプレイ、LCD、有機ELパネル、プリンタ、スピーカ当の出力デバイスから構成される。 The output unit 120f outputs the data processed by the CPU 120a and the data stored in the RAM 120b, the ROM 120c, and the external memory 120d. The output unit 120f is composed of, for example, a CRT display, an LCD, an organic EL panel, a printer, and an output device such as a speaker.
 通信部120gは、ネットワークを介して、又は、直接外部機器と接続・通信するためのインターフェイスである。通信部120gは、例えば、シリアルインタフェース、LANインターフェイス等のインターフェイスから構成される。 The communication unit 120g is an interface for connecting / communicating with an external device via a network or directly. The communication unit 120g is composed of interfaces such as a serial interface and a LAN interface, for example.
 図2、3に示される飛行制御システム及び位置推定システムの各部121、122、125、130~135は、ROM120cや外部メモリ120dに記憶された各種プログラムが、CPU120a、RAM120b、ROM120c、外部メモリ120d、入力部120e、出力部120f、通信部120g等を資源として使用することで実現される。 In each unit 121, 122, 125, 130 to 135 of the flight control system and the position estimation system shown in FIGS. This is realized by using the input unit 120e, the output unit 120f, the communication unit 120g, and the like as resources.
 断面形状データ記憶部130は、トンネルの内壁面の断面形状に関する断面形状データを記憶する。また、飛行経路データ記憶部131は、飛行計画経路データを記憶する。断面形状データ記憶部130及び飛行経路データ記憶部131は、記憶媒体に記録されたデータを読み取り、メモリが記憶することにより実現される。なお、断面形状データ記憶部130及び飛行経路データ記憶部131は、断面形状データ及び飛行計画経路データをプログラムに組み込むことにより省略することも可能である。 The cross-sectional shape data storage unit 130 stores the cross-sectional shape data related to the cross-sectional shape of the inner wall surface of the tunnel. Further, the flight route data storage unit 131 stores flight plan route data. The cross-sectional shape data storage unit 130 and the flight path data storage unit 131 are realized by reading the data recorded in the storage medium and storing the data in the memory. The cross-sectional shape data storage unit 130 and the flight path data storage unit 131 can be omitted by incorporating the cross-sectional shape data and the flight plan route data into the program.
 本実施形態では、情報処理ユニット120を自己位置推定システム及び飛行制御システムとして機能させているが、情報処理ユニット120とは別個にこれらシステムを搭載する等して無人航空機に備えられる構成としてもよい。また、自己位置推定システム及び飛行制御システム又はその構成要素は、1つの物理的な装置として構成される必要はなく、複数の物理的な装置から構成されてもよい。また、自己位置推定システム及び飛行制御システムを、無人航空機とは別体の地上局のコンピュータ、PC、スマートフォン、タブレット端末等の任意の適切な装置、クラウド・コンピューティングシステム、又はそれらの組み合わせ等として構成してもよい。また、自己位置推定システム及び飛行制御システムの各部の機能は、無人航空機が備える1つ又は複数の装置及び無人航空機とは別体の1つ又は複数の装置のうちのいずれか1つで又は複数で分散して実行される構成としてもよい。 In the present embodiment, the information processing unit 120 functions as a self-position estimation system and a flight control system, but it may be configured to be provided in an unmanned aerial vehicle by mounting these systems separately from the information processing unit 120. .. Further, the self-position estimation system and the flight control system or its components need not be configured as one physical device, but may be composed of a plurality of physical devices. In addition, the self-position estimation system and flight control system can be used as any suitable device such as a computer, PC, smartphone, tablet terminal of a ground station separate from the unmanned aerial vehicle, a cloud computing system, or a combination thereof. It may be configured. In addition, the functions of each part of the self-position estimation system and the flight control system may be one or more of one or more devices provided in the unmanned aerial vehicle and one or more devices separate from the unmanned aerial vehicle. It may be a configuration that is distributed and executed in.
 図5A~図5Bは、本発明の一実施形態におけるトンネル及び飛行計画経路を示し、図5Aはトンネルの水平断面図であり、図5Bは図5AにおけるB-B断面図である。図5A~図5Bに示すように、本実施形態では、トンネル300の内壁面302の横断面は円形であり、同断面形状で鉛直方向に延びている。また、トンネルの内部区間には配管310が鉛直方向に延びている、配管310の外壁面312の横断面の横断面は円形である。 5A-5B show a tunnel and a flight plan route according to an embodiment of the present invention, FIG. 5A is a horizontal sectional view of the tunnel, and FIG. 5B is a sectional view taken along line BB in FIG. 5A. As shown in FIGS. 5A to 5B, in the present embodiment, the cross section of the inner wall surface 302 of the tunnel 300 is circular, and has the same cross-sectional shape and extends in the vertical direction. Further, the pipe 310 extends in the vertical direction in the inner section of the tunnel, and the cross section of the cross section of the outer wall surface 312 of the pipe 310 is circular.
 このため、既知のトンネルの内壁面302の形状を示す断面形状データは、例えば、円形のトンネル300の中心を原点(X=X0(=0)、Y=Y0(=0))とし、トンネル300の中心に対する配管310の中心の方向をY軸方向とし、高さ方向をZ軸方向とした絶対座標系で記録されている。このような絶対座標系において、トンネルの内壁面302を示す断面形状データは、半径をR0として、(X-X02+(Y-Y02=R0 2で示される円となる。また、既知のトンネルを配管310の外壁面312の形状を示す断面形状データは、配管310の中心をX=X1(=0)、Y=Y1、半径をR1として、トンネル300の中心と配管310の中心との距離をR2=Y1として、(X-X12+(Y-R22=R1 2で示される円となる。なお、本実施形態では、断面形状データは、トンネル300の内壁面302の中心を原点とし、配管310の中心がY軸上に位置するようにXY軸を設定し、トンネル300の内壁面の半径R0と、配管310の中心座標と、配管310の半径R1を含んでいるが、これに限らず、トンネル300の内壁面302と、配管310の中心との距離(上記の例では、R2)のみが含まれていればよい。 Therefore, in the cross-sectional shape data showing the shape of the inner wall surface 302 of the known tunnel, for example, the center of the circular tunnel 300 is set as the origin (X = X 0 (= 0), Y = Y 0 (= 0)). It is recorded in an absolute coordinate system in which the direction of the center of the pipe 310 with respect to the center of the tunnel 300 is the Y-axis direction and the height direction is the Z-axis direction. In such an absolute coordinate system, the cross-sectional shape data indicating the inner wall surface 302 of the tunnel is a circle represented by (XX 0 ) 2 + (YY 0 ) 2 = R 0 2 with a radius of R 0 . Become. Further, the cross-sectional shape data showing the shape of the outer wall surface 312 of the pipe 310 for the known tunnel is the center of the tunnel 300 with the center of the pipe 310 as X = X 1 (= 0), Y = Y 1 and the radius R 1 . The distance between the pipe 310 and the center of the pipe 310 is R 2 = Y 1 , and the circle is represented by (XX 1 ) 2 + (Y-R 2 ) 2 = R 1 2 . In the present embodiment, the cross-sectional shape data has the center of the inner wall surface 302 of the tunnel 300 as the origin, the XY axis is set so that the center of the pipe 310 is located on the Y axis, and the radius of the inner wall surface of the tunnel 300. It includes R 0 , the center coordinates of the pipe 310, and the radius R 1 of the pipe 310, but is not limited to this, and the distance between the inner wall surface 302 of the tunnel 300 and the center of the pipe 310 (R in the above example). Only 2 ) needs to be included.
 本実施形態では、断面形状データは、トンネルの内壁面に対応する幾何学的形状として、円を示すデータを含んでいるが、これに限らず、トンネルの内壁面の横断面形状に応じて、楕円形や長方形等の二次元形状を示すデータを含んでもよい。また、配管の外壁面の横断面形状に応じて、楕円形や長方形等を示すデータを含んでもよい。なお、ここでいう幾何学的形状は、例えば、X座標、Y座標、Z座標に関する数式により特定することができる形状をいう。さらに、断面形状データとしては、回転対称な幾何学的形状である場合に、本発明は好適である。なお、ここでいう回転対称とは、特定の点を中心に内壁面の横断面形状を回転させた場合に元の形状と略等しくなるような状態をいい、楕円形や、長方形、正多角形などが含まれる。 In the present embodiment, the cross-sectional shape data includes data indicating a circle as a geometric shape corresponding to the inner wall surface of the tunnel, but the present invention is not limited to this, and the cross-sectional shape data is not limited to this, depending on the cross-sectional shape of the inner wall surface of the tunnel. It may include data indicating a two-dimensional shape such as an ellipse or a rectangle. Further, data indicating an ellipse, a rectangle, or the like may be included depending on the cross-sectional shape of the outer wall surface of the pipe. The geometric shape referred to here is, for example, a shape that can be specified by mathematical formulas related to the X coordinate, the Y coordinate, and the Z coordinate. Further, the present invention is suitable for the cross-sectional shape data when the geometric shape is rotationally symmetric. The term "rotational symmetry" as used herein means a state in which the cross-sectional shape of the inner wall surface is rotated around a specific point and is substantially equal to the original shape, and is an elliptical shape, a rectangle, or a regular polygon. Etc. are included.
 飛行計画経路データは、無人航空機1の絶対座標系における三次元(X、Y、Z)の飛行計画経路を表すデータであり、典型的には、飛行計画経路上に存在する一連の複数のウェイポイントP1~Pmnの集合のデータである。各ウェイポイントP1~Pmnは、絶対座標系における三次元座標(X、Y、Z)で設定されている。飛行計画経路は、典型的には、それらの複数のウェイポイントを順番に結んだ直線の集合であるが、ウェイポイントの所定範囲内においては所定の曲率の曲線とすることもできる。飛行計画経路データは、複数のウェイポイントにおける飛行速度を定めるデータを含んでいてもよい。飛行計画経路データは、典型的には自律飛行において飛行計画経路を定めるために使用されるが、非自律飛行において飛行時のガイド用として使用することもできる。飛行計画経路データは、典型的には、飛行前に無人航空機1に入力されて記憶される。なお、本実施形態では、断面形状データ及び飛行計画経路データを、トンネル300の中心を原点としたXY座標系で表現しているが、これに限らず、例えばトンネルの中心に対する配管の中心方向を基準とし、高さZと、トンネルの中心を原点とした極座標で表現してもよい。 The flight plan route data is data representing a three-dimensional (X, Y, Z) flight plan route in the absolute coordinate system of the unmanned aerial vehicle 1, and is typically a series of a plurality of ways existing on the flight plan route. It is the data of the set of points P 1 to P mn . Each waypoint P 1 to P mn is set by three-dimensional coordinates (X, Y, Z) in the absolute coordinate system. A flight planning path is typically a set of straight lines connecting a plurality of waypoints in order, but can also be a curve of a predetermined curvature within a predetermined range of the waypoints. The flight plan route data may include data that determine the flight speed at a plurality of waypoints. Flight planning route data is typically used to determine flight planning routes in autonomous flight, but can also be used as a guide during flight in non-autonomous flight. Flight plan route data is typically input and stored in the unmanned aerial vehicle 1 prior to flight. In the present embodiment, the cross-sectional shape data and the flight planning route data are represented by the XY coordinate system with the center of the tunnel 300 as the origin, but the present invention is not limited to this, and for example, the central direction of the pipe with respect to the center of the tunnel is expressed. As a reference, it may be expressed in height Z and polar coordinates with the center of the tunnel as the origin.
 本実施形態では、図5Bに示すように、飛行計画経路は、以下の蛇行経路の繰り返しとして設定されている。
-横断面(XY断面)においてトンネル300の内壁面302に沿って、同心円上を時計周りに所定の角度移動(例えば、ウェイポイントP1~Pn
-鉛直方向に所定の距離移動(例えば、ウェイポイントPn~Pn+1
-横断面(XY断面)においてトンネル300の内壁面302に沿って、同心円上を反時計周りに所定の角度移動(例えば、ウェイポイントPn+1~P2n
-鉛直方向に所定の距離移動(例えば、ウェイポイントP2n~P2n+1
In this embodiment, as shown in FIG. 5B, the flight plan route is set as a repetition of the following meandering route.
-In the cross section (XY cross section), a predetermined angle is moved clockwise along the inner wall surface 302 of the tunnel 300 on a concentric circle (for example, waypoints P1 to Pn ).
-Move a predetermined distance in the vertical direction (for example, waypoints P n to P n + 1 )
-In the cross section (XY cross section), a predetermined angle is moved counterclockwise along the inner wall surface 302 of the tunnel 300 (for example, waypoints P n + 1 to P 2 n ).
-Move a predetermined distance in the vertical direction (for example, waypoints P 2n to P 2n + 1 )
 相対座標系計算部136は、横断面測定装置109により取得された横断面測定データに基づき、相対座標系における各特徴点の相対座標を計算する。
 横断面内位置/姿勢推定部132は、相対座標系計算部136により計算された相対座標と、断面形状データ記憶部130に記録された断面形状データとに基づき、横断面内における無人航空機の横断面内自己位置(X、Y)及びヘディングθを推定する。
The relative coordinate system calculation unit 136 calculates the relative coordinates of each feature point in the relative coordinate system based on the cross section measurement data acquired by the cross section measuring device 109.
The position / attitude estimation unit 132 in the cross section crosses the unmanned aircraft in the cross section based on the relative coordinates calculated by the relative coordinate system calculation unit 136 and the cross section shape data recorded in the cross section shape data storage unit 130. The in-plane self-position (X, Y) and heading θ are estimated.
 図6は、横断面測定装置により測定が可能な領域を示す水平断面図である。図中点Dは、無人航空機1の位置を示す。横断面測定装置109は、水平断面内に所定の角度(無人航空機1の前方:矢印の方向)を基準として、所定の角度おきにレーザ光を照射し、トンネル300の内壁面302及び配管310の外壁面312において反射された光を受光し、照射から受光までの時間に応じて対象物(トンネル300の内壁面302及び配管310の外壁面312)までの距離を測定する。このため、横断面測定装置109から得られた横断面測定データは、無人航空機1の前方を基準とし、無人航空機1の位置を原点とした、レーザ光を照射した角度αiと、対象物までの距離riとが組となった相対座標系における離散データQ1~Qnとなる。なお、各離散データQiの相対座標は、(αi、ri)(i=1、・・・・、n)となる。 FIG. 6 is a horizontal cross-sectional view showing a region that can be measured by the cross-sectional measurement device. The midpoint D in the figure indicates the position of the unmanned aerial vehicle 1. The cross-section measuring device 109 irradiates the laser beam at predetermined angles with reference to a predetermined angle (in front of the unmanned aircraft 1: the direction of the arrow) in the horizontal cross section, and irradiates the inner wall surface 302 of the tunnel 300 and the pipe 310 with reference to the laser beam. The light reflected by the outer wall surface 312 is received, and the distance to the object (the inner wall surface 302 of the tunnel 300 and the outer wall surface 312 of the pipe 310) is measured according to the time from irradiation to light reception. Therefore, the cross-section measurement data obtained from the cross-section measuring device 109 is based on the front of the unmanned aircraft 1 and the position of the unmanned aircraft 1 as the origin, up to the angle α i irradiated with the laser beam and the object. The discrete data Q1 to Qn in the relative coordinate system in which the distance r i of the above is paired. The relative coordinates of each discrete data Qi are (α i , r i ) (i = 1, ..., N).
 この際、例えば、図6に示すように横断面測定装置109からレーザ光を照射した場合には、配管310の外壁面312の無人航空機1に正対する部分Aでは、反射光が横断面測定装置109まだ到達するものの、配管310の外壁面312における無人航空機1から見て側方の部分Bで反射した反射光は、ほとんどが横断面測定装置109に向かわない。このため、横断面測定装置109では、配管310の外壁面312における無人航空機1から見て側方の部分Bを探知することができない。また、トンネル300の内壁面302における配管310の背後に位置する部分Cには、レーザ光が到達しない。このため、横断面測定装置109では、トンネル300の内壁面302における配管310の背後に位置する部分Cを探知することができない。 At this time, for example, when the laser beam is irradiated from the cross section measuring device 109 as shown in FIG. 6, the reflected light is reflected in the cross section measuring device in the portion A of the outer wall surface 312 of the pipe 310 facing the unmanned aerial vehicle 1. Although it still reaches 109, most of the reflected light reflected by the side portion B of the outer wall surface 312 of the pipe 310 when viewed from the unmanned aerial vehicle 1 does not go to the cross section measuring device 109. Therefore, the cross-section measuring device 109 cannot detect the side portion B of the outer wall surface 312 of the pipe 310 when viewed from the unmanned aerial vehicle 1. Further, the laser beam does not reach the portion C located behind the pipe 310 on the inner wall surface 302 of the tunnel 300. Therefore, the cross-section measuring device 109 cannot detect the portion C located behind the pipe 310 on the inner wall surface 302 of the tunnel 300.
 横断面測定データ取得部135は、横断面測定装置109を制御可能であり、横断面測定装置109により測定された横断面測定データを取得することができる。 The cross-section measurement data acquisition unit 135 can control the cross-section measurement device 109 and can acquire the cross-section measurement data measured by the cross-section measurement device 109.
 これを踏まえ、本実施形態では、以下のようにして無人航空機の横断面内自己位置を推定する。
 まず、相対座標系計算部136は、縦方向断面データに対してトンネル300の内壁面302に対応する第1の幾何学的形状、本実施形態では円、をマッチングする。図7は、光学横断面内位置推定部が、縦方向断面データに対してトンネルの内壁面に対応する円をマッチングする様子を示す図である。なお、図7には縦方向断面データを構成するデータの一部のみを示している。
Based on this, in the present embodiment, the self-position in the cross section of the unmanned aerial vehicle is estimated as follows.
First, the relative coordinate system calculation unit 136 matches the first geometric shape corresponding to the inner wall surface 302 of the tunnel 300, that is, a circle in the present embodiment, with respect to the vertical cross-sectional data. FIG. 7 is a diagram showing how the position estimation unit in the optical cross section matches the circle corresponding to the inner wall surface of the tunnel with the vertical cross section data. Note that FIG. 7 shows only a part of the data constituting the vertical cross-sectional data.
 図7に示すように、縦方向断面データは離散点Q1~Qnからなり、相対座標系計算部136は、離散点Q1~Qnに対して円をマッチングする。円マッチングのアルゴリズムとしても、最小二乗法や、RANSACなどを作用することができるが、直線マッチングと同様に、外れ値を除外してマッチングを行うRANSACが適している。RANSACにより円をマッチングする際には、以下のように行う。
ステップ1.まず、離散点Q1~Qnからランダムに3点を選択する。
ステップ2.ステップ1で決定した3点を通る臨時の円モデルを導出する。
ステップ3.得られた円モデルと、離散点Q1~Qnとの誤差を求める。誤差が所定値よりも小さい場合には、「正しいモデル候補」に加える。
ステップ4.ステップ1~3を繰り返す。
ステップ5.正しいモデル候補の中で最もデータに合致するものを円マッチングの結果として採用する。
As shown in FIG. 7, the vertical cross-sectional data consists of discrete points Q1 to Qn, and the relative coordinate system calculation unit 136 matches circles with respect to the discrete points Q1 to Qn. As a circle matching algorithm, the least squares method, RANSAC, or the like can be applied, but RANSAC, which excludes outliers and performs matching, is suitable as in the case of linear matching. When matching circles by RANSAC, it is done as follows.
Step 1. First, three points are randomly selected from the discrete points Q1 to Qn.
Step 2. A temporary circle model that passes through the three points determined in step 1 is derived.
Step 3. The error between the obtained circular model and the discrete points Q1 to Qn is obtained. If the error is smaller than the predetermined value, it is added to the "correct model candidate".
Step 4. Repeat steps 1 to 3.
Step 5. Among the correct model candidates, the one that best matches the data is adopted as the result of circle matching.
 例えば、図7に示す例において、Q3は、トンネル内壁面に対応する離散点の中で他の離散点の傾向から大きく離れている。そして、図7に示す例において、ステップ1で、Q3とQ4とQ5を選択すると円モデルはC1となる。これに対して、ステップ1で、Q1と、Q4と、Q8を選択すると円モデルはC2となる。このような円モデルC1及びCとQ1~Qnの誤差を求めると、円モデルC1の誤差は、円モデルC2の誤差よりも大きくなる。このため、ステップ5において、円モデルC1を設定する際に用いたQ3などの他のデータの傾向から外れた値は、円を決定するためのデータとしては用いず、円モデルC2のような外れ値の影響を除去した円がマッチングされる。このようにして、無人航空機1の前方を基準とした、円モデルC2の第1の特徴点としての中心点O1´の第1の相対座標と、円モデルC2の半径とを推定する。円モデルC2の第1の特徴点としての中心点O1´は、絶対座標系におけるトンネル300の内壁面302の中心点に対応する。 For example, in the example shown in FIG. 7, Q3 is far from the tendency of other discrete points among the discrete points corresponding to the inner wall surface of the tunnel. Then, in the example shown in FIG. 7, when Q3, Q4, and Q5 are selected in step 1, the circle model becomes C1. On the other hand, if Q1, Q4, and Q8 are selected in step 1, the circle model becomes C2. When the error between the circle models C1 and C and Q1 to Qn is obtained, the error of the circle model C1 becomes larger than the error of the circle model C2. Therefore, in step 5, values that deviate from the tendency of other data such as Q3 used when setting the circle model C1 are not used as data for determining the circle, and are outliers like the circle model C2. Circles with the effect of the value removed are matched. In this way, the first relative coordinates of the center point O 1 ′ as the first feature point of the circle model C2 and the radius of the circle model C2 are estimated with reference to the front of the unmanned aerial vehicle 1. The center point O 1 ′ as the first feature point of the circular model C2 corresponds to the center point of the inner wall surface 302 of the tunnel 300 in the absolute coordinate system.
 なお、本実施形態では、外れ値を除外してマッチングを行うアルゴリズムとしてRANSACを使用するが、Markov Localization、Monte-Carlo Localization、Particle Filter、Hough Transformなどを使用することも可能である。 In this embodiment, RANSAC is used as an algorithm for matching by excluding outliers, but Markov Localization, Monte-Carlo Localization, Particle Filter, Hough Transform, etc. can also be used.
 次に、図8に示すように、相対座標系計算部136は、このようにしてマッチングされた円モデルC2と、横断面測定データとを比較する。なお、図8では、横断面測定データが存在する領域を太線で示している。上記の通り、トンネル300の内壁面302のうち、配管310の背後に当たる部分には横断面測定装置109から照射したレーザ光が到達しない。このため、マッチングされた円モデルC2と、横断面測定データとを比較することにより、円モデルC2のうち、横断面測定データの離散データが近傍に存在しないギャップ領域Gを特定することができる。なお、ギャップ領域Gの具体的な特定方法としては、例えば、円モデルC2からの距離が所定の範囲内である(すなわち、中心点O1´から離散点までの距離と円モデルC2の半径との差が所定値以下である)横断面測定データの離散点のうちの両端点の間とするなどの方法を採用すればよい。このギャップ領域は、無人航空機1(横断面測定装置109)を中心とした配管310の存在する角度範囲に対応する。そして、このようにして特定されたギャップ領域の無人航空機1(横断面測定装置109)を中心とした角度範囲の二等分線L3を特定する。この二等分線L3は、無人航空機1の前方を基準とした、無人航空機1に対する配管310の中心に対応する点の方向を示す。すなわち、無人航空機1の前方を基準とした相対座標における配管310の中心の方向を推定することができる。 Next, as shown in FIG. 8, the relative coordinate system calculation unit 136 compares the circular model C2 matched in this way with the cross-sectional measurement data. In FIG. 8, the area where the cross-sectional measurement data exists is shown by a thick line. As described above, the laser beam emitted from the cross-section measuring device 109 does not reach the portion of the inner wall surface 302 of the tunnel 300 that is behind the pipe 310. Therefore, by comparing the matched circular model C2 with the cross-sectional measurement data, it is possible to identify the gap region G in the circular model C2 in which the discrete data of the cross-sectional measurement data does not exist in the vicinity. As a specific method for specifying the gap region G, for example, the distance from the circle model C2 is within a predetermined range (that is, the distance from the center point O 1 ′ to the discrete point and the radius of the circle model C2. A method such as between both end points of the discrete points of the cross-sectional measurement data (where the difference between the two is equal to or less than a predetermined value) may be adopted. This gap region corresponds to the angle range in which the pipe 310 exists centering on the unmanned aerial vehicle 1 (cross-section measuring device 109). Then, the bisector L3 in the angle range centered on the unmanned aerial vehicle 1 (cross-section measuring device 109) in the gap region thus specified is specified. The bisector L3 indicates the direction of the point corresponding to the center of the pipe 310 with respect to the unmanned aerial vehicle 1 with respect to the front of the unmanned aerial vehicle 1. That is, the direction of the center of the pipe 310 in relative coordinates with respect to the front of the unmanned aerial vehicle 1 can be estimated.
 なお、本実施形態では、配管310が円形であるため、相対座標系計算部136ギャップ領域の二等分線を設定しているが、これに限らず、配管310の形状が四角形などの場合には、二等分線に対して形状に応じた補正をかけてもよい。例えば、配管の形状が四角形の場合には、横断面測定データにおける配管に相当する部分に対して直線等をマッチングし、この直線の角度に応じて二等分線の向きを補正してもよい。上記の円をマッチングした場合の二等分線及び補正を加えた二等分線を略二等分線という。ただし、配管310の形状が四角形であっても必ずしも補正をかける必要はなく、二等分線を用いたとしても高い精度で中心の方向を推定できる。 In the present embodiment, since the pipe 310 is circular, the bisector of the relative coordinate system calculation unit 136 gap region is set, but the present invention is not limited to this, and the case where the shape of the pipe 310 is a quadrangle or the like is set. May apply a correction according to the shape to the bisector. For example, when the shape of the pipe is a quadrangle, a straight line or the like may be matched with the portion corresponding to the pipe in the cross-sectional measurement data, and the direction of the bisector may be corrected according to the angle of the straight line. .. The bisector when the above circles are matched and the bisector with correction are referred to as substantially bisectors. However, even if the shape of the pipe 310 is a quadrangle, it is not always necessary to apply correction, and even if a bisector is used, the direction of the center can be estimated with high accuracy.
 次に、図9に示すように、相対座標系計算部136が、円モデルC2の中心点O1´を中心とした配管310の中心が存在する円C3と、この二等分線L3との交点O2´の相対座標を特定する。配管310の中心が存在する位置は、断面形状データより、トンネル300の中心点O1を中心とした半径R2の円上として設定できる。この交点O2´の相対座標が、無人航空機1の前方を基準とした無人航空機1に対する配管310の第2の特徴点としての中心点の第2の位置(相対座標)となる。
 そして、相対座標系計算部136は上記のようにして特定した円モデルC2の中心点O1´の第1の相対座標に対する、交点O2´の第2の相対座標の方向を特定する。
Next, as shown in FIG. 9, the relative coordinate system calculation unit 136 transfers the circle C3 in which the center of the pipe 310 centered on the center point O 1 ′ of the circle model C2 exists and the bisector L3. Specify the relative coordinates of the intersection O 2 '. The position where the center of the pipe 310 exists can be set as a circle having a radius R 2 centered on the center point O 1 of the tunnel 300 from the cross-sectional shape data. The relative coordinates of the intersection O 2'are the second position (relative coordinates) of the center point as the second characteristic point of the pipe 310 with respect to the unmanned aerial vehicle 1 with respect to the front of the unmanned aerial vehicle 1.
Then, the relative coordinate system calculation unit 136 specifies the direction of the second relative coordinates of the intersection point O 2 ′ with respect to the first relative coordinates of the center point O 1 ′ of the circle model C2 specified as described above.
 次に、横断面内位置/姿勢推定部132が、相対座標系における中心点O1´に対する交点O2´の方向が、絶対座標系におけるトンネル300の中心点O1に対する配管310の中心点O2の方向(Y軸方向)に対応し、かつ、相対座標系における中心点O1´の座標が絶対座標系におけるトンネル300の中心点O1に対応するように、相対座標系を絶対座標系に座標変換を行う。具体的には、図10に示すように、上記のようにして特定した円モデルC2の中心点O1´に対する交点O2´の方向が、絶対座標系におけるY方向となり、円モデルC2の中心点O1´が原点となるように無人航空機1を基準とした相対座標系を絶対座標系に座標変換する。これにより、無人航空機1のX座標、Y座標及びヘディングθを算出することができる。 Next, the position / attitude estimation unit 132 in the cross section has the direction of the intersection point O 2 ′ with respect to the center point O 1 ′ in the relative coordinate system of the center point O2 of the pipe 310 with respect to the center point O1 of the tunnel 300 in the absolute coordinate system. Coordinate conversion of the relative coordinate system to the absolute coordinate system so that the coordinates of the center point O 1 ′ in the relative coordinate system correspond to the direction (Y-axis direction) and the coordinates of the center point O 1 ′ in the relative coordinate system correspond to the center point O1 of the tunnel 300 in the absolute coordinate system. I do. Specifically, as shown in FIG. 10, the direction of the intersection point O 2 ′ with respect to the center point O 1 ′ of the circle model C2 specified as described above is the Y direction in the absolute coordinate system, and the center of the circle model C2. The relative coordinate system with respect to the unmanned aircraft 1 is coordinate-converted to the absolute coordinate system so that the point O 1'is the origin. Thereby, the X coordinate, the Y coordinate and the heading θ of the unmanned aerial vehicle 1 can be calculated.
 なお、無人航空機1の自己位置と配管310の第2の位置関係の特定方法は、上記説明したギャップ領域を特定する方法に限定されない。すなわち、例えば、円モデルC2の内側の領域(特に図6のAで示される領域)の横断面測定データに対して第2の所定の形状である円をマッチングし、マッチングした円の所定の点である中心点の相対座標を、配管310の中心に対応する相対座標として推定してもよいし、両方法を行って精度を高めることも可能である。 The method of specifying the self-position of the unmanned aerial vehicle 1 and the second positional relationship of the pipe 310 is not limited to the method of specifying the gap region described above. That is, for example, a circle having a second predetermined shape is matched with the cross-sectional measurement data of the region inside the circle model C2 (particularly the region shown by A in FIG. 6), and a predetermined point of the matched circle is matched. The relative coordinates of the center point, which is the above, may be estimated as the relative coordinates corresponding to the center of the pipe 310, or both methods may be performed to improve the accuracy.
 自己位置データ生成部133は、飛行制御部134が飛行する際に基準となる自己位置データを生成する。具体的には、自己位置データ生成部133は、横断面内位置/姿勢推定部132が推定したX座標、Y座標及びヘディングθと、高度計110から取得された高度(Z座標)とに基づき、XYZ座標及びヘディングθを含む自己位置データを生成する。 The self-position data generation unit 133 generates self-position data as a reference when the flight control unit 134 flies. Specifically, the self-position data generation unit 133 is based on the X coordinate, Y coordinate and heading θ estimated by the position / attitude estimation unit 132 in the cross section and the altitude (Z coordinate) acquired from the altitude meter 110. Generate self-position data including XYZ coordinates and heading θ.
 飛行制御部134は、自己位置データ生成部133により生成された自己位置データに基づき、飛行経路データ記憶部131に記憶された飛行計画経路データの飛行計画経路に沿うように無人航空機の飛行を制御する。具体的には、各種センサにより、無人航空機1の姿勢、速度等を判断し、自己位置データ生成部133により生成された自己位置データに基づき無人航空機1の現在の飛行位置及びヘディングなどを判断し、操縦信号、飛行計画経路(目標)、速度制限、高度制限等の目標値と比較することにより各ロータ103に対する制御指令値を演算し、制御指令値を示すデータを制御信号生成部122に出力する。制御信号生成部122は、その制御指令値を電圧を表わすパルス信号に変換して各スピードコントローラ123に送信する。各スピードコントローラ123は、そのパルス信号を駆動電圧へと変換して各モータ102に印加し、これにより各モータ102の駆動を制御して各ロータ103の回転数を制御することにより無人航空機1の飛行が制御される。 The flight control unit 134 controls the flight of the unmanned aerial vehicle so as to follow the flight plan route of the flight plan route data stored in the flight route data storage unit 131 based on the self-position data generated by the self-position data generation unit 133. do. Specifically, the attitude, speed, etc. of the unmanned aircraft 1 are determined by various sensors, and the current flight position, heading, etc. of the unmanned aircraft 1 are determined based on the self-position data generated by the self-position data generation unit 133. , The control command value for each rotor 103 is calculated by comparing with the target values such as the flight control signal, flight plan route (target), speed limit, altitude limit, etc., and the data indicating the control command value is output to the control signal generation unit 122. do. The control signal generation unit 122 converts the control command value into a pulse signal representing a voltage and transmits it to each speed controller 123. Each speed controller 123 converts the pulse signal into a drive voltage and applies it to each motor 102, thereby controlling the drive of each motor 102 and controlling the rotation speed of each rotor 103, whereby the unmanned aerial vehicle 1 Flight is controlled.
 以下、上記の無人航空機によりトンネルの内を自律飛行し、トンネル内壁面を撮影する流れを説明する。
 図11は、図1に示す無人航空機によりトンネルの内を自律飛行し、トンネル内壁面を撮影する流れを示すフローチャートである。図11に示すように、まず、情報処理ユニット120は、飛行経路データ記憶部131に記憶された飛行計画経路データを参照して、飛行制御部134が第1のウェイポイントを目標として設定する(S100)。
Hereinafter, the flow of autonomously flying in the tunnel by the above-mentioned unmanned aerial vehicle and photographing the inner wall surface of the tunnel will be described.
FIG. 11 is a flowchart showing a flow of autonomously flying in the tunnel by the unmanned aerial vehicle shown in FIG. 1 and photographing the inner wall surface of the tunnel. As shown in FIG. 11, first, the information processing unit 120 refers to the flight plan route data stored in the flight path data storage unit 131, and the flight control unit 134 sets the first waypoint as a target (as shown in FIG. 11). S100).
 次に、情報処理ユニット120は、自己位置データ生成部133により無人航空機1の自己位置及び姿勢を推定しながら、推定した自己位置及び姿勢に基づき飛行制御部134がモータ102の回転数を制御し、目標に向かって自律飛行を行う(S110)。 Next, the information processing unit 120 estimates the self-position and orientation of the unmanned aerial vehicle 1 by the self-position data generation unit 133, and the flight control unit 134 controls the rotation speed of the motor 102 based on the estimated self-position and orientation. , Perform autonomous flight toward the target (S110).
 図12は、図1に示す無人航空機により自律飛行を行う流れを詳細に示すフローチャートである。自己位置を推定する際には、まず、横断面測定データ取得部135が、横断面測定装置109を制御して横断面測定データを測定させ、横断面測定装置109を制御して横断面測定データを取得する(S111:横断面測定データ取得ステップ)。
 そして、相対座標系計算部136及び横断面内位置/姿勢推定部132が、横断面測定装置109により取得された縦断面測定データと、断面形状データ記憶部130に記録された断面形状データとに基づき、横断面内における無人航空機の縦断面内自己位置(X、Y)及び姿勢(ヘディングθ)を推定する。
FIG. 12 is a flowchart showing in detail the flow of autonomous flight by the unmanned aerial vehicle shown in FIG. When estimating the self-position, first, the cross-section measurement data acquisition unit 135 controls the cross-section measurement device 109 to measure the cross-section measurement data, and controls the cross-section measurement device 109 to measure the cross-section measurement data. (S111: Cross section measurement data acquisition step).
Then, the relative coordinate system calculation unit 136 and the position / attitude estimation unit 132 in the cross section combine the vertical cross section measurement data acquired by the cross section measuring device 109 and the cross section data recorded in the cross section data storage unit 130. Based on this, the self-position (X, Y) and attitude (heading θ) in the vertical section of the unmanned aircraft in the cross section are estimated.
 具体的には、まず、図7を参照して説明したように、相対座標系計算部136が、横方向断面データに対してトンネル300の内壁面302に対応する円によりマッチングし、無人航空機1の位置及び姿勢を基準とした円の第1の特徴点としての中心位置の第1の相対座標(トンネル内壁面302の中心座標に対応)を特定する(S112:第1の特徴点推定ステップ)。 Specifically, first, as described with reference to FIG. 7, the relative coordinate system calculation unit 136 matches the lateral cross-sectional data with the circle corresponding to the inner wall surface 302 of the tunnel 300, and the unmanned aircraft 1 The first relative coordinates (corresponding to the center coordinates of the inner wall surface 302 of the tunnel) of the center position as the first feature point of the circle based on the position and orientation of the tunnel are specified (S112: first feature point estimation step). ..
 次に、図8を参照して説明したように、相対座標系計算部136が、横方向断面データにおける、円モデルC2のうち配管310の背後に位置し、レーザ光が到達せずに、横断面測定データの離散データが近傍に存在しない角度範囲に対応するギャップ領域Gを特定する(S113)。 Next, as described with reference to FIG. 8, the relative coordinate system calculation unit 136 is located behind the pipe 310 in the circle model C2 in the lateral cross-sectional data, and crosses the circle model C2 without reaching it. The gap region G corresponding to the angle range in which the discrete data of the surface measurement data does not exist in the vicinity is specified (S113).
 次に、図9を参照して説明したように、相対座標系計算部136が、無人航空機1(横断面測定装置109)を中心としたギャップ領域Gの角度範囲の二等分線L3を設定する(S114)。 Next, as described with reference to FIG. 9, the relative coordinate system calculation unit 136 sets the bisector L3 of the angle range of the gap region G centered on the unmanned aerial vehicle 1 (cross-section measuring device 109). (S114).
 次に、相対座標系計算部136が、断面形状データに基づき、トンネル300の中心と配管310の中心との距離を取得し、マッチングした円の中心点を中心とし、半径がトンネル300の中心と配管310の中心との距離に等しい円を設定する。そして、この円と、二等分線の交点の相対座標を算出し、算出した相対座標を配管310の中心に対応する第2の特徴点の第2の相対座標とする(S115:第2の特徴点推定ステップ)。 Next, the relative coordinate system calculation unit 136 acquires the distance between the center of the tunnel 300 and the center of the pipe 310 based on the cross-sectional shape data, centers the center point of the matched circle, and has a radius of the center of the tunnel 300. A circle equal to the distance from the center of the pipe 310 is set. Then, the relative coordinates of the intersection of this circle and the bisector are calculated, and the calculated relative coordinates are set as the second relative coordinates of the second feature point corresponding to the center of the pipe 310 (S115: second). Feature point estimation step).
 次に、相対座標系計算部136が、相対座標系における円の中心点O´の第1の相対座標に対する、配管310の中心に対応する配管310の中心点O2の方向の方向を計算する(S116:計算ステップ)。
 そして、横断面内位置/姿勢推定部132が、相対座標系における中心点O1´に対する交点O2´の方向が、絶対座標系におけるトンネル300の中心点O1に対する配管310の中心点O2の方向(Y軸方向)に対応し、かつ、相対座標系における中心点O1´の座標が絶対座標系におけるトンネル300の中心点O1に対応するように座標変換する。これにより、無人航空機1の横断面内における自己位置の絶対座標、及び、ヘディングθが推定される(S117:横断面内位置/姿勢推定ステップ)。
Next, the relative coordinate system calculation unit 136 calculates the direction of the center point O2 of the pipe 310 corresponding to the center of the pipe 310 with respect to the first relative coordinate of the center point O'of the circle in the relative coordinate system ( S116: Calculation step).
Then, the position / orientation estimation unit 132 in the cross section indicates that the direction of the intersection point O 2 ′ with respect to the center point O 1 ′ in the relative coordinate system is the direction of the center point O2 of the pipe 310 with respect to the center point O1 of the tunnel 300 in the absolute coordinate system. The coordinates are converted so as to correspond to (Y-axis direction) and the coordinates of the center point O 1 ′ in the relative coordinate system correspond to the center point O1 of the tunnel 300 in the absolute coordinate system. As a result, the absolute coordinates of the self-position in the cross section of the unmanned aerial vehicle 1 and the heading θ are estimated (S117: position / attitude estimation step in the cross section).
 次に、自己位置データ生成部133が、横断面内位置/姿勢推定部132により推定された横方向自己位置(X,Y)及びヘディングθと、高度計110により測定された高度(Z)とにより自己位置データを生成する(S118:自己位置データ生成ステップ)。なお、自己位置データ生成部133は、IMU111から取得した位置情報に基づき、生成した自己位置データの精度を判定し、生成した自己位置データの精度が低い場合にはトンネル内の撮影を中止してもよい。 Next, the self-position data generation unit 133 uses the lateral self-position (X, Y) and heading θ estimated by the position / attitude estimation unit 132 in the cross section and the altitude (Z) measured by the altimeter 110. Generate self-position data (S118: self-position data generation step). The self-position data generation unit 133 determines the accuracy of the generated self-position data based on the position information acquired from the IMU 111, and if the accuracy of the generated self-position data is low, the shooting in the tunnel is stopped. May be good.
 そして、飛行制御部134が自己位置データ生成部133により生成した自己位置データに基づき、無人航空機1が目標に到達したか否かを判定する(S119)。 Then, the flight control unit 134 determines whether or not the unmanned aerial vehicle 1 has reached the target based on the self-position data generated by the self-position data generation unit 133 (S119).
 飛行制御部134が、無人航空機1が目標に到達していないと判定した場合(S119においてNO)、飛行制御部134が自己位置データ生成部133により生成された自己位置データに基づき、目標に向かって飛行するように、各ロータ103に対する制御指令値を演算し、制御指令値を示すデータを制御信号生成部122に出力する(S120)。制御信号生成部122は、その制御指令値を電圧を表わすパルス信号に変換して各スピードコントローラ123に送信する。各スピードコントローラ123は、そのパルス信号を駆動電圧へと変換して各モータ102に印加し、これにより各モータ102の駆動を制御して各ロータ103の回転数を制御することにより無人航空機1の飛行が制御される。 When the flight control unit 134 determines that the unmanned aerial vehicle 1 has not reached the target (NO in S119), the flight control unit 134 heads for the target based on the self-position data generated by the self-position data generation unit 133. The control command value for each rotor 103 is calculated so as to fly, and the data indicating the control command value is output to the control signal generation unit 122 (S120). The control signal generation unit 122 converts the control command value into a pulse signal representing a voltage and transmits it to each speed controller 123. Each speed controller 123 converts the pulse signal into a drive voltage and applies it to each motor 102, thereby controlling the drive of each motor 102 and controlling the rotation speed of each rotor 103, whereby the unmanned aerial vehicle 1 Flight is controlled.
 このようなS111~S120を、所定の微小時間間隔で繰り返すことにより、無人航空機1が目標に向かって自律飛行する。このようにして、無人航空機1が目標に到達し、S119において、飛行制御部134が、無人航空機1が目標に到達したと判定した場合(S119においてYES)、情報処理ユニット120がカメラ108を駆動してトンネル内壁面を撮影する(S130)。撮影した画像の画像データは、情報処理ユニット120のメモリに記憶するか、アンテナ117を介して適宜な端末に記録される。 By repeating such S111 to S120 at predetermined minute time intervals, the unmanned aerial vehicle 1 autonomously flies toward the target. In this way, when the unmanned aerial vehicle 1 reaches the target and the flight control unit 134 determines in S119 that the unmanned aerial vehicle 1 has reached the target (YES in S119), the information processing unit 120 drives the camera 108. Then, the inner wall surface of the tunnel is photographed (S130). The image data of the captured image is stored in the memory of the information processing unit 120 or recorded in an appropriate terminal via the antenna 117.
 そして、無人航空機1が到達した目標が最終ウェイポイントPmnではない場合(S140においてNO)には、次のウェイポイントPを目標として設定し(S135)、S110~S140を繰り返す。 Then, when the target reached by the unmanned aerial vehicle 1 is not the final waypoint P mn (NO in S140), the next waypoint P is set as the target (S135), and S110 to S140 are repeated.
 まあ、無人航空機1が到達した目標が最終ウェイポイントPmnである場合(S135においてYES)には、無人航空機1は必要な工程が全て終了したと判断し、飛行制御部134により、例えば、初期位置に帰還するように飛行するなどして作業を終了する。 Well, if the target reached by the unmanned aerial vehicle 1 is the final waypoint P mn (YES in S135), the unmanned aerial vehicle 1 determines that all the necessary steps have been completed, and the flight control unit 134 determines, for example, the initial stage. Finish the work by flying to return to the position.
 本実施形態によれば、以下の効果が奏される。
 トンネル内部では、GPSなどが使用できず、さらには、地磁気の影響によりコンパスも使用することができないことがある。このような場合に、無人航空機の自己位置を推定する場合には、レーザ SLAMなどアルゴリズムにより位置を推定することが考えられる。しかしながら、レーザ SLAMなどのアルゴリズムにより位置を推定しようとしても、トンネル等の断面形状が回転対称であると、正確な位置を推定することが難しい。これに対して、本実施形態では、相対座標系計算部136が、相対座標系におけるトンネル300の中心に対応する点O1´と、配管310の中心に対応する点O2´との方向を算出することにより、この方向に基づき相対座標系と絶対座標系との関係を特定することができ、回転対称な形状であるトンネル300内であっても正確な自己位置を推定することができる。
 また、本実施形態では、相対座標系計算部136がトンネルの内壁面に円をマッチングしている。トンネルの内壁面は円、楕円、又は矩形状であることが多いが、円や、楕円、又は矩形のような回転対称な形状をマッチングしたとしても、無人航空機に対する円や、楕円、又は矩形との位置関係のみからでは、無人航空機の位置を一意に特定することができない。これに対して、本実施形態では、横断面測定データにおける配管310の中心に対応する点を特定しているため、無人航空機の位置を一意に特定することができる。
According to this embodiment, the following effects are achieved.
Inside the tunnel, GPS and the like cannot be used, and further, the compass may not be used due to the influence of the geomagnetism. In such a case, when estimating the self-position of the unmanned aerial vehicle, it is conceivable to estimate the position by an algorithm such as laser SLAM. However, even if an attempt is made to estimate the position by an algorithm such as laser SLAM, it is difficult to estimate an accurate position if the cross-sectional shape of the tunnel or the like is rotationally symmetric. On the other hand, in the present embodiment, the relative coordinate system calculation unit 136 calculates the directions of the point O1'corresponding to the center of the tunnel 300 and the point O2' corresponding to the center of the pipe 310 in the relative coordinate system. Thereby, the relationship between the relative coordinate system and the absolute coordinate system can be specified based on this direction, and the accurate self-position can be estimated even in the tunnel 300 having a rotationally symmetric shape.
Further, in the present embodiment, the relative coordinate system calculation unit 136 matches a circle with the inner wall surface of the tunnel. The inner wall of a tunnel is often circular, elliptical, or rectangular, but even if it is matched with a rotationally symmetric shape such as a circle, ellipse, or rectangle, it will still be a circle, ellipse, or rectangle for an unmanned aircraft. It is not possible to uniquely identify the position of an unmanned aircraft only from the positional relationship of. On the other hand, in the present embodiment, since the point corresponding to the center of the pipe 310 in the cross-sectional measurement data is specified, the position of the unmanned aerial vehicle can be uniquely specified.
 また、本実施形態では、横断面内位置/姿勢推定部132が、相対座標系におけるトンネル300の中心に対応する点O1´に対する配管310の中心に対応する点O2´の方向が、基準軸であるY軸に対応するように座標変換している。これにより、絶対座標系における無人航空機のヘディングの方向を推定することができる。
 また、本実施形態では、横断面内位置/姿勢推定部132が、相対座標系におけるトンネル300の中心に対応する点O1´に対する配管310の中心に対応する点O2´の方向が、基準軸であるY軸に対応し、さらに、相対座標系における点O1´が絶対座標系におけるトンネルの中心に対応する原点に対応するように座標変換している。これにより、絶対座標系における無人航空機の座標を推定することができる。
Further, in the present embodiment, the direction of the point O2'corresponding to the center of the pipe 310 with respect to the point O1'corresponding to the center of the tunnel 300 in the relative coordinate system of the position / attitude estimation unit 132 in the cross section is the reference axis. The coordinates are converted so as to correspond to a certain Y axis. This makes it possible to estimate the heading direction of the unmanned aerial vehicle in the absolute coordinate system.
Further, in the present embodiment, the direction of the point O2'corresponding to the center of the pipe 310 with respect to the point O1'corresponding to the center of the tunnel 300 in the relative coordinate system of the position / attitude estimation unit 132 in the cross section is the reference axis. The coordinates are transformed so that the point O1'in the relative coordinate system corresponds to the origin corresponding to the center of the tunnel in the absolute coordinate system. This makes it possible to estimate the coordinates of the unmanned aerial vehicle in the absolute coordinate system.
 また、本実施形態では、相対座標系計算部136は、横断面測定データにおける円に対応する範囲内でデータが存在しないギャップ領域Gを特定し、ギャップ領域Gの二等分線と、マッチングした円C2の中心点O1´を中心とする半径R2の円との交点を求めることにより、相対座標系における配管310の中心点に対応する点を推定している。これにより、例えば、配管310の半径等が不明であっても、無人航空機1の位置を推定することができる。また、例えば、本実施形態にように、配管310の横断面形状が円形であり、無人航空機1から見た場合に配管310の側面を横断面測定装置109により測定できない場合であっても、配管310の相対座標を正確に推定することができる。 Further, in the present embodiment, the relative coordinate system calculation unit 136 identifies the gap region G in which no data exists within the range corresponding to the circle in the cross-sectional measurement data, and matches the gap region G with the angle bisector of the gap region G. By finding the intersection with the circle of radius R2 centered on the center point O1'of the circle C2, the point corresponding to the center point of the pipe 310 in the relative coordinate system is estimated. Thereby, for example, the position of the unmanned aerial vehicle 1 can be estimated even if the radius of the pipe 310 or the like is unknown. Further, for example, as in the present embodiment, even if the cross-sectional shape of the pipe 310 is circular and the side surface of the pipe 310 cannot be measured by the cross-sectional measuring device 109 when viewed from the unmanned aerial vehicle 1, the pipe is piped. The relative coordinates of 310 can be estimated accurately.
 また、本実施形態では、横断面測定データは、離散データであり、相対座標系計算部136は、離散データにおける外れ値を除去し、マッチングを行う。これにより、トンネル300の内壁面302に照明などの凹凸があったり、換気口などが形成されていたりする場合であっても、マッチングを行うことができる。 Further, in the present embodiment, the cross-sectional measurement data is discrete data, and the relative coordinate system calculation unit 136 removes outliers in the discrete data and performs matching. As a result, matching can be performed even when the inner wall surface 302 of the tunnel 300 has irregularities such as lighting or a ventilation port or the like is formed.
 なお、本実施形態では、縦方向に円形の配管310が延びている場合について説明したが、配管310(内部構造物)の横断面形状はこれに限らず矩形等の他の形状であってもよい。内部構造物の横断面形状が矩形である場合には、S113においてギャップ領域の二等分線を設定した場合に、必ずしも、二等分線が矩形の中心を通らないが、その誤差はわずかであるため、実用上問題はない。 In the present embodiment, the case where the circular pipe 310 extends in the vertical direction has been described, but the cross-sectional shape of the pipe 310 (internal structure) is not limited to this, and may be another shape such as a rectangle. good. When the cross-sectional shape of the internal structure is rectangular, when the bisector of the gap region is set in S113, the bisector does not necessarily pass through the center of the rectangle, but the error is small. Therefore, there is no problem in practical use.
 なお、本実施形態では、相対座標系計算部136は、S112において、横断面測定データに円をマッチングした後、S113においてギャップ領域を特定し、S114においてギャップ領域の角度範囲の二等分線を設定し、S115においてギャップの二等分線と、配管の中心が位置する円との交点を特定することにより、配管310の相対座標を算出している。しかしながら、本発明における相対座標系計算部136による配管310の相対座標の算出方法は、これに限られない。例えば、配管310の横断面の形状が円形である場合には、例えば、横断面測定データのトンネル300の内壁面302の範囲外のデータに対して、第2の所定の形状である円をマッチングして、配管310の所定の位置(例えば、中心位置)を特定し、この位置を相対座標系における配管310に関連する第2の特徴点の位置とすることも可能である。当然のことながら、配管310の形状が楕円形や、矩形の場合には、それに応じてマッチングする第2の所定の形状を変更すればよい。なお、マッチングする第2の所定の形状は幾何学的形状が好ましい。 In the present embodiment, the relative coordinate system calculation unit 136 identifies the gap region in S113 after matching the circle with the cross-sectional measurement data in S112, and bisectors the angle range of the gap region in S114. The relative coordinates of the pipe 310 are calculated by setting and specifying the intersection of the bisector of the gap and the circle where the center of the pipe is located in S115. However, the method for calculating the relative coordinates of the pipe 310 by the relative coordinate system calculation unit 136 in the present invention is not limited to this. For example, when the cross-sectional shape of the pipe 310 is circular, for example, a circle having a second predetermined shape is matched with data outside the range of the inner wall surface 302 of the tunnel 300 in the cross-sectional measurement data. Then, a predetermined position (for example, the center position) of the pipe 310 can be specified, and this position can be set as the position of the second feature point related to the pipe 310 in the relative coordinate system. As a matter of course, when the shape of the pipe 310 is elliptical or rectangular, the matching second predetermined shape may be changed accordingly. The second predetermined shape to be matched is preferably a geometric shape.
 また、本実施形態では、高度計により高さ方向の座標を決定したが、これに限らず、IMUにより高さ方向の座標を決定してもよい。 Further, in the present embodiment, the coordinates in the height direction are determined by the altimeter, but the present invention is not limited to this, and the coordinates in the height direction may be determined by the IMU.
 また、本実施形態では、鉛直方向に延びるトンネル300に本発明を適用したが、これに限らず、煙突や、建物内の立坑などにも適用することができる。さらに、本発明を斜め上方に延びるトンネルやなどにも本発明を適用できる。この場合には、断面形状データ記憶部130には高さごとのトンネル300及び内部構造物の横断面形状データを記憶しておければよい。 Further, in the present embodiment, the present invention is applied to the tunnel 300 extending in the vertical direction, but the present invention is not limited to this, and can be applied to a chimney, a shaft in a building, and the like. Furthermore, the present invention can be applied to tunnels and the like that extend the present invention diagonally upward. In this case, the cross-sectional shape data storage unit 130 may store the cross-sectional shape data of the tunnel 300 and the internal structure for each height.
1    無人航空機
101  制御ユニット
102  モータ
103  ロータ
104  アーム
105  着陸脚
106  台座
107  支持部材
108  カメラ
110  高度計
109  横断面測定装置
111  IMU
117  アンテナ
120  情報処理ユニット
120a CPU
120b RAM
120c ROM
120d 外部メモリ
120e 入力部
120f 出力部
120g 通信部
120h システムバス
121  通信回路
122  制御信号生成部
123  スピードコントローラ
125  インターフェイス
130  断面形状データ記憶部
131  飛行経路データ記憶部
132  横断面内位置/姿勢推定部
133  自己位置データ生成部
134  飛行制御部
135  横断面測定データ取得部
136  相対座標系計算部
140  自己位置推定システム
200  飛行制御システム
300  トンネル
302  内壁面
310  配管
312  外壁面
1 Unmanned aerial vehicle 101 Control unit 102 Motor 103 Rotor 104 Arm 105 Landing gear 106 Pedestal 107 Support member 108 Camera 110 Altimeter 109 Cross section measuring device 111 IMU
117 Antenna 120 Information processing unit 120a CPU
120b RAM
120c ROM
120d External memory 120e Input unit 120f Output unit 120g Communication unit 120h System bus 121 Communication circuit 122 Control signal generation unit 123 Speed controller 125 Interface 130 Cross-sectional shape data storage unit 131 Flight path data storage unit 132 Cross-sectional position / attitude estimation unit 133 Self-position data generation unit 134 Flight control unit 135 Cross-section measurement data acquisition unit 136 Relative coordinate system calculation unit 140 Self-position estimation system 200 Flight control system 300 Tunnel 302 Inner wall surface 310 Piping 312 Outer wall surface

Claims (14)

  1.  無人航空機の自己位置及び/又は姿勢を推定する推定システムであって、
     前記無人航空機は、前記無人航空機の縦方向の飛行経路を横切る方向の横断面内における前記無人航空機の所定の方向を基準とした空間形状を測定し、横断面測定データを生成する横断面測定装置を含み、
     前記無人航空機が、縦方向に所定の経路に沿って延びる内部空間が形成されるとともに、前記内部空間内に縦方向に延びる内部構造物が設けられた構造物の前記内部空間を飛行するときに、前記横断面測定装置により生成された前記横断面測定データを取得する横断面測定データ取得部と、
      前記横断面測定データ取得部が取得した前記横断面測定データに、前記構造物の内壁面の断面形状に対応する回転対称な第1の所定形状をマッチングし、前記横断面内の前記無人航空機を基準とした相対座標系における前記マッチングした第1の所定形状の中心である第1の特徴点の第1の位置を推定し、
      前記横断面測定データ取得部が取得した前記横断面測定データに基づき、前記相対座標系における前記内部構造物に関連する第2の特徴点の第2の位置を推定し、
      前記相対座標系における前記第1の位置に対する前記第2の位置の方向を計算する、相対座標系計算部と、
     を備える、
     推定システム。
    An estimation system that estimates the self-position and / or attitude of an unmanned aerial vehicle.
    The unmanned aerial vehicle is a cross-section measuring device that measures a spatial shape based on a predetermined direction of the unmanned aerial vehicle in a cross section in a direction crossing the longitudinal flight path of the unmanned aerial vehicle and generates cross-section measurement data. Including
    When the unmanned aircraft flies in the internal space of a structure in which an internal space extending in the vertical direction along a predetermined path is formed and an internal structure extending in the vertical direction is provided in the internal space. , A cross-section measurement data acquisition unit that acquires the cross-section measurement data generated by the cross-section measurement device, and
    The cross-sectional measurement data acquired by the cross-sectional measurement data acquisition unit is matched with a first predetermined shape that is rotationally symmetric corresponding to the cross-sectional shape of the inner wall surface of the structure, and the unmanned aircraft in the cross-section is obtained. The first position of the first feature point, which is the center of the matched first predetermined shape in the reference relative coordinate system, is estimated.
    Based on the cross-sectional measurement data acquired by the cross-sectional measurement data acquisition unit, the second position of the second feature point related to the internal structure in the relative coordinate system is estimated.
    A relative coordinate system calculation unit that calculates the direction of the second position with respect to the first position in the relative coordinate system.
    To prepare
    Estimate system.
  2.  前記横断面における構造物の第1の特徴点に対応する点に対する前記内部構造物の第2の特徴点に対応する点の方向を基準軸とした絶対座標系における、前記無人航空機の自己位置及び/又は姿勢を推定する、横断面内位置/姿勢推定部をさらに、備える、
     請求項1に記載の推定システム。
    The self-position of the unmanned aerial vehicle and the self-position of the unmanned aerial vehicle in the absolute coordinate system with the direction of the point corresponding to the second feature point of the internal structure as the reference axis with respect to the point corresponding to the first feature point of the structure in the cross section. / Or further provided with a position / posture estimation unit in the cross section for estimating the posture.
    The estimation system according to claim 1.
  3.  前記横断面内位置/姿勢推定部は、
     前記相対座標系における前記第1の位置に対する前記第2の位置の方向が、前記基準軸に対応するように相対座標系を絶対座標系に座標変換し、前記絶対座標系における前記無人航空機のヘディングの方向を推定する、ように構成されている、
     請求項2に記載の推定システム。
    The position / attitude estimation unit in the cross section
    The relative coordinate system is coordinate-converted into an absolute coordinate system so that the direction of the second position with respect to the first position in the relative coordinate system corresponds to the reference axis, and the heading of the unmanned aerial vehicle in the absolute coordinate system. Is configured to estimate the direction of
    The estimation system according to claim 2.
  4.  前記横断面内位置/姿勢推定部は、
     前記相対座標系における前記第1の位置に対する前記第2の位置の方向が、前記基準軸に対応し、かつ、前記相対座標系における前記第1の位置が、前記構造物の前記第1の特徴点に対応する絶対座標系の座標に対応するように座標変換し、前記無人航空機の絶対座標系における座標を推定する、
     請求項2又は3に記載の推定システム。
    The position / attitude estimation unit in the cross section
    The direction of the second position with respect to the first position in the relative coordinate system corresponds to the reference axis, and the first position in the relative coordinate system is the first feature of the structure. Coordinates are transformed so as to correspond to the coordinates of the absolute coordinate system corresponding to the points, and the coordinates in the absolute coordinate system of the unmanned aircraft are estimated.
    The estimation system according to claim 2 or 3.
  5.  前記第1の所定形状は、円、楕円、又は、矩形の何れかである、
     請求項1~4の何れか1項に記載の推定システム。
    The first predetermined shape is either a circle, an ellipse, or a rectangle.
    The estimation system according to any one of claims 1 to 4.
  6.  前記相対座標系計算部は、
     前記横断面測定データにおける前記マッチングした第1の所定形状の近傍の領域内でデータが存在しないギャップ領域を特定し、
     前記無人航空機に対する前記ギャップ領域の角度範囲の略二等分線と、予め設定された前記第1の特徴点と前記第2の特徴点との距離を半径とし、かつ、前記第1の特徴点を中心とする円との交点を、前記第2の位置として推定する、
     請求項1~5の何れか1項に記載の推定システム。
    The relative coordinate system calculation unit
    In the cross-sectional measurement data, a gap region in which no data exists is specified in the region near the matched first predetermined shape.
    The radius is the substantially bisector of the angular range of the gap region with respect to the unmanned aircraft, and the distance between the preset first feature point and the second feature point, and the first feature point The intersection with the circle centered on is estimated as the second position.
    The estimation system according to any one of claims 1 to 5.
  7.  前記相対座標系計算部は、
     前記横断面測定データに前記内部構造物に対応する部分に対応する第2の所定の形状をマッチングし、前記マッチングした第2の所定形状における所定の点の相対座標を、前記第2の位置として推定する、
     請求項1~5の何れか1項に記載の推定システム。
    The relative coordinate system calculation unit
    A second predetermined shape corresponding to the portion corresponding to the internal structure is matched with the cross-sectional measurement data, and the relative coordinates of a predetermined point in the matched second predetermined shape are set as the second position. presume,
    The estimation system according to any one of claims 1 to 5.
  8.  前記横断面測定データは、離散データであり、
     前記相対座標系計算部は、前記離散データにおける外れ値を除去し、マッチングを行う、
     請求項1~7の何れか1項に記載の推定システム。
    The cross-sectional measurement data is discrete data and is
    The relative coordinate system calculation unit removes outliers in the discrete data and performs matching.
    The estimation system according to any one of claims 1 to 7.
  9.  請求項1~8の何れか1項に記載の推定システムと、
     前記推定システムにより推定された無人航空機の自己位置及び/又は姿勢に基づき、設定された飛行計画経路に沿って前記無人航空機を飛行させる飛行制御部と、を含む、
     無人航空機の飛行制御システム。
    The estimation system according to any one of claims 1 to 8 and the estimation system.
    A flight control unit that flies the unmanned aerial vehicle along a set flight planning route based on the self-position and / or attitude of the unmanned aerial vehicle estimated by the estimation system.
    Flight control system for unmanned aerial vehicles.
  10.  請求項1~8の何れか1項に記載の推定システムを有する、無人航空機。 An unmanned aerial vehicle having the estimation system according to any one of claims 1 to 8.
  11.  請求項9に記載の飛行制御システムを有する、無人航空機。 An unmanned aerial vehicle having the flight control system according to claim 9.
  12.  無人航空機の自己位置及び/又は姿勢をコンピュータにより推定する推定方法であって、
     前記無人航空機は、前記無人航空機の縦方向の飛行経路を横切る方向の横断面内における前記無人航空機の所定の方向を基準とした空間形状を測定し、横断面測定データを生成する横断面測定装置を含み、
     前記無人航空機が、縦方向に所定の経路に沿って延びる内部空間が形成されるとともに、前記内部空間内に縦方向に延びる内部構造物が設けられた構造物の前記内部空間を飛行するときに、前記横断面測定装置により生成された前記横断面測定データを取得する横断面測定データ取得ステップと、
     取得した前記横断面測定データに、前記構造物の内壁面の断面形状に対応する回転対称な第1の所定形状をマッチングし、前記横断面内の前記無人航空機を基準とした相対座標系における前記マッチングした第1の所定形状の中心である第1の特徴点の第1の位置を推定する第1の特徴点推定ステップと、
     取得した前記横断面測定データに基づき、前記相対座標系における前記内部構造物に関連する第2の特徴点の第2の位置を推定する第2の特徴点推定ステップと、
     前記相対座標系における前記第1の位置に対する前記第2の位置の方向を計算する計算ステップと、
     を含む、自己位置推定方法。
    An estimation method that estimates the self-position and / or attitude of an unmanned aerial vehicle by computer.
    The unmanned aerial vehicle is a cross-section measuring device that measures a spatial shape based on a predetermined direction of the unmanned aerial vehicle in a cross section in a direction crossing the longitudinal flight path of the unmanned aerial vehicle and generates cross-section measurement data. Including
    When the unmanned aircraft flies in the internal space of a structure in which an internal space extending in the vertical direction along a predetermined path is formed and an internal structure extending in the vertical direction is provided in the internal space. , A cross-section measurement data acquisition step for acquiring the cross-section measurement data generated by the cross-section measurement device, and
    The acquired cross-sectional measurement data is matched with a first predetermined shape that is rotationally symmetric corresponding to the cross-sectional shape of the inner wall surface of the structure, and the said in a relative coordinate system based on the unmanned aircraft in the cross section. A first feature point estimation step for estimating the first position of the first feature point, which is the center of the matched first predetermined shape, and a first feature point estimation step.
    Based on the acquired cross-sectional measurement data, a second feature point estimation step for estimating the second position of the second feature point related to the internal structure in the relative coordinate system, and a second feature point estimation step.
    A calculation step of calculating the direction of the second position with respect to the first position in the relative coordinate system.
    Self-position estimation method including.
  13.  請求項12に記載の方法をコンピュータに実行させるためのプログラム。 A program for causing a computer to execute the method according to claim 12.
  14.  請求項12に記載のプログラムを記録したコンピュータ読み取り可能な記録媒体。 A computer-readable recording medium on which the program according to claim 12 is recorded.
PCT/JP2021/000814 2021-01-13 2021-01-13 Estimation system for estimating self-location and/or attitude of unmanned aerial vehicle, flight control system, unmanned aerial vehicle, program, and recording medium WO2022153391A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022574908A JPWO2022153391A1 (en) 2021-01-13 2021-01-13
PCT/JP2021/000814 WO2022153391A1 (en) 2021-01-13 2021-01-13 Estimation system for estimating self-location and/or attitude of unmanned aerial vehicle, flight control system, unmanned aerial vehicle, program, and recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/000814 WO2022153391A1 (en) 2021-01-13 2021-01-13 Estimation system for estimating self-location and/or attitude of unmanned aerial vehicle, flight control system, unmanned aerial vehicle, program, and recording medium

Publications (1)

Publication Number Publication Date
WO2022153391A1 true WO2022153391A1 (en) 2022-07-21

Family

ID=82447012

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/000814 WO2022153391A1 (en) 2021-01-13 2021-01-13 Estimation system for estimating self-location and/or attitude of unmanned aerial vehicle, flight control system, unmanned aerial vehicle, program, and recording medium

Country Status (2)

Country Link
JP (1) JPWO2022153391A1 (en)
WO (1) WO2022153391A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017075863A (en) * 2015-10-15 2017-04-20 株式会社プロドローン Aerial type inspection device and inspection method
JP2017090146A (en) * 2015-11-06 2017-05-25 株式会社プロドローン Surface inspection device and surface inspection method using the same
CN109031312A (en) * 2018-04-26 2018-12-18 中国计量大学 Flying platform positioning device and localization method suitable for chimney inside processing
JP2019167044A (en) * 2018-03-26 2019-10-03 株式会社日立製作所 Unmanned flight body introduction device, in-conduit line work system and work method using unmanned flight body

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017075863A (en) * 2015-10-15 2017-04-20 株式会社プロドローン Aerial type inspection device and inspection method
JP2017090146A (en) * 2015-11-06 2017-05-25 株式会社プロドローン Surface inspection device and surface inspection method using the same
JP2019167044A (en) * 2018-03-26 2019-10-03 株式会社日立製作所 Unmanned flight body introduction device, in-conduit line work system and work method using unmanned flight body
CN109031312A (en) * 2018-04-26 2018-12-18 中国计量大学 Flying platform positioning device and localization method suitable for chimney inside processing

Also Published As

Publication number Publication date
JPWO2022153391A1 (en) 2022-07-21

Similar Documents

Publication Publication Date Title
US10914590B2 (en) Methods and systems for determining a state of an unmanned aerial vehicle
US10565732B2 (en) Sensor fusion using inertial and image sensors
EP3158293B1 (en) Sensor fusion using inertial and image sensors
EP3158417B1 (en) Sensor fusion using inertial and image sensors
EP3158411B1 (en) Sensor fusion using inertial and image sensors
US10534068B2 (en) Localization system, vehicle control system, and methods thereof
WO2017177542A1 (en) Object tracking method, device and system
CN112335190B (en) Radio link coverage map and impairment system and method
WO2020103049A1 (en) Terrain prediction method and device of rotary microwave radar, and system and unmanned aerial vehicle
EP4015993A1 (en) Aircraft sensor system synchronization
WO2020062356A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
Del Pizzo et al. A Vision-based navigation system for landing procedure
WO2022153391A1 (en) Estimation system for estimating self-location and/or attitude of unmanned aerial vehicle, flight control system, unmanned aerial vehicle, program, and recording medium
Lauterbach et al. Preliminary results on instantaneous UAV-based 3D mapping for rescue applications
WO2022153392A1 (en) Self-position estimation system and method for uncrewed aircraft, uncrewed aircraft, program, and recording medium
WO2022153390A1 (en) Self-position estimation system for estimating self position of uncrewed aircraft, flight control system, uncrewed aircraft, program, and recording medium
US20210199798A1 (en) Continuous wave radar terrain prediction method, device, system, and unmanned aerial vehicle
JP7031997B2 (en) Aircraft system, air vehicle, position measurement method, program
JP6934116B1 (en) Control device and control method for controlling the flight of an aircraft
Mossel et al. SmartCopter: Enabling autonomous flight in indoor environments with a smartphone as on-board processing unit
Desai et al. Stabilization and control of quad-rotor helicopter using a smartphone device
CN110892353A (en) Control method, control device and control terminal of unmanned aerial vehicle
WO2024004155A1 (en) System, subsystem, method and program for estimating position of wind-power generation device, and storage medium having said program therein
WO2021232296A1 (en) Method for controlling unmanned aerial vehicle, device, unmanned aerial vehicle, and storage medium
WO2023188378A1 (en) System, method, and program for using unmanned aircraft to estimate rotor direction/azimuth of wind power generation device, and storage medium storing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21919287

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022574908

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21919287

Country of ref document: EP

Kind code of ref document: A1