WO2021149546A1 - Information processing device - Google Patents

Information processing device Download PDF

Info

Publication number
WO2021149546A1
WO2021149546A1 PCT/JP2021/000781 JP2021000781W WO2021149546A1 WO 2021149546 A1 WO2021149546 A1 WO 2021149546A1 JP 2021000781 W JP2021000781 W JP 2021000781W WO 2021149546 A1 WO2021149546 A1 WO 2021149546A1
Authority
WO
WIPO (PCT)
Prior art keywords
landing
unit
obstacle
distance
drone
Prior art date
Application number
PCT/JP2021/000781
Other languages
French (fr)
Japanese (ja)
Inventor
▲高▼橋 誠
鷹見 忠雄
寛 河上
香緒莉 新畑
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Priority to JP2021573081A priority Critical patent/JP7461384B2/en
Publication of WO2021149546A1 publication Critical patent/WO2021149546A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • B64D45/04Landing aids; Safety measures to prevent collision with earth's surface
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/02Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data

Definitions

  • the present invention relates to a technique for landing an air vehicle.
  • Patent Document 1 states that a GPS mounted on an unmanned airplane is used to fly to near the landing point, and then a target mark placed at the landing point is captured by an imaging device, aiming at that.
  • a technique for landing and a technique for landing so that the difference between the position information measured by the GPS of the takeoff and landing platform and the position information of the drone are made zero are disclosed.
  • an object of the present invention is to reduce the possibility of colliding with an obstacle when the flying object lands.
  • the present invention relates to an acquisition unit that acquires an image taken by a stereo camera provided on the flying object, and a distance from the stereo camera to a point indicated by each pixel of the acquired image.
  • a calculation unit that calculates the distance, a detection unit that detects an obstacle from the acquired image, and a calculated distance if the position of the obstacle satisfies the landing condition even if the obstacle is detected.
  • an information processing apparatus including a landing instruction unit for instructing the landing of the aircraft based on the above.
  • Diagram showing an example of the damage degree table Diagram showing an example of the determined damage level The figure which shows an example of the operation procedure in a landing process Diagram showing an example of the overall configuration of the landing support system according to the second embodiment Diagram showing an example of the hardware configuration of the server device Diagram showing the functional configuration realized in the embodiment
  • FIG. 1 shows an example of the hardware configuration of the drone 10 according to the first embodiment.
  • the drone 10 is physically configured as a computer device including a processor 11, a memory 12, a storage 13, a communication device 14, a flight device 15, a sensor device 16, a stereo camera 17, a bus 18, and the like. May be done.
  • the word "device” can be read as a circuit, a device, a unit, or the like.
  • the processor 11 operates, for example, an operating system to control the entire computer.
  • the processor 11 may be configured by a central processing unit (CPU: Central Processing Unit) including an interface with a peripheral device, a control device, an arithmetic unit, a register, and the like.
  • CPU Central Processing Unit
  • the baseband signal processing unit and the like may be realized by the processor 11. Further, the processor 11 reads a program (program code), a software module, data, and the like from at least one of the storage 13 and the communication device 14 into the memory 12, and executes various processes according to the read program and the like.
  • a program program code
  • the program a program that causes a computer to execute at least a part of the operations described in the above-described embodiment is used.
  • the various processes described above are executed by one processor 11, they may be executed simultaneously or sequentially by two or more processors 11.
  • the processor 11 may be implemented by one or more chips.
  • the program may be transmitted from the network via a telecommunication line.
  • the memory 12 is a computer-readable recording medium.
  • the memory 12 may be composed of at least one such as a ROM (ReadOnlyMemory), an EPROM (ErasableProgrammableROM), an EPROM (ElectricallyErasableProgrammableROM), and a RAM (RandomAccessMemory).
  • the memory 12 may be referred to as a register, a cache, a main memory (main storage device), or the like.
  • the memory 12 can store a program (program code), a software module, or the like that can be executed to implement the wireless communication method according to the embodiment of the present disclosure.
  • the storage 13 is a computer-readable recording medium, and is, for example, an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, an optical magnetic disk (for example, a compact disk, a digital versatile disk, or a Blu-ray). It may consist of at least one (registered trademark) disk), smart card, flash memory (eg, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, and the like.
  • an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, an optical magnetic disk (for example, a compact disk, a digital versatile disk, or a Blu-ray). It may consist of at least one (registered trademark) disk), smart card, flash memory (eg, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, and the like.
  • an optical disk such as a CD-ROM (Compact Disc ROM), a hard
  • the storage 13 may be called an auxiliary storage device.
  • the storage medium described above may be, for example, a database, server or other suitable medium containing at least one of the memory 12 and the storage 13.
  • the communication device 14 is hardware (transmission / reception device) for communicating between computers via at least one of a wired network and a wireless network.
  • the above-mentioned transmission / reception antenna, amplifier unit, transmission / reception unit, transmission line interface, and the like may be realized by the communication device 14.
  • the transmission / reception unit may be physically or logically separated from each other in the transmission unit and the reception unit.
  • each device such as the processor 11 and the memory 12 is connected by a bus 18 for communicating information.
  • the bus 18 may be configured by using a single bus, or may be configured by using a different bus for each device.
  • the flight device 15 is a device provided with a motor, a rotor, etc., to fly its own aircraft.
  • the flight device 15 can move its own aircraft in all directions and make its own aircraft stationary (hovering) in the air.
  • the sensor device 16 is a device having a sensor group for acquiring information necessary for flight control.
  • the sensor device 16 includes, for example, a position sensor that measures the position (latitude and longitude) of the own machine.
  • the sensor device 16 includes a direction sensor for measuring the direction in which the own machine is facing (the direction in which the drone is facing the front direction of the own machine and the determined front direction is facing), and the own machine. It is equipped with an altitude sensor that measures altitude. Further, the sensor device 16 includes a speed sensor for measuring the speed of the own machine and an inertial measurement sensor (IMU (Inertial Measurement Unit)) for measuring the angular speed of three axes and the acceleration in three directions.
  • IMU Inertial Measurement Unit
  • the stereo camera 17 is a camera provided with a plurality of cameras and capable of recording information in the depth direction of the object by photographing the object from a plurality of directions with the plurality of cameras.
  • the stereo camera 17 includes two cameras in this embodiment.
  • Each camera includes an optical system including a lens and an image sensor.
  • the stereo camera 17 is provided so as to take a picture vertically below the own machine.
  • the above-mentioned device includes hardware such as a microprocessor, a digital signal processor (DSP: Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), and an FPGA (Field Programmable Gate Array). It may be configured. Further, in the above-mentioned device, a part or all of each functional block may be realized by the hardware. For example, the processor 11 may be implemented using at least one of the hardware.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • Each function in the drone 10 is such that the processor 11 performs an operation by loading predetermined software (program) on the hardware such as the processor 11 and the memory 12, and controls the communication by the communication device 14, or the memory 12 and the memory 12. It is realized by controlling at least one of reading and writing of data in the storage 13.
  • predetermined software program
  • FIG. 2 shows the functional configuration realized by the drone 10.
  • the drone 10 is photographed by the sensor acquisition unit 101, the distance calculation unit 102, the risk determination unit 103, the flight instruction unit 104, the flight control unit 105, the damage scale determination unit 106, the obstacle detection unit 107, and the like. It includes an instruction unit 108 and a landing instruction unit 109.
  • the sensor acquisition unit 101 acquires the measured value of the sensor provided in the own aircraft (drone 10) which is an air vehicle.
  • the sensor acquisition unit 101 acquires an image vertically below the own machine taken by the stereo camera 17 provided in the own machine as the measured value of the image sensor. In other words, the sensor acquisition unit 101 acquires a pixel value indicating an image vertically below the own machine output by the image sensor of the stereo camera 17 as a measured value of the sensor.
  • the sensor acquisition unit 101 is an example of the “acquisition unit” of the present invention. The sensor acquisition unit 101 supplies the acquired image to the distance calculation unit 102.
  • the distance calculation unit 102 calculates the distance from the stereo camera 17 (hereinafter referred to as "distance for each pixel") to the point where the object indicated by each pixel of the image acquired by the sensor acquisition unit 101 exists.
  • the distance calculation unit 102 is an example of the "calculation unit" of the present invention. Since the image captured by the stereo camera 17 captures the same point from a plurality of angles, the distance from the stereo camera 17 to the point where the object reflected in each pixel exists can be calculated.
  • the captured image does not necessarily show the ground, but may show any object such as a building, a natural object, a water surface, or an animal.
  • the distance from the stereo camera 17 can be regarded as the distance from the drone 10.
  • a well-known technique such as that described in JP-A-11-230745 may be used.
  • the distance calculation unit 102 calculates the distance for each pixel
  • the distance calculation unit 102 supplies the distance information indicating the calculated distance for each pixel to the risk determination unit 103.
  • the risk level determination unit 103 determines the risk level at the time of landing of each of the plurality of divided areas that divide the above-ground area vertically below the own aircraft.
  • the degree of danger at the time of landing is the high possibility that it will not be possible to land safely.
  • a safe landing is a landing without any damage.
  • the degree of danger is, for example, the drone 10 itself due to contact with some object (including humans and other animals) during landing or disturbance of the attitude at the time of landing (including the case of falling as a result). It represents the high possibility of damage or damage (including injury) to an object that the drone 10 comes into contact with.
  • FIG. 3 shows an example of the divided area.
  • the risk determination unit 103 divides four square areas having the own machine as the apex and having the same side length into divided areas B1, B2, and B3 when the own machine is viewed vertically downward from the sky. , Used as B4.
  • the risk level determination unit 103 determines the risk level based on the ratio of the unsuitable area that is not suitable for landing in the divided area, which is calculated based on the measured value acquired by the sensor acquisition unit 101.
  • Unsuitable areas are areas such as commercial areas, residential areas, industrial areas, and parks where people gather and are more likely to injure people when falling than other areas.
  • the unsuitable area is an area where there is a higher possibility that the drone 10 will be damaged by contacting or falling over some object when trying to land, such as a forest, slope, utility pole, vehicle, etc., as compared with other areas. You may.
  • the risk determination unit 103 first recognizes an object existing on the ground from the distance indicated by the supplied distance information, that is, the distance for each pixel calculated by the distance calculation unit 102. Objects such as buildings, vehicles, people, animals, utility poles, trees, forests, slopes and flatlands may exist on the ground. The risk determination unit 103 calculates the height of the object indicated by each pixel from the ground from the distance between each pixel included in the portion representing the object in the image and the own machine and the altitude of the own machine.
  • the risk determination unit 103 stores a pattern of shape and size for each type of object, and recognizes that the object corresponds to the pattern when the similarity with the stored pattern exceeds the threshold value. In this way, the risk determination unit 103 recognizes an object existing on the ground from the pixel values acquired by the sensor acquisition unit 101.
  • the risk level determination unit 103 determines the risk level based on the ratio of the unsuitable area calculated by weighting the area occupied by the recognized object according to the attribute of the object.
  • the risk determination unit 103 uses a weight table in which the attributes of the object and the weights are associated with each other.
  • FIG. 4 shows an example of a weight table. In the example of FIG. 4, "1.0” is for “slope”, “1.5” is for “buildings, utility poles, trees”, "2.0” is for “vehicles, people, animals", and so on. Object attributes and weights are associated.
  • the risk determination unit 103 When the risk determination unit 103 recognizes an object existing in the divided region, it calculates for all the recognized objects by multiplying the number of pixels of the object by a weight according to the attribute of the object. Then, the risk determination unit 103 calculates the value obtained by dividing the total value of the calculated values by the number of pixels of the entire divided region as the ratio of the unsuitable region. The risk level determination unit 103 determines the risk level using, for example, a risk level table in which the ratio of the unsuitable region and the risk level are associated with each other.
  • FIG. 5 shows an example of a risk table.
  • the risk levels of "danger Lv1", “danger Lv2”, and “danger Lv3" are associated with the proportions of the unsuitable regions “less than Th11", “Th11 or more and less than Th12", and “Th12 or more”.
  • Th11 and Th12 are thresholds for the proportion of unsuitable regions, such as 30% and 60%. The higher the value, the higher the risk.
  • the risk determination unit 103 determines, for example, the risk of the divided region in which the ratio of the unsuitable region is Th11 or more and less than Th12 as danger Lv2.
  • the risk level determination unit 103 determines the risk level of each divided region
  • the risk level determination unit 103 supplies the determination result to the flight instruction unit 104.
  • the flight instruction unit 104 gives an instruction to move the own aircraft (drone 10) in a direction corresponding to the arrangement of the divided regions in which the degree of danger determined by the degree of danger determination 103 is equal to or higher than a predetermined reference.
  • the flight instruction unit 104 gives the above instructions to the flight control unit 105.
  • the flight control unit 105 controls the flight of its own aircraft (drone 10) based on various values (position, altitude, direction, etc.) measured by the sensor device 16.
  • the flight control unit 105 controls to move the aircraft in the instructed direction.
  • FIG. 6 shows an example of the relationship between the arrangement of the divided areas and the moving direction.
  • the standard risk level is set to Lv3.
  • the flight instruction unit 104 instructs the direction A1 toward the center of the division region where the remaining one risk level is less than the reference as the movement direction.
  • the flight instruction unit 104 instructs the movement to the center C1 of the division region B1 in the direction A1.
  • the movement direction is the direction A2 toward the boundary between the divided regions where the remaining two danger levels are less than the standard. Instruct.
  • the flight instruction unit 104 instructs the movement to the center C2 of the sides of the divided regions B1 and B2 in the direction A2.
  • the flight instruction unit 104 may instruct the movement to the apex C3 on the diagonal line of the division region B1 in the case of FIG. 6A.
  • the movement to the end C4 on the opposite side of the sides of the divided regions B1 and B2 may be instructed.
  • the movement direction is the direction A3 toward the center of the division region facing the division region across the drone 10. Instruct.
  • the flight instruction unit 104 may randomly select either the direction A4-1 or the direction A4-2, or may compare the risk levels of the corresponding divided regions and select the smaller one.
  • the flight instruction unit 104 may compare the ratios of the unsuitable regions and select the smaller one. Further, when the flight instruction unit 104 has two divided regions of danger Lv3 arranged side by side, the flight instruction unit 104 may instruct the direction toward the boundary between the divided regions whose remaining two danger levels are less than the reference as the moving direction. If the risk levels of the remaining two divided areas are less than the standard as shown in FIG. 6 (e) but the levels are different, the direction A5 toward the center of the divided areas with the lower risk is designated as the moving direction. May be good.
  • the flight instruction unit 104 attempts the following two flight instructions. When there is no divided region in which the risk level determined by the risk level determination unit 103 is less than the first reference, the flight instruction unit 104 first instructs the flight instruction unit 104 to increase the altitude and then acquire the pixel value again.
  • the flight instruction unit 104 instructs the flight control unit 105 to raise the altitude, and instructs the sensor acquisition unit 101 to reacquire the pixel value.
  • the risk level determination unit 103 determines the risk level of the divided region that shows a wider range than before the altitude was increased. As a result, there is a possibility that there is a divided area where the risk level is less than the first criterion, so that it is possible to attempt landing after moving to an area where a safer landing is possible.
  • the flight instruction unit 104 may repeatedly give the above-mentioned instruction to raise the altitude when the division area whose risk level is less than the first criterion is not found. However, if the division area where the risk level is less than the first criterion is not found even after repeating the instruction a fixed number of times (for example, the number of times when the altitude at which flight is prohibited) is repeated, the flight instruction unit 104 fails to land. Instruct the movement in consideration of the damage at the time.
  • the damage scale determination unit 106 determines the degree of damage indicating the magnitude of damage when landing fails for each of the plurality of divided regions.
  • the damage scale determination unit 106 is supplied with distance information indicating the distance for each pixel from the distance calculation unit 102.
  • the damage scale determination unit 106 recognizes an object existing on the ground from the distance for each pixel indicated by the supplied distance information, for example, by a method using the same pattern as the risk determination unit 103.
  • the damage scale determination unit 106 identifies the attributes of each divided region from the recognized object.
  • the damage scale determination unit 106 specifies, for example, a divided area in which trees and slopes are included in a certain proportion or more as “forest”. In addition, the damage scale determination unit 106 identifies the divided area containing crops and agricultural machinery (tractors, etc.) in a certain proportion or more as “agricultural land", and the factory equipment (pipeline, warehouse, etc.) has a certain proportion. The divided area included above is specified as an "industrial area”. In addition, the damage scale determination unit 106 identifies the divided area containing a certain percentage or more of single-family homes and collective housing as a "residential area”, and identifies the divided area containing a certain percentage or more of arcades and signboards as a "commercial area”. do.
  • the method of specifying the attribute of the divided area is not limited to this.
  • the sensor acquisition unit 101 acquires position information indicating the position of the own machine as a measured value of the sensor and supplies it to the damage scale determination unit 106.
  • the damage scale determination unit 106 stores the map information indicating the attribute for each area, and specifies the attribute of the position indicated by the supplied position information.
  • the damage scale determination unit 106 may use the measured value of the sensor different from that used by the risk determination unit 103 and the like.
  • the damage scale determination unit 106 is based on the measured value acquired by the sensor acquisition unit 101 (directly like the position information). The degree of damage is determined based on the case (including the case based on the case and the case based indirectly such as the distance per pixel). The damage scale determination unit 106 determines the damage degree using, for example, a damage degree table in which the attributes of the divided area and the damage degree are associated with each other.
  • FIG. 7 shows an example of the damage degree table.
  • the attributes of the divided area are "damage Lv1" for “forest, agricultural land”, “damage Lv2" for “industrial land”, and “damage Lv3" for "residential area, commercial land”.
  • the damage scale determination unit 106 determines the damage degree associated with the attribute of the divided area specified as described above in the damage degree table as the damage degree of the divided area.
  • the damage scale determination unit 106 determines the degree of damage for each divided area
  • the damage scale determination unit 106 supplies the determination result to the flight instruction unit 104.
  • the flight instruction unit 104 moves its own aircraft in the direction of the division area having the smallest damage degree determined by the damage scale determination unit 106 when there is no division area in which the risk degree determined by the danger degree determination unit 103 is less than the standard. Instruct to let it descend.
  • FIG. 8 shows an example of the determined damage degree.
  • the degree of danger of the divided areas B1 to B4 is all Lv3, but the degree of damage is determined to be Lv1 for the divided area B1, Lv2 for the divided area B3, and Lv3 for the divided areas B2 and B4.
  • the flight instruction unit 104 instructs the direction A6 toward the center of the division region B1 having the least damage degree as the movement direction. By instructing the moving direction in consideration of the degree of damage in this way, even if the landing of the drone 10 fails, the damage caused by it can be minimized.
  • the flight instruction unit 104 makes a predetermined distance descent. Instruct. In this embodiment, it is assumed that the danger Lv2 is used as the second criterion. In the examples of FIGS. 6A to 6E, the flight instruction unit 104 moves to the divided region of danger Lv1, so that the flight instruction unit 104 also instructs the descent of a predetermined distance.
  • the flight instruction unit 104 does not instruct the descent but only instructs the movement.
  • the flight control unit 105 controls the aircraft to move in the horizontal direction when only the movement is instructed, and controls the aircraft to move diagonally downward when instructed to move and descend. In the latter case, the flight control unit 105 may control the aircraft so as to move in the horizontal direction and then descend.
  • the flight instruction unit 104 repeats the above-mentioned movement and descent instructions, and when the altitude of the own aircraft (drone 10) becomes less than a predetermined altitude, the flight instruction unit 105 causes the flight control unit 105 to perform landing control based on the measurement result by the distance measuring means.
  • the distance measuring means is a means for measuring the distance to the ground.
  • the distance calculating unit 102 is used as the distance measuring means, and the distance calculated by the distance calculating unit 102 is used as the measurement result.
  • the distance calculation unit 102 also supplies distance information indicating the distance for each pixel to the obstacle detection unit 107.
  • the obstacle detection unit 107 detects an obstacle from the image taken by the stereo camera 17 acquired by the sensor acquisition unit 101.
  • the obstacle detection unit 107 is an example of the "detection unit" of the present invention.
  • An obstacle is an object that interferes with the landing of the drone 10, and may be a building, a natural object, an animal (including a human being), or the like.
  • the obstacle detection unit 107 detects an obstacle when the shape indicated by the calculated distance for the pixel that may indicate the obstacle is within the allowable range as the obstacle.
  • the shape indicated by the distance is the same as the contour of the object used when the risk determination unit 103 recognizes the object. Similar to the risk level determination unit 103, the obstacle detection unit 107 stores a pattern of shape and size for each type of obstacle.
  • the obstacle detection unit 107 determines that the shape is within the allowable range as an obstacle when the similarity with the stored pattern exceeds the threshold value, and detects it as an obstacle corresponding to the pattern.
  • the obstacle detection unit 107 detects an obstacle, it supplies pixel information indicating a pixel in which the detected obstacle is captured to the photographing instruction unit 108.
  • the shooting instruction unit 108 instructs the shooting from another direction of the obstacle.
  • the shooting instruction unit 108 is an example of the “shooting instruction unit” of the present invention.
  • the shooting instruction unit 108 determines the position of the obstacle on the ground detected from the position of the pixel indicated by the supplied pixel information, and causes the flight control unit 105 to move the aircraft within a range including the position in the angle of view. Instruct. Then, the shooting instruction unit 108 instructs the stereo camera 17 to shoot after moving, thereby instructing the stereo camera 17 to shoot from another direction of the obstacle.
  • the obstacle detection unit 107 adds an image taken according to the shooting instruction of the shooting instruction unit 108, and detects the obstacle again.
  • the obstacle detection unit 107 supplies the landing instruction unit 109 with pixel information indicating the pixels in which the obstacle detected in this way is captured.
  • the landing instruction unit 109 instructs the flight control unit 105 to land the own aircraft (drone 10) based on the distance for each pixel calculated by the distance calculation unit 102.
  • the landing instruction unit 109 is an example of the "landing instruction unit" of the present invention.
  • the landing instruction unit 109 will instruct the aircraft to land by lowering the altitude as it is. Further, even if an obstacle is detected in the landing direction, the landing instruction unit 109 of the own aircraft based on the calculated distance for each pixel if the position of the obstacle satisfies the landing condition. Instruct the flight control unit 105 to land. The landing instruction unit 109 determines that the landing condition is satisfied when, for example, the obstacle detected by the obstacle detection unit 107 is not included in the predetermined range of the captured image.
  • the predetermined range is, for example, a range in the shape of a circle having a predetermined radius from the center of the captured image.
  • the shape of the range is not limited to a circle, and may be an ellipse, a square, a quadrangle, or the like, or may be deviated from the center of the image. In the following, these ranges will be referred to as "ranges for landing judgment". This makes it possible to land, for example, if an obstacle appears near the edge of the captured image.
  • the landing instruction unit 109 moves the obstacle to the landing determination range before landing of the own aircraft. If it is no longer included, it is judged that the landing conditions are met. Moving obstacles are, for example, vehicles or animals (including humans).
  • the flight control unit 105 lowers the altitude of the aircraft and ends the flight control when it lands on the ground or an object on the ground.
  • the drone 10 performs a landing process for landing its own aircraft based on the above configuration.
  • FIG. 9 shows an example of the operation procedure in the landing process.
  • the operation procedure of FIG. 9 is started, for example, when the drone 10 flies on the planned flight path.
  • the drone 10 sensor acquisition unit 101
  • the drone 10 calculates the distance to the point indicated by each pixel of the acquired captured image (step S12). Subsequently, the drone 10 (risk level determination unit 103) determines the risk level at the time of landing of each of the plurality of divided regions vertically below the own aircraft based on the calculated distance for each pixel (step S13). ). Next, the drone 10 (flight instruction unit 104) gives an instruction to move the aircraft in the direction and altitude according to the arrangement of the divided regions whose determined risk level is the first reference or higher (step S14).
  • the drone 10 determines whether or not the altitude is below the predetermined altitude (step S15), and if it is determined that the altitude is not (NO), the drone 10 returns to step S11 to continue the operation and falls below the predetermined altitude. If it is determined that the result is (YES), the procedure proceeds to the next operation procedure. First, the drone 10 (sensor acquisition unit 101) acquires a photographed image vertically below the own machine in the same manner as in step S11 (step S21).
  • the drone 10 calculates the distance to the point indicated by each pixel of the acquired captured image (step S22). Subsequently, the drone 10 (obstacle detection unit 107) performs a process of detecting an obstacle from the acquired captured image based on the calculated distance for each pixel (step S23). If an obstacle is detected in step S23 (YES), the drone 10 determines whether or not the landing condition is satisfied (step S24).
  • step S24 If it is determined in step S24 that the landing condition is not satisfied (NO), the drone 10 returns to step S21 and continues the operation, and if it is determined that the landing condition is satisfied (YES), the drone 10 proceeds to the next operation procedure.
  • the drone 10 (landing instruction unit 109) is based on the calculated distance for each pixel.
  • the flight control unit 105 is instructed to land the own aircraft, and landing control is started (step S25).
  • the degree of risk is determined based on the proportion of unsuitable areas that are not suitable for landing. As a result, it is possible to determine the degree of risk with higher accuracy than when the ratio of the unsuitable region is not taken into consideration.
  • the landing control is performed if the landing conditions are satisfied.
  • an image taken by a stereo camera 17 may be used for stereoscopic viewing or the like, and an obstacle can be detected by using an image taken for another purpose as such.
  • FIG. 10 shows an example of the overall configuration of the landing support system 1 according to the second embodiment.
  • the landing support system 1 is a system that supports the landing of an aircraft such as the drone 10.
  • the landing support system 1 includes a network 2, a drone 10, and a server device 20.
  • the network 2 is a communication system including a mobile communication network, the Internet, and the like, and relays data exchange between devices accessing the own system.
  • the server device 20 accesses the network 2 by wire communication (may be wireless communication), and the drone 10 accesses the network 2 by wireless communication.
  • FIG. 11 shows an example of the hardware configuration of the server device 20.
  • the server device 20 may be physically configured as a computer device including a processor 21, a memory 22, a storage 23, a communication device 24, a bus 25, and the like.
  • the hardware of the same name shown in FIG. 1 such as the processor 21 is the same type of hardware as in FIG. 1, although there are differences in performance, specifications, and the like.
  • FIG. 12 shows the functional configuration realized in this embodiment.
  • the drone 10 includes a flight control unit 105 and an image capturing unit 110.
  • the server device 20 includes a sensor acquisition unit 101, a distance calculation unit 102, a risk level determination unit 103, a flight instruction unit 104, a damage scale determination unit 106, an obstacle detection unit 107, and a shooting instruction unit 108. It is provided with a landing instruction unit 109.
  • the image capturing unit 110 includes a stereo camera 17 and captures an image vertically below the own machine.
  • the image capturing unit 110 transmits captured image data indicating the captured image to the server device 20.
  • the sensor acquisition unit 101 of the server device 20 acquires the captured image indicated by the transmitted captured image data as the measured value of the image sensor.
  • the sensor acquisition unit 101 to the landing instruction unit 109 operate in the same manner as in the first embodiment to support the landing of the drone 10.
  • the difference from the first embodiment is that the instructions by the flight instruction unit 104, the shooting instruction unit 108, and the landing instruction unit 109 are given by transmitting the instruction data to the drone 10.
  • FIG. 13 shows an example of the operation procedure in the landing process of this embodiment.
  • the operation procedure of FIG. 13 is started, for example, when the drone 10 flies on the planned flight path.
  • the drone 10 imaging unit 110
  • the drone 10 image capturing unit 110
  • the server device 20 determines whether or not the altitude of the drone 10 indicated by the transmitted captured image data is less than a predetermined altitude (step S33). If it is determined in step S33 that the altitude is not lower than the predetermined altitude (NO), the server device 20 (sensor acquisition unit 101) acquires an image taken vertically below the own device by the stereo camera 17 as a measured value of the image sensor. (Step S41).
  • the server device 20 calculates the distance to the point indicated by each pixel of the acquired captured image as the distance for each pixel (step S42). Subsequently, the server device 20 (risk degree determination unit 103) determines the risk degree of the plurality of divided regions based on the calculated distance for each pixel (step S43). Next, the server device 20 (flight instruction unit 104) generates instruction data indicating an instruction to move the own aircraft in the direction and altitude according to the arrangement of the divided areas whose determined risk level is the first reference or higher ( Step S44), transmission to the drone 10 (step S45).
  • the drone 10 controls the flight of its own aircraft according to the instructions indicated by the transmitted instruction data (step S46). After that, the operations from steps S31 to S46 are repeated until it is determined in step S33 that the altitude is below a predetermined altitude (YES). If it is determined in step S33 that the altitude is below a predetermined altitude (YES), first, the operation procedure of steps S51, S52, S53, and S54, which is the same operation procedure as in steps S31, S32, S41, and S42, is performed.
  • the server device 20 (obstacle detection unit 107) performs a process of detecting an obstacle from the acquired captured image based on the distance for each pixel calculated in step S54 (step S55). In the example of FIG. 13, it is assumed that an obstacle is detected.
  • the server device 20 determines whether or not the landing condition is satisfied (step S56), and if it determines that the landing condition is not satisfied (NO), the server device 20 returns to step S52 to continue the operation and satisfies the landing condition (YES). If it is determined, proceed to the next operation procedure.
  • step S56 If it is determined in step S56 that the landing condition is satisfied (YES), the server device 20 (landing instruction unit 109) instructs the calculated distance for each pixel and the landing of the own aircraft based on the distance. Data is generated (step S57) and transmitted to the drone 10 (step S58). The drone 10 (flight control unit 105) controls the landing of its own aircraft according to the instructions indicated by the transmitted instruction data (step S59).
  • the server device 20 performs processing such as calculation of the distance for each pixel, determination of the degree of danger, determination of the scale of damage, and detection of obstacles, the processing of the drone 10 is performed as compared with the first embodiment.
  • the load is reduced.
  • computer resources such as the processor 11 mounted on the drone 10 can be reduced in weight and size.
  • communication since communication is not required for flight instructions and the like, landing can be supported without being affected by the communication status.
  • the sensor acquisition unit 101 has acquired the measured values of the image sensor of the stereo camera 17 in the above embodiment, but the present invention is not limited to this.
  • the sensor acquisition unit 101 acquires position information indicating the position of the own machine measured by the position sensor included in the sensor device 16 of the own machine as a measured value of the sensor.
  • the sensor acquisition unit 101 supplies the acquired position information to the risk determination unit 103.
  • the risk level determination unit 103 determines the risk level based on the supplied position information.
  • the risk determination unit 103 stores, for example, a map showing the attributes of an object existing on the ground for each area.
  • the attribute of the object is, for example, the attribute shown in the weight table of FIG.
  • the risk determination unit 103 sets a predetermined area including the position indicated by the position information acquired by the sensor acquisition unit 101 as the ground area on the stored map. Then, the risk level determination unit 103 determines the risk level based on the ratio of the unsuitable area calculated by weighting the area of the unsuitable area occupied by the object existing in the divided area obtained by dividing the ground area according to the attribute of the object. judge.
  • the risk level determination unit 103 recognizes the object, weights it using the weighting table of FIG. 4, and determines the risk level using the riskiness table of FIG.
  • the risk level since the risk level is determined without using the captured image, the risk level can be determined even in a situation where the ground cannot be clearly photographed from the sky due to fog, for example.
  • an obstacle can be detected by using an image taken for another purpose (such as taking an image for stereoscopic viewing), for example.
  • the division region in which the risk level determined by the risk level determination unit 103 is less than the first reference in the example of FIG. 6, the risk levels Lv1 and Lv2 are divided. If there is no area), the risk determination unit 103 enlarges the above-ground area on the stored map and determines the risk again.
  • the above-ground area is an area before being divided into divided areas, and in the example of FIG. 6, it is an area in which divided areas B1, B2, B3, and B4 are combined.
  • the expanded part may include a low-risk area.
  • the risk level determination unit 103 does not find a divided region whose risk level is less than the first criterion even after re-determining once, the risk level may be repeatedly re-determined by expanding the above-ground area. By doing so, it is possible to more surely find a divided region whose risk level is less than the first criterion.
  • Aircraft In the examples, a rotary-wing aircraft type air vehicle was used as the air vehicle, but the vehicle is not limited to this.
  • the air vehicle may be, for example, an airplane type air vehicle or a helicopter type air vehicle. Further, it may be a vertical take-off and landing aircraft called VTOL (Vertical Take-Off and Landing Aircraft).
  • VTOL Vertical Take-Off and Landing Aircraft
  • the landing instruction unit 109 may use landing conditions different from those in the first embodiment. For example, in the first embodiment, the landing instruction unit 109 determines that the landing condition is satisfied when the detected obstacle is not included in the predetermined range of the captured image. It may be changed according to the type of the flying object.
  • the landing instruction unit 109 descends from substantially vertically upward to vertically downward, so a circular range is used. .. Further, in the case of a VTOL aircraft, the landing instruction unit 109 often descends while gliding diagonally in order to stably descend, so an elliptical range is used. Further, in the case of an airplane-type flying object, the landing instruction unit 109 descends diagonally at an angle close to horizontal, so a longer elliptical range is used.
  • an airplane-type flying object when an airplane-type flying object uses a circular range, it flies at a height that makes contact with an obstacle before landing in that range, so there is a risk of contacting an obstacle outside the range. Therefore, in the case of an airplane-type flying object, by using a longer elliptical range, it is possible to land through a space above a predetermined range where there is no risk of contact with an obstacle. In this way, by changing the length of a predetermined range according to the angle of the flight path when the flying object descends, landing regardless of the type of the flying object, as compared with the case where a circular range is always used, for example. The risk of contact with obstacles at times can be reduced.
  • the landing instruction unit 109 may change a predetermined range according to the weather conditions at the landing point.
  • the landing instruction unit 109 inquires of the system of the operator that provides the weather condition for each region on the Internet, for example, the weather condition of the region including the landing point.
  • the landing instruction unit 109 uses a range as a predetermined range, which is so small that the weather condition obtained in response to the inquiry is suitable for the flight of the drone 10.
  • the landing instruction unit 109 increases the predetermined range as the weather conditions are bad and the drone 10 is not suitable for flight. If the weather conditions are bad (for example, the wind is strong), the risk of the drone 10 being blown by the wind while descending and coming into contact with a nearby obstacle increases. Therefore, by increasing the predetermined range as the weather condition is worse, the risk of contact with an obstacle at the time of landing can be reduced as compared with the case where the range is kept constant regardless of the weather condition.
  • each divided region may be rectangular.
  • three regular hexagonal division regions may be used, or six regular triangle division regions may be used.
  • the shapes and sizes of the divided regions were the same, they do not have to be exactly the same and may be slightly different. In either case, it suffices that the arrangement of the divided areas having a risk level of the first criterion or higher is associated with the moving direction toward the area where the safest landing can be expected at the time of the arrangement.
  • the flight instruction unit 104, the shooting instruction unit 108, and the landing instruction unit 109 give instructions to the drone 10 in the embodiment, but the present invention is not limited to this, and the drone 10 is operated by, for example, a radio. If so, the operator may be instructed.
  • each indicator transmits data indicating, for example, the moving direction, the altitude after the movement, the shooting position, and the character string instructing the switching to the landing control to the radio, and those character strings are displayed on the display surface of the radio. It is advisable to give instructions by displaying.
  • the risk level determination unit 103 determines the risk level in three stages in the embodiment, but it may be in two stages or in four or more stages.
  • the degree of danger is represented by a character string such as "Large”, “Medium”, “Small”, “A”, “B”, and “C”. You may. In short, it is sufficient if the information indicates the high possibility that the drone 10 cannot land safely.
  • the first criterion used when the flight instruction unit 104 determines the moving direction and the second criterion used when determining the presence or absence of descent may be determined according to the way of expressing the degree of danger.
  • the flight instruction unit 104 may determine the movement direction by a method different from that of the embodiment.
  • the flight instruction unit 104 may determine the moving direction in consideration of the balance between the degree of danger and the degree of damage, for example. For example, when there is a division area B1 of damage Lv1 in danger Lv2 and a division area B2 of damage Lv3 in danger Lv1, the flight instruction unit 104 may set the division area B2 having the smaller risk as the movement direction, for example.
  • the division area B1 in which the total value of both levels is small may be set as the moving direction.
  • the distance measuring means used in landing control is not limited to a stereo camera.
  • a means for measuring the distance to an object using infrared rays, ultrasonic waves, millimeter waves, or the like may be used as the distance measuring means.
  • FIGS. 2 and 12 Device for Realizing Each Function
  • the computer resources provided by the cloud service may realize the functions realized by the server device 20. In either case, it suffices that each function shown in FIG. 2 or FIG. 12 is realized in the entire landing support system 1.
  • the present invention can be regarded as an information processing system such as a landing support system 1 including the above-mentioned information processing devices such as the drone 10 and the server device 20. .. Further, the present invention can be regarded as an information processing method for realizing the processing performed by the information processing device, and also as a program for operating a computer that controls the information processing device.
  • the program regarded as the present invention may be provided in the form of a recording medium such as an optical disk in which the program is stored, or may be downloaded to a computer via a network such as the Internet, and the downloaded program may be installed and used. It may be provided in the form of
  • each functional block may be realized by using one device that is physically or logically connected, or directly or indirectly (for example, by two or more devices that are physically or logically separated). , Wired, wireless, etc.) and may be realized using these plurality of devices.
  • the functional block may be realized by combining the software with the one device or the plurality of devices.
  • Functions include judgment, decision, judgment, calculation, calculation, processing, derivation, investigation, search, confirmation, reception, transmission, output, access, solution, selection, selection, establishment, comparison, assumption, expectation, and assumption.
  • broadcasting notifying, communicating, forwarding, configuring, reconfiguring, allocating, mapping, assigning, etc., but only these.
  • a functional block that makes transmission function is called a transmitting unit (transmitting unit) or a transmitter (transmitter).
  • transmitting unit transmitting unit
  • transmitter transmitter
  • the input and output information and the like may be stored in a specific location (for example, a memory) or may be managed using a management table. Input / output information and the like can be overwritten, updated, or added. The output information and the like may be deleted. The input information or the like may be transmitted to another device.
  • Judgment method may be performed by a value represented by 1 bit (0 or 1), may be performed by a boolean value (Boolean: true or false), or may be a numerical value. (For example, comparison with a predetermined value) may be performed.
  • the input / output information, etc. may be saved in a specific location (for example, memory) or may be managed by a management table. Input / output information and the like can be overwritten, updated, or added. The output information and the like may be deleted. The input information or the like may be transmitted to another device.
  • Software Software whether referred to as software, firmware, middleware, microcode, hardware description language, or by any other name, is an instruction, instruction set, code, code segment, program code, program. , Subprograms, software modules, applications, software applications, software packages, routines, subroutines, objects, executable files, execution threads, procedures, functions, etc. should be broadly interpreted.
  • software, instructions, information, etc. may be transmitted and received via a transmission medium.
  • a transmission medium For example, a website, where the software uses at least one of wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and wireless technology (infrared, microwave, etc.).
  • wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
  • wireless technology infrared, microwave, etc.
  • “Judgment”, “Decision” The terms “determining” and “determining” as used in this disclosure may include a wide variety of actions. “Judgment” and “decision” are, for example, judgment (judging), calculation (calculating), calculation (computing), processing (processing), derivation (deriving), investigating (investigating), search (looking up, search, inquiry). (For example, searching in a table, database or another data structure), ascertaining may be regarded as “judgment” or “decision”.
  • judgment and “decision” are receiving (for example, receiving information), transmitting (for example, transmitting information), input (input), output (output), and access. (Accessing) (for example, accessing data in memory) may be regarded as “judgment” or “decision”.
  • judgment and “decision” mean that the things such as solving, selecting, choosing, establishing, and comparing are regarded as “judgment” and “decision”. Can include. That is, “judgment” and “decision” may include considering some action as “judgment” and “decision”. Further, “judgment (decision)” may be read as “assuming", “expecting”, “considering” and the like.
  • 1 ... Landing support system, 10 ... Drone, 17 ... Stereo camera, 20 ... Server device, 101 ... Sensor acquisition unit, 102 ... Distance calculation unit, 103 ... Danger level determination unit, 104 ... Flight instruction unit, 105 ... Flight control unit , 106 ... Damage scale determination unit, 107 ... Obstacle detection unit, 108 ... Shooting instruction unit, 109 ... Landing instruction unit, 110 ... Imaging unit.

Abstract

A sensor obtainment unit 101 obtains a captured image captured by a stereo camera 17 vertically below a host machine. A distance calculation unit 102 calculates a distance for each of pixels in the obtained captured image. A danger level determination unit 103 determines a danger level for landing in each of a plurality of segmented regions vertically below the host machine, on the basis of the calculated distance for each of the pixels. A flight instruction unit 104 repeatedly makes an instruction to move the host machine in a direction and an altitude according to an arrangement of the segmented regions in which the determined danger level is at least a first standard. When the altitude drops below a predetermined altitude, an obstruction detection unit 107 performs processing for detecting an obstruction from the obtained captured image. When no obstruction is detected, or when a landing condition is met despite an obstruction being detected, a landing instruction unit 109 instructs a flight control unit 105 to land the host machine based on the calculated distance for each of the pixels, and causes landing control to start.

Description

情報処理装置Information processing device
 本発明は、飛行体を着陸させる技術に関する。 The present invention relates to a technique for landing an air vehicle.
 飛行体を着陸させる技術として、特許文献1には、無人飛行機に搭載されているGPSにより着陸点近くまで飛行させたのちに着陸点に置かれたターゲットマークを撮像装置でとらえ、それを目指して着陸させる技術と、離着陸台のGPSが測定する位置情報とドローンの位置情報の差をゼロにするようにして着陸させる技術が開示されている。 As a technique for landing an air vehicle, Patent Document 1 states that a GPS mounted on an unmanned airplane is used to fly to near the landing point, and then a target mark placed at the landing point is captured by an imaging device, aiming at that. A technique for landing and a technique for landing so that the difference between the position information measured by the GPS of the takeoff and landing platform and the position information of the drone are made zero are disclosed.
特開2018-190362号公報JP-A-2018-190362
 ドローンのような飛行体を着陸させる場合、着陸地点の近くに存在する障害物(人間、乗り物、建物等)に衝突すると、飛行体の破損はもちろん、障害物の破損(特に人間の場合は怪我)にも繋がる虞がある。
 そこで、本発明は、飛行体の着陸時に障害物に衝突する可能性を低くすることを目的とする。
When landing an aircraft such as a drone, if it collides with an obstacle (human, vehicle, building, etc.) near the landing point, the aircraft will be damaged as well as the obstacle (especially in the case of humans, injuries). ) May also lead to.
Therefore, an object of the present invention is to reduce the possibility of colliding with an obstacle when the flying object lands.
 上記目的を達成するために、本発明は、飛行体に設けられたステレオカメラが撮影した画像を取得する取得部と、取得された前記画像の各画素が示す地点までの前記ステレオカメラからの距離を算出する算出部と、取得された前記画像から障害物を検出する検出部と、前記障害物が検出されても当該障害物の位置が着陸条件を満たす場合には、算出された前記距離に基づく前記飛行体の着陸を指示する着陸指示部とを備える情報処理装置を提供する。 In order to achieve the above object, the present invention relates to an acquisition unit that acquires an image taken by a stereo camera provided on the flying object, and a distance from the stereo camera to a point indicated by each pixel of the acquired image. A calculation unit that calculates the distance, a detection unit that detects an obstacle from the acquired image, and a calculated distance if the position of the obstacle satisfies the landing condition even if the obstacle is detected. Provided is an information processing apparatus including a landing instruction unit for instructing the landing of the aircraft based on the above.
 本発明によれば、飛行体の着陸時に障害物に衝突する可能性を低くすることができる。 According to the present invention, it is possible to reduce the possibility of colliding with an obstacle when the flying object lands.
第1実施例に係るドローンのハードウェア構成の一例を表す図The figure which shows an example of the hardware configuration of the drone which concerns on 1st Example ドローンが実現する機能構成を表す図Diagram showing the functional configuration realized by the drone 分割領域の一例を表す図Diagram showing an example of a divided area 重みテーブルの一例を表す図Diagram showing an example of a weight table 危険度テーブルの一例を表す図Diagram showing an example of the risk table 分割領域の配置と移動方向の関係の例を表す図The figure which shows the example of the relationship between the arrangement of the division area and the movement direction. 損害度テーブルの一例を表す図Diagram showing an example of the damage degree table 判定された損害度の一例を表す図Diagram showing an example of the determined damage level 着陸処理における動作手順の一例を表す図The figure which shows an example of the operation procedure in a landing process 第2実施例に係る着陸支援システムの全体構成の一例を表す図Diagram showing an example of the overall configuration of the landing support system according to the second embodiment サーバ装置のハードウェア構成の一例を表す図Diagram showing an example of the hardware configuration of the server device 実施例で実現される機能構成を表す図Diagram showing the functional configuration realized in the embodiment 実施例の着陸処理における動作手順の一例を表す図The figure which shows an example of the operation procedure in the landing process of an Example.
[1]第1実施例
 図1は第1実施例に係るドローン10のハードウェア構成の一例を表す。ドローン10は、物理的には、プロセッサ11と、メモリ12と、ストレージ13と、通信装置14と、飛行装置15と、センサ装置16と、ステレオカメラ17と、バス18などを含むコンピュータ装置として構成されてもよい。なお、以下の説明では、「装置」という文言は、回路、デバイス、ユニットなどに読み替えることができる。
[1] First Example FIG. 1 shows an example of the hardware configuration of the drone 10 according to the first embodiment. The drone 10 is physically configured as a computer device including a processor 11, a memory 12, a storage 13, a communication device 14, a flight device 15, a sensor device 16, a stereo camera 17, a bus 18, and the like. May be done. In the following description, the word "device" can be read as a circuit, a device, a unit, or the like.
 また、各装置は、1つ又は複数含まれていてもよいし、一部の装置が含まれていなくてもよい。プロセッサ11は、例えば、オペレーティングシステムを動作させてコンピュータ全体を制御する。プロセッサ11は、周辺装置とのインターフェース、制御装置、演算装置、レジスタなどを含む中央処理装置(CPU:Central Processing Unit)によって構成されてもよい。 Further, each device may be included one or more, or some devices may not be included. The processor 11 operates, for example, an operating system to control the entire computer. The processor 11 may be configured by a central processing unit (CPU: Central Processing Unit) including an interface with a peripheral device, a control device, an arithmetic unit, a register, and the like.
 例えば、ベースバンド信号処理部等は、プロセッサ11によって実現されてもよい。また、プロセッサ11は、プログラム(プログラムコード)、ソフトウェアモジュール、データなどを、ストレージ13及び通信装置14の少なくとも一方からメモリ12に読み出し、読み出したプログラム等に従って各種の処理を実行する。プログラムとしては、上述の実施の形態において説明した動作の少なくとも一部をコンピュータに実行させるプログラムが用いられる。 For example, the baseband signal processing unit and the like may be realized by the processor 11. Further, the processor 11 reads a program (program code), a software module, data, and the like from at least one of the storage 13 and the communication device 14 into the memory 12, and executes various processes according to the read program and the like. As the program, a program that causes a computer to execute at least a part of the operations described in the above-described embodiment is used.
 上述の各種処理は、1つのプロセッサ11によって実行される旨を説明してきたが、2以上のプロセッサ11により同時又は逐次に実行されてもよい。プロセッサ11は、1以上のチップによって実装されてもよい。なお、プログラムは、電気通信回線を介してネットワークから送信されても良い。メモリ12は、コンピュータ読み取り可能な記録媒体である。 Although it has been explained that the various processes described above are executed by one processor 11, they may be executed simultaneously or sequentially by two or more processors 11. The processor 11 may be implemented by one or more chips. The program may be transmitted from the network via a telecommunication line. The memory 12 is a computer-readable recording medium.
 メモリ12は、例えば、ROM(Read Only Memory)、EPROM(Erasable Programmable ROM)、EEPROM(Electrically Erasable Programmable ROM)、RAM(Random Access Memory)などの少なくとも1つによって構成されてもよい。メモリ12は、レジスタ、キャッシュ、メインメモリ(主記憶装置)などと呼ばれてもよい。メモリ12は、本開示の一実施の形態に係る無線通信方法を実施するために実行可能なプログラム(プログラムコード)、ソフトウェアモジュールなどを保存することができる。 The memory 12 may be composed of at least one such as a ROM (ReadOnlyMemory), an EPROM (ErasableProgrammableROM), an EPROM (ElectricallyErasableProgrammableROM), and a RAM (RandomAccessMemory). The memory 12 may be referred to as a register, a cache, a main memory (main storage device), or the like. The memory 12 can store a program (program code), a software module, or the like that can be executed to implement the wireless communication method according to the embodiment of the present disclosure.
 ストレージ13は、コンピュータ読み取り可能な記録媒体であり、例えば、CD-ROM(Compact Disc ROM)などの光ディスク、ハードディスクドライブ、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリ(例えば、カード、スティック、キードライブ)、フロッピー(登録商標)ディスク、磁気ストリップなどの少なくとも1つによって構成されてもよい。 The storage 13 is a computer-readable recording medium, and is, for example, an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, an optical magnetic disk (for example, a compact disk, a digital versatile disk, or a Blu-ray). It may consist of at least one (registered trademark) disk), smart card, flash memory (eg, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, and the like.
 ストレージ13は、補助記憶装置と呼ばれてもよい。上述の記憶媒体は、例えば、メモリ12及びストレージ13の少なくとも一方を含むデータベース、サーバその他の適切な媒体であってもよい。通信装置14は、有線ネットワーク及び無線ネットワークの少なくとも一方を介してコンピュータ間の通信を行うためのハードウェア(送受信デバイス)である。 The storage 13 may be called an auxiliary storage device. The storage medium described above may be, for example, a database, server or other suitable medium containing at least one of the memory 12 and the storage 13. The communication device 14 is hardware (transmission / reception device) for communicating between computers via at least one of a wired network and a wireless network.
 例えば、上述の送受信アンテナ、アンプ部、送受信部、伝送路インターフェースなどは、通信装置14によって実現されてもよい。送受信部は、送信部と受信部とで、物理的に、または論理的に分離された実装がなされてもよい。また、プロセッサ11、メモリ12などの各装置は、情報を通信するためのバス18によって接続される。バス18は、単一のバスを用いて構成されてもよいし、装置間ごとに異なるバスを用いて構成されてもよい。 For example, the above-mentioned transmission / reception antenna, amplifier unit, transmission / reception unit, transmission line interface, and the like may be realized by the communication device 14. The transmission / reception unit may be physically or logically separated from each other in the transmission unit and the reception unit. Further, each device such as the processor 11 and the memory 12 is connected by a bus 18 for communicating information. The bus 18 may be configured by using a single bus, or may be configured by using a different bus for each device.
 飛行装置15は、モータ及びローター等を備え、自機を飛行させる装置である。飛行装置15は、空中において、あらゆる方向に自機を移動させたり、自機を静止(ホバリング)させたりすることができる。センサ装置16は、飛行制御に必要な情報を取得するセンサ群を有する装置である。センサ装置16は、例えば、自機の位置(緯度及び経度)を測定する位置センサを備える。 The flight device 15 is a device provided with a motor, a rotor, etc., to fly its own aircraft. The flight device 15 can move its own aircraft in all directions and make its own aircraft stationary (hovering) in the air. The sensor device 16 is a device having a sensor group for acquiring information necessary for flight control. The sensor device 16 includes, for example, a position sensor that measures the position (latitude and longitude) of the own machine.
 また、センサ装置16は、自機が向いている方向(ドローンには自機の正面方向が定められており、定められた正面方向が向いている方向)を測定する方向センサと、自機の高度を測定する高度センサとを備える。また、センサ装置16は、自機の速度を測定する速度センサと、3軸の角速度及び3方向の加速度を測定する慣性計測センサ(IMU(Inertial Measurement Unit))とを備える。 Further, the sensor device 16 includes a direction sensor for measuring the direction in which the own machine is facing (the direction in which the drone is facing the front direction of the own machine and the determined front direction is facing), and the own machine. It is equipped with an altitude sensor that measures altitude. Further, the sensor device 16 includes a speed sensor for measuring the speed of the own machine and an inertial measurement sensor (IMU (Inertial Measurement Unit)) for measuring the angular speed of three axes and the acceleration in three directions.
 ステレオカメラ17は、複数のカメラを備え、それら複数のカメラにより複数の方向から対象物を撮影することで、対象物の奥行き方向の情報を記録できるようにしたカメラである。ステレオカメラ17は、本実施例では、2つのカメラを備える。各カメラは、レンズを含む光学系とイメージセンサとをそれぞれ備える。ステレオカメラ17は、自機の鉛直下方を撮影するように設けられている。 The stereo camera 17 is a camera provided with a plurality of cameras and capable of recording information in the depth direction of the object by photographing the object from a plurality of directions with the plurality of cameras. The stereo camera 17 includes two cameras in this embodiment. Each camera includes an optical system including a lens and an image sensor. The stereo camera 17 is provided so as to take a picture vertically below the own machine.
 また、上記の装置は、マイクロプロセッサ、デジタル信号プロセッサ(DSP:Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、FPGA(Field Programmable Gate Array)などのハードウェアを含んで構成されてもよい。また、上記の装置は、当該ハードウェアにより、各機能ブロックの一部又は全てが実現されてもよい。例えば、プロセッサ11は、当該ハードウェアの少なくとも1つを用いて実装されてもよい。 In addition, the above-mentioned device includes hardware such as a microprocessor, a digital signal processor (DSP: Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), and an FPGA (Field Programmable Gate Array). It may be configured. Further, in the above-mentioned device, a part or all of each functional block may be realized by the hardware. For example, the processor 11 may be implemented using at least one of the hardware.
 ドローン10における各機能は、プロセッサ11、メモリ12などのハードウェア上に所定のソフトウェア(プログラム)を読み込ませることによって、プロセッサ11が演算を行い、通信装置14による通信を制御したり、メモリ12及びストレージ13におけるデータの読み出し及び書き込みの少なくとも一方を制御したりすることによって実現される。 Each function in the drone 10 is such that the processor 11 performs an operation by loading predetermined software (program) on the hardware such as the processor 11 and the memory 12, and controls the communication by the communication device 14, or the memory 12 and the memory 12. It is realized by controlling at least one of reading and writing of data in the storage 13.
 図2はドローン10が実現する機能構成を表す。ドローン10は、センサ取得部101と、距離算出部102と、危険度判定部103と、飛行指示部104と、飛行制御部105と、損害規模判定部106と、障害物検出部107と、撮影指示部108と、着陸指示部109とを備える。センサ取得部101は、飛行体である自機(ドローン10)に設けられたセンサの測定値を取得する。 FIG. 2 shows the functional configuration realized by the drone 10. The drone 10 is photographed by the sensor acquisition unit 101, the distance calculation unit 102, the risk determination unit 103, the flight instruction unit 104, the flight control unit 105, the damage scale determination unit 106, the obstacle detection unit 107, and the like. It includes an instruction unit 108 and a landing instruction unit 109. The sensor acquisition unit 101 acquires the measured value of the sensor provided in the own aircraft (drone 10) which is an air vehicle.
 本実施例では、センサ取得部101は、イメージセンサの測定値として、自機に設けられたステレオカメラ17が撮影した自機の鉛直下方の画像を取得する。言い方を変えると、センサ取得部101は、ステレオカメラ17のイメージセンサが出力した自機の鉛直下方の画像を示す画素値をセンサの測定値として取得する。センサ取得部101は本発明の「取得部」の一例である。センサ取得部101は、取得した画像を距離算出部102に供給する。 In this embodiment, the sensor acquisition unit 101 acquires an image vertically below the own machine taken by the stereo camera 17 provided in the own machine as the measured value of the image sensor. In other words, the sensor acquisition unit 101 acquires a pixel value indicating an image vertically below the own machine output by the image sensor of the stereo camera 17 as a measured value of the sensor. The sensor acquisition unit 101 is an example of the “acquisition unit” of the present invention. The sensor acquisition unit 101 supplies the acquired image to the distance calculation unit 102.
 距離算出部102は、センサ取得部101により取得された画像の各画素が示す対象物が存在する地点までのステレオカメラ17からの距離(以下「画素毎の距離」と言う)を算出する。距離算出部102は本発明の「算出部」の一例である。ステレオカメラ17が撮影した画像は同一の地点を複数のアングルから撮影するため、各画素に写っている対象物が存在する地点までのステレオカメラ17からの距離が算出可能である。 The distance calculation unit 102 calculates the distance from the stereo camera 17 (hereinafter referred to as "distance for each pixel") to the point where the object indicated by each pixel of the image acquired by the sensor acquisition unit 101 exists. The distance calculation unit 102 is an example of the "calculation unit" of the present invention. Since the image captured by the stereo camera 17 captures the same point from a plurality of angles, the distance from the stereo camera 17 to the point where the object reflected in each pixel exists can be calculated.
 ここで、撮影された画像には、地面が写っているとは限らず、建築物、自然物、水面又は動物等のあらゆる物体が写っている可能性がある。本実施例では、ステレオカメラ17からの距離はドローン10からの距離と見做すことができるものとする。画素毎の距離の算出方法については、例えば特開平11-230745に記載されているような周知の技術が用いられればよい。 Here, the captured image does not necessarily show the ground, but may show any object such as a building, a natural object, a water surface, or an animal. In this embodiment, the distance from the stereo camera 17 can be regarded as the distance from the drone 10. As a method for calculating the distance for each pixel, a well-known technique such as that described in JP-A-11-230745 may be used.
 距離算出部102は、画素毎の距離を算出すると、算出した画素毎の距離を示す距離情報を危険度判定部103に供給する。危険度判定部103は、センサ取得部101により取得された測定値に基づき、自機の鉛直下方にある地上領域を分割した複数の分割領域の各々の着陸の際の危険度を判定する。 When the distance calculation unit 102 calculates the distance for each pixel, the distance calculation unit 102 supplies the distance information indicating the calculated distance for each pixel to the risk determination unit 103. Based on the measured values acquired by the sensor acquisition unit 101, the risk level determination unit 103 determines the risk level at the time of landing of each of the plurality of divided areas that divide the above-ground area vertically below the own aircraft.
 着陸の際の危険度とは、安全に着陸できない可能性の高さのことである。安全な着陸とは、何の破損も伴わない着陸のことである。つまり、危険度とは、例えば着陸の際に何らかの物体(人間やその他の動物を含む)と接触するまたは着陸時の姿勢が乱れる(その結果として転倒する場合を含む)ことにより、ドローン10自身が破損したり、ドローン10が接触した物体を破損(怪我を含む)させたりする可能性の高さを表す。 The degree of danger at the time of landing is the high possibility that it will not be possible to land safely. A safe landing is a landing without any damage. In other words, the degree of danger is, for example, the drone 10 itself due to contact with some object (including humans and other animals) during landing or disturbance of the attitude at the time of landing (including the case of falling as a result). It represents the high possibility of damage or damage (including injury) to an object that the drone 10 comes into contact with.
 図3は分割領域の一例を表す。危険度判定部103は、本実施例では、自機を上空から鉛直下方に見た場合に、自機を頂点とし且つ辺の長さが等しい4つの正方形の領域を分割領域B1、B2、B3、B4として用いる。危険度判定部103は、センサ取得部101により取得された測定値に基づき算出される、分割領域における着陸に適していない不適領域の割合に基づき危険度を判定する。 FIG. 3 shows an example of the divided area. In this embodiment, the risk determination unit 103 divides four square areas having the own machine as the apex and having the same side length into divided areas B1, B2, and B3 when the own machine is viewed vertically downward from the sky. , Used as B4. The risk level determination unit 103 determines the risk level based on the ratio of the unsuitable area that is not suitable for landing in the divided area, which is calculated based on the measured value acquired by the sensor acquisition unit 101.
 不適領域は、例えば、商業地、住宅地、工業地及び公園のように人が集まるため落下時に人を怪我させてしまう可能性が他の領域に比べて高い領域である。また、不適領域は、森林、傾斜地、電柱及び車両等のように着陸しようとすると何らかの物体に接触したり転倒したりしてドローン10が破損する可能性が他の領域に比べて高い領域であってもよい。 Unsuitable areas are areas such as commercial areas, residential areas, industrial areas, and parks where people gather and are more likely to injure people when falling than other areas. In addition, the unsuitable area is an area where there is a higher possibility that the drone 10 will be damaged by contacting or falling over some object when trying to land, such as a forest, slope, utility pole, vehicle, etc., as compared with other areas. You may.
 本実施例では、危険度判定部103は、まず、供給された距離情報が示す距離、すなわち、距離算出部102により算出された画素毎の距離から地上に存在する物体を認識する。地上には、建物、乗り物、人、動物、電柱、樹木、森林、斜面及び平地等の物体が存在する場合がある。危険度判定部103は、画像中の物体を表す部分に含まれる各画素と自機との距離と、自機の高度から、各画素が示す物体の地上からの高さを算出する。 In this embodiment, the risk determination unit 103 first recognizes an object existing on the ground from the distance indicated by the supplied distance information, that is, the distance for each pixel calculated by the distance calculation unit 102. Objects such as buildings, vehicles, people, animals, utility poles, trees, forests, slopes and flatlands may exist on the ground. The risk determination unit 103 calculates the height of the object indicated by each pixel from the ground from the distance between each pixel included in the portion representing the object in the image and the own machine and the altitude of the own machine.
 物体を表す各画素の高さが分かると、3次元空間におけるその物体の輪郭が表されることになる。危険度判定部103は、物体の種類毎に形状と大きさのパターンを記憶しておき、記憶したパターンとの類似度が閾値を超える場合にそのパターンに対応する物体であると認識する。危険度判定部103は、このようにして、センサ取得部101により取得された画素値から地上に存在する物体を認識する。 If the height of each pixel representing an object is known, the outline of the object in the three-dimensional space will be represented. The risk determination unit 103 stores a pattern of shape and size for each type of object, and recognizes that the object corresponds to the pattern when the similarity with the stored pattern exceeds the threshold value. In this way, the risk determination unit 103 recognizes an object existing on the ground from the pixel values acquired by the sensor acquisition unit 101.
 そして、危険度判定部103は、認識した物体が占める領域の面積にその物体の属性に応じた重みを付けて算出される不適領域の割合に基づき危険度を判定する。危険度判定部103は、物体の属性と重みとを対応付けた重みテーブルを用いる。
 図4は重みテーブルの一例を表す。図4の例では、「斜面」には「1.0」、「建物、電柱、樹木」には「1.5」、「乗り物、人、動物」には「2.0」というように、物体の属性と重みとが対応付けられている。
Then, the risk level determination unit 103 determines the risk level based on the ratio of the unsuitable area calculated by weighting the area occupied by the recognized object according to the attribute of the object. The risk determination unit 103 uses a weight table in which the attributes of the object and the weights are associated with each other.
FIG. 4 shows an example of a weight table. In the example of FIG. 4, "1.0" is for "slope", "1.5" is for "buildings, utility poles, trees", "2.0" is for "vehicles, people, animals", and so on. Object attributes and weights are associated.
 危険度判定部103は、分割領域に存在する物体を認識すると、その物体の画素数にその物体の属性に応じた重みを乗じた値を認識した全ての物体について算出する。そして、危険度判定部103は、算出した値の合計値を分割領域全体の画素数で除した値を不適領域の割合として算出する。危険度判定部103は、例えば、不適領域の割合と危険度とを対応付けた危険度テーブルを用いて危険度を判定する。 When the risk determination unit 103 recognizes an object existing in the divided region, it calculates for all the recognized objects by multiplying the number of pixels of the object by a weight according to the attribute of the object. Then, the risk determination unit 103 calculates the value obtained by dividing the total value of the calculated values by the number of pixels of the entire divided region as the ratio of the unsuitable region. The risk level determination unit 103 determines the risk level using, for example, a risk level table in which the ratio of the unsuitable region and the risk level are associated with each other.
 図5は危険度テーブルの一例を表す。図5の例では、「Th11未満」、「Th11以上Th12未満」及び「Th12以上」という不適領域の割合に、「危険Lv1」、「危険Lv2」及び「危険Lv3」という危険度が対応付けられている。Th11及びTh12は、不適領域の割合の閾値であり、例えば30%及び60%といった値である。危険度は、数値が大きいほど危険度が高いことを表す。危険度判定部103は、例えば不適領域の割合がTh11以上Th12未満の分割領域の危険度を危険Lv2と判定する。 FIG. 5 shows an example of a risk table. In the example of FIG. 5, the risk levels of "danger Lv1", "danger Lv2", and "danger Lv3" are associated with the proportions of the unsuitable regions "less than Th11", "Th11 or more and less than Th12", and "Th12 or more". ing. Th11 and Th12 are thresholds for the proportion of unsuitable regions, such as 30% and 60%. The higher the value, the higher the risk. The risk determination unit 103 determines, for example, the risk of the divided region in which the ratio of the unsuitable region is Th11 or more and less than Th12 as danger Lv2.
 危険度判定部103は、各分割領域の危険度を判定すると、判定結果を飛行指示部104に供給する。飛行指示部104は、危険度判定部103により判定された危険度が予め定められた基準以上の分割領域の配置に応じた方向に自機(ドローン10)を移動させる指示を行う。 When the risk level determination unit 103 determines the risk level of each divided region, the risk level determination unit 103 supplies the determination result to the flight instruction unit 104. The flight instruction unit 104 gives an instruction to move the own aircraft (drone 10) in a direction corresponding to the arrangement of the divided regions in which the degree of danger determined by the degree of danger determination 103 is equal to or higher than a predetermined reference.
 飛行指示部104は、飛行制御部105に対して上記の指示を行う。飛行制御部105は、センサ装置16が測定する各種の値(位置、高度及び方向等)に基づいて、自機(ドローン10)の飛行を制御する。飛行制御部105は、飛行指示部104から移動の指示が行われると、指示された方向に自機を移動させる制御を行う。 The flight instruction unit 104 gives the above instructions to the flight control unit 105. The flight control unit 105 controls the flight of its own aircraft (drone 10) based on various values (position, altitude, direction, etc.) measured by the sensor device 16. When the flight instruction unit 104 gives an instruction to move the flight control unit 105, the flight control unit 105 controls to move the aircraft in the instructed direction.
 図6は分割領域の配置と移動方向の関係の例を表す。図6では、基準の危険度がLv3と定められているものとする。飛行指示部104は、図6(a)のように危険Lv3の分割領域が3つある場合は、残り1つの危険度が基準未満の分割領域の中心に向かう方向A1を移動方向として指示する。飛行指示部104は、本実施例では、方向A1にある分割領域B1の中心C1までの移動を指示する。 FIG. 6 shows an example of the relationship between the arrangement of the divided areas and the moving direction. In FIG. 6, it is assumed that the standard risk level is set to Lv3. When there are three division regions of danger Lv3 as shown in FIG. 6A, the flight instruction unit 104 instructs the direction A1 toward the center of the division region where the remaining one risk level is less than the reference as the movement direction. In this embodiment, the flight instruction unit 104 instructs the movement to the center C1 of the division region B1 in the direction A1.
 飛行指示部104は、図6(b)のように危険Lv3の分割領域が並んで2つある場合は、残り2つの危険度が基準未満の分割領域同士の境目に向かう方向A2を移動方向として指示する。この場合、飛行指示部104は、本実施例では、方向A2にある分割領域B1及びB2の辺の中心C2までの移動を指示する。なお、飛行指示部104は、図6(a)の場合に分割領域B1の対角線上にある頂点C3までの移動を指示してもよい。 When the flight instruction unit 104 has two divided regions of danger Lv3 arranged side by side as shown in FIG. 6B, the movement direction is the direction A2 toward the boundary between the divided regions where the remaining two danger levels are less than the standard. Instruct. In this case, in this embodiment, the flight instruction unit 104 instructs the movement to the center C2 of the sides of the divided regions B1 and B2 in the direction A2. The flight instruction unit 104 may instruct the movement to the apex C3 on the diagonal line of the division region B1 in the case of FIG. 6A.
 また、図6(b)の場合に分割領域B1及びB2の辺の反対側の端C4までの移動を指示してもよい。ただし、以下の説明では、いずれも中心C1、C2に相当する位置までの移動が指示されるものとする。飛行指示部104は、図6(c)のように危険Lv3の分割領域が1つだけの場合は、その分割領域とドローン10を挟んで対向する分割領域の中心に向かう方向A3を移動方向として指示する。 Further, in the case of FIG. 6B, the movement to the end C4 on the opposite side of the sides of the divided regions B1 and B2 may be instructed. However, in the following description, it is assumed that the movement to the position corresponding to the centers C1 and C2 is instructed. When the flight instruction unit 104 has only one division region of danger Lv3 as shown in FIG. 6C, the movement direction is the direction A3 toward the center of the division region facing the division region across the drone 10. Instruct.
 また、飛行指示部104は、図6(d)のように危険Lv3の分割領域が2つだけで且つ互いに隣接していない場合は、それら分割領域に隣接する2つの分割領域のいずれかの中心に向かう方向A4-1又は方向A4-2を移動方向として指示する。飛行指示部104は、方向A4-1又は方向A4-2のいずれかをランダムに選択してもよいし、該当する分割領域の危険度を比較して小さい方を選択してもよい。 Further, when the flight instruction unit 104 has only two division regions of danger Lv3 and is not adjacent to each other as shown in FIG. 6D, the center of any of the two division regions adjacent to the division regions. The direction A4-1 or the direction A4-2 toward the direction A4-1 is indicated as the moving direction. The flight instruction unit 104 may randomly select either the direction A4-1 or the direction A4-2, or may compare the risk levels of the corresponding divided regions and select the smaller one.
 また、飛行指示部104は、2つの分割領域の危険度が同じであれば、不適領域の割合を比較して小さい方を選択してもよい。また、飛行指示部104は、危険Lv3の分割領域が並んで2つある場合は、残り2つの危険度が基準未満の分割領域同士の境目に向かう方向を移動方向として指示してもよいが、図6(e)のように残り2つの分割領域の危険度が基準未満であるがレベルが異なる場合であれば、より危険度の小さい分割領域の中心に向かう方向A5を移動方向として指示してもよい。 Further, if the risk levels of the two divided regions are the same, the flight instruction unit 104 may compare the ratios of the unsuitable regions and select the smaller one. Further, when the flight instruction unit 104 has two divided regions of danger Lv3 arranged side by side, the flight instruction unit 104 may instruct the direction toward the boundary between the divided regions whose remaining two danger levels are less than the reference as the moving direction. If the risk levels of the remaining two divided areas are less than the standard as shown in FIG. 6 (e) but the levels are different, the direction A5 toward the center of the divided areas with the lower risk is designated as the moving direction. May be good.
 ドローン10が飛行している地域によっては、4つの分割領域の全てが危険Lv3という場合も起こりうる。その場合、飛行指示部104は、次の2つの飛行指示を試みる。危険度判定部103により判定された危険度が第1基準未満となる分割領域がない場合、飛行指示部104は、まず、高度を上げてから再度画素値を取得するよう指示する。 Depending on the area where the drone 10 is flying, it is possible that all four division areas are dangerous Lv3. In that case, the flight instruction unit 104 attempts the following two flight instructions. When there is no divided region in which the risk level determined by the risk level determination unit 103 is less than the first reference, the flight instruction unit 104 first instructs the flight instruction unit 104 to increase the altitude and then acquire the pixel value again.
 飛行指示部104は、具体的には、飛行制御部105に対して高度を上げる指示を行い、センサ取得部101に対して画素値の再取得を指示する。指示に従い高度を上げて画素値が取得されると、危険度判定部103は、高度を上げる前に比べて広い範囲を示すことになった分割領域の危険度を判定する。これにより、危険度が第1基準未満となる分割領域が存在する可能性が生じるので、より安全な着陸が可能な領域に移動してからの着陸を試みることができる。 Specifically, the flight instruction unit 104 instructs the flight control unit 105 to raise the altitude, and instructs the sensor acquisition unit 101 to reacquire the pixel value. When the altitude is increased according to the instruction and the pixel value is acquired, the risk level determination unit 103 determines the risk level of the divided region that shows a wider range than before the altitude was increased. As a result, there is a possibility that there is a divided area where the risk level is less than the first criterion, so that it is possible to attempt landing after moving to an area where a safer landing is possible.
 飛行指示部104は、危険度が第1基準未満となる分割領域が見つからない場合、上記の高度を上げる指示を繰り返し行ってもよい。ただし、決められた回数(例えば飛行禁止となる高度に達する回数)だけ指示を繰り返しても危険度が第1基準未満となる分割領域が見つからない場合は、飛行指示部104は、着陸が失敗したときの損害を考慮した移動の指示を行う。 The flight instruction unit 104 may repeatedly give the above-mentioned instruction to raise the altitude when the division area whose risk level is less than the first criterion is not found. However, if the division area where the risk level is less than the first criterion is not found even after repeating the instruction a fixed number of times (for example, the number of times when the altitude at which flight is prohibited) is repeated, the flight instruction unit 104 fails to land. Instruct the movement in consideration of the damage at the time.
 損害規模判定部106は、着陸が失敗したときの損害の大きさを示す損害度を複数の分割領域の各々について判定する。損害規模判定部106には、距離算出部102から画素毎の距離を示す距離情報が供給される。損害規模判定部106は、供給された距離情報が示す画素毎の距離から、例えば危険度判定部103と同じパターンを用いる方法で地上に存在する物体を認識する。損害規模判定部106は、認識した物体から、各分割領域の属性を特定する。 The damage scale determination unit 106 determines the degree of damage indicating the magnitude of damage when landing fails for each of the plurality of divided regions. The damage scale determination unit 106 is supplied with distance information indicating the distance for each pixel from the distance calculation unit 102. The damage scale determination unit 106 recognizes an object existing on the ground from the distance for each pixel indicated by the supplied distance information, for example, by a method using the same pattern as the risk determination unit 103. The damage scale determination unit 106 identifies the attributes of each divided region from the recognized object.
 損害規模判定部106は、例えば、樹木及び傾斜地が一定の割合以上含まれる分割領域は「山林」と特定する。また、損害規模判定部106は、作物及び農業用機械(トラクター等)が一定の割合以上含まれる分割領域は「農業地」と特定し、工場の設備(パイプライン及び倉庫等)が一定の割合以上含まれる分割領域は「工業地」と特定する。また、損害規模判定部106は、一戸建て及び集団住宅が一定の割合以上含まれる分割領域は「住宅地」と特定し、アーケード及び看板が一定の割合以上含まれる分割領域は「商業地」と特定する。 The damage scale determination unit 106 specifies, for example, a divided area in which trees and slopes are included in a certain proportion or more as "forest". In addition, the damage scale determination unit 106 identifies the divided area containing crops and agricultural machinery (tractors, etc.) in a certain proportion or more as "agricultural land", and the factory equipment (pipeline, warehouse, etc.) has a certain proportion. The divided area included above is specified as an "industrial area". In addition, the damage scale determination unit 106 identifies the divided area containing a certain percentage or more of single-family homes and collective housing as a "residential area", and identifies the divided area containing a certain percentage or more of arcades and signboards as a "commercial area". do.
 なお、分割領域の属性の特定方法はこれに限らない。例えばセンサ取得部101が自機の位置を示す位置情報をセンサの測定値として取得し損害規模判定部106に供給する。
 そして、損害規模判定部106が、地域毎の属性を示す地図情報を記憶しておき、供給された位置情報が示す位置の属性を特定する。このように、損害規模判定部106は、危険度判定部103等が用いるものとは異なるセンサの測定値を用いてもよい。
The method of specifying the attribute of the divided area is not limited to this. For example, the sensor acquisition unit 101 acquires position information indicating the position of the own machine as a measured value of the sensor and supplies it to the damage scale determination unit 106.
Then, the damage scale determination unit 106 stores the map information indicating the attribute for each area, and specifies the attribute of the position indicated by the supplied position information. As described above, the damage scale determination unit 106 may use the measured value of the sensor different from that used by the risk determination unit 103 and the like.
 画素毎の距離も測定値に基づき算出される値であるから、いずれの場合も、損害規模判定部106は、センサ取得部101により取得された測定値に基づき(位置情報のように直接的に基づく場合と画素毎の距離のように間接的に基づく場合を含む)、損害度を判定する。損害規模判定部106は、例えば、分割領域の属性と損害度とを対応付けた損害度テーブルを用いて損害度を判定する。 Since the distance for each pixel is also a value calculated based on the measured value, in each case, the damage scale determination unit 106 is based on the measured value acquired by the sensor acquisition unit 101 (directly like the position information). The degree of damage is determined based on the case (including the case based on the case and the case based indirectly such as the distance per pixel). The damage scale determination unit 106 determines the damage degree using, for example, a damage degree table in which the attributes of the divided area and the damage degree are associated with each other.
 図7は損害度テーブルの一例を表す。図7の例では、「山林、農業地」には「損害Lv1」、「工業地」には「損害Lv2」、「住宅地、商業地」には「損害Lv3」というように分割領域の属性と損害度とが対応付けられている。損害規模判定部106は、上記のとおり特定した分割領域の属性に損害度テーブルで対応付けられている損害度を、その分割領域の損害度として判定する。 FIG. 7 shows an example of the damage degree table. In the example of FIG. 7, the attributes of the divided area are "damage Lv1" for "forest, agricultural land", "damage Lv2" for "industrial land", and "damage Lv3" for "residential area, commercial land". And the degree of damage are associated with each other. The damage scale determination unit 106 determines the damage degree associated with the attribute of the divided area specified as described above in the damage degree table as the damage degree of the divided area.
 損害規模判定部106は、各分割領域について損害度を判定するとその判定結果を飛行指示部104に供給する。飛行指示部104は、危険度判定部103により判定された危険度が基準未満の分割領域がない場合、損害規模判定部106により判定された損害度が最も小さい分割領域の方向に自機を移動させて下降させる指示を行う。 When the damage scale determination unit 106 determines the degree of damage for each divided area, the damage scale determination unit 106 supplies the determination result to the flight instruction unit 104. The flight instruction unit 104 moves its own aircraft in the direction of the division area having the smallest damage degree determined by the damage scale determination unit 106 when there is no division area in which the risk degree determined by the danger degree determination unit 103 is less than the standard. Instruct to let it descend.
 図8は判定された損害度の一例を表す。図8の例では、分割領域B1~B4の危険度が全てLv3であるが、損害度は分割領域B1がLv1、分割領域B3がLv2、分割領域B2、B4がLv3と判定されている。この場合、飛行指示部104は、最も損害度が小さい分割領域B1の中心に向かう方向A6を移動方向として指示する。このように損害度を考慮して移動方向を指示することで、ドローン10の着陸が万が一失敗してもそれによる損害を最小限に食い止めることができる。 FIG. 8 shows an example of the determined damage degree. In the example of FIG. 8, the degree of danger of the divided areas B1 to B4 is all Lv3, but the degree of damage is determined to be Lv1 for the divided area B1, Lv2 for the divided area B3, and Lv3 for the divided areas B2 and B4. In this case, the flight instruction unit 104 instructs the direction A6 toward the center of the division region B1 having the least damage degree as the movement direction. By instructing the moving direction in consideration of the degree of damage in this way, even if the landing of the drone 10 fails, the damage caused by it can be minimized.
 飛行指示部104は、危険度判定部103により判定された危険度が予め定められた基準未満の分割領域が自機(ドローン10)の移動する方向にある場合、所定の距離の下降を合わせて指示する。本実施例では、危険Lv2が第2基準として用いられるものとする。飛行指示部104は、図6(a)~(e)の例であれば、いずれも危険Lv1の分割領域に移動するので、所定の距離の下降も合わせて指示する。 When the division area where the risk level determined by the risk level determination unit 103 is less than a predetermined standard is in the direction of movement of the own aircraft (drone 10), the flight instruction unit 104 makes a predetermined distance descent. Instruct. In this embodiment, it is assumed that the danger Lv2 is used as the second criterion. In the examples of FIGS. 6A to 6E, the flight instruction unit 104 moves to the divided region of danger Lv1, so that the flight instruction unit 104 also instructs the descent of a predetermined distance.
 一方、図6(f)のように移動方向A6にある分割領域が危険Lv2である場合は、飛行指示部104は、下降を指示せずに、移動のみを指示する。飛行制御部105は、移動のみを指示された場合は、水平方向に移動するよう自機を制御し、移動及び下降を指示された場合は、斜め下方に移動するよう自機を制御する。なお、飛行制御部105は、後者の場合に水平方向に移動してから下降するよう自機を制御してもよい。 On the other hand, when the divided region in the moving direction A6 is danger Lv2 as shown in FIG. 6 (f), the flight instruction unit 104 does not instruct the descent but only instructs the movement. The flight control unit 105 controls the aircraft to move in the horizontal direction when only the movement is instructed, and controls the aircraft to move diagonally downward when instructed to move and descend. In the latter case, the flight control unit 105 may control the aircraft so as to move in the horizontal direction and then descend.
 飛行指示部104は、上述した移動及び下降の指示を繰り返し、自機(ドローン10)の高度が所定の高度未満になると、測距手段による測定結果に基づく着陸制御を行うよう飛行制御部105に指示する。測距手段とは、地上までの距離を測定する手段であり、本実施例では、距離算出部102が測距手段として用いられ、距離算出部102が算出する距離が測定結果として用いられる。 The flight instruction unit 104 repeats the above-mentioned movement and descent instructions, and when the altitude of the own aircraft (drone 10) becomes less than a predetermined altitude, the flight instruction unit 105 causes the flight control unit 105 to perform landing control based on the measurement result by the distance measuring means. Instruct. The distance measuring means is a means for measuring the distance to the ground. In this embodiment, the distance calculating unit 102 is used as the distance measuring means, and the distance calculated by the distance calculating unit 102 is used as the measurement result.
 距離算出部102は、画素毎の距離を示す距離情報を障害物検出部107にも供給する。障害物検出部107は、センサ取得部101により取得されたステレオカメラ17が撮影した画像から障害物を検出する。障害物検出部107は本発明の「検出部」の一例である。障害物とは、ドローン10の着陸を邪魔する物体のことであり、建築物、自然物及び動物(人間含む)等のいずれも該当する場合がある。 The distance calculation unit 102 also supplies distance information indicating the distance for each pixel to the obstacle detection unit 107. The obstacle detection unit 107 detects an obstacle from the image taken by the stereo camera 17 acquired by the sensor acquisition unit 101. The obstacle detection unit 107 is an example of the "detection unit" of the present invention. An obstacle is an object that interferes with the landing of the drone 10, and may be a building, a natural object, an animal (including a human being), or the like.
 障害物検出部107は、障害物を示す可能性がある画素について算出された距離が示す形状がその障害物として許容される範囲にある場合に障害物を検出する。距離が示す形状とは、危険度判定部103が物体を認識する際に用いた物体の輪郭と同じものである。障害物検出部107は、危険度判定部103と同様に、障害物の種類毎に形状と大きさのパターンを記憶しておく。 The obstacle detection unit 107 detects an obstacle when the shape indicated by the calculated distance for the pixel that may indicate the obstacle is within the allowable range as the obstacle. The shape indicated by the distance is the same as the contour of the object used when the risk determination unit 103 recognizes the object. Similar to the risk level determination unit 103, the obstacle detection unit 107 stores a pattern of shape and size for each type of obstacle.
 そして、障害物検出部107は、記憶したパターンとの類似度が閾値を超える場合に形状が障害物として許容される範囲にあると判断し、そのパターンに対応する障害物として検出する。障害物検出部107は、障害物を検出すると、検出した障害物が写った画素を示す画素情報を撮影指示部108に供給する。撮影指示部108は、障害物検出部107により障害物が検出された場合にその障害物の別の方向からの撮影を指示する。撮影指示部108は本発明の「撮影指示部」の一例である。 Then, the obstacle detection unit 107 determines that the shape is within the allowable range as an obstacle when the similarity with the stored pattern exceeds the threshold value, and detects it as an obstacle corresponding to the pattern. When the obstacle detection unit 107 detects an obstacle, it supplies pixel information indicating a pixel in which the detected obstacle is captured to the photographing instruction unit 108. When an obstacle is detected by the obstacle detection unit 107, the shooting instruction unit 108 instructs the shooting from another direction of the obstacle. The shooting instruction unit 108 is an example of the “shooting instruction unit” of the present invention.
 撮影指示部108は、供給された画素情報が示す画素の位置から検出された障害物の地上における位置を判断し、その位置を画角に含む範囲で自機を移動させるよう飛行制御部105に指示する。そして、撮影指示部108は、移動後にステレオカメラ17に対して撮影を指示することで、障害物の別の方向からの撮影を指示する。障害物検出部107は、撮影指示部108の撮影の指示により撮影された画像を加えて、障害物を再度検出する。 The shooting instruction unit 108 determines the position of the obstacle on the ground detected from the position of the pixel indicated by the supplied pixel information, and causes the flight control unit 105 to move the aircraft within a range including the position in the angle of view. Instruct. Then, the shooting instruction unit 108 instructs the stereo camera 17 to shoot after moving, thereby instructing the stereo camera 17 to shoot from another direction of the obstacle. The obstacle detection unit 107 adds an image taken according to the shooting instruction of the shooting instruction unit 108, and detects the obstacle again.
 以上のとおり再撮影及び再検出が行われることで、複数のアングルから撮影された障害物を検出することができ、一つの画像だけからの検出に比べて、障害物が示す空間中の領域をより正確に把握することができる。障害物検出部107は、こうして検出した障害物が写った画素を示す画素情報を着陸指示部109に供給する。着陸指示部109は、距離算出部102により算出された画素毎の距離に基づく自機(ドローン10)の着陸を飛行制御部105に対して指示する。着陸指示部109は本発明の「着陸指示部」の一例である。 By performing re-imaging and re-detection as described above, obstacles photographed from a plurality of angles can be detected, and the area in space indicated by the obstacles can be detected as compared with the detection from only one image. It can be grasped more accurately. The obstacle detection unit 107 supplies the landing instruction unit 109 with pixel information indicating the pixels in which the obstacle detected in this way is captured. The landing instruction unit 109 instructs the flight control unit 105 to land the own aircraft (drone 10) based on the distance for each pixel calculated by the distance calculation unit 102. The landing instruction unit 109 is an example of the "landing instruction unit" of the present invention.
 着陸指示部109は、着陸する方向に障害物がない場合であれば、そのまま高度を低下させて自機を着陸させる指示を行う。また、着陸指示部109は、着陸する方向に障害物が検出された場合であっても、その障害物の位置が着陸条件を満たす場合には、算出された画素毎の距離に基づく自機の着陸を飛行制御部105に対して指示する。着陸指示部109は、例えば、障害物検出部107により検出された障害物が撮影された画像の所定の範囲に含まれない場合に着陸条件が満たされたと判断する。 If there are no obstacles in the landing direction, the landing instruction unit 109 will instruct the aircraft to land by lowering the altitude as it is. Further, even if an obstacle is detected in the landing direction, the landing instruction unit 109 of the own aircraft based on the calculated distance for each pixel if the position of the obstacle satisfies the landing condition. Instruct the flight control unit 105 to land. The landing instruction unit 109 determines that the landing condition is satisfied when, for example, the obstacle detected by the obstacle detection unit 107 is not included in the predetermined range of the captured image.
 所定の範囲とは、例えば、撮影される画像の中央から所定半径の円の形状をした範囲である。なお、範囲の形状は円に限らず、楕円、方形、四角形等のいずれでもよく、また、画像の中央からずれていてもよい。以下ではこれらの範囲のことを「着陸判断用の範囲」と言う。これにより、例えば撮影された画像の端の方に障害物が写っている場合であれば、着陸がされるようにすることができる。 The predetermined range is, for example, a range in the shape of a circle having a predetermined radius from the center of the captured image. The shape of the range is not limited to a circle, and may be an ellipse, a square, a quadrangle, or the like, or may be deviated from the center of the image. In the following, these ranges will be referred to as "ranges for landing judgment". This makes it possible to land, for example, if an obstacle appears near the edge of the captured image.
 また、着陸指示部109は、障害物検出部107により検出された障害物が着陸判断用の範囲に含まれていても、その障害物が移動により自機の着陸前に着陸判断用の範囲に含まれなくなる場合は着陸条件が満たされると判断する。移動する障害物とは、例えば乗り物又は動物(人間を含む)等である。着陸が指示されると、飛行制御部105は、自機の高度を下げてゆき、地上又は地上の物体の上に着陸すると飛行制御を終了する。 Further, even if the obstacle detected by the obstacle detection unit 107 is included in the landing determination range, the landing instruction unit 109 moves the obstacle to the landing determination range before landing of the own aircraft. If it is no longer included, it is judged that the landing conditions are met. Moving obstacles are, for example, vehicles or animals (including humans). When the landing is instructed, the flight control unit 105 lowers the altitude of the aircraft and ends the flight control when it lands on the ground or an object on the ground.
 ドローン10は、上記の構成に基づいて、自機を着陸させる着陸処理を行う。
 図9は着陸処理における動作手順の一例を表す。図9の動作手順は、例えば、ドローン10が計画された飛行経路を飛行することを契機に開始される。まず、ドローン10(センサ取得部101)は、イメージセンサの測定値として、ステレオカメラ17による自機の鉛直下方の撮影画像を取得する(ステップS11)。
The drone 10 performs a landing process for landing its own aircraft based on the above configuration.
FIG. 9 shows an example of the operation procedure in the landing process. The operation procedure of FIG. 9 is started, for example, when the drone 10 flies on the planned flight path. First, the drone 10 (sensor acquisition unit 101) acquires an image taken vertically below the own machine by the stereo camera 17 as a measured value of the image sensor (step S11).
 次に、ドローン10(距離算出部102)は、取得された撮影画像の各画素が示す地点までの距離を算出する(ステップS12)。続いて、ドローン10(危険度判定部103)は、算出された画素毎の距離に基づき、自機の鉛直下方にある複数の分割領域の各々の着陸の際の危険度を判定する(ステップS13)。次に、ドローン10(飛行指示部104)は、判定された危険度が第1基準以上の分割領域の配置に応じた方向及び高度に自機を移動させる指示を行う(ステップS14)。 Next, the drone 10 (distance calculation unit 102) calculates the distance to the point indicated by each pixel of the acquired captured image (step S12). Subsequently, the drone 10 (risk level determination unit 103) determines the risk level at the time of landing of each of the plurality of divided regions vertically below the own aircraft based on the calculated distance for each pixel (step S13). ). Next, the drone 10 (flight instruction unit 104) gives an instruction to move the aircraft in the direction and altitude according to the arrangement of the divided regions whose determined risk level is the first reference or higher (step S14).
 続いて、ドローン10は、所定の高度未満になったか否かを判断し(ステップS15)、なっていない(NO)と判断した場合はステップS11に戻って動作を継続し、所定の高度未満になった(YES)と判断した場合は次の動作手順に進む。まず、ドローン10(センサ取得部101)は、ステップS11と同様に自機の鉛直下方の撮影画像を取得する(ステップS21)。 Subsequently, the drone 10 determines whether or not the altitude is below the predetermined altitude (step S15), and if it is determined that the altitude is not (NO), the drone 10 returns to step S11 to continue the operation and falls below the predetermined altitude. If it is determined that the result is (YES), the procedure proceeds to the next operation procedure. First, the drone 10 (sensor acquisition unit 101) acquires a photographed image vertically below the own machine in the same manner as in step S11 (step S21).
 次に、ドローン10(距離算出部102)は、取得された撮影画像の各画素が示す地点までの距離を算出する(ステップS22)。続いて、ドローン10(障害物検出部107)は、算出された画素毎の距離に基づき、取得された撮影画像から障害物を検出する処理を行う(ステップS23)。ドローン10は、ステップS23で障害物が検出された(YES)場合は着陸条件を満たす否かを判断する(ステップS24)。 Next, the drone 10 (distance calculation unit 102) calculates the distance to the point indicated by each pixel of the acquired captured image (step S22). Subsequently, the drone 10 (obstacle detection unit 107) performs a process of detecting an obstacle from the acquired captured image based on the calculated distance for each pixel (step S23). If an obstacle is detected in step S23 (YES), the drone 10 determines whether or not the landing condition is satisfied (step S24).
 ドローン10は、ステップS24で着陸条件を満たさない(NO)と判断した場合はステップS21に戻って動作を継続し、着陸条件を満たす(YES)と判断した場合は次の動作手順に進む。ステップS23で障害物が検出されなかった(NO)場合と、ステップS24で着陸条件が満たされた(YES)場合に、ドローン10(着陸指示部109)は、算出された画素毎の距離に基づく自機の着陸を飛行制御部105に対して指示し、着陸制御を開始させる(ステップS25)。 If it is determined in step S24 that the landing condition is not satisfied (NO), the drone 10 returns to step S21 and continues the operation, and if it is determined that the landing condition is satisfied (YES), the drone 10 proceeds to the next operation procedure. When no obstacle is detected in step S23 (NO) and when the landing condition is satisfied (YES) in step S24, the drone 10 (landing instruction unit 109) is based on the calculated distance for each pixel. The flight control unit 105 is instructed to land the own aircraft, and landing control is started (step S25).
 本実施例では、上記のとおり、複数の分割領域の中でも他に比べて危険度が少ない分割領域、すなわち、飛行体が安全に着陸できる可能性が比較的高い場所を探索することができる。また、本実施例では、移動だけなく下降も指示されるので、安全な分割領域の範囲を徐々に小さく絞り込んでいくことができる。そして、こうして探索された分割領域まで移動してから着陸制御が行われることで、分割領域の危険度に関係なく着陸制御が行われる場合に比べて、飛行体を安全に着陸させることができる。 In this embodiment, as described above, it is possible to search for a divided region having a lower risk than the others among a plurality of divided regions, that is, a place where the flying object has a relatively high possibility of landing safely. Further, in this embodiment, since not only the movement but also the descent is instructed, the range of the safe division region can be gradually narrowed down. Then, by moving to the divided region searched in this way and then performing the landing control, the aircraft can land safely as compared with the case where the landing control is performed regardless of the risk of the divided region.
 また、上記のとおり測距手段による測定結果に基づく着陸制御が行われる場合、測定結果の精度が低いほど、望ましくない挙動(例えば必要以上に障害物を避ける移動等)が起こり得る。本実施例では、ドローン10が所定の高度未満まで下降してから着陸制御が行われるので、測定精度の低さを原因とした望ましくない挙動の発生を抑制することができる。また、本実施例では、着陸に適していない不適領域の割合に基づき危険度が判定される。これにより、不適領域の割合を考慮しない場合に比べて、より精度の高い危険度を判定することができる。 Further, when landing control is performed based on the measurement result by the distance measuring means as described above, the lower the accuracy of the measurement result, the more undesired behavior (for example, movement to avoid obstacles more than necessary) may occur. In this embodiment, since the landing control is performed after the drone 10 has descended to less than a predetermined altitude, it is possible to suppress the occurrence of undesired behavior due to low measurement accuracy. Further, in this embodiment, the degree of risk is determined based on the proportion of unsuitable areas that are not suitable for landing. As a result, it is possible to determine the degree of risk with higher accuracy than when the ratio of the unsuitable region is not taken into consideration.
 また、本実施例では、ドローン10が、障害物を検出しても、着陸条件が満たされる場合には着陸制御を行う。これにより、飛行体の着陸時に障害物に衝突する可能性を低くしつつ、例えばわずかな障害物のためにいつまでも着陸できずにバッテリー切れを起こすといった事態を防ぐことができる。また、例えばステレオカメラ17で撮影した画像は立体視等に利用する場合があるが、そのように別の用途のために撮影した画像を利用して障害物を検出することができる。 Further, in this embodiment, even if the drone 10 detects an obstacle, the landing control is performed if the landing conditions are satisfied. As a result, it is possible to reduce the possibility of colliding with an obstacle when the flying object lands, and to prevent a situation in which, for example, a slight obstacle prevents the aircraft from landing indefinitely and causes the battery to run out. Further, for example, an image taken by a stereo camera 17 may be used for stereoscopic viewing or the like, and an obstacle can be detected by using an image taken for another purpose as such.
[2]第2実施例 本発明の第2実施例について、以下、第1実施例と異なる点を中心に説明する。第1実施例では、ドローン10自身が着陸に関する処理を全て行ったが、第2実施例では、ドローン10及びサーバ装置がそれらの処理を分担する。
 図10は第2実施例に係る着陸支援システム1の全体構成の一例を表す。着陸支援システム1は、ドローン10のような飛行体の着陸を支援するシステムである。
[2] Second Example The second embodiment of the present invention will be described below focusing on the differences from the first embodiment. In the first embodiment, the drone 10 itself performed all the processing related to landing, but in the second embodiment, the drone 10 and the server device share the processing.
FIG. 10 shows an example of the overall configuration of the landing support system 1 according to the second embodiment. The landing support system 1 is a system that supports the landing of an aircraft such as the drone 10.
 着陸支援システム1は、ネットワーク2と、ドローン10と、サーバ装置20とを備える。ネットワーク2は、移動体通信網及びインターネット等を含む通信システムであり、自システムにアクセスする装置同士のデータのやり取りを中継する。ネットワーク2には、サーバ装置20が有線通信で(無線通信でもよい)、ドローン10が無線通信でアクセスしている。 The landing support system 1 includes a network 2, a drone 10, and a server device 20. The network 2 is a communication system including a mobile communication network, the Internet, and the like, and relays data exchange between devices accessing the own system. The server device 20 accesses the network 2 by wire communication (may be wireless communication), and the drone 10 accesses the network 2 by wireless communication.
 図11はサーバ装置20のハードウェア構成の一例を表す。サーバ装置20は、物理的には、プロセッサ21と、メモリ22と、ストレージ23と、通信装置24と、バス25などを含むコンピュータ装置として構成されてもよい。プロセッサ21等の図1に同名のハードウェアが表されているものは、性能及び仕様等の違いはあるが図1と同種のハードウェアである。 FIG. 11 shows an example of the hardware configuration of the server device 20. The server device 20 may be physically configured as a computer device including a processor 21, a memory 22, a storage 23, a communication device 24, a bus 25, and the like. The hardware of the same name shown in FIG. 1 such as the processor 21 is the same type of hardware as in FIG. 1, although there are differences in performance, specifications, and the like.
 図12は本実施例で実現される機能構成を表す。ドローン10は、飛行制御部105と、画像撮影部110とを備える。サーバ装置20は、センサ取得部101と、距離算出部102と、危険度判定部103と、飛行指示部104と、損害規模判定部106と、障害物検出部107と、撮影指示部108と、着陸指示部109とを備える。画像撮影部110は、ステレオカメラ17を備えて自機の鉛直下方の画像を撮影する。 FIG. 12 shows the functional configuration realized in this embodiment. The drone 10 includes a flight control unit 105 and an image capturing unit 110. The server device 20 includes a sensor acquisition unit 101, a distance calculation unit 102, a risk level determination unit 103, a flight instruction unit 104, a damage scale determination unit 106, an obstacle detection unit 107, and a shooting instruction unit 108. It is provided with a landing instruction unit 109. The image capturing unit 110 includes a stereo camera 17 and captures an image vertically below the own machine.
 画像撮影部110は、撮影した画像を示す撮影画像データをサーバ装置20に送信する。サーバ装置20のセンサ取得部101は、送信されてきた撮影画像データが示す撮影画像をイメージセンサの測定値として取得する。以降、センサ取得部101から着陸指示部109まで、第1実施例と同様に動作してドローン10の着陸を支援する。第1実施例との違いは、飛行指示部104、撮影指示部108及び着陸指示部109による指示は、指示データをドローン10に送信することで行われる点である。 The image capturing unit 110 transmits captured image data indicating the captured image to the server device 20. The sensor acquisition unit 101 of the server device 20 acquires the captured image indicated by the transmitted captured image data as the measured value of the image sensor. After that, the sensor acquisition unit 101 to the landing instruction unit 109 operate in the same manner as in the first embodiment to support the landing of the drone 10. The difference from the first embodiment is that the instructions by the flight instruction unit 104, the shooting instruction unit 108, and the landing instruction unit 109 are given by transmitting the instruction data to the drone 10.
 図13は本実施例の着陸処理における動作手順の一例を表す。図13の動作手順は、例えば、ドローン10が計画された飛行経路を飛行することを契機に開始される。まず、ドローン10(画像撮影部110)は、自機の鉛直下方の画像の撮影と、自機の高度の測定とを行う(ステップS31)。次に、ドローン10(画像撮影部110)は、撮影した画像及び自機の高度を示す撮影画像データをサーバ装置20に送信する(ステップS32)。 FIG. 13 shows an example of the operation procedure in the landing process of this embodiment. The operation procedure of FIG. 13 is started, for example, when the drone 10 flies on the planned flight path. First, the drone 10 (imaging unit 110) takes an image vertically below the own machine and measures the altitude of the own machine (step S31). Next, the drone 10 (image capturing unit 110) transmits the captured image and captured image data indicating the altitude of the own machine to the server device 20 (step S32).
 サーバ装置20は、送信されてきた撮影画像データが示すドローン10の高度が所定の高度未満であるか否かを判断する(ステップS33)。ステップS33で所定の高度未満ではない(NO)と判断した場合、サーバ装置20(センサ取得部101)は、イメージセンサの測定値として、ステレオカメラ17による自機の鉛直下方の撮影画像を取得する(ステップS41)。 The server device 20 determines whether or not the altitude of the drone 10 indicated by the transmitted captured image data is less than a predetermined altitude (step S33). If it is determined in step S33 that the altitude is not lower than the predetermined altitude (NO), the server device 20 (sensor acquisition unit 101) acquires an image taken vertically below the own device by the stereo camera 17 as a measured value of the image sensor. (Step S41).
 次に、サーバ装置20(距離算出部102)は、取得された撮影画像の各画素が示す地点までの距離を画素毎の距離として算出する(ステップS42)。続いて、サーバ装置20(危険度判定部103)は、算出された画素毎の距離に基づいて複数の分割領域の危険度を判定する(ステップS43)。次に、サーバ装置20(飛行指示部104)は、判定された危険度が第1基準以上の分割領域の配置に応じた方向及び高度に自機を移動させる指示を示す指示データを生成し(ステップS44)、ドローン10に送信する(ステップS45)。 Next, the server device 20 (distance calculation unit 102) calculates the distance to the point indicated by each pixel of the acquired captured image as the distance for each pixel (step S42). Subsequently, the server device 20 (risk degree determination unit 103) determines the risk degree of the plurality of divided regions based on the calculated distance for each pixel (step S43). Next, the server device 20 (flight instruction unit 104) generates instruction data indicating an instruction to move the own aircraft in the direction and altitude according to the arrangement of the divided areas whose determined risk level is the first reference or higher ( Step S44), transmission to the drone 10 (step S45).
 ドローン10(飛行制御部105)は、送信されてきた指示データが示す指示に従い自機の飛行を制御する(ステップS46)。以降、ステップS33において所定の高度未満(YES)と判断されるまで、ステップS31からS46までの動作が繰り返される。ステップS33において所定の高度未満(YES)と判断されると、まず、ステップS31、S32、S41、S42と同じ動作手順であるステップS51、S52、S53、S54の動作手順が行われる。 The drone 10 (flight control unit 105) controls the flight of its own aircraft according to the instructions indicated by the transmitted instruction data (step S46). After that, the operations from steps S31 to S46 are repeated until it is determined in step S33 that the altitude is below a predetermined altitude (YES). If it is determined in step S33 that the altitude is below a predetermined altitude (YES), first, the operation procedure of steps S51, S52, S53, and S54, which is the same operation procedure as in steps S31, S32, S41, and S42, is performed.
 次に、サーバ装置20(障害物検出部107)は、ステップS54で算出された画素毎の距離に基づき、取得された撮影画像から障害物を検出する処理を行う(ステップS55)。図13の例では障害物が検出されたものとする。サーバ装置20は、着陸条件を満たす否かを判断し(ステップS56)、着陸条件を満たさない(NO)と判断した場合はステップS52に戻って動作を継続し、着陸条件を満たす(YES)と判断した場合は次の動作手順に進む。 Next, the server device 20 (obstacle detection unit 107) performs a process of detecting an obstacle from the acquired captured image based on the distance for each pixel calculated in step S54 (step S55). In the example of FIG. 13, it is assumed that an obstacle is detected. The server device 20 determines whether or not the landing condition is satisfied (step S56), and if it determines that the landing condition is not satisfied (NO), the server device 20 returns to step S52 to continue the operation and satisfies the landing condition (YES). If it is determined, proceed to the next operation procedure.
 ステップS56で着陸条件が満たされた(YES)と判断した場合、サーバ装置20(着陸指示部109)は、算出された画素毎の距離と、その距離に基づく自機の着陸とを指示する指示データを生成し(ステップS57)、ドローン10に送信する(ステップS58)。ドローン10(飛行制御部105)は、送信されてきた指示データが示す指示に従い自機の着陸を制御する(ステップS59)。 If it is determined in step S56 that the landing condition is satisfied (YES), the server device 20 (landing instruction unit 109) instructs the calculated distance for each pixel and the landing of the own aircraft based on the distance. Data is generated (step S57) and transmitted to the drone 10 (step S58). The drone 10 (flight control unit 105) controls the landing of its own aircraft according to the instructions indicated by the transmitted instruction data (step S59).
 本実施例では、画素毎の距離の算出、危険度の判定、損害規模の判定及び障害物の検出等の処理がサーバ装置20で行われるので、第1実施例に比べてドローン10の処理の負荷が軽減される。その結果、ドローン10に搭載するプロセッサ11等のコンピュータ資源を軽量化及び小型化できる。一方、第1実施例の場合、飛行指示等に通信が不要なので、通信状況に影響されることなく着陸を支援することができる。 In this embodiment, since the server device 20 performs processing such as calculation of the distance for each pixel, determination of the degree of danger, determination of the scale of damage, and detection of obstacles, the processing of the drone 10 is performed as compared with the first embodiment. The load is reduced. As a result, computer resources such as the processor 11 mounted on the drone 10 can be reduced in weight and size. On the other hand, in the case of the first embodiment, since communication is not required for flight instructions and the like, landing can be supported without being affected by the communication status.
[3]変形例 上述した実施例は本発明の実施の一例に過ぎず、以下のように変形させてもよい。また、実施例及び各変形例は必要に応じてそれぞれ組み合わせてもよい。実施例及び各変形例を組み合わせる際は、各変形例について優先順位を付けて(各変形例を実施すると競合する事象が生じる場合にどちらを優先するかを決める順位付けをして)実施してもよい。 [3] Modification example The above-described embodiment is merely an example of the embodiment of the present invention, and may be modified as follows. Further, the examples and the modified examples may be combined as necessary. When combining the examples and each variant, prioritize each variant (prioritize which one to prioritize when conflicting events occur when each variant is implemented). May be good.
[3-1]センサ取得部101 センサ取得部101は、上記の実施例ではステレオカメラ17のイメージセンサの測定値を取得したが、これに限らない。センサ取得部101は、例えば、自機のセンサ装置16が備える位置センサが測定した自機の位置を示す位置情報をセンサの測定値として取得する。 [3-1] Sensor acquisition unit 101 The sensor acquisition unit 101 has acquired the measured values of the image sensor of the stereo camera 17 in the above embodiment, but the present invention is not limited to this. For example, the sensor acquisition unit 101 acquires position information indicating the position of the own machine measured by the position sensor included in the sensor device 16 of the own machine as a measured value of the sensor.
 センサ取得部101は、取得した位置情報を危険度判定部103に供給する。危険度判定部103は、供給された位置情報に基づいて危険度を判定する。危険度判定部103は、例えば、地上に存在する物体の属性を領域ごとに表す地図を記憶する。物体の属性とは、例えば図4の重みテーブルに示す属性である。 The sensor acquisition unit 101 supplies the acquired position information to the risk determination unit 103. The risk level determination unit 103 determines the risk level based on the supplied position information. The risk determination unit 103 stores, for example, a map showing the attributes of an object existing on the ground for each area. The attribute of the object is, for example, the attribute shown in the weight table of FIG.
 危険度判定部103は、まず、記憶している地図において、センサ取得部101により取得された位置情報が示す位置を含む所定の領域を地上領域とする。そして、危険度判定部103は、地上領域を分割した分割領域に存在する物体が占める不適領域の面積にその物体の属性に応じた重みを付けて算出される不適領域の割合に基づき危険度を判定する。 First, the risk determination unit 103 sets a predetermined area including the position indicated by the position information acquired by the sensor acquisition unit 101 as the ground area on the stored map. Then, the risk level determination unit 103 determines the risk level based on the ratio of the unsuitable area calculated by weighting the area of the unsuitable area occupied by the object existing in the divided area obtained by dividing the ground area according to the attribute of the object. judge.
 危険度判定部103は、第1実施例で説明したように、物体を認識し、図4の重みテーブルを用いて重み付けをして、図5の危険度テーブルを用いて危険度を判定する。本変形例では、撮影画像を用いずに危険度を判定するので、例えば霧が出ていて上空からだと地上がはっきり撮影できない状況でも、危険度を判定することができる。一方、第1実施例の場合は、例えば別の用途(立体視用の画像撮影等)のために撮影した画像を利用して障害物を検出することができる。 As described in the first embodiment, the risk level determination unit 103 recognizes the object, weights it using the weighting table of FIG. 4, and determines the risk level using the riskiness table of FIG. In this modified example, since the risk level is determined without using the captured image, the risk level can be determined even in a situation where the ground cannot be clearly photographed from the sky due to fog, for example. On the other hand, in the case of the first embodiment, an obstacle can be detected by using an image taken for another purpose (such as taking an image for stereoscopic viewing), for example.
 なお、本変形例において、第1実施例でも述べたように危険度判定部103により判定された危険度が第1基準未満となる分割領域(図6の例であれば危険Lv1、Lv2の分割領域)がない場合、危険度判定部103は、記憶している地図における地上領域を拡大して危険度を再度判定する。地上領域とは、分割領域に分割する前の領域のことであり、図6の例であれば分割領域B1、B2、B3、B4を合わせた領域である。 In this modified example, as described in the first embodiment, the division region in which the risk level determined by the risk level determination unit 103 is less than the first reference (in the example of FIG. 6, the risk levels Lv1 and Lv2 are divided). If there is no area), the risk determination unit 103 enlarges the above-ground area on the stored map and determines the risk again. The above-ground area is an area before being divided into divided areas, and in the example of FIG. 6, it is an area in which divided areas B1, B2, B3, and B4 are combined.
 地上領域を拡大することで、拡大した部分に危険度が低い領域が含まれている場合がある。これにより、危険度が第1基準未満となる分割領域が存在する可能性が生じるので、より安全な着陸が可能な領域に移動してからの着陸を試みることができる。危険度判定部103は、一度再判定をしても危険度が第1基準未満となる分割領域が見つからない場合には、繰り返し地上領域を拡大して危険度の再判定を行ってもよい。そうすることで、より確実に危険度が第1基準未満となる分割領域を見つけることができる。 By expanding the above-ground area, the expanded part may include a low-risk area. As a result, there is a possibility that there is a divided area where the risk level is less than the first criterion, so that it is possible to attempt landing after moving to an area where a safer landing is possible. If the risk level determination unit 103 does not find a divided region whose risk level is less than the first criterion even after re-determining once, the risk level may be repeatedly re-determined by expanding the above-ground area. By doing so, it is possible to more surely find a divided region whose risk level is less than the first criterion.
[3-2]飛行体
 実施例では、飛行体として回転翼機型の飛行体が用いられたが、これに限らない。飛行体は、例えば飛行機型の飛行体であってもよいし、ヘリコプター型の飛行体であってもよい。また、VTOL(Vertical Take-Off and Landing Aircraft)と呼ばれる垂直離着陸機であってもよい。
[3-2] Aircraft In the examples, a rotary-wing aircraft type air vehicle was used as the air vehicle, but the vehicle is not limited to this. The air vehicle may be, for example, an airplane type air vehicle or a helicopter type air vehicle. Further, it may be a vertical take-off and landing aircraft called VTOL (Vertical Take-Off and Landing Aircraft).
[3-3]着陸条件 着陸指示部109は、第1実施例とは異なる着陸条件を用いてもよい。着陸指示部109は、例えば、第1実施例では、検出された障害物が撮影された画像の所定の範囲に含まれない場合に着陸条件が満たされたと判断したが、この所定の範囲を、飛行体の種類に応じて変化させてもよい。 [3-3] Landing conditions The landing instruction unit 109 may use landing conditions different from those in the first embodiment. For example, in the first embodiment, the landing instruction unit 109 determines that the landing condition is satisfied when the detected obstacle is not included in the predetermined range of the captured image. It may be changed according to the type of the flying object.
 着陸指示部109は、例えば、第1実施例で述べた回転翼機型の飛行体やヘリコプター型の飛行体の場合は、ほぼ鉛直上方から鉛直下方に向けて下降するので、円形の範囲を用いる。また、着陸指示部109は、VTOLの飛行体の場合は、安定して下降させるために斜めに滑空しながら下降することが多いので、楕円形の範囲を用いる。また、着陸指示部109は、飛行機型の飛行体の場合は、水平に近い角度で斜めに下降してくるので、より長い楕円形の範囲を用いる。 For example, in the case of the rotary-wing aircraft type air vehicle or helicopter type air vehicle described in the first embodiment, the landing instruction unit 109 descends from substantially vertically upward to vertically downward, so a circular range is used. .. Further, in the case of a VTOL aircraft, the landing instruction unit 109 often descends while gliding diagonally in order to stably descend, so an elliptical range is used. Further, in the case of an airplane-type flying object, the landing instruction unit 109 descends diagonally at an angle close to horizontal, so a longer elliptical range is used.
 例えば飛行機型の飛行体で円形の範囲を用いた場合、その範囲に着陸する前から障害物に接触する高さを飛行するので、範囲の外にある障害物に接触するおそれが生じる。そこで、飛行機型の飛行体の場合は、より長い楕円形の範囲を用いることで、障害物に接触するおそれがない所定の範囲の上空の空間を通って着陸することができる。このように、飛行体が下降するときの飛行経路の角度に応じて所定の範囲の長さを変化させることで、例えば常に円形の範囲を用いる場合に比べて、飛行体の種類に関わらず着陸時に障害物に接触する危険を少なくすることができる。 For example, when an airplane-type flying object uses a circular range, it flies at a height that makes contact with an obstacle before landing in that range, so there is a risk of contacting an obstacle outside the range. Therefore, in the case of an airplane-type flying object, by using a longer elliptical range, it is possible to land through a space above a predetermined range where there is no risk of contact with an obstacle. In this way, by changing the length of a predetermined range according to the angle of the flight path when the flying object descends, landing regardless of the type of the flying object, as compared with the case where a circular range is always used, for example. The risk of contact with obstacles at times can be reduced.
 また、着陸指示部109は、所定の範囲を、着陸地点の気象状況に応じて変化させてもよい。この場合、着陸指示部109は、例えばインターネット上で地域毎の気象状況を提供する事業者のシステムに、着陸地点を含む地域の気象状況を問い合わせる。着陸指示部109は、問い合わせの応答で得られた気象状況がドローン10の飛行に適した状況であるほど小さい範囲を所定の範囲として用いる。 Further, the landing instruction unit 109 may change a predetermined range according to the weather conditions at the landing point. In this case, the landing instruction unit 109 inquires of the system of the operator that provides the weather condition for each region on the Internet, for example, the weather condition of the region including the landing point. The landing instruction unit 109 uses a range as a predetermined range, which is so small that the weather condition obtained in response to the inquiry is suitable for the flight of the drone 10.
 つまり、着陸指示部109は、気象状況が悪くドローン10の飛行に適していない状況であるほど、所定の範囲を大きくする。気象状況が悪い(例えば風が強い)と、ドローン10が下降中に例えば風に煽られて近くの障害物に接触する危険が大きくなる。そこで、気象状況が悪いほど所定の範囲を大きくすることで、気象状況に関係なく範囲を一定にする場合に比べて、着陸時に障害物に接触する危険を少なくすることができる。 That is, the landing instruction unit 109 increases the predetermined range as the weather conditions are bad and the drone 10 is not suitable for flight. If the weather conditions are bad (for example, the wind is strong), the risk of the drone 10 being blown by the wind while descending and coming into contact with a nearby obstacle increases. Therefore, by increasing the predetermined range as the weather condition is worse, the risk of contact with an obstacle at the time of landing can be reduced as compared with the case where the range is kept constant regardless of the weather condition.
[3-4]分割領域
 実施例では、鉛直上方から見て4つの正方形の分割領域が用いられたが、これに限らない。例えば、各分割領域の形状が長方形であってもよい。また、例えば正六角形の分割領域が3つ用いられてもよいし、正三角形の分割領域が6つ用いられてもよい。また、各分割領域の形状及び大きさが一致していたが、これらも厳密に同じである必要はなく、多少異なっていてもよい。いずれの場合も、危険度が第1基準以上の分割領域の配置と、その配置の際に最も安全な着陸が期待できる領域に向けた移動方向とが対応付けられていればよい。
[3-4] Divided region In the embodiment, four square divided regions viewed from above vertically are used, but the present invention is not limited to this. For example, the shape of each divided region may be rectangular. Further, for example, three regular hexagonal division regions may be used, or six regular triangle division regions may be used. Further, although the shapes and sizes of the divided regions were the same, they do not have to be exactly the same and may be slightly different. In either case, it suffices that the arrangement of the divided areas having a risk level of the first criterion or higher is associated with the moving direction toward the area where the safest landing can be expected at the time of the arrangement.
[3-5]指示方法
 飛行指示部104、撮影指示部108及び着陸指示部109は、実施例ではドローン10に対して指示を行ったが、これに限らず、例えばプロポでドローン10が操作される場合は操作者に対して指示を行ってもよい。この場合、各指示部は、例えば、移動方向、移動後の高度、撮影する位置、着陸制御への切り替えを指示する文字列を示すデータをプロポに送信し、プロポの表示面にそれらの文字列を表示させることで指示を行うとよい。
[3-5] Instruction method The flight instruction unit 104, the shooting instruction unit 108, and the landing instruction unit 109 give instructions to the drone 10 in the embodiment, but the present invention is not limited to this, and the drone 10 is operated by, for example, a radio. If so, the operator may be instructed. In this case, each indicator transmits data indicating, for example, the moving direction, the altitude after the movement, the shooting position, and the character string instructing the switching to the landing control to the radio, and those character strings are displayed on the display surface of the radio. It is advisable to give instructions by displaying.
[3-6]危険度
 危険度判定部103は、実施例では3段階で危険度を判定したが、2段階でもよいし、4段階以上であってもよい。また、「Lv1」のように数値で表すだけでなく、「大」、「中」、「小」や「A」、「B」、「C」のように危険度が文字列で表されていてもよい。要するに、ドローン10が安全に着陸できない可能性の高さを表す情報であればよい。
飛行指示部104が移動方向を決める際に用いる第1基準と下降の有無を決める際に用いる第2基準は、こうした危険度の表し方に応じて定められればよい。
[3-6] Danger level The risk level determination unit 103 determines the risk level in three stages in the embodiment, but it may be in two stages or in four or more stages. In addition to being represented by a numerical value such as "Lv1", the degree of danger is represented by a character string such as "Large", "Medium", "Small", "A", "B", and "C". You may. In short, it is sufficient if the information indicates the high possibility that the drone 10 cannot land safely.
The first criterion used when the flight instruction unit 104 determines the moving direction and the second criterion used when determining the presence or absence of descent may be determined according to the way of expressing the degree of danger.
[3-7]移動方向の決定方法 飛行指示部104は、実施例とは異なる方法で移動方向を決定してもよい。飛行指示部104は、例えば、危険度と損害度とのバランスを考慮して移動方向を決定してもよい。
 例えば危険Lv2で損害Lv1の分割領域B1と危険Lv1で損害Lv3の分割領域B2とがあった場合、飛行指示部104は、危険度が小さい方の分割領域B2を移動方向としてもよいが、例えば両方のレベルの合計値が小さい分割領域B1を移動方向としてもよい。
[3-7] Method of Determining Movement Direction The flight instruction unit 104 may determine the movement direction by a method different from that of the embodiment. The flight instruction unit 104 may determine the moving direction in consideration of the balance between the degree of danger and the degree of damage, for example.
For example, when there is a division area B1 of damage Lv1 in danger Lv2 and a division area B2 of damage Lv3 in danger Lv1, the flight instruction unit 104 may set the division area B2 having the smaller risk as the movement direction, for example. The division area B1 in which the total value of both levels is small may be set as the moving direction.
[3-8]測距手段 着陸制御で用いられる測距手段は、ステレオカメラに限らない。例えば、赤外線、超音波又はミリ波等を用いて対象物との距離を測定する手段が測距手段として用いられてもよい。 [3-8] Distance measuring means The distance measuring means used in landing control is not limited to a stereo camera. For example, a means for measuring the distance to an object using infrared rays, ultrasonic waves, millimeter waves, or the like may be used as the distance measuring means.
[3-9]各機能を実現する装置
 図2及び図12に表す各機能を実現する装置は、上述した装置に限らない。例えば、サーバ装置20が実現する機能をクラウドサービスで提供されるコンピュータ資源が実現してもよい。いずれの場合も、着陸支援システム1の全体で図2又は図12に表す各機能が実現されていればよい。
[3-9] Device for Realizing Each Function The device for realizing each function shown in FIGS. 2 and 12 is not limited to the above-mentioned device. For example, the computer resources provided by the cloud service may realize the functions realized by the server device 20. In either case, it suffices that each function shown in FIG. 2 or FIG. 12 is realized in the entire landing support system 1.
[3-10]発明のカテゴリ
 本発明は、上述したドローン10及びサーバ装置20のような情報処理装置の他、それらの情報処理装置を備える着陸支援システム1のような情報処理システムとしても捉えられる。また、本発明は、情報処理装置が実施する処理を実現するための情報処理方法としても捉えられるし、情報処理装置を制御するコンピュータを機能させるためのプログラムとしても捉えられる。本発明として捉えられるプログラムは、プログラムを記憶させた光ディスク等の記録媒体の形態で提供されてもよいし、インターネット等のネットワークを介してコンピュータにダウンロードさせ、ダウンロードしたプログラムをインストールして利用可能にするなどの形態で提供されてもよい。
[3-10] Category of Invention The present invention can be regarded as an information processing system such as a landing support system 1 including the above-mentioned information processing devices such as the drone 10 and the server device 20. .. Further, the present invention can be regarded as an information processing method for realizing the processing performed by the information processing device, and also as a program for operating a computer that controls the information processing device. The program regarded as the present invention may be provided in the form of a recording medium such as an optical disk in which the program is stored, or may be downloaded to a computer via a network such as the Internet, and the downloaded program may be installed and used. It may be provided in the form of
[3-11]機能ブロック
 なお、上記実施例の説明に用いたブロック図は、機能単位のブロックを示している。これらの機能ブロック(構成部)は、ハードウェア及びソフトウェアの少なくとも一方の任意の組み合わせによって実現される。また、各機能ブロックの実現方法は特に限定されない。
[3-11] Functional Blocks The block diagram used in the description of the above embodiment shows blocks for functional units. These functional blocks (components) are realized by any combination of at least one of hardware and software. Further, the method of realizing each functional block is not particularly limited.
 すなわち、各機能ブロックは、物理的又は論理的に結合した1つの装置を用いて実現されてもよいし、物理的又は論理的に分離した2つ以上の装置を直接的又は間接的に(例えば、有線、無線などを用いて)接続し、これら複数の装置を用いて実現されてもよい。機能ブロックは、上記1つの装置又は上記複数の装置にソフトウェアを組み合わせて実現されてもよい。 That is, each functional block may be realized by using one device that is physically or logically connected, or directly or indirectly (for example, by two or more devices that are physically or logically separated). , Wired, wireless, etc.) and may be realized using these plurality of devices. The functional block may be realized by combining the software with the one device or the plurality of devices.
 機能には、判断、決定、判定、計算、算出、処理、導出、調査、探索、確認、受信、送信、出力、アクセス、解決、選択、選定、確立、比較、想定、期待、見做し、報知(broadcasting)、通知(notifying)、通信(communicating)、転送(forwarding)、構成(configuring)、再構成(reconfiguring)、割り当て(allocating、mapping)、割り振り(assigning)などがあるが、これらに限られない。たとえば、送信を機能させる機能ブロック(構成部)は、送信部(transmitting unit)や送信機(transmitter)と呼称される。いずれも、上述したとおり、実現方法は特に限定されない。 Functions include judgment, decision, judgment, calculation, calculation, processing, derivation, investigation, search, confirmation, reception, transmission, output, access, solution, selection, selection, establishment, comparison, assumption, expectation, and assumption. There are broadcasting, notifying, communicating, forwarding, configuring, reconfiguring, allocating, mapping, assigning, etc., but only these. I can't. For example, a functional block (constituent unit) that makes transmission function is called a transmitting unit (transmitting unit) or a transmitter (transmitter). As described above, the method of realizing each of them is not particularly limited.
[3-12]入出力された情報等の扱い 入出力された情報等は特定の場所(例えば、メモリ)に保存されてもよいし、管理テーブルを用いて管理してもよい。入出力される情報等は、上書き、更新、又は追記され得る。出力された情報等は削除されてもよい。入力された情報等は他の装置へ送信されてもよい。 [3-12] Handling of input / output information and the like The input and output information and the like may be stored in a specific location (for example, a memory) or may be managed using a management table. Input / output information and the like can be overwritten, updated, or added. The output information and the like may be deleted. The input information or the like may be transmitted to another device.
[3-13]判定方法 判定は、1ビットで表される値(0か1か)によって行われてもよいし、真偽値(Boolean:true又はfalse)によって行われてもよいし、数値の比較(例えば、所定の値との比較)によって行われてもよい。 [3-13] Judgment method Judgment may be performed by a value represented by 1 bit (0 or 1), may be performed by a boolean value (Boolean: true or false), or may be a numerical value. (For example, comparison with a predetermined value) may be performed.
[3-14]処理手順等 本開示において説明した各態様/実施例の処理手順、シーケンス、フローチャートなどは、矛盾の無い限り、順序を入れ替えてもよい。例えば、本開示において説明した方法については、例示的な順序を用いて様々なステップの要素を提示しており、提示した特定の順序に限定されない。 [3-14] Processing procedure, etc. The order of the processing procedures, sequences, flowcharts, etc. of each aspect / embodiment described in the present disclosure may be changed as long as there is no contradiction. For example, the methods described in the present disclosure present elements of various steps using exemplary order, and are not limited to the particular order presented.
[3-15]入出力された情報等の扱い 入出力された情報等は特定の場所(例えばメモリ)に保存されてもよいし、管理テーブルで管理してもよい。入出力される情報等は、上書き、更新、又は追記され得る。出力された情報等は削除されてもよい。入力された情報等は他の装置へ送信されてもよい。 [3-15] Handling of input / output information, etc. The input / output information, etc. may be saved in a specific location (for example, memory) or may be managed by a management table. Input / output information and the like can be overwritten, updated, or added. The output information and the like may be deleted. The input information or the like may be transmitted to another device.
[3-16]ソフトウェア
 ソフトウェアは、ソフトウェア、ファームウェア、ミドルウェア、マイクロコード、ハードウェア記述言語と呼ばれるか、他の名称で呼ばれるかを問わず、命令、命令セット、コード、コードセグメント、プログラムコード、プログラム、サブプログラム、ソフトウェアモジュール、アプリケーション、ソフトウェアアプリケーション、ソフトウェアパッケージ、ルーチン、サブルーチン、オブジェクト、実行可能ファイル、実行スレッド、手順、機能などを意味するよう広く解釈されるべきである。
[3-16] Software Software, whether referred to as software, firmware, middleware, microcode, hardware description language, or by any other name, is an instruction, instruction set, code, code segment, program code, program. , Subprograms, software modules, applications, software applications, software packages, routines, subroutines, objects, executable files, execution threads, procedures, functions, etc. should be broadly interpreted.
 また、ソフトウェア、命令、情報などは、伝送媒体を介して送受信されてもよい。例えば、ソフトウェアが、有線技術(同軸ケーブル、光ファイバケーブル、ツイストペア、デジタル加入者回線(DSL:Digital Subscriber Line)など)及び無線技術(赤外線、マイクロ波など)の少なくとも一方を使用してウェブサイト、サーバ、又は他のリモートソースから送信される場合、これらの有線技術及び無線技術の少なくとも一方は、伝送媒体の定義内に含まれる。 Further, software, instructions, information, etc. may be transmitted and received via a transmission medium. For example, a website, where the software uses at least one of wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and wireless technology (infrared, microwave, etc.). When transmitted from a server, or other remote source, at least one of these wired and wireless technologies is included within the definition of transmission medium.
[3-17]情報、信号
 本開示において説明した情報、信号などは、様々な異なる技術のいずれかを使用して表されてもよい。例えば、上記の説明全体に渡って言及され得るデータ、命令、コマンド、情報、信号、ビット、シンボル、チップなどは、電圧、電流、電磁波、磁界若しくは磁性粒子、光場若しくは光子、又はこれらの任意の組み合わせによって表されてもよい。
[3-17] Information, Signals The information, signals, etc. described in the present disclosure may be represented using any of a variety of different techniques. For example, data, instructions, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description are voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may be represented by a combination of.
[3-18]「判断」、「決定」
 本開示で使用する「判断(determining)」、「決定(determining)」という用語は、多種多様な動作を包含する場合がある。「判断」、「決定」は、例えば、判定(judging)、計算(calculating)、算出(computing)、処理(processing)、導出(deriving)、調査(investigating)、探索(looking up、search、inquiry)(例えば、テーブル、データベース又は別のデータ構造での探索)、確認(ascertaining)した事を「判断」「決定」したとみなす事などを含み得る。
[3-18] "Judgment", "Decision"
The terms "determining" and "determining" as used in this disclosure may include a wide variety of actions. "Judgment" and "decision" are, for example, judgment (judging), calculation (calculating), calculation (computing), processing (processing), derivation (deriving), investigating (investigating), search (looking up, search, inquiry). (For example, searching in a table, database or another data structure), ascertaining may be regarded as "judgment" or "decision".
 また、「判断」、「決定」は、受信(receiving)(例えば、情報を受信すること)、送信(transmitting)(例えば、情報を送信すること)、入力(input)、出力(output)、アクセス(accessing)(例えば、メモリ中のデータにアクセスすること)した事を「判断」「決定」したとみなす事などを含み得る。また、「判断」、「決定」は、解決(resolving)、選択(selecting)、選定(choosing)、確立(establishing)、比較(comparing)などした事を「判断」「決定」したとみなす事を含み得る。つまり、「判断」「決定」は、何らかの動作を「判断」「決定」したとみなす事を含み得る。また、「判断(決定)」は、「想定する(assuming)」、「期待する(expecting)」、「みなす(considering)」などで読み替えられてもよい。 Also, "judgment" and "decision" are receiving (for example, receiving information), transmitting (for example, transmitting information), input (input), output (output), and access. (Accessing) (for example, accessing data in memory) may be regarded as "judgment" or "decision". In addition, "judgment" and "decision" mean that the things such as solving, selecting, choosing, establishing, and comparing are regarded as "judgment" and "decision". Can include. That is, "judgment" and "decision" may include considering some action as "judgment" and "decision". Further, "judgment (decision)" may be read as "assuming", "expecting", "considering" and the like.
[3-19]「に基づいて」の意味 本開示において使用する「に基づいて」という記載は、別段に明記されていない限り、「のみに基づいて」を意味しない。言い換えれば、「に基づいて」という記載は、「のみに基づいて」と「に少なくとも基づいて」の両方を意味する。 [3-19] Meaning of "based on" The description "based on" used in this disclosure does not mean "based on only" unless otherwise specified. In other words, the statement "based on" means both "based only" and "at least based on".
[3-20]「異なる」 本開示において、「AとBが異なる」という用語は、「AとBが互いに異なる」ことを意味してもよい。なお、当該用語は、「AとBがそれぞれCと異なる」ことを意味してもよい。「離れる」、「結合される」などの用語も、「異なる」と同様に解釈されてもよい。 [3-20] "Different" In the present disclosure, the term "A and B are different" may mean "A and B are different from each other". The term may mean that "A and B are different from C". Terms such as "separate" and "combined" may be interpreted in the same way as "different".
[3-21]「及び」、「又は」 本開示において、「A及びB」でも「A又はB」でも実施可能な構成については、一方の表現で記載された構成を、他方の表現で記載された構成として用いてもよい。例えば「A及びB」と記載されている場合、他の記載との不整合が生じず実施可能であれば、「A又はB」として用いてもよい。 [3-21] "and", "or" In the present disclosure, for configurations that can be implemented by either "A and B" or "A or B", the configuration described in one expression is described in the other expression. It may be used as a configured configuration. For example, when "A and B" are described, they may be used as "A or B" as long as they are not inconsistent with other descriptions and can be implemented.
[3-22]態様のバリエーション等 本開示において説明した各態様/実施形態は単独で用いてもよいし、組み合わせて用いてもよいし、実行に伴って切り替えて用いてもよい。また、所定の情報の通知(例えば、「Xであること」の通知)は、明示的に行うものに限られず、暗黙的(例えば、当該所定の情報の通知を行わない)ことによって行われてもよい。 [3-22] Variations of Aspects, etc. Each aspect / embodiment described in the present disclosure may be used alone, in combination, or switched with execution. Further, the notification of predetermined information (for example, the notification of "being X") is not limited to the explicit one, but is performed implicitly (for example, the notification of the predetermined information is not performed). May be good.
 以上、本開示について詳細に説明したが、当業者にとっては、本開示が本開示中に説明した実施形態に限定されるものではないということは明らかである。本開示は、請求の範囲の記載により定まる本開示の趣旨及び範囲を逸脱することなく修正及び変更態様として実施することができる。したがって、本開示の記載は、例示説明を目的とするものであり、本開示に対して何ら制限的な意味を有するものではない。 Although the present disclosure has been described in detail above, it is clear to those skilled in the art that the present disclosure is not limited to the embodiments described in the present disclosure. The present disclosure may be implemented as an amendment or modification without departing from the purpose and scope of the present disclosure, which is determined by the description of the scope of claims. Therefore, the description of the present disclosure is for the purpose of exemplary explanation and does not have any limiting meaning to the present disclosure.
1…着陸支援システム、10…ドローン、17…ステレオカメラ、20…サーバ装置、101…センサ取得部、102…距離算出部、103…危険度判定部、104…飛行指示部、105…飛行制御部、106…損害規模判定部、107…障害物検出部、108…撮影指示部、109…着陸指示部、110…画像撮影部。 1 ... Landing support system, 10 ... Drone, 17 ... Stereo camera, 20 ... Server device, 101 ... Sensor acquisition unit, 102 ... Distance calculation unit, 103 ... Danger level determination unit, 104 ... Flight instruction unit, 105 ... Flight control unit , 106 ... Damage scale determination unit, 107 ... Obstacle detection unit, 108 ... Shooting instruction unit, 109 ... Landing instruction unit, 110 ... Imaging unit.

Claims (7)

  1.  飛行体に設けられたステレオカメラが撮影した画像を取得する取得部と、
     取得された前記画像の各画素が示す地点までの前記ステレオカメラからの距離を算出する算出部と、
     取得された前記画像から障害物を検出する検出部と、
     前記障害物が検出されても当該障害物の位置が着陸条件を満たす場合には、算出された前記距離に基づく前記飛行体の着陸を指示する着陸指示部と
     を備える情報処理装置。
    An acquisition unit that acquires images taken by a stereo camera installed on the aircraft, and
    A calculation unit that calculates the distance from the stereo camera to the point indicated by each pixel of the acquired image, and a calculation unit.
    A detection unit that detects obstacles from the acquired image, and
    An information processing device including a landing instruction unit that instructs the landing of the flying object based on the calculated distance if the position of the obstacle satisfies the landing condition even if the obstacle is detected.
  2.  前記検出部は、前記検出された障害物の形状と予め登録された形状との類似度に基づいて、障害物か否かを判定する
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the detection unit determines whether or not the obstacle is an obstacle based on the degree of similarity between the detected shape of the obstacle and the shape registered in advance.
  3.  前記障害物が検出された場合に当該障害物の別の方向からの撮影を指示する撮影指示部を備え、
     前記検出部は、前記撮影の指示により撮影された画像を加えて、前記障害物を再度検出する
     請求項1又は2に記載の情報処理装置。
    A shooting instruction unit for instructing shooting from another direction of the obstacle when the obstacle is detected is provided.
    The information processing device according to claim 1 or 2, wherein the detection unit adds an image taken according to the shooting instruction and detects the obstacle again.
  4.  前記指示部は、検出された前記障害物が前記撮影された画像内の所定の範囲に含まれない場合に前記着陸条件が満たされたと判断する
     請求項1から3のいずれか1項に記載の情報処理装置。
    The indicator according to any one of claims 1 to 3, wherein the indicating unit determines that the landing condition is satisfied when the detected obstacle is not included in the predetermined range in the captured image. Information processing device.
  5.  前記指示部は、検出された前記障害物が前記範囲に含まれていても、当該障害物が移動により前記飛行体の着陸前に前記範囲に含まれなくなる場合は前記着陸条件が満たされると判断する
     請求項4に記載の情報処理装置。
    Even if the detected obstacle is included in the range, the indicator determines that the landing condition is satisfied if the obstacle is not included in the range before the landing of the flying object due to movement. The information processing apparatus according to claim 4.
  6.  前記指示部は、前記飛行体の種類に応じて前記範囲を変化させる
     請求項4又は5に記載の情報処理装置。
    The information processing device according to claim 4 or 5, wherein the indicating unit changes the range according to the type of the flying object.
  7.  前記指示部は、着陸地点の気象状況に応じて前記範囲を変化させる
     請求項4から6のいずれか1項に記載の情報処理装置。
    The information processing device according to any one of claims 4 to 6, wherein the indicating unit changes the range according to the weather conditions at the landing point.
PCT/JP2021/000781 2020-01-21 2021-01-13 Information processing device WO2021149546A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021573081A JP7461384B2 (en) 2020-01-21 2021-01-13 information processing equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-007667 2020-01-21
JP2020007667 2020-01-21

Publications (1)

Publication Number Publication Date
WO2021149546A1 true WO2021149546A1 (en) 2021-07-29

Family

ID=76992268

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/000781 WO2021149546A1 (en) 2020-01-21 2021-01-13 Information processing device

Country Status (2)

Country Link
JP (1) JP7461384B2 (en)
WO (1) WO2021149546A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6293369B1 (en) * 2016-06-13 2018-03-14 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Unmanned aerial vehicle, delivery system, unmanned aircraft control method, and program for controlling unmanned aerial vehicle
WO2019060414A1 (en) * 2017-09-21 2019-03-28 Amazon Technologies, Inc. Object detection and avoidance for aerial vehicles

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019196150A (en) * 2018-05-11 2019-11-14 株式会社自律制御システム研究所 System, method, and program for identifying safe landing area, and storage medium for storing the program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6293369B1 (en) * 2016-06-13 2018-03-14 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Unmanned aerial vehicle, delivery system, unmanned aircraft control method, and program for controlling unmanned aerial vehicle
WO2019060414A1 (en) * 2017-09-21 2019-03-28 Amazon Technologies, Inc. Object detection and avoidance for aerial vehicles

Also Published As

Publication number Publication date
JP7461384B2 (en) 2024-04-03
JPWO2021149546A1 (en) 2021-07-29

Similar Documents

Publication Publication Date Title
US10126126B2 (en) Autonomous mission action alteration
CN110069071B (en) Unmanned aerial vehicle navigation method and device, storage medium and electronic equipment
EP3735380B1 (en) Adjusting flight parameters of an aerial robotic vehicle based on presence of propeller guard(s)
US11797028B2 (en) Unmanned aerial vehicle control method and device and obstacle notification method and device
US20210271269A1 (en) Unmanned aerial vehicle path planning method and apparatus and unmanned aerial vehicle
EP3735623B1 (en) Adjustable object avoidance proximity threshold based on presence of propeller guard(s)
TWI817962B (en) Method, robotic vehicle, and processing device of adjustable object avoidance proximity threshold based on predictability of the environment
US11798426B2 (en) Autonomous mission action alteration
US20200317339A1 (en) Wireless communication relay system using unmanned device and method therefor
JP6485889B2 (en) Flight control device, flight control method, and program
JPWO2017081898A1 (en) Flight control device, flight control method, and program
WO2021149546A1 (en) Information processing device
WO2021149607A1 (en) Information processing device
JP7082711B2 (en) Information processing equipment and information processing method
JPWO2020189493A1 (en) Information processing equipment and information processing method
CN113574487A (en) Unmanned aerial vehicle control method and device and unmanned aerial vehicle
WO2021140798A1 (en) Information processing system
JP2020067880A (en) Information processing apparatus
WO2021146969A1 (en) Distance measurement method, movable platform, device, and storage medium
US20220147901A1 (en) Information processing apparatus
KR20230046110A (en) Method, system and non-transitory computer-readable recording medium for determining emergency landing sites for drones
CN117724527A (en) Unmanned aerial vehicle autonomous landing method, medium and electronic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21743943

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021573081

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21743943

Country of ref document: EP

Kind code of ref document: A1