WO2023282124A1 - Control device - Google Patents

Control device Download PDF

Info

Publication number
WO2023282124A1
WO2023282124A1 PCT/JP2022/025759 JP2022025759W WO2023282124A1 WO 2023282124 A1 WO2023282124 A1 WO 2023282124A1 JP 2022025759 W JP2022025759 W JP 2022025759W WO 2023282124 A1 WO2023282124 A1 WO 2023282124A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
priority
landing
structures
control device
Prior art date
Application number
PCT/JP2022/025759
Other languages
French (fr)
Japanese (ja)
Inventor
昌志 安沢
真幸 森下
広樹 石塚
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Priority to JP2023533549A priority Critical patent/JPWO2023282124A1/ja
Publication of WO2023282124A1 publication Critical patent/WO2023282124A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D9/00Equipment for handling freight; Equipment for facilitating passenger embarkation or the like
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/02Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data

Definitions

  • the present invention relates to technology for landing an aircraft.
  • Patent Literature 1 describes a mechanism in which a landing pad is provided in a landing zone of a delivery destination of a drone, and the drone is guided to the landing pad by a visual assistance device, an optical assistance device or a wireless assistance device.
  • a dedicated facility called a landing pad must be installed at the delivery destination of the drone.
  • Destinations for drones include, for example, dwelling units of various sizes and shapes.
  • the object of the present invention is to land the flying object at an appropriate position with respect to various structures designated as the destination of the flying object.
  • the structure existing on the site is detected based on the result of the flying object sensing the site containing the structure.
  • a detection unit for determining whether or not the flying object can land at a landing position determined for the detected structure; and a landing control for landing at a landing position.
  • the present invention it is possible to land the flying object at an appropriate position with respect to the structure designated as the destination of the flying object.
  • FIG. 4 is a bird's-eye view illustrating the structure of the destination of the drone according to the same embodiment; It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment.
  • FIG. 4 is a diagram illustrating a movement route of the drone 10 when detecting a structure
  • FIG. 4 is a diagram illustrating a movement route of the drone 10 when detecting a structure
  • FIG. 4 is a diagram illustrating a movement route of the drone 10 when detecting a structure
  • 4 is a flowchart illustrating a procedure of processing by the drone 10;
  • FIG. 1 is a diagram showing an example configuration of a drone management system 1 according to an embodiment of an information processing system of the present invention.
  • the drone management system 1 includes a drone 10 that transports packages to a destination, a user terminal 30 that is used by a user living in a building or other structure that is the destination, a wireless communication network 40, and a wireless communication network 40. and a connected server device 50 .
  • a drone 10 that transports packages to a destination
  • a user terminal 30 that is used by a user living in a building or other structure that is the destination
  • a wireless communication network 40 and a wireless communication network 40.
  • a connected server device 50 a connected server device 50 .
  • FIG. 1 is a diagram showing an example configuration of a drone management system 1 according to an embodiment of an information processing system of the present invention.
  • the drone management system 1 includes a drone 10 that transports packages to a destination, a user terminal 30 that is used by a user living in a building or other structure that is the destination,
  • the drone 10 is an unmanned flying object that flies in the air.
  • the drone 10 transports the cargo by holding the cargo, flying to the destination, and landing at the destination.
  • the user terminal 30 is a communicable computer such as a smartphone, tablet, or personal computer.
  • the user terminal 30 is a smart phone and functions as a communication terminal for the user who receives the parcel to access the server device 50 via the wireless communication network 40 .
  • the server device 50 stores flight plan information such as the flight date and time, flight route and flight altitude of the drone 10, and remotely steers the drone according to the flight plan information.
  • Remote control by the server device 50 is mainly a section between a drone departure/arrival point called a base and the drone's destination above ground.
  • the section between the target airspace and the landing position of the drone is carried out under autonomous control by the drone itself.
  • the drone 10 detects a structure corresponding to the destination and/or other structures existing on the site including the structure (for example, doors, verandas, gates, parking lots, warehouses, gardens, etc.) Then, it determines whether or not it is possible to land on and/or in the vicinity of these detected structures, and then lands.
  • the section between the drone's departure/arrival point and the destination airspace depends on remote control by the server device 50, and the section between the destination airspace and the drone's landing position is Although it is realized by autonomous flight by the drone itself, it is not limited to this example.
  • the drone 10 may autonomously fly all sections between the landing positions of the departure/arrival point and the destination without relying on remote control by the server device 50, or You may fly according to the remote control of the server apparatus 50 in all the sections between.
  • the wireless communication network 40 may be, for example, equipment conforming to the 4th generation mobile communication system or may be equipment conforming to the 5th generation mobile communication system.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the drone 10.
  • the drone 10 physically includes a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a positioning device 1007, a sensor 1008, a flight drive mechanism 1009, and a bus connecting these. It is configured as a computer device. Note that in the following description, the term "apparatus" can be read as a circuit, device, unit, or the like.
  • the hardware configuration of the drone 10 may be configured to include one or more of each device shown in the figure, or may be configured without some of the devices.
  • Each function in the drone 10 is performed by the processor 1001 by loading predetermined software (program) on hardware such as the processor 1001 and the memory 1002, the processor 1001 performs calculations, the communication by the communication device 1004 is controlled, the memory 1002 and It is realized by controlling at least one of data reading and writing in the storage 1003 and controlling the positioning device 1007 , the sensor 1008 and the flight driving mechanism 1009 .
  • predetermined software program
  • the processor 1001 performs calculations
  • the communication by the communication device 1004 is controlled
  • the memory 1002 It is realized by controlling at least one of data reading and writing in the storage 1003 and controlling the positioning device 1007 , the sensor 1008 and the flight driving mechanism 1009 .
  • the processor 1001 for example, operates an operating system and controls the entire computer.
  • the processor 1001 may be configured by a central processing unit (CPU) including interfaces with peripheral devices, a control unit, an arithmetic unit, registers, and the like.
  • a baseband signal processing unit, a call processing unit, and the like may be implemented by the processor 1001 .
  • the processor 1001 reads programs (program codes), software modules, data, etc. from at least one of the storage 1003 and the communication device 1004 to the memory 1002, and executes various processes according to them.
  • programs program codes
  • software modules software modules
  • data etc.
  • the functional blocks of drone 10 may be implemented by a control program stored in memory 1002 and running on processor 1001 .
  • Various types of processing may be executed by one processor 1001, but may also be executed by two or more processors 1001 simultaneously or sequentially.
  • Processor 1001 may be implemented by one or more chips. Note that the program may be transmitted to the drone 10 via the wireless communication network 40 .
  • the memory 1002 is a computer-readable recording medium, and may be composed of at least one of, for example, ROM, EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM, and the like.
  • the memory 1002 may also be called a register, cache, main memory (main storage device), or the like.
  • the memory 1002 can store executable programs (program code), software modules, etc. to perform the methods of the present invention.
  • the storage 1003 is a computer-readable recording medium, for example, an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu-ray disk), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like.
  • Storage 1003 may also be called an auxiliary storage device.
  • the storage 1003 stores various programs and data groups.
  • the processor 1001, memory 1002, and storage 1003 described above function as an example of the control device of the present invention.
  • the communication device 1004 is hardware (transmitting/receiving device) for communicating between computers via the wireless communication network 40, and is also called a network device, a network controller, a network card, a communication module, or the like.
  • the communication device 1004 includes a high-frequency switch, duplexer, filter, frequency synthesizer, etc. in order to implement frequency division duplexing and time division duplexing.
  • a transmitting/receiving antenna, an amplifier section, a transmitting/receiving section, a transmission line interface, etc. may be implemented by the communication device 1004 .
  • the transceiver may be physically or logically separate implementations for the transmitter and receiver.
  • the input device 1005 is an input device that receives input from the outside, and includes, for example, keys, switches, and microphones.
  • the output device 1006 is an output device that outputs to the outside, and includes, for example, a display device such as a liquid crystal display and a speaker. Note that the input device 1005 and the output device 1006 may be integrated.
  • the positioning device 1007 is hardware that measures the position of the drone 10, such as a GPS (Global Positioning System) device.
  • the drone 10 flies from the departure/arrival point to the sky above the destination based on the positioning by the positioning device 1007 .
  • the sensor 1008 includes a ranging sensor that functions as altitude measurement means and landing position status confirmation means for the drone 10, a gyro sensor and direction sensor that function as attitude measurement means for the drone 10, an image sensor that functions as imaging means, and the like.
  • the flight drive mechanism 1009 includes hardware such as motors and propellers for the drone 10 to fly.
  • Each device such as the processor 1001 and the memory 1002 is connected by a bus for communicating information.
  • the bus may be configured using a single bus, or may be configured using different buses between devices.
  • the drone 10 includes a microprocessor, a GPU (Graphics Processing Unit), a digital signal processor (DSP: Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field Programmable Gate Array), etc. hardware, and part or all of each functional block may be realized by the hardware.
  • processor 1001 may be implemented using at least one of these pieces of hardware.
  • FIG. 3 is a diagram showing the hardware configuration of the server device 50.
  • the hardware configuration of the server device 50 may be configured to include one or more of the devices shown in FIG. 3, or may be configured without some of the devices. Further, the server device 50 may be configured by connecting a plurality of devices having different housings for communication.
  • the server device 50 is physically configured as a computer device including a processor 5001, a memory 5002, a storage 5003, a communication device 5004, and a bus connecting them. Each function in the server device 50 is performed by causing the processor 5001 to perform calculations, controlling communication by the communication device 5004, and controlling the and by controlling at least one of reading and writing of data in the storage 5003 . Each of these devices operates with power supplied from a power source (not shown). Note that in the following description, the term "apparatus" can be read as a circuit, device, unit, or the like.
  • a processor 5001 operates an operating system to control the entire computer.
  • the processor 5001 may be configured by a central processing unit (CPU) including interfaces with peripheral devices, a control unit, an arithmetic unit, registers, and the like.
  • CPU central processing unit
  • a baseband signal processing unit, a call processing unit, and the like may be implemented by the processor 5001 .
  • the processor 5001 reads programs (program codes), software modules, data, etc. from at least one of the storage 5003 and the communication device 5004 to the memory 5002, and executes various processes according to them.
  • programs program codes
  • software modules software modules
  • data etc.
  • As the program a program that causes a computer to execute at least part of the operations described below is used.
  • the functional blocks of drone 10 may be implemented by a control program stored in memory 5002 and running on processor 5001 .
  • Various types of processing may be executed by one processor 5001, but may also be executed by two or more processors 5001 simultaneously or sequentially.
  • Processor 5001 may be implemented by one or more chips.
  • the memory 5002 is a computer-readable recording medium, and may be composed of at least one of ROM, EPROM, EEPROM, and RAM, for example.
  • the memory 5002 may also be called a register, cache, main memory (main storage device), or the like.
  • the memory 5002 can store executable programs (program code), software modules, etc. for performing methods according to the present invention.
  • the storage 5003 is a computer-readable recording medium, for example, an optical disk such as a CD-ROM, a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu-ray (registered trademark) disk ), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like.
  • Storage 5003 may be called an auxiliary storage device.
  • the storage 5003 stores at least programs and data groups for executing various processes described later.
  • the communication device 5004 is hardware (transmitting/receiving device) for communicating between computers via the wireless communication network 40, and is also called a network device, a network controller, a network card, a communication module, or the like.
  • Each device such as the processor 5001 and memory 5002 is connected by a bus for communicating information.
  • the bus may be configured using a single bus, or may be configured using different buses between devices.
  • the server device 50 may be configured including hardware such as a microprocessor, digital signal processor, ASIC, PLD, and FPGA, and part or all of each functional block may be realized by the hardware.
  • processor 5001 may be implemented using at least one of these pieces of hardware.
  • FIG. 4 is a diagram showing an example of the functional configuration of the drone 10. As shown in FIG. In the drone 10, functions of an acquisition unit 11, a detection unit 12, a determination unit 13, and a landing control unit 14 are realized.
  • the acquisition unit 11 acquires various data from, for example, the positioning device 1007, the sensor 1008, the server device 50, or the like.
  • the detection unit 12 captures an image of the site including the structure with the sensor 1008 (image sensor) of the drone 10. , the structure and/or structures existing on the site are detected.
  • the structures here refer to the elements that make up the building designated as the destination (for example, architectural elements such as doors, verandas, and gates), as well as other buildings, facilities, and other structures (parking lots, warehouses, gardens, etc.) , aisles, roads, under the eaves, etc.), which are typically likely to be provided in general dwelling units.
  • the detection unit 12 detects the structure according to a predetermined priority. Specifically, the detection unit 12 uses the priority according to the positional relationship between the structure specified as the destination of the drone 10 and other structures and/or sites adjacent to the structure. In addition, the detection unit 12 uses priority according to the shape of the structure designated as the destination of the drone 10 . Further, the detection unit 12 uses the priority according to the arrangement of the structure with respect to the site of the structure designated as the destination of the drone 10 . Further, the detector 12 uses the user-indicated priority of the destination of the drone 10 . Note that "adjacent" does not necessarily mean that the structures are physically in contact with each other, as long as it can be understood that they exist within a predetermined distance range from the designated structure. good.
  • FIG. 5 is a bird's eye view exemplifying the structure that is the destination of the drone 10 and the structure of its site. That is, the image is illustrated when the drone 10 captures the image below from the sky above the destination (for example, 20 m above) with the image sensor.
  • another building B1 and its site G1 are also adjacent to the building B and site G.
  • the site G includes, for example, a gate g, trees W, and a roof P of a parking lot.
  • FIG. 5 exemplifies a passage L1, a parking lot L2, and a garden L3 as the structures described above.
  • This priority depends on the positional relationship between the building B and the roads R1 and R2 adjacent to the building B or another building B1 or another site G1, and the shape of the building B. priority according to the layout of the building B with respect to the site G, or priority according to an instruction by the user.
  • the detection unit 12 stores a priority order table as illustrated in FIG.
  • positional relationship patterns 01, 02, 03, . are associated with priority order patterns 01, 02, 03, . . . used when the destination corresponds to the positional relationship.
  • the positional relationship patterns 01, 02, 03, . Contains information about the placement of structures, such as where they are adjacent to roads, and where structures and/or sites are adjacent to other structures and/or sites at a distance less than a threshold.
  • search is performed in the order of gates, parking lots, under eaves, etc., or structures and/or sites are searched.
  • an algorithm for detecting structures such as excluding a predetermined range centered on a location where the site is adjacent to other structures and/or sites with a distance of less than a threshold from the search range for structures. .
  • the detection unit 12 also stores a priority order table as illustrated in FIG.
  • the shape patterns 01, 02, 03, . 11, 12, 13, . . . are associated with each other.
  • the priority patterns 11, 12, 13, It specifies an algorithm for detecting For example, in the case of shape pattern 11, first try to detect a "passage” from one part of the shape and then try to detect a "garden” from another part of the shape, whereas shape pattern If it is 12, then first try to detect a "parking lot” from one part of the shape, then try to detect "under the eaves” from another part of the shape, and so on.
  • the detection unit 12 also stores a priority order table as illustrated in FIG.
  • Priority patterns 21, 22, 23, . . . are associated with each other.
  • the detection unit 12 also stores a priority order table as illustrated in FIG.
  • each destination is associated with a priority pattern, which is an algorithm for detecting structures around the destination.
  • This priority is specified by the user at the destination of the drone 10 (that is, the user who receives the package) according to his/her own wishes. For example, first try to detect "parking lot”, then try to detect "garden”, and so on. For example, this instruction is given to the user terminal 30 before the cargo is transported, and the user gives an instruction in response to the notification. However, since there are cases where the user does not indicate the priority order, in that case, no information is registered in the table regarding the priority order corresponding to the destination where the user receives the parcel.
  • the priority patterns illustrated in FIGS. 6 to 9 may be used together. For example, if the destination best fits any one of the positional relationship patterns, shape patterns, and arrangement patterns illustrated in FIGS. to search for structures. For destinations whose priorities are registered in the table shown in FIG. 9, the detection unit 12 searches for structures according to the priority patterns registered in FIG. For destinations for which priority is not registered, structures are searched according to a priority pattern determined using either one of FIGS. 6 to 8 or a combination thereof.
  • the drone 10 searches for a structure after approaching the structure by lowering the altitude to some extent from the sky above the destination. For example, when the drone 10 (the detection unit 12) attempts to detect a door, the drone 10 descends to a position where the door can be recognized (a position substantially horizontal to the door) before starting the search. Therefore, time efficiency is poor if the drone 10 lowers its altitude from the sky each time it searches for each structure. Therefore, for example, when one of the positional relationship patterns, shape patterns, and arrangement patterns illustrated in FIGS. It is desirable to use the priority pattern that allows structures to be searched the fastest. For example, it is assumed that there are priority patterns A, B, and C that apply one each to the positional relationship pattern illustrated in FIG.
  • FIG. 10 illustrates the movement route of the drone 10 when detection is performed according to the priority pattern A
  • FIG. 11 illustrates the movement route of the drone 10 when detection is performed according to the priority pattern B
  • 12 exemplifies the moving route of the drone 10 when detection is performed according to the priority pattern C.
  • FIG. 10 is a bird's-eye view illustrating the destination of the drone 10 as well as the structure of the structure and its site. The number attached to each arrow means the movement order of the drones 10 . Since the moving route illustrated in FIG.
  • the detection unit 12 uses the order of priority that enables detection of structures within a range in which the flight altitude of the drone 10 falls within a predetermined range.
  • the determination unit 13 determines whether the drone 10 can land at the landing position determined with respect to the structure detected by the detection unit 12. Specifically, the determination unit 13 determines whether or not the drone 10 can land based on the levelness and flatness of the landing position, or whether or not there is liquid. The horizontality and flatness of the landing position are determined based on the outputs of the ranging sensor and image sensor of the sensor 1008 . Also, whether or not there is liquid (typically water) at the landing position is determined based on the output of the image sensor of sensor 1008 .
  • the landing position determined for the ground structure is, for example, the landing position for a structure called a garden is the center of the garden, and the landing position for a structure called a parking lot is near the edge of the parking lot.
  • the landing position for a structure called a door is in front of the door, and the landing position for a structure called a gate is a position closer to the structure than the gate.
  • the landing control unit 14 controls the flight drive mechanism 1009 to land the drone 10 at the landing position when the judgment unit 13 judges that the landing position can be landed.
  • the drone 10 starts flying from the departure/arrival point to the destination (step S01). After that, the drone 10 flies under the control of the server device 50 up to the destination address specified at the time of the package delivery request.
  • the detection unit 12 detects the structure and/or the site based on the image captured by the sensor 1008 (image sensor). Detect structures existing on the site. At this time, the detection unit 12 detects the structure according to the above-described priority order (step S03).
  • the determination unit 13 determines whether the drone 10 can land at the landing position determined for the structure detected by the detection unit 12. Specifically, the determination unit 13 identifies factors related to suitability for landing, such as levelness and flatness of the landing position, or presence or absence of liquid, from the output of the ranging sensor and image sensor of the sensor 1008. Then, it is determined whether or not the drone 10 can land (step S04).
  • the landing control unit 14 controls the flight drive mechanism 1009 to land the drone 10 at the landing position when the determination unit 13 determines that the landing position can be landed (step S05). If the determination unit 13 does not determine that the vehicle can land at the landing position, the detection unit 12 tries to detect the next priority structure, and repeats the above steps S03 and S04. Then, when it is determined that all structures are not landable, the drone 10 performs a predetermined NG process and returns to the departure/arrival point.
  • [Modification] The invention is not limited to the embodiments described above.
  • the embodiment described above may be modified as follows. Also, two or more of the following modified examples may be combined for implementation.
  • [Modification 1] For example, when a plurality of structures are detected at one destination at the same time, one structure to be determined for the possibility of landing may be determined based on the distance from the drone. For example, when two structures called doors are detected by one detection process, the determination unit 13 determines whether or not the door closer to the drone 10 at that time can land. In other words, when multiple structures are detected, the determination unit 13 may determine a structure to be determined based on the distance from the drone 10 .
  • the detection unit 12 detects a structure based on an image of the site including the structure captured by the sensor 1008 (image sensor) of the drone 10. In this detection, is calculated as the likelihood (recognition accuracy) that it corresponds to a structure. Therefore, when a plurality of structures are detected at one destination at the same time, the determination unit 13 may determine a structure to be determined based on the likelihood of each structure. For example, the determination unit 13 compares the likelihood of each structure and determines the structure with the highest likelihood as the structure to be determined. In this way, it is possible to determine the structure to be determined as the landing position based on the likelihood of the structure.
  • Drone landing control is realized by so-called edge computing (control by drone), cloud computing (control by server device), or combination of both (control by drone and server device), as described in the embodiments. good too. Therefore, the control device of the present invention may be provided in the server device 50.
  • FIG. 1 illustrates a diagrammatic representation of Drone landing control.
  • the unmanned flying object is not limited to what is called a drone, and may be of any structure or form as long as it is an unmanned flying object capable of transporting cargo.
  • the present invention can be applied to a manned flying object that has a human on board but that operates automatically.
  • an image sensor is used as an imaging means provided in the sensor 1008 of the drone 10 in detecting a structure.
  • the detection method of the structure is not limited to the example of the embodiment. , shape, size, or the like can be used. That is, the sensor 1008 includes a sensor capable of sensing the site including the structure, and the detection unit 12 of the server device 50 detects the structure and/or the site based on the result of sensing the site including the structure. It suffices to detect existing structures.
  • the judgment unit 13 judges whether or not the flying object can land based on the levelness and flatness of the landing position, or whether or not there is liquid.
  • the decision may be made on the basis of Specifically, for example, the material and temperature of the landing position, or whether or not there is snow cover, etc. can be considered.
  • the material of the landing position and the presence or absence of snow coverage are determined based on the output of the image sensor of the sensor 1008, for example.
  • the temperature of the landing position is determined based on the output of the non-contact temperature sensor included in sensor 1008 .
  • the determination unit 13 determines based on at least one of a plurality of types of landing position conditions, such as levelness, flatness, material, temperature, or whether or not there is liquid or snow at the landing position. can be used to determine whether the aircraft can land.
  • types of landing position conditions such as levelness, flatness, material, temperature, or whether or not there is liquid or snow at the landing position.
  • each functional block may be implemented by one device physically and/or logically coupled, or may be implemented by two or more physically and/or logically separated devices directly and/or indirectly. These multiple devices may be physically connected (eg, wired and/or wirelessly).
  • one computer may have the functions of the user terminals 30 to 32 exemplified in the embodiments.
  • each function exemplified in FIG. 5 may be provided in any of the devices constituting the drone management system 1 as an information processing system.
  • the server device 50 can directly control the drone 10
  • the server device 50 may have a function corresponding to the processing unit 313 and directly restrict the flight of the drone 10 .
  • LTE Long Term Evolution
  • LTE-A Long Term Evolution-Advanced
  • SUPER 3G IMT-Advanced
  • 4G 5G
  • FRA Full Radio Access
  • W-CDMA registered trademark
  • GSM registered trademark
  • CDMA2000 Code Division Multiple Access 2000
  • UMB Universal Mobile Broadband
  • IEEE 802.11 Wi-Fi
  • IEEE 802.16 WiMAX
  • IEEE 802.20 UWB (Ultra-WideBand
  • the information or parameters described in this specification may be represented by absolute values, relative values from a predetermined value, or other corresponding information.
  • determining and “determining” used herein may encompass a wide variety of actions.
  • Determining means, for example, judging, calculating, computing, processing, deriving, investigating, looking up (e.g., table , searching in a database or other data structure), ascertaining as “determining” or “determining”.
  • judgment and “decision” are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access, and so on.
  • accessing for example, accessing data in memory may include deeming that something has been "determined” or "determined”.
  • the present invention may be provided as an information processing method or as a program.
  • a program may be provided in a form recorded on a recording medium such as an optical disc, or may be provided in a form in which the program is downloaded to a computer via a network such as the Internet, installed, and made available. It is possible.
  • Software, instructions, etc. may be transmitted and received via a transmission medium.
  • the software can be used to access websites, servers, or other When transmitted from a remote source, these wired and/or wireless technologies are included within the definition of transmission media.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. may be represented by a combination of
  • references to elements using the "first”, “second”, etc. designations used herein does not generally limit the quantity or order of those elements. These designations may be used herein as a convenient method of distinguishing between two or more elements. Thus, references to first and second elements do not imply that only two elements may be employed therein, or that the first element must precede the second element in any way.

Abstract

A drone 10 comprises: a detection unit 12 that, after a flying vehicle is located over a structure specified as a destination, detects, on the basis of an image of a site including the structure captured by the flying vehicle, the structure and/or a structure affixed to the site; a decision unit 13 that decides whether or not landing of the flying vehicle at a landing location determined for the detected structure is possible; and a landing control unit 14 that causes the flying vehicle to land at the landing location when the decision is made that landing is possible. The detection unit 12 detects the structure in accordance with an order of priority corresponding to the positional relationship between the structure specified as the destination of the drone 10, another structure and/or another site adjacent to the structure.

Description

制御装置Control device
 本発明は、飛行体を着陸させるための技術に関する。 The present invention relates to technology for landing an aircraft.
 ドローンと呼ばれる無人飛行体の普及に伴い、ドローンを荷物の配達に利用する仕組みが種々提案されている。例えば特許文献1には、ドローンの配達目的地の着陸ゾーンに着陸パッドを設け、視覚支援装置、光学支援装置又は無線支援装置によりドローンをその着陸パッドに案内する仕組みが記載されている。 With the spread of unmanned flying objects called drones, various mechanisms for using drones to deliver packages have been proposed. For example, Patent Literature 1 describes a mechanism in which a landing pad is provided in a landing zone of a delivery destination of a drone, and the drone is guided to the landing pad by a visual assistance device, an optical assistance device or a wireless assistance device.
特許第6622291号公報Japanese Patent No. 6622291
 特許文献1の仕組みでは、ドローンの配達目的地に着陸パッドという専用設備を設けなければならない。ドローンの目的地には、例えば様々な大きさや形状の住戸があり、これらに対して一様に着陸パッドを設けることには制約があると考えられる。 With the mechanism of Patent Document 1, a dedicated facility called a landing pad must be installed at the delivery destination of the drone. Destinations for drones include, for example, dwelling units of various sizes and shapes.
 そこで、本発明は、飛行体の目的地として指定された様々な構造物に対して飛行体を適切な位置に着陸させることを目的とする。 Therefore, the object of the present invention is to land the flying object at an appropriate position with respect to various structures designated as the destination of the flying object.
 本発明は、目的地として指定された構造物の上空に飛行体が到達した後、当該飛行体が当該構造物を含む敷地をセンシングした結果に基づいて、当該敷地内に存在する構造物を検出する検出部と、検出された前記構造物に対して決められた着陸位置に前記飛行体が着陸可能か否かを判断する判断部と、着陸可能と判断された場合に、前記飛行体を前記着陸位置に着陸させる着陸制御部とを備える制御装置を提供する。 According to the present invention, after a flying object reaches the sky above a structure designated as a destination, the structure existing on the site is detected based on the result of the flying object sensing the site containing the structure. a detection unit for determining whether or not the flying object can land at a landing position determined for the detected structure; and a landing control for landing at a landing position.
 本発明によれば、飛行体の目的地として指定された構造物に対して飛行体を適切な位置に着陸させることが可能となる。 According to the present invention, it is possible to land the flying object at an appropriate position with respect to the structure designated as the destination of the flying object.
本発明の一実施形態に係るドローン管理システム1の構成の一例を示すブロック図である。It is a block diagram showing an example of composition of drone management system 1 concerning one embodiment of the present invention. 同実施形態に係るドローン10のハードウェア構成の一例を示すブロック図である。It is a block diagram showing an example of hardware constitutions of drone 10 concerning the embodiment. 同実施形態に係るサーバ装置50のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of the server apparatus 50 which concerns on the same embodiment. ドローン10の機能構成の一例を示すブロック図である。2 is a block diagram showing an example of a functional configuration of the drone 10; FIG. 同実施形態に係るドローンの目的地の構造を例示する鳥瞰図である。FIG. 4 is a bird's-eye view illustrating the structure of the destination of the drone according to the same embodiment; 同実施形態に係る優先順位テーブルを例示する図である。It is a figure which illustrates the priority table which concerns on the same embodiment. 同実施形態に係る優先順位テーブルを例示する図である。It is a figure which illustrates the priority table which concerns on the same embodiment. 同実施形態に係る優先順位テーブルを例示する図である。It is a figure which illustrates the priority table which concerns on the same embodiment. 同実施形態に係る優先順位テーブルを例示する図である。It is a figure which illustrates the priority table which concerns on the same embodiment. 構造物検出時のドローン10の移動経路を例示する図である。FIG. 4 is a diagram illustrating a movement route of the drone 10 when detecting a structure; 構造物検出時のドローン10の移動経路を例示する図である。FIG. 4 is a diagram illustrating a movement route of the drone 10 when detecting a structure; 構造物検出時のドローン10の移動経路を例示する図である。FIG. 4 is a diagram illustrating a movement route of the drone 10 when detecting a structure; ドローン10による処理の手順を例示するフローチャートである。4 is a flowchart illustrating a procedure of processing by the drone 10;
[構成]
 図1は、本発明の情報処理システムの一実施形態に係るドローン管理システム1の構成の一例を示す図である。ドローン管理システム1は、荷物を目的地に輸送するドローン10と、目的地となる建物その他の構造物に居住するユーザによって利用されるユーザ端末30と、無線通信網40と、無線通信網40に接続されたサーバ装置50とを備える。なお、図1においては、ドローン10、ユーザ端末30、無線通信網40、及びサーバ装置50を1つずつ図示しているが、これらはそれぞれ複数あってもよい。
[Constitution]
FIG. 1 is a diagram showing an example configuration of a drone management system 1 according to an embodiment of an information processing system of the present invention. The drone management system 1 includes a drone 10 that transports packages to a destination, a user terminal 30 that is used by a user living in a building or other structure that is the destination, a wireless communication network 40, and a wireless communication network 40. and a connected server device 50 . Although one drone 10, one user terminal 30, one wireless communication network 40, and one server device 50 are illustrated in FIG. 1, there may be a plurality of each.
 ドローン10は、空中を飛行する無人の飛行体である。ドローン10は、荷物を保持して目的地まで飛行し、その目的地に着陸することで、荷物を輸送する。 The drone 10 is an unmanned flying object that flies in the air. The drone 10 transports the cargo by holding the cargo, flying to the destination, and landing at the destination.
 ユーザ端末30は、例えばスマートフォンやタブレット、又はパーソナルコンピュータ等の通信可能なコンピュータである。本実施形態において、ユーザ端末30はスマートフォンであり、荷物を受け取るユーザが無線通信網40経由でサーバ装置50にアクセスするための通信端末として機能する。 The user terminal 30 is a communicable computer such as a smartphone, tablet, or personal computer. In this embodiment, the user terminal 30 is a smart phone and functions as a communication terminal for the user who receives the parcel to access the server device 50 via the wireless communication network 40 .
 サーバ装置50は、ドローン10の飛行日時、飛行経路及び飛行高度等の飛行計画情報を記憶しており、その飛行計画情報に従ってドローンを遠隔で操縦する。サーバ装置50による遠隔操縦は、主に、基地と呼ばれるドローンの発着地とドローンの目的地上空との間の区間である。目的地上空とドローンの着陸位置との間の区間はドローン自身による自律的な制御下で飛行が行われる。具体的には、ドローン10は、目的地に相当する構造物および/又はその構造物を含む敷地に存在する他の構造物(例えばドア、ベランダ、門、駐車場、倉庫、庭等)を検出し、検出したこれらの構造物および/又はその近辺に、着陸可能か否かを判断して着陸を行う。 The server device 50 stores flight plan information such as the flight date and time, flight route and flight altitude of the drone 10, and remotely steers the drone according to the flight plan information. Remote control by the server device 50 is mainly a section between a drone departure/arrival point called a base and the drone's destination above ground. The section between the target airspace and the landing position of the drone is carried out under autonomous control by the drone itself. Specifically, the drone 10 detects a structure corresponding to the destination and/or other structures existing on the site including the structure (for example, doors, verandas, gates, parking lots, warehouses, gardens, etc.) Then, it determines whether or not it is possible to land on and/or in the vicinity of these detected structures, and then lands.
 なお、本実施形態では、上述したように、ドローンの発着地と目的地上空との間の区間はサーバ装置50による遠隔操縦に依存し、目的地上空とドローンの着陸位置との間の区間はドローン自身による自律的な飛行で実現するが、この例に限らない。例えば、ドローン10は、サーバ装置50による遠隔操縦に頼らずに、発着地及び目的地の着陸位置の間の全ての区間を自律的に飛行してもよいし、発着地及び目的地の着陸位置の間の全ての区間においてサーバ装置50の遠隔操縦に従って飛行してもよい。 In this embodiment, as described above, the section between the drone's departure/arrival point and the destination airspace depends on remote control by the server device 50, and the section between the destination airspace and the drone's landing position is Although it is realized by autonomous flight by the drone itself, it is not limited to this example. For example, the drone 10 may autonomously fly all sections between the landing positions of the departure/arrival point and the destination without relying on remote control by the server device 50, or You may fly according to the remote control of the server apparatus 50 in all the sections between.
 無線通信網40は、例えば第4世代移動通信システムに準拠する設備であってもよいし、第5世代移動通信システムに準拠する設備であってもよい。 The wireless communication network 40 may be, for example, equipment conforming to the 4th generation mobile communication system or may be equipment conforming to the 5th generation mobile communication system.
 図2は、ドローン10のハードウェア構成の一例を示す図である。ドローン10は、物理的には、プロセッサ1001、メモリ1002、ストレージ1003、通信装置1004、入力装置1005、出力装置1006、測位装置1007、センサ1008、飛行駆動機構1009及びこれらを接続するバスなどを含むコンピュータ装置として構成されている。なお、以下の説明では、「装置」という文言は、回路、デバイス、ユニットなどに読み替えることができる。ドローン10のハードウェア構成は、図に示した各装置を1つ又は複数含むように構成されてもよいし、一部の装置を含まずに構成されてもよい。 FIG. 2 is a diagram showing an example of the hardware configuration of the drone 10. FIG. The drone 10 physically includes a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a positioning device 1007, a sensor 1008, a flight drive mechanism 1009, and a bus connecting these. It is configured as a computer device. Note that in the following description, the term "apparatus" can be read as a circuit, device, unit, or the like. The hardware configuration of the drone 10 may be configured to include one or more of each device shown in the figure, or may be configured without some of the devices.
 ドローン10における各機能は、プロセッサ1001、メモリ1002などのハードウェア上に所定のソフトウェア(プログラム)を読み込ませることによって、プロセッサ1001が演算を行い、通信装置1004による通信を制御したり、メモリ1002及びストレージ1003におけるデータの読み出し及び書き込みの少なくとも一方を制御したり、測位装置1007、センサ1008及び飛行駆動機構1009を制御することによって実現される。 Each function in the drone 10 is performed by the processor 1001 by loading predetermined software (program) on hardware such as the processor 1001 and the memory 1002, the processor 1001 performs calculations, the communication by the communication device 1004 is controlled, the memory 1002 and It is realized by controlling at least one of data reading and writing in the storage 1003 and controlling the positioning device 1007 , the sensor 1008 and the flight driving mechanism 1009 .
 プロセッサ1001は、例えば、オペレーティングシステムを動作させてコンピュータ全体を制御する。プロセッサ1001は、周辺装置とのインターフェース、制御装置、演算装置、レジスタなどを含む中央処理装置(CPU)によって構成されてもよい。また、例えばベースバンド信号処理部や呼処理部などがプロセッサ1001によって実現されてもよい。 The processor 1001, for example, operates an operating system and controls the entire computer. The processor 1001 may be configured by a central processing unit (CPU) including interfaces with peripheral devices, a control unit, an arithmetic unit, registers, and the like. Also, for example, a baseband signal processing unit, a call processing unit, and the like may be implemented by the processor 1001 .
 プロセッサ1001は、プログラム(プログラムコード)、ソフトウェアモジュール、データなどを、ストレージ1003及び通信装置1004の少なくとも一方からメモリ1002に読み出し、これらに従って各種の処理を実行する。プログラムとしては、後述する動作の少なくとも一部をコンピュータに実行させるプログラムが用いられる。ドローン10の機能ブロックは、メモリ1002に格納され、プロセッサ1001において動作する制御プログラムによって実現されてもよい。各種の処理は、1つのプロセッサ1001によって実行されてもよいが、2以上のプロセッサ1001により同時又は逐次に実行されてもよい。プロセッサ1001は、1以上のチップによって実装されてもよい。なお、プログラムは、無線通信網40経由でドローン10に送信されてもよい。 The processor 1001 reads programs (program codes), software modules, data, etc. from at least one of the storage 1003 and the communication device 1004 to the memory 1002, and executes various processes according to them. As the program, a program that causes a computer to execute at least part of the operations described below is used. The functional blocks of drone 10 may be implemented by a control program stored in memory 1002 and running on processor 1001 . Various types of processing may be executed by one processor 1001, but may also be executed by two or more processors 1001 simultaneously or sequentially. Processor 1001 may be implemented by one or more chips. Note that the program may be transmitted to the drone 10 via the wireless communication network 40 .
 メモリ1002は、コンピュータ読み取り可能な記録媒体であり、例えば、ROM、EPROM(Erasable Programmable ROM)、EEPROM(Electrically Erasable Programmable ROM)、RAMなどの少なくとも1つによって構成されてもよい。メモリ1002は、レジスタ、キャッシュ、メインメモリ(主記憶装置)などと呼ばれてもよい。メモリ1002は、本実施形態に係る方法を実施するために実行可能なプログラム(プログラムコード)、ソフトウェアモジュールなどを保存することができる。 The memory 1002 is a computer-readable recording medium, and may be composed of at least one of, for example, ROM, EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM, and the like. The memory 1002 may also be called a register, cache, main memory (main storage device), or the like. The memory 1002 can store executable programs (program code), software modules, etc. to perform the methods of the present invention.
 ストレージ1003は、コンピュータ読み取り可能な記録媒体であり、例えば、CD-ROM(Compact Disc ROM)などの光ディスク、ハードディスクドライブ、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリ(例えば、カード、スティック、キードライブ)、フロッピー(登録商標)ディスク、磁気ストリップなどの少なくとも1つによって構成されてもよい。ストレージ1003は、補助記憶装置と呼ばれてもよい。ストレージ1003は、各種のプログラムやデータ群を記憶する。 The storage 1003 is a computer-readable recording medium, for example, an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu-ray disk), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like. Storage 1003 may also be called an auxiliary storage device. The storage 1003 stores various programs and data groups.
 以上のプロセッサ1001、メモリ1002、ストレージ1003は本発明の制御装置の一例として機能する。 The processor 1001, memory 1002, and storage 1003 described above function as an example of the control device of the present invention.
 通信装置1004は、無線通信網40を介してコンピュータ間の通信を行うためのハードウェア(送受信デバイス)であり、例えばネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュールなどともいう。通信装置1004は、周波数分割複信及び時間分割複信を実現するために、高周波スイッチ、デュプレクサ、フィルタ、周波数シンセサイザなどを含んで構成されている。送受信アンテナ、アンプ部、送受信部、伝送路インターフェースなどは、通信装置1004によって実現されてもよい。送受信部は、送信部と受信部とで、物理的に、または論理的に分離された実装がなされてもよい。 The communication device 1004 is hardware (transmitting/receiving device) for communicating between computers via the wireless communication network 40, and is also called a network device, a network controller, a network card, a communication module, or the like. The communication device 1004 includes a high-frequency switch, duplexer, filter, frequency synthesizer, etc. in order to implement frequency division duplexing and time division duplexing. A transmitting/receiving antenna, an amplifier section, a transmitting/receiving section, a transmission line interface, etc. may be implemented by the communication device 1004 . The transceiver may be physically or logically separate implementations for the transmitter and receiver.
 入力装置1005は、外部からの入力を受け付ける入力デバイスであり、例えばキーやスイッチ、マイクなどを含む。出力装置1006は、外部への出力を実施する出力デバイスであり、例えば液晶ディスプレイのような表示装置や、スピーカなどを含む。なお、入力装置1005及び出力装置1006は、一体となった構成であってもよい。 The input device 1005 is an input device that receives input from the outside, and includes, for example, keys, switches, and microphones. The output device 1006 is an output device that outputs to the outside, and includes, for example, a display device such as a liquid crystal display and a speaker. Note that the input device 1005 and the output device 1006 may be integrated.
 測位装置1007は、ドローン10の位置を測定するハードウェアであり、例えばGPS(Global Positioning System)デバイスである。ドローン10は測位装置1007による測位に基づいて、発着地から目的地の上空まで飛行する。 The positioning device 1007 is hardware that measures the position of the drone 10, such as a GPS (Global Positioning System) device. The drone 10 flies from the departure/arrival point to the sky above the destination based on the positioning by the positioning device 1007 .
 センサ1008は、ドローン10の高度測定手段及び着陸位置の状況確認手段として機能する測距センサ、ドローン10の姿勢測定手段として機能するジャイロセンサ及び方位センサ、撮像手段として機能するイメージセンサ等を備える。 The sensor 1008 includes a ranging sensor that functions as altitude measurement means and landing position status confirmation means for the drone 10, a gyro sensor and direction sensor that function as attitude measurement means for the drone 10, an image sensor that functions as imaging means, and the like.
 飛行駆動機構1009は、ドローン10が飛行を行うためのモータ及びプロペラ等のハードウェアを備える。 The flight drive mechanism 1009 includes hardware such as motors and propellers for the drone 10 to fly.
 プロセッサ1001、メモリ1002などの各装置は、情報を通信するためのバスによって接続される。バスは、単一のバスを用いて構成されてもよいし、装置間ごとに異なるバスを用いて構成されてもよい。また、ドローン10は、マイクロプロセッサ、GPU(Graphics Processing Unit)、デジタル信号プロセッサ(DSP:Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、FPGA(Field Programmable Gate Array)などのハードウェアを含んで構成されてもよく、当該ハードウェアにより、各機能ブロックの一部又は全てが実現されてもよい。例えば、プロセッサ1001は、これらのハードウェアの少なくとも1つを用いて実装されてもよい。 Each device such as the processor 1001 and the memory 1002 is connected by a bus for communicating information. The bus may be configured using a single bus, or may be configured using different buses between devices. In addition, the drone 10 includes a microprocessor, a GPU (Graphics Processing Unit), a digital signal processor (DSP: Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field Programmable Gate Array), etc. hardware, and part or all of each functional block may be realized by the hardware. For example, processor 1001 may be implemented using at least one of these pieces of hardware.
 図3は、サーバ装置50のハードウェア構成を示す図である。サーバ装置50のハードウェア構成は、図3に示した各装置を1つ又は複数含むように構成されてもよいし、一部の装置を含まずに構成されてもよい。また、それぞれ筐体が異なる複数の装置が通信接続されて、サーバ装置50を構成してもよい。 FIG. 3 is a diagram showing the hardware configuration of the server device 50. As shown in FIG. The hardware configuration of the server device 50 may be configured to include one or more of the devices shown in FIG. 3, or may be configured without some of the devices. Further, the server device 50 may be configured by connecting a plurality of devices having different housings for communication.
 サーバ装置50は、物理的には、プロセッサ5001、メモリ5002、ストレージ5003、通信装置5004、及びこれらを接続するバスなどを含むコンピュータ装置として構成されている。サーバ装置50における各機能は、プロセッサ5001、メモリ5002などのハードウェア上に所定のソフトウェア(プログラム)を読み込ませることによって、プロセッサ5001が演算を行い、通信装置5004による通信を制御したり、メモリ5002及びストレージ5003におけるデータの読み出し及び書き込みの少なくとも一方を制御したりすることによって実現される。これらの各装置は図示せぬ電源から供給される電力によって動作する。なお、以下の説明では、「装置」という文言は、回路、デバイス、ユニットなどに読み替えることができる。 The server device 50 is physically configured as a computer device including a processor 5001, a memory 5002, a storage 5003, a communication device 5004, and a bus connecting them. Each function in the server device 50 is performed by causing the processor 5001 to perform calculations, controlling communication by the communication device 5004, and controlling the and by controlling at least one of reading and writing of data in the storage 5003 . Each of these devices operates with power supplied from a power source (not shown). Note that in the following description, the term "apparatus" can be read as a circuit, device, unit, or the like.
 プロセッサ5001は、例えば、オペレーティングシステムを動作させてコンピュータ全体を制御する。プロセッサ5001は、周辺装置とのインターフェース、制御装置、演算装置、レジスタなどを含む中央処理装置(CPU)によって構成されてもよい。また、例えばベースバンド信号処理部や呼処理部などがプロセッサ5001によって実現されてもよい。 A processor 5001, for example, operates an operating system to control the entire computer. The processor 5001 may be configured by a central processing unit (CPU) including interfaces with peripheral devices, a control unit, an arithmetic unit, registers, and the like. Also, for example, a baseband signal processing unit, a call processing unit, and the like may be implemented by the processor 5001 .
 プロセッサ5001は、プログラム(プログラムコード)、ソフトウェアモジュール、データなどを、ストレージ5003及び通信装置5004の少なくとも一方からメモリ5002に読み出し、これらに従って各種の処理を実行する。プログラムとしては、後述する動作の少なくとも一部をコンピュータに実行させるプログラムが用いられる。ドローン10の機能ブロックは、メモリ5002に格納され、プロセッサ5001において動作する制御プログラムによって実現されてもよい。各種の処理は、1つのプロセッサ5001によって実行されてもよいが、2以上のプロセッサ5001により同時又は逐次に実行されてもよい。プロセッサ5001は、1以上のチップによって実装されてもよい。 The processor 5001 reads programs (program codes), software modules, data, etc. from at least one of the storage 5003 and the communication device 5004 to the memory 5002, and executes various processes according to them. As the program, a program that causes a computer to execute at least part of the operations described below is used. The functional blocks of drone 10 may be implemented by a control program stored in memory 5002 and running on processor 5001 . Various types of processing may be executed by one processor 5001, but may also be executed by two or more processors 5001 simultaneously or sequentially. Processor 5001 may be implemented by one or more chips.
 メモリ5002は、コンピュータ読み取り可能な記録媒体であり、例えば、ROM、EPROM、EEPROM、RAMなどの少なくとも1つによって構成されてもよい。メモリ5002は、レジスタ、キャッシュ、メインメモリ(主記憶装置)などと呼ばれてもよい。メモリ5002は、本実施形態に係る方法を実施するために実行可能なプログラム(プログラムコード)、ソフトウェアモジュールなどを保存することができる。 The memory 5002 is a computer-readable recording medium, and may be composed of at least one of ROM, EPROM, EEPROM, and RAM, for example. The memory 5002 may also be called a register, cache, main memory (main storage device), or the like. The memory 5002 can store executable programs (program code), software modules, etc. for performing methods according to the present invention.
 ストレージ5003は、コンピュータ読み取り可能な記録媒体であり、例えば、CD-ROMなどの光ディスク、ハードディスクドライブ、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリ(例えば、カード、スティック、キードライブ)、フロッピー(登録商標)ディスク、磁気ストリップなどの少なくとも1つによって構成されてもよい。ストレージ5003は、補助記憶装置と呼ばれてもよい。ストレージ5003は、少なくとも、後述するような各種処理を実行するためのプログラム及びデータ群を記憶している。 The storage 5003 is a computer-readable recording medium, for example, an optical disk such as a CD-ROM, a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu-ray (registered trademark) disk ), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like. Storage 5003 may be called an auxiliary storage device. The storage 5003 stores at least programs and data groups for executing various processes described later.
 通信装置5004は、無線通信網40を介してコンピュータ間の通信を行うためのハードウェア(送受信デバイス)であり、例えばネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュールなどともいう。 The communication device 5004 is hardware (transmitting/receiving device) for communicating between computers via the wireless communication network 40, and is also called a network device, a network controller, a network card, a communication module, or the like.
 プロセッサ5001、メモリ5002などの各装置は、情報を通信するためのバスによって接続される。バスは、単一のバスを用いて構成されてもよいし、装置間ごとに異なるバスを用いて構成されてもよい。 Each device such as the processor 5001 and memory 5002 is connected by a bus for communicating information. The bus may be configured using a single bus, or may be configured using different buses between devices.
 サーバ装置50は、マイクロプロセッサ、デジタル信号プロセッサ、ASIC、PLD、FPGAなどのハードウェアを含んで構成されてもよく、当該ハードウェアにより、各機能ブロックの一部又は全てが実現されてもよい。例えば、プロセッサ5001は、これらのハードウェアの少なくとも1つを用いて実装されてもよい。 The server device 50 may be configured including hardware such as a microprocessor, digital signal processor, ASIC, PLD, and FPGA, and part or all of each functional block may be realized by the hardware. For example, processor 5001 may be implemented using at least one of these pieces of hardware.
 図4は、ドローン10の機能構成の一例を示す図である。ドローン10においては、取得部11、検出部12、判断部13、着陸制御部14という機能が実現される。 FIG. 4 is a diagram showing an example of the functional configuration of the drone 10. As shown in FIG. In the drone 10, functions of an acquisition unit 11, a detection unit 12, a determination unit 13, and a landing control unit 14 are realized.
 取得部11は、例えば測位装置1007、センサ1008又はサーバ装置50等から各種のデータを取得する。 The acquisition unit 11 acquires various data from, for example, the positioning device 1007, the sensor 1008, the server device 50, or the like.
 検出部12は、ドローン10の目的地として指定された、建物などの構造物の上空にドローン10が到達した後、ドローン10のセンサ1008(イメージセンサ)によってその構造物を含む敷地を撮像した画像に基づいて、その構造物および/又はその敷地内に存在する構造物を検出する。ここでいう構造物とは、目的地として指定された建物を構成する要素(例えばドア、ベランダ、門などの建築要素)および当該建物以外の建物や施設その他の構造物(駐車場、倉庫、庭、通路、道路、軒下など)、典型的には、一般的な住戸に備えられている可能性が高いものである。 After the drone 10 reaches the sky above a structure such as a building designated as the destination of the drone 10, the detection unit 12 captures an image of the site including the structure with the sensor 1008 (image sensor) of the drone 10. , the structure and/or structures existing on the site are detected. The structures here refer to the elements that make up the building designated as the destination (for example, architectural elements such as doors, verandas, and gates), as well as other buildings, facilities, and other structures (parking lots, warehouses, gardens, etc.) , aisles, roads, under the eaves, etc.), which are typically likely to be provided in general dwelling units.
 このとき、検出部12は、予め定められた優先順位に従って構造物の検出を行う。具体的には、検出部12は、ドローン10の目的地として指定された構造物と、その構造物に隣接する他の構造物および/又は敷地の位置関係とに応じた優先順位を用いる。また、検出部12は、ドローン10の目的地として指定された構造物の形状に応じた優先順位を用いる。また、検出部12は、ドローン10の目的地として指定された構造物の敷地に対するその構造物の配置に応じた優先順位を用いる。さらに、検出部12は、ドローン10の目的地のユーザによって指示された優先順位を用いる。なお、「隣接する」とは、必ずしも構造物同士が物理的に接触している必要はなく、指定された構造物から所定の距離範囲内の周囲に存在していると把握できるものであればよい。 At this time, the detection unit 12 detects the structure according to a predetermined priority. Specifically, the detection unit 12 uses the priority according to the positional relationship between the structure specified as the destination of the drone 10 and other structures and/or sites adjacent to the structure. In addition, the detection unit 12 uses priority according to the shape of the structure designated as the destination of the drone 10 . Further, the detection unit 12 uses the priority according to the arrangement of the structure with respect to the site of the structure designated as the destination of the drone 10 . Further, the detector 12 uses the user-indicated priority of the destination of the drone 10 . Note that "adjacent" does not necessarily mean that the structures are physically in contact with each other, as long as it can be understood that they exist within a predetermined distance range from the designated structure. good.
 ここで、図5は、ドローン10の目的地である構造物及びその敷地の構造を例示する鳥瞰図である。つまり、ドローン10が目的地の上空(例えば上空20m)からイメージセンサで下方を撮像したときの画像を例示している。目的地に相当する敷地Gの内側には建物Bがある。建物B及び敷地Gには道路R1,R2が隣接している。また、建物B及び敷地Gには、他の建物B1及びその敷地G1も隣接している。敷地G内には、例えば門gや樹木W、及び駐車場の屋根Pなどがある。図5では、前述した構造物として、通路L1、駐車場L2、庭L3を例示している。 Here, FIG. 5 is a bird's eye view exemplifying the structure that is the destination of the drone 10 and the structure of its site. That is, the image is illustrated when the drone 10 captures the image below from the sky above the destination (for example, 20 m above) with the image sensor. There is a building B inside the site G corresponding to the destination. Building B and site G are adjacent to roads R1 and R2. In addition, another building B1 and its site G1 are also adjacent to the building B and site G. As shown in FIG. The site G includes, for example, a gate g, trees W, and a roof P of a parking lot. FIG. 5 exemplifies a passage L1, a parking lot L2, and a garden L3 as the structures described above.
 これらの構造物について、ドローン10の検出部12によって前述した優先順位に従って探索する。この優先順位は、図5に例示した、建物Bと、その建物Bに隣接する道路R1,R2又は他の建物B1乃至他の敷地G1の位置関係に応じた優先順位、建物Bの形状に応じた優先順位、敷地Gに対する建物Bの配置に応じた優先順位、又は、ユーザによる指示に応じた優先順位である。 These structures are searched for by the detection unit 12 of the drone 10 according to the priority described above. This priority depends on the positional relationship between the building B and the roads R1 and R2 adjacent to the building B or another building B1 or another site G1, and the shape of the building B. priority according to the layout of the building B with respect to the site G, or priority according to an instruction by the user.
 このため、検出部12は、図6に例示するような優先順位テーブルを記憶している。この優先順位テーブルにおいては、目的地として指定された構造物と、その構造物に隣接する他の構造物および/又は他の敷地の位置関係を複数通り表した位置関係パターン01,02,03…と、目的地がその位置関係に該当するときに用いられる優先順位のパターン01,02,03…とが対応付けられている。位置関係パターン01,02,03…は、例えば構造物および/又は敷地がどこで道路と隣接しているとか、構造物および/又は敷地が隣接している複数の道路のうち太いほう/狭いほうの道路にどこで隣接しているとか、構造物および/又は敷地が他の構造物および/又は敷地と閾値以下の距離においてどこで隣接しているといった、構造物の配置に関する情報を含む。一方、優先順位パターン01,02,03…は、構造物および/又は敷地が道路と隣接している箇所を中心とした所定範囲において、門、ドア、駐車場…の順番で探索するとか、構造物および/又は敷地が隣接している複数の道路のうち太いほうの道路に隣接しているエリアを中心とした所定範囲において門、駐車場、軒下…の順番で探索するとか、構造物および/又は敷地が他の構造物および/又は敷地と閾値以下の距離で隣接している箇所を中心とした所定範囲を構造物の探索範囲から除外するといった、構造物を検出するためのアルゴリズムを規定する。 For this reason, the detection unit 12 stores a priority order table as illustrated in FIG. In this priority table, positional relationship patterns 01, 02, 03, . are associated with priority order patterns 01, 02, 03, . . . used when the destination corresponds to the positional relationship. The positional relationship patterns 01, 02, 03, . Contains information about the placement of structures, such as where they are adjacent to roads, and where structures and/or sites are adjacent to other structures and/or sites at a distance less than a threshold. On the other hand, priority patterns 01, 02, 03, . In a predetermined range centered on an area adjacent to the thicker road among multiple roads adjacent to objects and/or sites, search is performed in the order of gates, parking lots, under eaves, etc., or structures and/or sites are searched. Alternatively, specify an algorithm for detecting structures, such as excluding a predetermined range centered on a location where the site is adjacent to other structures and/or sites with a distance of less than a threshold from the search range for structures. .
 また、検出部12は、図7に例示するような優先順位テーブルを記憶している。この優先順位テーブルにおいては、目的地として指定された、構造物の形状を複数通り表した形状パターン01,02,03…と、その目的地がその形状に該当するときに用いられる優先順位のパターン11,12,13…とが対応付けられている。形状パターン01,02,03…は、上空から構造物を見たときのその構造物の形状が複数に分類されたものである。一方、優先順位パターン11,12,13…は、その構造物の形状の場合に、どのような構造物を敷地内のどの構造物および/又はどの箇所から優先的に探索するかという、構造物を検出するためのアルゴリズムを規定したものである。例えば形状パターン11である場合は、最初に、その形状の或る一部分から「通路」の検出を試み、次に、その形状の別の一部分から「庭」の検出を試みるのに対し、形状パターン12である場合は、最初に、その形状の或る一部分から「駐車場」の検出を試み、次に、その形状の別の一部分から「軒下」の検出を試みるといった具合である。 The detection unit 12 also stores a priority order table as illustrated in FIG. In this priority order table, the shape patterns 01, 02, 03, . 11, 12, 13, . . . are associated with each other. The shape patterns 01, 02, 03, . On the other hand, the priority patterns 11, 12, 13, . It specifies an algorithm for detecting For example, in the case of shape pattern 11, first try to detect a "passage" from one part of the shape and then try to detect a "garden" from another part of the shape, whereas shape pattern If it is 12, then first try to detect a "parking lot" from one part of the shape, then try to detect "under the eaves" from another part of the shape, and so on.
 また、検出部12は、図8に例示するような優先順位テーブルを記憶している。この優先順位テーブルにおいては、目的地として指定された構造物の敷地に対するその構造物の配置を複数通り表した配置パターン01,02,03…と、目的地がその配置に該当するときに用いられる優先順位のパターン21,22,23…とが対応付けられている。配置パターン01,02,03…は、上空から構造物を見たときの敷地に対する構造物の位置及び大きさが複数に分類されたものである。一方、優先順位パターン21,22,23…は、その配置の場合に、どのような構造物を敷地内のどの構造物および/又はどの箇所から優先的に探索するかといった、構造物を検出するためのアルゴリズムを規定したものである。例えば配置パターン11である場合は、最初に、その構造物の或る一部分から「門」の検出を試み、次に、その構造物の別の一部分から「庭」の検出を試みるのに対し、配置パターン12である場合は、最初に、その構造物の或る一部分から「ドア」の検出を試み、次に、その構造物の別の一部分から「駐車場」の検出を試みるといった具合である。 The detection unit 12 also stores a priority order table as illustrated in FIG. In this priority order table, layout patterns 01, 02, 03, . Priority patterns 21, 22, 23, . . . are associated with each other. The layout patterns 01, 02, 03, . On the other hand, the priority patterns 21, 22, 23, . It specifies an algorithm for For example, in the case of arrangement pattern 11, first try to detect a "gate" from one part of the structure, then try to detect a "garden" from another part of the structure, In the case of placement pattern 12, first try to detect a "door" from one part of the structure, then a "parking lot" from another part of the structure, and so on. .
 また、検出部12は、図9に例示するような優先順位テーブルを記憶している。この優先順位テーブルには、各々の目的地について、当該目的地周辺の構造物を検出するためのアルゴリズムである、優先順位のパターンが対応付けられている。この優先順位は、ドローン10の目的地にいるユーザ(つまり荷物を受け取るユーザ)が自身の希望に沿って指定したものである。例えば、最初に、「駐車場」の検出を試み、次に、「庭」の検出を試みる…といった具合である。この指示は、例えば荷物輸送前にユーザ端末30に対して通知を行い、その通知に対する応答としてユーザが指示するものである。ただし、ユーザによっては優先順位を指示しない場合もあるから、その場合は、そのユーザが荷物を受け取る目的地に対応する優先順位がテーブルには何も情報が登録されないことになる。 The detection unit 12 also stores a priority order table as illustrated in FIG. In this priority table, each destination is associated with a priority pattern, which is an algorithm for detecting structures around the destination. This priority is specified by the user at the destination of the drone 10 (that is, the user who receives the package) according to his/her own wishes. For example, first try to detect "parking lot", then try to detect "garden", and so on. For example, this instruction is given to the user terminal 30 before the cargo is transported, and the user gives an instruction in response to the notification. However, since there are cases where the user does not indicate the priority order, in that case, no information is registered in the table regarding the priority order corresponding to the destination where the user receives the parcel.
 なお、図6~図9に例示した優先順位パターンは併用してもよい。例えば図6~図8に例示した位置関係パターン、形状パターン又は配置パターンのうち、目的地がいずれか1つのパターンに最もよく当てはまる場合には、検出部12は、そのパターンに対応する優先順位パターンに従って構造物を探索する。また、検出部12は、例えば図9に例示したテーブルに優先順位が登録されている目的地については、図9に登録された優先順位パターンに従って構造物を探索するが、例えば図9のテーブルに優先順位が登録されていない目的地については、図6~図8のいずれかを用いて又はこれらを併用して決めた優先順位パターンに従って構造物を探索する。 The priority patterns illustrated in FIGS. 6 to 9 may be used together. For example, if the destination best fits any one of the positional relationship patterns, shape patterns, and arrangement patterns illustrated in FIGS. to search for structures. For destinations whose priorities are registered in the table shown in FIG. 9, the detection unit 12 searches for structures according to the priority patterns registered in FIG. For destinations for which priority is not registered, structures are searched according to a priority pattern determined using either one of FIGS. 6 to 8 or a combination thereof.
 また、ドローン10(検出部12)は、目的地の上空から或る程度高度を下げて構造物に近づいてから構造物を探索する。例えばドローン10(検出部12)がドアの検出を試みる場合は、そのドアを認識可能な位置(ドアに対してほぼ水平な位置)の高度まで下がってから探索を開始する。従って、1つ1つの構造物を探索するたびにドローン10が上空から高度を下げていては時間効率が悪い。そこで、例えば図6~図8に例示した位置関係パターン、形状パターン及び配置パターンにそれぞれ1つずつ当てはまる場合に、これらパターンに対応する3つの優先順位パターンのうち、ドローン10が上空に戻ることなく最も早く構造物を探索することができるような優先順位パターンを用いることが望ましい。例えば図6に例示した位置関係パターン、図7に例示した形状パターン及び図8に例示した配置パターンにそれぞれ1つずつ当てはまる優先順位パターンA,B,Cがある場合を想定する。この場合において、優先順位パターンAに従った検出を行う場合のドローン10の移動経路を図10に例示し、優先順位パターンBに従った検出を行う場合のドローン10の移動経路を図11に例示し、優先順位パターンCに従った検出を行う場合のドローン10の移動経路を図12に例示する。これら各図は、ドローン10の目的地である及構造物及びその敷地の構造を例示する鳥瞰図であり、図中の矢印はドローン10の移動経路を示している。各矢印に付した数字は、ドローン10の移動順序を意味している。
 図10に例示した移動経路は建物Bの上を移動しなくてもよいので、図11、12に例示した移動経路に比べて、構造物の上空まで戻る回数が少ないと言える。よって、この場合は、図10に例示した移動経路、つまり優先順位パターンAが採用されることになる。このように、検出部12は、ドローン10の飛行高度が所定の範囲に収まる範囲で構造物の検出を行うことが可能な優先順位を用いることになる。
Also, the drone 10 (the detection unit 12) searches for a structure after approaching the structure by lowering the altitude to some extent from the sky above the destination. For example, when the drone 10 (the detection unit 12) attempts to detect a door, the drone 10 descends to a position where the door can be recognized (a position substantially horizontal to the door) before starting the search. Therefore, time efficiency is poor if the drone 10 lowers its altitude from the sky each time it searches for each structure. Therefore, for example, when one of the positional relationship patterns, shape patterns, and arrangement patterns illustrated in FIGS. It is desirable to use the priority pattern that allows structures to be searched the fastest. For example, it is assumed that there are priority patterns A, B, and C that apply one each to the positional relationship pattern illustrated in FIG. 6, the shape pattern illustrated in FIG. 7, and the arrangement pattern illustrated in FIG. In this case, FIG. 10 illustrates the movement route of the drone 10 when detection is performed according to the priority pattern A, and FIG. 11 illustrates the movement route of the drone 10 when detection is performed according to the priority pattern B. 12 exemplifies the moving route of the drone 10 when detection is performed according to the priority pattern C. As shown in FIG. Each of these figures is a bird's-eye view illustrating the destination of the drone 10 as well as the structure of the structure and its site. The number attached to each arrow means the movement order of the drones 10 .
Since the moving route illustrated in FIG. 10 does not need to move over the building B, it can be said that the number of times of returning to the sky above the structure is smaller than that of the moving routes illustrated in FIGS. Therefore, in this case, the movement route illustrated in FIG. 10, that is, the priority pattern A is adopted. In this way, the detection unit 12 uses the order of priority that enables detection of structures within a range in which the flight altitude of the drone 10 falls within a predetermined range.
 図4の説明に戻り、判断部13は、検出部12により検出された構造物に対して決められた着陸位置にドローン10が着陸可能か否かを判断する。具体的には、判断部13は、着陸位置の水平度、平坦度、又は、液体があるか否かに基づいて、ドローン10が着陸可能か否かを判断する。着陸位置の水平度及び平坦度は、センサ1008の測距センサやイメージセンサの出力に基づいて判断される。また、着陸位置に液体(典型的には水)があるか否かは、センサ1008のイメージセンサの出力に基づいて判断される。なお、地構造物に対して決められた着陸位置とは、例えば庭という構造物に対する着陸位置はその庭の中央であり、駐車場という構造物に対する着陸位置はその駐車場の端に近い位置であり、ドアという構造物に対する着陸位置はそのドアの前であり、門という構造物に対する着陸位置はその門よりも構造物寄りの位置であるというようにあらかじめ決められている。 Returning to the description of FIG. 4, the determination unit 13 determines whether the drone 10 can land at the landing position determined with respect to the structure detected by the detection unit 12. Specifically, the determination unit 13 determines whether or not the drone 10 can land based on the levelness and flatness of the landing position, or whether or not there is liquid. The horizontality and flatness of the landing position are determined based on the outputs of the ranging sensor and image sensor of the sensor 1008 . Also, whether or not there is liquid (typically water) at the landing position is determined based on the output of the image sensor of sensor 1008 . In addition, the landing position determined for the ground structure is, for example, the landing position for a structure called a garden is the center of the garden, and the landing position for a structure called a parking lot is near the edge of the parking lot. The landing position for a structure called a door is in front of the door, and the landing position for a structure called a gate is a position closer to the structure than the gate.
 着陸制御部14は、判断部13により着陸位置に着陸可能と判断された場合に、飛行駆動機構1009を制御してドローン10をその着陸位置に着陸させる。 The landing control unit 14 controls the flight drive mechanism 1009 to land the drone 10 at the landing position when the judgment unit 13 judges that the landing position can be landed.
[動作]
 次に、図13に示すフローチャートを参照して、ドローン10の飛行時の処理について説明する。図13において、ドローン10は発着地から目的に向けて飛行を開始する(ステップS01)。以降、ドローン10は、サーバ装置50による制御の下で、荷物の配達依頼時に指定された目的地の住所の上空まで飛行する。
[motion]
Next, the processing during flight of the drone 10 will be described with reference to the flowchart shown in FIG. 13 . In FIG. 13, the drone 10 starts flying from the departure/arrival point to the destination (step S01). After that, the drone 10 flies under the control of the server device 50 up to the destination address specified at the time of the package delivery request.
 ドローン10が目的地の上空に到達すると(ステップS02;YES)、検出部12は、センサ1008(イメージセンサ)によってその構造物を含む敷地を撮像された画像に基づいて、その構造物および/又はその敷地に存在する構造物を検出する。このとき、検出部12は、前述した優先順位に従って構造物の検出を行う(ステップS03)。 When the drone 10 reaches the sky above the destination (step S02; YES), the detection unit 12 detects the structure and/or the site based on the image captured by the sensor 1008 (image sensor). Detect structures existing on the site. At this time, the detection unit 12 detects the structure according to the above-described priority order (step S03).
 優先順位が上位にある構造物が検出されると、判断部13は、検出部12により検出された構造物に対して決められた着陸位置にドローン10が着陸可能か否かを判断する。具体的には、判断部13は、センサ1008の測距センサやイメージセンサの出力から、着陸位置の水平度、平坦度、又は、液体があるか否かなどの着陸の適切性に関する要因を特定し、ドローン10が着陸可能か否かを判断する(ステップS04)。 When a structure with a higher priority order is detected, the determination unit 13 determines whether the drone 10 can land at the landing position determined for the structure detected by the detection unit 12. Specifically, the determination unit 13 identifies factors related to suitability for landing, such as levelness and flatness of the landing position, or presence or absence of liquid, from the output of the ranging sensor and image sensor of the sensor 1008. Then, it is determined whether or not the drone 10 can land (step S04).
 着陸制御部14は、判断部13により着陸位置に着陸可能と判断された場合に、飛行駆動機構1009を制御してドローン10をその着陸位置に着陸させる(ステップS05)。なお、判断部13により着陸位置に着陸可能と判断されなかった場合は、検出部12が次の優先順位の構造物の検出を試み、上記のステップS03、S04の処理を繰り返す。
そして、全ての構造物について着陸可能と判断されなかった場合は、ドローン10は所定のNG処理を行って発着地に帰着する。
The landing control unit 14 controls the flight drive mechanism 1009 to land the drone 10 at the landing position when the determination unit 13 determines that the landing position can be landed (step S05). If the determination unit 13 does not determine that the vehicle can land at the landing position, the detection unit 12 tries to detect the next priority structure, and repeats the above steps S03 and S04.
Then, when it is determined that all structures are not landable, the drone 10 performs a predetermined NG process and returns to the departure/arrival point.
 以上説明した実施形態によれば、目的地においてドローン10の着陸専用の設備を設けなくても、目的地として指定された様々な構造物に対して無人飛行体を適切な位置に着陸させることが可能となる。 According to the embodiments described above, it is possible to land an unmanned air vehicle at an appropriate position on various structures designated as destinations without providing equipment dedicated to landing the drone 10 at the destination. It becomes possible.
[変形例]
 本発明は、上述した実施形態に限定されない。上述した実施形態を以下のように変形してもよい。また、以下の2つ以上の変形例を組み合わせて実施してもよい。
[変形例1]
 例えば1つの目的地において同時に複数の構造物が検出された場合に、ドローンからの距離に基づいて、着陸可能性の判定対象となる一の構造物を決めてもよい。例えば1回の検出処理によってドアという構造物が2つ検出された場合、そのときのドローン10に近いほうのドアについて判断部13は着陸可能か否かを判断する。つまり、判断部13は、複数の構造物が検出された場合に、ドローン10からの距離に基づいて、判断対象となる構造物を決めるようにすればよい。
[Modification]
The invention is not limited to the embodiments described above. The embodiment described above may be modified as follows. Also, two or more of the following modified examples may be combined for implementation.
[Modification 1]
For example, when a plurality of structures are detected at one destination at the same time, one structure to be determined for the possibility of landing may be determined based on the distance from the drone. For example, when two structures called doors are detected by one detection process, the determination unit 13 determines whether or not the door closer to the drone 10 at that time can land. In other words, when multiple structures are detected, the determination unit 13 may determine a structure to be determined based on the distance from the drone 10 .
 また、上記実施形態で説明したように、検出部12は、ドローン10のセンサ1008(イメージセンサ)によってその構造物を含む敷地を撮像された画像に基づいて構造物を検出するが、この検出においては構造物に該当する確からしさ(認識確度)が算出される。
そこで、1つの目的地において同時に複数の構造物が検出された場合に、判断部13は、各々の構造物の確からしさに基づいて、判断対象となる構造物を決めるようにしてもよい。例えば判断部13は、各々の構造物の確からしさを比較し、その確からしさが最も高い構造物を、判断対象となる構造物として決める。このようにすれば、構造物の確からしさに基づいて、着陸位置としての判断対象となる構造物を決めることができる。
Further, as described in the above embodiment, the detection unit 12 detects a structure based on an image of the site including the structure captured by the sensor 1008 (image sensor) of the drone 10. In this detection, is calculated as the likelihood (recognition accuracy) that it corresponds to a structure.
Therefore, when a plurality of structures are detected at one destination at the same time, the determination unit 13 may determine a structure to be determined based on the likelihood of each structure. For example, the determination unit 13 compares the likelihood of each structure and determines the structure with the highest likelihood as the structure to be determined. In this way, it is possible to determine the structure to be determined as the landing position based on the likelihood of the structure.
[変形例2]
 ドローンの着陸制御は、実施形態で説明した、いわゆるエッジコンピューティング(ドローンによる制御)、クラウドコンピューティング(サーバ装置による制御)、又は、その双方の連携(ドローン及びサーバ装置による制御)で実現してもよい。従って、本発明の制御装置はサーバ装置50に備えられていてもよい。
[Modification 2]
Drone landing control is realized by so-called edge computing (control by drone), cloud computing (control by server device), or combination of both (control by drone and server device), as described in the embodiments. good too. Therefore, the control device of the present invention may be provided in the server device 50. FIG.
[変形例3]
 無人飛行体は、ドローンと呼ばれるものに限らず、荷物を輸送可能な無人の飛行体であればどのような構造や形態のものであってもよい。また、人間は乗っているが飛行体自身が自動運転するような有人飛行体の場合であっても、本発明を適用することが可能である。
[Modification 3]
The unmanned flying object is not limited to what is called a drone, and may be of any structure or form as long as it is an unmanned flying object capable of transporting cargo. In addition, the present invention can be applied to a manned flying object that has a human on board but that operates automatically.
[変形例4]
 上述した実施形態は、荷物を輸送する飛行体(ドローン10)が目的地に着陸するときの例で説明したが、例えば、飛行体が荷物を保持せずに目的地に着陸し、その着陸位置にて荷物を受け取って保持した状態で次の目的地へと離陸するというシーンにおける、飛行体着陸時に対しても本発明を適用することが可能である。また、飛行体の飛行目的又は用途は、実施形態で例示した荷物の輸送に限らず、例えば何らかの対象物を測定したり撮影したりするなど、どのようなものであってもよい。つまり、本発明は、飛行体の飛行目的又は用途に関わらず、その飛行体が着陸するときに適用することができる。
[Modification 4]
In the above-described embodiment, an example in which the flying object (drone 10) that transports the cargo lands at the destination has been described. It is also possible to apply the present invention to the landing of an aircraft in a scene in which a cargo is received and held at a point and taken off to the next destination. Further, the purpose or application of the flying object is not limited to the transport of luggage as exemplified in the embodiment, but may be any purpose such as measuring or photographing some object. That is, the present invention can be applied when the vehicle lands regardless of the flight purpose or application of the vehicle.
[変形例5]
 上記実施形態では、構造物の検出において、ドローン10のセンサ1008が備える撮像手段としてのイメージセンサを用いていた。構造物の検出手法は、実施形態の例に限らず、例えばLiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)と呼ばれる技術や、SLAM(Simultaneous Localization and Mapping)と呼ばれる技術など、構造物の位置、形状、又は大きさなどをセンシング可能な手法を用いることができる。即ち、センサ1008は、構造物を含む敷地をセンシング可能なセンサを備え、サーバ装置50の検出部12は、構造物を含む敷地をセンシングした結果に基づいて、その構造物および/又はその敷地に存在する構造物を検出すればよい。
[Modification 5]
In the above-described embodiment, an image sensor is used as an imaging means provided in the sensor 1008 of the drone 10 in detecting a structure. The detection method of the structure is not limited to the example of the embodiment. , shape, size, or the like can be used. That is, the sensor 1008 includes a sensor capable of sensing the site including the structure, and the detection unit 12 of the server device 50 detects the structure and/or the site based on the result of sensing the site including the structure. It suffices to detect existing structures.
[変形例6]
 上記実施形態において、判断部13は、着陸位置の水平度、平坦度、又は、液体があるか否かに基づいて飛行体が着陸可能か否かを判断していたが、これら以外の条件に基づいてその判断を行ってもよい。具体的には、例えば着陸位置の材質、温度、又は、積雪があるか否か等が考えられる。着陸位置の材質や積雪の有無は、例えばセンサ1008のイメージセンサの出力に基づいて判断される。また、着陸位置の温度は、センサ1008が備える非接触温度センサの出力に基づいて判断される。このように、判断部13は、着陸位置の水平度、平坦度、材質、温度、又は、液体若しくは積雪があるか否かといった、複数種類の着陸位置の状態のうち少なくともいずれか1つに基づいて、飛行体が着陸可能か否かを判断すればよい。
[Modification 6]
In the above embodiment, the judgment unit 13 judges whether or not the flying object can land based on the levelness and flatness of the landing position, or whether or not there is liquid. The decision may be made on the basis of Specifically, for example, the material and temperature of the landing position, or whether or not there is snow cover, etc. can be considered. The material of the landing position and the presence or absence of snow coverage are determined based on the output of the image sensor of the sensor 1008, for example. Also, the temperature of the landing position is determined based on the output of the non-contact temperature sensor included in sensor 1008 . In this way, the determination unit 13 determines based on at least one of a plurality of types of landing position conditions, such as levelness, flatness, material, temperature, or whether or not there is liquid or snow at the landing position. can be used to determine whether the aircraft can land.
[そのほかの変形例]
 上記実施の形態の説明に用いたブロック図は、機能単位のブロックを示している。これらの機能ブロック(構成部)は、ハードウェア及び/又はソフトウェアの任意の組み合わせによって実現される。また、各機能ブロックの実現手段は特に限定されない。すなわち、各機能ブロックは、物理的及び/又は論理的に結合した1つの装置により実現されてもよいし、物理的及び/又は論理的に分離した2つ以上の装置を直接的及び/又は間接的に(例えば、有線及び/又は無線)で接続し、これら複数の装置により実現されてもよい。
例えば、実施形態で例示したユーザ端末30~32の機能を1つのコンピュータが備えていてもよい。要するに、図5に例示した各機能は、情報処理システムとしてのドローン管理システム1を構成する装置のいずれかが備えていればよい。例えば、サーバ装置50がドローン10に対して直接制御可能な場合は、サーバ装置50が処理部313に相当する機能を備え、ドローン10に対して直接、その飛行を制限するようにしてもよい。
[Other Modifications]
The block diagrams used in the description of the above embodiments show blocks in functional units. These functional blocks (components) are implemented by any combination of hardware and/or software. Further, means for realizing each functional block is not particularly limited. That is, each functional block may be implemented by one device physically and/or logically coupled, or may be implemented by two or more physically and/or logically separated devices directly and/or indirectly. These multiple devices may be physically connected (eg, wired and/or wirelessly).
For example, one computer may have the functions of the user terminals 30 to 32 exemplified in the embodiments. In short, each function exemplified in FIG. 5 may be provided in any of the devices constituting the drone management system 1 as an information processing system. For example, if the server device 50 can directly control the drone 10 , the server device 50 may have a function corresponding to the processing unit 313 and directly restrict the flight of the drone 10 .
 本明細書で説明した各態様/実施形態は、LTE(Long Term Evolution)、LTE-A(LTE-Advanced)、SUPER 3G、IMT-Advanced、4G、5G、FRA(Future Radio Access)、W-CDMA(登録商標)、GSM(登録商標)、CDMA2000、UMB(Ultra Mobile Broadband)、IEEE 802.11(Wi-Fi)、IEEE 802.16(WiMAX)、IEEE 802.20、UWB(Ultra-WideBand)、Bluetooth(登録商標)、その他の適切なシステムを利用するシステム及び/又はこれらに基づいて拡張された次世代システムに適用されてもよい。 Each aspect/embodiment described herein includes LTE (Long Term Evolution), LTE-A (LTE-Advanced), SUPER 3G, IMT-Advanced, 4G, 5G, FRA (Future Radio Access), W-CDMA (registered trademark), GSM (registered trademark), CDMA2000, UMB (Ultra Mobile Broadband), IEEE 802.11 (Wi-Fi), IEEE 802.16 (WiMAX), IEEE 802.20, UWB (Ultra-WideBand), It may be applied to systems utilizing Bluetooth®, other suitable systems, and/or advanced next generation systems based thereon.
 本明細書で説明した各態様/実施形態の処理手順、シーケンス、フローチャートなどは、矛盾の無い限り、順序を入れ替えてもよい。例えば、本明細書で説明した方法については、例示的な順序で様々なステップの要素を提示しており、提示した特定の順序に限定されない。本明細書で説明した各態様/実施形態は単独で用いてもよいし、組み合わせて用いてもよいし、実行に伴って切り替えて用いてもよい。また、所定の情報の通知(例えば、「Xであること」の通知)は、明示的に行うものに限られず、暗黙的(例えば、当該所定の情報の通知を行わない)ことによって行われてもよい。 The order of the processing procedures, sequences, flowcharts, etc. of each aspect/embodiment described in this specification may be changed as long as there is no contradiction. For example, the methods described herein present elements of the various steps in a sample order, and are not limited to the specific order presented. Each aspect/embodiment described herein may be used alone, in combination, or switched between implementations. In addition, the notification of predetermined information (for example, notification of “being X”) is not limited to being performed explicitly, but may be performed implicitly (for example, not notifying the predetermined information). good too.
 本明細書で説明した情報又はパラメータなどは、絶対値で表されてもよいし、所定の値からの相対値で表されてもよいし、対応する別の情報で表されてもよい。 The information or parameters described in this specification may be represented by absolute values, relative values from a predetermined value, or other corresponding information.
 本明細書で使用する「判定(determining)」、「決定(determining)」という用語は、多種多様な動作を包含する場合がある。「判定」、「決定」は、例えば、判断(judging)、計算(calculating)、算出(computing)、処理(processing)、導出(deriving)、調査(investigating)、探索(looking up)(例えば、テーブル、データベース又は別のデータ構造での探索)、確認(ascertaining)した事を「判定」「決定」したとみなす事などを含み得る。また、「判定」、「決定」は、受信(receiving)(例えば、情報を受信すること)、送信(transmitting)(例えば、情報を送信すること)、入力(input)、出力(output)、アクセス(accessing)(例えば、メモリ中のデータにアクセスすること)した事を「判定」「決定」したとみなす事などを含み得る。また、「判定」、「決定」は、解決(resolving)、選択(selecting)、選定(choosing)、確立(establishing)、比較(comparing)などした事を「判定」「決定」したとみなす事を含み得る。つまり、「判定」「決定」は、何らかの動作を「判定」「決定」したとみなす事を含み得る。 The terms "determining" and "determining" used herein may encompass a wide variety of actions. "Determining", "determining" means, for example, judging, calculating, computing, processing, deriving, investigating, looking up (e.g., table , searching in a database or other data structure), ascertaining as "determining" or "determining". In addition, "judgment" and "decision" are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access, and so on. (accessing) (for example, accessing data in memory) may include deeming that something has been "determined" or "determined". In addition, "judgement" and "decision" are considered to be "judgment" and "decision" by resolving, selecting, choosing, establishing, comparing, etc. can contain. In other words, "judgment" and "decision" may include considering that some action is "judgment" and "decision".
 本発明は、情報処理方法として提供されてもよいし、プログラムとして提供されてもよい。かかるプログラムは、光ディスク等の記録媒体に記録した形態で提供されたり、インターネット等のネットワークを介して、コンピュータにダウンロードさせ、これをインストールして利用可能にするなどの形態で提供されたりすることが可能である。 The present invention may be provided as an information processing method or as a program. Such a program may be provided in a form recorded on a recording medium such as an optical disc, or may be provided in a form in which the program is downloaded to a computer via a network such as the Internet, installed, and made available. It is possible.
 ソフトウェア、命令などは、伝送媒体を介して送受信されてもよい。例えば、ソフトウェアが、同軸ケーブル、光ファイバケーブル、ツイストペア及びデジタル加入者回線(DSL)などの有線技術及び/又は赤外線、無線及びマイクロ波などの無線技術を使用してウェブサイト、サーバ、又は他のリモートソースから送信される場合、これらの有線技術及び/又は無線技術は、伝送媒体の定義内に含まれる。 Software, instructions, etc. may be transmitted and received via a transmission medium. For example, the software can be used to access websites, servers, or other When transmitted from a remote source, these wired and/or wireless technologies are included within the definition of transmission media.
 本明細書で説明した情報、信号などは、様々な異なる技術のいずれかを使用して表されてもよい。例えば、上記の説明全体に渡って言及され得るデータ、命令、コマンド、情報、信号、ビット、シンボル、チップなどは、電圧、電流、電磁波、磁界若しくは磁性粒子、光場若しくは光子、又はこれらの任意の組み合わせによって表されてもよい。 The information, signals, etc. described herein may be represented using any of a variety of different technologies. For example, data, instructions, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. may be represented by a combination of
 本明細書で使用する「第1の」、「第2の」などの呼称を使用した要素へのいかなる参照も、それらの要素の量又は順序を全般的に限定するものではない。これらの呼称は、2つ以上の要素間を区別する便利な方法として本明細書で使用され得る。したがって、第1及び第2の要素への参照は、2つの要素のみがそこで採用され得ること、又は何らかの形で第1の要素が第2の要素に先行しなければならないことを意味しない。 Any reference to elements using the "first", "second", etc. designations used herein does not generally limit the quantity or order of those elements. These designations may be used herein as a convenient method of distinguishing between two or more elements. Thus, references to first and second elements do not imply that only two elements may be employed therein, or that the first element must precede the second element in any way.
 上記の各装置の構成における「手段」を、「部」、「回路」、「デバイス」等に置き換えてもよい。 "Means" in the configuration of each device described above may be replaced with "unit", "circuit", "device", or the like.
 「含む(including)」、「含んでいる(comprising)」、及びそれらの変形が、本明細書或いは特許請求の範囲で使用されている限り、これら用語は、用語「備える」と同様に、包括的であることが意図される。さらに、本明細書或いは特許請求の範囲において使用されている用語「又は(or)」は、排他的論理和ではないことが意図される。 To the extent that "including," "comprising," and variations thereof are used herein or in the claims, these terms, as well as the term "comprising," are inclusive. intended to be Furthermore, the term "or" as used in this specification or the claims is not intended to be an exclusive OR.
 本開示の全体において、例えば、英語でのa、an、及びtheのように、翻訳により冠詞が追加された場合、これらの冠詞は、文脈から明らかにそうではないことが示されていなければ、複数のものを含むものとする。 Throughout this disclosure, where articles have been added by translation, e.g., a, an, and the in English, these articles are used unless the context clearly indicates otherwise. It shall include plural things.
 以上、本発明について詳細に説明したが、当業者にとっては、本発明が本明細書中に説明した実施形態に限定されるものではないということは明らかである。本発明は、特許請求の範囲の記載により定まる本発明の趣旨及び範囲を逸脱することなく修正及び変更態様として実施することができる。したがって、本明細書の記載は、例示説明を目的とするものであり、本発明に対して何ら制限的な意味を有するものではない。 Although the present invention has been described in detail above, it will be apparent to those skilled in the art that the present invention is not limited to the embodiments described herein. The present invention can be implemented with modifications and variations without departing from the spirit and scope of the invention defined by the claims. Accordingly, the descriptions herein are for the purpose of illustration and description, and are not intended to have any limiting meaning with respect to the present invention.
1:ドローン管理システム、10:ドローン、11:取得部、12:検出部、13:判断部、14:着陸制御部、30:ユーザ端末、40:無線通信網、50:サーバ装置、1001:プロセッサ、1002:メモリ、1003:ストレージ、1004:通信装置、1005:入力装置、1006:出力装置、1007:測位装置、1008:センサ、1009:飛行駆動機構、50:サーバ装置、5001:プロセッサ、5002:メモリ、5003:ストレージ、5004:通信装置。 1: drone management system, 10: drone, 11: acquisition unit, 12: detection unit, 13: determination unit, 14: landing control unit, 30: user terminal, 40: wireless communication network, 50: server device, 1001: processor , 1002: memory, 1003: storage, 1004: communication device, 1005: input device, 1006: output device, 1007: positioning device, 1008: sensor, 1009: flight drive mechanism, 50: server device, 5001: processor, 5002: memory, 5003: storage, 5004: communication device.

Claims (10)

  1.  目的地として指定された構造物の上空に飛行体が到達した後、当該飛行体が当該構造物を含む敷地をセンシングした結果に基づいて、当該敷地内に存在する構造物を検出する検出部と、
     検出された前記構造物に対して決められた着陸位置に前記飛行体が着陸可能か否かを判断する判断部と、
     着陸可能と判断された場合に、前記飛行体を前記着陸位置に着陸させる着陸制御部と
     を備える制御装置。
    a detection unit that, after the flying object reaches the sky over the structure designated as the destination, detects the structure existing on the site based on the result of sensing the site containing the structure by the flying object; ,
    a determination unit that determines whether the aircraft can land at a landing position determined for the detected structure;
    a landing control unit that lands the aircraft at the landing position when it is determined that the aircraft can land.
  2.  前記検出部は、優先順位に従って前記構造物の検出を行う
     ことを特徴とする請求項1記載の制御装置。
    The control device according to claim 1, wherein the detection unit detects the structure according to the order of priority.
  3.  前記優先順位は、前記構造物と、当該構造物に隣接した、他の構造物および他の敷地のうち少なくともいずれか一つとの位置関係に基づいて定められる
     ことを特徴とする請求項2記載の制御装置。
    3. The priority according to claim 2, characterized in that the order of priority is determined based on the positional relationship between the structure and at least one of other structures and other sites adjacent to the structure. Control device.
  4.  前記優先順位は、前記構造物の形状に基づいて定められる
     ことを特徴とする請求項2又は3記載の制御装置。
    4. The control device according to claim 2, wherein said priority is determined based on the shape of said structure.
  5.  前記優先順位は、前記敷地内に存在する構造物の配置に基づいて定められる
     ことを特徴とする請求項2~4のいずれか1項に記載の制御装置。
    5. The control device according to any one of claims 2 to 4, wherein said priority is determined based on the arrangement of structures existing within said site.
  6.  前記検出部は、前記優先順位に規定された構造物のうち、前記飛行体の飛行高度が所定の範囲にある場合において検出可能である構造物を、検出対象として選択する
     ことを特徴とする請求項2~5のいずれか1項に記載の制御装置。
    The detection unit selects, as a detection target, a structure that can be detected when the flight altitude of the aircraft is within a predetermined range, from among the structures defined in the priority order. Item 6. The control device according to any one of items 2 to 5.
  7.  前記検出部は、前記目的地として指定された構造物と他の構造物との距離に応じて、前記構造物の探索範囲から除外する範囲を設定する
     ことを特徴とする請求項1~6のいずれか1項に記載の制御装置。
    According to the distance between the structure designated as the destination and other structures, the detection unit sets a range to be excluded from the search range of the structure. A control device according to any one of the preceding claims.
  8.  前記判断部は、複数の前記構造物が検出された場合に、前記飛行体からの距離に基づいて、着陸可能かの判断の対象となる一の構造物を決める
     ことを特徴とする請求項1~7のいずれか1項に記載の制御装置。
    2. The determination unit, when a plurality of structures are detected, determines one structure to be determined as to whether landing is possible based on the distance from the aircraft. 8. The control device according to any one of 1 to 7.
  9.  前記判断部は、複数の前記構造物が検出された場合に、各々の構造物の確からしさに基づいて、着陸可能かの判断の対象となる一の構造物を決める
     ことを特徴とする請求項1~7のいずれか1項に記載の制御装置。
    3. The determining unit determines one structure to be determined as to whether landing is possible based on the probability of each structure when a plurality of the structures are detected. 8. The control device according to any one of 1 to 7.
  10.  前記判断部は、前記着陸位置の状態に基づいて、前記飛行体が着陸可能か否かを判断する
     ことを特徴とする請求項1~9のいずれか1項に記載の制御装置。
    The control device according to any one of claims 1 to 9, wherein the determination unit determines whether or not the aircraft can land based on the state of the landing position.
PCT/JP2022/025759 2021-07-07 2022-06-28 Control device WO2023282124A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023533549A JPWO2023282124A1 (en) 2021-07-07 2022-06-28

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021112731 2021-07-07
JP2021-112731 2021-07-07

Publications (1)

Publication Number Publication Date
WO2023282124A1 true WO2023282124A1 (en) 2023-01-12

Family

ID=84801589

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/025759 WO2023282124A1 (en) 2021-07-07 2022-06-28 Control device

Country Status (2)

Country Link
JP (1) JPWO2023282124A1 (en)
WO (1) WO2023282124A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019049874A (en) * 2017-09-11 2019-03-28 Kddi株式会社 Flight management device, flight device, and flight management method
WO2019146576A1 (en) * 2018-01-23 2019-08-01 株式会社Nttドコモ Information processing device
JP6622291B2 (en) * 2014-09-08 2019-12-18 クゥアルコム・インコーポレイテッドQualcomm Incorporated Methods, systems, and devices for delivery drone security
WO2021039100A1 (en) * 2019-08-27 2021-03-04 ソニー株式会社 Mobile body, information processing device, information processing method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6622291B2 (en) * 2014-09-08 2019-12-18 クゥアルコム・インコーポレイテッドQualcomm Incorporated Methods, systems, and devices for delivery drone security
JP2019049874A (en) * 2017-09-11 2019-03-28 Kddi株式会社 Flight management device, flight device, and flight management method
WO2019146576A1 (en) * 2018-01-23 2019-08-01 株式会社Nttドコモ Information processing device
WO2021039100A1 (en) * 2019-08-27 2021-03-04 ソニー株式会社 Mobile body, information processing device, information processing method, and program

Also Published As

Publication number Publication date
JPWO2023282124A1 (en) 2023-01-12

Similar Documents

Publication Publication Date Title
US10217367B2 (en) Unmanned aerial vehicle and system having the same
CN110998467A (en) Model for determining lowering point at transport position
US11932391B2 (en) Wireless communication relay system using unmanned device and method therefor
TW201706970A (en) Unmanned aircraft navigation system and method
CN111542479B (en) Method for determining article transfer location, method for determining landing location, article transfer system, and information processing device
JP2014197404A (en) Transitioning mixed-mode vehicle to autonomous mode
JPWO2018078859A1 (en) Flight control program, flight control method, and information processing apparatus
US20190244530A1 (en) Unmanned aerial vehicle and system having the same and method for searching for route of unmanned aerial vehicle
US20210157012A1 (en) Drone escort system
US20210150914A1 (en) Flight control apparatus
JP7167327B2 (en) Control device, program and control method
JPWO2020194495A1 (en) Landing management device, landing management method, and landing management system
JP7178351B2 (en) flight control system
JP2018206053A (en) Parking support system and parking support method
WO2019087891A1 (en) Information processing device and flight control system
WO2023282124A1 (en) Control device
JP7050809B2 (en) Information processing equipment
JP7157823B2 (en) Information processing equipment
US10755582B2 (en) Drone physical and data interface for enhanced distance coverage
WO2023021948A1 (en) Control device and program
JP7148567B2 (en) System, management device, program, and management method
WO2023042601A1 (en) Information processing device
JP7167341B2 (en) Information processing equipment
WO2023162583A1 (en) Control apparatus
WO2023042551A1 (en) Information processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22837540

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023533549

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE