WO2023021948A1 - Control device and program - Google Patents

Control device and program Download PDF

Info

Publication number
WO2023021948A1
WO2023021948A1 PCT/JP2022/028865 JP2022028865W WO2023021948A1 WO 2023021948 A1 WO2023021948 A1 WO 2023021948A1 JP 2022028865 W JP2022028865 W JP 2022028865W WO 2023021948 A1 WO2023021948 A1 WO 2023021948A1
Authority
WO
WIPO (PCT)
Prior art keywords
landing
destination
candidate
drone
user
Prior art date
Application number
PCT/JP2022/028865
Other languages
French (fr)
Japanese (ja)
Inventor
昌志 安沢
広樹 石塚
圭祐 中島
真幸 森下
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Publication of WO2023021948A1 publication Critical patent/WO2023021948A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/02Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data

Definitions

  • the present invention relates to technology for landing an aircraft.
  • Patent Literature 1 describes a mechanism in which a landing pad is provided in a landing zone of a delivery destination of a drone, and the drone is guided to the landing pad by a visual assistance device, an optical assistance device or a wireless assistance device.
  • a dedicated facility called a landing pad must be provided at the delivery destination of the drone.
  • Destinations for drones include, for example, dwelling units of various sizes and shapes.
  • the purpose of the present invention is to land the flying object at an appropriate position at the destination of the flying object.
  • the present invention provides an acquisition unit that acquires information on a plurality of candidate landing positions at a destination of an aircraft, and an acquisition unit that acquires information about a plurality of candidate landing positions at a destination of the aircraft, and based on the results of inspection of the destination by the aircraft that has reached the destination based on the information. a judgment unit for judging whether or not the flying object can land at each of the candidate landing positions; and a landing controller for landing a body.
  • the present invention it is possible to land the flying object at an appropriate position at the destination of the flying object.
  • FIG. 1 It is a block diagram showing an example of composition of drone management system 1 concerning one embodiment of the present invention. It is a block diagram showing an example of hardware constitutions of drone 10 concerning the embodiment. It is a block diagram which shows an example of the hardware constitutions of the server apparatus 50 which concerns on the same embodiment. 2 is a block diagram showing an example of a functional configuration of the drone 10; FIG. It is a figure which illustrates the landing candidate position list
  • FIG. 1 is a diagram showing an example configuration of a drone management system 1 according to an embodiment of an information processing system of the present invention.
  • the drone management system 1 includes a drone 10 that transports a package to a destination, a user terminal 30 that is used by a user living in a building that is the destination, a wireless communication network 40, and a server connected to the wireless communication network 40.
  • a device 50 Although one drone 10, one user terminal 30, one wireless communication network 40, and one server device 50 are illustrated in FIG. 1, there may be a plurality of each.
  • the drone 10 is an unmanned flying object that flies in the air.
  • the drone 10 transports the cargo by holding the cargo, flying to the destination, and landing at the destination.
  • the user terminal 30 is a communicable computer such as a smartphone, tablet, or personal computer.
  • the user terminal 30 is a smart phone and functions as a communication terminal for the user who receives the parcel to access the server device 50 via the wireless communication network 40 .
  • the server device 50 stores flight plan information such as the flight date and time, flight route and flight altitude of the drone 10, and remotely steers the drone according to the flight plan information.
  • Remote control by the server device 50 is mainly a section between a drone departure/arrival point called a base and the drone's destination above ground.
  • the section between the target airspace and the landing position of the drone is carried out under autonomous control by the drone itself.
  • the drone 10 detects a landing candidate position (for example, a door, a veranda, a gate, a parking lot, a warehouse, a garden, etc.) fixed to a building corresponding to the destination or a site containing the building, and detects the landing candidate. Landing is performed after judging whether or not it is possible to land at or near the position.
  • a landing candidate position for example, a door, a veranda, a gate, a parking lot, a warehouse, a garden, etc.
  • the section between the drone's departure/arrival point and the destination airspace depends on remote control by the server device 50, and the section between the destination airspace and the drone's landing position is Although it is realized by autonomous flight by the drone itself, it is not limited to this example.
  • the drone 10 may autonomously fly all sections between the landing positions of the departure/arrival point and the destination without relying on remote control by the server device 50, or You may fly according to the remote control of the server apparatus 50 in all the sections between.
  • the wireless communication network 40 may be, for example, equipment conforming to the 4th generation mobile communication system or may be equipment conforming to the 5th generation mobile communication system.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the drone 10.
  • the drone 10 physically includes a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a positioning device 1007, a sensor 1008, a flight drive mechanism 1009, and a bus connecting these. It is configured as a computer device. Note that in the following description, the term "apparatus" can be read as a circuit, device, unit, or the like.
  • the hardware configuration of the drone 10 may be configured to include one or more of each device shown in the figure, or may be configured without some of the devices.
  • Each function in the drone 10 is performed by the processor 1001 by loading predetermined software (program) on hardware such as the processor 1001 and the memory 1002, the processor 1001 performs calculations, the communication by the communication device 1004 is controlled, the memory 1002 and It is realized by controlling at least one of data reading and writing in the storage 1003 and controlling the positioning device 1007 , the sensor 1008 and the flight driving mechanism 1009 .
  • predetermined software program
  • the processor 1001 performs calculations
  • the communication by the communication device 1004 is controlled
  • the memory 1002 It is realized by controlling at least one of data reading and writing in the storage 1003 and controlling the positioning device 1007 , the sensor 1008 and the flight driving mechanism 1009 .
  • the processor 1001 for example, operates an operating system and controls the entire computer.
  • the processor 1001 may be configured by a central processing unit (CPU) including interfaces with peripheral devices, a control unit, an arithmetic unit, registers, and the like.
  • a baseband signal processing unit, a call processing unit, and the like may be implemented by the processor 1001 .
  • the processor 1001 reads programs (program codes), software modules, data, etc. from at least one of the storage 1003 and the communication device 1004 to the memory 1002, and executes various processes according to them.
  • programs program codes
  • software modules software modules
  • data etc.
  • the functional blocks of drone 10 may be implemented by a control program stored in memory 1002 and running on processor 1001 .
  • Various types of processing may be executed by one processor 1001, but may also be executed by two or more processors 1001 simultaneously or sequentially.
  • Processor 1001 may be implemented by one or more chips. Note that the program may be transmitted to the drone 10 via the wireless communication network 40 .
  • the memory 1002 is a computer-readable recording medium, and may be composed of at least one of, for example, ROM, EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM, and the like.
  • the memory 1002 may also be called a register, cache, main memory (main storage device), or the like.
  • the memory 1002 can store executable programs (program code), software modules, etc. to perform the methods of the present invention.
  • the storage 1003 is a computer-readable recording medium, for example, an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu-ray disk), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like.
  • Storage 1003 may also be called an auxiliary storage device.
  • the storage 1003 stores various programs and data groups.
  • the processor 1001, memory 1002, and storage 1003 described above function as an example of the control device of the present invention.
  • the communication device 1004 is hardware (transmitting/receiving device) for communicating between computers via the wireless communication network 40, and is also called a network device, a network controller, a network card, a communication module, or the like.
  • the communication device 1004 includes a high-frequency switch, duplexer, filter, frequency synthesizer, etc. in order to implement frequency division duplexing and time division duplexing.
  • a transmitting/receiving antenna, an amplifier section, a transmitting/receiving section, a transmission line interface, etc. may be implemented by the communication device 1004 .
  • the transceiver may be physically or logically separate implementations for the transmitter and receiver.
  • the input device 1005 is an input device that receives input from the outside, and includes, for example, keys, switches, and microphones.
  • the output device 1006 is an output device that outputs to the outside, and includes, for example, a display device such as a liquid crystal display and a speaker. Note that the input device 1005 and the output device 1006 may be integrated.
  • the positioning device 1007 is hardware that measures the position of the drone 10, such as a GPS (Global Positioning System) device.
  • the drone 10 flies from the departure/arrival point to the sky above the destination based on the positioning by the positioning device 1007 .
  • the sensor 1008 includes a ranging sensor that functions as altitude measurement means and landing position status confirmation means for the drone 10, a gyro sensor and direction sensor that function as attitude measurement means for the drone 10, an image sensor that functions as imaging means, and the like.
  • the flight drive mechanism 1009 includes hardware such as motors and propellers for the drone 10 to fly.
  • Each device such as the processor 1001 and the memory 1002 is connected by a bus for communicating information.
  • the bus may be configured using a single bus, or may be configured using different buses between devices.
  • the drone 10 includes a microprocessor, a GPU (Graphics Processing Unit), a digital signal processor (DSP: Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field Programmable Gate Array), etc. hardware, and part or all of each functional block may be realized by the hardware.
  • processor 1001 may be implemented using at least one of these pieces of hardware.
  • FIG. 3 is a diagram showing the hardware configuration of the server device 50.
  • the hardware configuration of the server device 50 may be configured to include one or more of the devices shown in FIG. 3, or may be configured without some of the devices. Further, the server device 50 may be configured by connecting a plurality of devices having different housings for communication.
  • the server device 50 is physically configured as a computer device including a processor 5001, a memory 5002, a storage 5003, a communication device 5004, and a bus connecting them. Each function in the server device 50 is performed by causing the processor 5001 to perform calculations, controlling communication by the communication device 5004, and controlling the and by controlling at least one of reading and writing of data in the storage 5003 . Each of these devices operates with power supplied from a power source (not shown). Note that in the following description, the term "apparatus" can be read as a circuit, device, unit, or the like.
  • a processor 5001 operates an operating system to control the entire computer.
  • the processor 5001 may be configured by a central processing unit (CPU) including interfaces with peripheral devices, a control unit, an arithmetic unit, registers, and the like.
  • CPU central processing unit
  • a baseband signal processing unit, a call processing unit, and the like may be implemented by the processor 5001 .
  • the processor 5001 reads programs (program codes), software modules, data, etc. from at least one of the storage 5003 and the communication device 5004 to the memory 5002, and executes various processes according to them.
  • programs program codes
  • software modules software modules
  • data etc.
  • As the program a program that causes a computer to execute at least part of the operations described below is used.
  • the functional blocks of drone 10 may be implemented by a control program stored in memory 5002 and running on processor 5001 .
  • Various types of processing may be executed by one processor 5001, but may also be executed by two or more processors 5001 simultaneously or sequentially.
  • Processor 5001 may be implemented by one or more chips.
  • the memory 5002 is a computer-readable recording medium, and may be composed of at least one of ROM, EPROM, EEPROM, and RAM, for example.
  • the memory 5002 may also be called a register, cache, main memory (main storage device), or the like.
  • the memory 5002 can store executable programs (program code), software modules, etc. for performing methods according to the present invention.
  • the storage 5003 is a computer-readable recording medium, for example, an optical disk such as a CD-ROM, a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu-ray (registered trademark) disk ), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like.
  • Storage 5003 may be called an auxiliary storage device.
  • the storage 5003 stores at least programs and data groups for executing various processes described later.
  • the communication device 5004 is hardware (transmitting/receiving device) for communicating between computers via the wireless communication network 40, and is also called a network device, a network controller, a network card, a communication module, or the like.
  • Each device such as the processor 5001 and memory 5002 is connected by a bus for communicating information.
  • the bus may be configured using a single bus, or may be configured using different buses between devices.
  • the server device 50 may be configured including hardware such as a microprocessor, digital signal processor, ASIC, PLD, and FPGA, and part or all of each functional block may be realized by the hardware.
  • processor 5001 may be implemented using at least one of these pieces of hardware.
  • FIG. 4 is a diagram showing an example of the functional configuration of the drone 10.
  • Each function realized by the drone 10 is performed by causing the processor 1001 to perform calculations, controlling the communication device 1004, and controlling the memory 1002 by loading predetermined software (programs) onto hardware such as the processor 1001 and the memory 1002. and by controlling at least one of reading and writing of data in the storage 1003 .
  • functions of an acquisition unit 11, an inspection unit 12, a determination unit 13, a presentation unit 14, and a landing control unit 15 are realized.
  • the acquisition unit 11 acquires various data from external devices such as the server device 50 .
  • the acquisition unit 11 acquires from the server device 50 various instructions and commands from the server device 50 as well as landing candidate position information regarding a plurality of landing candidate positions at the destination of the drone 10 .
  • a landing candidate position is a candidate position at which the drone 10 lands at the destination.
  • the candidate landing positions are various structures and facilities at the destination, such as entrances, doors, balconies, gates, parking lots, warehouses, gardens, rooftops, passages, and under the eaves. It is likely to be provided.
  • FIG. 5 is a diagram illustrating a landing candidate position list corresponding to the landing candidate position information.
  • each landing candidate position is associated with the presence or absence of the landing permission designated by the user residing at the destination of the drone 10 and the priority as the landing position of the drone 10. ing.
  • the roof is excluded. Landing clearance has been given to the landing candidate position.
  • the user gives landing permission in advance to a landing candidate position considered suitable for the drone 10 to land in his/her own home corresponding to the destination of the drone 10, and gives landing permission to the candidate landing position suitable for the drone 10 to land. Do not give landing clearance to candidate landing positions that you think are not possible.
  • the user can access the server device 50 in advance using the user terminal 30 and register any content regarding the presence or absence of such landing permission.
  • the order of priority in the landing candidate position list may be registered in advance by the user in the same manner as the presence or absence of landing permission, or may be determined in advance by the system operator.
  • This list of candidate landing positions is designated by each user for each destination and registered in the server device 50 .
  • the landing candidate positions in the landing candidate position list are not limited to examples represented by names such as the entrance, the garden, the veranda, the roof, the gate outside, the gate inside, and the parking lot.
  • a photographic image of the candidate landing position (for example, taken from a distance of about 1.5 m from the candidate landing position) or the relative position of the candidate landing position at the destination (a photograph of the entire destination taken from the air above the destination) an image showing each landing candidate position, etc.).
  • the identifiers of the destinations illustrated in FIG. 5 may be represented by regularly assigned character strings, or may be represented by location information (such as latitude and longitude) of the destinations.
  • the inspection unit 12 uses the sensor 1008 (image sensor) of the drone 10 to image the entire site including the building.
  • the candidate landing positions at the destination are examined based on the obtained image and the list of candidate landing positions at the destination.
  • the inspection here means specifying where the candidate landing positions written in the candidate landing position list corresponding to the destination exist and what state they are in at the destination.
  • each landing candidate position is extracted from the image captured by the sensor 1008 based on the feature amount on the image of each landing candidate position. It is to detect by the sensor 1008 whether there is liquid or whether there is an obstacle.
  • FIG. 6 is a bird's-eye view illustrating the structure of the building and its site, which is the destination of the drone 10.
  • FIG. That is, the image is illustrated when the drone 10 captures the image below from the sky above the destination (for example, 20 m above) with the image sensor.
  • Building B and site G are adjacent to roads R1 and R2.
  • another building B1 and its site G1 are also adjacent to the building B and site G.
  • the site G includes, for example, a gate g, trees W, and a roof P of a parking lot.
  • the drone 10 may inspect the landing candidate position by approaching the building by lowering the altitude to some extent from the sky above the destination. For example, when the drone 10 (inspection unit 12) attempts to inspect a landing candidate position called the entrance, the entrance (door) is lowered to a position where it can be recognized (a position approximately horizontal to the entrance door) before inspection. try.
  • the determination unit 13 determines whether or not the drone 10 can land at each of the candidate landing positions based on the result of the inspection of the destination by the inspection unit 12 . Specifically, the determination unit 13 determines whether the drone 10 can land at each landing candidate position based on its levelness, flatness, whether there is liquid, or whether there is an obstacle. to judge whether The horizontality and flatness of the landing position are determined based on the outputs of the range sensor and image sensor of the sensor 1008 . Also, whether or not there is liquid (typically water) or an obstacle at the landing position is determined based on the output of the image sensor of sensor 1008 .
  • the determination unit 13 may determine whether or not the drone 10 can land at a position determined with respect to the candidate landing positions.
  • the landing position for the candidate landing position of the garden is the center of the garden
  • the landing position for the candidate landing position of the parking lot is near the edge of the parking lot
  • the landing position for the candidate landing position of the entrance is at the entrance. It may be predetermined such as before.
  • the presentation unit 14 presents the landing candidate positions determined by the determination unit 13 as possible to land to the user corresponding to the destination. At this time, the presentation unit 14 presents the landing candidate positions determined by the determination unit 13 as possible to land to the user in accordance with the order of priority. Specifically, the presentation unit 14 notifies the server device 50 of the landing candidate position determined by the determination unit 13 to be possible for landing, and presentation information about the landing candidate position is transmitted to the user terminal 30 via the server device 50. be. In the user terminal 30, this presentation information is presented to the user by means of display or the like.
  • FIG. 7 is a diagram illustrating a screen displayed on the user terminal 30 based on the presentation information.
  • the landing candidate positions determined by the determination unit 13 to be landable are presented, for example, according to the priority assigned to each landing candidate position.
  • the screen is presented in the order of the entrance, the outside of the gate, the inside of the gate, the garden, the veranda, and the parking lot from the top of the screen in order of priority.
  • the user operates the user terminal 30 and selects a desired landing position from the plurality of landing candidate positions thus presented.
  • the candidate landing position selected as the landing position desired by the user is notified from the user terminal 30 to the drone 10 via the server device 50 .
  • FIG. 1 the example of FIG.
  • the candidate landing positions presented to the user are not limited to examples represented by names such as the entrance, the garden, the veranda, the roof, the gate outside, the gate inside, and the parking lot.
  • a photographic image of the candidate landing position for example, taken from a distance of about 1.5 m from the candidate landing position
  • the relative position of the candidate landing position at the destination a photograph of the entire destination taken from the air above the destination
  • the landing control unit 15 controls the flight drive mechanism 1009 while checking the position and attitude of the drone 10 with the sensor 1008 to land the drone 10 at the candidate landing position selected by the user.
  • the drone 10 starts flying from the departure/arrival point to the destination (step S01). After that, the drone 10 flies under the control of the server device 50 up to the destination address specified at the time of the package delivery request.
  • the inspection unit 12 inspects the landing candidate position at the destination based on the image of the destination captured by the sensor 1008 (image sensor). (Step S03). At this time, as described above, the drone 10 (inspection unit 12) may inspect the landing candidate position in a state in which the drone 10 (inspection unit 12) has lowered its altitude to some extent from the sky above the destination and has approached the building.
  • the determination unit 13 determines whether the drone 10 can land at each landing candidate position based on the inspection result of the inspection unit 12 . Specifically, the determination unit 13 identifies the horizontality and flatness of the landing position, whether or not there is liquid, or whether or not there is an obstacle, based on the output of the ranging sensor and image sensor of the sensor 1008. Then, it is determined whether or not the drone 10 can land (step S04).
  • the presenting unit 14 presents the presenting information to the user by transmitting the presenting information regarding the candidate landing position determined by the determining unit 13 as possible to land to the user terminal 30 of the user corresponding to the destination (step S05).
  • the landing control unit 15 When the user selects one of the candidate landing positions presented to the user (step S06; YES), the landing control unit 15 lands the drone 10 at the selected candidate landing position (step S07). Note that if a predetermined period of time elapses without the user selecting a landing candidate position presented to the user, the drone 10 performs predetermined error processing such as notifying the user terminal 30 of the fact via the server device 50. , may proceed to the next destination or return to the base.
  • the landing possibility is determined for each of the landing candidate positions prepared in advance, and the drone 10 is landed at the landing candidate position desired by the user among the possible landing candidate positions. be able to. That is, it becomes possible to land the drone 10 at an appropriate position with respect to the destination.
  • the present invention is not limited to the above-described embodiments.
  • the embodiment described above may be modified as follows. Also, two or more of the following modified examples may be combined for implementation.
  • [Modification 1] In the above-described embodiment, if a predetermined period of time has passed without the user selecting a landing candidate position presented to the user, the drone 10 notifies the user terminal 30 of the fact via the server device 50. After performing predetermined error processing, such as, it headed for the next destination or returned to the base. In this manner, when the user does not select a candidate landing position, another user acting as a substitute for the user may select a candidate landing position.
  • the drone 10 or the server device 50 associates the identifier of the destination with the first user (for example, the householder who lives at the destination) and the second user (for example, the family who lives at the same destination)
  • the communication address of the user terminal 30 is stored.
  • the presentation unit 14 causes the determination unit 13 to determine that landing is possible for the second user corresponding to the destination. It presents candidate landing positions. That is, the presentation unit 14 notifies the server device 50 of the landing candidate position determined by the determination unit 13 to be possible for landing, and the user terminal 30 of the second user receives the presentation information about the landing candidate position via the server device 50. sent.
  • this presentation information is presented to the user by means of display or the like.
  • the second user operates his or her own user terminal 30 and selects a desired landing position from among the plurality of presented candidate landing positions.
  • the candidate landing position selected as the landing position desired by the second user is notified from the user terminal 30 to the drone 10 via the server device 50 .
  • the landing control unit 15 of the drone 10 lands the drone 10 at the landing candidate position selected by the second user. In this way, the chances that the drone 10 can land at the destination are increased, and the packages transported by the drone 10 can be delivered more quickly.
  • the presentation unit 14 presents the landing candidate positions determined by the determination unit 13 as possible to the user in order of priority, and the user selects the desired landing position.
  • the landing control unit 15 selects one of the candidate landing positions determined by the determination unit 13 as possible to land according to the priority, and controls the drone at the selected candidate landing position. 10 may be landed.
  • selecting the landing candidate positions in accordance with the order of priority means selecting the landing candidate position with the first priority, or, for example, selecting the landing candidate position that is the widest among the landing candidate positions corresponding to a predetermined number of priorities from the top. It includes selecting landing candidate positions extracted according to the priority according to conditions other than the priority, such as selecting . In this way, the system can automatically land the drone 10 without user selection.
  • the priority of the above embodiment and modification may dynamically change according to various conditions. For example, the priority varies depending on the weather before or after the drone 10 lands, the time zone when the drone 10 lands, the attributes of the cargo transported by the drone 10, or the environment of the destination of the drone 10. may be
  • An example of different priorities depending on the weather when the drone 10 lands or before and after landing is, for example, after a predetermined period of time (for example, one hour) based on weather forecast information acquired by the drone 10 from a predetermined weather forecast information providing device later), or if the humidity detected by the humidity sensor mounted on the drone 10 is equal to or higher than the threshold, the priority of the landing candidate position with a roof is raised, or the candidate landing position without a roof For example, lowering the priority of Also, for example, if the wind volume and wind direction above the threshold can be predicted based on the weather forecast information obtained by the drone 10 from a predetermined weather forecast information providing device, priority is given to landing candidate positions with obstacles such as walls on the windward side. An example of raising the order (however, if the weight of the parcel is equal to or greater than the threshold, the priority is not changed) can be considered.
  • the road adjacent to the destination can be seen.
  • lowering the priority of a candidate landing position such as a parking lot or entrance that is easy to reach, or in the early morning hours (for example, 5 to 6 o'clock), the priority of a candidate landing position such as an entrance that is easily noticed by a user going out.
  • An example is to raise the priority of a candidate landing position such as a parking lot or entrance that is easy to reach, or in the early morning hours (for example, 5 to 6 o'clock), the priority of a candidate landing position such as an entrance that is easily noticed by a user going out.
  • An example is to raise
  • priorities that differ depending on the attributes of the cargo to be transported by the drone 10 include, for example, when the price or importance of the cargo is high, from the viewpoint of safety, parking lots or For example, lowering the priority of the landing candidate position such as the entrance, or when the appearance color of the luggage and the color of the landing candidate position are in a similar range, the priority of the landing candidate position from the viewpoint of inconspicuousness of the luggage One example is lowering the
  • Examples of different priorities depending on the environment of the destination of the drone 10 include, for example, referring to a crime map in the vicinity of the destination where the crime risk is above a certain level, or where there is a large-scale facility such as a school in the vicinity of the destination. In such a case, from the viewpoint of safety, it is conceivable to lower the priority of a candidate landing position such as a parking lot or an entrance that can be easily seen from the road adjacent to the destination. In addition, when animals such as dogs and cats are at the destination, it is conceivable to give priority to high candidate landing positions (such as verandas) that animals cannot reach and candidate landing positions that are far from areas where animals are present. In addition, an example of lowering the priority of a candidate landing position where there is a car stop is also conceivable.
  • the expression lowering the priority includes the meaning of excluding from the candidate landing positions.
  • the priority in the above embodiment and modification may vary depending on whether or not the user is at home at the destination.
  • the drone 10 includes a first determination unit that determines whether or not the user corresponding to the destination is located at the destination. This is the order of priority according to whether or not When the user corresponding to the destination is located at the destination, the priority of the landing candidate position that is easy for the user to access, such as in front of the entrance or the window, is raised, and the user is not located at the destination. In this case, from the viewpoint of safety, it is conceivable to raise the priority of landing candidate positions such as a garden that is hard to see from the road adjacent to the destination or a veranda that is difficult for others to access.
  • the drone 10 when the drone 10 approaches the vicinity of the destination, the drone 10 notifies the user terminal 30 via the server device 50 of whether or not the user corresponding to the destination is located at the destination.
  • An example may be considered in which the user selects whether or not the user is at home in response to the notification, and notifies the drone 10 of the selection result from the user terminal 30 via the server device 50 .
  • the drone 10 or via the server device 50 calls the fixed telephone of the destination, and if there is a response from the fixed telephone, it is determined that the person is at home.
  • a method may be considered in which a reply is made by operating the fixed telephone as to whether or not it is possible to receive the parcel.
  • a smart phone such as the user terminal 30 or a mobile phone may be used instead of using a fixed telephone.
  • the drone 10 when the user permits the application of the user terminal 30 to acquire position information, and the drone 10 approaches the vicinity of the destination, the drone 10 notifies the user terminal 30 via the server device 50 to that effect.
  • the location information of the user terminal 30 is included in a range that is substantially the same as the location of the destination, a method of determining that the user is located at the destination is also conceivable.
  • the entry/exit state is manually input to the application of the user terminal 30, and when the drone 10 approaches the vicinity of the destination, the user terminal 30 may be acquired via the server device 50 from the Also, when the drone 10 approaches the vicinity of the destination, it monitors the power usage status at the destination using smart meter technology or the like, and estimates whether the user is at the destination. For example, if the amount of power usage within a certain period of time is equal to or greater than a threshold value, or if the amount of power usage fluctuates significantly beyond the threshold value within a certain period of time, it is determined that the user is located at the target location.
  • the drone 10 when the drone 10 approaches the vicinity of the destination, the drone 10 captures an image of the interior of the building from the balcony or window of the building corresponding to the destination with an image sensor, and detects whether the lights are on or there are people. A method of determining such as by image recognition is also conceivable.
  • the drone 10 when the drone 10 approaches the vicinity of the destination, the drone 10 makes a sound in front of the entrance, etc., and then the image is captured by the image sensor. A method of determining that the user is at the destination is also conceivable.
  • the drone 10 when the drone 10 approaches the vicinity of the destination, it flies around the building of the destination and takes an image with the image sensor.
  • a light is mounted on the drone 10, light is emitted toward the window of the building corresponding to the destination, and an image is captured by the image sensor.
  • the drone 10 communicates with the intercom of the building corresponding to the destination (or presses the button of the intercom), sounds the intercom, performs imaging with the image sensor, and if there is a reaction action from the user, the user reaches the destination.
  • a method of judging that the object exists is also conceivable. As described above, the drone 10 can be landed at an appropriate landing candidate position depending on whether or not the user is at the destination.
  • the priority may change according to the level of traffic around the destination.
  • the drone 10 has a second determination unit that determines the number or density of people in a predetermined area including the destination, and the priority is determined according to the determination result determined by the second determination unit. becomes.
  • the determination of the number or density of people in a predetermined area including the destination is, for example, a method in which the drone 10 performs image recognition of the state of pedestrian traffic in the vicinity of the destination from the sky, and determines whether or not the pedestrian traffic is equal to or greater than a threshold.
  • a determination method using statistical information such as mobile space statistics (registered trademark) provided by NTT DoCoMo, Inc.
  • a determination method using traffic congestion information provided by a congestion information providing device are conceivable. For example, if the number or density of people in a given area including the destination is above a threshold, the priority of landing candidate positions such as parking lots or entrances, which are easily visible from the road adjacent to the destination, is lowered from the viewpoint of safety. can be considered.
  • Landing control of the drone 10 is performed by the so-called edge computing (control by the drone), cloud computing (control by the server device), or cooperation of both (by the drone and the server device), as described in the embodiment. control). Therefore, the control device of the present invention may be provided in the server device 50.
  • FIG. 1 A block diagram illustrating an exemplary computing system
  • the unmanned flying object is not limited to what is called a drone, and may be of any structure and form as long as it can transport cargo.
  • the present invention can be applied to a manned flying object that has a human on board but that operates automatically.
  • the flying object (drone 10) that transports the cargo lands at the destination.
  • the present invention can also be applied to the landing of an aircraft in a scene in which the cargo is received and held at the landing position and taken off to the next destination.
  • the purpose or application of the flying object is not limited to the transport of luggage as exemplified in the embodiment, but may be any purpose such as measuring or photographing some object. That is, the present invention can be applied when the vehicle lands regardless of the flight purpose or application of the vehicle.
  • an image sensor is used as an imaging means provided in the sensor 1008 of the drone 10 in the inspection of the destination.
  • the destination inspection method is not limited to the example of the embodiment, for example, a technology called LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), a technology called SLAM (Simultaneous Localization and Mapping), etc.
  • a technique capable of sensing the shape, size, or the like can be used. That is, the sensor 1008 has a sensor capable of inspecting the candidate landing positions at the destination, and the determination unit 13 determines whether or not the drone 10 can land at each candidate landing position based on the inspection result. good.
  • the determination unit 13 determines whether the flying object can land based on the levelness, flatness, presence of liquid, or presence of obstacles at the landing position. However, the determination may be made based on conditions other than these. Specifically, for example, the material and temperature of the landing position, or whether or not there is snow cover, etc. can be considered. The material of the landing position and the presence or absence of snow coverage are determined based on the output of the image sensor of the sensor 1008, for example. Also, the temperature of the landing position is determined based on the output of the non-contact temperature sensor included in sensor 1008 .
  • the determination unit 13 determines based on at least one of a plurality of types of landing position conditions, such as levelness, flatness, material, temperature, or whether or not there is liquid or snow at the landing position. can be used to determine whether the aircraft can land.
  • types of landing position conditions such as levelness, flatness, material, temperature, or whether or not there is liquid or snow at the landing position.
  • each functional block may be implemented by one device physically and/or logically coupled, or may be implemented by two or more physically and/or logically separated devices directly and/or indirectly. These multiple devices may be physically connected (eg, wired and/or wirelessly).
  • one computer may have the functions of the user terminals 30 to 32 exemplified in the embodiments.
  • each function exemplified in FIG. 5 may be provided in any of the devices constituting the drone management system 1 as an information processing system.
  • the server device 50 can directly control the drone 10
  • the server device 50 may have a function corresponding to the processing unit 313 and directly restrict the flight of the drone 10 .
  • LTE Long Term Evolution
  • LTE-A Long Term Evolution-Advanced
  • SUPER 3G IMT-Advanced
  • 4G 5G
  • FRA Full Radio Access
  • W-CDMA registered trademark
  • GSM registered trademark
  • CDMA2000 Code Division Multiple Access 2000
  • UMB Universal Mobile Broadband
  • IEEE 802.11 Wi-Fi
  • IEEE 802.16 WiMAX
  • IEEE 802.20 UWB (Ultra-WideBand
  • the information or parameters described in this specification may be represented by absolute values, relative values from a predetermined value, or other corresponding information.
  • determining and “determining” used herein may encompass a wide variety of actions.
  • Determining means, for example, judging, calculating, computing, processing, deriving, investigating, looking up (e.g., table , searching in a database or other data structure), ascertaining as “determining” or “determining”.
  • judgment and “decision” are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access, and so on.
  • accessing for example, accessing data in memory may include deeming that something has been "determined” or "determined”.
  • the present invention may be provided as an information processing method or as a program.
  • a program may be provided in a form recorded on a recording medium such as an optical disc, or may be provided in a form in which the program is downloaded to a computer via a network such as the Internet, installed, and made available. It is possible.
  • Software, instructions, etc. may be transmitted and received via a transmission medium.
  • the software can be used to access websites, servers, or other When transmitted from a remote source, these wired and/or wireless technologies are included within the definition of transmission media.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. may be represented by a combination of
  • references to elements using the "first”, “second”, etc. designations used herein does not generally limit the quantity or order of those elements. These designations may be used herein as a convenient method of distinguishing between two or more elements. Thus, references to first and second elements do not imply that only two elements may be employed therein, or that the first element must precede the second element in any way.
  • 1 drone management system, 10: drone, 11: acquisition unit, 12: inspection unit, 13: determination unit, 14: presentation unit, 15: landing control unit, 30: user terminal, 40: wireless communication network, 50: server Device, 1001: Processor, 1002: Memory, 1003: Storage, 1004: Communication device, 1005: Input device, 1006: Output device, 1007: Positioning device, 1008: Sensor, 1009: Flight drive mechanism, 50: Server device, 5001 : processor, 5002: memory, 5003: storage, 5004: communication device.

Abstract

In the present invention, after a drone 10 reaches the sky above a building designated as a destination of the drone 10, an inspection unit 12 inspects candidate landing positions at the destination on the basis of an image of the entire site including the building as captured by a sensor 1008 (image sensor) of the drone 10, and a list of candidate landing positions at the destination. An assessment unit 13 assesses whether the drone 10 can land at each of the candidate landing positions on the basis of a result obtained when the drone 10, which has reached the sky at the destination, inspects the destination on the basis of the candidate landing position list. A presentation unit 14 presents a user with the candidate landing positions at which the assessment unit 13 has assessed that the drone can land. A landing control unit 15 controls a flight driving mechanism 1009 to cause the drone 10 to land at any candidate landing position selected by the user from among the candidate landing positions at which the assessment unit 13 has assessed that the drone can land.

Description

制御装置及びプログラムController and program
 本発明は、飛行体を着陸させるための技術に関する。 The present invention relates to technology for landing an aircraft.
 ドローンと呼ばれる無人飛行体の普及に伴い、ドローンを荷物の配達に利用する仕組みが種々提案されている。例えば特許文献1には、ドローンの配達目的地の着陸ゾーンに着陸パッドを設け、視覚支援装置、光学支援装置又は無線支援装置によりドローンをその着陸パッドに案内する仕組みが記載されている。 With the spread of unmanned flying objects called drones, various mechanisms for using drones to deliver packages have been proposed. For example, Patent Literature 1 describes a mechanism in which a landing pad is provided in a landing zone of a delivery destination of a drone, and the drone is guided to the landing pad by a visual assistance device, an optical assistance device or a wireless assistance device.
特許第6622291号公報Japanese Patent No. 6622291
 特許文献1の仕組みでは、ドローンの配達目的地に着陸パッドという専用設備を設けなければならない。ドローンの目的地には、例えば様々な大きさや形状の住戸があり、これらに対して一様に着陸パッドを設けることには制約があると考えられる。 With the mechanism of Patent Document 1, a dedicated facility called a landing pad must be provided at the delivery destination of the drone. Destinations for drones include, for example, dwelling units of various sizes and shapes.
 本発明は、飛行体の目的地においてその飛行体を適切な位置に着陸させることを目的とする。 The purpose of the present invention is to land the flying object at an appropriate position at the destination of the flying object.
 本発明は、飛行体の目的地における複数の着陸候補位置に関する情報を取得する取得部と、前記目的地の上空に到達した前記飛行体が前記情報に基づいて前記目的地を検査した結果に基づいて、前記飛行体が各々の前記着陸候補位置に着陸可能か否かを判断する判断部と、着陸可能と判断された前記着陸候補位置の中から選択されたいずれかの着陸候補位置に前記飛行体を着陸させる着陸制御部とを備えることを特徴とする制御装置を提供する。 The present invention provides an acquisition unit that acquires information on a plurality of candidate landing positions at a destination of an aircraft, and an acquisition unit that acquires information about a plurality of candidate landing positions at a destination of the aircraft, and based on the results of inspection of the destination by the aircraft that has reached the destination based on the information. a judgment unit for judging whether or not the flying object can land at each of the candidate landing positions; and a landing controller for landing a body.
 本発明によれば、飛行体の目的地においてその飛行体を適切な位置に着陸させることが可能となる。 According to the present invention, it is possible to land the flying object at an appropriate position at the destination of the flying object.
本発明の一実施形態に係るドローン管理システム1の構成の一例を示すブロック図である。It is a block diagram showing an example of composition of drone management system 1 concerning one embodiment of the present invention. 同実施形態に係るドローン10のハードウェア構成の一例を示すブロック図である。It is a block diagram showing an example of hardware constitutions of drone 10 concerning the embodiment. 同実施形態に係るサーバ装置50のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of the server apparatus 50 which concerns on the same embodiment. ドローン10の機能構成の一例を示すブロック図である。2 is a block diagram showing an example of a functional configuration of the drone 10; FIG. 同実施形態に係る着陸候補位置リストを例示する図である。It is a figure which illustrates the landing candidate position list|wrist which concerns on the same embodiment. 同実施形態に係るドローン10の目的地の構造を例示する鳥瞰図である。4 is a bird's-eye view illustrating the structure of the destination of the drone 10 according to the same embodiment. FIG. 同実施形態に係るユーザ端末30に表示される画面を例示する図である。It is a figure which illustrates the screen displayed on the user terminal 30 which concerns on the same embodiment. ドローン10による処理の手順を例示するフローチャートである。4 is a flowchart illustrating a procedure of processing by the drone 10;
[構成]
 図1は、本発明の情報処理システムの一実施形態に係るドローン管理システム1の構成の一例を示す図である。ドローン管理システム1は、荷物を目的地に輸送するドローン10と、目的地となる建物に居住するユーザによって利用されるユーザ端末30と、無線通信網40と、無線通信網40に接続されたサーバ装置50とを備える。なお、図1においては、ドローン10、ユーザ端末30、無線通信網40、及びサーバ装置50を1つずつ図示しているが、これらはそれぞれ複数あってもよい。
[composition]
FIG. 1 is a diagram showing an example configuration of a drone management system 1 according to an embodiment of an information processing system of the present invention. The drone management system 1 includes a drone 10 that transports a package to a destination, a user terminal 30 that is used by a user living in a building that is the destination, a wireless communication network 40, and a server connected to the wireless communication network 40. a device 50; Although one drone 10, one user terminal 30, one wireless communication network 40, and one server device 50 are illustrated in FIG. 1, there may be a plurality of each.
 ドローン10は、空中を飛行する無人の飛行体である。ドローン10は、荷物を保持して目的地まで飛行し、その目的地に着陸することで、荷物を輸送する。 The drone 10 is an unmanned flying object that flies in the air. The drone 10 transports the cargo by holding the cargo, flying to the destination, and landing at the destination.
 ユーザ端末30は、例えばスマートフォンやタブレット、又はパーソナルコンピュータ等の通信可能なコンピュータである。本実施形態において、ユーザ端末30はスマートフォンであり、荷物を受け取るユーザが無線通信網40経由でサーバ装置50にアクセスするための通信端末として機能する。 The user terminal 30 is a communicable computer such as a smartphone, tablet, or personal computer. In this embodiment, the user terminal 30 is a smart phone and functions as a communication terminal for the user who receives the parcel to access the server device 50 via the wireless communication network 40 .
 サーバ装置50は、ドローン10の飛行日時、飛行経路及び飛行高度等の飛行計画情報を記憶しており、その飛行計画情報に従ってドローンを遠隔で操縦する。サーバ装置50による遠隔操縦は、主に、基地と呼ばれるドローンの発着地とドローンの目的地上空との間の区間である。目的地上空とドローンの着陸位置との間の区間はドローン自身による自律的な制御下で飛行が行われる。具体的には、ドローン10は、目的地に相当する建物又はその建物を含む敷地に定着する着陸候補位置(例えばドア、ベランダ、門、駐車場、倉庫、庭等)を検出し、その着陸候補位置又はその近辺に着陸可能か否かを判断して着陸を行う。 The server device 50 stores flight plan information such as the flight date and time, flight route and flight altitude of the drone 10, and remotely steers the drone according to the flight plan information. Remote control by the server device 50 is mainly a section between a drone departure/arrival point called a base and the drone's destination above ground. The section between the target airspace and the landing position of the drone is carried out under autonomous control by the drone itself. Specifically, the drone 10 detects a landing candidate position (for example, a door, a veranda, a gate, a parking lot, a warehouse, a garden, etc.) fixed to a building corresponding to the destination or a site containing the building, and detects the landing candidate. Landing is performed after judging whether or not it is possible to land at or near the position.
 なお、本実施形態では、上述したように、ドローンの発着地と目的地上空との間の区間はサーバ装置50による遠隔操縦に依存し、目的地上空とドローンの着陸位置との間の区間はドローン自身による自律的な飛行で実現するが、この例に限らない。例えば、ドローン10は、サーバ装置50による遠隔操縦に頼らずに、発着地及び目的地の着陸位置の間の全ての区間を自律的に飛行してもよいし、発着地及び目的地の着陸位置の間の全ての区間においてサーバ装置50の遠隔操縦に従って飛行してもよい。 In this embodiment, as described above, the section between the drone's departure/arrival point and the destination airspace depends on remote control by the server device 50, and the section between the destination airspace and the drone's landing position is Although it is realized by autonomous flight by the drone itself, it is not limited to this example. For example, the drone 10 may autonomously fly all sections between the landing positions of the departure/arrival point and the destination without relying on remote control by the server device 50, or You may fly according to the remote control of the server apparatus 50 in all the sections between.
 無線通信網40は、例えば第4世代移動通信システムに準拠する設備であってもよいし、第5世代移動通信システムに準拠する設備であってもよい。 The wireless communication network 40 may be, for example, equipment conforming to the 4th generation mobile communication system or may be equipment conforming to the 5th generation mobile communication system.
 図2は、ドローン10のハードウェア構成の一例を示す図である。ドローン10は、物理的には、プロセッサ1001、メモリ1002、ストレージ1003、通信装置1004、入力装置1005、出力装置1006、測位装置1007、センサ1008、飛行駆動機構1009及びこれらを接続するバスなどを含むコンピュータ装置として構成されている。なお、以下の説明では、「装置」という文言は、回路、デバイス、ユニットなどに読み替えることができる。ドローン10のハードウェア構成は、図に示した各装置を1つ又は複数含むように構成されてもよいし、一部の装置を含まずに構成されてもよい。 FIG. 2 is a diagram showing an example of the hardware configuration of the drone 10. FIG. The drone 10 physically includes a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a positioning device 1007, a sensor 1008, a flight drive mechanism 1009, and a bus connecting these. It is configured as a computer device. Note that in the following description, the term "apparatus" can be read as a circuit, device, unit, or the like. The hardware configuration of the drone 10 may be configured to include one or more of each device shown in the figure, or may be configured without some of the devices.
 ドローン10における各機能は、プロセッサ1001、メモリ1002などのハードウェア上に所定のソフトウェア(プログラム)を読み込ませることによって、プロセッサ1001が演算を行い、通信装置1004による通信を制御したり、メモリ1002及びストレージ1003におけるデータの読み出し及び書き込みの少なくとも一方を制御したり、測位装置1007、センサ1008及び飛行駆動機構1009を制御することによって実現される。 Each function in the drone 10 is performed by the processor 1001 by loading predetermined software (program) on hardware such as the processor 1001 and the memory 1002, the processor 1001 performs calculations, the communication by the communication device 1004 is controlled, the memory 1002 and It is realized by controlling at least one of data reading and writing in the storage 1003 and controlling the positioning device 1007 , the sensor 1008 and the flight driving mechanism 1009 .
 プロセッサ1001は、例えば、オペレーティングシステムを動作させてコンピュータ全体を制御する。プロセッサ1001は、周辺装置とのインターフェース、制御装置、演算装置、レジスタなどを含む中央処理装置(CPU)によって構成されてもよい。また、例えばベースバンド信号処理部や呼処理部などがプロセッサ1001によって実現されてもよい。 The processor 1001, for example, operates an operating system and controls the entire computer. The processor 1001 may be configured by a central processing unit (CPU) including interfaces with peripheral devices, a control unit, an arithmetic unit, registers, and the like. Also, for example, a baseband signal processing unit, a call processing unit, and the like may be implemented by the processor 1001 .
 プロセッサ1001は、プログラム(プログラムコード)、ソフトウェアモジュール、データなどを、ストレージ1003及び通信装置1004の少なくとも一方からメモリ1002に読み出し、これらに従って各種の処理を実行する。プログラムとしては、後述する動作の少なくとも一部をコンピュータに実行させるプログラムが用いられる。ドローン10の機能ブロックは、メモリ1002に格納され、プロセッサ1001において動作する制御プログラムによって実現されてもよい。各種の処理は、1つのプロセッサ1001によって実行されてもよいが、2以上のプロセッサ1001により同時又は逐次に実行されてもよい。プロセッサ1001は、1以上のチップによって実装されてもよい。なお、プログラムは、無線通信網40経由でドローン10に送信されてもよい。 The processor 1001 reads programs (program codes), software modules, data, etc. from at least one of the storage 1003 and the communication device 1004 to the memory 1002, and executes various processes according to them. As the program, a program that causes a computer to execute at least part of the operations described below is used. The functional blocks of drone 10 may be implemented by a control program stored in memory 1002 and running on processor 1001 . Various types of processing may be executed by one processor 1001, but may also be executed by two or more processors 1001 simultaneously or sequentially. Processor 1001 may be implemented by one or more chips. Note that the program may be transmitted to the drone 10 via the wireless communication network 40 .
 メモリ1002は、コンピュータ読み取り可能な記録媒体であり、例えば、ROM、EPROM(Erasable Programmable ROM)、EEPROM(Electrically Erasable Programmable ROM)、RAMなどの少なくとも1つによって構成されてもよい。メモリ1002は、レジスタ、キャッシュ、メインメモリ(主記憶装置)などと呼ばれてもよい。メモリ1002は、本実施形態に係る方法を実施するために実行可能なプログラム(プログラムコード)、ソフトウェアモジュールなどを保存することができる。 The memory 1002 is a computer-readable recording medium, and may be composed of at least one of, for example, ROM, EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM, and the like. The memory 1002 may also be called a register, cache, main memory (main storage device), or the like. The memory 1002 can store executable programs (program code), software modules, etc. to perform the methods of the present invention.
 ストレージ1003は、コンピュータ読み取り可能な記録媒体であり、例えば、CD-ROM(Compact Disc ROM)などの光ディスク、ハードディスクドライブ、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリ(例えば、カード、スティック、キードライブ)、フロッピー(登録商標)ディスク、磁気ストリップなどの少なくとも1つによって構成されてもよい。ストレージ1003は、補助記憶装置と呼ばれてもよい。ストレージ1003は、各種のプログラムやデータ群を記憶する。 The storage 1003 is a computer-readable recording medium, for example, an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu-ray disk), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like. Storage 1003 may also be called an auxiliary storage device. The storage 1003 stores various programs and data groups.
 以上のプロセッサ1001、メモリ1002、ストレージ1003は本発明の制御装置の一例として機能する。 The processor 1001, memory 1002, and storage 1003 described above function as an example of the control device of the present invention.
 通信装置1004は、無線通信網40を介してコンピュータ間の通信を行うためのハードウェア(送受信デバイス)であり、例えばネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュールなどともいう。通信装置1004は、周波数分割複信及び時間分割複信を実現するために、高周波スイッチ、デュプレクサ、フィルタ、周波数シンセサイザなどを含んで構成されている。送受信アンテナ、アンプ部、送受信部、伝送路インターフェースなどは、通信装置1004によって実現されてもよい。送受信部は、送信部と受信部とで、物理的に、または論理的に分離された実装がなされてもよい。 The communication device 1004 is hardware (transmitting/receiving device) for communicating between computers via the wireless communication network 40, and is also called a network device, a network controller, a network card, a communication module, or the like. The communication device 1004 includes a high-frequency switch, duplexer, filter, frequency synthesizer, etc. in order to implement frequency division duplexing and time division duplexing. A transmitting/receiving antenna, an amplifier section, a transmitting/receiving section, a transmission line interface, etc. may be implemented by the communication device 1004 . The transceiver may be physically or logically separate implementations for the transmitter and receiver.
 入力装置1005は、外部からの入力を受け付ける入力デバイスであり、例えばキーやスイッチ、マイクなどを含む。出力装置1006は、外部への出力を実施する出力デバイスであり、例えば液晶ディスプレイのような表示装置や、スピーカなどを含む。なお、入力装置1005及び出力装置1006は、一体となった構成であってもよい。 The input device 1005 is an input device that receives input from the outside, and includes, for example, keys, switches, and microphones. The output device 1006 is an output device that outputs to the outside, and includes, for example, a display device such as a liquid crystal display and a speaker. Note that the input device 1005 and the output device 1006 may be integrated.
 測位装置1007は、ドローン10の位置を測定するハードウェアであり、例えばGPS(Global Positioning System)デバイスである。ドローン10は測位装置1007による測位に基づいて、発着地から目的地の上空まで飛行する。 The positioning device 1007 is hardware that measures the position of the drone 10, such as a GPS (Global Positioning System) device. The drone 10 flies from the departure/arrival point to the sky above the destination based on the positioning by the positioning device 1007 .
 センサ1008は、ドローン10の高度測定手段及び着陸位置の状況確認手段として機能する測距センサ、ドローン10の姿勢測定手段として機能するジャイロセンサ及び方位センサ、撮像手段として機能するイメージセンサ等を備える。 The sensor 1008 includes a ranging sensor that functions as altitude measurement means and landing position status confirmation means for the drone 10, a gyro sensor and direction sensor that function as attitude measurement means for the drone 10, an image sensor that functions as imaging means, and the like.
 飛行駆動機構1009は、ドローン10が飛行を行うためのモータ及びプロペラ等のハードウェアを備える。 The flight drive mechanism 1009 includes hardware such as motors and propellers for the drone 10 to fly.
 プロセッサ1001、メモリ1002などの各装置は、情報を通信するためのバスによって接続される。バスは、単一のバスを用いて構成されてもよいし、装置間ごとに異なるバスを用いて構成されてもよい。また、ドローン10は、マイクロプロセッサ、GPU(Graphics Processing Unit)、デジタル信号プロセッサ(DSP:Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、FPGA(Field Programmable Gate Array)などのハードウェアを含んで構成されてもよく、当該ハードウェアにより、各機能ブロックの一部又は全てが実現されてもよい。例えば、プロセッサ1001は、これらのハードウェアの少なくとも1つを用いて実装されてもよい。 Each device such as the processor 1001 and the memory 1002 is connected by a bus for communicating information. The bus may be configured using a single bus, or may be configured using different buses between devices. In addition, the drone 10 includes a microprocessor, a GPU (Graphics Processing Unit), a digital signal processor (DSP: Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field Programmable Gate Array), etc. hardware, and part or all of each functional block may be realized by the hardware. For example, processor 1001 may be implemented using at least one of these pieces of hardware.
 図3は、サーバ装置50のハードウェア構成を示す図である。サーバ装置50のハードウェア構成は、図3に示した各装置を1つ又は複数含むように構成されてもよいし、一部の装置を含まずに構成されてもよい。また、それぞれ筐体が異なる複数の装置が通信接続されて、サーバ装置50を構成してもよい。 FIG. 3 is a diagram showing the hardware configuration of the server device 50. As shown in FIG. The hardware configuration of the server device 50 may be configured to include one or more of the devices shown in FIG. 3, or may be configured without some of the devices. Further, the server device 50 may be configured by connecting a plurality of devices having different housings for communication.
 サーバ装置50は、物理的には、プロセッサ5001、メモリ5002、ストレージ5003、通信装置5004、及びこれらを接続するバスなどを含むコンピュータ装置として構成されている。サーバ装置50における各機能は、プロセッサ5001、メモリ5002などのハードウェア上に所定のソフトウェア(プログラム)を読み込ませることによって、プロセッサ5001が演算を行い、通信装置5004による通信を制御したり、メモリ5002及びストレージ5003におけるデータの読み出し及び書き込みの少なくとも一方を制御したりすることによって実現される。これらの各装置は図示せぬ電源から供給される電力によって動作する。なお、以下の説明では、「装置」という文言は、回路、デバイス、ユニットなどに読み替えることができる。 The server device 50 is physically configured as a computer device including a processor 5001, a memory 5002, a storage 5003, a communication device 5004, and a bus connecting them. Each function in the server device 50 is performed by causing the processor 5001 to perform calculations, controlling communication by the communication device 5004, and controlling the and by controlling at least one of reading and writing of data in the storage 5003 . Each of these devices operates with power supplied from a power source (not shown). Note that in the following description, the term "apparatus" can be read as a circuit, device, unit, or the like.
 プロセッサ5001は、例えば、オペレーティングシステムを動作させてコンピュータ全体を制御する。プロセッサ5001は、周辺装置とのインターフェース、制御装置、演算装置、レジスタなどを含む中央処理装置(CPU)によって構成されてもよい。また、例えばベースバンド信号処理部や呼処理部などがプロセッサ5001によって実現されてもよい。 A processor 5001, for example, operates an operating system to control the entire computer. The processor 5001 may be configured by a central processing unit (CPU) including interfaces with peripheral devices, a control unit, an arithmetic unit, registers, and the like. Also, for example, a baseband signal processing unit, a call processing unit, and the like may be implemented by the processor 5001 .
 プロセッサ5001は、プログラム(プログラムコード)、ソフトウェアモジュール、データなどを、ストレージ5003及び通信装置5004の少なくとも一方からメモリ5002に読み出し、これらに従って各種の処理を実行する。プログラムとしては、後述する動作の少なくとも一部をコンピュータに実行させるプログラムが用いられる。ドローン10の機能ブロックは、メモリ5002に格納され、プロセッサ5001において動作する制御プログラムによって実現されてもよい。各種の処理は、1つのプロセッサ5001によって実行されてもよいが、2以上のプロセッサ5001により同時又は逐次に実行されてもよい。プロセッサ5001は、1以上のチップによって実装されてもよい。 The processor 5001 reads programs (program codes), software modules, data, etc. from at least one of the storage 5003 and the communication device 5004 to the memory 5002, and executes various processes according to them. As the program, a program that causes a computer to execute at least part of the operations described below is used. The functional blocks of drone 10 may be implemented by a control program stored in memory 5002 and running on processor 5001 . Various types of processing may be executed by one processor 5001, but may also be executed by two or more processors 5001 simultaneously or sequentially. Processor 5001 may be implemented by one or more chips.
 メモリ5002は、コンピュータ読み取り可能な記録媒体であり、例えば、ROM、EPROM、EEPROM、RAMなどの少なくとも1つによって構成されてもよい。メモリ5002は、レジスタ、キャッシュ、メインメモリ(主記憶装置)などと呼ばれてもよい。メモリ5002は、本実施形態に係る方法を実施するために実行可能なプログラム(プログラムコード)、ソフトウェアモジュールなどを保存することができる。 The memory 5002 is a computer-readable recording medium, and may be composed of at least one of ROM, EPROM, EEPROM, and RAM, for example. The memory 5002 may also be called a register, cache, main memory (main storage device), or the like. The memory 5002 can store executable programs (program code), software modules, etc. for performing methods according to the present invention.
 ストレージ5003は、コンピュータ読み取り可能な記録媒体であり、例えば、CD-ROMなどの光ディスク、ハードディスクドライブ、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリ(例えば、カード、スティック、キードライブ)、フロッピー(登録商標)ディスク、磁気ストリップなどの少なくとも1つによって構成されてもよい。ストレージ5003は、補助記憶装置と呼ばれてもよい。ストレージ5003は、少なくとも、後述するような各種処理を実行するためのプログラム及びデータ群を記憶している。 The storage 5003 is a computer-readable recording medium, for example, an optical disk such as a CD-ROM, a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu-ray (registered trademark) disk ), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like. Storage 5003 may be called an auxiliary storage device. The storage 5003 stores at least programs and data groups for executing various processes described later.
 通信装置5004は、無線通信網40を介してコンピュータ間の通信を行うためのハードウェア(送受信デバイス)であり、例えばネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュールなどともいう。 The communication device 5004 is hardware (transmitting/receiving device) for communicating between computers via the wireless communication network 40, and is also called a network device, a network controller, a network card, a communication module, or the like.
 プロセッサ5001、メモリ5002などの各装置は、情報を通信するためのバスによって接続される。バスは、単一のバスを用いて構成されてもよいし、装置間ごとに異なるバスを用いて構成されてもよい。 Each device such as the processor 5001 and memory 5002 is connected by a bus for communicating information. The bus may be configured using a single bus, or may be configured using different buses between devices.
 サーバ装置50は、マイクロプロセッサ、デジタル信号プロセッサ、ASIC、PLD、FPGAなどのハードウェアを含んで構成されてもよく、当該ハードウェアにより、各機能ブロックの一部又は全てが実現されてもよい。例えば、プロセッサ5001は、これらのハードウェアの少なくとも1つを用いて実装されてもよい。 The server device 50 may be configured including hardware such as a microprocessor, digital signal processor, ASIC, PLD, and FPGA, and part or all of each functional block may be realized by the hardware. For example, processor 5001 may be implemented using at least one of these pieces of hardware.
 図4は、ドローン10の機能構成の一例を示す図である。ドローン10によって実現される各機能は、プロセッサ1001、メモリ1002などのハードウェア上に所定のソフトウェア(プログラム)を読み込ませることによって、プロセッサ1001が演算を行い、通信装置1004を制御したり、メモリ1002及びストレージ1003におけるデータの読み出し及び書き込みの少なくとも一方を制御したりすることによって実現される。具体的には、ドローン10において、取得部11、検査部12、判断部13、提示部14及び着陸制御部15という機能が実現される。 FIG. 4 is a diagram showing an example of the functional configuration of the drone 10. As shown in FIG. Each function realized by the drone 10 is performed by causing the processor 1001 to perform calculations, controlling the communication device 1004, and controlling the memory 1002 by loading predetermined software (programs) onto hardware such as the processor 1001 and the memory 1002. and by controlling at least one of reading and writing of data in the storage 1003 . Specifically, in the drone 10, functions of an acquisition unit 11, an inspection unit 12, a determination unit 13, a presentation unit 14, and a landing control unit 15 are realized.
 取得部11は、サーバ装置50等の外部装置から各種のデータを取得する。例えば、取得部11は、サーバ装置50による各種の指示や命令のほか、ドローン10の目的地における複数の着陸候補位置に関する着陸候補位置情報をサーバ装置50から取得する。着陸候補位置とは、ドローン10が目的地において着陸する候補となる位置のことである。具体的には、着陸候補位置は例えば玄関、ドア、ベランダ、門扉、駐車場、倉庫、庭、屋上、通路、軒下などの、目的地における各種構造物や各種設備であり、一般的な住戸に備えられている可能性が高いものである。 The acquisition unit 11 acquires various data from external devices such as the server device 50 . For example, the acquisition unit 11 acquires from the server device 50 various instructions and commands from the server device 50 as well as landing candidate position information regarding a plurality of landing candidate positions at the destination of the drone 10 . A landing candidate position is a candidate position at which the drone 10 lands at the destination. Specifically, the candidate landing positions are various structures and facilities at the destination, such as entrances, doors, balconies, gates, parking lots, warehouses, gardens, rooftops, passages, and under the eaves. It is likely to be provided.
 ここで、図5は、着陸候補位置情報に相当する着陸候補位置リストを例示する図である。着陸候補位置リストには、各々の各着陸候補位置に対して、ドローン10の目的地に居住するユーザにより指定された着陸許可の有無と、ドローン10の着陸位置としての優先順位とが対応付けられている。図5の例では、例えば「D00125」という識別子で識別される目的地において、玄関、庭、ベランダ、屋上、門扉外側、門扉内側、駐車場という予め決められた着陸候補位置のうち、屋上を除く着陸候補位置に対して着陸の許可がなされている。ユーザは、ドローン10の目的地に相当する自身の自宅において、ドローン10が着陸するのに適していると考える着陸候補位置に対して着陸許可を予め与えておき、ドローン10が着陸するのに適していないと考える着陸候補位置に対して着陸許可を与えないようにする。このような着陸許可の有無については、ユーザがユーザ端末30を用いて事前にサーバ装置50にアクセスしておき、任意の内容を登録することができる。着陸候補位置リストにおける優先順位については、着陸許可の有無と同様にユーザによって予め登録されていてもよいし、システム運営者が予め決めておいてもよい。 Here, FIG. 5 is a diagram illustrating a landing candidate position list corresponding to the landing candidate position information. In the landing candidate position list, each landing candidate position is associated with the presence or absence of the landing permission designated by the user residing at the destination of the drone 10 and the priority as the landing position of the drone 10. ing. In the example of FIG. 5, for example, in the destination identified by the identifier "D00125", among the predetermined landing candidate positions such as the entrance, the garden, the veranda, the roof, the outside of the gate, the inside of the gate, and the parking lot, the roof is excluded. Landing clearance has been given to the landing candidate position. The user gives landing permission in advance to a landing candidate position considered suitable for the drone 10 to land in his/her own home corresponding to the destination of the drone 10, and gives landing permission to the candidate landing position suitable for the drone 10 to land. Do not give landing clearance to candidate landing positions that you think are not possible. The user can access the server device 50 in advance using the user terminal 30 and register any content regarding the presence or absence of such landing permission. The order of priority in the landing candidate position list may be registered in advance by the user in the same manner as the presence or absence of landing permission, or may be determined in advance by the system operator.
 この着陸候補位置リストは各々のユーザによって目的地ごとに指定されてサーバ装置50に登録されるようになっており、取得部11は、この着陸候補位置リストを無線通信網40経由でサーバ装置50から取得する。なお、着陸候補位置リストにおける着陸候補位置は、玄関、庭、ベランダ、屋上、門扉外側、門扉内側、駐車場といった名称で表現される例に限らず、例えばドローン10やユーザが所有するカメラで撮影した着陸候補位置の写真画像(例えば着陸候補位置から1.5mほど離れたところから撮影したもの)や、目的地における着陸候補位置の相対的な位置(目的地上空から撮影した目的地全体の写真画像において各着陸候補位置を示したものなど)で表現されていてもよい。なお、図5に例示した目的地の識別子は、規則的に割り当てられた文字列で表現されていてもよいし、目的地の位置情報(緯度経度など)で表現されていてもよい。 This list of candidate landing positions is designated by each user for each destination and registered in the server device 50 . Get from Note that the landing candidate positions in the landing candidate position list are not limited to examples represented by names such as the entrance, the garden, the veranda, the roof, the gate outside, the gate inside, and the parking lot. A photographic image of the candidate landing position (for example, taken from a distance of about 1.5 m from the candidate landing position) or the relative position of the candidate landing position at the destination (a photograph of the entire destination taken from the air above the destination) an image showing each landing candidate position, etc.). Note that the identifiers of the destinations illustrated in FIG. 5 may be represented by regularly assigned character strings, or may be represented by location information (such as latitude and longitude) of the destinations.
 図4の説明に戻り、検査部12は、ドローン10の目的地として指定された建物の上空にドローン10が到達した後、ドローン10のセンサ1008(イメージセンサ)によってその建物を含む敷地全体を撮像された画像と、その目的地における着陸候補位置リストとに基づいて、その目的地における着陸候補位置を検査する。ここでいう検査とは、目的地において、その目的地に対応する着陸候補位置リストに記された着陸候補位置がどこに存在し、どのような状態であるかを特定することである。具体的には、各着陸候補位置の画像上の特徴量に基づいて、センサ1008によって撮像された画像から各着陸候補位置を抽出し、さらに、各着陸候補位置について、その水平度、平坦度、液体があるか否か、又は、障害物があるか否かをセンサ1008によって検出することである。 Returning to the description of FIG. 4, after the drone 10 reaches the sky above the building designated as the destination of the drone 10, the inspection unit 12 uses the sensor 1008 (image sensor) of the drone 10 to image the entire site including the building. The candidate landing positions at the destination are examined based on the obtained image and the list of candidate landing positions at the destination. The inspection here means specifying where the candidate landing positions written in the candidate landing position list corresponding to the destination exist and what state they are in at the destination. Specifically, each landing candidate position is extracted from the image captured by the sensor 1008 based on the feature amount on the image of each landing candidate position. It is to detect by the sensor 1008 whether there is liquid or whether there is an obstacle.
 ここで、図6は、ドローン10の目的地である建物及びその敷地の構造を例示する鳥瞰図である。つまり、ドローン10が目的地の上空(例えば上空20m)からイメージセンサで下方を撮像したときの画像を例示している。目的地に相当する敷地Gの内側には建物Bがある。建物B及び敷地Gには道路R1,R2が隣接している。また、建物B及び敷地Gには、他の建物B1及びその敷地G1も隣接している。敷地G内には、例えば門扉gや樹木W、及び駐車場の屋根Pなどがある。なお、ドローン10(検査部12)は、目的地の上空から或る程度高度を下げることによって建物に近づいて着陸候補位置を検査してもよい。例えばドローン10(検査部12)が玄関という着陸候補位置の検査を試みる場合は、その玄関(ドア)を認識可能な位置(玄関のドアに対してほぼ水平な位置)の高度まで下がってから検査を試みる。 Here, FIG. 6 is a bird's-eye view illustrating the structure of the building and its site, which is the destination of the drone 10. FIG. That is, the image is illustrated when the drone 10 captures the image below from the sky above the destination (for example, 20 m above) with the image sensor. There is a building B inside the site G corresponding to the destination. Building B and site G are adjacent to roads R1 and R2. In addition, another building B1 and its site G1 are also adjacent to the building B and site G. As shown in FIG. The site G includes, for example, a gate g, trees W, and a roof P of a parking lot. Note that the drone 10 (inspection unit 12) may inspect the landing candidate position by approaching the building by lowering the altitude to some extent from the sky above the destination. For example, when the drone 10 (inspection unit 12) attempts to inspect a landing candidate position called the entrance, the entrance (door) is lowered to a position where it can be recognized (a position approximately horizontal to the entrance door) before inspection. try.
 図4の説明に戻り、判断部13は、検査部12が目的地を検査した結果に基づいて、ドローン10が各々の着陸候補位置に着陸可能か否かを判断する。具体的には、判断部13は、各着陸候補位置について、その水平度、平坦度、液体があるか否か、又は、障害物があるか否かに基づいて、ドローン10が着陸可能か否かを判断する。着陸位置の水平度及び平坦度はセンサ1008の測距センサやイメージセンサの出力に基づいて判断される。
また、着陸位置に液体(典型的には水)や障害物があるか否かはセンサ1008のイメージセンサの出力に基づいて判断される。なお、判断部13は、着陸候補位置に対して決められた位置についてドローン10が着陸可能か否かを判断するようにしてもよい。例えば庭という着陸候補位置に対する着陸位置はその庭の中央であり、駐車場という着陸候補位置に対する着陸位置はその駐車場の端に近い位置であり、玄関という着陸候補位置に対する着陸位置はその玄関の前であるというようにあらかじめ決められていてもよい。
Returning to the description of FIG. 4 , the determination unit 13 determines whether or not the drone 10 can land at each of the candidate landing positions based on the result of the inspection of the destination by the inspection unit 12 . Specifically, the determination unit 13 determines whether the drone 10 can land at each landing candidate position based on its levelness, flatness, whether there is liquid, or whether there is an obstacle. to judge whether The horizontality and flatness of the landing position are determined based on the outputs of the range sensor and image sensor of the sensor 1008 .
Also, whether or not there is liquid (typically water) or an obstacle at the landing position is determined based on the output of the image sensor of sensor 1008 . Note that the determination unit 13 may determine whether or not the drone 10 can land at a position determined with respect to the candidate landing positions. For example, the landing position for the candidate landing position of the garden is the center of the garden, the landing position for the candidate landing position of the parking lot is near the edge of the parking lot, and the landing position for the candidate landing position of the entrance is at the entrance. It may be predetermined such as before.
 提示部14は、判断部13によって着陸可能と判断された着陸候補位置を、目的地に対応するユーザに提示する。このとき、提示部14は、判断部13によって着陸可能と判断された着陸候補位置を優先順位に従ってユーザに提示する。具体的には、提示部14は、判断部13によって着陸可能と判断された着陸候補位置をサーバ装置50に通知し、サーバ装置50経由でユーザ端末30にその着陸候補位置に関する提示情報が送信される。ユーザ端末30において、この提示情報が表示等の方法によってユーザに提示される。 The presentation unit 14 presents the landing candidate positions determined by the determination unit 13 as possible to land to the user corresponding to the destination. At this time, the presentation unit 14 presents the landing candidate positions determined by the determination unit 13 as possible to land to the user in accordance with the order of priority. Specifically, the presentation unit 14 notifies the server device 50 of the landing candidate position determined by the determination unit 13 to be possible for landing, and presentation information about the landing candidate position is transmitted to the user terminal 30 via the server device 50. be. In the user terminal 30, this presentation information is presented to the user by means of display or the like.
 ここで、図7は、提示情報に基づいてユーザ端末30に表示される画面を例示する図である。この提示情報において、判断部13によって着陸可能と判断された着陸候補位置は、例えば各着陸候補位置に割り当てられた優先順位に従って提示される。ここでは、優先順位の高い順に画面の上位から、玄関、門扉外側、門扉内側、庭、ベランダ、駐車場という順序で提示されている。ユーザはユーザ端末30を操作して、このようにして提示された複数の着陸候補位置から、自身が希望する着陸位置を選択する。ユーザが希望する着陸位置として選択された着陸候補位置はユーザ端末30からサーバ装置50経由でドローン10に通知される。なお、図7の例では判断部13によって着陸可能と判断された着陸候補位置が全て提示されているが、例えば優先順位の高い順から所定数(例えば3つ)の着陸候補位置のみが提示されるようにしてもよい。また、ユーザに提示される着陸候補位置は、玄関、庭、ベランダ、屋上、門扉外側、門扉内側、駐車場といった名称で表現される例に限らず、例えばドローン10やユーザが所有するカメラで撮影した着陸候補位置の写真画像(例えば着陸候補位置から1.5mほど離れたところから撮影したもの)や、目的地における着陸候補位置の相対的な位置(目的地上空から撮影した目的地全体の写真画像において各着陸候補位置を示したものなど)で表現されていてもよい。 Here, FIG. 7 is a diagram illustrating a screen displayed on the user terminal 30 based on the presentation information. In this presentation information, the landing candidate positions determined by the determination unit 13 to be landable are presented, for example, according to the priority assigned to each landing candidate position. Here, the screen is presented in the order of the entrance, the outside of the gate, the inside of the gate, the garden, the veranda, and the parking lot from the top of the screen in order of priority. The user operates the user terminal 30 and selects a desired landing position from the plurality of landing candidate positions thus presented. The candidate landing position selected as the landing position desired by the user is notified from the user terminal 30 to the drone 10 via the server device 50 . In the example of FIG. 7, all of the landing candidate positions determined by the determination unit 13 to be possible for landing are presented. You may do so. In addition, the candidate landing positions presented to the user are not limited to examples represented by names such as the entrance, the garden, the veranda, the roof, the gate outside, the gate inside, and the parking lot. A photographic image of the candidate landing position (for example, taken from a distance of about 1.5 m from the candidate landing position) or the relative position of the candidate landing position at the destination (a photograph of the entire destination taken from the air above the destination) an image showing each landing candidate position, etc.).
 図4の説明に戻り、着陸制御部15は、センサ1008によってドローン10の位置や姿勢を確認しながら飛行駆動機構1009を制御して、ユーザによって選択された着陸候補位置にドローン10を着陸させる。 Returning to the description of FIG. 4, the landing control unit 15 controls the flight drive mechanism 1009 while checking the position and attitude of the drone 10 with the sensor 1008 to land the drone 10 at the candidate landing position selected by the user.
[動作]
 次に、図8に示すフローチャートを参照して、ドローン10の飛行時の処理について説明する。図8において、ドローン10は発着地から目的地に向けて飛行を開始する(ステップS01)。以降、ドローン10は、サーバ装置50による制御の下で、荷物の配達依頼時に指定された目的地の住所の上空まで飛行する。
[motion]
Next, the processing during flight of the drone 10 will be described with reference to the flowchart shown in FIG. In FIG. 8, the drone 10 starts flying from the departure/arrival point to the destination (step S01). After that, the drone 10 flies under the control of the server device 50 up to the destination address specified at the time of the package delivery request.
 ドローン10が目的地の上空に到達すると(ステップS02;YES)、検査部12は、センサ1008(イメージセンサ)によって撮像された目的地の画像に基づいて、その目的地における着陸候補位置を検査する(ステップS03)。このとき、前述したように、ドローン10(検査部12)は目的地の上空から或る程度高度を下げて建物に近づいた状態で着陸候補位置を検査してもよい。 When the drone 10 reaches the sky above the destination (step S02; YES), the inspection unit 12 inspects the landing candidate position at the destination based on the image of the destination captured by the sensor 1008 (image sensor). (Step S03). At this time, as described above, the drone 10 (inspection unit 12) may inspect the landing candidate position in a state in which the drone 10 (inspection unit 12) has lowered its altitude to some extent from the sky above the destination and has approached the building.
 判断部13は、検査部12の検査結果に基づいて各着陸候補位置にドローン10が着陸可能か否かを判断する。具体的には、判断部13は、センサ1008の測距センサやイメージセンサの出力から、着陸位置の水平度、平坦度、液体があるか否か、又は、障害物があるか否かを特定し、ドローン10が着陸可能か否かを判断する(ステップS04)。 The determination unit 13 determines whether the drone 10 can land at each landing candidate position based on the inspection result of the inspection unit 12 . Specifically, the determination unit 13 identifies the horizontality and flatness of the landing position, whether or not there is liquid, or whether or not there is an obstacle, based on the output of the ranging sensor and image sensor of the sensor 1008. Then, it is determined whether or not the drone 10 can land (step S04).
 提示部14は、判断部13によって着陸可能と判断された着陸候補位置に関する提示情報を、目的地に対応するユーザのユーザ端末30に送信することで、その提示情報をそのユーザに提示する(ステップS05)。 The presenting unit 14 presents the presenting information to the user by transmitting the presenting information regarding the candidate landing position determined by the determining unit 13 as possible to land to the user terminal 30 of the user corresponding to the destination (step S05).
 着陸制御部15は、ユーザに提示された着陸候補位置の中からいずれかがユーザによって選択されると(ステップS06;YES)、選択された着陸候補位置にドローン10を着陸させる(ステップS07)。なお、ユーザに提示された着陸候補位置においてユーザによる選択がなされないまま所定期間が経過した場合、ドローン10はサーバ装置50経由でユーザ端末30にその旨を通知するなどの所定のエラー処理を行い、次の目的地に向かったり基地に帰着したりしてもよい。 When the user selects one of the candidate landing positions presented to the user (step S06; YES), the landing control unit 15 lands the drone 10 at the selected candidate landing position (step S07). Note that if a predetermined period of time elapses without the user selecting a landing candidate position presented to the user, the drone 10 performs predetermined error processing such as notifying the user terminal 30 of the fact via the server device 50. , may proceed to the next destination or return to the base.
 以上説明した実施形態によれば、予め用意された着陸候補位置のそれぞれに対して着陸可否を判断し、さらに、着陸可能な着陸候補位置のうちユーザが希望する着陸候補位置にドローン10を着陸させることができる。つまり、目的地に対してドローン10を適切な位置に着陸させることが可能となる。 According to the embodiment described above, the landing possibility is determined for each of the landing candidate positions prepared in advance, and the drone 10 is landed at the landing candidate position desired by the user among the possible landing candidate positions. be able to. That is, it becomes possible to land the drone 10 at an appropriate position with respect to the destination.
[変形例] 本発明は、上述した実施形態に限定されない。上述した実施形態を以下のように変形してもよい。また、以下の2つ以上の変形例を組み合わせて実施してもよい。
[変形例1] 上記実施形態において、ユーザに提示された着陸候補位置においてユーザによる選択がなされないまま所定期間が経過した場合、ドローン10はサーバ装置50経由でユーザ端末30にその旨を通知するなどの所定のエラー処理を行い、次の目的地に向かったり基地に帰着したりしていた。このように、ユーザによる着陸候補位置の選択が無かった場合に、そのユーザの代理となる他のユーザに着陸候補位置の選択をしてもらってもよい。この場合、ドローン10又はサーバ装置50は、目的地の識別子に対応付けて、第1のユーザ(例えば目的地に居住する世帯主)及び第2のユーザ(例えば同じ目的地に居住する家族)のユーザ端末30の通信アドレスを記憶しておく。そして、提示部14は、目的地に対応する第1のユーザによって着陸候補位置が選択されなかった場合には、目的地に対応する第2のユーザに対し、判断部13によって着陸可能と判断された着陸候補位置を提示する。つまり、提示部14は、判断部13によって着陸可能と判断された着陸候補位置をサーバ装置50に通知し、サーバ装置50経由で第2のユーザのユーザ端末30にその着陸候補位置に関する提示情報が送信される。第2のユーザのユーザ端末30において、この提示情報が表示等の方法によってユーザに提示される。第2のユーザは自身のユーザ端末30を操作して、提示された複数の着陸候補位置から、自身が希望する着陸位置を選択する。第2のユーザが希望する着陸位置として選択された着陸候補位置はユーザ端末30からサーバ装置50経由でドローン10に通知される。ドローン10の着陸制御部15は、第2のユーザによって選択された着陸候補位置にドローン10を着陸させる。このようにすれば、ドローン10が目的地に着陸することができる機会が多くなり、ドローン10が輸送する荷物をより迅速に配達することができる。
[Modifications] The present invention is not limited to the above-described embodiments. The embodiment described above may be modified as follows. Also, two or more of the following modified examples may be combined for implementation.
[Modification 1] In the above-described embodiment, if a predetermined period of time has passed without the user selecting a landing candidate position presented to the user, the drone 10 notifies the user terminal 30 of the fact via the server device 50. After performing predetermined error processing, such as, it headed for the next destination or returned to the base. In this manner, when the user does not select a candidate landing position, another user acting as a substitute for the user may select a candidate landing position. In this case, the drone 10 or the server device 50 associates the identifier of the destination with the first user (for example, the householder who lives at the destination) and the second user (for example, the family who lives at the same destination) The communication address of the user terminal 30 is stored. Then, if the landing candidate position is not selected by the first user corresponding to the destination, the presentation unit 14 causes the determination unit 13 to determine that landing is possible for the second user corresponding to the destination. It presents candidate landing positions. That is, the presentation unit 14 notifies the server device 50 of the landing candidate position determined by the determination unit 13 to be possible for landing, and the user terminal 30 of the second user receives the presentation information about the landing candidate position via the server device 50. sent. At the user terminal 30 of the second user, this presentation information is presented to the user by means of display or the like. The second user operates his or her own user terminal 30 and selects a desired landing position from among the plurality of presented candidate landing positions. The candidate landing position selected as the landing position desired by the second user is notified from the user terminal 30 to the drone 10 via the server device 50 . The landing control unit 15 of the drone 10 lands the drone 10 at the landing candidate position selected by the second user. In this way, the chances that the drone 10 can land at the destination are increased, and the packages transported by the drone 10 can be delivered more quickly.
[変形例2] 上記実施形態では、提示部14は、判断部13によって着陸可能と判断された着陸候補位置を優先順位に従ってユーザに提示し、ユーザがその着陸候補位置から希望するものを選択していた。このような動作に代えて、着陸制御部15は、判断部13によって着陸可能と判断された着陸候補位置の中からいずれかの着陸候補位置を優先順位に従って選択し、選択した着陸候補位置にドローン10を着陸させるようにしてもよい。ここで、着陸候補位置を優先順位に従って選択するとは、第1優先順位の着陸候補位置を選択することのほか、例えば上位から所定数の優先順位に対応する着陸候補位置のうち最も広い着陸候補位置を選択するなど、優先順位に従って抽出した着陸候補位置のうち優先順位以外の条件に従って選択することを含む。このようにすれば、ユーザによる選択を介さずに、システムが自動でドローン10を着陸させることができる。 [Modification 2] In the above embodiment, the presentation unit 14 presents the landing candidate positions determined by the determination unit 13 as possible to the user in order of priority, and the user selects the desired landing position. was Instead of such an operation, the landing control unit 15 selects one of the candidate landing positions determined by the determination unit 13 as possible to land according to the priority, and controls the drone at the selected candidate landing position. 10 may be landed. Here, selecting the landing candidate positions in accordance with the order of priority means selecting the landing candidate position with the first priority, or, for example, selecting the landing candidate position that is the widest among the landing candidate positions corresponding to a predetermined number of priorities from the top. It includes selecting landing candidate positions extracted according to the priority according to conditions other than the priority, such as selecting . In this way, the system can automatically land the drone 10 without user selection.
[変形例3] 上記実施形態及び上記変形例の優先順位は、諸条件に応じて動的に変動するものであってもよい。例えば、優先順位は、ドローン10が着陸するとき或いはその前後の天候、ドローン10が着陸する時間帯、ドローン10が輸送する荷物の属性、又は、ドローン10の目的地の環境に応じて異なる優先順位であってもよい。 [Modification 3] The priority of the above embodiment and modification may dynamically change according to various conditions. For example, the priority varies depending on the weather before or after the drone 10 lands, the time zone when the drone 10 lands, the attributes of the cargo transported by the drone 10, or the environment of the destination of the drone 10. may be
 ドローン10が着陸するとき或いはその前後の天候に応じて異なる優先順位の例としては、例えば所定の天気予報情報提供装置からドローン10が取得した天気予報情報に基づいて所定期間経過後(例えば1時間後)に降雨があるという場合や、ドローン10が搭載する湿度センサによって検出された湿度が閾値以上である場合には、屋根がある着陸候補位置の優先順位を上げたり、屋根のない着陸候補位置の優先順位を下げたりといった例が考えられる。また、例えば所定の天気予報情報提供装置からドローン10が取得した天気予報情報に基づいて閾値以上の風量及び風向を予測できる場合には、風上に壁等の障害物がある着陸候補位置の優先順位を上げる(ただし、荷物の重さが閾値以上の場合は優先順位を変更しない)といった例が考えられる。 An example of different priorities depending on the weather when the drone 10 lands or before and after landing is, for example, after a predetermined period of time (for example, one hour) based on weather forecast information acquired by the drone 10 from a predetermined weather forecast information providing device later), or if the humidity detected by the humidity sensor mounted on the drone 10 is equal to or higher than the threshold, the priority of the landing candidate position with a roof is raised, or the candidate landing position without a roof For example, lowering the priority of Also, for example, if the wind volume and wind direction above the threshold can be predicted based on the weather forecast information obtained by the drone 10 from a predetermined weather forecast information providing device, priority is given to landing candidate positions with obstacles such as walls on the windward side. An example of raising the order (however, if the weight of the parcel is equal to or greater than the threshold, the priority is not changed) can be considered.
 ドローン10が着陸する時間帯に応じて異なる優先順位の例としては、例えば日没に近い時間帯(例えば18時以降)である場合には、安全性の観点から、目的地に接する道路から目につきやすい駐車場又は玄関という着陸候補位置の優先順位を下げるといった例や、早朝の時間帯(例えば5~6時)である場合には、外出するユーザが気づきやすい玄関という着陸候補位置の優先順位を上げるといった例が考えられる。 As an example of different priorities depending on the time zone when the drone 10 lands, for example, in the case of a time zone close to sunset (for example, after 18:00), from the viewpoint of safety, the road adjacent to the destination can be seen. For example, lowering the priority of a candidate landing position such as a parking lot or entrance that is easy to reach, or in the early morning hours (for example, 5 to 6 o'clock), the priority of a candidate landing position such as an entrance that is easily noticed by a user going out. An example is to raise
 ドローン10が輸送する荷物の属性に応じて異なる優先順位の例としては、例えば荷物の価格や重要性が高い場合には、安全性の観点から、目的地に接する道路から目につきやすい駐車場又は玄関という着陸候補位置の優先順位を下げるといった例や、荷物の外観色と着陸候補位置の色が類似する範囲にある場合には、荷物の目立ちにくさという観点から、その着陸候補位置の優先順位を下げるといった例が考えられる。 Examples of priorities that differ depending on the attributes of the cargo to be transported by the drone 10 include, for example, when the price or importance of the cargo is high, from the viewpoint of safety, parking lots or For example, lowering the priority of the landing candidate position such as the entrance, or when the appearance color of the luggage and the color of the landing candidate position are in a similar range, the priority of the landing candidate position from the viewpoint of inconspicuousness of the luggage One example is lowering the
 ドローン10の目的地の環境に応じて異なる優先順位の例としては、例えば目的地近辺の犯罪マップを参照して犯罪リスクが一定以上であったり、目的地近隣に学校等の大規模施設があったりする場合には、安全性の観点から、目的地に接する道路から目につきやすい駐車場又は玄関という着陸候補位置の優先順位を下げるといった例が考えられる。また、犬や猫などの動物が目的地にいる場合には、動物が届かない高い着陸候補位置(ベランダなど)や動物のいるエリアから遠い着陸候補位置の優先順位を上げるといった例が考えられる。また、車止めが存在する着陸候補位置の優先順位を下げるといった例も考えられる。 Examples of different priorities depending on the environment of the destination of the drone 10 include, for example, referring to a crime map in the vicinity of the destination where the crime risk is above a certain level, or where there is a large-scale facility such as a school in the vicinity of the destination. In such a case, from the viewpoint of safety, it is conceivable to lower the priority of a candidate landing position such as a parking lot or an entrance that can be easily seen from the road adjacent to the destination. In addition, when animals such as dogs and cats are at the destination, it is conceivable to give priority to high candidate landing positions (such as verandas) that animals cannot reach and candidate landing positions that are far from areas where animals are present. In addition, an example of lowering the priority of a candidate landing position where there is a car stop is also conceivable.
 以上のようにすれば、ドローン10が着陸するとき或いはその前後の天候、ドローン10が着陸する時間帯、ドローン10が輸送する荷物の属性、又は、ドローン10の目的地の環境に応じた適切な着陸候補位置にドローン10を着陸させられる可能性が高くなる。なお、本発明において、優先順位を下げるという表現は、着陸候補位置から除外するという意味を含む。 As described above, when the drone 10 lands, the weather before and after that, the time zone when the drone 10 lands, the attribute of the package to be transported by the drone 10, or the environment of the destination of the drone 10. The possibility of landing the drone 10 at the landing candidate position increases. In the present invention, the expression lowering the priority includes the meaning of excluding from the candidate landing positions.
[変形例4] また、上記実施形態及び上記変形例の優先順位は、ユーザが目的地に在宅か否かに応じて変動してもよい。この場合、ドローン10は、目的地に対応するユーザが当該目的地に所在しているか否かを判定する第1判定部を備え、優先順位は、目的地に対応するユーザが当該目的地に所在しているか否かに応じた優先順位である。目的地に対応するユーザが当該目的地に所在している場合には、例えば玄関や窓の前等のユーザがアクセスしやすい着陸候補位置の優先順位を上げ、ユーザが目的地に所在していない場合には、安全性の観点から、目的地に接する道路から目につきにくい庭や他人がアクセスしにくいベランダという着陸候補位置の優先順位を上げるといった例が考えられる。 [Modification 4] In addition, the priority in the above embodiment and modification may vary depending on whether or not the user is at home at the destination. In this case, the drone 10 includes a first determination unit that determines whether or not the user corresponding to the destination is located at the destination. This is the order of priority according to whether or not When the user corresponding to the destination is located at the destination, the priority of the landing candidate position that is easy for the user to access, such as in front of the entrance or the window, is raised, and the user is not located at the destination. In this case, from the viewpoint of safety, it is conceivable to raise the priority of landing candidate positions such as a garden that is hard to see from the road adjacent to the destination or a veranda that is difficult for others to access.
 目的地に対応するユーザが当該目的地に所在しているか否かは、例えばドローン10が目的地付近に接近した際に、ドローン10からサーバ装置50経由でユーザ端末30にその旨を通知し、その通知に対してユーザが在宅しているか否かを選択し、その選択結果をユーザ端末30からサーバ装置50経由でドローン10に通知するといった例が考えられる。また、ドローン10が目的地付近に接近した際に、目的地の固定電話機にドローン10から又はサーバ装置50を介して電話を掛け、その固定電話機から応答があった場合には在宅と判定するという方法や、荷物の受け取りが可能かどうかをその固定電話機に対する操作で返答させる方法も考えられる。後者のユーザに返答させる方法においては、固定電話機を用いたではなく、ユーザ端末30等のスマートフォンや携帯電話機などであってもよい。また、ユーザがユーザ端末30のアプリケーションに対して位置情報の取得を許可しておき、ドローン10が目的地付近に接近した際に、その旨をドローン10からサーバ装置50経由でユーザ端末30に通知し、ユーザ端末30の位置情報が目的地の位置とほぼ同一の範囲に含まれる場合にはユーザが当該目的地に所在していると判定するという方法も考えられる。また、ユーザが目的地に相当する建物に出入りするたびにユーザ端末30のアプリケーションにその出入りの状態を手入力しておき、その状態をドローン10が目的地付近に接近した際に、ユーザ端末30からサーバ装置50経由で取得するようにしてもよい。また、ドローン10が目的地付近に接近した際に、スマートメーター技術等を用いてその目的地における電力使用状況をモニタリングし、ユーザが目的地に所在しているかどうか推定する。例えば一定期間内の電力の使用量が閾値以上であったり、一定期間内に電力使用量に閾値以上の大きな変動があったりした場合は、ユーザが的地に所在していると判定すると。また、ドローン10が目的地付近に接近した際に、ドローン10が目的地に相当する建物のベランダや窓などから建物内の様子をイメージセンサで撮像し、電灯が点灯しているとか人間がいる等を画像認識で判定するという方法も考えられる。また、ドローン10が目的地付近に接近した際に、玄関前などでドローン10が音を鳴らしてからイメージセンサによる撮像を行い、その後の一定期間内に玄関のドアや窓の開閉があったり人の姿が確認できたりすれば、ユーザが目的地に所在していると判定するという方法も考えられる。また、ドローン10が目的地付近に接近した際に、目的地の建物の周辺などを飛行してイメージセンサによる撮像を行い、開いている窓があった場合には、ユーザが目的地に所在していると判定するという方法も考えられる。また、ドローン10にライトを実装しておき、目的地に相当する建物の窓に向かって光を照射してイメージセンサによる撮像を行い、ユーザからの反応行動があれば、ユーザが目的地に所在していると判定するという方法も考えられる。また、ドローン10が目的地に相当する建物のインターホンと通信して(又はインターホンのボタンを押して)インターホンを鳴らしてイメージセンサによる撮像を行い、ユーザからの反応行動があれば、ユーザが目的地に所在していると判定するという方法も考えられる。以上のようにすれば、ユーザが目的に所在しているか否かに応じて、適切な着陸候補位置にドローン10を着陸させることができる。 For example, when the drone 10 approaches the vicinity of the destination, the drone 10 notifies the user terminal 30 via the server device 50 of whether or not the user corresponding to the destination is located at the destination. An example may be considered in which the user selects whether or not the user is at home in response to the notification, and notifies the drone 10 of the selection result from the user terminal 30 via the server device 50 . Also, when the drone 10 approaches the vicinity of the destination, the drone 10 or via the server device 50 calls the fixed telephone of the destination, and if there is a response from the fixed telephone, it is determined that the person is at home. Alternatively, a method may be considered in which a reply is made by operating the fixed telephone as to whether or not it is possible to receive the parcel. In the latter method of having the user reply, a smart phone such as the user terminal 30 or a mobile phone may be used instead of using a fixed telephone. In addition, when the user permits the application of the user terminal 30 to acquire position information, and the drone 10 approaches the vicinity of the destination, the drone 10 notifies the user terminal 30 via the server device 50 to that effect. However, if the location information of the user terminal 30 is included in a range that is substantially the same as the location of the destination, a method of determining that the user is located at the destination is also conceivable. In addition, each time the user enters or exits a building corresponding to the destination, the entry/exit state is manually input to the application of the user terminal 30, and when the drone 10 approaches the vicinity of the destination, the user terminal 30 may be acquired via the server device 50 from the Also, when the drone 10 approaches the vicinity of the destination, it monitors the power usage status at the destination using smart meter technology or the like, and estimates whether the user is at the destination. For example, if the amount of power usage within a certain period of time is equal to or greater than a threshold value, or if the amount of power usage fluctuates significantly beyond the threshold value within a certain period of time, it is determined that the user is located at the target location. Also, when the drone 10 approaches the vicinity of the destination, the drone 10 captures an image of the interior of the building from the balcony or window of the building corresponding to the destination with an image sensor, and detects whether the lights are on or there are people. A method of determining such as by image recognition is also conceivable. In addition, when the drone 10 approaches the vicinity of the destination, the drone 10 makes a sound in front of the entrance, etc., and then the image is captured by the image sensor. A method of determining that the user is at the destination is also conceivable. Also, when the drone 10 approaches the vicinity of the destination, it flies around the building of the destination and takes an image with the image sensor. It is also possible to consider a method of determining that Also, a light is mounted on the drone 10, light is emitted toward the window of the building corresponding to the destination, and an image is captured by the image sensor. A method of determining that the In addition, the drone 10 communicates with the intercom of the building corresponding to the destination (or presses the button of the intercom), sounds the intercom, performs imaging with the image sensor, and if there is a reaction action from the user, the user reaches the destination. A method of judging that the object exists is also conceivable. As described above, the drone 10 can be landed at an appropriate landing candidate position depending on whether or not the user is at the destination.
[変形例5] 優先順位が目的地近辺の人通りのレベルに応じて変動してもよい。この場合、ドローン10は、目的地を含む所定エリア内の人間の数又は密度に関する判定を行う第2判定部を備え、優先順位は、第2判定部により判定された判定結果に応じた優先順位となる。目的地を含む所定エリア内の人間の数又は密度に関する判定は、例えばドローン10が上空から目的地近辺の人通りの様子を画像認識し、人通りが閾値以上であるか否かを判定する方法や、NTTドコモ社が提供するモバイル空間統計(登録商標)などの統計情報を利用して判定する方法や、渋滞情報提供装置から提供される渋滞情報を利用して判定する方法が考えられる。目的地を含む所定エリア内の人間の数又は密度が閾値以上の場合は、安全性の観点から、目的地に接する道路から目につきやすい駐車場又は玄関という着陸候補位置の優先順位を下げるといった例が考えられる。 [Modification 5] The priority may change according to the level of traffic around the destination. In this case, the drone 10 has a second determination unit that determines the number or density of people in a predetermined area including the destination, and the priority is determined according to the determination result determined by the second determination unit. becomes. The determination of the number or density of people in a predetermined area including the destination is, for example, a method in which the drone 10 performs image recognition of the state of pedestrian traffic in the vicinity of the destination from the sky, and determines whether or not the pedestrian traffic is equal to or greater than a threshold. Alternatively, a determination method using statistical information such as mobile space statistics (registered trademark) provided by NTT DoCoMo, Inc., and a determination method using traffic congestion information provided by a congestion information providing device are conceivable. For example, if the number or density of people in a given area including the destination is above a threshold, the priority of landing candidate positions such as parking lots or entrances, which are easily visible from the road adjacent to the destination, is lowered from the viewpoint of safety. can be considered.
 なお、上述した変形例3~5において優先順位を変動させる条件のうち、少なくとも2以上の条件を組み合わせて用いてもよい。例えば、着陸の時間帯が18時以降かつユーザが目的地に所在していない場合と、着陸の時間帯が18時より前でかつユーザが目的地に所在していない場合と、着陸の時間帯が18時以降かつユーザが目的地に所在している場合と、着陸の時間帯が18時より前でかつユーザが目的地に所在している場合とで、優先順位を変動させる、といった例が考えられるが、もちろんこれは例示に過ぎず、2以上の条件の組み合わせは様々なものが考えられる。 It should be noted that at least two or more of the conditions for changing the priority in Modifications 3 to 5 described above may be used in combination. For example, when the time of landing is after 18:00 and the user is not at the destination, when the time of landing is before 18:00 and the user is not at the destination, and when the time of landing is is after 18:00 and the user is at the destination, and the landing time is before 18:00 and the user is at the destination. Of course, this is only an example, and various combinations of two or more conditions are conceivable.
[変形例6] ドローン10の着陸制御は、実施形態で説明した、いわゆるエッジコンピューティング(ドローンによる制御)、クラウドコンピューティング(サーバ装置による制御)、又は、その双方の連携(ドローン及びサーバ装置による制御)で実現してもよい。従って、本発明の制御装置はサーバ装置50に備えられていてもよい。 [Modification 6] Landing control of the drone 10 is performed by the so-called edge computing (control by the drone), cloud computing (control by the server device), or cooperation of both (by the drone and the server device), as described in the embodiment. control). Therefore, the control device of the present invention may be provided in the server device 50. FIG.
[変形例7] 無人飛行体は、ドローンと呼ばれるものに限らず、荷物を輸送可能な無人の飛行体であればどのような構造や形態のものであってもよい。また、人間は乗っているが飛行体自身が自動運転するような有人飛行体の場合であっても、本発明を適用することが可能である。 [Modification 7] The unmanned flying object is not limited to what is called a drone, and may be of any structure and form as long as it can transport cargo. In addition, the present invention can be applied to a manned flying object that has a human on board but that operates automatically.
[変形例8] 上述した実施形態は、荷物を輸送する飛行体(ドローン10)が目的地に着陸するときの例で説明したが、例えば、飛行体が荷物を保持せずに目的地に着陸し、その着陸位置にて荷物を受け取って保持した状態で次の目的地へと離陸するというシーンにおける、飛行体着陸時に対しても本発明を適用することが可能である。また、飛行体の飛行目的又は用途は、実施形態で例示した荷物の輸送に限らず、例えば何らかの対象物を測定したり撮影したりするなど、どのようなものであってもよい。つまり、本発明は、飛行体の飛行目的又は用途に関わらず、その飛行体が着陸するときに適用することができる。 [Modification 8] In the above-described embodiment, the flying object (drone 10) that transports the cargo lands at the destination. However, the present invention can also be applied to the landing of an aircraft in a scene in which the cargo is received and held at the landing position and taken off to the next destination. Further, the purpose or application of the flying object is not limited to the transport of luggage as exemplified in the embodiment, but may be any purpose such as measuring or photographing some object. That is, the present invention can be applied when the vehicle lands regardless of the flight purpose or application of the vehicle.
[変形例9] 上記実施形態では、目的地の検査において、ドローン10のセンサ1008が備える撮像手段としてのイメージセンサを用いていた。目的地の検査手法は、実施形態の例に限らず、例えばLiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)と呼ばれる技術や、SLAM(Simultaneous Localization and Mapping)と呼ばれる技術など、物体の位置、形状、又は大きさなどをセンシング可能な手法を用いることができる。即ち、センサ1008は、目的地における着陸候補位置を検査可能なセンサを備え、判断部13は、その検査結果に基づいて、ドローン10が各々の着陸候補位置に着陸可能か否かを判断すればよい。 [Modification 9] In the above-described embodiment, an image sensor is used as an imaging means provided in the sensor 1008 of the drone 10 in the inspection of the destination. The destination inspection method is not limited to the example of the embodiment, for example, a technology called LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), a technology called SLAM (Simultaneous Localization and Mapping), etc. A technique capable of sensing the shape, size, or the like can be used. That is, the sensor 1008 has a sensor capable of inspecting the candidate landing positions at the destination, and the determination unit 13 determines whether or not the drone 10 can land at each candidate landing position based on the inspection result. good.
[変形例10] 上記実施形態において、判断部13は、着陸位置の水平度、平坦度、液体があるか否か、又は、障害物があるか否かに基づいて飛行体が着陸可能か否かを判断していたが、これら以外の条件に基づいてその判断を行ってもよい。具体的には、例えば着陸位置の材質、温度、又は、積雪があるか否か等が考えられる。着陸位置の材質や積雪の有無は、例えばセンサ1008のイメージセンサの出力に基づいて判断される。また、着陸位置の温度は、センサ1008が備える非接触温度センサの出力に基づいて判断される。このように、判断部13は、着陸位置の水平度、平坦度、材質、温度、又は、液体若しくは積雪があるか否かといった、複数種類の着陸位置の状態のうち少なくともいずれか1つに基づいて、飛行体が着陸可能か否かを判断すればよい。 [Modification 10] In the above embodiment, the determination unit 13 determines whether the flying object can land based on the levelness, flatness, presence of liquid, or presence of obstacles at the landing position. However, the determination may be made based on conditions other than these. Specifically, for example, the material and temperature of the landing position, or whether or not there is snow cover, etc. can be considered. The material of the landing position and the presence or absence of snow coverage are determined based on the output of the image sensor of the sensor 1008, for example. Also, the temperature of the landing position is determined based on the output of the non-contact temperature sensor included in sensor 1008 . In this way, the determination unit 13 determines based on at least one of a plurality of types of landing position conditions, such as levelness, flatness, material, temperature, or whether or not there is liquid or snow at the landing position. can be used to determine whether the aircraft can land.
[そのほかの変形例] 上記実施の形態の説明に用いたブロック図は、機能単位のブロックを示している。これらの機能ブロック(構成部)は、ハードウェア及び/又はソフトウェアの任意の組み合わせによって実現される。また、各機能ブロックの実現手段は特に限定されない。すなわち、各機能ブロックは、物理的及び/又は論理的に結合した1つの装置により実現されてもよいし、物理的及び/又は論理的に分離した2つ以上の装置を直接的及び/又は間接的に(例えば、有線及び/又は無線)で接続し、これら複数の装置により実現されてもよい。
 例えば、実施形態で例示したユーザ端末30~32の機能を1つのコンピュータが備えていてもよい。要するに、図5に例示した各機能は、情報処理システムとしてのドローン管理システム1を構成する装置のいずれかが備えていればよい。例えば、サーバ装置50がドローン10に対して直接制御可能な場合は、サーバ装置50が処理部313に相当する機能を備え、ドローン10に対して直接、その飛行を制限するようにしてもよい。
[Other Modifications] The block diagrams used in the description of the above embodiments show blocks in functional units. These functional blocks (components) are implemented by any combination of hardware and/or software. Further, means for realizing each functional block is not particularly limited. That is, each functional block may be implemented by one device physically and/or logically coupled, or may be implemented by two or more physically and/or logically separated devices directly and/or indirectly. These multiple devices may be physically connected (eg, wired and/or wirelessly).
For example, one computer may have the functions of the user terminals 30 to 32 exemplified in the embodiments. In short, each function exemplified in FIG. 5 may be provided in any of the devices constituting the drone management system 1 as an information processing system. For example, if the server device 50 can directly control the drone 10 , the server device 50 may have a function corresponding to the processing unit 313 and directly restrict the flight of the drone 10 .
 本明細書で説明した各態様/実施形態は、LTE(Long Term Evolution)、LTE-A(LTE-Advanced)、SUPER 3G、IMT-Advanced、4G、5G、FRA(Future Radio Access)、W-CDMA(登録商標)、GSM(登録商標)、CDMA2000、UMB(Ultra Mobile Broadband)、IEEE 802.11(Wi-Fi)、IEEE 802.16(WiMAX)、IEEE 802.20、UWB(Ultra-WideBand)、Bluetooth(登録商標)、その他の適切なシステムを利用するシステム及び/又はこれらに基づいて拡張された次世代システムに適用されてもよい。 Each aspect/embodiment described herein includes LTE (Long Term Evolution), LTE-A (LTE-Advanced), SUPER 3G, IMT-Advanced, 4G, 5G, FRA (Future Radio Access), W-CDMA (registered trademark), GSM (registered trademark), CDMA2000, UMB (Ultra Mobile Broadband), IEEE 802.11 (Wi-Fi), IEEE 802.16 (WiMAX), IEEE 802.20, UWB (Ultra-WideBand), It may be applied to systems utilizing Bluetooth®, other suitable systems, and/or advanced next generation systems based thereon.
 本明細書で説明した各態様/実施形態の処理手順、シーケンス、フローチャートなどは、矛盾の無い限り、順序を入れ替えてもよい。例えば、本明細書で説明した方法については、例示的な順序で様々なステップの要素を提示しており、提示した特定の順序に限定されない。本明細書で説明した各態様/実施形態は単独で用いてもよいし、組み合わせて用いてもよいし、実行に伴って切り替えて用いてもよい。また、所定の情報の通知(例えば、「Xであること」の通知)は、明示的に行うものに限られず、暗黙的(例えば、当該所定の情報の通知を行わない)ことによって行われてもよい。 The order of the processing procedures, sequences, flowcharts, etc. of each aspect/embodiment described in this specification may be changed as long as there is no contradiction. For example, the methods described herein present elements of the various steps in a sample order, and are not limited to the specific order presented. Each aspect/embodiment described herein may be used alone, in combination, or switched between implementations. In addition, the notification of predetermined information (for example, notification of “being X”) is not limited to being performed explicitly, but may be performed implicitly (for example, not notifying the predetermined information). good too.
 本明細書で説明した情報又はパラメータなどは、絶対値で表されてもよいし、所定の値からの相対値で表されてもよいし、対応する別の情報で表されてもよい。 The information or parameters described in this specification may be represented by absolute values, relative values from a predetermined value, or other corresponding information.
 本明細書で使用する「判定(determining)」、「決定(determining)」という用語は、多種多様な動作を包含する場合がある。「判定」、「決定」は、例えば、判断(judging)、計算(calculating)、算出(computing)、処理(processing)、導出(deriving)、調査(investigating)、探索(looking up)(例えば、テーブル、データベース又は別のデータ構造での探索)、確認(ascertaining)した事を「判定」「決定」したとみなす事などを含み得る。また、「判定」、「決定」は、受信(receiving)(例えば、情報を受信すること)、送信(transmitting)(例えば、情報を送信すること)、入力(input)、出力(output)、アクセス(accessing)(例えば、メモリ中のデータにアクセスすること)した事を「判定」「決定」したとみなす事などを含み得る。また、「判定」、「決定」は、解決(resolving)、選択(selecting)、選定(choosing)、確立(establishing)、比較(comparing)などした事を「判定」「決定」したとみなす事を含み得る。つまり、「判定」「決定」は、何らかの動作を「判定」「決定」したとみなす事を含み得る。 The terms "determining" and "determining" used herein may encompass a wide variety of actions. "Determining", "determining" means, for example, judging, calculating, computing, processing, deriving, investigating, looking up (e.g., table , searching in a database or other data structure), ascertaining as "determining" or "determining". In addition, "judgment" and "decision" are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access, and so on. (accessing) (for example, accessing data in memory) may include deeming that something has been "determined" or "determined". In addition, "judgement" and "decision" are considered to be "judgment" and "decision" by resolving, selecting, choosing, establishing, comparing, etc. can contain. In other words, "judgment" and "decision" may include considering that some action is "judgment" and "decision".
 本発明は、情報処理方法として提供されてもよいし、プログラムとして提供されてもよい。かかるプログラムは、光ディスク等の記録媒体に記録した形態で提供されたり、インターネット等のネットワークを介して、コンピュータにダウンロードさせ、これをインストールして利用可能にするなどの形態で提供されたりすることが可能である。 The present invention may be provided as an information processing method or as a program. Such a program may be provided in a form recorded on a recording medium such as an optical disc, or may be provided in a form in which the program is downloaded to a computer via a network such as the Internet, installed, and made available. It is possible.
 ソフトウェア、命令などは、伝送媒体を介して送受信されてもよい。例えば、ソフトウェアが、同軸ケーブル、光ファイバケーブル、ツイストペア及びデジタル加入者回線(DSL)などの有線技術及び/又は赤外線、無線及びマイクロ波などの無線技術を使用してウェブサイト、サーバ、又は他のリモートソースから送信される場合、これらの有線技術及び/又は無線技術は、伝送媒体の定義内に含まれる。 Software, instructions, etc. may be transmitted and received via a transmission medium. For example, the software can be used to access websites, servers, or other When transmitted from a remote source, these wired and/or wireless technologies are included within the definition of transmission media.
 本明細書で説明した情報、信号などは、様々な異なる技術のいずれかを使用して表されてもよい。例えば、上記の説明全体に渡って言及され得るデータ、命令、コマンド、情報、信号、ビット、シンボル、チップなどは、電圧、電流、電磁波、磁界若しくは磁性粒子、光場若しくは光子、又はこれらの任意の組み合わせによって表されてもよい。 The information, signals, etc. described herein may be represented using any of a variety of different technologies. For example, data, instructions, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. may be represented by a combination of
 本明細書で使用する「第1の」、「第2の」などの呼称を使用した要素へのいかなる参照も、それらの要素の量又は順序を全般的に限定するものではない。これらの呼称は、2つ以上の要素間を区別する便利な方法として本明細書で使用され得る。したがって、第1及び第2の要素への参照は、2つの要素のみがそこで採用され得ること、又は何らかの形で第1の要素が第2の要素に先行しなければならないことを意味しない。 Any reference to elements using the "first", "second", etc. designations used herein does not generally limit the quantity or order of those elements. These designations may be used herein as a convenient method of distinguishing between two or more elements. Thus, references to first and second elements do not imply that only two elements may be employed therein, or that the first element must precede the second element in any way.
 上記の各装置の構成における「手段」を、「部」、「回路」、「デバイス」等に置き換えてもよい。 "Means" in the configuration of each device described above may be replaced with "unit", "circuit", "device", or the like.
 「含む(including)」、「含んでいる(comprising)」、及びそれらの変形が、本明細書或いは特許請求の範囲で使用されている限り、これら用語は、用語「備える」と同様に、包括的であることが意図される。さらに、本明細書或いは特許請求の範囲において使用されている用語「又は(or)」は、排他的論理和ではないことが意図される。 To the extent that "including," "comprising," and variations thereof are used herein or in the claims, these terms, as well as the term "comprising," are inclusive. intended to be Furthermore, the term "or" as used in this specification or the claims is not intended to be an exclusive OR.
 本開示の全体において、例えば、英語でのa、an、及びtheのように、翻訳により冠詞が追加された場合、これらの冠詞は、文脈から明らかにそうではないことが示されていなければ、複数のものを含むものとする。 Throughout this disclosure, where articles have been added by translation, e.g., a, an, and the in English, these articles are used unless the context clearly indicates otherwise. It shall include plural things.
 以上、本発明について詳細に説明したが、当業者にとっては、本発明が本明細書中に説明した実施形態に限定されるものではないということは明らかである。本発明は、特許請求の範囲の記載により定まる本発明の趣旨及び範囲を逸脱することなく修正及び変更態様として実施することができる。したがって、本明細書の記載は、例示説明を目的とするものであり、本発明に対して何ら制限的な意味を有するものではない。 Although the present invention has been described in detail above, it will be apparent to those skilled in the art that the present invention is not limited to the embodiments described herein. The present invention can be implemented with modifications and variations without departing from the spirit and scope of the invention defined by the claims. Accordingly, the descriptions herein are for the purpose of illustration and description, and are not intended to have any limiting meaning with respect to the present invention.
1:ドローン管理システム、10:ドローン、11:取得部、12:検査部、13:判断部、14:提示部、15:着陸制御部、30:ユーザ端末、40:無線通信網、50:サーバ装置、1001:プロセッサ、1002:メモリ、1003:ストレージ、1004:通信装置、1005:入力装置、1006:出力装置、1007:測位装置、1008:センサ、1009:飛行駆動機構、50:サーバ装置、5001:プロセッサ、5002:メモリ、5003:ストレージ、5004:通信装置。 1: drone management system, 10: drone, 11: acquisition unit, 12: inspection unit, 13: determination unit, 14: presentation unit, 15: landing control unit, 30: user terminal, 40: wireless communication network, 50: server Device, 1001: Processor, 1002: Memory, 1003: Storage, 1004: Communication device, 1005: Input device, 1006: Output device, 1007: Positioning device, 1008: Sensor, 1009: Flight drive mechanism, 50: Server device, 5001 : processor, 5002: memory, 5003: storage, 5004: communication device.

Claims (10)

  1.  飛行体の目的地における複数の着陸候補位置に関する情報を取得する取得部と、
     前記目的地の上空に到達した前記飛行体が前記情報に基づいて前記目的地を検査した結果に基づいて、前記飛行体が各々の前記着陸候補位置に着陸可能か否かを判断する判断部と、
     着陸可能と判断された前記着陸候補位置の中から選択されたいずれかの着陸候補位置に前記飛行体を着陸させる着陸制御部と
     を備えることを特徴とする制御装置。
    an acquisition unit that acquires information about a plurality of candidate landing positions at the destination of the aircraft;
    a judgment unit for judging whether or not the flying object can land at each of the landing candidate positions based on the results of inspection of the destination by the flying object that has reached the destination based on the information; ,
    a landing control unit that lands the flying object at one of the candidate landing positions selected from the candidate landing positions determined to be landable.
  2.  前記判断部によって着陸可能と判断された前記着陸候補位置を、前記目的地に対応するユーザに提示する提示部を備え、
     前記着陸制御部は、提示された前記着陸候補位置の中から前記ユーザによって選択された着陸候補位置に前記飛行体を着陸させる
     ことを特徴とする請求項1記載の制御装置。
    a presenting unit that presents the landing candidate position determined by the determining unit to be landable to a user corresponding to the destination;
    2. The control device according to claim 1, wherein the landing control unit lands the flying object at a landing candidate position selected by the user from among the presented landing candidate positions.
  3.  前記提示部は、前記判断部によって着陸可能と判断された前記着陸候補位置を優先順位に従ってユーザに提示する
     ことを特徴とする請求項2記載の制御装置。
    3. The control device according to claim 2, wherein the presenting unit presents the landing candidate positions determined by the determining unit as possible to land to the user in accordance with the order of priority.
  4.  前記提示部は、前記目的地に対応する第1のユーザによって前記着陸候補位置が選択されなかった場合には、前記目的地に対応する第2のユーザに対し、前記判断部によって着陸可能と判断された前記着陸候補位置を提示し、
     前記着陸制御部は、前記第2のユーザによって選択された着陸候補位置に前記飛行体を着陸させる
     ことを特徴とする請求項2又は3記載の制御装置。
    The presentation unit determines that the second user corresponding to the destination can land by the determination unit when the landing candidate position is not selected by the first user corresponding to the destination. Presenting the landing candidate position obtained,
    The control device according to claim 2 or 3, wherein the landing control section causes the aircraft to land at the landing candidate position selected by the second user.
  5.  前記着陸制御部は、前記判断部によって着陸可能と判断された前記着陸候補位置の中からいずれかの着陸候補位置を優先順位に従って選択し、選択した着陸候補位置に前記飛行体を着陸させる
     ことを特徴とする請求項1記載の制御装置。
    The landing control section selects one of the landing candidate positions determined by the determining section to be possible for landing according to priority, and lands the flying object at the selected landing candidate position. 2. A control device according to claim 1.
  6.  前記優先順位は、前記飛行体が着陸するとき或いはその前後の天候、前記飛行体が着陸する時間帯、前記飛行体が輸送する荷物の属性、又は、前記目的地の環境に応じて異なる優先順位である
     ことを特徴とする請求項3又は5記載の制御装置。
    The priorities differ depending on the weather before or after the landing of the flying object, the time period when the flying object lands, the attributes of the cargo to be transported by the flying object, or the environment of the destination. The control device according to claim 3 or 5, characterized in that:
  7.  前記優先順位は、前記目的地に対応するユーザが当該目的地に所在しているか否かに応じた優先順位である
     ことを特徴とする請求項3、5、又は6のいずれか1項に記載の制御装置。
    7. The priority according to any one of claims 3, 5, and 6, wherein the priority is a priority according to whether or not a user corresponding to the destination is located at the destination. controller.
  8.  前記優先順位は、前記目的地を含む所定エリア内の人間の数又は密度に関する判定結果に応じた優先順位である
     ことを特徴とする請求項3、5、6、又は7のいずれか1項に記載の制御装置。
    8. The priority according to any one of claims 3, 5, 6, or 7, wherein the priority is a priority according to a determination result regarding the number or density of people within a predetermined area including the destination. Control device as described.
  9.  前記判断部は、前記着陸候補位置の水平度、平坦度、材質、温度、又は、液体若しくは積雪があるか否かに基づいて、飛行体が着陸可能か否かを判断する
     ことを特徴とする請求項1~8のいずれか1項に記載の制御装置。
    The determination unit determines whether or not the aircraft can land based on levelness, flatness, material, temperature, or presence of liquid or snow at the candidate landing position. A control device according to any one of claims 1 to 8.
  10.  コンピュータに、
     飛行体の目的地における複数の着陸候補位置に関する情報を取得するステップと、
     前記目的地の上空に到達した前記飛行体が前記情報に基づいて前記目的地を検査した結果に基づいて、前記飛行体が各々の前記着陸候補位置に着陸可能か否かを判断するステップと、
     着陸可能と判断された着陸候補位置のなかから選択されたいずれかの地点に前記飛行体を着陸させるステップと
     を実行させるためのプログラム。
    to the computer,
    obtaining information about a plurality of candidate landing positions at the vehicle's destination;
    determining whether or not the flying object can land at each of the landing candidate positions based on the results of inspection of the destination by the flying object that has reached the destination based on the information;
    A program for executing a step of landing the flying object at any point selected from the landing candidate positions determined to be landable.
PCT/JP2022/028865 2021-08-16 2022-07-27 Control device and program WO2023021948A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021132189 2021-08-16
JP2021-132189 2021-08-16

Publications (1)

Publication Number Publication Date
WO2023021948A1 true WO2023021948A1 (en) 2023-02-23

Family

ID=85240560

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/028865 WO2023021948A1 (en) 2021-08-16 2022-07-27 Control device and program

Country Status (1)

Country Link
WO (1) WO2023021948A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001328600A (en) * 2000-05-19 2001-11-27 Fuji Heavy Ind Ltd Landing point searching device, flying object using therewith and landing point evaluating device
US9257048B1 (en) * 2010-04-21 2016-02-09 The Boeing Company Aircraft emergency landing route system
WO2019008669A1 (en) * 2017-07-04 2019-01-10 三菱電機株式会社 Aircraft control device, aircraft control system, and aircraft control method
JP2020057225A (en) * 2018-10-02 2020-04-09 パイオニア株式会社 Information processor, information processing method, and program
WO2020121530A1 (en) * 2018-12-14 2020-06-18 楽天株式会社 Control method for unmanned aerial vehicle, management method, control device, management device, and unmanned aerial vehicle system
WO2021033256A1 (en) * 2019-08-20 2021-02-25 楽天株式会社 Information processing system, information processing device and information processing method
JP2021041914A (en) * 2019-08-07 2021-03-18 ザ・ボーイング・カンパニーThe Boeing Company Determining runway exit for landing aircraft
US20210312823A1 (en) * 2020-04-07 2021-10-07 The Boeing Company Landing site candidate identification

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001328600A (en) * 2000-05-19 2001-11-27 Fuji Heavy Ind Ltd Landing point searching device, flying object using therewith and landing point evaluating device
US9257048B1 (en) * 2010-04-21 2016-02-09 The Boeing Company Aircraft emergency landing route system
WO2019008669A1 (en) * 2017-07-04 2019-01-10 三菱電機株式会社 Aircraft control device, aircraft control system, and aircraft control method
JP2020057225A (en) * 2018-10-02 2020-04-09 パイオニア株式会社 Information processor, information processing method, and program
WO2020121530A1 (en) * 2018-12-14 2020-06-18 楽天株式会社 Control method for unmanned aerial vehicle, management method, control device, management device, and unmanned aerial vehicle system
JP2021041914A (en) * 2019-08-07 2021-03-18 ザ・ボーイング・カンパニーThe Boeing Company Determining runway exit for landing aircraft
WO2021033256A1 (en) * 2019-08-20 2021-02-25 楽天株式会社 Information processing system, information processing device and information processing method
US20210312823A1 (en) * 2020-04-07 2021-10-07 The Boeing Company Landing site candidate identification

Similar Documents

Publication Publication Date Title
US10217367B2 (en) Unmanned aerial vehicle and system having the same
US10157548B2 (en) Waypoint directory in air traffic control systems for unmanned aerial vehicles
US11749074B2 (en) Rescue support in large-scale emergency situations
US10171646B2 (en) Systems and methods for providing geolocation services
US10715653B2 (en) Systems and methods for providing geolocation services
US20190147747A1 (en) Remote Control of an Unmanned Aerial Vehicle
JP6741073B2 (en) Flight control program, flight control method, and information processing apparatus
US11961408B2 (en) Air traffic control of unmanned aerial vehicles for delivery applications
US20210150914A1 (en) Flight control apparatus
US11393344B2 (en) Flight control apparatus and flight control system
WO2023021948A1 (en) Control device and program
KR20190048688A (en) Autonomous flight system using drone and method thereof
WO2019054027A1 (en) Flight control system and flight control apparatus
JP7050809B2 (en) Information processing equipment
JP2022001840A (en) Control device, program, system, and control method
US20220335838A1 (en) Control device, unmanned aerial vehicle, and method
WO2023282124A1 (en) Control device
JP7148567B2 (en) System, management device, program, and management method
WO2023042601A1 (en) Information processing device
JP7319244B2 (en) Control device, program, system and method
WO2023145762A1 (en) Control device
WO2023042551A1 (en) Information processing device
WO2019146577A1 (en) Information processing device
WO2024084781A1 (en) Information processing device
WO2023162583A1 (en) Control apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22858269

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023542297

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE