WO2023282124A1 - Dispositif de commande - Google Patents

Dispositif de commande Download PDF

Info

Publication number
WO2023282124A1
WO2023282124A1 PCT/JP2022/025759 JP2022025759W WO2023282124A1 WO 2023282124 A1 WO2023282124 A1 WO 2023282124A1 JP 2022025759 W JP2022025759 W JP 2022025759W WO 2023282124 A1 WO2023282124 A1 WO 2023282124A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
priority
landing
structures
control device
Prior art date
Application number
PCT/JP2022/025759
Other languages
English (en)
Japanese (ja)
Inventor
昌志 安沢
真幸 森下
広樹 石塚
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Priority to JP2023533549A priority Critical patent/JPWO2023282124A1/ja
Publication of WO2023282124A1 publication Critical patent/WO2023282124A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D9/00Equipment for handling freight; Equipment for facilitating passenger embarkation or the like
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/02Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data

Definitions

  • the present invention relates to technology for landing an aircraft.
  • Patent Literature 1 describes a mechanism in which a landing pad is provided in a landing zone of a delivery destination of a drone, and the drone is guided to the landing pad by a visual assistance device, an optical assistance device or a wireless assistance device.
  • a dedicated facility called a landing pad must be installed at the delivery destination of the drone.
  • Destinations for drones include, for example, dwelling units of various sizes and shapes.
  • the object of the present invention is to land the flying object at an appropriate position with respect to various structures designated as the destination of the flying object.
  • the structure existing on the site is detected based on the result of the flying object sensing the site containing the structure.
  • a detection unit for determining whether or not the flying object can land at a landing position determined for the detected structure; and a landing control for landing at a landing position.
  • the present invention it is possible to land the flying object at an appropriate position with respect to the structure designated as the destination of the flying object.
  • FIG. 4 is a bird's-eye view illustrating the structure of the destination of the drone according to the same embodiment; It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment. It is a figure which illustrates the priority table which concerns on the same embodiment.
  • FIG. 4 is a diagram illustrating a movement route of the drone 10 when detecting a structure
  • FIG. 4 is a diagram illustrating a movement route of the drone 10 when detecting a structure
  • FIG. 4 is a diagram illustrating a movement route of the drone 10 when detecting a structure
  • 4 is a flowchart illustrating a procedure of processing by the drone 10;
  • FIG. 1 is a diagram showing an example configuration of a drone management system 1 according to an embodiment of an information processing system of the present invention.
  • the drone management system 1 includes a drone 10 that transports packages to a destination, a user terminal 30 that is used by a user living in a building or other structure that is the destination, a wireless communication network 40, and a wireless communication network 40. and a connected server device 50 .
  • a drone 10 that transports packages to a destination
  • a user terminal 30 that is used by a user living in a building or other structure that is the destination
  • a wireless communication network 40 and a wireless communication network 40.
  • a connected server device 50 a connected server device 50 .
  • FIG. 1 is a diagram showing an example configuration of a drone management system 1 according to an embodiment of an information processing system of the present invention.
  • the drone management system 1 includes a drone 10 that transports packages to a destination, a user terminal 30 that is used by a user living in a building or other structure that is the destination,
  • the drone 10 is an unmanned flying object that flies in the air.
  • the drone 10 transports the cargo by holding the cargo, flying to the destination, and landing at the destination.
  • the user terminal 30 is a communicable computer such as a smartphone, tablet, or personal computer.
  • the user terminal 30 is a smart phone and functions as a communication terminal for the user who receives the parcel to access the server device 50 via the wireless communication network 40 .
  • the server device 50 stores flight plan information such as the flight date and time, flight route and flight altitude of the drone 10, and remotely steers the drone according to the flight plan information.
  • Remote control by the server device 50 is mainly a section between a drone departure/arrival point called a base and the drone's destination above ground.
  • the section between the target airspace and the landing position of the drone is carried out under autonomous control by the drone itself.
  • the drone 10 detects a structure corresponding to the destination and/or other structures existing on the site including the structure (for example, doors, verandas, gates, parking lots, warehouses, gardens, etc.) Then, it determines whether or not it is possible to land on and/or in the vicinity of these detected structures, and then lands.
  • the section between the drone's departure/arrival point and the destination airspace depends on remote control by the server device 50, and the section between the destination airspace and the drone's landing position is Although it is realized by autonomous flight by the drone itself, it is not limited to this example.
  • the drone 10 may autonomously fly all sections between the landing positions of the departure/arrival point and the destination without relying on remote control by the server device 50, or You may fly according to the remote control of the server apparatus 50 in all the sections between.
  • the wireless communication network 40 may be, for example, equipment conforming to the 4th generation mobile communication system or may be equipment conforming to the 5th generation mobile communication system.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the drone 10.
  • the drone 10 physically includes a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a positioning device 1007, a sensor 1008, a flight drive mechanism 1009, and a bus connecting these. It is configured as a computer device. Note that in the following description, the term "apparatus" can be read as a circuit, device, unit, or the like.
  • the hardware configuration of the drone 10 may be configured to include one or more of each device shown in the figure, or may be configured without some of the devices.
  • Each function in the drone 10 is performed by the processor 1001 by loading predetermined software (program) on hardware such as the processor 1001 and the memory 1002, the processor 1001 performs calculations, the communication by the communication device 1004 is controlled, the memory 1002 and It is realized by controlling at least one of data reading and writing in the storage 1003 and controlling the positioning device 1007 , the sensor 1008 and the flight driving mechanism 1009 .
  • predetermined software program
  • the processor 1001 performs calculations
  • the communication by the communication device 1004 is controlled
  • the memory 1002 It is realized by controlling at least one of data reading and writing in the storage 1003 and controlling the positioning device 1007 , the sensor 1008 and the flight driving mechanism 1009 .
  • the processor 1001 for example, operates an operating system and controls the entire computer.
  • the processor 1001 may be configured by a central processing unit (CPU) including interfaces with peripheral devices, a control unit, an arithmetic unit, registers, and the like.
  • a baseband signal processing unit, a call processing unit, and the like may be implemented by the processor 1001 .
  • the processor 1001 reads programs (program codes), software modules, data, etc. from at least one of the storage 1003 and the communication device 1004 to the memory 1002, and executes various processes according to them.
  • programs program codes
  • software modules software modules
  • data etc.
  • the functional blocks of drone 10 may be implemented by a control program stored in memory 1002 and running on processor 1001 .
  • Various types of processing may be executed by one processor 1001, but may also be executed by two or more processors 1001 simultaneously or sequentially.
  • Processor 1001 may be implemented by one or more chips. Note that the program may be transmitted to the drone 10 via the wireless communication network 40 .
  • the memory 1002 is a computer-readable recording medium, and may be composed of at least one of, for example, ROM, EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM, and the like.
  • the memory 1002 may also be called a register, cache, main memory (main storage device), or the like.
  • the memory 1002 can store executable programs (program code), software modules, etc. to perform the methods of the present invention.
  • the storage 1003 is a computer-readable recording medium, for example, an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu-ray disk), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like.
  • Storage 1003 may also be called an auxiliary storage device.
  • the storage 1003 stores various programs and data groups.
  • the processor 1001, memory 1002, and storage 1003 described above function as an example of the control device of the present invention.
  • the communication device 1004 is hardware (transmitting/receiving device) for communicating between computers via the wireless communication network 40, and is also called a network device, a network controller, a network card, a communication module, or the like.
  • the communication device 1004 includes a high-frequency switch, duplexer, filter, frequency synthesizer, etc. in order to implement frequency division duplexing and time division duplexing.
  • a transmitting/receiving antenna, an amplifier section, a transmitting/receiving section, a transmission line interface, etc. may be implemented by the communication device 1004 .
  • the transceiver may be physically or logically separate implementations for the transmitter and receiver.
  • the input device 1005 is an input device that receives input from the outside, and includes, for example, keys, switches, and microphones.
  • the output device 1006 is an output device that outputs to the outside, and includes, for example, a display device such as a liquid crystal display and a speaker. Note that the input device 1005 and the output device 1006 may be integrated.
  • the positioning device 1007 is hardware that measures the position of the drone 10, such as a GPS (Global Positioning System) device.
  • the drone 10 flies from the departure/arrival point to the sky above the destination based on the positioning by the positioning device 1007 .
  • the sensor 1008 includes a ranging sensor that functions as altitude measurement means and landing position status confirmation means for the drone 10, a gyro sensor and direction sensor that function as attitude measurement means for the drone 10, an image sensor that functions as imaging means, and the like.
  • the flight drive mechanism 1009 includes hardware such as motors and propellers for the drone 10 to fly.
  • Each device such as the processor 1001 and the memory 1002 is connected by a bus for communicating information.
  • the bus may be configured using a single bus, or may be configured using different buses between devices.
  • the drone 10 includes a microprocessor, a GPU (Graphics Processing Unit), a digital signal processor (DSP: Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field Programmable Gate Array), etc. hardware, and part or all of each functional block may be realized by the hardware.
  • processor 1001 may be implemented using at least one of these pieces of hardware.
  • FIG. 3 is a diagram showing the hardware configuration of the server device 50.
  • the hardware configuration of the server device 50 may be configured to include one or more of the devices shown in FIG. 3, or may be configured without some of the devices. Further, the server device 50 may be configured by connecting a plurality of devices having different housings for communication.
  • the server device 50 is physically configured as a computer device including a processor 5001, a memory 5002, a storage 5003, a communication device 5004, and a bus connecting them. Each function in the server device 50 is performed by causing the processor 5001 to perform calculations, controlling communication by the communication device 5004, and controlling the and by controlling at least one of reading and writing of data in the storage 5003 . Each of these devices operates with power supplied from a power source (not shown). Note that in the following description, the term "apparatus" can be read as a circuit, device, unit, or the like.
  • a processor 5001 operates an operating system to control the entire computer.
  • the processor 5001 may be configured by a central processing unit (CPU) including interfaces with peripheral devices, a control unit, an arithmetic unit, registers, and the like.
  • CPU central processing unit
  • a baseband signal processing unit, a call processing unit, and the like may be implemented by the processor 5001 .
  • the processor 5001 reads programs (program codes), software modules, data, etc. from at least one of the storage 5003 and the communication device 5004 to the memory 5002, and executes various processes according to them.
  • programs program codes
  • software modules software modules
  • data etc.
  • As the program a program that causes a computer to execute at least part of the operations described below is used.
  • the functional blocks of drone 10 may be implemented by a control program stored in memory 5002 and running on processor 5001 .
  • Various types of processing may be executed by one processor 5001, but may also be executed by two or more processors 5001 simultaneously or sequentially.
  • Processor 5001 may be implemented by one or more chips.
  • the memory 5002 is a computer-readable recording medium, and may be composed of at least one of ROM, EPROM, EEPROM, and RAM, for example.
  • the memory 5002 may also be called a register, cache, main memory (main storage device), or the like.
  • the memory 5002 can store executable programs (program code), software modules, etc. for performing methods according to the present invention.
  • the storage 5003 is a computer-readable recording medium, for example, an optical disk such as a CD-ROM, a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu-ray (registered trademark) disk ), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like.
  • Storage 5003 may be called an auxiliary storage device.
  • the storage 5003 stores at least programs and data groups for executing various processes described later.
  • the communication device 5004 is hardware (transmitting/receiving device) for communicating between computers via the wireless communication network 40, and is also called a network device, a network controller, a network card, a communication module, or the like.
  • Each device such as the processor 5001 and memory 5002 is connected by a bus for communicating information.
  • the bus may be configured using a single bus, or may be configured using different buses between devices.
  • the server device 50 may be configured including hardware such as a microprocessor, digital signal processor, ASIC, PLD, and FPGA, and part or all of each functional block may be realized by the hardware.
  • processor 5001 may be implemented using at least one of these pieces of hardware.
  • FIG. 4 is a diagram showing an example of the functional configuration of the drone 10. As shown in FIG. In the drone 10, functions of an acquisition unit 11, a detection unit 12, a determination unit 13, and a landing control unit 14 are realized.
  • the acquisition unit 11 acquires various data from, for example, the positioning device 1007, the sensor 1008, the server device 50, or the like.
  • the detection unit 12 captures an image of the site including the structure with the sensor 1008 (image sensor) of the drone 10. , the structure and/or structures existing on the site are detected.
  • the structures here refer to the elements that make up the building designated as the destination (for example, architectural elements such as doors, verandas, and gates), as well as other buildings, facilities, and other structures (parking lots, warehouses, gardens, etc.) , aisles, roads, under the eaves, etc.), which are typically likely to be provided in general dwelling units.
  • the detection unit 12 detects the structure according to a predetermined priority. Specifically, the detection unit 12 uses the priority according to the positional relationship between the structure specified as the destination of the drone 10 and other structures and/or sites adjacent to the structure. In addition, the detection unit 12 uses priority according to the shape of the structure designated as the destination of the drone 10 . Further, the detection unit 12 uses the priority according to the arrangement of the structure with respect to the site of the structure designated as the destination of the drone 10 . Further, the detector 12 uses the user-indicated priority of the destination of the drone 10 . Note that "adjacent" does not necessarily mean that the structures are physically in contact with each other, as long as it can be understood that they exist within a predetermined distance range from the designated structure. good.
  • FIG. 5 is a bird's eye view exemplifying the structure that is the destination of the drone 10 and the structure of its site. That is, the image is illustrated when the drone 10 captures the image below from the sky above the destination (for example, 20 m above) with the image sensor.
  • another building B1 and its site G1 are also adjacent to the building B and site G.
  • the site G includes, for example, a gate g, trees W, and a roof P of a parking lot.
  • FIG. 5 exemplifies a passage L1, a parking lot L2, and a garden L3 as the structures described above.
  • This priority depends on the positional relationship between the building B and the roads R1 and R2 adjacent to the building B or another building B1 or another site G1, and the shape of the building B. priority according to the layout of the building B with respect to the site G, or priority according to an instruction by the user.
  • the detection unit 12 stores a priority order table as illustrated in FIG.
  • positional relationship patterns 01, 02, 03, . are associated with priority order patterns 01, 02, 03, . . . used when the destination corresponds to the positional relationship.
  • the positional relationship patterns 01, 02, 03, . Contains information about the placement of structures, such as where they are adjacent to roads, and where structures and/or sites are adjacent to other structures and/or sites at a distance less than a threshold.
  • search is performed in the order of gates, parking lots, under eaves, etc., or structures and/or sites are searched.
  • an algorithm for detecting structures such as excluding a predetermined range centered on a location where the site is adjacent to other structures and/or sites with a distance of less than a threshold from the search range for structures. .
  • the detection unit 12 also stores a priority order table as illustrated in FIG.
  • the shape patterns 01, 02, 03, . 11, 12, 13, . . . are associated with each other.
  • the priority patterns 11, 12, 13, It specifies an algorithm for detecting For example, in the case of shape pattern 11, first try to detect a "passage” from one part of the shape and then try to detect a "garden” from another part of the shape, whereas shape pattern If it is 12, then first try to detect a "parking lot” from one part of the shape, then try to detect "under the eaves” from another part of the shape, and so on.
  • the detection unit 12 also stores a priority order table as illustrated in FIG.
  • Priority patterns 21, 22, 23, . . . are associated with each other.
  • the detection unit 12 also stores a priority order table as illustrated in FIG.
  • each destination is associated with a priority pattern, which is an algorithm for detecting structures around the destination.
  • This priority is specified by the user at the destination of the drone 10 (that is, the user who receives the package) according to his/her own wishes. For example, first try to detect "parking lot”, then try to detect "garden”, and so on. For example, this instruction is given to the user terminal 30 before the cargo is transported, and the user gives an instruction in response to the notification. However, since there are cases where the user does not indicate the priority order, in that case, no information is registered in the table regarding the priority order corresponding to the destination where the user receives the parcel.
  • the priority patterns illustrated in FIGS. 6 to 9 may be used together. For example, if the destination best fits any one of the positional relationship patterns, shape patterns, and arrangement patterns illustrated in FIGS. to search for structures. For destinations whose priorities are registered in the table shown in FIG. 9, the detection unit 12 searches for structures according to the priority patterns registered in FIG. For destinations for which priority is not registered, structures are searched according to a priority pattern determined using either one of FIGS. 6 to 8 or a combination thereof.
  • the drone 10 searches for a structure after approaching the structure by lowering the altitude to some extent from the sky above the destination. For example, when the drone 10 (the detection unit 12) attempts to detect a door, the drone 10 descends to a position where the door can be recognized (a position substantially horizontal to the door) before starting the search. Therefore, time efficiency is poor if the drone 10 lowers its altitude from the sky each time it searches for each structure. Therefore, for example, when one of the positional relationship patterns, shape patterns, and arrangement patterns illustrated in FIGS. It is desirable to use the priority pattern that allows structures to be searched the fastest. For example, it is assumed that there are priority patterns A, B, and C that apply one each to the positional relationship pattern illustrated in FIG.
  • FIG. 10 illustrates the movement route of the drone 10 when detection is performed according to the priority pattern A
  • FIG. 11 illustrates the movement route of the drone 10 when detection is performed according to the priority pattern B
  • 12 exemplifies the moving route of the drone 10 when detection is performed according to the priority pattern C.
  • FIG. 10 is a bird's-eye view illustrating the destination of the drone 10 as well as the structure of the structure and its site. The number attached to each arrow means the movement order of the drones 10 . Since the moving route illustrated in FIG.
  • the detection unit 12 uses the order of priority that enables detection of structures within a range in which the flight altitude of the drone 10 falls within a predetermined range.
  • the determination unit 13 determines whether the drone 10 can land at the landing position determined with respect to the structure detected by the detection unit 12. Specifically, the determination unit 13 determines whether or not the drone 10 can land based on the levelness and flatness of the landing position, or whether or not there is liquid. The horizontality and flatness of the landing position are determined based on the outputs of the ranging sensor and image sensor of the sensor 1008 . Also, whether or not there is liquid (typically water) at the landing position is determined based on the output of the image sensor of sensor 1008 .
  • the landing position determined for the ground structure is, for example, the landing position for a structure called a garden is the center of the garden, and the landing position for a structure called a parking lot is near the edge of the parking lot.
  • the landing position for a structure called a door is in front of the door, and the landing position for a structure called a gate is a position closer to the structure than the gate.
  • the landing control unit 14 controls the flight drive mechanism 1009 to land the drone 10 at the landing position when the judgment unit 13 judges that the landing position can be landed.
  • the drone 10 starts flying from the departure/arrival point to the destination (step S01). After that, the drone 10 flies under the control of the server device 50 up to the destination address specified at the time of the package delivery request.
  • the detection unit 12 detects the structure and/or the site based on the image captured by the sensor 1008 (image sensor). Detect structures existing on the site. At this time, the detection unit 12 detects the structure according to the above-described priority order (step S03).
  • the determination unit 13 determines whether the drone 10 can land at the landing position determined for the structure detected by the detection unit 12. Specifically, the determination unit 13 identifies factors related to suitability for landing, such as levelness and flatness of the landing position, or presence or absence of liquid, from the output of the ranging sensor and image sensor of the sensor 1008. Then, it is determined whether or not the drone 10 can land (step S04).
  • the landing control unit 14 controls the flight drive mechanism 1009 to land the drone 10 at the landing position when the determination unit 13 determines that the landing position can be landed (step S05). If the determination unit 13 does not determine that the vehicle can land at the landing position, the detection unit 12 tries to detect the next priority structure, and repeats the above steps S03 and S04. Then, when it is determined that all structures are not landable, the drone 10 performs a predetermined NG process and returns to the departure/arrival point.
  • [Modification] The invention is not limited to the embodiments described above.
  • the embodiment described above may be modified as follows. Also, two or more of the following modified examples may be combined for implementation.
  • [Modification 1] For example, when a plurality of structures are detected at one destination at the same time, one structure to be determined for the possibility of landing may be determined based on the distance from the drone. For example, when two structures called doors are detected by one detection process, the determination unit 13 determines whether or not the door closer to the drone 10 at that time can land. In other words, when multiple structures are detected, the determination unit 13 may determine a structure to be determined based on the distance from the drone 10 .
  • the detection unit 12 detects a structure based on an image of the site including the structure captured by the sensor 1008 (image sensor) of the drone 10. In this detection, is calculated as the likelihood (recognition accuracy) that it corresponds to a structure. Therefore, when a plurality of structures are detected at one destination at the same time, the determination unit 13 may determine a structure to be determined based on the likelihood of each structure. For example, the determination unit 13 compares the likelihood of each structure and determines the structure with the highest likelihood as the structure to be determined. In this way, it is possible to determine the structure to be determined as the landing position based on the likelihood of the structure.
  • Drone landing control is realized by so-called edge computing (control by drone), cloud computing (control by server device), or combination of both (control by drone and server device), as described in the embodiments. good too. Therefore, the control device of the present invention may be provided in the server device 50.
  • FIG. 1 illustrates a diagrammatic representation of Drone landing control.
  • the unmanned flying object is not limited to what is called a drone, and may be of any structure or form as long as it is an unmanned flying object capable of transporting cargo.
  • the present invention can be applied to a manned flying object that has a human on board but that operates automatically.
  • an image sensor is used as an imaging means provided in the sensor 1008 of the drone 10 in detecting a structure.
  • the detection method of the structure is not limited to the example of the embodiment. , shape, size, or the like can be used. That is, the sensor 1008 includes a sensor capable of sensing the site including the structure, and the detection unit 12 of the server device 50 detects the structure and/or the site based on the result of sensing the site including the structure. It suffices to detect existing structures.
  • the judgment unit 13 judges whether or not the flying object can land based on the levelness and flatness of the landing position, or whether or not there is liquid.
  • the decision may be made on the basis of Specifically, for example, the material and temperature of the landing position, or whether or not there is snow cover, etc. can be considered.
  • the material of the landing position and the presence or absence of snow coverage are determined based on the output of the image sensor of the sensor 1008, for example.
  • the temperature of the landing position is determined based on the output of the non-contact temperature sensor included in sensor 1008 .
  • the determination unit 13 determines based on at least one of a plurality of types of landing position conditions, such as levelness, flatness, material, temperature, or whether or not there is liquid or snow at the landing position. can be used to determine whether the aircraft can land.
  • types of landing position conditions such as levelness, flatness, material, temperature, or whether or not there is liquid or snow at the landing position.
  • each functional block may be implemented by one device physically and/or logically coupled, or may be implemented by two or more physically and/or logically separated devices directly and/or indirectly. These multiple devices may be physically connected (eg, wired and/or wirelessly).
  • one computer may have the functions of the user terminals 30 to 32 exemplified in the embodiments.
  • each function exemplified in FIG. 5 may be provided in any of the devices constituting the drone management system 1 as an information processing system.
  • the server device 50 can directly control the drone 10
  • the server device 50 may have a function corresponding to the processing unit 313 and directly restrict the flight of the drone 10 .
  • LTE Long Term Evolution
  • LTE-A Long Term Evolution-Advanced
  • SUPER 3G IMT-Advanced
  • 4G 5G
  • FRA Full Radio Access
  • W-CDMA registered trademark
  • GSM registered trademark
  • CDMA2000 Code Division Multiple Access 2000
  • UMB Universal Mobile Broadband
  • IEEE 802.11 Wi-Fi
  • IEEE 802.16 WiMAX
  • IEEE 802.20 UWB (Ultra-WideBand
  • the information or parameters described in this specification may be represented by absolute values, relative values from a predetermined value, or other corresponding information.
  • determining and “determining” used herein may encompass a wide variety of actions.
  • Determining means, for example, judging, calculating, computing, processing, deriving, investigating, looking up (e.g., table , searching in a database or other data structure), ascertaining as “determining” or “determining”.
  • judgment and “decision” are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access, and so on.
  • accessing for example, accessing data in memory may include deeming that something has been "determined” or "determined”.
  • the present invention may be provided as an information processing method or as a program.
  • a program may be provided in a form recorded on a recording medium such as an optical disc, or may be provided in a form in which the program is downloaded to a computer via a network such as the Internet, installed, and made available. It is possible.
  • Software, instructions, etc. may be transmitted and received via a transmission medium.
  • the software can be used to access websites, servers, or other When transmitted from a remote source, these wired and/or wireless technologies are included within the definition of transmission media.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. may be represented by a combination of
  • references to elements using the "first”, “second”, etc. designations used herein does not generally limit the quantity or order of those elements. These designations may be used herein as a convenient method of distinguishing between two or more elements. Thus, references to first and second elements do not imply that only two elements may be employed therein, or that the first element must precede the second element in any way.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Selon l'invention, un drone (10) comprend : une unité de détection (12) destinée, une fois qu'un véhicule volant est situé au-dessus d'une structure indiquée comme destination, à détecter, en fonction d'une image d'un site comprenant la structure capturée par le véhicule volant, ladite structure et/ou une structure fixée sur le site ; une unité de décision (13) qui décide si l'atterrissage du véhicule volant à un emplacement d'atterrissage déterminé pour la structure détectée est possible ; et une unité de commande d'atterrissage (14) destinée à entraîner l'atterrissage du véhicule volant à l'emplacement d'atterrissage lorsque la décision relative à la possibilité de l'atterrissage est prise. L'unité de détection (12) détecte la structure conformément à un ordre de priorité correspondant à la relation de position entre la structure indiquée comme destination du drone (10), une autre structure et/ou un autre site adjacent à la structure.
PCT/JP2022/025759 2021-07-07 2022-06-28 Dispositif de commande WO2023282124A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023533549A JPWO2023282124A1 (fr) 2021-07-07 2022-06-28

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-112731 2021-07-07
JP2021112731 2021-07-07

Publications (1)

Publication Number Publication Date
WO2023282124A1 true WO2023282124A1 (fr) 2023-01-12

Family

ID=84801589

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/025759 WO2023282124A1 (fr) 2021-07-07 2022-06-28 Dispositif de commande

Country Status (2)

Country Link
JP (1) JPWO2023282124A1 (fr)
WO (1) WO2023282124A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019049874A (ja) * 2017-09-11 2019-03-28 Kddi株式会社 飛行管理装置、飛行装置、及び飛行管理方法
WO2019146576A1 (fr) * 2018-01-23 2019-08-01 株式会社Nttドコモ Dispositif de traitement d'informations
JP6622291B2 (ja) * 2014-09-08 2019-12-18 クゥアルコム・インコーポレイテッドQualcomm Incorporated 配達ドローンセキュリティのための方法、システム、およびデバイス
WO2021039100A1 (fr) * 2019-08-27 2021-03-04 ソニー株式会社 Corps mobile, dispositif de traitement d'informations, procédé de traitement d'informations et programme

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6622291B2 (ja) * 2014-09-08 2019-12-18 クゥアルコム・インコーポレイテッドQualcomm Incorporated 配達ドローンセキュリティのための方法、システム、およびデバイス
JP2019049874A (ja) * 2017-09-11 2019-03-28 Kddi株式会社 飛行管理装置、飛行装置、及び飛行管理方法
WO2019146576A1 (fr) * 2018-01-23 2019-08-01 株式会社Nttドコモ Dispositif de traitement d'informations
WO2021039100A1 (fr) * 2019-08-27 2021-03-04 ソニー株式会社 Corps mobile, dispositif de traitement d'informations, procédé de traitement d'informations et programme

Also Published As

Publication number Publication date
JPWO2023282124A1 (fr) 2023-01-12

Similar Documents

Publication Publication Date Title
US20180165973A1 (en) Unmanned aerial vehicle and system having the same
CN110998467A (zh) 确定运送位置处的下放点的模型
US20170358216A1 (en) Flying lane management systems and methods for unmanned aerial vehicles
US11932391B2 (en) Wireless communication relay system using unmanned device and method therefor
CN111542479B (zh) 物品交接场所的决定方法、着陆场所的决定方法、物品交接系统、及信息处理装置
JP2014197404A (ja) 混在モード車輌の自律的モードへの移行
US10810891B2 (en) Unmanned aerial vehicle and system having the same and method for searching for route of unmanned aerial vehicle
US11513233B2 (en) Drone escort system
JPWO2018078859A1 (ja) 飛行制御プログラム、飛行制御方法、および情報処理装置
KR102296225B1 (ko) 카메라를 탑재하지 않은 소형 비행체 및 그 비행체의 이동 방법
WO2018057629A1 (fr) Véhicules autonomes effectuant une gestion de stock
US20210150914A1 (en) Flight control apparatus
JP7167327B2 (ja) 制御装置、プログラム及び制御方法
US10755582B2 (en) Drone physical and data interface for enhanced distance coverage
JP7178351B2 (ja) 飛行制御システム
JP2018206053A (ja) 駐車支援システムおよび駐車支援方法
WO2019087891A1 (fr) Dispositif de traitement d'informations et système de commande de vol
WO2023282124A1 (fr) Dispositif de commande
JP7148567B2 (ja) システム、管理装置、プログラム、及び管理方法
JP7050809B2 (ja) 情報処理装置
US20210343162A1 (en) Information processing apparatus
WO2023021948A1 (fr) Dispositif de commande et programme
WO2023042601A1 (fr) Dispositif de traitement d'informations
JP7167341B2 (ja) 情報処理装置
WO2023162583A1 (fr) Appareil de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22837540

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023533549

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE