WO2020262222A1 - Système de commande pour véhicule volant - Google Patents

Système de commande pour véhicule volant Download PDF

Info

Publication number
WO2020262222A1
WO2020262222A1 PCT/JP2020/024100 JP2020024100W WO2020262222A1 WO 2020262222 A1 WO2020262222 A1 WO 2020262222A1 JP 2020024100 W JP2020024100 W JP 2020024100W WO 2020262222 A1 WO2020262222 A1 WO 2020262222A1
Authority
WO
WIPO (PCT)
Prior art keywords
cursor
image
control system
air vehicle
flying object
Prior art date
Application number
PCT/JP2020/024100
Other languages
English (en)
Japanese (ja)
Inventor
知也 榊原
Original Assignee
株式会社Clue
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Clue filed Critical 株式会社Clue
Priority to JP2021526902A priority Critical patent/JPWO2020262222A1/ja
Publication of WO2020262222A1 publication Critical patent/WO2020262222A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/20Initiating means actuated automatically, e.g. responsive to gust detectors using radiated signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]

Definitions

  • the present disclosure relates to an air vehicle control system, particularly an air vehicle control system that controls the flight of a flying vehicle by user operation.
  • a flight such as a so-called drone or multicopter that flies by rotating multiple propellers
  • the body may be used.
  • Patent Document 1 discloses that a three-dimensional image is generated from an image obtained by capturing an object with a camera mounted on an air vehicle, and Patent Document 2 inspects a sewage pipeline with a camera mounted on an air vehicle. It is disclosed to do.
  • the present disclosure has been made in view of the above circumstances, and an object of the present invention is to provide an air vehicle control system capable of controlling the flight of an air vehicle by a simple operation.
  • the flying object control system for achieving the above object is a flying object control system that controls the flight of an flying object by a user's operation, and is mounted on the flying object to be an object from the flying object.
  • the image acquisition unit that acquires the image of the above, the display unit that has a display surface on which the image acquired by the image acquisition unit is displayed, and the image displayed on the display surface of the display unit are superimposed and displayed on the display surface.
  • An image in which a cursor generation module that generates a cursor and a cursor movement module that moves the cursor generated by the cursor generation module to an arbitrary position on the image are provided, and the cursor is moved by the cursor movement module and the cursor is positioned. It is characterized by moving the flying object to the actual position corresponding to the above arbitrary position.
  • a cursor is displayed overlaid on the image displayed on the display surface of the display unit, and the flying object can be moved by moving the cursor, which is simple.
  • the flight of the flying object can be controlled by operation. Therefore, it is possible to suppress the occurrence of unexpected behavior of the flying object due to erroneous operation.
  • the control system of this flying object is characterized in that when the cursor is positioned at an arbitrary position on the image, the arbitrary position on the image is positioned at the center of the display surface.
  • the display unit of the control system of the air vehicle is characterized by being composed of a touch panel device that receives an input of an operation on the air vehicle by the user touching the display surface.
  • the cursor generation module of the control system of the flying object is characterized in that the cursor is generated at a position on the image separated from the position on the image that the user touches through the display surface.
  • the cursor when the cursor is moved to a position on the image for positioning, an arbitrary position on the image to be positioned is not hidden by the user touching the position on the image via the display surface. , The cursor can be accurately positioned at any position on the image.
  • the cursor movement module of the control system of the flying object is a contact point when the user displaces the contact point while touching the display surface when the position on the image where the user contacts through the display surface is set as the contact point.
  • the cursor is moved on the image according to the displacement of, and the cursor is positioned at an arbitrary position on the image where the cursor is located when the user releases the contact with the display surface.
  • the flying object may be one that is moved in the horizontal direction of the object above the object, and the object may be the roof of the building.
  • the flight of the flying object can be controlled by a simple operation, it is possible to suppress the occurrence of unexpected behavior of the flying object due to an erroneous operation.
  • FIG. 1 is a diagram illustrating an outline of a control system for an air vehicle according to the present embodiment.
  • the flight object control system 1 is communicably connected to the flight object 10, the portable information terminal 20 that controls the flight of the flight object 10 by the operation of the user U, and the flight object 10 via a network.
  • the server 40 is provided.
  • the flying object control system 1 controls the flight of the flying object 10 that images the roof 101 of the building 100 that is the object, and the image of the roof 101 imaged by the flying object 10. Is used, for example, to detect a repaired portion of the roof 101.
  • the airframe 10 includes a motor 16 arranged at the tips of the airframe 11 and a plurality of arms radially formed from the airframe 11, and a propeller 17 connected to the motor 16, so-called drone or multicopter. Is.
  • FIG. 2 is a block diagram illustrating a hardware configuration of the flying object 10 according to the present embodiment.
  • the flying object 10 is connected to a transmission / reception unit 12, a flight controller 13 connected to the transmission / reception unit 12, a battery 14 to supply electric power via the flight controller 13, and a flight controller 13 to drive a propeller 17. It is provided with a speed control unit (Electronic Speed Controller: ESC) 15 that controls the motor 16.
  • ESC Electronic Speed Controller
  • the transmission / reception unit 12 is a communication interface configured to transmit / receive data from a plurality of external devices such as an information terminal, a display device, or another remote controller.
  • the transmission / reception unit 12 is a portable information terminal. While receiving the control signal from 20, it also sends and receives various data to and from the server 40.
  • the transmission / reception unit 12 is, for example, a local area network (Local Area Network), a wide area network (Wide Area Network: WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, and cloud communication. It is possible to use multiple communication networks such as.
  • the transmission / reception unit 12 executes transmission / reception of various acquired data, processing results generated by the flight controller 13, various control data, and a plurality of data such as user commands from a terminal or a remote controller.
  • the flight controller 13 includes a processor 13A, a memory 13B, and sensors 13C as a main configuration.
  • the processor 13A is composed of, for example, a CPU (Central Processing Unit), controls the operation of the flight controller 13, controls the transmission and reception of data between each element, performs processing necessary for executing a program, and the like. ..
  • a CPU Central Processing Unit
  • the memory 13B includes a main storage device composed of a volatile storage device such as DRAM (Dynamic Random Access Memory), and an auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disk Drive). ..
  • a volatile storage device such as DRAM (Dynamic Random Access Memory)
  • auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disk Drive). ..
  • While this memory 13B is used as a work area of the processor 13A, various setting information such as logic, code, and program instructions that can be executed by the flight controller 13 are stored.
  • the memory 13B may be configured so that the data acquired from the sensors 13C and the like is directly transmitted and stored.
  • the sensors 13C are composed of various sensors such as a GPS sensor that receives radio waves from GPS satellites, a pressure sensor that measures atmospheric pressure, a temperature sensor that measures temperature, and an acceleration sensor.
  • the airframe 10 includes a camera 18 which is an image acquisition unit fixed to the airframe 11 and acquires an image of the roof 101 of the building 100 from the airframe 10.
  • the camera 18 captures the roof 101 and acquires it as an image.
  • FIG. 3 is a block diagram illustrating an outline of the configuration of the portable information terminal 20 for operating the flying object 10 according to the present embodiment.
  • the portable information terminal 20 is mounted by a so-called tablet-shaped small computer, and includes a control unit 21 and a touch panel unit 22 which is a display unit, as shown in the figure.
  • the portable information terminal 20 is not limited to the small tablet-shaped computer described above, and may be implemented by a smartphone, a personal computer, a portable game machine, or the like.
  • the server 40 has the function of the portable information terminal 20, and the user U acquires information through an interface of another computer, tablet terminal, or the like, and operates via the interface. It may be a configuration.
  • the portable information terminal 20 may be a terminal device such as a tablet or a computer in which a radio for controlling the flight of the flying object 10 is attached and the radio is controlled.
  • the control unit 21 includes a processor 21a, a memory 21b, a storage 21c, a transmission / reception unit 21d, and an input / output unit 21e as main configurations, and these are electrically connected to each other via a bus 21f.
  • the processor 21a is an arithmetic unit that controls the operation of the control unit 21, controls the transmission and reception of data between each element, and performs processing necessary for executing a program.
  • the processor 21a is, for example, a CPU (Central Processing Unit), and executes each process by executing a program stored in the storage 21c and expanded in the memory 21b, which will be described later.
  • CPU Central Processing Unit
  • the memory 21b includes a main storage device composed of a volatile storage device such as DRAM (Dynamic Random Access Memory), and an auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disc Drive). ..
  • a volatile storage device such as DRAM (Dynamic Random Access Memory)
  • auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disc Drive). ..
  • BIOS Basic Input / Output System
  • the storage 21c stores information used for programs and various processes.
  • a control program for controlling the flight of the flying object 10 and an image of the roof 101 acquired by the camera 18 are stored.
  • the transmission / reception unit 21d connects the control unit 21 to a network such as an Internet network, and may be provided with a short-range communication interface such as Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
  • a short-range communication interface such as Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
  • control signal for controlling the flight of the flying object 10 is transmitted to the flying object 10 via the transmission / reception unit 21d.
  • the input / output unit 21e is an interface to which input / output devices are connected, and in the present embodiment, the touch panel unit 22 is connected.
  • the bus 21f transmits, for example, an address signal, a data signal, and various control signals between the connected processor 21a, memory 21b, storage 21c, transmission / reception unit 21d, and input / output unit 21e.
  • the touch panel unit 22 includes a display surface 22a on which images and images acquired by the camera 18 are displayed.
  • the display surface 22a receives information input by contact with the display surface 22a, and is implemented by various techniques such as a resistance film method and a capacitance method.
  • FIG. 4 is a diagram illustrating an outline of the configuration of the screen interface of the display surface 22a.
  • the screen interface IF is composed of a window W, a toolbar T, and commands C1 to Cn.
  • the image M of the roof 101 of the building 100 acquired by the camera 18 is displayed.
  • various operations on the flying object 10 are input.
  • the cursor P generated by the cursor generation module described later is superimposed on the image M at an arbitrary position on the image M (for example, the top 101a of the roof 101) corresponding to the actual position of the flying object 10, for example. It is displayed together.
  • the toolbar T executes various processes on the image M displayed in the window W, and is composed of various functions such as "home”, “file”, and “edit”. ..
  • the commands C1 to Cn input various commands related to the operation of the flying object 10 and the camera 18 mounted on the flying object 10, for example, "ascending" and “altitude” of the flying object 10. It is composed of a plurality of input units that execute various commands such as “change”, “landing”, or “imaging” by the camera 18.
  • FIG. 5 is a block diagram illustrating an outline of the configuration of the control program 30 stored in the storage 21c of the control unit 21 of the portable information terminal 20.
  • the control program 30 includes an input module 31, a cursor generation module 32, a cursor movement module 33, a video processing module 34, and a control signal generation module 35.
  • the input module 31 receives various operations on the flying object 10 when the user U comes into contact with the window W, or various commands are input via commands C1 to Cn. It is a module that executes data processing based on these inputs.
  • FIG. 6 is a diagram illustrating an outline of processing of the cursor generation module 32.
  • the cursor generation module 32 is a module that generates a cursor P displayed on the window W by being superimposed on the image M displayed on the window W.
  • the generated cursor P is displayed superimposed on the image M at an arbitrary position (for example, the top 101a of the roof 101) on the image M corresponding to the actual position of the flying object 10.
  • the cursor generation module 32 is on the image M at which an arbitrary distance ⁇ is set from the contact point S where the user U contacts the image M. Generate a cursor P at the position.
  • the actual position of the flying object 10 and the position of the cursor P on the image M may be different depending on the position where the user U contacts the image M via the window W, and the cursor P is arbitrary on the image M.
  • the state is not positioned at the position of, that is, the cursor P can be moved to an arbitrary position on the image M.
  • FIG. 7 is a diagram illustrating an outline of processing of the cursor movement module 33.
  • the cursor movement module 33 is a module that moves the cursor P generated by the cursor generation module 32 to an arbitrary position on the image M.
  • the user U when the user U displaces the contact point S while contacting the display surface 22a, it is generated at the top 101a of the roof 101, which is a position on the image M with an arbitrary separation distance ⁇ from the contact point S.
  • the cursor P moves following the displacement of the contact point S.
  • the cursor P generated on the top 101a of the roof 101 moves following the displacement of the contact point S, and the cursor moves.
  • the cursor P is positioned at the corner 101b of the roof 101.
  • the target point is the finger F of the user U.
  • the corner portion 101b of the roof 101 which is a point, is not concealed, and the cursor P can be accurately positioned at the target point.
  • the cursor movement module 33 continuously moves the cursor P from the initial position on the image M by following the movement of the contact point S via the separation distance ⁇ .
  • this embodiment is not limited to such an example.
  • the cursor movement module 33 may move the cursor P from a position different from the initial position of the cursor P.
  • the cursor movement module 33 sets an arbitrary position on the image that the user U contacts through the display surface 22a as the contact point S, and displays the cursor P at a position separated from the contact point S by a distance ⁇ . After that, the cursor P may be moved following the movement of the contact point S.
  • the cursor P may be displayed at the initial position until the user U touches an arbitrary point on the image. If the separation distance ⁇ between the cursor P and the contact point S is a predetermined distance, the user U can easily move the cursor P with the movement of the contact point S.
  • FIG. 8 is a diagram illustrating an outline of processing of the video processing module 34. As shown in the drawing, when the cursor P is positioned with an arbitrary position on the image M as a target point, the positioned arbitrary position on the image M is the central portion of the window W on the display surface 22a. Positioned to.
  • the corner 101b of the roof 101 is positioned on the central portion of the display surface 22a.
  • the display of the image M on the display surface 22a is scrolled so as to be performed.
  • control signal generation module 35 is a module that generates various control signals related to the operation of the flying object 10 and the like based on the operations and commands of the user U input via the input module 31. ..
  • a control signal for adjusting the angle of the camera 18 is generated and transmitted to the flying object 10, and the angle of the camera 18 is adjusted.
  • the camera 18 when the command of "imaging” is input, the camera 18 generates a control signal for imaging the roof 101 of the building 100 as an image and transmits it to the flying object 10, and the camera 18 images the roof 101. ..
  • FIG. 9 is a diagram illustrating an outline when the flight of the flying object 10 is controlled by the control signal generated by the control signal generation module 35. For example, when a command for "climbing" is input via commands C1 to Cn, a control signal for raising the flying object 10 is generated and transmitted to the flying object 10 as shown in the figure, and the flying object 10 is constructed. It rises above the object 100.
  • the server 40 is implemented by, for example, a desktop computer, and in the present embodiment, an image of the roof 101 of the building 100 captured by the camera 18 of the flying object 10 is stored in the storage thereof. ..
  • the flying object 10 is as shown in FIG. Ascends above building 100.
  • the image M acquired by the camera 18 of the flying object 10 is displayed in the window W of the screen interface IF.
  • the cursor P moves to the top 101a of the roof 101, which is a position on the image M corresponding to the actual position of the flying object 10. Will be generated.
  • the user U touches the image M through the window W and moves the cursor P to position the cursor P at the corner 101b of the roof 101, which is the target point.
  • the flying object 10 moves from the top 101a of the roof 101 toward the actual position corresponding to the corner 101b of the roof 101 on the image M. , Move in the horizontal direction H of the building 100.
  • the corner portion 101b of the roof 101 moves.
  • the display of the image M on the display surface 22a is scrolled so as to be positioned at the central portion of the display surface 22a.
  • the camera 18 of the flying object 10 has the roof 101.
  • the corner portion 101b of the above is imaged as an image.
  • the captured image is stored in the storage 21c of the portable information terminal 20 and also in the storage of the server 40, and is later used for detecting the repaired portion of the roof 101.
  • the user U touches the image M through the window W to move the cursor P, and moves the cursor P to an arbitrary position (for example, an unobstructed ground surface) on the image M where the flying object 10 can land.
  • the flying object 10 moves in the horizontal direction H of the structure 100 toward the actual position corresponding to the ground surface on the image M.
  • the user U uses commands C1 to Cn of the screen interface IF of the display surface 22a of the touch panel unit 22 of the portable information terminal 20.
  • the aircraft 10 descends and lands on the ground surface.
  • the image M displayed on the window W of the screen interface IF of the display surface 22a of the touch panel unit 22 of the portable information terminal 20 is superimposed.
  • the cursor P is displayed, and the flying object 10 can be moved by moving the cursor P.
  • the flight of the flying object 10 can be controlled by a simple operation, it is possible to suppress the occurrence of unexpected behavior of the flying object 10 due to an erroneous operation.
  • the cursor P is generated from an arbitrary position on the image M that the user U contacts with the finger F via the window W of the screen interface IF of the display surface 22a to a position on the image M via the separation distance ⁇ . Therefore, when the cursor P is moved to an arbitrary position on the image M which is the target point and positioned, the finger F of the user U does not hide the arbitrary position on the image M, and the cursor P is accurately moved. Can be positioned at the target point.
  • the distance on the image between the initial position of the cursor P generated by the cursor generation module 32 and the position of the cursor P determined by the cursor processing module 33 is calculated, and the altitude of the flying object 10 and the camera 18 are calculated. Based on the information related to the zoom state, the distance in the real space corresponding to the above-mentioned distance on the image is estimated. As a result, the flying object 10 can move more accurately in the horizontal direction of the building 100 toward the actual position corresponding to the ground surface on the image M.
  • an example of this embodiment will be described.
  • FIG. 10 is a diagram illustrating an outline of processing according to another embodiment of the present disclosure.
  • An altitude information box H indicating the altitude of the flying object 10 is superimposed and displayed on the illustrated image M. Further, the image M displays a zoom information object Z indicating the zoom magnification of the camera 18.
  • the zoom information object Z may be a command for adjusting the zoom magnification of the camera 18.
  • the cursor movement module 33 moves the initial position of the cursor P generated by the cursor generation module 32 (in FIG. 10, the roof 101 displayed on the image M in FIG. 10).
  • the distance on the image M between the position corresponding to the top portion 101a) and the position of the positioned cursor P (the position corresponding to the corner portion 101b of the roof 101 displayed in the image M in FIG. 10) is calculated.
  • the method of calculating the distance on the image M is not particularly limited.
  • the image processing module 34 estimates the distance in the real space corresponding to the distance on the image M based on at least one of the distance on the image M and the altitude of the flying object 10 and the zoom information of the camera 18. Will be done.
  • the distance in the real space referred to here is the horizontal distance to which the flying object 10 should move.
  • control signal generation module 35 generates a control signal related to the estimated real space distance and transmits it to the flying object 10.
  • the flying object 10 moves in the horizontal direction H of the building 100 toward an actual position corresponding to an arbitrary position positioned on the image M.
  • the moving distance of the flying object 10 is an estimated real space distance.
  • the distance in real space is estimated based on the altitude of the flying object 10 and the zoom information of the camera 18.
  • the size of the roof 101 displayed in the image M changes depending on the altitude of the flying object 10 and the zoom adjustment of the camera 18. Then, the correspondence between the distance between the initial position of the cursor P and the target position on the image M and the distance in the real space also changes.
  • the flying object 10 can be moved more accurately regardless of the altitude of the flying object 10 and the zoom state of the camera 18. Can be made to.
  • control system 1 of the air vehicle includes the server 40
  • the control system 1 of the air vehicle can also be constructed without deploying the server 40.
  • the flying object control system 1 may be constructed by directly communicating between the flying object 10 and the portable information terminal 20.
  • the function of the server 40 may be realized by the portable information terminal 20.
  • 11 and 12 are diagrams illustrating an outline of an air vehicle control system according to another embodiment. For example, as shown in FIG.
  • the flying object 10 and the portable information terminal 20 directly communicate with each other, and the portable information terminal 20 and the server 40 directly communicate with each other. It may be configured to perform.
  • the server 40 communicates with the flying object 10 via the portable information terminal 20.
  • the portable information terminal 20 and the server 40 directly communicate with each other, and the server 40 and the flying object 10 directly communicate with each other. There may be. In this case, the portable information terminal 20 communicates with the flying object 10 via the server 40.
  • the object is the roof 101 of the building 100
  • it may be a tree or an arbitrary ground surface, and further, an object such as a car or an animal that is temporarily stopped. It may be.
  • Aircraft control system 10 Aircraft 13 Flight controller 18 Camera (video acquisition unit) 20 Portable information terminal 21 Control unit 22 Touch panel unit (display unit) 22a Display surface 30 Control program 32 Cursor generation module 33 Cursor movement module 34 Video processing module 35 Control signal generation module 100 Building 101 Roof (object) 101a Top 101b Corner C1 to Cn Command F Finger M Video P Cursor S Contact point U User W Window

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un système de commande pour un véhicule volant. Selon la présente invention, le système de commande permet une commande de vol du véhicule volant par des opérations simples. Le système de commande pour un véhicule volant commande le vol du véhicule volant par l'intermédiaire d'opérations par un utilisateur et comprend : une unité d'acquisition de vidéo qui est montée sur le véhicule volant pour acquérir une vidéo d'un objet cible à partir du véhicule volant; une unité d'affichage qui a une surface d'affichage sur laquelle est affichée la vidéo acquise par l'unité d'acquisition de vidéo; un module de génération de curseur qui génère un curseur qui est affiché sur la surface d'affichage de façon à être superposé sur la vidéo affichée sur la surface d'affichage de l'unité d'affichage; et un module de déplacement de curseur qui déplace le curseur généré par le module de génération de curseur vers n'importe quelle position sur la vidéo. Le curseur est déplacé par le module de déplacement de curseur et le véhicule volant est déplacé vers une position réelle correspondant à n'importe quelle position sur la vidéo à laquelle le curseur est positionné.
PCT/JP2020/024100 2019-06-24 2020-06-19 Système de commande pour véhicule volant WO2020262222A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021526902A JPWO2020262222A1 (fr) 2019-06-24 2020-06-19

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-116133 2019-06-24
JP2019116133 2019-06-24

Publications (1)

Publication Number Publication Date
WO2020262222A1 true WO2020262222A1 (fr) 2020-12-30

Family

ID=74059740

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/024100 WO2020262222A1 (fr) 2019-06-24 2020-06-19 Système de commande pour véhicule volant

Country Status (2)

Country Link
JP (1) JPWO2020262222A1 (fr)
WO (1) WO2020262222A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5155683A (en) * 1991-04-11 1992-10-13 Wadiatur Rahim Vehicle remote guidance with path control
JPH07135685A (ja) * 1993-01-05 1995-05-23 Sfim Ind 誘導システム
JP2009229313A (ja) * 2008-03-24 2009-10-08 Toyota Motor Corp 車載情報装置
WO2017057157A1 (fr) * 2015-09-30 2017-04-06 株式会社ニコン Dispositif de vol, dispositif de mouvement, serveur et programme
JP2017138162A (ja) * 2016-02-02 2017-08-10 Jfe鋼板株式会社 構造物の点検システムおよび点検方法
CN108521803A (zh) * 2017-03-15 2018-09-11 深圳市大疆创新科技有限公司 无人飞行器航点规划方法、系统、电子设备和存储介质
WO2019093504A1 (fr) * 2017-11-09 2019-05-16 株式会社Clue Terminal, procédé et programme d'utilisation de drone

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5155683A (en) * 1991-04-11 1992-10-13 Wadiatur Rahim Vehicle remote guidance with path control
JPH07135685A (ja) * 1993-01-05 1995-05-23 Sfim Ind 誘導システム
JP2009229313A (ja) * 2008-03-24 2009-10-08 Toyota Motor Corp 車載情報装置
WO2017057157A1 (fr) * 2015-09-30 2017-04-06 株式会社ニコン Dispositif de vol, dispositif de mouvement, serveur et programme
JP2017138162A (ja) * 2016-02-02 2017-08-10 Jfe鋼板株式会社 構造物の点検システムおよび点検方法
CN108521803A (zh) * 2017-03-15 2018-09-11 深圳市大疆创新科技有限公司 无人飞行器航点规划方法、系统、电子设备和存储介质
WO2019093504A1 (fr) * 2017-11-09 2019-05-16 株式会社Clue Terminal, procédé et programme d'utilisation de drone

Also Published As

Publication number Publication date
JPWO2020262222A1 (fr) 2020-12-30

Similar Documents

Publication Publication Date Title
US11914370B2 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
CN108279694B (zh) 电子设备及其控制方法
US11604479B2 (en) Methods and system for vision-based landing
JP6835392B2 (ja) 撮像装置により取得された画像を制御するためのシステム及び方法
CN108292141B (zh) 用于目标跟踪的方法和系统
US20180181119A1 (en) Method and electronic device for controlling unmanned aerial vehicle
US20200326709A1 (en) Method and device for controlling reset of gimbal, gimbal, and unmanned aerial vehicle
CN105549604A (zh) 飞行器操控方法和装置
WO2021199449A1 (fr) Procédé de calcul de position et système de traitement d'informations
KR102290746B1 (ko) 무인 비행 장치를 제어하는 전자 장치, 그에 의해 제어되는 무인 비행 장치 및 시스템
US11327477B2 (en) Somatosensory remote controller, somatosensory remote control flight system and method, and head-less control method
WO2021212278A1 (fr) Procédé et appareil de traitement de données, plate-forme mobile et dispositif habitronique
WO2021251441A1 (fr) Procédé, système et programme
US20210181769A1 (en) Movable platform control method, movable platform, terminal device, and system
CN106020219B (zh) 一种飞行器的控制方法和装置
WO2020262222A1 (fr) Système de commande pour véhicule volant
KR20190128425A (ko) 원통좌표계 기반 무인이동체 조종 방법, 이를 구현하기 위한 프로그램이 저장된 기록매체 및 이를 구현하기 위해 매체에 저장된 컴퓨터프로그램
US20220166917A1 (en) Information processing apparatus, information processing method, and program
JP2021073796A (ja) 制御装置、及び画像を取得する方法
WO2022113482A1 (fr) Dispositif, procédé et programme de traitement d'informations
WO2022070851A1 (fr) Procédé, système et programme
US20240013460A1 (en) Information processing apparatus, information processing method, program, and information processing system
JP7487900B1 (ja) 情報処理方法、情報処理システム及びプログラム
WO2021038622A1 (fr) Système de commande pour corps mobile sans pilote et procédé de commande de corps mobile sans pilote
JP2024062247A (ja) 情報処理システム、情報処理方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20832557

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021526902

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20832557

Country of ref document: EP

Kind code of ref document: A1