WO2020262222A1 - Control system for flying vehicle - Google Patents

Control system for flying vehicle Download PDF

Info

Publication number
WO2020262222A1
WO2020262222A1 PCT/JP2020/024100 JP2020024100W WO2020262222A1 WO 2020262222 A1 WO2020262222 A1 WO 2020262222A1 JP 2020024100 W JP2020024100 W JP 2020024100W WO 2020262222 A1 WO2020262222 A1 WO 2020262222A1
Authority
WO
WIPO (PCT)
Prior art keywords
cursor
image
control system
air vehicle
flying object
Prior art date
Application number
PCT/JP2020/024100
Other languages
French (fr)
Japanese (ja)
Inventor
知也 榊原
Original Assignee
株式会社Clue
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Clue filed Critical 株式会社Clue
Priority to JP2021526902A priority Critical patent/JPWO2020262222A1/ja
Publication of WO2020262222A1 publication Critical patent/WO2020262222A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/20Initiating means actuated automatically, e.g. responsive to gust detectors using radiated signals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U30/00Means for producing lift; Empennages; Arrangements thereof
    • B64U30/20Rotors; Rotor supports

Definitions

  • the present disclosure relates to an air vehicle control system, particularly an air vehicle control system that controls the flight of a flying vehicle by user operation.
  • a flight such as a so-called drone or multicopter that flies by rotating multiple propellers
  • the body may be used.
  • Patent Document 1 discloses that a three-dimensional image is generated from an image obtained by capturing an object with a camera mounted on an air vehicle, and Patent Document 2 inspects a sewage pipeline with a camera mounted on an air vehicle. It is disclosed to do.
  • the present disclosure has been made in view of the above circumstances, and an object of the present invention is to provide an air vehicle control system capable of controlling the flight of an air vehicle by a simple operation.
  • the flying object control system for achieving the above object is a flying object control system that controls the flight of an flying object by a user's operation, and is mounted on the flying object to be an object from the flying object.
  • the image acquisition unit that acquires the image of the above, the display unit that has a display surface on which the image acquired by the image acquisition unit is displayed, and the image displayed on the display surface of the display unit are superimposed and displayed on the display surface.
  • An image in which a cursor generation module that generates a cursor and a cursor movement module that moves the cursor generated by the cursor generation module to an arbitrary position on the image are provided, and the cursor is moved by the cursor movement module and the cursor is positioned. It is characterized by moving the flying object to the actual position corresponding to the above arbitrary position.
  • a cursor is displayed overlaid on the image displayed on the display surface of the display unit, and the flying object can be moved by moving the cursor, which is simple.
  • the flight of the flying object can be controlled by operation. Therefore, it is possible to suppress the occurrence of unexpected behavior of the flying object due to erroneous operation.
  • the control system of this flying object is characterized in that when the cursor is positioned at an arbitrary position on the image, the arbitrary position on the image is positioned at the center of the display surface.
  • the display unit of the control system of the air vehicle is characterized by being composed of a touch panel device that receives an input of an operation on the air vehicle by the user touching the display surface.
  • the cursor generation module of the control system of the flying object is characterized in that the cursor is generated at a position on the image separated from the position on the image that the user touches through the display surface.
  • the cursor when the cursor is moved to a position on the image for positioning, an arbitrary position on the image to be positioned is not hidden by the user touching the position on the image via the display surface. , The cursor can be accurately positioned at any position on the image.
  • the cursor movement module of the control system of the flying object is a contact point when the user displaces the contact point while touching the display surface when the position on the image where the user contacts through the display surface is set as the contact point.
  • the cursor is moved on the image according to the displacement of, and the cursor is positioned at an arbitrary position on the image where the cursor is located when the user releases the contact with the display surface.
  • the flying object may be one that is moved in the horizontal direction of the object above the object, and the object may be the roof of the building.
  • the flight of the flying object can be controlled by a simple operation, it is possible to suppress the occurrence of unexpected behavior of the flying object due to an erroneous operation.
  • FIG. 1 is a diagram illustrating an outline of a control system for an air vehicle according to the present embodiment.
  • the flight object control system 1 is communicably connected to the flight object 10, the portable information terminal 20 that controls the flight of the flight object 10 by the operation of the user U, and the flight object 10 via a network.
  • the server 40 is provided.
  • the flying object control system 1 controls the flight of the flying object 10 that images the roof 101 of the building 100 that is the object, and the image of the roof 101 imaged by the flying object 10. Is used, for example, to detect a repaired portion of the roof 101.
  • the airframe 10 includes a motor 16 arranged at the tips of the airframe 11 and a plurality of arms radially formed from the airframe 11, and a propeller 17 connected to the motor 16, so-called drone or multicopter. Is.
  • FIG. 2 is a block diagram illustrating a hardware configuration of the flying object 10 according to the present embodiment.
  • the flying object 10 is connected to a transmission / reception unit 12, a flight controller 13 connected to the transmission / reception unit 12, a battery 14 to supply electric power via the flight controller 13, and a flight controller 13 to drive a propeller 17. It is provided with a speed control unit (Electronic Speed Controller: ESC) 15 that controls the motor 16.
  • ESC Electronic Speed Controller
  • the transmission / reception unit 12 is a communication interface configured to transmit / receive data from a plurality of external devices such as an information terminal, a display device, or another remote controller.
  • the transmission / reception unit 12 is a portable information terminal. While receiving the control signal from 20, it also sends and receives various data to and from the server 40.
  • the transmission / reception unit 12 is, for example, a local area network (Local Area Network), a wide area network (Wide Area Network: WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, and cloud communication. It is possible to use multiple communication networks such as.
  • the transmission / reception unit 12 executes transmission / reception of various acquired data, processing results generated by the flight controller 13, various control data, and a plurality of data such as user commands from a terminal or a remote controller.
  • the flight controller 13 includes a processor 13A, a memory 13B, and sensors 13C as a main configuration.
  • the processor 13A is composed of, for example, a CPU (Central Processing Unit), controls the operation of the flight controller 13, controls the transmission and reception of data between each element, performs processing necessary for executing a program, and the like. ..
  • a CPU Central Processing Unit
  • the memory 13B includes a main storage device composed of a volatile storage device such as DRAM (Dynamic Random Access Memory), and an auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disk Drive). ..
  • a volatile storage device such as DRAM (Dynamic Random Access Memory)
  • auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disk Drive). ..
  • While this memory 13B is used as a work area of the processor 13A, various setting information such as logic, code, and program instructions that can be executed by the flight controller 13 are stored.
  • the memory 13B may be configured so that the data acquired from the sensors 13C and the like is directly transmitted and stored.
  • the sensors 13C are composed of various sensors such as a GPS sensor that receives radio waves from GPS satellites, a pressure sensor that measures atmospheric pressure, a temperature sensor that measures temperature, and an acceleration sensor.
  • the airframe 10 includes a camera 18 which is an image acquisition unit fixed to the airframe 11 and acquires an image of the roof 101 of the building 100 from the airframe 10.
  • the camera 18 captures the roof 101 and acquires it as an image.
  • FIG. 3 is a block diagram illustrating an outline of the configuration of the portable information terminal 20 for operating the flying object 10 according to the present embodiment.
  • the portable information terminal 20 is mounted by a so-called tablet-shaped small computer, and includes a control unit 21 and a touch panel unit 22 which is a display unit, as shown in the figure.
  • the portable information terminal 20 is not limited to the small tablet-shaped computer described above, and may be implemented by a smartphone, a personal computer, a portable game machine, or the like.
  • the server 40 has the function of the portable information terminal 20, and the user U acquires information through an interface of another computer, tablet terminal, or the like, and operates via the interface. It may be a configuration.
  • the portable information terminal 20 may be a terminal device such as a tablet or a computer in which a radio for controlling the flight of the flying object 10 is attached and the radio is controlled.
  • the control unit 21 includes a processor 21a, a memory 21b, a storage 21c, a transmission / reception unit 21d, and an input / output unit 21e as main configurations, and these are electrically connected to each other via a bus 21f.
  • the processor 21a is an arithmetic unit that controls the operation of the control unit 21, controls the transmission and reception of data between each element, and performs processing necessary for executing a program.
  • the processor 21a is, for example, a CPU (Central Processing Unit), and executes each process by executing a program stored in the storage 21c and expanded in the memory 21b, which will be described later.
  • CPU Central Processing Unit
  • the memory 21b includes a main storage device composed of a volatile storage device such as DRAM (Dynamic Random Access Memory), and an auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disc Drive). ..
  • a volatile storage device such as DRAM (Dynamic Random Access Memory)
  • auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disc Drive). ..
  • BIOS Basic Input / Output System
  • the storage 21c stores information used for programs and various processes.
  • a control program for controlling the flight of the flying object 10 and an image of the roof 101 acquired by the camera 18 are stored.
  • the transmission / reception unit 21d connects the control unit 21 to a network such as an Internet network, and may be provided with a short-range communication interface such as Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
  • a short-range communication interface such as Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
  • control signal for controlling the flight of the flying object 10 is transmitted to the flying object 10 via the transmission / reception unit 21d.
  • the input / output unit 21e is an interface to which input / output devices are connected, and in the present embodiment, the touch panel unit 22 is connected.
  • the bus 21f transmits, for example, an address signal, a data signal, and various control signals between the connected processor 21a, memory 21b, storage 21c, transmission / reception unit 21d, and input / output unit 21e.
  • the touch panel unit 22 includes a display surface 22a on which images and images acquired by the camera 18 are displayed.
  • the display surface 22a receives information input by contact with the display surface 22a, and is implemented by various techniques such as a resistance film method and a capacitance method.
  • FIG. 4 is a diagram illustrating an outline of the configuration of the screen interface of the display surface 22a.
  • the screen interface IF is composed of a window W, a toolbar T, and commands C1 to Cn.
  • the image M of the roof 101 of the building 100 acquired by the camera 18 is displayed.
  • various operations on the flying object 10 are input.
  • the cursor P generated by the cursor generation module described later is superimposed on the image M at an arbitrary position on the image M (for example, the top 101a of the roof 101) corresponding to the actual position of the flying object 10, for example. It is displayed together.
  • the toolbar T executes various processes on the image M displayed in the window W, and is composed of various functions such as "home”, “file”, and “edit”. ..
  • the commands C1 to Cn input various commands related to the operation of the flying object 10 and the camera 18 mounted on the flying object 10, for example, "ascending" and “altitude” of the flying object 10. It is composed of a plurality of input units that execute various commands such as “change”, “landing”, or “imaging” by the camera 18.
  • FIG. 5 is a block diagram illustrating an outline of the configuration of the control program 30 stored in the storage 21c of the control unit 21 of the portable information terminal 20.
  • the control program 30 includes an input module 31, a cursor generation module 32, a cursor movement module 33, a video processing module 34, and a control signal generation module 35.
  • the input module 31 receives various operations on the flying object 10 when the user U comes into contact with the window W, or various commands are input via commands C1 to Cn. It is a module that executes data processing based on these inputs.
  • FIG. 6 is a diagram illustrating an outline of processing of the cursor generation module 32.
  • the cursor generation module 32 is a module that generates a cursor P displayed on the window W by being superimposed on the image M displayed on the window W.
  • the generated cursor P is displayed superimposed on the image M at an arbitrary position (for example, the top 101a of the roof 101) on the image M corresponding to the actual position of the flying object 10.
  • the cursor generation module 32 is on the image M at which an arbitrary distance ⁇ is set from the contact point S where the user U contacts the image M. Generate a cursor P at the position.
  • the actual position of the flying object 10 and the position of the cursor P on the image M may be different depending on the position where the user U contacts the image M via the window W, and the cursor P is arbitrary on the image M.
  • the state is not positioned at the position of, that is, the cursor P can be moved to an arbitrary position on the image M.
  • FIG. 7 is a diagram illustrating an outline of processing of the cursor movement module 33.
  • the cursor movement module 33 is a module that moves the cursor P generated by the cursor generation module 32 to an arbitrary position on the image M.
  • the user U when the user U displaces the contact point S while contacting the display surface 22a, it is generated at the top 101a of the roof 101, which is a position on the image M with an arbitrary separation distance ⁇ from the contact point S.
  • the cursor P moves following the displacement of the contact point S.
  • the cursor P generated on the top 101a of the roof 101 moves following the displacement of the contact point S, and the cursor moves.
  • the cursor P is positioned at the corner 101b of the roof 101.
  • the target point is the finger F of the user U.
  • the corner portion 101b of the roof 101 which is a point, is not concealed, and the cursor P can be accurately positioned at the target point.
  • the cursor movement module 33 continuously moves the cursor P from the initial position on the image M by following the movement of the contact point S via the separation distance ⁇ .
  • this embodiment is not limited to such an example.
  • the cursor movement module 33 may move the cursor P from a position different from the initial position of the cursor P.
  • the cursor movement module 33 sets an arbitrary position on the image that the user U contacts through the display surface 22a as the contact point S, and displays the cursor P at a position separated from the contact point S by a distance ⁇ . After that, the cursor P may be moved following the movement of the contact point S.
  • the cursor P may be displayed at the initial position until the user U touches an arbitrary point on the image. If the separation distance ⁇ between the cursor P and the contact point S is a predetermined distance, the user U can easily move the cursor P with the movement of the contact point S.
  • FIG. 8 is a diagram illustrating an outline of processing of the video processing module 34. As shown in the drawing, when the cursor P is positioned with an arbitrary position on the image M as a target point, the positioned arbitrary position on the image M is the central portion of the window W on the display surface 22a. Positioned to.
  • the corner 101b of the roof 101 is positioned on the central portion of the display surface 22a.
  • the display of the image M on the display surface 22a is scrolled so as to be performed.
  • control signal generation module 35 is a module that generates various control signals related to the operation of the flying object 10 and the like based on the operations and commands of the user U input via the input module 31. ..
  • a control signal for adjusting the angle of the camera 18 is generated and transmitted to the flying object 10, and the angle of the camera 18 is adjusted.
  • the camera 18 when the command of "imaging” is input, the camera 18 generates a control signal for imaging the roof 101 of the building 100 as an image and transmits it to the flying object 10, and the camera 18 images the roof 101. ..
  • FIG. 9 is a diagram illustrating an outline when the flight of the flying object 10 is controlled by the control signal generated by the control signal generation module 35. For example, when a command for "climbing" is input via commands C1 to Cn, a control signal for raising the flying object 10 is generated and transmitted to the flying object 10 as shown in the figure, and the flying object 10 is constructed. It rises above the object 100.
  • the server 40 is implemented by, for example, a desktop computer, and in the present embodiment, an image of the roof 101 of the building 100 captured by the camera 18 of the flying object 10 is stored in the storage thereof. ..
  • the flying object 10 is as shown in FIG. Ascends above building 100.
  • the image M acquired by the camera 18 of the flying object 10 is displayed in the window W of the screen interface IF.
  • the cursor P moves to the top 101a of the roof 101, which is a position on the image M corresponding to the actual position of the flying object 10. Will be generated.
  • the user U touches the image M through the window W and moves the cursor P to position the cursor P at the corner 101b of the roof 101, which is the target point.
  • the flying object 10 moves from the top 101a of the roof 101 toward the actual position corresponding to the corner 101b of the roof 101 on the image M. , Move in the horizontal direction H of the building 100.
  • the corner portion 101b of the roof 101 moves.
  • the display of the image M on the display surface 22a is scrolled so as to be positioned at the central portion of the display surface 22a.
  • the camera 18 of the flying object 10 has the roof 101.
  • the corner portion 101b of the above is imaged as an image.
  • the captured image is stored in the storage 21c of the portable information terminal 20 and also in the storage of the server 40, and is later used for detecting the repaired portion of the roof 101.
  • the user U touches the image M through the window W to move the cursor P, and moves the cursor P to an arbitrary position (for example, an unobstructed ground surface) on the image M where the flying object 10 can land.
  • the flying object 10 moves in the horizontal direction H of the structure 100 toward the actual position corresponding to the ground surface on the image M.
  • the user U uses commands C1 to Cn of the screen interface IF of the display surface 22a of the touch panel unit 22 of the portable information terminal 20.
  • the aircraft 10 descends and lands on the ground surface.
  • the image M displayed on the window W of the screen interface IF of the display surface 22a of the touch panel unit 22 of the portable information terminal 20 is superimposed.
  • the cursor P is displayed, and the flying object 10 can be moved by moving the cursor P.
  • the flight of the flying object 10 can be controlled by a simple operation, it is possible to suppress the occurrence of unexpected behavior of the flying object 10 due to an erroneous operation.
  • the cursor P is generated from an arbitrary position on the image M that the user U contacts with the finger F via the window W of the screen interface IF of the display surface 22a to a position on the image M via the separation distance ⁇ . Therefore, when the cursor P is moved to an arbitrary position on the image M which is the target point and positioned, the finger F of the user U does not hide the arbitrary position on the image M, and the cursor P is accurately moved. Can be positioned at the target point.
  • the distance on the image between the initial position of the cursor P generated by the cursor generation module 32 and the position of the cursor P determined by the cursor processing module 33 is calculated, and the altitude of the flying object 10 and the camera 18 are calculated. Based on the information related to the zoom state, the distance in the real space corresponding to the above-mentioned distance on the image is estimated. As a result, the flying object 10 can move more accurately in the horizontal direction of the building 100 toward the actual position corresponding to the ground surface on the image M.
  • an example of this embodiment will be described.
  • FIG. 10 is a diagram illustrating an outline of processing according to another embodiment of the present disclosure.
  • An altitude information box H indicating the altitude of the flying object 10 is superimposed and displayed on the illustrated image M. Further, the image M displays a zoom information object Z indicating the zoom magnification of the camera 18.
  • the zoom information object Z may be a command for adjusting the zoom magnification of the camera 18.
  • the cursor movement module 33 moves the initial position of the cursor P generated by the cursor generation module 32 (in FIG. 10, the roof 101 displayed on the image M in FIG. 10).
  • the distance on the image M between the position corresponding to the top portion 101a) and the position of the positioned cursor P (the position corresponding to the corner portion 101b of the roof 101 displayed in the image M in FIG. 10) is calculated.
  • the method of calculating the distance on the image M is not particularly limited.
  • the image processing module 34 estimates the distance in the real space corresponding to the distance on the image M based on at least one of the distance on the image M and the altitude of the flying object 10 and the zoom information of the camera 18. Will be done.
  • the distance in the real space referred to here is the horizontal distance to which the flying object 10 should move.
  • control signal generation module 35 generates a control signal related to the estimated real space distance and transmits it to the flying object 10.
  • the flying object 10 moves in the horizontal direction H of the building 100 toward an actual position corresponding to an arbitrary position positioned on the image M.
  • the moving distance of the flying object 10 is an estimated real space distance.
  • the distance in real space is estimated based on the altitude of the flying object 10 and the zoom information of the camera 18.
  • the size of the roof 101 displayed in the image M changes depending on the altitude of the flying object 10 and the zoom adjustment of the camera 18. Then, the correspondence between the distance between the initial position of the cursor P and the target position on the image M and the distance in the real space also changes.
  • the flying object 10 can be moved more accurately regardless of the altitude of the flying object 10 and the zoom state of the camera 18. Can be made to.
  • control system 1 of the air vehicle includes the server 40
  • the control system 1 of the air vehicle can also be constructed without deploying the server 40.
  • the flying object control system 1 may be constructed by directly communicating between the flying object 10 and the portable information terminal 20.
  • the function of the server 40 may be realized by the portable information terminal 20.
  • 11 and 12 are diagrams illustrating an outline of an air vehicle control system according to another embodiment. For example, as shown in FIG.
  • the flying object 10 and the portable information terminal 20 directly communicate with each other, and the portable information terminal 20 and the server 40 directly communicate with each other. It may be configured to perform.
  • the server 40 communicates with the flying object 10 via the portable information terminal 20.
  • the portable information terminal 20 and the server 40 directly communicate with each other, and the server 40 and the flying object 10 directly communicate with each other. There may be. In this case, the portable information terminal 20 communicates with the flying object 10 via the server 40.
  • the object is the roof 101 of the building 100
  • it may be a tree or an arbitrary ground surface, and further, an object such as a car or an animal that is temporarily stopped. It may be.
  • Aircraft control system 10 Aircraft 13 Flight controller 18 Camera (video acquisition unit) 20 Portable information terminal 21 Control unit 22 Touch panel unit (display unit) 22a Display surface 30 Control program 32 Cursor generation module 33 Cursor movement module 34 Video processing module 35 Control signal generation module 100 Building 101 Roof (object) 101a Top 101b Corner C1 to Cn Command F Finger M Video P Cursor S Contact point U User W Window

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mechanical Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Provided is a control system for a flying vehicle according to the present invention, the control system enabling flight control of the flying vehicle through simple operations. The control system for a flying vehicle controls the flight of the flying vehicle through operations by a user and comprises: a video acquisition unit that is mounted on the flying vehicle to acquire video of a target object from the flying vehicle; a display unit that has a display surface on which is displayed the video acquired by the video acquisition unit; a cursor generation module that generates a cursor that is displayed on the display surface so as to be superimposed on the video displayed on the display surface of the display unit; and a cursor movement module that moves the cursor generated by the cursor generation module to any position on the video. The cursor is moved by the cursor movement module, and the flying vehicle is moved to a real position corresponding to any position on the video at which the cursor is positioned.

Description

飛行体の制御システムAircraft control system
 本開示は、飛行体の制御システム、特に、ユーザの操作によって飛行する飛行体の飛行を制御する飛行体の制御システムに関する。 The present disclosure relates to an air vehicle control system, particularly an air vehicle control system that controls the flight of a flying vehicle by user operation.
 対象物を高所から観察したり、上空から地上を空撮したり、あるいは立ち入りが困難な領域を観察したりする場合には、近年、複数のプロペラの回転によって飛行するいわゆるドローンあるいはマルチコプタといった飛行体が用いられることがある。 When observing an object from a high place, taking an aerial view of the ground from the sky, or observing an area that is difficult to enter, in recent years, a flight such as a so-called drone or multicopter that flies by rotating multiple propellers The body may be used.
 特許文献1には、飛行体に搭載したカメラで対象物を撮像した画像から三次元の画像を生成することが開示され、特許文献2には、飛行体に搭載したカメラで下水管路を点検することが開示されている。 Patent Document 1 discloses that a three-dimensional image is generated from an image obtained by capturing an object with a camera mounted on an air vehicle, and Patent Document 2 inspects a sewage pipeline with a camera mounted on an air vehicle. It is disclosed to do.
特開2018-10630公報JP-A-2018-10630 特開2016-218813公報Japanese Unexamined Patent Publication No. 2016-218813
 ところで、このような飛行体を操作するには、高度な操作技術が必要であることから、その習得は比較的困難であるところ、例えば、操作技術の未熟な者等が操作すれば、誤操作によって飛行体が想定しない挙動を起こすことも想定される。 By the way, in order to operate such an air vehicle, it is relatively difficult to learn it because advanced operation technology is required. For example, if an inexperienced person in operation technology operates it, an erroneous operation may occur. It is also expected that the aircraft will behave unexpectedly.
 本開示は、上記事情に鑑みてなされたものであり、簡易な操作によって飛行体の飛行を制御することができる飛行体の制御システムを提供することを課題とするものである。 The present disclosure has been made in view of the above circumstances, and an object of the present invention is to provide an air vehicle control system capable of controlling the flight of an air vehicle by a simple operation.
 上記課題を達成するための、本開示に係る飛行体の制御システムは、ユーザの操作によって飛行する飛行体の飛行を制御する飛行体の制御システムにおいて、飛行体に搭載されて飛行体から対象物の映像を取得する映像取得部と、映像取得部で取得した映像が表示される表示面を有する表示部と、表示部の表示面に表示された映像に重ね合わせられて表示面に表示されるカーソルを生成するカーソル生成モジュールと、カーソル生成モジュールで生成されたカーソルを映像上の任意の位置に移動させるカーソル移動モジュールと、を備え、カーソル移動モジュールによってカーソルが移動されてカーソルが位置決めされた映像上の任意の位置に対応する実位置に飛行体を移動させることを特徴としている。 The flying object control system according to the present disclosure for achieving the above object is a flying object control system that controls the flight of an flying object by a user's operation, and is mounted on the flying object to be an object from the flying object. The image acquisition unit that acquires the image of the above, the display unit that has a display surface on which the image acquired by the image acquisition unit is displayed, and the image displayed on the display surface of the display unit are superimposed and displayed on the display surface. An image in which a cursor generation module that generates a cursor and a cursor movement module that moves the cursor generated by the cursor generation module to an arbitrary position on the image are provided, and the cursor is moved by the cursor movement module and the cursor is positioned. It is characterized by moving the flying object to the actual position corresponding to the above arbitrary position.
 この飛行体の制御システムによれば、表示部の表示面に表示された映像に重ね合わせられてカーソルが表示され、このカーソルを移動させることによって、飛行体を移動させることができることから、簡易な操作によって飛行体の飛行を制御することができる。したがって、誤操作による飛行体の想定しない挙動の発生を抑制することができる。 According to this flying object control system, a cursor is displayed overlaid on the image displayed on the display surface of the display unit, and the flying object can be moved by moving the cursor, which is simple. The flight of the flying object can be controlled by operation. Therefore, it is possible to suppress the occurrence of unexpected behavior of the flying object due to erroneous operation.
 この飛行体の制御システムは、カーソルが映像上の任意の位置に位置決めされた際に映像上の任意の位置が表示面の中央部分に位置決めされることを特徴としている。 The control system of this flying object is characterized in that when the cursor is positioned at an arbitrary position on the image, the arbitrary position on the image is positioned at the center of the display surface.
 さらに、飛行体の制御システムの表示部は、ユーザが表示面に接触することによって飛行体に対する操作の入力を受け付けるタッチパネル装置で構成されることを特徴としている。 Further, the display unit of the control system of the air vehicle is characterized by being composed of a touch panel device that receives an input of an operation on the air vehicle by the user touching the display surface.
 しかも、飛行体の制御システムのカーソル生成モジュールは、表示面を介してユーザが接触する映像上の位置から離間した映像上の位置にカーソルを生成することを特徴としている。 Moreover, the cursor generation module of the control system of the flying object is characterized in that the cursor is generated at a position on the image separated from the position on the image that the user touches through the display surface.
 したがって、カーソルを映像上の位置に移動させて位置決めするに際して、表示面を介してユーザが映像上の位置に接触することによって位置決めをしようとする映像上の任意の位置が隠蔽されることがなく、カーソルを的確に映像上の任意の位置に位置決めすることができる。 Therefore, when the cursor is moved to a position on the image for positioning, an arbitrary position on the image to be positioned is not hidden by the user touching the position on the image via the display surface. , The cursor can be accurately positioned at any position on the image.
 しかも、飛行体の制御システムのカーソル移動モジュールは、表示面を介してユーザが接触する映像上の位置を接触点とした場合にユーザが表示面に接触しながら接触点を変位させる際に接触点の変位に追従させてカーソルを映像上で移動させ、ユーザが表示面への接触を解除した際にカーソルが位置する映像上の任意の位置にカーソルを位置決めする、ことを特徴としている。 Moreover, the cursor movement module of the control system of the flying object is a contact point when the user displaces the contact point while touching the display surface when the position on the image where the user contacts through the display surface is set as the contact point. The cursor is moved on the image according to the displacement of, and the cursor is positioned at an arbitrary position on the image where the cursor is located when the user releases the contact with the display surface.
 さらに、飛行体は、対象物の上方において対象物の水平方向に移動されるものであってもよく、対象物が建造物の屋根であってもよい。 Further, the flying object may be one that is moved in the horizontal direction of the object above the object, and the object may be the roof of the building.
 この発明によれば、簡易な操作によって飛行体の飛行を制御することができることから、誤操作による飛行体の想定しない挙動の発生を抑制することができる。 According to the present invention, since the flight of the flying object can be controlled by a simple operation, it is possible to suppress the occurrence of unexpected behavior of the flying object due to an erroneous operation.
本開示の実施の形態に係る飛行体の制御システムの概略を説明する図である。It is a figure explaining the outline of the control system of the flying object which concerns on embodiment of this disclosure. 同じく、本実施の形態に係る飛行体のハードウェア構成を説明するブロック図である。Similarly, it is a block diagram explaining the hardware configuration of the flying object which concerns on this embodiment. 同じく、本実施の形態に係る飛行体を操作する携帯型情報端末の構成の概略を説明するブロック図である。Similarly, it is a block diagram explaining the outline of the structure of the portable information terminal which operates the flying object which concerns on this embodiment. 同じく、本実施の形態に係る飛行体を操作する携帯型情報端末の表示部の表示面の画面インターフェースの構成の概略を説明する図である。Similarly, it is a figure explaining the outline of the structure of the screen interface of the display surface of the display part of the portable information terminal which operates the flying object which concerns on this embodiment. 同じく、本実施の形態に係る飛行体を操作する携帯型情報端末の制御部のストレージに格納される制御プログラムの構成の概略を説明するブロック図である。Similarly, it is a block diagram explaining the outline of the structure of the control program stored in the storage of the control part of the portable information terminal which operates the flying object which concerns on this embodiment. 同じく、本実施の形態に係る飛行体を操作する携帯型情報端末の制御部のストレージに格納される制御プログラムのカーソル生成モジュールの処理の概略を説明する図である。Similarly, it is a figure explaining the outline of the process of the cursor generation module of the control program stored in the storage of the control part of the portable information terminal which operates the flying object which concerns on this embodiment. 同じく、本実施の形態に係る飛行体を操作する携帯型情報端末の制御部のストレージに格納される制御プログラムのカーソル移動モジュールの処理の概略を説明する図である。Similarly, it is a figure explaining the outline of the process of the cursor movement module of the control program stored in the storage of the control part of the portable information terminal which operates the flying object which concerns on this embodiment. 同じく、本実施の形態に係る飛行体を操作する携帯型情報端末の制御部のストレージに格納される制御プログラムの映像処理モジュールの処理の概略を説明する図である。Similarly, it is a figure explaining the outline of the processing of the image processing module of the control program stored in the storage of the control part of the portable information terminal which operates the flying object which concerns on this embodiment. 同じく、本実施の形態に係る飛行体を操作する携帯型情報端末の制御部のストレージに格納される制御プログラムの制御信号生成モジュールで生成された制御信号によって飛行体の飛行が制御される場合の概略を説明する図である。Similarly, when the flight of the flying object is controlled by the control signal generated by the control signal generation module of the control program stored in the storage of the control unit of the portable information terminal that operates the flying object according to the present embodiment. It is a figure explaining the outline. 他の実施の形態に係る飛行体を操作する携帯型情報端末の制御部のストレージに格納される制御プログラムの処理の概略を説明する図である。It is a figure explaining the outline of the processing of the control program stored in the storage of the control part of the portable information terminal which operates the flying object which concerns on another embodiment. 他の実施の形態に係る飛行体の制御システムの概略を説明する図である。It is a figure explaining the outline of the control system of the flying object which concerns on other embodiment. 他の実施の形態に係る飛行体の制御システムの概略を説明する図である。It is a figure explaining the outline of the control system of the flying object which concerns on other embodiment.
 次に、図1~図12に基づいて、本開示の実施の形態に係る飛行体の制御システムについて説明する。 Next, the control system of the flying object according to the embodiment of the present disclosure will be described with reference to FIGS. 1 to 12.
 図1は、本実施の形態に係る飛行体の制御システムの概略を説明する図である。図示のように、飛行体の制御システム1は、飛行体10、ユーザUの操作によって飛行体10の飛行を制御する携帯型情報端末20及び飛行体10とネットワークを介して相互に通信可能に接続されるサーバ40を備える。 FIG. 1 is a diagram illustrating an outline of a control system for an air vehicle according to the present embodiment. As shown in the figure, the flight object control system 1 is communicably connected to the flight object 10, the portable information terminal 20 that controls the flight of the flight object 10 by the operation of the user U, and the flight object 10 via a network. The server 40 is provided.
 この飛行体の制御システム1は、本実施の形態では、対象物である建造物100の屋根101を撮像する飛行体10の飛行を制御するものであり、飛行体10が撮像した屋根101の画像は、例えば、屋根101の補修箇所の検出に用いられる。 In the present embodiment, the flying object control system 1 controls the flight of the flying object 10 that images the roof 101 of the building 100 that is the object, and the image of the roof 101 imaged by the flying object 10. Is used, for example, to detect a repaired portion of the roof 101.
 飛行体10は、本実施の形態では、機体11、機体11から放射状に形成される複数のアームの先端にそれぞれ配備されるモータ16及びモータ16に連結されるプロペラ17を備える、いわゆるドローンあるいはマルチコプタである。 In the present embodiment, the airframe 10 includes a motor 16 arranged at the tips of the airframe 11 and a plurality of arms radially formed from the airframe 11, and a propeller 17 connected to the motor 16, so-called drone or multicopter. Is.
 図2は、本実施の形態に係る飛行体10のハードウェア構成を説明するブロック図である。図示のように、飛行体10は、送受信部12、送受信部12と接続されるフライトコントローラ13、フライトコントローラ13を介して電力を供給するバッテリ14、フライトコントローラ13と接続されてプロペラ17を駆動するモータ16を制御する速度制御部(Electronic Speed Controller:ESC)15を備える。 FIG. 2 is a block diagram illustrating a hardware configuration of the flying object 10 according to the present embodiment. As shown in the figure, the flying object 10 is connected to a transmission / reception unit 12, a flight controller 13 connected to the transmission / reception unit 12, a battery 14 to supply electric power via the flight controller 13, and a flight controller 13 to drive a propeller 17. It is provided with a speed control unit (Electronic Speed Controller: ESC) 15 that controls the motor 16.
 送受信部12は、例えば、情報端末、表示装置あるいは他の遠隔の制御器といった複数の外部機器からのデータを送受信するように構成された通信インターフェースであり、本実施の形態では、携帯型情報端末20からの制御信号を受信するとともに、サーバ40と各種のデータの送受信を行う。 The transmission / reception unit 12 is a communication interface configured to transmit / receive data from a plurality of external devices such as an information terminal, a display device, or another remote controller. In the present embodiment, the transmission / reception unit 12 is a portable information terminal. While receiving the control signal from 20, it also sends and receives various data to and from the server 40.
 この送受信部12は、例えば、ローカルエリアネットワーク(Local Area Network:LAN)、ワイドエリアネットワーク(Wide Area Network:WAN)、赤外線、無線、WiFi、ポイントツーポイント(P2P)ネットワーク、電気通信ネットワーク、クラウド通信等といった複数の通信網を利用することができる。 The transmission / reception unit 12 is, for example, a local area network (Local Area Network), a wide area network (Wide Area Network: WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, and cloud communication. It is possible to use multiple communication networks such as.
 さらに、送受信部12は、取得した各種のデータ、フライトコントローラ13が生成した処理結果、各種の制御データ、端末または遠隔の制御器からのユーザコマンド等の複数のデータの送受信を実行する。 Further, the transmission / reception unit 12 executes transmission / reception of various acquired data, processing results generated by the flight controller 13, various control data, and a plurality of data such as user commands from a terminal or a remote controller.
 フライトコントローラ13は、プロセッサ13A、メモリ13B、及びセンサ類13Cを主要構成として備える。 The flight controller 13 includes a processor 13A, a memory 13B, and sensors 13C as a main configuration.
 プロセッサ13Aは、本実施の形態では例えばCPU(Central Processing Unit)で構成され、フライトコントローラ13の動作を制御し、各要素間におけるデータの送受信の制御や、プログラムの実行に必要な処理等を行う。 In the present embodiment, the processor 13A is composed of, for example, a CPU (Central Processing Unit), controls the operation of the flight controller 13, controls the transmission and reception of data between each element, performs processing necessary for executing a program, and the like. ..
 メモリ13Bは、DRAM(Dynamic Random Access Memory)等の揮発性記憶装置で構成される主記憶装置、及びフラッシュメモリやHDD(Hard Disc Drive)等の不揮発性記憶装置で構成される補助記憶装置を備える。 The memory 13B includes a main storage device composed of a volatile storage device such as DRAM (Dynamic Random Access Memory), and an auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disk Drive). ..
 このメモリ13Bは、プロセッサ13Aの作業領域として使用される一方、フライトコントローラ13が実行可能であるロジック、コード、あるいはプログラム命令といった各種の設定情報等が格納される。 While this memory 13B is used as a work area of the processor 13A, various setting information such as logic, code, and program instructions that can be executed by the flight controller 13 are stored.
 さらに、このメモリ13Bに、センサ類13C等から取得したデータが直接的に伝達されて記憶されるように構成してもよい。 Further, the memory 13B may be configured so that the data acquired from the sensors 13C and the like is directly transmitted and stored.
 センサ類13Cは、本実施の形態では、GPS衛星から電波を受信するGPSセンサ、大気圧を測定する気圧センサ、温度を測定する温度センサや加速度センサといった各種のセンサによって構成される。 In the present embodiment, the sensors 13C are composed of various sensors such as a GPS sensor that receives radio waves from GPS satellites, a pressure sensor that measures atmospheric pressure, a temperature sensor that measures temperature, and an acceleration sensor.
 さらに、飛行体10は、機体11に固定されて飛行体10から建造物100の屋根101の映像を取得する映像取得部であるカメラ18を備える。このカメラ18は、屋根101を撮像して画像として取得する。 Further, the airframe 10 includes a camera 18 which is an image acquisition unit fixed to the airframe 11 and acquires an image of the roof 101 of the building 100 from the airframe 10. The camera 18 captures the roof 101 and acquires it as an image.
 図3は、本実施の形態に係る飛行体10を操作する携帯型情報端末20の構成の概略を説明するブロック図である。この携帯型情報端末20は、本実施の形態ではいわゆるタブレット状の小型のコンピュータによって実装され、図示のように、制御部21及び表示部であるタッチパネル部22を備える。なお、携帯型情報端末20は、上述したタブレット状の小型のコンピュータに限定されず、スマートフォン、パーソナルコンピュータ、携帯ゲーム機等により実装されてもよい。また、他の実施形態においては、サーバ40が携帯型情報端末20の機能を有し、他のコンピュータやタブレット端末等のインターフェースを介してユーザUが情報を取得し、該インターフェースを介して操作する構成であってもよい。また、携帯型情報端末20は、飛行体10の飛行を制御するためのプロポが付随し、該プロポを制御するためのタブレットまたはコンピュータ等の端末装置であってもよい。 FIG. 3 is a block diagram illustrating an outline of the configuration of the portable information terminal 20 for operating the flying object 10 according to the present embodiment. In the present embodiment, the portable information terminal 20 is mounted by a so-called tablet-shaped small computer, and includes a control unit 21 and a touch panel unit 22 which is a display unit, as shown in the figure. The portable information terminal 20 is not limited to the small tablet-shaped computer described above, and may be implemented by a smartphone, a personal computer, a portable game machine, or the like. Further, in another embodiment, the server 40 has the function of the portable information terminal 20, and the user U acquires information through an interface of another computer, tablet terminal, or the like, and operates via the interface. It may be a configuration. Further, the portable information terminal 20 may be a terminal device such as a tablet or a computer in which a radio for controlling the flight of the flying object 10 is attached and the radio is controlled.
 制御部21は、プロセッサ21a、メモリ21b、ストレージ21c、送受信部21d、及び入出力部21eを主要構成として備え、これらが互いにバス21fを介して電気的に接続される。 The control unit 21 includes a processor 21a, a memory 21b, a storage 21c, a transmission / reception unit 21d, and an input / output unit 21e as main configurations, and these are electrically connected to each other via a bus 21f.
 プロセッサ21aは、制御部21の動作を制御し、各要素間におけるデータの送受信の制御や、プログラムの実行に必要な処理等を行う演算装置である。 The processor 21a is an arithmetic unit that controls the operation of the control unit 21, controls the transmission and reception of data between each element, and performs processing necessary for executing a program.
 このプロセッサ21aは、本実施の形態では例えばCPU(Central Processing Unit)であり、後述するストレージ21cに格納されてメモリ21bに展開されたプログラムを実行して各処理を行う。 In the present embodiment, the processor 21a is, for example, a CPU (Central Processing Unit), and executes each process by executing a program stored in the storage 21c and expanded in the memory 21b, which will be described later.
 メモリ21bは、DRAM(Dynamic Random Access Memory)等の揮発性記憶装置で構成される主記憶装置、及びフラッシュメモリやHDD(Hard Disc Drive)等の不揮発性記憶装置で構成される補助記憶装置を備える。 The memory 21b includes a main storage device composed of a volatile storage device such as DRAM (Dynamic Random Access Memory), and an auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disc Drive). ..
 このメモリ21bは、プロセッサ21aの作業領域として使用される一方、制御部21の起動時に実行されるBIOS(Basic Input/Output System)、及び各種の設定情報等が格納される。 While this memory 21b is used as a work area of the processor 21a, the BIOS (Basic Input / Output System) executed when the control unit 21 is started, various setting information, and the like are stored.
 ストレージ21cは、プログラムや各種の処理に用いられる情報等が格納されている。本実施の形態では、飛行体10の飛行を制御する制御プログラムや、カメラ18で取得した屋根101の画像が格納される。 The storage 21c stores information used for programs and various processes. In the present embodiment, a control program for controlling the flight of the flying object 10 and an image of the roof 101 acquired by the camera 18 are stored.
 制御プログラムの構成の詳細については、後述する。 The details of the control program configuration will be described later.
 送受信部21dは、制御部21をインターネット網等のネットワークに接続するものであって、Bluetooth(登録商標)やBLE(Bluetooth Low Energy)といった近距離通信インターフェースを具備するものであってもよい。 The transmission / reception unit 21d connects the control unit 21 to a network such as an Internet network, and may be provided with a short-range communication interface such as Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
 本実施の形態では、飛行体10の飛行を制御する制御信号が、この送受信部21dを介して飛行体10に送信される。 In the present embodiment, the control signal for controlling the flight of the flying object 10 is transmitted to the flying object 10 via the transmission / reception unit 21d.
 入出力部21eは、入出力機器が接続されるインターフェースであって、本実施の形態では、タッチパネル部22が接続される。 The input / output unit 21e is an interface to which input / output devices are connected, and in the present embodiment, the touch panel unit 22 is connected.
 バス21fは、接続したプロセッサ21a、メモリ21b、ストレージ21c、送受信部21d及び入出力部21eの間において、例えばアドレス信号、データ信号及び各種の制御信号を伝達する。 The bus 21f transmits, for example, an address signal, a data signal, and various control signals between the connected processor 21a, memory 21b, storage 21c, transmission / reception unit 21d, and input / output unit 21e.
 タッチパネル部22は、カメラ18で取得した映像や画像が表示される表示面22aを備える。この表示面22aは、本実施の形態では、表示面22aへの接触によって情報の入力を受け付けるものであって、抵抗膜方式や静電容量方式といった各種の技術によって実装される。 The touch panel unit 22 includes a display surface 22a on which images and images acquired by the camera 18 are displayed. In the present embodiment, the display surface 22a receives information input by contact with the display surface 22a, and is implemented by various techniques such as a resistance film method and a capacitance method.
 図4は、表示面22aの画面インターフェースの構成の概略を説明する図である。図示のように、画面インターフェースIFは、ウインドウW、ツールバーT、及びコマンドC1~Cnによって構成される。 FIG. 4 is a diagram illustrating an outline of the configuration of the screen interface of the display surface 22a. As shown, the screen interface IF is composed of a window W, a toolbar T, and commands C1 to Cn.
 ウインドウWには、本実施の形態では、カメラ18で取得した建造物100の屋根101の映像Mが表示される。このウインドウWにユーザUが例えば指で接触することによって、飛行体10に対する各種の操作が入力される。 In the window W, in the present embodiment, the image M of the roof 101 of the building 100 acquired by the camera 18 is displayed. When the user U touches the window W, for example, with a finger, various operations on the flying object 10 are input.
 このウインドウWには、後述するカーソル生成モジュールによって生成されたカーソルPが、例えば飛行体10の実位置に対応する映像M上の任意の位置(例えば屋根101の頂部101a)に、映像Mと重ね合わせられて表示される。 In this window W, the cursor P generated by the cursor generation module described later is superimposed on the image M at an arbitrary position on the image M (for example, the top 101a of the roof 101) corresponding to the actual position of the flying object 10, for example. It is displayed together.
 ツールバーTは、本実施の形態では、ウインドウWに表示される映像Mに各種の処理を実行するものであって、例えば「ホーム」、「ファイル」あるいは「編集」といった各種の機能によって構成される。 In the present embodiment, the toolbar T executes various processes on the image M displayed in the window W, and is composed of various functions such as "home", "file", and "edit". ..
 コマンドC1~Cnは、本実施の形態では、飛行体10や飛行体10に搭載されたカメラ18の操作に関する各種の指令を入力するものであって、例えば飛行体10の「上昇」、「高度変更」、「着陸」あるいはカメラ18による「撮像」といった各種の指令を実行する複数の入力部によって構成される。 In the present embodiment, the commands C1 to Cn input various commands related to the operation of the flying object 10 and the camera 18 mounted on the flying object 10, for example, "ascending" and "altitude" of the flying object 10. It is composed of a plurality of input units that execute various commands such as "change", "landing", or "imaging" by the camera 18.
 図5は、携帯型情報端末20の制御部21のストレージ21cに格納される制御プログラム30の構成の概略を説明するブロック図である。図示のように、制御プログラム30は、入力モジュール31、カーソル生成モジュール32、カーソル移動モジュール33、映像処理モジュール34及び制御信号生成モジュール35を備える。 FIG. 5 is a block diagram illustrating an outline of the configuration of the control program 30 stored in the storage 21c of the control unit 21 of the portable information terminal 20. As shown in the figure, the control program 30 includes an input module 31, a cursor generation module 32, a cursor movement module 33, a video processing module 34, and a control signal generation module 35.
 入力モジュール31は、本実施の形態では、ウインドウWにユーザUが接触して飛行体10に対する各種の操作が入力されたり、コマンドC1~Cnを介して各種の指令が入力されたりした際に、これらの入力に基づいてデータ処理を実行するモジュールである。 In the present embodiment, the input module 31 receives various operations on the flying object 10 when the user U comes into contact with the window W, or various commands are input via commands C1 to Cn. It is a module that executes data processing based on these inputs.
 図6は、カーソル生成モジュール32の処理の概略を説明する図である。図示のように、カーソル生成モジュール32は、ウインドウWに表示される映像Mに重ね合わせられてウインドウWに表示されるカーソルPを生成するモジュールである。 FIG. 6 is a diagram illustrating an outline of processing of the cursor generation module 32. As shown in the figure, the cursor generation module 32 is a module that generates a cursor P displayed on the window W by being superimposed on the image M displayed on the window W.
 生成されたカーソルPは、本実施の形態では、飛行体10の実位置に対応する映像M上の任意の位置(例えば屋根101の頂部101a)に、映像Mと重ね合わせられて表示される。 In the present embodiment, the generated cursor P is displayed superimposed on the image M at an arbitrary position (for example, the top 101a of the roof 101) on the image M corresponding to the actual position of the flying object 10.
 このとき、ユーザUが指FでウインドウWを介して映像Mに接触すると、カーソル生成モジュール32は、ユーザUが映像Mに接触した接触点Sから任意の離間距離αをおいた映像M上の位置にカーソルPを生成する。 At this time, when the user U touches the image M through the window W with the finger F, the cursor generation module 32 is on the image M at which an arbitrary distance α is set from the contact point S where the user U contacts the image M. Generate a cursor P at the position.
 この場合、ユーザUがウインドウWを介して映像Mに接触する位置によっては、飛行体10の実位置とカーソルPの映像M上の位置とが異なる場合があり、カーソルPが映像M上の任意の位置に位置決めされていない状態、すなわちカーソルPを映像M上の任意の位置に移動させることが可能な状態となる。 In this case, the actual position of the flying object 10 and the position of the cursor P on the image M may be different depending on the position where the user U contacts the image M via the window W, and the cursor P is arbitrary on the image M. The state is not positioned at the position of, that is, the cursor P can be moved to an arbitrary position on the image M.
 図7は、カーソル移動モジュール33の処理の概略を説明する図である。図示のように、カーソル移動モジュール33は、カーソル生成モジュール32で生成されたカーソルPを映像M上の任意の位置に移動させるモジュールである。 FIG. 7 is a diagram illustrating an outline of processing of the cursor movement module 33. As shown in the figure, the cursor movement module 33 is a module that moves the cursor P generated by the cursor generation module 32 to an arbitrary position on the image M.
 本実施の形態では、ユーザUが表示面22aに接触しながら接触点Sを変位させると、接触点Sから任意の離間距離αをおいた映像M上の位置である屋根101の頂部101aに生成されたカーソルPが、接触点Sの変位に追従して移動する。 In the present embodiment, when the user U displaces the contact point S while contacting the display surface 22a, it is generated at the top 101a of the roof 101, which is a position on the image M with an arbitrary separation distance α from the contact point S. The cursor P moves following the displacement of the contact point S.
 例えば、ユーザUがウインドウWに接触しながら矢線Aに沿って接触点Sを変位させると、屋根101の頂部101aに生成されたカーソルPが接触点Sの変位に追従して移動し、カーソルPが屋根101の角部101bに位置したときにユーザUがウインドウWへの接触を解除、すなわちウインドウWから指Fを離すと、カーソルPが屋根101の角部101bに位置決めされる。 For example, when the user U displaces the contact point S along the arrow line A while touching the window W, the cursor P generated on the top 101a of the roof 101 moves following the displacement of the contact point S, and the cursor moves. When the user U releases the contact with the window W when P is located at the corner 101b of the roof 101, that is, when the finger F is released from the window W, the cursor P is positioned at the corner 101b of the roof 101.
 このように、カーソルPは接触点Sから離間距離αを介して移動されることから、例えばカーソルPを位置決めする目標点を屋根101の角部101bとする場合において、ユーザUの指Fによって目標点である屋根101の角部101bが隠蔽されることがなく、カーソルPを的確に目標点に位置決めすることができる。 In this way, since the cursor P is moved from the contact point S via the separation distance α, for example, when the target point for positioning the cursor P is the corner portion 101b of the roof 101, the target point is the finger F of the user U. The corner portion 101b of the roof 101, which is a point, is not concealed, and the cursor P can be accurately positioned at the target point.
 なお、図6および図7に示した例では、カーソル移動モジュール33は、カーソルPを、映像M上の初期位置から接触点Sの動きに離間距離αを介して追随して連続的に移動させるが、本実施形態はかかる例に限定されない。例えば、カーソル移動モジュール33は、カーソルPの初期位置とは異なる位置からカーソルPを移動させてもよい。具体的には、カーソル移動モジュール33は、表示面22aを介してユーザUが接触する映像上の任意の位置を接触点Sとし、接触点Sから離間距離α離れた位置にカーソルPを表示させてから、接触点Sの動きに追随してカーソルPを移動させてもよい。その際、映像上の任意の点にユーザUが接触するまでは、カーソルPを初期位置に表示してもよい。カーソルPと接触点Sとの離間距離αが所定の距離であれば、ユーザUにとって接触点Sの動きに伴うカーソルPの移動が容易となる。 In the examples shown in FIGS. 6 and 7, the cursor movement module 33 continuously moves the cursor P from the initial position on the image M by following the movement of the contact point S via the separation distance α. However, this embodiment is not limited to such an example. For example, the cursor movement module 33 may move the cursor P from a position different from the initial position of the cursor P. Specifically, the cursor movement module 33 sets an arbitrary position on the image that the user U contacts through the display surface 22a as the contact point S, and displays the cursor P at a position separated from the contact point S by a distance α. After that, the cursor P may be moved following the movement of the contact point S. At that time, the cursor P may be displayed at the initial position until the user U touches an arbitrary point on the image. If the separation distance α between the cursor P and the contact point S is a predetermined distance, the user U can easily move the cursor P with the movement of the contact point S.
 図8は、映像処理モジュール34の処理の概略を説明する図である。図示のように、映像処理モジュール34は、カーソルPが映像M上の任意の位置を目標点として位置決めされると、位置決めされた映像M上の任意の位置が表示面22aのウインドウWの中央部分に位置決めされる。 FIG. 8 is a diagram illustrating an outline of processing of the video processing module 34. As shown in the drawing, when the cursor P is positioned with an arbitrary position on the image M as a target point, the positioned arbitrary position on the image M is the central portion of the window W on the display surface 22a. Positioned to.
 本実施の形態では、屋根101の頂部101aに生成されたカーソルPが目標点である屋根101の角部101bに位置決めされると、この屋根101の角部101bが表示面22aの中央部分に位置決めされるように、表示面22aにおける映像Mの表示がスクロールされる。 In the present embodiment, when the cursor P generated on the top 101a of the roof 101 is positioned on the corner 101b of the roof 101, which is the target point, the corner 101b of the roof 101 is positioned on the central portion of the display surface 22a. The display of the image M on the display surface 22a is scrolled so as to be performed.
 図5で示すように、制御信号生成モジュール35は、入力モジュール31を介して入力されたユーザUの操作や指令に基づいて、飛行体10等の操作に関する各種の制御信号を生成するモジュールである。 As shown in FIG. 5, the control signal generation module 35 is a module that generates various control signals related to the operation of the flying object 10 and the like based on the operations and commands of the user U input via the input module 31. ..
 例えば、コマンドC1~Cnを介して「カメラの角度調整」の指令が入力されると、カメラ18の角度を調整する制御信号が生成されて飛行体10に送信されて、カメラ18の角度が調整され、「撮像」の指令が入力されると、カメラ18で建造物100の屋根101を画像として撮像する制御信号が生成されて飛行体10に送信されて、カメラ18によって屋根101が撮像される。 For example, when a command for "adjusting the angle of the camera" is input via commands C1 to Cn, a control signal for adjusting the angle of the camera 18 is generated and transmitted to the flying object 10, and the angle of the camera 18 is adjusted. Then, when the command of "imaging" is input, the camera 18 generates a control signal for imaging the roof 101 of the building 100 as an image and transmits it to the flying object 10, and the camera 18 images the roof 101. ..
 図9は、制御信号生成モジュール35で生成された制御信号によって飛行体10の飛行が制御される場合の概略を説明する図である。例えば、コマンドC1~Cnを介して「上昇」の指令が入力されると、図示のように、飛行体10を上昇させる制御信号が生成されて飛行体10に送信されて、飛行体10が建造物100の上方に上昇する。 FIG. 9 is a diagram illustrating an outline when the flight of the flying object 10 is controlled by the control signal generated by the control signal generation module 35. For example, when a command for "climbing" is input via commands C1 to Cn, a control signal for raising the flying object 10 is generated and transmitted to the flying object 10 as shown in the figure, and the flying object 10 is constructed. It rises above the object 100.
 一方、ユーザUが指FでウインドウWを介して映像Mに接触してカーソルPを移動させてカーソルPを映像M上の任意の位置に位置決めすると、映像M上において位置決めされた任意の位置に対応する実位置に飛行体10を移動させる制御信号が生成されて飛行体10に送信されて、飛行体10が建造物100の上方において建造物100の水平方向Hに移動する。 On the other hand, when the user U touches the image M through the window W with the finger F and moves the cursor P to position the cursor P at an arbitrary position on the image M, the user U is positioned at an arbitrary position on the image M. A control signal for moving the flying object 10 to the corresponding actual position is generated and transmitted to the flying object 10, and the flying object 10 moves above the building 100 in the horizontal direction H of the building 100.
 図1で示すように、サーバ40は、例えばデスクトップ型のコンピュータによって実装され、本実施の形態では、飛行体10のカメラ18が撮像する建造物100の屋根101の画像がそのストレージに格納される。 As shown in FIG. 1, the server 40 is implemented by, for example, a desktop computer, and in the present embodiment, an image of the roof 101 of the building 100 captured by the camera 18 of the flying object 10 is stored in the storage thereof. ..
 次に、本実施の形態の飛行体の制御システム1によって飛行体10の飛行を制御する方法について説明する。 Next, a method of controlling the flight of the flying object 10 by the flying object control system 1 of the present embodiment will be described.
 まず、ユーザUが、携帯型情報端末20のタッチパネル部22の表示面22aの画面インターフェースIFのコマンドC1~Cnを介して「上昇」の指令を入力すると、図9で示すように、飛行体10が建造物100の上空に上昇する。 First, when the user U inputs the "ascend" command via the commands C1 to Cn of the screen interface IF of the display surface 22a of the touch panel unit 22 of the portable information terminal 20, the flying object 10 is as shown in FIG. Ascends above building 100.
 飛行体10が建造物100の上空に上昇する際には、図4で示すように、飛行体10のカメラ18によって取得される映像Mが画面インターフェースIFのウインドウWに表示される。 When the flying object 10 rises above the building 100, as shown in FIG. 4, the image M acquired by the camera 18 of the flying object 10 is displayed in the window W of the screen interface IF.
 飛行体10が任意の高度(例えば30m)まで上昇してその高度でホバリング状態に移行すると、飛行体10の実位置に対応する映像M上の位置である屋根101の頂部101aに、カーソルPが生成される。 When the flying object 10 rises to an arbitrary altitude (for example, 30 m) and shifts to the hovering state at that altitude, the cursor P moves to the top 101a of the roof 101, which is a position on the image M corresponding to the actual position of the flying object 10. Will be generated.
 続いて、図7で示すように、ユーザUがウインドウWを介して映像Mに接触してカーソルPを移動させて、目標点である屋根101の角部101bに位置決めする。 Subsequently, as shown in FIG. 7, the user U touches the image M through the window W and moves the cursor P to position the cursor P at the corner 101b of the roof 101, which is the target point.
 このとき、カーソルPは接触点Sから離間距離αを介して移動することから、ユーザUの指Fによって屋根101の角部101bが隠蔽されることがなく、カーソルPを的確に目標点に位置決めすることができる。 At this time, since the cursor P moves from the contact point S via the separation distance α, the corner portion 101b of the roof 101 is not hidden by the finger F of the user U, and the cursor P is accurately positioned at the target point. can do.
 カーソルPを屋根101の角部101bに位置決めすると、図9で示すように、飛行体10が、屋根101の頂部101aから、映像M上の屋根101の角部101bに対応する実位置に向かって、建造物100の水平方向Hに移動する。 When the cursor P is positioned at the corner 101b of the roof 101, as shown in FIG. 9, the flying object 10 moves from the top 101a of the roof 101 toward the actual position corresponding to the corner 101b of the roof 101 on the image M. , Move in the horizontal direction H of the building 100.
 一方、ユーザUがウインドウWを介して映像Mに接触してカーソルPを移動させて、屋根101の角部101bにカーソルPを位置決めすると、図8で示すように、屋根101の角部101bが表示面22aの中央部分に位置決めされるように、表示面22aにおける映像Mの表示がスクロールされる。 On the other hand, when the user U touches the image M via the window W to move the cursor P and positions the cursor P on the corner portion 101b of the roof 101, as shown in FIG. 8, the corner portion 101b of the roof 101 moves. The display of the image M on the display surface 22a is scrolled so as to be positioned at the central portion of the display surface 22a.
 続いて、ユーザUが、携帯型情報端末20のタッチパネル部22の表示面22aの画面インターフェースIFのコマンドC1~Cnを介して「撮像」の指令を入力すると、飛行体10のカメラ18が屋根101の角部101bを画像として撮像する。 Subsequently, when the user U inputs an "imaging" command via commands C1 to Cn of the screen interface IF of the display surface 22a of the touch panel unit 22 of the portable information terminal 20, the camera 18 of the flying object 10 has the roof 101. The corner portion 101b of the above is imaged as an image.
 撮像した画像は、携帯型情報端末20のストレージ21cに格納されるとともに、サーバ40のストレージに格納され、爾後、屋根101の補修箇所の検出に用いられる。 The captured image is stored in the storage 21c of the portable information terminal 20 and also in the storage of the server 40, and is later used for detecting the repaired portion of the roof 101.
 その後、ユーザUがウインドウWを介して映像Mに接触してカーソルPを移動させて、飛行体10が着陸可能な映像M上の任意の位置(例えば障害物のない地表面)にカーソルPを位置決めすると、飛行体10が、映像M上の地表面に対応する実位置に向かって、建造物100の水平方向Hに移動する。 After that, the user U touches the image M through the window W to move the cursor P, and moves the cursor P to an arbitrary position (for example, an unobstructed ground surface) on the image M where the flying object 10 can land. When positioned, the flying object 10 moves in the horizontal direction H of the structure 100 toward the actual position corresponding to the ground surface on the image M.
 飛行体10が映像M上の地表面に対応する実位置に移動した後、ユーザUが、携帯型情報端末20のタッチパネル部22の表示面22aの画面インターフェースIFのコマンドC1~Cnを介して「着陸」の指令を入力すると、飛行体10が下降して地表面に着陸する。 After the aircraft 10 moves to the actual position corresponding to the ground surface on the image M, the user U uses commands C1 to Cn of the screen interface IF of the display surface 22a of the touch panel unit 22 of the portable information terminal 20. When the command of "landing" is input, the aircraft 10 descends and lands on the ground surface.
 このように、本実施の形態の飛行体の制御システム1によれば、携帯型情報端末20のタッチパネル部22の表示面22aの画面インターフェースIFのウインドウWに表示された映像Mに重ね合わせられてカーソルPが表示され、このカーソルPを移動させることによって、飛行体10を移動させることができる。 As described above, according to the flying object control system 1 of the present embodiment, the image M displayed on the window W of the screen interface IF of the display surface 22a of the touch panel unit 22 of the portable information terminal 20 is superimposed. The cursor P is displayed, and the flying object 10 can be moved by moving the cursor P.
 したがって、簡易な操作によって飛行体10の飛行を制御することができることから、誤操作による飛行体10の想定しない挙動の発生を抑制することができる。 Therefore, since the flight of the flying object 10 can be controlled by a simple operation, it is possible to suppress the occurrence of unexpected behavior of the flying object 10 due to an erroneous operation.
 しかも、カーソルPは、表示面22aの画面インターフェースIFのウインドウWを介してユーザUが指Fで接触する映像M上の任意の位置から離間距離αを介した映像M上の位置に生成されることから、カーソルPを目標点である映像M上の任意の位置に移動させて位置決めするに際して、ユーザUの指Fによって映像M上の任意の位置が隠蔽されることがなく、カーソルPを的確に目標点に位置決めすることができる。 Moreover, the cursor P is generated from an arbitrary position on the image M that the user U contacts with the finger F via the window W of the screen interface IF of the display surface 22a to a position on the image M via the separation distance α. Therefore, when the cursor P is moved to an arbitrary position on the image M which is the target point and positioned, the finger F of the user U does not hide the arbitrary position on the image M, and the cursor P is accurately moved. Can be positioned at the target point.
 <他の実施形態>
 次に、本開示の他の実施形態について説明する。本実施形態では、カーソル生成モジュール32により生成されたカーソルPの初期位置とカーソル処理モジュール33により決定されたカーソルPの位置との映像上の距離が算出され、飛行体10の高度やカメラ18のズームの状態に係る情報とに基づいて、上記の映像上の距離に対応する実空間での距離が推定される。これにより、飛行体10は、映像M上の地表面に対応する実位置に向かって建造物100の水平方向へ、より正確に移動することが可能である。以下、本実施形態の一例について説明する。
<Other embodiments>
Next, other embodiments of the present disclosure will be described. In the present embodiment, the distance on the image between the initial position of the cursor P generated by the cursor generation module 32 and the position of the cursor P determined by the cursor processing module 33 is calculated, and the altitude of the flying object 10 and the camera 18 are calculated. Based on the information related to the zoom state, the distance in the real space corresponding to the above-mentioned distance on the image is estimated. As a result, the flying object 10 can move more accurately in the horizontal direction of the building 100 toward the actual position corresponding to the ground surface on the image M. Hereinafter, an example of this embodiment will be described.
 図10は、本開示の他の実施形態に係る処理の概略を説明する図である。図示する映像Mには、飛行体10の高度を示す高度情報ボックスHが重畳されて表示されている。また、映像Mには、カメラ18のズーム倍率を示すズーム情報オブジェクトZが表示されている。なお、係るズーム情報オブジェクトZは、カメラ18のズーム倍率を調整するためのコマンドであってもよい。 FIG. 10 is a diagram illustrating an outline of processing according to another embodiment of the present disclosure. An altitude information box H indicating the altitude of the flying object 10 is superimposed and displayed on the illustrated image M. Further, the image M displays a zoom information object Z indicating the zoom magnification of the camera 18. The zoom information object Z may be a command for adjusting the zoom magnification of the camera 18.
 例えば、カーソルPが屋根101の角部101bに位置決めされると、カーソル移動モジュール33は、カーソル生成モジュール32で生成されたカーソルPの初期位置(図10では映像Mに表示されている屋根101の頂部101aに相当する位置)と、位置決めされたカーソルPの位置(図10では映像Mに表示されている屋根101の角部101bに相当する位置)との映像M上の距離を算出する。映像M上の距離の算出方法については特に限定されない。 For example, when the cursor P is positioned at the corner 101b of the roof 101, the cursor movement module 33 moves the initial position of the cursor P generated by the cursor generation module 32 (in FIG. 10, the roof 101 displayed on the image M in FIG. 10). The distance on the image M between the position corresponding to the top portion 101a) and the position of the positioned cursor P (the position corresponding to the corner portion 101b of the roof 101 displayed in the image M in FIG. 10) is calculated. The method of calculating the distance on the image M is not particularly limited.
 映像処理モジュール34は、上記の映像M上の距離と、飛行体10の高度およびカメラ18のズーム情報の少なくともいずれかに基づいて、上記の映像M上の距離に対応する実空間の距離が推定される。ここでいう実空間の距離とは、飛行体10の移動すべき水平距離である。 The image processing module 34 estimates the distance in the real space corresponding to the distance on the image M based on at least one of the distance on the image M and the altitude of the flying object 10 and the zoom information of the camera 18. Will be done. The distance in the real space referred to here is the horizontal distance to which the flying object 10 should move.
 そして制御信号生成モジュール35は、推定された実空間の距離に係る制御信号を生成し、飛行体10に送信する。飛行体10は、かかる制御信号を受けて、映像M上において位置決めされた任意の位置に対応する実位置に向かって、建造物100の水平方向Hに移動する。 Then, the control signal generation module 35 generates a control signal related to the estimated real space distance and transmits it to the flying object 10. In response to such a control signal, the flying object 10 moves in the horizontal direction H of the building 100 toward an actual position corresponding to an arbitrary position positioned on the image M.
 飛行体10の移動距離は、推定された実空間の距離である。実空間の距離は、飛行体10の高度やカメラ18のズーム情報に基づいて推定される。映像Mに表示される屋根101の大きさは、飛行体10の高度やカメラ18のズーム調整によって変化する。そうすると、映像M上におけるカーソルPの初期位置と目標位置との距離と、実空間の距離との対応関係も変化する。 The moving distance of the flying object 10 is an estimated real space distance. The distance in real space is estimated based on the altitude of the flying object 10 and the zoom information of the camera 18. The size of the roof 101 displayed in the image M changes depending on the altitude of the flying object 10 and the zoom adjustment of the camera 18. Then, the correspondence between the distance between the initial position of the cursor P and the target position on the image M and the distance in the real space also changes.
 そこで、実空間の距離を飛行体10の高度やカメラ18のズーム情報に基づいて推定することにより、飛行体10の高度やカメラ18のズームの状態にかかわらず、飛行体10をより正確に移動させることができる。 Therefore, by estimating the distance in the real space based on the altitude of the flying object 10 and the zoom information of the camera 18, the flying object 10 can be moved more accurately regardless of the altitude of the flying object 10 and the zoom state of the camera 18. Can be made to.
 なお、本開示は上記実施の形態に限定されることはなく、発明の趣旨を逸脱しない範囲で種々の変更が可能である。上記実施の形態では、飛行体の制御システム1がサーバ40を備える場合を説明したが、サーバ40を配備しないで飛行体の制御システム1を構築することもできる。他にも、飛行体10と携帯型情報端末20とが直接通信することによって飛行体の制御システム1が構築されてもよい。この場合、例えばサーバ40の機能は、携帯型情報端末20により実現されてもよい。また、図11および図12は、他の実施の形態に係る飛行体の制御システムの概略を説明する図である。例えば、図11に示すように、他の実施形態に係る飛行体の制御システム1では、飛行体10と携帯型情報端末20とが直接通信し、携帯型情報端末20とサーバ40とが直接通信を行うような構成であってもよい。この場合、サーバ40は、携帯型情報端末20を介して飛行体10と通信する。また、図12に示すように、他の実施形態に係る飛行体の制御システム1では、携帯型情報端末20とサーバ40とが直接通信し、サーバ40と飛行体10とが直接通信する構成であってもよい。この場合、携帯型情報端末20は、サーバ40を介して飛行体10と通信する。 Note that the present disclosure is not limited to the above embodiment, and various modifications can be made without departing from the spirit of the invention. In the above embodiment, the case where the control system 1 of the air vehicle includes the server 40 has been described, but the control system 1 of the air vehicle can also be constructed without deploying the server 40. In addition, the flying object control system 1 may be constructed by directly communicating between the flying object 10 and the portable information terminal 20. In this case, for example, the function of the server 40 may be realized by the portable information terminal 20. 11 and 12 are diagrams illustrating an outline of an air vehicle control system according to another embodiment. For example, as shown in FIG. 11, in the flying object control system 1 according to another embodiment, the flying object 10 and the portable information terminal 20 directly communicate with each other, and the portable information terminal 20 and the server 40 directly communicate with each other. It may be configured to perform. In this case, the server 40 communicates with the flying object 10 via the portable information terminal 20. Further, as shown in FIG. 12, in the flying object control system 1 according to another embodiment, the portable information terminal 20 and the server 40 directly communicate with each other, and the server 40 and the flying object 10 directly communicate with each other. There may be. In this case, the portable information terminal 20 communicates with the flying object 10 via the server 40.
 上記実施の形態では、対象物が建造物100の屋根101である場合を説明したが、樹木や任意の地表面であってもよく、更には、一時的に停止している自動車や動物といった物体であってもよい。 In the above embodiment, the case where the object is the roof 101 of the building 100 has been described, but it may be a tree or an arbitrary ground surface, and further, an object such as a car or an animal that is temporarily stopped. It may be.
1  飛行体の制御システム
10  飛行体
13  フライトコントローラ
18  カメラ(映像取得部)
20  携帯型情報端末
21  制御部
22  タッチパネル部(表示部)
22a  表示面
30  制御プログラム
32  カーソル生成モジュール
33  カーソル移動モジュール
34  映像処理モジュール
35  制御信号生成モジュール
100  建造物
101  屋根(対象物)
101a  頂部
101b  角部
C1~Cn  コマンド
F  指
M  映像
P  カーソル
S  接触点
U  ユーザ
W  ウインドウ
1 Aircraft control system 10 Aircraft 13 Flight controller 18 Camera (video acquisition unit)
20 Portable information terminal 21 Control unit 22 Touch panel unit (display unit)
22a Display surface 30 Control program 32 Cursor generation module 33 Cursor movement module 34 Video processing module 35 Control signal generation module 100 Building 101 Roof (object)
101a Top 101b Corner C1 to Cn Command F Finger M Video P Cursor S Contact point U User W Window

Claims (9)

  1.  ユーザの操作によって飛行する飛行体の飛行を制御する飛行体の制御システムにおいて、
     前記飛行体に搭載されて該飛行体から対象物の映像を取得する映像取得部と、
     該映像取得部で取得した前記映像が表示される表示面を有する表示部と、
     該表示部の前記表示面に表示された前記映像に重ね合わせられて前記表示面に表示されるカーソルを生成するカーソル生成モジュールと、
     該カーソル生成モジュールで生成された前記カーソルを前記映像上の任意の位置に移動させるカーソル移動モジュールと、を備え、
     前記カーソル移動モジュールによって前記カーソルが移動されて該カーソルが位置決めされた前記映像上の任意の位置に対応する実位置に前記飛行体を移動させることを特徴とする飛行体の制御システム。
    In an air vehicle control system that controls the flight of an air vehicle that flies by user operation,
    An image acquisition unit mounted on the air vehicle and acquiring an image of an object from the air vehicle,
    A display unit having a display surface on which the image acquired by the image acquisition unit is displayed,
    A cursor generation module that generates a cursor that is superimposed on the image displayed on the display surface of the display unit and displayed on the display surface.
    A cursor movement module for moving the cursor generated by the cursor generation module to an arbitrary position on the image is provided.
    A control system for a flying object, characterized in that the cursor is moved by the cursor moving module to move the flying object to an actual position corresponding to an arbitrary position on the image in which the cursor is positioned.
  2.  前記カーソルが前記映像上の任意の位置に位置決めされた際に該映像上の任意の位置が前記表示面の中央部分に位置決めされることを特徴とする請求項1に記載の飛行体の制御システム。 The control system for an air vehicle according to claim 1, wherein when the cursor is positioned at an arbitrary position on the image, the arbitrary position on the image is positioned at the central portion of the display surface. ..
  3.  前記表示部は、
     前記ユーザが前記表示面に接触することによって前記飛行体に対する操作の入力を受け付けるタッチパネル装置で構成されることを特徴とする請求項1または2に記載の飛行体の制御システム。
    The display unit
    The control system for an air vehicle according to claim 1 or 2, wherein the user comprises a touch panel device that receives an input of an operation on the air vehicle by touching the display surface.
  4.  前記カーソル生成モジュールは、
     前記表示面を介して前記ユーザが接触する前記映像上の位置から離間した該映像上の位置に前記カーソルを生成することを特徴とする請求項3に記載の飛行体の制御システム。
    The cursor generation module
    The control system for an air vehicle according to claim 3, wherein the cursor is generated at a position on the image that is separated from the position on the image that the user contacts through the display surface.
  5.  前記カーソル移動モジュールは、
     前記表示面を介して前記ユーザが接触する前記映像上の位置を接触点とした場合に前記ユーザが前記表示面に接触しながら前記接触点を変位させる際に前記接触点の変位に追従させて前記カーソルを前記映像上で移動させ、
     前記ユーザが前記表示面への接触を解除した際に前記カーソルが位置する前記映像上の任意の位置に前記カーソルを位置決めする、
     ことを特徴とする請求項1~4のいずれか1項に記載の飛行体の制御システム。
    The cursor movement module
    When the position on the image that the user contacts through the display surface is set as the contact point, the displacement of the contact point is followed when the user displaces the contact point while contacting the display surface. Move the cursor on the image to
    Positioning the cursor at an arbitrary position on the image in which the cursor is located when the user releases contact with the display surface.
    The control system for an air vehicle according to any one of claims 1 to 4.
  6.  前記カーソル移動モジュールは、前記カーソル生成モジュールにより生成された前記カーソルの初期位置と、前記カーソル移動モジュールにより移動されて位置決めされた前記カーソルの位置との映像上の距離を算出し、
     前記映像上の距離と前記映像取得部のズームの状態に係る情報とに基づいて、前記映像上の距離に対応する実空間での距離を推定する映像処理モジュールをさらに備える、請求項1~5のいずれか1項に記載の飛行体の制御システム。
    The cursor movement module calculates the distance on the image between the initial position of the cursor generated by the cursor generation module and the position of the cursor moved and positioned by the cursor movement module.
    Claims 1 to 5 further include a video processing module that estimates a distance in real space corresponding to the distance on the video based on the distance on the video and information related to the zoom state of the video acquisition unit. The control system for an air vehicle according to any one of the above.
  7.  前記映像処理モジュールは、前記飛行体の高度に係る高度情報に基づいて、前記映像上の距離に対応する実空間での距離を推定する、請求項6に記載の飛行体の制御システム。 The control system for an air vehicle according to claim 6, wherein the image processing module estimates a distance in real space corresponding to the distance on the image based on altitude information related to the altitude of the air vehicle.
  8.  前記飛行体は、
     前記対象物の上方において該対象物の水平方向に移動されることを特徴とする請求項1~7のいずれか1項に記載の飛行体の制御システム。
    The flying object
    The control system for an air vehicle according to any one of claims 1 to 7, wherein the object is moved in the horizontal direction above the object.
  9.  前記対象物が建造物の屋根であることを特徴とする請求項1~8のいずれか1項に記載の飛行体の制御システム。 The control system for an air vehicle according to any one of claims 1 to 8, wherein the object is the roof of a building.
PCT/JP2020/024100 2019-06-24 2020-06-19 Control system for flying vehicle WO2020262222A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021526902A JPWO2020262222A1 (en) 2019-06-24 2020-06-19

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019116133 2019-06-24
JP2019-116133 2019-06-24

Publications (1)

Publication Number Publication Date
WO2020262222A1 true WO2020262222A1 (en) 2020-12-30

Family

ID=74059740

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/024100 WO2020262222A1 (en) 2019-06-24 2020-06-19 Control system for flying vehicle

Country Status (2)

Country Link
JP (1) JPWO2020262222A1 (en)
WO (1) WO2020262222A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5155683A (en) * 1991-04-11 1992-10-13 Wadiatur Rahim Vehicle remote guidance with path control
JPH07135685A (en) * 1993-01-05 1995-05-23 Sfim Ind Guidance system
JP2009229313A (en) * 2008-03-24 2009-10-08 Toyota Motor Corp Onboard information device
WO2017057157A1 (en) * 2015-09-30 2017-04-06 株式会社ニコン Flight device, movement device, server, and program
JP2017138162A (en) * 2016-02-02 2017-08-10 Jfe鋼板株式会社 Inspection system and inspection method of structure
CN108521803A (en) * 2017-03-15 2018-09-11 深圳市大疆创新科技有限公司 Unmanned vehicle destination planing method, system, electronic equipment and storage medium
WO2019093504A1 (en) * 2017-11-09 2019-05-16 株式会社Clue Terminal, method and program for operating drone

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5155683A (en) * 1991-04-11 1992-10-13 Wadiatur Rahim Vehicle remote guidance with path control
JPH07135685A (en) * 1993-01-05 1995-05-23 Sfim Ind Guidance system
JP2009229313A (en) * 2008-03-24 2009-10-08 Toyota Motor Corp Onboard information device
WO2017057157A1 (en) * 2015-09-30 2017-04-06 株式会社ニコン Flight device, movement device, server, and program
JP2017138162A (en) * 2016-02-02 2017-08-10 Jfe鋼板株式会社 Inspection system and inspection method of structure
CN108521803A (en) * 2017-03-15 2018-09-11 深圳市大疆创新科技有限公司 Unmanned vehicle destination planing method, system, electronic equipment and storage medium
WO2019093504A1 (en) * 2017-11-09 2019-05-16 株式会社Clue Terminal, method and program for operating drone

Also Published As

Publication number Publication date
JPWO2020262222A1 (en) 2020-12-30

Similar Documents

Publication Publication Date Title
US11914370B2 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
US10551834B2 (en) Method and electronic device for controlling unmanned aerial vehicle
CN108279694B (en) Electronic device and control method thereof
JP6835392B2 (en) Systems and methods for controlling images acquired by imaging devices
CN108292141B (en) Method and system for target tracking
CN107168352B (en) Target tracking system and method
US20200326709A1 (en) Method and device for controlling reset of gimbal, gimbal, and unmanned aerial vehicle
CN105549604A (en) Aircraft control method and apparatus
EP3399380B1 (en) Headless control method
WO2021199449A1 (en) Position calculation method and information processing system
KR102290746B1 (en) Device for controlling unmanned aerial vehicle, unmanned aerial vehicle controlled by the device and system
WO2021251441A1 (en) Method, system, and program
US20210181769A1 (en) Movable platform control method, movable platform, terminal device, and system
CN106020219B (en) A kind of control method and device of aircraft
WO2020262222A1 (en) Control system for flying vehicle
JP2021073796A (en) Control device, and method for obtaining image
KR20190128425A (en) Method for controling unmanned moving object based on cylindrical coordinate system and recording medium storing program for executing the same, and computer prograom stored in recording medium for executing the same
US20220166917A1 (en) Information processing apparatus, information processing method, and program
WO2022113482A1 (en) Information processing device, method, and program
WO2022070851A1 (en) Method, system, and program
JPWO2021124579A1 (en) Aircraft imaging method and information processing equipment
US20240013460A1 (en) Information processing apparatus, information processing method, program, and information processing system
JP7487900B1 (en) Information processing method, information processing system, and program
JP2023083072A (en) Method, system and program
JP2024062247A (en) Information processing system, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20832557

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021526902

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20832557

Country of ref document: EP

Kind code of ref document: A1