WO2022070851A1 - Method, system, and program - Google Patents

Method, system, and program Download PDF

Info

Publication number
WO2022070851A1
WO2022070851A1 PCT/JP2021/033465 JP2021033465W WO2022070851A1 WO 2022070851 A1 WO2022070851 A1 WO 2022070851A1 JP 2021033465 W JP2021033465 W JP 2021033465W WO 2022070851 A1 WO2022070851 A1 WO 2022070851A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
information
flight
flying object
flying
Prior art date
Application number
PCT/JP2021/033465
Other languages
French (fr)
Japanese (ja)
Inventor
瑛理香 田中
Original Assignee
株式会社Clue
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Clue filed Critical 株式会社Clue
Priority to JP2022553765A priority Critical patent/JPWO2022070851A1/ja
Publication of WO2022070851A1 publication Critical patent/WO2022070851A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/20Initiating means actuated automatically, e.g. responsive to gust detectors using radiated signals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Definitions

  • This disclosure relates to methods, systems and programs.
  • Patent Document 1 discloses a technique for operating an unmanned flying object using a touch panel.
  • This disclosure has been made in view of this background, and an object thereof is to provide a method, a system and a program capable of easily setting a flight path of an air vehicle in the field. ..
  • it is a method relating to control of an air vehicle, which is to acquire information of an image captured by the air vehicle, to display the image on a display unit, and to display the image on the display unit.
  • Information is output to the flying object, and a method including is provided.
  • the system is related to the control of the flying object, the image acquisition unit for acquiring the information of the image captured by the flying object, the display control unit for displaying the image on the display unit, and the display unit.
  • An input information acquisition unit that acquires input information including information about a position on the image, which is generated based on an operation on the image displayed on the image, and a position in real space corresponding to the position on the image.
  • a system including an output control unit for outputting flight control information to the flying object is provided.
  • a program for making a computer function as a control device for an air vehicle and displays an image acquisition unit for acquiring information of an image captured by the air vehicle and the image on a display unit.
  • the input information acquisition unit that acquires input information including information about the position on the image, which is generated based on the operation on the image displayed on the display unit, and the position on the image.
  • a program is provided that functions as an output control unit that outputs flight control information for flying a position in real space to the flying object.
  • FIG. 1 is a diagram showing an outline of a system 1 according to an embodiment of the present disclosure.
  • the system 1 includes an information processing terminal 10 and an unmanned flying object 20.
  • the system 1 according to the present embodiment can be used, for example, for construction management and inspection of the building S1 which is an object of photography by the unmanned flying object 20.
  • U using the information processing terminal 10 operates the touch panel of the information processing terminal 10 so as to trace the flight path of the unmanned flying object 20.
  • Such an operation may be a continuous or intermittent operation such as a slide or a swipe.
  • the unmanned aircraft 20 can be controlled to fly at a position in real space estimated from the position information on the image obtained by such a tracing operation.
  • the information processing terminal 10 is mounted by a so-called tablet-shaped small computer.
  • the information processing terminal 10 may be realized by a portable information processing terminal such as a smartphone or a game machine, or may be realized by a stationary information processing terminal such as a personal computer.
  • the information processing terminal 10 may be realized by a plurality of hardware and may have a configuration in which the functions are distributed to them.
  • FIG. 2 is a block diagram showing the configuration of the information processing terminal 10 according to the present embodiment.
  • the information processing terminal 10 includes a control unit 11 and a touch panel 12 which is an example of a display unit.
  • the processor 11a is an arithmetic unit that controls the operation of the control unit 11, controls the transmission and reception of data between each element, and performs processing necessary for program execution.
  • the processor 11a is, for example, a CPU (Central Processing Unit), and executes a program stored in the storage 11c described later and expanded in the memory 11b to perform each process.
  • CPU Central Processing Unit
  • the memory 11b includes a main storage device composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory), and an auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disc Drive). .. While the memory 11b is used as a work area of the processor 11a, the BIOS (Basic Input / Output System) executed when the control unit 11 is started, various setting information, and the like are stored.
  • BIOS Basic Input / Output System
  • the storage 11c stores programs and information used for various processes. For example, when the user operates the flying object for capturing the image information of the roof 101 via the information processing terminal 10, the storage 11c may store a program for controlling the flight of the flying object. ..
  • the transmission / reception unit 11d connects the control unit 11 to a network such as an Internet network, and may be provided with a short-range communication interface such as Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
  • a short-range communication interface such as Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
  • the input / output unit 11e is an interface to which input / output devices are connected, and in the present embodiment, the touch panel 12 is connected.
  • the bus 11f transmits, for example, an address signal, a data signal, and various control signals between the connected processor 11a, memory 11b, storage 11c, transmission / reception unit 11d, and input / output unit 11e.
  • the touch panel 12 is an example of a display unit, and includes a display surface on which acquired images and images are displayed.
  • this display surface receives information input by contact with the display surface, and is implemented by various techniques such as a resistance film method and a capacitance method.
  • an image captured by the unmanned flying object 20 can be displayed on the display surface of the touch panel 12. Further, on the display surface, buttons, objects, and the like for flight control of the unmanned vehicle 20 and control of the image pickup apparatus may be displayed. Further, the user can input input information to the image, the button, or the like displayed on the display surface via the touch panel 12.
  • the input information will be described in detail later, but the operations related to the input of the input information include, for example, a touch (tap) operation for a button, an object, etc., a tap operation for determining the flight path of the unmanned aircraft 20, and a slide operation. , Swipe operation, etc.
  • FIG. 3 is a block diagram showing an example of the functional configuration of the unmanned aircraft 20 according to the present embodiment.
  • the unmanned vehicle 20 includes a transmission / reception unit 22, a flight controller 23, a battery 24, an ESC 25, a motor 26, a propeller 27, and a camera 28 in the main body 21.
  • the unmanned flying object 20 is an example of an flying object.
  • the type of the flying object is not particularly limited, and may be, for example, a so-called multi-rotor type drone as shown in FIG.
  • the flight controller 23 can have one or more processors 23A such as a programmable processor (eg, central processing unit (CPU)).
  • processors 23A such as a programmable processor (eg, central processing unit (CPU)).
  • the flight controller 23 has a memory 23B and can access the memory 23B.
  • Memory 23B stores logic, code, and / or program instructions that the flight controller can execute to perform one or more steps.
  • the memory 23B may include, for example, a separable medium such as an SD card or a random access memory (RAM) or an external storage device.
  • the data acquired from the sensors 23C may be directly transmitted and stored in the memory 23B.
  • still image / moving image data taken by the camera 28 is recorded in the built-in memory or the external memory.
  • the flight controller 23 includes a control module configured to control the state of the flying object.
  • the control module may adjust the spatial placement, velocity, and / or acceleration of an air vehicle with 6 degrees of freedom (translation x, y and z, and rotational motion ⁇ x , ⁇ y and ⁇ z ).
  • the propulsion mechanism (motor 26, etc.) of the flying object is controlled via the ESC (Electric Speed Controller) 25.
  • the control module can control one or more of the camera 28, the sensors 23C, and the like.
  • the flight controller 23 is a transmitter / receiver configured to transmit and / or receive data from one or more external devices (eg, a terminal such as an information processing terminal 10, a display device, or another remote controller). It is possible to communicate with 22.
  • the transmission / reception unit 22 uses one or more of a local area network (LAN), a wide area network (WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, cloud communication, and the like. can do.
  • the transmission / reception unit 22 is one or more of data acquired by the camera 28 and sensors 23C, processing results generated by the flight controller 23, predetermined control data, user commands from the information processing terminal 10 or a remote controller, and the like. Can be sent and / or received.
  • Sensors 23C may include an inertial sensor (acceleration sensor, gyro sensor), GPS sensor, proximity sensor (eg, rider), or vision / image sensor (eg, camera).
  • inertial sensor acceleration sensor, gyro sensor
  • GPS sensor GPS sensor
  • proximity sensor eg, rider
  • vision / image sensor eg, camera
  • the battery 24 can be a known battery such as a lithium polymer battery.
  • the power for driving the unmanned vehicle 20 is not limited to the electric power supplied from the battery 24 or the like, and may be, for example, the power of an internal combustion engine or the like.
  • the camera 28 is an example of an image pickup device.
  • the type of the camera 28 is not particularly limited, and may be, for example, an ordinary digital camera, an omnidirectional camera, an infrared camera, an image sensor such as a thermography, or the like.
  • the camera 28 may be connected to the main body 21 so as to be independently displaceable by a gimbal or the like (not shown).
  • FIG. 4 is a block diagram showing a functional configuration of the control unit 11 according to the present embodiment.
  • the control unit 11 includes an image acquisition unit 111, a display control unit 112, an input information acquisition unit 113, a flight control information generation unit 114, a camera control information generation unit 115, and an output control unit 116.
  • Each of these functional units can be realized by the processor 11a reading a program stored in the storage 11c into the memory 11b and executing the program.
  • the image acquisition unit 111 has a function of acquiring information on an image captured by the unmanned flying object 20.
  • the image acquisition unit 111 appropriately acquires an image captured by the camera 28 mounted on the unmanned flying object 20.
  • the image to be acquired may be a moving image obtained in real time, or may be a still image captured at an arbitrary timing.
  • the acquired image information is output to the display control unit 112.
  • the display control unit 112 has a function of displaying the acquired image on the touch panel 12. Further, the display control unit 112 includes information such as buttons, objects, and texts for providing information to the user who uses the system 1 and for acquiring input information based on the operation by the user in the image. It has a function to display. Further, it may also have a function for displaying the display based on the input information obtained by the input information acquisition unit 113, which will be described later, on the touch panel 12.
  • the input information acquisition unit 113 has a function of acquiring input information generated based on an operation on an image displayed on the touch panel 12.
  • the input information referred to here includes, for example, information regarding a position on an image displayed on the touch panel 12.
  • the position on the image is, for example, the position of the pixels constituting the image. That is, the input information includes information indicating to which position on the image the user has performed an operation.
  • the information about the position on the image includes, for example, the information of a line segment consisting of a continuous set of positions on the image.
  • the information of such a line segment is obtained when the user performs an operation such as tracing the touch panel 12.
  • an operation such as tracing the touch panel 12.
  • the trajectory obtained by the tracing can be the flight route of the unmanned flying object 20.
  • the tracing operation does not necessarily have to be a continuous operation, but may be an intermittent operation. That is, even when the user's tracing operation is performed at a plurality of locations, a flight route can be formed based on such a trajectory. Of these, the route corresponding to the flight route can be complemented as appropriate for the portion where the tracing operation is interrupted.
  • the input information may include information regarding the imaging target position on the image of the camera 28 loaded on the unmanned flying object 20.
  • the image pickup target position on the image of the camera 28 is a position for determining the image pickup direction of the camera 28 during flight of the unmanned flying object 20.
  • the user can determine at least one of the images of the space displayed on the touch panel 12 as the imaging target position.
  • the input information may include information regarding the position (altitude) in the height direction of the unmanned aircraft 20 in real space during flight by flight control.
  • the position of the camera 28 in flight can be determined from the information of the position in the height direction, the position in the real space (horizontal direction) of the unmanned flying object 20 during flight, and the image pickup target position of the camera 28. can.
  • the input information may include information regarding the speed of the unmanned flying object 20 during flight by flight control.
  • the information input via the touch panel 12 or the like can be appropriately acquired by the input information acquisition unit 113 as input information.
  • the flight control information generation unit 114 has a function of generating information (flight control information) for controlling the flight of the unmanned aircraft 20 based on the input information.
  • the flight control information includes, for example, the flight path of the unmanned flight object 20, the start and end points of the flight path, the orientation of the unmanned flight object 20 (around the pitch axis, the roll axis, and the yaw axis), the flight altitude of the unmanned flight object 20, and the flight altitude of the unmanned flight object 20. Contains information on the flight speed of the unmanned aircraft 20.
  • the flight control referred to here means that the flight of the unmanned flying object 20 is controlled according to the flight control information generated based on the input information according to the present embodiment input to the touch panel 12.
  • the flight control information generation unit 114 calculates, for example, a position in real space corresponding to a position on an image included in the input information.
  • the position information in the real space is, for example, latitude information and longitude information. Such latitude information and longitude information can be calculated based on the image pickup position information of the camera 28 (or the unmanned vehicle 20) of the unmanned vehicle 20.
  • the flight control information generation unit 114 refers to the angle of view of the camera 28 and the height direction of the unmanned flying object 20 at the time when the camera 28 takes an image while hovering with reference to the pixel at the center of the captured image. Based on the position of, the position in real space corresponding to each pixel of the image can be calculated.
  • the flight control information generation unit 114 determines the flight speed, direction, etc. of the unmanned aircraft 20 in flight at the flight position, and generates it as flight control information. ..
  • the flight speed during flight may be determined, for example, by input based on an operation from the user.
  • the orientation during flight may be determined, for example, according to the orientation of the camera 28 determined by the camera control information generation unit 115 described later, or if the camera 28 is connected to the main body 21 by a gimbal or the like, the flight may be determined.
  • the inside orientation may be constant.
  • the control of the behavior of the unmanned vehicle 20 during such flight can be realized by a known technique.
  • the camera control information generation unit 115 has a function of generating information (camera control information) for controlling the operation of the camera 28 based on the input information.
  • the camera control information includes, for example, information regarding the orientation of the camera 28, the imaging timing, the imaging process, and the like.
  • the camera control information generation unit 115 determines the direction (imaging direction) of the camera 28 based on the position in the real space corresponding to the imaging target position. Generate information to control.
  • the information for controlling the orientation of the camera 28 is, for example, the information of the camera 28 at the flight position calculated based on the image pickup target position acquired by the input information acquisition unit 113 and the position of the unmanned vehicle 20 in flight. Contains information about orientation. At least the horizontal orientation of the camera 28 can be calculated by the relationship between the imaging target position and the flight position.
  • an imaging target position corresponding to a flight position at a certain timing may be separately determined.
  • the imaging target position for example, only the position in the horizontal direction may be specified.
  • the angle around the pitch axis of the camera 28 can be adjusted as appropriate as described later.
  • the orientation of the camera 28 can be appropriately controlled during flight other than flight control, for example, in response to an input to the touch panel 12 by the user.
  • the imaging target position may include a position in the height direction in the real space. If such a position in the height direction is set, the angle around the pitch axis of the camera 28 can be automatically determined.
  • the output control unit 116 has a function of outputting various information generated by the flight control information generation unit 114 and the camera control information generation unit 115 to the unmanned vehicle 20.
  • the unmanned flying object 20 controls the flight of the unmanned flying object 20 and the operation of the camera 28 based on the acquired flight control information and the camera control information.
  • FIG. 5 is a flowchart of a series of controls in the system 1 according to the present embodiment.
  • the image acquisition unit 111 acquires image information from the unmanned flying object 20 (step SQ101). At this time, the unmanned flying object 20 is hovering over the object to be imaged.
  • FIG. 6 is a diagram showing an example of the first situation relating to the control method by the system 1 according to the present embodiment.
  • the unmanned aircraft 20 is hovering over the building S1 which is the object to be imaged at the height of the altitude H1.
  • the image pickup direction of the camera 28 is directly downward.
  • the altitude H1 is the altitude of the camera 28 here, the altitude H1 may be the altitude of the unmanned flying object 20.
  • FIG. 7 is a diagram showing a first screen example displayed on the touch panel 12 according to the present embodiment.
  • the information bar D11, the main screen D12, the sub screen D13, the plurality of buttons, and the objects 101 to 109 are displayed on the screen V1 of the touch panel 12.
  • the information bar D11 is an area mainly for presenting information about the touch panel 12 and the unmanned flying object 20.
  • the information bar D11 displays the radio wave condition of the touch panel 12, the remaining battery level, and the current altitude of the unmanned vehicle 20.
  • the captured image or the like obtained by the camera 28 may be displayed on the main screen D12. On the main screen D12, other information may be displayed superimposed on the captured image.
  • the sub screen D13 is an area for displaying the captured image captured by the camera 28.
  • the sub-screen D13 may be used, for example, to supplementally display a captured image used for determining a flight path. For example, by tapping the sub screen D13, the flight route setting can be started. Specific examples will be described later.
  • the button 101 is a button used for landing the unmanned aircraft 20.
  • the button 102 is, for example, a button for moving the unmanned aircraft 20 to the starting point of the set flight path.
  • the button 103 is a button for starting control of flight along the flight path of the unmanned flying object 20.
  • the button 104 is a button for performing an image pickup process by the camera 28.
  • the button 105 is a button for adjusting the altitude of the unmanned aircraft 20.
  • the button 106 is a button for adjusting the horizontal position of the unmanned aircraft 20.
  • the button 107 is a button for displaying or editing the set route.
  • the object 108 is an object for adjusting the horizontal direction (around the yaw axis) of the unmanned aircraft 20.
  • the object 109 is an object for adjusting the angle around the pitch axis of the camera 28.
  • the captured image displayed on the main screen D12 includes images of the building S1, the plaza S2, and the road S3, which are the imaging target portions.
  • FIG. 8 is a diagram showing a second screen example displayed on the touch panel 12 according to the present embodiment.
  • the screen shown in FIG. 8 is a screen for inputting the flight path of the unmanned flying object 20.
  • the captured image and the buttons 201 to 204 are displayed on the screen V1.
  • Button 201 is a button for setting a flight path.
  • the button 202 is a button for setting the center point (position to be imaged) of the captured image.
  • the button 203 is a button for setting the flight speed of the unmanned flying object 20.
  • Button 204 is a button for saving the settings.
  • points and lines corresponding to the flight path can be input to the screen V1.
  • the user operates the screen V1 with a finger, a stylus pen, or the like to specify a position on the image corresponding to the flight path.
  • Such an operation may be an intuitive slide operation such as drawing with a pen.
  • FIG. 9 is a diagram showing a third screen example displayed on the touch panel 12 according to the present embodiment.
  • a locus 210 can be drawn on the screen V1 by a user operation.
  • the point 211 indicating the start point and the end point of the locus 210 may be displayed by another aspect.
  • the point 211 indicating the start point and the end point can be, for example, a start point and a goal point in the flight control of the unmanned aircraft 20.
  • the start point and the end point of the locus (line segment) may or may not match.
  • one of the start point and the end point may be adjusted to be the same position as the other.
  • the images (frames) taken during the flight of the unmanned vehicle 20 match at the start time and the end time. This makes it possible to obtain a more useful image.
  • either the start point or the end point may be provided so that its location can be changed on the locus 210. Further, the locus 210 once drawn may be appropriately modified.
  • FIG. 10 is a diagram showing a fourth screen example displayed on the touch panel 12 according to the present embodiment.
  • the imaging target position also referred to as a center point
  • the image pickup target position 212 on the image can be adjusted by, for example, a tap operation or a slide operation.
  • the imaging target position 212 on the image determined here corresponds to at least a horizontal position in real space. That is, during flight control, the camera 28 of the unmanned flying object 20 continues to face the direction in which the image pickup target position in the real space corresponding to the image pickup target position 212 on the image exists in the horizontal direction.
  • FIG. 11 is a diagram showing a modified example of the fourth screen example displayed on the touch panel 12 according to the present embodiment.
  • the image pickup target position may be selected not by a point but by a line segment or the like consisting of a plurality of points.
  • the start point and the end point of the imaging target position 213 may be set as the point 214, similarly to the point 211 indicating the start point and the end point of the locus 210. This makes it possible to dynamically change the direction of the camera 28 of the unmanned flying object 20 under flight control toward the image pickup target.
  • FIG. 12 is a diagram showing a fifth screen example displayed on the touch panel 12 according to the present embodiment.
  • the flight speed can be selected by, for example, selecting one of the low speed button 215, the medium speed button 216, and the high speed button 217.
  • the method of setting the flight speed is not limited to this example, and for example, the flight speed can be specified numerically.
  • Tap button 204 to complete the setting of the flight path and the position to be imaged by the camera.
  • FIG. 13 is a diagram showing a sixth screen example displayed on the touch panel 12 according to the present embodiment.
  • the screen V1 shown in FIG. 13 displays a form 120 in which the user taps the button 105 on the screen V1 shown in FIG. 7 to set the altitude during flight by controlling the flight of the unmanned vehicle 20.
  • the altitude during flight may be fixed or variable during flight. When the altitude is variable, the altitude may be appropriately set by a form other than the form 120.
  • the altitude during flight may be predetermined.
  • the information obtained by the user's operation on the touch panel 12 in steps SQ103 to SQ109 can be output to the flight control information generation unit 114 and the camera control information generation unit 115 as appropriate input information.
  • the flight control information generation unit 114 generates flight control information for flying a position in the real space corresponding to the trajectory 210 based on the obtained input information (step SQ111). Further, the camera control information generation unit 115 controls the camera so that the camera 28 continuously faces the image pickup target position in the real space corresponding to the image pickup target position 212 on the image based on the obtained input information. Generate information (step SQ113). This information is output to the unmanned aircraft 20 by the output control unit 116.
  • FIG. 14 is a diagram showing an example of a second situation relating to the control method by the system 1 according to the present embodiment.
  • the unmanned aircraft 20 makes a round flight so as to make a turn over the building S1 at a height H2 according to the flight control information.
  • the unmanned flying object 20 can control the attitude during flight so that the image pickup target position C1 is in the image pickup direction of the camera 28 according to the camera control information.
  • FIG. 15 is a diagram showing a seventh screen example displayed on the touch panel 12 according to the present embodiment.
  • an image captured by the camera 28 of the unmanned vehicle 20 in flight based on flight control is appropriately displayed.
  • the captured image may be a moving image or a still image captured at a predetermined interval. Further, the image captured during hovering may be displayed on the sub screen D13.
  • the locus 130 corresponding to the flight path previously set by the user's input may be superimposed and displayed.
  • the object 131 corresponding to the unmanned flying object 20 may be displayed on the sub screen D13 at the position corresponding to the current position of the unmanned flying object 20 flying in the flight path. This makes it possible to confirm in real time (or when viewing the image after flight) from which point on the flight path the image displayed on the main screen D12 was captured.
  • the example of the control method of the unmanned aircraft 20 by the system 1 according to the present embodiment has been described above.
  • an example of a control method in the actual flight of the unmanned vehicle 20 has been described, but the present technology is not limited to such an example.
  • this technology can be used for simulation in the case of flying a virtual drone in VR (Virtual Reality) or AR (Augmented Reality) space.
  • VR Virtual Reality
  • AR Augmented Reality
  • the simulation technique using the VR space or the AR space a known technique can be used.
  • the unmanned flight object 20 can be made to fly a desired flight path only by inputting the position information on the screen of the touch panel 12. This makes it possible to easily set the flight path of the flying object at the site. Further, by setting the image pickup target position for determining the orientation of the camera with respect to the touch panel 12, the orientation of the camera during flight can be easily determined. This facilitates imaging using an unmanned flying object of an imaging object such as a building.
  • the flight control information and the camera control information once set may be stored in the storage 11c or the like. As a result, it is possible to repeatedly fly an unmanned vehicle under the same conditions in inspections and the like, and to image an image to be imaged at the same position and angle.
  • the image captured by the unmanned flying object 20 during hovering is an image obtained by imaging the image pickup direction of the camera 28 loaded on the unmanned flying object 20 as the ground direction.
  • FIG. 16 is an example of a control method by the system 1 according to a modification of the present embodiment. As shown in FIG. 16, in such a system 1, the image pickup direction of the camera 28 when the unmanned aircraft 20 is hovering may be horizontal with respect to the structure S4. At this time, the unmanned flying object 20 may be controlled so that the camera 28 takes an image of the structure S4 while the unmanned flying object 20 is flying on a plane parallel to the height direction with respect to the height H3.
  • the imaging direction of the camera 28 may be determined, for example, with the position of the height H3 in the structure S4 as the imaging target position.
  • the system 1 according to the present embodiment can be applied regardless of the positional relationship between the unmanned flying object 20 and the structure that is the imaging target.
  • the device described in the present specification may be realized as a single device, or may be realized by a plurality of devices, etc., which are partially or wholly connected by a network.
  • the control unit and the storage of the information processing terminal 10 may be realized by different servers connected to each other by a network.
  • the series of processes by the apparatus described in the present specification may be realized by using any of software, hardware, and a combination of software and hardware. It is possible to create a computer program for realizing each function of the information processing terminal 10 according to the present embodiment and implement it on a PC or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed, for example, via a network without using a recording medium.
  • (Item 1) It ’s a method of controlling an aircraft. Acquiring the information of the image captured by the aircraft and Displaying the image on the display unit and Acquiring input information generated based on an operation on the image displayed on the display unit, and the input information includes information on a position on the image. Outputting flight control information for flying a position in real space corresponding to the position on the image to the flying object, and How to include.
  • (Item 2) The method described in item 1.
  • the input information includes information of a line segment consisting of a continuous set of positions on the image.
  • the information of the line segment includes the information of the start point and the end point of the line segment.
  • the input information includes information about an image pickup target position on the image of the image pickup apparatus loaded on the flight object. During the flight of the flying object that flies based on the flight control information, the flight is provided with information for controlling the orientation of the image pickup device based on the image pickup target position in the real space corresponding to the image pickup target position.
  • a method that further includes outputting to the body (Item 5) The method described in item 4. A method in which the imaging target position is designated by a plurality or consecutively on the image. (Item 6) The method according to item 4 or 5. The method, wherein the imaging target position includes a position in the height direction in real space. (Item 7) The method according to any one of items 1 to 6. Obtaining information about the position of the flying object in the height direction at the time when the image was taken, A method, wherein the position in real space corresponding to the position on the image is determined based on the position in the height direction of the flying object at the time when the image is taken. (Item 8) The method according to any one of items 1 to 7.
  • the input information includes information about the height position of the flying object in real space during flight by the flight control.
  • the method according to any one of items 1 to 8. A method further comprising displaying an image captured by an image pickup apparatus loaded on the flying object flying based on the flight control on the display unit.
  • the method according to any one of items 1 to 9. A method further comprising displaying an image previously displayed on the display unit in another area of the display unit during flight of the flying object flying under the flight control.
  • (Item 12) The method according to any one of items 1 to 11.
  • (Item 13) The method according to any one of items 1 to 12.
  • the flight control information includes information relating to the speed at which the flight object is flown.
  • a method, wherein the operation on the image displayed on the display unit includes an operation for determining the speed.
  • (Item 14) The method according to any one of items 1 to 13.
  • the image captured by the flying object during hovering is an image obtained by imaging the image pickup direction of the image pickup apparatus loaded on the flying object as the ground direction.
  • An image acquisition unit that acquires information on the image captured by the aircraft, A display control unit that displays the image on the display unit, An input information acquisition unit that acquires input information including information about a position on the image, which is generated based on an operation on the image displayed on the display unit.
  • An output control unit that outputs flight control information for flying a position in real space corresponding to the position on the image to the flying object.
  • a system equipped with. (Item 16) A program for making a computer function as a control device for an air vehicle.
  • An image acquisition unit that acquires information on the image captured by the aircraft, A display control unit that displays the image on the display unit, An input information acquisition unit that acquires input information including information about a position on the image, which is generated based on an operation on the image displayed on the display unit.
  • An output control unit that outputs flight control information for flying a position in real space corresponding to the position on the image to the flying object.
  • a program that functions as.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

[Problem] To make it easy to set the flight path of a flight vehicle on site. [Solution] A method relating to the control of a flight vehicle, said method including: acquiring information about an image captured by an uncrewed flight vehicle 20; displaying the image on a touch panel 12; acquiring input information generated on the basis of an operation on the image displayed on the touch panel 12; and outputting to the uncrewed flight vehicle 20 information about flight control for flying to a position in actual space corresponding to a position on the image, the input information including information relating to the position on the image.

Description

方法、システムおよびプログラムMethods, systems and programs
 本開示は、方法、システムおよびプログラムに関する。 This disclosure relates to methods, systems and programs.
 ドローン等の無人飛行体を用いた点検や撮影等のニーズが高まっている。そのため、無人飛行体を容易に操作する技術の開発が進められている。例えば、特許文献1には、タッチパネルを用いて無人飛行体を操作するための技術が開示されている。 There is an increasing need for inspections and photography using unmanned aerial vehicles such as drones. Therefore, the development of technology for easily operating an unmanned aircraft is underway. For example, Patent Document 1 discloses a technique for operating an unmanned flying object using a touch panel.
特開2019-85041号公報Japanese Unexamined Patent Publication No. 2019-85041
 点検や施工管理等の現場において、飛行体を用いた撮影を行うには、その場で適切な飛行経路を設定することが求められる。 In order to take pictures using a flying object at the site of inspection or construction management, it is required to set an appropriate flight route on the spot.
 本開示はこのような背景を鑑みてなされたものであり、その目的は、現場での飛行体の飛行経路の設定を容易に行うことが可能な、方法、システムおよびプログラムを提供することである。 This disclosure has been made in view of this background, and an object thereof is to provide a method, a system and a program capable of easily setting a flight path of an air vehicle in the field. ..
 本開示によれば、飛行体の制御に関する方法であって、飛行体が撮像した画像の情報を取得することと、前記画像を表示部に表示することと、前記表示部に表示された前記画像に対する操作に基づいて生成される入力情報を取得することと、前記入力情報は前記画像上の位置に関する情報を含み、前記画像上の位置に対応する実空間上の位置を飛行するための飛行制御の情報を前記飛行体に出力することと、を含む方法が提供される。 According to the present disclosure, it is a method relating to control of an air vehicle, which is to acquire information of an image captured by the air vehicle, to display the image on a display unit, and to display the image on the display unit. To acquire input information generated based on an operation on the image, and the input information includes information on a position on the image, and a flight control for flying a position in real space corresponding to the position on the image. Information is output to the flying object, and a method including is provided.
 また、本開示によれば、飛行体の制御に関するシステムであって、飛行体が撮像した画像の情報を取得する画像取得部と、前記画像を表示部に表示する表示制御部と、前記表示部に表示された前記画像に対する操作に基づいて生成される、前記画像上の位置に関する情報を含む入力情報を取得する入力情報取得部と、前記画像上の位置に対応する実空間上の位置を飛行するための飛行制御の情報を前記飛行体に出力する出力制御部と、を備えるシステムが提供される。 Further, according to the present disclosure, the system is related to the control of the flying object, the image acquisition unit for acquiring the information of the image captured by the flying object, the display control unit for displaying the image on the display unit, and the display unit. An input information acquisition unit that acquires input information including information about a position on the image, which is generated based on an operation on the image displayed on the image, and a position in real space corresponding to the position on the image. A system including an output control unit for outputting flight control information to the flying object is provided.
 また、本開示によれば、コンピュータを、飛行体の制御装置として機能させるためのプログラムであって、飛行体が撮像した画像の情報を取得する画像取得部と、前記画像を表示部に表示する表示制御部と、前記表示部に表示された前記画像に対する操作に基づいて生成される、前記画像上の位置に関する情報を含む入力情報を取得する入力情報取得部と、前記画像上の位置に対応する実空間上の位置を飛行するための飛行制御の情報を前記飛行体に出力する出力制御部と、として機能させるプログラムが提供される。 Further, according to the present disclosure, it is a program for making a computer function as a control device for an air vehicle, and displays an image acquisition unit for acquiring information of an image captured by the air vehicle and the image on a display unit. Corresponds to the display control unit, the input information acquisition unit that acquires input information including information about the position on the image, which is generated based on the operation on the image displayed on the display unit, and the position on the image. A program is provided that functions as an output control unit that outputs flight control information for flying a position in real space to the flying object.
 本発明によれば、現場での飛行体の飛行経路の設定を容易に行うことができる。 According to the present invention, it is possible to easily set the flight path of the flying object in the field.
本開示の一実施形態に係るシステム1の概略を示す図である。It is a figure which shows the outline of the system 1 which concerns on one Embodiment of this disclosure. 同実施形態に係る情報処理端末10の構成を示すブロック図である。It is a block diagram which shows the structure of the information processing terminal 10 which concerns on the same embodiment. 同実施形態に係る無人飛行体20の機能構成の一例を示すブロック図である。It is a block diagram which shows an example of the functional structure of the unmanned aircraft 20 which concerns on the same embodiment. 同実施形態に係る制御部11の機能構成を示すブロック図である。It is a block diagram which shows the functional structure of the control part 11 which concerns on the same embodiment. 同実施形態に係るシステム1における一連の制御に係るフローチャート図である。It is a flowchart which concerns on a series of control in the system 1 which concerns on the same embodiment. 同実施形態に係るシステム1による制御方法に係る第1の状況の例を示す図である。It is a figure which shows the example of the 1st situation which concerns on the control method by the system 1 which concerns on the same embodiment. 同実施形態に係るタッチパネル12に表示される第1の画面例を示す図である。It is a figure which shows the 1st screen example displayed on the touch panel 12 which concerns on the same embodiment. 同実施形態に係るタッチパネル12に表示される第2の画面例を示す図である。It is a figure which shows the 2nd screen example displayed on the touch panel 12 which concerns on the same embodiment. 同実施形態に係るタッチパネル12に表示される第3の画面例を示す図である。It is a figure which shows the 3rd screen example which is displayed on the touch panel 12 which concerns on the same embodiment. 同実施形態に係るタッチパネル12に表示される第4の画面例を示す図である。It is a figure which shows the 4th screen example displayed on the touch panel 12 which concerns on the same embodiment. 同実施形態に係るタッチパネル12に表示される第4の画面例の変形例を示す図である。It is a figure which shows the modification of the 4th screen example displayed on the touch panel 12 which concerns on the same embodiment. 同実施形態に係るタッチパネル12に表示される第5の画面例を示す図である。It is a figure which shows the 5th screen example displayed on the touch panel 12 which concerns on the same embodiment. 同実施形態に係るタッチパネル12に表示される第6の画面例を示す図である。It is a figure which shows the sixth screen example displayed on the touch panel 12 which concerns on the same embodiment. 同実施形態に係るシステム1による制御方法に係る第2の状況の例を示す図である。It is a figure which shows the example of the 2nd situation which concerns on the control method by the system 1 which concerns on the same embodiment. 同実施形態に係るタッチパネル12に表示される第7の画面例を示す図である。It is a figure which shows the 7th screen example displayed on the touch panel 12 which concerns on the same embodiment. 同実施形態の一変形例に係るシステム1による制御方法の一例である。This is an example of a control method by the system 1 according to a modification of the same embodiment.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 The preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings below. In the present specification and the drawings, components having substantially the same functional configuration are designated by the same reference numerals, so that duplicate description will be omitted.
<システム概要>
 図1は、本開示の一実施形態に係るシステム1の概略を示す図である。図示のように、システム1は、情報処理端末10と、無人飛行体20とを備える。本実施形態に係るシステム1は、例えば、無人飛行体20による撮影の対象物である建造物S1の施工管理や点検等のために用いられ得る。かかるシステム1においては、情報処理端末10を使用するUが、情報処理端末10のタッチパネルに対して、無人飛行体20の飛行経路をなぞるように操作する。かかる操作は、例えばスライドやスワイプ等の連続的または断続的な操作であり得る。無人飛行体20は、かかるなぞり操作により得られる画像上の位置情報から推定される実空間上の位置において飛行するように制御され得る。
<System overview>
FIG. 1 is a diagram showing an outline of a system 1 according to an embodiment of the present disclosure. As shown in the figure, the system 1 includes an information processing terminal 10 and an unmanned flying object 20. The system 1 according to the present embodiment can be used, for example, for construction management and inspection of the building S1 which is an object of photography by the unmanned flying object 20. In such a system 1, U using the information processing terminal 10 operates the touch panel of the information processing terminal 10 so as to trace the flight path of the unmanned flying object 20. Such an operation may be a continuous or intermittent operation such as a slide or a swipe. The unmanned aircraft 20 can be controlled to fly at a position in real space estimated from the position information on the image obtained by such a tracing operation.
 本実施形態に係る情報処理端末10は、いわゆるタブレット状の小型のコンピュータによって実装される。他の実施形態においては、情報処理端末10は、スマートフォンまたはゲーム機等の携帯型の情報処理端末により実現されてもよいし、パーソナルコンピュータ等の据え置き型の情報処理端末により実現されてもよい。また、情報処理端末10は、複数のハードウェアにより実現され、それらに機能が分散された構成を有してもよい。 The information processing terminal 10 according to this embodiment is mounted by a so-called tablet-shaped small computer. In another embodiment, the information processing terminal 10 may be realized by a portable information processing terminal such as a smartphone or a game machine, or may be realized by a stationary information processing terminal such as a personal computer. Further, the information processing terminal 10 may be realized by a plurality of hardware and may have a configuration in which the functions are distributed to them.
 図2は、本実施形態に係る情報処理端末10の構成を示すブロック図である。図示のように、情報処理端末10は、制御部11及び表示部の一例であるタッチパネル12を備える。 FIG. 2 is a block diagram showing the configuration of the information processing terminal 10 according to the present embodiment. As shown in the figure, the information processing terminal 10 includes a control unit 11 and a touch panel 12 which is an example of a display unit.
 プロセッサ11aは、制御部11の動作を制御し、各要素間におけるデータの送受信の制御や、プログラムの実行に必要な処理等を行う演算装置である。このプロセッサ11aは、本実施の形態では例えばCPU(Central Processing Unit)であり、後述するストレージ11cに格納されてメモリ11bに展開されたプログラムを実行して各処理を行う。 The processor 11a is an arithmetic unit that controls the operation of the control unit 11, controls the transmission and reception of data between each element, and performs processing necessary for program execution. In the present embodiment, the processor 11a is, for example, a CPU (Central Processing Unit), and executes a program stored in the storage 11c described later and expanded in the memory 11b to perform each process.
 メモリ11bは、DRAM(Dynamic Random Access Memory)等の揮発性記憶装置で構成される主記憶装置、及びフラッシュメモリやHDD(Hard Disc Drive)等の不揮発性記憶装置で構成される補助記憶装置を備える。このメモリ11bは、プロセッサ11aの作業領域として使用される一方、制御部11の起動時に実行されるBIOS(Basic Input/Output System)、及び各種の設定情報等が格納される。 The memory 11b includes a main storage device composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory), and an auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disc Drive). .. While the memory 11b is used as a work area of the processor 11a, the BIOS (Basic Input / Output System) executed when the control unit 11 is started, various setting information, and the like are stored.
 ストレージ11cは、プログラムや各種の処理に用いられる情報等が格納されている。例えば、屋根101の画像情報を撮像するための飛行体を、情報処理端末10を介してユーザが操作する場合、ストレージ11cには、かかる飛行体の飛行を制御するプログラムが格納されていてもよい。 The storage 11c stores programs and information used for various processes. For example, when the user operates the flying object for capturing the image information of the roof 101 via the information processing terminal 10, the storage 11c may store a program for controlling the flight of the flying object. ..
 送受信部11dは、制御部11をインターネット網等のネットワークに接続するものであって、Bluetooth(登録商標)やBLE(Bluetooth Low Energy)といった近距離通信インターフェースを具備するものであってもよい。 The transmission / reception unit 11d connects the control unit 11 to a network such as an Internet network, and may be provided with a short-range communication interface such as Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
 入出力部11eは、入出力機器が接続されるインターフェースであって、本実施形態では、タッチパネル12が接続される。 The input / output unit 11e is an interface to which input / output devices are connected, and in the present embodiment, the touch panel 12 is connected.
 バス11fは、接続したプロセッサ11a、メモリ11b、ストレージ11c、送受信部11d及び入出力部11eの間において、例えばアドレス信号、データ信号及び各種の制御信号を伝達する。 The bus 11f transmits, for example, an address signal, a data signal, and various control signals between the connected processor 11a, memory 11b, storage 11c, transmission / reception unit 11d, and input / output unit 11e.
 タッチパネル12は、表示部の一例であり、取得した映像や画像が表示される表示面を備える。この表示面は、本実施形態では、表示面への接触によって情報の入力を受け付けるものであって、抵抗膜方式や静電容量方式といった各種の技術によって実装される。 The touch panel 12 is an example of a display unit, and includes a display surface on which acquired images and images are displayed. In the present embodiment, this display surface receives information input by contact with the display surface, and is implemented by various techniques such as a resistance film method and a capacitance method.
 タッチパネル12の表示面には、例えば、無人飛行体20により撮像された画像が表示され得る。また、該表示面には、無人飛行体20の飛行制御や撮像装置の制御のためのボタンやオブジェクト等が表示され得る。また、ユーザは、該表示面に表示された画像やボタン等に対して、タッチパネル12を介して入力情報を入力し得る。かかる入力情報について詳しくは後述するが、かかる入力情報の入力に係る操作は、例えば、ボタンやオブジェクト等に対するタッチ(タップ)操作や、無人飛行体20の飛行経路を決めるためのタップ操作、スライド操作、スワイプ操作等である。 For example, an image captured by the unmanned flying object 20 can be displayed on the display surface of the touch panel 12. Further, on the display surface, buttons, objects, and the like for flight control of the unmanned vehicle 20 and control of the image pickup apparatus may be displayed. Further, the user can input input information to the image, the button, or the like displayed on the display surface via the touch panel 12. The input information will be described in detail later, but the operations related to the input of the input information include, for example, a touch (tap) operation for a button, an object, etc., a tap operation for determining the flight path of the unmanned aircraft 20, and a slide operation. , Swipe operation, etc.
 図3は、本実施形態に係る無人飛行体20の機能構成の一例を示すブロック図である。図3に示すように、一実施形態に係る無人飛行体20は、本体21において、送受信部22、フライトコントローラ23、バッテリ24、ESC25、モータ26、プロペラ27およびカメラ28を備える。なお、無人飛行体20は飛行体の一例である。飛行体の種類は特に限定されず、例えば、図3に示すようなマルチローター式のいわゆるドローンであってもよい。 FIG. 3 is a block diagram showing an example of the functional configuration of the unmanned aircraft 20 according to the present embodiment. As shown in FIG. 3, the unmanned vehicle 20 according to the embodiment includes a transmission / reception unit 22, a flight controller 23, a battery 24, an ESC 25, a motor 26, a propeller 27, and a camera 28 in the main body 21. The unmanned flying object 20 is an example of an flying object. The type of the flying object is not particularly limited, and may be, for example, a so-called multi-rotor type drone as shown in FIG.
 フライトコントローラ23は、プログラマブルプロセッサ(例えば、中央演算処理装置(CPU))などの1つ以上のプロセッサ23Aを有することができる。 The flight controller 23 can have one or more processors 23A such as a programmable processor (eg, central processing unit (CPU)).
 フライトコントローラ23は、メモリ23Bを有しており、メモリ23Bにアクセス可能である。メモリ23Bは、1つ以上のステップを行うためにフライトコントローラが実行可能であるロジック、コード、および/またはプログラム命令を記憶している。 The flight controller 23 has a memory 23B and can access the memory 23B. Memory 23B stores logic, code, and / or program instructions that the flight controller can execute to perform one or more steps.
 メモリ23Bは、例えば、SDカードやランダムアクセスメモリ(RAM)などの分離可能な媒体または外部の記憶装置を含んでいてもよい。センサ類23Cから取得したデータは、メモリ23Bに直接に伝達されかつ記憶されてもよい。例えば、カメラ28で撮影した静止画・動画データが内蔵メモリ又は外部メモリに記録される。 The memory 23B may include, for example, a separable medium such as an SD card or a random access memory (RAM) or an external storage device. The data acquired from the sensors 23C may be directly transmitted and stored in the memory 23B. For example, still image / moving image data taken by the camera 28 is recorded in the built-in memory or the external memory.
 フライトコントローラ23は、飛行体の状態を制御するように構成された制御モジュールを含んでいる。例えば、制御モジュールは、6自由度(並進運動x、y及びz、並びに回転運動θ、θ及びθ)を有する飛行体の空間的配置、速度、および/または加速度を調整するために飛行体の推進機構(モータ26等)をESC(Electric Speed Controller)25を介して制御する。制御モジュールは、カメラ28やセンサ類23C等のうち1つ以上を制御することができる。 The flight controller 23 includes a control module configured to control the state of the flying object. For example, the control module may adjust the spatial placement, velocity, and / or acceleration of an air vehicle with 6 degrees of freedom (translation x, y and z, and rotational motion θ x , θ y and θ z ). The propulsion mechanism (motor 26, etc.) of the flying object is controlled via the ESC (Electric Speed Controller) 25. The control module can control one or more of the camera 28, the sensors 23C, and the like.
 フライトコントローラ23は、1つ以上の外部のデバイス(例えば、情報処理端末10等の端末、表示装置、または他の遠隔の制御器)からのデータを送信および/または受け取るように構成された送受信部22と通信可能である。例えば、送受信部22は、ローカルエリアネットワーク(LAN)、ワイドエリアネットワーク(WAN)、赤外線、無線、WiFi、ポイントツーポイント(P2P)ネットワーク、電気通信ネットワーク、クラウド通信などのうちの1つ以上を利用することができる。 The flight controller 23 is a transmitter / receiver configured to transmit and / or receive data from one or more external devices (eg, a terminal such as an information processing terminal 10, a display device, or another remote controller). It is possible to communicate with 22. For example, the transmission / reception unit 22 uses one or more of a local area network (LAN), a wide area network (WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, cloud communication, and the like. can do.
 送受信部22は、カメラ28やセンサ類23Cで取得したデータ、フライトコントローラ23が生成した処理結果、所定の制御データ、情報処理端末10または遠隔の制御器からのユーザコマンドなどのうちの1つ以上を送信および/または受け取ることができる。 The transmission / reception unit 22 is one or more of data acquired by the camera 28 and sensors 23C, processing results generated by the flight controller 23, predetermined control data, user commands from the information processing terminal 10 or a remote controller, and the like. Can be sent and / or received.
 本実施形態によるセンサ類23Cは、慣性センサ(加速度センサ、ジャイロセンサ)、GPSセンサ、近接センサ(例えば、ライダー)、またはビジョン/イメージセンサ(例えば、カメラ)を含み得る。 Sensors 23C according to this embodiment may include an inertial sensor (acceleration sensor, gyro sensor), GPS sensor, proximity sensor (eg, rider), or vision / image sensor (eg, camera).
 バッテリ24は、例えばリチウムポリマー電池等の公知のバッテリであり得る。なお、無人飛行体20を駆動させる動力は、バッテリ24等から供給される電力に限定されず、例えば内燃機関等の動力によるものであってもよい。 The battery 24 can be a known battery such as a lithium polymer battery. The power for driving the unmanned vehicle 20 is not limited to the electric power supplied from the battery 24 or the like, and may be, for example, the power of an internal combustion engine or the like.
 カメラ28は撮像装置の一例である。カメラ28の種類は特に限定されず、例えば通常のデジタルカメラ、全天球カメラ、赤外線カメラ、サーモグラフィ等のイメージセンサ等であってもよい。カメラ28は、本体21と不図示のジンバル等により独立変位可能に接続されていてもよい。 The camera 28 is an example of an image pickup device. The type of the camera 28 is not particularly limited, and may be, for example, an ordinary digital camera, an omnidirectional camera, an infrared camera, an image sensor such as a thermography, or the like. The camera 28 may be connected to the main body 21 so as to be independently displaceable by a gimbal or the like (not shown).
 図4は、本実施形態に係る制御部11の機能構成を示すブロック図である。図4に示すように、制御部11は、画像取得部111、表示制御部112、入力情報取得部113、飛行制御情報生成部114、カメラ制御情報生成部115および出力制御部116を備える。これらの各機能部は、プロセッサ11aがストレージ11cに記憶されているプログラムをメモリ11bに読み出して実行することにより実現され得る。 FIG. 4 is a block diagram showing a functional configuration of the control unit 11 according to the present embodiment. As shown in FIG. 4, the control unit 11 includes an image acquisition unit 111, a display control unit 112, an input information acquisition unit 113, a flight control information generation unit 114, a camera control information generation unit 115, and an output control unit 116. Each of these functional units can be realized by the processor 11a reading a program stored in the storage 11c into the memory 11b and executing the program.
 画像取得部111は、無人飛行体20が撮像した画像の情報を取得する機能を有する。画像取得部111は、無人飛行体20の搭載するカメラ28が撮像した画像を適宜取得する。取得する画像は、リアルタイムで得られる動画像であってもよいし、任意のタイミングで撮像された静止画像であってもよい。取得した画像の情報は、表示制御部112に出力される。 The image acquisition unit 111 has a function of acquiring information on an image captured by the unmanned flying object 20. The image acquisition unit 111 appropriately acquires an image captured by the camera 28 mounted on the unmanned flying object 20. The image to be acquired may be a moving image obtained in real time, or may be a still image captured at an arbitrary timing. The acquired image information is output to the display control unit 112.
 表示制御部112は、取得した画像をタッチパネル12に表示させる機能を有する。また、表示制御部112は、本システム1を利用するユーザに情報を提供するための、およびユーザによる操作に基づく入力情報を取得するためのボタンやオブジェクト、テキスト等の情報を前記画像に含めて表示させる機能を有する。また、後述する入力情報取得部113により得られた入力情報に基づく表示をタッチパネル12に表示させるための機能も有し得る。 The display control unit 112 has a function of displaying the acquired image on the touch panel 12. Further, the display control unit 112 includes information such as buttons, objects, and texts for providing information to the user who uses the system 1 and for acquiring input information based on the operation by the user in the image. It has a function to display. Further, it may also have a function for displaying the display based on the input information obtained by the input information acquisition unit 113, which will be described later, on the touch panel 12.
 入力情報取得部113は、タッチパネル12に表示された画像に対する操作に基づいて生成される入力情報を取得する機能を有する。ここでいう入力情報は、例えば、タッチパネル12に表示された画像上の位置に関する情報を含む。画像上の位置とは、例えば、該画像を構成する画素の位置である。すなわち、入力情報には、ユーザが画像上のどの位置に対する操作を行ったかを示す情報が含まれる。 The input information acquisition unit 113 has a function of acquiring input information generated based on an operation on an image displayed on the touch panel 12. The input information referred to here includes, for example, information regarding a position on an image displayed on the touch panel 12. The position on the image is, for example, the position of the pixels constituting the image. That is, the input information includes information indicating to which position on the image the user has performed an operation.
 画像上の位置に関する情報は、例えば、画像上の位置の連続集合からなる線分の情報を含む。かかる線分の情報は、ユーザがタッチパネル12に対してなぞるような操作を行った際に得られる。例えば、タッチパネル12に無人飛行体20が上空から撮像した一空間の画像が表示されているとき、ユーザは、無人飛行体20を飛行させたい位置をなぞるように操作する。そのなぞりにより得られる軌跡が、無人飛行体20の飛行ルートとなり得る。なお、かかるなぞる操作は、必ずしも連続的な操作でなくてもよく、断続的な操作であってもよい。すなわち、ユーザによるなぞる操作が、複数の箇所において行われた場合であっても、かかる軌跡に基づいて飛行ルートが形成され得る。このうち、なぞり操作が途切れている部分については、適宜飛行ルートに対応する経路が補完され得る。
The information about the position on the image includes, for example, the information of a line segment consisting of a continuous set of positions on the image. The information of such a line segment is obtained when the user performs an operation such as tracing the touch panel 12. For example, when an image of one space captured by the unmanned vehicle 20 from the sky is displayed on the touch panel 12, the user operates so as to trace the position where the unmanned vehicle 20 wants to fly. The trajectory obtained by the tracing can be the flight route of the unmanned flying object 20. The tracing operation does not necessarily have to be a continuous operation, but may be an intermittent operation. That is, even when the user's tracing operation is performed at a plurality of locations, a flight route can be formed based on such a trajectory. Of these, the route corresponding to the flight route can be complemented as appropriate for the portion where the tracing operation is interrupted.
 また、入力情報は、無人飛行体20に積載されるカメラ28の、画像上の撮像目標位置に関する情報を含んでもよい。カメラ28の画像上の撮像目標位置とは、無人飛行体20の飛行中におけるカメラ28の撮像方向を定めるための位置である。ユーザが、タッチパネル12に表示されている空間の画像のうち、少なくとも一か所を撮像目標位置として決めることができる。 Further, the input information may include information regarding the imaging target position on the image of the camera 28 loaded on the unmanned flying object 20. The image pickup target position on the image of the camera 28 is a position for determining the image pickup direction of the camera 28 during flight of the unmanned flying object 20. The user can determine at least one of the images of the space displayed on the touch panel 12 as the imaging target position.
 また、入力情報は、飛行制御による飛行時の無人飛行体20の実空間上の高さ方向の位置(高度)に関する情報を含んでもよい。かかる高さ方向の位置と、飛行時の無人飛行体20の実空間上の(水平方向の)位置と、カメラ28の撮像目標位置との情報から、飛行中のカメラ28の位置を定めることができる。 Further, the input information may include information regarding the position (altitude) in the height direction of the unmanned aircraft 20 in real space during flight by flight control. The position of the camera 28 in flight can be determined from the information of the position in the height direction, the position in the real space (horizontal direction) of the unmanned flying object 20 during flight, and the image pickup target position of the camera 28. can.
 また、入力情報は、飛行制御による飛行時の無人飛行体20の速度に関する情報を含んでもよい。その他、タッチパネル12等を介して入力される情報は、適宜入力情報として入力情報取得部113により取得され得る。 Further, the input information may include information regarding the speed of the unmanned flying object 20 during flight by flight control. In addition, the information input via the touch panel 12 or the like can be appropriately acquired by the input information acquisition unit 113 as input information.
 飛行制御情報生成部114は、入力情報に基づいて無人飛行体20の飛行を制御するための情報(飛行制御情報)を生成する機能を有する。飛行制御情報は、例えば、無人飛行体20の飛行経路、該飛行経路の始点および終点、無人飛行体20の向き(ピッチ軸、ロール軸およびヨー軸まわりにおける)、無人飛行体20の飛行高度および無人飛行体20の飛行速度に関する情報を含む。ここでいう飛行制御とは、タッチパネル12に対して入力された本実施形態に係る入力情報に基づいて生成される飛行制御情報に従って無人飛行体20の飛行を制御することを意味する。 The flight control information generation unit 114 has a function of generating information (flight control information) for controlling the flight of the unmanned aircraft 20 based on the input information. The flight control information includes, for example, the flight path of the unmanned flight object 20, the start and end points of the flight path, the orientation of the unmanned flight object 20 (around the pitch axis, the roll axis, and the yaw axis), the flight altitude of the unmanned flight object 20, and the flight altitude of the unmanned flight object 20. Contains information on the flight speed of the unmanned aircraft 20. The flight control referred to here means that the flight of the unmanned flying object 20 is controlled according to the flight control information generated based on the input information according to the present embodiment input to the touch panel 12.
 飛行制御情報生成部114は、例えば、入力情報に含まれる画像上の位置に対応する実空間上の位置を算出する。実空間上の位置情報は、例えば、緯度情報および経度情報である。かかる緯度情報および経度情報は、無人飛行体20のカメラ28(または無人飛行体20)の撮像位置情報に基づいて算出され得る。具体的には、飛行制御情報生成部114は、撮像した画像の中心の画素を基準として、カメラ28の画角およびカメラ28がホバリングしながら画像を撮像した時点における無人飛行体20の高さ方向の位置に基づいて、該画像の各画素に対応する実空間上の位置を算出し得る。 The flight control information generation unit 114 calculates, for example, a position in real space corresponding to a position on an image included in the input information. The position information in the real space is, for example, latitude information and longitude information. Such latitude information and longitude information can be calculated based on the image pickup position information of the camera 28 (or the unmanned vehicle 20) of the unmanned vehicle 20. Specifically, the flight control information generation unit 114 refers to the angle of view of the camera 28 and the height direction of the unmanned flying object 20 at the time when the camera 28 takes an image while hovering with reference to the pixel at the center of the captured image. Based on the position of, the position in real space corresponding to each pixel of the image can be calculated.
 飛行制御情報生成部114は、実空間上の飛行位置(飛行経路)を算出した後、該飛行位置における無人飛行体20の飛行中の飛行速度や向き等を決定し、飛行制御情報として生成する。飛行中の飛行速度は、例えばユーザからの操作に基づく入力により決定されてもよい。飛行中の向きは、例えば、後述するカメラ制御情報生成部115により決定されるカメラ28の向きに応じて決められてもよいし、カメラ28が本体21とジンバル等により接続されていれば、飛行中の向きは一定であってもよい。かかる飛行中の無人飛行体20の挙動の制御については、公知の技術により実現され得る。 After calculating the flight position (flight path) in the real space, the flight control information generation unit 114 determines the flight speed, direction, etc. of the unmanned aircraft 20 in flight at the flight position, and generates it as flight control information. .. The flight speed during flight may be determined, for example, by input based on an operation from the user. The orientation during flight may be determined, for example, according to the orientation of the camera 28 determined by the camera control information generation unit 115 described later, or if the camera 28 is connected to the main body 21 by a gimbal or the like, the flight may be determined. The inside orientation may be constant. The control of the behavior of the unmanned vehicle 20 during such flight can be realized by a known technique.
 カメラ制御情報生成部115は、入力情報に基づいてカメラ28の動作を制御するための情報(カメラ制御情報)を生成する機能を有する。カメラ制御情報は、例えば、カメラ28の向き、撮像タイミング、撮像処理等に関する情報を含む。 The camera control information generation unit 115 has a function of generating information (camera control information) for controlling the operation of the camera 28 based on the input information. The camera control information includes, for example, information regarding the orientation of the camera 28, the imaging timing, the imaging process, and the like.
 カメラ制御情報生成部115は、例えば、飛行制御情報に基づいて飛行する無人飛行体20の飛行中に、撮像目標位置に対応する実空間上の位置に基づいてカメラ28の向き(撮像方向)を制御するための情報を生成する。カメラ28の向きを制御するための情報は、例えば、入力情報取得部113により取得された撮像目標位置と、飛行中の無人飛行体20の位置とに基づいて算出される飛行位置におけるカメラ28の向きに関する情報を含む。カメラ28の少なくとも水平方向の向きは、撮像目標位置と飛行位置との関係により算出され得る。撮像目標位置が複数ある場合は、例えば、あるタイミングにおける飛行位置に対応する撮像目標位置が別途定められていてもよい。また、撮像目標位置は、例えば水平方向における位置のみが指定されていてもよい。この場合、カメラ28のピッチ軸まわりの角度は、後述するように、適宜調整され得る。また、カメラ28の向きは、飛行制御以外における飛行中においては、例えばユーザのタッチパネル12に対する入力に応じて、適宜制御され得る。また、撮像目標位置は、実空間上における高さ方向の位置を含んでもよい。かかる高さ方向の位置が設定されていれば、カメラ28のピッチ軸まわりの角度が自動的に決定され得る。 For example, during the flight of the unmanned flying object 20 flying based on the flight control information, the camera control information generation unit 115 determines the direction (imaging direction) of the camera 28 based on the position in the real space corresponding to the imaging target position. Generate information to control. The information for controlling the orientation of the camera 28 is, for example, the information of the camera 28 at the flight position calculated based on the image pickup target position acquired by the input information acquisition unit 113 and the position of the unmanned vehicle 20 in flight. Contains information about orientation. At least the horizontal orientation of the camera 28 can be calculated by the relationship between the imaging target position and the flight position. When there are a plurality of imaging target positions, for example, an imaging target position corresponding to a flight position at a certain timing may be separately determined. Further, as the imaging target position, for example, only the position in the horizontal direction may be specified. In this case, the angle around the pitch axis of the camera 28 can be adjusted as appropriate as described later. Further, the orientation of the camera 28 can be appropriately controlled during flight other than flight control, for example, in response to an input to the touch panel 12 by the user. Further, the imaging target position may include a position in the height direction in the real space. If such a position in the height direction is set, the angle around the pitch axis of the camera 28 can be automatically determined.
 出力制御部116は、飛行制御情報生成部114やカメラ制御情報生成部115が生成した各種情報を、無人飛行体20に出力する機能を有する。無人飛行体20は、取得した飛行制御情報やカメラ制御情報に基づいて、無人飛行体20の飛行およびカメラ28の動作の制御を行う。 The output control unit 116 has a function of outputting various information generated by the flight control information generation unit 114 and the camera control information generation unit 115 to the unmanned vehicle 20. The unmanned flying object 20 controls the flight of the unmanned flying object 20 and the operation of the camera 28 based on the acquired flight control information and the camera control information.
 次に、本実施形態に係るシステム1を用いた無人飛行体20の制御方法の一例について、フローチャートに沿って説明する。図5は、本実施形態に係るシステム1における一連の制御に係るフローチャート図である。 Next, an example of the control method of the unmanned aircraft 20 using the system 1 according to the present embodiment will be described with reference to the flowchart. FIG. 5 is a flowchart of a series of controls in the system 1 according to the present embodiment.
 まず、画像取得部111は、無人飛行体20から画像情報を取得する(ステップSQ101)。このとき、無人飛行体20は、撮像対象物の上空をホバリングしている。 First, the image acquisition unit 111 acquires image information from the unmanned flying object 20 (step SQ101). At this time, the unmanned flying object 20 is hovering over the object to be imaged.
 図6は、本実施形態に係るシステム1による制御方法に係る第1の状況の例を示す図である。図6に示すように、無人飛行体20は、高度H1の高さにおいて、撮像対象物である建造物S1の上空をホバリングしている。カメラ28の撮像方向は、真下方向である。なお、高度H1はここではカメラ28の高度としているが、高度H1は無人飛行体20の高度であってもよい。 FIG. 6 is a diagram showing an example of the first situation relating to the control method by the system 1 according to the present embodiment. As shown in FIG. 6, the unmanned aircraft 20 is hovering over the building S1 which is the object to be imaged at the height of the altitude H1. The image pickup direction of the camera 28 is directly downward. Although the altitude H1 is the altitude of the camera 28 here, the altitude H1 may be the altitude of the unmanned flying object 20.
 図7は、本実施形態に係るタッチパネル12に表示される第1の画面例を示す図である。図7に示すように、タッチパネル12の画面V1には、情報バーD11と、メイン画面D12と、サブ画面D13と、複数のボタンおよびオブジェクト101~109とが表示されている。情報バーD11は、主にタッチパネル12および無人飛行体20に関する情報を提示するための領域である。図7に示す例では、情報バーD11には、タッチパネル12の電波状況、バッテリ残量および無人飛行体20の現在の高度が表示されている。メイン画面D12には、カメラ28により得られる撮像画像等が表示され得る。かかるメイン画面D12には、該撮像画像に重畳されて他の情報が表示されてもよい。サブ画面D13は、カメラ28により撮像された撮像画像を表示する領域である。本実施形態では、サブ画面D13には、例えば、飛行経路を決定するために用いられる撮像画像を補足的に表示するために用いられ得る。例えば、サブ画面D13をタップすることにより、飛行経路の設定を開始することができる。具体例については後述する。 FIG. 7 is a diagram showing a first screen example displayed on the touch panel 12 according to the present embodiment. As shown in FIG. 7, the information bar D11, the main screen D12, the sub screen D13, the plurality of buttons, and the objects 101 to 109 are displayed on the screen V1 of the touch panel 12. The information bar D11 is an area mainly for presenting information about the touch panel 12 and the unmanned flying object 20. In the example shown in FIG. 7, the information bar D11 displays the radio wave condition of the touch panel 12, the remaining battery level, and the current altitude of the unmanned vehicle 20. The captured image or the like obtained by the camera 28 may be displayed on the main screen D12. On the main screen D12, other information may be displayed superimposed on the captured image. The sub screen D13 is an area for displaying the captured image captured by the camera 28. In the present embodiment, the sub-screen D13 may be used, for example, to supplementally display a captured image used for determining a flight path. For example, by tapping the sub screen D13, the flight route setting can be started. Specific examples will be described later.
 以下、ボタン101~107およびオブジェクト108、109について説明する。ボタン101は、無人飛行体20を着陸させるために用いられるボタンである。ボタン102は、例えば、無人飛行体20を設定された飛行経路の開始地点に移動させるためのボタンである。ボタン103は、無人飛行体20の該飛行経路に沿った飛行の制御等を開始するためのボタンである。ボタン104は、カメラ28による撮像処理を行うためのボタンである。ボタン105は、無人飛行体20の高度を調整するためのボタンである。ボタン106は、無人飛行体20の水平方向の位置を調整するためのボタンである。ボタン107は、設定した経路を表示させたり編集するためのボタンである。オブジェクト108は、無人飛行体20の水平方向(ヨー軸回り)の向きを調整するためのオブジェクトである。オブジェクト109は、カメラ28のピッチ軸まわりの角度を調整するためのオブジェクトである。メイン画面D12に表示される撮像画像は、撮像対象部である建造物S1と、広場S2と、道路S3と、の像を含む。 Hereinafter, the buttons 101 to 107 and the objects 108 and 109 will be described. The button 101 is a button used for landing the unmanned aircraft 20. The button 102 is, for example, a button for moving the unmanned aircraft 20 to the starting point of the set flight path. The button 103 is a button for starting control of flight along the flight path of the unmanned flying object 20. The button 104 is a button for performing an image pickup process by the camera 28. The button 105 is a button for adjusting the altitude of the unmanned aircraft 20. The button 106 is a button for adjusting the horizontal position of the unmanned aircraft 20. The button 107 is a button for displaying or editing the set route. The object 108 is an object for adjusting the horizontal direction (around the yaw axis) of the unmanned aircraft 20. The object 109 is an object for adjusting the angle around the pitch axis of the camera 28. The captured image displayed on the main screen D12 includes images of the building S1, the plaza S2, and the road S3, which are the imaging target portions.
 次に、ユーザがサブ画面D13をタップすることにより、飛行経路の入力に係る処理が開始される(ステップSQ103)。図8は、本実施形態に係るタッチパネル12に表示される第2の画面例を示す図である。図8に示す画面は、無人飛行体20の飛行経路を入力するための画面である。かかる画面V1には、撮像画像と、ボタン201~204が表示されている。ボタン201は、飛行経路の設定を行うためのボタンである。ボタン202は、撮像画像の中心点(撮像対象位置)を設定するためのボタンである。ボタン203は、無人飛行体20の飛行速度を設定するためのボタンである。ボタン204は、設定を保存するためのボタンである。 Next, when the user taps the sub screen D13, the process related to the input of the flight path is started (step SQ103). FIG. 8 is a diagram showing a second screen example displayed on the touch panel 12 according to the present embodiment. The screen shown in FIG. 8 is a screen for inputting the flight path of the unmanned flying object 20. The captured image and the buttons 201 to 204 are displayed on the screen V1. Button 201 is a button for setting a flight path. The button 202 is a button for setting the center point (position to be imaged) of the captured image. The button 203 is a button for setting the flight speed of the unmanned flying object 20. Button 204 is a button for saving the settings.
 かかる図8に示す画面V1において、ボタン201を選択すると、画面V1に対して、飛行経路に対応する点や線を入力することができる。例えばユーザが指やスタイラスペン等を用いて画面V1に対して操作を行い、飛行経路に対応する画像上の位置を指定する。かかる操作は、例えばペンで描くような直感的なスライド操作等であり得る。 When the button 201 is selected on the screen V1 shown in FIG. 8, points and lines corresponding to the flight path can be input to the screen V1. For example, the user operates the screen V1 with a finger, a stylus pen, or the like to specify a position on the image corresponding to the flight path. Such an operation may be an intuitive slide operation such as drawing with a pen.
 図9は、本実施形態に係るタッチパネル12に表示される第3の画面例を示す図である。図9に示すように、画面V1に対して、ユーザの操作により軌跡210が描かれ得る。なお、かかる軌跡210の始点および終点を示す点211が他の態様により表示されてもよい。かかる始点および終点を示す点211は、例えば、無人飛行体20の飛行制御におけるスタート地点およびゴール地点となり得る。なお、軌跡(線分)の始点および終点は一致していてもよいし、一致していなくてもよい。また、軌跡210を描いた際に始点および終点が異なる位置となっていた場合において、始点および終点のいずれか一方が、他方と同じ位置になるように調整されてもよい。始点と終点とを一致させることで、無人飛行体20の飛行中に撮影される画像(フレーム)が開始時点と終了時点で一致する。これにより、より有用な画像を得ることが可能となる。また、始点および終点のいずれか一方は、軌跡210上においてその場所が変更可能に設けられていてもよい。また、一度描画された軌跡210は、適宜修正することが可能であってもよい。 FIG. 9 is a diagram showing a third screen example displayed on the touch panel 12 according to the present embodiment. As shown in FIG. 9, a locus 210 can be drawn on the screen V1 by a user operation. In addition, the point 211 indicating the start point and the end point of the locus 210 may be displayed by another aspect. The point 211 indicating the start point and the end point can be, for example, a start point and a goal point in the flight control of the unmanned aircraft 20. The start point and the end point of the locus (line segment) may or may not match. Further, when the start point and the end point are different positions when the locus 210 is drawn, one of the start point and the end point may be adjusted to be the same position as the other. By matching the start point and the end point, the images (frames) taken during the flight of the unmanned vehicle 20 match at the start time and the end time. This makes it possible to obtain a more useful image. Further, either the start point or the end point may be provided so that its location can be changed on the locus 210. Further, the locus 210 once drawn may be appropriately modified.
 次に、ユーザがボタン202をタップすることにより、撮像対象位置の入力に係る処理が開始される(ステップSQ105)。図10は、本実施形態に係るタッチパネル12に表示される第4の画面例を示す図である。図10に示すように、ユーザがボタン202をタップすると画像上の撮像対象位置(中心点とも言う)212を選択することができる。かかる画像上の撮像対象位置212は、例えば、タップ操作やスライド操作により調整され得る。ここで決定される画像上の撮像対象位置212は、少なくとも実空間上の水平方向の位置に対応する。すなわち、飛行制御中において、無人飛行体20のカメラ28は、水平方向において、かかる画像上の撮像対象位置212に対応する実空間上の撮像対象位置が存在する方向を向き続けることとなる。 Next, when the user taps the button 202, the process related to the input of the imaging target position is started (step SQ105). FIG. 10 is a diagram showing a fourth screen example displayed on the touch panel 12 according to the present embodiment. As shown in FIG. 10, when the user taps the button 202, the imaging target position (also referred to as a center point) 212 on the image can be selected. The image pickup target position 212 on the image can be adjusted by, for example, a tap operation or a slide operation. The imaging target position 212 on the image determined here corresponds to at least a horizontal position in real space. That is, during flight control, the camera 28 of the unmanned flying object 20 continues to face the direction in which the image pickup target position in the real space corresponding to the image pickup target position 212 on the image exists in the horizontal direction.
 なお、図10に示す例では、撮像対象位置212は一点としていたが、本技術はかかる例に限定されない。図11は、本実施形態に係るタッチパネル12に表示される第4の画面例の変形例を示す図である。図11に示すように、撮像対象位置213のように、撮像対象位置は点ではなく複数の点からなる線分等で選択されてもよい。また、軌跡210の始点および終点を示す点211と同様に、撮像対象位置213の始点および終点が点214のように設定されてもよい。これにより、飛行制御中の無人飛行体20のカメラ28の撮像対象に向く方向を動的に変化させることも可能である。 In the example shown in FIG. 10, the imaging target position 212 was set to one point, but this technique is not limited to such an example. FIG. 11 is a diagram showing a modified example of the fourth screen example displayed on the touch panel 12 according to the present embodiment. As shown in FIG. 11, as in the image pickup target position 213, the image pickup target position may be selected not by a point but by a line segment or the like consisting of a plurality of points. Further, the start point and the end point of the imaging target position 213 may be set as the point 214, similarly to the point 211 indicating the start point and the end point of the locus 210. This makes it possible to dynamically change the direction of the camera 28 of the unmanned flying object 20 under flight control toward the image pickup target.
 次に、ユーザがボタン203をタップすることにより、無人飛行体20の飛行制御による飛行中の飛行速度の入力に係る処理が開始される(ステップSQ107)。図12は、本実施形態に係るタッチパネル12に表示される第5の画面例を示す図である。図12に示すように、ユーザがボタン203をタップすると無人飛行体20が軌跡210に対応する飛行経路に従って飛行する際の飛行速度を選択することができる。図12に示す例では、例えば、低速ボタン215、中速ボタン216および高速ボタン217のいずれかから一つを選択することにより、飛行速度を選択することができる。飛行速度の設定方法はかかる例に限定されず、例えば、飛行速度を数値により指定することもできる。 Next, when the user taps the button 203, the process related to the input of the flight speed during flight by the flight control of the unmanned aircraft 20 is started (step SQ107). FIG. 12 is a diagram showing a fifth screen example displayed on the touch panel 12 according to the present embodiment. As shown in FIG. 12, when the user taps the button 203, the flight speed at which the unmanned vehicle 20 flies according to the flight path corresponding to the trajectory 210 can be selected. In the example shown in FIG. 12, the flight speed can be selected by, for example, selecting one of the low speed button 215, the medium speed button 216, and the high speed button 217. The method of setting the flight speed is not limited to this example, and for example, the flight speed can be specified numerically.
 ボタン204をタップすると、飛行経路およびカメラの撮像対象位置の設定が完了する。 Tap button 204 to complete the setting of the flight path and the position to be imaged by the camera.
 次に、図7に示したような画面V1に戻ると、ユーザの選択により無人飛行体20の飛行制御による飛行中の飛行高度の入力に係る処理が開始される(ステップSQ109)。図13は、本実施形態に係るタッチパネル12に表示される第6の画面例を示す図である。図13に示す画面V1は、図7に示した画面V1に対してユーザがボタン105をタップすることにより、無人飛行体20の飛行制御による飛行中の高度を設定するフォーム120が表示される。かかるフォーム120に対して高度を入力することにより、飛行制御による飛行中の高度が決定される。なお、飛行中の高度は、飛行中において固定であってもよいし、可変であってもよい。高度が可変である場合、フォーム120とは別のフォームにより適宜高度が設定されてもよい。なお、飛行中の高度は、予め定められてものであってもよい。 Next, when returning to the screen V1 as shown in FIG. 7, the process related to the input of the flight altitude during flight by the flight control of the unmanned aircraft 20 is started by the user's selection (step SQ109). FIG. 13 is a diagram showing a sixth screen example displayed on the touch panel 12 according to the present embodiment. The screen V1 shown in FIG. 13 displays a form 120 in which the user taps the button 105 on the screen V1 shown in FIG. 7 to set the altitude during flight by controlling the flight of the unmanned vehicle 20. By inputting the altitude for the form 120, the altitude during flight by flight control is determined. The altitude during flight may be fixed or variable during flight. When the altitude is variable, the altitude may be appropriately set by a form other than the form 120. The altitude during flight may be predetermined.
 かかるステップSQ103~SQ109においてユーザのタッチパネル12に対する操作により得られた情報は、適宜入力情報として、飛行制御情報生成部114およびカメラ制御情報生成部115に出力され得る。 The information obtained by the user's operation on the touch panel 12 in steps SQ103 to SQ109 can be output to the flight control information generation unit 114 and the camera control information generation unit 115 as appropriate input information.
 次に、飛行制御情報生成部114は、得られた入力情報をもとに、軌跡210に対応する実空間上の位置を飛行するための飛行制御情報を生成する(ステップSQ111)。また、カメラ制御情報生成部115は、得られた入力情報をもとに、画像上の撮像対象位置212に対応する実空間上の撮像対象位置をカメラ28が向き続けて撮像するためのカメラ制御情報を生成する(ステップSQ113)。これらの情報は、出力制御部116により、無人飛行体20へ出力される。 Next, the flight control information generation unit 114 generates flight control information for flying a position in the real space corresponding to the trajectory 210 based on the obtained input information (step SQ111). Further, the camera control information generation unit 115 controls the camera so that the camera 28 continuously faces the image pickup target position in the real space corresponding to the image pickup target position 212 on the image based on the obtained input information. Generate information (step SQ113). This information is output to the unmanned aircraft 20 by the output control unit 116.
 次に、ユーザがボタン102をタップすることにより、無人飛行体20は、飛行制御情報に基づいて、飛行経路の開始地点への移動を開始する(ステップSQ115、SQ117)。図14は、本実施形態に係るシステム1による制御方法に係る第2の状況の例を示す図である。ここでは、無人飛行体20は、飛行制御情報に従って、高さH2において、建造物S1の上空を旋回するように一周飛行する。その際、無人飛行体20は、カメラ制御情報に従って、撮像対象位置C1をカメラ28の撮像方向となるように、飛行中の姿勢等を制御し得る。 Next, when the user taps the button 102, the unmanned aircraft 20 starts moving to the start point of the flight path based on the flight control information (steps SQ115, SQ117). FIG. 14 is a diagram showing an example of a second situation relating to the control method by the system 1 according to the present embodiment. Here, the unmanned aircraft 20 makes a round flight so as to make a turn over the building S1 at a height H2 according to the flight control information. At that time, the unmanned flying object 20 can control the attitude during flight so that the image pickup target position C1 is in the image pickup direction of the camera 28 according to the camera control information.
 次に、ユーザがボタン103をタップすることにより、無人飛行体20は、飛行制御情報に基づいて飛行を開始する(ステップSQ119)。そしてその際、カメラ28は、撮像対象位置C1の方向を向きながら、飛行中の撮像画像を取得する(ステップSQ121)。図15は、本実施形態に係るタッチパネル12に表示される第7の画面例を示す図である。図15に示す画面V1のメイン画面D12には、飛行制御に基づき飛行中である無人飛行体20のカメラ28により撮像される撮像画像が適宜表示される。かかる撮像画像は、動画像であってもよいし、所定の間隔で撮像される静止画像であってもよい。また、サブ画面D13には、ホバリング時に撮像した画像が表示されていてもよい。また、サブ画面D13には、先にユーザの入力により設定された飛行経路に対応する軌跡130が重畳して表示されていてもよい。さらに、飛行経路を飛行中の無人飛行体20の現在位置に対応する位置に、無人飛行体20に対応するオブジェクト131が、サブ画面D13に表示されていてもよい。これにより、メイン画面D12に表示されている画像が、飛行経路のどの地点から撮像されたかどうかを、リアルタイムで(または飛行後の画像の閲覧の際に)確認することが可能となる。 Next, when the user taps the button 103, the unmanned aircraft 20 starts flying based on the flight control information (step SQ119). At that time, the camera 28 acquires the captured image during flight while facing the direction of the imaging target position C1 (step SQ121). FIG. 15 is a diagram showing a seventh screen example displayed on the touch panel 12 according to the present embodiment. On the main screen D12 of the screen V1 shown in FIG. 15, an image captured by the camera 28 of the unmanned vehicle 20 in flight based on flight control is appropriately displayed. The captured image may be a moving image or a still image captured at a predetermined interval. Further, the image captured during hovering may be displayed on the sub screen D13. Further, on the sub screen D13, the locus 130 corresponding to the flight path previously set by the user's input may be superimposed and displayed. Further, the object 131 corresponding to the unmanned flying object 20 may be displayed on the sub screen D13 at the position corresponding to the current position of the unmanned flying object 20 flying in the flight path. This makes it possible to confirm in real time (or when viewing the image after flight) from which point on the flight path the image displayed on the main screen D12 was captured.
 以上、本実施形態に係るシステム1による無人飛行体20の制御方法の一例について説明した。なお、上記実施形態においては、無人飛行体20の実際の飛行における制御方法の例について説明したが、本技術はかかる例に限定されない。例えば、VR(Virtual Reality)やAR(Augmented Reality)空間における仮想のドローンを飛行させた場合におけるシミュレーションに、本技術を用いることが可能である。かかる無人飛行体20の飛行制御による飛行挙動のシミュレーションをVRやARにより行うことで、実際に無人飛行体20を飛行させたときのカメラ28の撮像方向等を事前に確認することができる。なお、VR空間やAR空間を用いたシミュレーションの技術は、公知のものを用いることができる。 The example of the control method of the unmanned aircraft 20 by the system 1 according to the present embodiment has been described above. In the above embodiment, an example of a control method in the actual flight of the unmanned vehicle 20 has been described, but the present technology is not limited to such an example. For example, this technology can be used for simulation in the case of flying a virtual drone in VR (Virtual Reality) or AR (Augmented Reality) space. By simulating the flight behavior by controlling the flight of the unmanned vehicle 20 by VR or AR, it is possible to confirm in advance the image pickup direction of the camera 28 when the unmanned vehicle 20 is actually flown. As the simulation technique using the VR space or the AR space, a known technique can be used.
 以上説明したように、本実施形態に係るシステム1によれば、タッチパネル12に対して飛行経路に対応する位置情報をユーザインタフェースに対して直感的に入力することができる。また、タッチパネル12の画面上にかかる位置情報を入力するだけで、所望の飛行経路を無人飛行体20に飛行させることができる。これにより、現場での飛行体の飛行経路の設定を容易に行うことができる。また、タッチパネル12に対してカメラの向きを決めるための撮像対象位置を設定することで、飛行中のカメラの向きも容易に定めることができる。これにより、建造物等の撮像対象物の無人飛行体を用いた撮像が容易となる。なお、一度設定された飛行制御情報やカメラ制御情報は、ストレージ11c等に保存されていてもよい。これにより、点検等において繰り返し同じ条件で無人飛行体を飛行させ、同じ位置やアングルにより撮像対象物を撮像することができる。 As described above, according to the system 1 according to the present embodiment, it is possible to intuitively input the position information corresponding to the flight path to the touch panel 12 to the user interface. Further, the unmanned flight object 20 can be made to fly a desired flight path only by inputting the position information on the screen of the touch panel 12. This makes it possible to easily set the flight path of the flying object at the site. Further, by setting the image pickup target position for determining the orientation of the camera with respect to the touch panel 12, the orientation of the camera during flight can be easily determined. This facilitates imaging using an unmanned flying object of an imaging object such as a building. The flight control information and the camera control information once set may be stored in the storage 11c or the like. As a result, it is possible to repeatedly fly an unmanned vehicle under the same conditions in inspections and the like, and to image an image to be imaged at the same position and angle.
 なお、上記実施形態では、ホバリング時に無人飛行体20が撮像した画像は、無人飛行体20に積載されるカメラ28の撮像方向を地上方向として撮像して得られる画像としたが、本技術はかかる例に限定されない。図16は、本実施形態の一変形例に係るシステム1による制御方法の一例である。図16に示すように、かかるシステム1においては、無人飛行体20のホバリング時のカメラ28の撮像方向は、構造物S4に対して水平方向であってよい。このとき、高さH3を基準として、無人飛行体20が高さ方向に平行な平面上において飛行しながらカメラ28により構造物S4を撮像するよう無人飛行体20が制御されてもよい。この場合のカメラ28の撮像方向は、例えば、構造物S4における高さH3の位置を撮像目標位置として決定されてもよい。このように、無人飛行体20と撮像目標である構造物との位置関係によらず、本実施形態に係るシステム1を適用することが可能である。 In the above embodiment, the image captured by the unmanned flying object 20 during hovering is an image obtained by imaging the image pickup direction of the camera 28 loaded on the unmanned flying object 20 as the ground direction. Not limited to examples. FIG. 16 is an example of a control method by the system 1 according to a modification of the present embodiment. As shown in FIG. 16, in such a system 1, the image pickup direction of the camera 28 when the unmanned aircraft 20 is hovering may be horizontal with respect to the structure S4. At this time, the unmanned flying object 20 may be controlled so that the camera 28 takes an image of the structure S4 while the unmanned flying object 20 is flying on a plane parallel to the height direction with respect to the height H3. In this case, the imaging direction of the camera 28 may be determined, for example, with the position of the height H3 in the structure S4 as the imaging target position. As described above, the system 1 according to the present embodiment can be applied regardless of the positional relationship between the unmanned flying object 20 and the structure that is the imaging target.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 Although the preferred embodiments of the present disclosure have been described in detail with reference to the accompanying drawings, the technical scope of the present disclosure is not limited to such examples. It is clear that anyone with ordinary knowledge in the art of the present disclosure may come up with various modifications or amendments within the scope of the technical ideas set forth in the claims. Is, of course, understood to belong to the technical scope of the present disclosure.
 本明細書において説明した装置は、単独の装置として実現されてもよく、一部または全部がネットワークで接続された複数の装置等により実現されてもよい。例えば、情報処理端末10の制御部およびストレージは、互いにネットワークで接続された異なるサーバにより実現されてもよい。 The device described in the present specification may be realized as a single device, or may be realized by a plurality of devices, etc., which are partially or wholly connected by a network. For example, the control unit and the storage of the information processing terminal 10 may be realized by different servers connected to each other by a network.
 本明細書において説明した装置による一連の処理は、ソフトウェア、ハードウェア、及びソフトウェアとハードウェアとの組合せのいずれを用いて実現されてもよい。本実施形態に係る情報処理端末10の各機能を実現するためのコンピュータプログラムを作製し、PC等に実装することが可能である。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体も提供することができる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリ等である。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信されてもよい。 The series of processes by the apparatus described in the present specification may be realized by using any of software, hardware, and a combination of software and hardware. It is possible to create a computer program for realizing each function of the information processing terminal 10 according to the present embodiment and implement it on a PC or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed, for example, via a network without using a recording medium.
 また、本明細書においてフローチャート図を用いて説明した処理は、必ずしも図示された順序で実行されなくてもよい。いくつかの処理ステップは、並列的に実行されてもよい。また、追加的な処理ステップが採用されてもよく、一部の処理ステップが省略されてもよい。 Further, the processes described in the present specification using the flowchart diagram do not necessarily have to be executed in the order shown in the figure. Some processing steps may be performed in parallel. Further, additional processing steps may be adopted, and some processing steps may be omitted.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 Further, the effects described in the present specification are merely explanatory or exemplary and are not limited. That is, the technique according to the present disclosure may exert other effects apparent to those skilled in the art from the description of the present specification, in addition to or in place of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(項目1)
 飛行体の制御に関する方法であって、
 飛行体が撮像した画像の情報を取得することと、
 前記画像を表示部に表示することと、
 前記表示部に表示された前記画像に対する操作に基づいて生成される入力情報を取得することと、前記入力情報は前記画像上の位置に関する情報を含み、
 前記画像上の位置に対応する実空間上の位置を飛行するための飛行制御の情報を前記飛行体に出力することと、
 を含む方法。
(項目2)
 項目1に記載の方法であって、
 前記入力情報は、前記画像上の位置の連続集合からなる線分の情報を含み、
 前記画像上の線分に対応する実空間上のルートを飛行するための飛行制御の情報を前記飛行体に出力する、方法。
(項目3)
 項目2に記載の方法であって、
 前記線分の情報は、前記線分の始点と終点との情報を含み、
 前記画像に対する操作により取得される前記終点の位置が、前記始点の位置と異なる場合に、前記終点の位置が前記始点の位置と一致するように前記終点の位置が調整される、方法。
(項目4)
 項目1~3のいずれか1項に記載の方法であって、
 前記入力情報は、前記飛行体に積載される撮像装置の、前記画像上の撮像目標位置に関する情報を含み、
 前記飛行制御の情報に基づいて飛行する前記飛行体の飛行中に、前記撮像目標位置に対応する実空間上の撮像目標位置に基づいて前記撮像装置の向きを制御するための情報を、前記飛行体に出力することをさらに含む、方法。
(項目5)
 項目4に記載の方法であって、
 前記撮像目標位置は、前記画像上において複数または連続的に指定される、方法。
(項目6)
 項目4または5に記載の方法であって、
 前記撮像目標位置は、実空間上における高さ方向の位置を含む、方法。
(項目7)
 項目1~6のいずれか1項に記載の方法であって、
 前記画像を撮像した時点における飛行体の高さ方向の位置に関する情報を取得し、
 前記画像上の位置に対応する実空間上の位置は、前記画像を撮像した時点における飛行体の高さ方向の位置に基づいて決定される、方法。
(項目8)
 項目1~7のいずれか1項に記載の方法であって、
 前記入力情報は、前記飛行制御による飛行時の前記飛行体の実空間上の高さ方向の位置に関する情報を含む、方法。
(項目9)
 項目1~8のいずれか1項に記載の方法であって、
 前記飛行制御に基づき飛行する前記飛行体に積載される撮像装置が撮像する画像を前記表示部に表示することをさらに含む、方法。
(項目10)
 項目1~9のいずれか1項に記載の方法であって、
 前記飛行制御に基づき飛行する前記飛行体の飛行中に、前記表示部に先に表示されていた画像を、前記表示部の他の領域に表示することをさらに含む、方法。
(項目11)
 項目10に記載の方法であって、
 前記他の領域に、前記飛行制御に基づき飛行する前記飛行体の実空間上の飛行位置に対応する前記他の領域における画像上の位置に、前記飛行体に対応するオブジェクトを表示させることをさらに含む、方法。
(項目12)
 項目1~11のいずれか1項に記載の方法であって、
 前記表示部に、前記飛行体の前記飛行制御による飛行挙動のシミュレーションの結果を表示させることをさらに含む、方法。
(項目13)
 項目1~12のいずれか1項に記載の方法であって、
 前記飛行制御の情報は、前記飛行体を飛行する際の速度に係る情報を含み、
 前記表示部に表示された前記画像に対する操作には、前記速度を決定するための操作を含む、方法。
(項目14)
 項目1~13のいずれか1項に記載の方法であって、
 前記飛行体がホバリング時に撮像した前記画像は、前記飛行体に積載される撮像装置の撮像方向を地上方向として撮像して得られる画像である、方法。
(項目15)
 飛行体の制御に関するシステムであって、
 飛行体が撮像した画像の情報を取得する画像取得部と、
 前記画像を表示部に表示する表示制御部と、
 前記表示部に表示された前記画像に対する操作に基づいて生成される、前記画像上の位置に関する情報を含む入力情報を取得する入力情報取得部と、
 前記画像上の位置に対応する実空間上の位置を飛行するための飛行制御の情報を前記飛行体に出力する出力制御部と、
を備えるシステム。
(項目16)
 コンピュータを、飛行体の制御装置として機能させるためのプログラムであって、
 飛行体が撮像した画像の情報を取得する画像取得部と、
 前記画像を表示部に表示する表示制御部と、
 前記表示部に表示された前記画像に対する操作に基づいて生成される、前記画像上の位置に関する情報を含む入力情報を取得する入力情報取得部と、
 前記画像上の位置に対応する実空間上の位置を飛行するための飛行制御の情報を前記飛行体に出力する出力制御部と、
 として機能させるプログラム。
The following configurations also belong to the technical scope of the present disclosure.
(Item 1)
It ’s a method of controlling an aircraft.
Acquiring the information of the image captured by the aircraft and
Displaying the image on the display unit and
Acquiring input information generated based on an operation on the image displayed on the display unit, and the input information includes information on a position on the image.
Outputting flight control information for flying a position in real space corresponding to the position on the image to the flying object, and
How to include.
(Item 2)
The method described in item 1.
The input information includes information of a line segment consisting of a continuous set of positions on the image.
A method of outputting flight control information for flying a route in real space corresponding to a line segment on an image to the flying object.
(Item 3)
The method described in item 2.
The information of the line segment includes the information of the start point and the end point of the line segment.
A method in which, when the position of the end point acquired by an operation on the image is different from the position of the start point, the position of the end point is adjusted so that the position of the end point coincides with the position of the start point.
(Item 4)
The method according to any one of items 1 to 3.
The input information includes information about an image pickup target position on the image of the image pickup apparatus loaded on the flight object.
During the flight of the flying object that flies based on the flight control information, the flight is provided with information for controlling the orientation of the image pickup device based on the image pickup target position in the real space corresponding to the image pickup target position. A method that further includes outputting to the body.
(Item 5)
The method described in item 4.
A method in which the imaging target position is designated by a plurality or consecutively on the image.
(Item 6)
The method according to item 4 or 5.
The method, wherein the imaging target position includes a position in the height direction in real space.
(Item 7)
The method according to any one of items 1 to 6.
Obtaining information about the position of the flying object in the height direction at the time when the image was taken,
A method, wherein the position in real space corresponding to the position on the image is determined based on the position in the height direction of the flying object at the time when the image is taken.
(Item 8)
The method according to any one of items 1 to 7.
The input information includes information about the height position of the flying object in real space during flight by the flight control.
(Item 9)
The method according to any one of items 1 to 8.
A method further comprising displaying an image captured by an image pickup apparatus loaded on the flying object flying based on the flight control on the display unit.
(Item 10)
The method according to any one of items 1 to 9.
A method further comprising displaying an image previously displayed on the display unit in another area of the display unit during flight of the flying object flying under the flight control.
(Item 11)
The method according to item 10.
Further, it is possible to display the object corresponding to the flying object at the position on the image in the other area corresponding to the flight position in the real space of the flying object flying under the flight control in the other area. Including, method.
(Item 12)
The method according to any one of items 1 to 11.
A method further comprising displaying the result of a simulation of flight behavior by the flight control of the flying object on the display unit.
(Item 13)
The method according to any one of items 1 to 12.
The flight control information includes information relating to the speed at which the flight object is flown.
A method, wherein the operation on the image displayed on the display unit includes an operation for determining the speed.
(Item 14)
The method according to any one of items 1 to 13.
The image captured by the flying object during hovering is an image obtained by imaging the image pickup direction of the image pickup apparatus loaded on the flying object as the ground direction.
(Item 15)
It is a system related to the control of the aircraft.
An image acquisition unit that acquires information on the image captured by the aircraft,
A display control unit that displays the image on the display unit,
An input information acquisition unit that acquires input information including information about a position on the image, which is generated based on an operation on the image displayed on the display unit.
An output control unit that outputs flight control information for flying a position in real space corresponding to the position on the image to the flying object.
A system equipped with.
(Item 16)
A program for making a computer function as a control device for an air vehicle.
An image acquisition unit that acquires information on the image captured by the aircraft,
A display control unit that displays the image on the display unit,
An input information acquisition unit that acquires input information including information about a position on the image, which is generated based on an operation on the image displayed on the display unit.
An output control unit that outputs flight control information for flying a position in real space corresponding to the position on the image to the flying object.
A program that functions as.
  1   システム
  10  情報処理端末
  11  制御部
  12  タッチパネル
  20  無人飛行体
  28  カメラ
  111 画像取得部
  112 表示制御部
  113 入力情報取得部
  114 飛行制御情報生成部
  115 カメラ制御情報生成部
1 System 10 Information processing terminal 11 Control unit 12 Touch panel 20 Unmanned flying object 28 Camera 111 Image acquisition unit 112 Display control unit 113 Input information acquisition unit 114 Flight control information generation unit 115 Camera control information generation unit

Claims (16)

  1.  飛行体の制御に関する方法であって、
     飛行体が撮像した画像の情報を取得することと、
     前記画像を表示部に表示することと、
     前記表示部に表示された前記画像に対する操作に基づいて生成される入力情報を取得することと、前記入力情報は前記画像上の位置に関する情報を含み、
     前記画像上の位置に対応する実空間上の位置を飛行するための飛行制御の情報を前記飛行体に出力することと、
     を含む方法。
    It ’s a method of controlling an aircraft.
    Acquiring the information of the image captured by the aircraft and
    Displaying the image on the display unit and
    Acquiring input information generated based on an operation on the image displayed on the display unit, and the input information includes information on a position on the image.
    Outputting flight control information for flying a position in real space corresponding to the position on the image to the flying object, and
    How to include.
  2.  請求項1に記載の方法であって、
     前記入力情報は、前記画像上の位置の連続集合からなる線分の情報を含み、
     前記画像上の線分に対応する実空間上のルートを飛行するための飛行制御の情報を前記飛行体に出力する、方法。
    The method according to claim 1.
    The input information includes information of a line segment consisting of a continuous set of positions on the image.
    A method of outputting flight control information for flying a route in real space corresponding to a line segment on an image to the flying object.
  3.  請求項2に記載の方法であって、
     前記線分の情報は、前記線分の始点と終点との情報を含み、
     前記画像に対する操作により取得される前記終点の位置が、前記始点の位置と異なる場合に、前記終点の位置が前記始点の位置と一致するように前記終点の位置が調整される、方法。
    The method according to claim 2.
    The information of the line segment includes the information of the start point and the end point of the line segment.
    A method in which, when the position of the end point acquired by an operation on the image is different from the position of the start point, the position of the end point is adjusted so that the position of the end point coincides with the position of the start point.
  4.  請求項1~3のいずれか1項に記載の方法であって、
     前記入力情報は、前記飛行体に積載される撮像装置の、前記画像上の撮像目標位置に関する情報を含み、
     前記飛行制御の情報に基づいて飛行する前記飛行体の飛行中に、前記撮像目標位置に対応する実空間上の撮像目標位置に基づいて前記撮像装置の向きを制御するための情報を、前記飛行体に出力することをさらに含む、方法。
    The method according to any one of claims 1 to 3.
    The input information includes information about an image pickup target position on the image of the image pickup apparatus loaded on the flight object.
    During the flight of the flying object that flies based on the flight control information, the flight is provided with information for controlling the orientation of the image pickup device based on the image pickup target position in the real space corresponding to the image pickup target position. A method that further includes outputting to the body.
  5.  請求項4に記載の方法であって、
     前記撮像目標位置は、前記画像上において複数または連続的に指定される、方法。
    The method according to claim 4.
    A method in which the imaging target position is designated by a plurality or consecutively on the image.
  6.  請求項4または5に記載の方法であって、
     前記撮像目標位置は、実空間上における高さ方向の位置を含む、方法。
    The method according to claim 4 or 5.
    The method, wherein the imaging target position includes a position in the height direction in real space.
  7.  請求項1~6のいずれか1項に記載の方法であって、
     前記画像を撮像した時点における飛行体の高さ方向の位置に関する情報を取得し、
     前記画像上の位置に対応する実空間上の位置は、前記画像を撮像した時点における飛行体の高さ方向の位置に基づいて決定される、方法。
    The method according to any one of claims 1 to 6.
    Obtaining information about the position of the flying object in the height direction at the time when the image was taken,
    A method, wherein the position in real space corresponding to the position on the image is determined based on the position in the height direction of the flying object at the time when the image is taken.
  8.  請求項1~7のいずれか1項に記載の方法であって、
     前記入力情報は、前記飛行制御による飛行時の前記飛行体の実空間上の高さ方向の位置に関する情報を含む、方法。
    The method according to any one of claims 1 to 7.
    The input information includes information about the height position of the flying object in real space during flight by the flight control.
  9.  請求項1~8のいずれか1項に記載の方法であって、
     前記飛行制御に基づき飛行する前記飛行体に積載される撮像装置が撮像する画像を前記表示部に表示することをさらに含む、方法。
    The method according to any one of claims 1 to 8.
    A method further comprising displaying an image captured by an image pickup apparatus loaded on the flying object flying based on the flight control on the display unit.
  10.  請求項1~9のいずれか1項に記載の方法であって、
     前記飛行制御に基づき飛行する前記飛行体の飛行中に、前記表示部に先に表示されていた画像を、前記表示部の他の領域に表示することをさらに含む、方法。
    The method according to any one of claims 1 to 9.
    A method further comprising displaying an image previously displayed on the display unit in another area of the display unit during flight of the flying object flying under the flight control.
  11.  請求項10に記載の方法であって、
     前記他の領域に、前記飛行制御に基づき飛行する前記飛行体の実空間上の飛行位置に対応する前記他の領域における画像上の位置に、前記飛行体に対応するオブジェクトを表示させることをさらに含む、方法。
    The method according to claim 10.
    Further, it is possible to display the object corresponding to the flying object at the position on the image in the other area corresponding to the flight position in the real space of the flying object flying under the flight control in the other area. Including, method.
  12.  請求項1~11のいずれか1項に記載の方法であって、
     前記表示部に、前記飛行体の前記飛行制御による飛行挙動のシミュレーションの結果を表示させることをさらに含む、方法。
    The method according to any one of claims 1 to 11.
    A method further comprising displaying the result of a simulation of flight behavior by the flight control of the flying object on the display unit.
  13.  請求項1~12のいずれか1項に記載の方法であって、
     前記飛行制御の情報は、前記飛行体を飛行する際の速度に係る情報を含み、
     前記表示部に表示された前記画像に対する操作には、前記速度を決定するための操作を含む、方法。
    The method according to any one of claims 1 to 12.
    The flight control information includes information relating to the speed at which the flight object is flown.
    A method, wherein the operation on the image displayed on the display unit includes an operation for determining the speed.
  14.  請求項1~13のいずれか1項に記載の方法であって、
     前記飛行体がホバリング時に撮像した前記画像は、前記飛行体に積載される撮像装置の撮像方向を地上方向として撮像して得られる画像である、方法。
    The method according to any one of claims 1 to 13.
    The image captured by the flying object during hovering is an image obtained by imaging the image pickup direction of the image pickup apparatus loaded on the flying object as the ground direction.
  15.  飛行体の制御に関するシステムであって、
     飛行体が撮像した画像の情報を取得する画像取得部と、
     前記画像を表示部に表示する表示制御部と、
     前記表示部に表示された前記画像に対する操作に基づいて生成される、前記画像上の位置に関する情報を含む入力情報を取得する入力情報取得部と、
     前記画像上の位置に対応する実空間上の位置を飛行するための飛行制御の情報を前記飛行体に出力する出力制御部と、
    を備えるシステム。
    It is a system related to the control of the aircraft.
    An image acquisition unit that acquires information on images captured by the aircraft,
    A display control unit that displays the image on the display unit,
    An input information acquisition unit that acquires input information including information about a position on the image, which is generated based on an operation on the image displayed on the display unit.
    An output control unit that outputs flight control information for flying a position in real space corresponding to the position on the image to the flying object.
    A system equipped with.
  16.  コンピュータを、飛行体の制御装置として機能させるためのプログラムであって、
     飛行体が撮像した画像の情報を取得する画像取得部と、
     前記画像を表示部に表示する表示制御部と、
     前記表示部に表示された前記画像に対する操作に基づいて生成される、前記画像上の位置に関する情報を含む入力情報を取得する入力情報取得部と、
     前記画像上の位置に対応する実空間上の位置を飛行するための飛行制御の情報を前記飛行体に出力する出力制御部と、
     として機能させるプログラム。
    A program for making a computer function as a control device for an air vehicle.
    An image acquisition unit that acquires information on images captured by the aircraft,
    A display control unit that displays the image on the display unit,
    An input information acquisition unit that acquires input information including information about a position on the image, which is generated based on an operation on the image displayed on the display unit.
    An output control unit that outputs flight control information for flying a position in real space corresponding to the position on the image to the flying object.
    A program that functions as.
PCT/JP2021/033465 2020-09-30 2021-09-13 Method, system, and program WO2022070851A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022553765A JPWO2022070851A1 (en) 2020-09-30 2021-09-13

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-166617 2020-09-30
JP2020166617 2020-09-30

Publications (1)

Publication Number Publication Date
WO2022070851A1 true WO2022070851A1 (en) 2022-04-07

Family

ID=80951365

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/033465 WO2022070851A1 (en) 2020-09-30 2021-09-13 Method, system, and program

Country Status (2)

Country Link
JP (1) JPWO2022070851A1 (en)
WO (1) WO2022070851A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170031355A1 (en) * 2015-07-29 2017-02-02 Lg Electronics Inc. Mobile terminal and method for controlling the same
JP2017509919A (en) * 2014-09-30 2017-04-06 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method for operating unmanned aerial vehicle and unmanned aerial vehicle
JP2017076302A (en) * 2015-10-16 2017-04-20 株式会社プロドローン Method for controlling small sized unmanned airplane
US20180157252A1 (en) * 2016-12-05 2018-06-07 Samsung Electronics Co., Ltd. Electronic device and method of controlling electronic device
US20190139420A1 (en) * 2017-11-08 2019-05-09 Sikorsky Aircraft Corporation, A Lockheed Martin Company Aircraft route systems
JP2019178998A (en) * 2018-03-30 2019-10-17 大和ハウス工業株式会社 Position identification system
JP2020003428A (en) * 2018-06-29 2020-01-09 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Information processing device, flight path generating method, program, and recording medium
JP2020149255A (en) * 2019-03-12 2020-09-17 Terra Drone株式会社 Flight route generation device, flight route generation method and program thereof, and structure inspection method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017509919A (en) * 2014-09-30 2017-04-06 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method for operating unmanned aerial vehicle and unmanned aerial vehicle
US20170031355A1 (en) * 2015-07-29 2017-02-02 Lg Electronics Inc. Mobile terminal and method for controlling the same
JP2017076302A (en) * 2015-10-16 2017-04-20 株式会社プロドローン Method for controlling small sized unmanned airplane
US20180157252A1 (en) * 2016-12-05 2018-06-07 Samsung Electronics Co., Ltd. Electronic device and method of controlling electronic device
US20190139420A1 (en) * 2017-11-08 2019-05-09 Sikorsky Aircraft Corporation, A Lockheed Martin Company Aircraft route systems
JP2019178998A (en) * 2018-03-30 2019-10-17 大和ハウス工業株式会社 Position identification system
JP2020003428A (en) * 2018-06-29 2020-01-09 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Information processing device, flight path generating method, program, and recording medium
JP2020149255A (en) * 2019-03-12 2020-09-17 Terra Drone株式会社 Flight route generation device, flight route generation method and program thereof, and structure inspection method

Also Published As

Publication number Publication date
JPWO2022070851A1 (en) 2022-04-07

Similar Documents

Publication Publication Date Title
US11644832B2 (en) User interaction paradigms for a flying digital assistant
KR102236339B1 (en) Systems and methods for controlling images captured by an imaging device
JP6228679B2 (en) Gimbal and gimbal simulation system
WO2020143677A1 (en) Flight control method and flight control system
CN111694376B (en) Flight simulation method and device, electronic equipment and unmanned aerial vehicle
WO2021199449A1 (en) Position calculation method and information processing system
WO2021251441A1 (en) Method, system, and program
JP2024075613A (en) Information processing method, information processing device, and program
JP6966810B2 (en) Management server and management system, display information generation method, program
US20220187828A1 (en) Information processing device, information processing method, and program
US20230359198A1 (en) Unmanned aerial vehicle, control method thereof, and storage medium
WO2022070851A1 (en) Method, system, and program
WO2020042186A1 (en) Control method for movable platform, movable platform, terminal device and system
WO2022188151A1 (en) Image photographing method, control apparatus, movable platform, and computer storage medium
KR20190128425A (en) Method for controling unmanned moving object based on cylindrical coordinate system and recording medium storing program for executing the same, and computer prograom stored in recording medium for executing the same
WO2021064982A1 (en) Information processing device and information processing method
WO2022113482A1 (en) Information processing device, method, and program
JP2018014608A (en) Controller, imaging device, mobile, control method and program
EP3518063A1 (en) Combined video display and gimbal control
JP2023083072A (en) Method, system and program
JP2021012612A (en) Information processor, information processing system, and control method and program for same
KR102542181B1 (en) Method and apparatus for controlling unmanned air vehicle for generating 360 degree virtual reality image
CN117492381B (en) Robot collaborative pointing simulation visualization method, system, equipment and storage medium
US20240013460A1 (en) Information processing apparatus, information processing method, program, and information processing system
WO2020262222A1 (en) Control system for flying vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21875153

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022553765

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21875153

Country of ref document: EP

Kind code of ref document: A1