WO2021130980A1 - Aircraft flight path display method and information processing device - Google Patents

Aircraft flight path display method and information processing device Download PDF

Info

Publication number
WO2021130980A1
WO2021130980A1 PCT/JP2019/051213 JP2019051213W WO2021130980A1 WO 2021130980 A1 WO2021130980 A1 WO 2021130980A1 JP 2019051213 W JP2019051213 W JP 2019051213W WO 2021130980 A1 WO2021130980 A1 WO 2021130980A1
Authority
WO
WIPO (PCT)
Prior art keywords
flight path
information
flight
display method
image
Prior art date
Application number
PCT/JP2019/051213
Other languages
French (fr)
Japanese (ja)
Inventor
西本 晋也
Original Assignee
株式会社センシンロボティクス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社センシンロボティクス filed Critical 株式会社センシンロボティクス
Priority to JP2020511544A priority Critical patent/JP6730764B1/en
Priority to PCT/JP2019/051213 priority patent/WO2021130980A1/en
Priority to JP2020112346A priority patent/JP2021104802A/en
Publication of WO2021130980A1 publication Critical patent/WO2021130980A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/83Electronic components structurally integrated with aircraft elements, e.g. circuit boards carrying loads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64FGROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
    • B64F1/00Ground or aircraft-carrier-deck installations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms

Definitions

  • the present invention relates to a flight path display method for an air vehicle and an information processing device.
  • Patent Document 1 discloses a flight route display method for displaying a flight route for the purpose of inspection or the like.
  • the flight path R is shown from a bird's-eye view on the map M, but in the technique, it is possible to confirm what kind of flight path the flying object flies from a three-dimensional viewpoint. There wasn't. Especially when the object of inspection etc. is a three-dimensional object, it is not possible to confirm what kind of flight path the flying object will fly around the three-dimensional object from a three-dimensional viewpoint, so it is necessary to judge the appropriateness of the flight path. Was making it difficult.
  • the present invention has been made in view of such a background, and a flight path display method and a flight path display method capable of confirming the suitability of the flight path by comparing the actual flight range situation with the flight path of the flying object.
  • the purpose is to provide an information processing device.
  • the main invention of the present invention for solving the above problems is a flight path display method for displaying a flight path of an air vehicle on a user terminal, which includes position information, shooting image angle information, and shooting angle information of the user terminal.
  • the flight path display method comprises a step of generating a composite image obtained by superimposing the flight path image on the captured image acquired based on the above.
  • the present invention it is possible to provide a flight route display method and a server that can confirm the suitability of the flight path by comparing the actual flight range situation with the flight path of the flying object.
  • FIG. 1 It is a figure which shows the structure of the management system which concerns on embodiment of this invention. It is a block diagram which shows the hardware configuration of the management server of FIG. It is a block diagram which shows the hardware configuration of the user terminal of FIG. It is a block diagram which shows the hardware composition of the flying object of FIG. It is a block diagram which shows the function of the management server of FIG. It is a block diagram which shows the structure of the parameter information storage part of FIG. It is a flowchart of the flight path display method which concerns on embodiment of this invention. It is a figure which shows an example of the photographed image and the flight path image which concerns on embodiment of this invention. It is a figure which shows an example of the composite image G which concerns on embodiment of this invention.
  • the flight path display method and the information processing apparatus have the following configurations.
  • a flight route display method characterized by the fact that.
  • the flight route display method according to item 1 The image showing the flight path further contains information about waypoints.
  • [Item 3] The flight route display method according to item 1 or 2.
  • the image showing the flight path further includes a three-dimensional model of the virtual flying object.
  • [Item 4] The flight route display method according to items 1 to 3.
  • the image showing the flight path further includes a three-dimensional model of the virtual imaging object.
  • the user terminal is a wearable device having an AR function.
  • [Item 6] The flight route display method according to items 1 to 5. Further including the step of editing the flight path while displaying the composite image, A flight route display method characterized by the fact that.
  • [Item 7] The flight route display method according to items 1 to 6. Further including the step of displaying the image acquired from the flying object, A flight route display method characterized by the fact that.
  • [Item 8] The flight route display method according to item 7. It is possible to switch so that the image acquired from the flying object is displayed larger than the composite image. A flight route display method characterized by the fact that.
  • [Item 9] An information processing device for displaying the flight path of an air vehicle on a user terminal.
  • a shooting state information acquisition unit that acquires shooting state information including position information, shooting angle of view information, shooting angle information, and shooting direction information of the user terminal, and a shooting state information acquisition unit.
  • a flight path image generation unit that generates a flight path image including the flight path drawn in the virtual space based on the flight path information related to the flight path.
  • a composite image generation unit that generates a composite image obtained by superimposing the flight path image on the captured image acquired based on the shooting state information is provided.
  • the management system includes a management server 1, one or more user terminals 2, one or more flying objects 4, and one or more flying object storage devices 5. ing.
  • the management server 1, the user terminal 2, the flying object 4, and the flying object storage device 5 are connected to each other so as to be able to communicate with each other via a network.
  • the illustrated configuration is an example, and is not limited to this. For example, a configuration that is carried by the user without having the flying object storage device 5 may be used.
  • FIG. 2 is a diagram showing a hardware configuration of the management server 1.
  • the illustrated configuration is an example, and may have other configurations.
  • the management server 1 is connected to a plurality of user terminals 2, an air vehicle 4, and an air vehicle storage device 5 to form a part of this system.
  • the management server 1 may be a general-purpose computer such as a workstation or a personal computer, or may be logically realized by cloud computing.
  • the management server 1 includes at least a processor 10, a memory 11, a storage 12, a transmission / reception unit 13, an input / output unit 14, and the like, and these are electrically connected to each other through a bus 15.
  • the processor 10 is an arithmetic unit that controls the operation of the entire management server 1, controls the transmission and reception of data between each element, and performs information processing necessary for application execution and authentication processing.
  • the processor 10 is a CPU (Central Processing Unit) and / or a GPU (Graphics Processing Unit), and executes each information processing by executing a program or the like for the system stored in the storage 12 and expanded in the memory 11. ..
  • the memory 11 includes a main memory composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory) and an auxiliary storage composed of a non-volatile storage device such as a flash memory or an HDD (Hard Disc Drive). ..
  • the memory 11 is used as a work area of the processor 10 and stores a BIOS (Basic Input / Output System) executed when the management server 1 is started, various setting information, and the like.
  • BIOS Basic Input / Output System
  • the storage 12 stores various programs such as application programs.
  • a database storing data used for each process may be built in the storage 12.
  • the transmission / reception unit 13 connects the management server 1 to the network and the blockchain network.
  • the transmission / reception unit 13 may be provided with a short-range communication interface of Bluetooth (registered trademark) and BLE (Bluetooth Low Energy).
  • the input / output unit 14 is an information input device such as a keyboard and a mouse, and an output device such as a display.
  • the bus 15 is commonly connected to each of the above elements and transmits, for example, an address signal, a data signal, and various control signals.
  • the user terminal 2 shown in FIG. 3 also includes a processor 20, a memory 21, a storage 22, a transmission / reception unit 23, an input / output unit 24, a photographing unit 26, a photographing state information acquisition unit 27, and the like, and these are mutually provided through a bus 25. It is electrically connected. Since the functions of each element can be configured in the same manner as the management server 1 described above, detailed description of the same configuration will be omitted.
  • the shooting unit 26 is, for example, a camera, and acquires a shot image.
  • the shooting state information acquisition unit 27 is a storage unit (memory 21 or storage) that stores information (for example, shooting angle information) related to sensors such as GPS, gyro sensor, pressure sensor, temperature sensor, and shooting unit 26 of the terminal. It may be a part of 22), etc., and the shooting angle information, shooting angle information, shooting orientation information, and user terminal position information (for example, latitude / longitude information and altitude information) of the user terminal when the shot image is acquired. Etc.) etc. are acquired as shooting status information. These acquired images and information are transmitted to the management server 1 and stored in the storage unit 160 described later.
  • the altitude information included in the user terminal position information may be altitude information calculated based on the above-mentioned barometric pressure sensor or temperature sensor, or altitude information set by the user, but for example, height information set by the user, Alternatively, the altitude information may be a value offset in the vertical direction by a predetermined height according to the assumed position of the user terminal from the average height information according to the gender of the user.
  • FIG. 4 is a block diagram showing a hardware configuration of the air vehicle 4.
  • the flight controller 41 can have one or more processors such as a programmable processor (eg, central processing unit (CPU)).
  • a programmable processor eg, central processing unit (CPU)
  • the flight controller 41 has a memory 411 and can access the memory.
  • Memory 411 stores logic, code, and / or program instructions that the flight controller can execute to perform one or more steps.
  • the flight controller 41 may include sensors 412 such as an inertial sensor (accelerometer, gyro sensor), GPS sensor, proximity sensor (for example, rider) and the like.
  • Memory 411 may include, for example, a separable medium such as an SD card or random access memory (RAM) or an external storage device.
  • the data acquired from the cameras / sensors 42 may be directly transmitted and stored in the memory 411.
  • still image / moving image data taken by a camera or the like may be recorded in the internal memory or an external memory, but the present invention is not limited to this, and at least the management server 1 or the management server 1 or the internal memory may be recorded from the camera / sensor 42 or the internal memory via the network NW. It may be recorded in either the user terminal 2 or the air vehicle storage device 5.
  • the camera 42 is installed on the flying object 4 via the gimbal 43.
  • the flight controller 41 includes a control module (not shown) configured to control the state of the flying object.
  • the control module adjusts the spatial placement, velocity, and / or acceleration of an air vehicle with six degrees of freedom (translational motion x, y and z, and rotational motion ⁇ x , ⁇ y and ⁇ z).
  • ESC44 Electric Speed Controller
  • the propulsion mechanism (motor 45, etc.) of the flying object.
  • the propeller 46 is rotated by the motor 45 supplied from the battery 48 to generate lift of the flying object.
  • the control module can control one or more of the states of the mounting unit and the sensors.
  • the flight controller 41 is configured to transmit and / or receive data from one or more external devices (eg, transmitter / receiver (propo) 49, terminal, display device, or other remote control). It is possible to communicate with the unit 47.
  • the transmitter / receiver 49 can use any suitable communication means such as wired communication or wireless communication.
  • the transmission / reception unit 47 uses one or more of a local area network (LAN), a wide area network (WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, cloud communication, and the like. can do.
  • LAN local area network
  • WAN wide area network
  • P2P point-to-point
  • the transmission / reception unit 47 transmits and / or receives one or more of the data acquired by the sensors 42, the processing result generated by the flight controller 41, the predetermined control data, the user command from the terminal or the remote controller, and the like. be able to.
  • Sensors 42 may include an inertial sensor (accelerometer, gyro sensor), GPS sensor, proximity sensor (eg, rider), or vision / image sensor (eg, camera).
  • inertial sensor accelerelerometer, gyro sensor
  • GPS sensor GPS sensor
  • proximity sensor eg, rider
  • vision / image sensor eg, camera
  • FIG. 5 is a block diagram illustrating the functions implemented in the management server 1.
  • the management server 1 includes a communication unit 110, a flight mission generation unit 120, a captured image / shooting state information receiving unit 130, a flight path image generation unit 140, a composite image generation unit 150, and a storage unit 160.
  • the flight mission generation unit 120 includes a flight path generation unit 121.
  • the storage unit 160 includes a flight path information storage unit 162, a flight log storage unit 164, and a shooting information storage unit 166.
  • the storage unit 160 may further have a storage unit that stores information necessary for performing imaging, for example, information on flight conditions (for example, flight speed, waypoint interval, etc.), and an imaging target. Even if it has a storage unit (not shown) that stores information about an object (for example, position coordinates and height information) and information about the surrounding environment of the object (for example, information about terrain and surrounding structures). Good.
  • the communication unit 110 communicates with the user terminal 2, the flying object 4, and the flying object storage device 5.
  • the communication unit 110 also functions as a reception unit that receives flight requests from the user terminal 2.
  • Flight mission generation unit 120 generates flight missions.
  • the flight mission is information including at least imaging point (so-called waypoint, including latitude / longitude information and flight altitude information) information, imaging direction information, and flight path including imaging date / time information.
  • the flight path may be set by a known method, for example, referring to a manually set imaging point and imaging direction, or setting the position coordinates of the imaging object and the shooting distance from the imaging object.
  • the flight path generation unit 121 may automatically calculate and set the image pickup point and the image pickup direction.
  • the generated information regarding the flight path may be stored in the flight path information storage unit 162.
  • the flight path may be, for example, a configuration in which the position where the aircraft is carried by the user is set as the flight start position or the user collects the aircraft at the flight end position without having the flight object storage device 5. Then, based on the information of the flight object storage device 5 managed by the management server 1 (for example, position information, storage state information, storage aircraft information, etc.), the flight start position, intermediate stopover, or flight end position was selected. It may be configured to be generated as a flight path including the position of the airframe storage device 5.
  • the photographed image / photographed state information receiving unit 130 receives the photographed image and the photographed state information transmitted from the user terminal 2 and stores the photographed image / photographed state information in the photographed information storage unit 166 as needed.
  • the flight path image generation unit 140 constructs, for example, a virtual three-dimensional coordinate space (so-called VR space), and draws a flight path in the VR space based on the flight path information stored in the flight path information storage unit 162 or the like. To do. Then, based on the shooting state information, the flight path drawn by the virtual camera in the VR space is set to the same shooting state as the shooting state of the shooting unit 26 (for example, shooting angle of view, shooting angle, shooting direction, shooting position, etc.). ), And the flight path image is acquired (see, for example, the left figure of FIG. 8).
  • VR space virtual three-dimensional coordinate space
  • the flight path drawn by the virtual camera in the VR space is set to the same shooting state as the shooting state of the shooting unit 26 (for example, shooting angle of view, shooting angle, shooting direction, shooting position, etc.).
  • the flight path image is acquired (see, for example, the left figure of FIG. 8).
  • the flight path not only the flight path is drawn, but also the three-dimensional model of the object to be imaged, the three-dimensional model of the flying object 4, and the three-dimensional model related to the surrounding environment (for example, buildings, trees, etc.).
  • Information on the flight path such as the waypoint number may be drawn, and these drawn objects may be captured in the flight path image.
  • the composite image generation unit 150 may generate a composite image by, for example, acquiring a photographed image and a flight path image of the photographing unit 26 and superimposing the flight path image on the photographed image.
  • the flight path information storage unit 162 stores the image pickup point information, the image pickup direction information, and the image capture date / time information of the flying object generated by the flight mission generation unit 120.
  • the flight log storage unit 164 may use, for example, information acquired by the aircraft 4 on the flight path set in the flight mission (for example, position information from takeoff to landing, still images, moving images, etc.). Memorize voice and other information).
  • the shooting information storage unit 166 includes at least a shooting image angle information storage unit 1661, a shooting angle information storage unit 1662, a shooting orientation information storage unit 1663, and a user terminal position information storage unit 1664, and captures an image.
  • the shooting state information acquired by the shooting state information receiving unit 130 is stored.
  • a photographed image / photographed state information receiving unit 130 As the functions of the information processing device on the management server 1, a photographed image / photographed state information receiving unit 130, a flight path image generation unit 140, a composite image generation unit 150, and a storage unit 160 are provided.
  • the description is based on the above example, but the present invention is not limited to this, and these configurations (however, instead of the captured image / shooting state information receiving unit 130, the photographing unit 26 and the shooting state information acquisition unit 27) are provided in the user terminal 2. It is provided as a function of its own information processing device, and may acquire the flight path information set on the user terminal 2 or the flight path information generated by the management server 1 to generate the above-mentioned composite image. As a result, the processing load of the user terminal 2 is increased, but the data communication amount and communication frequency with the management server 1 can be reduced.
  • FIG. 7 illustrates a flowchart of the flight path display method according to the present embodiment.
  • FIG. 8-13 is an example for explaining the generation of a composite image for displaying the flight path according to the embodiment of the present invention.
  • the information processing device acquires the captured image S captured by the photographing unit 26 of the flying object and the shooting state information acquired by the shooting state information acquisition unit 27 (SQ101).
  • the information processing device draws a flight path in the VR space based on the flight path information generated by the management server 1 or the user terminal 2, and is drawn by a virtual camera in the VR space based on the shooting state information.
  • the flight path image H is acquired by photographing the flight path (SQ102).
  • the information processing device superimposes the flight path image H on the captured image S by the composite image generation unit 150 to generate the composite image G (SQ103), and displays the composite image G on the screen of the user terminal 2. (SQ104).
  • the information processing device determines whether or not the shooting conditions have been changed (SQ105), and if so, starts from acquiring the shot image and the shooting state information again (SQ101). Further, when it is determined that the shooting conditions have not been changed, it is determined whether or not the shooting has been completed (SQ106), and if not, the captured image and the shooting state information are acquired again (Similarly). It starts from SQ101), but at this time, the shooting state information may not be acquired.
  • the operation related to this flow may be performed at any time according to a predetermined sampling rate.
  • a flight path or the like is displayed on a screen photographed by the user terminal 2 and functions as a so-called AR.
  • an imaging object exists, for example, when the imaging object is photographed, a composite image in which a flight path is superimposed on the acquired still image or moving image may be acquired.
  • the flight path image H includes, for example, a flight path FR, a waypoint WP, information related thereto (for example, a waypoint number), and a virtual vehicle VD. Then, as illustrated in FIG. 9, the captured image S and the flight path image H are combined to obtain the composite image G. As a result, the flight path FR and the like within the shooting range in which the captured image S is acquired are virtually superimposed on the space in the real world, so that it becomes easy to confirm the suitability of the flight path FR and the like.
  • the virtual flying object VD in the VR space by further using the shooting date and time information, it becomes easy to confirm which position the flying object 4 will fly at the time specified by the user.
  • the portion where the flight has ended (the solid line portion of the flight path FR in FIGS. 8 and 9) and the planned flight portion (the dotted line portion of the flight path FR in FIGS. 8 and 9) can be easily distinguished.
  • the shape and thickness of the lines may be changed, or the colors may be separated.
  • ⁇ Composite image G example 2> As illustrated in FIG. 10, a photographed image S including an image pickup object T may be acquired. In this case, as illustrated in FIG. 11, since the positional relationship between the image pickup object T and the flight path FR can be visually recognized, it becomes easy to confirm the suitability of the flight path FR and the like.
  • ⁇ Composite image G example 3> As illustrated in FIG. 12, a photographed image S including the imaging object T and the flying object 4 may be acquired. In this case, as illustrated in FIG. 13, since the positional relationship between the flying object 4 and the flight path FR can be visually recognized, it becomes easy to confirm the suitability of the flight path FR after the present time. Further, as illustrated in FIG. 14, when displaying the composite image G, by providing the display area W for displaying the image acquired from the flying object 4, the suitability of the flight path is also taken into consideration in consideration of the suitability of imaging. May be identifiable. In FIG. 14, the composite image G is displayed larger than the display area W, but instead, the display area W may be displayed larger than the composite image G, and these can be switched. It may be configured.
  • the flight path image H including the virtual image pickup target VT may be acquired.
  • the virtual imaging object VT and the actual imaging object T can be superimposed and compared, for example, an unexpected configuration O exists in the imaging object T.
  • the flight path FR may be edited while displaying the composite image G on the user terminal 2. This makes it easier to set a more appropriate flight path FR while confirming the flight path with the composite image G.
  • the object to be imaged will be illustrated with a steel tower, but the object is not limited to this, and any object that can be photographed by the camera 42 may be used, for example, a high-rise building. It may be a model such as an apartment, a house, a chimney, an antenna tower, a lighthouse, a windmill, a tree, a Kannon statue, or smoke of a creature such as a person or an animal or a fire.
  • the object to be imaged is not limited to the one that actually exists, and any three-dimensional model such as an object to be constructed can be arranged.
  • the user terminal 2 exemplifies a mobile communication terminal such as a smartphone, but is not limited to this, and may be, for example, a wearable device having an AR function (for example, a wearable device such as AR glasses).
  • a wearable device having an AR function for example, a wearable device such as AR glasses.

Abstract

[Problem] The present invention provides an imaging method and an information processing device with which the suitability of a flight path of an aircraft can be confirmed by comparing the actual flight range situation with the flight path. [Solution] The present invention provides a flight path display method for displaying a flight path of an aircraft on a user terminal, the flight path display method being characterized to include: a step for acquiring photographing state information including position information, photographing angle-of-view information, photographing angle information, and photographing direction information of the user terminal; a step for generating a flight path image including the flight path depicted in a virtual space on the basis of the flight path information regarding the flight path; and a step for generating a composite image obtained by superimposing the flight path image on the photographed image acquired on the basis of the photographing state information.

Description

飛行体の飛行経路表示方法及び情報処理装置Flight path display method and information processing device for the aircraft
 本発明は、飛行体の飛行経路表示方法及び情報処理装置に関する。 The present invention relates to a flight path display method for an air vehicle and an information processing device.
 近年、ドローン(Drone)や無人航空機(UAV:Unmanned Aerial Vehicle)などの飛行体(以下、「飛行体」と総称する)が産業に利用され始めている。こうした中で、特許文献1には、点検等を目的とした飛行経路を表示する飛行経路表示方法が開示されている。 In recent years, flying objects (hereinafter collectively referred to as "aircraft") such as drones and unmanned aerial vehicles (UAVs) have begun to be used in industry. Under these circumstances, Patent Document 1 discloses a flight route display method for displaying a flight route for the purpose of inspection or the like.
特開2015-058758号公報Japanese Unexamined Patent Publication No. 2015-058758
 しかしながら、上記特許文献1の開示技術では、地図M上に俯瞰的に飛行経路Rを示しているが、当該技術では立体的な視点で飛行体がどのような飛行経路を飛行するかは確認できなかった。特に点検等の対象が立体物である場合には、立体的な視点で飛行体が立体物の周囲をどのような飛行経路で飛行するか確認できないことが、飛行経路の適切性を判断することを難しくしていた。 However, in the disclosed technique of Patent Document 1, the flight path R is shown from a bird's-eye view on the map M, but in the technique, it is possible to confirm what kind of flight path the flying object flies from a three-dimensional viewpoint. There wasn't. Especially when the object of inspection etc. is a three-dimensional object, it is not possible to confirm what kind of flight path the flying object will fly around the three-dimensional object from a three-dimensional viewpoint, so it is necessary to judge the appropriateness of the flight path. Was making it difficult.
 さらに、事前に様々な視点から飛行範囲の画像を取得して、飛行経路と照らし合わせるなどして確認することや、実際に現地で飛行させることで確認することは可能であったが、前者の場合には、飛行経路が現実空間に重畳されるわけではないため正確性に乏しく、且つ、画像取得時からの変化に対応することも難しい。後者の場合には、現地で飛行させるためのユーザの労力や、飛行経路の適性が未確認の状態で飛行させるリスクがあり得た。 Furthermore, it was possible to acquire images of the flight range from various viewpoints in advance and check them by comparing them with the flight path, or by actually flying in the field, but the former In this case, since the flight path is not superimposed on the real space, the accuracy is poor, and it is difficult to respond to the change from the time of image acquisition. In the latter case, there could be a risk of the user's effort to fly in the field and the risk of flying in a state where the suitability of the flight route is unconfirmed.
 本発明はこのような背景を鑑みてなされたものであり、現実の飛行範囲の状況と飛行体の飛行経路とを照らし合わせることにより、当該飛行経路の適性が確認可能となる飛行経路表示方法及び情報処理装置を提供することを目的とする。 The present invention has been made in view of such a background, and a flight path display method and a flight path display method capable of confirming the suitability of the flight path by comparing the actual flight range situation with the flight path of the flying object. The purpose is to provide an information processing device.
 上記課題を解決するための本発明の主たる発明は、飛行体の飛行経路をユーザ端末に表示するための飛行経路表示方法であって、前記ユーザ端末の位置情報及び撮影画角情報、撮影角度情報、撮影方向情報を含む撮影状態情報を取得するステップと、前記飛行経路に関する飛行経路情報に基づき、仮想空間内に描写された前記飛行経路を含む飛行経路画像を生成するステップと、前記撮影状態情報に基づき取得された撮影画像上に、前記飛行経路画像を重畳することで得られる合成画像を生成するステップと、を含む、ことを特徴とする飛行経路表示方法、である。 The main invention of the present invention for solving the above problems is a flight path display method for displaying a flight path of an air vehicle on a user terminal, which includes position information, shooting image angle information, and shooting angle information of the user terminal. , A step of acquiring shooting state information including shooting direction information, a step of generating a flight path image including the flight path drawn in a virtual space based on the flight path information related to the flight path, and the shooting state information. The flight path display method comprises a step of generating a composite image obtained by superimposing the flight path image on the captured image acquired based on the above.
 本発明によれば、現実の飛行範囲の状況と飛行体の飛行経路を照らし合わせることにより、当該飛行経路の適性が確認可能となる飛行経路表示方法及びサーバを提供することができる。 According to the present invention, it is possible to provide a flight route display method and a server that can confirm the suitability of the flight path by comparing the actual flight range situation with the flight path of the flying object.
本発明の実施の形態にかかる管理システムの構成を示す図である。It is a figure which shows the structure of the management system which concerns on embodiment of this invention. 図1の管理サーバのハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware configuration of the management server of FIG. 図1のユーザ端末のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware configuration of the user terminal of FIG. 図1の飛行体のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware composition of the flying object of FIG. 図1の管理サーバの機能を示すブロック図である。It is a block diagram which shows the function of the management server of FIG. 図5のパラメータ情報記憶部の構造を示すブロック図である。It is a block diagram which shows the structure of the parameter information storage part of FIG. 本発明の実施の形態にかかる飛行経路表示方法のフローチャートである。It is a flowchart of the flight path display method which concerns on embodiment of this invention. 本発明の実施の形態にかかる撮影画像及び飛行経路画像の一例を示す図である。It is a figure which shows an example of the photographed image and the flight path image which concerns on embodiment of this invention. 本発明の実施の形態にかかる合成画像Gの一例を示す図である。It is a figure which shows an example of the composite image G which concerns on embodiment of this invention. 本発明の実施の形態にかかる撮影画像及び飛行経路画像の一例を示す図である。It is a figure which shows an example of the photographed image and the flight path image which concerns on embodiment of this invention. 本発明の実施の形態にかかる合成画像Gの一例を示す図である。It is a figure which shows an example of the composite image G which concerns on embodiment of this invention. 本発明の実施の形態にかかる撮影画像及び飛行経路画像の一例を示す図である。It is a figure which shows an example of the photographed image and the flight path image which concerns on embodiment of this invention. 本発明の実施の形態にかかる合成画像Gの一例を示す図である。It is a figure which shows an example of the composite image G which concerns on embodiment of this invention. 本発明の実施の形態にかかるユーザ端末の表示画面の一例を示す図である。It is a figure which shows an example of the display screen of the user terminal which concerns on embodiment of this invention. 本発明の実施の形態にかかる撮影画像及び飛行経路画像の一例を示す図である。It is a figure which shows an example of the photographed image and the flight path image which concerns on embodiment of this invention. 本発明の実施の形態にかかる合成画像Gの一例を示す図である。It is a figure which shows an example of the composite image G which concerns on embodiment of this invention.
 本発明の実施形態の内容を列記して説明する。本発明の実施の形態による飛行経路表示方法及び情報処理装置は、以下のような構成を備える。
[項目1]
 飛行体の飛行経路をユーザ端末に表示するための飛行経路表示方法であって、
 前記ユーザ端末の位置情報及び撮影画角情報、撮影角度情報、撮影方向情報を含む撮影状態情報を取得するステップと、
 前記飛行経路に関する飛行経路情報に基づき、仮想空間内に描写された前記飛行経路を含む飛行経路画像を生成するステップと、
 前記撮影状態情報に基づき取得された撮影画像上に、前記飛行経路画像を重畳することで得られる合成画像を生成するステップと、を含む、
 ことを特徴とする飛行経路表示方法。
[項目2]
 項目1に記載の飛行経路表示方法であって、
 前記飛行経路を示す画像は、ウェイポイントに関する情報をさらに含む、
 ことを特徴とする飛行経路表示方法。
[項目3]
 項目1または2に記載の飛行経路表示方法であって、
 前記飛行経路を示す画像は、仮想飛行体の三次元モデルをさらに含む、
 ことを特徴とする飛行経路表示方法。
[項目4]
 項目1乃至3に記載の飛行経路表示方法であって、
 前記飛行経路を示す画像は、仮想撮像対象物の三次元モデルをさらに含む、
 ことを特徴とする飛行経路表示方法。
[項目5]
 項目1乃至4に記載の飛行経路表示方法であって、
 前記ユーザ端末は、AR機能を有するウェアラブルデバイスである、
 ことを特徴とする飛行経路表示方法。
[項目6]
 項目1乃至5に記載の飛行経路表示方法であって、
 前記合成画像を表示させながら飛行経路を編集するステップ、をさらに含む、
 ことを特徴とする飛行経路表示方法。
[項目7]
 項目1乃至6に記載の飛行経路表示方法であって、
 前記飛行体から取得した映像を表示するステップ、をさらに含む、
 ことを特徴とする飛行経路表示方法。
[項目8]
 項目7に記載の飛行経路表示方法であって、
 前記飛行体から取得した映像を前記合成画像よりも大きく表示するように切り替え可能である、
 ことを特徴とする飛行経路表示方法。
[項目9]
 飛行体の飛行経路をユーザ端末に表示するための情報処理装置であって、
 前記ユーザ端末の位置情報及び撮影画角情報、撮影角度情報、撮影方向情報を含む撮影状態情報を取得する撮影状態情報取得部と、
 前記飛行経路に関する飛行経路情報に基づき、仮想空間内に描写された前記飛行経路を含む飛行経路画像を生成する飛行経路画像生成部と、
 前記撮影状態情報に基づき取得された撮影画像上に、前記飛行経路画像を重畳することで得られる合成画像を生成する合成画像生成部と、を備える、
 ことを特徴とする情報処理装置。
The contents of the embodiments of the present invention will be described in a list. The flight path display method and the information processing apparatus according to the embodiment of the present invention have the following configurations.
[Item 1]
It is a flight route display method for displaying the flight path of an air vehicle on a user terminal.
A step of acquiring shooting state information including position information, shooting angle of view information, shooting angle information, and shooting direction information of the user terminal, and
A step of generating a flight path image including the flight path depicted in the virtual space based on the flight path information regarding the flight path, and
A step of generating a composite image obtained by superimposing the flight path image on the photographed image acquired based on the photographed state information is included.
A flight route display method characterized by the fact that.
[Item 2]
The flight route display method according to item 1.
The image showing the flight path further contains information about waypoints.
A flight route display method characterized by the fact that.
[Item 3]
The flight route display method according to item 1 or 2.
The image showing the flight path further includes a three-dimensional model of the virtual flying object.
A flight route display method characterized by the fact that.
[Item 4]
The flight route display method according to items 1 to 3.
The image showing the flight path further includes a three-dimensional model of the virtual imaging object.
A flight route display method characterized by the fact that.
[Item 5]
The flight route display method according to items 1 to 4.
The user terminal is a wearable device having an AR function.
A flight route display method characterized by the fact that.
[Item 6]
The flight route display method according to items 1 to 5.
Further including the step of editing the flight path while displaying the composite image,
A flight route display method characterized by the fact that.
[Item 7]
The flight route display method according to items 1 to 6.
Further including the step of displaying the image acquired from the flying object,
A flight route display method characterized by the fact that.
[Item 8]
The flight route display method according to item 7.
It is possible to switch so that the image acquired from the flying object is displayed larger than the composite image.
A flight route display method characterized by the fact that.
[Item 9]
An information processing device for displaying the flight path of an air vehicle on a user terminal.
A shooting state information acquisition unit that acquires shooting state information including position information, shooting angle of view information, shooting angle information, and shooting direction information of the user terminal, and a shooting state information acquisition unit.
A flight path image generation unit that generates a flight path image including the flight path drawn in the virtual space based on the flight path information related to the flight path.
A composite image generation unit that generates a composite image obtained by superimposing the flight path image on the captured image acquired based on the shooting state information is provided.
An information processing device characterized by this.
<実施の形態の詳細>
 以下、本発明の実施の形態による飛行体の撮像方法及び情報処理装置についての実施の形態を説明する。添付図面において、同一または類似の要素には同一または類似の参照符号及び名称が付され、各実施形態の説明において同一または類似の要素に関する重複する説明は省略することがある。また、各実施形態で示される特徴は、互いに矛盾しない限り他の実施形態にも適用可能である。
<Details of the embodiment>
Hereinafter, embodiments of an image pickup method for an air vehicle and an information processing apparatus according to the embodiment of the present invention will be described. In the accompanying drawings, the same or similar elements are given the same or similar reference numerals and names, and duplicate description of the same or similar elements may be omitted in the description of each embodiment. In addition, the features shown in each embodiment can be applied to other embodiments as long as they do not contradict each other.
<構成>
 図1に示されるように、本実施の形態における管理システムは、管理サーバ1と、一以上のユーザ端末2と、一以上の飛行体4と、一以上の飛行体格納装置5とを有している。管理サーバ1と、ユーザ端末2と、飛行体4と、飛行体格納装置5は、ネットワークを介して互いに通信可能に接続されている。なお、図示された構成は一例であり、これに限らず、例えば、飛行体格納装置5を有さずに、ユーザにより持ち運びされる構成などでもよい。
<Structure>
As shown in FIG. 1, the management system according to the present embodiment includes a management server 1, one or more user terminals 2, one or more flying objects 4, and one or more flying object storage devices 5. ing. The management server 1, the user terminal 2, the flying object 4, and the flying object storage device 5 are connected to each other so as to be able to communicate with each other via a network. The illustrated configuration is an example, and is not limited to this. For example, a configuration that is carried by the user without having the flying object storage device 5 may be used.
<管理サーバ1>
 図2は、管理サーバ1のハードウェア構成を示す図である。なお、図示された構成は一例であり、これ以外の構成を有していてもよい。
<Management server 1>
FIG. 2 is a diagram showing a hardware configuration of the management server 1. The illustrated configuration is an example, and may have other configurations.
 図示されるように、管理サーバ1は、複数のユーザ端末2と、飛行体4、飛行体格納装置5と接続され本システムの一部を構成する。管理サーバ1は、例えばワークステーションやパーソナルコンピュータのような汎用コンピュータとしてもよいし、或いはクラウド・コンピューティングによって論理的に実現されてもよい。 As shown in the figure, the management server 1 is connected to a plurality of user terminals 2, an air vehicle 4, and an air vehicle storage device 5 to form a part of this system. The management server 1 may be a general-purpose computer such as a workstation or a personal computer, or may be logically realized by cloud computing.
 管理サーバ1は、少なくとも、プロセッサ10、メモリ11、ストレージ12、送受信部13、入出力部14等を備え、これらはバス15を通じて相互に電気的に接続される。 The management server 1 includes at least a processor 10, a memory 11, a storage 12, a transmission / reception unit 13, an input / output unit 14, and the like, and these are electrically connected to each other through a bus 15.
 プロセッサ10は、管理サーバ1全体の動作を制御し、各要素間におけるデータの送受信の制御、及びアプリケーションの実行及び認証処理に必要な情報処理等を行う演算装置である。例えばプロセッサ10はCPU(Central Processing Unit)および/またはGPU(Graphics Processing Unit)であり、ストレージ12に格納されメモリ11に展開された本システムのためのプログラム等を実行して各情報処理を実施する。 The processor 10 is an arithmetic unit that controls the operation of the entire management server 1, controls the transmission and reception of data between each element, and performs information processing necessary for application execution and authentication processing. For example, the processor 10 is a CPU (Central Processing Unit) and / or a GPU (Graphics Processing Unit), and executes each information processing by executing a program or the like for the system stored in the storage 12 and expanded in the memory 11. ..
 メモリ11は、DRAM(Dynamic Random Access Memory)等の揮発性記憶装置で構成される主記憶と、フラッシュメモリやHDD(Hard Disc Drive)等の不揮発性記憶装置で構成される補助記憶と、を含む。メモリ11は、プロセッサ10のワークエリア等として使用され、また、管理サーバ1の起動時に実行されるBIOS(Basic Input / Output System)、及び各種設定情報等を格納する。 The memory 11 includes a main memory composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory) and an auxiliary storage composed of a non-volatile storage device such as a flash memory or an HDD (Hard Disc Drive). .. The memory 11 is used as a work area of the processor 10 and stores a BIOS (Basic Input / Output System) executed when the management server 1 is started, various setting information, and the like.
 ストレージ12は、アプリケーション・プログラム等の各種プログラムを格納する。各処理に用いられるデータを格納したデータベースがストレージ12に構築されていてもよい。 The storage 12 stores various programs such as application programs. A database storing data used for each process may be built in the storage 12.
 送受信部13は、管理サーバ1をネットワークおよびブロックチェーンネットワークに接続する。なお、送受信部13は、Bluetooth(登録商標)及びBLE(Bluetooth Low Energy)の近距離通信インターフェースを備えていてもよい。 The transmission / reception unit 13 connects the management server 1 to the network and the blockchain network. The transmission / reception unit 13 may be provided with a short-range communication interface of Bluetooth (registered trademark) and BLE (Bluetooth Low Energy).
 入出力部14は、キーボード・マウス類等の情報入力機器、及びディスプレイ等の出力機器である。 The input / output unit 14 is an information input device such as a keyboard and a mouse, and an output device such as a display.
 バス15は、上記各要素に共通に接続され、例えば、アドレス信号、データ信号及び各種制御信号を伝達する。 The bus 15 is commonly connected to each of the above elements and transmits, for example, an address signal, a data signal, and various control signals.
<ユーザ端末2>
 図3に示されるユーザ端末2もまた、プロセッサ20、メモリ21、ストレージ22、送受信部23、入出力部24、撮影部26、撮影状態情報取得部27等を備え、これらはバス25を通じて相互に電気的に接続される。各要素の機能は、上述した管理サーバ1と同様に構成することが可能であることから、同様の構成の詳細な説明は省略する。
<User terminal 2>
The user terminal 2 shown in FIG. 3 also includes a processor 20, a memory 21, a storage 22, a transmission / reception unit 23, an input / output unit 24, a photographing unit 26, a photographing state information acquisition unit 27, and the like, and these are mutually provided through a bus 25. It is electrically connected. Since the functions of each element can be configured in the same manner as the management server 1 described above, detailed description of the same configuration will be omitted.
 撮影部26は、例えばカメラであり、撮影画像を取得する。撮影状態情報取得部27は、例えばGPSやジャイロセンサ、気圧センサ、温度センサなどのセンサ類、端末の撮影部26に関する情報(例えば、撮影画角情報など)を記憶した記憶部(メモリ21やストレージ22の一部であってもよい)などを含み、撮影画像を取得した際のユーザ端末の撮影画角情報や撮影角度情報、撮影方位情報、ユーザ端末位置情報(例えば、緯度経度情報と高度情報など)などが撮影状態情報として取得される。これらの取得した画像及び情報は、管理サーバ1に送信され、後述の記憶部160に記憶される。なお、ユーザ端末位置情報に含まれる高度情報は、上述の気圧センサや温度センサに基づき算出された高度情報や、ユーザに設定させた高度情報でもよいが、例えば、ユーザに設定させた身長情報、または、ユーザの性別に応じた平均的な身長情報から、想定されるユーザ端末位置に応じて所定の高さだけ上下方向にオフセットした値を高度情報としてもよい。 The shooting unit 26 is, for example, a camera, and acquires a shot image. The shooting state information acquisition unit 27 is a storage unit (memory 21 or storage) that stores information (for example, shooting angle information) related to sensors such as GPS, gyro sensor, pressure sensor, temperature sensor, and shooting unit 26 of the terminal. It may be a part of 22), etc., and the shooting angle information, shooting angle information, shooting orientation information, and user terminal position information (for example, latitude / longitude information and altitude information) of the user terminal when the shot image is acquired. Etc.) etc. are acquired as shooting status information. These acquired images and information are transmitted to the management server 1 and stored in the storage unit 160 described later. The altitude information included in the user terminal position information may be altitude information calculated based on the above-mentioned barometric pressure sensor or temperature sensor, or altitude information set by the user, but for example, height information set by the user, Alternatively, the altitude information may be a value offset in the vertical direction by a predetermined height according to the assumed position of the user terminal from the average height information according to the gender of the user.
<飛行体4>
 図4は、飛行体4のハードウェア構成を示すブロック図である。フライトコントローラ41は、プログラマブルプロセッサ(例えば、中央演算処理装置(CPU))などの1つ以上のプロセッサを有することができる。
<Aircraft 4>
FIG. 4 is a block diagram showing a hardware configuration of the air vehicle 4. The flight controller 41 can have one or more processors such as a programmable processor (eg, central processing unit (CPU)).
 また、フライトコントローラ41は、メモリ411を有しており、当該メモリにアクセス可能である。メモリ411は、1つ以上のステップを行うためにフライトコントローラが実行可能であるロジック、コード、および/またはプログラム命令を記憶している。また、フライトコントローラ41は、慣性センサ(加速度センサ、ジャイロセンサ)、GPSセンサ、近接センサ(例えば、ライダー)等のセンサ類412を含みうる。 Further, the flight controller 41 has a memory 411 and can access the memory. Memory 411 stores logic, code, and / or program instructions that the flight controller can execute to perform one or more steps. Further, the flight controller 41 may include sensors 412 such as an inertial sensor (accelerometer, gyro sensor), GPS sensor, proximity sensor (for example, rider) and the like.
 メモリ411は、例えば、SDカードやランダムアクセスメモリ(RAM)などの分離可能な媒体または外部の記憶装置を含んでいてもよい。カメラ/センサ類42から取得したデータは、メモリ411に直接に伝達されかつ記憶されてもよい。例えば、カメラ等で撮影した静止画・動画データが内蔵メモリ又は外部メモリに記録されてもよいが、これに限らず、カメラ/センサ42または内蔵メモリからネットワークNWを介して、少なくとも管理サーバ1やユーザ端末2、飛行体格納装置5のいずれかに記録されてもよい。カメラ42は飛行体4にジンバル43を介して設置される。 Memory 411 may include, for example, a separable medium such as an SD card or random access memory (RAM) or an external storage device. The data acquired from the cameras / sensors 42 may be directly transmitted and stored in the memory 411. For example, still image / moving image data taken by a camera or the like may be recorded in the internal memory or an external memory, but the present invention is not limited to this, and at least the management server 1 or the management server 1 or the internal memory may be recorded from the camera / sensor 42 or the internal memory via the network NW. It may be recorded in either the user terminal 2 or the air vehicle storage device 5. The camera 42 is installed on the flying object 4 via the gimbal 43.
 フライトコントローラ41は、飛行体の状態を制御するように構成された図示しない制御モジュールを含んでいる。例えば、制御モジュールは、6自由度(並進運動x、y及びz、並びに回転運動θ、θ及びθ)を有する飛行体の空間的配置、速度、および/または加速度を調整するために、ESC44(Electric Speed Controller)を経由して飛行体の推進機構(モータ45等)を制御する。バッテリー48から給電されるモータ45によりプロペラ46が回転することで飛行体の揚力を生じさせる。制御モジュールは、搭載部、センサ類の状態のうちの1つ以上を制御することができる。 The flight controller 41 includes a control module (not shown) configured to control the state of the flying object. For example, the control module adjusts the spatial placement, velocity, and / or acceleration of an air vehicle with six degrees of freedom (translational motion x, y and z, and rotational motion θ x , θ y and θ z). , ESC44 (Electric Speed Controller) to control the propulsion mechanism (motor 45, etc.) of the flying object. The propeller 46 is rotated by the motor 45 supplied from the battery 48 to generate lift of the flying object. The control module can control one or more of the states of the mounting unit and the sensors.
 フライトコントローラ41は、1つ以上の外部のデバイス(例えば、送受信機(プロポ)49、端末、表示装置、または他の遠隔の制御器)からのデータを送信および/または受け取るように構成された送受信部47と通信可能である。送受信機49は、有線通信または無線通信などの任意の適当な通信手段を使用することができる。 The flight controller 41 is configured to transmit and / or receive data from one or more external devices (eg, transmitter / receiver (propo) 49, terminal, display device, or other remote control). It is possible to communicate with the unit 47. The transmitter / receiver 49 can use any suitable communication means such as wired communication or wireless communication.
 例えば、送受信部47は、ローカルエリアネットワーク(LAN)、ワイドエリアネットワーク(WAN)、赤外線、無線、WiFi、ポイントツーポイント(P2P)ネットワーク、電気通信ネットワーク、クラウド通信などのうちの1つ以上を利用することができる。 For example, the transmission / reception unit 47 uses one or more of a local area network (LAN), a wide area network (WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, cloud communication, and the like. can do.
 送受信部47は、センサ類42で取得したデータ、フライトコントローラ41が生成した処理結果、所定の制御データ、端末または遠隔の制御器からのユーザコマンドなどのうちの1つ以上を送信および/または受け取ることができる。 The transmission / reception unit 47 transmits and / or receives one or more of the data acquired by the sensors 42, the processing result generated by the flight controller 41, the predetermined control data, the user command from the terminal or the remote controller, and the like. be able to.
 本実施の形態によるセンサ類42は、慣性センサ(加速度センサ、ジャイロセンサ)、GPSセンサ、近接センサ(例えば、ライダー)、またはビジョン/イメージセンサ(例えば、カメラ)を含み得る。 Sensors 42 according to this embodiment may include an inertial sensor (accelerometer, gyro sensor), GPS sensor, proximity sensor (eg, rider), or vision / image sensor (eg, camera).
<管理サーバの機能>
 図5は、管理サーバ1に実装される機能を例示したブロック図である。本実施の形態においては、管理サーバ1は、通信部110、フライトミッション生成部120、撮影画像・撮影状態情報受信部130、飛行経路画像生成部140、合成画像生成部150、記憶部160を備えている。フライトミッション生成部120は、飛行経路生成部121を含む。また、記憶部160は、飛行経路情報記憶部162、フライトログ記憶部164、撮影情報記憶部166を含む。なお、記憶部160は、撮像を行うために必要な情報を記憶する記憶部をさらに有していてもよく、例えば、飛行条件に関する情報(例えば、飛行速度やウェイポイント間隔など)や、撮像対象物に関する情報(例えば、位置座標や高さ情報など)、対象物の周辺環境に関する情報(例えば、地形や周辺の構造物に関する情報)を記憶する記憶部(不図示)をそれぞれ有していてもよい。
<Management server function>
FIG. 5 is a block diagram illustrating the functions implemented in the management server 1. In the present embodiment, the management server 1 includes a communication unit 110, a flight mission generation unit 120, a captured image / shooting state information receiving unit 130, a flight path image generation unit 140, a composite image generation unit 150, and a storage unit 160. ing. The flight mission generation unit 120 includes a flight path generation unit 121. Further, the storage unit 160 includes a flight path information storage unit 162, a flight log storage unit 164, and a shooting information storage unit 166. The storage unit 160 may further have a storage unit that stores information necessary for performing imaging, for example, information on flight conditions (for example, flight speed, waypoint interval, etc.), and an imaging target. Even if it has a storage unit (not shown) that stores information about an object (for example, position coordinates and height information) and information about the surrounding environment of the object (for example, information about terrain and surrounding structures). Good.
 通信部110は、ユーザ端末2や、飛行体4、飛行体格納装置5と通信を行う。通信部110は、ユーザ端末2から、フライト依頼を受け付ける受付部としても機能する。 The communication unit 110 communicates with the user terminal 2, the flying object 4, and the flying object storage device 5. The communication unit 110 also functions as a reception unit that receives flight requests from the user terminal 2.
 フライトミッション生成部120は、フライトミッションを生成する。フライトミッションは、少なくとも撮像ポイント(いわゆる、ウェイポイントであり、例えば緯度経度情報及び飛行高度情報を含む)情報及び撮像方向情報、撮像日時情報を含む飛行経路を含む情報である。飛行経路の設定は既知の方法で行われてもよく、例えば、手動で撮像ポイント及び撮像方向を設定されたものを参照したり、撮像対象物の位置座標と撮像対象物からの撮影距離を設定するなどして飛行経路生成部121により自動で撮像ポイント及び撮像方向を算出して設定したりしてもよい。生成された飛行経路に関する情報は、飛行経路情報記憶部162に記憶されていてもよい。 Flight mission generation unit 120 generates flight missions. The flight mission is information including at least imaging point (so-called waypoint, including latitude / longitude information and flight altitude information) information, imaging direction information, and flight path including imaging date / time information. The flight path may be set by a known method, for example, referring to a manually set imaging point and imaging direction, or setting the position coordinates of the imaging object and the shooting distance from the imaging object. The flight path generation unit 121 may automatically calculate and set the image pickup point and the image pickup direction. The generated information regarding the flight path may be stored in the flight path information storage unit 162.
 なお、飛行経路は、例えば、飛行体格納装置5を有さずに、ユーザにより機体を持ち運びされた位置を飛行開始位置としたり、飛行終了位置においてユーザが機体を回収したりする構成などでもよいし、管理サーバ1により管理された飛行体格納装置5の情報(例えば、位置情報や格納状態情報、格納機体情報など)を基に、飛行開始位置、途中経由地または飛行終了位置として選択された飛行体格納装置5の位置も含めた飛行経路として生成される構成でもよい。 The flight path may be, for example, a configuration in which the position where the aircraft is carried by the user is set as the flight start position or the user collects the aircraft at the flight end position without having the flight object storage device 5. Then, based on the information of the flight object storage device 5 managed by the management server 1 (for example, position information, storage state information, storage aircraft information, etc.), the flight start position, intermediate stopover, or flight end position was selected. It may be configured to be generated as a flight path including the position of the airframe storage device 5.
 撮影画像・撮影状態情報受信部130は、ユーザ端末2から送信された撮影画像及び撮影状態情報を受信し、必要に応じて撮影情報記憶部166へ記憶する。 The photographed image / photographed state information receiving unit 130 receives the photographed image and the photographed state information transmitted from the user terminal 2 and stores the photographed image / photographed state information in the photographed information storage unit 166 as needed.
 飛行経路画像生成部140は、例えば、仮想的な三次元座標空間(いわゆるVR空間)を構築し、飛行経路情報記憶部162などに格納された飛行経路情報に基づき、VR空間に飛行経路を描画する。そして、撮影状態情報に基づいて、VR空間内の仮想カメラにより、描画された飛行経路を、撮影部26の撮影状態と同じ撮影状態(例えば、撮影画角、撮影角度、撮影方位、撮影位置など)において撮影し、飛行経路画像を取得する(例えば図8の左図参照)。なお、VR空間内には、飛行経路が描画されるだけに限らず、撮像対象物の三次元モデルや、飛行体4の三次元モデル、周辺環境(例えば、建物、木々など)に関する三次元モデル、ウェイポイント番号などの飛行経路などに関する情報を描画してもよく、飛行経路画像には、これらの描画されたものが撮影されていてもよい。 The flight path image generation unit 140 constructs, for example, a virtual three-dimensional coordinate space (so-called VR space), and draws a flight path in the VR space based on the flight path information stored in the flight path information storage unit 162 or the like. To do. Then, based on the shooting state information, the flight path drawn by the virtual camera in the VR space is set to the same shooting state as the shooting state of the shooting unit 26 (for example, shooting angle of view, shooting angle, shooting direction, shooting position, etc.). ), And the flight path image is acquired (see, for example, the left figure of FIG. 8). In the VR space, not only the flight path is drawn, but also the three-dimensional model of the object to be imaged, the three-dimensional model of the flying object 4, and the three-dimensional model related to the surrounding environment (for example, buildings, trees, etc.). , Information on the flight path such as the waypoint number may be drawn, and these drawn objects may be captured in the flight path image.
 合成画像生成部150は、例えば、撮影部26の撮影画像及び飛行経路画像を取得し、撮影画像に飛行経路画像を重ね合わせることで合成画像を生成してもよい。 The composite image generation unit 150 may generate a composite image by, for example, acquiring a photographed image and a flight path image of the photographing unit 26 and superimposing the flight path image on the photographed image.
 飛行経路情報記憶部162は、フライトミッション生成部120で生成された飛行体の撮像ポイント情報及び撮像方向情報、撮像日時情報を記憶している。フライトログ記憶部164は、例えば、フライトミッションにて設定された飛行経路上にて、飛行体4により取得された情報(例えば、離陸から着陸までに経由した位置の情報、静止画像、動画像、音声その他の情報)を記憶している。 The flight path information storage unit 162 stores the image pickup point information, the image pickup direction information, and the image capture date / time information of the flying object generated by the flight mission generation unit 120. The flight log storage unit 164 may use, for example, information acquired by the aircraft 4 on the flight path set in the flight mission (for example, position information from takeoff to landing, still images, moving images, etc.). Memorize voice and other information).
 撮影情報記憶部166は、図6に示すように、撮影画角情報記憶部1661、撮影角度情報記憶部1662、撮影方位情報記憶部1663、ユーザ端末位置情報記憶部1664を少なくとも含み、撮影画像・撮影状態情報受信部130が取得した撮影状態情報をそれぞれ記憶する。 As shown in FIG. 6, the shooting information storage unit 166 includes at least a shooting image angle information storage unit 1661, a shooting angle information storage unit 1662, a shooting orientation information storage unit 1663, and a user terminal position information storage unit 1664, and captures an image. The shooting state information acquired by the shooting state information receiving unit 130 is stored.
 なお、本実施の形態においては、管理サーバ1上の情報処理装置の機能として、撮影画像・撮影状態情報受信部130、飛行経路画像生成部140、合成画像生成部150、記憶部160を備えている例を基に説明しているが、これに限らず、これらの構成(ただし、撮影画像・撮影状態情報受信部130に代えて、撮影部26及び撮影状態情報取得部27)をユーザ端末2自体の情報処理装置の機能として備え、ユーザ端末2上で設定された飛行経路情報または管理サーバ1で生成された飛行経路情報を取得し、上述の合成画像を生成してもよい。これにより、ユーザ端末2の処理負荷は高まるが、管理サーバ1とのデータ通信量・通信頻度を減少させることができる。 In the present embodiment, as the functions of the information processing device on the management server 1, a photographed image / photographed state information receiving unit 130, a flight path image generation unit 140, a composite image generation unit 150, and a storage unit 160 are provided. The description is based on the above example, but the present invention is not limited to this, and these configurations (however, instead of the captured image / shooting state information receiving unit 130, the photographing unit 26 and the shooting state information acquisition unit 27) are provided in the user terminal 2. It is provided as a function of its own information processing device, and may acquire the flight path information set on the user terminal 2 or the flight path information generated by the management server 1 to generate the above-mentioned composite image. As a result, the processing load of the user terminal 2 is increased, but the data communication amount and communication frequency with the management server 1 can be reduced.
<飛行経路表示方法の一例>
 図7-15を参照して、本実施形態にかかる飛行経路表示方法について説明する。図7には、本実施形態にかかる飛行経路表示方法のフローチャートが例示されている。図8-13は、本発明の実施の形態にかかる飛行経路表示のための合成画像の生成について説明する例である。
<Example of flight route display method>
The flight path display method according to the present embodiment will be described with reference to FIGS. 7-15. FIG. 7 illustrates a flowchart of the flight path display method according to the present embodiment. FIG. 8-13 is an example for explaining the generation of a composite image for displaying the flight path according to the embodiment of the present invention.
 まず、情報処理装置は、飛行体の撮影部26で撮影した撮影画像S及び撮影状態情報取得部27により取得された撮影状態情報を取得する(SQ101)。 First, the information processing device acquires the captured image S captured by the photographing unit 26 of the flying object and the shooting state information acquired by the shooting state information acquisition unit 27 (SQ101).
 次に、情報処理装置は、管理サーバ1またはユーザ端末2において生成された飛行経路情報に基づき、VR空間内に飛行経路を描画し、撮影状態情報に基づき、VR空間内の仮想カメラにより描画された飛行経路を撮影して飛行経路画像Hを取得する(SQ102)。 Next, the information processing device draws a flight path in the VR space based on the flight path information generated by the management server 1 or the user terminal 2, and is drawn by a virtual camera in the VR space based on the shooting state information. The flight path image H is acquired by photographing the flight path (SQ102).
 次に、情報処理装置は、合成画像生成部150により撮影画像Sに飛行経路画像Hを重ね合わせて、合成画像Gを生成し(SQ103)、ユーザ端末2の画面に当該合成画像Gを表示する(SQ104)。 Next, the information processing device superimposes the flight path image H on the captured image S by the composite image generation unit 150 to generate the composite image G (SQ103), and displays the composite image G on the screen of the user terminal 2. (SQ104).
 次に、情報処理装置は、撮影条件が変更されたかどうかを判定し(SQ105)、変更されている場合には、再度撮影画像及び撮影状態情報の取得(SQ101)から開始する。また、撮影条件が変更されていないと判定された場合に、撮影が終了されたかどうかを判定し(SQ106)、終了していない場合には、同様に、再度撮影画像及び撮影状態情報の取得(SQ101)から開始するが、この時には撮影状態情報を取得しない構成としてもよい。 Next, the information processing device determines whether or not the shooting conditions have been changed (SQ105), and if so, starts from acquiring the shot image and the shooting state information again (SQ101). Further, when it is determined that the shooting conditions have not been changed, it is determined whether or not the shooting has been completed (SQ106), and if not, the captured image and the shooting state information are acquired again (Similarly). It starts from SQ101), but at this time, the shooting state information may not be acquired.
 なお、本フローに関する動作については、一例として所定のサンプリングレートに応じて随時行われていてもよく、例えばユーザ端末2により撮影した画面に飛行経路等が表示され、いわゆるARとして機能する。また、撮像対象物が存在する場合には、例えば撮像対象物を撮影すると、取得した静止画像または動画像に飛行経路が重畳された合成画像が取得されるように構成されていてもよい。 As an example, the operation related to this flow may be performed at any time according to a predetermined sampling rate. For example, a flight path or the like is displayed on a screen photographed by the user terminal 2 and functions as a so-called AR. Further, when an imaging object exists, for example, when the imaging object is photographed, a composite image in which a flight path is superimposed on the acquired still image or moving image may be acquired.
<合成画像G例1>
 図8に例示されるように、飛行経路画像Hには、例えば飛行経路FRやウェイポイントWP及びそれに関する情報(例えばウェイポイント番号)、仮想飛行体VDが含まれる。そして、図9に例示されるように、撮影画像Sと飛行経路画像Hを合成して合成画像Gを得る。これにより、撮影画像Sを取得した撮影範囲内における飛行経路FR等が仮想的に現実世界の空間に重畳されるため、飛行経路FRの適性などについて確認しやすくなる。
<Composite image G example 1>
As illustrated in FIG. 8, the flight path image H includes, for example, a flight path FR, a waypoint WP, information related thereto (for example, a waypoint number), and a virtual vehicle VD. Then, as illustrated in FIG. 9, the captured image S and the flight path image H are combined to obtain the composite image G. As a result, the flight path FR and the like within the shooting range in which the captured image S is acquired are virtually superimposed on the space in the real world, so that it becomes easy to confirm the suitability of the flight path FR and the like.
 また、撮影日時情報もさらに用いてVR空間に仮想飛行体VDを配置することにより、ユーザが指定した時間において飛行体4がどの位置を飛行するかが確認しやすくなる。この時、飛行が終了している部分(図8、図9における飛行経路FRの実線部分)と飛行予定部分(図8、図9における飛行経路FRの点線部分)とが判別しやすいように、線の形状や太さを変更したり、色を分けたりするなどしてもよい。 Further, by arranging the virtual flying object VD in the VR space by further using the shooting date and time information, it becomes easy to confirm which position the flying object 4 will fly at the time specified by the user. At this time, the portion where the flight has ended (the solid line portion of the flight path FR in FIGS. 8 and 9) and the planned flight portion (the dotted line portion of the flight path FR in FIGS. 8 and 9) can be easily distinguished. The shape and thickness of the lines may be changed, or the colors may be separated.
<合成画像G例2>
 図10に例示されるように、撮像対象物Tを含めた撮影画像Sを取得してもよい。この場合、図11に例示されるように、撮像対象物Tと飛行経路FRとの位置関係を視認することができるため、飛行経路FRの適性などについて確認しやすくなる。
<Composite image G example 2>
As illustrated in FIG. 10, a photographed image S including an image pickup object T may be acquired. In this case, as illustrated in FIG. 11, since the positional relationship between the image pickup object T and the flight path FR can be visually recognized, it becomes easy to confirm the suitability of the flight path FR and the like.
<合成画像G例3>
 図12に例示されるように、撮像対象物T及び飛行体4を含めた撮影画像Sを取得してもよい。この場合、図13に例示されるように、飛行体4と飛行経路FRとの位置関係を視認することができるため、特に現時点以降の飛行経路FRの適性などについて確認しやすくなる。さらに、図14に例示されるように、合成画像Gを表示する際に、飛行体4から取得した映像を表示するための表示領域Wを備えることで、撮像の適性も踏まえて飛行経路の適性を確認可能にしてもよい。なお、図14においては、合成画像Gが表示領域Wよりも大きく表示されているが、これに代えて表示領域Wを合成画像Gよりも大きく表示するようにしてもよく、これらを切り替え可能に構成してもよい。
<Composite image G example 3>
As illustrated in FIG. 12, a photographed image S including the imaging object T and the flying object 4 may be acquired. In this case, as illustrated in FIG. 13, since the positional relationship between the flying object 4 and the flight path FR can be visually recognized, it becomes easy to confirm the suitability of the flight path FR after the present time. Further, as illustrated in FIG. 14, when displaying the composite image G, by providing the display area W for displaying the image acquired from the flying object 4, the suitability of the flight path is also taken into consideration in consideration of the suitability of imaging. May be identifiable. In FIG. 14, the composite image G is displayed larger than the display area W, but instead, the display area W may be displayed larger than the composite image G, and these can be switched. It may be configured.
<合成画像G例4>
 図15に例示されるように、仮想撮像対象物VTを含めた飛行経路画像Hを取得してもよい。この場合、図16に例示されるように、仮想撮像対象物VTと実際の撮像対象物Tを重ね合わせて比較することができるため、例えば撮像対象物Tに予期せぬ構成Oが存在している場合に、予定していた飛行経路FRと実際の撮像対象物Tの構成とを比較しやすくすることができるため、より飛行経路FRの適性などについて確認しやすくなる。
<Composite image G example 4>
As illustrated in FIG. 15, the flight path image H including the virtual image pickup target VT may be acquired. In this case, as illustrated in FIG. 16, since the virtual imaging object VT and the actual imaging object T can be superimposed and compared, for example, an unexpected configuration O exists in the imaging object T. In this case, it is possible to easily compare the planned flight path FR with the actual configuration of the imaged object T, so that it becomes easier to confirm the suitability of the flight path FR and the like.
 さらに、ユーザ端末2において合成画像Gを表示させながら、飛行経路FRの編集を可能にしてもよい。これにより、合成画像Gにより飛行経路を確認しながら、より適切な飛行経路FRを設定しやすくなる。 Further, the flight path FR may be edited while displaying the composite image G on the user terminal 2. This makes it easier to set a more appropriate flight path FR while confirming the flight path with the composite image G.
 なお、撮像対象物は、本例においては鉄塔である場合を図示して説明するが、これに限らず、カメラ42で撮影可能な物であればどのようなものであってもよく、例えば高層マンション、家、煙突、アンテナ塔、灯台、風車、樹木、観音像等の造形物、さらには人や動物などの生き物や火事等の煙などであってもよい。また、撮像対象物は実際に存在しているものに限らず、建設予定の物など任意の三次元モデルを配置可能である。 In this example, the object to be imaged will be illustrated with a steel tower, but the object is not limited to this, and any object that can be photographed by the camera 42 may be used, for example, a high-rise building. It may be a model such as an apartment, a house, a chimney, an antenna tower, a lighthouse, a windmill, a tree, a Kannon statue, or smoke of a creature such as a person or an animal or a fire. In addition, the object to be imaged is not limited to the one that actually exists, and any three-dimensional model such as an object to be constructed can be arranged.
 また、ユーザ端末2は、スマートフォンのような携帯通信端末を例示しているが、これに限らず、例えばAR機能を備えるウェアラブルデバイス(例えば、ARグラスのようなウェアラブルデバイスなど)であってもよく、これにより、飛行経路の視認性が向上し、飛行経路の適性がより確認しやすくなる。 Further, the user terminal 2 exemplifies a mobile communication terminal such as a smartphone, but is not limited to this, and may be, for example, a wearable device having an AR function (for example, a wearable device such as AR glasses). As a result, the visibility of the flight path is improved, and it becomes easier to confirm the suitability of the flight path.
 このように、現実の飛行範囲の状況と飛行体の飛行経路を照らし合わせることにより、当該飛行経路の適性が確認可能となる。 In this way, by comparing the actual flight range situation with the flight path of the flying object, the suitability of the flight path can be confirmed.
 上述した実施の形態は、本発明の理解を容易にするための例示に過ぎず、本発明を限定して解釈するためのものではない。本発明は、その趣旨を逸脱することなく、変更、改良することができると共に、本発明にはその均等物が含まれることは言うまでもない。 The above-described embodiment is merely an example for facilitating the understanding of the present invention, and is not intended to limit the interpretation of the present invention. It goes without saying that the present invention can be modified and improved without departing from the spirit thereof, and the present invention includes an equivalent thereof.
 1    管理サーバ
 2    ユーザ端末
 4    飛行体
 

 
1 Management server 2 User terminal 4 Aircraft

Claims (9)

  1.  飛行体の飛行経路をユーザ端末に表示するための飛行経路表示方法であって、
     前記ユーザ端末の位置情報及び撮影画角情報、撮影角度情報、撮影方向情報を含む撮影状態情報を取得するステップと、
     前記飛行経路に関する飛行経路情報に基づき、仮想空間内に描写された前記飛行経路を含む飛行経路画像を生成するステップと、
     前記撮影状態情報に基づき取得された撮影画像上に、前記飛行経路画像を重畳することで得られる合成画像を生成するステップと、を含む、
     ことを特徴とする飛行経路表示方法。
    It is a flight route display method for displaying the flight path of an air vehicle on a user terminal.
    A step of acquiring shooting state information including position information, shooting angle of view information, shooting angle information, and shooting direction information of the user terminal, and
    A step of generating a flight path image including the flight path depicted in the virtual space based on the flight path information regarding the flight path, and
    A step of generating a composite image obtained by superimposing the flight path image on the photographed image acquired based on the photographed state information is included.
    A flight route display method characterized by the fact that.
  2.  請求項1に記載の飛行経路表示方法であって、
     前記飛行経路を示す画像は、ウェイポイントに関する情報をさらに含む、
     ことを特徴とする飛行経路表示方法。
    The flight route display method according to claim 1.
    The image showing the flight path further contains information about waypoints.
    A flight route display method characterized by the fact that.
  3.  請求項1または2に記載の飛行経路表示方法であって、
     前記飛行経路を示す画像は、仮想飛行体の三次元モデルをさらに含む、
     ことを特徴とする飛行経路表示方法。
    The flight route display method according to claim 1 or 2.
    The image showing the flight path further includes a three-dimensional model of the virtual flying object.
    A flight route display method characterized by the fact that.
  4.  請求項1乃至3に記載の飛行経路表示方法であって、
     前記飛行経路を示す画像は、仮想撮像対象物の三次元モデルをさらに含む、
     ことを特徴とする飛行経路表示方法。
    The flight route display method according to claims 1 to 3.
    The image showing the flight path further includes a three-dimensional model of the virtual imaging object.
    A flight route display method characterized by the fact that.
  5.  請求項1乃至4に記載の飛行経路表示方法であって、
     前記ユーザ端末は、AR機能を有するウェアラブルデバイスである、
     ことを特徴とする飛行経路表示方法。
    The flight route display method according to claims 1 to 4.
    The user terminal is a wearable device having an AR function.
    A flight route display method characterized by the fact that.
  6.  請求項1乃至5に記載の飛行経路表示方法であって、
     前記合成画像を表示させながら飛行経路を編集するステップ、をさらに含む、
     ことを特徴とする飛行経路表示方法。
    The flight route display method according to claims 1 to 5.
    Further including the step of editing the flight path while displaying the composite image,
    A flight route display method characterized by the fact that.
  7.  請求項1乃至6に記載の飛行経路表示方法であって、
     前記飛行体から取得した映像を表示するステップ、をさらに含む、
     ことを特徴とする飛行経路表示方法。
    The flight route display method according to claims 1 to 6.
    Further including the step of displaying the image acquired from the flying object,
    A flight route display method characterized by the fact that.
  8.  請求項7に記載の飛行経路表示方法であって、
     前記飛行体から取得した映像を前記合成画像よりも大きく表示するように切り替え可能である、
     ことを特徴とする飛行経路表示方法。
    The flight route display method according to claim 7.
    It is possible to switch so that the image acquired from the flying object is displayed larger than the composite image.
    A flight route display method characterized by the fact that.
  9.  飛行体の飛行経路をユーザ端末に表示するための情報処理装置であって、
     前記ユーザ端末の位置情報及び撮影画角情報、撮影角度情報、撮影方向情報を含む撮影状態情報を取得する撮影状態情報取得部と、
     前記飛行経路に関する飛行経路情報に基づき、仮想空間内に描写された前記飛行経路を含む飛行経路画像を生成する飛行経路画像生成部と、
     前記撮影状態情報に基づき取得された撮影画像上に、前記飛行経路画像を重畳することで得られる合成画像を生成する合成画像生成部と、を備える、
     ことを特徴とする情報処理装置。

     
    An information processing device for displaying the flight path of an air vehicle on a user terminal.
    A shooting state information acquisition unit that acquires shooting state information including position information, shooting angle of view information, shooting angle information, and shooting direction information of the user terminal, and a shooting state information acquisition unit.
    A flight path image generation unit that generates a flight path image including the flight path drawn in the virtual space based on the flight path information related to the flight path.
    A composite image generation unit that generates a composite image obtained by superimposing the flight path image on the captured image acquired based on the shooting state information is provided.
    An information processing device characterized by this.

PCT/JP2019/051213 2019-12-26 2019-12-26 Aircraft flight path display method and information processing device WO2021130980A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2020511544A JP6730764B1 (en) 2019-12-26 2019-12-26 Flight route display method and information processing apparatus
PCT/JP2019/051213 WO2021130980A1 (en) 2019-12-26 2019-12-26 Aircraft flight path display method and information processing device
JP2020112346A JP2021104802A (en) 2019-12-26 2020-06-30 Flight route display device of flight body and information processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/051213 WO2021130980A1 (en) 2019-12-26 2019-12-26 Aircraft flight path display method and information processing device

Publications (1)

Publication Number Publication Date
WO2021130980A1 true WO2021130980A1 (en) 2021-07-01

Family

ID=71738538

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/051213 WO2021130980A1 (en) 2019-12-26 2019-12-26 Aircraft flight path display method and information processing device

Country Status (2)

Country Link
JP (2) JP6730764B1 (en)
WO (1) WO2021130980A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022155628A (en) * 2021-03-31 2022-10-14 住友重機械建機クレーン株式会社 Display device and route display program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017078704A (en) * 2015-07-17 2017-04-27 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Flight route creation method, flight route creation program, and flight route display device
WO2018123747A1 (en) * 2016-12-28 2018-07-05 Necソリューションイノベータ株式会社 Drone maneuvering system, maneuvering signal transmitter set, and drone maneuvering method
JP2018165930A (en) * 2017-03-28 2018-10-25 株式会社ゼンリンデータコム Drone navigation device, drone navigation method and drone navigation program
JP2019032234A (en) * 2017-08-08 2019-02-28 株式会社プロドローン Display device
US20190077504A1 (en) * 2017-09-11 2019-03-14 Disney Enterprises, Inc. Augmented reality travel route planning
JP2019114036A (en) * 2017-12-22 2019-07-11 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Information processing apparatus, flight control instruction method, program, and recording medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017078704A (en) * 2015-07-17 2017-04-27 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Flight route creation method, flight route creation program, and flight route display device
WO2018123747A1 (en) * 2016-12-28 2018-07-05 Necソリューションイノベータ株式会社 Drone maneuvering system, maneuvering signal transmitter set, and drone maneuvering method
JP2018165930A (en) * 2017-03-28 2018-10-25 株式会社ゼンリンデータコム Drone navigation device, drone navigation method and drone navigation program
JP2019032234A (en) * 2017-08-08 2019-02-28 株式会社プロドローン Display device
US20190077504A1 (en) * 2017-09-11 2019-03-14 Disney Enterprises, Inc. Augmented reality travel route planning
JP2019114036A (en) * 2017-12-22 2019-07-11 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Information processing apparatus, flight control instruction method, program, and recording medium

Also Published As

Publication number Publication date
JPWO2021130980A1 (en) 2021-12-23
JP2021104802A (en) 2021-07-26
JP6730764B1 (en) 2020-07-29

Similar Documents

Publication Publication Date Title
CN108351653B (en) System and method for UAV flight control
JP6765512B2 (en) Flight path generation method, information processing device, flight path generation system, program and recording medium
JP6829513B1 (en) Position calculation method and information processing system
US11082639B2 (en) Image display method, image display system, flying object, program, and recording medium
JP6583840B1 (en) Inspection system
WO2021251441A1 (en) Method, system, and program
JP6966810B2 (en) Management server and management system, display information generation method, program
JP6730764B1 (en) Flight route display method and information processing apparatus
WO2021079516A1 (en) Flight route creation method for flying body and management server
CN110799922A (en) Shooting control method and unmanned aerial vehicle
JP2020036163A (en) Information processing apparatus, photographing control method, program, and recording medium
JP6818379B1 (en) Flight route creation method and management server for aircraft
WO2021064982A1 (en) Information processing device and information processing method
JP2021100234A (en) Aircraft imaging method and information processing device
WO2021124579A1 (en) Image capturing method of flight vehicle and information processing device
JP6800505B1 (en) Aircraft management server and management system
JP7370045B2 (en) Dimension display system and method
JP6786138B1 (en) Aircraft management server and management system
JP6715541B1 (en) Aircraft management server and management system
JP6771253B1 (en) Aircraft management server and management system
JP6810497B1 (en) Flight route creation method and management server for aircraft
JP6810498B1 (en) Flight route creation method and management server for aircraft
JP6934646B1 (en) Flight restriction area setting method, waypoint setting method and management server, information processing system, program
JP6604681B1 (en) Dimension display system and dimension display method
JP2023083072A (en) Method, system and program

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2020511544

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19957762

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19957762

Country of ref document: EP

Kind code of ref document: A1