WO2021124579A1 - Procédé de capture d'image de véhicule volant et dispositif de traitement d'informations - Google Patents

Procédé de capture d'image de véhicule volant et dispositif de traitement d'informations Download PDF

Info

Publication number
WO2021124579A1
WO2021124579A1 PCT/JP2019/050206 JP2019050206W WO2021124579A1 WO 2021124579 A1 WO2021124579 A1 WO 2021124579A1 JP 2019050206 W JP2019050206 W JP 2019050206W WO 2021124579 A1 WO2021124579 A1 WO 2021124579A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
shooting
light source
imaging
date
Prior art date
Application number
PCT/JP2019/050206
Other languages
English (en)
Japanese (ja)
Inventor
西本 晋也
兼太郎 深見
Original Assignee
株式会社センシンロボティクス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社センシンロボティクス filed Critical 株式会社センシンロボティクス
Priority to JP2020519143A priority Critical patent/JPWO2021124579A1/ja
Priority to PCT/JP2019/050206 priority patent/WO2021124579A1/fr
Publication of WO2021124579A1 publication Critical patent/WO2021124579A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/10Propulsion
    • B64U50/19Propulsion using electrically powered motors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms

Definitions

  • the present invention relates to an image pickup method for an air vehicle and an information processing device.
  • Patent Document 1 discloses an imaging method in which an unmanned aerial vehicle is controlled to fly so as to be in a position where the imaging is not performed against the backlight when the imaging is performed against the backlight.
  • Patent Document 1 when the whole image of the image-imaging object is imaged, it is not possible to take an image from the position side where the image is backlit. Therefore, in order to image the whole image, the date and time are changed again. There was a burden of adjustment on the user side, such as shooting.
  • the present invention has been made in view of such a background, and even in the case of backlit imaging, by confirming whether or not the light source is located within the shooting range, imaging can be performed without reflecting the light source. It is an object of the present invention to provide an imaging method and an information processing apparatus that enable it.
  • the main invention of the present invention for solving the above problems is a method of imaging an air vehicle connected to an information processing device that controls an air vehicle in order to image an object, and at least information on shooting point information of the air vehicle. And an information acquisition step of acquiring shooting condition information including shooting direction information, shooting angle information and shooting angle information of a shooting device mounted on the flying object, and date and time information related to the shooting point information. A calculation step for calculating the orientation and altitude of the light source with respect to the flying object based on the shooting point information and the date and time information or the shooting point information and the position information of the light source acquired in advance, and the shooting point information and the date and time information.
  • the shooting range derived from the shooting direction information and at least one of the shooting angle information or the shooting angle information is compared with at least one of the orientation or altitude at which the light source is located, and the light source is within the shooting range. It is an imaging method comprising a determination step of determining whether or not it is located.
  • an imaging method and information processing that enable imaging without reflecting the light source by confirming whether or not the light source is located within the photographing range, particularly even in the case of imaging against the sun. Equipment can be provided.
  • FIG. 1 It is a figure which shows the structure of the management system which concerns on embodiment of this invention. It is a block diagram which shows the hardware configuration of the management server of FIG. It is a block diagram which shows the hardware configuration of the user terminal of FIG. It is a block diagram which shows the hardware composition of the flying object of FIG. It is a block diagram which shows the function of the management server of FIG. It is a block diagram which shows the structure of the parameter information storage part of FIG. It is a flowchart of the imaging method which concerns on embodiment of this invention. It is a figure which shows an example of the photographing range which concerns on embodiment of this invention. It is a figure which shows an example of the photographing range which concerns on embodiment of this invention.
  • the flight management server and flight management system have the following configurations.
  • [Item 1] A method of imaging an air vehicle connected to an information processing device that controls the air vehicle in order to image an object.
  • Shooting condition information including at least shooting point information and shooting direction information of the flying object, shooting angle information and shooting angle information of a shooting device mounted on the flying object, and date and time information related to the shooting point information.
  • the shooting range derived from the shooting point information, the date and time information, the shooting direction information, and at least one of the shooting angle information or the shooting angle information is compared with at least one of the orientation or altitude at which the light source is located.
  • the determination step is Determining if the light source is located within at least one of the vertical or horizontal shooting ranges.
  • [Item 3] The imaging method according to item 1 or 2.
  • the management server outputs a warning to the user.
  • a shooting condition changing step of changing at least a part of the shooting condition information is further included.
  • the light source is an artificial object that emits light.
  • the light source is the sun
  • the information processing device Shooting condition information including at least shooting point information and shooting direction information of the flying object, shooting angle information and shooting angle information of a shooting device mounted on the flying object, and date and time information related to the shooting point information.
  • Information acquisition department to acquire A calculation unit that calculates the direction and altitude of the light source with respect to the flying object based on the shooting point information, the date and time information, or the position information of the light source acquired in advance.
  • a shooting range derived from the shooting point information, the date and time information, the shooting direction information, and at least one of the shooting angle information and the shooting angle information is compared with at least one of the direction or altitude of the light source.
  • a determination unit for determining whether or not the light source is located within the shooting range, and the like. An information processing device characterized by this.
  • the management system includes a management server 1, one or more user terminals 2, one or more flying objects 4, and one or more flying object storage devices 5. ing.
  • the management server 1, the user terminal 2, the flying object 4, and the flying object storage device 5 are connected to each other so as to be able to communicate with each other via a network.
  • the illustrated configuration is an example, and is not limited to this. For example, a configuration that is carried by the user without having the flying object storage device 5 may be used.
  • FIG. 2 is a diagram showing a hardware configuration of the management server 1.
  • the illustrated configuration is an example, and may have other configurations.
  • the management server 1 is connected to a plurality of user terminals 2, an air vehicle 4, and an air vehicle storage device 5 to form a part of this system.
  • the management server 1 may be a general-purpose computer such as a workstation or a personal computer, or may be logically realized by cloud computing.
  • the management server 1 includes at least a processor 10, a memory 11, a storage 12, a transmission / reception unit 13, an input / output unit 14, and the like, and these are electrically connected to each other through a bus 15.
  • the processor 10 is an arithmetic unit that controls the operation of the entire management server 1, controls the transmission and reception of data between each element, and performs information processing necessary for application execution and authentication processing.
  • the processor 10 is a CPU (Central Processing Unit) and / or a GPU (Graphics Processing Unit), and executes each information processing by executing a program or the like for the system stored in the storage 12 and expanded in the memory 11. ..
  • the memory 11 includes a main memory composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory) and an auxiliary storage composed of a non-volatile storage device such as a flash memory or an HDD (Hard Disc Drive). ..
  • the memory 11 is used as a work area of the processor 10, and also stores a BIOS (Basic Input / Output System) executed when the management server 1 is started, various setting information, and the like.
  • BIOS Basic Input / Output System
  • the storage 12 stores various programs such as application programs.
  • a database storing data used for each process may be built in the storage 12.
  • the transmission / reception unit 13 connects the management server 1 to the network and the blockchain network.
  • the transmission / reception unit 13 may be provided with a short-range communication interface of Bluetooth (registered trademark) and BLE (Bluetooth Low Energy).
  • the input / output unit 14 is an information input device such as a keyboard and a mouse, and an output device such as a display.
  • the bus 15 is commonly connected to each of the above elements and transmits, for example, an address signal, a data signal, and various control signals.
  • the user terminal 2 shown in FIG. 3 also includes a processor 20, a memory 21, a storage 22, a transmission / reception unit 23, an input / output unit 24, and the like, which are electrically connected to each other through a bus 25. Since the functions of each element can be configured in the same manner as the management server 1 described above, detailed description of each element will be omitted.
  • FIG. 4 is a block diagram showing a hardware configuration of the air vehicle 4.
  • the flight controller 41 can have one or more processors such as a programmable processor (eg, central processing unit (CPU)).
  • a programmable processor eg, central processing unit (CPU)
  • the flight controller 41 has a memory 411 and can access the memory.
  • Memory 411 stores logic, code, and / or program instructions that the flight controller can execute to perform one or more steps.
  • the flight controller 41 may include sensors 412 such as an inertial sensor (accelerometer, gyro sensor), GPS sensor, proximity sensor (for example, rider) and the like.
  • Memory 411 may include, for example, a separable medium such as an SD card or random access memory (RAM) or an external storage device.
  • the data acquired from the cameras / sensors 42 may be directly transmitted and stored in the memory 411.
  • still image / moving image data taken by a camera or the like may be recorded in the built-in memory or an external memory, but the present invention is not limited to this, and at least the management server 1 or the management server 1 or the built-in memory may be recorded from the camera / sensor 42 or the built-in memory via the network NW. It may be recorded in either the user terminal 2 or the air vehicle storage device 5.
  • the camera 42 is installed on the flying object 4 via the gimbal 43.
  • the flight controller 41 includes a control module (not shown) configured to control the state of the flying object.
  • the control module adjusts the spatial placement, velocity, and / or acceleration of an air vehicle with six degrees of freedom (translational motion x, y and z, and rotational motion ⁇ x , ⁇ y and ⁇ z).
  • ESC44 Electric Speed Controller
  • the propulsion mechanism (motor 45, etc.) of the flying object.
  • the propeller 46 is rotated by the motor 45 supplied from the battery 48 to generate lift of the flying object.
  • the control module can control one or more of the states of the mounting unit and the sensors.
  • the flight controller 41 is configured to transmit and / or receive data from one or more external devices (eg, transmitter / receiver (propo) 49, terminal, display device, or other remote control). It is possible to communicate with the unit 47.
  • the transmitter / receiver 49 can use any suitable communication means such as wired communication or wireless communication.
  • the transmission / reception unit 47 uses one or more of a local area network (LAN), a wide area network (WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, cloud communication, and the like. can do.
  • LAN local area network
  • WAN wide area network
  • P2P point-to-point
  • the transmission / reception unit 47 transmits and / or receives one or more of the data acquired by the sensors 42, the processing result generated by the flight controller 41, the predetermined control data, the user command from the terminal or the remote controller, and the like. be able to.
  • Sensors 42 may include an inertial sensor (accelerometer, gyro sensor), GPS sensor, proximity sensor (eg, rider), or vision / image sensor (eg, camera).
  • inertial sensor accelerelerometer, gyro sensor
  • GPS sensor GPS sensor
  • proximity sensor eg, rider
  • vision / image sensor eg, camera
  • FIG. 5 is a block diagram illustrating the functions implemented in the management server 1.
  • the management server 1 includes a communication unit 110, a flight mission generation unit 120, an information acquisition unit 130, a light source position calculation unit 140, a determination unit 150, a storage unit 160, and a report generation unit 170.
  • the flight mission generation unit 120 includes a flight path generation unit 121.
  • the storage unit 160 includes a flight path information storage unit 162, a flight log storage unit 164, and a shooting condition parameter storage unit 166.
  • the storage unit 160 may further have a storage unit that stores information necessary for performing imaging, for example, information on flight conditions (for example, flight speed, waypoint interval, etc.), and a shooting target. Even if it has a storage unit (not shown) that stores information about an object (for example, position coordinates and height information) and information about the surrounding environment of the object (for example, information about terrain and surrounding structures). Good.
  • the communication unit 110 communicates with the user terminal 2, the flying object 4, and the flying object storage device 5.
  • the communication unit 110 also functions as a reception unit that receives flight requests from the user terminal 2.
  • Flight mission generation unit 120 generates flight missions.
  • the flight mission is information including at least shooting point (so-called waypoint, including latitude / longitude information and flight altitude information) information, shooting direction information, and flight path including shooting date / time information.
  • the flight path may be set by a known method, for example, referring to a manually set shooting point and shooting direction, or setting the position coordinates of the imaging target and the shooting distance from the imaging target.
  • the shooting point and the shooting direction may be automatically calculated and set.
  • the flight path may be, for example, a configuration in which the position where the aircraft is carried by the user is set as the flight start position or the user collects the aircraft at the flight end position without having the flight object storage device 5. Then, based on the information of the flight object storage device 5 managed by the management server 1 (for example, position information, storage state information, storage aircraft information, etc.), the flight start position, intermediate stopover, or flight end position was selected. It may be configured to be generated as a flight path including the position of the airframe storage device 5.
  • the information acquisition unit 130 acquires the shooting condition information of the flying object.
  • the acquisition of the shooting condition information referred to here is the acquisition of shooting condition parameters including at least flight path information, date and time information, shooting angle of view information, and shooting angle information.
  • shooting condition parameters for example, a default value stored in advance in the storage unit 160 may be acquired, or may be automatically calculated and acquired according to the size of the object to be photographed.
  • the light source position calculation unit 140 calculates the direction and altitude position of the light source with respect to the flying object (hereinafter referred to as the light source position) based on, for example, the shooting point information and the shooting date / time information.
  • the solar altitude and direction (hereinafter, referred to as "position of the sun") can be calculated by a known method. More specifically, for example, since the position of the sun can be calculated by substituting the date and time and latitude / longitude into a known mathematical formula for obtaining the position of the sun, the position of the sun can be calculated directly by this known method for calculating the position of the sun.
  • the light source position calculation unit 140 calculates the light source position with respect to the flying object based on, for example, the shooting point information and the position information of the light source acquired in advance.
  • the position of the light source of the artificial object is based on information that stores the position of emitting the light (for example, latitude / longitude information, height position information, etc.).
  • Information can be acquired, and the position information of the light source with respect to the shooting point can be calculated from the relative relationship between the position information of the light source and the shooting point information.
  • the determination unit 150 determines whether or not the light source is within the shooting range. This determination is made by comparing the shooting condition information (particularly the shooting point information and shooting direction information on the flight path, the shooting angle of view and the shooting angle information) with the above-mentioned position information of the light source to determine the shooting range from the shooting point. It is possible to determine whether or not the light source is located inside. As a more specific example of the determination method, with respect to the horizontal direction, the orientation in which the light source is located is included within the shooting range formed by the shooting angle of view in the horizontal direction centered on the shooting direction from the shooting point. It is judged by comparing whether or not it is possible. Further, in the vertical direction, it is determined by comparing whether or not the altitude at which the light source is located is included in the shooting range formed by the shooting angle of view in the vertical direction centered on the shooting angle from the shooting point.
  • the flight path information storage unit 162 stores the shooting point information, the shooting direction information, and the shooting date / time information of the flying object generated by the flight mission generation unit 120.
  • the flight log storage unit 164 may use, for example, information acquired by the aircraft 4 on the flight path set in the flight mission (for example, position information from takeoff to landing, still images, moving images, etc.). Memorize voice and other information).
  • the shooting condition parameter storage unit 166 includes at least a shooting angle information storage unit 1661, a shooting angle information storage unit 1662, and a light source position information storage unit 1663.
  • the report generation unit 170 generates the report information displayed on the user terminal 2 based on the flight log storage unit 164.
  • FIG. 7 illustrates a flowchart of the imaging method according to the present embodiment.
  • This flowchart illustrates a configuration in which an application is started on the user terminal 2, but the present invention is not limited to this, and for example, a processor and an input / output device in which the management server 1 and the air vehicle storage device 5 can start the application are provided. It may have a configuration capable of various settings and the like.
  • FIG. 8-13 is an example for explaining the imaging range in the imaging method according to the embodiment of the present invention.
  • the information acquisition unit 130 acquires the shooting condition information of the flying object from the storage unit 160 (SQ101).
  • the acquisition of the shooting condition information referred to here is the acquisition of shooting condition parameters including at least flight path information, date and time information, shooting angle of view information, and shooting angle information.
  • shooting condition parameters for example, default values may be acquired in advance, or may be automatically acquired according to the size of the object to be photographed.
  • the object to be photographed will be illustrated with a steel tower, but the object is not limited to this, and any object that can be photographed by the camera 42 may be used, for example, a high-rise building. It may be a model such as an apartment, a house, a chimney, an antenna tower, a lighthouse, a windmill, a tree, a Kannon statue, or smoke of a creature such as a person or an animal or a fire.
  • the management server 1 calculates the position information of the light source by the light source position calculation unit 140 (SQ102).
  • the light source is not limited to a light source that emits light by itself (for example, the sun, the moon, lighting, etc.), but is a light source generated by reflecting light emitted from another object (for example, reflected on glass or a window). Light etc.) is also included.
  • the management server 1 determines whether or not the light source is within the shooting range by the determination unit 150 (SQ103). Here, if it is determined that the light source is not located within the shooting range from the shooting point, the flying object starts the flight based on the shooting condition information (SQ104).
  • the light source is not located within the shooting range. That is, even if the light source is included in the horizontal shooting range as illustrated in FIG. 9, if the light source is not included in the vertical shooting range as illustrated in FIG. 8, it is within the shooting range. It may be determined that the light source is not located in.
  • the light source is not located within the shooting range. That is, even if the light source is included in the vertical shooting range as illustrated in FIG. 11, if the light source is not included in the horizontal shooting range as illustrated in FIG. 12, it is within the shooting range. It may be determined that the light source is not located in.
  • the light source is not included in the shooting range in the vertical direction and the horizontal direction, it may be determined that the light source is not located in the shooting range.
  • the change of the shooting condition information may be dealt with by changing any shooting condition information as long as it is determined that the light source is not located within the shooting range from the shooting point.
  • one of the shooting condition information to be changed may be the shooting angle. That is, for example, when the light source is located within the shooting range when shooting an object from above to below as shown in FIG. 11, by changing the shooting angle upward instead of downward, for example, in FIG. It can be dealt with by changing the shooting conditions. At this time, the traveling direction of the flight path may be changed from the lower side to the upper side instead of the upper side to the lower side. On the contrary, when the light source is above and the light source is located within the shooting range under the shooting conditions as shown in FIG. 10, it can be dealt with by changing the shooting angle as shown in FIG.
  • the light source when the light source is above, the light source may be located within the shooting range at the shooting angle as shown in FIG. 8, but in that case, in addition to changing the shooting angle, the shooting altitude is also changed. By doing so, it is possible to change the shooting conditions so that the light source is not located within the shooting range as shown in FIG.
  • one of the shooting condition information to be changed may be a shooting point and a shooting direction. That is, for example, the light source is located within the shooting range in the positional relationship as shown in FIG. 9, but by changing the shooting point and the shooting direction as shown in FIG. 12, the shooting conditions are such that the light source is not positioned within the shooting range. It is also possible to change. At this time, instead of changing the shooting direction of the aircraft, the shooting angle of the shooting device may be changed.
  • one of the shooting condition information to be changed may be the shooting date and time. That is, for example, the light source is at the position shown in FIG. 11 at the set date and time, but the light source is located at the position shown in FIG. 8 by changing the shooting date and time, so other shooting conditions should be changed. It is also possible to deal with it without.
  • the date and time information and the time zone information are compared by storing the time zone information that the artificial object emits light or does not emit light in the storage unit 160. By doing so, it can be determined that the light source emits light or does not emit light at the set date and time, so by changing the shooting date and time during the time when the light source does not emit light, both light sources can be used. It may be set to enable shooting even if it is located within the shooting range.
  • the light source is the sun
  • the weather in which the sun appears or the sun does not appear at the set date and time.
  • shooting may be possible even if the light source is located within both shooting ranges.
  • the report is output using the information (for example, captured image) acquired by the flying object during the flight (SQ106).
  • the management server may output a warning to the user that the light source is located within the shooting range after the confirmation step of SQ103. Then, in response to the warning, the user can select whether to proceed to the step of canceling the flight itself, the step of changing the shooting conditions, the step of changing the shooting conditions, and the like. Good. This allows the user to flexibly set the flight plan.
  • the management server 1 as the information processing device executes the processing for the imaging method, but instead of this, the information processing device mounted on the flying object 4 itself or , The information processing device mounted on the flying object storage device 5 may be configured to execute the above-mentioned processing.
  • the object is an object such as a steel tower in which the light source located on one side to the other side of the object can be seen due to a gap or the like
  • the object is, for example, a high-rise condominium.
  • the light source located on the other side is hidden, for example, within a predetermined angle range centered on the shooting direction or shooting angle from the shooting point, which is set based on the width information and height information of the target.
  • the light source may be set to be capable of shooting even if it is located within both shooting ranges.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mechanical Engineering (AREA)
  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Le problème décrit par la présente invention est de fournir un procédé de capture d'image et un dispositif de traitement d'informations dans lesquels, en particulier, même dans un cas de capture d'images à contre-jour, la capture d'image peut être effectuée sans imager la source de lumière. La solution selon la présente invention concerne un procédé de capture d'image caractérisé en ce qu'il comprend : une étape d'acquisition d'informations consistant à acquérir des informations de condition d'imagerie comprenant au moins des informations de point d'imagerie ainsi que des informations de direction d'imagerie d'un véhicule volant, comprenant également des informations d'angle de vue d'imagerie ainsi que des informations d'angle d'imagerie d'un dispositif d'imagerie monté sur le véhicule volant et comprenant également des informations de date et d'heure associées aux informations de point d'imagerie ; une étape de calcul consistant à calculer une direction dans laquelle la source de lumière est positionnée et une hauteur à laquelle la source de lumière est positionnée par rapport au véhicule volant sur la base des informations de point d'imagerie ainsi que des informations de date et d'heure ou des informations de point d'imagerie ainsi que des informations de position de la source de lumière acquises à l'avance ; et une étape de détermination à laquelle une plage d'imagerie, dérivée des informations de point d'imagerie ainsi que des informations de date et d'heure, à partir des informations de direction d'imagerie et à partir d'au moins l'une des informations d'angle de vue d'imagerie ou des informations d'angle d'imagerie est comparée à au moins la direction dans laquelle et/ou la hauteur à laquelle la source de lumière est positionnée, ce qui permet de déterminer si la source de lumière est positionnée à l'intérieur de la plage d'imagerie.
PCT/JP2019/050206 2019-12-20 2019-12-20 Procédé de capture d'image de véhicule volant et dispositif de traitement d'informations WO2021124579A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020519143A JPWO2021124579A1 (ja) 2019-12-20 2019-12-20 飛行体の撮像方法及び情報処理装置
PCT/JP2019/050206 WO2021124579A1 (fr) 2019-12-20 2019-12-20 Procédé de capture d'image de véhicule volant et dispositif de traitement d'informations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/050206 WO2021124579A1 (fr) 2019-12-20 2019-12-20 Procédé de capture d'image de véhicule volant et dispositif de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2021124579A1 true WO2021124579A1 (fr) 2021-06-24

Family

ID=76476743

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/050206 WO2021124579A1 (fr) 2019-12-20 2019-12-20 Procédé de capture d'image de véhicule volant et dispositif de traitement d'informations

Country Status (2)

Country Link
JP (1) JPWO2021124579A1 (fr)
WO (1) WO2021124579A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114750972A (zh) * 2022-04-27 2022-07-15 西安理工大学 多旋翼无人机回收辅助导航装置及方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011099049A1 (fr) * 2010-02-09 2011-08-18 トヨタ自動車株式会社 Système de formation d'images
JP2013054545A (ja) * 2011-09-05 2013-03-21 Mitsubishi Motors Corp 運転支援装置
JP2017068639A (ja) * 2015-09-30 2017-04-06 セコム株式会社 自律移動ロボット
JP2018092237A (ja) * 2016-11-30 2018-06-14 キヤノンマーケティングジャパン株式会社 無人航空機制御システム、無人航空機制御システムの制御方法、およびプログラム

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4678649B2 (ja) * 2005-10-31 2011-04-27 富士通株式会社 画像処理装置
JPWO2013136399A1 (ja) * 2012-03-12 2015-07-30 パナソニックIpマネジメント株式会社 情報提供システム、情報提供装置、撮影装置、およびコンピュータプログラム
KR20150106719A (ko) * 2014-03-12 2015-09-22 삼성전자주식회사 전자 장치의 촬영 위치 안내 방법 및 이를 이용한 전자 장치

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011099049A1 (fr) * 2010-02-09 2011-08-18 トヨタ自動車株式会社 Système de formation d'images
JP2013054545A (ja) * 2011-09-05 2013-03-21 Mitsubishi Motors Corp 運転支援装置
JP2017068639A (ja) * 2015-09-30 2017-04-06 セコム株式会社 自律移動ロボット
JP2018092237A (ja) * 2016-11-30 2018-06-14 キヤノンマーケティングジャパン株式会社 無人航空機制御システム、無人航空機制御システムの制御方法、およびプログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114750972A (zh) * 2022-04-27 2022-07-15 西安理工大学 多旋翼无人机回收辅助导航装置及方法
CN114750972B (zh) * 2022-04-27 2024-05-14 西安理工大学 多旋翼无人机回收辅助导航装置及方法

Also Published As

Publication number Publication date
JPWO2021124579A1 (ja) 2021-12-23

Similar Documents

Publication Publication Date Title
US20200026720A1 (en) Construction and update of elevation maps
CN107209514B (zh) 传感器数据的选择性处理
JP6878567B2 (ja) 3次元形状推定方法、飛行体、モバイルプラットフォーム、プログラム及び記録媒体
WO2021199449A1 (fr) Procédé de calcul de position et système de traitement d'informations
JP2020140726A (ja) 無人飛行体のフライト管理サーバ及びフライト管理システム
WO2021124579A1 (fr) Procédé de capture d'image de véhicule volant et dispositif de traitement d'informations
JP6966810B2 (ja) 管理サーバ及び管理システム、表示情報生成方法、プログラム
JP2021100234A (ja) 飛行体の撮像方法及び情報処理装置
WO2020225979A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations
JPWO2021079516A1 (ja) 飛行体の飛行経路作成方法及び管理サーバ
JP6730764B1 (ja) 飛行体の飛行経路表示方法及び情報処理装置
JP7004374B1 (ja) 移動体の移動経路生成方法及びプログラム、管理サーバ、管理システム
JP2020036163A (ja) 情報処理装置、撮影制御方法、プログラム及び記録媒体
JP6818379B1 (ja) 飛行体の飛行経路作成方法及び管理サーバ
JP6684012B1 (ja) 情報処理装置および情報処理方法
JP6800505B1 (ja) 飛行体の管理サーバ及び管理システム
WO2020204200A1 (fr) Système de génération de plan de travail
JP6899108B1 (ja) 計器指示値読取方法及び管理サーバ、計器指示値読取システム、プログラム
JP6934646B1 (ja) 飛行制限領域設定方法、ウェイポイント設定方法及び管理サーバ、情報処理システム、プログラム
JP7370045B2 (ja) 寸法表示システムおよび寸法表示方法
WO2022113482A1 (fr) Dispositif, procédé et programme de traitement d'informations
JP6810498B1 (ja) 飛行体の飛行経路作成方法及び管理サーバ
JP6810497B1 (ja) 飛行体の飛行経路作成方法及び管理サーバ
JP6978026B1 (ja) ウェイポイント設定方法及びウェイポイント修正方法、管理サーバ、情報処理システム、プログラム
WO2021049508A1 (fr) Système d'affichage de dimensions, et procédé d'affichage de dimensions

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2020519143

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19956741

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19956741

Country of ref document: EP

Kind code of ref document: A1