WO2021124579A1 - Image capturing method of flight vehicle and information processing device - Google Patents

Image capturing method of flight vehicle and information processing device Download PDF

Info

Publication number
WO2021124579A1
WO2021124579A1 PCT/JP2019/050206 JP2019050206W WO2021124579A1 WO 2021124579 A1 WO2021124579 A1 WO 2021124579A1 JP 2019050206 W JP2019050206 W JP 2019050206W WO 2021124579 A1 WO2021124579 A1 WO 2021124579A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
shooting
light source
imaging
date
Prior art date
Application number
PCT/JP2019/050206
Other languages
French (fr)
Japanese (ja)
Inventor
西本 晋也
兼太郎 深見
Original Assignee
株式会社センシンロボティクス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社センシンロボティクス filed Critical 株式会社センシンロボティクス
Priority to PCT/JP2019/050206 priority Critical patent/WO2021124579A1/en
Priority to JP2020519143A priority patent/JPWO2021124579A1/en
Publication of WO2021124579A1 publication Critical patent/WO2021124579A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/10Propulsion
    • B64U50/19Propulsion using electrically powered motors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms

Definitions

  • the present invention relates to an image pickup method for an air vehicle and an information processing device.
  • Patent Document 1 discloses an imaging method in which an unmanned aerial vehicle is controlled to fly so as to be in a position where the imaging is not performed against the backlight when the imaging is performed against the backlight.
  • Patent Document 1 when the whole image of the image-imaging object is imaged, it is not possible to take an image from the position side where the image is backlit. Therefore, in order to image the whole image, the date and time are changed again. There was a burden of adjustment on the user side, such as shooting.
  • the present invention has been made in view of such a background, and even in the case of backlit imaging, by confirming whether or not the light source is located within the shooting range, imaging can be performed without reflecting the light source. It is an object of the present invention to provide an imaging method and an information processing apparatus that enable it.
  • the main invention of the present invention for solving the above problems is a method of imaging an air vehicle connected to an information processing device that controls an air vehicle in order to image an object, and at least information on shooting point information of the air vehicle. And an information acquisition step of acquiring shooting condition information including shooting direction information, shooting angle information and shooting angle information of a shooting device mounted on the flying object, and date and time information related to the shooting point information. A calculation step for calculating the orientation and altitude of the light source with respect to the flying object based on the shooting point information and the date and time information or the shooting point information and the position information of the light source acquired in advance, and the shooting point information and the date and time information.
  • the shooting range derived from the shooting direction information and at least one of the shooting angle information or the shooting angle information is compared with at least one of the orientation or altitude at which the light source is located, and the light source is within the shooting range. It is an imaging method comprising a determination step of determining whether or not it is located.
  • an imaging method and information processing that enable imaging without reflecting the light source by confirming whether or not the light source is located within the photographing range, particularly even in the case of imaging against the sun. Equipment can be provided.
  • FIG. 1 It is a figure which shows the structure of the management system which concerns on embodiment of this invention. It is a block diagram which shows the hardware configuration of the management server of FIG. It is a block diagram which shows the hardware configuration of the user terminal of FIG. It is a block diagram which shows the hardware composition of the flying object of FIG. It is a block diagram which shows the function of the management server of FIG. It is a block diagram which shows the structure of the parameter information storage part of FIG. It is a flowchart of the imaging method which concerns on embodiment of this invention. It is a figure which shows an example of the photographing range which concerns on embodiment of this invention. It is a figure which shows an example of the photographing range which concerns on embodiment of this invention.
  • the flight management server and flight management system have the following configurations.
  • [Item 1] A method of imaging an air vehicle connected to an information processing device that controls the air vehicle in order to image an object.
  • Shooting condition information including at least shooting point information and shooting direction information of the flying object, shooting angle information and shooting angle information of a shooting device mounted on the flying object, and date and time information related to the shooting point information.
  • the shooting range derived from the shooting point information, the date and time information, the shooting direction information, and at least one of the shooting angle information or the shooting angle information is compared with at least one of the orientation or altitude at which the light source is located.
  • the determination step is Determining if the light source is located within at least one of the vertical or horizontal shooting ranges.
  • [Item 3] The imaging method according to item 1 or 2.
  • the management server outputs a warning to the user.
  • a shooting condition changing step of changing at least a part of the shooting condition information is further included.
  • the light source is an artificial object that emits light.
  • the light source is the sun
  • the information processing device Shooting condition information including at least shooting point information and shooting direction information of the flying object, shooting angle information and shooting angle information of a shooting device mounted on the flying object, and date and time information related to the shooting point information.
  • Information acquisition department to acquire A calculation unit that calculates the direction and altitude of the light source with respect to the flying object based on the shooting point information, the date and time information, or the position information of the light source acquired in advance.
  • a shooting range derived from the shooting point information, the date and time information, the shooting direction information, and at least one of the shooting angle information and the shooting angle information is compared with at least one of the direction or altitude of the light source.
  • a determination unit for determining whether or not the light source is located within the shooting range, and the like. An information processing device characterized by this.
  • the management system includes a management server 1, one or more user terminals 2, one or more flying objects 4, and one or more flying object storage devices 5. ing.
  • the management server 1, the user terminal 2, the flying object 4, and the flying object storage device 5 are connected to each other so as to be able to communicate with each other via a network.
  • the illustrated configuration is an example, and is not limited to this. For example, a configuration that is carried by the user without having the flying object storage device 5 may be used.
  • FIG. 2 is a diagram showing a hardware configuration of the management server 1.
  • the illustrated configuration is an example, and may have other configurations.
  • the management server 1 is connected to a plurality of user terminals 2, an air vehicle 4, and an air vehicle storage device 5 to form a part of this system.
  • the management server 1 may be a general-purpose computer such as a workstation or a personal computer, or may be logically realized by cloud computing.
  • the management server 1 includes at least a processor 10, a memory 11, a storage 12, a transmission / reception unit 13, an input / output unit 14, and the like, and these are electrically connected to each other through a bus 15.
  • the processor 10 is an arithmetic unit that controls the operation of the entire management server 1, controls the transmission and reception of data between each element, and performs information processing necessary for application execution and authentication processing.
  • the processor 10 is a CPU (Central Processing Unit) and / or a GPU (Graphics Processing Unit), and executes each information processing by executing a program or the like for the system stored in the storage 12 and expanded in the memory 11. ..
  • the memory 11 includes a main memory composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory) and an auxiliary storage composed of a non-volatile storage device such as a flash memory or an HDD (Hard Disc Drive). ..
  • the memory 11 is used as a work area of the processor 10, and also stores a BIOS (Basic Input / Output System) executed when the management server 1 is started, various setting information, and the like.
  • BIOS Basic Input / Output System
  • the storage 12 stores various programs such as application programs.
  • a database storing data used for each process may be built in the storage 12.
  • the transmission / reception unit 13 connects the management server 1 to the network and the blockchain network.
  • the transmission / reception unit 13 may be provided with a short-range communication interface of Bluetooth (registered trademark) and BLE (Bluetooth Low Energy).
  • the input / output unit 14 is an information input device such as a keyboard and a mouse, and an output device such as a display.
  • the bus 15 is commonly connected to each of the above elements and transmits, for example, an address signal, a data signal, and various control signals.
  • the user terminal 2 shown in FIG. 3 also includes a processor 20, a memory 21, a storage 22, a transmission / reception unit 23, an input / output unit 24, and the like, which are electrically connected to each other through a bus 25. Since the functions of each element can be configured in the same manner as the management server 1 described above, detailed description of each element will be omitted.
  • FIG. 4 is a block diagram showing a hardware configuration of the air vehicle 4.
  • the flight controller 41 can have one or more processors such as a programmable processor (eg, central processing unit (CPU)).
  • a programmable processor eg, central processing unit (CPU)
  • the flight controller 41 has a memory 411 and can access the memory.
  • Memory 411 stores logic, code, and / or program instructions that the flight controller can execute to perform one or more steps.
  • the flight controller 41 may include sensors 412 such as an inertial sensor (accelerometer, gyro sensor), GPS sensor, proximity sensor (for example, rider) and the like.
  • Memory 411 may include, for example, a separable medium such as an SD card or random access memory (RAM) or an external storage device.
  • the data acquired from the cameras / sensors 42 may be directly transmitted and stored in the memory 411.
  • still image / moving image data taken by a camera or the like may be recorded in the built-in memory or an external memory, but the present invention is not limited to this, and at least the management server 1 or the management server 1 or the built-in memory may be recorded from the camera / sensor 42 or the built-in memory via the network NW. It may be recorded in either the user terminal 2 or the air vehicle storage device 5.
  • the camera 42 is installed on the flying object 4 via the gimbal 43.
  • the flight controller 41 includes a control module (not shown) configured to control the state of the flying object.
  • the control module adjusts the spatial placement, velocity, and / or acceleration of an air vehicle with six degrees of freedom (translational motion x, y and z, and rotational motion ⁇ x , ⁇ y and ⁇ z).
  • ESC44 Electric Speed Controller
  • the propulsion mechanism (motor 45, etc.) of the flying object.
  • the propeller 46 is rotated by the motor 45 supplied from the battery 48 to generate lift of the flying object.
  • the control module can control one or more of the states of the mounting unit and the sensors.
  • the flight controller 41 is configured to transmit and / or receive data from one or more external devices (eg, transmitter / receiver (propo) 49, terminal, display device, or other remote control). It is possible to communicate with the unit 47.
  • the transmitter / receiver 49 can use any suitable communication means such as wired communication or wireless communication.
  • the transmission / reception unit 47 uses one or more of a local area network (LAN), a wide area network (WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, cloud communication, and the like. can do.
  • LAN local area network
  • WAN wide area network
  • P2P point-to-point
  • the transmission / reception unit 47 transmits and / or receives one or more of the data acquired by the sensors 42, the processing result generated by the flight controller 41, the predetermined control data, the user command from the terminal or the remote controller, and the like. be able to.
  • Sensors 42 may include an inertial sensor (accelerometer, gyro sensor), GPS sensor, proximity sensor (eg, rider), or vision / image sensor (eg, camera).
  • inertial sensor accelerelerometer, gyro sensor
  • GPS sensor GPS sensor
  • proximity sensor eg, rider
  • vision / image sensor eg, camera
  • FIG. 5 is a block diagram illustrating the functions implemented in the management server 1.
  • the management server 1 includes a communication unit 110, a flight mission generation unit 120, an information acquisition unit 130, a light source position calculation unit 140, a determination unit 150, a storage unit 160, and a report generation unit 170.
  • the flight mission generation unit 120 includes a flight path generation unit 121.
  • the storage unit 160 includes a flight path information storage unit 162, a flight log storage unit 164, and a shooting condition parameter storage unit 166.
  • the storage unit 160 may further have a storage unit that stores information necessary for performing imaging, for example, information on flight conditions (for example, flight speed, waypoint interval, etc.), and a shooting target. Even if it has a storage unit (not shown) that stores information about an object (for example, position coordinates and height information) and information about the surrounding environment of the object (for example, information about terrain and surrounding structures). Good.
  • the communication unit 110 communicates with the user terminal 2, the flying object 4, and the flying object storage device 5.
  • the communication unit 110 also functions as a reception unit that receives flight requests from the user terminal 2.
  • Flight mission generation unit 120 generates flight missions.
  • the flight mission is information including at least shooting point (so-called waypoint, including latitude / longitude information and flight altitude information) information, shooting direction information, and flight path including shooting date / time information.
  • the flight path may be set by a known method, for example, referring to a manually set shooting point and shooting direction, or setting the position coordinates of the imaging target and the shooting distance from the imaging target.
  • the shooting point and the shooting direction may be automatically calculated and set.
  • the flight path may be, for example, a configuration in which the position where the aircraft is carried by the user is set as the flight start position or the user collects the aircraft at the flight end position without having the flight object storage device 5. Then, based on the information of the flight object storage device 5 managed by the management server 1 (for example, position information, storage state information, storage aircraft information, etc.), the flight start position, intermediate stopover, or flight end position was selected. It may be configured to be generated as a flight path including the position of the airframe storage device 5.
  • the information acquisition unit 130 acquires the shooting condition information of the flying object.
  • the acquisition of the shooting condition information referred to here is the acquisition of shooting condition parameters including at least flight path information, date and time information, shooting angle of view information, and shooting angle information.
  • shooting condition parameters for example, a default value stored in advance in the storage unit 160 may be acquired, or may be automatically calculated and acquired according to the size of the object to be photographed.
  • the light source position calculation unit 140 calculates the direction and altitude position of the light source with respect to the flying object (hereinafter referred to as the light source position) based on, for example, the shooting point information and the shooting date / time information.
  • the solar altitude and direction (hereinafter, referred to as "position of the sun") can be calculated by a known method. More specifically, for example, since the position of the sun can be calculated by substituting the date and time and latitude / longitude into a known mathematical formula for obtaining the position of the sun, the position of the sun can be calculated directly by this known method for calculating the position of the sun.
  • the light source position calculation unit 140 calculates the light source position with respect to the flying object based on, for example, the shooting point information and the position information of the light source acquired in advance.
  • the position of the light source of the artificial object is based on information that stores the position of emitting the light (for example, latitude / longitude information, height position information, etc.).
  • Information can be acquired, and the position information of the light source with respect to the shooting point can be calculated from the relative relationship between the position information of the light source and the shooting point information.
  • the determination unit 150 determines whether or not the light source is within the shooting range. This determination is made by comparing the shooting condition information (particularly the shooting point information and shooting direction information on the flight path, the shooting angle of view and the shooting angle information) with the above-mentioned position information of the light source to determine the shooting range from the shooting point. It is possible to determine whether or not the light source is located inside. As a more specific example of the determination method, with respect to the horizontal direction, the orientation in which the light source is located is included within the shooting range formed by the shooting angle of view in the horizontal direction centered on the shooting direction from the shooting point. It is judged by comparing whether or not it is possible. Further, in the vertical direction, it is determined by comparing whether or not the altitude at which the light source is located is included in the shooting range formed by the shooting angle of view in the vertical direction centered on the shooting angle from the shooting point.
  • the flight path information storage unit 162 stores the shooting point information, the shooting direction information, and the shooting date / time information of the flying object generated by the flight mission generation unit 120.
  • the flight log storage unit 164 may use, for example, information acquired by the aircraft 4 on the flight path set in the flight mission (for example, position information from takeoff to landing, still images, moving images, etc.). Memorize voice and other information).
  • the shooting condition parameter storage unit 166 includes at least a shooting angle information storage unit 1661, a shooting angle information storage unit 1662, and a light source position information storage unit 1663.
  • the report generation unit 170 generates the report information displayed on the user terminal 2 based on the flight log storage unit 164.
  • FIG. 7 illustrates a flowchart of the imaging method according to the present embodiment.
  • This flowchart illustrates a configuration in which an application is started on the user terminal 2, but the present invention is not limited to this, and for example, a processor and an input / output device in which the management server 1 and the air vehicle storage device 5 can start the application are provided. It may have a configuration capable of various settings and the like.
  • FIG. 8-13 is an example for explaining the imaging range in the imaging method according to the embodiment of the present invention.
  • the information acquisition unit 130 acquires the shooting condition information of the flying object from the storage unit 160 (SQ101).
  • the acquisition of the shooting condition information referred to here is the acquisition of shooting condition parameters including at least flight path information, date and time information, shooting angle of view information, and shooting angle information.
  • shooting condition parameters for example, default values may be acquired in advance, or may be automatically acquired according to the size of the object to be photographed.
  • the object to be photographed will be illustrated with a steel tower, but the object is not limited to this, and any object that can be photographed by the camera 42 may be used, for example, a high-rise building. It may be a model such as an apartment, a house, a chimney, an antenna tower, a lighthouse, a windmill, a tree, a Kannon statue, or smoke of a creature such as a person or an animal or a fire.
  • the management server 1 calculates the position information of the light source by the light source position calculation unit 140 (SQ102).
  • the light source is not limited to a light source that emits light by itself (for example, the sun, the moon, lighting, etc.), but is a light source generated by reflecting light emitted from another object (for example, reflected on glass or a window). Light etc.) is also included.
  • the management server 1 determines whether or not the light source is within the shooting range by the determination unit 150 (SQ103). Here, if it is determined that the light source is not located within the shooting range from the shooting point, the flying object starts the flight based on the shooting condition information (SQ104).
  • the light source is not located within the shooting range. That is, even if the light source is included in the horizontal shooting range as illustrated in FIG. 9, if the light source is not included in the vertical shooting range as illustrated in FIG. 8, it is within the shooting range. It may be determined that the light source is not located in.
  • the light source is not located within the shooting range. That is, even if the light source is included in the vertical shooting range as illustrated in FIG. 11, if the light source is not included in the horizontal shooting range as illustrated in FIG. 12, it is within the shooting range. It may be determined that the light source is not located in.
  • the light source is not included in the shooting range in the vertical direction and the horizontal direction, it may be determined that the light source is not located in the shooting range.
  • the change of the shooting condition information may be dealt with by changing any shooting condition information as long as it is determined that the light source is not located within the shooting range from the shooting point.
  • one of the shooting condition information to be changed may be the shooting angle. That is, for example, when the light source is located within the shooting range when shooting an object from above to below as shown in FIG. 11, by changing the shooting angle upward instead of downward, for example, in FIG. It can be dealt with by changing the shooting conditions. At this time, the traveling direction of the flight path may be changed from the lower side to the upper side instead of the upper side to the lower side. On the contrary, when the light source is above and the light source is located within the shooting range under the shooting conditions as shown in FIG. 10, it can be dealt with by changing the shooting angle as shown in FIG.
  • the light source when the light source is above, the light source may be located within the shooting range at the shooting angle as shown in FIG. 8, but in that case, in addition to changing the shooting angle, the shooting altitude is also changed. By doing so, it is possible to change the shooting conditions so that the light source is not located within the shooting range as shown in FIG.
  • one of the shooting condition information to be changed may be a shooting point and a shooting direction. That is, for example, the light source is located within the shooting range in the positional relationship as shown in FIG. 9, but by changing the shooting point and the shooting direction as shown in FIG. 12, the shooting conditions are such that the light source is not positioned within the shooting range. It is also possible to change. At this time, instead of changing the shooting direction of the aircraft, the shooting angle of the shooting device may be changed.
  • one of the shooting condition information to be changed may be the shooting date and time. That is, for example, the light source is at the position shown in FIG. 11 at the set date and time, but the light source is located at the position shown in FIG. 8 by changing the shooting date and time, so other shooting conditions should be changed. It is also possible to deal with it without.
  • the date and time information and the time zone information are compared by storing the time zone information that the artificial object emits light or does not emit light in the storage unit 160. By doing so, it can be determined that the light source emits light or does not emit light at the set date and time, so by changing the shooting date and time during the time when the light source does not emit light, both light sources can be used. It may be set to enable shooting even if it is located within the shooting range.
  • the light source is the sun
  • the weather in which the sun appears or the sun does not appear at the set date and time.
  • shooting may be possible even if the light source is located within both shooting ranges.
  • the report is output using the information (for example, captured image) acquired by the flying object during the flight (SQ106).
  • the management server may output a warning to the user that the light source is located within the shooting range after the confirmation step of SQ103. Then, in response to the warning, the user can select whether to proceed to the step of canceling the flight itself, the step of changing the shooting conditions, the step of changing the shooting conditions, and the like. Good. This allows the user to flexibly set the flight plan.
  • the management server 1 as the information processing device executes the processing for the imaging method, but instead of this, the information processing device mounted on the flying object 4 itself or , The information processing device mounted on the flying object storage device 5 may be configured to execute the above-mentioned processing.
  • the object is an object such as a steel tower in which the light source located on one side to the other side of the object can be seen due to a gap or the like
  • the object is, for example, a high-rise condominium.
  • the light source located on the other side is hidden, for example, within a predetermined angle range centered on the shooting direction or shooting angle from the shooting point, which is set based on the width information and height information of the target.
  • the light source may be set to be capable of shooting even if it is located within both shooting ranges.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mechanical Engineering (AREA)
  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

[Problem] To provide an image capturing method and an information processing device wherein particularly, even in a case of capturing images against the light, the image capture can be effected with no light source imaged. [Solution] An image capturing method according to the present invention is characterized by comprising: an information acquisition step of acquiring imaging condition information at least including imaging point information as well as imaging direction information of a flight vehicle, also including imaging view-angle information as well as imaging angle information of an imaging device mounted on the flight vehicle and also including date and time information related to the imaging point information; a calculation step of calculating a direction in and a height at which the light source is positioned relative to the flight vehicle on the basis of the imaging point information as well as the date and time information or of the imaging point information as well as position information of the light source acquired in advance; and a determination step in which an imaging range derived from the imaging point information as well as the date and time information, from the imaging direction information and from at least one of the imaging view-angle information or the imaging angle information is compared with at least one of the direction in or the height at which the light source is positioned, thereby determining whether the light source is positioned within the imaging range.

Description

飛行体の撮像方法及び情報処理装置Aircraft imaging method and information processing equipment
 本発明は、飛行体の撮像方法及び情報処理装置に関する。 The present invention relates to an image pickup method for an air vehicle and an information processing device.
 近年、ドローン(Drone)や無人航空機(UAV:Unmanned Aerial Vehicle)などの飛行体(以下、「飛行体」と総称する)が産業に利用され始めている。こうした中で、特許文献1には、逆光での撮像となる場合には、逆光での撮像とならない位置になるように、無人航空機を飛行させるように制御する撮像方法が開示されている。 In recent years, flying objects (hereinafter collectively referred to as "aircraft") such as drones and unmanned aerial vehicles (UAVs) have begun to be used in industry. Under these circumstances, Patent Document 1 discloses an imaging method in which an unmanned aerial vehicle is controlled to fly so as to be in a position where the imaging is not performed against the backlight when the imaging is performed against the backlight.
特開2018-092237号公報JP-A-2018-092237
 しかしながら、上記特許文献1の開示技術では、撮像対象の全体像を撮像する場合に、逆光での撮像となる位置側からの撮像ができないため、該全体像を撮像するためには日時を改めて再撮影をするなど、ユーザ側で調整をする負担あった。 However, in the technique disclosed in Patent Document 1, when the whole image of the image-imaging object is imaged, it is not possible to take an image from the position side where the image is backlit. Therefore, in order to image the whole image, the date and time are changed again. There was a burden of adjustment on the user side, such as shooting.
 また、近年、逆光に対するソフトウェア補正技術が進歩しており、逆光であっても撮像できる環境となってきているが、特に撮影範囲内に光源が映り込んでいる場合には、該ソフトウェア補正技術でも対応が難しい状況である。 Further, in recent years, software correction technology for backlight has been advanced, and it has become an environment where imaging can be performed even in backlight. However, especially when the light source is reflected in the shooting range, the software correction technology can also be used. It is a difficult situation to deal with.
 本発明はこのような背景を鑑みてなされたものであり、逆光での撮像となる場合においても、撮影範囲内に光源が位置するかどうかを確認することにより、光源が映り込むことなく撮像が可能となる撮像方法及び情報処理装置を提供することを目的とする。 The present invention has been made in view of such a background, and even in the case of backlit imaging, by confirming whether or not the light source is located within the shooting range, imaging can be performed without reflecting the light source. It is an object of the present invention to provide an imaging method and an information processing apparatus that enable it.
 上記課題を解決するための本発明の主たる発明は、対象物を撮像するために飛行体を制御する情報処理装置に接続された飛行体の撮像方法であって、少なくとも前記飛行体の撮影ポイント情報及び撮影方向情報と、前記飛行体に搭載された撮影装置の撮影画角情報及び撮影角度情報と、前記撮影ポイント情報に関連する日時情報と、を含む撮影条件情報を取得する情報取得ステップと、前記撮影ポイント情報及び前記日時情報または前記撮影ポイント情報及び予め取得した光源の位置情報に基づき、前記飛行体に対する光源の位置する方位及び高度を算出する算出ステップと、前記撮影ポイント情報及び前記日時情報、前記撮影方向情報と前記撮影画角情報または撮影角度情報の少なくとも一方とから導かれる撮影範囲と、前記光源の位置する方位または高度の少なくとも一方とを比較し、前記撮影範囲内に前記光源が位置するかどうかを判定する判定ステップと、を含む、ことを特徴とする撮像方法、である。 The main invention of the present invention for solving the above problems is a method of imaging an air vehicle connected to an information processing device that controls an air vehicle in order to image an object, and at least information on shooting point information of the air vehicle. And an information acquisition step of acquiring shooting condition information including shooting direction information, shooting angle information and shooting angle information of a shooting device mounted on the flying object, and date and time information related to the shooting point information. A calculation step for calculating the orientation and altitude of the light source with respect to the flying object based on the shooting point information and the date and time information or the shooting point information and the position information of the light source acquired in advance, and the shooting point information and the date and time information. , The shooting range derived from the shooting direction information and at least one of the shooting angle information or the shooting angle information is compared with at least one of the orientation or altitude at which the light source is located, and the light source is within the shooting range. It is an imaging method comprising a determination step of determining whether or not it is located.
 本発明によれば、特に、逆光での撮像となる場合においても、撮影範囲内に光源が位置するかどうかを確認することにより、光源が映り込むことなく撮像が可能となる撮像方法及び情報処理装置を提供することができる。 According to the present invention, an imaging method and information processing that enable imaging without reflecting the light source by confirming whether or not the light source is located within the photographing range, particularly even in the case of imaging against the sun. Equipment can be provided.
本発明の実施の形態にかかる管理システムの構成を示す図である。It is a figure which shows the structure of the management system which concerns on embodiment of this invention. 図1の管理サーバのハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware configuration of the management server of FIG. 図1のユーザ端末のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware configuration of the user terminal of FIG. 図1の飛行体のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware composition of the flying object of FIG. 図1の管理サーバの機能を示すブロック図である。It is a block diagram which shows the function of the management server of FIG. 図5のパラメータ情報記憶部の構造を示すブロック図である。It is a block diagram which shows the structure of the parameter information storage part of FIG. 本発明の実施の形態にかかる撮像方法のフローチャートである。It is a flowchart of the imaging method which concerns on embodiment of this invention. 本発明の実施の形態にかかる撮影範囲の一例を示す図である。It is a figure which shows an example of the photographing range which concerns on embodiment of this invention. 本発明の実施の形態にかかる撮影範囲の一例を示す図である。It is a figure which shows an example of the photographing range which concerns on embodiment of this invention. 本発明の実施の形態にかかる撮影範囲の一例を示す図である。It is a figure which shows an example of the photographing range which concerns on embodiment of this invention. 本発明の実施の形態にかかる撮影範囲の一例を示す図である。It is a figure which shows an example of the photographing range which concerns on embodiment of this invention. 本発明の実施の形態にかかる撮影範囲の一例を示す図である。It is a figure which shows an example of the photographing range which concerns on embodiment of this invention. 本発明の実施の形態にかかる撮影範囲の一例を示す図である。It is a figure which shows an example of the photographing range which concerns on embodiment of this invention.
 本発明の実施形態の内容を列記して説明する。本発明の実施の形態によるフライト管理サーバ及びフライト管理システムは、以下のような構成を備える。
[項目1]
 対象物を撮像するために飛行体を制御する情報処理装置に接続された飛行体の撮像方法であって、
 少なくとも前記飛行体の撮影ポイント情報及び撮影方向情報と、前記飛行体に搭載された撮影装置の撮影画角情報及び撮影角度情報と、前記撮影ポイント情報に関連する日時情報と、を含む撮影条件情報を取得する情報取得ステップと、
 前記撮影ポイント情報及び前記日時情報または前記撮影ポイント情報及び予め取得した光源の位置情報に基づき、前記飛行体に対する光源の位置する方位及び高度を算出する算出ステップと、
 前記撮影ポイント情報及び前記日時情報、前記撮影方向情報と前記撮影画角情報または撮影角度情報の少なくとも一方とから導かれる撮影範囲と、前記光源の位置する方位または高度の少なくとも一方とを比較し、前記撮影範囲内に前記光源が位置するかどうかを判定する判定ステップと、を含む、
 ことを特徴とする撮像方法。
[項目2]
 項目1に記載の撮像方法であって、
 前記判定するステップは、
 垂直方向または水平方向の少なくともいずれかの撮影範囲内に前記光源が位置するかどうかを判定する、
 ことを特徴とする撮像方法。
[項目3]
 項目1または2に記載の撮像方法であって、
 前記判定ステップにより前記撮影範囲内に前記光源が位置すると判定された場合には、前記管理サーバはユーザに対して警告を出力する、
 ことを特徴とする撮像方法。
[項目4]
 項目1乃至3に記載の撮像方法であって、
 前記判定ステップにより前記撮影範囲内に前記光源が位置すると判定された場合には、前記撮影条件情報の少なくとも一部を変更する撮影条件変更ステップをさらに含む、
 ことを特徴とする撮像方法。
[項目5]
 項目1乃至4に記載の撮像方法であって、
 前記光源は、光を発する人工物であり、
 前記人工物が光を発するまたは光を発しない時間帯を示す時間帯情報を取得するステップと、
 前記日時情報と前記時間帯情報を比較し、前記飛行装置が飛行する時間帯が、前記人工物が光を発するまたは光を発しない時間帯内に含まれるかどうかを判定するステップと、を含む、
 ことを特徴とする撮像方法。
[項目6]
 項目1乃至4に記載の撮像方法であって、
 前記光源は、太陽であり、
 前記対象物が位置する地域の天候情報を取得するステップと、
 前記天候情報に基づき、前記太陽が現れる天気または前記太陽が現れない天気の時間帯を取得するステップと、
 前記日時情報と前記時間帯情報を比較し、前記飛行装置が飛行する時間帯が、前記太陽が現れる天気または前記太陽が現れない天気の時間帯内に含まれるかどうかを判定するステップと、を含む、
 ことを特徴とする撮像方法。 
[項目7]
 対象物を撮像するために飛行体を制御する情報処理装置であって、
 前記情報処理装置は、
 少なくとも前記飛行体の撮影ポイント情報及び撮影方向情報と、前記飛行体に搭載された撮影装置の撮影画角情報及び撮影角度情報と、前記撮影ポイント情報に関連する日時情報と、を含む撮影条件情報を取得する情報取得部と、
 前記撮影ポイント情報及び前記日時情報または予め取得した光源の位置情報に基づき、前記飛行体に対する光源の位置する方位及び高度を算出する算出部と、
 前記撮影ポイント情報及び前記日時情報、前記撮影方向情報と、前記撮影画角情報及び撮影角度情報の少なくとも一方とから導かれる撮影範囲と、前記光源のする方位または高度の少なくとも一方とを比較し、前記撮影範囲内に前記光源が位置するかどうかを判定する判定部と、を含む、
 ことを特徴とする情報処理装置。
The contents of the embodiments of the present invention will be described in a list. The flight management server and flight management system according to the embodiment of the present invention have the following configurations.
[Item 1]
A method of imaging an air vehicle connected to an information processing device that controls the air vehicle in order to image an object.
Shooting condition information including at least shooting point information and shooting direction information of the flying object, shooting angle information and shooting angle information of a shooting device mounted on the flying object, and date and time information related to the shooting point information. Information acquisition steps to acquire and
A calculation step of calculating the direction and altitude of the light source with respect to the flying object based on the shooting point information and the date and time information or the shooting point information and the position information of the light source acquired in advance.
The shooting range derived from the shooting point information, the date and time information, the shooting direction information, and at least one of the shooting angle information or the shooting angle information is compared with at least one of the orientation or altitude at which the light source is located. A determination step for determining whether or not the light source is located within the shooting range, and the like.
An imaging method characterized by that.
[Item 2]
The imaging method according to item 1.
The determination step is
Determining if the light source is located within at least one of the vertical or horizontal shooting ranges.
An imaging method characterized by that.
[Item 3]
The imaging method according to item 1 or 2.
When it is determined by the determination step that the light source is located within the shooting range, the management server outputs a warning to the user.
An imaging method characterized by that.
[Item 4]
The imaging method according to items 1 to 3.
When it is determined by the determination step that the light source is located within the shooting range, a shooting condition changing step of changing at least a part of the shooting condition information is further included.
An imaging method characterized by that.
[Item 5]
The imaging method according to items 1 to 4.
The light source is an artificial object that emits light.
A step of acquiring time zone information indicating a time zone in which the artificial object emits light or does not emit light, and
A step of comparing the date and time information with the time zone information and determining whether or not the time zone in which the flight device flies is included in the time zone in which the artificial object emits light or does not emit light. ,
An imaging method characterized by that.
[Item 6]
The imaging method according to items 1 to 4.
The light source is the sun
The step of acquiring the weather information of the area where the object is located, and
Based on the weather information, the step of acquiring the time zone of the weather in which the sun appears or the weather in which the sun does not appear, and
A step of comparing the date and time information with the time zone information and determining whether or not the time zone in which the flight device flies is included in the time zone of the weather in which the sun appears or the weather in which the sun does not appear. Including,
An imaging method characterized by that.
[Item 7]
An information processing device that controls an air vehicle to image an object.
The information processing device
Shooting condition information including at least shooting point information and shooting direction information of the flying object, shooting angle information and shooting angle information of a shooting device mounted on the flying object, and date and time information related to the shooting point information. Information acquisition department to acquire
A calculation unit that calculates the direction and altitude of the light source with respect to the flying object based on the shooting point information, the date and time information, or the position information of the light source acquired in advance.
A shooting range derived from the shooting point information, the date and time information, the shooting direction information, and at least one of the shooting angle information and the shooting angle information is compared with at least one of the direction or altitude of the light source. A determination unit for determining whether or not the light source is located within the shooting range, and the like.
An information processing device characterized by this.
<実施の形態の詳細>
 以下、本発明の実施の形態による飛行体の撮像方法及び情報処理装置についての実施の形態を説明する。添付図面において、同一または類似の要素には同一または類似の参照符号及び名称が付され、各実施形態の説明において同一または類似の要素に関する重複する説明は省略することがある。また、各実施形態で示される特徴は、互いに矛盾しない限り他の実施形態にも適用可能である。
<Details of the embodiment>
Hereinafter, embodiments of an image pickup method for an air vehicle and an information processing apparatus according to the embodiment of the present invention will be described. In the accompanying drawings, the same or similar elements are given the same or similar reference numerals and names, and duplicate description of the same or similar elements may be omitted in the description of each embodiment. In addition, the features shown in each embodiment can be applied to other embodiments as long as they do not contradict each other.
<構成>
 図1に示されるように、本実施の形態における管理システムは、管理サーバ1と、一以上のユーザ端末2と、一以上の飛行体4と、一以上の飛行体格納装置5とを有している。管理サーバ1と、ユーザ端末2と、飛行体4と、飛行体格納装置5は、ネットワークを介して互いに通信可能に接続されている。なお、図示された構成は一例であり、これに限らず、例えば、飛行体格納装置5を有さずに、ユーザにより持ち運びされる構成などでもよい。
<Structure>
As shown in FIG. 1, the management system according to the present embodiment includes a management server 1, one or more user terminals 2, one or more flying objects 4, and one or more flying object storage devices 5. ing. The management server 1, the user terminal 2, the flying object 4, and the flying object storage device 5 are connected to each other so as to be able to communicate with each other via a network. The illustrated configuration is an example, and is not limited to this. For example, a configuration that is carried by the user without having the flying object storage device 5 may be used.
<管理サーバ1>
 図2は、管理サーバ1のハードウェア構成を示す図である。なお、図示された構成は一例であり、これ以外の構成を有していてもよい。
<Management server 1>
FIG. 2 is a diagram showing a hardware configuration of the management server 1. The illustrated configuration is an example, and may have other configurations.
 図示されるように、管理サーバ1は、複数のユーザ端末2と、飛行体4、飛行体格納装置5と接続され本システムの一部を構成する。管理サーバ1は、例えばワークステーションやパーソナルコンピュータのような汎用コンピュータとしてもよいし、或いはクラウド・コンピューティングによって論理的に実現されてもよい。 As shown in the figure, the management server 1 is connected to a plurality of user terminals 2, an air vehicle 4, and an air vehicle storage device 5 to form a part of this system. The management server 1 may be a general-purpose computer such as a workstation or a personal computer, or may be logically realized by cloud computing.
 管理サーバ1は、少なくとも、プロセッサ10、メモリ11、ストレージ12、送受信部13、入出力部14等を備え、これらはバス15を通じて相互に電気的に接続される。 The management server 1 includes at least a processor 10, a memory 11, a storage 12, a transmission / reception unit 13, an input / output unit 14, and the like, and these are electrically connected to each other through a bus 15.
 プロセッサ10は、管理サーバ1全体の動作を制御し、各要素間におけるデータの送受信の制御、及びアプリケーションの実行及び認証処理に必要な情報処理等を行う演算装置である。例えばプロセッサ10はCPU(Central Processing Unit)および/またはGPU(Graphics Processing Unit)であり、ストレージ12に格納されメモリ11に展開された本システムのためのプログラム等を実行して各情報処理を実施する。 The processor 10 is an arithmetic unit that controls the operation of the entire management server 1, controls the transmission and reception of data between each element, and performs information processing necessary for application execution and authentication processing. For example, the processor 10 is a CPU (Central Processing Unit) and / or a GPU (Graphics Processing Unit), and executes each information processing by executing a program or the like for the system stored in the storage 12 and expanded in the memory 11. ..
 メモリ11は、DRAM(Dynamic Random Access Memory)等の揮発性記憶装置で構成される主記憶と、フラッシュメモリやHDD(Hard Disc Drive)等の不揮発性記憶装置で構成される補助記憶と、を含む。メモリ11は、プロセッサ10のワークエリア等として使用され、また、管理サーバ1の起動時に実行されるBIOS(Basic Input / Output System)、及び各種設定情報等を格納する。 The memory 11 includes a main memory composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory) and an auxiliary storage composed of a non-volatile storage device such as a flash memory or an HDD (Hard Disc Drive). .. The memory 11 is used as a work area of the processor 10, and also stores a BIOS (Basic Input / Output System) executed when the management server 1 is started, various setting information, and the like.
 ストレージ12は、アプリケーション・プログラム等の各種プログラムを格納する。各処理に用いられるデータを格納したデータベースがストレージ12に構築されていてもよい。 The storage 12 stores various programs such as application programs. A database storing data used for each process may be built in the storage 12.
 送受信部13は、管理サーバ1をネットワークおよびブロックチェーンネットワークに接続する。なお、送受信部13は、Bluetooth(登録商標)及びBLE(Bluetooth Low Energy)の近距離通信インターフェースを備えていてもよい。 The transmission / reception unit 13 connects the management server 1 to the network and the blockchain network. The transmission / reception unit 13 may be provided with a short-range communication interface of Bluetooth (registered trademark) and BLE (Bluetooth Low Energy).
 入出力部14は、キーボード・マウス類等の情報入力機器、及びディスプレイ等の出力機器である。 The input / output unit 14 is an information input device such as a keyboard and a mouse, and an output device such as a display.
 バス15は、上記各要素に共通に接続され、例えば、アドレス信号、データ信号及び各種制御信号を伝達する。 The bus 15 is commonly connected to each of the above elements and transmits, for example, an address signal, a data signal, and various control signals.
<ユーザ端末2>
 図3に示されるユーザ端末2もまた、プロセッサ20、メモリ21、ストレージ22、送受信部23、入出力部24等を備え、これらはバス25を通じて相互に電気的に接続される。各要素の機能は、上述した管理サーバ1と同様に構成することが可能であることから、各要素の詳細な説明は省略する。
<User terminal 2>
The user terminal 2 shown in FIG. 3 also includes a processor 20, a memory 21, a storage 22, a transmission / reception unit 23, an input / output unit 24, and the like, which are electrically connected to each other through a bus 25. Since the functions of each element can be configured in the same manner as the management server 1 described above, detailed description of each element will be omitted.
<飛行体4>
 図4は、飛行体4のハードウェア構成を示すブロック図である。フライトコントローラ41は、プログラマブルプロセッサ(例えば、中央演算処理装置(CPU))などの1つ以上のプロセッサを有することができる。
<Aircraft 4>
FIG. 4 is a block diagram showing a hardware configuration of the air vehicle 4. The flight controller 41 can have one or more processors such as a programmable processor (eg, central processing unit (CPU)).
 また、フライトコントローラ41は、メモリ411を有しており、当該メモリにアクセス可能である。メモリ411は、1つ以上のステップを行うためにフライトコントローラが実行可能であるロジック、コード、および/またはプログラム命令を記憶している。また、フライトコントローラ41は、慣性センサ(加速度センサ、ジャイロセンサ)、GPSセンサ、近接センサ(例えば、ライダー)等のセンサ類412を含みうる。 Further, the flight controller 41 has a memory 411 and can access the memory. Memory 411 stores logic, code, and / or program instructions that the flight controller can execute to perform one or more steps. Further, the flight controller 41 may include sensors 412 such as an inertial sensor (accelerometer, gyro sensor), GPS sensor, proximity sensor (for example, rider) and the like.
 メモリ411は、例えば、SDカードやランダムアクセスメモリ(RAM)などの分離可能な媒体または外部の記憶装置を含んでいてもよい。カメラ/センサ類42から取得したデータは、メモリ411に直接に伝達されかつ記憶されてもよい。例えば、カメラ等で撮影した静止画・動画データが内蔵メモリ又は外部メモリに記録されてもよいが、これに限らず、カメラ/センサ42または内蔵メモリからネットワークNWを介して、少なくとも管理サーバ1やユーザ端末2、飛行体格納装置5のいずれかに記録されてもよい。カメラ42は飛行体4にジンバル43を介して設置される。 Memory 411 may include, for example, a separable medium such as an SD card or random access memory (RAM) or an external storage device. The data acquired from the cameras / sensors 42 may be directly transmitted and stored in the memory 411. For example, still image / moving image data taken by a camera or the like may be recorded in the built-in memory or an external memory, but the present invention is not limited to this, and at least the management server 1 or the management server 1 or the built-in memory may be recorded from the camera / sensor 42 or the built-in memory via the network NW. It may be recorded in either the user terminal 2 or the air vehicle storage device 5. The camera 42 is installed on the flying object 4 via the gimbal 43.
 フライトコントローラ41は、飛行体の状態を制御するように構成された図示しない制御モジュールを含んでいる。例えば、制御モジュールは、6自由度(並進運動x、y及びz、並びに回転運動θ、θ及びθ)を有する飛行体の空間的配置、速度、および/または加速度を調整するために、ESC44(Electric Speed Controller)を経由して飛行体の推進機構(モータ45等)を制御する。バッテリー48から給電されるモータ45によりプロペラ46が回転することで飛行体の揚力を生じさせる。制御モジュールは、搭載部、センサ類の状態のうちの1つ以上を制御することができる。 The flight controller 41 includes a control module (not shown) configured to control the state of the flying object. For example, the control module adjusts the spatial placement, velocity, and / or acceleration of an air vehicle with six degrees of freedom (translational motion x, y and z, and rotational motion θ x , θ y and θ z). , ESC44 (Electric Speed Controller) to control the propulsion mechanism (motor 45, etc.) of the flying object. The propeller 46 is rotated by the motor 45 supplied from the battery 48 to generate lift of the flying object. The control module can control one or more of the states of the mounting unit and the sensors.
 フライトコントローラ41は、1つ以上の外部のデバイス(例えば、送受信機(プロポ)49、端末、表示装置、または他の遠隔の制御器)からのデータを送信および/または受け取るように構成された送受信部47と通信可能である。送受信機49は、有線通信または無線通信などの任意の適当な通信手段を使用することができる。 The flight controller 41 is configured to transmit and / or receive data from one or more external devices (eg, transmitter / receiver (propo) 49, terminal, display device, or other remote control). It is possible to communicate with the unit 47. The transmitter / receiver 49 can use any suitable communication means such as wired communication or wireless communication.
 例えば、送受信部47は、ローカルエリアネットワーク(LAN)、ワイドエリアネットワーク(WAN)、赤外線、無線、WiFi、ポイントツーポイント(P2P)ネットワーク、電気通信ネットワーク、クラウド通信などのうちの1つ以上を利用することができる。 For example, the transmission / reception unit 47 uses one or more of a local area network (LAN), a wide area network (WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, cloud communication, and the like. can do.
 送受信部47は、センサ類42で取得したデータ、フライトコントローラ41が生成した処理結果、所定の制御データ、端末または遠隔の制御器からのユーザコマンドなどのうちの1つ以上を送信および/または受け取ることができる。 The transmission / reception unit 47 transmits and / or receives one or more of the data acquired by the sensors 42, the processing result generated by the flight controller 41, the predetermined control data, the user command from the terminal or the remote controller, and the like. be able to.
 本実施の形態によるセンサ類42は、慣性センサ(加速度センサ、ジャイロセンサ)、GPSセンサ、近接センサ(例えば、ライダー)、またはビジョン/イメージセンサ(例えば、カメラ)を含み得る。 Sensors 42 according to this embodiment may include an inertial sensor (accelerometer, gyro sensor), GPS sensor, proximity sensor (eg, rider), or vision / image sensor (eg, camera).
<管理サーバの機能>
 図5は、管理サーバ1に実装される機能を例示したブロック図である。本実施の形態においては、管理サーバ1は、通信部110、フライトミッション生成部120、情報取得部130、光源位置算出部140、判定部150、記憶部160、レポート生成部170を備えている。フライトミッション生成部120は、飛行経路生成部121を含む。また、記憶部160は、飛行経路情報記憶部162、フライトログ記憶部164、撮影条件パラメータ記憶部166を含む。なお、記憶部160は、撮像を行うために必要な情報を記憶する記憶部をさらに有していてもよく、例えば、飛行条件に関する情報(例えば、飛行速度やウェイポイント間隔など)や、撮影対象物に関する情報(例えば、位置座標や高さ情報など)、対象物の周辺環境に関する情報(例えば、地形や周辺の構造物に関する情報)を記憶する記憶部(不図示)をそれぞれ有していてもよい。
<Management server function>
FIG. 5 is a block diagram illustrating the functions implemented in the management server 1. In the present embodiment, the management server 1 includes a communication unit 110, a flight mission generation unit 120, an information acquisition unit 130, a light source position calculation unit 140, a determination unit 150, a storage unit 160, and a report generation unit 170. The flight mission generation unit 120 includes a flight path generation unit 121. Further, the storage unit 160 includes a flight path information storage unit 162, a flight log storage unit 164, and a shooting condition parameter storage unit 166. The storage unit 160 may further have a storage unit that stores information necessary for performing imaging, for example, information on flight conditions (for example, flight speed, waypoint interval, etc.), and a shooting target. Even if it has a storage unit (not shown) that stores information about an object (for example, position coordinates and height information) and information about the surrounding environment of the object (for example, information about terrain and surrounding structures). Good.
 通信部110は、ユーザ端末2や、飛行体4、飛行体格納装置5と通信を行う。通信部110は、ユーザ端末2から、フライト依頼を受け付ける受付部としても機能する。 The communication unit 110 communicates with the user terminal 2, the flying object 4, and the flying object storage device 5. The communication unit 110 also functions as a reception unit that receives flight requests from the user terminal 2.
 フライトミッション生成部120は、フライトミッションを生成する。フライトミッションは、少なくとも撮影ポイント(いわゆる、ウェイポイントであり、例えば緯度経度情報及び飛行高度情報を含む)情報及び撮影方向情報、撮影日時情報を含む飛行経路を含む情報である。飛行経路の設定は既知の方法で行われてもよく、例えば、手動で撮影ポイント及び撮影方向を設定されたものを参照したり、撮像対象物の位置座標と撮像対象物からの撮影距離を設定するなどして自動で撮影ポイント及び撮影方向を算出して設定したりしてもよい。 Flight mission generation unit 120 generates flight missions. The flight mission is information including at least shooting point (so-called waypoint, including latitude / longitude information and flight altitude information) information, shooting direction information, and flight path including shooting date / time information. The flight path may be set by a known method, for example, referring to a manually set shooting point and shooting direction, or setting the position coordinates of the imaging target and the shooting distance from the imaging target. The shooting point and the shooting direction may be automatically calculated and set.
 なお、飛行経路は、例えば、飛行体格納装置5を有さずに、ユーザにより機体を持ち運びされた位置を飛行開始位置としたり、飛行終了位置においてユーザが機体を回収したりする構成などでもよいし、管理サーバ1により管理された飛行体格納装置5の情報(例えば、位置情報や格納状態情報、格納機体情報など)を基に、飛行開始位置、途中経由地または飛行終了位置として選択された飛行体格納装置5の位置も含めた飛行経路として生成される構成でもよい。 The flight path may be, for example, a configuration in which the position where the aircraft is carried by the user is set as the flight start position or the user collects the aircraft at the flight end position without having the flight object storage device 5. Then, based on the information of the flight object storage device 5 managed by the management server 1 (for example, position information, storage state information, storage aircraft information, etc.), the flight start position, intermediate stopover, or flight end position was selected. It may be configured to be generated as a flight path including the position of the airframe storage device 5.
 情報取得部130は、飛行体の撮影条件情報を取得する。ここでいう撮影条件情報の取得とは、少なくとも飛行経路情報及び日時情報、撮影画角情報及び撮影角度情報を含む撮影条件パラメータの取得である。その他の撮影条件パラメータについても、例えば、記憶部160に予め記憶されたデフォルト値を取得したり、撮影対象物の大きさなどに合わせて自動で算出されて取得されるようにしてもよい。 The information acquisition unit 130 acquires the shooting condition information of the flying object. The acquisition of the shooting condition information referred to here is the acquisition of shooting condition parameters including at least flight path information, date and time information, shooting angle of view information, and shooting angle information. As for other shooting condition parameters, for example, a default value stored in advance in the storage unit 160 may be acquired, or may be automatically calculated and acquired according to the size of the object to be photographed.
 光源位置算出部140は、例えば、撮影ポイント情報及び撮影日時情報に基づき、飛行体に対する光源の位置する方位及び高度位置(以下、光源位置と称する)を算出する。この算出方法は、例えば、光源が太陽である場合には、既知の方法により太陽高度及び方位(以下、「太陽の位置」と称する)を算出可能である。より具体的には、例えば太陽の位置を求めるための既知の数式に日時と緯度経度を代入することにより太陽の位置が算出可能であるため、この既知の太陽位置算出方法により直接算出したり、当該算出結果をテーブル化して対応関係を記録した太陽位置データテーブル(不図示)を用いるなどすることにより算出可能である。したがって、上述の撮影条件情報(特に飛行経路上の撮影ポイント情報と撮影日時情報)と上述の太陽位置算出方法または太陽位置データテーブルを用いることにより、撮影ポイントに対する太陽の位置情報を算出することが可能である。 The light source position calculation unit 140 calculates the direction and altitude position of the light source with respect to the flying object (hereinafter referred to as the light source position) based on, for example, the shooting point information and the shooting date / time information. In this calculation method, for example, when the light source is the sun, the solar altitude and direction (hereinafter, referred to as "position of the sun") can be calculated by a known method. More specifically, for example, since the position of the sun can be calculated by substituting the date and time and latitude / longitude into a known mathematical formula for obtaining the position of the sun, the position of the sun can be calculated directly by this known method for calculating the position of the sun. It can be calculated by creating a table of the calculation results and using a sun position data table (not shown) that records the correspondence. Therefore, it is possible to calculate the position information of the sun with respect to the shooting point by using the above-mentioned shooting condition information (particularly the shooting point information and the shooting date / time information on the flight path) and the above-mentioned sun position calculation method or the sun position data table. It is possible.
 また、光源位置算出部140は、例えば、撮影ポイント情報と予め取得した光源の位置情報とに基づき、飛行体に対する光源位置を算出する。例えば光源が光を発する人工物(例えば、照明など)である場合には、その光を発する位置を記憶した情報(例えば、緯度経度情報や高さ位置情報など)により当該人工物の光源の位置情報を取得することが可能であり、その光源の位置情報と撮影ポイント情報との相対関係から、撮影ポイントに対する光源の位置情報を算出することが可能である。 Further, the light source position calculation unit 140 calculates the light source position with respect to the flying object based on, for example, the shooting point information and the position information of the light source acquired in advance. For example, when the light source is an artificial object that emits light (for example, lighting), the position of the light source of the artificial object is based on information that stores the position of emitting the light (for example, latitude / longitude information, height position information, etc.). Information can be acquired, and the position information of the light source with respect to the shooting point can be calculated from the relative relationship between the position information of the light source and the shooting point information.
 判定部150は、撮影範囲内に光源が入るかどうかを判定する。この判定は、撮影条件情報(特に飛行経路上の撮影ポイント情報及び撮影方向情報、撮影画角及び撮影角度に関する情報)と、上述の光源の位置情報を比較することにより、撮影ポイントからの撮影範囲内に光源が位置するかどうか判定が可能である。より具体的な判定方法の例としては、水平方向に対しては、撮影ポイントからの撮影方向を中心とする水平方向の撮影画角により形成される撮影範囲内に 、光源の位置する方位が含まれるかを比較して判定する。また、垂直方向に対しては、撮影ポイントからの撮影角度を中心とする垂直方向の撮影画角により形成される撮影範囲内に、光源の位置する高度が含まれるかを比較して判定する。 The determination unit 150 determines whether or not the light source is within the shooting range. This determination is made by comparing the shooting condition information (particularly the shooting point information and shooting direction information on the flight path, the shooting angle of view and the shooting angle information) with the above-mentioned position information of the light source to determine the shooting range from the shooting point. It is possible to determine whether or not the light source is located inside. As a more specific example of the determination method, with respect to the horizontal direction, the orientation in which the light source is located is included within the shooting range formed by the shooting angle of view in the horizontal direction centered on the shooting direction from the shooting point. It is judged by comparing whether or not it is possible. Further, in the vertical direction, it is determined by comparing whether or not the altitude at which the light source is located is included in the shooting range formed by the shooting angle of view in the vertical direction centered on the shooting angle from the shooting point.
 飛行経路情報記憶部162は、フライトミッション生成部120で生成された飛行体の撮影ポイント情報及び撮影方向情報、撮影日時情報を記憶している。フライトログ記憶部164は、例えば、フライトミッションにて設定された飛行経路上にて、飛行体4により取得された情報(例えば、離陸から着陸までに経由した位置の情報、静止画像、動画像、音声その他の情報)を記憶している。 The flight path information storage unit 162 stores the shooting point information, the shooting direction information, and the shooting date / time information of the flying object generated by the flight mission generation unit 120. The flight log storage unit 164 may use, for example, information acquired by the aircraft 4 on the flight path set in the flight mission (for example, position information from takeoff to landing, still images, moving images, etc.). Memorize voice and other information).
 撮影条件パラメータ記憶部166は、図6に示すように、撮影画角情報記憶部1661、撮影角度情報記憶部1662、光源位置情報記憶部1663を少なくとも含む。 As shown in FIG. 6, the shooting condition parameter storage unit 166 includes at least a shooting angle information storage unit 1661, a shooting angle information storage unit 1662, and a light source position information storage unit 1663.
 レポート生成部170は、フライトログ記憶部164に基づいて、ユーザ端末2に表示されるレポート情報を生成する。 The report generation unit 170 generates the report information displayed on the user terminal 2 based on the flight log storage unit 164.
<撮像方法の一例>
 図7-13を参照して、本実施形態にかかる撮像方法について説明する。図7には、本実施形態にかかる撮像方法のフローチャートが例示されている。このフローチャートでは、例示的にユーザ端末2上でアプリケーションを起動する構成を示しているが、これに限らず、例えば管理サーバ1や飛行体格納装置5がアプリケーションを起動可能なプロセッサと入出力装置を有し、各種設定等が可能な構成であってもよい。なお、図8-13は、本発明の実施の形態にかかる撮像方法における撮影範囲について説明する例である。
<Example of imaging method>
The imaging method according to the present embodiment will be described with reference to FIGS. 7-13. FIG. 7 illustrates a flowchart of the imaging method according to the present embodiment. This flowchart illustrates a configuration in which an application is started on the user terminal 2, but the present invention is not limited to this, and for example, a processor and an input / output device in which the management server 1 and the air vehicle storage device 5 can start the application are provided. It may have a configuration capable of various settings and the like. Note that FIG. 8-13 is an example for explaining the imaging range in the imaging method according to the embodiment of the present invention.
 まず、情報取得部130により記憶部160から飛行体の撮影条件情報を取得する(SQ101)。ここでいう撮影条件情報の取得とは、少なくとも飛行経路情報及び日時情報、撮影画角情報及び撮影角度情報を含む撮影条件パラメータの取得である。その他の撮影条件パラメータについても、例えば、予めデフォルト値を取得したり、撮影対象物の大きさなどに合わせて自動で取得されるようにしてもよい。 First, the information acquisition unit 130 acquires the shooting condition information of the flying object from the storage unit 160 (SQ101). The acquisition of the shooting condition information referred to here is the acquisition of shooting condition parameters including at least flight path information, date and time information, shooting angle of view information, and shooting angle information. For other shooting condition parameters, for example, default values may be acquired in advance, or may be automatically acquired according to the size of the object to be photographed.
 なお、撮影対象物は、本例においては鉄塔である場合を図示して説明するが、これに限らず、カメラ42で撮影可能な物であればどのようなものであってもよく、例えば高層マンション、家、煙突、アンテナ塔、灯台、風車、樹木、観音像等の造形物、さらには人や動物などの生き物や火事等の煙などであってもよい。 In this example, the object to be photographed will be illustrated with a steel tower, but the object is not limited to this, and any object that can be photographed by the camera 42 may be used, for example, a high-rise building. It may be a model such as an apartment, a house, a chimney, an antenna tower, a lighthouse, a windmill, a tree, a Kannon statue, or smoke of a creature such as a person or an animal or a fire.
 次に、管理サーバ1は、光源位置算出部140により光源の位置情報を算出する(SQ102)。なお、光源は、自ら光を発する光源(例えば、太陽や月、照明など)に限らず、他のものから発された光を反射することにより生成された光源(例えば、ガラスや窓に反射した光など)も含まれる。 Next, the management server 1 calculates the position information of the light source by the light source position calculation unit 140 (SQ102). The light source is not limited to a light source that emits light by itself (for example, the sun, the moon, lighting, etc.), but is a light source generated by reflecting light emitted from another object (for example, reflected on glass or a window). Light etc.) is also included.
 次に、管理サーバ1は、判定部150により撮影範囲内に光源が入るかどうかを判定する(SQ103)。ここで、撮影ポイントからの撮影範囲内に光源が位置しないと判定された場合には、飛行体は当該撮影条件情報に基づきフライトを開始する(SQ104)。 Next, the management server 1 determines whether or not the light source is within the shooting range by the determination unit 150 (SQ103). Here, if it is determined that the light source is not located within the shooting range from the shooting point, the flying object starts the flight based on the shooting condition information (SQ104).
 例えば、図8及び図9に例示されるような位置関係の場合、撮影範囲内に光源が位置しないと判定されるように判定してもよい。すなわち、図9に例示されるように水平方向の撮影範囲内に光源が含まれたとしても、図8に例示されるように垂直方向の撮影範囲内に光源が含まれなければ、撮影範囲内に光源が位置しないと判定されてもよい。 For example, in the case of the positional relationship as illustrated in FIGS. 8 and 9, it may be determined that the light source is not located within the shooting range. That is, even if the light source is included in the horizontal shooting range as illustrated in FIG. 9, if the light source is not included in the vertical shooting range as illustrated in FIG. 8, it is within the shooting range. It may be determined that the light source is not located in.
 例えば、図8のような位置関係に代えて、図10に例示されるような位置関係の場合においても、撮影範囲内に光源が位置しないと判定されてもよい。 For example, instead of the positional relationship as shown in FIG. 8, it may be determined that the light source is not located within the photographing range even in the case of the positional relationship illustrated in FIG.
 例えば、図11及び図12に例示されるような位置関係の場合、撮影範囲内に光源が位置しないと判定されてもよい。すなわち、図11に例示されるように垂直方向の撮影範囲内に光源が含まれたとしても、図12に例示されるように水平方向の撮影範囲内に光源が含まれなければ、撮影範囲内に光源が位置しないと判定されてもよい。 For example, in the case of the positional relationship illustrated in FIGS. 11 and 12, it may be determined that the light source is not located within the shooting range. That is, even if the light source is included in the vertical shooting range as illustrated in FIG. 11, if the light source is not included in the horizontal shooting range as illustrated in FIG. 12, it is within the shooting range. It may be determined that the light source is not located in.
 また、例えば、図示しないが、垂直方向及び水平方向の撮影範囲内に光源が含まれなければ、撮影範囲内に光源が位置しないと判定されてもよい。 Further, for example, although not shown, if the light source is not included in the shooting range in the vertical direction and the horizontal direction, it may be determined that the light source is not located in the shooting range.
 次に、SQ103にて撮影ポイントからの撮影範囲内に光源が位置すると判定された場合には、撮影条件情報の少なくとも一部を変更する(SQ105)。 Next, when the SQ103 determines that the light source is located within the shooting range from the shooting point, at least a part of the shooting condition information is changed (SQ105).
 ここで、撮影条件情報の変更は、撮影ポイントからの撮影範囲内に光源が位置しないと判定されるようになるのであれば、いずれの撮影条件情報を変更して対応してもよい。 Here, the change of the shooting condition information may be dealt with by changing any shooting condition information as long as it is determined that the light source is not located within the shooting range from the shooting point.
 例えば、変更する撮影条件情報の1つは、撮影角度であり得る。すなわち、例えば図11のように上方から下方に向けて対象物を撮影する際に撮影範囲内に光源が位置する場合には、撮影角度を下向きではなく上向きに変更することにより、例えば図10のような撮影条件と変更されることで対応され得る。この時、飛行経路の進行方向も上方から下方に代えて、下方から上方となるように併せて変更してもよい。反対に、光源が上方にあり、図10のような撮影条件において撮影範囲内に光源が位置する場合には、図8のような撮影角度に変更することで対応され得る。 For example, one of the shooting condition information to be changed may be the shooting angle. That is, for example, when the light source is located within the shooting range when shooting an object from above to below as shown in FIG. 11, by changing the shooting angle upward instead of downward, for example, in FIG. It can be dealt with by changing the shooting conditions. At this time, the traveling direction of the flight path may be changed from the lower side to the upper side instead of the upper side to the lower side. On the contrary, when the light source is above and the light source is located within the shooting range under the shooting conditions as shown in FIG. 10, it can be dealt with by changing the shooting angle as shown in FIG.
 また、光源が上方にある場合において、図8のような撮影角度だと撮影範囲内に光源が位置することもあり得るが、その場合は撮影角度の変更に加えて、撮影高度も併せて変更することで、図13に記載されるように撮影範囲内に光源が位置しないように撮影条件を変更することも可能である。 Further, when the light source is above, the light source may be located within the shooting range at the shooting angle as shown in FIG. 8, but in that case, in addition to changing the shooting angle, the shooting altitude is also changed. By doing so, it is possible to change the shooting conditions so that the light source is not located within the shooting range as shown in FIG.
 例えば、変更する撮影条件情報の1つは、撮影ポイント及び撮影方向であり得る。すなわち、例えば図9のような位置関係では撮影範囲内に光源が位置するが、例えば図12のように撮影ポイント及び撮影方向を変更することで、撮影範囲内に光源が位置しないように撮影条件を変更することも可能である。この時、機体の撮影方向の変更に代えて、撮影装置の撮影角度を変更することで対応してもよい。 For example, one of the shooting condition information to be changed may be a shooting point and a shooting direction. That is, for example, the light source is located within the shooting range in the positional relationship as shown in FIG. 9, but by changing the shooting point and the shooting direction as shown in FIG. 12, the shooting conditions are such that the light source is not positioned within the shooting range. It is also possible to change. At this time, instead of changing the shooting direction of the aircraft, the shooting angle of the shooting device may be changed.
 例えば、変更する撮影条件情報の1つは、撮影する日時であり得る。すなわち、例えば、設定された日時だと光源が図11のような位置にあるが、撮影する日時を変更することにより光源が図8のような位置になるため、その他の撮影条件を変更することなく対応することも可能である。 For example, one of the shooting condition information to be changed may be the shooting date and time. That is, for example, the light source is at the position shown in FIG. 11 at the set date and time, but the light source is located at the position shown in FIG. 8 by changing the shooting date and time, so other shooting conditions should be changed. It is also possible to deal with it without.
 また、光源が光を発する人工物である場合には、例えば人工物が光を発するまたは光を発しない時間帯情報を記憶部160に記憶しておくことにより、日時情報と時間帯情報を比較することで設定された日時だと光源が光を発するまたは光を発しない時間帯であることも判定できるので、光源が光を発しない時間帯に撮影する日時を変更することで、光源が両撮影範囲内に位置していても撮影可能に設定してもよい。 When the light source is an artificial object that emits light, for example, the date and time information and the time zone information are compared by storing the time zone information that the artificial object emits light or does not emit light in the storage unit 160. By doing so, it can be determined that the light source emits light or does not emit light at the set date and time, so by changing the shooting date and time during the time when the light source does not emit light, both light sources can be used. It may be set to enable shooting even if it is located within the shooting range.
 さらに、光源が太陽である場合には、例えば対象物が位置する地域の天候情報を管理サーバ1内またはネットワークから取得することにより、設定された日時だと太陽が現れる天気または前記太陽が現れない天気の時間帯であることを判定し、撮影する日時を例えば太陽が現れない曇りの日時に変更することで、光源が両撮影範囲内に位置していても撮影可能に設定してもよい。 Further, when the light source is the sun, for example, by acquiring the weather information of the area where the object is located from the management server 1 or the network, the weather in which the sun appears or the sun does not appear at the set date and time. By determining that it is a weather time zone and changing the shooting date and time to, for example, a cloudy date and time when the sun does not appear, shooting may be possible even if the light source is located within both shooting ranges.
 次に、飛行体がフライト中に取得した情報(例えば、撮影画像など)を利用して、レポートを出力する(SQ106)。 Next, the report is output using the information (for example, captured image) acquired by the flying object during the flight (SQ106).
 なお、図示していないが、SQ103の確認ステップの後に、管理サーバが撮影範囲内に光源が位置する旨の警告をユーザに対して出力するようにしてもよい。そして、警告に応じて、ユーザは、フライト自体を中止するステップへ進むのか、撮影条件を変更するステップに進むのか、さらには何れの撮影条件を変更するのかなどを選択することが可能にしてもよい。これにより、ユーザがフライトプランを柔軟に設定することが可能になる。 Although not shown, the management server may output a warning to the user that the light source is located within the shooting range after the confirmation step of SQ103. Then, in response to the warning, the user can select whether to proceed to the step of canceling the flight itself, the step of changing the shooting conditions, the step of changing the shooting conditions, and the like. Good. This allows the user to flexibly set the flight plan.
 また、上述の実施の形態においては、情報処理装置としての管理サーバ1により当該撮像方法のための処理を実行しているが、これに代えて、飛行体4自体に搭載された情報処理装置や、飛行体格納装置5に搭載された情報処理装置により、上述の処理を実行するように構成をなしてもよい。 Further, in the above-described embodiment, the management server 1 as the information processing device executes the processing for the imaging method, but instead of this, the information processing device mounted on the flying object 4 itself or , The information processing device mounted on the flying object storage device 5 may be configured to execute the above-mentioned processing.
 さらに、本例では対象物が鉄塔のように、隙間などにより対象物の一方側から他方側に位置する光源が見える対象物の場合について説明しているが、対象物が例えば、高層マンションのように他方側に位置する光源が隠れる対象物の場合については、例えば対象物の幅情報や高さ情報などに基づき設定される、撮影ポイントからの撮影方向または撮影角度を中心とする所定角度範囲内に、光源が位置するか判定することにより、光源が両撮影範囲内に位置していても撮影可能に設定してもよい。 Further, in this example, the case where the object is an object such as a steel tower in which the light source located on one side to the other side of the object can be seen due to a gap or the like is described, but the object is, for example, a high-rise condominium. In the case of an object in which the light source located on the other side is hidden, for example, within a predetermined angle range centered on the shooting direction or shooting angle from the shooting point, which is set based on the width information and height information of the target. In addition, by determining whether the light source is located, the light source may be set to be capable of shooting even if it is located within both shooting ranges.
 このように、特に逆光での撮像となる場合においても、撮影範囲内に光源が位置するかどうかを確認することにより、光源が映り込むことなく撮像が可能となる。 In this way, even when the image is taken against the sun, it is possible to take an image without reflecting the light source by checking whether the light source is located within the shooting range.
 上述した実施の形態は、本発明の理解を容易にするための例示に過ぎず、本発明を限定して解釈するためのものではない。本発明は、その趣旨を逸脱することなく、変更、改良することができると共に、本発明にはその均等物が含まれることは言うまでもない。 The above-described embodiment is merely an example for facilitating the understanding of the present invention, and is not intended to limit the interpretation of the present invention. It goes without saying that the present invention can be modified and improved without departing from the spirit thereof, and the present invention includes an equivalent thereof.
 1    管理サーバ
 2    ユーザ端末
 4    飛行体
 

 
1 Management server 2 User terminal 4 Aircraft

Claims (7)

  1.  対象物を撮像するために飛行体を制御する情報処理装置に接続された飛行体の撮像方法であって、
     少なくとも前記飛行体の撮影ポイント情報及び撮影方向情報と、前記飛行体に搭載された撮影装置の撮影画角情報及び撮影角度情報と、前記撮影ポイント情報に関連する日時情報と、を含む撮影条件情報を取得する情報取得ステップと、
     前記撮影ポイント情報及び前記日時情報または前記撮影ポイント情報及び予め取得した光源の位置情報に基づき、前記飛行体に対する光源の位置する方位及び高度を算出する算出ステップと、
     前記撮影ポイント情報及び前記日時情報、前記撮影方向情報と前記撮影画角情報または撮影角度情報の少なくとも一方とから導かれる撮影範囲と、前記光源の位置する方位または高度の少なくとも一方とを比較し、前記撮影範囲内に前記光源が位置するかどうかを判定する判定ステップと、を含む、
     ことを特徴とする撮像方法。
    A method of imaging an air vehicle connected to an information processing device that controls the air vehicle in order to image an object.
    Shooting condition information including at least shooting point information and shooting direction information of the flying object, shooting angle information and shooting angle information of a shooting device mounted on the flying object, and date and time information related to the shooting point information. Information acquisition steps to acquire and
    A calculation step of calculating the direction and altitude of the light source with respect to the flying object based on the shooting point information and the date and time information or the shooting point information and the position information of the light source acquired in advance.
    The shooting range derived from the shooting point information, the date and time information, the shooting direction information, and at least one of the shooting angle information or the shooting angle information is compared with at least one of the orientation or altitude at which the light source is located. A determination step for determining whether or not the light source is located within the shooting range, and the like.
    An imaging method characterized by that.
  2.  請求項1に記載の撮像方法であって、
     前記判定するステップは、
     垂直方向または水平方向の少なくともいずれかの撮影範囲内に前記光源が位置するかどうかを判定する、
     ことを特徴とする撮像方法。
    The imaging method according to claim 1.
    The determination step is
    Determining if the light source is located within at least one of the vertical or horizontal shooting ranges.
    An imaging method characterized by that.
  3.  請求項1または2に記載の撮像方法であって、
     前記判定ステップにより前記撮影範囲内に前記光源が位置すると判定された場合には、前記情報処理装置はユーザに対して警告を出力する、
     ことを特徴とする撮像方法。
    The imaging method according to claim 1 or 2.
    When it is determined by the determination step that the light source is located within the shooting range, the information processing apparatus outputs a warning to the user.
    An imaging method characterized by that.
  4.  請求項1乃至3に記載の撮像方法であって、
     前記判定ステップにより前記撮影範囲内に前記光源が位置すると判定された場合には、前記撮影条件情報の少なくとも一部を変更する撮影条件変更ステップをさらに含む、
     ことを特徴とする撮像方法。
    The imaging method according to claims 1 to 3.
    When it is determined by the determination step that the light source is located within the shooting range, a shooting condition changing step of changing at least a part of the shooting condition information is further included.
    An imaging method characterized by that.
  5.  請求項1乃至4に記載の撮像方法であって、
     前記光源は、光を発する人工物であり、
     前記人工物が光を発するまたは光を発しない時間帯を示す時間帯情報を取得するステップと、
     前記日時情報と前記時間帯情報を比較し、前記飛行装置が飛行する時間帯が、前記人工物が光を発するまたは光を発しない時間帯内に含まれるかどうかを判定するステップと、を含む、
     ことを特徴とする撮像方法。
    The imaging method according to claims 1 to 4.
    The light source is an artificial object that emits light.
    A step of acquiring time zone information indicating a time zone in which the artificial object emits light or does not emit light, and
    A step of comparing the date and time information with the time zone information and determining whether or not the time zone in which the flight device flies is included in the time zone in which the artificial object emits light or does not emit light. ,
    An imaging method characterized by that.
  6.  請求項1乃至4に記載の撮像方法であって、
     前記光源は、太陽であり、
     前記対象物が位置する地域の天候情報を取得するステップと、
     前記天候情報に基づき、前記太陽が現れる天気または前記太陽が現れない天気の時間帯を取得するステップと、
     前記日時情報と前記時間帯情報を比較し、前記飛行装置が飛行する時間帯が、前記太陽が現れる天気または前記太陽が現れない天気の時間帯内に含まれるかどうかを判定するステップと、を含む、
     ことを特徴とする撮像方法。 
    The imaging method according to claims 1 to 4.
    The light source is the sun
    The step of acquiring the weather information of the area where the object is located, and
    Based on the weather information, the step of acquiring the time zone of the weather in which the sun appears or the weather in which the sun does not appear, and
    A step of comparing the date and time information with the time zone information and determining whether or not the time zone in which the flight device flies is included in the time zone of the weather in which the sun appears or the weather in which the sun does not appear. Including,
    An imaging method characterized by that.
  7.  対象物を撮像するために飛行体を制御する情報処理装置であって、
     前記情報処理装置は、
     少なくとも前記飛行体の撮影ポイント情報及び撮影方向情報と、前記飛行体に搭載された撮影装置の撮影画角情報及び撮影角度情報と、前記撮影ポイント情報に関連する日時情報と、を含む撮影条件情報を取得する情報取得部と、
     前記撮影ポイント情報及び前記日時情報または前記撮影ポイント情報及び予め取得した光源の位置情報に基づき、前記飛行体に対する光源の位置する方位及び高度を算出する光源位置算出部と、
     前記撮影ポイント情報及び前記日時情報、前記撮影方向情報と、前記撮影画角情報または撮影角度情報の少なくとも一方とから導かれる撮影範囲と、前記光源の位置する方位または高度の少なくとも一方とを比較し、前記撮影範囲内に前記光源が位置するかどうかを判定する判定部と、を含む、
     ことを特徴とする情報処理装置。

     
    An information processing device that controls an air vehicle to image an object.
    The information processing device
    Shooting condition information including at least shooting point information and shooting direction information of the flying object, shooting angle information and shooting angle information of a shooting device mounted on the flying object, and date and time information related to the shooting point information. Information acquisition department to acquire
    A light source position calculation unit that calculates the orientation and altitude of the light source with respect to the flying object based on the shooting point information and the date and time information or the shooting point information and the position information of the light source acquired in advance.
    The shooting range derived from the shooting point information, the date and time information, the shooting direction information, and at least one of the shooting angle information or the shooting angle information is compared with at least one of the orientation or altitude at which the light source is located. , A determination unit for determining whether or not the light source is located within the shooting range, and the like.
    An information processing device characterized by this.

PCT/JP2019/050206 2019-12-20 2019-12-20 Image capturing method of flight vehicle and information processing device WO2021124579A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2019/050206 WO2021124579A1 (en) 2019-12-20 2019-12-20 Image capturing method of flight vehicle and information processing device
JP2020519143A JPWO2021124579A1 (en) 2019-12-20 2019-12-20 Aircraft imaging method and information processing equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/050206 WO2021124579A1 (en) 2019-12-20 2019-12-20 Image capturing method of flight vehicle and information processing device

Publications (1)

Publication Number Publication Date
WO2021124579A1 true WO2021124579A1 (en) 2021-06-24

Family

ID=76476743

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/050206 WO2021124579A1 (en) 2019-12-20 2019-12-20 Image capturing method of flight vehicle and information processing device

Country Status (2)

Country Link
JP (1) JPWO2021124579A1 (en)
WO (1) WO2021124579A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114750972A (en) * 2022-04-27 2022-07-15 西安理工大学 Multi-rotor unmanned aerial vehicle recovery auxiliary navigation device and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011099049A1 (en) * 2010-02-09 2011-08-18 トヨタ自動車株式会社 Imaging system
JP2013054545A (en) * 2011-09-05 2013-03-21 Mitsubishi Motors Corp Driving support device
JP2017068639A (en) * 2015-09-30 2017-04-06 セコム株式会社 Autonomous Mobile Robot
JP2018092237A (en) * 2016-11-30 2018-06-14 キヤノンマーケティングジャパン株式会社 Unmanned aircraft control system, and control method and program of the same

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4678649B2 (en) * 2005-10-31 2011-04-27 富士通株式会社 Image processing device
CN103416050A (en) * 2012-03-12 2013-11-27 松下电器产业株式会社 Information provision system, information provision device, photographing device, and computer program
KR20150106719A (en) * 2014-03-12 2015-09-22 삼성전자주식회사 Method for informing shooting location of electronic device and electronic device implementing the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011099049A1 (en) * 2010-02-09 2011-08-18 トヨタ自動車株式会社 Imaging system
JP2013054545A (en) * 2011-09-05 2013-03-21 Mitsubishi Motors Corp Driving support device
JP2017068639A (en) * 2015-09-30 2017-04-06 セコム株式会社 Autonomous Mobile Robot
JP2018092237A (en) * 2016-11-30 2018-06-14 キヤノンマーケティングジャパン株式会社 Unmanned aircraft control system, and control method and program of the same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114750972A (en) * 2022-04-27 2022-07-15 西安理工大学 Multi-rotor unmanned aerial vehicle recovery auxiliary navigation device and method
CN114750972B (en) * 2022-04-27 2024-05-14 西安理工大学 Multi-rotor unmanned aerial vehicle recycling auxiliary navigation device and method

Also Published As

Publication number Publication date
JPWO2021124579A1 (en) 2021-12-23

Similar Documents

Publication Publication Date Title
US20200026720A1 (en) Construction and update of elevation maps
CN107209514B (en) Selective processing of sensor data
JP6878567B2 (en) 3D shape estimation methods, flying objects, mobile platforms, programs and recording media
WO2021199449A1 (en) Position calculation method and information processing system
WO2018045538A1 (en) Unmanned aerial vehicle, obstacle avoidance method for same, and obstacle avoidance system thereof
JP2020140726A (en) Flight management server of unmanned flying object and flight management system
WO2021124579A1 (en) Image capturing method of flight vehicle and information processing device
JP2021100234A (en) Aircraft imaging method and information processing device
WO2020225979A1 (en) Information processing device, information processing method, program, and information processing system
WO2021079516A1 (en) Flight route creation method for flying body and management server
JP6661187B1 (en) Aircraft management server and management system
JP6730764B1 (en) Flight route display method and information processing apparatus
JP7004374B1 (en) Movement route generation method and program of moving object, management server, management system
JP2020036163A (en) Information processing apparatus, photographing control method, program, and recording medium
JP6818379B1 (en) Flight route creation method and management server for aircraft
JP6684012B1 (en) Information processing apparatus and information processing method
JP6800505B1 (en) Aircraft management server and management system
WO2020204200A1 (en) Work plan generation system
JP6899108B1 (en) Instrument reading method and management server, instrument reading system, program
JP6934646B1 (en) Flight restriction area setting method, waypoint setting method and management server, information processing system, program
JP7370045B2 (en) Dimension display system and method
WO2022113482A1 (en) Information processing device, method, and program
JP6810498B1 (en) Flight route creation method and management server for aircraft
JP6810497B1 (en) Flight route creation method and management server for aircraft
JP6978026B1 (en) Waypoint setting method and waypoint correction method, management server, information processing system, program

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2020519143

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19956741

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19956741

Country of ref document: EP

Kind code of ref document: A1