WO2018163300A1 - Control device, imaging device, imaging system, moving body, control method, and program - Google Patents

Control device, imaging device, imaging system, moving body, control method, and program Download PDF

Info

Publication number
WO2018163300A1
WO2018163300A1 PCT/JP2017/009077 JP2017009077W WO2018163300A1 WO 2018163300 A1 WO2018163300 A1 WO 2018163300A1 JP 2017009077 W JP2017009077 W JP 2017009077W WO 2018163300 A1 WO2018163300 A1 WO 2018163300A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging device
imaging
exposure
region
altitude
Prior art date
Application number
PCT/JP2017/009077
Other languages
French (fr)
Japanese (ja)
Inventor
園宏 鄭
本庄 謙一
Original Assignee
エスゼット ディージェイアイ テクノロジー カンパニー リミテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by エスゼット ディージェイアイ テクノロジー カンパニー リミテッド filed Critical エスゼット ディージェイアイ テクノロジー カンパニー リミテッド
Priority to PCT/JP2017/009077 priority Critical patent/WO2018163300A1/en
Priority to JP2017559620A priority patent/JP6547984B2/en
Publication of WO2018163300A1 publication Critical patent/WO2018163300A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • the present invention relates to a control device, an imaging device, an imaging system, a moving body, a control method, and a program.
  • Patent Document 1 discloses that a scene image of a camera is divided into a plurality of areas, and the luminance value of the entire scene is calculated based on the luminance value of each divided area.
  • Patent Document 1 Japanese Unexamined Patent Application Publication No. 2009-25727
  • the imaging device When the imaging device is used in an environment where the brightness changes relatively quickly, such as when the imaging device is mounted on a moving object such as an unmanned aircraft, automatic exposure control by the imaging device may not be performed properly. .
  • the control device includes a dividing unit that divides an image captured by the imaging device into a plurality of regions based on an angle of view of the imaging device, an altitude of the imaging device, and an imaging direction of the imaging device. You may prepare.
  • the control device may include a control unit that controls the exposure of the imaging device for each of a plurality of regions.
  • the control device may include a determination unit that determines an exposure control value of the imaging device for each of a plurality of regions.
  • the control unit may control the exposure of the imaging device for each of a plurality of regions based on the exposure control value of the imaging device.
  • the control device may include a derivation unit that derives an evaluation value of brightness of an image captured by the imaging device for each of a plurality of regions.
  • the determining unit may determine an exposure control value for each of the plurality of regions based on the evaluation values of the plurality of regions.
  • the dividing unit may divide the image into an upper region and a lower region based on the angle of view of the imaging device, the altitude of the imaging device, and the imaging direction of the imaging device.
  • the deriving unit may derive an evaluation value of the brightness of each of the upper region and the lower region.
  • the determination unit may determine the exposure control values of the upper region and the lower region based on the brightness evaluation values of the upper region and the lower region.
  • the dividing unit divides a first image captured by the imaging device at the first time point into a first upper region and a first lower region based on the angle of view, altitude, and imaging direction of the imaging device at the first time point. You can do it.
  • the deriving unit may derive brightness evaluation values of the first upper region and the first lower region based on the first image.
  • the dividing unit converts the second image captured at the second time point by the image pickup device based on the angle of view, the altitude, and the image pickup direction of the image pickup device at the second time point after the first time point. It may be divided into two lower regions.
  • the determining unit determines an exposure control value for the second upper area based on the evaluation value for the first upper area, and determines an exposure control value for the second lower area based on the evaluation value for the first lower area. Good.
  • the angle of view, altitude, and imaging direction of the imaging device at the second time point may be the angle of view, altitude, and imaging direction of the imaging device at the second time point specified before the second time point.
  • the imaging device may be mounted on a moving body.
  • the altitude of the imaging device at the second time point may correspond to the altitude of the moving body at the second time point specified before the second time point.
  • the dividing unit may divide the image into an upper region, a lower region, and at least one intermediate region between the upper region and the lower region.
  • the determining unit may determine the exposure control value of the at least one intermediate region to a value between the exposure control value of the upper region and the exposure control value of the lower region.
  • the imaging device may have an image sensor.
  • the control unit may control the exposure of the imaging device for each of a plurality of regions by controlling a gain of an electric signal output from the image sensor according to an exposure amount.
  • the imaging device may have an image sensor.
  • the imaging device may include an optical filter provided in front of the image sensor and capable of changing the light transmittance for each of a plurality of predetermined regions.
  • the control unit may control the exposure of the imaging device for each of the plurality of regions by controlling the light transmittance for each of the plurality of predetermined regions of the optical filter.
  • An imaging device may include the control device.
  • the imaging device may include an image sensor.
  • An imaging system may include the imaging device.
  • the imaging system may include a support mechanism that supports the imaging device.
  • the moving body according to one embodiment of the present invention may move by mounting the imaging system.
  • a control method includes a step of dividing an image captured by an imaging device into a plurality of regions based on an angle of view of the imaging device, an altitude of the imaging device, and an imaging direction of the imaging device. It's okay.
  • the control method may include a step of controlling the exposure of the imaging device for each of a plurality of regions.
  • a program provides a computer with a step of dividing an image captured by an imaging device into a plurality of regions based on an angle of view of the imaging device, an altitude of the imaging device, and an imaging direction of the imaging device. May be executed.
  • the program may cause the computer to execute a step of controlling the exposure of the imaging device for each of a plurality of regions.
  • exposure control by the imaging device can be more appropriately executed even when the imaging device is used in an environment where the brightness changes relatively quickly.
  • UAV unmanned aerial vehicle
  • remote control device It is a figure which shows an example of the functional block of UAV. It is a figure for demonstrating the upper area
  • a block is either (1) a stage in a process in which an operation is performed or (2) an apparatus responsible for performing the operation. May represent a “part”.
  • Certain stages and “units” may be implemented by programmable circuits and / or processors.
  • Dedicated circuitry may include digital and / or analog hardware circuitry.
  • Integrated circuits (ICs) and / or discrete circuits may be included.
  • the programmable circuit may include a reconfigurable hardware circuit.
  • Reconfigurable hardware circuits include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate arrays (FPGA), programmable logic arrays (PLA), etc.
  • the memory element or the like may be included.
  • the computer readable medium may include any tangible device capable of storing instructions to be executed by a suitable device.
  • a computer readable medium having instructions stored thereon comprises a product that includes instructions that can be executed to create a means for performing the operations specified in the flowcharts or block diagrams.
  • Examples of computer readable media may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like.
  • Computer readable media include floppy disks, diskettes, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), Electrically erasable programmable read only memory (EEPROM), static random access memory (SRAM), compact disc read only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory stick, integrated A circuit card or the like may be included.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • EEPROM Electrically erasable programmable read only memory
  • SRAM static random access memory
  • CD-ROM compact disc read only memory
  • DVD digital versatile disc
  • RTM Blu-ray
  • the computer readable instructions may include either source code or object code written in any combination of one or more programming languages.
  • the source code or object code includes a conventional procedural programming language.
  • Conventional procedural programming languages include assembler instructions, instruction set architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA, C ++, etc. It may be an object-oriented programming language and a “C” programming language or a similar programming language.
  • Computer readable instructions may be directed to a general purpose computer, special purpose computer, or other programmable data processing device processor or programmable circuit locally or in a wide area network (WAN) such as a local area network (LAN), the Internet, etc. ).
  • WAN wide area network
  • LAN local area network
  • the Internet etc.
  • the processor or programmable circuit may execute computer readable instructions to create a means for performing the operations specified in the flowcharts or block diagrams.
  • Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
  • FIG. 1 shows an example of the external appearance of an unmanned aerial vehicle (UAV) 10 and a remote control device 300.
  • the UAV 10 includes a UAV main body 20, a gimbal 50, a plurality of imaging devices 60, and an imaging device 100.
  • the gimbal 50 and the imaging device 100 are an example of an imaging system.
  • the UAV 10 is an example of a moving body propelled by a propulsion unit.
  • the moving body is a concept including a flying body such as another aircraft moving in the air, a vehicle moving on the ground, a ship moving on the water, etc. in addition to the UAV.
  • the UAV main body 20 includes a plurality of rotor blades.
  • the plurality of rotor blades is an example of a propulsion unit.
  • the UAV main body 20 causes the UAV 10 to fly by controlling the rotation of a plurality of rotor blades.
  • the UAV main body 20 causes the UAV 10 to fly using four rotary wings.
  • the number of rotor blades is not limited to four.
  • the UAV 10 may be a fixed wing machine that does not have a rotating wing.
  • the imaging apparatus 100 is an imaging camera that images a subject included in a desired imaging range.
  • the gimbal 50 supports the imaging device 100 in a rotatable manner.
  • the gimbal 50 is an example of a support mechanism.
  • the gimbal 50 supports the imaging device 100 so as to be rotatable about the pitch axis using an actuator.
  • the gimbal 50 further supports the imaging device 100 using an actuator so as to be rotatable about the roll axis and the yaw axis.
  • the gimbal 50 may change the posture of the imaging device 100 by rotating the imaging device 100 about at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that image the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • Two imaging devices 60 may be provided in the front which is the nose of UAV10.
  • Two other imaging devices 60 may be provided on the bottom surface of the UAV 10.
  • the two imaging devices 60 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired and function as a stereo camera. Based on images picked up by a plurality of image pickup devices 60, three-dimensional spatial data around the UAV 10 may be generated.
  • the number of imaging devices 60 included in the UAV 10 is not limited to four.
  • the UAV 10 only needs to include at least one imaging device 60.
  • the UAV 10 may include at least one imaging device 60 on each of the nose, the tail, the side surface, the bottom surface, and the ceiling surface of the UAV 10.
  • the angle of view that can be set by the imaging device 60 may be wider than the angle of view that can be set by the imaging device 100.
  • the imaging device 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 may communicate with the UAV 10 wirelessly.
  • the remote control device 300 transmits to the UAV 10 instruction information indicating various commands related to movement of the UAV 10 such as ascending, descending, accelerating, decelerating, moving forward, moving backward, and rotating.
  • the instruction information includes, for example, instruction information for raising the altitude of the UAV 10.
  • the instruction information may indicate the altitude at which the UAV 10 should be located.
  • the UAV 10 moves so as to be located at an altitude indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include an ascending command that raises the UAV 10.
  • the UAV 10 rises while accepting the ascent command. Even if the UAV 10 receives the ascending command, the UAV 10 may limit the ascent when the altitude of the UAV 10 has reached the upper limit altitude.
  • FIG. 2 shows an example of functional blocks of the UAV10.
  • the UAV 10 includes a UAV control unit 30, a memory 32, a communication interface 34, a propulsion unit 40, a GPS receiver 41, an inertial measurement device 42, a magnetic compass 43, a barometric altimeter 44, a gimbal 50, and the imaging device 100.
  • the communication interface 34 communicates with other devices such as the remote operation device 300.
  • the communication interface 34 may receive instruction information including various commands for the UAV control unit 30 from the remote operation device 300.
  • the UAV control unit 30 controls the propulsion unit 40, the GPS receiver 41, the inertial measurement device (IMU) 42, the magnetic compass 43, the barometric altimeter 44, the gimbal 50, the imaging device 60, and the imaging device 100.
  • the memory 32 may be a computer-readable recording medium and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 32 may be provided inside the UAV main body 20. It may be provided so as to be removable from the UAV main body 20.
  • the UAV control unit 30 controls the flight and imaging of the UAV 10 according to a program stored in the memory 32.
  • the UAV control unit 30 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
  • the UAV control unit 30 controls the flight and imaging of the UAV 10 according to a command received from the remote control device 300 via the communication interface 34.
  • the propulsion unit 40 propels the UAV 10.
  • the propulsion unit 40 includes a plurality of rotating blades and a plurality of drive motors that rotate the plurality of rotating blades.
  • the propulsion unit 40 causes the UAV 10 to fly by rotating a plurality of rotor blades via a plurality of drive motors in accordance with a command from the UAV control unit 30.
  • the GPS receiver 41 receives a plurality of signals indicating times transmitted from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position of the GPS receiver 41, that is, the position of the UAV 10 based on the received signals.
  • the IMU 42 detects the posture of the UAV 10.
  • the IMU 42 detects, as the posture of the UAV 10, acceleration in the three axial directions of the front, rear, left, and right of the UAV 10, and angular velocity in the three axial directions of pitch, roll, and yaw.
  • the magnetic compass 43 detects the heading of the UAV 10.
  • the barometric altimeter 44 detects the altitude at which the UAV 10 flies.
  • the barometric altimeter 44 detects the atmospheric pressure around the UAV 10, converts the detected atmospheric pressure into an altitude, and detects the altitude.
  • the imaging apparatus 100 includes an imaging unit 102 and a lens unit 200.
  • the lens unit 200 is an example of a lens device.
  • the imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130.
  • the image sensor 120 may be configured by a CCD or a CMOS.
  • the image sensor 120 outputs image data of an optical image formed through the plurality of lenses 210 to the imaging control unit 110.
  • the imaging control unit 110 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
  • the imaging control unit 110 may control the imaging device 100 in accordance with an operation command for the imaging device 100 from the UAV control unit 30.
  • the memory 130 may be a computer-readable recording medium and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 130 stores a program and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the memory 130 may be provided so as to be removable from the housing of the imaging apparatus 100.
  • the lens unit 200 includes a plurality of lenses 210, a lens moving mechanism 212, and a lens control unit 220.
  • the plurality of lenses 210 may function as a zoom lens, a varifocal lens, and a focus lens. At least some or all of the plurality of lenses 210 are arranged to be movable along the optical axis.
  • the lens unit 200 may be an interchangeable lens that is detachably attached to the imaging unit 102.
  • the lens moving mechanism 212 moves at least some or all of the plurality of lenses 210 along the optical axis.
  • the lens control unit 220 drives the lens moving mechanism 212 in accordance with a lens control command from the imaging unit 102 to move one or a plurality of lenses 210 along the optical axis direction.
  • the lens control command is, for example, a zoom control command and a focus control command.
  • the imaging device 100 mounted on the UAV 10 configured as described above may capture an image including the ground or the sea surface and the sky. Due to the movement of the UAV 10 or the attitude control of the imaging device 100 by the gimbal 50, the ratio of the ground, the sea surface, and the sky in the screen greatly changes. The automatic exposure control by the imaging apparatus 100 cannot follow this change, and the image captured by the imaging apparatus 100 may be temporarily too bright or too dark. Therefore, in the present embodiment, a horizontal line that is a boundary between the ground or the sea surface and the sky is estimated based on the altitude, the angle of view, and the shooting direction of the imaging apparatus 100. An image captured by the imaging apparatus 100 is divided into an upper area above the horizontal line and a lower area below the horizontal line. Then, an evaluation value of the brightness of the image is derived for each of the upper region and the lower region, and an exposure control value for the upper region and the lower region is determined based on each evaluation value. Thereby, the imaging device 100 appropriately performs exposure control.
  • the imaging control unit 110 includes a dividing unit 112, a derivation unit 114, a determination unit 116, and an exposure control unit 118.
  • the dividing unit 112 divides an image captured by the imaging apparatus 100 into a plurality of regions based on the angle of view of the imaging apparatus 100, the altitude of the imaging apparatus 100, and the imaging direction of the imaging apparatus 100.
  • the dividing unit 112 may divide the image into an upper area and a lower area based on the angle of view of the imaging apparatus 100, the altitude of the imaging apparatus 100, and the imaging direction of the imaging apparatus 100.
  • the dividing unit 112 estimates the position of the horizontal line in the image based on the angle of view of the imaging device 100, the altitude of the imaging device 100, and the imaging direction of the imaging device 100.
  • the dividing unit 112 may divide the image with the upper area above the estimated horizontal line as the upper area and the lower area below the estimated horizontal line as the lower area.
  • the upper region and the lower region in the image differ depending on the posture of the imaging apparatus 100 at the time when the image is captured.
  • the upper region may be a region on the upper side in the vertical direction in the image
  • the lower region may be a region on the lower side in the vertical direction in the image.
  • the upper region may be a region on the sky side in the image including the ground or sea surface and the sky.
  • the lower region may be a region on the ground or sea surface side in the image including the ground or sea surface and the sky.
  • the deriving unit 114 derives an evaluation value of the brightness of an image captured by the imaging device 100 for each of a plurality of areas.
  • the deriving unit 114 may derive the luminance value of the image captured by the imaging apparatus 100 as a brightness evaluation value for each of a plurality of regions.
  • the deriving unit 114 may derive the luminance value of each region based on the luminance value of each pixel included in each region.
  • the deriving unit 114 may derive the average value of the luminance values of the pixels included in each area as the luminance value of each area.
  • the deriving unit 114 performs predetermined weighting on the luminance value of each pixel included in each region, and derives the luminance value of each region based on each weighted luminance value. Good.
  • the deriving unit 114 may derive the brightness evaluation values of the upper region and the lower region divided by the dividing unit 112.
  • the determining unit 116 determines the exposure control value of the imaging apparatus 100 for each of a plurality of regions.
  • the determination unit 116 may determine an exposure control value for each of the plurality of regions based on the evaluation values of the plurality of regions.
  • the determination unit 116 may determine an exposure control value for each of the plurality of regions based on the luminance values of the plurality of regions.
  • the determination unit 116 may determine an exposure value (EV value) for each of the plurality of regions as the control value for each of the plurality of regions.
  • the determination unit 116 may determine the exposure control values of the upper region and the lower region based on the brightness evaluation values of the upper region and the lower region.
  • the exposure control unit 118 controls the exposure of the imaging apparatus 100 for each of a plurality of areas.
  • the exposure control unit 118 may control the exposure of the imaging device 100 for each of the plurality of regions based on the exposure control values of the plurality of regions.
  • the exposure control unit 118 is an example of a control unit.
  • the exposure control unit 118 may control the exposure of the imaging device 100 for each of a plurality of regions by controlling the gain of an electric signal output by the image sensor 120 according to the exposure amount.
  • the exposure control unit 118 controls the exposure of the imaging device 100 for each of the plurality of regions by controlling the gain of the image sensor 120 for each of the plurality of regions based on the exposure control values of the plurality of regions. Good.
  • the exposure control unit 118 may control the exposure of the imaging device 100 for each of the plurality of regions by controlling the exposure time for each of the plurality of regions.
  • the dividing unit 112 Based on the angle of view, altitude, and imaging direction of the imaging device 100 at the first time point, the dividing unit 112 converts the first image captured by the imaging device 100 at the first time point into the first upper region and the first lower side. It may be divided into regions.
  • the deriving unit 114 may derive brightness evaluation values of the first upper area and the first lower area based on the first image.
  • the dividing unit 112 Based on the angle of view, altitude, and imaging direction of the imaging device 100 at the second time point after the first time point, the dividing unit 112 outputs the second image captured at the second time point by the imaging device 100 in the second upper direction. You may divide
  • the angle of view, altitude, and imaging direction of the imaging device 100 at the second time point may be the angle of view, altitude, and imaging direction of the imaging device 100 at the second time point specified before the second time point.
  • the angle of view and imaging direction of the imaging device 100 at the second time point may correspond to the target angle of view and imaging direction to be set in the imaging device 100 at the second time point.
  • the altitude of the imaging device 100 at the second time point may correspond to the target altitude at which the imaging device 100 should be located at the second time point.
  • the altitude of the imaging device 100 at the second time point may correspond to the altitude of the UAV 10 at the second time point specified before the second time point.
  • the UAV 10 While receiving the ascending command from the remote operation device 300, the UAV 10 may ascend at a speed corresponding to the ascending command.
  • the dividing unit 112 may specify the current rising speed of the UAV 10 based on the rising command.
  • the dividing unit 112 may specify the altitude of the UAV 10 at the second time point after the current time before the second time point based on the current ascending speed and the current altitude.
  • the altitude of the imaging device 100 at the second time point may correspond to the target altitude at which the UAV 10 should be located at the second time point.
  • the target altitude may correspond to the altitude at which the UAV 10 indicated in the instruction information for the UAV 10 should be located.
  • the target altitude may correspond to the altitude at which the UAV 10 should be located after a predetermined time has elapsed since the present time.
  • the target angle of view and imaging direction may be the angle of view and imaging direction to be set when the imaging apparatus 100 captures images at a height at which the UAV 10 should be located after a predetermined time has elapsed since the present time.
  • the imaging direction of the imaging apparatus 100 may be determined based on a control value for the gimbal 50 to control the attitude of the imaging apparatus 100 and a control value for the UAV control unit 30 to control the attitude of the UAV 10.
  • the determination unit 116 may determine a control value for the exposure of the second upper region at the second time point based on the evaluation value of the first upper region at the first time point.
  • the determination unit 116 may determine a control value for the exposure of the second lower region at the second time point based on the evaluation value of the first lower region at the first time point.
  • the dividing unit 112 may divide the image into an upper region, a lower region, and at least one intermediate region between the upper region and the lower region.
  • the determination unit 116 may determine the exposure control value of at least one intermediate region as a value between the exposure control value of the upper region and the exposure control value of the lower region.
  • the determination unit 116 emptyes the upper area 502 above the horizontal line and sets the lower area 504 below the horizontal line to the ground or the sea surface. Judge. Then, the determination unit 116 may determine the exposure values of the upper region 502 and the lower region 504 so that the exposure value of the upper region 502 is higher than the exposure value of the lower region 503.
  • the angle of view of the imaging device 100 determined by the focal length f of the imaging device 100 and the size (width) S of the image sensor 120 is ⁇ .
  • R be the radius of the earth.
  • the altitude of UAV10 be h.
  • An angle formed between a virtual line 404 connecting the position of the horizontal line visible from the UAV 10 at the altitude h and the UAV 10 and a virtual line 406 extending from the UAV 10 at the altitude h in the vertical direction is ⁇ .
  • the angle range in which the horizontal line is included in the image captured by the imaging apparatus 100 is from ⁇ / 2 to ⁇ + ⁇ / with respect to the virtual line 406 as shown in FIG.
  • the exposure control unit 118 may control the exposure for each region divided by the dividing unit 112 when the posture of the imaging apparatus 100 satisfies the condition. For example, when the imaging control unit 110 detects an obstacle based on an image captured by the imaging device 60, the exposure control unit 118 does not have to execute exposure control for each of the areas divided by the dividing unit 112. Also good.
  • FIG. 7 is a flowchart illustrating an example of an image division procedure executed by the dividing unit 112.
  • FIG. 8 is a diagram for describing each parameter used for estimating the position of the horizontal line included in the image when the imaging apparatus 100 is oriented in the horizontal direction.
  • FIG. 9 is a diagram for describing each parameter used for estimating the position of the horizontal line included in the image when the imaging apparatus 100 is inclined upward by an angle ⁇ with respect to the horizontal direction.
  • the dividing unit 112 acquires the flight altitude h of the UAV 10 (S100).
  • the dividing unit 112 derives a distance d from the UAV 10 to the horizontal line (S102).
  • the distance d is derived from the radius R of the earth and the distance d by the following equation.
  • the dividing unit 112 derives a horizontal distance D indicating the distance between the UAV 10 and the intersection 414 of the virtual line 410 extending in the horizontal direction from the UAV 10 and the virtual line 412 passing through the horizontal line 420 and perpendicular to the virtual line 410 ( S104).
  • the dividing unit 112 derives a horizontal angle ⁇ that is an angle formed by the virtual line 410 and the virtual line 416 along the distance d (S106).
  • the horizontal distance D is derived from the following equation.
  • the horizontal angle ⁇ is derived from the following equation.
  • the dividing unit 112 derives the projection range W when the imaging apparatus 100 captures an image with the angle of view ⁇ (S108), and derives the height V from the lower portion 418 of the projection range W to the horizontal line 420.
  • the projection range W is derived by the following equation.
  • the height V from the lower portion 418 of the projection range W to the horizontal line 420 is derived by the following equation.
  • the projection range W is derived by the following equation.
  • the posture angle ⁇ is a positive value.
  • the height V from the lower portion 418 of the projection range W to the horizontal line 420 is derived by the following equation.
  • the dividing unit 112 identifies the upper region and the lower region based on the derived height V (S112 and S114). For example, as illustrated in FIG. 10, when the number of lines of an image 510 captured by the image sensor 120 is M [pixel], the dividing unit 112 defines the upper region as M ⁇ (W on the upper side in the vertical direction of the image. Set to -V) / W [pixel]. The dividing unit 112 sets the lower area to M ⁇ V / M [pixel] on the lower side in the vertical direction of the image.
  • the determination unit 116 determines the exposure value of the upper region to be, for example, 10 [EV] based on the luminance value of the upper region.
  • the determination unit 116 determines the exposure value of the lower region to be, for example, 6 [EV] based on the luminance value of the lower region.
  • the dividing unit 112 estimates the position of the horizontal line in the image based on the angle of view, the altitude, and the imaging direction of the image capturing apparatus 100, and the upper region above the estimated horizontal line is the lower region. It is possible to divide the image using as a lower area.
  • FIG. 11 shows the relationship between the ratio of the lower area to the entire image and the flight altitude h. Even if the flight altitude h is the same, the longer the focal length f of the imaging apparatus 100, that is, the smaller the angle of view ⁇ , the lower the proportion of the lower region.
  • FIG. 12 shows the relationship between the ratio of the lower area to the entire image and the posture angle ⁇ . Even if the posture angle ⁇ is the same, the longer the focal length f of the imaging apparatus 100, that is, the narrower the angle of view ⁇ , the smaller the proportion of the lower region.
  • FIG. 13 is a flowchart showing an example of a procedure for exposure control by the imaging apparatus 100.
  • the imaging apparatus 100 is turned on by turning on the main power supply of the UAV 10 (S300).
  • the UAV 10 does not fly immediately after the imaging apparatus 100 is turned on.
  • the exposure control unit 118 controls the exposure of the imaging apparatus 100 with the entire image as a single region as usual without dividing the image (S302).
  • the imaging control unit 110 determines whether or not the UAV 10 has received flight instruction information (S304).
  • the UAV 10 Upon receiving the flight instruction information, the UAV 10 starts flying (S306).
  • the dividing unit 112 specifies the current altitude of the UAV 10, the angle of view of the imaging device 100, and the imaging direction of the imaging device 100 (S308). Based on the current altitude of the UAV 10, the angle of view of the imaging device 100, and the imaging direction of the imaging device 100, the dividing unit 112 identifies the current upper region and lower region of the image captured by the imaging device 100 (S310). ).
  • the deriving unit 114 derives, for example, a luminance value as the brightness evaluation value of each of the current upper region and lower region.
  • the determination unit 116 determines the current upper region exposure value and the current lower region exposure value based on the current upper region luminance value and the current lower region luminance value (S312).
  • the exposure control unit 118 controls the exposure of the imaging device 100 for each of the upper region and the lower region based on the determined exposure value of the upper region and the current exposure value of the lower region (S314).
  • the dividing unit 112 specifies the height of the UAV 10 at the next time point, the angle of view of the imaging device 100, and the imaging direction of the imaging device 100 based on the instruction information (S316). Based on the instruction information, the dividing unit 112 may specify a target altitude at which the UAV 10 should be located after a predetermined period from the current time, and specify the target altitude as the UAV 10 altitude at the next time point. . The dividing unit 112 may specify the current rising speed of the UAV 10 based on the instruction information.
  • the dividing unit 112 identifies the target altitude at which the UAV 10 should be positioned after a predetermined period from the current time based on the current ascending speed and the current altitude, and sets the target altitude to the UAV 10 at the next time point. It may be specified as an altitude.
  • the dividing unit 112 may specify the angle of view and the imaging direction at the next time point based on the imaging condition indicated in the instruction information.
  • the dividing unit 112 specifies the upper region and the lower region at the next time point based on the altitude, the angle of view, and the imaging direction at the next time point (S318).
  • the dividing unit 112 may specify the upper region and the lower region according to the procedure illustrated in FIG. 7 based on the altitude, the angle of view, and the imaging direction at the next time point.
  • the determination unit 116 determines the exposure values of the upper region and the lower region at the next time point based on the previous exposure values of the upper region and the lower region (S320). Based on the current exposure values determined for the current upper region and the lower region based on the current image captured by the imaging apparatus 100, the determination unit 116 determines the upper region and the lower region at the next time point.
  • the exposure value may be determined.
  • the determination unit 116 may determine the exposure values of the upper region and the lower region at the next time point based on the respective luminance values derived for the current upper region and the lower region.
  • the exposure control unit 118 sets the exposure of the imaging apparatus 100 for each of the upper and lower regions based on the exposure values of the upper and lower regions at the next time point. (S322).
  • the imaging control unit 110 repeats the processing after step S316. If the UAV 10 is not in flight, the processing after step S316 is temporarily interrupted, and if the UAV 10 is powered off (S326), the series of processing ends. If the UAV 10 is powered on, the processing from step S302 is repeated.
  • the imaging control unit 110 may execute the processing after step S308 when the altitude of the UAV 10 becomes equal to or higher than a predetermined altitude.
  • the predetermined altitude may be a lower altitude that is likely to include a horizontal line in an image captured by the imaging apparatus 100.
  • the imaging control unit 110 may temporarily interrupt the processing after step S316 when the altitude of the imaging control unit 110 becomes equal to or lower than a predetermined altitude.
  • the dividing unit 112 determines the target altitude of the UAV 10, the target angle of view of the imaging apparatus 100, and the target imaging direction at a future time after a predetermined period from the current time during the flight of the UAV 10. Identify. Based on the target altitude of the UAV 10 at the future time point, the target angle of view of the image capturing device 100, and the target image capturing direction, the dividing unit 112 is configured to detect the upper region and the lower portion of the image captured by the image capturing device 100 at the future time point. Identify the area. Then, the determination unit 116 determines in advance the exposure values of the upper region and the lower region at a future time point based on the exposure values or luminance values of the upper region and the lower region of the current image.
  • the exposure control unit 118 has been described mainly with respect to an example in which exposure is controlled for each of a plurality of regions by controlling the gain of an electric signal output by the image sensor 120 according to the exposure amount. However, the exposure control unit 118 may control the exposure for each of a plurality of regions by other methods.
  • FIG. 14 shows another example of the functional block of the UAV10.
  • the UAV 10 illustrated in FIG. 14 is different from the UAV 10 illustrated in FIG. 2 in that the imaging unit 102 includes an optical filter 140.
  • the optical filter 140 is an optical filter provided in front of the image sensor 120 and capable of changing the light transmittance for each of a plurality of predetermined regions.
  • the optical filter 140 may be a variable ND filter capable of changing the light transmittance for each predetermined region.
  • the optical filter 140 may be a variable ND filter capable of changing the light transmittance for each of the plurality of horizontal regions 600 as shown in FIG.
  • the optical filter 140 may have a pair of electrodes for each of the plurality of horizontal regions 600.
  • the exposure control unit 118 controls the voltage applied to the pair of electrodes for each of the plurality of horizontal regions 600 based on the exposure value for each of the plurality of regions. Thereby, the exposure control part 118 can control exposure for every some area
  • the exposure control unit 118 applies a voltage that can realize an exposure value of 10 [EV] to a plurality of horizontal regions 602 corresponding to the upper region of the image.
  • the exposure control unit 118 applies voltages capable of realizing exposure values 9 [EV], 8 [EV], and 7 [EV] to a plurality of horizontal regions 606 corresponding to the intermediate region of the image, respectively.
  • the exposure control unit 118 applies a voltage that can realize an exposure value of 6 [EV] to a plurality of horizontal regions 604 with respect to the lower region of the image. Thereby, the exposure control unit 118 can control the exposure of the upper region, the lower region, and at least one intermediate region.
  • FIG. 16 illustrates an example of a computer 1200 in which aspects of the present invention may be embodied in whole or in part.
  • a program installed in the computer 1200 can cause the computer 1200 to function as an operation associated with the apparatus according to the embodiment of the present invention or as one or more “units” of the apparatus.
  • the program can cause the computer 1200 to execute the operation or the one or more “units”.
  • the program can cause the computer 1200 to execute a process according to an embodiment of the present invention or a stage of the process.
  • Such a program may be executed by CPU 1212 to cause computer 1200 to perform certain operations associated with some or all of the blocks in the flowcharts and block diagrams described herein.
  • the computer 1200 includes a CPU 1212 and a RAM 1214, which are connected to each other by a host controller 1210.
  • the computer 1200 also includes a communication interface 1222 and an input / output unit, which are connected to the host controller 1210 via the input / output controller 1220.
  • Computer 1200 also includes ROM 1230.
  • the CPU 1212 operates according to programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
  • the communication interface 1222 communicates with other electronic devices via a network.
  • a hard disk drive may store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 at the time of activation and / or a program depending on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card or a network.
  • the program is installed in the RAM 1214 or the ROM 1230 that is also an example of a computer-readable recording medium, and is executed by the CPU 1212.
  • Information processing described in these programs is read by the computer 1200 to bring about cooperation between the programs and the various types of hardware resources.
  • An apparatus or method may be configured by implementing information operations or processing in accordance with the use of computer 1200.
  • the CPU 1212 executes a communication program loaded in the RAM 1214 and performs communication processing on the communication interface 1222 based on the processing described in the communication program. You may order.
  • the communication interface 1222 reads transmission data stored in a RAM 1214 or a transmission buffer area provided in a recording medium such as a USB memory under the control of the CPU 1212 and transmits the read transmission data to a network, or The reception data received from the network is written into a reception buffer area provided on the recording medium.
  • the CPU 1212 allows the RAM 1214 to read all or necessary portions of a file or database stored in an external recording medium such as a USB memory, and executes various types of processing on the data on the RAM 1214. Good. The CPU 1212 may then write back the processed data to an external recording medium.
  • the CPU 1212 describes various types of operations, information processing, conditional judgment, conditional branching, unconditional branching, and information retrieval that are described throughout the present disclosure for data read from the RAM 1214 and specified by the instruction sequence of the program. Various types of processing may be performed, including / replacement, etc., and the result is written back to RAM 1214.
  • the CPU 1212 may search for information in files, databases, etc. in the recording medium. For example, when a plurality of entries each having an attribute value of the first attribute associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 specifies the attribute value of the first attribute. The entry that matches the condition is searched from the plurality of entries, the attribute value of the second attribute stored in the entry is read, and thereby the first attribute that satisfies the predetermined condition is associated. The attribute value of the obtained second attribute may be acquired.
  • the program or software module described above may be stored in a computer-readable storage medium on the computer 1200 or in the vicinity of the computer 1200.
  • a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, whereby the program is transferred to the computer 1200 via the network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)

Abstract

This control device may be provided with a division unit that divides an image captured by an imaging device on the basis of the field angle of the imaging device, the altitude of the imaging device, and the imaging direction of the imaging device. The control device may be provided with a control unit that controls exposure of the imaging device for each of multiple regions.

Description

制御装置、撮像装置、撮像システム、移動体、制御方法、及びプログラムControl device, imaging device, imaging system, moving object, control method, and program
 本発明は、制御装置、撮像装置、撮像システム、移動体、制御方法、及びプログラムに関する。 The present invention relates to a control device, an imaging device, an imaging system, a moving body, a control method, and a program.
 特許文献1には、カメラの被写界画像を複数の領域に分割して、分割された各領域の輝度値に基づいて被写界全体の輝度値を算出することが開示されている。
 特許文献1 特開2009-25727号公報
Patent Document 1 discloses that a scene image of a camera is divided into a plurality of areas, and the luminance value of the entire scene is calculated based on the luminance value of each divided area.
Patent Document 1 Japanese Unexamined Patent Application Publication No. 2009-25727
解決しようとする課題Challenges to be solved
 撮像装置が無人航空機などの移動体に搭載されて使用される場合など、明るさが比較的早く変わる環境で撮像装置が使用される場合、撮像装置による自動露出制御が適切に行えない場合がある。 When the imaging device is used in an environment where the brightness changes relatively quickly, such as when the imaging device is mounted on a moving object such as an unmanned aircraft, automatic exposure control by the imaging device may not be performed properly. .
一般的開示General disclosure
 本発明の一態様に係る制御装置は、撮像装置により撮像される画像を、撮像装置の画角、撮像装置の高度、及び撮像装置の撮像方向に基づいて、複数の領域に分割する分割部を備えてよい。制御装置は、撮像装置の露出を複数の領域ごとに制御する制御部を備えてよい。 The control device according to one aspect of the present invention includes a dividing unit that divides an image captured by the imaging device into a plurality of regions based on an angle of view of the imaging device, an altitude of the imaging device, and an imaging direction of the imaging device. You may prepare. The control device may include a control unit that controls the exposure of the imaging device for each of a plurality of regions.
 制御装置は、複数の領域ごとに、撮像装置の露出の制御値を決定する決定部を備えてよい。制御部は、撮像装置の露出の制御値に基づいて、撮像装置の露出を複数の領域ごとに制御してよい。 The control device may include a determination unit that determines an exposure control value of the imaging device for each of a plurality of regions. The control unit may control the exposure of the imaging device for each of a plurality of regions based on the exposure control value of the imaging device.
 制御装置は、撮像装置により撮像される画像の明るさの評価値を複数の領域ごとに導出する導出部を備えてよい。決定部は、複数の領域のそれぞれの評価値に基づいて、複数の領域ごとに露出の制御値を決定してよい。 The control device may include a derivation unit that derives an evaluation value of brightness of an image captured by the imaging device for each of a plurality of regions. The determining unit may determine an exposure control value for each of the plurality of regions based on the evaluation values of the plurality of regions.
 分割部は、撮像装置の画角、撮像装置の高度、及び撮像装置の撮像方向に基づいて、画像を、上方領域及び下方領域に分割してよい。導出部は、上方領域及び下方領域のそれぞれの明るさの評価値を導出してよい。決定部は、上方領域及び下方領域のそれぞれの明るさの評価値に基づいて、上方領域及び下方領域のそれぞれの露出の制御値を決定してよい。 The dividing unit may divide the image into an upper region and a lower region based on the angle of view of the imaging device, the altitude of the imaging device, and the imaging direction of the imaging device. The deriving unit may derive an evaluation value of the brightness of each of the upper region and the lower region. The determination unit may determine the exposure control values of the upper region and the lower region based on the brightness evaluation values of the upper region and the lower region.
 分割部は、第1時点での撮像装置の画角、高度、及び撮像方向に基づいて、第1時点で撮像装置に撮像される第1画像を、第1上方領域及び第1下方領域に分割してよい。導出部は、第1画像に基づいて第1上方領域及び第1下方領域の明るさの評価値を導出してよい。分割部は、第1時点より後の第2時点での撮像装置の画角、高度、及び撮像方向に基づいて、撮像装置により第2時点で撮像される第2画像を第2上方領域及び第2下方領域に分割してよい。決定部は、第1上方領域の評価値に基づいて第2上方領域の露出の制御値を決定し、第1下方領域の評価値に基づいて第2下方領域の露出の制御値を決定してよい。 The dividing unit divides a first image captured by the imaging device at the first time point into a first upper region and a first lower region based on the angle of view, altitude, and imaging direction of the imaging device at the first time point. You can do it. The deriving unit may derive brightness evaluation values of the first upper region and the first lower region based on the first image. The dividing unit converts the second image captured at the second time point by the image pickup device based on the angle of view, the altitude, and the image pickup direction of the image pickup device at the second time point after the first time point. It may be divided into two lower regions. The determining unit determines an exposure control value for the second upper area based on the evaluation value for the first upper area, and determines an exposure control value for the second lower area based on the evaluation value for the first lower area. Good.
 第2時点での撮像装置の画角、高度、及び撮像方向は、第2時点より前に特定される第2時点での撮像装置の画角、高度、及び撮像方向でよい。 The angle of view, altitude, and imaging direction of the imaging device at the second time point may be the angle of view, altitude, and imaging direction of the imaging device at the second time point specified before the second time point.
 撮像装置は、移動体に搭載されてよい。第2時点での撮像装置の高度は、第2時点より前に特定される第2時点での移動体の高度に対応してよい。 The imaging device may be mounted on a moving body. The altitude of the imaging device at the second time point may correspond to the altitude of the moving body at the second time point specified before the second time point.
 分割部は、画像を、上方領域、下方領域、及び上方領域と下方領域との間の少なくとも1つの中間領域に分割してよい。決定部は、少なくとも1つの中間領域の露出の制御値を、上方領域の露出の制御値と、下方領域の露出の制御値との間の値に決定してよい。 The dividing unit may divide the image into an upper region, a lower region, and at least one intermediate region between the upper region and the lower region. The determining unit may determine the exposure control value of the at least one intermediate region to a value between the exposure control value of the upper region and the exposure control value of the lower region.
 撮像装置は、イメージセンサを有してよい。制御部は、イメージセンサが露光量に応じて出力する電気信号のゲインを制御することにより、撮像装置の露出を複数の領域ごとに制御してよい。 The imaging device may have an image sensor. The control unit may control the exposure of the imaging device for each of a plurality of regions by controlling a gain of an electric signal output from the image sensor according to an exposure amount.
 撮像装置は、イメージセンサを有してよい。撮像装置は、イメージセンサの前方に設けられ、予め定められた複数の領域ごとに光の透過率を変更可能な光学フィルタを有してよい。制御部は、光学フィルタの予め定められた複数の領域ごとに光の透過率を制御することにより、撮像装置の露出を複数の領域ごとに制御してよい。 The imaging device may have an image sensor. The imaging device may include an optical filter provided in front of the image sensor and capable of changing the light transmittance for each of a plurality of predetermined regions. The control unit may control the exposure of the imaging device for each of the plurality of regions by controlling the light transmittance for each of the plurality of predetermined regions of the optical filter.
 本発明の一態様に係る撮像装置は、上記制御装置を備えてよい。撮像装置は、イメージセンサを備えてよい。 An imaging device according to one embodiment of the present invention may include the control device. The imaging device may include an image sensor.
 本発明の一態様に係る撮像システムは、上記撮像装置を備えてよい。撮像システムは、撮像装置を支持する支持機構を備えてよい。 An imaging system according to an aspect of the present invention may include the imaging device. The imaging system may include a support mechanism that supports the imaging device.
 本発明の一態様に係る移動体は、上記撮像システムを搭載して移動してよい。 The moving body according to one embodiment of the present invention may move by mounting the imaging system.
 本発明の一態様に係る制御方法は、撮像装置により撮像される画像を、撮像装置の画角、撮像装置の高度、及び撮像装置の撮像方向に基づいて、複数の領域に分割する段階を備えてよい。制御方法は、撮像装置の露出を複数の領域ごとに制御する段階を備えてよい。 A control method according to an aspect of the present invention includes a step of dividing an image captured by an imaging device into a plurality of regions based on an angle of view of the imaging device, an altitude of the imaging device, and an imaging direction of the imaging device. It's okay. The control method may include a step of controlling the exposure of the imaging device for each of a plurality of regions.
 本発明の一態様に係るプログラムは、撮像装置により撮像される画像を、撮像装置の画角、撮像装置の高度、及び撮像装置の撮像方向に基づいて、複数の領域に分割する段階をコンピュータに実行させてよい。プログラムは、撮像装置の露出を複数の領域ごとに制御する段階をコンピュータに実行させてよい。 A program according to one embodiment of the present invention provides a computer with a step of dividing an image captured by an imaging device into a plurality of regions based on an angle of view of the imaging device, an altitude of the imaging device, and an imaging direction of the imaging device. May be executed. The program may cause the computer to execute a step of controlling the exposure of the imaging device for each of a plurality of regions.
 本発明の一態様によれば、明るさが比較的早く変わる環境で撮像装置が使用される場合でも、撮像装置による露出制御をより適切に実行できる。 According to one aspect of the present invention, exposure control by the imaging device can be more appropriately executed even when the imaging device is used in an environment where the brightness changes relatively quickly.
 上記の発明の概要は、本発明の特徴の全てを列挙したものではない。これらの特徴群のサブコンビネーションも発明となりうる。 The above summary of the invention does not enumerate all the features of the present invention. A sub-combination of these feature groups can also be an invention.
無人航空機(UAV)及び遠隔操作装置の外観の一例を示す図である。It is a figure which shows an example of the external appearance of an unmanned aerial vehicle (UAV) and a remote control device. UAVの機能ブロックの一例を示す図である。It is a figure which shows an example of the functional block of UAV. 画像の上方領域及び下方領域について説明するための図である。It is a figure for demonstrating the upper area | region and lower area | region of an image. 撮像装置の画角について説明するための図である。It is a figure for demonstrating the angle of view of an imaging device. 画像に水平線が含まれる角度範囲について説明するための図である。It is a figure for demonstrating the angle range where a horizontal line is included in an image. 画像に水平線が含まれる角度範囲について説明するための図である。It is a figure for demonstrating the angle range where a horizontal line is included in an image. 画像の分割の手順の一例を示すフローチャートである。It is a flowchart which shows an example of the procedure of the division | segmentation of an image. 撮像装置が水平方向を向いている場合に、画像内に含まれる水平線の位置を推定するために用いられる各パラメータについて説明するための図である。It is a figure for demonstrating each parameter used in order to estimate the position of the horizon contained in an image, when an imaging device is facing the horizontal direction. 撮像装置が水平方向に対して上方に角度αだけ傾いている場合に、画像内に含まれる水平線の位置を推定するために用いられる各パラメータについて説明するための図である。It is a figure for demonstrating each parameter used in order to estimate the position of the horizontal line contained in an image, when an imaging device inclines only the angle (alpha) upwards with respect to the horizontal direction. 画像の上方領域及び下方領域について説明するための図である。It is a figure for demonstrating the upper area | region and lower area | region of an image. 画像全体に対する下方領域の割合と飛行高度との関係を示す図である。It is a figure which shows the relationship between the ratio of the downward area | region with respect to the whole image, and flight altitude. 画像全体に対する下方領域の割合と姿勢角度αとの関係を示す図である。It is a figure which shows the relationship between the ratio of the downward area | region with respect to the whole image, and attitude | position angle (alpha). 撮像装置による露出制御の手順の一例を示すフローチャートである。It is a flowchart which shows an example of the procedure of exposure control by an imaging device. UAVの機能ブロックの他の一例を示す図である。It is a figure which shows another example of the functional block of UAV. 光学フィルタについて説明するための図である。It is a figure for demonstrating an optical filter. ハードウェア構成の一例を示す図である。It is a figure which shows an example of a hardware constitutions.
 以下、発明の実施の形態を通じて本発明を説明するが、以下の実施の形態は請求の範囲に係る発明を限定するものではない。また、実施の形態の中で説明されている特徴の組み合わせの全てが発明の解決手段に必須であるとは限らない。以下の実施の形態に、多様な変更または改良を加えることが可能であることが当業者に明らかである。その様な変更または改良を加えた形態も本発明の技術的範囲に含まれ得ることが、請求の範囲の記載から明らかである。 Hereinafter, the present invention will be described through embodiments of the invention, but the following embodiments do not limit the claimed invention. Moreover, not all the combinations of features described in the embodiments are essential for the solution means of the invention. It will be apparent to those skilled in the art that various modifications or improvements can be made to the following embodiments. It is apparent from the scope of the claims that the embodiments added with such changes or improvements can be included in the technical scope of the present invention.
 請求の範囲、明細書、図面、及び要約書には、著作権による保護の対象となる事項が含まれる。著作権者は、これらの書類の何人による複製に対しても、特許庁のファイルまたはレコードに表示される通りであれば異議を唱えない。ただし、それ以外の場合、一切の著作権を留保する。 The claims, the description, the drawings, and the abstract include matters that are subject to copyright protection. The copyright owner will not object to any number of copies of these documents as they appear in the JPO file or record. However, in other cases, all copyrights are reserved.
 本発明の様々な実施形態は、フローチャート及びブロック図を参照して記載されてよく、ここにおいてブロックは、(1)操作が実行されるプロセスの段階または(2)操作を実行する役割を持つ装置の「部」を表わしてよい。特定の段階及び「部」が、プログラマブル回路、及び/またはプロセッサによって実装されてよい。専用回路は、デジタル及び/またはアナログハードウェア回路を含んでよい。集積回路(IC)及び/またはディスクリート回路を含んでよい。プログラマブル回路は、再構成可能なハードウェア回路を含んでよい。再構成可能なハードウェア回路は、論理AND、論理OR、論理XOR、論理NAND、論理NOR、及び他の論理操作、フリップフロップ、レジスタ、フィールドプログラマブルゲートアレイ(FPGA)、プログラマブルロジックアレイ(PLA)等のようなメモリ要素等を含んでよい。 Various embodiments of the present invention may be described with reference to flowcharts and block diagrams, where a block is either (1) a stage in a process in which an operation is performed or (2) an apparatus responsible for performing the operation. May represent a “part”. Certain stages and “units” may be implemented by programmable circuits and / or processors. Dedicated circuitry may include digital and / or analog hardware circuitry. Integrated circuits (ICs) and / or discrete circuits may be included. The programmable circuit may include a reconfigurable hardware circuit. Reconfigurable hardware circuits include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate arrays (FPGA), programmable logic arrays (PLA), etc. The memory element or the like may be included.
 コンピュータ可読媒体は、適切なデバイスによって実行される命令を格納可能な任意の有形なデバイスを含んでよい。その結果、そこに格納される命令を有するコンピュータ可読媒体は、フローチャートまたはブロック図で指定された操作を実行するための手段を作成すべく実行され得る命令を含む、製品を備えることになる。コンピュータ可読媒体の例としては、電子記憶媒体、磁気記憶媒体、光記憶媒体、電磁記憶媒体、半導体記憶媒体等が含まれてよい。コンピュータ可読媒体のより具体的な例としては、フロッピー(登録商標)ディスク、ディスケット、ハードディスク、ランダムアクセスメモリ(RAM)、リードオンリメモリ(ROM)、消去可能プログラマブルリードオンリメモリ(EPROMまたはフラッシュメモリ)、電気的消去可能プログラマブルリードオンリメモリ(EEPROM)、静的ランダムアクセスメモリ(SRAM)、コンパクトディスクリードオンリメモリ(CD-ROM)、デジタル多用途ディスク(DVD)、ブルーレイ(RTM)ディスク、メモリスティック、集積回路カード等が含まれてよい。 The computer readable medium may include any tangible device capable of storing instructions to be executed by a suitable device. As a result, a computer readable medium having instructions stored thereon comprises a product that includes instructions that can be executed to create a means for performing the operations specified in the flowcharts or block diagrams. Examples of computer readable media may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like. More specific examples of computer readable media include floppy disks, diskettes, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), Electrically erasable programmable read only memory (EEPROM), static random access memory (SRAM), compact disc read only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory stick, integrated A circuit card or the like may be included.
 コンピュータ可読命令は、1または複数のプログラミング言語の任意の組み合わせで記述されたソースコードまたはオブジェクトコードの何れかを含んでよい。ソースコードまたはオブジェクトコードは、従来の手続型プログラミング言語を含む。従来の手続型プログラミング言語は、アセンブラ命令、命令セットアーキテクチャ(ISA)命令、マシン命令、マシン依存命令、マイクロコード、ファームウェア命令、状態設定データ、またはSmalltalk、JAVA(登録商標)、C++等のようなオブジェクト指向プログラミング言語、及び「C」プログラミング言語または同様のプログラミング言語でよい。コンピュータ可読命令は、汎用コンピュータ、特殊目的のコンピュータ、若しくは他のプログラム可能なデータ処理装置のプロセッサまたはプログラマブル回路に対し、ローカルにまたはローカルエリアネットワーク(LAN)、インターネット等のようなワイドエリアネットワーク(WAN)を介して提供されてよい。プロセッサまたはプログラマブル回路は、フローチャートまたはブロック図で指定された操作を実行するための手段を作成すべく、コンピュータ可読命令を実行してよい。プロセッサの例としては、コンピュータプロセッサ、処理ユニット、マイクロプロセッサ、デジタル信号プロセッサ、コントローラ、マイクロコントローラ等を含む。 The computer readable instructions may include either source code or object code written in any combination of one or more programming languages. The source code or object code includes a conventional procedural programming language. Conventional procedural programming languages include assembler instructions, instruction set architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA, C ++, etc. It may be an object-oriented programming language and a “C” programming language or a similar programming language. Computer readable instructions may be directed to a general purpose computer, special purpose computer, or other programmable data processing device processor or programmable circuit locally or in a wide area network (WAN) such as a local area network (LAN), the Internet, etc. ). The processor or programmable circuit may execute computer readable instructions to create a means for performing the operations specified in the flowcharts or block diagrams. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
 図1は、無人航空機(UAV)10及び遠隔操作装置300の外観の一例を示す。UAV10は、UAV本体20、ジンバル50、複数の撮像装置60、及び撮像装置100を備える。ジンバル50、及び撮像装置100は、撮像システムの一例である。UAV10は、推進部により推進される移動体の一例である。移動体とは、UAVの他、空中を移動する他の航空機などの飛行体、地上を移動する車両、水上を移動する船舶等を含む概念である。 FIG. 1 shows an example of the external appearance of an unmanned aerial vehicle (UAV) 10 and a remote control device 300. The UAV 10 includes a UAV main body 20, a gimbal 50, a plurality of imaging devices 60, and an imaging device 100. The gimbal 50 and the imaging device 100 are an example of an imaging system. The UAV 10 is an example of a moving body propelled by a propulsion unit. The moving body is a concept including a flying body such as another aircraft moving in the air, a vehicle moving on the ground, a ship moving on the water, etc. in addition to the UAV.
 UAV本体20は、複数の回転翼を備える。複数の回転翼は、推進部の一例である。UAV本体20は、複数の回転翼の回転を制御することでUAV10を飛行させる。UAV本体20は、例えば、4つの回転翼を用いてUAV10を飛行させる。回転翼の数は、4つには限定されない。また、UAV10は、回転翼を有さない固定翼機でもよい。 The UAV main body 20 includes a plurality of rotor blades. The plurality of rotor blades is an example of a propulsion unit. The UAV main body 20 causes the UAV 10 to fly by controlling the rotation of a plurality of rotor blades. For example, the UAV main body 20 causes the UAV 10 to fly using four rotary wings. The number of rotor blades is not limited to four. The UAV 10 may be a fixed wing machine that does not have a rotating wing.
 撮像装置100は、所望の撮像範囲に含まれる被写体を撮像する撮像用のカメラである。ジンバル50は、撮像装置100を回転可能に支持する。ジンバル50は、支持機構の一例である。例えば、ジンバル50は、撮像装置100を、アクチュエータを用いてピッチ軸で回転可能に支持する。ジンバル50は、撮像装置100を、アクチュエータを用いて更にロール軸及びヨー軸のそれぞれを中心に回転可能に支持する。ジンバル50は、ヨー軸、ピッチ軸、及びロール軸の少なくとも1つを中心に撮像装置100を回転させることで、撮像装置100の姿勢を変更してよい。 The imaging apparatus 100 is an imaging camera that images a subject included in a desired imaging range. The gimbal 50 supports the imaging device 100 in a rotatable manner. The gimbal 50 is an example of a support mechanism. For example, the gimbal 50 supports the imaging device 100 so as to be rotatable about the pitch axis using an actuator. The gimbal 50 further supports the imaging device 100 using an actuator so as to be rotatable about the roll axis and the yaw axis. The gimbal 50 may change the posture of the imaging device 100 by rotating the imaging device 100 about at least one of the yaw axis, the pitch axis, and the roll axis.
 複数の撮像装置60は、UAV10の飛行を制御するためにUAV10の周囲を撮像するセンシング用のカメラである。2つの撮像装置60が、UAV10の機首である正面に設けられてよい。更に他の2つの撮像装置60が、UAV10の底面に設けられてよい。正面側の2つの撮像装置60はペアとなり、いわゆるステレオカメラとして機能してよい。底面側の2つの撮像装置60もペアとなり、ステレオカメラとして機能してよい。複数の撮像装置60により撮像された画像に基づいて、UAV10の周囲の3次元空間データが生成されてよい。UAV10が備える撮像装置60の数は4つには限定されない。UAV10は、少なくとも1つの撮像装置60を備えていればよい。UAV10は、UAV10の機首、機尾、側面、底面、及び天井面のそれぞれに少なくとも1つの撮像装置60を備えてもよい。撮像装置60で設定できる画角は、撮像装置100で設定できる画角より広くてよい。撮像装置60は、単焦点レンズまたは魚眼レンズを有してもよい。 The plurality of imaging devices 60 are sensing cameras that image the surroundings of the UAV 10 in order to control the flight of the UAV 10. Two imaging devices 60 may be provided in the front which is the nose of UAV10. Two other imaging devices 60 may be provided on the bottom surface of the UAV 10. The two imaging devices 60 on the front side may be paired and function as a so-called stereo camera. The two imaging devices 60 on the bottom side may also be paired and function as a stereo camera. Based on images picked up by a plurality of image pickup devices 60, three-dimensional spatial data around the UAV 10 may be generated. The number of imaging devices 60 included in the UAV 10 is not limited to four. The UAV 10 only needs to include at least one imaging device 60. The UAV 10 may include at least one imaging device 60 on each of the nose, the tail, the side surface, the bottom surface, and the ceiling surface of the UAV 10. The angle of view that can be set by the imaging device 60 may be wider than the angle of view that can be set by the imaging device 100. The imaging device 60 may have a single focus lens or a fisheye lens.
 遠隔操作装置300は、UAV10と通信して、UAV10を遠隔操作する。遠隔操作装置300は、UAV10と無線で通信してよい。遠隔操作装置300は、UAV10に上昇、下降、加速、減速、前進、後進、回転などのUAV10の移動に関する各種命令を示す指示情報を送信する。指示情報は、例えば、UAV10の高度を上昇させるための指示情報を含む。指示情報は、UAV10が位置すべき高度を示してよい。UAV10は、遠隔操作装置300から受信した指示情報により示される高度に位置するように移動する。指示情報は、UAV10を上昇させる上昇命令を含んでよい。UAV10は、上昇命令を受け付けている間、上昇する。UAV10は、上昇命令を受け付けても、UAV10の高度が上限高度に達している場合には、上昇を制限してよい。 The remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10. The remote operation device 300 may communicate with the UAV 10 wirelessly. The remote control device 300 transmits to the UAV 10 instruction information indicating various commands related to movement of the UAV 10 such as ascending, descending, accelerating, decelerating, moving forward, moving backward, and rotating. The instruction information includes, for example, instruction information for raising the altitude of the UAV 10. The instruction information may indicate the altitude at which the UAV 10 should be located. The UAV 10 moves so as to be located at an altitude indicated by the instruction information received from the remote operation device 300. The instruction information may include an ascending command that raises the UAV 10. The UAV 10 rises while accepting the ascent command. Even if the UAV 10 receives the ascending command, the UAV 10 may limit the ascent when the altitude of the UAV 10 has reached the upper limit altitude.
 図2は、UAV10の機能ブロックの一例を示す。UAV10は、UAV制御部30、メモリ32、通信インタフェース34、推進部40、GPS受信機41、慣性計測装置42、磁気コンパス43、気圧高度計44、ジンバル50、及び撮像装置100を備える。 FIG. 2 shows an example of functional blocks of the UAV10. The UAV 10 includes a UAV control unit 30, a memory 32, a communication interface 34, a propulsion unit 40, a GPS receiver 41, an inertial measurement device 42, a magnetic compass 43, a barometric altimeter 44, a gimbal 50, and the imaging device 100.
 通信インタフェース34は、遠隔操作装置300などの他の装置と通信する。通信インタフェース34は、遠隔操作装置300からUAV制御部30に対する各種の命令を含む指示情報を受信してよい。メモリ32は、UAV制御部30が、推進部40、GPS受信機41、慣性計測装置(IMU)42、磁気コンパス43、気圧高度計44、ジンバル50、撮像装置60、及び撮像装置100を制御するのに必要なプログラム等を格納する。メモリ32は、コンピュータ読み取り可能な記録媒体でよく、SRAM、DRAM、EPROM、EEPROM、及びUSBメモリ等のフラッシュメモリの少なくとも1つを含んでよい。メモリ32は、UAV本体20の内部に設けられてよい。UAV本体20から取り外し可能に設けられてよい。 The communication interface 34 communicates with other devices such as the remote operation device 300. The communication interface 34 may receive instruction information including various commands for the UAV control unit 30 from the remote operation device 300. In the memory 32, the UAV control unit 30 controls the propulsion unit 40, the GPS receiver 41, the inertial measurement device (IMU) 42, the magnetic compass 43, the barometric altimeter 44, the gimbal 50, the imaging device 60, and the imaging device 100. Stores necessary programs etc. The memory 32 may be a computer-readable recording medium and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory. The memory 32 may be provided inside the UAV main body 20. It may be provided so as to be removable from the UAV main body 20.
 UAV制御部30は、メモリ32に格納されたプログラムに従ってUAV10の飛行及び撮像を制御する。UAV制御部30は、CPUまたはMPU等のマイクロプロセッサ、MCU等のマイクロコントローラ等により構成されてよい。UAV制御部30は、通信インタフェース34を介して遠隔操作装置300から受信した命令に従って、UAV10の飛行及び撮像を制御する。推進部40は、UAV10を推進させる。推進部40は、複数の回転翼と、複数の回転翼を回転させる複数の駆動モータとを有する。推進部40は、UAV制御部30からの命令に従って複数の駆動モータを介して複数の回転翼を回転させて、UAV10を飛行させる。 The UAV control unit 30 controls the flight and imaging of the UAV 10 according to a program stored in the memory 32. The UAV control unit 30 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The UAV control unit 30 controls the flight and imaging of the UAV 10 according to a command received from the remote control device 300 via the communication interface 34. The propulsion unit 40 propels the UAV 10. The propulsion unit 40 includes a plurality of rotating blades and a plurality of drive motors that rotate the plurality of rotating blades. The propulsion unit 40 causes the UAV 10 to fly by rotating a plurality of rotor blades via a plurality of drive motors in accordance with a command from the UAV control unit 30.
 GPS受信機41は、複数のGPS衛星から発信された時刻を示す複数の信号を受信する。GPS受信機41は、受信された複数の信号に基づいてGPS受信機41の位置、つまりUAV10の位置を算出する。IMU42は、UAV10の姿勢を検出する。IMU42は、UAV10の姿勢として、UAV10の前後、左右、及び上下の3軸方向の加速度と、ピッチ、ロール、及びヨーの3軸方向の角速度とを検出する。磁気コンパス43は、UAV10の機首の方位を検出する。気圧高度計44は、UAV10が飛行する高度を検出する。気圧高度計44は、UAV10の周囲の気圧を検出し、検出された気圧を高度に換算して、高度を検出する。 The GPS receiver 41 receives a plurality of signals indicating times transmitted from a plurality of GPS satellites. The GPS receiver 41 calculates the position of the GPS receiver 41, that is, the position of the UAV 10 based on the received signals. The IMU 42 detects the posture of the UAV 10. The IMU 42 detects, as the posture of the UAV 10, acceleration in the three axial directions of the front, rear, left, and right of the UAV 10, and angular velocity in the three axial directions of pitch, roll, and yaw. The magnetic compass 43 detects the heading of the UAV 10. The barometric altimeter 44 detects the altitude at which the UAV 10 flies. The barometric altimeter 44 detects the atmospheric pressure around the UAV 10, converts the detected atmospheric pressure into an altitude, and detects the altitude.
 撮像装置100は、撮像部102及びレンズ部200を備える。レンズ部200は、レンズ装置の一例である。撮像部102は、イメージセンサ120、撮像制御部110、及びメモリ130を有する。イメージセンサ120は、CCDまたはCMOSにより構成されてよい。イメージセンサ120は、複数のレンズ210を介して結像された光学像の画像データを撮像制御部110に出力する。撮像制御部110は、CPUまたはMPUなどのマイクロプロセッサ、MCUなどのマイクロコントローラなどにより構成されてよい。撮像制御部110は、UAV制御部30からの撮像装置100の動作命令に応じて、撮像装置100を制御してよい。メモリ130は、コンピュータ可読可能な記録媒体でよく、SRAM、DRAM、EPROM、EEPROM、及びUSBメモリなどのフラッシュメモリの少なくとも1つを含んでよい。メモリ130は、撮像制御部110がイメージセンサ120などを制御するのに必要なプログラム等を格納する。メモリ130は、撮像装置100の筐体の内部に設けられてよい。メモリ130は、撮像装置100の筐体から取り外し可能に設けられてよい。 The imaging apparatus 100 includes an imaging unit 102 and a lens unit 200. The lens unit 200 is an example of a lens device. The imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130. The image sensor 120 may be configured by a CCD or a CMOS. The image sensor 120 outputs image data of an optical image formed through the plurality of lenses 210 to the imaging control unit 110. The imaging control unit 110 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The imaging control unit 110 may control the imaging device 100 in accordance with an operation command for the imaging device 100 from the UAV control unit 30. The memory 130 may be a computer-readable recording medium and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory. The memory 130 stores a program and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like. The memory 130 may be provided inside the housing of the imaging device 100. The memory 130 may be provided so as to be removable from the housing of the imaging apparatus 100.
 レンズ部200は、複数のレンズ210、レンズ移動機構212、及びレンズ制御部220を有する。複数のレンズ210は、ズームレンズ、バリフォーカルレンズ、及びフォーカスレンズとして機能してよい。複数のレンズ210の少なくとも一部または全部は、光軸に沿って移動可能に配置される。レンズ部200は、撮像部102に対して着脱可能に設けられる交換レンズでよい。レンズ移動機構212は、複数のレンズ210の少なくとも一部または全部を光軸に沿って移動させる。レンズ制御部220は、撮像部102からのレンズ制御命令に従って、レンズ移動機構212を駆動して、1または複数のレンズ210を光軸方向に沿って移動させる。レンズ制御命令は、例えば、ズーム制御命令、及びフォーカス制御命令である。 The lens unit 200 includes a plurality of lenses 210, a lens moving mechanism 212, and a lens control unit 220. The plurality of lenses 210 may function as a zoom lens, a varifocal lens, and a focus lens. At least some or all of the plurality of lenses 210 are arranged to be movable along the optical axis. The lens unit 200 may be an interchangeable lens that is detachably attached to the imaging unit 102. The lens moving mechanism 212 moves at least some or all of the plurality of lenses 210 along the optical axis. The lens control unit 220 drives the lens moving mechanism 212 in accordance with a lens control command from the imaging unit 102 to move one or a plurality of lenses 210 along the optical axis direction. The lens control command is, for example, a zoom control command and a focus control command.
 このように構成されたUAV10に搭載された撮像装置100は、地面または海面と空とを含む画像を撮影する場合がある。UAV10の移動、またはジンバル50による撮像装置100の姿勢制御などにより、画面内で地面または海面、及び空のそれぞれが占める割合は、大きく変化する。撮像装置100による自動露出制御がこの変化に追従できず、撮像装置100により撮像される画像が一時的に明る過ぎたり、暗過ぎたりする場合がある。そこで、本実施形態では、撮像装置100の高度、画角、及び撮影方向に基づいて地面または海面と、空との境界である水平線を推定する。撮像装置100により撮像される画像を水平線の上方である上方領域と水平線の下方である下方領域とに分割する。そして、画像の明るさの評価値を上方領域及び下方領域のそれぞれについて導出し、それぞれの評価値に基づいて上方領域及び下方領域の露出の制御値を決定する。これにより、撮像装置100は、適切に露出制御を実行する。 The imaging device 100 mounted on the UAV 10 configured as described above may capture an image including the ground or the sea surface and the sky. Due to the movement of the UAV 10 or the attitude control of the imaging device 100 by the gimbal 50, the ratio of the ground, the sea surface, and the sky in the screen greatly changes. The automatic exposure control by the imaging apparatus 100 cannot follow this change, and the image captured by the imaging apparatus 100 may be temporarily too bright or too dark. Therefore, in the present embodiment, a horizontal line that is a boundary between the ground or the sea surface and the sky is estimated based on the altitude, the angle of view, and the shooting direction of the imaging apparatus 100. An image captured by the imaging apparatus 100 is divided into an upper area above the horizontal line and a lower area below the horizontal line. Then, an evaluation value of the brightness of the image is derived for each of the upper region and the lower region, and an exposure control value for the upper region and the lower region is determined based on each evaluation value. Thereby, the imaging device 100 appropriately performs exposure control.
 上記のような露出制御を実現すべく、撮像制御部110は、分割部112、導出部114、決定部116、及び露出制御部118を含む。分割部112は、撮像装置100により撮像される画像を、撮像装置100の画角、撮像装置100の高度、及び撮像装置100の撮像方向に基づいて、複数の領域に分割する。分割部112は、撮像装置100の画角、撮像装置100の高度、及び撮像装置100の撮像方向に基づいて、画像を、上方領域及び下方領域に分割してよい。分割部112は、撮像装置100の画角、撮像装置100の高度、及び撮像装置100の撮像方向に基づいて、画像内の水平線の位置を推定する。分割部112は、推定された水平線の上方を上方領域、推定された水平線の下方を下方領域として、画像を分割してよい。画像内の上方領域及び下方領域は、画像が撮像される時点での撮像装置100の姿勢によって異なる。例えば、撮像装置100の姿勢が水平状態の場合、上方領域は、画像内において鉛直方向上側の領域でよく、下方領域は、画像内において鉛直方向下側の領域でよい。上方領域は、地面または海面と、空とを含む画像内の空側の領域でよい。下方領域は、地面または海面と、空とを含む画像内の地面または海面側の領域でよい。 In order to realize the above exposure control, the imaging control unit 110 includes a dividing unit 112, a derivation unit 114, a determination unit 116, and an exposure control unit 118. The dividing unit 112 divides an image captured by the imaging apparatus 100 into a plurality of regions based on the angle of view of the imaging apparatus 100, the altitude of the imaging apparatus 100, and the imaging direction of the imaging apparatus 100. The dividing unit 112 may divide the image into an upper area and a lower area based on the angle of view of the imaging apparatus 100, the altitude of the imaging apparatus 100, and the imaging direction of the imaging apparatus 100. The dividing unit 112 estimates the position of the horizontal line in the image based on the angle of view of the imaging device 100, the altitude of the imaging device 100, and the imaging direction of the imaging device 100. The dividing unit 112 may divide the image with the upper area above the estimated horizontal line as the upper area and the lower area below the estimated horizontal line as the lower area. The upper region and the lower region in the image differ depending on the posture of the imaging apparatus 100 at the time when the image is captured. For example, when the posture of the imaging apparatus 100 is in a horizontal state, the upper region may be a region on the upper side in the vertical direction in the image, and the lower region may be a region on the lower side in the vertical direction in the image. The upper region may be a region on the sky side in the image including the ground or sea surface and the sky. The lower region may be a region on the ground or sea surface side in the image including the ground or sea surface and the sky.
 導出部114は、撮像装置100により撮像される画像の明るさの評価値を複数の領域ごとに導出する。導出部114は、撮像装置100により撮像される画像の輝度値を明るさの評価値として、複数の領域ごとに導出してよい。導出部114は、それぞれの領域に含まれるそれぞれの画素の輝度値に基づいて、それぞれの領域の輝度値を導出してよい。導出部114は、それぞれの領域に含まれるそれぞれの画素の輝度値の平均値を、それぞれの領域の輝度値として導出してよい。導出部114は、それぞれの領域に含まれるそれぞれの画素の輝度値に対して予め定められた重み付けをして、重み付けされたそれぞれの輝度値に基づいて、それぞれの領域の輝度値を導出してよい。導出部114は、分割部112により分割された上方領域及び下方領域のそれぞれの明るさの評価値を導出してよい。 The deriving unit 114 derives an evaluation value of the brightness of an image captured by the imaging device 100 for each of a plurality of areas. The deriving unit 114 may derive the luminance value of the image captured by the imaging apparatus 100 as a brightness evaluation value for each of a plurality of regions. The deriving unit 114 may derive the luminance value of each region based on the luminance value of each pixel included in each region. The deriving unit 114 may derive the average value of the luminance values of the pixels included in each area as the luminance value of each area. The deriving unit 114 performs predetermined weighting on the luminance value of each pixel included in each region, and derives the luminance value of each region based on each weighted luminance value. Good. The deriving unit 114 may derive the brightness evaluation values of the upper region and the lower region divided by the dividing unit 112.
 決定部116は、複数の領域ごとに撮像装置100の露出の制御値を決定する。決定部116は、複数の領域のそれぞれの評価値に基づいて、複数の領域ごとに露出の制御値を決定してよい。決定部116は、複数の領域のそれぞれの輝度値に基づいて、複数の領域ごとに露出の制御値を決定してよい。決定部116は、複数の領域のそれぞれの制御値として、複数の領域ごとに露出値(EV値)を決定してよい。決定部116は、上方領域及び下方領域のそれぞれの明るさの評価値に基づいて、上方領域及び下方領域のそれぞれの露出の制御値を決定してよい。 The determining unit 116 determines the exposure control value of the imaging apparatus 100 for each of a plurality of regions. The determination unit 116 may determine an exposure control value for each of the plurality of regions based on the evaluation values of the plurality of regions. The determination unit 116 may determine an exposure control value for each of the plurality of regions based on the luminance values of the plurality of regions. The determination unit 116 may determine an exposure value (EV value) for each of the plurality of regions as the control value for each of the plurality of regions. The determination unit 116 may determine the exposure control values of the upper region and the lower region based on the brightness evaluation values of the upper region and the lower region.
 露出制御部118は、撮像装置100の露出を複数の領域ごとに制御する。露出制御部118は、複数の領域のそれぞれの露出の制御値に基づいて、撮像装置100の露出を複数の領域ごとに制御してよい。露出制御部118は、制御部の一例である。露出制御部118は、イメージセンサ120が露光量に応じて出力する電気信号のゲインを制御することにより、撮像装置100の露出を複数の領域ごとに制御してよい。露出制御部118は、複数の領域のそれぞれの露出の制御値に基づいて、イメージセンサ120のゲインを複数の領域ごとに制御することで、撮像装置100の露出を複数の領域ごとに制御してよい。露出制御部118は、複数の領域ごとに露光時間を制御することで、撮像装置100の露出を複数の領域ごとに制御してよい。 The exposure control unit 118 controls the exposure of the imaging apparatus 100 for each of a plurality of areas. The exposure control unit 118 may control the exposure of the imaging device 100 for each of the plurality of regions based on the exposure control values of the plurality of regions. The exposure control unit 118 is an example of a control unit. The exposure control unit 118 may control the exposure of the imaging device 100 for each of a plurality of regions by controlling the gain of an electric signal output by the image sensor 120 according to the exposure amount. The exposure control unit 118 controls the exposure of the imaging device 100 for each of the plurality of regions by controlling the gain of the image sensor 120 for each of the plurality of regions based on the exposure control values of the plurality of regions. Good. The exposure control unit 118 may control the exposure of the imaging device 100 for each of the plurality of regions by controlling the exposure time for each of the plurality of regions.
 分割部112は、第1時点での撮像装置100の画角、高度、及び撮像方向に基づいて、第1時点で撮像装置100に撮像される第1画像を、第1上方領域及び第1下方領域に分割してよい。導出部114は、第1画像に基づいて第1上方領域及び第1下方領域の明るさの評価値を導出してよい。分割部112は、第1時点より後の第2時点での撮像装置100の画角、高度、及び撮像方向に基づいて、撮像装置100により第2時点で撮像される第2画像を第2上方領域及び第2下方領域に分割してよい。第2時点での撮像装置100の画角、高度、及び撮像方向は、第2時点より前に特定される第2時点での撮像装置100の画角、高度、及び撮像方向でよい。第2時点での撮像装置100の画角及び撮像方向は、第2時点で撮像装置100に設定されるべき目標の画角及び撮像方向に対応してよい。第2時点での撮像装置100の高度は、第2時点で撮像装置100が位置すべき目標の高度に対応してよい。 Based on the angle of view, altitude, and imaging direction of the imaging device 100 at the first time point, the dividing unit 112 converts the first image captured by the imaging device 100 at the first time point into the first upper region and the first lower side. It may be divided into regions. The deriving unit 114 may derive brightness evaluation values of the first upper area and the first lower area based on the first image. Based on the angle of view, altitude, and imaging direction of the imaging device 100 at the second time point after the first time point, the dividing unit 112 outputs the second image captured at the second time point by the imaging device 100 in the second upper direction. You may divide | segment into a area | region and a 2nd lower area | region. The angle of view, altitude, and imaging direction of the imaging device 100 at the second time point may be the angle of view, altitude, and imaging direction of the imaging device 100 at the second time point specified before the second time point. The angle of view and imaging direction of the imaging device 100 at the second time point may correspond to the target angle of view and imaging direction to be set in the imaging device 100 at the second time point. The altitude of the imaging device 100 at the second time point may correspond to the target altitude at which the imaging device 100 should be located at the second time point.
 第2時点での撮像装置100の高度は、第2時点より前に特定される第2時点でのUAV10の高度に対応してよい。UAV10は、遠隔操作装置300から上昇命令を受け付けている間、上昇命令に応じた速度で上昇してよい。この場合、分割部112は、上昇命令に基づいてUAV10の現在の上昇速度を特定してよい。そして、分割部112は、現在の上昇速度、及び現在の高度に基づいて、現在より後の第2時点でのUAV10の高度を、第2時点より前に特定してよい。第2時点での撮像装置100の高度は、第2時点でUAV10が位置すべき目標の高度に対応してよい。目標の高度は、UAV10に対する指示情報に示されるUAV10が位置すべき高度に対応してよい。目標の高度は、現時点から予め定められた時間経過後に、UAV10が位置すべき高度に対応してよい。目標の画角及び撮像方向は、現時点から予め定められた時間経過後に、UAV10が位置すべき高度で撮像装置100が撮像する際に設定されるべき画角及び撮像方向でよい。撮像装置100の撮像方向は、ジンバル50が撮像装置100の姿勢を制御するための制御値、及びUAV制御部30がUAV10の姿勢を制御するための制御値に基づいて決定されてよい。決定部116は、第1時点の第1上方領域の評価値に基づいて、第2時点の第2上方領域の露出の制御値を決定してよい。決定部116は、第1時点の第1下方領域の評価値に基づいて、第2時点の第2下方領域の露出の制御値を決定してよい。 The altitude of the imaging device 100 at the second time point may correspond to the altitude of the UAV 10 at the second time point specified before the second time point. While receiving the ascending command from the remote operation device 300, the UAV 10 may ascend at a speed corresponding to the ascending command. In this case, the dividing unit 112 may specify the current rising speed of the UAV 10 based on the rising command. Then, the dividing unit 112 may specify the altitude of the UAV 10 at the second time point after the current time before the second time point based on the current ascending speed and the current altitude. The altitude of the imaging device 100 at the second time point may correspond to the target altitude at which the UAV 10 should be located at the second time point. The target altitude may correspond to the altitude at which the UAV 10 indicated in the instruction information for the UAV 10 should be located. The target altitude may correspond to the altitude at which the UAV 10 should be located after a predetermined time has elapsed since the present time. The target angle of view and imaging direction may be the angle of view and imaging direction to be set when the imaging apparatus 100 captures images at a height at which the UAV 10 should be located after a predetermined time has elapsed since the present time. The imaging direction of the imaging apparatus 100 may be determined based on a control value for the gimbal 50 to control the attitude of the imaging apparatus 100 and a control value for the UAV control unit 30 to control the attitude of the UAV 10. The determination unit 116 may determine a control value for the exposure of the second upper region at the second time point based on the evaluation value of the first upper region at the first time point. The determination unit 116 may determine a control value for the exposure of the second lower region at the second time point based on the evaluation value of the first lower region at the first time point.
 分割部112は、画像を、上方領域、下方領域、及び上方領域と下方領域との間の少なくとも1つの中間領域に分割してよい。決定部116は、少なくとも1つの中間領域の露出の制御値を、上方領域の露出の制御値と、下方領域の露出の制御値との間の値に決定してよい。露出の制御値が上方領域と下方領域とで大きく異なると、撮像装置100により撮像される画像の上方領域と下方領域との境界付近で線状のノイズが現れる可能性がある。そこで、上方領域と下方領域との間に中間領域を設けて、露出の制御値の変化が緩やかになるようにする。これにより、画像内に線状のノイズが現れることを防止できる。 The dividing unit 112 may divide the image into an upper region, a lower region, and at least one intermediate region between the upper region and the lower region. The determination unit 116 may determine the exposure control value of at least one intermediate region as a value between the exposure control value of the upper region and the exposure control value of the lower region. When the exposure control value is greatly different between the upper region and the lower region, linear noise may appear near the boundary between the upper region and the lower region of the image captured by the imaging device 100. Therefore, an intermediate region is provided between the upper region and the lower region so that the change in the exposure control value becomes gradual. Thereby, linear noise can be prevented from appearing in the image.
 図3に示すように、撮像装置100により撮像される画像500に、水平線が含まれる場合、決定部116は、水平線より上方の上方領域502を空、水平線より下方の下方領域504を地面または海面と判断する。そして、決定部116は、上方領域502の露出値が下方領域503の露出値より高くなるように上方領域502及び下方領域504の露出値を決定してよい。 As illustrated in FIG. 3, when an image 500 captured by the imaging apparatus 100 includes a horizontal line, the determination unit 116 emptyes the upper area 502 above the horizontal line and sets the lower area 504 below the horizontal line to the ground or the sea surface. Judge. Then, the determination unit 116 may determine the exposure values of the upper region 502 and the lower region 504 so that the exposure value of the upper region 502 is higher than the exposure value of the lower region 503.
 ここで、分割部112が撮像装置100により撮像される画像を上方領域と下方領域とに分割する手順の一例についてより具体的に説明する。 Here, an example of a procedure in which the dividing unit 112 divides an image captured by the imaging device 100 into an upper region and a lower region will be described more specifically.
 図4に示すように、撮像装置100の焦点距離fとイメージセンサ120のサイズ(幅)Sとにより定められる撮像装置100の画角をθとする。図5に示すように、地球の半径をRとする。UAV10の高度をhとする。高度hのUAV10から見える水平線の位置とUAV10とを結ぶ仮想線404と、高度hのUAV10から鉛直方向に延びる仮想線406とが成す角度をγとする。このような位置にUAV10が存在する場合に、撮像装置100により撮像される画像に水平線が含まれる角度範囲は、図6に示すように、仮想線406を基準としてγ-θ/2からγ+θ/2の範囲である。撮像装置100の光軸と、仮想線406との成す角度が、γ-θ/2からγ+θ/2の範囲に含まれるという条件を満たす場合、撮像装置100により撮像される画像に水平線が含まれる。露出制御部118は、撮像装置100の姿勢が当該条件を満たす場合、分割部112により分割された領域ごとに露出を制御してよい。撮像制御部110が、例えば、撮像装置60により撮像された画像に基づいて障害物を検知した場合には、分割部112により分割された領域ごとの露出制御を露出制御部118に実行させなくてもよい。 As shown in FIG. 4, the angle of view of the imaging device 100 determined by the focal length f of the imaging device 100 and the size (width) S of the image sensor 120 is θ. As shown in FIG. 5, let R be the radius of the earth. Let the altitude of UAV10 be h. An angle formed between a virtual line 404 connecting the position of the horizontal line visible from the UAV 10 at the altitude h and the UAV 10 and a virtual line 406 extending from the UAV 10 at the altitude h in the vertical direction is γ. When the UAV 10 exists at such a position, the angle range in which the horizontal line is included in the image captured by the imaging apparatus 100 is from γ−θ / 2 to γ + θ / with respect to the virtual line 406 as shown in FIG. 2 range. When the condition that the angle formed by the optical axis of the imaging apparatus 100 and the virtual line 406 is included in the range of γ−θ / 2 to γ + θ / 2, a horizontal line is included in the image captured by the imaging apparatus 100. . The exposure control unit 118 may control the exposure for each region divided by the dividing unit 112 when the posture of the imaging apparatus 100 satisfies the condition. For example, when the imaging control unit 110 detects an obstacle based on an image captured by the imaging device 60, the exposure control unit 118 does not have to execute exposure control for each of the areas divided by the dividing unit 112. Also good.
 図7は、分割部112により実行される画像の分割の手順の一例を示すフローチャートである。図8は、撮像装置100が水平方向を向いている場合に、画像内に含まれる水平線の位置を推定するために用いられる各パラメータについて説明するための図である。図9は、撮像装置100が水平方向に対して上方に角度αだけ傾いている場合に、画像内に含まれる水平線の位置を推定するために用いられる各パラメータについて説明するための図である。 FIG. 7 is a flowchart illustrating an example of an image division procedure executed by the dividing unit 112. FIG. 8 is a diagram for describing each parameter used for estimating the position of the horizontal line included in the image when the imaging apparatus 100 is oriented in the horizontal direction. FIG. 9 is a diagram for describing each parameter used for estimating the position of the horizontal line included in the image when the imaging apparatus 100 is inclined upward by an angle α with respect to the horizontal direction.
 分割部112は、UAV10の飛行高度hを取得する(S100)。分割部112は、UAV10から水平線までの距離dを導出する(S102)。距離dは、地球の半径R及び距離dから次式により導出される。
Figure JPOXMLDOC01-appb-M000001
The dividing unit 112 acquires the flight altitude h of the UAV 10 (S100). The dividing unit 112 derives a distance d from the UAV 10 to the horizontal line (S102). The distance d is derived from the radius R of the earth and the distance d by the following equation.
Figure JPOXMLDOC01-appb-M000001
 分割部112は、UAV10から水平方向に延びる仮想線410と、水平線420を通り仮想線410に垂直な仮想線412との交点414と、UAV10との間の距離を示す水平距離Dを導出する(S104)。分割部112は、仮想線410と、距離dに沿った仮想線416との成す角度である水平角度βを導出する(S106)。水平距離Dは、次式により導出される。
Figure JPOXMLDOC01-appb-M000002
The dividing unit 112 derives a horizontal distance D indicating the distance between the UAV 10 and the intersection 414 of the virtual line 410 extending in the horizontal direction from the UAV 10 and the virtual line 412 passing through the horizontal line 420 and perpendicular to the virtual line 410 ( S104). The dividing unit 112 derives a horizontal angle β that is an angle formed by the virtual line 410 and the virtual line 416 along the distance d (S106). The horizontal distance D is derived from the following equation.
Figure JPOXMLDOC01-appb-M000002
 水平角度βは、次式により導出される。
Figure JPOXMLDOC01-appb-M000003
The horizontal angle β is derived from the following equation.
Figure JPOXMLDOC01-appb-M000003
 続いて、分割部112は、画角θで撮像装置100が撮像した場合の投影範囲Wを導出し(S108)、投影範囲Wの下部418から水平線420までの高さVを導出する。 Subsequently, the dividing unit 112 derives the projection range W when the imaging apparatus 100 captures an image with the angle of view θ (S108), and derives the height V from the lower portion 418 of the projection range W to the horizontal line 420.
 図8に示すように、撮像装置100が水平方向を向いている場合、投影範囲Wは、次式により導出される。
Figure JPOXMLDOC01-appb-M000004
As shown in FIG. 8, when the imaging apparatus 100 is oriented in the horizontal direction, the projection range W is derived by the following equation.
Figure JPOXMLDOC01-appb-M000004
 この場合、投影範囲Wの下部418から水平線420までの高さVは、次式により導出される。
Figure JPOXMLDOC01-appb-M000005
In this case, the height V from the lower portion 418 of the projection range W to the horizontal line 420 is derived by the following equation.
Figure JPOXMLDOC01-appb-M000005
 一方、図9に示すように、撮像装置100が水平方向に対して角度(姿勢角度)αだけ傾いている場合、投影範囲Wは、次式により導出される。撮像装置100が水平方向に対して上方に向いている場合、姿勢角度αは、正の値とする。
Figure JPOXMLDOC01-appb-M000006
On the other hand, as shown in FIG. 9, when the imaging apparatus 100 is inclined by an angle (attitude angle) α with respect to the horizontal direction, the projection range W is derived by the following equation. When the imaging apparatus 100 is oriented upward with respect to the horizontal direction, the posture angle α is a positive value.
Figure JPOXMLDOC01-appb-M000006
 この場合、投影範囲Wの下部418から水平線420までの高さVは、次式により導出される。
Figure JPOXMLDOC01-appb-M000007
In this case, the height V from the lower portion 418 of the projection range W to the horizontal line 420 is derived by the following equation.
Figure JPOXMLDOC01-appb-M000007
 分割部112は、導出された高さVに基づいて上方領域及び下方領域を特定する(S112及びS114)。例えば、図10に示すように、イメージセンサ120により撮像される画像510のライン数がM[pixel]である場合、分割部112は、上方領域を、画像の垂直方向上方側のM×(W-V)/W[pixel]に設定する。分割部112は、下方領域を、画像の垂直方向下側のM×V/M[pixel]に設定する。 The dividing unit 112 identifies the upper region and the lower region based on the derived height V (S112 and S114). For example, as illustrated in FIG. 10, when the number of lines of an image 510 captured by the image sensor 120 is M [pixel], the dividing unit 112 defines the upper region as M × (W on the upper side in the vertical direction of the image. Set to -V) / W [pixel]. The dividing unit 112 sets the lower area to M × V / M [pixel] on the lower side in the vertical direction of the image.
 決定部116は、上方領域の輝度値に基づいて、上方領域の露出値を例えば10[EV]に決定する。決定部116は、下方領域の輝度値に基づいて、下方領域の露出値を例えば6[EV]に決定する。 The determination unit 116 determines the exposure value of the upper region to be, for example, 10 [EV] based on the luminance value of the upper region. The determination unit 116 determines the exposure value of the lower region to be, for example, 6 [EV] based on the luminance value of the lower region.
 以上の通り、上記の手順により、分割部112は、撮像装置100の画角、高度、及び撮像方向に基づいて画像内の水平線の位置を推定し、推定された水平線の上方を上方領域、下方を下方領域として画像を分割できる。 As described above, according to the above procedure, the dividing unit 112 estimates the position of the horizontal line in the image based on the angle of view, the altitude, and the imaging direction of the image capturing apparatus 100, and the upper region above the estimated horizontal line is the lower region. It is possible to divide the image using as a lower area.
 図11は、画像全体に対する下方領域の割合と飛行高度hとの関係を示す。飛行高度hが同一でも、撮像装置100の焦点距離fが長いほど、つまり画角θが狭いほど、下方領域の割合は少なくなる。 FIG. 11 shows the relationship between the ratio of the lower area to the entire image and the flight altitude h. Even if the flight altitude h is the same, the longer the focal length f of the imaging apparatus 100, that is, the smaller the angle of view θ, the lower the proportion of the lower region.
 図12は、画像全体に対する下方領域の割合と姿勢角度αとの関係を示す。姿勢角度αが同一でも、撮像装置100の焦点距離fが長いほど、つまり画角θが狭いほど、下方領域の割合は少なくなる。 FIG. 12 shows the relationship between the ratio of the lower area to the entire image and the posture angle α. Even if the posture angle α is the same, the longer the focal length f of the imaging apparatus 100, that is, the narrower the angle of view θ, the smaller the proportion of the lower region.
 図13は、撮像装置100による露出制御の手順の一例を示すフローチャートである。まず、UAV10の主電源をオンすることで、撮像装置100をオンする(S300)。撮像装置100の電源がオンした直後は、UAV10は飛行していない。露出制御部118は、画像を分割せずに、通常通り、画像全体を単一領域として撮像装置100の露出を制御する(S302)。その後、撮像制御部110は、UAV10が飛行の指示情報を受信したか否かを判定する(S304)。 FIG. 13 is a flowchart showing an example of a procedure for exposure control by the imaging apparatus 100. First, the imaging apparatus 100 is turned on by turning on the main power supply of the UAV 10 (S300). The UAV 10 does not fly immediately after the imaging apparatus 100 is turned on. The exposure control unit 118 controls the exposure of the imaging apparatus 100 with the entire image as a single region as usual without dividing the image (S302). Thereafter, the imaging control unit 110 determines whether or not the UAV 10 has received flight instruction information (S304).
 飛行の指示情報を受信すると、UAV10は、飛行を開始する(S306)。分割部112は、UAV10の現在の高度、撮像装置100の画角、及び撮像装置100の撮像方向を特定する(S308)。分割部112は、UAV10の現在の高度、撮像装置100の画角、及び撮像装置100の撮像方向に基づいて、撮像装置100により撮像される画像の現在の上方領域及び下方領域を特定する(S310)。導出部114は、現在の上方領域及び下方領域のそれぞれの明るさの評価値として、例えば、輝度値を導出する。決定部116は、現在の上方領域の輝度値、及び現在の下方領域の輝度値に基づいて、現在の上方領域の露出値、及び現在の下方領域の露出値を決定する(S312)。 Upon receiving the flight instruction information, the UAV 10 starts flying (S306). The dividing unit 112 specifies the current altitude of the UAV 10, the angle of view of the imaging device 100, and the imaging direction of the imaging device 100 (S308). Based on the current altitude of the UAV 10, the angle of view of the imaging device 100, and the imaging direction of the imaging device 100, the dividing unit 112 identifies the current upper region and lower region of the image captured by the imaging device 100 (S310). ). The deriving unit 114 derives, for example, a luminance value as the brightness evaluation value of each of the current upper region and lower region. The determination unit 116 determines the current upper region exposure value and the current lower region exposure value based on the current upper region luminance value and the current lower region luminance value (S312).
 露出制御部118は、決定された現在の上方領域の露出値、及び現在の下方領域の露出値に基づいて、撮像装置100の露出を上方領域及び下方領域ごとに制御する(S314)。 The exposure control unit 118 controls the exposure of the imaging device 100 for each of the upper region and the lower region based on the determined exposure value of the upper region and the current exposure value of the lower region (S314).
 次いで、分割部112は、指示情報に基づいて次の時点のUAV10の高度、撮像装置100の画角、及び撮像装置100の撮像方向を特定する(S316)。分割部112は、指示情報に基づいて、現時点から予め定められた期間経過後に、UAV10が位置すべき目標の高度を特定し、その目標の高度を次の時点のUAV10の高度として特定してよい。分割部112は、指示情報に基づいてUAV10の現在の上昇速度を特定してよい。そして、分割部112は現在の上昇速度及び現在の高度に基づいて、現時点から予め定められた期間経過後にUAV10が位置すべき目標の高度を特定し、その目標の高度を次の時点のUAV10の高度として特定してよい。分割部112は、指示情報に示される撮像条件に基づいて、次の時点の画角及び撮像方向を特定してよい。 Next, the dividing unit 112 specifies the height of the UAV 10 at the next time point, the angle of view of the imaging device 100, and the imaging direction of the imaging device 100 based on the instruction information (S316). Based on the instruction information, the dividing unit 112 may specify a target altitude at which the UAV 10 should be located after a predetermined period from the current time, and specify the target altitude as the UAV 10 altitude at the next time point. . The dividing unit 112 may specify the current rising speed of the UAV 10 based on the instruction information. Then, the dividing unit 112 identifies the target altitude at which the UAV 10 should be positioned after a predetermined period from the current time based on the current ascending speed and the current altitude, and sets the target altitude to the UAV 10 at the next time point. It may be specified as an altitude. The dividing unit 112 may specify the angle of view and the imaging direction at the next time point based on the imaging condition indicated in the instruction information.
 続いて、分割部112は、次の時点の高度、画角、及び撮像方向に基づいて、次の時点の上方領域及び下方領域を特定する(S318)。分割部112は、次の時点の高度、画角、及び撮像方向に基づいて、図7に示す手順に従って上方領域及び下方領域を特定してよい。次いで、決定部116は、前回の上方領域及び下方領域の露出値に基づいて、次の時点の上方領域及び下方領域の露出値を決定する(S320)。決定部116は、撮像装置100により撮像されている現在の画像に基づいて、現在の上方領域及び下方領域に対して決定されたそれぞれの露出値に基づいて、次の時点の上方領域及び下方領域の露出値を決定してよい。決定部116は、現在の上方領域及び下方領域に対して導出されたそれぞれの輝度値に基づいて、次の時点の上方領域及び下方領域の露出値を決定してよい。露出制御部118は、UAV10が次の時点までに目標の高度まで上昇する過程において、次の時点の上方領域及び下方領域の露出値に基づいて、撮像装置100の露出を上方領域及び下方領域ごとに制御する(S322)。 Subsequently, the dividing unit 112 specifies the upper region and the lower region at the next time point based on the altitude, the angle of view, and the imaging direction at the next time point (S318). The dividing unit 112 may specify the upper region and the lower region according to the procedure illustrated in FIG. 7 based on the altitude, the angle of view, and the imaging direction at the next time point. Next, the determination unit 116 determines the exposure values of the upper region and the lower region at the next time point based on the previous exposure values of the upper region and the lower region (S320). Based on the current exposure values determined for the current upper region and the lower region based on the current image captured by the imaging apparatus 100, the determination unit 116 determines the upper region and the lower region at the next time point. The exposure value may be determined. The determination unit 116 may determine the exposure values of the upper region and the lower region at the next time point based on the respective luminance values derived for the current upper region and the lower region. In the process in which the UAV 10 rises to the target altitude by the next time point, the exposure control unit 118 sets the exposure of the imaging apparatus 100 for each of the upper and lower regions based on the exposure values of the upper and lower regions at the next time point. (S322).
 UAV10が飛行中であれば(S324)、撮像制御部110は、ステップS316以降の処理を繰り返す。UAV10が飛行中でなければ、ステップS316以降の処理を一旦中断して、UAV10が電源オフしていれば(S326)、一連の処理を終了する。UAV10の電源がオンであれば、ステップS302以降の処理を繰り返す。 If the UAV 10 is in flight (S324), the imaging control unit 110 repeats the processing after step S316. If the UAV 10 is not in flight, the processing after step S316 is temporarily interrupted, and if the UAV 10 is powered off (S326), the series of processing ends. If the UAV 10 is powered on, the processing from step S302 is repeated.
 撮像制御部110は、UAV10の高度が、予め定められた高度以上になった時点で、ステップS308以降の処理を実行してもよい。予め定められた高度は、撮像装置100により撮像される画像に水平線が含まれる可能性が高い下限の高度でよい。また、撮像制御部110は、撮像制御部110の高度が予め定められた高度以下になった時点で、ステップS316以降の処理を一旦中断してもよい。 The imaging control unit 110 may execute the processing after step S308 when the altitude of the UAV 10 becomes equal to or higher than a predetermined altitude. The predetermined altitude may be a lower altitude that is likely to include a horizontal line in an image captured by the imaging apparatus 100. In addition, the imaging control unit 110 may temporarily interrupt the processing after step S316 when the altitude of the imaging control unit 110 becomes equal to or lower than a predetermined altitude.
 以上の通り、分割部112は、UAV10が飛行中に、現時点から予め定められた期間経過後の将来の時点におけるUAV10の目標の高度、撮像装置100の目標の画角、及び目標の撮像方向を特定する。分割部112は、将来の時点におけるUAV10の目標の高度、撮像装置100の目標の画角、及び目標の撮像方向に基づいて、将来の時点に撮像装置100により撮像される画像の上方領域及び下方領域を特定する。そして、決定部116は、現時点の画像の上方領域及び下方領域の露出値または輝度値に基づいて、将来の時点の上方領域及び下方領域の露出値を事前に決定する。これにより、UAV10が、上昇、または下降中に、地面または海面の領域と空の領域との割合が大きく変化することで、撮像装置100により撮像される画像の明るさが大きく変化して、撮像装置100が適切に露出制御を行えなくなることを防止できる。 As described above, the dividing unit 112 determines the target altitude of the UAV 10, the target angle of view of the imaging apparatus 100, and the target imaging direction at a future time after a predetermined period from the current time during the flight of the UAV 10. Identify. Based on the target altitude of the UAV 10 at the future time point, the target angle of view of the image capturing device 100, and the target image capturing direction, the dividing unit 112 is configured to detect the upper region and the lower portion of the image captured by the image capturing device 100 at the future time point. Identify the area. Then, the determination unit 116 determines in advance the exposure values of the upper region and the lower region at a future time point based on the exposure values or luminance values of the upper region and the lower region of the current image. As a result, when the UAV 10 is rising or descending, the ratio of the ground or sea surface area to the sky area is greatly changed, so that the brightness of the image captured by the imaging device 100 is greatly changed, and the imaging is performed. It is possible to prevent the apparatus 100 from appropriately performing exposure control.
 露出制御部118は、イメージセンサ120が露光量に応じて出力する電気信号のゲインを制御することで、複数の領域ごとに露出を制御する例を中心に説明した。しかし、露出制御部118は、他の方法で、複数の領域ごとに露出を制御してもよい。 The exposure control unit 118 has been described mainly with respect to an example in which exposure is controlled for each of a plurality of regions by controlling the gain of an electric signal output by the image sensor 120 according to the exposure amount. However, the exposure control unit 118 may control the exposure for each of a plurality of regions by other methods.
 図14は、UAV10の機能ブロックの他の一例を示す。図14に示すUAV10は、撮像部102が光学フィルタ140を備える点で、図2に示すUAV10と異なる。光学フィルタ140は、イメージセンサ120の前方に設けられ、予め定められた複数の領域ごとに光の透過率を変更可能な光学フィルタである。光学フィルタ140は、予め定められた領域ごとに光の透過率を変更可能な可変式NDフィルタでよい。 FIG. 14 shows another example of the functional block of the UAV10. The UAV 10 illustrated in FIG. 14 is different from the UAV 10 illustrated in FIG. 2 in that the imaging unit 102 includes an optical filter 140. The optical filter 140 is an optical filter provided in front of the image sensor 120 and capable of changing the light transmittance for each of a plurality of predetermined regions. The optical filter 140 may be a variable ND filter capable of changing the light transmittance for each predetermined region.
 例えば、光学フィルタ140は、図15に示すように、複数の水平領域600ごとに光の透過率を変更可能な可変式NDフィルタでよい。光学フィルタ140は、複数の水平領域600ごとに一対の電極を有してよい。露出制御部118は、複数の領域ごとの露出値に基づいて、複数の水平領域600ごとに一対の電極に印加する電圧を制御する。これにより、露出制御部118は、複数の領域ごとに露出を制御できる。例えば、露出制御部118は、画像の上方領域に対応する複数の水平領域602に露出値10[EV]を実現できる電圧を印加する。露出制御部118は、画像の中間領域に対応する複数の水平領域606にそれぞれ露出値9[EV]、8[EV]及び7[EV]を実現できる電圧を印加する。露出制御部118は、画像の下方領域に対する複数の水平領域604に露出値6[EV]を実現できる電圧を印加する。これにより、露出制御部118は、上方領域、下方領域、及び少なくとも1つの中間領域のそれぞれの露出を制御できる。 For example, the optical filter 140 may be a variable ND filter capable of changing the light transmittance for each of the plurality of horizontal regions 600 as shown in FIG. The optical filter 140 may have a pair of electrodes for each of the plurality of horizontal regions 600. The exposure control unit 118 controls the voltage applied to the pair of electrodes for each of the plurality of horizontal regions 600 based on the exposure value for each of the plurality of regions. Thereby, the exposure control part 118 can control exposure for every some area | region. For example, the exposure control unit 118 applies a voltage that can realize an exposure value of 10 [EV] to a plurality of horizontal regions 602 corresponding to the upper region of the image. The exposure control unit 118 applies voltages capable of realizing exposure values 9 [EV], 8 [EV], and 7 [EV] to a plurality of horizontal regions 606 corresponding to the intermediate region of the image, respectively. The exposure control unit 118 applies a voltage that can realize an exposure value of 6 [EV] to a plurality of horizontal regions 604 with respect to the lower region of the image. Thereby, the exposure control unit 118 can control the exposure of the upper region, the lower region, and at least one intermediate region.
 図16は、本発明の複数の態様が全体的または部分的に具現化されてよいコンピュータ1200の一例を示す。コンピュータ1200にインストールされたプログラムは、コンピュータ1200に、本発明の実施形態に係る装置に関連付けられるオペレーションまたは当該装置の1または複数の「部」として機能させることができる。または、当該プログラムは、コンピュータ1200に当該オペレーションまたは当該1または複数の「部」を実行させることができる。当該プログラムは、コンピュータ1200に、本発明の実施形態に係るプロセスまたは当該プロセスの段階を実行させることができる。そのようなプログラムは、コンピュータ1200に、本明細書に記載のフローチャート及びブロック図のブロックのうちのいくつかまたはすべてに関連付けられた特定のオペレーションを実行させるべく、CPU1212によって実行されてよい。 FIG. 16 illustrates an example of a computer 1200 in which aspects of the present invention may be embodied in whole or in part. A program installed in the computer 1200 can cause the computer 1200 to function as an operation associated with the apparatus according to the embodiment of the present invention or as one or more “units” of the apparatus. Alternatively, the program can cause the computer 1200 to execute the operation or the one or more “units”. The program can cause the computer 1200 to execute a process according to an embodiment of the present invention or a stage of the process. Such a program may be executed by CPU 1212 to cause computer 1200 to perform certain operations associated with some or all of the blocks in the flowcharts and block diagrams described herein.
 本実施形態によるコンピュータ1200は、CPU1212、及びRAM1214を含み、それらはホストコントローラ1210によって相互に接続されている。コンピュータ1200はまた、通信インタフェース1222、入力/出力ユニットを含み、それらは入力/出力コントローラ1220を介してホストコントローラ1210に接続されている。コンピュータ1200はまた、ROM1230を含む。CPU1212は、ROM1230及びRAM1214内に格納されたプログラムに従い動作し、それにより各ユニットを制御する。 The computer 1200 according to this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other by a host controller 1210. The computer 1200 also includes a communication interface 1222 and an input / output unit, which are connected to the host controller 1210 via the input / output controller 1220. Computer 1200 also includes ROM 1230. The CPU 1212 operates according to programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
 通信インタフェース1222は、ネットワークを介して他の電子デバイスと通信する。ハードディスクドライブが、コンピュータ1200内のCPU1212によって使用されるプログラム及びデータを格納してよい。ROM1230はその中に、アクティブ化時にコンピュータ1200によって実行されるブートプログラム等、及び/またはコンピュータ1200のハードウェアに依存するプログラムを格納する。プログラムが、CR-ROM、USBメモリまたはICカードのようなコンピュータ可読記録媒体またはネットワークを介して提供される。プログラムは、コンピュータ可読記録媒体の例でもあるRAM1214、またはROM1230にインストールされ、CPU1212によって実行される。これらのプログラム内に記述される情報処理は、コンピュータ1200に読み取られ、プログラムと、上記様々なタイプのハードウェアリソースとの間の連携をもたらす。装置または方法が、コンピュータ1200の使用に従い情報のオペレーションまたは処理を実現することによって構成されてよい。 The communication interface 1222 communicates with other electronic devices via a network. A hard disk drive may store programs and data used by the CPU 1212 in the computer 1200. The ROM 1230 stores therein a boot program executed by the computer 1200 at the time of activation and / or a program depending on the hardware of the computer 1200. The program is provided via a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card or a network. The program is installed in the RAM 1214 or the ROM 1230 that is also an example of a computer-readable recording medium, and is executed by the CPU 1212. Information processing described in these programs is read by the computer 1200 to bring about cooperation between the programs and the various types of hardware resources. An apparatus or method may be configured by implementing information operations or processing in accordance with the use of computer 1200.
 例えば、通信がコンピュータ1200及び外部デバイス間で実行される場合、CPU1212は、RAM1214にロードされた通信プログラムを実行し、通信プログラムに記述された処理に基づいて、通信インタフェース1222に対し、通信処理を命令してよい。通信インタフェース1222は、CPU1212の制御の下、RAM1214、またはUSBメモリのような記録媒体内に提供される送信バッファ領域に格納された送信データを読み取り、読み取られた送信データをネットワークに送信し、またはネットワークから受信した受信データを記録媒体上に提供される受信バッファ領域等に書き込む。 For example, when communication is performed between the computer 1200 and an external device, the CPU 1212 executes a communication program loaded in the RAM 1214 and performs communication processing on the communication interface 1222 based on the processing described in the communication program. You may order. The communication interface 1222 reads transmission data stored in a RAM 1214 or a transmission buffer area provided in a recording medium such as a USB memory under the control of the CPU 1212 and transmits the read transmission data to a network, or The reception data received from the network is written into a reception buffer area provided on the recording medium.
 また、CPU1212は、USBメモリ等のような外部記録媒体に格納されたファイルまたはデータベースの全部または必要な部分がRAM1214に読み取られるようにし、RAM1214上のデータに対し様々なタイプの処理を実行してよい。CPU1212は次に、処理されたデータを外部記録媒体にライトバックしてよい。 In addition, the CPU 1212 allows the RAM 1214 to read all or necessary portions of a file or database stored in an external recording medium such as a USB memory, and executes various types of processing on the data on the RAM 1214. Good. The CPU 1212 may then write back the processed data to an external recording medium.
 様々なタイプのプログラム、データ、テーブル、及びデータベースのような様々なタイプの情報が記録媒体に格納され、情報処理を受けてよい。CPU1212は、RAM1214から読み取られたデータに対し、本開示の随所に記載され、プログラムの命令シーケンスによって指定される様々なタイプのオペレーション、情報処理、条件判断、条件分岐、無条件分岐、情報の検索/置換等を含む、様々なタイプの処理を実行してよく、結果をRAM1214に対しライトバックする。また、CPU1212は、記録媒体内のファイル、データベース等における情報を検索してよい。例えば、各々が第2の属性の属性値に関連付けられた第1の属性の属性値を有する複数のエントリが記録媒体内に格納される場合、CPU1212は、第1の属性の属性値が指定される、条件に一致するエントリを当該複数のエントリの中から検索し、当該エントリ内に格納された第2の属性の属性値を読み取り、それにより予め定められた条件を満たす第1の属性に関連付けられた第2の属性の属性値を取得してよい。 Various types of information such as various types of programs, data, tables, and databases may be stored in the recording medium and subjected to information processing. The CPU 1212 describes various types of operations, information processing, conditional judgment, conditional branching, unconditional branching, and information retrieval that are described throughout the present disclosure for data read from the RAM 1214 and specified by the instruction sequence of the program. Various types of processing may be performed, including / replacement, etc., and the result is written back to RAM 1214. In addition, the CPU 1212 may search for information in files, databases, etc. in the recording medium. For example, when a plurality of entries each having an attribute value of the first attribute associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 specifies the attribute value of the first attribute. The entry that matches the condition is searched from the plurality of entries, the attribute value of the second attribute stored in the entry is read, and thereby the first attribute that satisfies the predetermined condition is associated. The attribute value of the obtained second attribute may be acquired.
 上で説明したプログラムまたはソフトウェアモジュールは、コンピュータ1200上またはコンピュータ1200近傍のコンピュータ可読記憶媒体に格納されてよい。また、専用通信ネットワークまたはインターネットに接続されたサーバーシステム内に提供されるハードディスクまたはRAMのような記録媒体が、コンピュータ可読記憶媒体として使用可能であり、それによりプログラムを、ネットワークを介してコンピュータ1200に提供する。 The program or software module described above may be stored in a computer-readable storage medium on the computer 1200 or in the vicinity of the computer 1200. In addition, a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, whereby the program is transferred to the computer 1200 via the network. provide.
 請求の範囲、明細書、及び図面中において示した装置、システム、プログラム、及び方法における動作、手順、ステップ、及び段階等の各処理の実行順序は、特段「より前に」、「先立って」等と明示しておらず、また、前の処理の出力を後の処理で用いるのでない限り、任意の順序で実現しうることに留意すべきである。請求の範囲、明細書、及び図面中の動作フローに関して、便宜上「まず、」、「次に、」等を用いて説明したとしても、この順で実施することが必須であることを意味するものではない。 The execution order of each process such as operations, procedures, steps, and stages in the apparatus, system, program, and method shown in the claims, the description, and the drawings is particularly “before” or “prior”. It should be noted that they can be implemented in any order unless the output of the previous process is used in the subsequent process. Regarding the operation flow in the claims, the description, and the drawings, even if it is described using “first”, “next”, etc. for the sake of convenience, it means that it is essential to carry out in this order. is not.
10 UAV
20 UAV本体
30 UAV制御部
32 メモリ
34 通信インタフェース
40 推進部
41 受信機
42 慣性計測装置
43 磁気コンパス
44 気圧高度計
50 ジンバル
60 撮像装置
100 撮像装置
102 撮像部
110 撮像制御部
112 分割部
114 導出部
116 決定部
118 露出制御部
120 イメージセンサ
130 メモリ
140 光学フィルタ
200 レンズ部
210 レンズ
212 レンズ移動機構
220 レンズ制御部
300 遠隔操作装置
1200 コンピュータ
1210 ホストコントローラ
1212 CPU
1214 RAM
1220 入力/出力コントローラ
1222 通信インタフェース
1230 ROM
10 UAV
20 UAV body 30 UAV control unit 32 Memory 34 Communication interface 40 Propulsion unit 41 Receiver 42 Inertial measurement device 43 Magnetic compass 44 Barometric altimeter 50 Gimbal 60 Imaging device 100 Imaging device 102 Imaging unit 110 Imaging control unit 112 Dividing unit 114 Deriving unit 116 Determination unit 118 Exposure control unit 120 Image sensor 130 Memory 140 Optical filter 200 Lens unit 210 Lens 212 Lens moving mechanism 220 Lens control unit 300 Remote operation device 1200 Computer 1210 Host controller 1212 CPU
1214 RAM
1220 Input / output controller 1222 Communication interface 1230 ROM

Claims (15)

  1.  撮像装置により撮像される画像を、前記撮像装置の画角、前記撮像装置の高度、及び前記撮像装置の撮像方向に基づいて、複数の領域に分割する分割部と、
     前記撮像装置の露出を前記複数の領域ごとに制御する制御部と
    を備える制御装置。
    A dividing unit that divides an image captured by the imaging device into a plurality of regions based on an angle of view of the imaging device, an altitude of the imaging device, and an imaging direction of the imaging device;
    And a control unit that controls exposure of the imaging device for each of the plurality of regions.
  2.  前記複数の領域ごとに、前記撮像装置の露出の制御値を決定する決定部をさらに備え、
     前記制御部は、前記撮像装置の露出の制御値に基づいて、前記撮像装置の露出を前記複数の領域ごとに制御する、請求項1に記載の制御装置。
    For each of the plurality of regions, further comprising a determination unit that determines a control value of exposure of the imaging device,
    The control device according to claim 1, wherein the control unit controls exposure of the imaging device for each of the plurality of regions based on a control value of exposure of the imaging device.
  3.  前記撮像装置により撮像される画像の明るさの評価値を前記複数の領域ごとに導出する導出部をさらに備え、
     前記決定部は、前記複数の領域のそれぞれの前記評価値に基づいて、前記複数の領域ごとに露出の制御値を決定する、請求項2に記載の制御装置。
    A derivation unit that derives an evaluation value of brightness of an image captured by the imaging device for each of the plurality of regions;
    The control device according to claim 2, wherein the determination unit determines an exposure control value for each of the plurality of regions based on the evaluation value of each of the plurality of regions.
  4.  前記分割部は、前記撮像装置の画角、前記撮像装置の高度、及び前記撮像装置の撮像方向に基づいて、前記画像を、上方領域及び下方領域に分割し、
     前記導出部は、前記上方領域及び前記下方領域のそれぞれの明るさの評価値を導出し、
     前記決定部は、前記上方領域及び前記下方領域のそれぞれの明るさの評価値に基づいて、前記上方領域及び前記下方領域のそれぞれの露出の制御値を決定する、請求項3に記載の制御装置。
    The dividing unit divides the image into an upper region and a lower region based on an angle of view of the imaging device, an altitude of the imaging device, and an imaging direction of the imaging device,
    The derivation unit derives evaluation values of the brightness of the upper region and the lower region,
    The control device according to claim 3, wherein the determination unit determines a control value of exposure of each of the upper region and the lower region based on an evaluation value of brightness of each of the upper region and the lower region. .
  5.  前記分割部は、第1時点での前記撮像装置の画角、高度、及び撮像方向に基づいて、前記第1時点で前記撮像装置に撮像される第1画像を、第1上方領域及び第1下方領域に分割し、
     前記導出部は、前記第1画像に基づいて前記第1上方領域及び前記第1下方領域の明るさの評価値を導出し、
     前記分割部は、前記第1時点より後の第2時点での前記撮像装置の画角、高度、及び撮像方向に基づいて、前記撮像装置により前記第2時点で撮像される第2画像を第2上方領域及び第2下方領域に分割し、
     前記決定部は、前記第1上方領域の前記評価値に基づいて前記第2上方領域の露出の制御値を決定し、前記第1下方領域の前記評価値に基づいて前記第2下方領域の露出の制御値を決定する、請求項4に記載の制御装置。
    The dividing unit converts a first image captured by the imaging device at the first time point into a first upper region and a first based on a field angle, an altitude, and an imaging direction of the imaging device at a first time point. Divided into lower areas,
    The deriving unit derives brightness evaluation values of the first upper region and the first lower region based on the first image,
    The dividing unit generates a second image captured by the imaging device at the second time point based on a field angle, an altitude, and an imaging direction of the imaging device at a second time point after the first time point. 2 divided into an upper region and a second lower region,
    The determining unit determines an exposure control value of the second upper region based on the evaluation value of the first upper region, and exposes the second lower region based on the evaluation value of the first lower region. The control device according to claim 4, wherein the control value is determined.
  6.  前記第2時点での前記撮像装置の画角、高度、及び撮像方向は、前記第2時点より前に特定される前記第2時点での前記撮像装置の画角、高度、及び撮像方向に対応する、請求項5に記載の制御装置。 The angle of view, altitude, and imaging direction of the imaging device at the second time point correspond to the angle of view, altitude, and imaging direction of the imaging device at the second time point specified before the second time point. The control device according to claim 5.
  7.  前記撮像装置は、移動体に搭載され、
     前記撮像装置の高度は、前記第2時点より前に特定される前記第2時点での前記移動体の高度に対応する、請求項6に記載の制御装置。
    The imaging device is mounted on a moving body,
    The control device according to claim 6, wherein an altitude of the imaging device corresponds to an altitude of the moving body at the second time point specified before the second time point.
  8.  前記分割部は、前記画像を、前記上方領域、前記下方領域、及び前記上方領域と前記下方領域との間の少なくとも1つの中間領域に分割し、
     前記決定部は、前記少なくとも1つの中間領域の露出の制御値を、前記上方領域の露出の制御値と、前記下方領域の露出の制御値との間の値に決定する、請求項4に記載の制御装置。
    The dividing unit divides the image into the upper region, the lower region, and at least one intermediate region between the upper region and the lower region,
    5. The determination unit according to claim 4, wherein the determining unit determines the exposure control value of the at least one intermediate region to a value between the exposure control value of the upper region and the exposure control value of the lower region. Control device.
  9.  前記撮像装置は、イメージセンサを有し、
     前記制御部は、前記イメージセンサが露光量に応じて出力する電気信号のゲインを制御することにより、前記撮像装置の露出を前記複数の領域ごとに制御する、請求項1に記載の制御装置。
    The imaging device has an image sensor,
    The control device according to claim 1, wherein the control unit controls exposure of the imaging device for each of the plurality of regions by controlling a gain of an electric signal output by the image sensor according to an exposure amount.
  10.  前記撮像装置は、
     イメージセンサと、
     前記イメージセンサの前方に設けられ、予め定められた複数の領域ごとに光の透過率を変更可能な光学フィルタと
    を有し、
     前記制御部は、前記光学フィルタの前記予め定められた複数の領域ごとに光の透過率を制御することにより、前記撮像装置の露出を前記複数の領域ごとに制御する、請求項1に記載の制御装置。
    The imaging device
    An image sensor;
    An optical filter provided in front of the image sensor and capable of changing light transmittance for each of a plurality of predetermined regions;
    The said control part controls exposure of the said imaging device for every said several area | region by controlling the transmittance | permeability of light for every said predetermined several area | region of the said optical filter. Control device.
  11.  請求項1から10の何れか1つに記載の制御装置と、
     イメージセンサと
    を備える撮像装置。
    A control device according to any one of claims 1 to 10,
    An imaging device comprising an image sensor.
  12.  請求項11に記載の撮像装置と、
     前記撮像装置を支持する支持機構と
    を備える撮像システム。
    An imaging device according to claim 11;
    An imaging system comprising: a support mechanism that supports the imaging device.
  13.  請求項12に記載の撮像システムを搭載して移動する移動体。 A moving body that carries the imaging system according to claim 12 and moves.
  14.  撮像装置により撮像される画像を、前記撮像装置の画角、前記撮像装置の高度、及び前記撮像装置の撮像方向に基づいて、複数の領域に分割する段階と、
     前記撮像装置の露出を前記複数の領域ごとに制御する段階と
    を備える制御方法。
    Dividing an image captured by the imaging device into a plurality of regions based on an angle of view of the imaging device, an altitude of the imaging device, and an imaging direction of the imaging device;
    Controlling the exposure of the imaging device for each of the plurality of regions.
  15.  撮像装置により撮像される画像を、前記撮像装置の画角、前記撮像装置の高度、及び前記撮像装置の撮像方向に基づいて、複数の領域に分割する段階と、
     前記撮像装置の露出を前記複数の領域ごとに制御する段階と
    をコンピュータに実行させるためのプログラム。
    Dividing an image captured by the imaging device into a plurality of regions based on an angle of view of the imaging device, an altitude of the imaging device, and an imaging direction of the imaging device;
    The program for making a computer perform the step which controls exposure of the said imaging device for every said some area | region.
PCT/JP2017/009077 2017-03-07 2017-03-07 Control device, imaging device, imaging system, moving body, control method, and program WO2018163300A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2017/009077 WO2018163300A1 (en) 2017-03-07 2017-03-07 Control device, imaging device, imaging system, moving body, control method, and program
JP2017559620A JP6547984B2 (en) 2017-03-07 2017-03-07 CONTROL DEVICE, IMAGING DEVICE, IMAGING SYSTEM, MOBILE OBJECT, CONTROL METHOD, AND PROGRAM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/009077 WO2018163300A1 (en) 2017-03-07 2017-03-07 Control device, imaging device, imaging system, moving body, control method, and program

Publications (1)

Publication Number Publication Date
WO2018163300A1 true WO2018163300A1 (en) 2018-09-13

Family

ID=63447426

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/009077 WO2018163300A1 (en) 2017-03-07 2017-03-07 Control device, imaging device, imaging system, moving body, control method, and program

Country Status (2)

Country Link
JP (1) JP6547984B2 (en)
WO (1) WO2018163300A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021161638A1 (en) * 2020-02-13 2021-08-19 株式会社村上開明堂 Camera system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60136480A (en) * 1983-12-24 1985-07-19 Sony Corp Controller for image pickup light quantity
JPH0879604A (en) * 1994-08-30 1996-03-22 Mitsubishi Electric Corp Infrared ray image pickup device
JPH08240833A (en) * 1995-03-02 1996-09-17 Mitsubishi Electric Corp Exposure controller of camera for vehicle
JP2010283631A (en) * 2009-06-05 2010-12-16 Toyota Industries Corp Image sensing device and method for processing image in the same
JP2016131367A (en) * 2015-01-09 2016-07-21 株式会社リコー Moving body system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009025727A (en) * 2007-07-23 2009-02-05 Nikon Corp Photometric device and camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60136480A (en) * 1983-12-24 1985-07-19 Sony Corp Controller for image pickup light quantity
JPH0879604A (en) * 1994-08-30 1996-03-22 Mitsubishi Electric Corp Infrared ray image pickup device
JPH08240833A (en) * 1995-03-02 1996-09-17 Mitsubishi Electric Corp Exposure controller of camera for vehicle
JP2010283631A (en) * 2009-06-05 2010-12-16 Toyota Industries Corp Image sensing device and method for processing image in the same
JP2016131367A (en) * 2015-01-09 2016-07-21 株式会社リコー Moving body system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021161638A1 (en) * 2020-02-13 2021-08-19 株式会社村上開明堂 Camera system

Also Published As

Publication number Publication date
JPWO2018163300A1 (en) 2019-03-22
JP6547984B2 (en) 2019-07-24

Similar Documents

Publication Publication Date Title
JP6496955B1 (en) Control device, system, control method, and program
CN111567032B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
JP6384000B1 (en) Control device, imaging device, imaging system, moving object, control method, and program
JP2020012878A (en) Controller, moving object, control method, and program
JP6630939B2 (en) Control device, imaging device, moving object, control method, and program
JP6587006B2 (en) Moving body detection device, control device, moving body, moving body detection method, and program
JP6501091B1 (en) CONTROL DEVICE, IMAGING DEVICE, MOBILE OBJECT, CONTROL METHOD, AND PROGRAM
JP6481228B1 (en) Determination device, control device, imaging system, flying object, determination method, and program
JP6565072B2 (en) Control device, lens device, flying object, control method, and program
WO2018163300A1 (en) Control device, imaging device, imaging system, moving body, control method, and program
WO2018185940A1 (en) Imaging control device, imaging device, imaging system, mobile body, imaging control method and program
JP6801161B1 (en) Image processing equipment, imaging equipment, moving objects, image processing methods, and programs
JP6651693B2 (en) Control device, moving object, control method, and program
JP6543859B1 (en) IMAGE PROCESSING DEVICE, IMAGING DEVICE, MOBILE OBJECT, IMAGE PROCESSING METHOD, AND PROGRAM
JP6565071B2 (en) Control device, imaging device, flying object, control method, and program
JP6696092B2 (en) Control device, moving body, control method, and program
JP6714802B2 (en) Control device, flying body, control method, and program
JP2019205047A (en) Controller, imaging apparatus, mobile body, control method and program
JP6459012B1 (en) Control device, imaging device, flying object, control method, and program
JP6818987B1 (en) Image processing equipment, imaging equipment, moving objects, image processing methods, and programs
JP6569157B1 (en) Control device, imaging device, moving object, control method, and program
JP6413170B1 (en) Determination apparatus, imaging apparatus, imaging system, moving object, determination method, and program
JP6710863B2 (en) Aircraft, control method, and program
JP6888215B2 (en) Control devices, imaging devices, imaging systems, moving objects, control methods, and programs

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017559620

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17900033

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 18/12/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17900033

Country of ref document: EP

Kind code of ref document: A1