WO2018185940A1 - Imaging control device, imaging device, imaging system, mobile body, imaging control method and program - Google Patents

Imaging control device, imaging device, imaging system, mobile body, imaging control method and program Download PDF

Info

Publication number
WO2018185940A1
WO2018185940A1 PCT/JP2017/014560 JP2017014560W WO2018185940A1 WO 2018185940 A1 WO2018185940 A1 WO 2018185940A1 JP 2017014560 W JP2017014560 W JP 2017014560W WO 2018185940 A1 WO2018185940 A1 WO 2018185940A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
lens
spatial frequency
frequency band
image
Prior art date
Application number
PCT/JP2017/014560
Other languages
French (fr)
Japanese (ja)
Inventor
明 邵
本庄 謙一
Original Assignee
エスゼット ディージェイアイ テクノロジー カンパニー リミテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by エスゼット ディージェイアイ テクノロジー カンパニー リミテッド filed Critical エスゼット ディージェイアイ テクノロジー カンパニー リミテッド
Priority to JP2017559465A priority Critical patent/JP6503607B2/en
Priority to PCT/JP2017/014560 priority patent/WO2018185940A1/en
Publication of WO2018185940A1 publication Critical patent/WO2018185940A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms

Definitions

  • the present invention relates to an imaging control device, an imaging device, an imaging system, a moving body, an imaging control method, and a program.
  • Patent Document 1 Japanese Patent No. 5932476
  • the AF processing using the blur amount of the image it is necessary to capture at least two images, and the AF processing time may be long.
  • the accuracy of the AF process may be reduced when a specific frequency component such as a high frequency component is included in the spatial frequency of the image.
  • the imaging control apparatus may include a derivation unit that derives a spatial frequency band of a captured image.
  • the imaging control device executes autofocus processing based on blur amounts of a plurality of images captured in a state where the positional relationship between the imaging surface and the lens is different based on the spatial frequency band of the image derived by the deriving unit.
  • a control unit is provided that controls the positional relationship between the imaging surface and the lens by selecting one of the first AF process and the second AF process for performing autofocus processing by a phase difference method, or by combining both. Good.
  • the control unit selects one of the first AF process and the second AF process based on the spatial frequency band of the image derived by the derivation unit and the spatial frequency band predetermined for the second AF process.
  • the positional relationship between the imaging surface and the lens may be controlled by combining both.
  • the control unit selects one of the first AF process and the second AF process based on the ratio of the spatial frequency band included in the predetermined spatial frequency band in the spatial frequency band of the image derived by the derivation unit. Or a combination of both may control the positional relationship between the imaging surface and the lens.
  • the control unit may control the positional relationship between the imaging surface and the lens by selecting the second AF process when the ratio is equal to or greater than the threshold and selecting the first AF process when the ratio is smaller than the threshold.
  • the control unit selects the second AF process when the ratio is equal to or greater than the first threshold, and combines the first AF process and the second AF process when the ratio is smaller than the first threshold and equal to or greater than the second threshold, When the threshold value is smaller than 2, the first AF process may be selected to control the positional relationship between the imaging surface and the lens.
  • the control unit compares the spatial frequency band of the image derived by the deriving unit with a predetermined spatial frequency band that can be reproduced by a plurality of pixels that generate the phase difference detection signal, and thereby performs the first AF process and the first AF process.
  • the positional relationship between the imaging surface and the lens may be controlled by selecting either one of the 2AF processing or combining both.
  • the second AF process may be executed based on a phase difference detection signal output from an image sensor in which a plurality of pixels and a plurality of other pixels that generate color component signals are arranged in a predetermined arrangement pattern. .
  • An imaging device may include the imaging control device.
  • the imaging device may include an image sensor having an imaging surface.
  • the imaging device may include a lens.
  • An imaging system may include the imaging device.
  • the imaging system may include a support mechanism that supports the imaging device.
  • the moving body according to one embodiment of the present invention may move by mounting the imaging system.
  • the imaging control method may include a step of deriving a spatial frequency band of a captured image.
  • the imaging control method includes a first AF process that performs an autofocus process based on blur amounts of a plurality of images that are captured in a state where the positional relationship between the imaging surface and the lens is different based on the spatial frequency band of the derived image. And a step of controlling the positional relationship between the imaging surface and the lens by selecting either one of the second AF processing for executing the autofocus processing by the phase difference method, or combining both.
  • the program according to an aspect of the present invention may cause a computer to execute a step of deriving a spatial frequency band of a captured image.
  • the program includes: a first AF process that performs an autofocus process based on blur amounts of a plurality of images captured in a state where the positional relationship between the imaging surface and the lens is different based on the derived spatial frequency band of the image;
  • the step of controlling the positional relationship between the imaging surface and the lens may be executed by the computer by selecting either one of the second AF processes that execute the autofocus process by the phase difference method, or by combining both.
  • a block is either (1) a stage in a process in which an operation is performed or (2) an apparatus responsible for performing the operation. May represent a “part”.
  • Certain stages and “units” may be implemented by programmable circuits and / or processors.
  • Dedicated circuitry may include digital and / or analog hardware circuitry.
  • Integrated circuits (ICs) and / or discrete circuits may be included.
  • the programmable circuit may include a reconfigurable hardware circuit.
  • Reconfigurable hardware circuits include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate arrays (FPGA), programmable logic arrays (PLA), etc.
  • the memory element or the like may be included.
  • the computer readable medium may include any tangible device capable of storing instructions to be executed by a suitable device.
  • a computer readable medium having instructions stored thereon comprises a product that includes instructions that can be executed to create a means for performing the operations specified in the flowcharts or block diagrams.
  • Examples of computer readable media may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like.
  • Computer readable media include floppy disks, diskettes, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), Electrically erasable programmable read only memory (EEPROM), static random access memory (SRAM), compact disc read only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory stick, integrated A circuit card or the like may be included.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • EEPROM Electrically erasable programmable read only memory
  • SRAM static random access memory
  • CD-ROM compact disc read only memory
  • DVD digital versatile disc
  • RTM Blu-ray
  • the computer readable instructions may include either source code or object code written in any combination of one or more programming languages.
  • the source code or object code includes a conventional procedural programming language.
  • Conventional procedural programming languages include assembler instructions, instruction set architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA, C ++, etc. It may be an object-oriented programming language and a “C” programming language or a similar programming language.
  • Computer readable instructions may be directed to a general purpose computer, special purpose computer, or other programmable data processing device processor or programmable circuit locally or in a wide area network (WAN) such as a local area network (LAN), the Internet, etc. ).
  • WAN wide area network
  • LAN local area network
  • the Internet etc.
  • the processor or programmable circuit may execute computer readable instructions to create a means for performing the operations specified in the flowcharts or block diagrams.
  • Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
  • FIG. 1 shows an example of the external appearance of an unmanned aerial vehicle (UAV) 10 and a remote control device 300.
  • the UAV 10 includes a UAV main body 20, a gimbal 50, a plurality of imaging devices 60, and an imaging device 100.
  • the gimbal 50 and the imaging device 100 are an example of an imaging system.
  • the UAV 10 is an example of a moving body propelled by a propulsion unit.
  • the moving body is a concept including a flying body such as another aircraft moving in the air, a vehicle moving on the ground, a ship moving on the water, etc. in addition to the UAV.
  • the UAV main body 20 includes a plurality of rotor blades.
  • the plurality of rotor blades is an example of a propulsion unit.
  • the UAV main body 20 causes the UAV 10 to fly by controlling the rotation of a plurality of rotor blades.
  • the UAV main body 20 causes the UAV 10 to fly using four rotary wings.
  • the number of rotor blades is not limited to four.
  • the UAV 10 may be a fixed wing machine that does not have a rotating wing.
  • the imaging apparatus 100 is an imaging camera that images a subject included in a desired imaging range.
  • the gimbal 50 supports the imaging device 100 in a rotatable manner.
  • the gimbal 50 is an example of a support mechanism.
  • the gimbal 50 supports the imaging device 100 so as to be rotatable about the pitch axis using an actuator.
  • the gimbal 50 further supports the imaging device 100 using an actuator so as to be rotatable about the roll axis and the yaw axis.
  • the gimbal 50 may change the posture of the imaging device 100 by rotating the imaging device 100 about at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that image the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • Two imaging devices 60 may be provided in the front which is the nose of UAV10.
  • Two other imaging devices 60 may be provided on the bottom surface of the UAV 10.
  • the two imaging devices 60 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired and function as a stereo camera. Based on images picked up by a plurality of image pickup devices 60, three-dimensional spatial data around the UAV 10 may be generated.
  • the number of imaging devices 60 included in the UAV 10 is not limited to four.
  • the UAV 10 only needs to include at least one imaging device 60.
  • the UAV 10 may include at least one imaging device 60 on each of the nose, the tail, the side surface, the bottom surface, and the ceiling surface of the UAV 10.
  • the angle of view that can be set by the imaging device 60 may be wider than the angle of view that can be set by the imaging device 100.
  • the imaging device 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 may communicate with the UAV 10 wirelessly.
  • the remote control device 300 transmits to the UAV 10 instruction information indicating various commands related to movement of the UAV 10 such as ascending, descending, accelerating, decelerating, moving forward, moving backward, and rotating.
  • the instruction information includes, for example, instruction information for raising the altitude of the UAV 10.
  • the instruction information may indicate the altitude at which the UAV 10 should be located.
  • the UAV 10 moves so as to be located at an altitude indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include an ascending command that raises the UAV 10.
  • the UAV 10 rises while accepting the ascent command. Even if the UAV 10 receives the ascending command, the UAV 10 may limit the ascent when the altitude of the UAV 10 has reached the upper limit altitude.
  • FIG. 2 shows an example of functional blocks of the UAV10.
  • the UAV 10 includes a UAV control unit 30, a memory 32, a communication interface 34, a propulsion unit 40, a GPS receiver 41, an inertial measurement device 42, a magnetic compass 43, a barometric altimeter 44, a temperature sensor 45, a gimbal 50, and the imaging device 100. .
  • the communication interface 34 communicates with other devices such as the remote operation device 300.
  • the communication interface 34 may receive instruction information including various commands for the UAV control unit 30 from the remote operation device 300.
  • the memory 32 includes a propulsion unit 40, a GPS receiver 41, an inertial measurement device (IMU) 42, a magnetic compass 43, a barometric altimeter 44, a temperature sensor 45, a gimbal 50, an imaging device 60, and the imaging device 100. Stores programs and the like necessary for controlling
  • the memory 32 may be a computer-readable recording medium and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 32 may be provided inside the UAV main body 20. It may be provided so as to be removable from the UAV main body 20.
  • the UAV control unit 30 controls the flight and imaging of the UAV 10 according to a program stored in the memory 32.
  • the UAV control unit 30 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
  • the UAV control unit 30 controls the flight and imaging of the UAV 10 according to a command received from the remote control device 300 via the communication interface 34.
  • the propulsion unit 40 propels the UAV 10.
  • the propulsion unit 40 includes a plurality of rotating blades and a plurality of drive motors that rotate the plurality of rotating blades.
  • the propulsion unit 40 causes the UAV 10 to fly by rotating a plurality of rotor blades via a plurality of drive motors in accordance with a command from the UAV control unit 30.
  • the GPS receiver 41 receives a plurality of signals indicating times transmitted from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position of the GPS receiver 41, that is, the position of the UAV 10 based on the received signals.
  • the IMU 42 detects the posture of the UAV 10.
  • the IMU 42 detects, as the posture of the UAV 10, acceleration in the three axial directions of the front, rear, left, and right of the UAV 10, and angular velocity in the three axial directions of pitch, roll, and yaw.
  • the magnetic compass 43 detects the heading of the UAV 10.
  • the barometric altimeter 44 detects the altitude at which the UAV 10 flies.
  • the barometric altimeter 44 detects the atmospheric pressure around the UAV 10, converts the detected atmospheric pressure into an altitude, and detects the altitude.
  • the temperature sensor 45 detects the temperature around the UAV 10.
  • the imaging apparatus 100 includes an imaging unit 102 and a lens unit 200.
  • the lens unit 200 is an example of a lens device.
  • the imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130.
  • the image sensor 120 may be configured by a CCD or a CMOS.
  • the image sensor 120 outputs image data of an optical image formed through the plurality of lenses 210 to the imaging control unit 110.
  • the imaging control unit 110 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
  • the imaging control unit 110 may control the imaging device 100 in accordance with an operation command for the imaging device 100 from the UAV control unit 30.
  • the memory 130 may be a computer-readable recording medium and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 130 stores a program and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the memory 130 may be provided so as to be removable from the housing of the imaging apparatus 100.
  • the lens unit 200 includes a plurality of lenses 210, a lens moving mechanism 212, and a lens control unit 220.
  • the plurality of lenses 210 may function as a zoom lens, a varifocal lens, and a focus lens. At least some or all of the plurality of lenses 210 are arranged to be movable along the optical axis.
  • the lens unit 200 may be an interchangeable lens that is detachably attached to the imaging unit 102.
  • the lens moving mechanism 212 moves at least some or all of the plurality of lenses 210 along the optical axis.
  • the lens control unit 220 drives the lens moving mechanism 212 in accordance with a lens control command from the imaging unit 102 to move one or a plurality of lenses 210 along the optical axis direction.
  • the lens control command is, for example, a zoom control command and a focus control command.
  • the imaging apparatus 100 configured in this manner performs an autofocus process (AF process) and images a desired subject.
  • AF process autofocus process
  • the imaging apparatus 100 determines the distance from the lens to the subject (subject distance) in order to execute the AF process.
  • a method for determining the subject distance there is a method of determining based on the blur amounts of a plurality of images captured in a state where the positional relationship between the lens and the imaging surface is different.
  • this method is referred to as a blur detection auto focus (BDAF) method.
  • BDAF blur detection auto focus
  • the blur amount (Cost) of the image can be expressed by the following equation (1) using a Gaussian function.
  • x indicates a pixel position in the horizontal direction.
  • represents a standard deviation value.
  • FIG. 3 shows an example of a curve represented by Equation (1).
  • FIG. 4 is a flowchart illustrating an example of a distance calculation procedure of the BDAF method.
  • the imaging apparatus 100 in a state in which the lens and the image plane is in the first positional relationship, for storing the image I 1 of the first sheet in the memory 130 by photographing.
  • the imaging surface of the focusing lens or the image sensor 120 in the optical axis direction, the lens and the imaging surface in the state in the second positional relationship, captured image I 2 of the second sheet by the imaging device 100 And stored in the memory 130 (S101).
  • the focus lens or the imaging surface of the image sensor 120 is moved in the optical axis direction so as not to exceed the focal point.
  • the moving amount of the imaging surface of the focus lens or the image sensor 120 may be 10 ⁇ m, for example.
  • the imaging device 100 divides the image I 1 into a plurality of regions (S102). Calculates a feature amount for each pixel in the image I2, may divide the image I 1 into a plurality of regions of pixel groups having a feature amount that is similar as a single region. A pixel group in a range set in the AF processing frame in the image I 1 may be divided into a plurality of regions. Imaging device 100 divides the image I 2 into a plurality of regions corresponding to a plurality of areas of the image I 1. The imaging apparatus 100 includes an object included in each of the plurality of regions based on the amount of blur of each of the plurality of regions of the image I 1 and the amount of blur of each of the plurality of regions of the image I 2. Is calculated (S103).
  • the distance calculation procedure will be further described with reference to FIG.
  • the distance from the lens L (principal point) to the object 510 (object surface) is A
  • the distance from the lens L (principal point) to the position (image plane) where the object 510 forms an image on the imaging surface is B
  • the focal length is F.
  • the relationship between the distance A, the separation B, and the focal length F can be expressed by the following formula (2) from the lens formula.
  • the focal length F is specified by the lens position. Therefore, if the distance B at which the object 510 forms an image on the imaging surface can be specified, the distance A from the lens L to the object 510 can be specified using Expression (2).
  • the distance B is specified by calculating the position at which the object 510 forms an image from the blur size (the circles of confusion 512 and 514) of the object 510 projected on the imaging surface.
  • A can be specified. That is, the imaging position can be specified in consideration of the fact that the size of blur (the amount of blur) is proportional to the imaging surface and the imaging position.
  • the distance from the image I 2 far from the image plane to the lens L is D 2 .
  • Each image is blurred.
  • the point spread function at this time is PSF, and the images at D 1 and D 2 are I d1 and I d2 , respectively.
  • the image I 1 can be expressed by the following equation (3) by a convolution operation.
  • the value C shown in the equation (4) corresponds to the amount of change in the blur amounts of the images I d1 and I d2 , that is, the value C corresponds to the difference between the blur amount of the image I d1 and the blur amount of the image I d2n. .
  • AF processing may take time.
  • a phase difference AF processing method that performs AF processing using two parallax images is known as a method that can shorten the AF processing time relatively.
  • the phase difference AF processing method includes an image plane phase difference AF (PDAF) method.
  • PDAF image plane phase difference AF
  • an image sensor in which a plurality of pixels are arranged as shown in FIG. 6 is used.
  • the image sensor includes a pixel column 600 including a Bayer array pixel block that outputs a color component detection signal, and a pixel column 602 and a pixel column 604 including pixel blocks including pixels that output a phase difference detection signal.
  • the pixel P1 that outputs the phase difference detection signal included in the pixel column 602 and the pixel P2 that outputs the phase difference detection signal included in the pixel column 604 have different light incident directions. Therefore, an image having a different phase is obtained from the image obtained from the pixel P1 and the image obtained from the pixel P2. Then, the distance to the subject can be derived based on the amount and direction of deviation between the image included in the image obtained from the pixel P1 and the image included in the image obtained from the pixel P2.
  • the PDAF method can execute AF processing faster than the BDAF method.
  • the pixels for phase difference detection are relatively spaced apart. Therefore, the image quality of the image obtained from the image P1 and the pixel P2 is relatively low. For this reason, if a high frequency component is included in the spatial frequency band of the image captured by the image sensor, the accuracy of the AF processing may decrease.
  • AF processing of the BDAF method and AF processing of the PADF method are selected based on the spatial frequency band of the image, or a combination of both, Execute the process.
  • the imaging control unit 110 includes a derivation unit 112 and a focusing control unit 114.
  • the deriving unit 112 derives the spatial frequency band of the image captured by the image sensor 120.
  • the deriving unit 112 may derive the spatial frequency band of the image by performing Fourier transform on the image and decomposing the image for each spatial frequency component.
  • the focusing control unit 114 determines the blur amounts of a plurality of images captured in a state where the positional relationship between the imaging surface of the image sensor 120 and the lens 210 is different.
  • the imaging surface of the image sensor 120 and the lens 210 are selected by selecting either one of BDAF processing that executes autofocus processing based on the PDAF processing that performs autofocus processing by the phase difference method, or a combination of both.
  • the focus control unit 114 may control the position of the focus lens by selecting one of BDAF processing and PDAF processing based on the spatial frequency band of the image, or by combining both.
  • the focus control unit 114 is an example of a control unit.
  • the BDAF process is an example of a first AF process.
  • the PFAF process is an example of a second AF process.
  • the focus control unit 114 selects either BDAF processing or PDAF processing based on the spatial frequency band of the image derived by the deriving unit 112 and the spatial frequency band predetermined for the PDAF processing. Alternatively, the positional relationship between the imaging surface of the image sensor 120 and the lens 210 may be controlled by combining both. The focusing control unit 114 performs either one of the BDAF process and the PDAF process based on the ratio of the spatial frequency band included in the predetermined spatial frequency band in the spatial frequency band of the image derived by the deriving unit 112. The position relationship between the imaging surface of the image sensor 120 and the lens 210 may be controlled by selecting or combining both.
  • the focus control unit 114 selects the PDAF process when the ratio is greater than or equal to the threshold value, and selects the BDAF process when the ratio is smaller than the threshold value, thereby determining the positional relationship between the imaging surface of the image sensor 120 and the lens 210. You may control.
  • the focus control unit 114 selects the PDAF process when the ratio is greater than or equal to the first threshold, and combines the BDAF process and the PDAF process when the ratio is smaller than the first threshold and greater than or equal to the second threshold, If it is smaller than two thresholds, the positional relationship between the imaging surface of the image sensor 120 and the lens 210 may be controlled by selecting BDAF processing.
  • the focus control unit 114 weights the distance to the subject specified by the BDAF process and the distance to the subject specified by the PADF process, and based on the weighted distance, the position of the focus lens May be controlled.
  • the focus control unit 114 may change the weighting of the BDAF process and the PDAF process depending on the ratio.
  • the focusing control unit 114 compares the spatial frequency band of the image derived by the deriving unit 112 with a predetermined spatial frequency band that can be reproduced by a plurality of pixels that generate the phase difference detection signal, thereby obtaining the BDAF.
  • the positional relationship between the imaging surface of the image sensor 120 and the lens 210 may be controlled by selecting one of the processing and the PDAF processing or combining both.
  • the focus control unit 114 is output from the image sensor 120 in which a plurality of pixels that generate a phase difference detection signal and a plurality of other pixels that generate a color component signal are arranged in a predetermined arrangement pattern.
  • the PDAF process may be executed based on the phase difference detection signal.
  • the plurality of pixels that generate the phase difference detection signal are, for example, the pixel P1 and the pixel P2 illustrated in FIG.
  • the plurality of other pixels that generate the color component signal are, for example, the pixel R, the pixel G, and the pixel B illustrated in FIG.
  • the predetermined spatial frequency band may be a spatial frequency band that can be reproduced by the pixel P1 and the pixel P2, for example.
  • the spatial frequency band that can be reproduced by the pixel R, the pixel G, and the pixel B is, for example, a region 700.
  • the spatial frequency band that can be reproduced by the pixel P1 and the pixel P2 is, for example, a region 702.
  • Region 700 and region 702 each contain a low frequency component.
  • the frequency component included in the region 702 has a lower proportion of high frequency components than the frequency component included in the region 700. Therefore, when the image picked up by the image sensor 120 includes many high-frequency components, the spatial frequency components that can be reproduced by the pixels P1 and P2 are small, and the focus control unit 114 cannot execute the AF processing with high accuracy by the PDAF processing. There is a case.
  • the BDAF process is executed based on the color component signal. That is, the focus control unit 114 can perform BDAF processing with high accuracy on an image including a spatial frequency band in the range of the region 700.
  • a region 710 is a region corresponding to the spatial frequency band of the image derived by the deriving unit 112.
  • the region 712 is a region corresponding to the spatial frequency band of the image included in the region 702 in the spatial frequency band of the image.
  • the focusing control unit 114 controls the position of the focus lens by selecting one of BDAF processing and PDAF processing or combining both based on the ratio of the area of the region 712 to the area of the region 710. It's okay.
  • the focus control unit 114 may select the PDAF process and control the position of the focus lens if the ratio of the region 712 is equal to or greater than a predetermined threshold.
  • the focus control unit 114 may select the BDAF process and control the position of the focus lens when the ratio of the area 712 is smaller than a predetermined threshold.
  • the focusing control unit 114 selects the PDAF process and selects the maximum spatial frequency component. If the frequency component is larger than a predetermined spatial frequency component, BDAF processing may be selected to control the position of the focus lens.
  • the predetermined spatial frequency component may be determined based on, for example, the maximum spatial frequency component in the spatial frequency band that can be reproduced by the pixel P1 and the pixel P2.
  • FIG. 8 is a flowchart showing an example of the procedure of AF processing.
  • the deriving unit 112 acquires an image captured by the image sensor 120.
  • the deriving unit 112 derives a spatial frequency band of the image.
  • the deriving unit 112 derives a ratio B of the spatial frequency band of the image that occupies a predetermined spatial frequency band for the PDAF processing (S200).
  • the focus control unit 114 determines whether or not the derived ratio B is equal to or greater than a predetermined first threshold Th1 (S202). If the ratio B is equal to or greater than the first threshold Th1, the focus control unit 114 selects the PDAF process and controls the positional relationship between the imaging surface of the image sensor 120 and the lens 210 (S204). On the other hand, if the ratio B is smaller than the first threshold Th1, the focusing control unit 114 selects the BDAF process and controls the positional relationship between the imaging surface of the image sensor 120 and the lens 210 (S206).
  • the focus control unit 114 switches between the PDAF process and the BDAF process.
  • the focusing control unit 114 can prevent the AF processing accuracy from being lowered by executing the PDAF processing.
  • the high frequency component is not included in the image, it is possible to prevent the AF process from taking a long time by the focusing control unit 114 executing the BDAF process. Therefore, according to the imaging device 100 according to the present embodiment, it is possible to suppress an increase in AF processing time and a decrease in AF processing accuracy with a good balance. You may distinguish a low frequency component and a high frequency component on the 1st threshold value Th as a boundary.
  • FIG. 9 is a flowchart showing another example of the AF processing procedure.
  • the deriving unit 112 derives a spatial frequency band of the image. Further, the deriving unit 112 derives a ratio B of the spatial frequency band of the image that occupies a predetermined spatial frequency band for the PDAF processing (S300).
  • the focus control unit 114 determines whether or not the derived ratio B is equal to or greater than a predetermined first threshold Th1 (S302). If the ratio B is equal to or greater than the first threshold Th1, the focus control unit 114 selects the PDAF process and controls the positional relationship between the imaging surface of the image sensor 120 and the lens 210 (S304). On the other hand, if the ratio B is smaller than the first threshold value Th1, the focusing control unit 114 determines whether the ratio B is smaller than the first threshold value Th1 and is equal to or larger than a predetermined second threshold value Th2 (S306).
  • the focus control unit 114 combines the PDAF process and the BDAF process to combine the imaging surface of the image sensor 120 and the lens 210. (S308).
  • the focus control unit 114 may control the position of the focus lens based on the position of the focus lens specified by the PDAF process and the position of the focus lens specified by the BDAF process.
  • the focusing control unit 114 weights each focus lens position specified by the PDAF process and the BDAF process according to the ratio B, and controls the position of the focus lens based on each weighted distance. Good.
  • the focus control unit 114 may weight each distance so that the weight of the distance specified by the PDAF process is larger than the weight of the distance specified by the BDAF process as the ratio B is larger. .
  • the focus control unit 114 selects the BDAF process and controls the positional relationship between the imaging surface of the image sensor 120 and the lens 210 (S310).
  • the focus control unit 114 executes the AF process by switching the PDAF process and the BDAF process, or combining the PDAF process and the BDAF process.
  • the focus control unit 114 executes the AF process by switching the PDAF process and the BDAF process, or combining the PDAF process and the BDAF process.
  • the AF process is executed by combining the PDAF process and the BDAF process.
  • the ratio B is, for example, around 50%
  • the accuracy of the AF process can be improved as compared with the case where the AF process is executed by only one of the PDAF process and the BDAF process. Therefore, according to the imaging device 100 according to the present embodiment, it is possible to suppress an increase in AF processing time and a decrease in AF processing accuracy with a good balance.
  • FIG. 10 illustrates an example of a computer 1200 in which aspects of the present invention may be embodied in whole or in part.
  • a program installed in the computer 1200 can cause the computer 1200 to function as an operation associated with the apparatus according to the embodiment of the present invention or as one or more “units” of the apparatus.
  • the program can cause the computer 1200 to execute the operation or the one or more “units”.
  • the program can cause the computer 1200 to execute a process according to an embodiment of the present invention or a stage of the process.
  • Such a program may be executed by CPU 1212 to cause computer 1200 to perform certain operations associated with some or all of the blocks in the flowcharts and block diagrams described herein.
  • the computer 1200 includes a CPU 1212 and a RAM 1214, which are connected to each other by a host controller 1210.
  • the computer 1200 also includes a communication interface 1222 and an input / output unit, which are connected to the host controller 1210 via the input / output controller 1220.
  • Computer 1200 also includes ROM 1230.
  • the CPU 1212 operates according to programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
  • the communication interface 1222 communicates with other electronic devices via a network.
  • a hard disk drive may store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 at the time of activation and / or a program depending on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card or a network.
  • the program is installed in the RAM 1214 or the ROM 1230 that is also an example of a computer-readable recording medium, and is executed by the CPU 1212.
  • Information processing described in these programs is read by the computer 1200 to bring about cooperation between the programs and the various types of hardware resources.
  • An apparatus or method may be configured by implementing information operations or processing in accordance with the use of computer 1200.
  • the CPU 1212 executes a communication program loaded in the RAM 1214 and performs communication processing on the communication interface 1222 based on the processing described in the communication program. You may order.
  • the communication interface 1222 reads transmission data stored in a RAM 1214 or a transmission buffer area provided in a recording medium such as a USB memory under the control of the CPU 1212 and transmits the read transmission data to a network, or The reception data received from the network is written into a reception buffer area provided on the recording medium.
  • the CPU 1212 allows the RAM 1214 to read all or necessary portions of a file or database stored in an external recording medium such as a USB memory, and executes various types of processing on the data on the RAM 1214. Good. The CPU 1212 may then write back the processed data to an external recording medium.
  • the CPU 1212 describes various types of operations, information processing, conditional judgment, conditional branching, unconditional branching, and information retrieval that are described throughout the present disclosure for data read from the RAM 1214 and specified by the instruction sequence of the program. Various types of processing may be performed, including / replacement, etc., and the result is written back to RAM 1214.
  • the CPU 1212 may search for information in files, databases, etc. in the recording medium. For example, when a plurality of entries each having an attribute value of the first attribute associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 specifies the attribute value of the first attribute. The entry that matches the condition is searched from the plurality of entries, the attribute value of the second attribute stored in the entry is read, and thereby the first attribute that satisfies the predetermined condition is associated. The attribute value of the obtained second attribute may be acquired.
  • the program or software module described above may be stored in a computer-readable storage medium on the computer 1200 or in the vicinity of the computer 1200.
  • a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, whereby the program is transferred to the computer 1200 via the network.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)

Abstract

This imaging control device may be provided with a derivation unit which derives the spatial frequency band of a captured image. The imaging control device may be provided with a control unit which, on the basis of the spatial frequency band of the image derived by the derivation unit, controls the positional relation between the imaging surface and the lens by selecting one of, or by combining both, first AF processing, in which autofocus processing is performed on the basis of the amount of blur in multiple images captured in states with different positional relations between the imaging surface and the lens, and second AF processing, in which autofocus processing is performed with a phase difference method.

Description

撮像制御装置、撮像装置、撮像システム、移動体、撮像制御方法、及びプログラムImaging control apparatus, imaging apparatus, imaging system, moving object, imaging control method, and program
 本発明は、撮像制御装置、撮像装置、撮像システム、移動体、撮像制御方法、及びプログラムに関する。 The present invention relates to an imaging control device, an imaging device, an imaging system, a moving body, an imaging control method, and a program.
 2つの視差画像を用いたオートフォーカス処理(AF処理)、及び特許文献1のように、異なる撮影パラメータで撮像したぼけ量の異なる複数の画像を用いたAF処理などが知られている。
 特許文献1 特許第5932476号公報
An autofocus process (AF process) using two parallax images and an AF process using a plurality of images with different amounts of blur captured with different shooting parameters are known.
Patent Document 1 Japanese Patent No. 5932476
解決しようとする課題Challenges to be solved
 画像のぼけ量を用いるAF処理によれば、少なくとも2枚の画像を撮像する必要があり、AF処理の時間が長くなる場合がある。2つの視差画像を用いるAF処理によれば、画像の空間周波数に高周波成分などの特定の周波数成分が含まれる場合にAF処理の精度が低下する場合がある。 According to the AF processing using the blur amount of the image, it is necessary to capture at least two images, and the AF processing time may be long. According to the AF process using two parallax images, the accuracy of the AF process may be reduced when a specific frequency component such as a high frequency component is included in the spatial frequency of the image.
一般的開示General disclosure
 本発明の一態様に係る撮像制御装置は、撮像された画像の空間周波数帯域を導出する導出部を備えてよい。撮像制御装置は、導出部により導出された画像の空間周波数帯域に基づいて、撮像面とレンズとの位置関係が異なる状態で撮像された複数の画像のぼけ量に基づいてオートフォーカス処理を実行する第1AF処理と、位相差方式でオートフォーカス処理を実行する第2AF処理との何れか一方を選択する、または両方を組み合わせることで、撮像面とレンズとの位置関係を制御する制御部を備えてよい。 The imaging control apparatus according to an aspect of the present invention may include a derivation unit that derives a spatial frequency band of a captured image. The imaging control device executes autofocus processing based on blur amounts of a plurality of images captured in a state where the positional relationship between the imaging surface and the lens is different based on the spatial frequency band of the image derived by the deriving unit. A control unit is provided that controls the positional relationship between the imaging surface and the lens by selecting one of the first AF process and the second AF process for performing autofocus processing by a phase difference method, or by combining both. Good.
 制御部は、導出部により導出された画像の空間周波数帯域と、第2AF処理に対して予め定められた空間周波数帯域とに基づいて第1AF処理と第2AF処理との何れか一方を選択する、または両方を組み合わせることで、撮像面とレンズとの位置関係を制御してよい。 The control unit selects one of the first AF process and the second AF process based on the spatial frequency band of the image derived by the derivation unit and the spatial frequency band predetermined for the second AF process. Alternatively, the positional relationship between the imaging surface and the lens may be controlled by combining both.
 制御部は、導出部により導出された画像の空間周波数帯域のうち予め定められた空間周波数帯域に含まれる空間周波数帯域の割合に基づいて、第1AF処理と第2AF処理との何れか一方を選択する、または両方を組み合わせることで、撮像面とレンズとの位置関係を制御してよい。 The control unit selects one of the first AF process and the second AF process based on the ratio of the spatial frequency band included in the predetermined spatial frequency band in the spatial frequency band of the image derived by the derivation unit. Or a combination of both may control the positional relationship between the imaging surface and the lens.
 制御部は、割合が閾値以上である場合、第2AF処理を選択し、割合が閾値より小さい場合、第1AF処理を選択することで、撮像面とレンズとの位置関係を制御してよい。 The control unit may control the positional relationship between the imaging surface and the lens by selecting the second AF process when the ratio is equal to or greater than the threshold and selecting the first AF process when the ratio is smaller than the threshold.
 制御部は、割合が第1閾値以上である場合、第2AF処理を選択し、割合が第1閾値より小さく、かつ第2閾値以上である場合、第1AF処理と第2AF処理とを組み合わせ、第2閾値より小さい場合、第1AF処理を選択することで、撮像面とレンズとの位置関係を制御してよい。 The control unit selects the second AF process when the ratio is equal to or greater than the first threshold, and combines the first AF process and the second AF process when the ratio is smaller than the first threshold and equal to or greater than the second threshold, When the threshold value is smaller than 2, the first AF process may be selected to control the positional relationship between the imaging surface and the lens.
 制御部は、導出部に導出された画像の空間周波数帯域を、位相差検出用信号を生成する複数の画素によって再現可能な予め定められた空間周波数帯域と比較することにより、第1AF処理と第2AF処理との何れか一方を選択する、または両方を組み合わせることで、撮像面とレンズとの位置関係を制御してよい。 The control unit compares the spatial frequency band of the image derived by the deriving unit with a predetermined spatial frequency band that can be reproduced by a plurality of pixels that generate the phase difference detection signal, and thereby performs the first AF process and the first AF process. The positional relationship between the imaging surface and the lens may be controlled by selecting either one of the 2AF processing or combining both.
 第2AF処理は、複数の画素、及び色成分信号を生成する複数の他の画素が予め定められた配列パターンで配置されるイメージセンサから出力される位相差検出用信号に基づいて実行されてよい。 The second AF process may be executed based on a phase difference detection signal output from an image sensor in which a plurality of pixels and a plurality of other pixels that generate color component signals are arranged in a predetermined arrangement pattern. .
 本発明の一態様に係る撮像装置は、上記撮像制御装置を備えてよい。撮像装置は、撮像面を有するイメージセンサを備えてよい。撮像装置は、レンズを備えてよい。 An imaging device according to one embodiment of the present invention may include the imaging control device. The imaging device may include an image sensor having an imaging surface. The imaging device may include a lens.
 本発明の一態様に係る撮像システムは、上記撮像装置を備えてよい。撮像システムは、撮像装置を支持する支持機構を備えてよい。 An imaging system according to an aspect of the present invention may include the imaging device. The imaging system may include a support mechanism that supports the imaging device.
 本発明の一態様に係る移動体は、上記撮像システムを搭載して移動してよい。 The moving body according to one embodiment of the present invention may move by mounting the imaging system.
 本発明の一態様に係る撮像制御方法は、撮像された画像の空間周波数帯域を導出する段階を備えてよい。撮像制御方法は、導出された画像の空間周波数帯域に基づいて、撮像面とレンズとの位置関係が異なる状態で撮像された複数の画像のぼけ量に基づいてオートフォーカス処理を実行する第1AF処理と、位相差方式でオートフォーカス処理を実行する第2AF処理との何れか一方を選択する、または両方を組み合わせることで、撮像面とレンズとの位置関係を制御する段階を備えてよい。 The imaging control method according to an aspect of the present invention may include a step of deriving a spatial frequency band of a captured image. The imaging control method includes a first AF process that performs an autofocus process based on blur amounts of a plurality of images that are captured in a state where the positional relationship between the imaging surface and the lens is different based on the spatial frequency band of the derived image. And a step of controlling the positional relationship between the imaging surface and the lens by selecting either one of the second AF processing for executing the autofocus processing by the phase difference method, or combining both.
 本発明の一態様に係るプログラムは、撮像された画像の空間周波数帯域を導出する段階をコンピュータに実行させてよい。プログラムは、導出された画像の空間周波数帯域に基づいて、撮像面とレンズとの位置関係が異なる状態で撮像された複数の画像のぼけ量に基づいてオートフォーカス処理を実行する第1AF処理と、位相差方式でオートフォーカス処理を実行する第2AF処理との何れか一方を選択する、または両方を組み合わせることで、撮像面とレンズとの位置関係を制御する段階をコンピュータに実行させてよい。 The program according to an aspect of the present invention may cause a computer to execute a step of deriving a spatial frequency band of a captured image. The program includes: a first AF process that performs an autofocus process based on blur amounts of a plurality of images captured in a state where the positional relationship between the imaging surface and the lens is different based on the derived spatial frequency band of the image; The step of controlling the positional relationship between the imaging surface and the lens may be executed by the computer by selecting either one of the second AF processes that execute the autofocus process by the phase difference method, or by combining both.
 本発明の一態様によれば、AF処理の時間の増大と、AF処理の精度の低下とをバランスよく抑制できる。 According to one aspect of the present invention, it is possible to suppress an increase in AF processing time and a decrease in AF processing accuracy in a balanced manner.
 上記の発明の概要は、本発明の特徴の全てを列挙したものではない。これらの特徴群のサブコンビネーションも発明となりうる。 The above summary of the invention does not enumerate all the features of the present invention. A sub-combination of these feature groups can also be an invention.
無人航空機及び遠隔操作装置の外観の一例を示す図である。It is a figure which shows an example of the external appearance of an unmanned aircraft and a remote control device. 無人航空機の機能ブロックの一例を示す図である。It is a figure which shows an example of the functional block of an unmanned aerial vehicle. ぼけ量とレンズ位置との関係を示す曲線の一例を示す図である。It is a figure which shows an example of the curve which shows the relationship between a blurring amount and a lens position. ぼけ量に基づいてオブジェクトまでの距離を算出する手順の一例を示す図である。It is a figure which shows an example of the procedure which calculates the distance to an object based on the amount of blurs. オブジェクトの位置、レンズの位置、及び焦点距離との関係について説明するための図である。It is a figure for demonstrating the relationship between the position of an object, the position of a lens, and a focal distance. イメージセンサの画素配列の一例を示す図である。It is a figure which shows an example of the pixel arrangement | sequence of an image sensor. 画像の空間周波数帯域の割合について説明するための図である。It is a figure for demonstrating the ratio of the spatial frequency band of an image. 画像の空間周波数帯域の割合について説明するための図である。It is a figure for demonstrating the ratio of the spatial frequency band of an image. AF処理の手順の一例を示すフローチャートである。It is a flowchart which shows an example of the procedure of AF process. AF処理の手順の一例を示すふとーチャートである。It is a foot chart which shows an example of the procedure of AF process. ハードウェア構成の一例を示す図である。It is a figure which shows an example of a hardware constitutions.
 以下、発明の実施の形態を通じて本発明を説明するが、以下の実施の形態は請求の範囲に係る発明を限定するものではない。また、実施の形態の中で説明されている特徴の組み合わせの全てが発明の解決手段に必須であるとは限らない。以下の実施の形態に、多様な変更または改良を加えることが可能であることが当業者に明らかである。その様な変更または改良を加えた形態も本発明の技術的範囲に含まれ得ることが、請求の範囲の記載から明らかである。 Hereinafter, the present invention will be described through embodiments of the invention, but the following embodiments do not limit the claimed invention. Moreover, not all the combinations of features described in the embodiments are essential for the solution means of the invention. It will be apparent to those skilled in the art that various modifications or improvements can be made to the following embodiments. It is apparent from the scope of the claims that the embodiments added with such changes or improvements can be included in the technical scope of the present invention.
 請求の範囲、明細書、図面、及び要約書には、著作権による保護の対象となる事項が含まれる。著作権者は、これらの書類の何人による複製に対しても、特許庁のファイルまたはレコードに表示される通りであれば異議を唱えない。ただし、それ以外の場合、一切の著作権を留保する。 The claims, the description, the drawings, and the abstract include matters that are subject to copyright protection. The copyright owner will not object to any number of copies of these documents as they appear in the JPO file or record. However, in other cases, all copyrights are reserved.
 本発明の様々な実施形態は、フローチャート及びブロック図を参照して記載されてよく、ここにおいてブロックは、(1)操作が実行されるプロセスの段階または(2)操作を実行する役割を持つ装置の「部」を表わしてよい。特定の段階及び「部」が、プログラマブル回路、及び/またはプロセッサによって実装されてよい。専用回路は、デジタル及び/またはアナログハードウェア回路を含んでよい。集積回路(IC)及び/またはディスクリート回路を含んでよい。プログラマブル回路は、再構成可能なハードウェア回路を含んでよい。再構成可能なハードウェア回路は、論理AND、論理OR、論理XOR、論理NAND、論理NOR、及び他の論理操作、フリップフロップ、レジスタ、フィールドプログラマブルゲートアレイ(FPGA)、プログラマブルロジックアレイ(PLA)等のようなメモリ要素等を含んでよい。 Various embodiments of the present invention may be described with reference to flowcharts and block diagrams, where a block is either (1) a stage in a process in which an operation is performed or (2) an apparatus responsible for performing the operation. May represent a “part”. Certain stages and “units” may be implemented by programmable circuits and / or processors. Dedicated circuitry may include digital and / or analog hardware circuitry. Integrated circuits (ICs) and / or discrete circuits may be included. The programmable circuit may include a reconfigurable hardware circuit. Reconfigurable hardware circuits include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate arrays (FPGA), programmable logic arrays (PLA), etc. The memory element or the like may be included.
 コンピュータ可読媒体は、適切なデバイスによって実行される命令を格納可能な任意の有形なデバイスを含んでよい。その結果、そこに格納される命令を有するコンピュータ可読媒体は、フローチャートまたはブロック図で指定された操作を実行するための手段を作成すべく実行され得る命令を含む、製品を備えることになる。コンピュータ可読媒体の例としては、電子記憶媒体、磁気記憶媒体、光記憶媒体、電磁記憶媒体、半導体記憶媒体等が含まれてよい。コンピュータ可読媒体のより具体的な例としては、フロッピー(登録商標)ディスク、ディスケット、ハードディスク、ランダムアクセスメモリ(RAM)、リードオンリメモリ(ROM)、消去可能プログラマブルリードオンリメモリ(EPROMまたはフラッシュメモリ)、電気的消去可能プログラマブルリードオンリメモリ(EEPROM)、静的ランダムアクセスメモリ(SRAM)、コンパクトディスクリードオンリメモリ(CD-ROM)、デジタル多用途ディスク(DVD)、ブルーレイ(RTM)ディスク、メモリスティック、集積回路カード等が含まれてよい。 The computer readable medium may include any tangible device capable of storing instructions to be executed by a suitable device. As a result, a computer readable medium having instructions stored thereon comprises a product that includes instructions that can be executed to create a means for performing the operations specified in the flowcharts or block diagrams. Examples of computer readable media may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like. More specific examples of computer readable media include floppy disks, diskettes, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), Electrically erasable programmable read only memory (EEPROM), static random access memory (SRAM), compact disc read only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory stick, integrated A circuit card or the like may be included.
 コンピュータ可読命令は、1または複数のプログラミング言語の任意の組み合わせで記述されたソースコードまたはオブジェクトコードの何れかを含んでよい。ソースコードまたはオブジェクトコードは、従来の手続型プログラミング言語を含む。従来の手続型プログラミング言語は、アセンブラ命令、命令セットアーキテクチャ(ISA)命令、マシン命令、マシン依存命令、マイクロコード、ファームウェア命令、状態設定データ、またはSmalltalk、JAVA(登録商標)、C++等のようなオブジェクト指向プログラミング言語、及び「C」プログラミング言語または同様のプログラミング言語でよい。コンピュータ可読命令は、汎用コンピュータ、特殊目的のコンピュータ、若しくは他のプログラム可能なデータ処理装置のプロセッサまたはプログラマブル回路に対し、ローカルにまたはローカルエリアネットワーク(LAN)、インターネット等のようなワイドエリアネットワーク(WAN)を介して提供されてよい。プロセッサまたはプログラマブル回路は、フローチャートまたはブロック図で指定された操作を実行するための手段を作成すべく、コンピュータ可読命令を実行してよい。プロセッサの例としては、コンピュータプロセッサ、処理ユニット、マイクロプロセッサ、デジタル信号プロセッサ、コントローラ、マイクロコントローラ等を含む。 The computer readable instructions may include either source code or object code written in any combination of one or more programming languages. The source code or object code includes a conventional procedural programming language. Conventional procedural programming languages include assembler instructions, instruction set architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA, C ++, etc. It may be an object-oriented programming language and a “C” programming language or a similar programming language. Computer readable instructions may be directed to a general purpose computer, special purpose computer, or other programmable data processing device processor or programmable circuit locally or in a wide area network (WAN) such as a local area network (LAN), the Internet, etc. ). The processor or programmable circuit may execute computer readable instructions to create a means for performing the operations specified in the flowcharts or block diagrams. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
 図1は、無人航空機(UAV)10及び遠隔操作装置300の外観の一例を示す。UAV10は、UAV本体20、ジンバル50、複数の撮像装置60、及び撮像装置100を備える。ジンバル50、及び撮像装置100は、撮像システムの一例である。UAV10は、推進部により推進される移動体の一例である。移動体とは、UAVの他、空中を移動する他の航空機などの飛行体、地上を移動する車両、水上を移動する船舶等を含む概念である。 FIG. 1 shows an example of the external appearance of an unmanned aerial vehicle (UAV) 10 and a remote control device 300. The UAV 10 includes a UAV main body 20, a gimbal 50, a plurality of imaging devices 60, and an imaging device 100. The gimbal 50 and the imaging device 100 are an example of an imaging system. The UAV 10 is an example of a moving body propelled by a propulsion unit. The moving body is a concept including a flying body such as another aircraft moving in the air, a vehicle moving on the ground, a ship moving on the water, etc. in addition to the UAV.
 UAV本体20は、複数の回転翼を備える。複数の回転翼は、推進部の一例である。UAV本体20は、複数の回転翼の回転を制御することでUAV10を飛行させる。UAV本体20は、例えば、4つの回転翼を用いてUAV10を飛行させる。回転翼の数は、4つには限定されない。また、UAV10は、回転翼を有さない固定翼機でもよい。 The UAV main body 20 includes a plurality of rotor blades. The plurality of rotor blades is an example of a propulsion unit. The UAV main body 20 causes the UAV 10 to fly by controlling the rotation of a plurality of rotor blades. For example, the UAV main body 20 causes the UAV 10 to fly using four rotary wings. The number of rotor blades is not limited to four. The UAV 10 may be a fixed wing machine that does not have a rotating wing.
 撮像装置100は、所望の撮像範囲に含まれる被写体を撮像する撮像用のカメラである。ジンバル50は、撮像装置100を回転可能に支持する。ジンバル50は、支持機構の一例である。例えば、ジンバル50は、撮像装置100を、アクチュエータを用いてピッチ軸で回転可能に支持する。ジンバル50は、撮像装置100を、アクチュエータを用いて更にロール軸及びヨー軸のそれぞれを中心に回転可能に支持する。ジンバル50は、ヨー軸、ピッチ軸、及びロール軸の少なくとも1つを中心に撮像装置100を回転させることで、撮像装置100の姿勢を変更してよい。 The imaging apparatus 100 is an imaging camera that images a subject included in a desired imaging range. The gimbal 50 supports the imaging device 100 in a rotatable manner. The gimbal 50 is an example of a support mechanism. For example, the gimbal 50 supports the imaging device 100 so as to be rotatable about the pitch axis using an actuator. The gimbal 50 further supports the imaging device 100 using an actuator so as to be rotatable about the roll axis and the yaw axis. The gimbal 50 may change the posture of the imaging device 100 by rotating the imaging device 100 about at least one of the yaw axis, the pitch axis, and the roll axis.
 複数の撮像装置60は、UAV10の飛行を制御するためにUAV10の周囲を撮像するセンシング用のカメラである。2つの撮像装置60が、UAV10の機首である正面に設けられてよい。更に他の2つの撮像装置60が、UAV10の底面に設けられてよい。正面側の2つの撮像装置60はペアとなり、いわゆるステレオカメラとして機能してよい。底面側の2つの撮像装置60もペアとなり、ステレオカメラとして機能してよい。複数の撮像装置60により撮像された画像に基づいて、UAV10の周囲の3次元空間データが生成されてよい。UAV10が備える撮像装置60の数は4つには限定されない。UAV10は、少なくとも1つの撮像装置60を備えていればよい。UAV10は、UAV10の機首、機尾、側面、底面、及び天井面のそれぞれに少なくとも1つの撮像装置60を備えてもよい。撮像装置60で設定できる画角は、撮像装置100で設定できる画角より広くてよい。撮像装置60は、単焦点レンズまたは魚眼レンズを有してもよい。 The plurality of imaging devices 60 are sensing cameras that image the surroundings of the UAV 10 in order to control the flight of the UAV 10. Two imaging devices 60 may be provided in the front which is the nose of UAV10. Two other imaging devices 60 may be provided on the bottom surface of the UAV 10. The two imaging devices 60 on the front side may be paired and function as a so-called stereo camera. The two imaging devices 60 on the bottom side may also be paired and function as a stereo camera. Based on images picked up by a plurality of image pickup devices 60, three-dimensional spatial data around the UAV 10 may be generated. The number of imaging devices 60 included in the UAV 10 is not limited to four. The UAV 10 only needs to include at least one imaging device 60. The UAV 10 may include at least one imaging device 60 on each of the nose, the tail, the side surface, the bottom surface, and the ceiling surface of the UAV 10. The angle of view that can be set by the imaging device 60 may be wider than the angle of view that can be set by the imaging device 100. The imaging device 60 may have a single focus lens or a fisheye lens.
 遠隔操作装置300は、UAV10と通信して、UAV10を遠隔操作する。遠隔操作装置300は、UAV10と無線で通信してよい。遠隔操作装置300は、UAV10に上昇、下降、加速、減速、前進、後進、回転などのUAV10の移動に関する各種命令を示す指示情報を送信する。指示情報は、例えば、UAV10の高度を上昇させる指示情報を含む。指示情報は、UAV10が位置すべき高度を示してよい。UAV10は、遠隔操作装置300から受信した指示情報により示される高度に位置するように移動する。指示情報は、UAV10を上昇させる上昇命令を含んでよい。UAV10は、上昇命令を受け付けている間、上昇する。UAV10は、上昇命令を受け付けても、UAV10の高度が上限高度に達している場合には、上昇を制限してよい。 The remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10. The remote operation device 300 may communicate with the UAV 10 wirelessly. The remote control device 300 transmits to the UAV 10 instruction information indicating various commands related to movement of the UAV 10 such as ascending, descending, accelerating, decelerating, moving forward, moving backward, and rotating. The instruction information includes, for example, instruction information for raising the altitude of the UAV 10. The instruction information may indicate the altitude at which the UAV 10 should be located. The UAV 10 moves so as to be located at an altitude indicated by the instruction information received from the remote operation device 300. The instruction information may include an ascending command that raises the UAV 10. The UAV 10 rises while accepting the ascent command. Even if the UAV 10 receives the ascending command, the UAV 10 may limit the ascent when the altitude of the UAV 10 has reached the upper limit altitude.
 図2は、UAV10の機能ブロックの一例を示す。UAV10は、UAV制御部30、メモリ32、通信インタフェース34、推進部40、GPS受信機41、慣性計測装置42、磁気コンパス43、気圧高度計44、温度センサ45、ジンバル50、及び撮像装置100を備える。 FIG. 2 shows an example of functional blocks of the UAV10. The UAV 10 includes a UAV control unit 30, a memory 32, a communication interface 34, a propulsion unit 40, a GPS receiver 41, an inertial measurement device 42, a magnetic compass 43, a barometric altimeter 44, a temperature sensor 45, a gimbal 50, and the imaging device 100. .
 通信インタフェース34は、遠隔操作装置300などの他の装置と通信する。通信インタフェース34は、遠隔操作装置300からUAV制御部30に対する各種の命令を含む指示情報を受信してよい。メモリ32は、UAV制御部30が、推進部40、GPS受信機41、慣性計測装置(IMU)42、磁気コンパス43、気圧高度計44、温度センサ45、ジンバル50、撮像装置60、及び撮像装置100を制御するのに必要なプログラム等を格納する。メモリ32は、コンピュータ読み取り可能な記録媒体でよく、SRAM、DRAM、EPROM、EEPROM、及びUSBメモリ等のフラッシュメモリの少なくとも1つを含んでよい。メモリ32は、UAV本体20の内部に設けられてよい。UAV本体20から取り外し可能に設けられてよい。 The communication interface 34 communicates with other devices such as the remote operation device 300. The communication interface 34 may receive instruction information including various commands for the UAV control unit 30 from the remote operation device 300. The memory 32 includes a propulsion unit 40, a GPS receiver 41, an inertial measurement device (IMU) 42, a magnetic compass 43, a barometric altimeter 44, a temperature sensor 45, a gimbal 50, an imaging device 60, and the imaging device 100. Stores programs and the like necessary for controlling The memory 32 may be a computer-readable recording medium and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory. The memory 32 may be provided inside the UAV main body 20. It may be provided so as to be removable from the UAV main body 20.
 UAV制御部30は、メモリ32に格納されたプログラムに従ってUAV10の飛行及び撮像を制御する。UAV制御部30は、CPUまたはMPU等のマイクロプロセッサ、MCU等のマイクロコントローラ等により構成されてよい。UAV制御部30は、通信インタフェース34を介して遠隔操作装置300から受信した命令に従って、UAV10の飛行及び撮像を制御する。推進部40は、UAV10を推進させる。推進部40は、複数の回転翼と、複数の回転翼を回転させる複数の駆動モータとを有する。推進部40は、UAV制御部30からの命令に従って複数の駆動モータを介して複数の回転翼を回転させて、UAV10を飛行させる。 The UAV control unit 30 controls the flight and imaging of the UAV 10 according to a program stored in the memory 32. The UAV control unit 30 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The UAV control unit 30 controls the flight and imaging of the UAV 10 according to a command received from the remote control device 300 via the communication interface 34. The propulsion unit 40 propels the UAV 10. The propulsion unit 40 includes a plurality of rotating blades and a plurality of drive motors that rotate the plurality of rotating blades. The propulsion unit 40 causes the UAV 10 to fly by rotating a plurality of rotor blades via a plurality of drive motors in accordance with a command from the UAV control unit 30.
 GPS受信機41は、複数のGPS衛星から発信された時刻を示す複数の信号を受信する。GPS受信機41は、受信された複数の信号に基づいてGPS受信機41の位置、つまりUAV10の位置を算出する。IMU42は、UAV10の姿勢を検出する。IMU42は、UAV10の姿勢として、UAV10の前後、左右、及び上下の3軸方向の加速度と、ピッチ、ロール、及びヨーの3軸方向の角速度とを検出する。磁気コンパス43は、UAV10の機首の方位を検出する。気圧高度計44は、UAV10が飛行する高度を検出する。気圧高度計44は、UAV10の周囲の気圧を検出し、検出された気圧を高度に換算して、高度を検出する。温度センサ45は、UAV10の周囲の温度を検出する。 The GPS receiver 41 receives a plurality of signals indicating times transmitted from a plurality of GPS satellites. The GPS receiver 41 calculates the position of the GPS receiver 41, that is, the position of the UAV 10 based on the received signals. The IMU 42 detects the posture of the UAV 10. The IMU 42 detects, as the posture of the UAV 10, acceleration in the three axial directions of the front, rear, left, and right of the UAV 10, and angular velocity in the three axial directions of pitch, roll, and yaw. The magnetic compass 43 detects the heading of the UAV 10. The barometric altimeter 44 detects the altitude at which the UAV 10 flies. The barometric altimeter 44 detects the atmospheric pressure around the UAV 10, converts the detected atmospheric pressure into an altitude, and detects the altitude. The temperature sensor 45 detects the temperature around the UAV 10.
 撮像装置100は、撮像部102及びレンズ部200を備える。レンズ部200は、レンズ装置の一例である。撮像部102は、イメージセンサ120、撮像制御部110、及びメモリ130を有する。イメージセンサ120は、CCDまたはCMOSにより構成されてよい。イメージセンサ120は、複数のレンズ210を介して結像された光学像の画像データを撮像制御部110に出力する。撮像制御部110は、CPUまたはMPUなどのマイクロプロセッサ、MCUなどのマイクロコントローラなどにより構成されてよい。撮像制御部110は、UAV制御部30からの撮像装置100の動作命令に応じて、撮像装置100を制御してよい。メモリ130は、コンピュータ可読可能な記録媒体でよく、SRAM、DRAM、EPROM、EEPROM、及びUSBメモリなどのフラッシュメモリの少なくとも1つを含んでよい。メモリ130は、撮像制御部110がイメージセンサ120などを制御するのに必要なプログラム等を格納する。メモリ130は、撮像装置100の筐体の内部に設けられてよい。メモリ130は、撮像装置100の筐体から取り外し可能に設けられてよい。 The imaging apparatus 100 includes an imaging unit 102 and a lens unit 200. The lens unit 200 is an example of a lens device. The imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130. The image sensor 120 may be configured by a CCD or a CMOS. The image sensor 120 outputs image data of an optical image formed through the plurality of lenses 210 to the imaging control unit 110. The imaging control unit 110 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The imaging control unit 110 may control the imaging device 100 in accordance with an operation command for the imaging device 100 from the UAV control unit 30. The memory 130 may be a computer-readable recording medium and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory. The memory 130 stores a program and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like. The memory 130 may be provided inside the housing of the imaging device 100. The memory 130 may be provided so as to be removable from the housing of the imaging apparatus 100.
 レンズ部200は、複数のレンズ210、レンズ移動機構212、及びレンズ制御部220を有する。複数のレンズ210は、ズームレンズ、バリフォーカルレンズ、及びフォーカスレンズとして機能してよい。複数のレンズ210の少なくとも一部または全部は、光軸に沿って移動可能に配置される。レンズ部200は、撮像部102に対して着脱可能に設けられる交換レンズでよい。レンズ移動機構212は、複数のレンズ210の少なくとも一部または全部を光軸に沿って移動させる。レンズ制御部220は、撮像部102からのレンズ制御命令に従って、レンズ移動機構212を駆動して、1または複数のレンズ210を光軸方向に沿って移動させる。レンズ制御命令は、例えば、ズーム制御命令、及びフォーカス制御命令である。 The lens unit 200 includes a plurality of lenses 210, a lens moving mechanism 212, and a lens control unit 220. The plurality of lenses 210 may function as a zoom lens, a varifocal lens, and a focus lens. At least some or all of the plurality of lenses 210 are arranged to be movable along the optical axis. The lens unit 200 may be an interchangeable lens that is detachably attached to the imaging unit 102. The lens moving mechanism 212 moves at least some or all of the plurality of lenses 210 along the optical axis. The lens control unit 220 drives the lens moving mechanism 212 in accordance with a lens control command from the imaging unit 102 to move one or a plurality of lenses 210 along the optical axis direction. The lens control command is, for example, a zoom control command and a focus control command.
 このように構成された撮像装置100は、オートフォーカス処理(AF処理)を実行して、所望の被写体を撮像する。 The imaging apparatus 100 configured in this manner performs an autofocus process (AF process) and images a desired subject.
 撮像装置100は、AF処理を実行するために、レンズから被写体までの距離(被写体距離)を決定する。被写体距離を決定するための方式として、レンズと撮像面との位置関係が異なる状態で撮像された複数の画像のぼけ量に基づいて決定する方式がある。ここで、この方式を、ぼけ検出オートフォーカス(Bokeh Detection Auto Foucus:BDAF)方式と称する。 The imaging apparatus 100 determines the distance from the lens to the subject (subject distance) in order to execute the AF process. As a method for determining the subject distance, there is a method of determining based on the blur amounts of a plurality of images captured in a state where the positional relationship between the lens and the imaging surface is different. Here, this method is referred to as a blur detection auto focus (BDAF) method.
 例えば、画像のぼけ量(Cost)は、ガウシアン関数を用いて次式(1)で表すことができる。式(1)において、xは、水平方向における画素位置を示す。σは、標準偏差値を示す。
Figure JPOXMLDOC01-appb-M000001
For example, the blur amount (Cost) of the image can be expressed by the following equation (1) using a Gaussian function. In Expression (1), x indicates a pixel position in the horizontal direction. σ represents a standard deviation value.
Figure JPOXMLDOC01-appb-M000001
 図3は、式(1)に表される曲線の一例を示す。曲線500の極小点502に対応するレンズ位置にフォーカスレンズを合わせることで、画像Iに含まれるオブジェクトに焦点を合わせることができる。 FIG. 3 shows an example of a curve represented by Equation (1). By focusing the focus lens on the lens position corresponding to the minimum point 502 of the curve 500, the object included in the image I can be focused.
 図4は、BDAF方式の距離算出手順の一例を示すフローチャートである。まず、撮像装置100で、レンズと撮像面とが第1位置関係にある状態で、1枚目の画像Iを撮像してメモリ130に格納する。次いで、フォーカスレンズまたはイメージセンサ120の撮像面を光軸方向に移動させることで、レンズと撮像面とが第2位置関係にある状態にして、撮像装置100で2枚目の画像Iを撮像してメモリ130に格納する(S101)。例えば、いわゆる山登りAFのように、合焦点を超えないようにフォーカスレンズまたはイメージセンサ120の撮像面を光軸方向に移動させる。フォーカスレンズまたはイメージセンサ120の撮像面の移動量は、例えば、10μmでよい。 FIG. 4 is a flowchart illustrating an example of a distance calculation procedure of the BDAF method. First, the imaging apparatus 100, in a state in which the lens and the image plane is in the first positional relationship, for storing the image I 1 of the first sheet in the memory 130 by photographing. Then, by moving the imaging surface of the focusing lens or the image sensor 120 in the optical axis direction, the lens and the imaging surface in the state in the second positional relationship, captured image I 2 of the second sheet by the imaging device 100 And stored in the memory 130 (S101). For example, like so-called hill-climbing AF, the focus lens or the imaging surface of the image sensor 120 is moved in the optical axis direction so as not to exceed the focal point. The moving amount of the imaging surface of the focus lens or the image sensor 120 may be 10 μm, for example.
 次いで、撮像装置100は、画像Iを複数の領域に分割する(S102)。画像I2内の画素ごとに特徴量を算出して、類似する特徴量を有する画素群を一つの領域として画像Iを複数の領域に分割してよい。画像IのうちAF処理枠に設定されている範囲の画素群を複数の領域に分割してもよい。撮像装置100は、画像Iの複数の領域に対応する複数の領域に画像Iを分割する。撮像装置100は、画像Iの複数の領域のそれぞれのぼけ量と、画像Iの複数の領域のそれぞれのぼけ量とに基づいて、複数の領域ごとに複数の領域のそれぞれに含まれるオブジェクトまでの距離を算出する(S103)。 Then, the imaging device 100 divides the image I 1 into a plurality of regions (S102). Calculates a feature amount for each pixel in the image I2, may divide the image I 1 into a plurality of regions of pixel groups having a feature amount that is similar as a single region. A pixel group in a range set in the AF processing frame in the image I 1 may be divided into a plurality of regions. Imaging device 100 divides the image I 2 into a plurality of regions corresponding to a plurality of areas of the image I 1. The imaging apparatus 100 includes an object included in each of the plurality of regions based on the amount of blur of each of the plurality of regions of the image I 1 and the amount of blur of each of the plurality of regions of the image I 2. Is calculated (S103).
 図5を参照して距離の算出手順についてさらに説明する。レンズL(主点)からオブジェクト510(物面)までの距離をA、レンズL(主点)からオブジェクト510が撮像面で結像する位置(像面)までの距離をB、焦点距離をFとする。この場合、距離A、離B、及び焦点距離Fの関係は、レンズの公式から次式(2)で表すことができる。
Figure JPOXMLDOC01-appb-M000002
The distance calculation procedure will be further described with reference to FIG. The distance from the lens L (principal point) to the object 510 (object surface) is A, the distance from the lens L (principal point) to the position (image plane) where the object 510 forms an image on the imaging surface is B, and the focal length is F. And In this case, the relationship between the distance A, the separation B, and the focal length F can be expressed by the following formula (2) from the lens formula.
Figure JPOXMLDOC01-appb-M000002
 焦点距離Fはレンズ位置で特定される。したがって、オブジェクト510が撮像面で結像する距離Bが特定できれば、式(2)を用いて、レンズLからオブジェクト510までの距離Aを特定することができる。 The focal length F is specified by the lens position. Therefore, if the distance B at which the object 510 forms an image on the imaging surface can be specified, the distance A from the lens L to the object 510 can be specified using Expression (2).
 図5に示すように、撮像面上に投影されたオブジェクト510のぼけの大きさ(錯乱円512及び514)からオブジェクト510が結像する位置を算出することで、距離Bを特定し、さらに距離Aを特定することができる。つまり、ぼけの大きさ(ぼけ量)が撮像面と結像位置とに比例することを考慮して、結像位置を特定できる。 As shown in FIG. 5, the distance B is specified by calculating the position at which the object 510 forms an image from the blur size (the circles of confusion 512 and 514) of the object 510 projected on the imaging surface. A can be specified. That is, the imaging position can be specified in consideration of the fact that the size of blur (the amount of blur) is proportional to the imaging surface and the imaging position.
 ここで、撮像面から近い像IからレンズLまでの距離をDとする。像面から遠い像IからレンズLまでの距離をDとする。それぞれの画像はぼけている。このときの点像分布関数(Point Spread Function)をPSF、D及びDにおける画像をそれぞれ、Id1及びId2とする。この場合、例えば、像Iは、畳み込み演算により次式(3)で表すことができる。
Figure JPOXMLDOC01-appb-M000003
Here, the distance from the image I 1 close to the imaging surface to the lens L and D 1. The distance from the image I 2 far from the image plane to the lens L is D 2 . Each image is blurred. The point spread function at this time is PSF, and the images at D 1 and D 2 are I d1 and I d2 , respectively. In this case, for example, the image I 1 can be expressed by the following equation (3) by a convolution operation.
Figure JPOXMLDOC01-appb-M000003
 さらに、画像データId1及びId2のフーリエ変換関数をfとして、画像Id1及びId2の点像分布関数PSF及びPSFをフーリエ変換した光学伝達関数(Optical Transfer Function)をOTF及びOTFとして、次式(4)のように比をとる。
Figure JPOXMLDOC01-appb-M000004
Further, the Fourier transform function of the image data I d1 and I d2 as f, the point spread image I d1 and I d2 function PSF 1 and the optical transfer function of the PSF 2 Fourier transform (Optical Transfer Function) of the OTF 1 and OTF 2 , the ratio is as shown in the following equation (4).
Figure JPOXMLDOC01-appb-M000004
 式(4)に示す値Cは、画像Id1及びId2のそれぞれのぼけ量の変化量、つまり、値Cは、画像Id1のぼけ量と画像Id2nのぼけ量との差に相当する。 The value C shown in the equation (4) corresponds to the amount of change in the blur amounts of the images I d1 and I d2 , that is, the value C corresponds to the difference between the blur amount of the image I d1 and the blur amount of the image I d2n. .
 上記のようなBDAF方式によれば、少なくとも2枚の画像を取得する必要があるので、AF処理に時間がかかる場合がある。 According to the BDAF method as described above, since it is necessary to acquire at least two images, AF processing may take time.
 AF処理の時間を比較的短くできる手法として、2つの視差画像を用いてAF処理を行う位相差AF処理方式が知られている。位相差AF処理方式には、像面位相差AF(PDAF)方式がある。PDAF方式によれば、例えば、図6に示すように複数の画素が配置されたイメージセンサが用いられる。イメージセンサは、色成分検出用信号を出力するベイヤ配列の画素ブロックからなる画素列600と、位相差検出用信号を出力する画素を含む画素ブロックからなる画素列602及び画素列604とを含む。画素列602に含まれる位相差検出用信号を出力する画素P1と、画素列604に含まれる位相差検出用信号を出力する画素P2とは、光が入射する方向が異なる。よって、画素P1から得られる画像と、画素P2から得られる画像とから位相が異なる画像が得られる。そして、画素P1から得られる画像に含まれる像と、画素P2から得られる画像に含まれる像とのずれの量及び方向に基づいて被写体までの距離を導出できる。 A phase difference AF processing method that performs AF processing using two parallax images is known as a method that can shorten the AF processing time relatively. The phase difference AF processing method includes an image plane phase difference AF (PDAF) method. According to the PDAF method, for example, an image sensor in which a plurality of pixels are arranged as shown in FIG. 6 is used. The image sensor includes a pixel column 600 including a Bayer array pixel block that outputs a color component detection signal, and a pixel column 602 and a pixel column 604 including pixel blocks including pixels that output a phase difference detection signal. The pixel P1 that outputs the phase difference detection signal included in the pixel column 602 and the pixel P2 that outputs the phase difference detection signal included in the pixel column 604 have different light incident directions. Therefore, an image having a different phase is obtained from the image obtained from the pixel P1 and the image obtained from the pixel P2. Then, the distance to the subject can be derived based on the amount and direction of deviation between the image included in the image obtained from the pixel P1 and the image included in the image obtained from the pixel P2.
 PDAF方式は、BDAF方式よりも高速にAF処理を実行できる場合が多い。しかし、位相差検出用の画素同士は、比較的間隔が空いている。したがって、画像P1及び画素P2から得られる画像の画質は比較的低い。そのため、イメージセンサで撮像される画像の空間周波数帯域に高周波成分が多く含まれると、AF処理の精度が低下する場合がある。 In many cases, the PDAF method can execute AF processing faster than the BDAF method. However, the pixels for phase difference detection are relatively spaced apart. Therefore, the image quality of the image obtained from the image P1 and the pixel P2 is relatively low. For this reason, if a high frequency component is included in the spatial frequency band of the image captured by the image sensor, the accuracy of the AF processing may decrease.
 そこで、本実施形態に係る撮像装置100によれば、BDAF方式のAF処理と、PADF方式のAF処理とを、画像の空間周波数帯域に基づいて一方を選択する、または両方を組み合わせることで、AF処理を実行する。 Therefore, according to the imaging apparatus 100 according to the present embodiment, AF processing of the BDAF method and AF processing of the PADF method are selected based on the spatial frequency band of the image, or a combination of both, Execute the process.
 撮像制御部110は、導出部112、及び合焦制御部114を有する。導出部112は、イメージセンサ120により撮像された画像の空間周波数帯域を導出する。導出部112は、画像に対してフーリエ変換を行うことで、画像を空間周波数成分ごとに分解することで、画像の空間周波数帯域を導出してよい。 The imaging control unit 110 includes a derivation unit 112 and a focusing control unit 114. The deriving unit 112 derives the spatial frequency band of the image captured by the image sensor 120. The deriving unit 112 may derive the spatial frequency band of the image by performing Fourier transform on the image and decomposing the image for each spatial frequency component.
 合焦制御部114は、導出部112により導出された画像の空間周波数帯域に基づいて、イメージセンサ120の撮像面とレンズ210との位置関係が異なる状態で撮像された複数の画像のぼけ量に基づいてオートフォーカス処理を実行するBDAF処理と、位相差方式でオートフォーカス処理を実行するPDAF処理との何れか一方を選択する、または両方を組み合わせることで、イメージセンサ120の撮像面とレンズ210との位置関係を制御する。合焦制御部114は、画像の空間周波数帯域に基づいて、BDAF処理とPDAF処理との何れか一方を選択する、または両方を組み合わせることで、フォーカスレンズの位置を制御してよい。合焦制御部114は、制御部の一例である。BDAF処理は、第1AF処理の一例である。PFAF処理は、第2AF処理の一例である。 Based on the spatial frequency band of the image derived by the deriving unit 112, the focusing control unit 114 determines the blur amounts of a plurality of images captured in a state where the positional relationship between the imaging surface of the image sensor 120 and the lens 210 is different. The imaging surface of the image sensor 120 and the lens 210 are selected by selecting either one of BDAF processing that executes autofocus processing based on the PDAF processing that performs autofocus processing by the phase difference method, or a combination of both. Control the positional relationship of The focus control unit 114 may control the position of the focus lens by selecting one of BDAF processing and PDAF processing based on the spatial frequency band of the image, or by combining both. The focus control unit 114 is an example of a control unit. The BDAF process is an example of a first AF process. The PFAF process is an example of a second AF process.
 合焦制御部114は、導出部112により導出された画像の空間周波数帯域と、PDAF処理に対して予め定められた空間周波数帯域とに基づいてBDAF処理とPDAF処理との何れか一方を選択する、または両方を組み合わせることで、イメージセンサ120の撮像面とレンズ210との位置関係を制御してよい。合焦制御部114は、導出部112により導出された画像の空間周波数帯域のうち予め定められた空間周波数帯域に含まれる空間周波数帯域の割合に基づいて、BDAF処理とPDAF処理との何れか一方を選択する、または両方を組み合わせることで、イメージセンサ120の撮像面とレンズ210との位置関係を制御してよい。 The focus control unit 114 selects either BDAF processing or PDAF processing based on the spatial frequency band of the image derived by the deriving unit 112 and the spatial frequency band predetermined for the PDAF processing. Alternatively, the positional relationship between the imaging surface of the image sensor 120 and the lens 210 may be controlled by combining both. The focusing control unit 114 performs either one of the BDAF process and the PDAF process based on the ratio of the spatial frequency band included in the predetermined spatial frequency band in the spatial frequency band of the image derived by the deriving unit 112. The position relationship between the imaging surface of the image sensor 120 and the lens 210 may be controlled by selecting or combining both.
 合焦制御部114は、割合が閾値以上である場合、PDAF処理を選択し、割合が閾値より小さい場合、BDAF処理を選択することで、イメージセンサ120の撮像面とレンズ210との位置関係を制御してよい。合焦制御部114は、割合が第1閾値以上である場合、PDAF処理を選択し、割合が第1閾値より小さく、かつ第2閾値以上である場合、BDAF処理とPDAF処理とを組み合わせ、第2閾値より小さい場合、BDAF処理を選択することで、イメージセンサ120の撮像面とレンズ210との位置関係を制御してよい。合焦制御部114は、例えば、BDAF処理により特定される被写体までの距離と、PADF処理により特定される被写体までの距離とにそれぞれ重み付けを行い、重み付けされた距離に基づいて、フォーカスレンズの位置を制御してよい。合焦制御部114は、割合の大きさにより、BDAF処理及びPDAF処理の重み付けの大きさを変えてよい。 The focus control unit 114 selects the PDAF process when the ratio is greater than or equal to the threshold value, and selects the BDAF process when the ratio is smaller than the threshold value, thereby determining the positional relationship between the imaging surface of the image sensor 120 and the lens 210. You may control. The focus control unit 114 selects the PDAF process when the ratio is greater than or equal to the first threshold, and combines the BDAF process and the PDAF process when the ratio is smaller than the first threshold and greater than or equal to the second threshold, If it is smaller than two thresholds, the positional relationship between the imaging surface of the image sensor 120 and the lens 210 may be controlled by selecting BDAF processing. For example, the focus control unit 114 weights the distance to the subject specified by the BDAF process and the distance to the subject specified by the PADF process, and based on the weighted distance, the position of the focus lens May be controlled. The focus control unit 114 may change the weighting of the BDAF process and the PDAF process depending on the ratio.
 合焦制御部114は、導出部112に導出された画像の空間周波数帯域を、位相差検出用信号を生成する複数の画素によって再現可能な予め定められた空間周波数帯域と比較することにより、BDAF処理とPDAF処理との何れか一方を選択する、または両方を組み合わせることで、イメージセンサ120の撮像面とレンズ210との位置関係を制御してよい。 The focusing control unit 114 compares the spatial frequency band of the image derived by the deriving unit 112 with a predetermined spatial frequency band that can be reproduced by a plurality of pixels that generate the phase difference detection signal, thereby obtaining the BDAF. The positional relationship between the imaging surface of the image sensor 120 and the lens 210 may be controlled by selecting one of the processing and the PDAF processing or combining both.
 合焦制御部114は、位相差検出用信号を生成する複数の画素及び、及び色成分信号を生成する複数の他の画素が予め定められた配列パターンで配置されるイメージセンサ120から出力される位相差検出用信号に基づいてPDAF処理を実行してよい。位相差検出用信号を生成する複数の画素は、例えば、図6に示される画素P1及び画素P2である。色成分信号を生成する複数の他の画素は、例えば、図6に示される画素R、画素G、及び画素Bである。予め定められた空間周波数帯域は、例えば、画素P1及び画素P2によって再現可能な空間周波数帯域でよい。 The focus control unit 114 is output from the image sensor 120 in which a plurality of pixels that generate a phase difference detection signal and a plurality of other pixels that generate a color component signal are arranged in a predetermined arrangement pattern. The PDAF process may be executed based on the phase difference detection signal. The plurality of pixels that generate the phase difference detection signal are, for example, the pixel P1 and the pixel P2 illustrated in FIG. The plurality of other pixels that generate the color component signal are, for example, the pixel R, the pixel G, and the pixel B illustrated in FIG. The predetermined spatial frequency band may be a spatial frequency band that can be reproduced by the pixel P1 and the pixel P2, for example.
 図7A及び図7Bに示すように、画素R、画素G、及び画素Bで再現可能な空間周波数帯域は、例えば、領域700である。一方、画素P1及び画素P2で再現可能な空間周波数帯域は、例えば、領域702である。領域700及び領域702はそれぞれ低周波成分を含む。領域702に含まれる周波数成分は、領域700に含まれる周波数成分より高周波成分の割合が少ない。したがって、イメージセンサ120により撮像される画像が高周波成分を多く含む場合、画素P1及び画素P2で再現可能な空間周波数成分が少なく、合焦制御部114は、PDAF処理により精度よくAF処理を実行できない場合がある。 7A and 7B, the spatial frequency band that can be reproduced by the pixel R, the pixel G, and the pixel B is, for example, a region 700. On the other hand, the spatial frequency band that can be reproduced by the pixel P1 and the pixel P2 is, for example, a region 702. Region 700 and region 702 each contain a low frequency component. The frequency component included in the region 702 has a lower proportion of high frequency components than the frequency component included in the region 700. Therefore, when the image picked up by the image sensor 120 includes many high-frequency components, the spatial frequency components that can be reproduced by the pixels P1 and P2 are small, and the focus control unit 114 cannot execute the AF processing with high accuracy by the PDAF processing. There is a case.
 一方、BDAF処理は、色成分信号に基づいて実行される。つまり、合焦制御部114は、領域700の範囲の空間周波数帯域を含む画像に対して、精度よくBDAF処理を実行できる。 On the other hand, the BDAF process is executed based on the color component signal. That is, the focus control unit 114 can perform BDAF processing with high accuracy on an image including a spatial frequency band in the range of the region 700.
 図7A及び図7Bは、空間周波数と、空間周波数のゲインとの関係を示す。ゲインが大きいほど、そのゲインに対応する空間周波数成分が画像に多く含まれる。図7A及び図7Bにおいて、領域710は、導出部112により導出された画像の空間周波数帯域に対応する領域である。領域712は、画像の空間周波数帯域のうち領域702に含まれる画像の空間周波数帯域に対応する領域である。合焦制御部114は、領域710の面積に対する領域712の面積の割合に基づいて、BDAF処理とPDAF処理との何れか一方を選択する、または両方を組み合わせることで、フォーカスレンズの位置を制御してよい。 7A and 7B show the relationship between the spatial frequency and the gain of the spatial frequency. The larger the gain, the more spatial frequency components corresponding to the gain are included in the image. 7A and 7B, a region 710 is a region corresponding to the spatial frequency band of the image derived by the deriving unit 112. The region 712 is a region corresponding to the spatial frequency band of the image included in the region 702 in the spatial frequency band of the image. The focusing control unit 114 controls the position of the focus lens by selecting one of BDAF processing and PDAF processing or combining both based on the ratio of the area of the region 712 to the area of the region 710. It's okay.
 合焦制御部114は、例えば、図7Aに示すように、領域712の割合が予め定められた閾値以上であれば、PDAF処理を選択して、フォーカスレンズの位置を制御してよい。一方、合焦制御部114は、図7Bに示すように、領域712の割合が予め定められた閾値より小さい場合には、BDAF処理を選択して、フォーカスレンズの位置を制御してよい。 For example, as shown in FIG. 7A, the focus control unit 114 may select the PDAF process and control the position of the focus lens if the ratio of the region 712 is equal to or greater than a predetermined threshold. On the other hand, as shown in FIG. 7B, the focus control unit 114 may select the BDAF process and control the position of the focus lens when the ratio of the area 712 is smaller than a predetermined threshold.
 合焦制御部114は、導出部112により導出された画像の空間周波数帯域に含まれる最大の空間周波数成分が、予め定められた空間周波数成分以下であれば、PDAF処理を選択し、最大の空間周波数成分が予め定められた空間周波数成分より大きければ、BDAF処理を選択し、フォーカスレンズの位置を制御してよい。予め定められた空間周波数成分は、例えば、画素P1及び画素P2で再現可能な空間周波数帯域のうち最大の空間周波数成分に基づいて定めらえてよい。 If the maximum spatial frequency component included in the spatial frequency band of the image derived by the deriving unit 112 is equal to or less than a predetermined spatial frequency component, the focusing control unit 114 selects the PDAF process and selects the maximum spatial frequency component. If the frequency component is larger than a predetermined spatial frequency component, BDAF processing may be selected to control the position of the focus lens. The predetermined spatial frequency component may be determined based on, for example, the maximum spatial frequency component in the spatial frequency band that can be reproduced by the pixel P1 and the pixel P2.
 図8は、AF処理の手順の一例を示すフローチャートである。まず、導出部112が、イメージセンサ120により撮像された画像を取得する。導出部112は、画像の空間周波数帯域を導出する。さらに、導出部112は、画像の空間周波数帯域のうち、PDAF処理に対して予め定められた空間周波数帯域に占める割合Bを導出する(S200)。 FIG. 8 is a flowchart showing an example of the procedure of AF processing. First, the deriving unit 112 acquires an image captured by the image sensor 120. The deriving unit 112 derives a spatial frequency band of the image. Furthermore, the deriving unit 112 derives a ratio B of the spatial frequency band of the image that occupies a predetermined spatial frequency band for the PDAF processing (S200).
 合焦制御部114は、導出された割合Bが予め定められた第1閾値Th1以上か否かを判定する(S202)。割合Bが第1閾値Th1以上であれば、合焦制御部114は、PDAF処理を選択して、イメージセンサ120の撮像面とレンズ210との位置関係を制御する(S204)。一方、割合Bが第1閾値Th1より小さければ、合焦制御部114は、BDAF処理を選択して、イメージセンサ120の撮像面とレンズ210との位置関係を制御する(S206)。 The focus control unit 114 determines whether or not the derived ratio B is equal to or greater than a predetermined first threshold Th1 (S202). If the ratio B is equal to or greater than the first threshold Th1, the focus control unit 114 selects the PDAF process and controls the positional relationship between the imaging surface of the image sensor 120 and the lens 210 (S204). On the other hand, if the ratio B is smaller than the first threshold Th1, the focusing control unit 114 selects the BDAF process and controls the positional relationship between the imaging surface of the image sensor 120 and the lens 210 (S206).
 このように、AF処理に利用される画像に含まれる空間周波数成分に基づいて、合焦制御部114は、PDAF処理及びBDAF処理を切り替える。これにより、画像に高周波成分が多く含まれる場合に、合焦制御部114がPDAF処理を実行することで、AF処理の精度が低下することを防止できる。また、画像に高周波成分が多く含まれない場合に、合焦制御部114がBDAF処理を実行することで、AF処理に時間がかかることを防止できる。よって、本実施形態に係る撮像装置100によれば、AF処理の時間の増大と、AF処理の精度の低下とをバランスよく抑制できる。第1閾値Thを境界として低周波成分と高周波成分とを区別してもよい。 Thus, based on the spatial frequency component included in the image used for the AF process, the focus control unit 114 switches between the PDAF process and the BDAF process. As a result, when the high-frequency component is included in the image, the focusing control unit 114 can prevent the AF processing accuracy from being lowered by executing the PDAF processing. Further, when the high frequency component is not included in the image, it is possible to prevent the AF process from taking a long time by the focusing control unit 114 executing the BDAF process. Therefore, according to the imaging device 100 according to the present embodiment, it is possible to suppress an increase in AF processing time and a decrease in AF processing accuracy with a good balance. You may distinguish a low frequency component and a high frequency component on the 1st threshold value Th as a boundary.
 図9は、AF処理の手順の他の一例を示すフローチャートである。まず、導出部112は、画像の空間周波数帯域を導出する。さらに、導出部112は、画像の空間周波数帯域のうち、PDAF処理に対して予め定められた空間周波数帯域に占める割合Bを導出する(S300)。 FIG. 9 is a flowchart showing another example of the AF processing procedure. First, the deriving unit 112 derives a spatial frequency band of the image. Further, the deriving unit 112 derives a ratio B of the spatial frequency band of the image that occupies a predetermined spatial frequency band for the PDAF processing (S300).
 合焦制御部114は、導出された割合Bが予め定められた第1閾値Th1以上か否かを判定する(S302)。割合Bが第1閾値Th1以上であれば、合焦制御部114は、PDAF処理を選択して、イメージセンサ120の撮像面とレンズ210との位置関係を制御する(S304)。一方、割合Bが第1閾値Th1より小さければ、合焦制御部114は、割合Bが第1閾値Th1より小さく、かつ予め定められた第2閾値Th2以上か否かを判定する(S306)。 The focus control unit 114 determines whether or not the derived ratio B is equal to or greater than a predetermined first threshold Th1 (S302). If the ratio B is equal to or greater than the first threshold Th1, the focus control unit 114 selects the PDAF process and controls the positional relationship between the imaging surface of the image sensor 120 and the lens 210 (S304). On the other hand, if the ratio B is smaller than the first threshold value Th1, the focusing control unit 114 determines whether the ratio B is smaller than the first threshold value Th1 and is equal to or larger than a predetermined second threshold value Th2 (S306).
 割合Bが第1閾値Th1より小さく、かつ予め定められた第2閾値Th2以上であれば、合焦制御部114は、PDAF処理とBDAF処理とを組み合わせて、イメージセンサ120の撮像面とレンズ210との位置関係を制御する(S308)。例えば、合焦制御部114は、PDAF処理で特定されるフォーカスレンズの位置と、BDAF処理で特定されるフォーカスレンズの位置とに基づいて、フォーカスレンズの位置を制御してよい。合焦制御部114は、PDAF処理及びBDAF処理で特定されるフォーカスレンズの位置のそれぞれに割合Bに応じた重み付けを行い、重み付けされたそれぞれの距離に基づいて、フォーカスレンズの位置を制御してよい。割合Bが大きいほど、PDAF処理で特定される距離の重み付けが、BDAF処理で特定される距離の重み付けより大きくなるように、合焦制御部114は、それぞれの距離に対して重み付けを行ってよい。 If the ratio B is smaller than the first threshold Th1 and equal to or greater than a predetermined second threshold Th2, the focus control unit 114 combines the PDAF process and the BDAF process to combine the imaging surface of the image sensor 120 and the lens 210. (S308). For example, the focus control unit 114 may control the position of the focus lens based on the position of the focus lens specified by the PDAF process and the position of the focus lens specified by the BDAF process. The focusing control unit 114 weights each focus lens position specified by the PDAF process and the BDAF process according to the ratio B, and controls the position of the focus lens based on each weighted distance. Good. The focus control unit 114 may weight each distance so that the weight of the distance specified by the PDAF process is larger than the weight of the distance specified by the BDAF process as the ratio B is larger. .
 割合Bが第2閾値より小さければ、合焦制御部114は、BDAF処理を選択して、イメージセンサ120の撮像面とレンズ210との位置関係を制御する(S310)。 If the ratio B is smaller than the second threshold, the focus control unit 114 selects the BDAF process and controls the positional relationship between the imaging surface of the image sensor 120 and the lens 210 (S310).
 以上の通り、画像に含まれる空間周波数成分に基づいて、合焦制御部114は、PDAF処理及びBDAF処理を切り替える、またはPDAF処理及びBDAF処理を組み合わせることで、AF処理を実行する。これにより、画像に高周波成分が多く含まれる場合に、合焦制御部114がPDAF処理を実行して、AF処理の精度が低下することを防止できる。また、画像に高周波成分が多く含まれない場合に、合焦制御部114がBDAF処理を実行して、AF処理に時間がかかることを防止できる。割合Bが例えば50%前後の場合、PDAF処理及びBDAF処理のいずれか一方のみを選択するのではなく、PDAF処理及びBDAF処理を組み合わせてAF処理を実行する。これにより、割合Bが例えば50%前後の場合に、PDAF処理及びBDAF処理のいずれか一方のみでAF処理を実行する場合よりもAF処理の精度を向上させることができる。よって、本実施形態に係る撮像装置100によれば、AF処理の時間の増大と、AF処理の精度の低下とをバランスよく抑制できる。 As described above, based on the spatial frequency component included in the image, the focus control unit 114 executes the AF process by switching the PDAF process and the BDAF process, or combining the PDAF process and the BDAF process. As a result, when the high-frequency component is included in the image, it is possible to prevent the focus control unit 114 from executing the PDAF process and reducing the accuracy of the AF process. In addition, when the image does not contain many high-frequency components, it is possible to prevent the focusing control unit 114 from executing the BDAF process and taking a long time for the AF process. When the ratio B is around 50%, for example, instead of selecting only one of the PDAF process and the BDAF process, the AF process is executed by combining the PDAF process and the BDAF process. Thereby, when the ratio B is, for example, around 50%, the accuracy of the AF process can be improved as compared with the case where the AF process is executed by only one of the PDAF process and the BDAF process. Therefore, according to the imaging device 100 according to the present embodiment, it is possible to suppress an increase in AF processing time and a decrease in AF processing accuracy with a good balance.
 図10は、本発明の複数の態様が全体的または部分的に具現化されてよいコンピュータ1200の一例を示す。コンピュータ1200にインストールされたプログラムは、コンピュータ1200に、本発明の実施形態に係る装置に関連付けられるオペレーションまたは当該装置の1または複数の「部」として機能させることができる。または、当該プログラムは、コンピュータ1200に当該オペレーションまたは当該1または複数の「部」を実行させることができる。当該プログラムは、コンピュータ1200に、本発明の実施形態に係るプロセスまたは当該プロセスの段階を実行させることができる。そのようなプログラムは、コンピュータ1200に、本明細書に記載のフローチャート及びブロック図のブロックのうちのいくつかまたはすべてに関連付けられた特定のオペレーションを実行させるべく、CPU1212によって実行されてよい。 FIG. 10 illustrates an example of a computer 1200 in which aspects of the present invention may be embodied in whole or in part. A program installed in the computer 1200 can cause the computer 1200 to function as an operation associated with the apparatus according to the embodiment of the present invention or as one or more “units” of the apparatus. Alternatively, the program can cause the computer 1200 to execute the operation or the one or more “units”. The program can cause the computer 1200 to execute a process according to an embodiment of the present invention or a stage of the process. Such a program may be executed by CPU 1212 to cause computer 1200 to perform certain operations associated with some or all of the blocks in the flowcharts and block diagrams described herein.
 本実施形態によるコンピュータ1200は、CPU1212、及びRAM1214を含み、それらはホストコントローラ1210によって相互に接続されている。コンピュータ1200はまた、通信インタフェース1222、入力/出力ユニットを含み、それらは入力/出力コントローラ1220を介してホストコントローラ1210に接続されている。コンピュータ1200はまた、ROM1230を含む。CPU1212は、ROM1230及びRAM1214内に格納されたプログラムに従い動作し、それにより各ユニットを制御する。 The computer 1200 according to this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other by a host controller 1210. The computer 1200 also includes a communication interface 1222 and an input / output unit, which are connected to the host controller 1210 via the input / output controller 1220. Computer 1200 also includes ROM 1230. The CPU 1212 operates according to programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
 通信インタフェース1222は、ネットワークを介して他の電子デバイスと通信する。ハードディスクドライブが、コンピュータ1200内のCPU1212によって使用されるプログラム及びデータを格納してよい。ROM1230はその中に、アクティブ化時にコンピュータ1200によって実行されるブートプログラム等、及び/またはコンピュータ1200のハードウェアに依存するプログラムを格納する。プログラムが、CR-ROM、USBメモリまたはICカードのようなコンピュータ可読記録媒体またはネットワークを介して提供される。プログラムは、コンピュータ可読記録媒体の例でもあるRAM1214、またはROM1230にインストールされ、CPU1212によって実行される。これらのプログラム内に記述される情報処理は、コンピュータ1200に読み取られ、プログラムと、上記様々なタイプのハードウェアリソースとの間の連携をもたらす。装置または方法が、コンピュータ1200の使用に従い情報のオペレーションまたは処理を実現することによって構成されてよい。 The communication interface 1222 communicates with other electronic devices via a network. A hard disk drive may store programs and data used by the CPU 1212 in the computer 1200. The ROM 1230 stores therein a boot program executed by the computer 1200 at the time of activation and / or a program depending on the hardware of the computer 1200. The program is provided via a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card or a network. The program is installed in the RAM 1214 or the ROM 1230 that is also an example of a computer-readable recording medium, and is executed by the CPU 1212. Information processing described in these programs is read by the computer 1200 to bring about cooperation between the programs and the various types of hardware resources. An apparatus or method may be configured by implementing information operations or processing in accordance with the use of computer 1200.
 例えば、通信がコンピュータ1200及び外部デバイス間で実行される場合、CPU1212は、RAM1214にロードされた通信プログラムを実行し、通信プログラムに記述された処理に基づいて、通信インタフェース1222に対し、通信処理を命令してよい。通信インタフェース1222は、CPU1212の制御の下、RAM1214、またはUSBメモリのような記録媒体内に提供される送信バッファ領域に格納された送信データを読み取り、読み取られた送信データをネットワークに送信し、またはネットワークから受信した受信データを記録媒体上に提供される受信バッファ領域等に書き込む。 For example, when communication is performed between the computer 1200 and an external device, the CPU 1212 executes a communication program loaded in the RAM 1214 and performs communication processing on the communication interface 1222 based on the processing described in the communication program. You may order. The communication interface 1222 reads transmission data stored in a RAM 1214 or a transmission buffer area provided in a recording medium such as a USB memory under the control of the CPU 1212 and transmits the read transmission data to a network, or The reception data received from the network is written into a reception buffer area provided on the recording medium.
 また、CPU1212は、USBメモリ等のような外部記録媒体に格納されたファイルまたはデータベースの全部または必要な部分がRAM1214に読み取られるようにし、RAM1214上のデータに対し様々なタイプの処理を実行してよい。CPU1212は次に、処理されたデータを外部記録媒体にライトバックしてよい。 In addition, the CPU 1212 allows the RAM 1214 to read all or necessary portions of a file or database stored in an external recording medium such as a USB memory, and executes various types of processing on the data on the RAM 1214. Good. The CPU 1212 may then write back the processed data to an external recording medium.
 様々なタイプのプログラム、データ、テーブル、及びデータベースのような様々なタイプの情報が記録媒体に格納され、情報処理を受けてよい。CPU1212は、RAM1214から読み取られたデータに対し、本開示の随所に記載され、プログラムの命令シーケンスによって指定される様々なタイプのオペレーション、情報処理、条件判断、条件分岐、無条件分岐、情報の検索/置換等を含む、様々なタイプの処理を実行してよく、結果をRAM1214に対しライトバックする。また、CPU1212は、記録媒体内のファイル、データベース等における情報を検索してよい。例えば、各々が第2の属性の属性値に関連付けられた第1の属性の属性値を有する複数のエントリが記録媒体内に格納される場合、CPU1212は、第1の属性の属性値が指定される、条件に一致するエントリを当該複数のエントリの中から検索し、当該エントリ内に格納された第2の属性の属性値を読み取り、それにより予め定められた条件を満たす第1の属性に関連付けられた第2の属性の属性値を取得してよい。 Various types of information such as various types of programs, data, tables, and databases may be stored in the recording medium and subjected to information processing. The CPU 1212 describes various types of operations, information processing, conditional judgment, conditional branching, unconditional branching, and information retrieval that are described throughout the present disclosure for data read from the RAM 1214 and specified by the instruction sequence of the program. Various types of processing may be performed, including / replacement, etc., and the result is written back to RAM 1214. In addition, the CPU 1212 may search for information in files, databases, etc. in the recording medium. For example, when a plurality of entries each having an attribute value of the first attribute associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 specifies the attribute value of the first attribute. The entry that matches the condition is searched from the plurality of entries, the attribute value of the second attribute stored in the entry is read, and thereby the first attribute that satisfies the predetermined condition is associated. The attribute value of the obtained second attribute may be acquired.
 上で説明したプログラムまたはソフトウェアモジュールは、コンピュータ1200上またはコンピュータ1200近傍のコンピュータ可読記憶媒体に格納されてよい。また、専用通信ネットワークまたはインターネットに接続されたサーバーシステム内に提供されるハードディスクまたはRAMのような記録媒体が、コンピュータ可読記憶媒体として使用可能であり、それによりプログラムを、ネットワークを介してコンピュータ1200に提供する。 The program or software module described above may be stored in a computer-readable storage medium on the computer 1200 or in the vicinity of the computer 1200. In addition, a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, whereby the program is transferred to the computer 1200 via the network. provide.
 請求の範囲、明細書、及び図面中において示した装置、システム、プログラム、及び方法における動作、手順、ステップ、及び段階等の各処理の実行順序は、特段「より前に」、「先立って」等と明示しておらず、また、前の処理の出力を後の処理で用いるのでない限り、任意の順序で実現しうることに留意すべきである。請求の範囲、明細書、及び図面中の動作フローに関して、便宜上「まず、」、「次に、」等を用いて説明したとしても、この順で実施することが必須であることを意味するものではない。 The execution order of each process such as operations, procedures, steps, and stages in the apparatus, system, program, and method shown in the claims, the description, and the drawings is particularly “before” or “prior”. It should be noted that they can be implemented in any order unless the output of the previous process is used in the subsequent process. Regarding the operation flow in the claims, the description, and the drawings, even if it is described using “first”, “next”, etc. for the sake of convenience, it means that it is essential to carry out in this order. is not.
10 UAV
20 UAV本体
30 UAV制御部
32 メモリ
34 通信インタフェース
40 推進部
41 GPS受信機
42 慣性計測装置
43 磁気コンパス
44 気圧高度計
45 温度センサ
50 ジンバル
60 撮像装置
100 撮像装置
102 撮像部
110 撮像制御部
112 導出部
114 合焦制御部
120 イメージセンサ
130 メモリ
200 レンズ部
210 レンズ
212 レンズ移動機構
220 レンズ制御部
300 遠隔操作装置
1200 コンピュータ
1210 ホストコントローラ
1212 CPU
1214 RAM
1220 入力/出力コントローラ
1222 通信インタフェース
1230 ROM
10 UAV
20 UAV body 30 UAV control unit 32 Memory 34 Communication interface 40 Propulsion unit 41 GPS receiver 42 Inertial measurement device 43 Magnetic compass 44 Barometric altimeter 45 Temperature sensor 50 Gimbal 60 Imaging device 100 Imaging device 102 Imaging unit 110 Imaging control unit 112 Deriving unit 114 Focus control unit 120 Image sensor 130 Memory 200 Lens unit 210 Lens 212 Lens moving mechanism 220 Lens control unit 300 Remote operation device 1200 Computer 1210 Host controller 1212 CPU
1214 RAM
1220 Input / output controller 1222 Communication interface 1230 ROM

Claims (12)

  1.  撮像された画像の空間周波数帯域を導出する導出部と、
     前記導出部により導出された前記画像の前記空間周波数帯域に基づいて、撮像面とレンズとの位置関係が異なる状態で撮像された複数の画像のぼけ量に基づいてオートフォーカス処理を実行する第1AF処理と、位相差方式でオートフォーカス処理を実行する第2AF処理との何れか一方を選択する、または両方を組み合わせることで、前記撮像面と前記レンズとの位置関係を制御する制御部と
    を備える撮像制御装置。
    A deriving unit for deriving a spatial frequency band of the captured image;
    Based on the spatial frequency band of the image derived by the deriving unit, a first AF that performs autofocus processing based on blur amounts of a plurality of images captured in a state where the positional relationship between the imaging surface and the lens is different A control unit that controls the positional relationship between the imaging surface and the lens by selecting either one of the processing and the second AF processing for performing autofocus processing by a phase difference method, or by combining both Imaging control device.
  2.  前記制御部は、前記導出部により導出された前記画像の前記空間周波数帯域と、前記第2AF処理に対して予め定められた空間周波数帯域とに基づいて前記第1AF処理と前記第2AF処理との何れか一方を選択する、または両方を組み合わせることで、前記撮像面と前記レンズとの位置関係を制御する、請求項1に記載の撮像制御装置。 The control unit performs the first AF process and the second AF process based on the spatial frequency band of the image derived by the derivation unit and a spatial frequency band predetermined for the second AF process. The imaging control apparatus according to claim 1, wherein the positional relationship between the imaging surface and the lens is controlled by selecting either one or combining both.
  3.  前記制御部は、前記導出部により導出された前記画像の前記空間周波数帯域のうち前記予め定められた空間周波数帯域に含まれる空間周波数帯域の割合に基づいて、前記第1AF処理と前記第2AF処理との何れか一方を選択する、または両方を組み合わせることで、前記撮像面と前記レンズとの位置関係を制御する、請求項2に記載の撮像制御装置。 The control unit may perform the first AF process and the second AF process based on a ratio of a spatial frequency band included in the predetermined spatial frequency band in the spatial frequency band of the image derived by the derivation unit. The imaging control apparatus according to claim 2, wherein the positional relationship between the imaging surface and the lens is controlled by selecting one of the two or a combination of both.
  4.  前記制御部は、前記割合が閾値以上である場合、前記第2AF処理を選択し、前記割合が前記閾値より小さい場合、前記第1AF処理を選択することで、前記撮像面と前記レンズとの位置関係を制御する、請求項3に記載の撮像制御装置。 The control unit selects the second AF process when the ratio is greater than or equal to a threshold, and selects the first AF process when the ratio is smaller than the threshold, thereby positioning the imaging surface and the lens. The imaging control device according to claim 3, wherein the imaging control device controls the relationship.
  5.  前記制御部は、前記割合が第1閾値以上である場合、前記第2AF処理を選択し、前記割合が前記第1閾値より小さく、かつ第2閾値以上である場合、前記第1AF処理と前記第2AF処理とを組み合わせ、前記第2閾値より小さい場合、前記第1AF処理を選択することで、前記撮像面と前記レンズとの位置関係を制御する、請求項3に記載の撮像制御装置。 The control unit selects the second AF process when the ratio is equal to or greater than a first threshold, and when the ratio is smaller than the first threshold and equal to or greater than a second threshold, the first AF process and the second AF process are selected. The imaging control apparatus according to claim 3, wherein a positional relationship between the imaging surface and the lens is controlled by selecting the first AF process when combined with 2AF processing and smaller than the second threshold.
  6.  前記制御部は、前記導出部に導出された前記画像の前記空間周波数帯域を、位相差検出用信号を生成する複数の画素によって再現可能な予め定められた空間周波数帯域と比較することにより、前記第1AF処理と前記第2AF処理との何れか一方を選択する、または両方を組み合わせることで、前記撮像面と前記レンズとの位置関係を制御する、請求項1に記載の撮像制御装置。 The control unit compares the spatial frequency band of the image derived by the deriving unit with a predetermined spatial frequency band that can be reproduced by a plurality of pixels that generate a phase difference detection signal. The imaging control apparatus according to claim 1, wherein the positional relationship between the imaging surface and the lens is controlled by selecting one of the first AF process and the second AF process or combining both.
  7.  前記第2AF処理は、前記複数の画素、及び色成分信号を生成する複数の他の画素が予め定められた配列パターンで配置されるイメージセンサから出力される前記位相差検出用信号に基づいて実行される、請求項6に記載の撮像制御装置。 The second AF process is executed based on the phase difference detection signal output from an image sensor in which the plurality of pixels and a plurality of other pixels that generate color component signals are arranged in a predetermined arrangement pattern. The imaging control device according to claim 6.
  8.  請求項1から7の何れか1つに記載の撮像制御装置と、
     前記撮像面を有するイメージセンサと
     前記レンズと
    を備える撮像装置。
    The imaging control device according to any one of claims 1 to 7,
    An imaging apparatus comprising: an image sensor having the imaging surface; and the lens.
  9.  請求項8に記載の撮像装置と、
     前記撮像装置を支持する支持機構と
    を備える撮像システム。
    An imaging device according to claim 8,
    An imaging system comprising: a support mechanism that supports the imaging device.
  10.  請求項9に記載の撮像システムを搭載して移動する移動体。 A moving body that is mounted with the imaging system according to claim 9 and moves.
  11.  撮像された画像の空間周波数帯域を導出する段階と、
     導出された前記画像の前記空間周波数帯域に基づいて、撮像面とレンズとの位置関係が異なる状態で撮像された複数の画像のぼけ量に基づいてオートフォーカス処理を実行する第1AF処理と、位相差方式でオートフォーカス処理を実行する第2AF処理との何れか一方を選択する、または両方を組み合わせることで、前記撮像面と前記レンズとの位置関係を制御する段階と
    を備える撮像制御方法。
    Deriving a spatial frequency band of the captured image;
    First AF processing for performing autofocus processing based on blur amounts of a plurality of images captured in a state where the positional relationship between the imaging surface and the lens is different based on the spatial frequency band of the derived image; An imaging control method comprising: controlling a positional relationship between the imaging surface and the lens by selecting any one of the second AF processing for executing the autofocus processing by a phase difference method, or combining both.
  12.  撮像された画像の空間周波数帯域を導出する段階と、
     導出された前記画像の前記空間周波数帯域に基づいて、撮像面とレンズとの位置関係が異なる状態で撮像された複数の画像のぼけ量に基づいてオートフォーカス処理を実行する第1AF処理と、位相差方式でオートフォーカス処理を実行する第2AF処理との何れか一方を選択する、または両方を組み合わせることで、前記撮像面と前記レンズとの位置関係を制御する段階と
     をコンピュータに実行させるためのプログラム。
    Deriving a spatial frequency band of the captured image;
    First AF processing for performing autofocus processing based on blur amounts of a plurality of images captured in a state where the positional relationship between the imaging surface and the lens is different based on the spatial frequency band of the derived image; A step of controlling a positional relationship between the imaging surface and the lens by selecting any one of the second AF processing for performing the autofocus processing by the phase difference method, or by combining both. program.
PCT/JP2017/014560 2017-04-07 2017-04-07 Imaging control device, imaging device, imaging system, mobile body, imaging control method and program WO2018185940A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2017559465A JP6503607B2 (en) 2017-04-07 2017-04-07 Imaging control apparatus, imaging apparatus, imaging system, moving object, imaging control method, and program
PCT/JP2017/014560 WO2018185940A1 (en) 2017-04-07 2017-04-07 Imaging control device, imaging device, imaging system, mobile body, imaging control method and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/014560 WO2018185940A1 (en) 2017-04-07 2017-04-07 Imaging control device, imaging device, imaging system, mobile body, imaging control method and program

Publications (1)

Publication Number Publication Date
WO2018185940A1 true WO2018185940A1 (en) 2018-10-11

Family

ID=63712736

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/014560 WO2018185940A1 (en) 2017-04-07 2017-04-07 Imaging control device, imaging device, imaging system, mobile body, imaging control method and program

Country Status (2)

Country Link
JP (1) JP6503607B2 (en)
WO (1) WO2018185940A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021193412A (en) * 2020-06-08 2021-12-23 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co., Ltd Device, imaging device, imaging system, and mobile object

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021092846A1 (en) * 2019-11-14 2021-05-20 深圳市大疆创新科技有限公司 Zoom tracking method and system, lens, imaging apparatus and unmanned aerial vehicle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009175279A (en) * 2008-01-22 2009-08-06 Olympus Imaging Corp Camera system
JP2014063142A (en) * 2012-08-31 2014-04-10 Canon Inc Distance detection device, imaging apparatus, program, recording medium, and distance detection method
WO2014083914A1 (en) * 2012-11-29 2014-06-05 富士フイルム株式会社 Image capture device and focus control method
JP6026695B1 (en) * 2016-06-14 2016-11-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Control device, moving body, control method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009175279A (en) * 2008-01-22 2009-08-06 Olympus Imaging Corp Camera system
JP2014063142A (en) * 2012-08-31 2014-04-10 Canon Inc Distance detection device, imaging apparatus, program, recording medium, and distance detection method
WO2014083914A1 (en) * 2012-11-29 2014-06-05 富士フイルム株式会社 Image capture device and focus control method
JP6026695B1 (en) * 2016-06-14 2016-11-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Control device, moving body, control method, and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021193412A (en) * 2020-06-08 2021-12-23 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co., Ltd Device, imaging device, imaging system, and mobile object

Also Published As

Publication number Publication date
JP6503607B2 (en) 2019-04-24
JPWO2018185940A1 (en) 2019-04-11

Similar Documents

Publication Publication Date Title
WO2018185939A1 (en) Imaging control device, imaging device, imaging system, mobile body, imaging control method and program
JP6496955B1 (en) Control device, system, control method, and program
US20210120171A1 (en) Determination device, movable body, determination method, and program
JP6878736B2 (en) Controls, mobiles, control methods, and programs
US20210014427A1 (en) Control device, imaging device, mobile object, control method and program
WO2018185940A1 (en) Imaging control device, imaging device, imaging system, mobile body, imaging control method and program
JP6790318B2 (en) Unmanned aerial vehicles, control methods, and programs
JP6587006B2 (en) Moving body detection device, control device, moving body, moving body detection method, and program
JP6543875B2 (en) Control device, imaging device, flying object, control method, program
JP6565072B2 (en) Control device, lens device, flying object, control method, and program
JP2020085918A (en) Determination device, imaging device, imaging system, moving body, determination method and program
JP6641574B1 (en) Determination device, moving object, determination method, and program
JP6696092B2 (en) Control device, moving body, control method, and program
WO2018123013A1 (en) Controller, mobile entity, control method, and program
JP2019083390A (en) Control device, imaging device, mobile body, control method, and program
JP6569157B1 (en) Control device, imaging device, moving object, control method, and program
WO2018163300A1 (en) Control device, imaging device, imaging system, moving body, control method, and program
JP2019205047A (en) Controller, imaging apparatus, mobile body, control method and program
JP6413170B1 (en) Determination apparatus, imaging apparatus, imaging system, moving object, determination method, and program
JP2019169810A (en) Image processing apparatus, imaging apparatus, mobile object, image processing method, and program
JP6818987B1 (en) Image processing equipment, imaging equipment, moving objects, image processing methods, and programs
JP2020016703A (en) Control device, moving body, control method, and program
JP6798072B2 (en) Controls, mobiles, control methods, and programs
WO2021249245A1 (en) Device, camera device, camera system, and movable member
JP6459012B1 (en) Control device, imaging device, flying object, control method, and program

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017559465

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17904898

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17904898

Country of ref document: EP

Kind code of ref document: A1