WO2021013143A1 - Apparatus, photgraphic apparatus, movable body, method, and program - Google Patents

Apparatus, photgraphic apparatus, movable body, method, and program Download PDF

Info

Publication number
WO2021013143A1
WO2021013143A1 PCT/CN2020/103227 CN2020103227W WO2021013143A1 WO 2021013143 A1 WO2021013143 A1 WO 2021013143A1 CN 2020103227 W CN2020103227 W CN 2020103227W WO 2021013143 A1 WO2021013143 A1 WO 2021013143A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
positional relationship
amount
blur
value
Prior art date
Application number
PCT/CN2020/103227
Other languages
French (fr)
Chinese (zh)
Inventor
高宫诚
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080003348.XA priority Critical patent/CN112292712A/en
Publication of WO2021013143A1 publication Critical patent/WO2021013143A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/571Depth or shape recovery from multiple images from focus
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/221Image signal generators using stereoscopic image cameras using a single 2D image sensor using the relative movement between cameras and objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • the present invention relates to a device, a camera device, a mobile body, a method, and a program.
  • the detection accuracy of the subject distance decreases.
  • the device includes a circuit based on the circuit structure: acquiring the first image included in the first captured image captured in a state where the imaging surface of the imaging device and the focusing lens are in a first positional relationship The amount of blur and the amount of blur of the second image included in the second captured image taken with the imaging surface and the focus lens in the second positional relationship.
  • the circuit is configured to obtain a first predicted value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the first image and the blur amount of the second image.
  • the circuit is configured to acquire the blur amount of the third image included in the third captured image captured in the state where the imaging surface and the focus lens are in the third positional relationship.
  • the circuit is configured to obtain a second predicted value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the second image and the blur amount of the third image.
  • the circuit is configured to determine a target value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the difference between the first predicted value and the second predicted value.
  • the first positional relationship, the second positional relationship, and the third positional relationship may respectively indicate the position of the focus lens relative to the imaging surface.
  • the first predicted value may indicate the first target position of the focus lens relative to the imaging surface when focusing on the subject.
  • the second predicted value may indicate the second target position of the focus lens relative to the imaging surface when focusing on the subject.
  • the circuit may be configured to use the difference between the first target position and the second target position to determine a target value representing the target position of the focus lens relative to the imaging surface when focusing on the subject.
  • the position of the focusing lens determined according to at least one of the first positional relationship and the second positional relationship is set as the first reference position
  • the position of the focusing lens determined according to at least one of the second positional relationship and the third positional relationship is set
  • the circuit may be configured to use the difference between the first target position and the second target position relative to the difference between the first reference position and the second reference position to determine the relative focus of the focus lens when focusing on the subject.
  • the target value of the target position on the camera surface is configured to use the difference between the first target position and the second target position relative to the difference between the first reference position and the second reference position to determine the relative focus of the focus lens when focusing on the subject.
  • the circuit may be configured to calculate the ratio of the difference between the first target position and the second target position relative to the difference between the first reference position and the second reference position as the change information.
  • the circuit may be configured to calculate the correction value of the second target position by dividing the second target position by the ratio.
  • the circuit may be configured to calculate the position indicated by the correction value relative to the second reference position as the target value.
  • the circuit may be configured to perform focus control of the imaging device according to the target value.
  • the circuit may be configured to obtain the blur amount of the first image and the blur amount of the second image.
  • the circuit may be configured to obtain the first predicted value according to the blur amount of the first image and the blur amount of the second image.
  • the circuit may be configured to perform focus control of the imaging device according to the first predicted value when the reliable value of the first predicted value is greater than or equal to the preset first value.
  • the circuit may be configured as follows: when the reliable value of the first predicted value is lower than the preset first value, obtain the blur amount of the third image, and obtain the second prediction according to the blur amount of the second image and the blur amount of the third image Value, and perform focus control of the camera device according to the target value.
  • the circuit may be configured such that when the reliable value of the first predicted value is lower than the second value lower than the first value, the third image is not acquired, and the focus control of the imaging device according to the first predicted value is not performed.
  • the circuit may be configured as follows: when the reliable value of the first predicted value is lower than the first value and greater than or equal to the second value, the blur amount of the third image is obtained, and the second image is obtained according to the blur amount of the second image and the blur amount of the third image. Two predicted values, and the focus control of the camera device is executed according to the target value.
  • the device includes a circuit based on the circuit structure: acquiring the first image included in the first captured image captured in a state where the imaging surface of the imaging device and the focusing lens are in a first positional relationship The amount of blur and the amount of blur of the second image included in the second captured image taken with the imaging surface and the focus lens in the second positional relationship.
  • the circuit may be configured to obtain a first predicted value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the first image and the blur amount of the second image.
  • the circuit may be configured to acquire the blur amount of the third image included in the third captured image taken with the imaging surface and the focusing lens in the third positional relationship, and the image taken with the imaging surface and the focusing lens in the fourth positional relationship.
  • the blur amount of the fourth image included in the fourth captured image may be configured to obtain a second predicted value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the third image and the blur amount of the fourth image.
  • the circuit may be configured to determine a target value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the difference between the first predicted value and the second predicted value.
  • the imaging device may include the above-mentioned device and an image sensor having an imaging surface.
  • An imaging system may include the above-mentioned imaging device and a support mechanism that supports the imaging device in a manner that can control the posture of the imaging device.
  • the mobile body according to an aspect of the present invention can be moved by mounting the aforementioned imaging device.
  • the method involved in one aspect of the present invention includes: acquiring the blur amount of a first image contained in a first captured image captured in a state where the imaging surface of the imaging device and the focus lens are in a first positional relationship, and that the imaging surface and the focus lens are The blur amount of the second image included in the second captured image captured in the state of the second positional relationship.
  • the method may include: obtaining a first predicted value based on the blur amount of the first image and the blur amount of the second image, the first predicted value representing the positional relationship between the imaging surface and the focusing lens when focusing on the subject.
  • the method may include: acquiring a blur amount of a third image included in a third image captured in a state where the imaging surface of the imaging device and the focus lens are in a third positional relationship.
  • the method may include: obtaining a second predicted value based on the blur amount of the second image and the blur amount of the third image, the second predicted value representing the positional relationship between the imaging plane and the focusing lens when focusing on the subject.
  • the method may include: determining a target value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the difference between the first predicted value and the second predicted value.
  • the program related to one aspect of the present invention may be a program for causing a computer to execute the above-mentioned method.
  • FIG. 1 is a diagram showing an example of an external perspective view of the imaging device 100.
  • FIG. 2 is a diagram showing functional blocks of the imaging device 100.
  • FIG. 3 shows an example of a curve representing the relationship between the amount of image blur (Cost) and the position of the focus lens.
  • Fig. 4 is a flowchart showing an example of a distance calculation process in the BDAF method.
  • FIG. 5 is a diagram illustrating the calculation process of the subject distance.
  • FIG. 6 shows the relationship between the DFD calculation value and the defocus amount for different subjects.
  • FIG. 7 shows the principle of focus control based on two DFD calculations performed by the imaging control unit 110.
  • FIG. 8 is a flowchart showing the processing procedure of the focus control executed by the imaging control unit 110.
  • FIG 9 shows an example of an unmanned aerial vehicle (UAV).
  • UAV unmanned aerial vehicle
  • FIG. 10 shows an example of a computer 1200 that embodies aspects of the present invention in whole or in part.
  • the blocks can represent (1) the stages of the process of performing operations or (2) the "parts" of the device that perform operations. Specific stages and “parts” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • the programmable circuit may include a reconfigurable hardware circuit.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate array (FPGA), programmable logic array (PLA) ) And other memory components.
  • the computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device.
  • the computer-readable medium having instructions stored thereon includes a product including instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like may be included.
  • the computer readable medium may include floppy (registered trademark) disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) Or flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (registered trademark) disc, Memory stick, integrated circuit card, etc.
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • flash memory electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disc
  • Blu-ray registered trademark
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes traditional procedural programming languages.
  • Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, etc.
  • FIG. 1 is a diagram showing an example of an external perspective view of an imaging device 100 according to this embodiment.
  • FIG. 2 is a diagram showing functional blocks of the imaging device 100 according to this embodiment.
  • the imaging device 100 includes an imaging unit 102 and a lens unit 200.
  • the imaging unit 102 includes an image sensor 120, an imaging control unit 110, a memory 130, an instruction unit 162, and a display unit 160.
  • the image sensor 120 may be composed of CCD or CMOS.
  • the image sensor 120 receives light through the lens 210 included in the lens unit 200.
  • the image sensor 120 outputs image data of the optical image captured by the lens 210 to the imaging control unit 110.
  • the imaging control unit 110 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the memory 130 may be a computer-readable recording medium, and may also include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the imaging control unit 110 corresponds to an electric circuit.
  • the memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the storage 130 may be configured to be detachable from the housing of the imaging device 100.
  • the instruction unit 162 is a user interface that accepts instructions to the imaging device 100 from the user.
  • the display unit 160 displays images captured by the image sensor 120, images processed by the imaging control unit 110, various setting information of the imaging device 100, and the like.
  • the display part 160 may be composed of a touch panel.
  • the imaging control unit 110 controls the lens unit 200 and the image sensor 120. For example, the imaging control unit 110 controls the focal position and focal length of the lens 210.
  • the imaging control unit 110 outputs a control command to the lens control unit 220 included in the lens unit 200 based on the information indicating the instruction from the user, thereby controlling the lens unit 200.
  • the lens unit 200 has one or more lenses 210, a lens driving unit 212, a lens control unit 220, and a memory 222.
  • one or more lenses 210 are collectively referred to as “lens 210”.
  • the lens 210 may include a focus lens and a zoom lens. At least a part or all of the lenses included in the lens 210 are arranged to be movable along the optical axis of the lens 210.
  • the lens unit 200 may be an interchangeable lens detachably provided in the imaging unit 102.
  • the lens driving unit 212 moves at least a part or all of the lens 210 along the optical axis of the lens 210.
  • the lens control unit 220 drives the lens drive unit 212 according to the lens control instruction from the imaging unit 102 to move the entire lens 210 or the zoom lens or the focus lens included in the lens 210 along the optical axis, thereby performing the zoom operation or focus operation. at least one.
  • the lens control commands are, for example, zoom control commands and focus control commands.
  • the lens driving part 212 may include a voice coil motor (VCM) that moves at least a part or all of the plurality of lenses 210 in the optical axis direction.
  • the lens driving part 212 may include a motor such as a DC motor, a coreless motor, or an ultrasonic motor.
  • the lens driving unit 212 can transmit the power from the motor to at least a part or all of the plurality of lenses 210 via mechanism components such as cam rings and guide shafts, so as to move at least a part or all of the lenses 210 along the optical axis.
  • the memory 222 stores control values for the focus lens and zoom lens that are moved by the lens drive unit 212.
  • the memory 222 may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the imaging control unit 110 outputs a control command to the image sensor 120 based on the information indicating the user's instruction acquired by the instruction unit 162 and the like to perform control including imaging operation control on the image sensor 120.
  • the imaging control unit 110 acquires an image captured by the image sensor 120.
  • the imaging control unit 110 performs image processing on the image acquired from the image sensor 120 and stores it in the memory 130.
  • the imaging control unit 110 acquires the amount of blur of the first image contained in the first captured image captured in the state where the imaging surface and the focus lens of the imaging device 100 are in the first positional relationship, and the imaging surface and the focus lens are in the second position The amount of blurring of the second image included in the second captured image captured in the state of the relationship.
  • the imaging control unit 110 acquires a first predicted value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the first image and the blur amount of the second image.
  • the imaging control unit 110 acquires the amount of blurring of the third image included in the third captured image captured with the imaging surface and the focus lens in the third positional relationship.
  • the imaging control unit 110 obtains a second predicted value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the second image and the blur amount of the third image.
  • the imaging control unit 110 determines a target value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the difference between the first predicted value and the second predicted value.
  • the first positional relationship, the second positional relationship, and the third positional relationship may respectively indicate the position of the focus lens relative to the imaging surface.
  • the first predicted value may indicate the first target position of the focus lens relative to the imaging surface when focusing on the subject.
  • the second predicted value may indicate the second target position of the focus lens relative to the imaging surface when focusing on the subject.
  • the imaging control unit 110 may use the difference between the first target position and the second target position to determine a target value indicating the target position of the focus lens relative to the imaging surface when focusing on the subject.
  • the imaging control unit 110 sets the position of the focus lens determined based on at least one of the first positional relationship and the second positional relationship as the first reference position, and sets the position of the focus lens determined based on at least one of the second positional relationship and the third positional relationship as the first reference position.
  • the position is set as the second reference position, and the imaging control unit 110 uses the difference between the first target position and the second target position relative to the difference between the first reference position and the second reference position to determine the focus lens that indicates when focusing on the subject The target value of the target position relative to the imaging surface.
  • the imaging control unit 110 may calculate the ratio of the difference between the first target position and the second target position to the difference between the first reference position and the second reference position as the change information.
  • the imaging control unit 110 may calculate the correction value of the second target position by dividing the second target position by the ratio.
  • the imaging control unit 110 may calculate the position indicated by the correction value relative to the second reference position as the target value.
  • the imaging control unit 110 may perform focus control of the imaging device 100 based on the target value.
  • the imaging control unit 110 may acquire the blur amount of the first image and the blur amount of the second image.
  • the imaging control unit 110 may obtain the first predicted value based on the blur amount of the first image and the blur amount of the second image.
  • the imaging control unit 110 may perform focus control of the imaging device 100 according to the first predicted value.
  • the imaging control unit 110 obtains the blur amount of the third image, and obtains the second predicted value according to the blur amount of the second image and the blur amount of the third image , And execute the focus control of the imaging device 100 according to the target value.
  • the imaging control section 110 When the reliable value of the first predicted value is lower than the second value lower than the first value, the imaging control section 110 does not acquire the third image, and does not perform focus control of the imaging device 100 based on the first predicted value.
  • the imaging control unit 110 obtains the blur amount of the third image, and obtains the second image based on the blur amount of the second image and the blur amount of the third image. Two predicted values, and the focus control of the imaging device 100 is performed according to the target value.
  • the imaging control unit 110 can acquire the blur amount of the first image contained in the first captured image captured in the first positional relationship between the imaging surface and the focus lens of the imaging device 100, and whether the imaging surface and the focus lens are in the The blur amount of the second image included in the second captured image captured in the state of the second positional relationship.
  • the imaging control unit 110 may obtain the first predicted value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the first image and the blur amount of the second image.
  • the imaging control unit 110 can acquire the blur amount of the third image included in the third captured image taken with the imaging surface and the focus lens in the third positional relationship and the image taken with the imaging surface and the focus lens in the fourth positional relationship.
  • the imaging control unit 110 may obtain a second predicted value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the third image and the blur amount of the fourth image.
  • the imaging control unit 110 may determine a target value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the difference between the first predicted value and the second predicted value.
  • the AF method executed by the imaging device 100 will be described.
  • the imaging device 100 determines the distance from the lens 210 to the subject (subject distance).
  • a method for determining the subject distance there is the following method: by moving the focus lens, it is determined based on the blur amount of a plurality of images captured in a state where the positional relationship between the focus lens and the light receiving surface of the image sensor 120 is different .
  • the AF using this method is called a blur detection auto focus (Bokeh Detection Auto Foucus: BDAF) method.
  • BDAF blur detection auto focus
  • DFD Depth From Defocus
  • the amount of blurring (Cost) of the image can be expressed by the following formula (1) using a Gaussian function.
  • x represents the pixel position in the horizontal direction.
  • represents the standard deviation value.
  • FIG. 3 shows an example of a curve representing the relationship between the amount of image blur (Cost) and the position of the focus lens.
  • C1 is the amount of image blur obtained when the focus lens is located at x1.
  • C2 is the amount of image blur obtained when the focus lens is located at x2.
  • Fig. 4 is a flowchart showing an example of a distance calculation process in the BDAF method.
  • the imaging control unit 110 takes a first image and stores it in the memory 130 when the lens 210 and the imaging surface of the image sensor 120 are in a first positional relationship.
  • the imaging control unit 110 moves the lens 210 in the optical axis direction to place the lens 210 and the imaging surface in a second positional relationship, and captures a second image by the imaging device 100 and stores it in the memory 130 (S201).
  • the imaging control unit 110 changes the positional relationship between the lens 210 and the imaging surface from the first positional relationship to the second positional relationship by moving the focus lens along the optical axis direction.
  • the amount of movement of the lens may be, for example, about 10 ⁇ m.
  • the imaging control unit 110 divides the first image into a plurality of regions (S202).
  • the imaging control unit 110 may calculate a feature amount for each pixel in the first image, and divide the first image into a plurality of regions by using a group of pixels having similar feature amounts as one region.
  • the imaging control unit 110 may divide the pixel group set as the range of the AF processing frame in the first image into a plurality of regions.
  • the imaging control unit 110 divides the second image into a plurality of regions corresponding to the plurality of regions of the first image.
  • the imaging control unit 110 calculates, for a plurality of regions, a target corresponding to an object contained in each of the plurality of regions based on the respective blur amounts of the plurality of regions of the first image and the respective blur amounts of the plurality of regions of the second image.
  • the distance of the subject (S203).
  • the method of changing the positional relationship between the lens 210 and the imaging surface of the image sensor 120 is not limited to the method of moving the focus lens included in the lens 210.
  • the imaging control unit 110 may move the entire lens 210 in the optical axis direction.
  • the imaging control unit 110 can move the imaging surface of the image sensor 120 in the optical axis direction.
  • the imaging control unit 110 can move at least a part of the lens included in the lens 210 and the imaging surface of the image sensor 120 in the optical axis direction.
  • the imaging control unit 110 may adopt any method as long as the relative positional relationship between the focal point of the lens 210 and the imaging surface of the image sensor 120 is optically changed.
  • the calculation process of the subject distance will be further explained with reference to FIG. 5.
  • the distance from the principal point of the lens L to the subject 510 (object plane) is set to A, and the distance from the principal point of the lens L to the position (image plane) where the light beam from the subject 510 is formed is set to B, Set the focal length of lens L to F.
  • the following formula (2) can be used to express the relationship between the distance A, the distance B, and the focal length F according to the lens formula.
  • the focal length F is determined by the position of each lens included in the lens L. Therefore, if it is possible to determine the distance B imaged according to the light beam of the subject 510, the equation (2) can be used to determine the distance A from the principal point of the lens L to the subject 510.
  • the imaging surface of the image sensor to the side of the lens L, the positional relationship between the lens L and the imaging surface is changed.
  • the image of the subject 510 projected on the imaging surface will be Produce blur.
  • the imaged position of the subject 510 is calculated according to the blur size (circles of confusion 512 and 514) of the image of the subject 510 projected on the imaging surface, and thus the distance B can be determined, and the distance A can be further determined. That is, considering that the size of the blur (blur amount) is proportional to the imaging surface and the imaging position, the imaging position can be determined from the difference in the blur amount.
  • each image of the image I 1 at a position distance D1 from the imaging surface and the image I 2 at a position distance D2 from the imaging surface is blurred.
  • the image I 1 assuming that the point spread function (Point Spread Function) is PSF 1 and the subject image is I d1 , the image I 1 can be expressed by the following equation (3) through convolution operation.
  • the value C shown in equation (4) corresponds to the amount of change in each blur of the image at a position distance D1 from the principal point of the lens L and an image at a position distance D2 from the principal point of the lens L, that is, the value C corresponds to The difference in the amount of blur between an image at a position distance D1 from the principal point of the lens L and an image at a position distance D2 from the principal point of the lens L.
  • Fig. 6 shows the relationship between the DFD calculation value and the defocus amount for different subjects.
  • the horizontal axis of the graph 610, the graph 620, and the graph 630 is the defocus amount, and the vertical axis is the DFD calculation value.
  • the unit of the defocus amount and the DFD calculation value is f ⁇ .
  • the circular marks of the graph 610, the graph 620, and the graph 630 respectively indicate a DFD calculation value obtained by the DFD calculation of the two images obtained by changing the position of the focus lens.
  • the solid lines of the graph 610, the graph 620, and the graph 630 represent the ideal value of the DFD calculation.
  • the graph 610 shows the relationship between the DFD calculation value and the actual defocus amount for the image with the specific object A as the subject.
  • the graph 620 shows the relationship between the DFD calculation value and the defocus amount of an image with a specific object B different from the object A as the subject.
  • the graph 630 shows the relationship between the DFD calculation value and the defocus amount for an image with a specific object C different from the object A and the object B as the subject.
  • the data point when it is judged to be in the most focused state from the actually captured image is taken as the origin of the graphs of graphs 610, 620, and 630.
  • the graph 650 shows the deviation of the ideal value of the DFD calculation value for the image of the object A.
  • the graph 660 shows the deviation of the ideal value of the DFD calculation value for the image of the object B.
  • the graph 670 shows the deviation of the ideal value of the DFD calculation value for the image of the object C.
  • object A is the object with the highest contrast
  • object C is the object with the lowest contrast.
  • the object C is, for example, a cloud. It can be seen from the graph 610, graph 620, graph 630, graph 650, graph 660, and graph 670 that, regarding the image of object A, the error of the DFD calculation value is extremely small. Even when the amount of defocus is large, the DFD calculation value The accuracy is also high. On the other hand, regarding the image of the object C, the error of the DFD calculation value is relatively large, and in particular, the greater the defocus amount, the lower the accuracy of the DFD calculation value.
  • the amount of change in the DFD calculation value with respect to the amount of change in the defocus amount may be smaller than the ideal value of the DFD calculation.
  • it may deviate from the ideal value of the DFD calculation.
  • FIG. 7 shows the principle of focus control based on two DFD calculations performed by the imaging control unit 110.
  • the vertical axis of the graph 710 is the DFD calculation value, and the horizontal axis is the defocus amount.
  • the graph 710 is a graph showing two data points 711 and 712 used for focus control in the graph 630 of FIG. 6.
  • the graph 720 expresses the data of the graph 710 in units of the number of lens pulses corresponding to the defocus amount.
  • the number of lens pulses indicates the number of pulses provided to the stepping motor driving the lens 210 for focusing control.
  • the number of lens pulses indicates the position of the lens 210 relative to a predetermined reference point in the optical axis direction.
  • the data point 711 represents the DFD operation value and the number of lens pulses obtained by the first DFD operation.
  • the data point 712 represents the DFD operation value and the number of lens pulses obtained by the second DFD operation.
  • the number of lens pulses indicates the reference position of the lens position in order to take an image for each DFD calculation.
  • the DFD calculation value obtained by the DFD calculation indicates the amount of defocus at the reference position.
  • the DFD calculation value represents the target movement amount of the lens 210 from its reference position to the focused state.
  • the imaging control section 110 performs the first DFD calculation and the second DFD calculation by making the lens reference positions different from each other.
  • the imaging control unit 110 calculates the amount of change in the DFD calculation value with respect to the amount of change in the lens reference position. This value represents the slope of the straight line 722 connecting the data point 711 and the data point 712. By dividing the second DFD calculation value by the slope value of the straight line 722, the second DFD calculation value is corrected and the lens 210 is moved. Thus, the imaging control unit 110 can be closer to the in-focus state (the number of lens pulses is 2000) than when the second DFD calculation value is not corrected.
  • the imaging control unit 110 captures three images that are image P1, image P2, and image P3 in a state where the lens positions are different from each other, and performs DFD calculation twice using the three images, image P1, image P2, and image P3. Specifically, the imaging control unit 110 performs a first DFD calculation based on the images P1 and P2, and performs a second DFD calculation based on the images P2 and P3.
  • the lens position when the image P1 is captured is LensP1
  • the lens position when the image P2 is captured is LensP2
  • the lens position when the image P3 is captured is LensP3.
  • LensP1 1160
  • LensP2 1440
  • LensP3 1720.
  • the data point 711 of the graph 720 of FIG. 7 represents coordinates (DFDLensP1, DFD1).
  • the data point 712 of the graph 720 of FIG. 7 represents (DFDLensP2, DFD2).
  • the target position (Peak2) of the lens 210 based only on the second DFD calculation is calculated by the following formula.
  • the lens position in the focused state is 2000, an error of -109 pulses has occurred. Therefore, even if the position of the lens 210 is moved to Peak2, it may become in a state of insufficient focus with the subject.
  • the imaging control unit 110 calculates the slope of the straight line 722 passing through the data point 711 and the data point 712 by the following formula.
  • the imaging control unit 110 uses the value of slope to correct DFD2 as follows to calculate the target position PeakCorrection of the lens 210.
  • FIG. 8 is a flowchart showing the processing procedure of the focus control executed by the imaging control unit 110.
  • the imaging control unit 110 executes a focus control process 800 and a DFD process 850 in parallel, where the former is implemented by an algorithm for focus control, and the latter is implemented by an algorithm for DFD calculation.
  • the imaging control unit 110 captures the image sensor 120 at the current lens position, and acquires image data of the first image (S802).
  • the imaging control unit 110 processes the image data of the first image into information necessary for DFD calculation (S852).
  • the imaging control unit 110 moves the position of the focus lens of the lens 210 by a preset movement amount (S804), causes the image sensor 120 to take an image, and acquires image data of the second image (S806).
  • the imaging control unit 110 processes the image data of the second image into information required for DFD calculation, performs the first DFD calculation based on the data of the first image and the second image, and calculates the current DFD calculation value.
  • the amount of defocus and the reliability value of the DFD calculation value (S852).
  • the imaging control unit 110 calculates the amount of movement of the focus lens based on the DFD calculation value obtained by the first DFD calculation (S854).
  • the imaging control unit 110 may calculate the reliability value of the DFD calculation value based on the blur amount of the subject in the image.
  • the imaging control unit 110 may calculate the reliability value of the DFD calculation value based on the amount of blur (Cost) of the image represented by the equation (1), for example.
  • the imaging control unit 110 can make the smaller the amount of blur, the smaller the reliability value.
  • the imaging control unit 110 compares the reliability value obtained by the first DFD operation with the first threshold value and a second threshold value lower than the first threshold value.
  • the imaging control unit 110 may compare the reliability value acquired through the first DFD operation with a first threshold value or a second threshold value lower than the first threshold value. Specifically, if the reliability value obtained by the first DFD calculation is greater than or equal to the first threshold value, the imaging control unit 110 transfers the process to S812 and drives the lens position to the lens target position indicated by the DFD calculation value. If the reliability value obtained by the first DFD operation is lower than the preset first threshold value and greater than or equal to the second threshold value, the process proceeds to S808 to move the lens position.
  • the imaging control unit 110 may determine that the amount of movement of the focus lens does not exceed the focus position. In addition, the imaging control unit 110 processes the image data of the second image into information necessary for DFD calculation (S858).
  • the focus control of the BDAF method is stopped.
  • the imaging control unit 110 may perform focus control using a method other than BDAF, such as contrast AF. If the imaging device 100 is in the process of shooting a moving image, the imaging control unit 110 may place the lens 210 in an infinite focus state when the focus control of the BDAF method is stopped. When the imaging device 100 is about to capture a still image, if the BDAF focus control is stopped, the imaging control unit 110 may notify the user of a focus error.
  • the imaging control unit 110 causes the image sensor 120 to take an image, and acquires image data of the third image (S810).
  • the imaging control unit 110 processes the image data of the third image into information required for DFD calculation, performs the second DFD calculation based on the data of the second image and the third image, and calculates the current DFD calculation value.
  • the defocus amount and the reliability value of the DFD calculation value (S858).
  • the imaging control unit 110 is based on the DFD operation value obtained by the first DFD operation of S852, the DFD operation value obtained by the second DFD operation of S858, and the lens when shooting the first image, the second image, and the third image.
  • the slope is calculated according to formula (5), the DFD calculation value is corrected by the slope, and the movement amount of the focus lens is calculated (S860).
  • the imaging control unit 110 determines whether the reliability value calculated by the second DFD calculation is greater than or equal to the second threshold value (S862). When the reliability value calculated by the second DFD calculation is lower than the second threshold value, the imaging control unit 110 stops the focus control of the BDAF method. When the reliability value obtained by the second DFD operation is greater than or equal to the second threshold value, the imaging control section 110 moves the process to S812, and moves the focus lens according to the movement amount calculated by S860 (S812). In addition, the imaging control unit 110 processes the image data of the third image into information necessary for DFD calculation (S864).
  • the imaging control unit 110 After the lens is moved in S812, the imaging control unit 110 causes the image sensor 120 to capture an image, and acquires image data of the image for focus checking (S814). In the DFD process 850, the imaging control unit 110 performs a DFD calculation based on the image data of the third image and the focus check image, and calculates the current defocus amount obtained as the DFD calculation value and the reliability value of the DFD calculation (S864).
  • the imaging control unit 110 determines whether it is in focus with the subject based on the focus amount represented by the DFD operation value obtained by the DFD operation in S864 and the reliability value of the DFD operation (S866), and when it is determined to be in focus, completes BDAF In the AF operation of the method, when it is determined that it is not in focus, the process moves to S812, and the focus lens is moved to the target position indicated by the DFD calculation value obtained by the DFD calculation of S864. After that, the operations of moving the focus lens, image capturing for DFD calculation, DFD calculation, and focus determination are repeated until it is determined that it is in a focused state.
  • the imaging control unit 110 calculates the amount of movement of the focus lens based on the difference between the DFD calculation value obtained by the first DFD calculation and the DFD calculation value obtained by the second DFD calculation. As a result, even when an object with low DFD arithmetic sensitivity is used as the subject, the AF operation can be performed with high accuracy.
  • the first DFD operation may be performed based on the first image and the second image with different lens positions
  • the second DFD operation may be performed based on the third image and the fourth image with different lens positions.
  • the slope is calculated using the results of two DFD operations.
  • the slope can also be calculated using the results of more than three DFD operations.
  • the aforementioned imaging device 100 may be mounted on a mobile body.
  • the camera device 100 may also be mounted on an unmanned aerial vehicle (UAV) as shown in FIG. 9.
  • UAV 10 may include a UAV main body 20, a universal joint 50, a plurality of camera devices 60, and a camera device 100.
  • the universal joint 50 and the camera device 100 are an example of a camera system.
  • UAV10 is an example of a mobile body propelled by a propulsion unit.
  • the concept of moving objects also includes flying objects such as other airplanes that move in the air, vehicles that move on the ground, and ships that move on the water.
  • the UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section.
  • the UAV main body 20 makes the UAV 10 fly by controlling the rotation of a plurality of rotors.
  • the UAV main body 20 uses, for example, four rotors to fly the UAV 10. The number of rotors is not limited to four.
  • UAV10 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging camera that captures a subject included in a desired imaging range.
  • the universal joint 50 rotatably supports the imaging device 100.
  • the universal joint 50 is an example of a supporting mechanism.
  • the universal joint 50 uses an actuator to rotatably support the imaging device 100 around the pitch axis.
  • the universal joint 50 uses an actuator to further rotatably support the imaging device 100 around the roll axis and the yaw axis, respectively.
  • the gimbal 50 can change the posture of the camera device 100 by rotating the camera device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that photograph the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • the two camera devices 60 can be installed on the nose of the UAV 10, that is, on the front.
  • the other two camera devices 60 may be provided on the bottom surface of the UAV 10.
  • the two imaging devices 60 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired to function as a stereo camera.
  • the three-dimensional spatial data around the UAV 10 can be generated based on the images taken by the plurality of camera devices 60.
  • the number of imaging devices 60 included in the UAV 10 is not limited to four.
  • the UAV 10 may include at least one camera device 60.
  • the UAV10 may be equipped with at least one camera 60 on the nose, tail, side, bottom, and top of the UAV10.
  • the viewing angle that can be set in the imaging device 60 may be larger than the viewing angle that can be set in the imaging device 100.
  • the imaging device 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 can wirelessly communicate with the UAV 10.
  • the remote operation device 300 transmits to the UAV 10 instruction information indicating various commands related to the movement of the UAV 10 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 10.
  • the indication information may indicate the height at which the UAV10 should be located.
  • the UAV 10 moves to be located at the height indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include an ascending instruction to raise the UAV10. UAV10 rises while receiving the rise command. When the height of UAV10 has reached the upper limit height, even if the ascending instruction is accepted, the ascent of UAV10 can be restricted.
  • FIG. 10 shows an example of a computer 1200 that can embody aspects of the present invention in whole or in part.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device.
  • a program installed on the computer 1200 can make the computer 1200 function as the imaging control unit 110.
  • the program can enable the computer 1200 to perform related operations or related functions of one or more "parts".
  • This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes specified operations associated with some or all blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program that depends on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above.
  • the apparatus or method may be constituted by implementing operations or processing of information according to the use of the computer 1200.
  • the CPU 1212 can execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network The received data is written into the receiving buffer provided in the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when a plurality of entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries. And read the attribute value of the second attribute stored in the entry that matches the condition, so as to obtain the attribute value of the second attribute that is associated with the first attribute that meets the predetermined condition.
  • the programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Measurement Of Optical Distance (AREA)
  • Focusing (AREA)

Abstract

The detection accuracy of the distance of a photographed subject can sometimes decrease according to the differences of photographed subjects. Provided is an apparatus, comprising a circuit composed as follows: on the basis of the amount of blurring of a first image included in first photographed images photographed when the imaging surface and the focusing lens of the photographic apparatus are in a first positional relationship state and the amount of blurring of a second image included in second photographed images photographed when the imaging surface and the focusing lens are in a second positional relationship state, acquiring a first predicted value representing the positional relationship between the imaging surface and the focusing lens when focusing on the photographed subject; on the basis of the amount of blurring of a third image included in third photographed images photographed when the imaging surface and the focusing lens are in a third positional relationship state, acquiring a second predicted value representing the positional relationship between the imaging surface and the focusing lens when focusing on the photographed subject; and, on the basis of the difference between the first predicted value and the second predicted value, determining a target value representing the positional relationship between the imaging surface and the focusing lens when focusing on the photographed subject.

Description

装置、摄像装置、移动体、方法以及程序Device, camera device, mobile body, method and program 技术领域Technical field
本发明涉及一种装置、摄像装置、移动体、方法以及程序。The present invention relates to a device, a camera device, a mobile body, a method, and a program.
背景技术Background technique
记载有用DFD方式测量被摄体距离的技术。Records the technology of measuring the distance of the subject using DFD method.
[现有技术文献][Prior Art Literature]
[专利文献][Patent Literature]
[专利文献1]日本特开2013-242617号公报[Patent Document 1] JP 2013-242617 A
发明内容Summary of the invention
有时根据被摄体的不同,被摄体距离的检测精度下降。Sometimes depending on the subject, the detection accuracy of the subject distance decreases.
本发明的一个方面所涉及的装置包括一种电路,该基于电路构成为:获取摄像装置的摄像面和聚焦镜头处于第一位置关系的状态下拍摄的第一摄像图像中包含的第一图像的模糊量以及摄像面和聚焦镜头处于第二位置关系的状态下拍摄的第二摄像图像中包含的第二图像的模糊量。电路构成为:根据第一图像的模糊量和第二图像的模糊量,获取表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系的第一预测值。电路构成为:获取摄像面和聚焦镜头处于第三位置关系的状态下拍摄的第三摄像图像所包含的第三图像的模糊量。电路构成为:根据第二图像的模糊量和第三图像的模糊量,获取表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系的第二预测值。电路构成为:根据第一预测值和第二预测值之差,确定表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系的目标值。The device according to one aspect of the present invention includes a circuit based on the circuit structure: acquiring the first image included in the first captured image captured in a state where the imaging surface of the imaging device and the focusing lens are in a first positional relationship The amount of blur and the amount of blur of the second image included in the second captured image taken with the imaging surface and the focus lens in the second positional relationship. The circuit is configured to obtain a first predicted value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the first image and the blur amount of the second image. The circuit is configured to acquire the blur amount of the third image included in the third captured image captured in the state where the imaging surface and the focus lens are in the third positional relationship. The circuit is configured to obtain a second predicted value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the second image and the blur amount of the third image. The circuit is configured to determine a target value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the difference between the first predicted value and the second predicted value.
第一位置关系、第二位置关系以及第三位置关系可以分别表示聚焦镜头相对于摄像面的位置。第一预测值可以表示对焦于被摄体时的聚焦镜头相对于摄像面的第一目标位置。第二预测值可以表示对焦于被摄体时的聚焦镜头相对于摄像面的第二目标位置。电路可以构成为:使用第一目标位置和第二目标位置之差,确定表示对焦于被摄体时的聚焦镜头相对于摄像面的目标位置的目标值。The first positional relationship, the second positional relationship, and the third positional relationship may respectively indicate the position of the focus lens relative to the imaging surface. The first predicted value may indicate the first target position of the focus lens relative to the imaging surface when focusing on the subject. The second predicted value may indicate the second target position of the focus lens relative to the imaging surface when focusing on the subject. The circuit may be configured to use the difference between the first target position and the second target position to determine a target value representing the target position of the focus lens relative to the imaging surface when focusing on the subject.
将根据第一位置关系和第二位置关系的至少一个确定的聚焦镜头的位置设定为第一基准位置,将根据第二位置关系和第三位置关系的至少一个确定的聚焦镜头的位置设定为 第二基准位置,电路可以构成为:使用相对于第一基准位置和第二基准位置之差的第一目标位置和第二目标位置之差,确定表示对焦于被摄体时的聚焦镜头相对于的摄像面的目标位置的目标值。The position of the focusing lens determined according to at least one of the first positional relationship and the second positional relationship is set as the first reference position, and the position of the focusing lens determined according to at least one of the second positional relationship and the third positional relationship is set For the second reference position, the circuit may be configured to use the difference between the first target position and the second target position relative to the difference between the first reference position and the second reference position to determine the relative focus of the focus lens when focusing on the subject. The target value of the target position on the camera surface.
电路可以构成为:计算出相对于第一基准位置和第二基准位置之差的第一目标位置和第二目标位置之差的比率作为变化信息。电路可以构成为:通过将第二目标位置除以比率来计算出第二目标位置的校正值。电路可以构成为:计算出相对于第二基准位置的校正值所表示的位置作为目标值。The circuit may be configured to calculate the ratio of the difference between the first target position and the second target position relative to the difference between the first reference position and the second reference position as the change information. The circuit may be configured to calculate the correction value of the second target position by dividing the second target position by the ratio. The circuit may be configured to calculate the position indicated by the correction value relative to the second reference position as the target value.
电路可以构成为:根据目标值执行摄像装置的对焦控制。The circuit may be configured to perform focus control of the imaging device according to the target value.
电路可以构成为:获取第一图像的模糊量和第二图像的模糊量。电路可以构成为:根据第一图像的模糊量和第二图像的模糊量获取第一预测值。电路可以构成为:当第一预测值的可靠值大于等于预先设定的第一值时,根据第一预测值执行摄像装置的对焦控制。电路可以构成为:当第一预测值的可靠值低于预先设定的第一值时,获取第三图像的模糊量,根据第二图像的模糊量和第三图像的模糊量获取第二预测值,并根据目标值执行摄像装置的对焦控制。The circuit may be configured to obtain the blur amount of the first image and the blur amount of the second image. The circuit may be configured to obtain the first predicted value according to the blur amount of the first image and the blur amount of the second image. The circuit may be configured to perform focus control of the imaging device according to the first predicted value when the reliable value of the first predicted value is greater than or equal to the preset first value. The circuit may be configured as follows: when the reliable value of the first predicted value is lower than the preset first value, obtain the blur amount of the third image, and obtain the second prediction according to the blur amount of the second image and the blur amount of the third image Value, and perform focus control of the camera device according to the target value.
电路可以构成为:当第一预测值的可靠值低于比第一值更低的第二值时,不获取第三图像,并不进行根据第一预测值的摄像装置的对焦控制。电路可以构成为:当第一预测值的可靠值低于第一值并且大于等于第二值时,获取第三图像的模糊量,根据第二图像的模糊量和第三图像的模糊量获取第二预测值,并根据目标值执行摄像装置的对焦控制。The circuit may be configured such that when the reliable value of the first predicted value is lower than the second value lower than the first value, the third image is not acquired, and the focus control of the imaging device according to the first predicted value is not performed. The circuit may be configured as follows: when the reliable value of the first predicted value is lower than the first value and greater than or equal to the second value, the blur amount of the third image is obtained, and the second image is obtained according to the blur amount of the second image and the blur amount of the third image. Two predicted values, and the focus control of the camera device is executed according to the target value.
本发明的一个方面所涉及的装置包括一种电路,该基于电路构成为:获取摄像装置的摄像面和聚焦镜头处于第一位置关系的状态下拍摄的第一摄像图像中包含的第一图像的模糊量以及摄像面和聚焦镜头处于第二位置关系的状态下拍摄的第二摄像图像中包含的第二图像的模糊量。电路可以构成为:根据第一图像的模糊量和第二图像的模糊量,获取表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系的第一预测值。电路可以构成为:获取摄像面和聚焦镜头处于第三位置关系的状态下拍摄的第三摄像图像中包含的第三图像的模糊量以及摄像面和聚焦镜头处于第四位置关系的状态下拍摄的第四摄像图像中包含的第四图像的模糊量。电路可以构成为:根据第三图像的模糊量和第四图像的模糊量,获取表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系的第二预测值。电路可以构成为根据第一预测值和第二预测值之差,确定表示对焦于被摄体时摄像面和聚焦镜头之间的位置关系的目标值。The device according to one aspect of the present invention includes a circuit based on the circuit structure: acquiring the first image included in the first captured image captured in a state where the imaging surface of the imaging device and the focusing lens are in a first positional relationship The amount of blur and the amount of blur of the second image included in the second captured image taken with the imaging surface and the focus lens in the second positional relationship. The circuit may be configured to obtain a first predicted value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the first image and the blur amount of the second image. The circuit may be configured to acquire the blur amount of the third image included in the third captured image taken with the imaging surface and the focusing lens in the third positional relationship, and the image taken with the imaging surface and the focusing lens in the fourth positional relationship. The blur amount of the fourth image included in the fourth captured image. The circuit may be configured to obtain a second predicted value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the third image and the blur amount of the fourth image. The circuit may be configured to determine a target value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the difference between the first predicted value and the second predicted value.
根据本发明的一个方面所涉及的摄像装置可以包括上述装置和具有摄像面的图像传 感器。The imaging device according to an aspect of the present invention may include the above-mentioned device and an image sensor having an imaging surface.
本发明的一个方面所涉及的摄像系统可以包括上述摄像装置以及以可控制所述摄像装置的姿势的方式进行支撑的支撑机构。An imaging system according to an aspect of the present invention may include the above-mentioned imaging device and a support mechanism that supports the imaging device in a manner that can control the posture of the imaging device.
本发明的一个方面所涉及的移动体可以搭载上述的摄像装置进行移动。The mobile body according to an aspect of the present invention can be moved by mounting the aforementioned imaging device.
本发明的一个方面所涉及的方法包括:获取摄像装置的摄像面和聚焦镜头处于第一位置关系的状态下拍摄的第一摄像图像中包含的第一图像的模糊量以及摄像面和聚焦镜头处于第二位置关系的状态下拍摄的第二摄像图像中包含的第二图像的模糊量。方法可以包括:根据第一图像的模糊量和第二图像的模糊量获取第一预测值,该第一预测值表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系。方法可以包括:获取摄像装置的摄像面和聚焦镜头处于第三位置关系的状态下拍摄的第三图像中包含的第三图像的模糊量。方法可以包括:根据第二图像的模糊量和第三图像的模糊量获取第二预测值,该第二预测值表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系。该方法可以包括:根据第一预测值和第二预测值之差,确定表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系的目标值。The method involved in one aspect of the present invention includes: acquiring the blur amount of a first image contained in a first captured image captured in a state where the imaging surface of the imaging device and the focus lens are in a first positional relationship, and that the imaging surface and the focus lens are The blur amount of the second image included in the second captured image captured in the state of the second positional relationship. The method may include: obtaining a first predicted value based on the blur amount of the first image and the blur amount of the second image, the first predicted value representing the positional relationship between the imaging surface and the focusing lens when focusing on the subject. The method may include: acquiring a blur amount of a third image included in a third image captured in a state where the imaging surface of the imaging device and the focus lens are in a third positional relationship. The method may include: obtaining a second predicted value based on the blur amount of the second image and the blur amount of the third image, the second predicted value representing the positional relationship between the imaging plane and the focusing lens when focusing on the subject. The method may include: determining a target value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the difference between the first predicted value and the second predicted value.
本发明的一个方面所涉及的程序可以是用于使计算机执行上述方法的程序。The program related to one aspect of the present invention may be a program for causing a computer to execute the above-mentioned method.
根据本发明的一个方面,有时可以抑制根据被摄体的不同导致被摄体距离的检测精度降低的情形。According to an aspect of the present invention, it is sometimes possible to suppress a decrease in the detection accuracy of the subject distance depending on the subject.
此外,上述发明内容未列举本发明的必要的全部特征。此外,这些特征组的子组合也可以构成发明。In addition, the above summary does not enumerate all the essential features of the present invention. In addition, sub-combinations of these feature groups may also constitute inventions.
附图说明Description of the drawings
图1是表示摄像装置100的外观立体图的一个示例的图。FIG. 1 is a diagram showing an example of an external perspective view of the imaging device 100.
图2是示出摄像装置100的功能块的图。FIG. 2 is a diagram showing functional blocks of the imaging device 100.
图3示出表示图像的模糊量(Cost)和聚焦镜头的位置之间的关系的曲线的一个示例。FIG. 3 shows an example of a curve representing the relationship between the amount of image blur (Cost) and the position of the focus lens.
图4是表示BDAF方式中的距离计算过程的一个示例的流程图。Fig. 4 is a flowchart showing an example of a distance calculation process in the BDAF method.
图5是说明被摄体距离的计算过程的图。FIG. 5 is a diagram illustrating the calculation process of the subject distance.
图6示出对于不同被摄体的DFD运算值和散焦量的关系。FIG. 6 shows the relationship between the DFD calculation value and the defocus amount for different subjects.
图7示出根据摄像控制部110进行的两次DFD运算的对焦控制的原理。FIG. 7 shows the principle of focus control based on two DFD calculations performed by the imaging control unit 110.
图8是表示摄像控制部110执行的对焦控制的处理过程的流程图。FIG. 8 is a flowchart showing the processing procedure of the focus control executed by the imaging control unit 110.
图9示出无人驾驶航空器(UAV)的一个示例。Figure 9 shows an example of an unmanned aerial vehicle (UAV).
图10示出整体或部分地体现本发明的多个方面的计算机1200的一个示例。FIG. 10 shows an example of a computer 1200 that embodies aspects of the present invention in whole or in part.
具体实施方式Detailed ways
以下,通过发明的实施方式来说明本发明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。Hereinafter, the present invention will be described through embodiments of the invention, but the following embodiments do not limit the invention according to the claims. In addition, all the feature combinations described in the embodiments are not necessarily necessary for the solution of the invention. It is obvious to a person skilled in the art that various changes or improvements can be made to the following embodiments. It is obvious from the description of the claims that all such changes or improvements can be included in the technical scope of the present invention.
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。The claims, the description, the drawings of the description, and the abstract of the description include matters that are the subject of copyright protection. As long as anyone makes copies of these files as indicated in the patent office files or records, the copyright owner will not raise an objection. However, in other cases, all copyrights are reserved.
本发明的各种实施方式可参照流程图及框图来描述,这里,方框可表示(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。Various embodiments of the present invention can be described with reference to flowcharts and block diagrams. Here, the blocks can represent (1) the stages of the process of performing operations or (2) the "parts" of the device that perform operations. Specific stages and "parts" can be implemented by programmable circuits and/or processors. Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits. The programmable circuit may include a reconfigurable hardware circuit. Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate array (FPGA), programmable logic array (PLA) ) And other memory components.
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。其结果是,其上存储有指令的计算机可读介质包括一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括floppy(注册商标)disk软盘、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、蓝光(注册商标)光盘、记忆棒、集成电路卡等。The computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device. As a result, the computer-readable medium having instructions stored thereon includes a product including instructions that can be executed to create means for performing operations specified by the flowchart or block diagram. As examples of computer-readable media, electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like may be included. As a more specific example of the computer readable medium, it may include floppy (registered trademark) disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) Or flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (registered trademark) disc, Memory stick, integrated circuit card, etc.
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk(注册商标)、JAVA(注 册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。处理器的示例包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。The computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages. The source code or object code includes traditional procedural programming languages. Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages. The computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device. The processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, etc.
图1是示出本实施方式所涉及的摄像装置100的外观立体图的一个示例的图。图2是示出本实施方式所涉及的摄像装置100的功能块的图。FIG. 1 is a diagram showing an example of an external perspective view of an imaging device 100 according to this embodiment. FIG. 2 is a diagram showing functional blocks of the imaging device 100 according to this embodiment.
摄像装置100包括摄像部102及镜头部200。摄像部102包括图像传感器120、摄像控制部110、存储器130、指示部162以及显示部160。The imaging device 100 includes an imaging unit 102 and a lens unit 200. The imaging unit 102 includes an image sensor 120, an imaging control unit 110, a memory 130, an instruction unit 162, and a display unit 160.
图像传感器120可以由CCD或CMOS构成。图像传感器120通过镜头部200所具有的镜头210接收光。图像传感器120将由镜头210摄像的光学图像的图像数据输出至摄像控制部110。The image sensor 120 may be composed of CCD or CMOS. The image sensor 120 receives light through the lens 210 included in the lens unit 200. The image sensor 120 outputs image data of the optical image captured by the lens 210 to the imaging control unit 110.
摄像控制部110可以由CPU或MPU等微处理器、MCU等微控制器等构成。存储器130可以是计算机可读记录介质,也可以包括诸如SRAM、DRAM、EPROM、EEPROM和USB存储器等闪存中的至少一种。摄像控制部110与电路对应。存储器130存储摄像控制部110对图像传感器120等进行控制所需的程序等。存储器130可以设置于摄像装置100的壳体内部。存储器130可以设置成可从摄像装置100的壳体上拆卸下来。The imaging control unit 110 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like. The memory 130 may be a computer-readable recording medium, and may also include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory. The imaging control unit 110 corresponds to an electric circuit. The memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like. The memory 130 may be provided inside the housing of the imaging device 100. The storage 130 may be configured to be detachable from the housing of the imaging device 100.
指示部162是从用户处接受对摄像装置100的指示的用户界面。显示部160显示由图像传感器120摄像、由摄像控制部110处理的图像、摄像装置100的各种设置信息等。显示部160可以由触控面板组成。The instruction unit 162 is a user interface that accepts instructions to the imaging device 100 from the user. The display unit 160 displays images captured by the image sensor 120, images processed by the imaging control unit 110, various setting information of the imaging device 100, and the like. The display part 160 may be composed of a touch panel.
摄像控制部110对镜头部200及图像传感器120进行控制。例如,摄像控制部110控制镜头210的焦点的位置、焦点距离。摄像控制部110根据表示来自用户的指示的信息,将控制指令输出到镜头部200所包括的镜头控制部220,从而对镜头部200进行控制。The imaging control unit 110 controls the lens unit 200 and the image sensor 120. For example, the imaging control unit 110 controls the focal position and focal length of the lens 210. The imaging control unit 110 outputs a control command to the lens control unit 220 included in the lens unit 200 based on the information indicating the instruction from the user, thereby controlling the lens unit 200.
镜头部200具有一个以上的镜头210、镜头驱动部212、镜头控制部220以及存储器222。在本实施方式中,将一个以上的镜头210统称为“镜头210”。镜头210可以包括聚焦镜头和变焦镜头。镜头210包括的镜头中的至少一部分或全部被布置为可沿着镜头210的光轴移动。镜头部200可以是可拆卸地设于摄像部102的可更换镜头。The lens unit 200 has one or more lenses 210, a lens driving unit 212, a lens control unit 220, and a memory 222. In this embodiment, one or more lenses 210 are collectively referred to as “lens 210”. The lens 210 may include a focus lens and a zoom lens. At least a part or all of the lenses included in the lens 210 are arranged to be movable along the optical axis of the lens 210. The lens unit 200 may be an interchangeable lens detachably provided in the imaging unit 102.
镜头驱动部212使镜头210中的至少一部分或全部沿着镜头210的光轴移动。镜头控制部220根据来自摄像部102的镜头控制指令,驱动镜头驱动部212,使镜头210整体或镜头210所包含的变焦镜头或聚焦镜头沿光轴方向移动,从而执行变焦操作或聚焦操作中的至少一个。镜头控制指令例如是变焦控制指令以及聚焦控制指令等。The lens driving unit 212 moves at least a part or all of the lens 210 along the optical axis of the lens 210. The lens control unit 220 drives the lens drive unit 212 according to the lens control instruction from the imaging unit 102 to move the entire lens 210 or the zoom lens or the focus lens included in the lens 210 along the optical axis, thereby performing the zoom operation or focus operation. at least one. The lens control commands are, for example, zoom control commands and focus control commands.
镜头驱动部212可包括使多个镜头210的至少一部分或全部沿光轴方向移动的音圈电机(VCM)。镜头驱动部212可包括DC电机、空心杯电机或超声波电机等电动机。镜头驱动部212可将来自电动机的动力经由凸轮环、导轴等机构部件传递给多个镜头210的至少一部分或全部,使镜头210的至少一部分或全部沿光轴移动。The lens driving part 212 may include a voice coil motor (VCM) that moves at least a part or all of the plurality of lenses 210 in the optical axis direction. The lens driving part 212 may include a motor such as a DC motor, a coreless motor, or an ultrasonic motor. The lens driving unit 212 can transmit the power from the motor to at least a part or all of the plurality of lenses 210 via mechanism components such as cam rings and guide shafts, so as to move at least a part or all of the lenses 210 along the optical axis.
存储器222存储通过镜头驱动部212进行移动的聚焦镜头、变焦镜头用的控制值。存储器222可以包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。The memory 222 stores control values for the focus lens and zoom lens that are moved by the lens drive unit 212. The memory 222 may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
摄像控制部110根据通过指示部162等获取的表示用户指示的信息,通过向图像传感器120输出控制指令,对图像传感器120实施包括摄像操作控制在内的控制。摄像控制部110获取由图像传感器120拍摄的图像。摄像控制部110对从图像传感器120获取的图像实施图像处理并存储在存储器130中。The imaging control unit 110 outputs a control command to the image sensor 120 based on the information indicating the user's instruction acquired by the instruction unit 162 and the like to perform control including imaging operation control on the image sensor 120. The imaging control unit 110 acquires an image captured by the image sensor 120. The imaging control unit 110 performs image processing on the image acquired from the image sensor 120 and stores it in the memory 130.
对本实施方式中的摄像控制部110的操作进行说明。摄像控制部110获取摄像装置100的摄像面和聚焦镜头处在第一位置关系的状态下拍摄的第一摄像图像中包含的第一图像的模糊量、以及摄像面和聚焦镜头处在第二位置关系的状态下拍摄的第二摄像图像中包含的第二图像的模糊量。摄像控制部110根据第一图像的模糊量和第二图像的模糊量,获取表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系的第一预测值。摄像控制部110获取摄像面和聚焦镜头处于第三位置关系的状态下拍摄的第三摄像图像中包含的第三图像的模糊量。摄像控制部110根据第二图像的模糊量和第三图像的模糊量,获取表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系的第二预测值。摄像控制部110根据第一预测值和第二预测值之差,确定表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系的目标值。The operation of the imaging control unit 110 in this embodiment will be described. The imaging control unit 110 acquires the amount of blur of the first image contained in the first captured image captured in the state where the imaging surface and the focus lens of the imaging device 100 are in the first positional relationship, and the imaging surface and the focus lens are in the second position The amount of blurring of the second image included in the second captured image captured in the state of the relationship. The imaging control unit 110 acquires a first predicted value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the first image and the blur amount of the second image. The imaging control unit 110 acquires the amount of blurring of the third image included in the third captured image captured with the imaging surface and the focus lens in the third positional relationship. The imaging control unit 110 obtains a second predicted value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the second image and the blur amount of the third image. The imaging control unit 110 determines a target value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the difference between the first predicted value and the second predicted value.
第一位置关系、第二位置关系以及第三位置关系可以分别表示聚焦镜头相对于摄像面的位置。第一预测值可以表示对焦于被摄体时的聚焦镜头相对于摄像面的第一目标位置。第二预测值可以表示对焦于被摄体时的聚焦镜头相对于摄像面的第二目标位置。在这种情况下,摄像控制部110可以使用第一目标位置和第二目标位置之差,确定表示对焦于被摄体时的聚焦镜头相对于摄像面的目标位置的目标值。The first positional relationship, the second positional relationship, and the third positional relationship may respectively indicate the position of the focus lens relative to the imaging surface. The first predicted value may indicate the first target position of the focus lens relative to the imaging surface when focusing on the subject. The second predicted value may indicate the second target position of the focus lens relative to the imaging surface when focusing on the subject. In this case, the imaging control unit 110 may use the difference between the first target position and the second target position to determine a target value indicating the target position of the focus lens relative to the imaging surface when focusing on the subject.
摄像控制部110将根据第一位置关系和第二位置关系的至少一个确定的聚焦镜头的位置设为第一基准位置,将根据第二位置关系和第三位置关系的至少一个确定的聚焦镜头的位置设为第二基准位置,摄像控制部110使用相对于第一基准位置和第二基准位置之差的第一目标位置和第二目标位置之差,确定表示对焦于被摄体时的聚焦镜头相对于的摄像面的目标位置的目标值。The imaging control unit 110 sets the position of the focus lens determined based on at least one of the first positional relationship and the second positional relationship as the first reference position, and sets the position of the focus lens determined based on at least one of the second positional relationship and the third positional relationship as the first reference position. The position is set as the second reference position, and the imaging control unit 110 uses the difference between the first target position and the second target position relative to the difference between the first reference position and the second reference position to determine the focus lens that indicates when focusing on the subject The target value of the target position relative to the imaging surface.
摄像控制部110可以计算出第一目标位置和第二目标位置之差与第一基准位置和第二基准位置之差的比率作为变化信息。摄像控制部110可以通过将第二目标位置除以该比率来计算出第二目标位置的校正值。摄像控制部110可以计算出相对于第二基准位置的校正值所表示的位置作为目标值。摄像控制部110可以根据目标值,执行摄像装置100的对焦控制。The imaging control unit 110 may calculate the ratio of the difference between the first target position and the second target position to the difference between the first reference position and the second reference position as the change information. The imaging control unit 110 may calculate the correction value of the second target position by dividing the second target position by the ratio. The imaging control unit 110 may calculate the position indicated by the correction value relative to the second reference position as the target value. The imaging control unit 110 may perform focus control of the imaging device 100 based on the target value.
摄像控制部110可以获取第一图像的模糊量和第二图像的模糊量。摄像控制部110可以根据第一图像的模糊量和第二图像的模糊量获取第一预测值。当第一预测值的可靠值大于等于预先设定的第一值时,摄像控制部110可以根据第一预测值执行摄像装置100的对焦控制。当第一预测值的可靠值低于预先设定的第一值时,摄像控制部110获取第三图像的模糊量,根据第二图像的模糊量和第三图像的模糊量获取第二预测值,并根据目标值执行摄像装置100的对焦控制。The imaging control unit 110 may acquire the blur amount of the first image and the blur amount of the second image. The imaging control unit 110 may obtain the first predicted value based on the blur amount of the first image and the blur amount of the second image. When the reliable value of the first predicted value is greater than or equal to the preset first value, the imaging control unit 110 may perform focus control of the imaging device 100 according to the first predicted value. When the reliable value of the first predicted value is lower than the preset first value, the imaging control unit 110 obtains the blur amount of the third image, and obtains the second predicted value according to the blur amount of the second image and the blur amount of the third image , And execute the focus control of the imaging device 100 according to the target value.
当第一预测值的可靠值低于比第一值更低的第二值时,摄像控制部110不获取第三图像,并不进行根据第一预测值的摄像装置100的对焦控制。当第一预测值的可靠值低于第一预测值并且大于等于第二值时,摄像控制部110获取第三图像的模糊量,根据第二图像的模糊量和第三图像的模糊量获取第二预测值,并且根据目标值执行摄像装置100的对焦控制。When the reliable value of the first predicted value is lower than the second value lower than the first value, the imaging control section 110 does not acquire the third image, and does not perform focus control of the imaging device 100 based on the first predicted value. When the reliable value of the first predicted value is lower than the first predicted value and greater than or equal to the second value, the imaging control unit 110 obtains the blur amount of the third image, and obtains the second image based on the blur amount of the second image and the blur amount of the third image. Two predicted values, and the focus control of the imaging device 100 is performed according to the target value.
另外,摄像控制部110可以获取摄像装置100的摄像面和聚焦镜头在处于第一位置关系的状态下拍摄的第一摄像图像中包含的第一图像的模糊量、以及摄像面和聚焦镜头在处于第二位置关系的状态下拍摄的第二摄像图像中包含的第二图像的模糊量。摄像控制部110可以根据第一图像的模糊量和第二图像的模糊量,获取表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系的第一预测值。摄像控制部110可以获取摄像面和聚焦镜头处于第三位置关系的状态下拍摄的第三摄像图像中包含的第三图像的模糊量以及摄像面和聚焦镜头处于第四位置关系的状态下拍摄的第四摄像图像中包含的第四图像的模糊量。摄像控制部110可以根据第三图像的模糊量和第四图像的模糊量,获取表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系的第二预测 值。摄像控制部110可以根据第一预测值和第二预测值之差,确定表示对焦于被摄体时的摄像面和聚焦镜头之间的位置关系的目标值。In addition, the imaging control unit 110 can acquire the blur amount of the first image contained in the first captured image captured in the first positional relationship between the imaging surface and the focus lens of the imaging device 100, and whether the imaging surface and the focus lens are in the The blur amount of the second image included in the second captured image captured in the state of the second positional relationship. The imaging control unit 110 may obtain the first predicted value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the first image and the blur amount of the second image. The imaging control unit 110 can acquire the blur amount of the third image included in the third captured image taken with the imaging surface and the focus lens in the third positional relationship and the image taken with the imaging surface and the focus lens in the fourth positional relationship. The blur amount of the fourth image included in the fourth captured image. The imaging control unit 110 may obtain a second predicted value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the blur amount of the third image and the blur amount of the fourth image. The imaging control unit 110 may determine a target value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject based on the difference between the first predicted value and the second predicted value.
对摄像装置100执行的AF方式进行说明。为了执行AF处理,摄像装置100确定从镜头210到被摄体的距离(被摄体距离)。作为用于确定被摄体距离的方式,存在以下方式:通过使聚焦镜头移动,根据在聚焦镜头和图像传感器120的受光面的位置关系不同的状态下拍摄到的多个图像的模糊量来确定。在此,将使用该方式的AF称为模糊检测自动聚焦(Bokeh Detection Auto Foucus:BDAF)方式。具体来说,在BDAF中,进行DFD(Depth From Defocus)运算来进行AF。The AF method executed by the imaging device 100 will be described. In order to perform the AF processing, the imaging device 100 determines the distance from the lens 210 to the subject (subject distance). As a method for determining the subject distance, there is the following method: by moving the focus lens, it is determined based on the blur amount of a plurality of images captured in a state where the positional relationship between the focus lens and the light receiving surface of the image sensor 120 is different . Here, the AF using this method is called a blur detection auto focus (Bokeh Detection Auto Foucus: BDAF) method. Specifically, in BDAF, DFD (Depth From Defocus) calculation is performed to perform AF.
例如,图像的模糊量(Cost)可以采用高斯函数由以下公式(1)来表示。在式(1)中,x表示水平方向上的像素位置。σ表示标准偏差值。For example, the amount of blurring (Cost) of the image can be expressed by the following formula (1) using a Gaussian function. In formula (1), x represents the pixel position in the horizontal direction. σ represents the standard deviation value.
【式1】【Formula 1】
Figure PCTCN2020103227-appb-000001
Figure PCTCN2020103227-appb-000001
图3示出表示图像的模糊量(Cost)和聚焦镜头的位置的关系的曲线的一个示例。C1是聚焦镜头位于x1时得到的图像模糊量。C2是聚焦镜头位于x2时得到的图像模糊量。通过将聚焦镜头对准镜头位置x0,可以将对焦于被摄体,其中,镜头位置x0对应于考虑镜头210的光学特性而根据散焦量C1及量C2确定的曲线500的极小点502。FIG. 3 shows an example of a curve representing the relationship between the amount of image blur (Cost) and the position of the focus lens. C1 is the amount of image blur obtained when the focus lens is located at x1. C2 is the amount of image blur obtained when the focus lens is located at x2. By aligning the focus lens at the lens position x0, the object can be focused on, where the lens position x0 corresponds to the minimum point 502 of the curve 500 determined according to the defocus amount C1 and the amount C2 in consideration of the optical characteristics of the lens 210.
图4是表示BDAF方式中的距离计算过程的一个示例的流程图。摄像控制部110在镜头210和图像传感器120的摄像面处于第一位置关系的状态下,拍摄第一图像并存储在存储器130中。摄像控制部110通过在光轴方向上移动镜头210,使镜头210与摄像面处于第二位置关系的状态,通过摄像装置100拍摄第二图像,并存储在存储器130中(S201)。例如,摄像控制部110通过沿光轴方向移动聚焦镜头,将镜头210与摄像面的位置关系从第一位置关系变更为第二位置关系。镜头的移动量可以为例如10μm左右。Fig. 4 is a flowchart showing an example of a distance calculation process in the BDAF method. The imaging control unit 110 takes a first image and stores it in the memory 130 when the lens 210 and the imaging surface of the image sensor 120 are in a first positional relationship. The imaging control unit 110 moves the lens 210 in the optical axis direction to place the lens 210 and the imaging surface in a second positional relationship, and captures a second image by the imaging device 100 and stores it in the memory 130 (S201). For example, the imaging control unit 110 changes the positional relationship between the lens 210 and the imaging surface from the first positional relationship to the second positional relationship by moving the focus lens along the optical axis direction. The amount of movement of the lens may be, for example, about 10 μm.
然后,摄像控制部110将第一图像分割为多个区域(S202)。摄像控制部110可以对第一图像内的每个像素计算特征量,将具有类似的特征量的像素组作为一个区域而将第一图像分割为多个区域。摄像控制部110也可以将第一图像中被设定为AF处理框的范围的像素组分割为多个区域。摄像控制部110将第二图像分割为与第一图像的多个区域对应的多个区域。摄像控制部110根据第一图像的多个区域的各个模糊量 和第二图像的多个区域的各自的模糊量,按多个区域计算到与多个区域的每一个所包含的物体对应的被摄体的距离(S203)。Then, the imaging control unit 110 divides the first image into a plurality of regions (S202). The imaging control unit 110 may calculate a feature amount for each pixel in the first image, and divide the first image into a plurality of regions by using a group of pixels having similar feature amounts as one region. The imaging control unit 110 may divide the pixel group set as the range of the AF processing frame in the first image into a plurality of regions. The imaging control unit 110 divides the second image into a plurality of regions corresponding to the plurality of regions of the first image. The imaging control unit 110 calculates, for a plurality of regions, a target corresponding to an object contained in each of the plurality of regions based on the respective blur amounts of the plurality of regions of the first image and the respective blur amounts of the plurality of regions of the second image. The distance of the subject (S203).
另外,改变镜头210与图像传感器120的摄像面的位置关系的方法并不限于使镜头210具备的聚焦镜头移动这一方法。例如,摄像控制部110可以使镜头210整体沿光轴方向移动。摄像控制部110可以使图像传感器120的摄像面沿光轴方向移动。摄像控制部110可以使镜头210所具备的至少一部分镜头以及图像传感器120的摄像面都沿光轴方向移动。摄像控制部110可以采用任意方法,只要光学地改变镜头210的焦点与图像传感器120的摄像面的相对位置关系即可。In addition, the method of changing the positional relationship between the lens 210 and the imaging surface of the image sensor 120 is not limited to the method of moving the focus lens included in the lens 210. For example, the imaging control unit 110 may move the entire lens 210 in the optical axis direction. The imaging control unit 110 can move the imaging surface of the image sensor 120 in the optical axis direction. The imaging control unit 110 can move at least a part of the lens included in the lens 210 and the imaging surface of the image sensor 120 in the optical axis direction. The imaging control unit 110 may adopt any method as long as the relative positional relationship between the focal point of the lens 210 and the imaging surface of the image sensor 120 is optically changed.
将参照图5进一步说明被摄体距离的计算过程。将从镜头L的主点到被摄体510(物面)的距离设为A,将从镜头L的主点到来自被摄体510的光束成像的位置(像面)的距离设为B,将镜头L的焦距设为F。在这种情况下,可以根据镜头公式用以下式(2)来表示距离A、距离B及焦距F之间的关系。The calculation process of the subject distance will be further explained with reference to FIG. 5. The distance from the principal point of the lens L to the subject 510 (object plane) is set to A, and the distance from the principal point of the lens L to the position (image plane) where the light beam from the subject 510 is formed is set to B, Set the focal length of lens L to F. In this case, the following formula (2) can be used to express the relationship between the distance A, the distance B, and the focal length F according to the lens formula.
【式2】[Formula 2]
Figure PCTCN2020103227-appb-000002
Figure PCTCN2020103227-appb-000002
焦距F由镜头L所具备的各镜头的位置确定。因此,如果可以确定根据被摄体510的光束成像的距离B,则能够采用式(2)来确定从镜头L的主点到被摄体510的距离A。The focal length F is determined by the position of each lens included in the lens L. Therefore, if it is possible to determine the distance B imaged according to the light beam of the subject 510, the equation (2) can be used to determine the distance A from the principal point of the lens L to the subject 510.
这里,假设通过将图像传感器的摄像面向镜头L侧移动,改变了镜头L与摄像面的位置关系。如图5所示,如果在距离镜头L的主点的距离D1的位置或距离镜头L的主点的距离D2的位置处存在摄像面,则投影在摄像面上的被摄体510的像会产生模糊。根据投影在摄像面上的被摄体510的像的模糊大小(弥散圆512和514)计算出被摄体510成像的位置,由此可以确定距离B,并且可以进一步确定距离A。也就是说,考虑到模糊的大小(模糊量)与摄像面和摄像位置成比例,从模糊量的差可以确定摄像位置。Here, it is assumed that by moving the imaging surface of the image sensor to the side of the lens L, the positional relationship between the lens L and the imaging surface is changed. As shown in FIG. 5, if there is an imaging surface at a distance D1 from the principal point of the lens L or a distance D2 from the principal point of the lens L, the image of the subject 510 projected on the imaging surface will be Produce blur. The imaged position of the subject 510 is calculated according to the blur size (circles of confusion 512 and 514) of the image of the subject 510 projected on the imaging surface, and thus the distance B can be determined, and the distance A can be further determined. That is, considering that the size of the blur (blur amount) is proportional to the imaging surface and the imaging position, the imaging position can be determined from the difference in the blur amount.
这里,距摄像面距离D1的位置的像I 1以及距离摄像面距离D2的位置的像I 2的各个图像模糊。关于像I 1,设点像分布函数(Point Spread Function)为PSF 1,被摄体像为I d1,则像I 1通过卷积运算可以用下式(3)表示。 Here, each image of the image I 1 at a position distance D1 from the imaging surface and the image I 2 at a position distance D2 from the imaging surface is blurred. Regarding the image I 1 , assuming that the point spread function (Point Spread Function) is PSF 1 and the subject image is I d1 , the image I 1 can be expressed by the following equation (3) through convolution operation.
【式3】[Formula 3]
I 1=PSF 1*I d1…(3) I 1 =PSF 1 *I d1 …(3)
像I 2也通过PSF 2的卷积运算同样表示。将被摄体像的傅里叶变换设为f,将点像分布函数PSF 1以及PSF 2进行傅里叶变换后的光学传递函数(Optical Transfer Function)设为OTF 1以及OTF 2,如下式(4)所示,获得比值。 Like I 2 is also expressed by the convolution operation of PSF 2 . The Fourier transform of the subject image is set to f, and the optical transfer function after Fourier transform of the point image distribution functions PSF 1 and PSF 2 is set to OTF 1 and OTF 2 , as follows ( 4) As shown, the ratio is obtained.
【式4】[Formula 4]
Figure PCTCN2020103227-appb-000003
Figure PCTCN2020103227-appb-000003
式(4)所示的值C相当于距镜头L的主点距离D1的位置的像以及距镜头L的主点距离D2的位置的像的各个模糊量的变化量,即,值C相当于距镜头L的主点距离D1的位置的像和从镜头L的主点距离D2的位置的像的模糊量的差。The value C shown in equation (4) corresponds to the amount of change in each blur of the image at a position distance D1 from the principal point of the lens L and an image at a position distance D2 from the principal point of the lens L, that is, the value C corresponds to The difference in the amount of blur between an image at a position distance D1 from the principal point of the lens L and an image at a position distance D2 from the principal point of the lens L.
在图5中,对通过将摄像面向镜头L侧移动,改变了镜头L与摄像面的位置关系的情况进行了说明。通过使聚焦镜头相对于摄像面移动,从而改变镜头L的焦点的位置和摄像面的位置关系,也会产生模糊量的差异。在本实施方式中,主要通过使聚焦镜头相对于摄像面移动而获取模糊量不同的图像,根据获取的图像进行DFD运算,获取表示散焦量的DFD运算值,根据DFD运算值计算出用于对焦于被摄体的聚焦镜头的位置的目标值。In FIG. 5, the case where the positional relationship between the lens L and the imaging surface is changed by moving the imaging surface to the side of the lens L has been described. By moving the focus lens relative to the imaging surface, thereby changing the positional relationship between the focal position of the lens L and the imaging surface, a difference in the amount of blur also occurs. In this embodiment, mainly by moving the focus lens relative to the imaging surface to obtain images with different blur amounts, perform DFD calculations based on the acquired images to obtain a DFD calculation value indicating the amount of defocus, and calculate the amount of defocus based on the DFD calculation value. The target value of the position of the focus lens that focuses on the subject.
图6表示对于不同被摄体的DFD运算值和散焦量的关系。图表610、图表620以及图表630的横轴是散焦量,纵轴是DFD运算值。散焦量和DFD运算值的单位为fδ。图表610、图表620以及图表630的圆形标记分别表示通过基于改变聚焦镜头的位置得到的两个图像的DFD运算得到的一个DFD运算值。图表610、图表620以及图表630的实线表示DFD运算的理想值。Fig. 6 shows the relationship between the DFD calculation value and the defocus amount for different subjects. The horizontal axis of the graph 610, the graph 620, and the graph 630 is the defocus amount, and the vertical axis is the DFD calculation value. The unit of the defocus amount and the DFD calculation value is fδ. The circular marks of the graph 610, the graph 620, and the graph 630 respectively indicate a DFD calculation value obtained by the DFD calculation of the two images obtained by changing the position of the focus lens. The solid lines of the graph 610, the graph 620, and the graph 630 represent the ideal value of the DFD calculation.
图表610表示对于以特定的物体A为被摄体的图像的DFD运算值和实际的散焦量的关系。图表620表示对于以与物体A不同的特定物体B为被摄体的图像的DFD运算值和散焦量的关系。图表630表示对于以与物体A和物体B不同的特定的物体C为被摄体的图像的DFD运算值和散焦量的关系。另外,将根据实际拍摄的图像判断为处于最对焦状态时的数据点作为图表610、图表620以及图表630的图表的原点。The graph 610 shows the relationship between the DFD calculation value and the actual defocus amount for the image with the specific object A as the subject. The graph 620 shows the relationship between the DFD calculation value and the defocus amount of an image with a specific object B different from the object A as the subject. The graph 630 shows the relationship between the DFD calculation value and the defocus amount for an image with a specific object C different from the object A and the object B as the subject. In addition, the data point when it is judged to be in the most focused state from the actually captured image is taken as the origin of the graphs of graphs 610, 620, and 630.
图表650表示的是对于物体A的图像的DFD运算值的理想值的偏差。图表660表示的是对于物体B的图像的DFD运算值的理想值的偏差。图表670表示的是对于物体C的图像的DFD运算值的理想值的偏差。The graph 650 shows the deviation of the ideal value of the DFD calculation value for the image of the object A. The graph 660 shows the deviation of the ideal value of the DFD calculation value for the image of the object B. The graph 670 shows the deviation of the ideal value of the DFD calculation value for the image of the object C.
物体A、物体B和物体C中,物体A是物体自身的对比度最高的物体,物体C是物体自身的对比度最低的物体。物体C例如为云。从图表610、图表620、图表630、图表650、图表660以及图表670可以看出,关于物体A的图像,DFD运算值的误差极小,即使在散焦量较大的情况下,DFD运算值的精度也很高。另一方面,关于物体C的图像,DFD运算值的误差比较大,特别是散焦量越大,DFD运算值的精度就越低。这样一来,根据作为被摄体的物体不同,DFD运算值相对于散焦量的变化量的变化量有时会比DFD运算的理想值小。另外,根据作为被摄体的物体不同,有时会导致偏离DFD运算的理想值。Among object A, object B, and object C, object A is the object with the highest contrast, and object C is the object with the lowest contrast. The object C is, for example, a cloud. It can be seen from the graph 610, graph 620, graph 630, graph 650, graph 660, and graph 670 that, regarding the image of object A, the error of the DFD calculation value is extremely small. Even when the amount of defocus is large, the DFD calculation value The accuracy is also high. On the other hand, regarding the image of the object C, the error of the DFD calculation value is relatively large, and in particular, the greater the defocus amount, the lower the accuracy of the DFD calculation value. In this way, depending on the object that is the subject, the amount of change in the DFD calculation value with respect to the amount of change in the defocus amount may be smaller than the ideal value of the DFD calculation. In addition, depending on the object as the subject, it may deviate from the ideal value of the DFD calculation.
图7表示基于摄像控制部110进行的两次DFD运算的对焦控制的原理。图表710的纵轴是DFD运算值,横轴是散焦量。图表710是表示在图6的图表630中用于对焦控制的两个数据点711和数据点712的图。图表720以对应于散焦量的镜头脉冲数为单位对图表710的数据进行了表达。镜头脉冲数表示为进行对焦控制而提供给驱动镜头210的步进电机的脉冲数。镜头脉冲数表示镜头210相对于光轴方向的预定基准点的位置。FIG. 7 shows the principle of focus control based on two DFD calculations performed by the imaging control unit 110. The vertical axis of the graph 710 is the DFD calculation value, and the horizontal axis is the defocus amount. The graph 710 is a graph showing two data points 711 and 712 used for focus control in the graph 630 of FIG. 6. The graph 720 expresses the data of the graph 710 in units of the number of lens pulses corresponding to the defocus amount. The number of lens pulses indicates the number of pulses provided to the stepping motor driving the lens 210 for focusing control. The number of lens pulses indicates the position of the lens 210 relative to a predetermined reference point in the optical axis direction.
在图7中,数据点711表示通过第一DFD运算得到的DFD运算值和镜头脉冲数。数据点712表示通过第二DFD运算得到的DFD运算值和镜头脉冲数。镜头脉冲数表示为了拍摄用于各个DFD运算的图像时的镜头位置的基准位置。在一次DFD运算中,由于使用了在镜头位置不同的状态下拍摄的两个图像,因此将拍摄一次DFD运算中使用的图像时的镜头位置的平均值作为镜头基准位置。另外,通过DFD运算得到的DFD运算值表示在其基准位置的散焦量。DFD运算值表示从其基准位置到对焦状态的镜头210的目标移动量。In FIG. 7, the data point 711 represents the DFD operation value and the number of lens pulses obtained by the first DFD operation. The data point 712 represents the DFD operation value and the number of lens pulses obtained by the second DFD operation. The number of lens pulses indicates the reference position of the lens position in order to take an image for each DFD calculation. In one DFD calculation, since two images taken with different lens positions are used, the average value of the lens positions when the images used in one DFD calculation are taken is used as the lens reference position. In addition, the DFD calculation value obtained by the DFD calculation indicates the amount of defocus at the reference position. The DFD calculation value represents the target movement amount of the lens 210 from its reference position to the focused state.
摄像控制部110通过使镜头基准位置彼此不同来执行第一DFD运算和第二DFD运算。摄像控制部110计算出DFD运算值相对于镜头基准位置的变化量的变化量。这个值表示连接数据点711和数据点712的直线722的斜率。通过将第二DFD运算值除以直线722的斜率值,校正第二DFD运算值而使镜头210移动。由此,摄像控制部110与不校正第二DFD运算值的情况相比,能够更接近对焦状态(镜头脉冲数2000)。The imaging control section 110 performs the first DFD calculation and the second DFD calculation by making the lens reference positions different from each other. The imaging control unit 110 calculates the amount of change in the DFD calculation value with respect to the amount of change in the lens reference position. This value represents the slope of the straight line 722 connecting the data point 711 and the data point 712. By dividing the second DFD calculation value by the slope value of the straight line 722, the second DFD calculation value is corrected and the lens 210 is moved. Thus, the imaging control unit 110 can be closer to the in-focus state (the number of lens pulses is 2000) than when the second DFD calculation value is not corrected.
使用具体的数值来说明摄像控制部110的操作。摄像控制部110在镜头位置相互不同的状态下对三个图像即图像P1、图像P2及图像P3进行摄像,使用三个图像即图像P1、图像P2及图像P3进行两次DFD运算。具体而言,摄像控制部110基于图像P1和P2进行第一DFD运算,基于图像P2和P3进行第二DFD运算。The operation of the imaging control unit 110 will be explained using specific numerical values. The imaging control unit 110 captures three images that are image P1, image P2, and image P3 in a state where the lens positions are different from each other, and performs DFD calculation twice using the three images, image P1, image P2, and image P3. Specifically, the imaging control unit 110 performs a first DFD calculation based on the images P1 and P2, and performs a second DFD calculation based on the images P2 and P3.
拍摄到图像P1时的镜头位置为LensP1,拍摄到图像P2时的镜头位置为LensP2,拍摄到图像P3时的镜头位置为LensP3。具体来说,LensP1=1160,LensP2=1440,LensP3=1720。The lens position when the image P1 is captured is LensP1, the lens position when the image P2 is captured is LensP2, and the lens position when the image P3 is captured is LensP3. Specifically, LensP1=1160, LensP2=1440, and LensP3=1720.
将通过基于图像P1和P2的第一DFD运算获取的第一DFD运算值设定为DFD1。具体而言,设为DFD1=-514。如果LensP1和LensP2的平均值设为DFDLensP1,则镜头基准位置DFDLensP1为1300。图7的图表720的数据点711表示坐标(DFDLensP1,DFD1)。The first DFD operation value acquired by the first DFD operation based on the images P1 and P2 is set as DFD1. Specifically, let DFD1=-514. If the average value of LensP1 and LensP2 is set to DFDLensP1, the lens reference position DFDLensP1 is 1300. The data point 711 of the graph 720 of FIG. 7 represents coordinates (DFDLensP1, DFD1).
基于图像P2和P3的第二DFD运算值设为DFD2。具体而言,设为DFD2=-311。如果LensP2和LensP3的平均值为DFDLensP2,则DFDLensP2=1580。图7的图表720的数据点712表示(DFDLensP2,DFD2)。The second DFD operation value based on the images P2 and P3 is set to DFD2. Specifically, set DFD2=-311. If the average value of LensP2 and LensP3 is DFDLensP2, then DFDLensP2=1580. The data point 712 of the graph 720 of FIG. 7 represents (DFDLensP2, DFD2).
首先,仅基于第二DFD运算的镜头210的目标位置(Peak2)通过下式计算。First, the target position (Peak2) of the lens 210 based only on the second DFD calculation is calculated by the following formula.
Peak2=DFDLensP2-DFD2Peak2=DFDLensP2-DFD2
由此,得到Peak2=1580-(-311)=1891。在图7中,由于成为对焦状态的镜头位置是2000,因此产生了-109脉冲的误差。因此,即使将镜头210的位置移动到Peak2,也可能成为与被摄体不充分聚焦的状态。Thus, Peak2=1580-(-311)=1891 is obtained. In FIG. 7, since the lens position in the focused state is 2000, an error of -109 pulses has occurred. Therefore, even if the position of the lens 210 is moved to Peak2, it may become in a state of insufficient focus with the subject.
接着,对摄像控制部110的对焦控制进行说明。摄像控制部110通过下式计算出通过数据点711和数据点712的直线722的斜率slope。Next, the focus control of the imaging control unit 110 will be described. The imaging control unit 110 calculates the slope of the straight line 722 passing through the data point 711 and the data point 712 by the following formula.
slope=(DFD2-DFD1)/(DFDLensP2-DFDLensP1)…(5)slope=(DFD2-DFD1)/(DFDLensP2-DFDLensP1)...(5)
由此,可获取slope=0.725。Thus, slope=0.725 can be obtained.
摄像控制部110使用slope的值,通过如下式般校正DFD2,计算出镜头210的目标位置PeakCorrection。The imaging control unit 110 uses the value of slope to correct DFD2 as follows to calculate the target position PeakCorrection of the lens 210.
PeakCorrection=DFDLensP2-DFD2/slopePeakCorrection=DFDLensP2-DFD2/slope
由此,PeakCorrection=1580(-311)/0.725=2009。由于成为对焦状态的镜头位置为2000,所以误差为9脉冲。由此可知,与仅基于第二DFD运算计算出的Peak2相比,误差可以控制在1/10以下。Therefore, PeakCorrection=1580(-311)/0.725=2009. Since the lens position in the focused state is 2000, the error is 9 pulses. From this, it can be seen that the error can be controlled below 1/10 compared with Peak2 calculated based only on the second DFD calculation.
图8是表示摄像控制部110执行的对焦控制的处理过程的流程图。摄像控制部110并行执行对焦控制处理800以及DFD处理850,其中前者通过用于对焦控制的算法实施,而后者通过用于DFD运算的算法实施。FIG. 8 is a flowchart showing the processing procedure of the focus control executed by the imaging control unit 110. The imaging control unit 110 executes a focus control process 800 and a DFD process 850 in parallel, where the former is implemented by an algorithm for focus control, and the latter is implemented by an algorithm for DFD calculation.
首先,在对焦控制处理800中,摄像控制部110在当前的镜头位置使图像传感器120摄像,获取第一图像的图像数据(S802)。摄像控制部110在DFD处理850中,将第一图像的图像数据加工成DFD运算所需的信息(S852)。First, in the focus control process 800, the imaging control unit 110 captures the image sensor 120 at the current lens position, and acquires image data of the first image (S802). In the DFD processing 850, the imaging control unit 110 processes the image data of the first image into information necessary for DFD calculation (S852).
摄像控制部110在对焦控制处理800中,以预先设定的移动量使镜头210的聚焦镜头的位置(S804)移动,使图像传感器120摄像,获取第二图像的图像数据(S806)。摄像控制部110在DFD处理850中,将第二图像的图像数据加工成DFD运算所需的信息,基于第一图像和第二图像的数据进行第一DFD运算,计算出作为DFD运算值的当前的散焦量和DFD运算值的可靠性值(S852)。摄像控制部110基于通过第一DFD运算得到的DFD运算值来计算出聚焦镜头的移动量(S854)。摄像控制部110可以基于图像内的被摄体的模糊量来计算DFD运算值的可靠性值。摄像控制部110例如可以基于式(1)表示的图像的模糊量(Cost),计算DFD运算值的可靠性值。摄像控制部110可以使模糊量越小,可靠性值越小。In the focus control process 800, the imaging control unit 110 moves the position of the focus lens of the lens 210 by a preset movement amount (S804), causes the image sensor 120 to take an image, and acquires image data of the second image (S806). In the DFD processing 850, the imaging control unit 110 processes the image data of the second image into information required for DFD calculation, performs the first DFD calculation based on the data of the first image and the second image, and calculates the current DFD calculation value. The amount of defocus and the reliability value of the DFD calculation value (S852). The imaging control unit 110 calculates the amount of movement of the focus lens based on the DFD calculation value obtained by the first DFD calculation (S854). The imaging control unit 110 may calculate the reliability value of the DFD calculation value based on the blur amount of the subject in the image. The imaging control unit 110 may calculate the reliability value of the DFD calculation value based on the amount of blur (Cost) of the image represented by the equation (1), for example. The imaging control unit 110 can make the smaller the amount of blur, the smaller the reliability value.
在S856中,摄像控制部110将通过第一DFD运算获取的可靠性值与第一阈值和低于第一阈值的第二阈值进行比较。摄像控制部110可以将通过第一DFD运算获取的可靠性值与第一阈值或低于第一阈值的第二阈值进行比较。具体而言,如果通过第一DFD运算获取的可靠性值大于等于第一阈值,则摄像控制部110将处理转移至S812,并将镜头位置驱动至DFD运算值表示的镜头目标位置。如果通过第一DFD运算得到的可靠性值比预先设定的第一阈值低,并且大于等于第二阈值,则将处理前进到S808,使镜头位置移动。另外,当在S808中使镜头移动时,考虑由S852计算出的DFD运算值,摄像控制部110可以确定聚焦镜头的移动量不超过对焦位置。另外,摄像控制部110将第二图像的图像数据加工成DFD运算所需的信息(S858)。In S856, the imaging control unit 110 compares the reliability value obtained by the first DFD operation with the first threshold value and a second threshold value lower than the first threshold value. The imaging control unit 110 may compare the reliability value acquired through the first DFD operation with a first threshold value or a second threshold value lower than the first threshold value. Specifically, if the reliability value obtained by the first DFD calculation is greater than or equal to the first threshold value, the imaging control unit 110 transfers the process to S812 and drives the lens position to the lens target position indicated by the DFD calculation value. If the reliability value obtained by the first DFD operation is lower than the preset first threshold value and greater than or equal to the second threshold value, the process proceeds to S808 to move the lens position. In addition, when the lens is moved in S808, considering the DFD calculation value calculated in S852, the imaging control unit 110 may determine that the amount of movement of the focus lens does not exceed the focus position. In addition, the imaging control unit 110 processes the image data of the second image into information necessary for DFD calculation (S858).
通过第一DFD运算得到的可靠性值低于第二阈值时,停止BDAF方式的对焦控制。在停止了BDAF方式的对焦控制的情况下,摄像控制部110可以使用对比度AF等BDAF以外的方法进行对焦控制。如果摄像装置100处于拍摄动态图像过程中,则当停止了BDAF方式的对焦控制时,摄像控制部110可以使镜头210处于无限远对焦状态。当摄像装置100要拍摄静止图像时,若停止了BDAF方式的对焦控制,则摄像控制部110可以向用户通知对焦错误。When the reliability value obtained by the first DFD calculation is lower than the second threshold value, the focus control of the BDAF method is stopped. When the focus control of the BDAF method is stopped, the imaging control unit 110 may perform focus control using a method other than BDAF, such as contrast AF. If the imaging device 100 is in the process of shooting a moving image, the imaging control unit 110 may place the lens 210 in an infinite focus state when the focus control of the BDAF method is stopped. When the imaging device 100 is about to capture a still image, if the BDAF focus control is stopped, the imaging control unit 110 may notify the user of a focus error.
在S808中移动了聚焦镜头之后,摄像控制部110使图像传感器120摄像,获取第三图像的图像数据(S810)。在DFD处理850中,摄像控制部110将第三图像的图像数据加工成DFD运算所需的信息,基于第二图像和第三图像的数据进行第二DFD运算,计算出作为DFD运算值的当前散焦量和DFD运算值的可靠性值(S858)。摄像控制部110基于通过S852的第一DFD运算得到的DFD运算值、通过S858的第二DFD运算得到的DFD运算值、以及对第一图像、第二图像以及第三图像分别进行拍摄时的镜头位置,根据式(5)计算出slope,利用slope校正DFD运算值,计算聚焦镜头的移动量(S860)。After the focus lens is moved in S808, the imaging control unit 110 causes the image sensor 120 to take an image, and acquires image data of the third image (S810). In the DFD processing 850, the imaging control unit 110 processes the image data of the third image into information required for DFD calculation, performs the second DFD calculation based on the data of the second image and the third image, and calculates the current DFD calculation value. The defocus amount and the reliability value of the DFD calculation value (S858). The imaging control unit 110 is based on the DFD operation value obtained by the first DFD operation of S852, the DFD operation value obtained by the second DFD operation of S858, and the lens when shooting the first image, the second image, and the third image. For the position, the slope is calculated according to formula (5), the DFD calculation value is corrected by the slope, and the movement amount of the focus lens is calculated (S860).
摄像控制部110判断通过第二DFD运算计算出的可靠性值是否大于等于第二阈值(S862)。当通过第二DFD运算计算出的可靠性值低于第二阈值时,摄像控制部110停止BDAF方式的对焦控制。当通过第二DFD运算获取的可靠性值大于等于第二阈值时,摄像控制部110将处理移至S812,并且根据由S860计算出的移动量来移动聚焦镜头(S812)。另外,摄像控制部110将第三图像的图像数据加工成DFD运算所需的信息(S864)。The imaging control unit 110 determines whether the reliability value calculated by the second DFD calculation is greater than or equal to the second threshold value (S862). When the reliability value calculated by the second DFD calculation is lower than the second threshold value, the imaging control unit 110 stops the focus control of the BDAF method. When the reliability value obtained by the second DFD operation is greater than or equal to the second threshold value, the imaging control section 110 moves the process to S812, and moves the focus lens according to the movement amount calculated by S860 (S812). In addition, the imaging control unit 110 processes the image data of the third image into information necessary for DFD calculation (S864).
在S812中移动了镜头后,摄像控制部110使图像传感器120摄像,获取对焦确认用图像的图像数据(S814)。在DFD处理850中,摄像控制部110基于第三图像及对焦确认用图像的图像数据进行DFD运算,计算作为DFD运算值得到的当前的散焦量和DFD运算的可靠性值(S864)。摄像控制部110基于由S864的DFD运算得到的DFD运算值表示的聚焦量和DFD运算的可靠性值判断是否处于与被摄体聚焦的状态(S866),当判断为处于对焦状态时,完成BDAF方式的AF操作,当判断为不处于对焦状态时,将处理移动到S812,将聚焦镜头移动到由S864的DFD运算得到的DFD运算值表示的目标位置。之后,反复进行聚焦镜头的移动、DFD运算用的图像拍摄、DFD运算、对焦判断的操作,直到判断为处于聚焦状态为止。After the lens is moved in S812, the imaging control unit 110 causes the image sensor 120 to capture an image, and acquires image data of the image for focus checking (S814). In the DFD process 850, the imaging control unit 110 performs a DFD calculation based on the image data of the third image and the focus check image, and calculates the current defocus amount obtained as the DFD calculation value and the reliability value of the DFD calculation (S864). The imaging control unit 110 determines whether it is in focus with the subject based on the focus amount represented by the DFD operation value obtained by the DFD operation in S864 and the reliability value of the DFD operation (S866), and when it is determined to be in focus, completes BDAF In the AF operation of the method, when it is determined that it is not in focus, the process moves to S812, and the focus lens is moved to the target position indicated by the DFD calculation value obtained by the DFD calculation of S864. After that, the operations of moving the focus lens, image capturing for DFD calculation, DFD calculation, and focus determination are repeated until it is determined that it is in a focused state.
如上所述,摄像控制部110基于通过第一DFD运算得到的DFD运算值和通过第二DFD运算得到的DFD运算值之差,计算出聚焦镜头的移动量。由此,即使在以DFD运算灵敏度低的物体作为被摄体的情况下,也能够高精度地进行AF操作。As described above, the imaging control unit 110 calculates the amount of movement of the focus lens based on the difference between the DFD calculation value obtained by the first DFD calculation and the DFD calculation value obtained by the second DFD calculation. As a result, even when an object with low DFD arithmetic sensitivity is used as the subject, the AF operation can be performed with high accuracy.
另外,在以上的说明中,为了进行两次DFD运算,进行了三次摄像。但是,为了进行两次DFD运算,也可以进行四次摄像。例如,可以基于镜头位置不同的第一图像和第二图像执行第一DFD运算,并基于镜头位置不同的第三图像和第四图像执行第二DFD运算。In addition, in the above description, in order to perform two DFD calculations, three imagings are performed. However, in order to perform two DFD calculations, four shots may be performed. For example, the first DFD operation may be performed based on the first image and the second image with different lens positions, and the second DFD operation may be performed based on the third image and the fourth image with different lens positions.
另外,在以上的说明中,是使用两次DFD运算的结果计算出slope。但是,也可以使用三次以上的DFD运算的结果计算出slope。In addition, in the above description, the slope is calculated using the results of two DFD operations. However, the slope can also be calculated using the results of more than three DFD operations.
上述摄像装置100可以搭载于移动体上。摄像装置100还可以搭载于如图9所示的无人驾驶航空器(UAV)上。UAV10可以包括UAV主体20、万向节50、多个摄像装置60及摄像装置100。万向节50及摄像装置100为摄像系统的一个示例。UAV10为由推进部推进的移动体的一个示例。除UAV之外,移动体的概念还包括在空中移动的其他飞机等飞行体、在地面上移动的车辆、在水上移动的船舶等。The aforementioned imaging device 100 may be mounted on a mobile body. The camera device 100 may also be mounted on an unmanned aerial vehicle (UAV) as shown in FIG. 9. The UAV 10 may include a UAV main body 20, a universal joint 50, a plurality of camera devices 60, and a camera device 100. The universal joint 50 and the camera device 100 are an example of a camera system. UAV10 is an example of a mobile body propelled by a propulsion unit. In addition to UAVs, the concept of moving objects also includes flying objects such as other airplanes that move in the air, vehicles that move on the ground, and ships that move on the water.
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV10飞行。UAV主体20使用例如四个旋翼来使UAV10飞行。旋翼的数量不限于四个。另外,UAV10也可以是没有旋翼的固定翼机。The UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section. The UAV main body 20 makes the UAV 10 fly by controlling the rotation of a plurality of rotors. The UAV main body 20 uses, for example, four rotors to fly the UAV 10. The number of rotors is not limited to four. In addition, UAV10 can also be a fixed-wing aircraft without rotors.
摄像装置100是为对包含在所期望的摄像范围内的被摄体进行拍摄的摄像用相机。万向节50可旋转地支撑摄像装置100。万向节50为支撑机构的一个示例。例如,万向节50使用致动器以俯仰轴为中心可旋转地支撑摄像装置100。万向节50使用致动器进一步分别以滚转轴和偏航轴为中心可旋转地支撑摄像装置100。万向节50可通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来改变摄像装置100的姿势。The imaging device 100 is an imaging camera that captures a subject included in a desired imaging range. The universal joint 50 rotatably supports the imaging device 100. The universal joint 50 is an example of a supporting mechanism. For example, the universal joint 50 uses an actuator to rotatably support the imaging device 100 around the pitch axis. The universal joint 50 uses an actuator to further rotatably support the imaging device 100 around the roll axis and the yaw axis, respectively. The gimbal 50 can change the posture of the camera device 100 by rotating the camera device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
多个摄像装置60是为了控制UAV10的飞行而对UAV10的周围进行拍摄的传感用相机。两个摄像装置60可以设置于UAV10的机头、即正面。并且,其它两个摄像装置60可以设置于UAV10的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。可以根据由多个摄像装置60所拍摄的图像来生成UAV10周围的三维空间数据。UAV10所包括的摄像装置60的数量不限于四个。UAV10包括至少一个摄像装置60即可。UAV10也可以在UAV10的机头、机尾、侧面、底面及顶面分别具备至少一个摄像装置60。摄像装置60中可设定的视角可大于摄像装置100中可设定的视角。摄像装置60也可以具有单焦点镜头或鱼眼镜头。The plurality of imaging devices 60 are sensing cameras that photograph the surroundings of the UAV 10 in order to control the flight of the UAV 10. The two camera devices 60 can be installed on the nose of the UAV 10, that is, on the front. In addition, the other two camera devices 60 may be provided on the bottom surface of the UAV 10. The two imaging devices 60 on the front side may be paired to function as a so-called stereo camera. The two imaging devices 60 on the bottom side may also be paired to function as a stereo camera. The three-dimensional spatial data around the UAV 10 can be generated based on the images taken by the plurality of camera devices 60. The number of imaging devices 60 included in the UAV 10 is not limited to four. The UAV 10 may include at least one camera device 60. The UAV10 may be equipped with at least one camera 60 on the nose, tail, side, bottom, and top of the UAV10. The viewing angle that can be set in the imaging device 60 may be larger than the viewing angle that can be set in the imaging device 100. The imaging device 60 may have a single focus lens or a fisheye lens.
远程操作装置300与UAV10通信,以远程操作UAV10。远程操作装置300可以与UAV10进行无线通信。远程操作装置300向UAV10发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV10的移动有关的各种指令的指示信息。指示信息包括例如使UAV10的高度上升的指示信息。指示信息可以表示UAV10应该位于的高度。UAV10进行移动,以位于从远程操作装置300接收的指示信息所表示的高度。指示信息可以 包括使UAV10上升的上升指令。UAV10在接受上升指令的期间上升。在UAV10的高度已达到上限高度时,即使接受上升指令,也可以限制UAV10上升。The remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10. The remote operation device 300 can wirelessly communicate with the UAV 10. The remote operation device 300 transmits to the UAV 10 instruction information indicating various commands related to the movement of the UAV 10 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating. The instruction information includes, for example, instruction information for raising the height of the UAV 10. The indication information may indicate the height at which the UAV10 should be located. The UAV 10 moves to be located at the height indicated by the instruction information received from the remote operation device 300. The instruction information may include an ascending instruction to raise the UAV10. UAV10 rises while receiving the rise command. When the height of UAV10 has reached the upper limit height, even if the ascending instruction is accepted, the ascent of UAV10 can be restricted.
图10示出可全部或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置相关联的操作或者该装置的一个或多个“部”而起作用。例如,安装在计算机1200上的程序能够使计算机1200作为摄像控制部110而起作用。或者,该程序能够使计算机1200执行相关操作或者相关一个或多个“部”的功能。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框相关联的指定操作。FIG. 10 shows an example of a computer 1200 that can embody aspects of the present invention in whole or in part. The program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. For example, a program installed on the computer 1200 can make the computer 1200 function as the imaging control unit 110. Alternatively, the program can enable the computer 1200 to perform related operations or related functions of one or more "parts". This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention. Such a program may be executed by the CPU 1212, so that the computer 1200 executes specified operations associated with some or all blocks in the flowcharts and block diagrams described in this specification.
本实施方式的计算机1200包括CPU1212以及RAM1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM1230。CPU1212按照ROM1230及RAM1214内存储的程序而工作,从而控制各单元。The computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210. The computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220. The computer 1200 also includes a ROM 1230. The CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214 to control each unit.
通信接口1222通过网络与其他电子装置通信。硬盘驱动器可以存储计算机1200内的CPU1212所使用的程序及数据。ROM1230在其中存储运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也作为计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程序中记述的信息处理由计算机1200读取,并引起程序与上述各种类型的硬件资源之间的协作。可以通过根据计算机1200的使用而实现信息的操作或者处理来构成装置或方法。The communication interface 1222 communicates with other electronic devices through a network. The hard disk drive can store programs and data used by the CPU 1212 in the computer 1200. The ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program that depends on the hardware of the computer 1200. The program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network. The program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212. The information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above. The apparatus or method may be constituted by implementing operations or processing of information according to the use of the computer 1200.
例如,当在计算机1200和外部装置之间执行通信时,CPU1212可执行加载在RAM1214中的通信程序,并且基于通信程序中描述的处理,指令通信接口1222进行通信处理。通信接口1222在CPU1212的控制下,读取存储在RAM1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。For example, when communication is performed between the computer 1200 and an external device, the CPU 1212 can execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing. The communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network The received data is written into the receiving buffer provided in the recording medium, etc.
此外,CPU1212可以使RAM1214读取USB存储器等外部记录介质所存储的文件或数据库的全部或者需要的部分,并对RAM1214上的数据执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。In addition, the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接受信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处描述的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中存储具有分别与第二属性的属性值相关联的第一属性的属性值的多个条目时,CPU1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内存储的第二属性的属性值,从而获取与满足预定条件的第一属性相关联的第二属性的属性值。It is possible to store various types of information such as various types of programs, data, tables, and databases in the recording medium and receive information processing. For the data read from the RAM 1214, the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214. In addition, the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when a plurality of entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries. And read the attribute value of the second attribute stored in the entry that matches the condition, so as to obtain the attribute value of the second attribute that is associated with the first attribute that meets the predetermined condition.
以上描述的程序或者软件模块可以存储在计算机1200上或者计算机1200附近的计算机可读存储介质上。另外,连接到专用通信网络或因特网的服务器系统中提供的诸如硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而可以经由网络将程序提供给计算机1200。The programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200. In addition, a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。The present invention has been described above using the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It is obvious to those skilled in the art that various changes or improvements can be made to the above-mentioned embodiments. It is obvious from the description of the claims that all such changes or improvements can be included in the technical scope of the present invention.
应该注意的是,权利要求书、说明书以及说明书附图中所示的装置、系统、程序以及方法中的操作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及说明书附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。It should be noted that the operation, sequence, step and stage of the device, system, program and method shown in the claims, specification and drawings of the specification are executed in the order of each process, as long as there is no special indication that "before... ", "in advance", etc., and can be implemented in any order as long as the output of the previous processing is not used in the subsequent processing. Regarding the operating procedures in the claims, the specification and the drawings in the specification, the descriptions are made using "first", "next", etc. for convenience, but it does not mean that it must be implemented in this order.
【符号说明】【Symbol Description】
10 UAV10 UAV
20 UAV主体20 UAV subject
50 万向节50 universal joint
60 摄像装置60 Camera device
100 摄像装置100 camera device
102 摄像部102 Camera Department
110 摄像控制部110 Camera Control Department
120 图像传感器120 Image sensor
130 存储器130 Memory
152 镜头驱动部152 Lens Drive
160 显示部160 Display
162 指示部162 Instruction Department
200 镜头部200 lens department
210 镜头210 Lens
212 镜头驱动部212 Lens Drive
220 镜头控制部220 Lens Control Department
222 存储器222 Memory
300 远程操作装置300 remote operation device
500 曲线500 curve
502 极小点502 Very Small Point
510 被摄体510 Subject
512 弥散圆512 Circle of Diffusion
610、620、630、650、660、670 图表610, 620, 630, 650, 660, 670 Graph
710 图表710 Chart
711 数据点711 data points
712 数据点712 data points
720 图表720 charts
722 直线722 Straight
800 对焦控制处理800 Focus control processing
850 DFD处理850 DFD processing
1200 计算机1200 Computer
1210 主机控制器1210 Host Controller
1212 CPU1212 CPU
1214 RAM1214 RAM
1220 输入/输出控制器1220 Input/Output Controller
1222 通信接口1222 Communication interface
1230 ROM1230 ROM

Claims (13)

  1. 一种装置,其特征在于,包括一种电路,所述基于电路构成为:获取摄像装置的摄像面和聚焦镜头处于第一位置关系的状态下拍摄的第一摄像图像中包含的第一图像的模糊量以及所述摄像面和所述聚焦镜头处于第二位置关系的状态下拍摄的第二摄像图像中包含的第二图像的模糊量;A device, characterized by comprising a circuit based on the circuit structure: to obtain the first image contained in the first image captured in a state where the imaging surface of the imaging device and the focusing lens are in a first positional relationship The amount of blur and the amount of blur of the second image included in the second captured image taken when the imaging surface and the focus lens are in a second positional relationship;
    根据所述第一图像的模糊量和所述第二图像的模糊量,获取表示对焦于被摄体时的所述摄像面和所述聚焦镜头之间的位置关系的第一预测值;Acquiring, according to the amount of blur of the first image and the amount of blur of the second image, a first predicted value representing the positional relationship between the imaging surface and the focusing lens when focusing on the subject;
    获取所述摄像面和聚焦镜头处于第三位置关系的状态下拍摄的第三摄像图像所包含的第三图像的模糊量;Acquiring a blur amount of a third image included in a third camera image taken when the camera surface and the focusing lens are in a third positional relationship;
    根据所述第二图像的模糊量和所述第三图像的模糊量,获取表示对焦于被摄体时的所述摄像面和所述聚焦镜头之间的位置关系的第二预测值;Acquiring, according to the amount of blur of the second image and the amount of blur of the third image, a second predicted value representing the positional relationship between the imaging surface and the focusing lens when focusing on the subject;
    根据所述第一预测值和所述第二预测值之差,确定表示对焦于被摄体时的所述摄像面和所述聚焦镜头之间的位置关系的目标值。Based on the difference between the first predicted value and the second predicted value, a target value representing the positional relationship between the imaging surface and the focus lens when focusing on a subject is determined.
  2. 根据权利要求1所述的装置,其特征在于,The device according to claim 1, wherein:
    所述第一位置关系、所述第二位置关系以及所述第三位置关系分别表示所述聚焦镜头相对于所述摄像面的位置,The first positional relationship, the second positional relationship, and the third positional relationship respectively represent the position of the focusing lens relative to the imaging surface,
    所述第一预测值表示对焦于被摄体时的所述聚焦镜头相对于所述摄像面的第一目标位置,The first predicted value represents a first target position of the focusing lens relative to the imaging surface when focusing on a subject,
    所述第二预测值表示对焦于被摄体时的所述聚焦镜头相对于所述摄像面的第二目标位置,The second predicted value represents a second target position of the focusing lens relative to the imaging plane when focusing on the subject,
    所述电路构成为:使用所述第一目标位置和所述第二目标位置之差,确定表示对焦于被摄体时的所述聚焦镜头相对于所述摄像面的目标位置的所述目标值。The circuit is configured to use the difference between the first target position and the second target position to determine the target value representing the target position of the focusing lens relative to the imaging surface when focusing on a subject .
  3. 根据权利要求2所述的装置,其特征在于,The device according to claim 2, wherein:
    将根据所述第一位置关系和所述第二位置关系的至少一个确定的所述聚焦镜头的位置设定为第一基准位置,将根据所述第二位置关系和所述第三位置关系的至少一个确定的所述聚焦镜头的位置设定为第二基准位置,所述电路构成为:使用相对于所述第一基准位置和所述第二基准位置之差的所述第一目标位置和所述第二目标位置之差,确定表示对焦于被摄体时的所述聚焦镜头相对于所述摄像面的目标位置的所述目标值。The position of the focusing lens determined according to at least one of the first positional relationship and the second positional relationship is set as a first reference position, and the position of the focusing lens is determined according to the second positional relationship and the third positional relationship. At least one determined position of the focus lens is set as a second reference position, and the circuit is configured to use the first target position and the difference between the first reference position and the second reference position The difference between the second target position determines the target value representing the target position of the focus lens relative to the imaging plane when focusing on the subject.
  4. 根据权利要求3所述的装置,其特征在于,所述基于电路构成为:The device according to claim 3, wherein the circuit-based configuration is:
    计算出相对于所述第一基准位置和所述第二基准位置之差的所述第一目标位置和所述第二目标位置之差的比率作为变化信息,Calculating the ratio of the difference between the first target position and the second target position relative to the difference between the first reference position and the second reference position as change information,
    通过将所述第二目标位置除以所述比率来计算出所述第二目标位置的校正值,The correction value of the second target position is calculated by dividing the second target position by the ratio,
    计算出相对于所述第二基准位置的所述校正值所表示的位置作为所述目标值。The position indicated by the correction value relative to the second reference position is calculated as the target value.
  5. 根据权利要求1至4中任一项所述的装置,其特征在于,The device according to any one of claims 1 to 4, characterized in that:
    所述电路构成为:根据所述目标值,执行所述摄像装置的对焦控制。The circuit is configured to execute focus control of the imaging device based on the target value.
  6. 根据权利要求5所述的装置,其特征在于,所述基于电路构成为:The device according to claim 5, wherein the circuit-based configuration is:
    获取所述第一图像的模糊量和所述第二图像的模糊量;Acquiring the blur amount of the first image and the blur amount of the second image;
    根据所述第一图像的模糊量和所述第二图像的模糊量获取所述第一预测值;Acquiring the first predicted value according to the blur amount of the first image and the blur amount of the second image;
    当所述第一预测值的可靠值大于等于预先设定的第一值时,根据所述第一预测值执行所述摄像装置的对焦控制;When the reliable value of the first predicted value is greater than or equal to the preset first value, execute the focus control of the camera device according to the first predicted value;
    当所述第一预测值的可靠值低于所述预先设定的第一值时,获取所述第三图像的模糊量,根据所述第二图像的模糊量和所述第三图像的模糊量获取所述第二预测值,并根据所述目标值执行所述摄像装置的对焦控制。When the reliable value of the first predicted value is lower than the preset first value, the blur amount of the third image is acquired, according to the blur amount of the second image and the blur of the third image The second predicted value is acquired by a certain amount, and the focus control of the imaging device is executed according to the target value.
  7. 根据权利要求6所述的装置,其特征在于,所述基于电路构成为:The device according to claim 6, wherein the circuit-based configuration is:
    当所述第一预测值的可靠值低于比所述第一值更低的第二值时,不获取所述第三图像,并不进行根据所述第一预测值的所述摄像装置的对焦控制;When the reliable value of the first predicted value is lower than the second value lower than the first value, the third image is not acquired, and the imaging device based on the first predicted value is not performed. Focus control
    当所述第一预测值的可靠值低于所述第一值并且大于等于所述第二值时,获取所述第三图像的模糊量,根据所述第二图像的模糊量和所述第三图像的模糊量获取所述第二预测值,并根据所述目标值执行所述摄像装置的对焦控制。When the reliable value of the first predicted value is lower than the first value and greater than or equal to the second value, the blur amount of the third image is acquired, based on the blur amount of the second image and the first The blur amounts of the three images acquire the second predicted value, and perform focus control of the imaging device according to the target value.
  8. 一种装置,其特征在于,包括一种电路,所述基于电路构成为:获取摄像装置的摄像面和聚焦镜头处于第一位置关系的状态下拍摄的第一摄像图像中包含的第一图像的模糊量以及所述摄像面和所述聚焦镜头处于第二位置关系的状态下拍摄的第二摄像图像中包含的第二图像的模糊量;A device, characterized by comprising a circuit based on the circuit structure: to obtain the first image contained in the first image captured in a state where the imaging surface of the imaging device and the focusing lens are in a first positional relationship The amount of blur and the amount of blur of the second image included in the second captured image taken when the imaging surface and the focus lens are in a second positional relationship;
    根据所述第一图像的模糊量和所述第二图像的模糊量,获取表示对焦于被摄体时的所述摄像面和所述聚焦镜头之间的位置关系的第一预测值;Acquiring, according to the amount of blur of the first image and the amount of blur of the second image, a first predicted value representing the positional relationship between the imaging surface and the focusing lens when focusing on the subject;
    获取在所述摄像面和所述聚焦镜头处于第三位置关系的状态下拍摄的第三摄像图像中包含的第三图像的模糊量以及所述摄像面和所述聚焦镜头处于第四位置关系的状态下拍摄的第四摄像图像中包含的第四图像的模糊量;Obtain the blur amount of the third image included in the third captured image captured in the state where the imaging surface and the focus lens are in the third positional relationship, and the difference that the imaging surface and the focus lens are in the fourth positional relationship The blur amount of the fourth image contained in the fourth camera image taken in the state;
    根据所述第三图像的模糊量和所述第四图像的模糊量,获取表示对焦于被摄体时的所 述摄像面和所述聚焦镜头之间的位置关系的第二预测值;Acquiring, according to the blur amount of the third image and the blur amount of the fourth image, a second predicted value representing the positional relationship between the imaging surface and the focus lens when focusing on the subject;
    根据所述第一预测值和所述第二预测值之差,确定表示对焦于被摄体时所述摄像面和所述聚焦镜头之间的位置关系的目标值。According to the difference between the first predicted value and the second predicted value, a target value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject is determined.
  9. 一种摄像装置,其特征在于,包括:根据权利要求1至4以及8中任一项所述的装置;以及An imaging device, characterized by comprising: the device according to any one of claims 1 to 4 and 8; and
    具备所述摄像面的图像传感器。An image sensor provided with the imaging surface.
  10. 一种摄像系统,其特征在于,包括:根据权利要求9所述的摄像装置;以及A camera system, comprising: the camera device according to claim 9; and
    以可控制所述摄像装置的姿势的方式进行支撑的支撑机构。A support mechanism that supports the camera in such a way that the posture of the camera can be controlled.
  11. 一种移动体,其特征在于,其搭载根据权利要求9所述的摄像装置并进行移动。A mobile body characterized in that it mounts and moves the imaging device according to claim 9.
  12. 一种方法,其特征在于,包括:获取摄像装置的摄像面和聚焦镜头处于第一位置关系的状态下拍摄的第一摄像图像中包含的第一图像的模糊量以及所述摄像面和所述聚焦镜头处于第二位置关系的状态下拍摄的第二摄像图像中包含的第二图像的模糊量;A method, characterized in that it comprises: acquiring a blur amount of a first image contained in a first captured image captured in a state where the imaging surface of the imaging device and the focusing lens are in a first positional relationship, and the imaging surface and the The amount of blurring of the second image included in the second captured image taken with the focusing lens in the second positional relationship;
    根据所述第一图像的模糊量和所述第二图像的模糊量,获取表示对焦于被摄体时的所述摄像面和所述聚焦镜头之间的位置关系的第一预测值;Acquiring, according to the amount of blur of the first image and the amount of blur of the second image, a first predicted value representing the positional relationship between the imaging surface and the focusing lens when focusing on the subject;
    获取摄像装置的摄像面和聚焦镜头处于第三位置关系的状态下拍摄的第三摄像图像中包含的第三图像的模糊量;Acquiring a blur amount of a third image included in a third captured image captured in a state where the imaging surface of the imaging device and the focus lens are in a third positional relationship;
    根据所述第二图像的模糊量和所述第三图像的模糊量,获取表示对焦于被摄体时的所述摄像面和所述聚焦镜头之间的位置关系的第二预测值;Acquiring, according to the amount of blur of the second image and the amount of blur of the third image, a second predicted value representing the positional relationship between the imaging surface and the focusing lens when focusing on the subject;
    根据所述第一预测值和所述第二预测值之差,确定表示对焦于被摄体时所述摄像面和所述聚焦镜头之间的位置关系的目标值。According to the difference between the first predicted value and the second predicted value, a target value indicating the positional relationship between the imaging surface and the focus lens when focusing on the subject is determined.
  13. 一种程序,其是用于使计算机执行根据权利要求12所述的方法的程序。A program which is a program for causing a computer to execute the method according to claim 12.
PCT/CN2020/103227 2019-07-23 2020-07-21 Apparatus, photgraphic apparatus, movable body, method, and program WO2021013143A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080003348.XA CN112292712A (en) 2019-07-23 2020-07-21 Device, imaging device, moving object, method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-135369 2019-07-23
JP2019135369A JP6874251B2 (en) 2019-07-23 2019-07-23 Devices, imaging devices, moving objects, methods, and programs

Publications (1)

Publication Number Publication Date
WO2021013143A1 true WO2021013143A1 (en) 2021-01-28

Family

ID=74193272

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/103227 WO2021013143A1 (en) 2019-07-23 2020-07-21 Apparatus, photgraphic apparatus, movable body, method, and program

Country Status (3)

Country Link
JP (1) JP6874251B2 (en)
CN (1) CN112292712A (en)
WO (1) WO2021013143A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113191349A (en) * 2021-05-31 2021-07-30 浙江大华技术股份有限公司 Control method and device for focusing motor, storage medium and electronic device
WO2022213340A1 (en) * 2021-04-09 2022-10-13 深圳市大疆创新科技有限公司 Focusing method, photographic device, photographic system and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130307966A1 (en) * 2012-05-17 2013-11-21 Canon Kabushiki Kaisha Depth measurement apparatus, image pickup apparatus, and depth measurement program
CN106707658A (en) * 2016-12-09 2017-05-24 东莞佩斯讯光电技术有限公司 Method and system capable of correcting image fuzziness caused by lens inclination
CN107529006A (en) * 2016-06-22 2017-12-29 奥林巴斯株式会社 Camera device
CN108235815A (en) * 2017-04-07 2018-06-29 深圳市大疆创新科技有限公司 Video camera controller, photographic device, camera system, moving body, camera shooting control method and program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5178553B2 (en) * 2009-01-27 2013-04-10 オリンパス株式会社 Imaging device
JP2013044844A (en) * 2011-08-23 2013-03-04 Panasonic Corp Image processing device and image processing method
JP2016218225A (en) * 2015-05-19 2016-12-22 キヤノン株式会社 Imaging device and control method of the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130307966A1 (en) * 2012-05-17 2013-11-21 Canon Kabushiki Kaisha Depth measurement apparatus, image pickup apparatus, and depth measurement program
CN107529006A (en) * 2016-06-22 2017-12-29 奥林巴斯株式会社 Camera device
CN106707658A (en) * 2016-12-09 2017-05-24 东莞佩斯讯光电技术有限公司 Method and system capable of correcting image fuzziness caused by lens inclination
CN108235815A (en) * 2017-04-07 2018-06-29 深圳市大疆创新科技有限公司 Video camera controller, photographic device, camera system, moving body, camera shooting control method and program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022213340A1 (en) * 2021-04-09 2022-10-13 深圳市大疆创新科技有限公司 Focusing method, photographic device, photographic system and readable storage medium
CN113191349A (en) * 2021-05-31 2021-07-30 浙江大华技术股份有限公司 Control method and device for focusing motor, storage medium and electronic device
CN113191349B (en) * 2021-05-31 2022-06-24 浙江大华技术股份有限公司 Control method and device for focusing motor, storage medium and electronic device

Also Published As

Publication number Publication date
CN112292712A (en) 2021-01-29
JP2021018376A (en) 2021-02-15
JP6874251B2 (en) 2021-05-19

Similar Documents

Publication Publication Date Title
WO2019113966A1 (en) Obstacle avoidance method and device, and unmanned aerial vehicle
CN108235815B (en) Imaging control device, imaging system, moving object, imaging control method, and medium
WO2021013143A1 (en) Apparatus, photgraphic apparatus, movable body, method, and program
WO2020011230A1 (en) Control device, movable body, control method, and program
JP2019216343A (en) Determination device, moving body, determination method, and program
US10942331B2 (en) Control apparatus, lens apparatus, photographic apparatus, flying body, and control method
JP6543875B2 (en) Control device, imaging device, flying object, control method, program
JP6503607B2 (en) Imaging control apparatus, imaging apparatus, imaging system, moving object, imaging control method, and program
WO2020216037A1 (en) Control device, camera device, movable body, control method and program
WO2021031833A1 (en) Control device, photographing system, control method, and program
WO2021204020A1 (en) Device, camera device, camera system, moving body, method, and program
CN111357271B (en) Control device, mobile body, and control method
JP6641574B1 (en) Determination device, moving object, determination method, and program
WO2020107487A1 (en) Image processing method and unmanned aerial vehicle
CN111226170A (en) Control device, mobile body, control method, and program
WO2021249245A1 (en) Device, camera device, camera system, and movable member
WO2020156085A1 (en) Image processing apparatus, photographing apparatus, unmanned aerial aircraft, image processing method and program
JP6569157B1 (en) Control device, imaging device, moving object, control method, and program
WO2021052216A1 (en) Control device, photographing device, control method, and program
WO2020244440A1 (en) Control device, camera device, camera system, control method and program
CN111213369B (en) Control device, control method, imaging device, mobile object, and computer-readable storage medium
WO2021143425A1 (en) Control device, photographing device, moving body, control method, and program
JP6413170B1 (en) Determination apparatus, imaging apparatus, imaging system, moving object, determination method, and program
JP2021197619A (en) Control device, imaging system, control method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20843118

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20843118

Country of ref document: EP

Kind code of ref document: A1