CN112335227A - Control device, imaging system, control method, and program - Google Patents

Control device, imaging system, control method, and program Download PDF

Info

Publication number
CN112335227A
CN112335227A CN202080003361.5A CN202080003361A CN112335227A CN 112335227 A CN112335227 A CN 112335227A CN 202080003361 A CN202080003361 A CN 202080003361A CN 112335227 A CN112335227 A CN 112335227A
Authority
CN
China
Prior art keywords
image
image pickup
positional relationship
focus lens
focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080003361.5A
Other languages
Chinese (zh)
Inventor
高宫诚
永山佳范
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2019151539A external-priority patent/JP2021032990A/en
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112335227A publication Critical patent/CN112335227A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

A distance measuring sensor that measures a distance to an object from reflected light of an optical pulse may receive external light other than the reflected light of the optical pulse such as sunlight, and thus the distance to the object may not be accurately measured. The control device includes a circuit configured to: executing focus control for focusing on the object based on a first target position relationship between an image pickup surface of the image pickup device and the focus lens determined according to the distance when the signal satisfies a first condition showing reliability of the distance; when the signal does not satisfy the first condition, focus control is performed based on a second target positional relationship of the image pickup plane and the focus lens determined based on respective blur amounts of a first image picked up by the image pickup device when the positional relationship of the image pickup plane and the focus lens is the first positional relationship and a second image picked up by the image pickup device when the positional relationship of the image pickup plane and the focus lens is the second positional relationship.

Description

Control device, imaging system, control method, and program
[ technical field ] A method for producing a semiconductor device
The invention relates to a control device, an imaging system, a control method, and a program.
[ background of the invention ]
Patent document 1 describes measuring a distance to a target object from reflected light of a light pulse.
[ Prior art documents ]
[ patent document ]
[ patent document 1] Japanese patent application laid-open No. 2006-79074
[ summary of the invention ]
[ technical problem to be solved by the invention ]
A distance measuring sensor that measures a distance to an object from light pulses reflected therefrom may receive external light other than light pulses reflected therefrom, such as sunlight, and thus may not be able to accurately measure the distance to the object.
[ MEANS FOR SOLVING PROBLEMS ] to solve the problems
The control device according to one aspect of the present invention may be a control device that controls an image pickup device including a distance measuring sensor and a focusing lens, the distance measuring sensor including a light emitting element that emits pulsed light and a light receiving element that receives light including reflected light of the pulsed light from an object and outputs a signal according to an amount of the received light, and measuring a distance to the object based on the signal. The control device may include circuitry configured to: when the signal satisfies a first condition indicating reliability of the distance, focus control for focusing on the object is executed based on a first target position relationship between an image pickup surface of the image pickup device and the focus lens determined according to the distance. The circuit may be configured to: when the signal does not satisfy the first condition, focus control is performed based on a second target positional relationship between the image pickup surface and the focus lens, which is determined based on respective blur amounts of a first image picked up by the image pickup device when the positional relationship between the image pickup surface and the focus lens is the first positional relationship and a second image picked up by the image pickup device when the positional relationship between the image pickup surface and the focus lens is the second positional relationship.
The circuit may be configured to: when a first degree of difference showing a degree of difference between the first image and the second image satisfies a second condition showing reliability of the second target positional relationship, focus control is performed in accordance with the second target positional relationship.
The circuit may be configured to: when the first difference does not satisfy the second condition, focus control is performed according to the contrast AF.
The circuit may be configured to: when the first difference does not meet the second condition, converting the position relation between the image pickup surface and the focusing lens from the second position relation into a third position relation; performing focus control based on a third target positional relationship of the image pickup plane and the focus lens determined from respective blur amounts of the first image or the second image and the third image when a second degree of difference showing a degree of difference between the third image picked up by the image pickup device and the first image or from the second image when the positional relationship of the image pickup plane and the focus lens is the third positional relationship satisfies a second condition; when the second difference does not satisfy the second condition, focus control is performed according to the contrast AF.
When the signal shows that the light amount is included in the preset range, the signal may satisfy the first condition.
The circuit may be configured to: when the signal does not satisfy the first condition, a first image captured by the imaging device when the positional relationship between the imaging plane and the focus lens is the first positional relationship is acquired, then the positional relationship between the imaging plane and the focus lens is converted into a second positional relationship, and a second image captured by the imaging device when the positional relationship between the imaging plane and the focus lens is the second positional relationship is acquired.
The circuit may be configured to: when the signal does not satisfy the first condition, a first image shot by the image pickup device when the position relationship between the image pickup surface and the focus lens is the first position relationship is acquired, and then the focus lens is moved by a preset distance in a direction determined according to the first target position relationship, so that the position relationship between the image pickup surface and the focus lens is moved to the second position relationship.
An imaging system according to an aspect of the present invention may include the control device, the distance measuring sensor, and the imaging device.
A control method according to an aspect of the present invention may be a control method of controlling an image pickup apparatus including a distance measuring sensor and a focusing lens, the distance measuring sensor including a light emitting element that emits pulsed light and a light receiving element that receives light including reflected light of the pulsed light from an object and outputs a signal according to an amount of the received light, and measuring a distance to the object based on the signal. The control method may include: when the signal satisfies a first condition indicating reliability of the distance, focus control for focusing on the object is executed based on a first target position relationship between an image pickup surface of the image pickup device and the focus lens determined according to the distance. The control method may include: when the signal does not satisfy the first condition, focus control is performed based on a second target positional relationship of the image pickup plane and the focus lens determined based on respective blur amounts of a first image picked up by the image pickup device when the positional relationship of the image pickup plane and the focus lens is the first positional relationship and a second image picked up by the image pickup device when the positional relationship of the image pickup plane and the focus lens is the second positional relationship.
The program according to one aspect of the present invention may be a program for causing a computer to function as the control device.
According to an aspect of the present invention, it is possible to avoid meaningless driving of the focus lens and perform focus control in a more preferable manner.
Moreover, the above summary does not list all the necessary features of the present invention. Furthermore, sub-combinations of these feature sets may also constitute the invention.
[ description of the drawings ]
Fig. 1 is an example of an external perspective view of a camera system.
Fig. 2 is an example of an external perspective view showing another form of the image pickup system.
Fig. 3 is a diagram showing functional blocks of the image pickup system.
Fig. 4 is a diagram showing one example of a curve showing a relationship of the blur amount with the lens position.
Fig. 5 is a diagram showing one example of a process of calculating the distance to the object based on the blur amount.
Fig. 6 is a diagram for explaining the relationship among the subject position, the lens position, and the focal length.
Fig. 7 is a flowchart showing one example of the AF processing procedure of the image pickup apparatus.
Fig. 8 is a diagram showing a relationship between the TOF operation result and the focus lens position.
Fig. 9 is a diagram illustrating one example of the action of the focus lens when AF is performed.
Fig. 10 is a diagram illustrating an example of the action of the focus lens when AF is performed.
Fig. 11 is a diagram illustrating an example of the action of the focus lens when AF is performed.
Fig. 12 is a diagram illustrating an example of the action of the focus lens when AF is performed.
Fig. 13 is a diagram showing an example of the external appearance of the unmanned aerial vehicle and the remote operation device.
Fig. 14 is a diagram showing an example of the hardware configuration.
[ detailed description ] embodiments
The present invention will be described below with reference to embodiments of the invention, but the following embodiments do not limit the invention according to the claims. Moreover, all combinations of features described in the embodiments are not necessarily essential to the inventive solution. It will be apparent to those skilled in the art that various changes and modifications can be made in the following embodiments. It is apparent from the description of the claims that the modes to which such changes or improvements are made are included in the technical scope of the present invention.
The claims, the specification, the drawings, and the abstract of the specification contain matters to be protected by copyright. The copyright owner would not make an objection as to the facsimile reproduction by anyone of the files, as illustrated by the patent office's documents or records. However, in other cases, the copyright of everything is reserved.
Various embodiments of the present invention may be described with reference to flow diagrams and block diagrams, where blocks may represent (1) stages of a process for performing an operation or (2) a "part" of a device having a role in performing an operation. Certain stages and "sections" may be implemented by programmable circuits and/or processors. The dedicated circuitry may comprise digital and/or analog hardware circuitry. May include Integrated Circuits (ICs) and/or discrete circuits. The programmable circuitry may comprise reconfigurable hardware circuitry. The reconfigurable hardware circuit may include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, Field Programmable Gate Arrays (FPGAs), Programmable Logic Arrays (PLAs), etc. memory elements.
A computer readable medium may comprise any tangible device that can store instructions for execution by a suitable device. As a result, a computer-readable medium having stored thereon instructions that may be executed to create a means for implementing the operations specified in the flowchart or block diagram includes an article of manufacture including instructions that may be executed to implement the operations specified in the flowchart or block diagram block or blocks. As examples of the computer readable medium, an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, and the like may be included. As more specific examples of the computer-readable medium, floppy (registered trademark) disk floppy disks, flexible disks, hard disks, Random Access Memories (RAMs), Read Only Memories (ROMs), erasable programmable read only memories (EPROMs or flash memories), Electrically Erasable Programmable Read Only Memories (EEPROMs), Static Random Access Memories (SRAMs), compact disc read only memories (CD-ROMs), Digital Versatile Discs (DVDs), Blue-lay (registered trademark) blu-ray discs, memory sticks, integrated circuit cards, and the like may be included.
Computer readable instructions may include any one of source code or object code described by any combination of one or more programming languages. The source code or object code comprises a conventional procedural programming language. Conventional procedural programming languages may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or an object oriented programming language such as Smalltalk (registered trademark), JAVA (registered trademark), C + +, or the like, as well as the "C" programming language or similar programming languages. The computer readable instructions may be provided to a processor or programmable circuitry of a general purpose computer, special purpose computer, or other programmable data processing apparatus, either locally or via a Wide Area Network (WAN), such as a Local Area Network (LAN), the internet, or the like. A processor or programmable circuit may execute the computer readable instructions to create means for implementing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
Fig. 1 is an example of an external perspective view of an imaging system 10 according to the present embodiment. The imaging system 10 includes an imaging device 100, a support mechanism 200, and a grip 300. The image pickup apparatus 100 includes a TOF sensor 160. The support mechanism 200 supports the imaging apparatus 100 rotatably about a roll axis, a pitch axis, and a yaw axis, respectively, using actuators. The support mechanism 200 can change or maintain the attitude of the imaging apparatus 100 by rotating the imaging apparatus 100 about at least one of the roll axis, the pitch axis, and the yaw axis. The support mechanism 200 includes a roll axis drive mechanism 201, a pitch axis drive mechanism 202, and a yaw axis drive mechanism 203. The support mechanism 200 further comprises a base 204 to which the yaw axis drive mechanism 203 is fixed. The grip 300 is fixed to the base 204. The grip 300 includes an operation interface 301 and a display 302. The imaging apparatus 100 is fixed to the pitch axis drive mechanism 202.
The operation interface 301 receives an instruction from a user to operate the imaging apparatus 100 and the support mechanism 200. The operation interface 301 may include a shutter/recording button that instructs the imaging apparatus 100 to perform shooting or recording. The operation interface 301 may include a power/function button that instructs to turn on or off the power of the image pickup apparatus 10 and to switch the still image shooting mode or the moving image shooting mode of the image pickup apparatus 100.
The display section 302 can display an image captured by the image capturing apparatus 100. The display unit 302 can display a menu screen for operating the image pickup apparatus 100 and the support mechanism 200. The display unit 302 may be a touch panel display that receives instructions for operating the imaging apparatus 100 and the support mechanism 200.
Fig. 2 is an example of an external perspective view showing another form of the camera system 10. As shown in fig. 2, the imaging system 10 can be used in a state where a mobile terminal including a display such as a smartphone 400 is fixed to the side of the grip 300. The user holds the grip 300 and takes a still image or a moving image by the imaging device 100. The display such as the smartphone 400 displays a still image or a moving image of the imaging apparatus 100.
Fig. 3 is a diagram showing functional blocks of the image pickup system 10. The imaging apparatus 100 includes an imaging control section 110, an image sensor 120, a memory 130, a lens control section 150, a lens driving section 152, a plurality of lenses 154, and a TOF sensor 160.
The image sensor 120 may be formed of a CCD or a CMOS. The image sensor 120 is one example of an image sensor for photographing. The image sensor 120 outputs image data of the optical image formed by the plurality of lenses 154 to the image pickup control section 110. The imaging control unit 110 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
The imaging control unit 110 generates image data by performing demosaicing processing on the image signal output from the image sensor 120 in accordance with an operation command to the imaging apparatus 100 from the grip 300, and by the imaging control unit 110. The imaging control unit 110 stores the image data in the memory 130. The imaging control section 110 controls the TOF sensor 160. The imaging control section 110 is an example of a circuit. The TOF sensor 160 is a time-of-flight type sensor that measures the distance to a target object. The image pickup apparatus 100 performs focus control by adjusting the position of the focus lens based on the distance measured by the TOF sensor 160.
The memory 130 may be a computer-readable storage medium and may include at least one of SRAM, DRAM, EPROM, EEPROM, and USB memory, among flash memories. The memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like. The memory 130 may be provided inside the housing of the image pickup apparatus 100. The grip 300 may include other memory for storing image data captured by the image capturing apparatus 100. The grip 300 may have a slot for a memory detachable from a housing of the grip 300.
The plurality of lenses 154 may function as zoom lenses (zoom lenses), variable focal lenses (variable lenses), and focus lenses. At least a part or all of the plurality of lenses 154 are configured to be movable along the optical axis. The lens control unit 150 drives the lens driving unit 152 in accordance with a lens control command from the imaging control unit 110, and moves one or more lenses 154 in the optical axis direction. The lens control command is, for example, a zoom control command and a focus control command. The lens driving part 152 may include a Voice Coil Motor (VCM) that moves at least a part or all of the plurality of lenses 154 in the optical axis direction. The lens driving part 152 may include a motor such as a DC motor, a coreless motor, or an ultrasonic motor. The lens driving section 152 can transmit power from the motor to at least a part or all of the plurality of lenses 154 via a mechanism member such as a cam ring or a guide shaft, and move at least a part or all of the plurality of lenses 154 along the optical axis.
The imaging apparatus 100 further includes an attitude control unit 210, an angular velocity sensor 212, and an acceleration sensor 214. The angular velocity sensor 212 detects the angular velocity of the image pickup apparatus 100. The angular velocity sensor 212 detects the respective angular velocities of the image pickup apparatus 100 around the roll axis, the pitch axis, and the yaw axis. The attitude control unit 210 acquires angular velocity information on the angular velocity of the imaging apparatus 100 from the angular velocity sensor 212. The angular velocity information may show respective angular velocities of the image pickup apparatus 100 about the roll axis, the pitch axis, and the yaw axis. The attitude control unit 210 acquires acceleration information on the acceleration of the image pickup apparatus 100 from the acceleration sensor 214. The acceleration information may also show the acceleration of the imaging apparatus 100 in each direction of the roll axis, pitch axis, and yaw axis.
The angular velocity sensor 212 and the acceleration sensor 214 may be provided in a housing that accommodates the image sensor 120, the lens 154, and the like. In the present embodiment, a description will be given of a configuration in which the imaging device 100 and the support mechanism 200 are integrated. However, the support mechanism 200 may include a base that removably fixes the image pickup apparatus 100. In this case, the angular velocity sensor 212 and the acceleration sensor 214 may be provided outside the housing of the image pickup apparatus 100 such as a base.
The attitude control section 210 controls the support mechanism 200 based on the angular velocity information and the acceleration information to maintain or change the attitude of the image pickup apparatus 100. The attitude control section 210 controls the support mechanism 200 to maintain or change the attitude of the image pickup apparatus 100 in accordance with the operation mode of the support mechanism 200 for controlling the attitude of the image pickup apparatus 100.
The working modes include the following modes: at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 of the support mechanism 200 is operated so that the attitude change of the imaging device 100 follows the attitude change of the base 204 of the support mechanism 200. The working modes include the following modes: the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 of the support mechanism 200 are each operated so that the attitude change of the imaging device 100 follows the attitude change of the base 204 of the support mechanism 200. The working modes include the following modes: the pitch axis drive mechanism 202 and the yaw axis drive mechanism 203 of the support mechanism 200 are each operated so that the attitude change of the imaging apparatus 100 follows the attitude change of the base 204 of the support mechanism 200. The working modes include the following modes: only the yaw axis drive mechanism 203 is operated so that the attitude change of the imaging apparatus 100 follows the attitude change of the base 204 of the support mechanism 200.
The operation modes may include: an FPV (First Person View) mode in which the support mechanism 200 is operated so that the attitude change of the image pickup apparatus 100 follows the attitude change of the base 204 of the support mechanism 200; and a fixed mode in which the support mechanism 200 is operated to maintain the posture of the image pickup apparatus 100.
The FPV mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated in order to make the attitude change of the image pickup apparatus 100 follow the attitude change of the base 204 of the support mechanism 200. The fixed mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated to maintain the current posture of the imaging apparatus 100.
The TOF sensor 160 includes a light emitting unit 162, a light receiving unit 164, a light emission control unit 166, a light receiving control unit 167, and a memory 168. The TOF sensor 160 is one example of a ranging sensor.
The light emitting unit 162 includes at least one light emitting element 163. The light emitting element 163 is a device that repeatedly emits pulsed light modulated at high speed, such as an LED or laser. The light emitting element 163 may emit pulsed light that is infrared light. The light emission control unit 166 controls light emission of the light emitting element 163. The light emission control section 166 may control the pulse width of the pulsed light emitted from the light emitting element 163.
The light receiving unit 164 includes a plurality of light receiving elements 165 that measure distances to the object associated with each of the plurality of regions. The light receiving unit 164 is an example of an image sensor for ranging. The plurality of light receiving elements 165 correspond to the respective regions of the plurality of regions. The light receiving element 165 repeatedly receives reflected light of the pulsed light from the object. The light receiving element 165 receives light including reflected light of the pulsed light from the object and outputs a signal according to the amount of the received light. The light receiving controller 167 controls the light receiving element 165 to receive light. The light reception controller 167 measures the distance to the object associated with each of the plurality of regions based on the signal output from the light receiving element 165. The light reception controller 167 measures the distance to the object associated with each of the plurality of regions based on the amount of reflected light repeatedly received by the light receiving element 165 in a preset light reception period. The light reception controller 167 may measure the distance to the object by determining the phase difference between the pulsed light and the reflected light based on the amount of reflected light repeatedly received by the light receiving element 165 within a preset light reception period. The light receiving unit 164 can measure the distance to the object by reading the frequency change of the reflected wave. This is called FMCW (Frequency Modulated Continuous Wave) system.
The memory 168 may be a computer-readable recording medium, and may include at least one of SRAM, DRAM, EPROM, and EEPROM. The memory 168 stores a program required for the light emission control section 166 to control the light emitting section 162, a program required for the light reception control section 167 to control the light receiving section 164, and the like.
An Autofocus (AF) method performed by the imaging apparatus 100 will be described. The image pickup apparatus 100 can control the positional relationship of the focus lens and the image pickup surface of the image sensor 120 by moving the focus lens in accordance with the distance from the image pickup apparatus 100 to the object (object distance) measured by the TOF sensor 160.
The image pickup control section 110 can perform contrast AF by deriving a contrast evaluation value of an image picked up by the image pickup apparatus 100 while moving the focus lens, and determining the position of the focus lens at which the contrast evaluation value reaches a peak. The image capture control section 110 may apply a contrast evaluation filter to the image captured by the image capture device 100, thereby deriving a contrast evaluation value of the image. The image pickup control section 110 may determine the position of the focus lens focused on a specified object based on the contrast evaluation value to perform contrast AF.
As other AF methods, there are the following methods: by moving the focus lens, determination is made based on the blur amounts of a plurality of images captured in a state where the positional relationship between the focus lens and the imaging surface of the image sensor 120 is different. AF using this method is referred to herein as a Blur Detection Autofocus (BDAF) method. Specifically, in BDAF, DFD (Depth From Defocus) operation is performed to perform AF.
For example, the blur amount (Cost) of an image can be represented by the following equation (1) using a gaussian function. In the formula (1), x represents a pixel position in the horizontal direction. σ denotes a standard deviation value.
[ formula 1]
Figure BDA0002840155250000061
Fig. 4 shows one example of a curve representing the relationship of the blur amount (Cost) of an image and the focus lens position. C1 is the amount of blur of the image acquired when the focus lens is located at x 1. C2 is the amount of blur of the image acquired when the focus lens is located at x 2. The subject can be focused by aligning the focus lens at the lens position x0 corresponding to the minimum point 502 of the curve 500 determined from the blur amount C1 and the amount C2 in consideration of the optical characteristics of the lens 154.
Fig. 5 is a flowchart showing one example of a distance calculation process in the BDAF mode. The imaging control unit 110 captures a first image and stores the first image in the memory 130 in a state where the lens 154 and the imaging surface of the image sensor 120 are in the first positional relationship. The imaging control unit 110 moves the lens 154 in the optical axis direction, so that the lens 154 and the imaging surface are in the second positional relationship, and a second image is captured by the imaging device 100 and stored in the memory 130 (S201). For example, the imaging control unit 110 changes the positional relationship between the lens 154 and the imaging surface from the first positional relationship to the second positional relationship by moving the focus lens in the optical axis direction. The movement amount of the lens may be, for example, about 10 μm.
Then, the imaging control unit 110 divides the first image into a plurality of regions (S202). The imaging control unit 110 may calculate a feature amount for each pixel in the first image, and divide the first image into a plurality of regions with a pixel group having a similar feature amount as one region. The imaging control unit 110 may divide a pixel group set as a range of the AF processing frame in the first image into a plurality of regions. The imaging control unit 110 divides the second image into a plurality of regions corresponding to the plurality of regions of the first image. The imaging control unit 110 calculates a distance to the object corresponding to the object included in each of the plurality of regions based on the blur amount of each of the plurality of regions of the first image and the blur amount of each of the plurality of regions of the second image (S203).
The method of changing the positional relationship between the lens 154 and the imaging surface of the image sensor 120 is not limited to the method of moving the focus lens provided in the lens 154. For example, the imaging control unit 110 may move the entire lens 154 in the optical axis direction. The imaging control unit 110 can move the imaging surface of the image sensor 120 in the optical axis direction. The imaging control unit 110 may move at least a part of the lenses included in the lens 154 and the imaging surface of the image sensor 120 in the optical axis direction. The imaging control section 110 may adopt any method for optically changing the relative positional relationship of the focal point of the lens 154 and the imaging surface of the image sensor 120.
The calculation process of the object distance will be further explained with reference to fig. 5. A distance from a principal point of the lens L to the object 510 (object plane) is a, a distance from the principal point of the lens L to a position (image plane) where the light flux from the object 510 is imaged is B, and a focal length of the lens L is F. In this case, the relationship among the distance a, the distance B, and the focal length F can be expressed by the following formula (2) according to the lens formula.
[ formula 2 ]
Figure BDA0002840155250000071
The focal length F is determined by the position of each lens included in the lens L. Therefore, if the distance B of the light beam imaging from the subject 510 can be determined, the distance a from the principal point of the lens L to the subject 510 can be determined using equation (2).
Here, it is assumed that the positional relationship between the lens L and the image pickup surface is changed by moving the image pickup surface of the image sensor to the lens L side. As shown in fig. 6, if there is an image pickup surface at a position of a distance D1 from the principal point of the lens L or a position of a distance D2 from the principal point of the lens L, the image of the object 510 projected on the image pickup surface is blurred. The distance B can be determined by calculating the position at which the object 510 is imaged from the blur sizes (circle of confusion 512 and 514) of the image of the object 510 projected on the image pickup surface, and further determining the distance a. That is, the image capturing position can be determined from the difference in blur amount, considering that the size of blur (blur amount) is proportional to the image capturing surface and the imaging position.
Here, the image I at a position distant from the imaging surface by a distance D11And an image I at a position distant from the imaging surface by a distance D22The respective images of (a) are blurred. About image I1Setting Point Spread Function (PSF)1The object image is Id1Then like I1The convolution operation can be expressed by the following equation (3).
[ formula 3 ]
I1=PSF1*Id1…(3)
Image I2Can also pass through PSF2Is expressed by the convolution operation of (a). Setting Fourier transform of the object image as f, setting point spread function PSF1And PSF2Performing Fourier transform to obtainTaking the Optical Transfer Function (Optical Transfer Function) as the OTF1And OTF2The ratio is obtained as in the following formula (4).
[ formula 4 ]
Figure BDA0002840155250000081
The C value shown in equation (4) is the amount of change in the blur amount of each of the image at the position distant from the principal point of the lens L by D1 and the image at the position distant from the principal point of the lens L by D2, that is, the difference between the blur amount of the image at the position distant from the principal point of the lens L by D1 and the blur amount of the image at the position distant from the principal point of the lens L by D2.
In fig. 6, a case where the positional relationship between the lens L and the imaging surface is changed by moving the imaging surface toward the lens L is described. The amount of blur also varies by changing the positional relationship between the focal position of the lens L and the image pickup surface by moving the focus lens with respect to the image pickup surface. In the present embodiment, images with different blur amounts are mainly acquired by moving the focus lens relative to the imaging plane, a DFD calculation value indicating the defocus amount is acquired by performing a DFD calculation based on the acquired images, and a position target value of the focus lens for focusing on the object is calculated based on the DFD calculation value as a target position relationship between the imaging plane and the focus lens.
As described above, the image pickup apparatus 100 performs AF in any one of AF (TOF method) based on ranging by the TOF sensor 160, AF (DFD method) based on DFD operation, and AF (contrast method) based on a contrast evaluation value.
According to the TOF method, the image capturing apparatus 100 can derive a target value of the position of the focus lens for focusing on the object from one image. The TOF method can accurately derive a target value when the image pickup apparatus 100 picks up an image with low brightness and low contrast. However, in an environment where sunlight easily enters, there is a possibility that external light other than the reflected light of the pulsed light enters the TOF sensor 160, and therefore, a target value may not be accurately derived.
According to the DFD method, the imaging apparatus 100 can derive a target value of the position of the focus lens for focusing on the subject from two or more images captured at different positions of the focus lens. However, when the imaging apparatus 100 captures an image with a low contrast such as a cloud, the difference between the blur amounts of two or more images may be small, and the target value may not be accurately derived.
According to the contrast method, the image pickup apparatus 100 needs to take a plurality of images to determine the position of the focus lens at which the contrast evaluation value reaches a peak. Therefore, the time taken to derive the target value may be long. However, the resistance to disturbances such as mechanical changes due to temperature and changes in optical components such as lenses is strong, and the target value accuracy is high.
As described above, the optimum AF method differs depending on the environment in which the subject imaged by the imaging apparatus 100 is located. Further, by performing AF without moving the focus lens as much as possible, image blur or the like is not generated during AF processing, and discomfort to the user is reduced. Therefore, according to the present embodiment, the image pickup apparatus 100 preferentially executes the AF method, that is, can perform AF while moving the focus lens as much as possible unnecessarily. More specifically, the imaging apparatus 100 performs AF by the TOF method when the reliability of the object distance calculated by the TOF method is high. The imaging apparatus 100 performs AF in the DFD method when the reliability of the object distance calculated in the TOF method is not high. Further, if the reliability of the object distance calculated by both the TOF method and the DFD method is not high, AF is performed by the contrast method.
The imaging control unit 110 determines whether or not a signal corresponding to the amount of light output from the light receiving element 165 satisfies a first condition indicating the reliability of the object distance. When the signal shows that the light amount is included in the preset range, the image pickup control section 110 determines that the signal satisfies the first condition. When the signal continuously shows that the light amount is included in the preset range for a preset period, the image pickup control section 110 may determine that the signal satisfies the first condition. The imaging control unit 110 may determine that the signal satisfies the first condition when the amplitude of the light received by the light receiving element 165 indicated by the signal is equal to or higher than the lower threshold and the value of the signal after a/D conversion is equal to or lower than the upper threshold. The imaging control unit 110 may determine that the signal satisfies the first condition when the amplitude of the light received by the light receiving element 165 indicated by the signal is equal to or greater than a threshold value and the a/D converted value of the signal is not saturated. That is, when the light received by the light receiving element 165 is not too weak and not too strong with respect to the pulsed light emitted from the light emitting element 163, the imaging control unit 110 determines that the signal satisfies the first condition.
When the signal satisfies the first condition, the imaging control unit 110 performs focus control for focusing on the object based on the first target positional relationship between the imaging surface of the imaging apparatus 100 and the focus lens, which is determined based on the distance to the object measured by the TOF sensor 160. When the signal does not satisfy the first condition, the image pickup control unit 110 executes focus control for focusing on the object based on a second target positional relationship between the image pickup surface and the focus lens determined based on blur amounts of the first image picked up by the image pickup device 100 when the positional relationship between the image pickup surface and the focus lens is the first positional relationship and the second image picked up by the image pickup device 100 when the positional relationship between the image pickup surface and the focus lens is the second positional relationship. That is, when the signal does not satisfy the first condition, the image pickup control section 110 executes focus control in a DFD manner.
When the signal does not satisfy the first condition, the image capture control unit 110 may convert the positional relationship between the image capture surface and the focus lens to the second positional relationship after acquiring the first image captured by the image capture device 100 when the image capture surface and the focus lens are in the first positional relationship, and acquire the second image captured by the image capture device 100 when the positional relationship between the image capture surface and the focus lens is the second positional relationship.
When the signal does not satisfy the first condition, the image capture control unit 110 may move the positional relationship between the image capture surface and the focus lens to the second positional relationship by moving the focus lens by a preset distance in a direction determined based on the first target positional relationship after acquiring the first image captured by the image capture apparatus 100 when the image capture surface and the focus lens are in the first positional relationship. Even when the accuracy of the object distance calculated by the TOF method is low, the imaging control unit 110 may correctly determine the direction in which the focus lens should be moved, based on the object distance calculated by the TOF method. Therefore, when the signal does not satisfy the first condition, the image pickup control section 110 can determine the moving direction of the focus lens from the object distance calculated in the TOF manner.
The imaging control section 110 may determine whether or not a first degree of difference showing a degree of difference between the first image and the second image satisfies a second condition showing reliability of the second target positional relationship before performing focus control in the DFD manner. The imaging control unit 110 may determine whether or not the first difference satisfies a second condition indicating reliability of the DFD scheme based on the similarity between the first image and the second image. When the similarity between the first image and the second image is lower than a preset threshold, the imaging control section 110 may determine that the first difference satisfies the second condition. When the difference between the contrast of the first image and the contrast of the second image is greater than a preset threshold, the imaging control section 110 may determine that the first difference satisfies the second condition. When the difference between the blur amount of the first image and the blur amount of the second image is larger than a preset threshold, the imaging control section 110 may determine that the first degree of difference satisfies the second condition. When the first degree of difference satisfies the second condition, the imaging control unit 110 may perform focusing control for focusing on the object based on the second target positional relationship derived in the DFD method.
When the first difference degree does not satisfy the second condition, the imaging control section 110 may execute focus control for focusing on the object by contrast AF.
When the first degree of difference does not satisfy the second condition, the image capture control section 110 may further shift the positional relationship between the image capture plane and the focus lens from the second positional relationship to a third positional relationship. Further, when a second degree of difference, which shows the degree of difference between the third image captured by the image capturing apparatus 100 and the first image or the second image when the positional relationship between the image pickup surface and the focus lens is in the third positional relationship, satisfies a second condition, the image pickup control section 110 may execute focus control for focusing on the object based on a third target positional relationship between the image pickup surface and the focus lens determined from respective amounts of blur of the first image or the second image and the third image. When the second difference does not satisfy the second condition, the imaging control unit 110 may perform focus control for focusing on the object by contrast AF.
Fig. 7 is a flowchart showing one example of the AF processing procedure of the image pickup apparatus 100. After the AF processing, the imaging control unit 110 causes the imaging device 100 to capture a first image for DFD when the focus lens is at the first position, and acquires the first image for DFD (S100). Then, the imaging control section 110 determines whether or not the reliability of the object distance measured by the TOF sensor 160 is high (S102). The image pickup control section 110 may determine that the reliability of the object distance measured by the TOF sensor 160 is high if the amount of light indicated by the signal output from the light receiving element 165 is within a preset range.
When the reliability of the object distance measured by the TOF sensor 160 is high, the imaging control unit 110 calculates a focused position indicating the position of the focus lens focused on the object, based on the TOF result indicating the object distance measured by the TOF sensor 160 (S104). The imaging control section 110 can calculate the in-focus position by referring to data showing the relationship between the TOF calculation result and the focus lens position as shown in fig. 8, for example. Then, the imaging control unit 110 drives the focus lens to the TOF-type focus position (S106). When the reliability of the TOF result is high, the imaging control unit 110 drives the focus lens from the AF start position to the in-focus position as shown in fig. 9.
When the reliability of the TOF result is low, the imaging control unit 110 drives the focus lens only by one depth or less in a focusing direction indicating a direction toward the focused position calculated from the TOF result (S108). The imaging control unit 110 moves the focus lens from the first position to the second position. One depth corresponds to the minimum focus lens movement distance that can be used with Point Spread Function (Point Spread Function) in performing DFD operations. For example, one depth may be 0.003 mm. By driving the focus lens by only one depth or less, it is difficult for the user to perceive a change in the degree of image blur due to the driving of the focus lens. Therefore, it is possible to prevent the degree of blur of an image displayed by the display from greatly varying due to driving of the focus lens, and to cause a sense of incongruity to the user.
Then, the image pickup control unit 110 causes the image pickup apparatus 100 to pick up the second image for DFD when the focus lens is at the second position, and acquires the second image for DFD (S110). The imaging control unit 110 calculates the focus position by performing DFD calculation using the first image and the second image (S112). The imaging control unit 110 determines whether or not the reliability of the focus position derived by the DFD calculation using the first image and the second image is high (S114). When the similarity between the first image and the second image is equal to or less than the preset threshold, the imaging control unit 110 may determine that the reliability of the focus position derived by the DFD operation is high. When the difference between the blur amount of the first image and the blur amount of the second image is equal to or greater than the preset threshold, the imaging control unit 110 may determine that the reliability of the focus position derived by the DFD calculation is high.
When it is determined that the reliability of the focus position derived by the DFD calculation is high, the image pickup control unit 110 drives the focus lens to the DFD-type focus position (S116). When determining that the reliability of the focus position derived by the DFD calculation is high, the imaging control unit 110 drives the focus lens from the AF start position (first position) to the second position as shown in fig. 10. Then, the imaging control unit 110 drives the focus lens to the focus position calculated by the DFD method.
When it is determined that the reliability of the focus position derived by the DFD calculation is low, the image pickup control unit 110 further drives the focus lens in the focus direction (S118). The imaging control unit 110 may drive the focus lens by only one depth or less. Alternatively, the image pickup control section 110 may drive the focus lens to one boundary position in the contrast AF search range determined by the in-focus position calculated by the TOF method or the DFD method to perform contrast AF. The imaging control unit 110 drives the focus lens from the second position to the third position. Then, the image pickup control unit 110 causes the image pickup device 100 to pick up a third image for DFD when the focus lens is at the third position, and acquires the third image for DFD (S120). The imaging control unit 110 determines whether or not the reliability of the focused position derived by the DFD calculation using the second image and the third image is high (S124). The imaging control unit 110 may determine whether or not the reliability of the focused position derived by the DFD calculation using the first image and the third image is high.
When the reliability of the focus position derived by the DFD calculation is high, the imaging control unit 110 drives the focus lens to the DFD-type focus position (S116). As shown in fig. 11, the imaging control unit 110 drives the focus lens from the AF start position (first position) to the second position, and further moves the focus lens to the third position, and then drives the focus lens to the focus position derived by the DFD calculation using the second image and the third image.
When the reliability of the focused position derived by the DFD operation is low, the imaging control section 110 acquires a contrast evaluation value of the third image for performing contrast AF (S126). The imaging control unit 110 searches for the position of the focus lens at which the contrast evaluation value reaches the peak, and drives the focus lens in minute steps (S128).
The imaging control section 110 determines whether or not the position of the focus lens at which the contrast evaluation value reaches the peak is retrieved (S130). If the peak value is not retrieved, the imaging control unit 110 repeats the steps from step S126. If the peak is retrieved, the image pickup control section 110 drives the focus lens to the in-focus position determined in the contrast manner (S132). As shown in fig. 12, if the reliability of the focused position is low in the TOF method and the DFD method, the image pickup control section 110 drives the focus lens in the mountain climbing method to retrieve the peak of the contrast evaluation value. The imaging control unit 110 drives the focus lens to a peak value exceeding the contrast evaluation value, and then drives the focus lens in the reverse direction to drive the focus lens to a contrast-based focus position.
As described above, according to the present embodiment, when the reliability of the object distance calculated by the TOF method is high, the image pickup apparatus 100 performs AF by the TOF method. The imaging apparatus 100 performs AF in the DFD method when the reliability of the object distance calculated in the TOF method is not high. Further, if the reliability of the object distance calculated by both the TOF method and the DFD method is not high, AF is performed by the contrast method. Thus, the imaging apparatus 100 can preferentially execute the AF method, that is, the AF method in which the focus lens is moved as unnecessarily as possible.
The imaging device 100 may be mounted on a mobile body. The imaging device 100 may be mounted on an Unmanned Aerial Vehicle (UAV) as shown in fig. 13. UAV1000 may include UAV body 20, gimbal 50, plurality of cameras 60, and camera 100. The gimbal 50 and the image pickup apparatus 100 are one example of an image pickup system. UAV1000 is one example of a mobile body propelled by a propulsion section. The concept of a mobile body is intended to include, in addition to a UAV, a flying body such as an airplane moving in the air, a vehicle moving on the ground, a ship moving on water, and the like.
The UAV body 20 includes a plurality of rotors. Multiple rotors are one example of a propulsion section. The UAV body 20 flies the UAV1000 by controlling the rotation of the plurality of rotors. UAV body 20 employs, for example, four rotors to fly UAV 1000. The number of rotors is not limited to four. In addition, UAV1000 may also be a fixed-wing aircraft without a rotor.
The imaging apparatus 100 is an imaging camera for imaging an object included in a desired imaging range. The gimbal 50 rotatably supports the image pickup apparatus 100. The gimbal 50 is an example of a support mechanism. For example, the gimbal 50 rotatably supports the image pickup apparatus 100 with a pitch axis using an actuator. The gimbal 50 further rotatably supports the image pickup apparatus 100 centered on the roll axis and the yaw axis, respectively, using the actuators. The gimbal 50 can change the attitude of the image pickup apparatus 100 by rotating the image pickup apparatus 100 around at least 1 of the yaw axis, the pitch axis, and the roll axis.
The plurality of imaging devices 60 are sensing cameras that capture images of the surroundings of the UAV1000 in order to control the flight of the UAV 1000. Two cameras 60 may be provided at the nose, i.e. the front, of the UAV 1000. Also, two other cameras 60 may be provided on the bottom surface of the UAV 1000. The two image pickup devices 60 on the front side may be paired to function as a so-called stereo camera. The two imaging devices 60 on the bottom surface side may also be paired to function as a stereo camera. Three-dimensional spatial data around the UAV1000 may be generated from images captured by the plurality of cameras 60. The number of cameras 60 included in the UAV1000 is not limited to four. The UAV1000 may include at least one camera 60. UAV1000 may also include at least one camera 60 at the nose, tail, sides, bottom, and top of UAV1000, respectively. The angle of view settable in the image pickup device 60 may be larger than the angle of view settable in the image pickup device 100. The imaging device 60 may also have a single focus lens or a fisheye lens.
The remote operation device 600 communicates with the UAV1000 to remotely operate the UAV 1000. The remote operation device 600 may wirelessly communicate with the UAV 1000. The remote operation device 600 transmits instruction information showing various instructions related to the movement of the UAV1000 such as ascending, descending, accelerating, decelerating, advancing, retreating, rotating, and the like, to the UAV 1000. The indication information includes, for example, indication information to raise the altitude of the UAV 1000. The indication may show the altitude at which the UAV1000 should be located. The UAV1000 moves to be located at an altitude shown by the instruction received from the remote operation apparatus 600. The indication may include a lift instruction to lift UAV 1000. The UAV1000 ascends while receiving the ascending instruction. When the height of UAV1000 has reached the upper limit height, UAV1000 may be restricted from ascending even if an ascending instruction is accepted.
FIG. 14 shows one example of a computer 1200 in which aspects of the invention may be embodied, in whole or in part. The program installed on the computer 1200 can cause the computer 1200 to function as one or more "sections" of or operations associated with the apparatus according to the embodiment of the present invention. Alternatively, the program can cause the computer 1200 to execute the operation or the one or more "sections". The program enables the computer 1200 to execute the processes or the stages of the processes according to the embodiments of the present invention. Such programs may be executed by the CPU1212 to cause the computer 1200 to perform specified operations associated with some or all of the blocks in the flowchart and block diagrams described herein.
The computer 1200 of the present embodiment includes a CPU1212 and a RAM1214, which are connected to each other through a host controller 1210. The computer 1200 also includes a communication interface 1222, an input/output unit, which are connected to the host controller 1210 through the input/output controller 1220. Computer 1200 also includes a ROM 1230. The CPU1212 operates in accordance with programs stored in the ROM1230 and the RAM1214, thereby controlling the respective units.
The communication interface 1222 communicates with other electronic devices through a network. The hard disk drive may store programs and data used by CPU1212 in computer 1200. The ROM1230 stores therein a boot program or the like executed by the computer 1200 at runtime, and/or a program depending on the hardware of the computer 1200. The program is provided through a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card, or a network. The program is installed in the RAM1214 or the ROM1230, which is also an example of a computer-readable recording medium, and executed by the CPU 1212. The information processing described in these programs is read by the computer 1200, and causes cooperation between the programs and the various types of hardware resources described above. An apparatus or method may be constructed by implementing operations or processes of information according to the use of the computer 1200.
For example, when communication is performed between the computer 1200 and an external device, the CPU1212 may execute a communication program loaded in the RAM1214, and instruct the communication interface 1222 to perform communication processing based on processing described in the communication program. The communication interface 1222 reads transmission data stored in a transmission buffer provided in a recording medium such as the RAM1214 or a USB memory and transmits the read transmission data to a network, or writes reception data received from the network in a reception buffer or the like provided in the recording medium, under the control of the CPU 1212.
Further, the CPU1212 may cause the RAM1214 to read all or a necessary portion of a file or a database stored in an external recording medium such as a USB memory, and perform various types of processing on data on the RAM 1214. Then, the CPU1212 may write back the processed data to the external recording medium.
Various types of information such as various types of programs, data, tables, and databases may be stored in the recording medium and processed by the information. With respect to data read from the RAM1214, the CPU1212 may execute various types of processing described throughout this disclosure, including various types of operations specified by an instruction sequence of a program, information processing, condition judgment, condition transition, unconditional transition, retrieval/replacement of information, and the like, and write the result back into the RAM 1214. Further, the CPU1212 can retrieve information in files, databases, etc., within the recording medium. For example, when a plurality of entries having attribute values of first attributes respectively associated with attribute values of second attributes are stored in a recording medium, the CPU1212 may retrieve an entry matching a condition specifying an attribute value of a first attribute from the plurality of entries and read an attribute value of a second attribute stored in the entry, thereby acquiring an attribute value of a second attribute associated with a first attribute satisfying a preset condition.
The programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200. In addition, a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the internet may be used as the computer-readable storage medium, so that the program can be provided to the computer 1200 via the network.
The present invention has been described above using the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various changes and modifications can be made in the above embodiments. It is apparent from the description of the claims that the modes to which such changes or improvements are made are included in the technical scope of the present invention.
It should be noted that the execution order of the operations, the sequence, the steps, the stages, and the like in the devices, systems, programs, and methods shown in the claims, the description, and the drawings of the specification can be realized in any order as long as "before …", "in advance", and the like are not particularly explicitly indicated, and as long as the output of the preceding process is not used in the following process. The operational flow in the claims, the specification, and the drawings of the specification is described using "first", "next", and the like for convenience, but it is not necessarily meant to be performed in this order.
[ notation ] to show
10 image pickup system
20 UAV body
50 universal joint
60 image pickup device
100 image pickup device
110 image pickup control unit
120 image sensor
130 memory
150 lens control part
152 lens driving unit
154 lens
160 sensor
162 light emitting part
163 light emitting element
164 light receiving part
165 light-receiving element
166 light emission control unit
167 light receiving control part
168 memory
200 supporting mechanism
201 rolling shaft driving mechanism
202 pitch axis drive mechanism
203 yaw axis driving mechanism
204 base part
210 attitude control section
212 angular velocity sensor
214 acceleration sensor
300 grip part
301 operating interface
302 display unit
400 smart phone
600 remote operation device
1200 computer
1210 host controller
1212 CPU
1214 RAM
1220 input/output controller
1222 communication interface
1230 ROM

Claims (10)

1. A control device that controls an image pickup apparatus including a distance measuring sensor and a focusing lens, the distance measuring sensor including a light emitting element that emits pulsed light and a light receiving element that receives light including reflected light of the pulsed light from an object and outputs a signal according to an amount of the received light, and measuring a distance to the object based on the signal, the control device comprising a circuit configured to:
executing focus control for focusing on the object based on a first target position relationship between an image pickup surface of the image pickup apparatus and the focus lens determined according to the distance when the signal satisfies a first condition showing reliability of the distance;
when the signal does not satisfy the first condition, the focus control is executed based on a second target positional relationship of the image pickup surface and the focus lens determined from respective blur amounts of a first image captured by the image pickup device when the positional relationship of the image pickup surface and the focus lens is a first positional relationship and a second image captured by the image pickup device when the positional relationship of the image pickup surface and the focus lens is a second positional relationship.
2. The control device of claim 1, wherein the circuit is configured to: performing the focus control according to the second target positional relationship when a first degree of difference showing a degree of difference between the first image and the second image satisfies a second condition showing reliability of the second target positional relationship.
3. The control device of claim 2, wherein the circuit is configured to: performing the focus control according to a contrast AF when the first difference degree does not satisfy the second condition.
4. The control device of claim 2, wherein the circuit is configured to:
when the first difference does not meet the second condition, converting the position relationship between the image pickup surface and the focusing lens from the second position relationship into a third position relationship;
executing the focus control based on a third target positional relationship of the image pickup surface and the focus lens determined from respective blur amounts of the first image or the second image and the third image when a second degree of difference showing a degree of difference between a third image picked up by the image pickup device and the first image or from the second image when the positional relationship of the image pickup surface and the focus lens is a third positional relationship satisfies the second condition;
performing the focus control according to a contrast AF when the second difference does not satisfy the second condition.
5. The control device according to claim 1, wherein the signal satisfies the first condition when the signal shows that the amount of light is included in a preset range.
6. The control device of claim 1, wherein the circuit is configured to: when the signal does not satisfy the first condition, acquiring the first image captured by the imaging device when the positional relationship between the imaging surface and the focusing lens is the first positional relationship, then converting the positional relationship between the imaging surface and the focusing lens into the second positional relationship, and acquiring the second image captured by the imaging device when the positional relationship between the imaging surface and the focusing lens is the second positional relationship.
7. The control device of claim 6, wherein the circuit is configured to: when the signal does not satisfy the first condition, acquiring the first image captured by the imaging device when the positional relationship between the imaging surface and the focus lens is the first positional relationship, and then moving the focus lens by a preset distance in a direction determined based on the first target positional relationship, thereby moving the positional relationship between the imaging surface and the focus lens to a second positional relationship.
8. An image pickup system, comprising: the control device according to any one of claims 1 to 7;
the distance measuring sensor; and
the image pickup device is provided.
9. A control method of controlling an image pickup apparatus including a distance measuring sensor and a focusing lens, the distance measuring sensor including a light emitting element that emits pulsed light and a light receiving element that receives light including reflected light of the pulsed light from an object and outputs a signal corresponding to an amount of the received light, and measuring a distance to the object based on the signal, the control method comprising:
executing focus control for focusing on the object based on a first target position relationship between an image pickup surface of the image pickup apparatus and the focus lens determined according to the distance when the signal satisfies a first condition showing reliability of the distance;
when the signal does not satisfy the first condition, the focus control is executed based on a second target positional relationship of the image pickup surface and the focus lens that is determined from respective blur amounts of a first image picked up by the image pickup device when the positional relationship of the image pickup surface and the focus lens is a first positional relationship and a second image picked up by the image pickup device when the positional relationship of the image pickup surface and the focus lens is a second positional relationship.
10. A program for causing a computer to function as the control device according to any one of claims 1 to 7.
CN202080003361.5A 2019-08-21 2020-08-03 Control device, imaging system, control method, and program Pending CN112335227A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2019-151539 2019-08-21
JP2019151539A JP2021032990A (en) 2019-08-21 2019-08-21 Control device, imaging system, control method and program
PCT/CN2020/106579 WO2021031833A1 (en) 2019-08-21 2020-08-03 Control device, photographing system, control method, and program

Publications (1)

Publication Number Publication Date
CN112335227A true CN112335227A (en) 2021-02-05

Family

ID=74301443

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080003361.5A Pending CN112335227A (en) 2019-08-21 2020-08-03 Control device, imaging system, control method, and program

Country Status (1)

Country Link
CN (1) CN112335227A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113163086A (en) * 2021-04-07 2021-07-23 Tcl通讯(宁波)有限公司 Be applied to display device's intelligence and shoot accessory structure
CN115242939A (en) * 2021-03-24 2022-10-25 维克多哈苏有限公司 Distance detection device and imaging device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6442639A (en) * 1987-08-08 1989-02-14 Olympus Optical Co Automatic focusing system
US6222996B1 (en) * 1998-07-15 2001-04-24 Olympus Optical Co., Ltd. Camera with distance measuring apparatus for preferentially controlling passive and active type AF system
JP2002010126A (en) * 2000-06-19 2002-01-11 Olympus Optical Co Ltd Image pickup device
CN103801824A (en) * 2014-02-25 2014-05-21 哈尔滨工业大学(威海) Automatic-focusing high-precision large-stroke precision positioning workbench
CN104954681A (en) * 2015-06-16 2015-09-30 广东欧珀移动通信有限公司 Method for switching off laser focusing mode and terminal
CN105403983A (en) * 2014-09-05 2016-03-16 三星电子株式会社 Inner Focusing Telephoto Lens System And Photographing Apparatus Including The Same
CN105847664A (en) * 2015-07-31 2016-08-10 维沃移动通信有限公司 Shooting method and device for mobile terminal
CN108139562A (en) * 2015-09-30 2018-06-08 富士胶片株式会社 Focusing control apparatus, focusing control method, focusing control program, lens assembly, photographic device
CN108235815A (en) * 2017-04-07 2018-06-29 深圳市大疆创新科技有限公司 Video camera controller, photographic device, camera system, moving body, camera shooting control method and program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6442639A (en) * 1987-08-08 1989-02-14 Olympus Optical Co Automatic focusing system
US6222996B1 (en) * 1998-07-15 2001-04-24 Olympus Optical Co., Ltd. Camera with distance measuring apparatus for preferentially controlling passive and active type AF system
JP2002010126A (en) * 2000-06-19 2002-01-11 Olympus Optical Co Ltd Image pickup device
CN103801824A (en) * 2014-02-25 2014-05-21 哈尔滨工业大学(威海) Automatic-focusing high-precision large-stroke precision positioning workbench
CN105403983A (en) * 2014-09-05 2016-03-16 三星电子株式会社 Inner Focusing Telephoto Lens System And Photographing Apparatus Including The Same
CN104954681A (en) * 2015-06-16 2015-09-30 广东欧珀移动通信有限公司 Method for switching off laser focusing mode and terminal
CN105847664A (en) * 2015-07-31 2016-08-10 维沃移动通信有限公司 Shooting method and device for mobile terminal
CN108139562A (en) * 2015-09-30 2018-06-08 富士胶片株式会社 Focusing control apparatus, focusing control method, focusing control program, lens assembly, photographic device
CN108235815A (en) * 2017-04-07 2018-06-29 深圳市大疆创新科技有限公司 Video camera controller, photographic device, camera system, moving body, camera shooting control method and program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
韩振宇: "自动聚焦激光诱导击穿光谱远程测量系统", 《光谱学与光谱分析》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115242939A (en) * 2021-03-24 2022-10-25 维克多哈苏有限公司 Distance detection device and imaging device
CN113163086A (en) * 2021-04-07 2021-07-23 Tcl通讯(宁波)有限公司 Be applied to display device's intelligence and shoot accessory structure

Similar Documents

Publication Publication Date Title
CN111567032B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
CN110383812B (en) Control device, system, control method, and program
CN111356954B (en) Control device, mobile body, control method, and program
CN112335227A (en) Control device, imaging system, control method, and program
CN110809746A (en) Control device, imaging device, mobile body, control method, and program
CN112292712A (en) Device, imaging device, moving object, method, and program
CN110337609B (en) Control device, lens device, imaging device, flying object, and control method
WO2021031833A1 (en) Control device, photographing system, control method, and program
CN109844634B (en) Control device, imaging device, flight object, control method, and program
US20220188993A1 (en) Control apparatus, photographing apparatus, control method, and program
JP2021085893A (en) Control device, image capturing device, control method, and program
CN111264055A (en) Specifying device, imaging system, moving object, synthesizing system, specifying method, and program
US20220046177A1 (en) Control device, camera device, movable object, control method, and program
CN110785997B (en) Control device, imaging device, mobile body, and control method
CN111357271B (en) Control device, mobile body, and control method
CN111602385B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
CN111226170A (en) Control device, mobile body, control method, and program
JP7173657B2 (en) Control device, imaging device, control method, and program
CN112313943A (en) Device, imaging device, moving object, method, and program
JP6805448B2 (en) Control devices, imaging systems, moving objects, control methods, and programs
WO2022001561A1 (en) Control device, camera device, control method, and program
CN111213369B (en) Control device, control method, imaging device, mobile object, and computer-readable storage medium
CN112166374B (en) Control device, imaging device, mobile body, and control method
JP6746856B2 (en) Control device, imaging system, moving body, control method, and program
CN114600024A (en) Device, imaging system, and moving object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned
AD01 Patent right deemed abandoned

Effective date of abandoning: 20230228