US20160182809A1 - Hybrid auto-focus mechanism - Google Patents

Hybrid auto-focus mechanism Download PDF

Info

Publication number
US20160182809A1
US20160182809A1 US14/842,238 US201514842238A US2016182809A1 US 20160182809 A1 US20160182809 A1 US 20160182809A1 US 201514842238 A US201514842238 A US 201514842238A US 2016182809 A1 US2016182809 A1 US 2016182809A1
Authority
US
United States
Prior art keywords
autofocus
distance
ranging device
lens
estimation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/842,238
Other versions
US9420163B2 (en
Inventor
Laurent Plaza
Olivier Lemarchand
Paul Varillon
Francesco CASCIO
Duncan Hall
Sam Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
STMicroelectronics Grenoble 2 SAS
STMicroelectronics Research and Development Ltd
STMicroelectronics Asia Pacific Pte Ltd
Original Assignee
STMicroelectronics Grenoble 2 SAS
STMicroelectronics Research and Development Ltd
STMicroelectronics Asia Pacific Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by STMicroelectronics Grenoble 2 SAS, STMicroelectronics Research and Development Ltd, STMicroelectronics Asia Pacific Pte Ltd filed Critical STMicroelectronics Grenoble 2 SAS
Assigned to STMICROELECTRONICS (GRENOBLE 2) SAS reassignment STMICROELECTRONICS (GRENOBLE 2) SAS ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CASCIO, Francesco, PLAZA, LAURENT, LEMARCHAND, OLIVIER
Assigned to STMICROELECTRONICS (RESEARCH & DEVELOPMENT) LIMITED reassignment STMICROELECTRONICS (RESEARCH & DEVELOPMENT) LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HALL, DUNCAN, LEE, SAM
Assigned to STMICROELECTRONICS ASIA PACIFIC PTE LTD reassignment STMICROELECTRONICS ASIA PACIFIC PTE LTD ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: VARILLON, PAUL
Publication of US20160182809A1 publication Critical patent/US20160182809A1/en
Application granted granted Critical
Publication of US9420163B2 publication Critical patent/US9420163B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • H04N5/23212
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • H04N5/2257
    • H04N5/2351
    • H04N9/045
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/365Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals by analysis of the spatial frequency components of the image
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/40Systems for automatic generation of focusing signals using time delay of the reflected waves, e.g. of ultrasonic waves

Definitions

  • the present disclosure relates to the field of systems and methods for autofocusing, and in particular to a system and method of autofocusing using a ranging device.
  • a drawback with passive autofocus mechanisms is that they tend to be relatively slow in providing the optimum lens position.
  • the ranging device is unable to provide a distance reading.
  • Hybrid autofocus systems use a combination of passive and active autofocus methods. If the active autofocus mechanism fails, the passive autofocus sequence is triggered to provide the autofocus function. Therefore, while such a hybrid system can provide a shorter focusing time using the active autofocus method, in the case that this method fails, focusing is still likely to be slow.
  • an autofocus method comprising: determining, by a processing device of a digital camera having a ranging device, that the ranging device failed in an attempt to provide a distance estimation; receiving by the processing device from the ranging device one or more parameters indicating conditions related to the failure of the ranging device to provide the distance estimation; and performing, by the processing device, an autofocus sequence based on the one or more parameters.
  • the one or more parameters comprise one or more of: an error code from the ranging device; and a distance value representing an estimation of the maximum distance for which the ranging device is capable of determining a distance estimation.
  • the digital camera comprises at least one lens controllable to have N unique focusing positions, and based on the one or more parameters, the autofocus sequence comprises an iterative search covering a subset of the N unique focusing positions, an initial lens position of the autofocus sequence being selected based on the one or more parameters.
  • the digital camera comprises an image sensor on which the at least one lens forms an image of an image scene
  • the method further comprising: processing, by the processing device, images captured by the image sensor to determine focus measures; and performing the iterative autofocus search based on the focus measures.
  • the one or more parameters comprise a distance value representing an estimation of the maximum distance for which the ranging device is capable of determining a distance estimation.
  • the distance value is determined based on an ambient light measure.
  • the autofocus further comprises estimating a reflectance of a target object of the autofocus method, and determining the distance value based also on the estimated reflectance of the target object.
  • estimating the reflectance of the target comprises detecting a color of the target object.
  • the processing device is adapted to: compare the distance value with a first distance threshold, and if the distance value is higher than the first distance threshold, performing the autofocus operation comprises selecting a lens position corresponding to infinity.
  • the first distance threshold is a distance corresponding to less than one lens position from the infinity lens position.
  • the processing device is adapted to: compare the distance value with a second distance threshold, and if the distance value is higher than the second distance threshold, performing the autofocus operation comprises performing an iterative autofocus operation with fine steps for a range of lens positions corresponding to distances in the range Dmax to infinity.
  • the second distance threshold is equal to a focusing distance corresponding to six or less lens positions from the lens position corresponding to infinity.
  • an autofocus system comprising: a digital camera comprising at least one lens controllable to have N unique focusing positions; a ranging device adapted to estimate a distance to an object; and a processing device adapted to: determine that the ranging device failed in an attempt to provide a distance estimation; receive from the ranging device one or more parameters indicating conditions related to the failure of the ranging device to provide the distance estimation; and perform an autofocus operation based on the one or more parameters.
  • the processing device is adapted to perform the autofocus operation by performing an iterative search covering a subset starting at a lens position selected based on the one or more parameters.
  • the digital camera comprises an image sensor on which the at least one lens forms an image of an image scene
  • the processing device is adapted to process images captured by the image sensor to determine focus measures and to perform the iterative autofocus search based on the focus measures.
  • the autofocus system further comprises a memory storing a table indicating a mapping between distances in the image scene and corresponding lens positions of the at least one lens for focusing at said distances.
  • the one or more parameters comprise a distance value representing an estimation of the maximum distance for which the ranging device is capable of determining a distance estimation, and the ranging device is adapted to generate the distance value based on an ambient light measure.
  • FIG. 1 schematically illustrates an image capture device having a ranging device according to an embodiment of the present disclosure
  • FIG. 2 schematically illustrates the image capture device of FIG. 1 in more detail according to an embodiment of the present disclosure
  • FIG. 3 is a graph representing a mapping between lens positions and object distances according to an embodiment of the present disclosure
  • FIG. 4 is a flow diagram illustrating operations in an autofocus method according to an embodiment of the present disclosure
  • FIG. 5 is a flow diagram illustrating operations in an autofocus method in more detail according to an embodiment of the present disclosure
  • FIG. 6 is a graph illustrating an iterative ful autofocus search according to an embodiment of the present disclosure.
  • FIG. 7 is a graph illustrating an iterative fine autofocus search according to an embodiment of the present disclosure.
  • FIG. 1 schematically illustrates an image capture device 102 .
  • the device 102 is for example a digital camera, or another portable electronics device equipped with a digital camera.
  • the device 102 is a mobile telephone, laptop or tablet computer, portable media players, or the like.
  • the device 102 comprises a camera unit 104 , for example comprising a lens unit 105 , and an image sensor 106 on which the lens unit 105 is adapted to form an image of the image scene.
  • the image capture device 102 also comprises a ranging device 108 .
  • the ranging device 108 is for example adapted to estimate the distance D to a target object, which is for example the closest object detected in the image scene.
  • the ranging device 108 operates by transmitting a signal, as represented by a beam 110 in FIG. 1 , and by detecting reflections from the image scene.
  • the target object is a person located at a distance D T from the image capturing device 102 .
  • the ranging device 108 may use any of a range of technologies for estimating the distance to an object.
  • the ranging device transmits light, such as infrared light, and comprises a light sensitive cell for detecting photons returning from the image scene.
  • the distance is estimated by calculating the time of flight of these photons.
  • other technologies could be used, such as ultrasound or capacitive sensing.
  • Dmax is a maximum object distance, referred to herein as Dmax, above which the ranging device is not capable of estimating the distance.
  • Dmax is shown as being a little further from the camera than the distance D T , and is for example in the range 0.5 to 2 m. If the closest object in the field of view of the ranging device 108 is at a distance greater than Dmax, the ranging device 108 will not be capable of detecting the presence of this object and estimating the distance to this object.
  • a hybrid autofocus solution is for example adopted to use both a ranging device and a passive autofocus technique.
  • the passive autofocus technique is employed. Furthermore, in such a case the ranging device 108 is for example capable of generating error codes and/or of calculating an estimation of Dmax based on at least the ambient light levels.
  • FIG. 2 schematically illustrates the image capture device 102 of FIG. 1 in more detail according to an example embodiment.
  • the device 102 for example comprises a processing device (P) 202 , which for example comprises one or more processors under control of software instructions stored in a memory (MEM) 204 .
  • the processing device 202 may alternatively be at least partially implemented in hardware, such as by an ASIC (application specific integrated circuit).
  • ASIC application specific integrated circuit
  • the processing device 202 is coupled to the ranging device (RD) 108 , for example via a suitable interface such as a parallel or serial bus.
  • the processing device 202 provides an enable signal EN to the ranging device 108 when a distance measurement is required, and the distance measurement D is provided by the ranging device 108 back to the processing device 202 , for example in the form of an 8-bit distance value.
  • an estimation of the distance Dmax is for example generated by the ranging device 108 and provided to the processing device 202 .
  • one or more error codes may be provided by the ranging device 108 indicating conditions relating to the failure of the ranging device 108 to provide the distance estimation.
  • the processing device 202 for example provides a control signal CTRL to the lens unit 105 for controlling the position of at least one lens, and thus controlling the focus of the camera.
  • the lens unit 105 comprises a VCM (voice coil motor—not illustrated) for positioning at least one lens of the lens unit 105
  • a control module also not illustrated
  • the lens unit 105 could comprise other types of mechanisms for adjusting the focus of the camera 104 , such as a stepper motor.
  • the control signal CTRL for example comprises one of N values for controlling at least one lens of the lens unit to have one of N unique positions, providing N unique focusing positions.
  • the image sensor 106 of the camera 104 for example provides an image I to the processing device 202 for processing.
  • the processing device 202 performs a hybrid autofocusing sequence in which the ranging device 108 is used by preference, but if a valid distance reading cannot be obtained, an iterative autofocus search is performed by processing one or more images I captured by the image sensor 106 to determine when a focused image has been obtained.
  • FIG. 3 is a graph representing an example of the position of a focusing lens of the lens unit 105 , expressed as the distance from a reference focus position, for a range of object distances.
  • a focused distance range for which an object positioned in this range will be considered to be focused on the image sensor.
  • One example of a curve is shown in FIG. 3 , corresponding to a specific camera. Indeed, the shape of the curve will depend on the particular camera system, and in particular on aspects such as the number of pixels, the pixel size, the field of view and the lens F number.
  • FIG. 3 is a graph representing an example of the position of a focusing lens of the lens unit 105 , expressed as the distance from a reference focus position, for a range of object distances.
  • FIG. 3 is a graph representing an example of the position of a focusing lens of the lens unit 105 , expressed as the distance from a reference focus position, for a range of object distances.
  • FIG. 3 is a graph representing an example of the position of
  • a first lens position corresponds to a focused distance of 3 m, and is for example a hyperfocal distance suitable for any object distance between 2.25 m and infinity.
  • a second lens position for example corresponds to an object distance of 1.5 m, a third lens position to an object distance of 1 m, a fourth lens position to an object distance of 0.75 m, etc.
  • the object distance range for each lens position is for example stored in a lookup table T stored in the memory 204 of the image capture device 102 .
  • the lens positions are for example calibrated for a given device by positing an object at each of the distances, and adjusting the lens positioning until focusing has been achieved.
  • the lookup table T allows a distance detected by the ranging device 108 to be converted directly into an appropriate lens position.
  • the memory 204 also for example stores distances D 1 and D 2 , which will be described in more detail below.
  • the distance Dl for example corresponds to the object distance above which the first lens position is to be used, and is for example around halfway between the first and second lens positions, and is for example equal to 2.25 m in the example of FIG. 3 .
  • the distance D 2 is for example the object distance to be used for the sixth lens position, counting from the hyperfocal lens position. Indeed, as will be described in more detail below, there are at least six lens movements in a typical HCS (Hill Climbing Search) autofocus sequence, three for a coarse search, and three for a fine search.
  • HCS Haill Climbing Search
  • the distance D 2 such that there are only six remaining potential lens positions, using only a fine autofocus search based on these remaining lens positions will always be equal to or faster than performing a full coarse and fine HCS autofocus sequence.
  • a different choice of the distance D 2 would be possible, for example if a different type of search algorithm is employed.
  • FIG. 4 is a flow diagram illustrating an example of operations in an autofocusing method according to an example embodiment. These operations are for example performed by the processing device 202 during an autofocusing sequence, and following a command by the processing device 202 to the ranging device 108 to provide a distance estimation. For example, a user has pressed a button or a touch screen of the image capture device 102 to indicate that they wish to capture a still image or a video sequence, and the autofocusing sequence is launched.
  • an operation 400 it is determined whether the ranging device 108 has successfully estimated the distance D to an object in its field of view. If so, in an operation 401 , focusing is performed based on this distance. For example, the distance D is converted, using the lookup table T stored in the memory 204 , into the control signal CRTL corresponding to an appropriate lens position, and the control signal is applied to the lens unit 105 . In some embodiments, a fine autofocus search algorithm may additionally be applied, as described in more detail below. Alternatively, if in operation 400 it is determined that the ranging device 108 failed to estimate a distance to any object, the next operation is 402 .
  • the processing device 202 receives, from the ranging device 108 , one or more parameters indicating conditions related to the failure of the ranging device 108 to provide the distance estimation.
  • the ranging device 108 receives one or more error codes EC and/or the distance Dmax.
  • an autofocus operation is performed by the processing device 202 based on the one or more parameters received from the ranging device 108 . For example, in some embodiments, this may involve performing an iterative autofocus search based on the value of Dmax, or bringing the lens position directly to the hyperfocal position.
  • FIG. 5 illustrates an autofocusing sequence in more detail based on the estimation Dmax provided by the ranging device 108 .
  • the host which is for example the processing device 202 , requests a new autofocus sequence.
  • the distance measurement to be performed by the ranging device is initiated, for example by asserting the enable signal EN shown in FIG. 2 .
  • this distance is converted into a lens position, and then in an operation 505 , the lens is moved to the focus position, for example in one shot, followed optionally by a further fine search.
  • the next operation is 506 , in which the distance Dmax is obtained from the ranging device 108 .
  • Dmax is for example calculated by the ranging device 108 based on an ambient light level.
  • Dmax is for example estimated based on the ambient levels of this transmission frequency. Indeed, high ambient levels at wavelengths interfering with the transmission wavelength are likely to add noise and reduce the performance of the ranging device 108 , thereby reducing Dmax.
  • the ranging device 108 may measure the ambient light by detecting the amount of light received by its photosensitive cell when no light is emitted.
  • the value Dmax is estimated assuming a specific reflectance of the target, for example equal to 17% grey.
  • the value of Dmax may be adjusted based on an estimation of the actual color and thus the actual reflectance of the target object, for example by evaluating pixel values close to the center of a captured image. Thus, for example, if the color is white, reflectance may be assumed to be at around 88%, whereas if the color is black, reflectance may be assumed to be only 5%.
  • Dmax is calculated based on the following equation:
  • RetAmbient is the ambient light level, for example measured by the ranging device 108
  • MaxAmbientRate is a maximum level of light before the ranging device 108 is saturated and/or blinded
  • RetSignalAtOmm is an estimation of the return signal rate when a target is placed at 0 mm
  • MinSignalNeeded is the minimum signal rate for obtaining a valid distance.
  • RetSignalAtOmm is determined based on the following equation:
  • ReturnRate is the return signal rate calibration value, with a target at the calibration distance DistanceCalibration.
  • MinSignalNeeded is for example determined based on the following equation:
  • MinSignalNeeded ThresholdLimit MaxConvTime ⁇ 1000
  • ThresholdLimit is the minimum number of events, in other words photon detections, for obtaining a valid distance, expressed in Mcps (mega photon counts per second), and MaxConvTime is the maximum convergence time in ms for providing a distance measurement.
  • an operation 507 is performed in which Dmax is for example compared to the distance D 1 . If Dmax is greater than D 1 , then in a next operation 508 , the lens is for example moved to the infinity position, for example corresponding to the hyperfocal distance. This is for example performed in one shot.
  • Dmax is compared to a value D 2 . If Dmax is greater than D 2 , in a next operation 510 , a fine HCS autofocus search is for example performed over a reduced range of lens positions. For example, the search is performed between Dmax and infinity. Due to the choice of D 2 as described above in relation to FIG. 3 , the search for example takes six steps or fewer. However, in other embodiments, other types of search sequence could be implemented over the reduced range of lens positions. If in operation 509 Dmax is not found to be greater than D 2 , the next operation is 511 .
  • a full iterative autofocus search is for example performed from macro to infinity. This operation for example takes at least six steps, three for a coarse search followed by three for a fine search.
  • FIG. 6 is a graph illustrating an example of the full autofocus search performed in operation 511 of FIG. 5 .
  • a focus measure which is for example based on a contrast detection technique applied to a captured image, is provided as an input.
  • the lens position is modified in relatively large increments towards the hyperfocal position until the focus measure starts to fall.
  • a fine search is then performed as represented by solid dots, by moving the lens position by smaller increments.
  • the hill climbing search (HCS) algorithm is for example applied.
  • the HCS algorithm for example involves:
  • FIG. 7 is a graph illustrating an example of the fine autofocus search for example performed in step 510 of FIG. 5 . As illustrated, initially the lens is for example taken to the infinity lens position, and then brought back until the focus measure starts to fall.
  • one or more steps in the autofocus search sequence can be economized by directly performing a fine autofocus search over a reduced range of lens positions, rather than performing a full autofocus search over the whole range.
  • the ranging device 108 in addition to or rather than using the value of Dmax, is capable of generating error codes when a valid distance measurement cannot be generated. Such error codes can be used to reduce the number of steps of the iterative autofocus search sequence. By way of example, one or more of the following error types may be indicated by the ranging device 108 using appropriate error codes:
  • Error type Raw ranging underflow. This type of error occurs when one or more internal registers in the ranging device are set to a negative value during the ranging operation. This is an error that generally results from the target being either very close to the ranging device, or from the target being particularly bright and being relatively far from the ranging device. These two cases can be differentiated based on the return signal rate. For example, if the return rate is low, the target is determined to be far, whereas if the return rate is high, the target is determined to be close. In the case of a close target, the lens position is for example taken to macro, in other words the lens position corresponding to the shortest focused distance. In the case of a far target, an autofocus search is for example performed between Dmax and infinity, or the hyperfocal position is used directly.
  • Error type Raw ranging overflow. This means that the buffer of the ranging device 108 for storing the distance measurement has overflowed. Generally, this means that the ranging device 108 must be close to its maximum range, and thus a search can be started from Dmax.
  • Error type low SNR, meaning that there is high ambient light compared to the signal. In view of the bright conditions, the likelihood is that the camera is outside, and thus the hyperfocal lens position is for example used. Alternatively, an autofocus search is performed from Dmax to infinity.
  • An advantage of the embodiments described herein is that the speed of the autofocus sequence can be increased by using parameters provided by a ranging device.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

An autofocus method determines that a ranging device of a digital camera has failed in an attempt to provide a distance estimation. The ranging device provides one or more parameters indicating conditions related to the failure of the ranging device to provide the distance estimation. An autofocus sequence based on the one or more parameters is then performed.

Description

    PRIORITY CLAIM
  • This application claims priority from French Application for Patent No. 1463242 filed Dec. 23, 2014, the disclosure of which is incorporated by reference.
  • TECHNICAL FIELD
  • The present disclosure relates to the field of systems and methods for autofocusing, and in particular to a system and method of autofocusing using a ranging device.
  • BACKGROUND
  • It has become standard in recent years to equip most types of mobile devices, such as mobile telephones, with cameras for capturing still images and/or video. Due to a demand for high quality images, the cameras are becoming more sophisticated and generally comprise a lens system with an autofocus mechanism to automatically perform a focusing operation.
  • Among existing autofocus mechanisms, some are termed “passive” autofocus systems, which generally involve processing images captured by the image sensor to detect when focusing has been achieved, for example by performing contrast or phase detection. Other autofocus mechanisms, termed “active” autofocus systems, rely on a dedicated ranging device to estimate the distance to an object in the image scene, allowing a rapid convergence to an appropriate lens position.
  • A drawback with passive autofocus mechanisms is that they tend to be relatively slow in providing the optimum lens position. However, while it would be desirable to rely solely on an active autofocus mechanism, in certain cases the ranging device is unable to provide a distance reading.
  • Hybrid autofocus systems use a combination of passive and active autofocus methods. If the active autofocus mechanism fails, the passive autofocus sequence is triggered to provide the autofocus function. Therefore, while such a hybrid system can provide a shorter focusing time using the active autofocus method, in the case that this method fails, focusing is still likely to be slow.
  • There is thus a need in the art for an autofocus method and system permitting faster focusing.
  • SUMMARY
  • It is an aim of embodiments of the present description to at least partially address one or more needs in the prior art.
  • According to one aspect, there is provided an autofocus method comprising: determining, by a processing device of a digital camera having a ranging device, that the ranging device failed in an attempt to provide a distance estimation; receiving by the processing device from the ranging device one or more parameters indicating conditions related to the failure of the ranging device to provide the distance estimation; and performing, by the processing device, an autofocus sequence based on the one or more parameters.
  • According to one embodiment, the one or more parameters comprise one or more of: an error code from the ranging device; and a distance value representing an estimation of the maximum distance for which the ranging device is capable of determining a distance estimation.
  • According to one embodiment, the digital camera comprises at least one lens controllable to have N unique focusing positions, and based on the one or more parameters, the autofocus sequence comprises an iterative search covering a subset of the N unique focusing positions, an initial lens position of the autofocus sequence being selected based on the one or more parameters.
  • According to one embodiment, the digital camera comprises an image sensor on which the at least one lens forms an image of an image scene, the method further comprising: processing, by the processing device, images captured by the image sensor to determine focus measures; and performing the iterative autofocus search based on the focus measures.
  • According to one embodiment, the one or more parameters comprise a distance value representing an estimation of the maximum distance for which the ranging device is capable of determining a distance estimation.
  • According to one embodiment, the distance value is determined based on an ambient light measure.
  • According to one embodiment, the autofocus further comprises estimating a reflectance of a target object of the autofocus method, and determining the distance value based also on the estimated reflectance of the target object.
  • According to one embodiment, estimating the reflectance of the target comprises detecting a color of the target object.
  • According to one embodiment, the processing device is adapted to: compare the distance value with a first distance threshold, and if the distance value is higher than the first distance threshold, performing the autofocus operation comprises selecting a lens position corresponding to infinity.
  • According to one embodiment, the first distance threshold is a distance corresponding to less than one lens position from the infinity lens position.
  • According to one embodiment, the processing device is adapted to: compare the distance value with a second distance threshold, and if the distance value is higher than the second distance threshold, performing the autofocus operation comprises performing an iterative autofocus operation with fine steps for a range of lens positions corresponding to distances in the range Dmax to infinity.
  • According to one embodiment, the second distance threshold is equal to a focusing distance corresponding to six or less lens positions from the lens position corresponding to infinity.
  • According to a further aspect, there is provided an autofocus system comprising: a digital camera comprising at least one lens controllable to have N unique focusing positions; a ranging device adapted to estimate a distance to an object; and a processing device adapted to: determine that the ranging device failed in an attempt to provide a distance estimation; receive from the ranging device one or more parameters indicating conditions related to the failure of the ranging device to provide the distance estimation; and perform an autofocus operation based on the one or more parameters.
  • According to one embodiment, the processing device is adapted to perform the autofocus operation by performing an iterative search covering a subset starting at a lens position selected based on the one or more parameters.
  • According to one embodiment, the digital camera comprises an image sensor on which the at least one lens forms an image of an image scene, and the processing device is adapted to process images captured by the image sensor to determine focus measures and to perform the iterative autofocus search based on the focus measures.
  • According to one embodiment, the autofocus system further comprises a memory storing a table indicating a mapping between distances in the image scene and corresponding lens positions of the at least one lens for focusing at said distances.
  • According to one embodiment, the one or more parameters comprise a distance value representing an estimation of the maximum distance for which the ranging device is capable of determining a distance estimation, and the ranging device is adapted to generate the distance value based on an ambient light measure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The foregoing and other features and advantages will become apparent from the following detailed description of embodiments, given by way of illustration and not limitation with reference to the accompanying drawings, in which:
  • FIG. 1 schematically illustrates an image capture device having a ranging device according to an embodiment of the present disclosure;
  • FIG. 2 schematically illustrates the image capture device of FIG. 1 in more detail according to an embodiment of the present disclosure;
  • FIG. 3 is a graph representing a mapping between lens positions and object distances according to an embodiment of the present disclosure;
  • FIG. 4 is a flow diagram illustrating operations in an autofocus method according to an embodiment of the present disclosure;
  • FIG. 5 is a flow diagram illustrating operations in an autofocus method in more detail according to an embodiment of the present disclosure;
  • FIG. 6 is a graph illustrating an iterative ful autofocus search according to an embodiment of the present disclosure; and
  • FIG. 7 is a graph illustrating an iterative fine autofocus search according to an embodiment of the present disclosure.
  • DETAILED DESCRIPTION
  • FIG. 1 schematically illustrates an image capture device 102. The device 102 is for example a digital camera, or another portable electronics device equipped with a digital camera. For example, the device 102 is a mobile telephone, laptop or tablet computer, portable media players, or the like. The device 102 comprises a camera unit 104, for example comprising a lens unit 105, and an image sensor 106 on which the lens unit 105 is adapted to form an image of the image scene.
  • The image capture device 102 also comprises a ranging device 108. The ranging device 108 is for example adapted to estimate the distance D to a target object, which is for example the closest object detected in the image scene. The ranging device 108 operates by transmitting a signal, as represented by a beam 110 in FIG. 1, and by detecting reflections from the image scene. In the example of FIG. 1, the target object is a person located at a distance DT from the image capturing device 102. The ranging device 108 may use any of a range of technologies for estimating the distance to an object. For example, in some embodiments, the ranging device transmits light, such as infrared light, and comprises a light sensitive cell for detecting photons returning from the image scene. For example, the distance is estimated by calculating the time of flight of these photons. Alternatively, other technologies could be used, such as ultrasound or capacitive sensing.
  • Whatever the type of ranging device 108, there is a maximum object distance, referred to herein as Dmax, above which the ranging device is not capable of estimating the distance. In the example of FIG. 1, Dmax is shown as being a little further from the camera than the distance DT, and is for example in the range 0.5 to 2 m. If the closest object in the field of view of the ranging device 108 is at a distance greater than Dmax, the ranging device 108 will not be capable of detecting the presence of this object and estimating the distance to this object. As will be described in more detail below, a hybrid autofocus solution is for example adopted to use both a ranging device and a passive autofocus technique. In the case that the ranging device 108 attempts to capture a distance reading but fails, the passive autofocus technique is employed. Furthermore, in such a case the ranging device 108 is for example capable of generating error codes and/or of calculating an estimation of Dmax based on at least the ambient light levels.
  • FIG. 2 schematically illustrates the image capture device 102 of FIG. 1 in more detail according to an example embodiment. As illustrated, the device 102 for example comprises a processing device (P) 202, which for example comprises one or more processors under control of software instructions stored in a memory (MEM) 204. The processing device 202 may alternatively be at least partially implemented in hardware, such as by an ASIC (application specific integrated circuit).
  • The processing device 202 is coupled to the ranging device (RD) 108, for example via a suitable interface such as a parallel or serial bus. For example, the processing device 202 provides an enable signal EN to the ranging device 108 when a distance measurement is required, and the distance measurement D is provided by the ranging device 108 back to the processing device 202, for example in the form of an 8-bit distance value. Alternatively, if the attempt by the ranging device 108 to make a distance measurement fails, an estimation of the distance Dmax is for example generated by the ranging device 108 and provided to the processing device 202. Additionally or alternatively, one or more error codes (EC) may be provided by the ranging device 108 indicating conditions relating to the failure of the ranging device 108 to provide the distance estimation.
  • The processing device 202 for example provides a control signal CTRL to the lens unit 105 for controlling the position of at least one lens, and thus controlling the focus of the camera. For example, in some embodiments, the lens unit 105 comprises a VCM (voice coil motor—not illustrated) for positioning at least one lens of the lens unit 105, and a control module (also not illustrated) is provided in the lens unit 105 for converting the digital control signal CTRL into an analog voltage level for driving the VCM. Alternatively, the lens unit 105 could comprise other types of mechanisms for adjusting the focus of the camera 104, such as a stepper motor. In any case, the control signal CTRL for example comprises one of N values for controlling at least one lens of the lens unit to have one of N unique positions, providing N unique focusing positions.
  • The image sensor 106 of the camera 104 for example provides an image I to the processing device 202 for processing. For example, the processing device 202 performs a hybrid autofocusing sequence in which the ranging device 108 is used by preference, but if a valid distance reading cannot be obtained, an iterative autofocus search is performed by processing one or more images I captured by the image sensor 106 to determine when a focused image has been obtained.
  • FIG. 3 is a graph representing an example of the position of a focusing lens of the lens unit 105, expressed as the distance from a reference focus position, for a range of object distances. In particular, for each lens position, there will be a focused distance range for which an object positioned in this range will be considered to be focused on the image sensor. One example of a curve is shown in FIG. 3, corresponding to a specific camera. Indeed, the shape of the curve will depend on the particular camera system, and in particular on aspects such as the number of pixels, the pixel size, the field of view and the lens F number. In the example of FIG. 3, a first lens position, labeled 1, corresponds to a focused distance of 3 m, and is for example a hyperfocal distance suitable for any object distance between 2.25 m and infinity. A second lens position for example corresponds to an object distance of 1.5 m, a third lens position to an object distance of 1 m, a fourth lens position to an object distance of 0.75 m, etc.
  • The object distance range for each lens position is for example stored in a lookup table T stored in the memory 204 of the image capture device 102. The lens positions are for example calibrated for a given device by positing an object at each of the distances, and adjusting the lens positioning until focusing has been achieved. Thus the lookup table T allows a distance detected by the ranging device 108 to be converted directly into an appropriate lens position.
  • The memory 204 also for example stores distances D1 and D2, which will be described in more detail below. The distance Dl for example corresponds to the object distance above which the first lens position is to be used, and is for example around halfway between the first and second lens positions, and is for example equal to 2.25 m in the example of FIG. 3. The distance D2 is for example the object distance to be used for the sixth lens position, counting from the hyperfocal lens position. Indeed, as will be described in more detail below, there are at least six lens movements in a typical HCS (Hill Climbing Search) autofocus sequence, three for a coarse search, and three for a fine search. Therefore, by choosing the distance D2 such that there are only six remaining potential lens positions, using only a fine autofocus search based on these remaining lens positions will always be equal to or faster than performing a full coarse and fine HCS autofocus sequence. In alternative embodiments, a different choice of the distance D2 would be possible, for example if a different type of search algorithm is employed.
  • FIG. 4 is a flow diagram illustrating an example of operations in an autofocusing method according to an example embodiment. These operations are for example performed by the processing device 202 during an autofocusing sequence, and following a command by the processing device 202 to the ranging device 108 to provide a distance estimation. For example, a user has pressed a button or a touch screen of the image capture device 102 to indicate that they wish to capture a still image or a video sequence, and the autofocusing sequence is launched.
  • From a start point (START), in an operation 400, it is determined whether the ranging device 108 has successfully estimated the distance D to an object in its field of view. If so, in an operation 401, focusing is performed based on this distance. For example, the distance D is converted, using the lookup table T stored in the memory 204, into the control signal CRTL corresponding to an appropriate lens position, and the control signal is applied to the lens unit 105. In some embodiments, a fine autofocus search algorithm may additionally be applied, as described in more detail below. Alternatively, if in operation 400 it is determined that the ranging device 108 failed to estimate a distance to any object, the next operation is 402.
  • In operation 402, the processing device 202 receives, from the ranging device 108, one or more parameters indicating conditions related to the failure of the ranging device 108 to provide the distance estimation. For example, the ranging device 108 receives one or more error codes EC and/or the distance Dmax.
  • After operation 402, in an operation 403, an autofocus operation is performed by the processing device 202 based on the one or more parameters received from the ranging device 108. For example, in some embodiments, this may involve performing an iterative autofocus search based on the value of Dmax, or bringing the lens position directly to the hyperfocal position.
  • FIG. 5 illustrates an autofocusing sequence in more detail based on the estimation Dmax provided by the ranging device 108.
  • In an operation 501, the host, which is for example the processing device 202, requests a new autofocus sequence.
  • After operation 501, in an operation 502, the distance measurement to be performed by the ranging device is initiated, for example by asserting the enable signal EN shown in FIG. 2.
  • After operation 502, in an operation 503, it is determined whether or not a valid distance has been obtained from the ranging device 108, or whether the distance estimation failed.
  • In the case that a valid distance was obtained in operation 503, in a next operation 504 this distance is converted into a lens position, and then in an operation 505, the lens is moved to the focus position, for example in one shot, followed optionally by a further fine search.
  • Alternatively, in the case that no valid distance was obtained in operation 503, the next operation is 506, in which the distance Dmax is obtained from the ranging device 108. For example, in the case that the ranging device 108 performed distance estimation based on the transmission and detection of light, Dmax is for example calculated by the ranging device 108 based on an ambient light level. In the case of other forms of transmission such as ultrasound, Dmax is for example estimated based on the ambient levels of this transmission frequency. Indeed, high ambient levels at wavelengths interfering with the transmission wavelength are likely to add noise and reduce the performance of the ranging device 108, thereby reducing Dmax. The ranging device 108 may measure the ambient light by detecting the amount of light received by its photosensitive cell when no light is emitted.
  • In some embodiments, the value Dmax is estimated assuming a specific reflectance of the target, for example equal to 17% grey. In some cases, the value of Dmax may be adjusted based on an estimation of the actual color and thus the actual reflectance of the target object, for example by evaluating pixel values close to the center of a captured image. Thus, for example, if the color is white, reflectance may be assumed to be at around 88%, whereas if the color is black, reflectance may be assumed to be only 5%.
  • In one embodiment, Dmax is calculated based on the following equation:
  • Dmax = RetSignalAt 0 mm × ( 1 - RetAmbient MaxAmbientRate ) MinSignalNeeded
  • where RetAmbient is the ambient light level, for example measured by the ranging device 108, MaxAmbientRate is a maximum level of light before the ranging device 108 is saturated and/or blinded, RetSignalAtOmm is an estimation of the return signal rate when a target is placed at 0 mm, and MinSignalNeeded is the minimum signal rate for obtaining a valid distance.
  • For example, RetSignalAtOmm is determined based on the following equation:

  • RetSignalAt0 mm=ReturnRate×DistanceCalibration2
  • where ReturnRate is the return signal rate calibration value, with a target at the calibration distance DistanceCalibration.
  • MinSignalNeeded is for example determined based on the following equation:
  • MinSignalNeeded = ThresholdLimit MaxConvTime × 1000
  • where ThresholdLimit is the minimum number of events, in other words photon detections, for obtaining a valid distance, expressed in Mcps (mega photon counts per second), and MaxConvTime is the maximum convergence time in ms for providing a distance measurement.
  • Referring again to FIG. 5, after operation 506, an operation 507 is performed in which Dmax is for example compared to the distance D1. If Dmax is greater than D1, then in a next operation 508, the lens is for example moved to the infinity position, for example corresponding to the hyperfocal distance. This is for example performed in one shot.
  • Alternatively, if in operation 507 Dmax is found to be lower than D1, the next operation is 509.
  • In operation 509, Dmax is compared to a value D2. If Dmax is greater than D2, in a next operation 510, a fine HCS autofocus search is for example performed over a reduced range of lens positions. For example, the search is performed between Dmax and infinity. Due to the choice of D2 as described above in relation to FIG. 3, the search for example takes six steps or fewer. However, in other embodiments, other types of search sequence could be implemented over the reduced range of lens positions. If in operation 509 Dmax is not found to be greater than D2, the next operation is 511.
  • In operation 511, a full iterative autofocus search is for example performed from macro to infinity. This operation for example takes at least six steps, three for a coarse search followed by three for a fine search.
  • FIG. 6 is a graph illustrating an example of the full autofocus search performed in operation 511 of FIG. 5. A focus measure, which is for example based on a contrast detection technique applied to a captured image, is provided as an input. As illustrated, during a coarse search represented by hollow dots, the lens position is modified in relatively large increments towards the hyperfocal position until the focus measure starts to fall. A fine search is then performed as represented by solid dots, by moving the lens position by smaller increments.
  • In both the coarse and fine search phases, the hill climbing search (HCS) algorithm is for example applied. The HCS algorithm for example involves:
  • i) taking a first focus measure for a first lens position;
  • ii) move the lens by one lens increment in a first direction to a second position, where the increment is higher for the coarse search than for the fine search;
  • iii) taking a second focus measure for the second lens position; and either:
  • iv) if the second focus measure increased with respect to the first focus measure, moving the lens position again in the first direction, and repeating this step until the focus measurement decreases; or
  • v) if the second focus measure decreased with respect to the first focus measure, check on the other side of the first lens position, if any.
  • FIG. 7 is a graph illustrating an example of the fine autofocus search for example performed in step 510 of FIG. 5. As illustrated, initially the lens is for example taken to the infinity lens position, and then brought back until the focus measure starts to fall.
  • Thus it can be seen from FIGS. 6 and 7 that one or more steps in the autofocus search sequence can be economized by directly performing a fine autofocus search over a reduced range of lens positions, rather than performing a full autofocus search over the whole range.
  • In some embodiments, in addition to or rather than using the value of Dmax, the ranging device 108 is capable of generating error codes when a valid distance measurement cannot be generated. Such error codes can be used to reduce the number of steps of the iterative autofocus search sequence. By way of example, one or more of the following error types may be indicated by the ranging device 108 using appropriate error codes:
  • Error type: Raw ranging underflow. This type of error occurs when one or more internal registers in the ranging device are set to a negative value during the ranging operation. This is an error that generally results from the target being either very close to the ranging device, or from the target being particularly bright and being relatively far from the ranging device. These two cases can be differentiated based on the return signal rate. For example, if the return rate is low, the target is determined to be far, whereas if the return rate is high, the target is determined to be close. In the case of a close target, the lens position is for example taken to macro, in other words the lens position corresponding to the shortest focused distance. In the case of a far target, an autofocus search is for example performed between Dmax and infinity, or the hyperfocal position is used directly.
  • Error type: Raw ranging overflow. This means that the buffer of the ranging device 108 for storing the distance measurement has overflowed. Generally, this means that the ranging device 108 must be close to its maximum range, and thus a search can be started from Dmax.
  • Error type: low SNR, meaning that there is high ambient light compared to the signal. In view of the bright conditions, the likelihood is that the camera is outside, and thus the hyperfocal lens position is for example used. Alternatively, an autofocus search is performed from Dmax to infinity.
  • An advantage of the embodiments described herein is that the speed of the autofocus sequence can be increased by using parameters provided by a ranging device.
  • Having thus described at least one illustrative embodiment, various alterations, modifications and improvements will readily occur to those skilled in the art.
  • For example, while embodiments have been described in which the ranging device is based on photo detection, it will be apparent to those skilled in the art that the principles described herein could be equally applied to other types of ranging detectors.

Claims (17)

1. An autofocus method, comprising:
determining by a processing device of a digital camera having a ranging device that the ranging device failed in an attempt to provide a distance estimation;
receiving by the processing device from the ranging device one or more parameters indicating conditions related to the failure of the ranging device to provide the distance estimation; and
performing by the processing device an autofocus sequence based on the one or more parameters.
2. The autofocus method of claim 1, wherein the one or more parameters comprise one or more of:
an error code from the ranging device; and
a distance value representing an estimation of the maximum distance for which the ranging device is capable of determining a distance estimation.
3. The autofocus method of claim 1, wherein the digital camera comprises at least one lens controllable to have N unique focusing positions, and wherein, based on the one or more parameters, the autofocus sequence comprises an iterative search covering a subset of the N unique focusing positions, an initial lens position of the autofocus sequence being selected based on the one or more parameters.
4. The autofocus method of claim 3, wherein the digital camera comprises an image sensor on which the at least one lens forms an image of an image scene, the method further comprising:
processing by the processing device of images captured by the image sensor to determine focus measures; and
performing the iterative autofocus search based on the focus measures.
5. The autofocus method of claim 3, wherein the one or more parameters comprise a distance value representing an estimation of the maximum distance for which the ranging device is capable of determining a distance estimation.
6. The autofocus method of claim 5, wherein the distance value is determined based on an ambient light measure.
7. The autofocus method of claim 6, further comprising estimating a reflectance of a target object of the autofocus method, and determining the distance value based also on the estimated reflectance of the target object.
8. The autofocus method of claim 7, wherein estimating the reflectance of the target comprises detecting a color of the target object.
9. The autofocus method of claim 5, wherein processing comprises:
comparing the distance value with a first distance threshold; and
if the distance value is higher than the first distance threshold, performing the autofocus operation by selecting a lens position corresponding to infinity.
10. The autofocus method of claim 9, wherein the first distance threshold is a distance corresponding to less than one lens position from the infinity lens position.
11. The autofocus method of claim 5, wherein processing comprises:
comparing the distance value with a second distance threshold, and
if the distance value is higher than the second distance threshold, performing the autofocus operation by performing an iterative autofocus operation with fine steps for a range of lens positions corresponding to distances in the range Dmax to infinity.
12. The autofocus method of claim 11, wherein the second distance threshold is equal to a focusing distance corresponding to six or less lens positions from the lens position corresponding to infinity.
13. An autofocus system, comprising:
a digital camera comprising at least one lens controllable to have N unique focusing positions;
a ranging device configured to estimate a distance to an object; and
a processing device configured to:
determine that the ranging device failed in an attempt to provide a distance estimation;
receive from the ranging device one or more parameters indicating conditions related to the failure of the ranging device to provide the distance estimation; and
perform an autofocus operation based on the one or more parameters.
14. The autofocus system of claim 13, wherein the processing device is configured to perform the autofocus operation by performing an iterative search starting at a lens position selected based on the one or more parameters.
15. The autofocus system of claim 13, wherein the digital camera comprises an image sensor on which the at least one lens forms an image of an image scene, and wherein the processing device is configured to process images captured by the image sensor to determine focus measures and to perform the iterative autofocus search based on the focus measures.
16. The autofocus system of claim 13, further comprising a memory storing a table indicating a mapping between distances in the image scene and corresponding lens positions of the at least one lens for focusing at said distances.
17. The autofocus system of claim 13, wherein the one or more parameters comprise a distance value representing an estimation of the maximum distance for which the ranging device is capable of determining a distance estimation, and wherein the ranging device is configured to generate the distance value based on an ambient light measure.
US14/842,238 2014-12-23 2015-09-01 Hybrid auto-focus mechanism Active US9420163B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1463242A FR3030791A1 (en) 2014-12-23 2014-12-23
FR1463242 2014-12-23

Publications (2)

Publication Number Publication Date
US20160182809A1 true US20160182809A1 (en) 2016-06-23
US9420163B2 US9420163B2 (en) 2016-08-16

Family

ID=53177574

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/842,238 Active US9420163B2 (en) 2014-12-23 2015-09-01 Hybrid auto-focus mechanism

Country Status (2)

Country Link
US (1) US9420163B2 (en)
FR (1) FR3030791A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170054895A1 (en) * 2015-08-19 2017-02-23 Google Inc. Smart image sensor having integrated memory and processor
US10148869B1 (en) * 2016-03-02 2018-12-04 Amazon Technologies, Inc. Systems and methods for determining a depth or reflectance of objects
EP3609173A1 (en) * 2018-08-08 2020-02-12 Vestel Elektronik Sanayi ve Ticaret A.S. Method and apparatus for recording an image
CN111711764A (en) * 2020-08-18 2020-09-25 昆山迈致治具科技有限公司 Method, apparatus, device and medium for determining focus position
US11350023B2 (en) * 2018-04-20 2022-05-31 Hangzhou Hikvision Digital Technology Co., Ltd. Automatic focusing method and device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2958464B2 (en) * 1990-01-17 1999-10-06 チノン株式会社 Automatic focusing device
JPH095611A (en) * 1995-06-14 1997-01-10 Fuji Photo Optical Co Ltd Auto-focusing device
JP3137559B2 (en) * 1995-06-16 2001-02-26 富士写真光機株式会社 Distance measuring device
JP2000098217A (en) * 1998-09-22 2000-04-07 Casio Comput Co Ltd Automatic focusing device and focusing method
JP2001343578A (en) * 2000-05-30 2001-12-14 Olympus Optical Co Ltd Camera
JP2003066321A (en) * 2001-08-29 2003-03-05 Mega Chips Corp Af control device and af control method
JP4481610B2 (en) * 2003-09-09 2010-06-16 キヤノン株式会社 Imaging device and focus control method of imaging device
JP2006301032A (en) * 2005-04-15 2006-11-02 Sony Corp Autofocus device, autofocus method, and program
CN104902160B (en) * 2014-03-03 2018-10-12 联想(北京)有限公司 A kind of information processing method and electronic equipment

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170054895A1 (en) * 2015-08-19 2017-02-23 Google Inc. Smart image sensor having integrated memory and processor
US10129477B2 (en) * 2015-08-19 2018-11-13 Google Llc Smart image sensor having integrated memory and processor
US10547779B2 (en) 2015-08-19 2020-01-28 Google Llc Smart image sensor having integrated memory and processor
US10148869B1 (en) * 2016-03-02 2018-12-04 Amazon Technologies, Inc. Systems and methods for determining a depth or reflectance of objects
US10498953B1 (en) * 2016-03-02 2019-12-03 Amazon Technologies, Inc. Systems and methods for determining a depth or reflectance of objects
US11350023B2 (en) * 2018-04-20 2022-05-31 Hangzhou Hikvision Digital Technology Co., Ltd. Automatic focusing method and device
EP3609173A1 (en) * 2018-08-08 2020-02-12 Vestel Elektronik Sanayi ve Ticaret A.S. Method and apparatus for recording an image
CN111711764A (en) * 2020-08-18 2020-09-25 昆山迈致治具科技有限公司 Method, apparatus, device and medium for determining focus position

Also Published As

Publication number Publication date
FR3030791A1 (en) 2016-06-24
US9420163B2 (en) 2016-08-16

Similar Documents

Publication Publication Date Title
US9420163B2 (en) Hybrid auto-focus mechanism
US7764321B2 (en) Distance measuring apparatus and method
CN107257934B (en) Search range extension for depth-assisted autofocus
US8818055B2 (en) Image processing apparatus, and method, and image capturing apparatus with determination of priority of a detected subject and updating the priority
CN109903324B (en) Depth image acquisition method and device
US9491349B2 (en) Method and apparatus for performing auto focus with multiple images having different exposure times
US20220124252A1 (en) Methods and apparatus for defocus reduction using laser autofocus
US20080239136A1 (en) Focal Length Detecting For Image Capture Device
US9716824B2 (en) Focus detection apparatus and focus detection method
US9918004B2 (en) Camera body capable of driving an image sensor along an optical axis in response to a change in an optical state of an object image
US20150035855A1 (en) Electronic apparatus, method of controlling the same, and image reproducing apparatus and method
US8755600B2 (en) Method and apparatus for determining the light direction
CN106226976A (en) A kind of dual camera image pickup method, system and terminal
EP2230837B1 (en) Method and apparatus for motion compensation
WO2016062083A1 (en) Focusing method, device and terminal
US9247124B2 (en) Imaging apparatus, semiconductor integrated circuit, and imaging method
JP3761383B2 (en) Automatic focusing device, camera, portable information input device, focusing position detection method, and computer-readable recording medium
US20100086292A1 (en) Device and method for automatically controlling continuous auto focus
US9912858B2 (en) Image capturing apparatus and control method thereof
JP2007328360A (en) Automatic focusing camera and photographing method
US9906724B2 (en) Method and device for setting a focus of a camera
US10582111B2 (en) Systems and methods for autofocus and depth map generation
US20190253607A1 (en) Object tracking autofocus
US8698948B2 (en) Image pickup apparatus and control method configured to provide exposure control
JP4612512B2 (en) Automatic focusing device, camera, portable information input device, focusing position detection method, and computer-readable recording medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: STMICROELECTRONICS ASIA PACIFIC PTE LTD, SINGAPORE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VARILLON, PAUL;REEL/FRAME:036469/0336

Effective date: 20150831

Owner name: STMICROELECTRONICS (GRENOBLE 2) SAS, FRANCE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PLAZA, LAURENT;LEMARCHAND, OLIVIER;CASCIO, FRANCESCO;SIGNING DATES FROM 20150826 TO 20150827;REEL/FRAME:036469/0229

Owner name: STMICROELECTRONICS (RESEARCH & DEVELOPMENT) LIMITE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HALL, DUNCAN;LEE, SAM;SIGNING DATES FROM 20150827 TO 20150831;REEL/FRAME:036469/0273

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8