US20160050361A1 - Apparatus and method for automatically adjusting focus in image capturing device - Google Patents

Apparatus and method for automatically adjusting focus in image capturing device Download PDF

Info

Publication number
US20160050361A1
US20160050361A1 US14/923,971 US201514923971A US2016050361A1 US 20160050361 A1 US20160050361 A1 US 20160050361A1 US 201514923971 A US201514923971 A US 201514923971A US 2016050361 A1 US2016050361 A1 US 2016050361A1
Authority
US
United States
Prior art keywords
area
image
controller
focus
selecting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/923,971
Inventor
Yong-gu Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US14/923,971 priority Critical patent/US20160050361A1/en
Publication of US20160050361A1 publication Critical patent/US20160050361A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • H04N5/23212
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/18Focusing aids
    • G03B13/24Focusing screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • H04N5/2351
    • H04N5/2354

Definitions

  • the present invention relates generally to an image capturing device. More particularly, the present invention relates to an apparatus and method for automatically adjusting a focus in an image capturing device.
  • image capturing devices such as cameras and camcorders equipped with an image sensor.
  • image capturing devices perform an Auto Focus (AF) function by which a subject is automatically in focus.
  • AF Auto Focus
  • an image capturing device cannot perform the AF function on a subject when the subject is not located in the central area of an image.
  • an aspect of the present invention is to provide an apparatus and method for performing an AF function on a subject even though a subject is not located in the central area of an image.
  • Another aspect of the present invention is to provide an apparatus and method for perceiving a position of a subject when a flash unit is enabled in a dark environment with a low level of illumination and performing an AF function centering on the perceived position.
  • an apparatus for automatically adjusting a focus in an image capturing device includes an image sensor for generating an image by capturing a subject through a lens, a flash unit for generating a flash of light while the image sensor is capturing the image, and a controller for dividing the image into a plurality of windows, for determining a brightest window from among the plurality of windows, and for setting an Auto Focus (AF) detection area for performing an AF function centering on the determined window.
  • AF Auto Focus
  • a method of automatically adjusting a focus in an image capturing device includes generating an image by capturing a subject through a lens when a flash unit is enabled, dividing the image into a plurality of windows, determining a brightest window from among the plurality of windows, and setting an Auto Focus (AF) detection area for performing an AF function centering on the determined window.
  • AF Auto Focus
  • FIG. 1 is a block diagram of an image capturing device according to an exemplary embodiment of the present invention
  • FIG. 2 illustrates an image divided into a plurality of windows, according to an exemplary embodiment of the present invention
  • FIG. 3 illustrates an image to which a weight and a brightness value per window are allocated, according to an exemplary embodiment of the present invention
  • FIG. 4 illustrates an AF detection area set by the image capturing device according to an exemplary embodiment of the present invention
  • FIG. 5 is a flowchart illustrating a process of setting an Auto Focus (AF) detection area in the image capturing device, according to an exemplary embodiment of the present invention.
  • AF Auto Focus
  • FIG. 6 illustrates an AF detection area set centering on a subject by the image capturing device according to an exemplary embodiment of the present invention.
  • an image capturing device may be a camera, a camcorder, a web camera, a monitoring camera, a medical camera, a high-speed camera, or a 3-dimensional camera or may be included in wireless terminals.
  • the wireless terminals may be portable electronic devices, such as videophones, cellular phones, smart phones, International Mobile Telecommunication 2000 (IMT-2000) terminals, Wideband Code Division Multiple Access (WCDMA) terminals, Universal Mobile Telecommunication Service (UMTS) terminals, Personal Digital Assistants (PDAs), Portable Multimedia Players (PMPs), Digital Multimedia Broadcasting (DMB) terminals, laptop computers, and tablet PCs.
  • IMT-2000 International Mobile Telecommunication 2000
  • WCDMA Wideband Code Division Multiple Access
  • UMTS Universal Mobile Telecommunication Service
  • PDAs Personal Digital Assistants
  • PMPs Portable Multimedia Players
  • DMB Digital Multimedia Broadcasting
  • FIG. 1 is a block diagram of an image capturing device according to an exemplary embodiment of the present invention.
  • the image capturing device may include a controller 101 , a flash unit 103 , an image sensor 105 , a black-level correction unit 107 , a digital gain correction unit 109 , and a lens shading correction unit 111 .
  • the image capturing device may include additional units not described here for the sake of convenience. Such additional components may vary depending on the nature of the image capturing device; for example, a cellular phone may include a communication unit for transmitting and receiving data. Similarly, the units of the image capturing device may be integrated into a single component.
  • the flash unit 103 is a portable lighting device for generating a flash of light in capturing to brightly capture an image in a dark environment and generates a flash of light under control of the controller 101 .
  • Examples of the flash unit 103 include a Light Emitting Diode (LED) flash unit for generating a flash of light using an LED and a xenon flash unit for generating a flash of light using a xenon lamp.
  • the dark environment is an environment with a low level of illumination, for example, night-time.
  • the image sensor 105 captures an image on a subject through a lens and outputs the image to the black-level correction unit 107 on a frame basis.
  • Examples of the image sensor 105 include a Complementary Metal Oxide Semiconductor (CMOS) sensor, a Charge Coupled Device (CCD) sensor, a Foveon sensor, and a complementary image sensor.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge Coupled Device
  • Foveon sensor a complementary image sensor.
  • the black-level correction unit 107 receives the image from the image sensor 105 , corrects a black level of the image, and outputs the black-level-corrected image to the digital gain correction unit 109 .
  • the digital gain correction unit 109 receives the black-level-corrected image from the black level correction unit 107 , corrects a digital gain of the black-level-corrected image to automatically adjust brightness under control of the controller 101 , and outputs the gain-corrected image to the lens shading correction unit 111 .
  • the lens shading correction unit 111 receives the gain-corrected image from the digital gain correction unit 109 , corrects lens-shading on the gain-corrected image, and outputs the lens-shading-corrected image to the controller 101 .
  • the lens shading correction unit 111 may perform the lens shading correction along a predetermined shading correction curve when brightness in an edge area of an image is lower (darker) than that in the central area according to characteristics of the image sensor 105 .
  • the lens shading indicates a phenomenon in which sensitivity of an image is non-uniform due to insufficiency of light intensity around a lens system in the image capturing device.
  • the digital gain correction unit 109 for correcting a digital gain of an image and the lens shading correction unit 111 for correcting lens shading on an image are components for performing pre-processing of an image output from the image sensor 105 .
  • the controller 101 controls the general operation of the image capturing device. For example, the controller 101 controls an exposure time and an analog gain of the image sensor 105 , controls a digital gain of the digital gain correction unit 109 , and controls ON/OFF of the flash unit 103 .
  • FIG. 2 illustrates an image divided into a plurality of windows, according to an exemplary embodiment of the present invention.
  • the controller 101 when the flash unit 103 is turned on (or enabled), the controller 101 receives an image from the lens shading correction unit 111 after the flash unit 103 is turned on and divides the image into a plurality of windows. For example, the controller 101 may divide an image 201 into M ⁇ N windows, wherein M and N are integers.
  • FIG. 3 illustrates an image to which a weight and a brightness value per window are allocated, according to an exemplary embodiment of the present invention.
  • the controller 101 calculates brightness values corresponding to the plurality of windows.
  • the brightness values may be calculated using any one of illumination, luminosity, and luminous intensity.
  • the controller 101 may calculate brightness values Y ij of the M ⁇ N windows of an image 303 .
  • the controller 101 searches for a weight pre-set by a user from among window-based weights allocated to brightness values, which are stored in a memory unit (not shown).
  • the window-based weights are weights previously stored in the memory unit, including a first weight for adding a brightness weight to windows located in the central area of an image, a second weight for adding a brightness weight to windows located in left and right areas of the image except for the central area, and a third weight for adding a brightness weight to windows located in a plurality of areas of the image.
  • the controller 101 may search for window-based weights W ij of the M ⁇ N windows of the image 301 .
  • the controller 101 generates window-based weighted brightness values based on the calculated window-based brightness values and the pre-set window-based weights.
  • the controller 101 calculates window-based weighted brightness values by multiplying the calculated window-based brightness values by the pre-set window-based weights on a window-to-window basis. For example, the controller 101 may calculate M ⁇ N window-based weighted brightness values by multiplying the M ⁇ N window-based brightness values Y ij by the M ⁇ N window-based weights W ij , respectively.
  • the controller 101 selects a maximum weighted brightness value from among the weighted brightness values corresponding to the plurality of windows and compares the selected maximum weighted brightness value with a predetermined reference brightness value Y TH . If the selected maximum weighted brightness value is greater than the predetermined reference brightness value Y TH the controller 101 compares the selected maximum weighted brightness value with a predetermined saturated brightness value.
  • the controller 101 sets an Auto Focus (AF) detection area centering on a window corresponding to the selected maximum weighted brightness value. If the selected maximum weighted brightness value is equal to or greater than the predetermined saturated brightness value, the controller 101 excludes the selected maximum weighted brightness value from the weighted brightness values corresponding to the plurality of windows, reselects a maximum weighted brightness value from among the weighted brightness values except for the selected maximum weighted brightness value, and compares the reselected maximum weighted brightness value with the predetermined reference brightness value Y TH .
  • AF Auto Focus
  • the predetermined saturated brightness value is a reference value to determine whether brightness of any window is saturated, and may be defined by Equation 1.
  • Y saturation denotes the predetermined saturated brightness value
  • the maximum brightness level indicates the highest weighted brightness value which can be calculated as a maximum weighted brightness value of any window. If necessary, when the maximum brightness level is changed, the predetermined saturated brightness value may also be changed.
  • the controller 101 sets an AF detection area centering on a window located at the center of the image.
  • the AF detection area indicates an area for performing an AF function in the image capturing device, and the AF function indicates a function by which a subject is automatically in focus.
  • FIG. 4 illustrates an AF detection area set by the image capturing device according to an exemplary embodiment of the present invention.
  • the controller 101 may set an AF detection area 407 centering on the window 403 in an image 405 .
  • the controller 101 captures an image by considering the set AF detection area 407 and stores the captured image in the memory unit.
  • FIG. 5 is a flowchart illustrating a process of setting an AF detection area in the image capturing device, according to an exemplary embodiment of the present invention.
  • step 501 the controller 101 determines whether the flash unit 103 is enabled. If the flash unit 103 is enabled, the controller 101 proceeds to step 503 ; if the flash unit 103 is not enabled, the controller 101 repeatedly performs step 501 .
  • the controller 101 enables the flash unit 103 by a request of a user or according to a predetermined condition.
  • the predetermined condition may be a condition in which the flash unit 103 automatically is enabled if a brightness value of an environment outside the image capturing device is equal to or less than a predetermined reference value or a condition in which the flash unit 103 is enabled at a predetermined time.
  • step 503 the controller 101 receives an image generated after enabling of the flash unit 103 from the lens shading correction unit 111 , divides the received image into a plurality of windows, and proceeds to step 505 .
  • the controller 101 calculates weighted brightness values corresponding to the plurality of windows.
  • the controller 101 calculates brightness values corresponding to the plurality of windows and searches for weights pre-set by the user from among window-based weights for brightness values, which are previously stored in the memory unit.
  • the brightness values may be calculated using any one of illumination, luminosity, and luminous intensity.
  • the controller 101 calculates window-based weighted brightness values by multiplying the calculated window-based brightness values by the pre-set window-based weights on a window-to-window basis.
  • step 507 the controller 101 selects a maximum weighted brightness value from among the weighted brightness values corresponding to the plurality of windows.
  • step 509 the controller 101 compares the selected maximum weighted brightness value with a predetermined reference brightness value. If the selected maximum weighted brightness value is greater than the predetermined reference brightness value, the controller 101 proceeds to step 511 . If the selected maximum weighted brightness value is equal to or less than the predetermined reference brightness value, the controller 101 proceeds to step 513 .
  • step 511 the controller 101 compares the selected maximum weighted brightness value with a predetermined saturated brightness value. If the selected maximum weighted brightness value is less than the predetermined saturated brightness value, the controller 101 proceeds to step 515 . If the selected maximum weighted brightness value is equal to or greater than the predetermined saturated brightness value, the controller 101 proceeds to step 517 .
  • the predetermined saturated brightness value is a reference value to determine whether brightness of any window is saturated and may be defined by Equation 1.
  • step 517 the controller 101 excludes the current (selected) maximum weighted brightness value from the weighted brightness values corresponding to the plurality of windows, reselects a maximum weighted brightness value from among the weighted brightness values except for the current maximum weighted brightness value, and returns to step 509 .
  • the controller 101 sets an AF detection area centering on a window corresponding to the maximum weighted brightness value.
  • the AF detection area indicates an area for performing an AF function in the image capturing device, and the AF function indicates a function by which a subject is automatically in focus.
  • step 513 the controller 101 sets an AF detection area centering on a window located at the center of the image.
  • step 519 the controller 101 captures an image by considering the set AF detection area and stores the captured image.
  • the image capturing device may capture an image by automatically focusing on a subject located in a bright area of the image.
  • FIG. 6 illustrates an AF detection area set centering on a subject by the image capturing device according to an exemplary embodiment of the present invention.
  • An image 601 indicates an image before an AF detection area is set, and an image 603 indicates an image after an AF detection area 605 is set. If a subject in the image 601 is located in the brightest area in the image 601 , the controller 101 may set the AF detection area 605 centering on the subject located in the image 603 .
  • the image capturing device perceives a position of a subject using brightness values of windows in an image and sets an AF detection area centering on the perceived position.
  • the image capturing device perceives that the subject is located at the brightest window from among the windows.
  • a flash unit when a flash unit is enabled in a dark environment with a low level of illumination, a position of a subject can be perceived, and an AF function can be performed centering on the perceived position.

Abstract

An apparatus and a method for automatically adjusting a focus in an image capturing device are provided. The method includes generating an image by capturing a subject through a lens when a flash unit is enabled, dividing the image into a plurality of windows, determining a brightest window from among the plurality of windows, and setting an Auto Focus (AF) detection area for performing an AF function centering on the determined window.

Description

    CROSS-REFERENCE TO RELATED APPLICATION(S)
  • This application is a continuation application of prior application Ser. No. 13/553,127, filed on Jul. 19, 2012, which issued as U.S. Pat. No. 9,185,283 on Nov. 10, 2015, and which claimed the benefit under 35 U.S.C. §119(a) of a Korean patent application filed on Jul. 20, 2011 in the Korean Intellectual Property Office and assigned Serial number 10-2011-0072179, the entire disclosure of which is hereby incorporated by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates generally to an image capturing device. More particularly, the present invention relates to an apparatus and method for automatically adjusting a focus in an image capturing device.
  • 2. Description of the Related Art
  • Recently, the rapid development of image devices has accelerated the development of image capturing devices, such as cameras and camcorders equipped with an image sensor. These image capturing devices perform an Auto Focus (AF) function by which a subject is automatically in focus. However, since the AF function is performed centering on the central area of an image, an image capturing device cannot perform the AF function on a subject when the subject is not located in the central area of an image.
  • SUMMARY OF THE INVENTION
  • Aspects of the present invention are to address the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present invention is to provide an apparatus and method for performing an AF function on a subject even though a subject is not located in the central area of an image.
  • Another aspect of the present invention is to provide an apparatus and method for perceiving a position of a subject when a flash unit is enabled in a dark environment with a low level of illumination and performing an AF function centering on the perceived position.
  • In accordance with an aspect of the present invention, an apparatus for automatically adjusting a focus in an image capturing device is provided. The apparatus includes an image sensor for generating an image by capturing a subject through a lens, a flash unit for generating a flash of light while the image sensor is capturing the image, and a controller for dividing the image into a plurality of windows, for determining a brightest window from among the plurality of windows, and for setting an Auto Focus (AF) detection area for performing an AF function centering on the determined window.
  • In accordance with another aspect of the present invention, a method of automatically adjusting a focus in an image capturing device is provided. The method includes generating an image by capturing a subject through a lens when a flash unit is enabled, dividing the image into a plurality of windows, determining a brightest window from among the plurality of windows, and setting an Auto Focus (AF) detection area for performing an AF function centering on the determined window.
  • Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other aspects, features, and advantages of certain exemplary embodiments of the present invention will become more apparent from the following description when taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a block diagram of an image capturing device according to an exemplary embodiment of the present invention;
  • FIG. 2 illustrates an image divided into a plurality of windows, according to an exemplary embodiment of the present invention;
  • FIG. 3 illustrates an image to which a weight and a brightness value per window are allocated, according to an exemplary embodiment of the present invention;
  • FIG. 4 illustrates an AF detection area set by the image capturing device according to an exemplary embodiment of the present invention;
  • FIG. 5 is a flowchart illustrating a process of setting an Auto Focus (AF) detection area in the image capturing device, according to an exemplary embodiment of the present invention; and
  • FIG. 6 illustrates an AF detection area set centering on a subject by the image capturing device according to an exemplary embodiment of the present invention.
  • Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding, but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
  • The terms and words used in the following description and claims are not limited to the bibliographical meanings, but are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention is provided for illustration purposes only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
  • It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
  • Exemplary embodiments of the present invention are described below with reference to the accompanying drawings. In the following description and the accompanying drawings, well-known functions or constructions are not described in detail since they would obscure the invention with unnecessary detail.
  • According to exemplary embodiments of the present invention, an image capturing device may be a camera, a camcorder, a web camera, a monitoring camera, a medical camera, a high-speed camera, or a 3-dimensional camera or may be included in wireless terminals. The wireless terminals may be portable electronic devices, such as videophones, cellular phones, smart phones, International Mobile Telecommunication 2000 (IMT-2000) terminals, Wideband Code Division Multiple Access (WCDMA) terminals, Universal Mobile Telecommunication Service (UMTS) terminals, Personal Digital Assistants (PDAs), Portable Multimedia Players (PMPs), Digital Multimedia Broadcasting (DMB) terminals, laptop computers, and tablet PCs.
  • FIG. 1 is a block diagram of an image capturing device according to an exemplary embodiment of the present invention.
  • Referring to FIG. 1, the image capturing device may include a controller 101, a flash unit 103, an image sensor 105, a black-level correction unit 107, a digital gain correction unit 109, and a lens shading correction unit 111. The image capturing device may include additional units not described here for the sake of convenience. Such additional components may vary depending on the nature of the image capturing device; for example, a cellular phone may include a communication unit for transmitting and receiving data. Similarly, the units of the image capturing device may be integrated into a single component.
  • The flash unit 103 is a portable lighting device for generating a flash of light in capturing to brightly capture an image in a dark environment and generates a flash of light under control of the controller 101. Examples of the flash unit 103 include a Light Emitting Diode (LED) flash unit for generating a flash of light using an LED and a xenon flash unit for generating a flash of light using a xenon lamp. The dark environment is an environment with a low level of illumination, for example, night-time.
  • The image sensor 105 captures an image on a subject through a lens and outputs the image to the black-level correction unit 107 on a frame basis. Examples of the image sensor 105 include a Complementary Metal Oxide Semiconductor (CMOS) sensor, a Charge Coupled Device (CCD) sensor, a Foveon sensor, and a complementary image sensor. The black-level correction unit 107 receives the image from the image sensor 105, corrects a black level of the image, and outputs the black-level-corrected image to the digital gain correction unit 109.
  • The digital gain correction unit 109 receives the black-level-corrected image from the black level correction unit 107, corrects a digital gain of the black-level-corrected image to automatically adjust brightness under control of the controller 101, and outputs the gain-corrected image to the lens shading correction unit 111. The lens shading correction unit 111 receives the gain-corrected image from the digital gain correction unit 109, corrects lens-shading on the gain-corrected image, and outputs the lens-shading-corrected image to the controller 101. The lens shading correction unit 111 may perform the lens shading correction along a predetermined shading correction curve when brightness in an edge area of an image is lower (darker) than that in the central area according to characteristics of the image sensor 105. The lens shading indicates a phenomenon in which sensitivity of an image is non-uniform due to insufficiency of light intensity around a lens system in the image capturing device.
  • As described above, the digital gain correction unit 109 for correcting a digital gain of an image and the lens shading correction unit 111 for correcting lens shading on an image are components for performing pre-processing of an image output from the image sensor 105.
  • The controller 101 controls the general operation of the image capturing device. For example, the controller 101 controls an exposure time and an analog gain of the image sensor 105, controls a digital gain of the digital gain correction unit 109, and controls ON/OFF of the flash unit 103.
  • FIG. 2 illustrates an image divided into a plurality of windows, according to an exemplary embodiment of the present invention.
  • Referring to FIG. 2, when the flash unit 103 is turned on (or enabled), the controller 101 receives an image from the lens shading correction unit 111 after the flash unit 103 is turned on and divides the image into a plurality of windows. For example, the controller 101 may divide an image 201 into M×N windows, wherein M and N are integers.
  • FIG. 3 illustrates an image to which a weight and a brightness value per window are allocated, according to an exemplary embodiment of the present invention.
  • The controller 101 calculates brightness values corresponding to the plurality of windows. The brightness values may be calculated using any one of illumination, luminosity, and luminous intensity. For example, the controller 101 may calculate brightness values Yij of the M×N windows of an image 303.
  • The controller 101 searches for a weight pre-set by a user from among window-based weights allocated to brightness values, which are stored in a memory unit (not shown). The window-based weights are weights previously stored in the memory unit, including a first weight for adding a brightness weight to windows located in the central area of an image, a second weight for adding a brightness weight to windows located in left and right areas of the image except for the central area, and a third weight for adding a brightness weight to windows located in a plurality of areas of the image. For example, the controller 101 may search for window-based weights Wij of the M×N windows of the image 301.
  • The controller 101 generates window-based weighted brightness values based on the calculated window-based brightness values and the pre-set window-based weights. The controller 101 calculates window-based weighted brightness values by multiplying the calculated window-based brightness values by the pre-set window-based weights on a window-to-window basis. For example, the controller 101 may calculate M×N window-based weighted brightness values by multiplying the M×N window-based brightness values Yij by the M×N window-based weights Wij, respectively.
  • The controller 101 selects a maximum weighted brightness value from among the weighted brightness values corresponding to the plurality of windows and compares the selected maximum weighted brightness value with a predetermined reference brightness value YTH. If the selected maximum weighted brightness value is greater than the predetermined reference brightness value YTH the controller 101 compares the selected maximum weighted brightness value with a predetermined saturated brightness value.
  • If the selected maximum weighted brightness value is less than the predetermined saturated brightness value, the controller 101 sets an Auto Focus (AF) detection area centering on a window corresponding to the selected maximum weighted brightness value. If the selected maximum weighted brightness value is equal to or greater than the predetermined saturated brightness value, the controller 101 excludes the selected maximum weighted brightness value from the weighted brightness values corresponding to the plurality of windows, reselects a maximum weighted brightness value from among the weighted brightness values except for the selected maximum weighted brightness value, and compares the reselected maximum weighted brightness value with the predetermined reference brightness value YTH.
  • The predetermined saturated brightness value is a reference value to determine whether brightness of any window is saturated, and may be defined by Equation 1.

  • Y saturation=95%×maximum brightness level   (1)
  • Ysaturation denotes the predetermined saturated brightness value, and the maximum brightness level indicates the highest weighted brightness value which can be calculated as a maximum weighted brightness value of any window. If necessary, when the maximum brightness level is changed, the predetermined saturated brightness value may also be changed.
  • If the selected maximum weighted brightness value is equal to or less than the predetermined reference brightness value YTH, the controller 101 sets an AF detection area centering on a window located at the center of the image. The AF detection area indicates an area for performing an AF function in the image capturing device, and the AF function indicates a function by which a subject is automatically in focus.
  • FIG. 4 illustrates an AF detection area set by the image capturing device according to an exemplary embodiment of the present invention.
  • Referring to FIG. 4, if a window 403 from among M×N windows 401 corresponds to a maximum weighted brightness value, the controller 101 may set an AF detection area 407 centering on the window 403 in an image 405. The controller 101 captures an image by considering the set AF detection area 407 and stores the captured image in the memory unit.
  • FIG. 5 is a flowchart illustrating a process of setting an AF detection area in the image capturing device, according to an exemplary embodiment of the present invention.
  • Referring to FIG. 5, in step 501, the controller 101 determines whether the flash unit 103 is enabled. If the flash unit 103 is enabled, the controller 101 proceeds to step 503; if the flash unit 103 is not enabled, the controller 101 repeatedly performs step 501. The controller 101 enables the flash unit 103 by a request of a user or according to a predetermined condition. The predetermined condition may be a condition in which the flash unit 103 automatically is enabled if a brightness value of an environment outside the image capturing device is equal to or less than a predetermined reference value or a condition in which the flash unit 103 is enabled at a predetermined time.
  • In step 503, the controller 101 receives an image generated after enabling of the flash unit 103 from the lens shading correction unit 111, divides the received image into a plurality of windows, and proceeds to step 505.
  • In step 505, the controller 101 calculates weighted brightness values corresponding to the plurality of windows. The controller 101 calculates brightness values corresponding to the plurality of windows and searches for weights pre-set by the user from among window-based weights for brightness values, which are previously stored in the memory unit. The brightness values may be calculated using any one of illumination, luminosity, and luminous intensity. The controller 101 calculates window-based weighted brightness values by multiplying the calculated window-based brightness values by the pre-set window-based weights on a window-to-window basis.
  • In step 507, the controller 101 selects a maximum weighted brightness value from among the weighted brightness values corresponding to the plurality of windows.
  • In step 509, the controller 101 compares the selected maximum weighted brightness value with a predetermined reference brightness value. If the selected maximum weighted brightness value is greater than the predetermined reference brightness value, the controller 101 proceeds to step 511. If the selected maximum weighted brightness value is equal to or less than the predetermined reference brightness value, the controller 101 proceeds to step 513.
  • In step 511, the controller 101 compares the selected maximum weighted brightness value with a predetermined saturated brightness value. If the selected maximum weighted brightness value is less than the predetermined saturated brightness value, the controller 101 proceeds to step 515. If the selected maximum weighted brightness value is equal to or greater than the predetermined saturated brightness value, the controller 101 proceeds to step 517. The predetermined saturated brightness value is a reference value to determine whether brightness of any window is saturated and may be defined by Equation 1.
  • In step 517, the controller 101 excludes the current (selected) maximum weighted brightness value from the weighted brightness values corresponding to the plurality of windows, reselects a maximum weighted brightness value from among the weighted brightness values except for the current maximum weighted brightness value, and returns to step 509.
  • In step 515, the controller 101 sets an AF detection area centering on a window corresponding to the maximum weighted brightness value. The AF detection area indicates an area for performing an AF function in the image capturing device, and the AF function indicates a function by which a subject is automatically in focus.
  • In step 513, the controller 101 sets an AF detection area centering on a window located at the center of the image. In step 519, the controller 101 captures an image by considering the set AF detection area and stores the captured image. By performing the foregoing steps, the image capturing device may capture an image by automatically focusing on a subject located in a bright area of the image.
  • FIG. 6 illustrates an AF detection area set centering on a subject by the image capturing device according to an exemplary embodiment of the present invention.
  • An image 601 indicates an image before an AF detection area is set, and an image 603 indicates an image after an AF detection area 605 is set. If a subject in the image 601 is located in the brightest area in the image 601, the controller 101 may set the AF detection area 605 centering on the subject located in the image 603.
  • According to exemplary embodiments of the present invention, if the flash unit 103 is enabled in a dark environment, the image capturing device perceives a position of a subject using brightness values of windows in an image and sets an AF detection area centering on the perceived position. The image capturing device perceives that the subject is located at the brightest window from among the windows.
  • As is apparent from the foregoing description, even though a subject is not located in the central area of an image, an AF function on the subject can be performed.
  • In addition, when a flash unit is enabled in a dark environment with a low level of illumination, a position of a subject can be perceived, and an AF function can be performed centering on the perceived position.
  • While the invention has been shown and described with reference to certain exemplary embodiments, such as a mobile communication terminal, thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims and their equivalents.

Claims (20)

What is claimed is:
1. An apparatus comprising:
a flash unit;
a image sensor; and
a controller configured to:
obtain an image corresponding to one or more external objects using the image sensor, the obtaining including flashing a light using the flash unit;
divide at least one portion of the image into a plurality of areas;
select at least one area from the plurality of areas based at least in part on a characteristic of the at least one area, the characteristic to be determined based at least in part on the light; and
set at least one portion of the at least one area as a focus area for the image.
2. The apparatus of claim 1, wherein the controller is further configured to:
adjust a size or a shape of the focus area based at least in part on the at least one area.
3. The apparatus of claim 1, wherein the controller is further configured to:
identify a brightness, a contrast, or an illumination corresponding to the at least one area as part of the characteristic.
4. The apparatus of claim 3, wherein the controller is further configured to:
select the at least one area based at least in part on a weight value determined according to an outcome of a mathematical function applied to the brightness, the contrast, or the illumination.
5. The apparatus of claim 4, wherein the controller is further configured to:
select the at least one area based at least in part on a determination that the weight value satisfies a specified reference value.
6. The apparatus of claim 1, wherein the controller is further configured to:
select the at least one area based at least in part on a location of the at least one area.
7. The apparatus of claim 1, wherein the controller is further configured to:
select the at least one area based at least in part on a distance between a center of the image and the at least one area.
8. The apparatus of claim 1, wherein the controller is further configured to:
perform an auto-focus function with respect to at least one external object of the one or more external objects in the at least one area.
9. The apparatus of claim 8, wherein the controller is further configured to:
capture the image as the auto-focus function is being performed with respect to the at least one area.
10. The apparatus of claim 1, wherein the controller is further configured to:
adjust an exposure time corresponding to the one or more external objects based at least in part on the light.
11. A method comprising:
obtaining an image corresponding to one or more external objects using an image sensor, wherein obtaining the image includes flashing a light using a flash unit;
dividing at least one portion of the image into a plurality of areas;
selecting at least one area from the plurality of areas based at least in part on a characteristic of the at least one area, the characteristic to be determined based at least in part on the light;
setting at least one portion of the at least one area as a focus area for the image; and
performing an autofocus function based on the focus area.
12. The method of claim 11, wherein setting the at least one portion comprises:
adjusting a size or a shape of the focus area based at least in part on the at least one area.
13. The method of claim 11, wherein selecting the at least one area comprises:
identifying a brightness, a contrast, or an illumination corresponding to the at least one area as part of the characteristic.
14. The method of claim 13, wherein selecting the at least one area comprises:
selecting the at least one area based at least in part on a weight value determined according to an outcome of a mathematical function applied to the brightness, the contrast, or the illumination.
15. The method of claim 14, wherein the selecting the at least one area comprises:
selecting the at least one area based at least in part on a determination that the weight value satisfies a specified reference value.
16. The method of claim 11, wherein the selecting the at least one area comprises:
selecting the at least one area based at least in part on a distance between a center of the image and the at least one area.
17. The method of claim 11, wherein performing the autofocus function comprises:
performing an auto-focus function with respect to at least one external object of the one or more external objects in the at least one area.
18. The method of claim 17, wherein performing the autofocus function comprises:
capturing the image as the auto-focus function is being performed with respect to the at least one area.
19. The method of claim 11, where in obtaining the image comprises:
adjusting an exposure time corresponding to the one or more external objects based at least in part on an amount, an intensity, or a duration of the light.
20. A non-transitory machine-readable storage device storing instructions that, when executed by one or more computer processors, cause the one or more computer processors to perform operations comprising:
obtaining an image corresponding to one or more external objects using an image sensor, the obtaining including flashing a light using a flash unit;
dividing at least one portion of the image into a plurality of areas;
selecting at least one area from the plurality of areas based at least in part on a characteristic of the at least one area, the characteristic to be determined based at least in part on the light;
setting at least one portion of the at least one area as a focus area for the image; and
performing an autofocus function based on the focus area.
US14/923,971 2011-07-20 2015-10-27 Apparatus and method for automatically adjusting focus in image capturing device Abandoned US20160050361A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/923,971 US20160050361A1 (en) 2011-07-20 2015-10-27 Apparatus and method for automatically adjusting focus in image capturing device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2011-0072179 2011-07-20
KR1020110072179A KR101971535B1 (en) 2011-07-20 2011-07-20 Apparatus and method for adjusting auto focus in image taking device
US13/553,127 US9185283B2 (en) 2011-07-20 2012-07-19 Apparatus and method for automatically adjusting focus in image capturing device
US14/923,971 US20160050361A1 (en) 2011-07-20 2015-10-27 Apparatus and method for automatically adjusting focus in image capturing device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/553,127 Continuation US9185283B2 (en) 2011-07-20 2012-07-19 Apparatus and method for automatically adjusting focus in image capturing device

Publications (1)

Publication Number Publication Date
US20160050361A1 true US20160050361A1 (en) 2016-02-18

Family

ID=46679108

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/553,127 Active US9185283B2 (en) 2011-07-20 2012-07-19 Apparatus and method for automatically adjusting focus in image capturing device
US14/923,971 Abandoned US20160050361A1 (en) 2011-07-20 2015-10-27 Apparatus and method for automatically adjusting focus in image capturing device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/553,127 Active US9185283B2 (en) 2011-07-20 2012-07-19 Apparatus and method for automatically adjusting focus in image capturing device

Country Status (3)

Country Link
US (2) US9185283B2 (en)
EP (1) EP2549740B1 (en)
KR (1) KR101971535B1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105306803A (en) * 2014-07-31 2016-02-03 深圳市爵影科技有限公司 Mobile equipment wireless flash triggering method
FR3041134B1 (en) 2015-09-10 2017-09-29 Parrot DRONE WITH FRONTAL VIEW CAMERA WHOSE PARAMETERS OF CONTROL, IN PARTICULAR SELF-EXPOSURE, ARE MADE INDEPENDENT OF THE ATTITUDE.
JP6525829B2 (en) * 2015-09-11 2019-06-05 キヤノン株式会社 Control device, imaging device, control method, program, and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5343268A (en) * 1992-11-09 1994-08-30 Konica Corporation Camera with strobe capable of adjusting light emitting amount
JP2010107862A (en) * 2008-10-31 2010-05-13 Hoya Corp Af-bracket photographic method and device

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4196997A (en) * 1978-01-13 1980-04-08 Nippon Kogaku K.K. Interchangeable lens for a camera
US5526088A (en) * 1992-03-05 1996-06-11 Nikon Corporation Focus detection device
JP4063418B2 (en) * 1998-09-11 2008-03-19 イーストマン コダック カンパニー Auto white balance device
US7342609B2 (en) * 2000-05-09 2008-03-11 Eastman Kodak Company Exposure adjustment in an imaging apparatus
JP4203380B2 (en) * 2003-08-29 2008-12-24 富士フイルム株式会社 Portable device with camera function and imaging method thereof
JP4431532B2 (en) * 2005-09-16 2010-03-17 富士フイルム株式会社 Target image position detecting device and method, and program for controlling target image position detecting device
TWI386042B (en) * 2008-12-17 2013-02-11 Altek Corp Digital camera device and its brightness correction method
US8989436B2 (en) * 2010-03-30 2015-03-24 Nikon Corporation Image processing method, computer-readable storage medium, image processing apparatus, and imaging apparatus
US8773577B2 (en) * 2010-10-27 2014-07-08 Qualcomm Incorporated Region of interest extraction
US20120177353A1 (en) * 2011-01-11 2012-07-12 Michael Dowell Laser Point of View and Camera Focus Assist Device and Method of Use

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5343268A (en) * 1992-11-09 1994-08-30 Konica Corporation Camera with strobe capable of adjusting light emitting amount
JP2010107862A (en) * 2008-10-31 2010-05-13 Hoya Corp Af-bracket photographic method and device

Also Published As

Publication number Publication date
KR101971535B1 (en) 2019-08-14
US9185283B2 (en) 2015-11-10
KR20130011201A (en) 2013-01-30
EP2549740A1 (en) 2013-01-23
US20130021520A1 (en) 2013-01-24
EP2549740B1 (en) 2019-09-04

Similar Documents

Publication Publication Date Title
US10880493B2 (en) Imaging device, method and system of providing fill light, and movable object
US10027938B2 (en) Image processing device, imaging device, image processing method, and image processing program
US7711257B2 (en) Image quality in cameras using flash
CN102369723B (en) Auto exposure method and device, and imaging device
US20150181102A1 (en) Control device, control method, and control system
US8103159B2 (en) Flashing control method for a digital camera
US10791279B2 (en) Control device, control method, and exposure control system
US7848629B2 (en) Auto-focus system and auto-focus method thereof
WO2012093519A1 (en) Imaging device and light emission amount control method
US9888184B2 (en) Light emission control device, control method therefor, storage medium storing control program therefor, and image pickup apparatus with light emission control device
US10025163B2 (en) Flash unit and emitted light amount control method
US8860876B2 (en) Image display using flash reach information
US20160050361A1 (en) Apparatus and method for automatically adjusting focus in image capturing device
US8587714B2 (en) Method for capturing image
WO2019111659A1 (en) Image processing device, imaging device, image processing method, and program
US9225911B2 (en) Handheld communications device and adjustment method for flashlight module of handheld communications device
CN102868861B (en) Obtain the method for image
CN108053384B (en) Lens shadow calibration method, user terminal and related medium product
JP2011171917A (en) Mobile terminal, and method and program for setting photographing system
CN108496355B (en) Method and equipment for compensating relative illumination through multiple exposure
JP6148577B2 (en) Imaging apparatus and control method
CN114125408A (en) Image processing method and device, terminal and readable storage medium

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION