US7916207B2 - Apparatus and method for generating focus data in an image sensor - Google Patents

Apparatus and method for generating focus data in an image sensor Download PDF

Info

Publication number
US7916207B2
US7916207B2 US11/319,554 US31955405A US7916207B2 US 7916207 B2 US7916207 B2 US 7916207B2 US 31955405 A US31955405 A US 31955405A US 7916207 B2 US7916207 B2 US 7916207B2
Authority
US
United States
Prior art keywords
green
focus data
plane
green plane
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US11/319,554
Other versions
US20060146151A1 (en
Inventor
Ji-Hye Moon
Jang-Sik Moon
Jeong-Guk Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intellectual Ventures II LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Assigned to MAGNA CHIP SEMICONDUCTOR, LTD. reassignment MAGNA CHIP SEMICONDUCTOR, LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LEE, JEONG-GUK, MOON, JANG-SIK, MOON, JI-HYE
Publication of US20060146151A1 publication Critical patent/US20060146151A1/en
Assigned to Crosstek Capital, LLC reassignment Crosstek Capital, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MAGNACHIP SEMICONDUCTOR, LTD.
Application granted granted Critical
Publication of US7916207B2 publication Critical patent/US7916207B2/en
Assigned to INTELLECTUAL VENTURES II LLC reassignment INTELLECTUAL VENTURES II LLC MERGER (SEE DOCUMENT FOR DETAILS). Assignors: Crosstek Capital, LLC
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values

Definitions

  • the present invention relates to an auto focus control of an image sensor; and, more particularly, to an apparatus for generating a focus data in an image sensor having an RGB bayer color pattern by using only green information, and a method for generating the same.
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • SOC system on chip
  • CMOS technology which has been set up already, can be utilized compatibly in fabricating the CMOS image sensor, there is another merit that it is possible to reduce manufacturing cost.
  • an auto focus control function becomes an essential function of the image sensor. Therefore, it becomes a criterion to determine the function of the image sensor how sharply the focus is adjusted under various environments. Accordingly, there have been increased image sensor systems having the auto focus control function.
  • FIG. 1 is a block diagram setting forth a conventional apparatus for generating a focus data in an image sensor.
  • the conventional apparatus for generating the focus data includes an RGB interpolation unit 100 for performing an RGB interpolation using an RGB bayer pattern, a color space converter 101 for performing a color space conversion for extracting a luminance component of ‘Y’ value using the interpolated RGB data, and a focus data generation unit 102 for generating the focus data using the ‘Y’ value extracted through the color space conversion.
  • the focus data is generated by extracting the ‘Y’ luminance value from an RGB domain, after interpolating the bayer pattern to have the RGB data at every pixel through the RGB interpolation unit 100 .
  • the conventional apparatus requires the interpolation unit 100 to generate the focus data and the color space converter 101 to extract the ‘Y’ value.
  • blocks for the interpolation and the color spacer conversion should be additionally installed in the conventional apparatus for generating the focus data so that a hardware resource is additionally needed, which leads to increase power consumption after all.
  • an object of the present invention to provide an apparatus for generating a focus data in an image sensor capable of preventing hardware resources and power consumption from being increased when extracting the focus data, and a method for generating the same.
  • an apparatus for generating a focus data in an image sensor including: a green interpolation unit for generating a M ⁇ N green plane from the RGB bayer pattern of a predetermined selected image window through a green interpolation; and a focus data generation unit for extracting the focus data from the M ⁇ N green plane, wherein M and N are positive integers.
  • a method for generating a focus data in an image sensor including: selecting an image window for obtaining a focus data from an RGB bayer pattern; performing a green interpolation for generating an M ⁇ N green plane using the RGB bayer pattern in the selected window; and extracting a focus data from the M ⁇ N green plane if the bad pixel does not exist, wherein M and N are positive integers.
  • FIG. 1 is a block diagram setting forth a conventional apparatus for generating a focus data in an image sensor
  • FIG. 2 is a block diagram setting forth an apparatus for generating a focus data in an image sensor in accordance with an embodiment of the present invention
  • FIG. 3 is a flow chart setting forth a procedure for generating the focus data of the image sensor in accordance with an embodiment of the present invention
  • FIG. 4 is a graph showing spectral characteristic of red (R), green (G), and blue (B);
  • FIG. 5 is a drawing illustrating a green interpolation
  • FIG. 6 is a drawing setting forth a convolution of a green plane and a Laplacian mask for generating a boundary value.
  • FIG. 2 is a block diagram setting forth an apparatus for generating a focus data in an image sensor in accordance with an embodiment of the present invention.
  • the apparatus for generating the focus data in accordance with the present invention includes a green interpolation unit 200 and a focus data generation unit 201 .
  • the green interpolation unit 200 selects a window for obtaining a focus data from an RGB bayer pattern, and generates an M ⁇ N green plane (M and N are positive integers) from the bayer pattern for using the focus data in the selected window.
  • the focus data generation unit 201 extracts the focus data by extracting a boundary value with a high frequency component from the M ⁇ N green plane.
  • FIG. 3 is a flow chart setting forth a procedure for generating the focus data of the image sensor in accordance with an embodiment of the present invention.
  • an interpolation is performed using information about a peripheral green pixel at the RGB bayer pattern in the selected window (S 302 ).
  • the M ⁇ N green plane may be obtained from an (M+1) ⁇ (N+1) bayer pattern and also may be obtained from an M ⁇ N bayer pattern.
  • FIG. 4 is a graph showing spectral characteristic of red (R), green (G), and blue (B).
  • FIG. 4 shows relative response versus wavelength of RGB. It is understood that the green (G) has information with regard to both red (R) and blue (B) because it has a medium wavelength band between red (R) and blue (B).
  • FIG. 5 is a drawing illustrating a green interpolation.
  • each location of (1,1), (1,3), (2,2), (3,1), and (3,3) has green information, whereas the other locations do not have the green information.
  • the green interpolation is performed over the locations having no green information using a following mathematic equation 2.
  • G 12 ( G 11+ G 22+ G 13)/3
  • G 21 ( G 11+ G 22+ G 31)/3
  • G 23 ( G 13+ G 22+ G 33)/3
  • G 32 ( G 22+ G 31+ G 33)/3 [Eq. 2]
  • the pixel having no green information e.g., G11, G21, G23, and G32, has a mean value of the green values of nearest-neighboring pixels having the green information.
  • the bayer data may undergo a bad pixel cancellation process. However, unless the bayer data undergoes the bad pixel cancellation process, the focus data is generated by using an edge detection filter (S 304 ) in case that there is no bad pixel therein, after determining as to whether there is any bad pixel or not in order to reduce an effect of the focus data due to the bad pixel (S 303 ).
  • the green interpolation is performed again (S 302 ) after performing a bad pixel compensation (S 305 ).
  • the G22 pixel satisfies a following inequality conditions, i.e.,
  • the G22 pixel does not satisfy the above inequality conditions, it is not the bad pixel so that the bad pixel compensation is not performed.
  • the corresponding pixel G22 is regarded as a bad pixel. Otherwise, the corresponding pixel G22 is not regarded as the bad pixel.
  • the corresponding pixel G22 is interpolated to have the means value of the green values of the pixels G11, G13, G31 and G33 of which one has its green information.
  • the focus data is generated.
  • a procedure for generating the focus data will be more illustrated in detail herebelow.
  • the energy of the high frequency component may enable the focus data to be generated by taking only image information of a high frequency region after transforming the image information of a spatial domain into that of a frequency domain through M ⁇ N Fourier transform or discrete cosine transform.
  • frequency transformation of the M ⁇ N block has a decisive effect on enlarging an area of a hardware because there is a need for a line memory or the like for frequency transformation, it is possible to apply the M ⁇ N edge detection filter for generating the focus data instead of this process in order to prevent the area of the hardware from being enlarged.
  • FIG. 6 is a drawing setting forth a convolution of the green plane and a Laplacian mask for generating a boundary value.
  • the boundary value of each pixel is a summation value of them after being multiplied by coefficients of a M ⁇ N edge detection filter such as Sobel or Laplacian mask. In this manner, the boundary value of each pixel which belongs to the focus region is successively accumulated to obtain the focus value.
  • the present invention provides the method for generating the focus data for the auto focus control in the system in which the image sensor having RGB bayer pattern is mounted.
  • the present invention without the R/B interpolation and the color space conversion processes, which are essentially needed for generating the focus in the prior art, it is possible to generate the focus data accurately, which results in reducing the area of the hardware and power consumption.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Studio Devices (AREA)

Abstract

An apparatus for generating a focus data in an image sensor, the apparatus includes a green interpolation unit for generating a M×N green plane from the RGB bayer pattern of a predetermined selected image window through a green interpolation; and a focus data generation unit for extracting the focus data from the M×N green plane, wherein M and N are positive integers.

Description

The present application contains subject matter related to the Korean patent application No. KR 2004-115989, filed in the Korean Patent Office on Dec. 30, 2004, the entire contents of which being incorporated herein by reference.
FIELD OF THE INVENTION
The present invention relates to an auto focus control of an image sensor; and, more particularly, to an apparatus for generating a focus data in an image sensor having an RGB bayer color pattern by using only green information, and a method for generating the same.
DESCRIPTION OF RELATED ARTS
Recently, a complementary metal oxide semiconductor (CMOS) image sensor has been widely used for a device such as a mobile phone, a camera for a personal computer, an electronic device or the like. The CMOS image sensor has several advantageous merits that its driving method is simple in comparison with a charge coupled device (CCD), which has been used for the image sensor, and a signal processing circuit can be integrated in one chip so as to enable a module to be more and more micronized in virtue of a system on chip (SOC).
In addition, since a CMOS technology, which has been set up already, can be utilized compatibly in fabricating the CMOS image sensor, there is another merit that it is possible to reduce manufacturing cost.
Meanwhile, in a modern image sensor system, an auto focus control function becomes an essential function of the image sensor. Therefore, it becomes a criterion to determine the function of the image sensor how sharply the focus is adjusted under various environments. Accordingly, there have been increased image sensor systems having the auto focus control function.
FIG. 1 is a block diagram setting forth a conventional apparatus for generating a focus data in an image sensor.
Referring to FIG. 1, the conventional apparatus for generating the focus data, includes an RGB interpolation unit 100 for performing an RGB interpolation using an RGB bayer pattern, a color space converter 101 for performing a color space conversion for extracting a luminance component of ‘Y’ value using the interpolated RGB data, and a focus data generation unit 102 for generating the focus data using the ‘Y’ value extracted through the color space conversion.
In the conventional apparatus for generating the focus data having the above constitution, the focus data is generated by extracting the ‘Y’ luminance value from an RGB domain, after interpolating the bayer pattern to have the RGB data at every pixel through the RGB interpolation unit 100. In this case, the conventional apparatus requires the interpolation unit 100 to generate the focus data and the color space converter 101 to extract the ‘Y’ value.
Therefore, blocks for the interpolation and the color spacer conversion should be additionally installed in the conventional apparatus for generating the focus data so that a hardware resource is additionally needed, which leads to increase power consumption after all.
SUMMARY OF THE INVENTION
It is, therefore, an object of the present invention to provide an apparatus for generating a focus data in an image sensor capable of preventing hardware resources and power consumption from being increased when extracting the focus data, and a method for generating the same.
In accordance with an aspect of the present invention, there is provided an apparatus for generating a focus data in an image sensor, the apparatus including: a green interpolation unit for generating a M×N green plane from the RGB bayer pattern of a predetermined selected image window through a green interpolation; and a focus data generation unit for extracting the focus data from the M×N green plane, wherein M and N are positive integers.
In accordance with another aspect of the present invention, there is provided a method for generating a focus data in an image sensor, the method including: selecting an image window for obtaining a focus data from an RGB bayer pattern; performing a green interpolation for generating an M×N green plane using the RGB bayer pattern in the selected window; and extracting a focus data from the M×N green plane if the bad pixel does not exist, wherein M and N are positive integers.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other objects and features of the present invention will become better understood with respect to the following description of the preferred embodiments given in conjunction with the accompanying drawings, in which:
FIG. 1 is a block diagram setting forth a conventional apparatus for generating a focus data in an image sensor;
FIG. 2 is a block diagram setting forth an apparatus for generating a focus data in an image sensor in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart setting forth a procedure for generating the focus data of the image sensor in accordance with an embodiment of the present invention;
FIG. 4 is a graph showing spectral characteristic of red (R), green (G), and blue (B);
FIG. 5 is a drawing illustrating a green interpolation; and
FIG. 6 is a drawing setting forth a convolution of a green plane and a Laplacian mask for generating a boundary value.
DETAILED DESCRIPTION OF THE INVENTION
An apparatus for generating a focus data in an image sensor and a method for generating the same in accordance with exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.
FIG. 2 is a block diagram setting forth an apparatus for generating a focus data in an image sensor in accordance with an embodiment of the present invention.
Referring to FIG. 2, the apparatus for generating the focus data in accordance with the present invention includes a green interpolation unit 200 and a focus data generation unit 201. The green interpolation unit 200 selects a window for obtaining a focus data from an RGB bayer pattern, and generates an M×N green plane (M and N are positive integers) from the bayer pattern for using the focus data in the selected window. The focus data generation unit 201 extracts the focus data by extracting a boundary value with a high frequency component from the M×N green plane.
FIG. 3 is a flow chart setting forth a procedure for generating the focus data of the image sensor in accordance with an embodiment of the present invention.
The procedure for generating the focus data will be described in detail with reference to FIGS. 2 and 3, herebelow.
First, after selecting the window for generating the focus data from the RGB bayer pattern (S301), an interpolation is performed using information about a peripheral green pixel at the RGB bayer pattern in the selected window (S302).
At this time, the M×N green plane may be obtained from an (M+1)×(N+1) bayer pattern and also may be obtained from an M×N bayer pattern.
FIG. 4 is a graph showing spectral characteristic of red (R), green (G), and blue (B).
In detail, FIG. 4 shows relative response versus wavelength of RGB. It is understood that the green (G) has information with regard to both red (R) and blue (B) because it has a medium wavelength band between red (R) and blue (B).
A following mathematic equation 1 represents luminance value extracted from an RGB domain.
Y=0.299×R+0.587×G+0.114×B  [Eq. 1]
From the above equation 1, it is understood that a weight for green information is relatively high compared to red or blue. Therefore, in the present invention, it is possible to generate the accurate focus data using only the green information without an additional process for extracting a luminance component, which will be set forth more fully later.
FIG. 5 is a drawing illustrating a green interpolation.
Herein, it is shown for the sake of illustrative purpose that a 3×3 green plane is obtained through the green interpolation using the window having a 3×3 bayer pattern.
Referring to FIG. 5( a), each location of (1,1), (1,3), (2,2), (3,1), and (3,3) has green information, whereas the other locations do not have the green information. Thus, the green interpolation is performed over the locations having no green information using a following mathematic equation 2.
G12=(G11+G22+G13)/3
G21=(G11+G22+G31)/3
G23=(G13+G22+G33)/3
G32=(G22+G31+G33)/3  [Eq. 2]
That is, the pixel having no green information, e.g., G11, G21, G23, and G32, has a mean value of the green values of nearest-neighboring pixels having the green information.
The bayer data may undergo a bad pixel cancellation process. However, unless the bayer data undergoes the bad pixel cancellation process, the focus data is generated by using an edge detection filter (S304) in case that there is no bad pixel therein, after determining as to whether there is any bad pixel or not in order to reduce an effect of the focus data due to the bad pixel (S303).
As a determination result, if there exists the bad pixel, the green interpolation is performed again (S302) after performing a bad pixel compensation (S305).
The bad pixel determination and the bad pixel compensation will be represented with reference to a following mathematic equation 3.
G22=(G11+G13+G31+G33)/4  [Eq. 3]
In detail, if the G22 pixel satisfies a following inequality conditions, i.e., |G22−G21|>Th (threshold value), |G22−G13|>Th, |G22−G31|>Th and |G22−G33|>Th, the G22 pixel proves to be a bad pixel so that the bad pixel compensation is performed, whereby the G22 pixel have the green information as represented as the above equation 3. On the contrary, if the G22 pixel does not satisfy the above inequality conditions, it is not the bad pixel so that the bad pixel compensation is not performed.
That is, if all the absolute values of differences between the corresponding pixel G22 and the other pixels G11, G13, G31 and G33 having the green information are greater than the predetermined threshold value Th, the corresponding pixel G22 is regarded as a bad pixel. Otherwise, the corresponding pixel G22 is not regarded as the bad pixel. In addition, if the corresponding pixel G22 proves to be the bad pixel, the corresponding pixel G22 is interpolated to have the means value of the green values of the pixels G11, G13, G31 and G33 of which one has its green information.
After completing the green interpolation, the bad pixel determination and the bad pixel compensation through the above processes, the focus data is generated. A procedure for generating the focus data will be more illustrated in detail herebelow.
In order to generate the focus data, energy or a boundary value of the high frequency component should be generated. The energy of the high frequency component may enable the focus data to be generated by taking only image information of a high frequency region after transforming the image information of a spatial domain into that of a frequency domain through M×N Fourier transform or discrete cosine transform.
Since frequency transformation of the M×N block has a decisive effect on enlarging an area of a hardware because there is a need for a line memory or the like for frequency transformation, it is possible to apply the M×N edge detection filter for generating the focus data instead of this process in order to prevent the area of the hardware from being enlarged.
FIG. 6 is a drawing setting forth a convolution of the green plane and a Laplacian mask for generating a boundary value.
The boundary value of each pixel is a summation value of them after being multiplied by coefficients of a M×N edge detection filter such as Sobel or Laplacian mask. In this manner, the boundary value of each pixel which belongs to the focus region is successively accumulated to obtain the focus value.
A result of a following equation 4 will be obtained through the convolution of the green plane and the Laplacian mask of FIG. 6.
E=G22−(G11+G12+G13+G21+G23+G31+G32+G33)  [Eq. 4]
Through the above processes, it is possible to generate the focus value by adding all the boundary values of the pixels or all the high frequency energies of respective blocks which belong to the region to be computed.
As stated above, the present invention provides the method for generating the focus data for the auto focus control in the system in which the image sensor having RGB bayer pattern is mounted. In accordance with the present invention, without the R/B interpolation and the color space conversion processes, which are essentially needed for generating the focus in the prior art, it is possible to generate the focus data accurately, which results in reducing the area of the hardware and power consumption.
While the present invention has been described with respect to certain preferred embodiments, it will be apparent to those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the following claims.

Claims (12)

1. An apparatus comprising:
a green interpolation unit configured to:
receive an RGB bayer pattern from an image sensor;
interpolate the RGB bayer pattern by replacing a value of an originally red or blue pixel with an average value of a first number of originally green pixels to form an original M×N green plane;
determine a bad pixel if an evaluation pixel in the original M×N green plane satisfies an inequality condition;
compensate the bad pixel to form a compensated M×N green plane by replacing the bad pixel value with an average of a second number of originally green pixels in the original M×N green plane; and
interpolate the compensated M×N green plane by replacing a value of an originally red or blue pixel with an average value of a third number of originally green pixels in the compensated M×N green plane to form a final green plane; and
a focus data generation unit configured to extract focus data from the final green plane,
wherein the first number is at least three, the second number is at least four, and the third number is at least three, and
wherein the first number is not equal to the second number and the first number is equal to the third number.
2. The apparatus of claim 1, wherein the inequality condition comprises:
a computation of four absolute value differences between the evaluation pixel value and the four originally green pixels in the original M×N plane; and
a comparison of a threshold value against four absolute value differences.
3. A method for generating focus data from an image sensor, the method comprising:
receiving an RGB bayer pattern from the image sensor;
interpolating the RGB bayer pattern by replacing a value of an originally red or blue pixel with an average value of a first number of originally green pixels to form an original M×N green plane;
determining a bad pixel if an evaluation pixel in the original M×N green plane satisfies an inequality condition;
compensating the bad pixel to form a compensated M×N green plane by replacing the bad pixel value with an average value of a second number of originally green pixels in the original M×N green plane;
interpolating the compensated M×N green plane by replacing a value of an originally red or blue pixel with an average value of a third number of originally green pixels in the compensated M×N green plane to form a final green plane; and
extracting the focus data by determining a high frequency energy of the final green plane,
wherein the first number is at least three, the second number is at least four, and the third number is at least three, and
wherein the first number is not equal to the second number and the first number is equal to the third number.
4. The method of claim 3, wherein the inequality condition comprises:
a computation of four absolute value differences between the evaluation pixel value and the four originally green pixels in the original M×N plane; and
a comparison of a threshold value against four absolute value differences.
5. The method of claim 3, wherein said extracting the focus data comprises convolving the final green plane with an edge detection filter.
6. The method of claim 5, wherein said extracting the focus data further comprises adding boundary values of pixels in the final green plane.
7. The apparatus of claim 1, wherein the focus data generation unit is further configured to convolve the final green plane with an edge detection filter to extract the focus data.
8. The apparatus of claim 7, wherein the focus data generation is further configured to add boundary values of pixels in the final green plane to extract the focus data.
9. The apparatus of claim 7, wherein the focus data generation is further configured to convolve the final green plane in a spatial domain with the edge detection filter.
10. The method of claim 3, wherein said extracting the focus data further comprises performing a convolution of the final green plane with an edge detection filter.
11. The method of claim 10, wherein said extracting the focus data further comprises adding boundary values of pixels in the final green plane.
12. The method of claim 10, wherein said extracting the focus data further comprises performing a convolution of the final green plane in a spatial domain with the edge detection filter.
US11/319,554 2004-12-30 2005-12-29 Apparatus and method for generating focus data in an image sensor Expired - Fee Related US7916207B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR1020040115989A KR100636971B1 (en) 2004-12-30 2004-12-30 Apparatus for generation of focus data in image sensor and method for generation the same
KR10-2004-0115989 2004-12-30
KR2004-0115989 2004-12-30

Publications (2)

Publication Number Publication Date
US20060146151A1 US20060146151A1 (en) 2006-07-06
US7916207B2 true US7916207B2 (en) 2011-03-29

Family

ID=36639914

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/319,554 Expired - Fee Related US7916207B2 (en) 2004-12-30 2005-12-29 Apparatus and method for generating focus data in an image sensor

Country Status (2)

Country Link
US (1) US7916207B2 (en)
KR (1) KR100636971B1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8619182B2 (en) 2012-03-06 2013-12-31 Csr Technology Inc. Fast auto focus techniques for digital cameras
US9392236B2 (en) 2012-10-31 2016-07-12 Samsung Electronics Co., Ltd. Image processing method, image signal processor, and image processing system including the same
US10148926B2 (en) 2015-12-07 2018-12-04 Samsung Electronics Co., Ltd. Imaging apparatus and image processing method of thereof

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007001010A1 (en) * 2007-01-02 2008-07-10 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method and image acquisition system for achromatized image acquisition of objects
US9179060B2 (en) * 2007-09-27 2015-11-03 Qualcomm Incorporated Method and apparatus for camera shake effect image stabilization
US20090086068A1 (en) * 2007-09-28 2009-04-02 Tatsuya Hagiwara Solid-state image pick-up device and image pick-up apparatus
JP5180795B2 (en) * 2007-12-10 2013-04-10 キヤノン株式会社 Imaging apparatus and control method thereof
JP5200955B2 (en) * 2008-02-14 2013-06-05 株式会社ニコン Image processing apparatus, imaging apparatus, and image processing program
US10353190B2 (en) 2009-12-30 2019-07-16 Koninklijke Philips N.V. Sensor for microscopy
CN104202583B (en) * 2014-08-07 2017-01-11 华为技术有限公司 Image processing device and method
CN116055698B (en) * 2022-12-30 2024-04-12 爱芯元智半导体(宁波)有限公司 Color adjustment method, color adjustment device and electronic equipment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5832143A (en) * 1996-01-17 1998-11-03 Sharp Kabushiki Kaisha Image data interpolating apparatus
KR19990081799A (en) 1997-12-08 1999-11-15 이데이 노부유끼 Image Processing System, Image Processing Method and Camera_
US20030136907A1 (en) * 1999-07-09 2003-07-24 Hitachi, Ltd. Charged particle beam apparatus
US6636630B1 (en) * 1999-05-28 2003-10-21 Sharp Kabushiki Kaisha Image-processing apparatus
US20030218677A1 (en) * 2002-05-27 2003-11-27 Tomoyuki Nishimura Auto white balance controlling method and electronic camera
US20030218679A1 (en) * 2001-12-24 2003-11-27 Alfio Castorina Method for improving the quality of a digital image
US6683643B1 (en) * 1997-03-19 2004-01-27 Konica Minolta Holdings, Inc. Electronic camera capable of detecting defective pixel
US20040032516A1 (en) * 2002-08-16 2004-02-19 Ramakrishna Kakarala Digital image system and method for combining demosaicing and bad pixel correction
US7110612B1 (en) * 2001-10-11 2006-09-19 Pixelworks, Inc. Weighted absolute difference based noise reduction method and apparatus
US7253836B1 (en) * 1998-06-30 2007-08-07 Nikon Corporation Digital camera, storage medium for image signal processing, carrier wave and electronic camera

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5832143A (en) * 1996-01-17 1998-11-03 Sharp Kabushiki Kaisha Image data interpolating apparatus
US6683643B1 (en) * 1997-03-19 2004-01-27 Konica Minolta Holdings, Inc. Electronic camera capable of detecting defective pixel
KR19990081799A (en) 1997-12-08 1999-11-15 이데이 노부유끼 Image Processing System, Image Processing Method and Camera_
US7253836B1 (en) * 1998-06-30 2007-08-07 Nikon Corporation Digital camera, storage medium for image signal processing, carrier wave and electronic camera
US6636630B1 (en) * 1999-05-28 2003-10-21 Sharp Kabushiki Kaisha Image-processing apparatus
US20030136907A1 (en) * 1999-07-09 2003-07-24 Hitachi, Ltd. Charged particle beam apparatus
US7110612B1 (en) * 2001-10-11 2006-09-19 Pixelworks, Inc. Weighted absolute difference based noise reduction method and apparatus
US20030218679A1 (en) * 2001-12-24 2003-11-27 Alfio Castorina Method for improving the quality of a digital image
US20030218677A1 (en) * 2002-05-27 2003-11-27 Tomoyuki Nishimura Auto white balance controlling method and electronic camera
US20040032516A1 (en) * 2002-08-16 2004-02-19 Ramakrishna Kakarala Digital image system and method for combining demosaicing and bad pixel correction

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8619182B2 (en) 2012-03-06 2013-12-31 Csr Technology Inc. Fast auto focus techniques for digital cameras
US9392236B2 (en) 2012-10-31 2016-07-12 Samsung Electronics Co., Ltd. Image processing method, image signal processor, and image processing system including the same
US10148926B2 (en) 2015-12-07 2018-12-04 Samsung Electronics Co., Ltd. Imaging apparatus and image processing method of thereof

Also Published As

Publication number Publication date
KR20060077188A (en) 2006-07-05
US20060146151A1 (en) 2006-07-06
KR100636971B1 (en) 2006-10-19

Similar Documents

Publication Publication Date Title
US7916207B2 (en) Apparatus and method for generating focus data in an image sensor
US8199216B2 (en) Apparatus and method for improving image quality of image sensor
US8115825B2 (en) Electronic device with two image sensors
RU2537038C2 (en) Automatic white balance processing with flexible colour space selection
RU2543974C2 (en) Auto-focus control using image statistics data based on coarse and fine auto-focus scores
US8345127B2 (en) Image processing apparatus and method of processing image for reducing noise of the image
RU2542928C2 (en) System and method for processing image data using image signal processor having final processing logic
US7683950B2 (en) Method and apparatus for correcting a channel dependent color aberration in a digital image
US8160381B2 (en) Method and apparatus for image noise reduction using noise models
US7860334B2 (en) Adaptive image filter for filtering image information
US7710476B2 (en) Color filter array, imaging device, and image processing unit
JP4979595B2 (en) Imaging system, image processing method, and image processing program
US20050201616A1 (en) High-quality gradient-corrected linear interpolation for demosaicing of color images
KR101225056B1 (en) Apparatus and method for reducing noise from image sensor
EP1394742B1 (en) Method for filtering the noise of a digital image sequence
US20060017824A1 (en) Image processing device, image processing method, electronic camera, and scanner
US10863148B2 (en) Tile-selection based deep demosaicing acceleration
US20070268503A1 (en) Image processing system
Zhang et al. A novel smoothness-based interpolation algorithm for division of focal plane Polarimeters
Lee et al. Cost-efffective color filter array demosaicing using spatial correlation
Song et al. Edge pattern based demosaicking algorithm of color filter array
Park et al. Demosaicing Method for Digital Cameras with White‐RGB Color Filter Array
Saafin et al. Image Demosaicking using Super Resolution Techniques
KR100816299B1 (en) Method and Device for Suppressing False Color
Sato 8 Image-Processing

Legal Events

Date Code Title Description
AS Assignment

Owner name: MAGNA CHIP SEMICONDUCTOR, LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MOON, JI-HYE;MOON, JANG-SIK;LEE, JEONG-GUK;REEL/FRAME:017429/0918

Effective date: 20051226

AS Assignment

Owner name: CROSSTEK CAPITAL, LLC, DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MAGNACHIP SEMICONDUCTOR, LTD.;REEL/FRAME:022764/0270

Effective date: 20090514

Owner name: CROSSTEK CAPITAL, LLC,DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MAGNACHIP SEMICONDUCTOR, LTD.;REEL/FRAME:022764/0270

Effective date: 20090514

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: INTELLECTUAL VENTURES II LLC, DELAWARE

Free format text: MERGER;ASSIGNOR:CROSSTEK CAPITAL, LLC;REEL/FRAME:026637/0632

Effective date: 20110718

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20230329