WO2022178684A1 - Camera system, method for hdr composition, and computer usable medium storing software for implementing method - Google Patents

Camera system, method for hdr composition, and computer usable medium storing software for implementing method Download PDF

Info

Publication number
WO2022178684A1
WO2022178684A1 PCT/CN2021/077520 CN2021077520W WO2022178684A1 WO 2022178684 A1 WO2022178684 A1 WO 2022178684A1 CN 2021077520 W CN2021077520 W CN 2021077520W WO 2022178684 A1 WO2022178684 A1 WO 2022178684A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
image
pixels
filter
camera system
Prior art date
Application number
PCT/CN2021/077520
Other languages
French (fr)
Inventor
Tsuyoshi Okuzaki
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2021/077520 priority Critical patent/WO2022178684A1/en
Priority to CN202180084392.2A priority patent/CN116648925A/en
Publication of WO2022178684A1 publication Critical patent/WO2022178684A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/133Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing panchromatic light, e.g. filters passing white light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements

Definitions

  • the present disclosure relates to a camera system, a method for HDR composition, and a computer usable medium storing software for implementing the method, and more specifically, to the system, the method and the medium which achieves improved high sensitivity, resolution and SNR.
  • a color filter having an RCCB arrangement which uses white pixels instead of green pixels in the Bayer array, has also becoming mainstream in image sensors for in-vehicle devices.
  • the present disclosure has been made in consideration of the above situation, and thus the present disclosure provides a camera system, a camera and a medium for improving a sensitivity, resolution and SNR.
  • a camera system comprising:
  • an image sensor comprising a plurality of image sensor pixels and outputting first color signals, second color signals, third color signals and fourth color signals;
  • a color filter arranged on the plurality of the image sensor pixels of the image sensor, the color filter including first color pixels, second color pixels, third color pixels and fourth color pixels;
  • a processor receiving a first image generated by the first color signals, the second color signals and the third color signals, and a second image generated by the fourth color signals, performing pattern matching on the second image, generating a fourth color signal from the first image and performing HDR composition using the pattern-matched second image and the fourth color signal generated from the first image, generating a luminance signal from the second image, generating a chromatic signal from the first image; and generating an image data using the luminance signal and the chromatic signal.
  • the processor may add the fourth color signals to generate the second image and, when there are a plurality of filter pixels of the first color pixels, the second color pixels and/or the third color pixels, may add the first color signals, the second color signals and/or the third color signals of a same color to each other so as to generate the first image.
  • the processor may perform demosaicing on the first image.
  • the processor may perform noise reduction on the first image using the pattern-matched second image.
  • the processor may perform gain adjustment on the first image to adjust a difference between a number of the first color pixel (s) , the second color pixel (s) and the third color pixel (s) , and a number of the fourth color pixels.
  • the processor may compensate a portion of overexposure in the second image by using the fourth color signal generated from the first image during the HDR composition.
  • noise may be reduced by using the noise-reduced first image, even when the HDR composition is performed over a wide range, from a low output level to a high output level.
  • the color filter may include a plurality of units, each of the units of the color filter may comprise the filter pixels arrayed in accordance with a square array pattern having N*N (N is equal to or greater than two) filter pixels.
  • the first color may be green (G) , red (R) or blue (B)
  • the second color is red (R) , blue (B) or green (G)
  • the third color may be blue (B) , green (G) or red (R)
  • the fourth color may be white (W) .
  • the square array pattern may have 2*2 filter pixels, including one filter pixel of the first color, the second color or the third color, and three filter pixels of the fourth color.
  • four units adjacent to each other may comprise three filter pixels of the first color, the second color or the third color, one filter pixel of the second color, the third color or the first color, one filter pixel of the third color, the first color or the second color, and twelve filter pixels of the fourth color.
  • the square array pattern may have 3*3 filter pixels, including three filter pixels of the first color, the second color or the fourth color, and six filter pixels of the fourth color.
  • four units adjacent to each other may comprise six filter pixels of the first color, the second color or the third color, three filter pixel of the second color, the third color or the first color, three filter pixel of the third color, the first color or the second color, and twenty four filter pixels of the fourth color.
  • the three filter pixels of the first color, the second color or the third color may be provided in a central position, and at a left and right of the central position or upper and lower positions of the central position, and the six filter pixels of the fourth color may be provided in the remaining positions.
  • the square array pattern may have 3*3 filter pixels, including one filter pixel of the first color, the second color or the third color, and eight filter pixels of the fourth color.
  • four units adjacent to each other may comprise one filter pixel of the first color, the second color or the third color, one filter pixel of the second color, the third color or the first color, one filter pixel of the third color, the first color or the second color, and thirty-two filter pixels of the fourth color.
  • the one filter pixel of the first color, the second color or the third color may be provided in a central position, and the eight filter pixels of the fourth color may be provided in the remaining positions.
  • a method for HDR composition comprising:
  • a portion of overexposure in the second image may be compensated by using the fourth color signal generated from the first image during the HDR composition.
  • a computer usable medium storing software for implementing any one of the above methods.
  • Fig. 1A is an explanatory plan view of a color filter of a 2x2 unit arrangement according to a comparative example
  • Fig. 1B is an explanatory plan view of a color filter of a 3x3 unit arrangement according to another comparative example
  • Fig. 2 is a graph illustrating spectroscopic sensitivity properties of white (W) pixels, red (R) pixels, green (G) pixels and blue (B) pixels;
  • Fig. 3 is a graph illustrating output values of white (W) pixels, red (R) pixels, green (G) pixels and blue (B) pixels;
  • Fig. 4 is a photograph illustrating an image acquired using a color filter of a Bayer arrangement under a low illuminance condition
  • Fig. 5 is a photograph illustrating an image acquired using a color filter of white pixels under a low illuminance condition
  • Fig. 6A is a block diagram schematically showing an overall arrangement of a camera system according to a first embodiment of the present disclosure
  • Fig. 6B is a block diagram schematically showing an overall arrangement of a camera system according to a second embodiment of the present disclosure
  • Fig. 7 is a block diagram schematically showing a method including processes (functional configuration) for HDR composition according to a third embodiment of the present disclosure
  • Fig. 8 is an explanatory plan view showing a process for reading out charges from white (W) pixels, red (R) pixels, green (G) pixels and blue (B) pixels of a first color filter of a 2x2 unit arrangement;
  • Fig. 9 is an explanatory plan view showing a process for reading out charges from white (W) pixels, red (R) pixels, (G) green pixels and (B) blue pixels of a second color filter of a 3x3 unit arrangement;
  • Fig. 10 is an explanatory plan view showing a process for reading out charges from white (W) pixels, red (R) pixels, green (G) pixels and blue (B) pixels of a third color filter of a 3x3 unit arrangement;
  • Fig. 11 shows photographs of an example of an image acquired using a color filter of white pixels, an image acquired using a color filter of red pixels, green pixels and blue pixels, and an image acquired using HDR composition;
  • Fig. 12A is a photograph of an example of an image acquired by a color filter of red pixels, green pixels and blue pixels;
  • Fig. 12B is a photograph of an example of an image acquired by a color filter of red pixels, green pixels, blue pixels and white pixels.
  • Image sensors using divided pixels have become mainstream in small devices, such as smartphones.
  • a color filter including the divided pixels of a Bayer arrangement (RGGB) having a four-times density or a nine-times density, is generally used.
  • Fig. 1A illustrates a color filter CF1 of a Bayer arrangement including a plurality of units, one unit including 2x2 pixels.
  • the color filter CF1 includes four units arrayed along the horizontal direction and along the vertical direction as a fundamental arrangement, two units are composed of four green (G) pixels, one unit is composed of red (R) pixels and the rest of one unit is composed of blue (B) pixels.
  • G green
  • R red
  • B blue
  • Fig. 1B illustrates a color filter CF2 of a Bayer arrangement including a plurality of units, one unit having 3x3 pixels.
  • the color filter CF2 includes four units arrayed along the horizontal direction and along the vertical direction as a fundamental arrangement, two units are composed of four green (G) pixels, one unit is composed of red (R) pixels and the rest of one unit is composed of blue (B) pixels.
  • a camera system having such a color filter has a mechanism inside its image sensor that adds a plurality of pixels (such as 2x2 or 3x3) of a same color to each other and reads out charges as one pixel.
  • pixels are white (w) pixels, and some of the divided pixels are color pixels, such as red (R) , green (G) and blue (B) .
  • Such a color filter having the above arrangement gives almost twice the sensitivity of the conventional color filter CF1 or CF2 with the Bayer arrangement.
  • the white (W) pixel can receive a visible light having a wider range of wavelengths compared with the red (R) pixel, green (G) pixel and blue (B) pixel.
  • the white (W) pixel can output a signal of higher level compared with the red (R) , green (G) and blue (B) pixels.
  • Fig. 5 illustrating an image acquired by a color filter of white pixels under a low illuminance condition
  • higher sensitivity can be acquired compared with the image acquired by a color filter of a Bayer arrangement under a low illuminance condition illustrated in Fig. 4.
  • a camera system 1A according to a first embodiment of the present disclosure comprises a camera module 10 including an optical system 1, an imaging device 2 and an image sensor driver 3, a processor 20, a memory 30 and a display unit 40.
  • the optical system 1 includes at least one optical lens, preferably a lens group having a plurality of optical lenses, a diaphragm adjustment mechanism, a zoom mechanism, and an auto focusing mechanism.
  • the imaging device 2 includes an image sensor, such as a charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) image sensor, having a plurality of photoelectric conversion elements arrayed along a horizontal direction and along a vertical direction, and a color filter arranged on the image sensor.
  • the imaging device 2 driven by the image sensor driver3, receives light from the optical system 1 and outputs color signals of four colors, red (R) , green (G) , blue (B) and white (W) .
  • the processor 20 receives the color signals from the imaging device 2 and performs processes described later in detail.
  • the memory 30 receives and outputs data, from/to the camera module 10, the processor 20 and the display unit 40.
  • the display unit 40 receives data from the processor 20, and displays an image acquired by the camera module 10 and processed by the processor 20.
  • the camera system shown in Fig. 6A comprises one processor 20, the processor 20 performing not only image processes but also system management processes.
  • the camera system shown in Fig. 6B comprises an image signal processor 21 and a main processor 22.
  • the image signal processor 21 performs image processes
  • the main processor 22 performs mostly system management processes, and may partially perform image processes.
  • the following processes are performed by the processor 20 in the camera system 1A, or the image signal processor 21, or the image signal processor 21 and the main processor 22 in the camera system 1B.
  • color signals namely red (R) , green (G) , blue (B) and white (W) signals are generated by the image sensor through a color filter CF11, CF12 or CF13.
  • Fig. 8 shows a process for reading out charges from the color filter CF11 (2x2 units) .
  • Each unit includes one pixel of red (R) , green (G) or blue (B) , and three pixels of white (W) .
  • Fig. 9 shows a process for reading out charges through the color filter CF12 (3x3 units) .
  • Each unit includes three pixels of red (R) , green (G) or blue (B) , and six pixels of white (W) .
  • Fig. 10 shows a process for reading out charges through the color filter CF13 (3x3 units) .
  • Each unit includes one pixel of red (R) , green (G) or blue (B) , and eight pixels of white (W) .
  • plural charges of white (W) are added to each other, and one charge of red (R) , green (G) or blue (B) is read out without addition, or plural charges of red (R) , green (G) or blue (B) are read out and added to each pixel of the same color, from the image sensor through the color filter CF11, CF12 or CF13. Charges of red (R) , green (G) or blue (B) , and charges of white (W) are read out independently from each other.
  • red (R) , green (G) , blue (B) and white (W) are read out, respectively, as if there were two image sensors, namely a monochrome image sensor including only white (W) pixels and a color image sensor including red (R) , green (G) and blue (B) of a Bayer arrangement (RGGB) .
  • the processor 20, or the image signal processor 21 and the main processor 22 receives the signals of red (R) , green (G) and blue (B) , and the signals of white (W) .
  • the signals of white (W) are generated by adding the charges of white (W) pixels at this stage, as described above.
  • the sensitivity of the signals of white (W) is higher than that of the signals of red (R) , green (G) and blue (B) . This is an important matter when performing HDR composition with the signals of white (W) and the signals of red (R) , green (G) and blue (B) , as described later.
  • demosaicing is performed for the signals of red (R) , green (G) and blue (B) , since the numbers of the pixels of red (R) , green (G) and blue (B) are small. It is not necessary to perform demosaicing for the signals of white (W) , since the pixels of white (W) are arrayed over almost the entire area of the color filter.
  • W-RGB cooperation noise reduction including pattern matching is performed using both the signals of white (W) and the signals of red (R) , green (G) and blue (B) .
  • the bright image of white (W) and the dark image of red (R) , green (G) and blue (B) are acquired at this stage.
  • step S3 it is necessary to improve the SNR and to interpolate the quantization error by performing noise reduction including pattern matching in a wide range.
  • the signals of white (W) have high sensitivity and high SNR, it is possible to perform pattern matching on the signals of white (W) with higher accuracy compared with pattern matching on the signals of red (R) , green (G) and blue (B) , as long as the pixels of white (W) are not saturated. Pattern matching on the signals including a lot of noise and/or quantization error results in low accuracy.
  • the HDR gain adjustment is performed on the noise-reduced signals of red (R) , green (G) and blue (B) , in order to match sensitivities of the signals of red (R) , green (G) and blue (B) , and the signals of white (W) .
  • a signal of white (W) is generated from the signals of red (R) , green (G) and blue (B)
  • the HDR composition is performed on the signals of white (W) with the signal of white (W) generated from the signals of red (R) , green (G) and blue (B) .
  • the signal of white (W) generated from the signals of red (R) , green (G) and blue (B) has low sensitivity, thus there is sufficient margin before the signal of white (W) reaches a point of saturation.
  • the signal of white (W) can be acquired by back-calculation using color reproduction improvement parameters of red (R) , green (G) and blue (B) , and parameters for generating brightness generated from the signals of white (W) .
  • the HDR composition in the step S6 is performed by ⁇ -blending according to brightness of the pixels of white (W) .
  • the signal of white (W) generated from the signals of red (R) , green (G) and blue (B) , which is used for the HDR composition, has low quantization error. Therefore, it is possible to perform HDR composition over a wide range of levels, from a low to high output level of signals.
  • step S6 the HDR composition being performed using the noise reduced (step S3) white generated from the signals of red (R) , green (G) and blue (B) , noise is suppressed.
  • step S7 when generating a luminance signal (Y) from the signals of white (W) , the signals of red (R) and the signals of blue (B) are used to improve luminance reproducibility.
  • a luminance signal (Y) is generated from the signals of white (W) on which demosaicing is not performed, the resolution is very high.
  • step S8 calculation for improvement of color reproduction of red (R) , green (G) and blue (B) is performed by using a matrix for improvement of color reproduction.
  • the color-difference signals CbCr are generated from the signals of red (R) , green (G) and blue (B) .
  • HDR image signals are generated using the luminance signal (Y) from the signals of white (W) and the color-difference signals CbCr from the signals of red (R) , green (G) and blue (B) .
  • the dynamic range can be expanded by up to 15 dB in the case of using the color filter CF11 shown in Fig. 8.
  • Fig. 7 can be realized by a circuit configuration provided in the imaging device 2, or software installed in the camera system 1A or 1B.
  • two kinds of images namely a first image with a high sensitivity and achromatic color, generated from the signals of white (W)
  • a second image with a low sensitivity and chromatic color generated from the signals of red (R) , green (G) and blue (B)
  • W white
  • B blue
  • the entire image is bright; however, the bright portions, such as the area with the light, are blown-out highlights due to overexposure.
  • the entire image is dark; however, the bright portions including the light are not blown-out highlights.
  • An HDR image is acquired by performing HDR composition and adaptive tone mapping correction with the first and second images.
  • the HDR image is overall bright, resolution and SNR are improved, and the area of overexposure including the light is compensated and thus there is no portion of blown-out highlights.
  • FIG. 12A An image acquired by the camera system including the conventional color filter CF2 is shown in Fig. 12A, and an image acquired by the camera system including the color filter CF13 according to the first embodiment of the present disclosure is shown in Fig. 12B.
  • Fig. 12 B shows, in comparison with the image shown in Fig. 12A, that the resolution of the camera system including the color filter CF13 according to the first embodiment of the present disclosure is improved.
  • a one-image sensor has both functions of color and monochrome sensors and thus there is no misalignment between plural image sensors.
  • a color camera and a monochrome camera can be combined into one dual camera.
  • two kinds of images namely one image composed of charges of white pixels and another image composed of charges of red (R) , green (G) and blue (B) are generated, and HDR composition is performed using these two kinds of images while maintaining high resolution of white (W) and low SNR.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • the feature defined with “first” and “second” may comprise one or more of this feature.
  • a plurality of means two or greater than two, unless specified otherwise.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements, which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are contacted via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is right or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is right or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.

Abstract

The camera system is provided to achieve a high resolution and HDR images, the system including an image sensor having a plurality of image sensor pixels and outputting first color signals, second color signals, third color signals and fourth color signals; a color filter arranged on the plurality of the image sensor pixels of the image sensor, the color filter including first color pixels, second color pixels, third color pixels and fourth color pixels; and a processor receiving a first image generated by the first color signals, the second color signals and the third color signals, and a second image generated by the fourth color signals, performing pattern matching on the second image, generating a fourth color signal from the first image and performing HDR composition using the pattern-matched second image and the fourth color signal generated from the first image, generating a luminance signal from the second image, generating a chromatic signal from the first image; and generating an image data using the luminance signal and the chromatic signal.

Description

[Title established by the ISA under Rule 37.2] CAMERA SYSTEM, METHOD FOR HDR COMPOSITION, AND COMPUTER USABLE MEDIUM STORING SOFTWARE FOR IMPLEMENTING METHOD TECHNICAL FIELD
The present disclosure relates to a camera system, a method for HDR composition, and a computer usable medium storing software for implementing the method, and more specifically, to the system, the method and the medium which achieves improved high sensitivity, resolution and SNR.
BACKGROUND
In recent years, camera systems have been developed and widely used, for example, in mobile phones. Along with progresses in high-speed/large-capacity data communication or high-speed processors, high resolution is required.
In order to achieve high sensitivity, a camera system with a color filter including white (transparent) pixels has been proposed.
However, when divided pixels are not used, resolution is lowered, and when the divided pixels are used, the resolution and the SNR (signal to noise ratio) are lowered.
For this reason, various proposals of color arrangement of a color filter have been made.
As described in the international publications, for example, No. WO2015 /045913, even if sensitivity is increased by using white pixels, resolution and/or resolution are deteriorated compared with the conventional Bayer array of a color filter with no white pixels.
A color filter having an RCCB arrangement, which uses white pixels instead of green pixels in the Bayer array, has also becoming mainstream in image sensors for in-vehicle devices.
However, in such a color filter, the resolution is deteriorated due to optical blur, unless the chromatic aberration of the lens is suppressed. Further, since the white pixels have an entire color bandwidth, the white pixels do not contribute to color reproduction. As a result, when trying to improve color reproduction, the SNR will be deteriorated.
As explained above, conventional camera systems cannot provide improved high-sensitivity, improved resolution and improved SNR.
SUMMARY
The present disclosure has been made in consideration of the above situation, and thus the present disclosure provides a camera system, a camera and a medium for improving a sensitivity, resolution and SNR.
According to one aspect of the present disclosure, there is provided a camera system, comprising:
an image sensor comprising a plurality of image sensor pixels and outputting first color signals, second color signals, third color signals and fourth color signals;
a color filter arranged on the plurality of the image sensor pixels of the image sensor, the color filter including first color pixels, second color pixels, third color pixels and fourth color pixels; and
a processor receiving a first image generated by the first color signals, the second color signals and the third color signals, and a second image generated by the fourth color signals, performing pattern matching on the second image, generating a fourth color signal from the first image and performing HDR composition using the pattern-matched second image and the fourth color signal generated from the first image, generating a luminance signal from the second image, generating a chromatic signal from the first image; and generating an image data using the luminance signal and the chromatic signal.
In some embodiments, the processor may add the fourth color signals to generate the second image and, when there are a plurality of filter pixels of the first color pixels, the second color pixels and/or the third color pixels, may add the first color signals, the second color signals and/or the third color signals of a same color to each other so as to generate the first image.
In some embodiments, the processor may perform demosaicing on the first image.
In some embodiments, the processor may perform noise reduction on the first image using the pattern-matched second image.
In some embodiments, the processor may perform gain adjustment on the first image to adjust a difference between a number of the first color pixel (s) , the second color pixel (s) and the third color pixel (s) , and a number of the fourth color pixels.
In some embodiments, the processor may compensate a portion of overexposure in the second image by using the fourth color signal generated from the first image during the HDR composition.
In some embodiments, during the HDR composition, noise may be reduced by using the noise-reduced first image, even when the HDR composition is performed over a wide range, from a low output level to a high output level.
In some embodiments, the color filter may include a plurality of units, each of the units of the color filter may comprise the filter pixels arrayed in accordance with a square array pattern having N*N (N is equal to or greater than two) filter pixels.
In some embodiments, the first color may be green (G) , red (R) or blue (B) , the second color is red (R) , blue (B) or green (G) , the third color may be blue (B) , green (G) or red (R) , and the fourth color may be white (W) .
In some embodiments, the square array pattern may have 2*2 filter pixels, including one filter pixel of the first color, the second color or the third color, and three filter pixels of the fourth color.
In some embodiments, four units adjacent to each other may comprise three filter pixels of the first color, the second color or the third color, one filter pixel of the second color, the third color or the first color, one filter pixel of the third color, the first color or the second color, and twelve filter pixels of the fourth color.
In some embodiments, the square array pattern may have 3*3 filter pixels, including three filter pixels of the first color, the second color or the fourth color, and six filter pixels of the fourth color.
In some embodiments, four units adjacent to each other may comprise six filter pixels of the first color, the second color or the third color, three filter pixel of the second color, the third color or the first color, three filter pixel of the third color, the first color or the second color, and twenty four filter pixels of the fourth color.
In some embodiments, the three filter pixels of the first color, the second color or the third color may be provided in a central position, and at a left and right of the central position or upper and lower positions of the central position, and the six filter pixels of the fourth color may be provided in the remaining positions.
In some embodiments, the square array pattern may have 3*3 filter pixels, including one filter pixel of the first color, the second color or the third color, and eight filter pixels of the fourth color.
In some embodiments, four units adjacent to each other may comprise one filter pixel of the first color, the second color or the third color, one filter pixel of the second color, the third color or the first color, one filter pixel of the third color, the first color or the second color, and thirty-two filter pixels of the fourth color.
In some embodiments, the one filter pixel of the first color, the second color or the third color may be provided in a central position, and the eight filter pixels of the fourth color may be provided in the remaining positions.
According to one aspect of the present disclosure, there is provided a method for HDR composition, the method comprising:
acquiring a first image of a first color, a second color and a third color, and a second image of a fourth color;
performing pattern matching on the second image;
generating a fourth color signal from the first image and performing HDR composition using the pattern-matched second image and the fourth color signal generated from the first image;
generating a luminance signal from the second image;
generating a chromatic signal from the first image; and
generating an image data using the luminance signal and the chromatic signal.
In some embodiments, a portion of overexposure in the second image may be compensated by using the fourth color signal generated from the first image during the HDR composition.
According to one aspect of the present disclosure, there is provided a computer usable medium storing software for implementing any one of the above methods.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of embodiments of the present disclosure will become apparent and more readily appreciated from the following descriptions made with reference to the drawings, in which:
Fig. 1A is an explanatory plan view of a color filter of a 2x2 unit arrangement according to a comparative example;
Fig. 1B is an explanatory plan view of a color filter of a 3x3 unit arrangement according to another comparative example;
Fig. 2 is a graph illustrating spectroscopic sensitivity properties of white (W) pixels, red (R) pixels, green (G) pixels and blue (B) pixels;
Fig. 3 is a graph illustrating output values of white (W) pixels, red (R) pixels, green (G) pixels and blue (B) pixels;
Fig. 4 is a photograph illustrating an image acquired using a color filter of a Bayer arrangement under a low illuminance condition;
Fig. 5 is a photograph illustrating an image acquired using a color filter of white pixels under a low illuminance condition;
Fig. 6A is a block diagram schematically showing an overall arrangement of a camera system according to a first embodiment of the present disclosure;
Fig. 6B is a block diagram schematically showing an overall arrangement of a camera system according to a second embodiment of the present disclosure;
Fig. 7 is a block diagram schematically showing a method including processes (functional configuration) for HDR composition according to a third embodiment of the present disclosure;
Fig. 8 is an explanatory plan view showing a process for reading out charges from white (W) pixels, red (R) pixels, green (G) pixels and blue (B) pixels of a first color filter of a 2x2 unit arrangement;
Fig. 9 is an explanatory plan view showing a process for reading out charges from white (W) pixels, red (R) pixels, (G) green pixels and (B) blue pixels of a second color filter of a 3x3 unit arrangement;
Fig. 10 is an explanatory plan view showing a process for reading out charges from white (W) pixels, red (R) pixels, green (G) pixels and blue (B) pixels of a third color filter of a 3x3 unit arrangement;
Fig. 11 shows photographs of an example of an image acquired using a color filter of white pixels, an image acquired using a color filter of red pixels, green pixels and blue pixels, and an image acquired using HDR composition;
Fig. 12A is a photograph of an example of an image acquired by a color filter of red pixels, green pixels and blue pixels; and
Fig. 12B is a photograph of an example of an image acquired by a color filter of red pixels, green pixels, blue pixels and white pixels.
DETAILED DESCRIPTION
Embodiments of the present disclosure will be described in detail and examples of the embodiments of the present disclosure will be illustrated in the accompanying drawings. The same or similar elements and elements having same or similar functions are denoted by like reference numerals throughout the descriptions. The embodiments described herein with reference to the drawings are explanatory, and aim to illustrate the present disclosure, but shall not be construed to limit the present disclosure.
Before explaining a camera system according to an embodiment of the present disclosure, a conventional camera system according to a comparative example, which has a color filter of a Bayer arrangement is explained for the sake of understanding the present disclosure.
Image sensors using divided pixels have become mainstream in small devices, such as smartphones.
In such an image sensor, a number of pixels is large, and charges are read out from all pixels. In a normal use, it is possible to achieve characteristics of an image sensor with relatively large pixels by adding charges from the divided pixels.
A color filter including the divided pixels of a Bayer arrangement (RGGB) , having a four-times density or a nine-times density, is generally used.
Fig. 1A illustrates a color filter CF1 of a Bayer arrangement including a plurality of units, one unit including 2x2 pixels.
More specifically, the color filter CF1 includes four units arrayed along the horizontal direction and along the vertical direction as a fundamental arrangement, two units are composed of four green (G) pixels, one unit is composed of red (R) pixels and the rest of one unit is composed of blue (B) pixels.
Fig. 1B illustrates a color filter CF2 of a Bayer arrangement including a plurality of units, one unit having 3x3 pixels.
The color filter CF2 includes four units arrayed along the horizontal direction and along the vertical direction as a fundamental arrangement, two units are composed of four green (G) pixels, one unit is composed of red (R) pixels and the rest of one unit is composed of blue (B) pixels.
A camera system having such a color filter has a mechanism inside its image sensor that adds a plurality of pixels (such as 2x2 or 3x3) of a same color to each other and reads out charges as one pixel.
In the camera system according to the present disclosure, which is described in detail hereinafter, almost all pixels are white (w) pixels, and some of the divided pixels are color pixels, such as red (R) , green (G) and blue (B) .
Such a color filter having the above arrangement gives almost twice the sensitivity of the conventional color filter CF1 or CF2 with the Bayer arrangement.
As shown in the graph of Fig. 2, which illustrates spectroscopic sensitivity properties of a white (transparent) pixel, a red pixel, a green pixel and a blue pixel, the white (W) pixel can receive a visible light having a wider range of wavelengths compared with the red (R) pixel, green (G) pixel and blue (B) pixel.
Further, as shown in the graph of Fig. 3, which illustrates output values of the white (W) pixel, the red (R) pixel, the green (G) pixel and the blue (B) pixel, the white (W) pixel can output a signal of higher level compared with the red (R) , green (G) and blue (B) pixels.
Therefore, a signal of high level can be acquired by the white (W) pixel even under a low illuminance condition.
As shown in Fig. 5, illustrating an image acquired by a color filter of white pixels under a low illuminance condition, higher sensitivity can be acquired compared with the image acquired by a color filter of a Bayer arrangement under a low illuminance condition illustrated in Fig. 4.
The camera system according to the following embodiment of the present disclosure which achieves improved sensitivity, resolution and SNR, will be described hereinafter.
Referring to Fig. 6A, a camera system 1A according to a first embodiment of the present disclosure comprises a camera module 10 including an optical system 1, an imaging device 2 and an image sensor driver 3, a processor 20, a memory 30 and a display unit 40.
The optical system 1 includes at least one optical lens, preferably a lens group having a plurality of optical lenses, a diaphragm adjustment mechanism, a zoom mechanism, and an auto focusing mechanism.
The imaging device 2 includes an image sensor, such as a charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) image sensor, having a plurality of photoelectric conversion elements arrayed along a horizontal direction and along a vertical direction, and a color filter arranged on the image sensor. The imaging device 2 driven by the image sensor driver3, receives light from the optical system 1 and outputs color signals of four colors, red (R) , green (G) , blue (B) and white (W) .
The processor 20 receives the color signals from the imaging device 2 and performs processes described later in detail.
The memory 30 receives and outputs data, from/to the camera module 10, the processor 20 and the display unit 40.
The display unit 40 receives data from the processor 20, and displays an image acquired by the camera module 10 and processed by the processor 20.
The camera system shown in Fig. 6A comprises one processor 20, the processor 20 performing not only image processes but also system management processes.
The camera system shown in Fig. 6B comprises an image signal processor 21 and a main processor 22. The image signal processor 21 performs image processes, and the main processor 22 performs mostly system management processes, and may partially perform image processes.
The method for HDR composition according to the third embodiment of the present disclosure is described in detail, referring to Fig. 7, which illustrates a functional configuration.
The following processes are performed by the processor 20 in the camera system 1A, or the image signal processor 21, or the image signal processor 21 and the main processor 22 in the camera system 1B.
In the step S1, color signals, namely red (R) , green (G) , blue (B) and white (W) signals are generated by the image sensor through a color filter CF11, CF12 or CF13.
Fig. 8 shows a process for reading out charges from the color filter CF11 (2x2 units) . Each unit includes one pixel of red (R) , green (G) or blue (B) , and three pixels of white (W) .
One charge of red (R) , green (G) or blue (B) is read out from the image sensor through the color filter CF11, and three charges of white (W) are read out therefrom and added to each other.
Fig. 9 shows a process for reading out charges through the color filter CF12 (3x3 units) . Each unit includes three pixels of red (R) , green (G) or blue (B) , and six pixels of white (W) .
Three charges of red (R) , green (G) or blue (B) are read out from the image sensor through the color filter CF11 and added to each pixel of the same color, and six charges of white (W) are read out therefrom and added to each other.
Fig. 10 shows a process for reading out charges through the color filter CF13 (3x3 units) . Each unit includes one pixel of red (R) , green (G) or blue (B) , and eight pixels of white (W) .
One charge of red (R) , green (G) or blue (B) is read out from the image sensor through the color filter CF13, and eight charges of white (W) are read out therefrom and added to each other.
As described above, plural charges of white (W) are added to each other, and one charge of red (R) , green (G) or blue (B) is read out without addition, or plural charges of red (R) , green (G) or blue (B) are read out and added to each pixel of the same color, from the image sensor through the color filter CF11, CF12 or CF13. Charges of red (R) , green (G) or blue (B) , and charges of white (W) are read out independently from each other.
As a result, charges of red (R) , green (G) , blue (B) and white (W) are read out, respectively, as if there were two image sensors, namely a monochrome image sensor including only white (W) pixels and a color image sensor including red (R) , green (G) and blue (B) of a Bayer arrangement (RGGB) .
The processor 20, or the image signal processor 21 and the main processor 22 receives the signals of red (R) , green (G) and blue (B) , and the signals of white (W) .
The signals of white (W) are generated by adding the charges of white (W) pixels at this stage, as described above. Thus, the sensitivity of the signals of white (W) is higher than that of the signals of red (R) , green (G) and blue (B) . This is an important matter when performing HDR composition with the signals of white (W) and the signals of red (R) , green (G) and blue (B) , as described later.
Though the sensitivities of the signals of red (R) , green (G) and blue (B) are lower than that of the signals of white (W) , these signals have sufficient margin before they saturate. Therefore,  even after the signals of white (W) reach a point of saturation, the signals of red (R) , green (G) and blue (B) will not saturate immediately.
In the step of S2, demosaicing is performed for the signals of red (R) , green (G) and blue (B) , since the numbers of the pixels of red (R) , green (G) and blue (B) are small. It is not necessary to perform demosaicing for the signals of white (W) , since the pixels of white (W) are arrayed over almost the entire area of the color filter.
In the step of S3, W-RGB cooperation noise reduction including pattern matching is performed using both the signals of white (W) and the signals of red (R) , green (G) and blue (B) .
As described above, the bright image of white (W) and the dark image of red (R) , green (G) and blue (B) are acquired at this stage.
If subsequent processes are performed directly on the dark image of red (R) , green (G) and blue (B) , the SNR becomes low and the quantization error becomes large.
Therefore, in the step S3, it is necessary to improve the SNR and to interpolate the quantization error by performing noise reduction including pattern matching in a wide range.
Since the signals of white (W) have high sensitivity and high SNR, it is possible to perform pattern matching on the signals of white (W) with higher accuracy compared with pattern matching on the signals of red (R) , green (G) and blue (B) , as long as the pixels of white (W) are not saturated. Pattern matching on the signals including a lot of noise and/or quantization error results in low accuracy.
It is possible to perform noise reduction on the signals of red (R) , green (G) and blue (B) with high SNR by using the pattern-matched signals of white (W) .
In the step S4, the HDR gain adjustment is performed on the noise-reduced signals of red (R) , green (G) and blue (B) , in order to match sensitivities of the signals of red (R) , green (G) and blue (B) , and the signals of white (W) .
In the step S5, a signal of white (W) is generated from the signals of red (R) , green (G) and blue (B) , and in the step S6, the HDR composition is performed on the signals of white (W) with the signal of white (W) generated from the signals of red (R) , green (G) and blue (B) .
The signal of white (W) generated from the signals of red (R) , green (G) and blue (B) has low sensitivity, thus there is sufficient margin before the signal of white (W) reaches a point of saturation. As a result, it is possible to supplement a saturated portion (blown-out highlights due to overexposure) included in the image of white (W) by using the signal of white (W) generated from the signals of red (R) , green (G) and blue (B) .
Incidentally, the signal of white (W) can be acquired by back-calculation using color reproduction improvement parameters of red (R) , green (G) and blue (B) , and parameters for generating brightness generated from the signals of white (W) .
The HDR composition in the step S6 is performed by α-blending according to brightness of the pixels of white (W) . The signal of white (W) generated from the signals of red (R) , green (G) and blue (B) , which is used for the HDR composition, has low quantization error. Therefore, it is possible to perform HDR composition over a wide range of levels, from a low to high output level of signals.
The linearity and signal characteristics of white generated from the signals of white (W) and those of white generated from the signals of red (R) , green (G) and blue (B) do not exactly match with each other.
However, as described above, it is possible to minimize a problem due to the mismatch between these signals by performing HDR composition over a wide range of levels, from low to high output level of the signals. Further, when performing HDR composition of low level signals, there is a possibility that noise increases. In the step S6, the HDR composition being performed using the noise reduced (step S3) white generated from the signals of red (R) , green (G) and blue (B) , noise is suppressed.
In the step S7, when generating a luminance signal (Y) from the signals of white (W) , the signals of red (R) and the signals of blue (B) are used to improve luminance reproducibility.
Since a luminance signal (Y) is generated from the signals of white (W) on which demosaicing is not performed, the resolution is very high.
In the step S8, calculation for improvement of color reproduction of red (R) , green (G) and blue (B) is performed by using a matrix for improvement of color reproduction.
In the step S9, the color-difference signals CbCr are generated from the signals of red (R) , green (G) and blue (B) .
In the step S10, HDR image signals (YCbCr) are generated using the luminance signal (Y) from the signals of white (W) and the color-difference signals CbCr from the signals of red (R) , green (G) and blue (B) .
The dynamic range can be expanded by up to 15 dB in the case of using the color filter CF11 shown in Fig. 8.
The above functions shown in Fig. 7 can be realized by a circuit configuration provided in the imaging device 2, or software installed in the  camera system  1A or 1B.
The technical effect which can be obtained by using the camera systems according to the first embodiment and the second embodiment, and the method for HDR composition according to the third embodiment of the present disclosure is described below.
Referring to Fig. 11, two kinds of images, namely a first image with a high sensitivity and achromatic color, generated from the signals of white (W) , and a second image with a low sensitivity and chromatic color, generated from the signals of red (R) , green (G) and blue (B) can be acquired by one image sensor.
In the first image, the entire image is bright; however, the bright portions, such as the area with the light, are blown-out highlights due to overexposure. In the second image, the entire image is dark; however, the bright portions including the light are not blown-out highlights.
An HDR image is acquired by performing HDR composition and adaptive tone mapping correction with the first and second images. The HDR image is overall bright, resolution and SNR are improved, and the area of overexposure including the light is compensated and thus there is no portion of blown-out highlights.
An image acquired by the camera system including the conventional color filter CF2 is shown in Fig. 12A, and an image acquired by the camera system including the color filter CF13 according to the first embodiment of the present disclosure is shown in Fig. 12B.
Fig. 12 B shows, in comparison with the image shown in Fig. 12A, that the resolution of the camera system including the color filter CF13 according to the first embodiment of the present disclosure is improved.
In contrast with a conventional method of capturing plural images by plural exposures, it is possible to acquire one image by performing HDR composition with no time difference.
Further, in contrast with another conventional method of capturing plural images by plural image sensors, such as a color image sensor and a monochrome sensor, a one-image sensor has both functions of color and monochrome sensors and thus there is no misalignment between plural image sensors.
Thus, according to the present disclosure, a color camera and a monochrome camera can be combined into one dual camera.
It is possible to improve a fusion accuracy of color and monochrome sensors and thus an HDR image can be acquired with no artifact.
Further, since pixels of white (W) are arrayed over almost the entire area of the color filters CF11, CF12 or CF13, demosaicing is not necessary for the signals of white (W) and thus an image with higher resolution can be acquired, compared with the conventional color filters which require demosaicing.
As described above, according to the camera system and the method for HDR composition of the present disclosure, two kinds of images, namely one image composed of charges of white pixels and another image composed of charges of red (R) , green (G) and blue (B) are generated, and HDR composition is performed using these two kinds of images while maintaining high resolution of white (W) and low SNR.
In addition, since the dynamic range is increased by using a one-image sensor, it is possible to reduce the number of shootings and length of exposure compared to the conventional method.
In the description of embodiments of the present disclosure, it is to be understood that terms such as "central" , "longitudinal" , "transverse" , "length" , "width" , "thickness" , "upper" , "lower" , "front" , "rear" , "left" , "right" , "vertical" , "horizontal" , "top" , "bottom" , "inner" , "outer" , "clockwise" and "counterclockwise" should be construed to refer to the orientation or the position as described or as shown in the drawings under discussion. These relative terms are only used to simplify the description of the present disclosure, and do not indicate or imply that the device or element referred to must have a particular orientation, or constructed or operated in a particular orientation. Thus, these terms cannot be construed to limit the present disclosure.
In addition, terms such as "first" and "second" are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features. Thus, the feature defined with "first" and "second" may comprise one or more of this feature. In the description of the present disclosure, "a plurality of" means two or greater than two, unless specified otherwise.
In the description of embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted" , "connected" , "coupled" and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements, which can be understood by those skilled in the art according to specific situations.
In the embodiments of the present disclosure, unless specified or limited otherwise, a structure in which a first feature is "on" or "below" a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are contacted via an additional feature formed therebetween. Furthermore, a first feature "on" , "above" or "on top of" a second feature may include an embodiment in which the first feature is right or obliquely "on" , "above" or "on top of" the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature  "below" , "under" or "on bottom of" a second feature may include an embodiment in which the first feature is right or obliquely "below" , "under" or "on bottom of" the second feature, or just means that the first feature is at a height lower than that of the second feature.
Various embodiments and examples are provided in the above description to implement different structures of the present disclosure. In order to simplify the present disclosure, certain elements and settings are described in the above. However, these elements and settings are only by way of example and are not intended to limit the present disclosure. In addition, reference numbers and/or reference letters may be repeated in different examples in the present disclosure. This repetition is for the purpose of simplification and clarity and does not refer to relations between different embodiments and/or settings. Furthermore, examples of different processes and materials are provided in the present disclosure. However, it would be appreciated by those skilled in the art that other processes and/or materials may be also applied.
Reference throughout this specification to "an embodiment" , "some embodiments" , "an exemplary embodiment" , "an example" , "a specific example" or "some examples" means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above phrases throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.

Claims (20)

  1. A camera system, comprising:
    an image sensor comprising a plurality of image sensor pixels and outputting first color signals, second color signals, third color signals and fourth color signals;
    a color filter arranged on the plurality of the image sensor pixels of the image sensor, the color filter including first color pixels, second color pixels, third color pixels and fourth color pixels; and
    a processor receiving a first image generated by the first color signals, the second color signals and the third color signals, and a second image generated by the fourth color signals, performing pattern matching on the second image, generating a fourth color signal from the first image and performing HDR composition using the pattern-matched second image and the fourth color signal generated from the first image, generating a luminance signal from the second image, generating a chromatic signal from the first image; and generating an image data using the luminance signal and the chromatic signal.
  2. The camera system according to claim 1, wherein the processor adds the fourth color signals to generate the second image and, when there are a plurality of filter pixels of the first color pixels, the second color pixels and/or the third color pixels, adds the first color signals, the second color signals and/or the third color signals of a same color to each other so as to generate the first image.
  3. The camera system according to claim 1 or 2, wherein the processor performs demosaicing on the first image.
  4. The camera system according to any one of claims 1 to 3, wherein the processor performs noise reduction on the first image using the pattern-matched second image.
  5. The camera system according to any one of claims 1 to 4, wherein the processor performs gain adjustment on the first image to adjust a difference between a number of the first color pixel (s) , the second color pixel (s) and the third color pixel (s) , and a number of the fourth color pixels.
  6. The camera system according to any one of claims 1 to 5, wherein the processor compensates a portion of overexposure in the second image by using the fourth color signal generated from the first image during the HDR composition.
  7. The camera system according to any one of claims 1 to 6, wherein during the HDR composition, noise is reduced by using the noise-reduced first image, even when the HDR composition is performed over a wide range, from a low output level to a high output level.
  8. The camera system according to any one of claims 1 to 7, wherein the color filter includes a plurality of units, each of the units of the color filter comprises the filter pixels arrayed in accordance with a square array pattern having N*N (N is equal to or greater than two) filter pixels.
  9. The camera system according to any one of claims 1 to 8, wherein the first color is green (G) , red (R) or blue (B) , the second color is red (R) , blue (B) or green (G) , the third color is blue (B) , green (G) or red (R) , and the fourth color is white (W) .
  10. The camera system according to claim 8 or 9, wherein the square array pattern has 2*2 filter pixels, including one filter pixel of the first color, the second color or the third color, and three filter pixels of the fourth color.
  11. The camera system according to claim 10, wherein four units adjacent to each other comprise three filter pixels of the first color, the second color or the third color, one filter pixel of the second color, the third color or the first color, one filter pixel of the third color, the first color or the second color, and twelve filter pixels of the fourth color.
  12. The camera system according to claim 8 or 9, wherein the square array pattern has 3*3 filter pixels, including three filter pixels of the first color, the second color or the fourth color, and six filter pixels of the fourth color.
  13. The camera system according to claim 12, wherein four units adjacent to each other comprise six filter pixels of the first color, the second color or the third color, three filter pixel of the second color, the third color or the first color, three filter pixel of the third color, the first color or the second color, and twenty four filter pixels of the fourth color.
  14. The camera system according to claim 12, wherein the three filter pixels of the first color, the second color or the third color are provided in a central position, and at a left and right of the central position or upper and lower positions of the central position, and the six filter pixels of the fourth color are provided in the remaining positions.
  15. The camera system according to claim 8 or 9, wherein the square array pattern has 3*3 filter pixels, including one filter pixel of the first color, the second color or the third color, and eight filter pixels of the fourth color.
  16. The camera system according to claim 15, wherein four units adjacent to each other comprise one filter pixel of the first color, the second color or the third color, one filter pixel of the second color, the third color or the first color, one filter pixel of the third color, the first color or the second color, and thirty-two filter pixels of the fourth color.
  17. The camera system according to claim 15, wherein the one filter pixel of the first color, the second color or the third color are provided in a central position, and the eight filter pixels of the fourth color are provided in the remaining positions.
  18. A method for HDR composition, the method comprising:
    acquiring a first image of a first color, a second color and a third color, and a second image of a fourth color;
    performing pattern matching on the second image;
    generating a fourth color signal from the first image and performing HDR composition using the pattern-matched second image and the fourth color signal generated from the first image;
    generating a luminance signal from the second image;
    generating a chromatic signal from the first image; and
    generating an image data using the luminance signal and the chromatic signal.
  19. The method for HDR composition according to claim 18, wherein a portion of overexposure in the second image is compensated by using the fourth color signal generated from the first image during the HDR composition.
  20. A computer usable medium storing software for implementing the method according to claim 18 or 19.
PCT/CN2021/077520 2021-02-23 2021-02-23 Camera system, method for hdr composition, and computer usable medium storing software for implementing method WO2022178684A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/077520 WO2022178684A1 (en) 2021-02-23 2021-02-23 Camera system, method for hdr composition, and computer usable medium storing software for implementing method
CN202180084392.2A CN116648925A (en) 2021-02-23 2021-02-23 Camera system, method for HDR synthesis, and computer usable medium storing software for implementing the method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/077520 WO2022178684A1 (en) 2021-02-23 2021-02-23 Camera system, method for hdr composition, and computer usable medium storing software for implementing method

Publications (1)

Publication Number Publication Date
WO2022178684A1 true WO2022178684A1 (en) 2022-09-01

Family

ID=83048621

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/077520 WO2022178684A1 (en) 2021-02-23 2021-02-23 Camera system, method for hdr composition, and computer usable medium storing software for implementing method

Country Status (2)

Country Link
CN (1) CN116648925A (en)
WO (1) WO2022178684A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090153699A1 (en) * 2007-12-18 2009-06-18 Sony Corporation Imaging apparatus, imaging processing method, and imaging control program
US20140240548A1 (en) * 2013-02-22 2014-08-28 Broadcom Corporation Image Processing Based on Moving Lens with Chromatic Aberration and An Image Sensor Having a Color Filter Mosaic
US20160255289A1 (en) * 2015-02-27 2016-09-01 Semiconductor Components Industries, Llc High dynamic range imaging systems having differential photodiode exposures
US20200280704A1 (en) * 2019-02-28 2020-09-03 Qualcomm Incorporated Quad color filter array image sensor with aperture simulation and phase detection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090153699A1 (en) * 2007-12-18 2009-06-18 Sony Corporation Imaging apparatus, imaging processing method, and imaging control program
US20140240548A1 (en) * 2013-02-22 2014-08-28 Broadcom Corporation Image Processing Based on Moving Lens with Chromatic Aberration and An Image Sensor Having a Color Filter Mosaic
US20160255289A1 (en) * 2015-02-27 2016-09-01 Semiconductor Components Industries, Llc High dynamic range imaging systems having differential photodiode exposures
US20200280704A1 (en) * 2019-02-28 2020-09-03 Qualcomm Incorporated Quad color filter array image sensor with aperture simulation and phase detection

Also Published As

Publication number Publication date
CN116648925A (en) 2023-08-25

Similar Documents

Publication Publication Date Title
US9287316B2 (en) Systems and methods for mitigating image sensor pixel value clipping
CN112261391B (en) Image processing method, camera assembly and mobile terminal
JP5173493B2 (en) Imaging apparatus and imaging system
WO2021212763A1 (en) High-dynamic-range image processing system and method, electronic device and readable storage medium
JP4025207B2 (en) Solid-state image sensor and digital camera
WO2010089830A1 (en) Image pick-up device
JP2018196083A (en) Image processing system
JP2006148931A (en) SoC CAMERA SYSTEM EMPLOYING COMPLEMENTARY COLOR FILTER
US8111298B2 (en) Imaging circuit and image pickup device
CN109922283B (en) Image pickup apparatus and control method of image pickup apparatus
WO2020177123A1 (en) Color imaging system
JP5526673B2 (en) Solid-state imaging device and electronic device
JP2008141658A (en) Electronic camera and image processor
JP2003179819A (en) Image pickup device
US20040125226A1 (en) Apparatus for compensating for shading on a picture picked up by a solid-state image sensor over a broad dynamic range
US20230247308A1 (en) Image processing method, camera assembly and mobile terminal
JP5917160B2 (en) Imaging apparatus, image processing apparatus, image processing method, and program
WO2022178684A1 (en) Camera system, method for hdr composition, and computer usable medium storing software for implementing method
JP2001016598A (en) Color imaging device and image pickup device
JPH10189930A (en) Solid-state image sensor
JP4309505B2 (en) Imaging device and imaging apparatus
JPH0378388A (en) Color solid state image pickup element
WO2022109802A1 (en) Color imaging system
KR20220051240A (en) Image capture method, camera assembly and mobile terminal
WO2022257084A1 (en) Imaging apparatus, imaging system and method for enhancing expected color, and medium storing software for implementing the method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21927129

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180084392.2

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21927129

Country of ref document: EP

Kind code of ref document: A1