WO2023044856A1 - Method for improving quality of image captured by image sensor having image plane phase-difference pixels, electronic device, computer-readable storage medium and terminal device - Google Patents

Method for improving quality of image captured by image sensor having image plane phase-difference pixels, electronic device, computer-readable storage medium and terminal device Download PDF

Info

Publication number
WO2023044856A1
WO2023044856A1 PCT/CN2021/120677 CN2021120677W WO2023044856A1 WO 2023044856 A1 WO2023044856 A1 WO 2023044856A1 CN 2021120677 W CN2021120677 W CN 2021120677W WO 2023044856 A1 WO2023044856 A1 WO 2023044856A1
Authority
WO
WIPO (PCT)
Prior art keywords
type
light source
image
focus position
color
Prior art date
Application number
PCT/CN2021/120677
Other languages
French (fr)
Inventor
Tsuyoshi Okuzaki
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2021/120677 priority Critical patent/WO2023044856A1/en
Priority to CN202180098729.5A priority patent/CN117396734A/en
Publication of WO2023044856A1 publication Critical patent/WO2023044856A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Definitions

  • the present disclosure relates to a method for improving quality of an image captured by an image sensor having image plane phase-difference pixels, an electronic device performing the method, a computer-readable storage medium storing a program to implement the method, and a terminal device performing the method.
  • An electronic device such as a smartphone has a camera module which includes a lens and an image sensor.
  • the image sensor converts light transmitted through the lens into colored light by using a color filter attached to a photodiode of a pixel.
  • the photodiode receives the colored light to output a corresponding color signal. Red, Green and Blue color signals make up an image signal or an image data of the image captured by the camera module.
  • image plane phase-difference pixel In recent years, an image sensor having new pixels that can detect phase differences for the autofocus operation has been developed.
  • the pixel is referred to as “image plane phase-difference pixel” .
  • a subject to be imaged is usually illuminated by a light source such as sunlight or a room light such as a fluorescent lamp or an LED lamp.
  • the subject itself may include a light source.
  • a light source such as sunlight or a room light such as a fluorescent lamp or an LED lamp.
  • the subject itself may include a light source.
  • the term of "type" of light source includes not only a type of the light source such as sunlight, fluorescent lamp, LED lamp or OLED lamp, but also its color temperature.
  • a type of the light source is estimated by using the image signal. Specifically, the type of the light source is estimated from the integrated value for each color. The first integrated value is obtained by integrating the output values of the pixels for red. The second integrated value is obtained by integrating the output values of the pixels for green. The third integrated value is obtained by integrating the output values of the pixels for blue. The type of light source is estimated based on the first integrated value, the second integrated value and the third integrated value.
  • color reproduction processing on the captured image e.g., white balance processing, color matrix processing
  • white balance processing e.g., color matrix processing
  • the color reproduction processing may not be performed properly.
  • One of the reasons is that it is difficult to accurately estimate a type of the light source due to non-ideal characteristics of color filters. As shown in FIG. 15, the transmission characteristics of a color filter actually allow light having wavelength other than the desired wavelength to be transmitted (see the areas A1 and A2) . Therefore, the image signal does not indicate the exact intensity for each color.
  • the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide a method for improving the quality of an image captured by an image sensor having image plane phase-difference pixels, an electronic device performing the method, and a computer-readable storage medium storing a program to implement the method.
  • a method for improving quality of an image captured by an image sensor having image plane phase-difference pixels includes acquiring an image signal and a phase difference signal of the image when the image sensor is exposed, determining a type of a light source of the image based on the phase difference signal, and performing a color reproduction processing on the image based on the type of the light source.
  • the determining a type of a light source of the image based on the phase difference signal may include obtaining a first focus position for a first color based on the phase difference signal, obtaining a second focus position for a second color based on the phase difference signal, and estimating a first type of the light source based on the first focus position and the second focus position.
  • the estimating the first type of the light source based on the first focus position and the second focus position may include calculating a ratio of the first focus position to the second focus position, and estimating the type of the light source based on the ratio.
  • the first color may be red and the second color is green.
  • the first color may be blue and the second color is green.
  • the method may further include estimating a second type of the light source based on the image signal.
  • the determining a type of a light source of the image based on the phase difference signal may include determining the type of the light source to be the first type when the first type and the second type are different from each other and the first type is a predetermined type.
  • the predetermined type may be a white LED.
  • the determining a type of a light source of the image based on the phase difference signal may include obtaining a first focus position for a first color of the image sensor based on the phase difference signal, obtaining a second focus position for a second color of the image sensor based on the phase difference signal, obtaining a third focus position for a third color of the image sensor based on the phase difference signal, calculating a first ratio of the first focus position to the second focus position, calculating a second ratio of the third focus position to the second focus position, and estimating a first type of the light source based on the first ratio and the second ratio.
  • the performing a color reproduction processing on the image based on the type of the light source may include performing white balance processing on the image based on a spectrum of the type of the light source.
  • the performing a color reproduction processing on the image based on the type of the light source may include selecting a linear matrix corresponding to the type of the light source, and converting color in the image based on the linear matrix.
  • an electronic device includes a processor and a memory for storing instructions.
  • the instructions when executed by the processor, cause the processor to perform the method according to the present disclosure.
  • a computer-readable storage medium on which a computer program is stored, is provided.
  • the computer program is executed by a computer to implement the method according to the present disclosure.
  • a terminal device includes one or more processors, a memory and one or more programs.
  • the one or more programs includes instructions and are stored in the memory.
  • the one or more programs are configured to be executed by the one or more processors for:
  • FIG. 1 is a functional block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
  • FIG. 2 is a plan view showing a part of an image sensor in the electronic device.
  • FIG. 3 is a cross-sectional view taken along a line I-I of FIG. 2.
  • FIG. 4 is a functional block diagram of an image signal processor in the electronic device according to an embodiment of the present disclosure.
  • FIG. 5 is a flowchart for improving color reproduction for an image acquired by the image sensor according to an embodiment of the present disclosure.
  • FIG. 6 is a flowchart for deciding a type of light source of the image according to an embodiment of the present disclosure.
  • FIG. 7 is a flowchart for estimating a type of the light source based on a phase difference signal according to an embodiment of the present disclosure.
  • FIG. 8 is a diagram for explaining how to calculate a focus position according to an embodiment of the present disclosure.
  • FIG. 9 is an example of a graph for calculating the focus position according to an embodiment of the present disclosure.
  • FIG. 10 is an example of a graph showing an average value of wavelengths of light received by the photodiode for each color filter when the light source is DL5000.
  • FIG. 11 is an example of a graph showing an average value of wavelengths of light received by the photodiode for each color filter when the light source is LED5000K.
  • FIG. 12 is an example of a graph showing a ratio (R/G) of the focus positions for angles of the chromaticity diagram.
  • FIG. 13 is an example of a graph showing a ratio (B/G) of the focus positions for angles of the chromaticity diagram.
  • FIG. 14 is an example of an image after color reproduction processing in the case of the conventional (left) and the present disclosure (right) .
  • FIG. 15 is an example of a graph showing the characteristics of actual color filters.
  • FIG. 16 is an example of a graph showing spectra of different types of light sources.
  • FIG. 1 is a functional block diagram illustrating an example of a configuration of the electronic device 100 according to an embodiment of the present disclosure.
  • the electronic device 100 is a mobile device such as a smartphone, a tablet terminal or a mobile phone, but may be other types of electronic device equipped with a camera module.
  • the electronic device 100 includes a camera module 10, a range sensor module 20, an image signal processor (ISP) 30, a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial measurement unit (IMU) 47, a main processor 48 and a memory 49.
  • ISP image signal processor
  • GNSS global navigation satellite system
  • the camera module 10 is configured to capture an image.
  • the module 10 can also take a video.
  • the camera module 10 includes a lens unit 11 that is capable of focusing on a subject, an image sensor 12 that detects an image inputted via the lens unit 11, and an image sensor driver 13 that drives the image sensor 12.
  • the lens unit 11 may include a plurality of lenses. At least one of the lenses can move along the optical axis direction for focusing operation.
  • the camera module 10 may be a stereo camera module for binocular stereo viewing.
  • the stereo camera module comprises a master camera module and a slave camera module.
  • FIG. 2 is a plan view showing a part of the image sensor 12.
  • FIG. 3 is a cross-sectional view taken along a line I-I of FIG. 2.
  • the image sensor 12 has a plurality of pixels arranged in a grid pattern.
  • the image sensor 12 has three types of pixels, that is, a pixel 12R for red, a pixel 12G for green and a pixel 12B for blue. As shown in FIG. 2, the pixels 12R, 12G and 12B are arranged according to the Bayer layout.
  • the image sensor 12 is not limited to the RGB pixels arranged according to the Bayer layout, but it may contain pixels for white, i.e., it may consist of RGBW pixels.
  • the pixels 12R, 12G and 12B are image plane phase-difference pixels so that the phase detection autofocus operation can be performed.
  • the pixel 12R has an on-chip lens (micro lens) 121, a color filter 122R and two photodiodes 123L and 123R.
  • the on-chip lens 121 efficiently collects the incident light on the photodiodes 123L and 123R.
  • the color filter 122R is configured to transmit red light, and it is provided so as to cover the photodiodes 123L and 123R.
  • the photodiodes 123L and 123R are located on the left and right sides of the pixel 12R, respectively.
  • the photodiode 123L receives the light transmitted through the right side of the lens unit 11.
  • the photodiode 123R receives the light transmitted through the left side of the lens unit 11.
  • the pixels 12G and 12B have the same configuration as the pixel 12R. That is, the pixel 12G has an on-chip lens 121, a color filter 122G that transmits green light, and two photodiodes 123L and 123R. The pixel 12B has an on-chip lens 121, a color filter that transmits blue light, and two photodiodes 123L and 123R.
  • the photodiodes of the pixel are not limited to the ones that divide the pixel into two sides, left and right, but the photodiodes may be ones that divide the pixel into four sides, top, bottom, left and right.
  • the color filters are not limited to RGB, and they may be CMY (cyan, magenta, yellow) .
  • the pixels 12R, 12G and 12B may be arranged continuously or at predetermined intervals over the entire surface of the image sensor 12, or may be arranged on a part of the surface.
  • the image signal can be obtained by adding an output of the photodiode 123L and an output of the photodiode 123R for each pixel of each color.
  • the integration of the outputs may be made for the entire image or for a portion of the image.
  • the phase difference signal can be obtained by adding an output of either the photodiode 123L or the photodiode 123R of the same color pixels in a predetermined pixel block.
  • the pixel block contains four pixels of the same color, for example. As shown in FIG. 2, each of the pixel blocks PG1, PG2 and PG3 has four pixels 12R. By using an added value or average value of the outputs of the four pixels, the influence of noise can be reduced.
  • the pixel block may contain a plurality of the same color pixels, the plurality being other than four (e.g., two, six or nine) .
  • the range sensor module 20 includes a lens unit 21, a range sensor 22, a range sensor driver 23 and a projector 24 that emits pulsed light toward a subject.
  • the range sensor module 20 can measure the distance between the electronic device 100 and the subject.
  • the range sensor module 20 is a ToF camera and captures a time-of-flight depth map or a ToF depth map by emitting pulsed light toward the subject and detecting light reflected from the subject.
  • the range sensor module 20 may be omitted.
  • the image signal processor 30 sends instructions to the image sensor driver 13 and the range sensor driver 23 to control the camera module 10 and the range sensor module 20, respectively.
  • the ISP 30 acquires data of an image captured by the camera module 10.
  • the data contain image signal and phase difference signal.
  • the data may be the RAW signal.
  • the image signal and the phase difference signal can be obtained from the RAW signal.
  • the functions of the ISP 30 are described in detail with reference to FIG. 4.
  • the ISP 30 includes an acquiring unit 31, a determining unit 32 and a performing unit 33.
  • the acquiring unit 31 is configured to acquire the data of an image when the camera module 10 captures the image. Specifically, the acquiring unit 31 acquires an image signal and a phase difference signal of the image when the image sensor 12 is exposed. For example, the acquiring unit 31 acquires the image signal and the phase difference signal stored in the memory 49.
  • the acquiring unit 31 may acquire the RAW signal of the image and obtain the image signal and the phase difference signal from the RAW signal.
  • the determining unit 32 is configured to determine a type of light source of the captured image based on the phase difference signal. The details of how to determine the type of light source will be explained later with reference to FIGs. 7 to 9.
  • the performing unit 33 is configured to perform a color reproduction processing on the image based on the type of the light source.
  • the color reproduction processing is white balance processing and/or color matrix processing.
  • the GNSS module 40 measures a current position of the electronic device 100.
  • the wireless communication module 41 performs wireless communications with the Internet.
  • the CODEC 42 bi-directionally performs encoding and decoding, using a predetermined encoding/decoding method.
  • the speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42.
  • the microphone 44 outputs sound data to the CODEC 42 based on inputted sound.
  • the display module 45 displays various information such as an image captured by the camera module 10 in real-time, a User Interface (UI) , and a color-reproduced image which is created by the ISP 30.
  • UI User Interface
  • the input module 46 inputs information via a user’s operation.
  • the input module 46 is a touch panel or a keyboard and so on.
  • the input module 46 inputs an instruction to capture and store an image displayed on the display module 45.
  • the IMU 47 detects an angular velocity and an acceleration of the electronic device 100. For example, a posture of the electronic device 100 can be grasped by a measurement result of the IMU 47.
  • the main processor 48 controls the GNSS module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
  • the memory 49 stores data of an image, data of depth map, various camera parameters to be used in image processing, and a program which runs on the image signal processor 30 and/or the main processor 48.
  • a method for improving the quality of an image captured by the image sensor 12 consisting of the pixels 12R, 12G and 12 B according to an embodiment of the present disclosure will be described with reference to the flowchart shown in FIG. 5.
  • the camera module 10 captures an image.
  • the data of the image are stored in the memory 49.
  • the data may be image signal and phase difference signal, or RAW signal.
  • the acquiring unit 31 acquires the data of the image.
  • the determining unit 32 determines a type of light source of the image captured in the step S1.
  • the type of the light source is daylight or a white LED, for example.
  • the type of the light source may be an LED of other color such as blue, green or red, a fluorescent light, or sunset light.
  • the performing unit 33 performs white balance processing on the image based on the type of the light source determined in the step S2. Specifically, the performing unit 33 performs white balance processing on the image based on a spectrum of the type of the light source according to various known methods.
  • the performing unit 33 performs color matrix processing on the image based on the type of the light source determined in the step S2. Specifically, the performing unit 33 selects a color matrix or a linear matrix which corresponds to the type of the light source.
  • a correspondence table between a type of light source and a color matrix may be stored in the memory 49. The performing unit 33 may select the linear matrix by referring to the correspondence table. After that, the performing unit 33 converts colors of the image based on the selected linear matrix.
  • step S3 and the step S4 may be performed in reverse order or in parallel.
  • the image captured in the step S1 may be divided into a plurality of regions, and the steps S2 to S4 may be performed for each of the regions.
  • At least one of the steps S1 to S4 may be performed by other processor (s) such as the main processor 48.
  • step S2 the details of the step S2 will be described with reference to the flowchart of FIG. 6.
  • the determining unit 32 estimates a type of the light source based on the image signal.
  • the estimated type is referred to as the first type.
  • the type is estimated by integrating the outputs of the photodiodes 123L and 123R for each color (e.g., red, green, blue) according to the known method. That is, the values can be obtained by integrating the image signal for each color.
  • the determining unit 32 estimates a type of the light source based on the phase difference signal.
  • the estimated type is referred to as the second type. The details of this step will be explained in detail later.
  • the determining unit 32 determines whether the first type and the second type are different from each other. If the result is “No” , the process proceeds to the step S24, if it is “Yes” , the process proceeds to the step S25.
  • the determining unit 32 determines the first type as the type of the light source.
  • the determining unit 32 determines whether the second type is specific or not. If the result is “Yes” , the process proceeds to the step S26, if it is “No” , the process proceeds to the step S24.
  • the determining unit 32 determines the second type as the type of the light source.
  • the type of the light source can be determined as described above.
  • the image may be divided into a plurality of regions, each of which includes one of the light sources, and the processing flow above may be performed for each region.
  • the type of light source for each region can be obtained.
  • the division of the image may be performed mechanically so that each area has the same size.
  • the division may be performed based on the difference in properties. For example, the image is divided into two regions, one with a subject and the other with the background.
  • step S22 the details of the step S22 will be described with reference to the flowchart of FIG. 7.
  • the determining unit 32 obtains a focus position (first focus position) for a first color of the image sensor 12 based on the phase difference signal.
  • the focus position can also be understood as a focal length.
  • the determining unit 32 calculates the first focus position based on the phase difference signal obtained from the outputs of photodiodes that receive light transmitted through the red color filter 122R.
  • the determining unit 32 calculates phase differences for a plurality of positional relationships with different spacing between image plane phase-difference pixels. For example, as shown in FIG. 8, the phase difference is calculated for each of the positional relationships related to a pair of photodiodes 123L and 123R (i.e., -2, -1, ⁇ 0, +1, +2) .
  • the pixel consisting of the pair of photodiodes 123L and 123R in FIG. 8 corresponds to the pixel 12R in FIG. 3.
  • the photodiodes whose output value is sampled are hatched.
  • the phase differences are obtained by calculating the difference between the outputs of a pair of the photodiodes 123L and 123R.
  • the difference between an output of the photodiode 123R of the second red pixel from the left and an output of the photodiode 123L of the red pixel in the center is calculated.
  • the sum (or average value) of the outputs of a plurality of photodiodes in the pixel block may be calculated and used to obtain the difference.
  • FIG. 9 is a graph in which the points P1, P2, P3, P4 and P5 are plotted.
  • the x-coordinate value of each point indicates the positional relationship. That is to say, the x-coordinate values of the points P1, P2, P3, P4 and P5 are -2, -1, ⁇ 0, +1 and +2, respectively.
  • the y-coordinate value of each point indicates the phase difference calculated as described above.
  • the determining unit 32 finds a point where a straight line based on the points P1, P2, P3, P4 and P5 intersects the x-axis.
  • the intersection indicates a focus position for red (the first color) .
  • the straight line is a polygonal line consisting of the points P1, P2, P3, P4 and P5, as shown in FIG. 9. In this example, the intersection of the x-axis and the straight line connecting the points P2 and P3 is found.
  • the straight line may be a line obtained by linear interpolation of the points P1, P2, P3, P4 and P5.
  • the number of points is not limited to 5, but it is arbitrary.
  • the determining unit 32 obtains a focus position (second focus position) for a second color of the image sensor 12 based on the phase difference signal.
  • the first color is green.
  • the determining unit 32 calculates the second focus position based on the phase difference signal obtained from the outputs of photodiodes that receive light transmitted through the green color filter 122G.
  • the first focus position and the second focus position are different from each other due to the chromatic aberration of the lens unit 11.
  • the determining unit 32 estimates a type of the light source based on the first focus position and the second focus position.
  • the determining unit 32 calculates a ratio of the first focus position to the second focus position.
  • the ratio R/G is calculated since the first color is red and the second color is green in the present disclosure.
  • the determining unit 32 estimates a type of the light source based on the ratio.
  • the determining unit 32 may refer to a correspondence table to estimate the type of the light source.
  • the table shows a correspondence between a type of light source and a ratio of focus positions, and the table may be stored in the memory 49.
  • the first focus position reflects an average value of the wavelengths of light that has passed through the red color filter 122R.
  • the second focus position reflects an average value of the wavelengths of light that has passed through the green color filter 122G. There is a little deviation between the first focus position and the second focus position due to the chromatic aberration of the lens of the lens unit 11.
  • the average values W1, W2 and W3 of the wavelengths of light received by blue color filter, green color filter and red color filter are near the peak of the transmittance of each color filter.
  • the average values W2’ and W3’ of the wavelengths of light received by red color filter and green color filter deviate from the peak of the transmittance of each color filter. Actually, the deviations suggest the type of the light source.
  • the step S22 is performed to estimate a type of light source based on the phase difference signal.
  • FIG. 12 is a graph in which the ratios R/G obtained for the two types of light sources (i.e., DL5000 and LED5000K) are plotted in association with the angle of the YIQ chromaticity diagram. From this figure, it can be seen that at all angles of the chromaticity diagram, the difference between the two ratios is large enough to determine the type of light source.
  • FIG. 13 shows a graph in which the ratios B/G obtained for the two types of light sources (i.e., DL5000 and LED5000K) are plotted in association with the angle of the YIQ chromaticity diagram. From this figure, it can be seen that at all angles of the chromaticity diagram, the difference between the two ratios is large enough to determine the type of light source.
  • the first color is not limited to red and the second color is not limited to green.
  • the first color may be blue and the second color may be green.
  • the ratio may be B/G or G/B.
  • the first color may be red and the second color may be blue. In this case, the ratio may be B/R or R/B.
  • the type of light source may be estimated from a plurality of ratios related to the focus position. For example, both the first ratio R/G and the second ratio B/G may be calculated to estimate the type of light source. This can further improve the accuracy of estimating the type of light source.
  • the determining unit 32 obtains a first focus position for a first color (e.g., Red) of the image sensor 12 based on the phase difference signal.
  • the determining unit 32 obtains a second focus position for a second color (e.g., Green) of the image sensor 12 based on the phase difference signal.
  • the determining unit 32 obtains a third focus position for a third color (e.g., Blue) of the image sensor based on the phase difference signal.
  • the determining unit 32 calculates a first ratio R/G and a second ratio B/G. After that, the determining unit 32 estimates a type of the light source based on the first ratio and the second ratio.
  • the focus positions may be converted by performing inverse calculation of the chromatic aberration correction based on the design information of the lens unit 11.
  • the color reproducibility of the captured image can be improved by using the type of light source estimated according to the present disclosure. That is, the performance of the white balance processing and/or the color matrix processing can be improved because the accuracy of estimating the type of light source is improved.
  • a type of light source can be estimated with high accuracy without new hardware being required, and the performance of the color reproduction processing can be improved.
  • FIG. 14 shows an example of the image after the color reproduction processing.
  • the left side of FIG. 14 shows an example of an image I1 on which the color reproduction processing is performed when the light source type is estimated only by the image signal (that is, the case where the step S2 consists of only the step S21) .
  • the right side of FIG. 14 shows an example of an image I2 on which color reproduction processing is performed when the above-described method of the present disclosure is performed.
  • the images I1 and I2 contain the two light sources LS1 and LS2, both of which are LEDs.
  • the color reproduction of the image I2 has been improved in the newspaper on the table (indicated by an area A) because the type of the light sources LS1 and LS2 is correctly estimated.
  • FIG. 16 shows an example of the spectra of light sources with a color temperature of 5000K.
  • the DL5000 is a light source that imitates sunlight and has a broad spectrum.
  • LED5000K is a white LED consisting of a blue LED and a fluorescent material that absorbs blue light and emits yellow light.
  • the white LED has a complicated spectrum with a plurality of peaks.
  • the electronic device 100 performs the method according to the present disclosure. It can be understood that the method may be performed by a terminal device which does not have a camera module.
  • the terminal device has one or more processors, a memory and one or more programs which includes instructions and are stored in the memory. The one or more programs are configured to be executed by the one or more processors for performing the method according to the present disclosure.
  • the terminal device may be an image processing terminal, for example.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • a feature defined as “first” and “second” may comprise one or more of this feature.
  • a plurality of means “two or more than two” , unless otherwise specified.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.
  • the storage medium may be transitory or non-transitory.

Abstract

Disclosed is a method for improving quality of an image captured by an image sensor having image plane phase-difference pixels. The method includes acquiring an image signal and a phase difference signal of the image when the image sensor is exposed, determining a type of a light source of the image based on the phase difference signal, and performing a color reproduction processing on the image based on the type of the light source.

Description

METHOD FOR IMPROVING QUALITY OF IMAGE CAPTURED BY IMAGE SENSOR HAVING IMAGE PLANE PHASE-DIFFERENCE PIXELS, ELECTRONIC DEVICE, COMPUTER-READABLE STORAGE MEDIUM AND TERMINAL DEVICE TECHNICAL FIELD
The present disclosure relates to a method for improving quality of an image captured by an image sensor having image plane phase-difference pixels, an electronic device performing the method, a computer-readable storage medium storing a program to implement the method, and a terminal device performing the method.
BACKGROUND
An electronic device such as a smartphone has a camera module which includes a lens and an image sensor. The image sensor converts light transmitted through the lens into colored light by using a color filter attached to a photodiode of a pixel. The photodiode receives the colored light to output a corresponding color signal. Red, Green and Blue color signals make up an image signal or an image data of the image captured by the camera module.
In recent years, an image sensor having new pixels that can detect phase differences for the autofocus operation has been developed. The pixel is referred to as “image plane phase-difference pixel” .
A subject to be imaged is usually illuminated by a light source such as sunlight or a room light such as a fluorescent lamp or an LED lamp. The subject itself may include a light source. There are various types of light sources. In the present application, the term of "type" of light source includes not only a type of the light source such as sunlight, fluorescent lamp, LED lamp or OLED lamp, but also its color temperature.
Conventionally, a type of the light source is estimated by using the image signal. Specifically, the type of the light source is estimated from the integrated value for each color. The first integrated value is obtained by integrating the output values of the pixels for red. The second integrated value is obtained by integrating the output values of the pixels for green. The third integrated value is obtained by integrating the output values of the pixels for blue. The type of light source is estimated based on the first integrated value, the second integrated value and the third integrated value.
After estimating the type of the light source, color reproduction processing on the captured image (e.g., white balance processing, color matrix processing) is performed based on the estimated type of the light source.
However, the color reproduction processing may not be performed properly. One of the reasons is that it is difficult to accurately estimate a type of the light source due to non-ideal characteristics of color filters. As shown in FIG. 15, the transmission characteristics of a color filter actually allow light having wavelength other than the desired wavelength to be transmitted (see the areas A1 and A2) . Therefore, the image signal does not indicate the exact intensity for each color.
SUMMARY
The present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide a method for improving the quality of an image captured by an image sensor having image plane phase-difference pixels, an electronic device performing the method, and a computer-readable storage medium storing a program to implement the method.
In accordance with the present disclosure, a method for improving quality of an image captured by an image sensor having image plane phase-difference pixels is provided. The  method includes acquiring an image signal and a phase difference signal of the image when the image sensor is exposed, determining a type of a light source of the image based on the phase difference signal, and performing a color reproduction processing on the image based on the type of the light source.
In some embodiments, the determining a type of a light source of the image based on the phase difference signal may include obtaining a first focus position for a first color based on the phase difference signal, obtaining a second focus position for a second color based on the phase difference signal, and estimating a first type of the light source based on the first focus position and the second focus position.
In some embodiments, the estimating the first type of the light source based on the first focus position and the second focus position may include calculating a ratio of the first focus position to the second focus position, and estimating the type of the light source based on the ratio.
In some embodiments, the first color may be red and the second color is green.
In some embodiments, the first color may be blue and the second color is green.
In some embodiments, the method may further include estimating a second type of the light source based on the image signal. The determining a type of a light source of the image based on the phase difference signal may include determining the type of the light source to be the first type when the first type and the second type are different from each other and the first type is a predetermined type.
In some embodiments, the predetermined type may be a white LED.
In some embodiments, the determining a type of a light source of the image based on the phase difference signal may include obtaining a first focus position for a first color of the image sensor based on the phase difference signal, obtaining a second focus position for a second color of the image sensor based on the phase difference signal, obtaining a third focus position for a third color of the image sensor based on the phase difference signal, calculating a first ratio of the first focus position to the second focus position, calculating a second ratio of the third focus position to the second focus position, and estimating a first type of the light source based on the first ratio and the second ratio.
In some embodiments, the performing a color reproduction processing on the image based on the type of the light source may include performing white balance processing on the image based on a spectrum of the type of the light source.
In some embodiments, the performing a color reproduction processing on the image based on the type of the light source may include selecting a linear matrix corresponding to the type of the light source, and converting color in the image based on the linear matrix.
In accordance with the present disclosure, an electronic device includes a processor and a memory for storing instructions. The instructions, when executed by the processor, cause the processor to perform the method according to the present disclosure.
In accordance with the present disclosure, a computer-readable storage medium, on which a computer program is stored, is provided. The computer program is executed by a computer to implement the method according to the present disclosure.
In accordance with the present disclosure, a terminal device is provided. The terminal device includes one or more processors, a memory and one or more programs. The one or more programs includes instructions and are stored in the memory. The one or more programs are configured to be executed by the one or more processors for:
acquiring an image signal and a phase difference signal of the image when the image sensor is exposed;
determining a type of a light source of the image based on the phase difference signal; and
performing a color reproduction processing on the image based on the type of the light source.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of embodiments of the present disclosure will become apparent and more readily appreciated from the following descriptions made with reference to the drawings below.
FIG. 1 is a functional block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
FIG. 2 is a plan view showing a part of an image sensor in the electronic device.
FIG. 3 is a cross-sectional view taken along a line I-I of FIG. 2.
FIG. 4 is a functional block diagram of an image signal processor in the electronic device according to an embodiment of the present disclosure.
FIG. 5 is a flowchart for improving color reproduction for an image acquired by the image sensor according to an embodiment of the present disclosure.
FIG. 6 is a flowchart for deciding a type of light source of the image according to an embodiment of the present disclosure.
FIG. 7 is a flowchart for estimating a type of the light source based on a phase difference signal according to an embodiment of the present disclosure.
FIG. 8 is a diagram for explaining how to calculate a focus position according to an embodiment of the present disclosure.
FIG. 9 is an example of a graph for calculating the focus position according to an embodiment of the present disclosure.
FIG. 10 is an example of a graph showing an average value of wavelengths of light received by the photodiode for each color filter when the light source is DL5000.
FIG. 11 is an example of a graph showing an average value of wavelengths of light received by the photodiode for each color filter when the light source is LED5000K.
FIG. 12 is an example of a graph showing a ratio (R/G) of the focus positions for angles of the chromaticity diagram.
FIG. 13 is an example of a graph showing a ratio (B/G) of the focus positions for angles of the chromaticity diagram.
FIG. 14 is an example of an image after color reproduction processing in the case of the conventional (left) and the present disclosure (right) .
FIG. 15 is an example of a graph showing the characteristics of actual color filters.
FIG. 16 is an example of a graph showing spectra of different types of light sources.
DETAILED DESCRIPTION
Embodiments of the present disclosure will be described in detail and examples of the embodiments will be illustrated in the accompanying drawings. The same or similar elements and elements having same or similar functions are denoted by like reference numerals throughout the descriptions. The embodiments described herein with reference to the drawings are explanatory and aim to illustrate the present disclosure, but shall not be construed to limit the present disclosure.
<Electronic device 100>
An electronic device 100 will be described with reference to FIG. 1. FIG. 1 is a functional block diagram illustrating an example of a configuration of the electronic device 100 according to an embodiment of the present disclosure.
The electronic device 100 is a mobile device such as a smartphone, a tablet terminal or a mobile phone, but may be other types of electronic device equipped with a camera module.
As shown in FIG. 1, the electronic device 100 includes a camera module 10, a range sensor module 20, an image signal processor (ISP) 30, a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial measurement unit (IMU) 47, a main processor 48 and a memory 49.
The camera module 10 is configured to capture an image. The module 10 can also take a video.
The camera module 10 includes a lens unit 11 that is capable of focusing on a subject, an image sensor 12 that detects an image inputted via the lens unit 11, and an image sensor driver 13 that drives the image sensor 12. The lens unit 11 may include a plurality of lenses. At least one of the lenses can move along the optical axis direction for focusing operation.
Optionally, the camera module 10 may be a stereo camera module for binocular stereo viewing. The stereo camera module comprises a master camera module and a slave camera module.
An example of the image sensor 12 is described in detail with reference to FIGs. 2 and 3. FIG. 2 is a plan view showing a part of the image sensor 12. FIG. 3 is a cross-sectional view taken along a line I-I of FIG. 2.
The image sensor 12 has a plurality of pixels arranged in a grid pattern. The image sensor 12 has three types of pixels, that is, a pixel 12R for red, a pixel 12G for green and a pixel 12B for blue. As shown in FIG. 2, the  pixels  12R, 12G and 12B are arranged according to the Bayer layout. The image sensor 12 is not limited to the RGB pixels arranged according to the Bayer layout, but it may contain pixels for white, i.e., it may consist of RGBW pixels.
The  pixels  12R, 12G and 12B are image plane phase-difference pixels so that the phase detection autofocus operation can be performed. As shown in FIG. 3, the pixel 12R has an on-chip lens (micro lens) 121, a color filter 122R and two  photodiodes  123L and 123R. The on-chip lens 121 efficiently collects the incident light on the  photodiodes  123L and 123R. The color filter 122R is configured to transmit red light, and it is provided so as to cover the  photodiodes  123L and 123R.
The  photodiodes  123L and 123R are located on the left and right sides of the pixel 12R, respectively. The photodiode 123L receives the light transmitted through the right side of the lens unit 11. The photodiode 123R receives the light transmitted through the left side of the lens unit 11.
The  pixels  12G and 12B have the same configuration as the pixel 12R. That is, the pixel 12G has an on-chip lens 121, a color filter 122G that transmits green light, and two  photodiodes  123L and 123R. The pixel 12B has an on-chip lens 121, a color filter that transmits blue light, and two  photodiodes  123L and 123R.
The photodiodes of the pixel are not limited to the ones that divide the pixel into two sides, left and right, but the photodiodes may be ones that divide the pixel into four sides, top, bottom, left and right.
The color filters are not limited to RGB, and they may be CMY (cyan, magenta, yellow) .
The  pixels  12R, 12G and 12B may be arranged continuously or at predetermined intervals over the entire surface of the image sensor 12, or may be arranged on a part of the surface.
The image signal can be obtained by adding an output of the photodiode 123L and an output of the photodiode 123R for each pixel of each color. The integration of the outputs may be made for the entire image or for a portion of the image.
The phase difference signal can be obtained by adding an output of either the photodiode 123L or the photodiode 123R of the same color pixels in a predetermined pixel block. The pixel block contains four pixels of the same color, for example. As shown in FIG. 2, each of the pixel blocks PG1, PG2 and PG3 has four pixels 12R. By using an added value or average value of the outputs of the four pixels, the influence of noise can be reduced. The pixel block may contain a plurality of the same color pixels, the plurality being other than four (e.g., two, six or nine) .
The range sensor module 20 includes a lens unit 21, a range sensor 22, a range sensor driver 23 and a projector 24 that emits pulsed light toward a subject. The range sensor module 20 can measure the distance between the electronic device 100 and the subject. For example, the range sensor module 20 is a ToF camera and captures a time-of-flight depth map or a ToF depth map  by emitting pulsed light toward the subject and detecting light reflected from the subject. The range sensor module 20 may be omitted.
The image signal processor 30 sends instructions to the image sensor driver 13 and the range sensor driver 23 to control the camera module 10 and the range sensor module 20, respectively.
The ISP 30 acquires data of an image captured by the camera module 10. The data contain image signal and phase difference signal. Alternatively, the data may be the RAW signal. The image signal and the phase difference signal can be obtained from the RAW signal.
The functions of the ISP 30 are described in detail with reference to FIG. 4. The ISP 30 includes an acquiring unit 31, a determining unit 32 and a performing unit 33.
The acquiring unit 31 is configured to acquire the data of an image when the camera module 10 captures the image. Specifically, the acquiring unit 31 acquires an image signal and a phase difference signal of the image when the image sensor 12 is exposed. For example, the acquiring unit 31 acquires the image signal and the phase difference signal stored in the memory 49.
Alternatively, the acquiring unit 31 may acquire the RAW signal of the image and obtain the image signal and the phase difference signal from the RAW signal.
The determining unit 32 is configured to determine a type of light source of the captured image based on the phase difference signal. The details of how to determine the type of light source will be explained later with reference to FIGs. 7 to 9.
The performing unit 33 is configured to perform a color reproduction processing on the image based on the type of the light source. The color reproduction processing is white balance processing and/or color matrix processing.
Other components of the electronic device 100 are described below.
The GNSS module 40 measures a current position of the electronic device 100. The wireless communication module 41 performs wireless communications with the Internet. The CODEC 42 bi-directionally performs encoding and decoding, using a predetermined encoding/decoding method. The speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42. The microphone 44 outputs sound data to the CODEC 42 based on inputted sound.
The display module 45 displays various information such as an image captured by the camera module 10 in real-time, a User Interface (UI) , and a color-reproduced image which is created by the ISP 30.
The input module 46 inputs information via a user’s operation. The input module 46 is a touch panel or a keyboard and so on. For example, the input module 46 inputs an instruction to capture and store an image displayed on the display module 45.
The IMU 47 detects an angular velocity and an acceleration of the electronic device 100. For example, a posture of the electronic device 100 can be grasped by a measurement result of the IMU 47.
The main processor 48 controls the GNSS module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
The memory 49 stores data of an image, data of depth map, various camera parameters to be used in image processing, and a program which runs on the image signal processor 30 and/or the main processor 48.
< Method for improving quality of the captured image >
A method for improving the quality of an image captured by the image sensor 12 consisting of the  pixels  12R, 12G and 12 B according to an embodiment of the present disclosure will be described with reference to the flowchart shown in FIG. 5.
In the step S1, the camera module 10 captures an image. The data of the image are stored in the memory 49. The data may be image signal and phase difference signal, or RAW signal. The acquiring unit 31 acquires the data of the image.
In the step S2, the determining unit 32 determines a type of light source of the image captured in the step S1. The type of the light source is daylight or a white LED, for example. The type of the light source may be an LED of other color such as blue, green or red, a fluorescent light, or sunset light.
In the step S3, the performing unit 33 performs white balance processing on the image based on the type of the light source determined in the step S2. Specifically, the performing unit 33 performs white balance processing on the image based on a spectrum of the type of the light source according to various known methods.
In the step S4, the performing unit 33 performs color matrix processing on the image based on the type of the light source determined in the step S2. Specifically, the performing unit 33 selects a color matrix or a linear matrix which corresponds to the type of the light source. A correspondence table between a type of light source and a color matrix may be stored in the memory 49. The performing unit 33 may select the linear matrix by referring to the correspondence table. After that, the performing unit 33 converts colors of the image based on the selected linear matrix.
Optionally, the step S3 and the step S4 may be performed in reverse order or in parallel.
Optionally, the image captured in the step S1 may be divided into a plurality of regions, and the steps S2 to S4 may be performed for each of the regions.
Optionally, at least one of the steps S1 to S4 may be performed by other processor (s) such as the main processor 48.
Next, the details of the step S2 will be described with reference to the flowchart of FIG. 6.
In the step S21, the determining unit 32 estimates a type of the light source based on the image signal. The estimated type is referred to as the first type. The type is estimated by integrating the outputs of the  photodiodes  123L and 123R for each color (e.g., red, green, blue) according to the known method. That is, the values can be obtained by integrating the image signal for each color.
In the step S22, the determining unit 32 estimates a type of the light source based on the phase difference signal. The estimated type is referred to as the second type. The details of this step will be explained in detail later.
In the step S23, the determining unit 32 determines whether the first type and the second type are different from each other. If the result is “No” , the process proceeds to the step S24, if it is “Yes” , the process proceeds to the step S25.
In the step S24, the determining unit 32 determines the first type as the type of the light source.
In the step S25, the determining unit 32 determines whether the second type is specific or not. If the result is “Yes” , the process proceeds to the step S26, if it is “No” , the process proceeds to the step S24.
In the step S26, the determining unit 32 determines the second type as the type of the light source.
The type of the light source can be determined as described above.
When the image contains a plurality of light sources, the image may be divided into a plurality of regions, each of which includes one of the light sources, and the processing flow above may be performed for each region. As a result, the type of light source for each region can be obtained. The division of the image may be performed mechanically so that each area has the same size. The division may be performed based on the difference in properties. For example, the image is divided into two regions, one with a subject and the other with the background.
Next, the details of the step S22 will be described with reference to the flowchart of FIG. 7.
In the step S221, the determining unit 32 obtains a focus position (first focus position) for a first color of the image sensor 12 based on the phase difference signal. The focus position can also be understood as a focal length.
For example, when the first color is red, the determining unit 32 calculates the first focus position based on the phase difference signal obtained from the outputs of photodiodes that receive light transmitted through the red color filter 122R.
The calculation method of the focus position will be explained in detail.
First, the determining unit 32 calculates phase differences for a plurality of positional relationships with different spacing between image plane phase-difference pixels. For example, as shown in FIG. 8, the phase difference is calculated for each of the positional relationships related to a pair of  photodiodes  123L and 123R (i.e., -2, -1, ±0, +1, +2) . The pixel consisting of the pair of  photodiodes  123L and 123R in FIG. 8 corresponds to the pixel 12R in FIG. 3. In FIG. 8, in each positional relationship, the photodiodes whose output value is sampled are hatched.
The phase differences are obtained by calculating the difference between the outputs of a pair of the  photodiodes  123L and 123R. In the case of the positional relationship “-2” , the difference between an output of the photodiode 123R of the second red pixel from the left and an output of the photodiode 123L of the red pixel in the center is calculated. Please note that, as the output, the sum (or average value) of the outputs of a plurality of photodiodes in the pixel block may be calculated and used to obtain the difference.
FIG. 9 is a graph in which the points P1, P2, P3, P4 and P5 are plotted. The x-coordinate value of each point indicates the positional relationship. That is to say, the x-coordinate values of the points P1, P2, P3, P4 and P5 are -2, -1, ±0, +1 and +2, respectively. The y-coordinate value of each point indicates the phase difference calculated as described above.
The determining unit 32 finds a point where a straight line based on the points P1, P2, P3, P4 and P5 intersects the x-axis. The intersection indicates a focus position for red (the first color) . The straight line is a polygonal line consisting of the points P1, P2, P3, P4 and P5, as shown in FIG. 9. In this example, the intersection of the x-axis and the straight line connecting the points P2 and P3 is found. Alternatively, the straight line may be a line obtained by linear interpolation of the points P1, P2, P3, P4 and P5. Of course, the number of points is not limited to 5, but it is arbitrary.
In the step S222, the determining unit 32 obtains a focus position (second focus position) for a second color of the image sensor 12 based on the phase difference signal. For example, the first color is green. By the same method as described in the step S221, the determining unit 32 calculates the second focus position based on the phase difference signal obtained from the outputs of photodiodes that receive light transmitted through the green color filter 122G. The first focus position and the second focus position are different from each other due to the chromatic aberration of the lens unit 11.
In the step S223, the determining unit 32 estimates a type of the light source based on the first focus position and the second focus position.
First, the determining unit 32 calculates a ratio of the first focus position to the second focus position. In the present embodiment, the ratio R/G is calculated since the first color is red and the second color is green in the present disclosure.
Next, the determining unit 32 estimates a type of the light source based on the ratio. The determining unit 32 may refer to a correspondence table to estimate the type of the light source. The table shows a correspondence between a type of light source and a ratio of focus positions, and the table may be stored in the memory 49.
Hereinafter, the reason why the type of light source can be estimated based on the focus position will be described. The first focus position reflects an average value of the wavelengths of light that has passed through the red color filter 122R. The second focus position reflects an average value of the wavelengths of light that has passed through the green color filter 122G. There is a little deviation between the first focus position and the second focus position due to the chromatic aberration of the lens of the lens unit 11.
As shown in FIG. 10, when the light source is DL5000, the average values W1, W2 and W3 of the wavelengths of light received by blue color filter, green color filter and red color filter are  near the peak of the transmittance of each color filter. On the other hand, as shown in FIG. 11, when the light source is LED5000K, the average values W2’ and W3’ of the wavelengths of light received by red color filter and green color filter deviate from the peak of the transmittance of each color filter. Actually, the deviations suggest the type of the light source.
As described above, the step S22 is performed to estimate a type of light source based on the phase difference signal.
FIG. 12 is a graph in which the ratios R/G obtained for the two types of light sources (i.e., DL5000 and LED5000K) are plotted in association with the angle of the YIQ chromaticity diagram. From this figure, it can be seen that at all angles of the chromaticity diagram, the difference between the two ratios is large enough to determine the type of light source.
The same applies to the ratio B/G. FIG. 13 shows a graph in which the ratios B/G obtained for the two types of light sources (i.e., DL5000 and LED5000K) are plotted in association with the angle of the YIQ chromaticity diagram. From this figure, it can be seen that at all angles of the chromaticity diagram, the difference between the two ratios is large enough to determine the type of light source.
Please note that the first color is not limited to red and the second color is not limited to green. As an example, the first color may be blue and the second color may be green. In this case, the ratio may be B/G or G/B. As another example, the first color may be red and the second color may be blue. In this case, the ratio may be B/R or R/B.
Alternatively, the type of light source may be estimated from a plurality of ratios related to the focus position. For example, both the first ratio R/G and the second ratio B/G may be calculated to estimate the type of light source. This can further improve the accuracy of estimating the type of light source.
More specifically, the determining unit 32 obtains a first focus position for a first color (e.g., Red) of the image sensor 12 based on the phase difference signal. Next, the determining unit 32 obtains a second focus position for a second color (e.g., Green) of the image sensor 12 based on the phase difference signal. The determining unit 32 obtains a third focus position for a third color (e.g., Blue) of the image sensor based on the phase difference signal. The determining unit 32 calculates a first ratio R/G and a second ratio B/G. After that, the determining unit 32 estimates a type of the light source based on the first ratio and the second ratio.
Optionally, in order to increase the difference in the focus positions calculated in the steps S221 and S222, the focus positions may be converted by performing inverse calculation of the chromatic aberration correction based on the design information of the lens unit 11.
As described above, by estimating a type of light source based on the phase difference signal read from the image plane phase difference pixels, it is possible to accurately discriminate a type of light sources which have different spectral shapes, and thus it is possible to improve the accuracy of estimating the type of light source.
As a result, the color reproducibility of the captured image can be improved by using the type of light source estimated according to the present disclosure. That is, the performance of the white balance processing and/or the color matrix processing can be improved because the accuracy of estimating the type of light source is improved.
Further, according to the method of the present disclosure, a type of light source can be estimated with high accuracy without new hardware being required, and the performance of the color reproduction processing can be improved.
FIG. 14 shows an example of the image after the color reproduction processing. The left side of FIG. 14 shows an example of an image I1 on which the color reproduction processing is performed when the light source type is estimated only by the image signal (that is, the case where the step S2 consists of only the step S21) . The right side of FIG. 14 shows an example of an image I2 on which color reproduction processing is performed when the above-described method of the present disclosure is performed. The images I1 and I2 contain the two light sources LS1 and LS2, both of which are LEDs.
As can be seen by comparing the image I1 and the image I2, the color reproduction of the image I2 has been improved in the newspaper on the table (indicated by an area A) because the type of the light sources LS1 and LS2 is correctly estimated.
In the conventional method of estimating the type of light source from the image signal, it has been difficult to distinguish between light sources with different spectral shape because the image signal is just the sum of intensities of the light that has passed through the color filter. Actually, some light sources have different spectral shapes. FIG. 16 shows an example of the spectra of light sources with a color temperature of 5000K. The DL5000 is a light source that imitates sunlight and has a broad spectrum. LED5000K is a white LED consisting of a blue LED and a fluorescent material that absorbs blue light and emits yellow light. The white LED has a complicated spectrum with a plurality of peaks. As described, according to the method of the present disclosure, it is possible to distinguish between light sources with different spectral shape by estimating a type of light source based on the phase difference signal read from the image plane phase difference pixels.
As mentioned, the electronic device 100 performs the method according to the present disclosure. It can be understood that the method may be performed by a terminal device which does not have a camera module. The terminal device has one or more processors, a memory and one or more programs which includes instructions and are stored in the memory. The one or more programs are configured to be executed by the one or more processors for performing the method according to the present disclosure. The terminal device may be an image processing terminal, for example.
In the description of embodiments of the present disclosure, it is to be understood that terms such as "central" , "longitudinal" , "transverse" , "length" , "width" , "thickness" , "upper" , "lower" , "front" , "rear" , "back" , "left" , "right" , "vertical" , "horizontal" , "top" , "bottom" , "inner" , "outer" , "clockwise" and "counterclockwise" should be construed to refer to the orientation or the position as described or as shown in the drawings in discussion. These relative terms are only used to simplify the description of the present disclosure, and do not indicate or imply that the device or element referred to must have a particular orientation, or must be constructed or operated in a particular orientation. Thus, these terms cannot be constructed to limit the present disclosure.
In addition, terms such as "first" and "second" are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features. Thus, a feature defined as "first" and "second" may comprise one or more of this feature. In the description of the present disclosure, "a plurality of" means “two or more than two” , unless otherwise specified.
In the description of embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted" , "connected" , "coupled" and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
In the embodiments of the present disclosure, unless specified or limited otherwise, a structure in which a first feature is "on" or "below" a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween. Furthermore, a first feature "on" , "above" or "on top of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "on" , "above" or "on top of" the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature "below" , "under" or "on bottom of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "below" , "under" or "on bottom of" the  second feature, or just means that the first feature is at a height lower than that of the second feature.
Various embodiments and examples are provided in the above description to implement different structures of the present disclosure. In order to simplify the present disclosure, certain elements and settings are described in the above. However, these elements and settings are only by way of example and are not intended to limit the present disclosure. In addition, reference numbers and/or reference letters may be repeated in different examples in the present disclosure. This repetition is for the purpose of simplification and clarity and does not refer to relations between different embodiments and/or settings. Furthermore, examples of different processes and materials are provided in the present disclosure. However, it would be appreciated by those skilled in the art that other processes and/or materials may also be applied.
Reference throughout this specification to "an embodiment" , "some embodiments" , "an exemplary embodiment" , "an example" , "a specific example" or "some examples" means that a particular feature, structure, material, or characteristics described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above phrases throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.
Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
The logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function, may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment. As to the specification, "the computer readable medium" may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment. More specific examples of the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) . In addition, the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
It should be understood that each part of the present disclosure may be realized by the hardware, software, firmware or their combination. In the above embodiments, a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system. For example, if it is realized by the hardware, likewise in another embodiment, the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit  for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
Those skilled in the art shall understand that all or parts of the steps in the above exemplifying method of the present disclosure may be achieved by commanding the related hardware with programs. The programs may be stored in a computer readable storage medium, and the programs comprise one or a combination of the steps in the method embodiments of the present disclosure when run on a computer.
In addition, each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module. The integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
The storage medium mentioned above may be read-only memories, magnetic disks, CD, etc. The storage medium may be transitory or non-transitory.
Although embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that the embodiments are explanatory and cannot be construed to limit the present disclosure, and changes, modifications, alternatives and variations can be made in the embodiments without departing from the scope of the present disclosure.

Claims (20)

  1. A method for improving quality of an image captured by an image sensor having image plane phase-difference pixels, the method comprising:
    acquiring an image signal and a phase difference signal of the image when the image sensor is exposed;
    determining a type of a light source of the image based on the phase difference signal; and
    performing a color reproduction processing on the image based on the type of the light source.
  2. The method of claim 1, wherein the determining a type of a light source of the image based on the phase difference signal comprises:
    obtaining a first focus position for a first color based on the phase difference signal;
    obtaining a second focus position for a second color based on the phase difference signal; and
    estimating a first type of the light source based on the first focus position and the second focus position.
  3. The method of claim 2, wherein the estimating the first type of the light source based on the first focus position and the second focus position comprises:
    calculating a ratio of the first focus position to the second focus position; and
    estimating the type of the light source based on the ratio.
  4. The method of claim 2 or 3, wherein the first color is red and the second color is green.
  5. The method of claim 2 or 3, wherein the first color is blue and the second color is green.
  6. The method of any one of claims 2 to 5, further comprising estimating a second type of the light source based on the image signal,
    wherein the determining a type of a light source of the image based on the phase difference signal comprises determining the type of the light source to be the first type when the first type and the second type are different from each other and the first type is a predetermined type.
  7. The method of claim 6, wherein the predetermined type is a white LED.
  8. The method of claim 1, wherein the determining a type of a light source of the image based on the phase difference signal comprises:
    obtaining a first focus position for a first color of the image sensor based on the phase difference signal;
    obtaining a second focus position for a second color of the image sensor based on the phase difference signal;
    obtaining a third focus position for a third color of the image sensor based on the phase difference signal;
    calculating a first ratio of the first focus position to the second focus position;
    calculating a second ratio of the third focus position to the second focus position; and
    estimating a first type of the light source based on the first ratio and the second ratio.
  9. The method of any one of claims 1 to 8, wherein the performing a color reproduction processing on the image based on the type of the light source comprises:
    performing white balance processing on the image based on a spectrum of the type of the light source.
  10. The method of any one of claims 1 to 9, wherein the performing a color reproduction processing on the image based on the type of the light source comprises:
    selecting a linear matrix corresponding to the type of the light source; and
    converting color in the image based on the linear matrix.
  11. An electronic device for improving color reproduction for an image acquired by an image sensor having image plane phase-difference pixels, the device comprising:
    an acquiring unit configured to acquire an image signal and a phase difference signal;
    a determining unit configured to determine a type of a light source of the image based on the phase difference signal; and
    a performing unit configured to perform a color reproduction processing on the image based on the type of the light source.
  12. A computer-readable storage medium, on which a computer program is stored, wherein the computer program is executed by a computer to implement the method according to any one of claims 1 to 10.
  13. A terminal device, comprising:
    one or more processors;
    a memory; and
    one or more programs, wherein the one or more programs including instructions are stored in the memory and configured to be executed by the one or more processors for:
    acquiring an image signal and a phase difference signal of the image when the image sensor is exposed;
    determining a type of a light source of the image based on the phase difference signal; and
    performing a color reproduction processing on the image based on the type of the light source.
  14. The terminal device of claim 13, wherein the determining a type of a light source of the image based on the phase difference signal comprises:
    obtaining a first focus position for a first color based on the phase difference signal;
    obtaining a second focus position for a second color based on the phase difference signal; and
    estimating a first type of the light source based on the first focus position and the second focus position.
  15. The terminal device of claim 14, wherein the estimating the first type of the light source based on the first focus position and the second focus position comprises:
    calculating a ratio of the first focus position to the second focus position; and
    estimating the type of the light source based on the ratio.
  16. The terminal device of claim 14 or 15, further comprising estimating a second type of the light source based on the image signal,
    wherein the determining a type of a light source of the image based on the phase difference signal comprises determining the type of the light source to be the first type when the first type and the second type are different from each other and the first type is a predetermined type.
  17. The terminal device of claim 16, wherein the predetermined type is a white LED.
  18. The terminal device of claim 13, wherein the determining a type of a light source of the image based on the phase difference signal comprises:
    obtaining a first focus position for a first color of the image sensor based on the phase difference signal;
    obtaining a second focus position for a second color of the image sensor based on the phase difference signal;
    obtaining a third focus position for a third color of the image sensor based on the phase difference signal;
    calculating a first ratio of the first focus position to the second focus position;
    calculating a second ratio of the third focus position to the second focus position; and
    estimating a first type of the light source based on the first ratio and the second ratio.
  19. The terminal device of any one of claims 13 to 18, wherein the performing a color reproduction processing on the image based on the type of the light source comprises:
    performing white balance processing on the image based on a spectrum of the type of the light source.
  20. The terminal device of any one of claims 13 to 19, wherein the performing a color reproduction processing on the image based on the type of the light source comprises:
    selecting a linear matrix corresponding to the type of the light source; and
    converting color in the image based on the linear matrix.
PCT/CN2021/120677 2021-09-26 2021-09-26 Method for improving quality of image captured by image sensor having image plane phase-difference pixels, electronic device, computer-readable storage medium and terminal device WO2023044856A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/120677 WO2023044856A1 (en) 2021-09-26 2021-09-26 Method for improving quality of image captured by image sensor having image plane phase-difference pixels, electronic device, computer-readable storage medium and terminal device
CN202180098729.5A CN117396734A (en) 2021-09-26 2021-09-26 Method, electronic device, computer-readable storage medium and terminal device for improving quality of image acquired by image sensor having image plane phase difference pixels

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/120677 WO2023044856A1 (en) 2021-09-26 2021-09-26 Method for improving quality of image captured by image sensor having image plane phase-difference pixels, electronic device, computer-readable storage medium and terminal device

Publications (1)

Publication Number Publication Date
WO2023044856A1 true WO2023044856A1 (en) 2023-03-30

Family

ID=85719890

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/120677 WO2023044856A1 (en) 2021-09-26 2021-09-26 Method for improving quality of image captured by image sensor having image plane phase-difference pixels, electronic device, computer-readable storage medium and terminal device

Country Status (2)

Country Link
CN (1) CN117396734A (en)
WO (1) WO2023044856A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104603662A (en) * 2012-09-06 2015-05-06 富士胶片株式会社 Imaging device and focus control method
CN104639827A (en) * 2013-11-13 2015-05-20 佳能株式会社 Image capturing apparatus and method of controlling the same
US20180278828A1 (en) * 2014-12-18 2018-09-27 Lg Innotek Co., Ltd. Camera Module and Image Sensing Method Thereof, and Recording Medium Having Recorded Therein Program for Implementing Method
CN109196316A (en) * 2016-05-12 2019-01-11 三星电子株式会社 The method and electronic equipment of light source for authentication image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104603662A (en) * 2012-09-06 2015-05-06 富士胶片株式会社 Imaging device and focus control method
CN104639827A (en) * 2013-11-13 2015-05-20 佳能株式会社 Image capturing apparatus and method of controlling the same
US20180278828A1 (en) * 2014-12-18 2018-09-27 Lg Innotek Co., Ltd. Camera Module and Image Sensing Method Thereof, and Recording Medium Having Recorded Therein Program for Implementing Method
CN109196316A (en) * 2016-05-12 2019-01-11 三星电子株式会社 The method and electronic equipment of light source for authentication image

Also Published As

Publication number Publication date
CN117396734A (en) 2024-01-12

Similar Documents

Publication Publication Date Title
US10542243B2 (en) Method and system of light source estimation for image processing
US10348962B2 (en) Image processing method and apparatus, and electronic device
US10438320B2 (en) Image processing method and apparatus, and electronic device
JP5406151B2 (en) 3D imaging device
US10339632B2 (en) Image processing method and apparatus, and electronic device
US20150098005A1 (en) Image sensor and image capturing system
US20080151079A1 (en) Imaging Device and Manufacturing Method Thereof
TW201540066A (en) Image sensor modules including primary high-resolution imagers and secondary imagers
TW201544848A (en) Structured-stereo imaging assembly including separate imagers for different wavelengths
US10194126B2 (en) Image processing method, imaging apparatus, and method performed thereby
JP5927570B2 (en) Three-dimensional imaging device, light transmission unit, image processing device, and program
CN110326284B (en) Image pickup device and image pickup element
US20200175704A1 (en) Distance measurement device based on phase difference
KR102412278B1 (en) Camera module including filter array of complementary colors and electronic device including the camera module
US11165984B2 (en) Camera system with complementary pixlet structure
EP3328057B1 (en) Camera assembly, method for portrait tracking based on the same, and electronic device
WO2023044856A1 (en) Method for improving quality of image captured by image sensor having image plane phase-difference pixels, electronic device, computer-readable storage medium and terminal device
US11061139B2 (en) Ranging sensor
KR101575964B1 (en) Sensor array included in dual aperture camera
JP6361857B2 (en) Image reading apparatus and image reading program
KR20150109187A (en) Structured light system
US20210258522A1 (en) Camera system with complementary pixlet structure
CN114286951B (en) Passive three-dimensional image sensing based on color focus difference
US20240070886A1 (en) Mixed-mode depth imaging
WO2021243554A1 (en) Electric device, method of controlling electric device, and computer readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21957975

Country of ref document: EP

Kind code of ref document: A1