CN112118378A - Image acquisition method and device, terminal and computer readable storage medium - Google Patents

Image acquisition method and device, terminal and computer readable storage medium Download PDF

Info

Publication number
CN112118378A
CN112118378A CN202011073863.3A CN202011073863A CN112118378A CN 112118378 A CN112118378 A CN 112118378A CN 202011073863 A CN202011073863 A CN 202011073863A CN 112118378 A CN112118378 A CN 112118378A
Authority
CN
China
Prior art keywords
image
filter
output mode
pixel
panchromatic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011073863.3A
Other languages
Chinese (zh)
Inventor
唐城
李龙佳
张弓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011073863.3A priority Critical patent/CN112118378A/en
Publication of CN112118378A publication Critical patent/CN112118378A/en
Priority to EP21876886.9A priority patent/EP4216534A4/en
Priority to PCT/CN2021/105464 priority patent/WO2022073364A1/en
Priority to US18/193,134 priority patent/US20230254553A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components

Abstract

The application discloses an image acquisition method. The method comprises the following steps: the image is output through at least one of a plurality of image output modes including a full resolution output mode, a first merged output mode, and a second merged output mode. The application also discloses an image acquisition device, a non-volatile computer-readable storage medium and a computer device. The image can be output through at least one of the multiple image output modes, different image output modes can be used aiming at different scenes, the adaptability to different scenes is strong, better balance can be obtained between definition and signal to noise ratio, and the imaging effect under different scenes is improved. And the image sensor comprises a panchromatic optical filter, so that the light inlet quantity of the pixels can be improved, and the imaging effect under dark light is improved.

Description

Image acquisition method and device, terminal and computer readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image acquisition method, an image acquisition apparatus, a terminal, and a non-volatile computer-readable storage medium.
Background
At present, when a camera shoots, the output mode of an image is generally fixed, the adaptability to different scenes is poor, only some compensation can be performed by adjusting exposure parameters so as to improve the imaging quality under different scenes, however, the improvement of the imaging effect after the exposure parameters are adjusted is limited, and the imaging effect is still poor.
Disclosure of Invention
Embodiments of the present application provide an image acquisition method, an image acquisition apparatus, a terminal, and a non-volatile computer-readable storage medium.
The image acquisition method is applied to an image sensor, the image sensor comprises an optical filter array and a pixel array, the optical filter array comprises a minimum repeating unit, the minimum repeating unit comprises a plurality of optical filter sets, each optical filter set comprises a color optical filter and a panchromatic optical filter, the width of the wave band of the transmitted light of the color optical filter is smaller than that of the wave band of the transmitted light of the panchromatic optical filter, each color optical filter and the panchromatic optical filter comprise a plurality of sub-optical filters, the pixel array comprises a plurality of pixels, each pixel corresponds to one sub-optical filter of the optical filter array, and the pixels are used for receiving the light which passes through the corresponding sub-optical filter to generate an electric signal; the image acquisition method includes outputting an image in at least one of a plurality of image output modes including a full-resolution output mode in which a first pixel value is read out for each pixel to acquire a first image, a first combined output mode in which a second pixel value read out is combined with a plurality of pixels corresponding to the panchromatic filter and a third pixel value read out is combined with a plurality of pixels corresponding to the color filter to acquire a second image, and a second combined output mode in which a fourth pixel value read out is combined with a plurality of pixels corresponding to all the panchromatic filters in the filter group and a fifth pixel value read out is combined with a plurality of pixels corresponding to all the color filters to acquire a third image.
The image acquisition device of the embodiment of the application comprises an output module. The output module is used for outputting an image through at least one of a plurality of image output modes, wherein the plurality of image output modes comprise a full-resolution output mode for acquiring a first image according to the first pixel value read out by each pixel, a first combined output mode for acquiring a second image according to the second pixel value read out by combining the plurality of pixels corresponding to the panchromatic filter and the third pixel value read out by combining the plurality of pixels corresponding to the color filter, and a second combined output mode for acquiring a third image according to the fourth pixel value read out by combining the plurality of pixels corresponding to all the panchromatic filters in the filter set and the fifth pixel value read out by combining the plurality of pixels corresponding to all the color filters.
The terminal of the embodiment of the application comprises an image sensor and a processor, wherein the image sensor comprises an optical filter array and a pixel array, the optical filter array comprises a minimum repeating unit, the minimum repeating unit comprises a plurality of optical filter sets, each optical filter set comprises a color optical filter and a panchromatic optical filter, the width of the wave band of the transmitted light of the color optical filter is smaller than that of the wave band of the transmitted light of the panchromatic optical filter, and each of the color optical filter and the panchromatic optical filter comprises a plurality of sub-optical filters; the pixel array comprises a plurality of pixels, each pixel corresponds to one sub-filter of the filter array, and the pixels are used for receiving light rays passing through the corresponding sub-filter to generate electric signals; the processor is configured to: outputting an image through at least one of a plurality of image output modes including a full-resolution output mode in which a first pixel value is read out from each pixel to obtain a first image, a first combined output mode in which a second pixel value read out is combined from a plurality of pixels corresponding to the panchromatic filter and a third pixel value read out is combined from a plurality of pixels corresponding to the color filter to obtain a second image, and a second combined output mode in which a fourth pixel value read out is combined from a plurality of pixels corresponding to all of the panchromatic filters in the filter group and a fifth pixel value read out is combined from a plurality of pixels corresponding to all of the color filters to obtain a third image.
One or more non-transitory computer-readable storage media embodying a computer program that, when executed by one or more processors, causes the processors to perform an image acquisition method. The image acquisition method is applied to an image sensor, the image sensor comprises an optical filter array and a pixel array, the optical filter array comprises a minimum repeating unit, the minimum repeating unit comprises a plurality of optical filter groups, each optical filter group comprises a color optical filter and a panchromatic optical filter, the width of a wave band of transmitted light of the color optical filter is smaller than that of the wave band of transmitted light of the panchromatic optical filter, each color optical filter and the panchromatic optical filter comprise a plurality of sub-optical filters, the pixel array comprises a plurality of pixels, each pixel corresponds to one sub-optical filter of the optical filter array, and the pixels are used for receiving the light which passes through the corresponding sub-optical filter to generate an electric signal; the image acquisition method includes outputting an image in at least one of a plurality of image output modes including a full-resolution output mode in which a first pixel value is read out for each pixel to acquire a first image, a first combined output mode in which a second pixel value read out is combined with a plurality of pixels corresponding to the panchromatic filter and a third pixel value read out is combined with a plurality of pixels corresponding to the color filter to acquire a second image, and a second combined output mode in which a fourth pixel value read out is combined with a plurality of pixels corresponding to all the panchromatic filters in the filter group and a fifth pixel value read out is combined with a plurality of pixels corresponding to all the color filters to acquire a third image.
The image acquisition method, the image acquisition device, the terminal and the nonvolatile computer readable storage medium in the embodiment of the application can output images through at least one of multiple image output modes, can use different image output modes aiming at different scenes, have strong adaptability to different scenes, can obtain better balance between definition and signal to noise ratio, and improve imaging effects under different scenes. And the image sensor comprises a panchromatic optical filter, so that the light inlet quantity of the pixels can be improved, and the imaging effect under dark light is improved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application.
FIG. 2 is a block diagram of an image capture device according to some embodiments of the present application.
Fig. 3 is a schematic structural diagram of a terminal according to some embodiments of the present application.
FIG. 4 is an exploded schematic view of an image sensor according to some embodiments of the present application.
Fig. 5 is a schematic diagram of the connection of a pixel array and readout circuitry according to some embodiments of the present application.
Fig. 6 is a schematic plan view of a filter array according to some embodiments of the present application.
Fig. 7a is a schematic plan view of a minimal repeating unit of a filter array according to some embodiments of the present disclosure.
Fig. 7b is a schematic plan view of a minimal repeating unit of a filter array according to some embodiments of the present disclosure.
Fig. 7c is a schematic plan view of a minimal repeating unit of a filter array according to some embodiments of the present disclosure.
Fig. 7d is a schematic plan view of a minimal repeating unit of a filter array according to some embodiments of the present disclosure.
Fig. 8 is a schematic plan view of a pixel array according to some embodiments of the present application.
Fig. 9 is a schematic plan view of a minimal repeating unit of a pixel array according to some embodiments of the present application.
FIG. 10 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application.
FIG. 11 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application.
FIG. 12 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application.
FIG. 13 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application.
FIG. 14 is a schematic illustration of an image acquisition method according to certain embodiments of the present application.
FIG. 15 is a schematic illustration of an image acquisition method according to certain embodiments of the present application.
FIG. 16 is a schematic illustration of an image acquisition method according to certain embodiments of the present application.
FIG. 17 is a schematic diagram of a connection between a readable storage medium and a processor according to some embodiments of the present application.
FIG. 18 is a block diagram of an image processing circuit according to some embodiments of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Referring to fig. 1, the image capturing method according to the embodiment of the present disclosure is applied to an image sensor 21, the image sensor 21 includes a filter array 22 and a pixel array 23, the filter array 22 includes a minimum repeating unit 221, the minimum repeating unit 221 includes a plurality of filter sets 222, each filter set 222 includes a color filter 223 and a panchromatic filter 224, a width of a wavelength band of light transmitted by the color filter 223 is smaller than a width of a wavelength band of light transmitted by the panchromatic filter 224, and each of the color filter 223 and the panchromatic filter 224 includes a plurality of sub-filters 225; the pixel array 23 includes a plurality of pixels 231, each pixel 231 corresponds to one sub-filter 225 of the filter array 22, and the pixels 231 are configured to receive light passing through the corresponding sub-filter 225 to generate an electrical signal; the image acquisition method comprises the following steps:
011: an image is output through at least one of a plurality of image output modes including a full-resolution output mode in which the first pixel values read out from each pixel 231 are combined to obtain a first image, a first combined output mode in which the read out second pixel values are combined from the plurality of pixels 231 corresponding to the panchromatic filter 224 and the read out third pixel values are combined from the plurality of pixels 231 corresponding to the color filters 223 to obtain a second image, and a second combined output mode in which the read out fifth pixel values are combined from the plurality of pixels 231 corresponding to all the panchromatic filters 224 in the filter set 222 and the plurality of pixels 231 corresponding to all the color filters 223 to obtain a third image.
Referring to fig. 2, the image capturing apparatus 10 according to the embodiment of the present disclosure is applied to an image sensor 21, and the image capturing apparatus 10 includes an output module 11. The output module 11 is configured to execute step 011. That is, the output module 11 is configured to output an image through at least one of a plurality of image output modes.
Referring to fig. 3 to 5, the terminal 100 of the present embodiment includes an image sensor 21 and a processor 30. The image sensor 21 comprises a filter array 22 and a pixel array 23, and the processor 30 is configured to perform step 011. That is, the processor 30 is configured to output an image through at least one of a plurality of image output modes.
The terminal 100 includes a mobile phone, a tablet computer, a notebook computer, a teller machine, a gate, a smart watch, a head display device, and the like, and it is understood that the terminal 100 may also be any other device with image processing function. The following description will be made by taking the terminal 100 as a mobile phone, but the terminal 100 is not limited to a mobile phone. The terminal 100 includes a camera 20, a processor 30, and a housing 40. The camera 20 and the processor 30 are disposed in the housing 40, and the housing 40 can also be used to mount functional modules of the terminal 100, such as a power supply device and a communication device, so that the housing 40 provides protection for the functional modules, such as dust prevention, drop prevention, and water prevention.
The camera 20 may be a front camera, a rear camera, a side camera, an off-screen camera, etc., without limitation. The camera 20 includes a lens and an image sensor 21, when the camera 20 takes an image, light passes through the lens and reaches the image sensor 21, and the image sensor 21 is used for converting an optical signal irradiated onto the image sensor 21 into an electrical signal.
Referring to fig. 4 and 5, the image sensor 21 includes a microlens array 25, a filter array 22, a pixel array 23, and a readout circuit 24.
The microlens array 25 includes a plurality of microlenses 251, the sub-filters 225, and the pixels 231 are arranged in a one-to-one correspondence, the microlenses 251 are configured to converge incident light, the converged light is received by the corresponding pixels 231 after passing through the corresponding sub-filters 235, and the pixels 231 generate electrical signals according to the received light.
For convenience of description, the minimum repeating unit 221 of the filter array 22 is referred to as a first minimum repeating unit, and the minimum repeating unit 232 of the pixel array 23 is referred to as a second minimum repeating unit.
The first minimum repeating unit includes a plurality of filter sets 222, for example, the first minimum repeating unit includes 2 filter sets 222, 3 filter sets 222, 4 filter sets 222, 5 filter sets 222, 6 filter sets 222, and the like, in this embodiment, the first minimum repeating unit includes 4 filter sets 222, and the 4 filter sets 222 are arranged in a matrix.
Referring to fig. 6, each filter set 222 includes a color filter 223 (e.g., a rectangular portion composed of 4 sub-filters 225 with a filling pattern in fig. 6) and a panchromatic filter 224 (e.g., a rectangular portion composed of 4 sub-filters 225 without a filling pattern in fig. 6), the width of the wavelength band of the transmitted light of the color filter 223 is smaller than that of the panchromatic filter 224, for example, the wavelength band of the transmitted light of the color filter 223 may correspond to the wavelength band of red light, the wavelength band of green light, or the wavelength band of blue light, and the wavelength band of the transmitted light of the panchromatic filter 224 is the wavelength band of all visible lights, that is, the color filter 223 only allows a specific color light to pass through, and the panchromatic filter 224 can pass all colors of light. Of course, the wavelength band of the transmitted light of the color filter 223 may correspond to other wavelength bands of color light, such as magenta light, purple light, cyan light, yellow light, etc., and is not limited herein.
The sum of the number of color filters 223 and the number of panchromatic filters 224 in the filter set 222 is 4, 9, 16, 25, and the like, which may be arranged in a matrix. In the present embodiment, the sum of the number of color filters 223 and the number of full-color filters 224 in the filter group 222 is 4.
The ratio of the number of color filters 223 to the number of panchromatic filters 224 may be 1:3, 1:1 or 3: 1. For example, if the ratio of the number of the color filters 223 to the number of the full-color filters 224 is 1:3, the number of the color filters 223 is 1, and the number of the full-color filters 224 is 3, then the number of the full-color filters 224 is large, and the imaging quality under dark light is better; or, if the ratio of the number of the color filters 223 to the number of the panchromatic filters 224 is 1:1, the number of the color filters 223 is 2, and the number of the panchromatic filters 224 is 2, in this case, not only can a better color representation be obtained, but also the imaging quality under dark light is better; alternatively, if the ratio of the number of the color filters 223 to the number of the panchromatic filters 224 is 3:1, the number of the color filters 223 is 3, and the number of the panchromatic filters 224 is 1, then better color representation can be obtained, and the imaging quality under dark light can be improved. In the embodiment of the present application, as shown in fig. 7a, the number of the color filters 223 is 2, the number of the panchromatic filters 224 is 2, the 2 color filters 223 and the 2 panchromatic filters 224 are arranged in a matrix, the 2 color filters 223 are located in a direction of a first diagonal D1 (specifically, on the first diagonal D1) of a rectangle corresponding to the matrix, the 2 panchromatic filters 224 are located in a direction of a second diagonal D2 (specifically, on the second diagonal D2) of the rectangle corresponding to the matrix, and the direction of the first diagonal D1 is different from the direction of the second diagonal D2 (for example, the direction of the first diagonal D1 is not parallel to the direction of the second diagonal D2), so that both color representation and dark light imaging quality are achieved. In other embodiments, one color filter 223 and one panchromatic filter 224 are located at a first diagonal D1, and the other color filter 223 and the other panchromatic filter 224 are located at a second diagonal D2.
For example, the color corresponding to the wavelength band of the light transmitted by the color filter 223 of the filter set 222 in the first minimal repeating unit includes color a, color b and color c, or color a, color b or color c, or color a and color b, or color b and color c, or color a and color c. The color a is red, the color b is green, and the color c is blue, or for example, the color a is magenta, the color b is cyan, and the color c is yellow, etc., which are not limited herein. In the embodiment of the present application, the color corresponding to the wavelength band of the light transmitted by the color filter 223 of the filter set 222 in the first minimum repeating unit includes a color a, a color b, and a color c, where the color a, the color b, and the color c are green, red, and blue, specifically, the color corresponding to the color filter 223 of the 4 filter sets 222 (as shown in fig. 7, the color corresponding to the color filters 223 of the first minimum repeating unit is red, green, blue, and green, respectively, so as to form an arrangement similar to a bayer array, and of course, the color corresponding to the first filter set 2221, the second filter set 2222, the third filter set 2223, and the fourth filter set 2224 may also be green, red, green, and blue, or blue, green, red, and green, etc., and are not intended to be limiting herein.
The color filter 223 and the panchromatic filter 224 each include a plurality of sub-filters 225, for example, the color filter 223 and the panchromatic filter 224 include 2 sub-filters 225, 3 sub-filters 225, 4 sub-filters 225, 5 sub-filters 225, 6 sub-filters 225, and the like. The sub-filters 225 in the same color filter 223 (full color filter 224) transmit light in the same wavelength band.
Referring to fig. 7a, in an example, the sum of the number of color filters 223 and the number of panchromatic filters 224 in the filter set 222 is 4, and the ratio of the number of color filters 223 and the number of panchromatic filters 224 is 1:1, then the first minimum repeating unit is 8 rows and 8 columns and includes 64 sub-filters 225, and the arrangement may be:
Figure RE-GDA0002784945410000041
where w denotes a panchromatic sub-filter, a, b, and c denote color sub-filters, the panchromatic sub-filter refers to the sub-filter 225 that can filter out all light rays outside the visible light band, and the color sub-filters include a red sub-filter, a green sub-filter, a blue sub-filter, a magenta sub-filter, a cyan sub-filter, and a yellow sub-filter. The red sub-filter 225 is a sub-filter for filtering all light except red light, the green sub-filter 225 is a sub-filter for filtering all light except green light, the blue sub-filter 225 is a sub-filter for filtering all light except blue light, the magenta sub-filter 225 is a sub-filter for filtering all light except magenta light, the cyan sub-filter 225 is a sub-filter for filtering all light except cyan light, and the yellow sub-filter 225 is a sub-filter for filtering all light except yellow light.
a may be a red sub-filter, a green sub-filter, a blue sub-filter, a magenta sub-filter, a cyan sub-filter, or a yellow sub-filter, b may be a red sub-filter, a green sub-filter, a blue sub-filter, a magenta sub-filter, a cyan sub-filter, or a yellow sub-filter, and c may be a red sub-filter, a green sub-filter, a blue sub-filter, a magenta sub-filter, a cyan sub-filter, or a yellow sub-filter. For example, b is a red sub-filter, a is a green sub-filter, and c is a blue sub-filter; or c is a red sub-filter, a is a green sub-filter, and b is a blue sub-filter; for another example, c is a red sub-filter, a is a green sub-filter, and b is a blue sub-filter; alternatively, a is a red sub-filter, b is a blue sub-filter, c is a green sub-filter, etc., and the disclosure is not limited herein; for example, b is a magenta sub-filter, a is a cyan sub-filter, and b is a yellow sub-filter. In other embodiments, the color filter may further include other color sub-filters, such as an orange sub-filter, a violet sub-filter, and the like, which is not limited herein.
Referring to fig. 7b, in another example, the sum of the number of color filters 223 and the number of panchromatic filters 224 in the filter set 222 is 4, and the ratio of the number of color filters 223 and the number of panchromatic filters 224 is 1:1, then the first minimum repeating unit is 8 rows and 8 columns and includes 64 sub-filters 225, and the arrangement may also be:
Figure RE-GDA0002784945410000042
Figure RE-GDA0002784945410000051
referring to fig. 7c, in yet another example, the sum of the number of the color filters 223 and the number of the panchromatic filters 224 in the filter set 222 is 9, the color filters 223 and the panchromatic filters 224 in the filter set 222 are arranged in a matrix, and the ratio of the number of the color filters 223 to the number of the panchromatic filters 224 is 4:5, so that the number of the color filters 223 is 4, the number of the panchromatic filters 224 is 5, and the number of the panchromatic filters 224 is larger, so that the imaging quality under dark light is better; the panchromatic filter 224 is located on a third diagonal D3 and a fourth diagonal D4 of a rectangle corresponding to the filter set 222, the third diagonal D3 and the fourth diagonal D4 are diagonals of the rectangle, the color filter 223 is located in a direction of the third diagonal D3 or a direction of the fourth diagonal D4 and is not located on a third diagonal D3 and a fourth diagonal D4, the direction of the third diagonal D3 and the direction of the fourth diagonal D4 are different (for example, the direction of the third diagonal D3 and the direction of the fourth diagonal D4 are not parallel), specifically, the first minimum repeating unit is 12 rows and 12 columns, and includes 144 sub-filters 225, and the arrangement mode may be:
Figure RE-GDA0002784945410000052
referring to fig. 7d, in another example, the sum of the number of the color filters 223 and the number of the panchromatic filters 224 in the filter set 222 is 9, the color filters 223 and the panchromatic filters 224 in the filter set 222 are arranged in a matrix, and the ratio of the number of the color filters 223 to the number of the panchromatic filters 224 is 5:4, so that the number of the color filters 223 is 5, the number of the panchromatic filters 224 is 4, and at this time, the number of the color filters 223 is large, so that better color representation can be obtained, and the imaging quality under dark light can be improved; the color filter 223 is located on the fifth diagonal D5 and the sixth diagonal D6 of the rectangle corresponding to the filter set 222, the fifth diagonal D5 and the fourth diagonal D6 are diagonals of the rectangle, the panchromatic filter 223 is located in the direction of the fifth diagonal D5 or the sixth diagonal D6 and is not located on the fifth diagonal D5 and the sixth diagonal D6, the direction of the fifth diagonal D5 is different from the direction of the sixth diagonal D6 (for example, the direction of the fifth diagonal D5 is not parallel to the direction of the sixth diagonal D6), specifically, the first minimum repeating unit is 12 rows and 12 columns, and includes 144 sub-filters 225, and the arrangement manner may also be:
Figure RE-GDA0002784945410000053
Figure RE-GDA0002784945410000061
the image sensor 21, the camera 20 and the terminal 100 of the embodiment include the panchromatic filter 224, and the image sensor 10 can acquire more light quantity during shooting, so that shooting parameters do not need to be adjusted, the imaging quality under dim light is improved under the condition that the shooting stability is not affected, stability and quality can be considered during imaging under dim light, and the stability and quality of imaging under dim light are high. And the panchromatic filter 224 and the color filter 223 are composed of 4 sub-filters 225, the pixels 231 corresponding to the 4 sub-filters 225 can be combined and output during imaging under dark light to obtain an image with high signal-to-noise ratio, and the pixels 231 corresponding to each sub-filter 225 can be independently output under a scene with sufficient light to obtain an image with high definition and signal-to-noise ratio.
Referring to fig. 4 and 8, the pixel array 23 includes a plurality of pixels 231, each pixel 231 corresponds to one of the sub-filters 225, the pixels 231 are configured to receive light passing through the corresponding sub-filter 225 to generate electrical signals, and the processor 30 processes the electrical signals to obtain pixel values of the pixels 231.
The second minimal repeating unit includes a plurality of pixel groups 233 corresponding to the filter group 222 in the second minimal repeating unit, the second minimal repeating unit includes 4 pixel groups 233 arranged in a matrix, each pixel group 233 corresponds to one filter group 222, as shown in fig. 9, the 4 pixel groups 233 include a first pixel group 2331, a second pixel group 2332, a third pixel group 2333 and a fourth pixel group 2334, and the first pixel group 2331, the second pixel group 2332, the third pixel group 2333 and the fourth pixel group 2334 are respectively arranged corresponding to the first filter group 2221, the second filter group 2222, the third filter group 2223 and the fourth filter group 2224.
The pixel group 233 includes a color pixel unit 234 and a panchromatic pixel unit 235, and the color pixel unit 234 and the panchromatic pixel unit 235 are disposed in one-to-one correspondence with the color filters 223 and the panchromatic filters 224, respectively. In this embodiment, each of the color pixel units 234 and the panchromatic pixel units 235 is 2, the 2 color pixel units 234 and the 2 panchromatic pixel units 235 are arranged in a matrix, the 2 color pixel units 234 are located on a seventh diagonal D7 of the rectangle corresponding to the matrix, and the 2 panchromatic pixel units 235 are located on an eighth diagonal D8 of the rectangle corresponding to the matrix.
Color pixel cell 234 includes color pixel 2341 and panchromatic pixel cell 235 includes panchromatic pixel 2311. The color pixels 2341 are disposed in one-to-one correspondence with the sub-filters 225 (hereinafter referred to as color sub-filters) of the color filter 223, the panchromatic pixels 2311 are disposed in one-to-one correspondence with the sub-filters 225 (hereinafter referred to as panchromatic sub-filters) of the panchromatic filter 224, the color filter 223 and the panchromatic filter 224 include 4 color sub-filters and 4 panchromatic sub-filters, respectively, and the color pixel unit 234 and the panchromatic pixel unit 235 also include 4 color pixels 2341 and 4 panchromatic pixels 2311, respectively. The color pixel 2341 can receive light of a specific color (e.g., red, green, or blue) transmitted by the corresponding color sub-filter to generate an electrical signal, the panchromatic pixel 2311 can receive light of all colors transmitted by the corresponding panchromatic sub-filter to generate an electrical signal, and the processor 30 can obtain pixel values corresponding to the panchromatic pixel 2311 and the color pixel 2341 according to the electrical signal.
The color pixels 2341 include colors corresponding to the wavelength bands of light transmitted by the color sub-filters arranged correspondingly, and the color pixels 2341 in the second minimal repeating unit also include colors a, b and c, for example, the wavelength bands of light transmitted by the color sub-filters in the first minimal repeating unit include a wavelength band of red light, a wavelength band of green light and a wavelength band of blue light, so that the color pixels 2341 include red, green and blue colors. Corresponding to the colors corresponding to the 4 filter sets 222, the colors corresponding to the color pixels 2341 of the color pixel units 234 in the 4 pixel groups 233 (i.e., the first pixel group 2331, the second pixel group 2332, the third pixel group 2333, and the fourth pixel group 2334) are red, green, blue, and green, respectively, i.e., the color a is green, the color b is red, and the color c is blue. It is understood that the color pixel 2341 includes a color that is not the color of the color pixel 2341 itself, but the color corresponding to the wavelength band of the light transmitted by the color sub-filter corresponding to the color pixel 2341.
The panchromatic pixel 2311 in the second minimal repeating unit has a color corresponding to a wavelength band of light transmitted by the panchromatic sub-filter in the first minimal repeating unit, for example, the panchromatic pixel 2311 includes a color W, and the panchromatic sub-filter transmits light in a visible light wavelength band, so that the color W is white. It will be appreciated that the panchromatic pixels 2311 include colors that are not the colors of the panchromatic pixels 2311 themselves, but rather colors corresponding to the wavelength bands of light transmitted by the panchromatic sub-filters corresponding to the panchromatic pixels 2311.
Referring to fig. 5, the readout circuit 24 is electrically connected to the pixel array 23 and is used for controlling the exposure of the pixel array 23 and the reading and outputting of the pixel values of the pixels 231.
The readout circuit 24 includes a vertical driving unit 241, a control unit 242, a column processing unit 243, and a horizontal driving unit 244.
The vertical driving unit 241 includes a shift register and an address decoder. The vertical driving unit 241 includes a readout scanning and reset scanning functions. The readout scanning refers to sequentially scanning the pixels 231 row by row, and reading signals from these pixels 231 row by row. For example, a signal output by each pixel 231 in the pixel row selected and scanned is transmitted to the column processing unit 243. The reset scan is for resetting charges, and the photocharges of the photoelectric conversion elements of the pixels 231 are discarded, so that accumulation of new photocharges can be started.
The signal processing performed by the column processing unit 243 is Correlated Double Sampling (CDS) processing. In the CDS processing, the reset level and the signal level output from each pixel 231 in the selected pixel row are taken out, and the level difference is calculated. Thus, signals of the pixels 231 in one row are obtained. The column processing unit 243 may have an analog-to-digital (a/D) conversion function for converting analog pixel signals into a digital format.
The horizontal driving unit 244 includes a shift register and an address decoder. The horizontal driving unit 244 sequentially scans the pixel array 11 column by column. Each pixel column is sequentially processed by the column processing unit 243 and sequentially output by a selective scanning operation performed by the horizontal driving unit 244.
The control unit 242 configures timing signals according to an operation mode, and controls the vertical driving unit 241, the column processing unit 243, and the horizontal driving unit 244 to cooperatively operate using various timing signals.
Specifically, the processor 30 may select at least one of a plurality of image output modes for outputting an image with respect to the current scene. For example, in order to achieve the acquisition of the highest-definition image, the user may select a full-resolution output mode output image among a plurality of image output modes. In the full-resolution output mode, each pixel 231 outputs a first pixel value, so as to generate an image with a resolution equal to that of the image sensor 21, for example, if the resolution of the image sensor 21 is 4800 ten thousand pixels, a first image with a size of 4800 ten thousand pixels can be generated;
as another example, the current ambient brightness is not sufficient, and the user may select the first combined output mode of the multiple image output modes to output the image in order to improve the signal-to-noise ratio of the image. In the first combined output mode, the electrical signals of the 4 panchromatic pixels 2311 in the panchromatic pixel unit 235 corresponding to the panchromatic filter 224 are combined and read out to obtain a second pixel value, the electrical signals of the 4 color pixels 2341 in the color pixel unit 234 corresponding to the color filter 223 are combined and read out to obtain a third pixel value, and an 1/4 image with a resolution equal to that of the image sensor 21 can be generated according to all the third pixel values and the fourth pixel values, for example, the image sensor 21 has a resolution of 4800 ten thousand pixels, so that a 1200 ten thousand pixel second image can be generated;
for another example, when the current ambient brightness is severely insufficient, the user may select a second combined output mode of the multiple image output modes to output the image in order to maximize the signal-to-noise ratio of the image. In the second combined output mode, the electrical signals of the 8 panchromatic pixels 2311 in the panchromatic pixel units 235 corresponding to all the panchromatic filters 224 in each filter set 222 are combined and read out to obtain a fourth pixel value, the electrical signals of the 8 color pixels 2341 in the color pixel units 234 corresponding to all the color filters 223 in each filter set 222 are combined and read out to obtain a fifth pixel value, all the fourth pixel values and all the fifth pixel values respectively generate an intermediate image, and the two intermediate images are combined to generate an 1/16 image with a resolution equal to the resolution of the image sensor 21, for example, the resolution of the image sensor 21 is 4800 ten thousand pixels, so that a third image with a size of 300 ten thousand pixels can be generated.
The electrical signal combining readout may be to accumulate electrical signals accumulated in the plurality of pixels 231 to obtain an accumulated electrical signal and then determine a corresponding pixel value according to the accumulated electrical signal, or the electrical signal combining readout may be to accumulate a plurality of pixel values after reading out the pixel value of each pixel 231 to obtain a pixel value of one pixel.
Of course, the processor 30 may simultaneously select a plurality of image output modes to output the first image, the second image, and/or the third image. For example, the processor 30 simultaneously outputs the first image and the second image, or the second image and the third image, or the first image, the second image, and the third image. The user can select a satisfactory image from a plurality of images output in a plurality of image output modes.
The image acquisition method, the image acquisition device and the terminal 100 of the embodiment of the application can output images through at least one of the multiple image output modes, can use different image output modes aiming at different scenes, have strong adaptability to different scenes, can obtain better balance between definition and signal to noise ratio, and improve the imaging effect under different scenes.
Referring again to fig. 1, in some embodiments, the image acquisition method includes:
012: acquiring shooting information, wherein the shooting information comprises at least one of ambient brightness and shooting parameters;
013: determining the image output mode adapted to the photographing information.
Referring again to fig. 2, in some embodiments, the image processing apparatus 10 further includes an obtaining module 12 and a determining module 13. The obtaining module 12 and the determining module 13 are configured to perform step 012 and step 013, respectively. Namely, the acquisition module 12 is used for acquiring shooting information; the determining module 13 is configured to determine the image output mode adapted to the shooting information.
Referring again to fig. 3, in some embodiments, processor 30 is further configured to perform step 012 and step 013. That is, the processor 20 is also used to acquire photographing information and determine the image output mode adapted to the photographing information.
Specifically, the processor 30 first obtains shooting information, which includes at least one of ambient brightness and shooting parameters, for example, the shooting information includes ambient brightness, or the shooting information includes shooting parameters, or the shooting information includes ambient brightness and shooting parameters, wherein the shooting parameters may include shooting mode, exposure parameters, and the like. The present embodiment will be described by taking an example in which the shooting parameters include ambient brightness and shooting parameters (the shooting parameters include a shooting mode).
The processor 30 may acquire the current photographing mode and an ambient light intensity signal collected by the light sensor 50 (shown in fig. 3) of the terminal 100, and then determine the ambient brightness according to the light intensity signal; or the processor 30 may control the camera 20 to capture an image and then determine the ambient brightness according to the gray value distribution of the captured image; or when the image is captured, in order to obtain a better capturing effect under different ambient brightness, the exposure parameters, such as aperture size, sensitivity, etc., are generally automatically adjusted, the ambient brightness and the exposure parameters have a mapping relationship, and the processor 30 can determine the ambient brightness according to the exposure parameters when the image is captured.
After acquiring the ambient brightness and the photographing parameters, the processor 30 may determine an image output mode adapted to the ambient brightness and/or the photographing parameters. For example, the processor 30 may determine an image output mode that is adapted to the photographing mode and the ambient brightness.
Since the shooting mode generally requires the user to actively select, the processor 30 may preferentially determine the image output mode according to the shooting mode, for example, when the shooting mode is the full resolution mode, the processor 30 determines that the adapted image output mode is the full resolution output mode; for another example, if the shooting mode is the high resolution mode, the processor 30 determines that the adapted image output mode is the first merged output mode; for another example, if the photographing mode is the low resolution mode, the processor 30 determines the adapted image output mode as the second combined output mode.
When the photographing mode is not selected, the processor 30 may determine an image output mode adapted to the ambient brightness.
For example, when the ambient brightness is high (e.g., the ambient brightness is above the first ambient brightness threshold), the processor 30 may determine that the adapted image output mode is the full resolution output mode; when the ambient brightness is normal (e.g., the ambient brightness is higher than the second ambient brightness threshold and is less than the first ambient brightness threshold), the processor 30 may determine that the adapted image output mode is the first merged output mode; when the ambient brightness is low (e.g., the ambient brightness is less than the second ambient brightness threshold), the processor 30 may determine the adapted image output mode as the second merged output mode. Therefore, adaptive image output modes are selected according to different environment brightness, the definition and the signal-to-noise ratio are well balanced, the definition and the signal-to-noise ratio are not too low, and the imaging quality is improved.
After determining the image output mode, the processor 30 may control the image sensor 21 to output the corresponding image according to the adapted image output mode. The image output mode can be changed in real time along with the change of the shooting information, and the processor 30 acquires the shooting information in real time and determines the image output mode at intervals of preset time, so that the real-time adaptation of the image output mode and the current shooting information is ensured. And the image sensor 21 includes a panchromatic filter 224, which can increase the light entering amount of the pixel and improve the imaging effect under dark light.
The corresponding image output mode can be determined according to the shooting information, so that when the scene with different shooting information such as environment brightness and shooting parameters is responded, the proper image output mode is selected, better balance is obtained between definition and signal to noise ratio, adaptability to different scenes is stronger, and imaging effects under different scenes can be improved.
Referring to fig. 10, in some embodiments, step 013 (specifically, determining the image output mode adapted to the ambient brightness) includes the following steps:
0131: when the ambient brightness is larger than a first ambient brightness threshold value, determining that the image output mode is a full-resolution output mode;
0132: when the ambient brightness is greater than the second ambient brightness threshold and less than the first ambient brightness threshold, determining that the image output mode is a first combined output mode; and
0133: and when the ambient brightness is smaller than a second ambient brightness threshold value, determining that the image output mode is a second combined output mode, wherein the first ambient brightness threshold value is larger than the second ambient brightness threshold value.
Referring again to fig. 2, in some embodiments, the determination module 13 is further configured to perform steps 0131, 0132, and 0133. That is, the determining module 13 is further configured to determine that the image output mode is the full-resolution output mode when the ambient brightness is greater than the first ambient brightness threshold; when the ambient brightness is greater than the second ambient brightness threshold and less than the first ambient brightness threshold, determining that the image output mode is a first combined output mode; and when the ambient brightness is smaller than the second ambient brightness threshold value, determining the image output mode as a second combined output mode.
Referring again to FIG. 3, in certain embodiments, processor 30 is also configured to perform steps 0131, steps 0132, and steps 0133. That is, the processor 30 is further configured to determine that the image output mode is the full resolution output mode when the ambient brightness is greater than the first ambient brightness threshold; when the ambient brightness is greater than the second ambient brightness threshold and less than the first ambient brightness threshold, determining that the image output mode is a first combined output mode; and when the ambient brightness is smaller than the second ambient brightness threshold value, determining the image output mode as a second combined output mode.
Specifically, the shooting information acquired by the processor 30 may only include the ambient brightness, and when the image output mode adapted to the shooting information is determined, the image output mode adapted to the ambient brightness is determined.
When the terminal 100 leaves the factory, a first ambient brightness threshold and a second ambient brightness threshold that are sequentially reduced may be preset, and the first ambient brightness threshold and the second ambient brightness threshold may be determined according to an empirical value, or obtained by testing the terminal 100, for example, placing the terminal 100 in an environment with adjustable ambient brightness, obtaining an electrical signal of a pixel of the image sensor 21 corresponding to the ambient brightness by adjusting the ambient brightness, for example, establishing a mapping relationship between an average value of the electrical signals of the pixel of the image sensor 21 and the ambient brightness, when a pixel value corresponding to the average value is 200, considering the ambient brightness corresponding to the average value as the first ambient brightness threshold, and when a pixel value corresponding to the average value is 150, considering the ambient brightness corresponding to the average value as the second ambient brightness threshold. Thus, the ambient brightness threshold is obtained by testing the image sensor 21 of the terminal 100, the ambient brightness threshold is more adaptive to the terminal 100, and the accuracy of the ambient brightness threshold is higher.
When the ambient brightness is greater than the first ambient brightness threshold (hereinafter referred to as "high-brightness environment"), the ambient light is sufficient, and the amount of light that can be obtained by each pixel is large, the processor 30 may determine that the adapted image output mode is the full-resolution output mode, so as to obtain the first image with high definition and signal-to-noise ratio; when the ambient brightness is greater than the second ambient brightness threshold and less than or equal to the first ambient brightness threshold (hereinafter referred to as "medium-bright environment"), the ambient light is still more, but the amount of light that can be obtained by each pixel is reduced compared to the high-bright environment, and the processor 30 may determine that the adapted image output mode is the first merged output mode, so as to obtain the second image with slightly reduced definition but improved signal-to-noise ratio; when the ambient brightness is less than or equal to the second ambient brightness threshold (hereinafter referred to as a low-brightness environment), the ambient light is less, and the amount of light that can be obtained by each pixel is less, the processor 30 may determine that the adapted image output mode is the second merged output mode, so as to obtain a third image with reduced definition but significantly improved signal-to-noise ratio. Therefore, adaptive image output modes are selected according to different environment brightness, the definition and the signal-to-noise ratio are well balanced, the definition and the signal-to-noise ratio are not too low, and the imaging quality is improved.
Referring to fig. 11, in some embodiments, the capturing parameters include exposure parameters, and the step 013 (specifically, determining the image output mode adapted to the ambient brightness and the capturing parameters) further includes the following steps:
0134: determining the light inlet quantity according to the environment brightness and the exposure parameters;
0135: when the light entering amount is larger than a first light entering amount threshold value, determining that the image output mode is a full-resolution output mode;
0136: when the ambient brightness is larger than the second light inlet amount threshold and smaller than the first light inlet amount threshold, determining that the image output mode is a first combination output mode;
0137: and when the ambient brightness is less than the second light incoming amount threshold value, determining the image output mode as a second combined output mode.
Referring again to fig. 2, in some embodiments, the determination module 13 is further configured to perform steps 0134, 0135, 0136, and 0137. Namely, the determining module 13 is further configured to determine the light entering amount according to the ambient brightness and the exposure parameter; when the light entering amount is larger than a first light entering amount threshold value, determining that the image output mode is a full-resolution output mode; when the ambient brightness is larger than the second light inlet amount threshold and smaller than the first light inlet amount threshold, determining that the image output mode is a first combination output mode; and when the ambient brightness is smaller than the second light incoming amount threshold value, determining the image output mode as a second combined output mode.
Referring again to fig. 3, in some embodiments, processor 30 is further configured to perform steps 0134, 0135, 0136, and 0137. That is, the processor 30 is further configured to determine the light entering amount according to the ambient brightness and the exposure parameter; when the light entering amount is larger than a first light entering amount threshold value, determining that the image output mode is a full-resolution output mode; when the ambient brightness is larger than the second light inlet amount threshold and smaller than the first light inlet amount threshold, determining that the image output mode is a first combination output mode; and when the ambient brightness is smaller than the second light incoming amount threshold value, determining the image output mode as a second combined output mode.
Specifically, since the camera 20 can adjust exposure parameters such as aperture size, shutter time, sensitivity, and the like during shooting, even under the same ambient brightness, the pixel values of pixels under different exposure parameters are significantly different. For example, in the case of constant ambient brightness, the larger the aperture, the larger the amount of light entering, the more light that can be obtained per pixel, and the larger the pixel value; for another example, in the case of constant ambient brightness, the larger the shutter time is, the larger the light-entering amount is, the more light can be obtained by each pixel, and the larger the pixel value is; for another example, when the ambient brightness is not changed, the greater the sensitivity, the greater the electric signal generated by the same amount of light entering becomes, which may be equivalent to the greater the amount of light entering, and the greater the pixel value; therefore, in addition to the ambient brightness, the exposure parameters also affect the selection of the image output mode, for example, taking the case that the exposure parameters include the aperture size, the light entering amount when the aperture size is smaller in the high-brightness environment may be smaller than the light entering amount when the aperture size is larger in the medium-brightness environment, and therefore, the processor 30 may determine the light entering amount according to the ambient brightness and the exposure parameters, and then determine the image output mode according to the light entering amount.
Specifically, when the amount of light entering is greater than the first light entering amount threshold, the amount of light that can be obtained by each pixel is large, and the processor 30 may determine that the adapted image output mode is the full resolution output mode, so as to obtain the first image with high definition and signal-to-noise ratio; when the amount of light entering is greater than the second light entering threshold and less than or equal to the first light entering threshold, the amount of light available for each pixel decreases, and the processor 30 may determine the adapted image output mode as the first combined output mode to obtain the second image with slightly decreased sharpness but increased signal-to-noise ratio; when the amount of light entering is less than or equal to the second threshold amount of light entering, and the amount of light available for each pixel is also less, the processor 30 may determine the adapted image output mode as the second combined output mode, so as to obtain a third image with reduced sharpness and significantly improved signal-to-noise ratio. Therefore, adaptive image output modes are selected according to different environment brightness and exposure parameters, the definition and the signal-to-noise ratio are well balanced, the definition and the signal-to-noise ratio are not too low, and the imaging quality is improved.
Referring to FIG. 12, in some embodiments, step 011 includes the steps of:
0111: outputting a first image through a full resolution output mode; and/or
0112: outputting a second image through a first merging output mode; and/or
0113: and outputting the third image through the second combined output mode.
Referring again to fig. 2, in some embodiments, the determining module 13 is further configured to perform step 0111, step 0112, and step 0113. That is, the determining module 13 is further configured to output the first image in the full resolution output mode; and/or outputting the second image through the first combined output mode; and/or outputting the third image through the second merged output mode.
Referring again to fig. 3, in some embodiments, the processor 30 is further configured to perform step 0111, step 0112, and step 0113. That is, the processor 30 is configured to output the first image in the full resolution output mode; and/or outputting the second image through the first combined output mode; and/or outputting the third image through the second merged output mode.
Specifically, when the image output mode is the full resolution output mode, the processor 30 controls the image sensor 21 to output the first image in the full resolution output mode; when the image output mode is the first merging output mode, the processor 30 controls the image sensor 21 to output the second image in the first merging output mode; when the image output mode is the second merged output mode, the processor 30 controls the image sensor 21 to output the third image in the second merged output mode.
The processor 30 may also control the image sensor 21 to simultaneously output the first image and the second image in the full resolution output mode and the first merged output mode, or the processor 30 may control the image sensor 21 to simultaneously output the first image and the third image in the full resolution output mode and the second merged output mode, or the processor 30 may control the image sensor 21 to simultaneously output the second image and the third image in the first merged output mode and the second merged output mode, or the processor 30 may control the image sensor 21 to simultaneously output the first image, the second image, and the third image in the full resolution output mode, the first merged output mode, and the second merged output mode.
After the image sensor 21 outputs the first image and the second image, or the second image and the third image, or the first image, the second image and the third image at the same time, the user can select the target image according to the preference of the user and save the target image.
It is understood that the image sensor 21 may output a plurality of images simultaneously: the image sensor 21 rapidly outputs a plurality of times according to different image output modes to obtain a plurality of images; the method can also be as follows: the image sensor 21 outputs a pixel value of each pixel (i.e., the first image is output in the full resolution mode), and then the processor 30 performs a combining process according to each pixel value to output the first image, the second image, and/or the third image, respectively.
In this way, the processor 30 may control the image sensor 21 to output a corresponding image through the adapted image output mode.
Referring to fig. 13, in some embodiments, step 0111 includes the following steps:
01111: each first pixel value is interpolated based on a predetermined first interpolation algorithm to obtain a first image arranged in a bayer array.
Referring again to fig. 2, in some embodiments, the determining module 13 is further configured to perform step 01111. That is, the determining module 13 is further configured to interpolate each first pixel value based on a predetermined first interpolation algorithm to obtain the first image in the bayer array arrangement.
Referring again to fig. 3, in some embodiments, the processor 30 is further configured to execute step 01111. That is, the processor 30 is further configured to interpolate each first pixel value based on a predetermined first interpolation algorithm to obtain a first image arranged in a bayer array.
Referring to fig. 14, in particular, when it is determined that the image output mode is the full-resolution output mode, the image sensor 21 acquires a first pixel value of each pixel to generate an original image P0, the pixel P01 in the original image P0 corresponds to the pixel 231 (shown in fig. 8) in the pixel array 23 in a one-to-one manner, and then the processor 30 interpolates the first pixel value of each pixel P01 in the original image P0 based on a preset first interpolation algorithm, so that each first pixel value in the original image P0 is interpolated to a pixel value of a corresponding target pixel P11 in the first image P1, the pixel P11 of the first image P1 corresponds to the pixel P01 of the original image P0 in a one-to-one manner, and the pixel corresponding to the position of the interpolated pixel in the first image P1 is the target pixel. As shown in fig. 14, according to the color of each pixel in the first image P1 to be generated in the bayer array (color a is green, color b is red, and color c is blue), the first pixel value of the pixel P01 in the original image P0 is converted into the target pixel value of the color of the target pixel P11 in the first image P1, for example, the first target pixel P11 (target pixel of the pixel to be interpolated) in the upper left corner of the first image P1 is a red pixel, and then the processor 30 performs interpolation processing (such as averaging) on the pixel to be interpolated according to the first pixel value of the pixel to be interpolated and the first pixel value of the red pixel P01 around the pixel to be interpolated in the original image P0, so as to convert the first pixel value of the pixel to be interpolated into the target pixel value of the target pixel P11. In this manner, each pixel P01 in the original image P0 may be interpolated to a corresponding target pixel P11 in the first image P1 to generate the first image P1 in a bayer array arrangement.
Referring again to fig. 13, in some embodiments, step 0112 includes:
01121: each of the second pixel values and the third pixel values is interpolated based on a predetermined second interpolation algorithm to obtain a second image arranged in a bayer array.
Referring again to fig. 2, in some embodiments, the determining module 13 is further configured to perform step 0321. That is, the determining module 13 is further configured to interpolate each of the second pixel value and the third pixel value based on a predetermined second interpolation algorithm to obtain the second image in the bayer array arrangement.
Referring again to fig. 3, in some embodiments, processor 30 is also configured to perform step 0321. That is, the processor 30 is further configured to interpolate each of the second pixel value and the third pixel value based on a predetermined second interpolation algorithm to obtain the second image in the bayer array arrangement.
Referring to fig. 15, specifically, when the image output mode is determined to be the first combined output mode, the image sensor 21 combines and reads the electrical signals of the 4 panchromatic pixels 2351 in the panchromatic pixel unit 235 corresponding to the panchromatic filter 224 to obtain a second pixel value, combines and reads the electrical signals of the 4 color pixels 2341 in the color pixel unit corresponding to the color filter 223 to obtain a third pixel value, and then the image sensor 21 outputs the original image P0 'according to the second pixel value and the third pixel value, where the number of pixels of the original image P0' is 1/4 of the original image P0. The processor 30 interpolates the second pixel value and the third pixel value in the original image P0 ' based on a preset second interpolation algorithm to obtain a second image P2 arranged in a bayer array, wherein the pixel P21 of the second image P2 corresponds to the pixel P01 ' of the original image P0 ' in a one-to-one manner, and the pixel corresponding to the position of the interpolated pixel in the second image P2 is a target pixel P21. The processor 30 may convert the second pixel value or the third pixel value of the pixel P01 ' in the original image P0 ' into the target pixel value of the color of the target pixel P21 in the second image P2 according to the color of each pixel in the second image P2 of the bayer array to be generated (color a is green, color b is red, and color c is blue), for example, the first pixel P21 in the upper left corner of the second image P2 is a red pixel (target pixel of a pixel to be interpolated), and then the processor 30 interpolates the pixel to be interpolated according to the second pixel value of the first pixel P01 ' in the upper left corner of the original image P0 ' (i.e., pixel to be interpolated) and the third pixel values of the surrounding red pixels P01 ', thereby converting the second pixel value of the pixel to be interpolated into the target pixel value of the target pixel P21. In this manner, the pixel P01 'in the original image P0' may be interpolated to a corresponding target pixel P21 in the second image P2 to generate the second image P2 in a bayer array arrangement.
Referring again to fig. 13, in some embodiments, step 0113 includes:
01131: interpolating each of the fourth pixel value and the fifth pixel value based on a predetermined third interpolation algorithm to obtain a third image in a bayer array arrangement.
Referring again to fig. 2, in certain embodiments, the determination module 13 is further configured to perform step 0331. That is, the determining module 13 is further configured to interpolate each of the fourth pixel value and the fifth pixel value based on a predetermined third interpolation algorithm to obtain a third image in a bayer array arrangement.
Referring again to fig. 3, in certain embodiments, processor 30 is also configured to perform step 0331. That is, the processor 30 is further configured to interpolate each of the fourth pixel value and the fifth pixel value based on a predetermined third interpolation algorithm to obtain a third image in a bayer array arrangement.
Specifically, when it is determined that the image output mode is the second combined output mode, the image sensor 21 combines and reads out the electrical signals of the 8 panchromatic pixels 2351 in the panchromatic pixel units 235 corresponding to all the panchromatic filters 224 in each filter set 222 to obtain a fourth pixel value, combines and reads out the electrical signals of the 8 color pixels 2341 in the color pixel units 234 corresponding to all the color filters 223 in each filter set 222 to obtain a fifth pixel value, and then the image sensor 21 outputs the first intermediate image B1 and the second intermediate image B2, respectively, according to the fourth pixel value and the fifth pixel value. The processor 30 interpolates the first intermediate image B1 and the second intermediate image B2 based on a preset third interpolation algorithm to obtain a third image P3 in a bayer array arrangement. For example, the pixel values of the pixels corresponding to the positions in the first intermediate image B1 and the second intermediate image B2 may be weighted and summed (for example, the weights are both 0.5) to serve as the target pixel value of the target pixel P31 corresponding to the position in the third image P3, for example, the fourth pixel value x1 of the first pixel B11 at the upper left corner of the first intermediate image B1 and the fifth pixel value x2 of the first pixel B21 at the upper left corner of the second intermediate image B2 are weighted and summed to obtain the target pixel value of 0.5x1+0.5x2 of the first pixel P31 at the upper left corner of the third image P3, so that the third image P3 arranged in a bayer array is interpolated according to the first intermediate image B1 and the second intermediate image B2.
It can be understood that, in the above embodiment, the pixel at the corresponding position between different images means that, taking the first pixel at the upper left corner of the image as the origin of coordinates, the pixel with the same coordinates in different images is the pixel at the corresponding position.
Referring to fig. 1 and 17, a non-volatile computer-readable storage medium 200 is further provided in an embodiment of the present application. One or more non-transitory computer-readable storage media 200 embodying a computer program 201, which when executed by one or more processors 300, causes the processors 300 to perform the steps of:
011: the image is output through at least one of a plurality of image output modes.
Referring to fig. 10, further, when the computer program 201 is executed by the one or more processors 300, the processors 300 may further perform the following steps:
0131: when the ambient brightness is larger than a first ambient brightness threshold value, determining that the image output mode is a full-resolution output mode;
0132: when the ambient brightness is greater than the second ambient brightness threshold and less than the first ambient brightness threshold, determining that the image output mode is a first combined output mode; and
0133: and when the ambient brightness is smaller than a second ambient brightness threshold value, determining that the image output mode is a second combined output mode, wherein the first ambient brightness threshold value is larger than the second ambient brightness threshold value.
Referring to fig. 18, the processor 30 according to the embodiment of the present disclosure may be an image processing circuit 80, and the image processing circuit 80 may be implemented by hardware and/or software components, including various processing units defining an ISP (image signal processing) pipeline. FIG. 18 is a diagram of an image processing circuit 800 in one embodiment. As shown in fig. 18, for convenience of explanation, only aspects of the image processing technique related to the embodiment of the present application are shown.
As shown in fig. 18, the image processing circuit 80 includes an ISP processor 81 and a control logic 82. The image data captured by the camera 83 is first processed by the ISP processor 81, and the ISP processor 81 analyzes the image data to capture image statistics that may be used to determine one or more control parameters of the camera 83. Camera 83 (camera 83 may be camera 20 of terminal 100 as shown in fig. 3) may include one or more lenses 832 and an image sensor 834 (image sensor 834 may be image sensor 21 of camera 20 as shown in fig. 3). Image sensor 834 may comprise a color filter array (which may be filter array 22 as shown in fig. 6), and image sensor 834 may acquire light intensity and wavelength information captured by each imaging pixel and provide a set of raw image data that may be processed by ISP processor 81. The sensor 84 (e.g., a gyroscope) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 81 based on the type of sensor 84 interface. The sensor 84 interface may be a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interface, or a combination of the above.
In addition, the image sensor 834 may also send raw image data to the sensor 84, the sensor 84 may provide raw image data to the ISP processor 81 based on the sensor 84 interface type, or the sensor 84 may store raw image data in the image memory 85.
The ISP processor 81 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 81 may perform one or more image processing operations on the raw image data, gathering statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
The ISP processor 81 may also receive image data from an image memory 85. For example, the sensor 84 interface sends raw image data to the image memory 85, and the raw image data in the image memory 85 is then provided to the ISP processor 81 for processing. The image Memory 85 may be the Memory 53, a portion of the Memory 53, a storage device, or a separate dedicated Memory within the electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from the image sensor 834 interface or from the sensor 84 interface or from the image memory 85, the ISP processor 81 may perform one or more image processing operations such as interpolation processing, median filtering, bilateral smoothing filtering, etc. The processed image data may be sent to image memory 85 for additional processing before being displayed. The ISP processor 81 receives the processed data from the image memory 85 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. The image data processed by ISP processor 81 may be output to display 87 (display 87 may be display screen 60 of terminal 100 as shown in fig. 3) for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). Further, the output of the ISP processor 81 may also be sent to the image memory 85, and the display 87 may read image data from the image memory 85. In one embodiment, image memory 85 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 81 may be sent to an encoder/decoder 86 for encoding/decoding the image data. The encoded image data may be saved and decompressed before being displayed on the display 87 device. The encoder/decoder 86 may be implemented by a CPU or GPU or coprocessor.
The statistical data determined by ISP processor 81 may be sent to control logic 82 unit. For example, the statistical data may include image sensor 834 statistics such as image output mode, auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 832 shading correction, and the like. Control logic 82 may include a processing element and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of camera 83 and control parameters of ISP processor 81 based on the received statistical data. For example, the control parameters of camera 83 may include sensor 84 control parameters (e.g., gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 832 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 832 shading correction parameters.
Referring to fig. 1, the following steps are performed to implement the image obtaining method by using the image processing circuit 80 (specifically, the ISP processor 81):
011: the image is output through at least one of a plurality of image output modes.
Referring to fig. 10, further, the image processing circuit 80 (specifically, the ISP processor 81) may further perform the following steps:
0131: when the ambient brightness is larger than a first ambient brightness threshold value, determining that the image output mode is a full-resolution output mode;
0132: when the ambient brightness is greater than the second ambient brightness threshold and less than the first ambient brightness threshold, determining that the image output mode is a first combined output mode; and
0133: and when the ambient brightness is smaller than a second ambient brightness threshold value, determining that the image output mode is a second combined output mode, wherein the first ambient brightness threshold value is larger than the second ambient brightness threshold value. It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (25)

1. An image acquisition method applied to an image sensor, wherein the image sensor comprises a filter array and a pixel array, the filter array comprises a minimum repeating unit, the minimum repeating unit comprises a plurality of filter sets, each filter set comprises a color filter and a panchromatic filter, the width of the wave band of the transmitted light of the color filter is smaller than the width of the wave band of the transmitted light of the panchromatic filter, each color filter and the panchromatic filter comprise a plurality of sub-filters, the pixel array comprises a plurality of pixels, each pixel corresponds to one sub-filter of the filter array, and the pixel is used for receiving the light which passes through the corresponding sub-filter to generate an electric signal; the image acquisition method comprises the following steps:
outputting an image through at least one of a plurality of image output modes including a full-resolution output mode in which a first pixel value is read out from each pixel to obtain a first image, a first combined output mode in which a second pixel value read out is combined from a plurality of pixels corresponding to the panchromatic filter and a third pixel value read out is combined from a plurality of pixels corresponding to the color filter to obtain a second image, and a second combined output mode in which a fourth pixel value read out is combined from a plurality of pixels corresponding to all of the panchromatic filters in the filter group and a fifth pixel value read out is combined from a plurality of pixels corresponding to all of the color filters to obtain a third image.
2. The image acquisition method according to claim 1, further comprising:
acquiring shooting information, wherein the shooting information comprises at least one of ambient brightness and shooting parameters; and
determining the image output mode adapted to the photographing information.
3. The image acquisition method according to claim 2, wherein the shooting parameters include exposure parameters, and the acquiring the shooting information includes:
determining the ambient brightness according to an ambient light intensity signal acquired by the optical sensor; or
And determining the ambient brightness according to the exposure parameters.
4. The image acquisition method according to claim 2, wherein the determining the image output mode adapted to the photographing information includes:
determining the image output mode adapted to the ambient brightness and/or the photographing parameters.
5. The image acquisition method according to claim 4, wherein said determining the image output mode adapted to the ambient brightness comprises:
when the ambient brightness is greater than a first ambient brightness threshold, determining that the image output mode is the full-resolution output mode;
when the ambient brightness is greater than a second ambient brightness threshold and less than the first ambient brightness threshold, determining that the image output mode is the first merged output mode; and
when the ambient brightness is less than the second ambient brightness threshold, determining that the image output mode is the second merged output mode.
6. The image acquisition method according to claim 4, wherein the shooting parameters include exposure parameters, and the determining the image output mode adapted to the ambient brightness and the shooting parameters includes:
determining the light inlet quantity according to the environment brightness and the exposure parameter;
when the light entering amount is larger than a first light entering amount threshold value, determining that the image output mode is the full-resolution output mode;
when the ambient brightness is larger than a second light incoming amount threshold and smaller than the first light incoming amount threshold, determining that the image output mode is the first combined output mode; and
when the ambient brightness is less than the second light entering amount threshold, determining that the image output mode is the second combined output mode.
7. The image acquisition method according to claim 1, wherein the outputting the image in at least one of the preset plurality of image output modes comprises:
outputting the first image in the full resolution output mode; and/or
Outputting the second image through the first merged output mode; and/or
Outputting the third image through the second merged output mode.
8. The image acquisition method according to claim 7, wherein said outputting the first image in the full resolution output mode comprises:
interpolating each of the first pixel values based on a predetermined first interpolation algorithm to obtain the first image in a bayer array arrangement.
9. The image acquisition method according to claim 7, wherein said outputting the second image in the first merged output mode comprises:
interpolating each of the second pixel value and the third pixel value based on a predetermined second interpolation algorithm to obtain the second image in a bayer array arrangement.
10. The image acquisition method according to claim 7, wherein said outputting a third image in the second merged output mode comprises:
interpolating each of the fourth pixel value and the fifth pixel value based on a predetermined third interpolation algorithm to obtain the third image in a bayer array arrangement.
11. The image capturing method as claimed in claim 1, wherein the number of the filter sets is 4, 4 filter sets are arranged in a matrix, and each of the color filter and the panchromatic filter includes 4 sub-filters.
12. The method of claim 1, wherein the filter set comprises 2 color filters and 2 panchromatic filters, wherein the 2 color filters and the 2 panchromatic filters are arranged in a matrix, wherein the 2 color filters are arranged in a first diagonal direction, and wherein the 2 panchromatic filters are arranged in a second diagonal direction, and wherein the first diagonal direction and the second diagonal direction are different.
13. The image capturing method of claim 12, wherein the minimum repeating unit is 64 sub-filters arranged in 8 rows and 8 columns in the following manner:
Figure FDA0002716061230000021
where w denotes a panchromatic sub-filter, and a, b, and c each denote a color sub-filter.
14. The image capturing method of claim 12, wherein the minimum repeating unit is 64 sub-filters arranged in 8 rows and 8 columns in the following manner:
Figure FDA0002716061230000022
where w denotes a panchromatic sub-filter, and a, b, and c each denote a color sub-filter.
15. The image capturing method according to claim 1, wherein in each of the filter sets, the panchromatic filter is disposed on a third diagonal line and a fourth diagonal line, the color filter is disposed in a third diagonal direction or a fourth diagonal direction, and the third diagonal direction is different from the fourth diagonal direction.
16. The image sensor of claim 15, wherein the minimal repeating unit is 144 sub-filters arranged in 12 rows and 12 columns in a manner that:
Figure FDA0002716061230000023
Figure FDA0002716061230000031
where w denotes a panchromatic sub-filter, and a, b, and c each denote a color sub-filter.
17. The image sensor of claim 1, wherein in each of the filter sets, the color filters are disposed on a fifth diagonal and a sixth diagonal, and the panchromatic filter is disposed in a fifth diagonal direction or a sixth diagonal direction, the fifth diagonal direction being different from the sixth diagonal direction.
18. The image sensor of claim 17, wherein the minimal repeating unit is 144 sub-filters arranged in 12 rows and 12 columns in a manner that:
Figure FDA0002716061230000032
where w denotes a panchromatic sub-filter, and a, b, and c each denote a color sub-filter.
19. An image acquisition device, applied to an image sensor, wherein the image sensor comprises a filter array and a pixel array, the filter array comprises a minimum repeating unit, the minimum repeating unit comprises a plurality of filter sets, each filter set comprises a color filter and a panchromatic filter, the width of the wave band of the transmitted light of the color filter is smaller than that of the wave band of the transmitted light of the panchromatic filter, each color filter and the panchromatic filter comprise a plurality of sub-filters, the pixel array comprises a plurality of pixels, each pixel corresponds to one sub-filter of the filter array, and the pixel is used for receiving the light passing through the corresponding sub-filter to generate an electric signal; the image acquisition apparatus includes:
the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring shooting information which comprises at least one of ambient brightness and shooting parameters;
a first determination module configured to determine an image output mode adapted to the shooting information, the image output mode including a full-resolution output mode in which a first pixel value read out from each pixel is used to obtain a first image, a first merged output mode in which a second pixel value read out from a plurality of pixels corresponding to the panchromatic filter and a third pixel value read out from a plurality of pixels corresponding to the color filter are merged to obtain a second image, and a second merged output mode in which a fourth pixel value read out from a plurality of pixels corresponding to all the panchromatic filters in the filter group and a fifth pixel value read out from a plurality of pixels corresponding to all the color filters are merged to obtain a third image; and
a second determining module for outputting an image according to the adapted image output mode.
20. A terminal comprising an image sensor and a processor, the image sensor comprising a filter array, a pixel array, and readout circuitry, the filter array comprising a minimal repeating unit, the minimal repeating unit comprising a plurality of filter sets, the filter sets comprising a color filter and a panchromatic filter, the color filter transmitting light in a band of wavelengths that is less than the panchromatic filter transmitting light in a band of wavelengths, the color filter and the panchromatic filter each comprising a plurality of sub-filters; the pixel array comprises a plurality of pixels, each pixel corresponds to one sub-filter of the filter array, and the pixels are used for receiving light rays passing through the corresponding sub-filter to generate electric signals; the processor is configured to:
acquiring shooting information, wherein the shooting information comprises at least one of ambient brightness and shooting parameters;
determining an image output mode adapted to the photographing information, the image output mode including a full-resolution output mode in which a first pixel value read out from each pixel is used to obtain a first image, a first combined output mode in which a second pixel value read out from a plurality of pixels corresponding to the panchromatic filter and a third pixel value read out from a plurality of pixels corresponding to the color filter are combined to obtain a second image, and a second combined output mode in which a fourth pixel value read out from a plurality of pixels corresponding to all the panchromatic filters in the filter group and a fifth pixel value read out from a plurality of pixels corresponding to all the color filters are combined to obtain a third image; and
and controlling the readout circuit to output an image according to the adapted image output mode.
21. The terminal of claim 20, wherein the processor is further configured to determine the image output mode adapted to the ambient brightness and/or the photographing parameters.
22. The terminal of claim 21, wherein the processor is further configured to:
when the ambient brightness is greater than a first ambient brightness threshold, determining that the image output mode is the full-resolution output mode;
when the ambient brightness is greater than a second ambient brightness threshold and less than the first ambient brightness threshold, determining that the image output mode is the first merged output mode; and
when the ambient brightness is less than the second ambient brightness threshold, determining that the image output mode is the second merged output mode.
23. The terminal of claim 21, wherein the processor is further configured to:
determining the light inlet quantity according to the environment brightness and the exposure parameter;
when the light entering amount is larger than a first light entering amount threshold value, determining that the image output mode is the full-resolution output mode;
when the ambient brightness is larger than a second light incoming amount threshold and smaller than the first light incoming amount threshold, determining that the image output mode is the first combined output mode; and
when the ambient brightness is less than the second light entering amount threshold, determining that the image output mode is the second combined output mode.
24. The terminal of claim 20, wherein the processor is further configured to:
outputting the first image according to the full resolution output mode; and/or
Outputting the second image according to the first merged output mode; and/or
Outputting the third image according to the second merged output mode.
25. A non-transitory computer readable storage medium containing a computer program which, when executed by one or more processors, causes the processors to perform the image acquisition method of any one of claims 1 to 18.
CN202011073863.3A 2020-10-09 2020-10-09 Image acquisition method and device, terminal and computer readable storage medium Pending CN112118378A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN202011073863.3A CN112118378A (en) 2020-10-09 2020-10-09 Image acquisition method and device, terminal and computer readable storage medium
EP21876886.9A EP4216534A4 (en) 2020-10-09 2021-07-09 Image obtaining method and apparatus, terminal, and computer readable storage medium
PCT/CN2021/105464 WO2022073364A1 (en) 2020-10-09 2021-07-09 Image obtaining method and apparatus, terminal, and computer readable storage medium
US18/193,134 US20230254553A1 (en) 2020-10-09 2023-03-30 Image obtaining method and apparatus, terminal, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011073863.3A CN112118378A (en) 2020-10-09 2020-10-09 Image acquisition method and device, terminal and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN112118378A true CN112118378A (en) 2020-12-22

Family

ID=73797410

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011073863.3A Pending CN112118378A (en) 2020-10-09 2020-10-09 Image acquisition method and device, terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112118378A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113556519A (en) * 2021-07-01 2021-10-26 Oppo广东移动通信有限公司 Image processing method, electronic device, and non-volatile computer-readable storage medium
CN113676708A (en) * 2021-07-01 2021-11-19 Oppo广东移动通信有限公司 Image generation method and device, electronic equipment and computer-readable storage medium
CN113676675A (en) * 2021-08-16 2021-11-19 Oppo广东移动通信有限公司 Image generation method and device, electronic equipment and computer-readable storage medium
CN113840067A (en) * 2021-09-10 2021-12-24 Oppo广东移动通信有限公司 Image sensor, image generation method and device and electronic equipment
CN114222047A (en) * 2021-12-27 2022-03-22 Oppo广东移动通信有限公司 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium
WO2022226858A1 (en) * 2021-04-28 2022-11-03 江苏南大五维电子科技有限公司 Dim light color camera
WO2023082766A1 (en) * 2021-11-12 2023-05-19 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, and image generation method and apparatus
WO2023087908A1 (en) * 2021-11-22 2023-05-25 Oppo广东移动通信有限公司 Focusing control method and apparatus, image sensor, electronic device, and computer readable storage medium
WO2023098284A1 (en) * 2021-12-01 2023-06-08 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, and image generation method and apparatus
WO2023098282A1 (en) * 2021-12-01 2023-06-08 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, and image generation method and apparatus
WO2023109265A1 (en) * 2021-12-14 2023-06-22 Oppo广东移动通信有限公司 Image sensor, photographing module, electronic device, and image generation method and apparatus
WO2023109264A1 (en) * 2021-12-14 2023-06-22 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, and image generation method and apparatus

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022226858A1 (en) * 2021-04-28 2022-11-03 江苏南大五维电子科技有限公司 Dim light color camera
CN113676708A (en) * 2021-07-01 2021-11-19 Oppo广东移动通信有限公司 Image generation method and device, electronic equipment and computer-readable storage medium
CN113676708B (en) * 2021-07-01 2023-11-14 Oppo广东移动通信有限公司 Image generation method, device, electronic equipment and computer readable storage medium
CN113556519A (en) * 2021-07-01 2021-10-26 Oppo广东移动通信有限公司 Image processing method, electronic device, and non-volatile computer-readable storage medium
CN113676675B (en) * 2021-08-16 2023-08-15 Oppo广东移动通信有限公司 Image generation method, device, electronic equipment and computer readable storage medium
CN113676675A (en) * 2021-08-16 2021-11-19 Oppo广东移动通信有限公司 Image generation method and device, electronic equipment and computer-readable storage medium
CN113840067A (en) * 2021-09-10 2021-12-24 Oppo广东移动通信有限公司 Image sensor, image generation method and device and electronic equipment
CN113840067B (en) * 2021-09-10 2023-08-18 Oppo广东移动通信有限公司 Image sensor, image generation method and device and electronic equipment
WO2023082766A1 (en) * 2021-11-12 2023-05-19 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, and image generation method and apparatus
WO2023087908A1 (en) * 2021-11-22 2023-05-25 Oppo广东移动通信有限公司 Focusing control method and apparatus, image sensor, electronic device, and computer readable storage medium
WO2023098282A1 (en) * 2021-12-01 2023-06-08 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, and image generation method and apparatus
WO2023098284A1 (en) * 2021-12-01 2023-06-08 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, and image generation method and apparatus
WO2023109265A1 (en) * 2021-12-14 2023-06-22 Oppo广东移动通信有限公司 Image sensor, photographing module, electronic device, and image generation method and apparatus
WO2023109264A1 (en) * 2021-12-14 2023-06-22 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, and image generation method and apparatus
CN114222047A (en) * 2021-12-27 2022-03-22 Oppo广东移动通信有限公司 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium

Similar Documents

Publication Publication Date Title
CN213279832U (en) Image sensor, camera and terminal
CN112118378A (en) Image acquisition method and device, terminal and computer readable storage medium
CN108322669B (en) Image acquisition method and apparatus, imaging apparatus, and readable storage medium
CN108712608B (en) Terminal equipment shooting method and device
CN108989700B (en) Imaging control method, imaging control device, electronic device, and computer-readable storage medium
US8355074B2 (en) Exposing pixel groups in producing digital images
US6982756B2 (en) Digital camera, image signal processing method and recording medium for the same
CN111491110B (en) High dynamic range image processing system and method, electronic device, and storage medium
CN111711755B (en) Image processing method and device, terminal and computer readable storage medium
CN104038702B (en) Picture pick-up device and its control method
US20090219419A1 (en) Peripheral Light Amount Correction Apparatus, Peripheral Light Amount Correction Method, Electronic Information Device, Control Program and Readable Recording Medium
CN107509044B (en) Image synthesis method, image synthesis device, computer-readable storage medium and computer equipment
US9160937B2 (en) Signal processing apparatus and signal processing method, solid-state imaging apparatus, electronic information device, signal processing program, and computer readable storage medium
CN108683863B (en) Imaging control method, imaging control device, electronic equipment and readable storage medium
CN108419022A (en) Control method, control device, computer readable storage medium and computer equipment
CN113676675B (en) Image generation method, device, electronic equipment and computer readable storage medium
CN113170061B (en) Image sensor, imaging device, electronic apparatus, image processing system, and signal processing method
CN110166707A (en) Image processing method, device, electronic equipment and storage medium
CN113840067B (en) Image sensor, image generation method and device and electronic equipment
CN110166706A (en) Image processing method, device, electronic equipment and storage medium
CN114125242A (en) Image sensor, camera module, electronic equipment, image generation method and device
CN107194901B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
CN111711766B (en) Image processing method and device, terminal and computer readable storage medium
CN107341782B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
US20230247308A1 (en) Image processing method, camera assembly and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination