CN114125319A - Image sensor, camera module, image processing method and device and electronic equipment - Google Patents

Image sensor, camera module, image processing method and device and electronic equipment Download PDF

Info

Publication number
CN114125319A
CN114125319A CN202111445496.XA CN202111445496A CN114125319A CN 114125319 A CN114125319 A CN 114125319A CN 202111445496 A CN202111445496 A CN 202111445496A CN 114125319 A CN114125319 A CN 114125319A
Authority
CN
China
Prior art keywords
image
sub
pixel unit
pixel
blue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111445496.XA
Other languages
Chinese (zh)
Inventor
袁旺程
何循洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202111445496.XA priority Critical patent/CN114125319A/en
Publication of CN114125319A publication Critical patent/CN114125319A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

The application discloses an image sensor belongs to the technical field of image processing. The image sensor comprises a plurality of color pixel units and a plurality of brightness pixel units, wherein the color pixel units comprise a blue sub-pixel unit, a green sub-pixel unit and a red sub-pixel unit which are stacked in a preset sequence; the brightness pixel unit comprises at least one of a visible photon pixel unit and an infrared photon pixel unit; the visible photon pixel unit is used for collecting visible light signals in target light, and the infrared photon pixel unit is used for collecting near infrared light signals in the target light; the color pixel units and the brightness pixel units are arranged at intervals according to a preset proportion.

Description

Image sensor, camera module, image processing method and device and electronic equipment
Technical Field
The application belongs to the technical field of image processing, and particularly relates to an image sensor, a camera module, an image processing method and device and electronic equipment.
Background
The image experience part of the present electronic device is more and more important, the perception of the user is gradually enhanced, the demand and the requirement for photographing are also higher and higher, and a technology for enhancing the photographing experience under low illumination by increasing the photosensitive performance is developed based on the Bayer format at present.
In the image sensor based on the RGBW-Kodak format, one pixel unit corresponds to one color, and red R, green G, blue B, and white W are arranged in a ratio of 1:2:1:4, and four pixels are respectively designed to be 50% for collecting green signals and 25% for collecting red and blue signals, as shown in fig. 1 below. Each pixel unit can only capture the intensity of one of red, green, blue and white light waves, in the process, green light 3/4, 7/8 red light and blue light are lost, and finally the three primary color data are output to an image processor for processing. Since each pixel cell only records one color signal, if the true color is restored, the other two missing color information need to be guessed by an algorithm, which may cause a false color problem.
Disclosure of Invention
The embodiment of the application aims to provide an image sensor, a camera module, an image processing method, an image processing device and electronic equipment, which can restore the truest color information of a shot object, maximize the low-illumination light capturing capability of the image sensor, realize the best signal-to-noise ratio and definition, and improve the image quality level and the user experience of a final image.
In a first aspect, an embodiment of the present application provides an image sensor method, including: the pixel comprises a blue sub-pixel unit, a green sub-pixel unit and a red sub-pixel unit which are stacked and arranged according to a preset sequence; the brightness pixel unit comprises at least one of a visible photon pixel unit and an infrared photon pixel unit; the visible photon pixel unit is used for collecting visible light signals in target light, and the infrared photon pixel unit is used for collecting near infrared light signals in the target light; the color pixel units and the brightness pixel units are arranged at intervals according to a preset proportion.
In a second aspect, an embodiment of the present application provides a camera module, which includes the image sensor according to the first aspect.
In a third aspect, an embodiment of the present application provides an electronic device, including the image pickup module according to the second aspect.
In a fourth aspect, an embodiment of the present application provides an image processing method applied to an image processing apparatus including the image sensor according to the first aspect, the method including:
the method comprises the steps of obtaining a blue sub-image, a green sub-image, a red sub-image and a brightness sub-image, wherein the blue sub-image is obtained based on a blue light signal collected by a blue sub-pixel unit, the green sub-image is obtained based on a green light signal collected by a green sub-pixel unit, the red sub-image is obtained based on a red light signal collected by a red sub-pixel unit, and the brightness sub-image is obtained based on a brightness light signal collected by a brightness pixel unit; and carrying out image synthesis on the blue sub-image, the green sub-image, the red sub-image and the brightness sub-image, and outputting a target color image.
In a fifth aspect, an embodiment of the present application provides an image processing apparatus, including an analog-to-digital conversion module, an image processing module, and the image sensor according to the first aspect,
the image sensor is used for respectively collecting a blue light signal, a green light signal, a red light signal and a brightness light signal in the target light, and correspondingly converting the blue light signal into a blue electric signal, the green light signal into a green electric signal, the red light signal into a red electric signal and the brightness electric signal of the brightness light signal;
the analog-to-digital conversion module is connected with the image sensor and converts the blue electric signal output by the image sensor into a blue digital signal, the green electric signal into a green digital signal, the red electric signal into a red digital signal and the brightness electric signal into a brightness digital signal;
the image processing module is connected with the analog-to-digital conversion module, and is used for carrying out image synthesis on the blue sub-image of the blue digital signal, the green sub-image of the green digital signal, the red sub-image of the red digital signal and the brightness sub-image of the brightness digital signal output by the analog-to-digital conversion module and outputting a target color image.
In a sixth aspect, the present application provides an electronic device, which includes a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to the fourth aspect.
In a seventh aspect, the present application provides a readable storage medium, on which a program or instructions are stored, and when executed by a processor, the program or instructions implement the steps of the method according to the fourth aspect.
In an eighth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the fourth aspect.
In a ninth aspect, the present application provides a computer program product, which is stored in a storage medium and executed by at least one processor to implement the method according to the fourth aspect.
In the embodiment of the application, the image sensor comprises a plurality of color pixel units and a plurality of brightness pixel units, wherein the color pixel units comprise a blue sub-pixel unit, a green sub-pixel unit and a red sub-pixel unit which are stacked and arranged according to a preset sequence, the brightness pixel units comprise at least one of a visible photon pixel unit and an infrared photon pixel unit, and the color pixel units and the brightness pixel units are arranged at intervals according to a preset proportion. Therefore, each color pixel unit can acquire R/G/B three-color signals at the same time, and the truest color information of the object is restored. Each brightness pixel unit can capture the full-wave band of visible light by using the visible photon pixel unit and/or capture an infrared wave band signal by using the infrared photon pixel unit, so that the low-illumination light capturing capability of the image sensor is maximized, the optimal signal-to-noise ratio and definition are realized, and the image quality level and the user experience of the final image are improved. Simultaneously, the signal sampling rate of this application compares traditional image sensor and has promotes by a wide margin.
Drawings
Fig. 1 is a schematic diagram of an arrangement layout of pixel units of an image sensor.
Fig. 2A and 2B are schematic structural diagrams of a pixel unit of an image sensor according to an embodiment of the present application.
Fig. 3 is a schematic diagram of optical wave filtering of a pixel unit of an image sensor according to an embodiment of the present application.
Fig. 4 is a schematic layout diagram of an arrangement of pixel units of an image sensor according to an embodiment of the present application.
Fig. 5 is a flowchart illustrating an image processing method according to an embodiment of the present application.
Fig. 6A is a diagram illustrating an example of an arrangement matrix of each sub-pixel unit of the image sensor according to the embodiment of the present application.
Fig. 6B is an exemplary diagram of a captured image corresponding to each sub-pixel unit of the image sensor according to the embodiment of the present application.
Fig. 6C is an exemplary diagram of an image after a pixel completion process corresponding to each sub-pixel unit of the image sensor according to the embodiment of the present application.
Fig. 6D is an exemplary image after image fusion processing corresponding to each sub-pixel unit of the image sensor according to the embodiment of the present application.
Fig. 7 is a block diagram showing the configuration of an image processing apparatus according to an embodiment of the present application.
Fig. 8 is a block diagram of an electronic device according to an embodiment of the present application.
Fig. 9 is a schematic hardware structure diagram of an electronic device implementing an embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not necessarily limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means a relationship in which a front and rear related objects are one kind of "or".
The image sensor, the camera module, the image processing method, the image processing device, and the electronic apparatus provided in the embodiments of the present application are described in detail below with reference to the accompanying drawings.
The embodiment of the application provides an image sensor, which comprises a plurality of color pixel units and a plurality of brightness pixel units, wherein the color pixel units comprise a blue sub-pixel unit, a green sub-pixel unit and a red sub-pixel unit which are stacked and arranged according to a preset sequence; the brightness pixel unit comprises at least one of a visible photon pixel unit and an infrared photon pixel unit; the visible photon pixel unit is used for collecting visible light signals in target light rays, and the infrared photon pixel unit is used for collecting near infrared light signals in the target light rays; the color pixel units and the brightness pixel units are arranged at intervals according to a preset proportion.
The blue sub-pixel unit collects blue signals in target light, the green sub-pixel unit collects green signals in the target light, and the red sub-pixel unit collects red signals in the target light.
Fig. 2 is a schematic structural diagram of a pixel unit of an image sensor according to an embodiment of the present disclosure, and as shown in fig. 2A, a color pixel unit includes a blue (B) sub-pixel unit 12, a green (G) sub-pixel unit 14, and a red (R) sub-pixel unit 16, where the B sub-pixel unit 12, the G sub-pixel unit 14, and the red R sub-pixel unit 16 are stacked in order from top to bottom.
In one embodiment, the luminance pixel cell may include only visible light (W) sub-pixel cells 22. In other embodiments, the luminance pixel cell may include only an infrared light (NIR) sub-pixel cell 24.
Optionally, the luminance pixel unit includes a visible photon pixel unit and an infrared photon pixel unit, and the visible photon pixel unit and the infrared photon pixel unit are stacked in a preset order.
As shown in fig. 2B, in the embodiment where the luminance pixel unit includes both the W sub-pixel unit 22 and the NIR sub-pixel unit 24, the W sub-pixel unit 22 and the NIR sub-pixel unit 24 are stacked in order from top to bottom.
Optionally, at least one of the red sub-pixel unit, the green sub-pixel unit, the blue sub-pixel unit, the visible photon pixel unit, and the infrared photon pixel unit is an organic light guide film.
In the embodiments of the present application, the organic photoconductive film has functions of filtering light and sensing light. The blue sub-pixel unit can absorb blue light with short wavelength in target light (white light), thereby capturing collected blue light and allowing green light with middle wavelength and red light with long wavelength to pass through. The green sub-pixel unit can absorb green light with middle wavelength in the target light, thereby capturing collected green light and allowing red light with long wavelength to pass through. The red sub-pixel unit can absorb red light with long wavelength in the target light, so as to capture collected red light. The visible photon pixel unit can absorb the visible light of the whole wave band in the target light, so that the visible light is captured and collected, and the infrared light with longer wavelength and the near infrared light are allowed to pass through. The infrared photon pixel unit can absorb near infrared light in target light, so that the near infrared light is captured and collected.
Therefore, by adopting the organic light guide film in at least one of the red sub-pixel unit, the green sub-pixel unit, the blue sub-pixel unit, the visible light sub-pixel unit and the infrared sub-pixel unit, light filtering and light sensing can be realized simultaneously, and high manufacturing cost and complexity caused by respectively adopting the light filter and the light sensing element are avoided.
Fig. 3 shows a light wave filtering process of a pixel unit of an image sensor according to an embodiment of the present application, as shown in fig. 3, a B sub-pixel unit 12, a G sub-pixel unit 14, and a red R sub-pixel unit 16 are sequentially stacked in layers, after target light enters the B sub-pixel unit 12, blue light 2 therein is absorbed and collected by the B sub-pixel unit 12, and green light 4 and red light 6 pass through the B sub-pixel unit 12 and enter the G sub-pixel unit 14. The G sub-pixel cell 14 absorbs the green light 4 collected therein and allows the red light 6 to pass through. The red light 6 enters the red R sub-pixel element 16 and is thus absorbed and collected by the red R sub-pixel element 16.
As described above, the color pixel units and the brightness pixel units are arranged at intervals according to a preset proportion. Optionally, the preset proportion is 0.3-3.
Therefore, the image sensor obtained by arranging the color pixel unit and the brightness pixel unit at intervals can better balance the color of the shot object and the brightness under low-illumination light, and the most real color information of the shot object can be restored, and the signal-to-noise ratio and the definition of an image under low-illumination light can be realized.
Optionally, the preset ratio comprises 1:1, 1:2 or 2: 1.
In order to achieve better color effect of the image, the number of the arranged color pixel units 10 may be larger than the number of the arranged brightness pixel units 20, for example, the preset ratio is 2: 1. Conversely, in order to achieve better brightness effect of the image, the number of the arranged brightness pixel units 20 may be larger than the number of the arranged color pixel units 10, for example, the preset ratio is 1: 2.
To better balance the color and brightness effects, the preset ratio is optionally 1: 1.
As shown in fig. 4, the color pixel cells 10 and the luminance pixel cells 20 of the image sensor in this embodiment are arranged at 1:1 proportional intervals.
In the embodiment of the application, the image sensor comprises a plurality of color pixel units and a plurality of brightness pixel units, wherein the color pixel units comprise a blue sub-pixel unit, a green sub-pixel unit and a red sub-pixel unit which are stacked and arranged according to a preset sequence, the brightness pixel units comprise at least one of a visible photon pixel unit and an infrared photon pixel unit, and the color pixel units and the brightness pixel units are arranged at intervals according to a preset proportion. Therefore, each color pixel unit can acquire R/G/B three-color signals at the same time, and the truest color information of the object is restored. Each brightness pixel unit can capture the full wave band of visible light by using the visible photon pixel unit and/or capture the infrared wave band signal by using the infrared photon pixel unit, so that the low-illumination light capturing capability of the image sensor is maximized, and the optimal signal-to-noise ratio and definition are realized.
Simultaneously, the signal sampling rate of this application compares traditional image sensor and has promotes by a wide margin, can make the false color scheduling problem that traditional image sensor brought also can improve by a wide margin. Compared with a traditional sensor, the color image output method and the color image output device can achieve the effect that the finally output color image has sharper and real color and the color resolution is better.
In an embodiment, the present application further provides a camera module including the image sensor according to any of the embodiments of fig. 1 to 4.
The module of making a video recording of this application embodiment includes: image sensor, circuit board and camera lens. The image sensor is electrically connected with the circuit board; the lens is arranged on one side of the image sensor, which is far away from the circuit board.
In one embodiment, the lens may include one or more lenses.
Through including the image sensor of this application embodiment in the module of making a video recording, can make the image that the module of shooing was shot restore the truest color information of the object of shooing, and have high signal-to-noise ratio and the definition of shooting the image under the low light.
Optionally, as shown in fig. 5, an embodiment of the present application further provides an image processing method, which is applied to an image processing apparatus including the image sensor of any of the above embodiments.
Fig. 5 is a schematic flowchart of an image processing method according to an embodiment of the present application, where the method includes the following steps:
102, obtaining a blue sub-image, a green sub-image, a red sub-image and a brightness sub-image, wherein the blue sub-image is obtained based on a blue light signal collected by a blue sub-pixel unit, the green sub-image is obtained based on a green light signal collected by a green sub-pixel unit, the red sub-image is obtained based on a red light signal collected by a red sub-pixel unit, and the brightness sub-image is obtained based on a brightness light signal collected by a brightness pixel unit.
And 104, carrying out image synthesis on the blue sub-image, the green sub-image, the red sub-image and the brightness sub-image, and outputting a target color image.
Each sub-pixel unit of the image sensor has both functions of filtering light and sensing light, and in step 102, each sub-pixel unit acquires a corresponding light signal and obtains a corresponding sub-image through signal conversion. In step 104, the sub-images are combined to obtain a color image.
Optionally, in step 102, the acquiring a blue sub-image, a green sub-image, a red sub-image, and a luminance sub-image includes: acquiring a blue light signal collected by a blue sub-pixel unit, a green light signal collected by a green sub-pixel unit, a red light signal collected by a red sub-pixel unit and a brightness light signal collected by a brightness pixel unit; obtaining a blue electric signal obtained by converting the blue light signal by the blue sub-pixel unit, a green electric signal obtained by converting the green light signal by the green sub-pixel unit, a red electric signal obtained by converting the red light signal by the red sub-pixel unit, and a brightness electric signal obtained by converting the brightness light signal by the brightness pixel unit; acquiring a blue digital signal obtained by performing electrical number conversion on the blue electrical signal, a green digital signal obtained by performing electrical number conversion on the green electrical signal, a red digital signal obtained by performing electrical number conversion on the red electrical signal, and a brightness digital signal obtained by performing electrical number conversion on the brightness electrical signal; and acquiring a blue sub-image of the blue digital signal, a green sub-image of the green digital signal, a red sub-image of the red digital signal and a brightness sub-image of the brightness digital signal.
Corresponding light signals can be collected by using the color pixel units of all colors of the image sensor and converted into electric signals, so that the filtering and photosensitive operation of red light, green light, blue light and brightness is realized. The digital-to-analog conversion circuit may be used to convert the electrical signals obtained by the corresponding sensitization of each sub-pixel unit into digital signals in a one-to-one correspondence manner, so as to form a digital signal matrix, i.e., an image, corresponding to each sub-pixel unit.
The light signals collected by the various color pixel units of the image sensor are correspondingly converted into electric signals, light filtering and sensitization of red light, green light, blue light and brightness are achieved, the electric signals are further subjected to electric-digital conversion to obtain corresponding digital signals, and then blue sub-images, green sub-images, red sub-images and brightness sub-images corresponding to the collected light signals are obtained at intervals according to a preset proportion, and the image collection and processing efficiency is improved.
Optionally, in step 104, the image synthesizing the blue sub-image, the green sub-image, the red sub-image, and the luminance sub-image to output a target color image includes: performing pixel completion processing on blank pixel units in the blue sub-image, the green sub-image, the red sub-image and the brightness sub-image respectively; and carrying out image fusion processing on the blue sub-image, the green sub-image, the red sub-image and the brightness sub-image after the pixel completion processing to obtain a target color image.
The blue sub-image, the green sub-image, the red sub-image and the luminance sub-image obtained in step 102 have blank pixel units with corresponding proportion, that is, pixel units without pixel values, in each sub-image because the color pixel units and the luminance pixel units are arranged at intervals according to a preset proportion. Therefore, the real color of the shot object corresponding to each sub-image is restored by performing pixel completion on each sub-image. Then, the signal-to-noise ratio and the definition of the low-illumination light shooting image can be improved by carrying out image fusion processing on each sub-image after the pixel completion processing.
Referring now to fig. 6A to 6D, as shown in fig. 6A, in each sub-pixel unit arrangement matrix C of the image sensor, a color pixel unit R/G/B including a blue sub-pixel unit, a green sub-pixel unit, and a red sub-pixel unit stacked in a preset order, and a luminance pixel unit W/NIR including a visible light sub-pixel unit and an infrared sub-pixel unit stacked in a preset order are arranged, and the color pixel unit R/G/B and the luminance pixel unit W/NIR are arranged at a 1:1 ratio interval.
Each sub-pixel unit respectively obtains a red sub-image R corresponding to a red sub-pixel unit, a green sub-image B corresponding to a green sub-pixel unit, a blue sub-image G corresponding to a blue sub-pixel unit, a visible photon image W corresponding to a visible photon pixel unit, and a near-infrared sub-image N corresponding to an infrared photon pixel unit in each sub-pixel unit arrangement matrix C by collecting an optical signal, an optical signal conversion electric signal, and an electric signal conversion digital signal, as shown in fig. 6B.
Since the color pixel units and the brightness pixel units are arranged at the proportional interval of 1:1, each obtained sub-image corresponds to a blank pixel unit with half the proportion. Therefore, it is necessary to perform pixel completion processing for the blank pixel unit in each sub-image.
Optionally, performing pixel completion processing on each blank pixel unit in the blue sub-image, the green sub-image, the red sub-image, and the luminance sub-image respectively, includes: acquiring pixel values of color pixel units adjacent to each blank pixel unit, wherein each color pixel unit comprises a blue pixel unit, a green pixel unit or a red pixel unit; the pixel value of each blank pixel cell is determined based on the pixel values of adjacent color pixel cells.
For example, when performing pixel completion processing on the blue sub-image, pixel completion may be performed according to pixel values of a predetermined number of blue pixel cells corresponding to the periphery of each blank pixel cell in the blue sub-image. Similarly, when the pixel completion processing is performed on the visible photon image, the pixel completion can be performed according to the pixel values of a predetermined number of visible light pixel units corresponding to the periphery of each blank area in the visible photon image.
In one embodiment, the interpolation may be complemented by a Demosaic (Demosaic) algorithm. And respectively carrying out pixel completion on a blank pixel unit in the R/G/B color sub-image and a blank pixel unit in the W/N brightness sub-image by Demosaic algorithm interpolation to prepare for subsequent image processing.
The red sub-image R, the green sub-image B, the blue sub-image G, the visible photon image W, and the near-infrared sub-image N shown in fig. 6B are subjected to pixel completion processing, and then the red sub-image R ', the green sub-image B ', the blue sub-image G ', the visible photon image W ', and the near-infrared sub-image N ' are correspondingly obtained, as shown in fig. 6C.
As described above, the image fusion processing is performed on each sub-image after the pixel completion processing to obtain a target color image, such as an RGB image.
In one embodiment, the performing image fusion processing on the blue sub-image, the green sub-image, the red sub-image, and the luminance sub-image after the pixel completion processing includes: based on the brightness subimages, carrying out image fusion processing on the blue subimages after the pixel completion to obtain image fused blue subimages; performing image fusion processing on the green sub-image subjected to pixel completion based on the brightness sub-image to obtain an image-fused green sub-image; performing image fusion processing on the red sub-image after the completion of the pixels based on the brightness sub-image to obtain an image-fused red sub-image; and obtaining the target color image based on the blue sub-image, the green sub-image and the red sub-image after image fusion.
In this embodiment, the luminance sub-images may be used to perform image fusion processing on each color sub-image, and the pixel values of the visible photon image W 'and/or the near-infrared sub-image N' are referred to and adjusted respectively corresponding to the pixel values of the blue sub-image, the pixel values of the green sub-image, and the pixel values of the red sub-image, so that the black-and-white image or the gray-scale image corresponding to the luminance sub-image is fused into each color sub-image, and the definition and the noise in each color sub-image are greatly improved. Each color sub-image after the image fusion processing outputs a color image in a format of JPG, PNG or the like which can be displayed finally, and even the image shot by low light has better definition and signal-to-noise ratio.
In another embodiment, the performing image fusion processing on the blue sub-image, the green sub-image, the red sub-image, and the luminance sub-image after the pixel completion processing includes: superposing the blue sub-image, the green sub-image and the red sub-image after pixel completion to generate a color image; and carrying out image fusion on the brightness sub-image and the color image to obtain the target color image.
In this embodiment, the color sub-images may be superimposed to produce a color image that is visually multi-colored. Then, the pixel values of the synthesized color image are correspondingly adjusted by referring to the pixel values of the visible photon image W 'and/or the near-infrared sub-image N', so that a black-and-white image or a gray image corresponding to the brightness sub-image is respectively fused into the synthesized color image, the definition and the noise in the synthesized color image are greatly improved, the finally displayable color image in the format of JPG (joint photographic group) or PNG (public network group) and the like is output, and the definition and the signal-to-noise ratio of the image shot by low-light are also better. Through the fusion processing of the different embodiments, the color image is adjusted by using the visible photon image W 'and the near-infrared sub-image N', so that the definition and the noise in the finally output color image are greatly improved.
Fig. 6D is an exemplary diagram of an image after image fusion processing corresponding to each sub-pixel unit of the image sensor in the embodiment of the present application, and a color image P is obtained after image fusion processing is performed on the red sub-image R ', the green sub-image B ', the blue sub-image G ', the visible photon image W ', and the near-infrared sub-image N ' shown in fig. 6C, as shown in fig. 6D.
In the embodiment of the application, a blue sub-image obtained based on a blue light signal acquired by a blue sub-pixel unit, a green sub-image obtained based on a green light signal acquired by a green sub-pixel unit, a red sub-image obtained based on a red light signal acquired by a red sub-pixel unit, and a luminance sub-image obtained based on a luminance light signal acquired by a luminance pixel unit are acquired, and image synthesis is performed to output a color image, so that the finally output color image can restore the truest color information of a shot object, and meanwhile, the low-illumination light capturing capability of an image sensor can be remarkably improved, and therefore, the optimal signal-to-noise ratio and definition are achieved.
In the image processing method provided by the embodiment of the application, the execution main body can be an image processing device. In the embodiments of the present application, an image processing apparatus provided in the embodiments of the present application will be described by taking an example in which an image processing apparatus executes an image processing method.
Optionally, as shown in fig. 7, the image processing apparatus 800 of the embodiment of the present application includes an image sensor 820, an image acquisition module 840, and an image synthesis module 860.
An image obtaining module 840, configured to obtain a blue sub-image, a green sub-image, a red sub-image, and a luminance sub-image, where the blue sub-image is obtained based on a blue light signal collected by a blue sub-pixel unit of the image sensor, the green sub-image is obtained based on a green light signal collected by a green sub-pixel unit of the image sensor, the red sub-image is obtained based on a red light signal collected by a red sub-pixel unit of the image sensor, and the luminance sub-image is obtained based on a luminance light signal collected by a luminance pixel unit of the image sensor;
the image synthesis module 860 is configured to perform image synthesis on the blue sub-image, the green sub-image, the red sub-image, and the luminance sub-image, and output a target color image.
Optionally, the image obtaining module 840 specifically includes:
the optical signal acquisition sub-module is used for acquiring a blue light signal collected by the blue sub-pixel unit, a green light signal collected by the green sub-pixel unit, a red light signal collected by the red sub-pixel unit and a brightness optical signal collected by the brightness pixel unit;
an electric signal obtaining sub-module, configured to obtain a blue electric signal obtained by converting the blue light signal by the blue sub-pixel unit, a green electric signal obtained by converting the green light signal by the green sub-pixel unit, a red electric signal obtained by converting the red light signal by the red sub-pixel unit, and a luminance electric signal obtained by converting the luminance light signal by the luminance pixel unit;
the digital signal acquisition sub-module is used for acquiring a blue digital signal obtained by performing electrical number conversion on the blue electric signal, a green digital signal obtained by performing electrical number conversion on the green electric signal, a red digital signal obtained by performing electrical number conversion on the red electric signal and a brightness digital signal obtained by performing electrical number conversion on the brightness electric signal;
and the image acquisition sub-module is used for acquiring a blue sub-image of the blue digital signal, a green sub-image of the green digital signal, a red sub-image of the red digital signal and a brightness sub-image of the brightness digital signal.
Optionally, the image synthesizing module 860 specifically includes: the pixel completion submodule is used for performing pixel completion processing on blank pixel units in the blue sub-image, the green sub-image, the red sub-image and the luminance sub-image respectively; and the image fusion sub-module is used for carrying out image fusion processing on the blue sub-image, the green sub-image, the red sub-image and the brightness sub-image after the pixel completion processing to obtain a target color image.
Optionally, the pixel completion submodule is specifically configured to: acquiring pixel values of color pixel units adjacent to each blank pixel unit, wherein each color pixel unit comprises a blue pixel unit, a green pixel unit or a red pixel unit; the pixel value of each blank pixel cell is determined based on the pixel values of adjacent color pixel cells.
Optionally, the image fusion sub-module is specifically configured to: based on the brightness subimages, carrying out image fusion processing on the blue subimages after the pixel completion to obtain image-fused blue subimages; performing image fusion processing on the green sub-image subjected to pixel completion based on the brightness sub-image to obtain an image-fused green sub-image; performing image fusion processing on the red sub-image subjected to pixel completion based on the brightness sub-image to obtain an image-fused red sub-image; and obtaining the target color image based on the blue sub-image, the green sub-image and the red sub-image after image fusion.
Optionally, the image fusion sub-module is specifically configured to: superposing the blue sub-image, the green sub-image and the red sub-image after pixel completion to generate a color image; and carrying out image fusion on the brightness sub-image and the color image to obtain the target color image.
In the embodiment of the present application, a blue sub-image obtained based on a blue light signal acquired by a blue sub-pixel unit of an image sensor according to the embodiment of the present application, a green sub-image obtained based on a green light signal acquired by a green sub-pixel unit, a red sub-image obtained based on a red light signal acquired by a red sub-pixel unit, and a luminance sub-image obtained based on a luminance light signal acquired by a luminance pixel unit are obtained, and image synthesis is performed to output a color image, so that the finally output color image can restore the truest color information of a subject, and meanwhile, the low-illumination light capturing capability of the image sensor can be significantly improved, thereby realizing the optimal signal-to-noise ratio and definition.
The image processing apparatus in the embodiment of the present application may be an electronic device, or may be a component in the electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be a device other than a terminal. By way of example, the electronic Device may be a Mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a personal computer (personal computer, PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The image processing apparatus provided in the embodiment of the present application can implement each process implemented in the method embodiments of fig. 5 to 6D, and is not described here again to avoid repetition.
Optionally, as shown in fig. 8, an electronic device 900 is further provided in this embodiment of the present application, and includes a processor 940 and a memory 920, where the memory 920 stores a program or an instruction that can be executed on the processor 940, and when the program or the instruction is executed by the processor 940, the program or the instruction implements each process of the foregoing embodiment of the image processing method, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010.
Those skilled in the art will appreciate that the electronic device 1000 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 1010 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or fewer components than those shown, or combine some components, or arrange different components, and thus, the description thereof is omitted.
It should be understood that in the embodiment of the present application, the input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, and the Graphics Processing Unit 10041 processes image data of still pictures or videos obtained by an image capturing device, such as a camera module including the image sensor structure of the embodiment of the present application, in a video capturing mode or an image capturing mode.
The image sensor comprises a plurality of color pixel units and a plurality of brightness pixel units, wherein the color pixel units comprise a blue sub-pixel unit, a green sub-pixel unit and a red sub-pixel unit which are stacked and arranged according to a preset sequence; the brightness pixel unit comprises at least one of a visible photon pixel unit and an infrared photon pixel unit; the visible photon pixel unit is used for collecting visible light signals in target light rays, and the infrared photon pixel unit is used for collecting near infrared light signals in the target light rays; the color pixel units and the brightness pixel units are arranged at intervals according to a preset proportion.
Optionally, the luminance pixel unit includes a visible photon pixel unit and an infrared photon pixel unit, and the visible photon pixel unit and the infrared photon pixel unit are stacked in a preset order.
Optionally, at least one of the red sub-pixel unit, the green sub-pixel unit, the blue sub-pixel unit, the visible photon pixel unit, and the infrared photon pixel unit is an organic light guide film.
Optionally, the preset proportion is 0.3-3.
Optionally, the preset ratio comprises 1:1, 1:2, or 2: 1.
In an embodiment of the present application, an image sensor of an image capture device includes a plurality of color pixel units and a plurality of luminance pixel units, the color pixel units include a blue sub-pixel unit, a green sub-pixel unit and a red sub-pixel unit which are stacked in a preset order, the luminance pixel units include at least one of a visible sub-pixel unit and an infrared sub-pixel unit, and the color pixel units and the luminance pixel units are arranged at intervals according to a preset proportion. Therefore, each color pixel unit can acquire R/G/B three-color signals at the same time, and the truest color information of the shot object is restored. And each brightness pixel unit can capture the visible light full wave band by using the visible photon pixel unit and/or capture the infrared wave band signal by using the infrared photon pixel unit, so that the low-illumination light capturing capability of the image sensor is maximized, and the optimal signal-to-noise ratio and definition are realized.
The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes at least one of a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 can include two portions, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a first storage area storing a program or an instruction and a second storage area storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, and the like) required for at least one function, and the like. Further, the memory 1009 may include volatile memory or non-volatile memory, or the memory 1009 may include both volatile and non-volatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct Memory bus RAM (DRRAM). The memory 1009 in the embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 1010 may include one or more processing units; optionally, the processor 1010 integrates an application processor, which primarily handles operations related to the operating system, user interface, and applications, and a modem processor, which primarily handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The processor 1010 is configured to obtain a blue sub-image, a green sub-image, a red sub-image and a brightness sub-image, where the blue sub-image is obtained based on a blue light signal acquired by a blue sub-pixel unit of an image sensor of the camera module, the green sub-image is obtained based on a green light signal acquired by a green sub-pixel unit of the image sensor of the camera module, the red sub-image is obtained based on a red light signal acquired by a red sub-pixel unit of the image sensor of the camera module, and the brightness sub-image is obtained based on a brightness light signal acquired by a brightness pixel unit of the image sensor of the camera module; and carrying out image synthesis on the blue sub-image, the green sub-image, the red sub-image and the brightness sub-image, and outputting a target color image.
Optionally, the processor 1010 is configured to obtain a blue light signal collected by the blue sub-pixel unit, a green light signal collected by the green sub-pixel unit, a red light signal collected by the red sub-pixel unit, and a luminance light signal collected by the luminance pixel unit; acquiring a blue electrical signal obtained by converting the blue light signal by the blue sub-pixel unit, a green electrical signal obtained by converting the green light signal by the green sub-pixel unit, a red electrical signal obtained by converting the red light signal by the red sub-pixel unit, and a brightness electrical signal obtained by converting the brightness optical signal by the brightness pixel unit; acquiring a blue digital signal obtained by performing electrical number conversion on the blue electrical signal, a green digital signal obtained by performing electrical number conversion on the green electrical signal, a red digital signal obtained by performing electrical number conversion on the red electrical signal, and a brightness digital signal obtained by performing electrical number conversion on the brightness electrical signal; and acquiring a blue sub-image of the blue digital signal, a green sub-image of the green digital signal, a red sub-image of the red digital signal and a brightness sub-image of the brightness digital signal.
Optionally, the processor 1010 is configured to perform pixel completion processing on blank pixel units in the blue sub-image, the green sub-image, the red sub-image, and the luminance sub-image respectively; and carrying out image fusion processing on the blue sub-image, the green sub-image, the red sub-image and the brightness sub-image after the pixel completion processing to obtain a target color image.
Optionally, the processor 1010 is configured to obtain pixel values of color pixel units adjacent to each blank pixel unit, where the color pixel units include a blue pixel unit, a green pixel unit, or a red pixel unit; the pixel value of each blank pixel cell is determined based on the pixel values of adjacent color pixel cells.
Optionally, the processor 1010 is configured to perform image fusion processing on the pixel-complemented blue sub-image based on the luminance sub-image, so as to obtain an image-fused blue sub-image; performing image fusion processing on the green sub-image subjected to pixel completion based on the brightness sub-image to obtain an image-fused green sub-image; performing image fusion processing on the red sub-image subjected to pixel completion based on the brightness sub-image to obtain an image-fused red sub-image; and obtaining the target color image based on the blue sub-image, the green sub-image and the red sub-image after image fusion.
Optionally, the processor 1010 is configured to superimpose the pixel-complemented blue sub-image, green sub-image, and red sub-image to generate a color image; and carrying out image fusion on the brightness sub-image and the color image to obtain the target color image.
In the embodiment of the present application, the processor 1010 obtains a blue sub-image obtained based on a blue light signal collected by a blue sub-pixel unit of the image sensor of the embodiment of the present application, a green sub-image obtained based on a green light signal collected by a green sub-pixel unit, a red sub-image obtained based on a red light signal collected by a red sub-pixel unit, and a luminance sub-image obtained based on a luminance light signal collected by a luminance pixel unit, and performs image synthesis to output a color image, so that the finally output color image can restore the truest color information of a subject, and meanwhile, the low-illumination light capturing capability of the image sensor can be significantly improved, thereby achieving the best signal-to-noise ratio and sharpness.
An embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements the processes of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the above image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as a system-on-chip, or a system-on-chip.
The present application provides a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the above-mentioned embodiment of the method for generating a brush pattern, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the above embodiment method can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better embodiment. Based on such understanding, the technical solutions of the present application may be substantially or partially embodied in the form of a software product stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk), and including instructions for enabling a terminal (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the scope of the invention as defined by the appended claims.

Claims (15)

1. An image sensor comprising a plurality of color pixel cells and a plurality of luminance pixel cells,
the color pixel unit comprises a blue sub-pixel unit, a green sub-pixel unit and a red sub-pixel unit which are stacked and arranged according to a preset sequence;
the brightness pixel unit comprises at least one of a visible photon pixel unit and an infrared photon pixel unit; the visible photon pixel unit is used for collecting visible light signals in target light, and the infrared photon pixel unit is used for collecting near infrared light signals in the target light;
the color pixel units and the brightness pixel units are arranged at intervals according to a preset proportion.
2. The image sensor of claim 1, wherein the luminance pixel unit comprises a visible photon pixel unit and an infrared photon pixel unit, and the visible photon pixel unit and the infrared photon pixel unit are stacked in a preset order.
3. The image sensor of claim 1 or 2, wherein at least one of the red, green, blue, visible and infrared photonic pixel units is an organic light guiding film.
4. The image sensor of claim 1, wherein the predetermined ratio is 0.3-3.
5. The image sensor of claim 4, wherein the preset ratio comprises 1:1, 1:2, or 2: 1.
6. A camera module, characterized in that it comprises an image sensor according to any one of claims 1 to 5.
7. The camera module of claim 6, further comprising:
a circuit board to which the image sensor is electrically connected;
the lens is arranged on one side, far away from the circuit board, of the image sensor.
8. An electronic apparatus comprising the camera module according to claim 6 or 7.
9. An image processing method applied to an image processing apparatus including the image sensor according to any one of claims 1 to 5, comprising:
the method comprises the steps of obtaining a blue sub-image, a green sub-image, a red sub-image and a brightness sub-image, wherein the blue sub-image is obtained based on a blue light signal collected by a blue sub-pixel unit, the green sub-image is obtained based on a green light signal collected by a green sub-pixel unit, the red sub-image is obtained based on a red light signal collected by a red sub-pixel unit, and the brightness sub-image is obtained based on a brightness light signal collected by a brightness pixel unit;
and carrying out image synthesis on the blue sub-image, the green sub-image, the red sub-image and the brightness sub-image, and outputting a target color image.
10. The method of claim 9, wherein obtaining the blue sub-image, the green sub-image, the red sub-image, and the luminance sub-image comprises:
acquiring a blue light signal collected by a blue sub-pixel unit, a green light signal collected by a green sub-pixel unit, a red light signal collected by a red sub-pixel unit and a brightness light signal collected by a brightness pixel unit;
obtaining a blue electric signal obtained by converting the blue light signal by the blue sub-pixel unit, a green electric signal obtained by converting the green light signal by the green sub-pixel unit, a red electric signal obtained by converting the red light signal by the red sub-pixel unit, and a brightness electric signal obtained by converting the brightness light signal by the brightness pixel unit;
acquiring a blue digital signal obtained by carrying out electrical number conversion on the blue electrical signal, a green digital signal obtained by carrying out electrical number conversion on the green electrical signal, a red digital signal obtained by carrying out electrical number conversion on the red electrical signal and a brightness digital signal obtained by carrying out electrical number conversion on the brightness electrical signal;
and acquiring a blue sub-image of the blue digital signal, a green sub-image of the green digital signal, a red sub-image of the red digital signal and a brightness sub-image of the brightness digital signal.
11. The method of claim 9, wherein the image synthesizing the blue sub-image, the green sub-image, the red sub-image, and the luminance sub-image to output a target color image comprises:
performing pixel completion processing on blank pixel units in the blue sub-image, the green sub-image, the red sub-image and the brightness sub-image respectively;
and carrying out image fusion processing on the blue sub-image, the green sub-image, the red sub-image and the brightness sub-image after the pixel completion processing to obtain a target color image.
12. The method according to claim 11, wherein the performing pixel completion processing on blank pixel units in the blue sub-image, the green sub-image, the red sub-image and the luminance sub-image respectively comprises:
acquiring pixel values of color pixel units adjacent to each blank pixel unit, wherein the color pixel units comprise blue pixel units, green pixel units or red pixel units;
the pixel value of each blank pixel cell is determined based on the pixel values of adjacent color pixel cells.
13. The method according to claim 11 or 12, wherein the image fusion processing of the blue sub-image, the green sub-image, the red sub-image and the luminance sub-image after the pixel completion processing comprises:
based on the brightness subimages, carrying out image fusion processing on the blue subimages after the pixel completion to obtain image-fused blue subimages;
performing image fusion processing on the green sub-image subjected to pixel completion based on the brightness sub-image to obtain an image-fused green sub-image;
performing image fusion processing on the red sub-image subjected to pixel completion based on the brightness sub-image to obtain an image-fused red sub-image;
and obtaining the target color image based on the blue sub-image, the green sub-image and the red sub-image after image fusion.
14. The method according to claim 11 or 12, wherein the image fusion processing of the blue sub-image, the green sub-image, the red sub-image and the luminance sub-image after the pixel completion processing comprises:
superposing the blue sub-image, the green sub-image and the red sub-image after pixel completion to generate a color image;
and carrying out image fusion on the brightness sub-image and the color image to obtain the target color image.
15. An electronic device comprising an image sensor according to any one of claims 1-5, a processor and a memory, the memory storing a program or instructions executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the image processing method according to any one of claims 9 to 14.
CN202111445496.XA 2021-11-30 2021-11-30 Image sensor, camera module, image processing method and device and electronic equipment Pending CN114125319A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111445496.XA CN114125319A (en) 2021-11-30 2021-11-30 Image sensor, camera module, image processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111445496.XA CN114125319A (en) 2021-11-30 2021-11-30 Image sensor, camera module, image processing method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN114125319A true CN114125319A (en) 2022-03-01

Family

ID=80368469

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111445496.XA Pending CN114125319A (en) 2021-11-30 2021-11-30 Image sensor, camera module, image processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN114125319A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114650377A (en) * 2022-03-22 2022-06-21 维沃移动通信有限公司 Camera module, control method of camera module and electronic equipment

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003332551A (en) * 2002-05-08 2003-11-21 Canon Inc Color image pickup device and color photoreceptor device
CN102036599A (en) * 2008-03-18 2011-04-27 诺瓦达克技术公司 Imaging system for combined full-color reflectance and near-infrared imaging
KR20130020435A (en) * 2011-08-19 2013-02-27 한경대학교 산학협력단 Apparatus and method for reconstructing color image based on multi-spectrum using bayer color filter array camera
JP2014011722A (en) * 2012-07-02 2014-01-20 Sigma Corp Imaging device
CN204720451U (en) * 2014-06-03 2015-10-21 半导体元件工业有限责任公司 Imaging system and processor system
CN105611136A (en) * 2016-02-26 2016-05-25 联想(北京)有限公司 Image sensor and electronic equipment
US20170048500A1 (en) * 2015-08-10 2017-02-16 Lilong SHI Rgb-rwb dual images by multi-layer sensors towards better image quality
CN106791734A (en) * 2016-12-27 2017-05-31 珠海市魅族科技有限公司 The method of device, electronic installation and IMAQ for IMAQ
CN111432099A (en) * 2020-03-30 2020-07-17 Oppo广东移动通信有限公司 Image sensor, processing system and method, electronic device, and storage medium
CN112331684A (en) * 2020-11-20 2021-02-05 联合微电子中心有限责任公司 Image sensor and forming method thereof
CN112822466A (en) * 2020-12-28 2021-05-18 维沃移动通信有限公司 Image sensor, camera module and electronic equipment
CN113037980A (en) * 2021-03-23 2021-06-25 北京灵汐科技有限公司 Pixel sensing array and vision sensor
CN113556520A (en) * 2021-07-16 2021-10-26 京东方科技集团股份有限公司 Image processing method, image processing device and system
CN113570532A (en) * 2021-07-28 2021-10-29 Oppo广东移动通信有限公司 Image processing method, device, terminal and readable storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003332551A (en) * 2002-05-08 2003-11-21 Canon Inc Color image pickup device and color photoreceptor device
CN102036599A (en) * 2008-03-18 2011-04-27 诺瓦达克技术公司 Imaging system for combined full-color reflectance and near-infrared imaging
KR20130020435A (en) * 2011-08-19 2013-02-27 한경대학교 산학협력단 Apparatus and method for reconstructing color image based on multi-spectrum using bayer color filter array camera
JP2014011722A (en) * 2012-07-02 2014-01-20 Sigma Corp Imaging device
CN204720451U (en) * 2014-06-03 2015-10-21 半导体元件工业有限责任公司 Imaging system and processor system
US20170048500A1 (en) * 2015-08-10 2017-02-16 Lilong SHI Rgb-rwb dual images by multi-layer sensors towards better image quality
CN105611136A (en) * 2016-02-26 2016-05-25 联想(北京)有限公司 Image sensor and electronic equipment
CN106791734A (en) * 2016-12-27 2017-05-31 珠海市魅族科技有限公司 The method of device, electronic installation and IMAQ for IMAQ
CN111432099A (en) * 2020-03-30 2020-07-17 Oppo广东移动通信有限公司 Image sensor, processing system and method, electronic device, and storage medium
CN112331684A (en) * 2020-11-20 2021-02-05 联合微电子中心有限责任公司 Image sensor and forming method thereof
CN112822466A (en) * 2020-12-28 2021-05-18 维沃移动通信有限公司 Image sensor, camera module and electronic equipment
CN113037980A (en) * 2021-03-23 2021-06-25 北京灵汐科技有限公司 Pixel sensing array and vision sensor
CN113556520A (en) * 2021-07-16 2021-10-26 京东方科技集团股份有限公司 Image processing method, image processing device and system
CN113570532A (en) * 2021-07-28 2021-10-29 Oppo广东移动通信有限公司 Image processing method, device, terminal and readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114650377A (en) * 2022-03-22 2022-06-21 维沃移动通信有限公司 Camera module, control method of camera module and electronic equipment
WO2023179527A1 (en) * 2022-03-22 2023-09-28 维沃移动通信有限公司 Camera module, control method for camera module, and electronic device

Similar Documents

Publication Publication Date Title
EP4024323A1 (en) Image processing method and apparatus
CN107431760B (en) The image processing method and storage medium of photographic device, photographic device
CN104995911B (en) Image processing device, image capture device, filter generating device, image restoration method, and program
CN105409211B (en) For the automatic white balance positive with skin-color adjustment of image procossing
CN103327342B (en) There is the imaging system of opaque filter pixel
CN113012081B (en) Image processing method, device and electronic system
CN105049718A (en) Image processing method and terminal
CN103563350A (en) Image processing device, image processing method, and digital camera
CN104662463A (en) Image processing apparatus, imaging system, and image processing system
CN114693580B (en) Image processing method and related device
WO2024027287A9 (en) Image processing system and method, and computer-readable medium and electronic device
CN113014803A (en) Filter adding method and device and electronic equipment
CN105933616A (en) Image processing method and equipment
CN105791793A (en) Image processing method and electronic device
JP2010193037A (en) Apparatus and program for creating pseudo-color image
Song et al. Real-scene reflection removal with raw-rgb image pairs
CN114298889A (en) Image processing circuit and image processing method
CN114125319A (en) Image sensor, camera module, image processing method and device and electronic equipment
CN115835034A (en) White balance processing method and electronic equipment
CN104010134B (en) For forming the system and method with wide dynamic range
JP2003199119A (en) Method of processing digital cfa image
Lukac Single-sensor imaging in consumer digital cameras: a survey of recent advances and future directions
CN110460783B (en) Array camera module, image processing system, image processing method and electronic equipment
Guo et al. Low-light color imaging via cross-camera synthesis
CN116055891A (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination