CN112822466A - Image sensor, camera module and electronic equipment - Google Patents

Image sensor, camera module and electronic equipment Download PDF

Info

Publication number
CN112822466A
CN112822466A CN202011587866.9A CN202011587866A CN112822466A CN 112822466 A CN112822466 A CN 112822466A CN 202011587866 A CN202011587866 A CN 202011587866A CN 112822466 A CN112822466 A CN 112822466A
Authority
CN
China
Prior art keywords
pixel
image sensor
pixel region
color
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011587866.9A
Other languages
Chinese (zh)
Inventor
罗轶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202011587866.9A priority Critical patent/CN112822466A/en
Publication of CN112822466A publication Critical patent/CN112822466A/en
Priority to PCT/CN2021/139889 priority patent/WO2022143280A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Abstract

The application provides an image sensor, a camera module and electronic equipment, and belongs to the technical field of image processing. Wherein, the image sensor includes: the color filter array comprises a plurality of sampling pixel groups, each sampling pixel group comprises a plurality of pixel areas, the plurality of pixel areas comprise color pixel areas and white pixel areas, and the color pixel areas surround the white pixel areas; the color pixel regions comprise at least two first pixel regions, at least one second pixel region and at least one third pixel region, and the first pixel regions, the second pixel regions and the third pixel regions are used for receiving visible light in different wavelength ranges. Therefore, a five-in-one pixel array arrangement mode is formed, the RGB signals and the Mono signals can be read simultaneously, the light sensing capability of the sensor is improved on the basis of ensuring the definition of a color image, and the occupied space and the energy consumption of the image sensor are reduced.

Description

Image sensor, camera module and electronic equipment
Technical Field
The application belongs to the technical field of image processing, and particularly relates to an image sensor, a camera module and electronic equipment.
Background
In a complementary metal-oxide semiconductor (CMOS) image sensor, a Color Filter Array (CFA) commonly used includes an RGB (red, green, blue) Array, an RGB Bayer (Bayer) Array, a CMY (cyan, violet, yellow) Array, an RGBW (red, green, blue, white) Array, and a RYYB (red, yellow, blue) Array. After a CMOS Image Sensor (CIS) outputs an original image, a Demosaic (Demosaic) process is required to generate a final RGB color image.
In the related art, under a Demosaic difference value algorithm, a color image generated by an RGB Bayer array has good definition, but a pixel signal-to-noise ratio is not ideal, so that an image brightness value is low. Although RGBW can give consideration to both color and noise ratio, because the number of RGB effective pixels is limited, a full-resolution RGB image can only "guess" the missing pixels by a difference algorithm, and needs to be subjected to difference processing for many times, and if the difference algorithm is not strong, the sharpness is easily lost and part of the scene is overexposed.
Disclosure of Invention
The embodiment of the application provides an image sensor, a camera shooting assembly and electronic equipment, can compromise the color definition of an image and reduce noise, avoids the problems of blurring and color error and the like, simultaneously does not need a plurality of image sensor chips, and effectively reduces the volume and the power consumption of the image sensor.
In a first aspect, an embodiment of the present application provides an image sensor, including:
the color filter array comprises a plurality of sampling pixel groups, each sampling pixel group comprises a plurality of pixel areas, the plurality of pixel areas comprise color pixel areas and white pixel areas, and the color pixel areas surround the white pixel areas;
the color pixel regions comprise at least two first pixel regions, at least one second pixel region and at least one third pixel region, and the first pixel regions, the second pixel regions and the third pixel regions are used for receiving visible light in different wavelength ranges.
In a second aspect, an embodiment of the present application provides a camera module, including:
a circuit board;
the image sensor provided in the embodiment of the first aspect, electrically connected to the circuit board;
and the lens is arranged on one side of the image sensor, which is deviated from the circuit board.
In a third aspect, an embodiment of the present application provides an electronic device, including the camera module provided in the embodiment of the second aspect.
In an embodiment of the present application, an image sensor includes: the color filter array comprises a plurality of sampling pixel groups, each sampling pixel group comprises a plurality of pixel areas, the plurality of pixel areas comprise color pixel areas and white pixel areas, and the color pixel areas surround the white pixel areas; the color pixel regions comprise at least two first pixel regions, at least one second pixel region and at least one third pixel region, and the first pixel regions, the second pixel regions and the third pixel regions are used for receiving visible light in different wavelength ranges. By setting the pixel area to be a color pixel area and a white pixel area, on one hand, each pixel has 5 color areas, and can simultaneously read color (RGB) signals and black and white (Mono) signals, so that on the basis of ensuring the definition of a color image, the signal-to-noise ratio of the image sensor is increased, the light sensitivity of the sensor is improved, a picture with higher quality can be shot even in a dark light environment, and the imaging quality of the image sensor is effectively improved; on the other hand, the functions of the Mono image sensor and the RGB image sensor are combined on one chip of the image sensor, which not only effectively reduces the manufacturing cost of the image sensor, but also is beneficial to the miniaturization of the image sensor, reduces the occupied space of the image sensor, and reduces the operation burden of the electronic device on the image processing.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a top view of a color filter array of an image sensor according to one embodiment of the present application;
FIG. 2 is a top view of a sample pixel group of an image sensor according to one embodiment of the present application;
FIG. 3 is a top view of a sample pixel group of an image sensor according to yet another embodiment of the present application;
FIG. 4 is a schematic diagram of the 45 section of the sampling pixel set of FIG. 2;
FIG. 5 is a top view of a composite pixel group of an image sensor according to one embodiment of the present application;
FIG. 6 is a top view of a composite pixel group of an image sensor according to yet another embodiment of the present application;
FIG. 7 is a top view of a composite pixel group of an image sensor according to yet another embodiment of the present application;
FIG. 8 is a top view of a color filter array of an image sensor according to yet another embodiment of the present application;
fig. 9 is a block diagram of a hardware configuration of an electronic device according to an embodiment of the present application.
Reference numerals:
1 color filter array, 10 sampling pixel groups, 110 color pixel regions, 112 first pixel regions, 114 second pixel regions, 116 third pixel regions, 120 white pixel regions, 130 first deep trench isolations, 140 sub-pixels, 150 second deep trench isolations, 20 photo-sensitive circuits, 22 photoelectric conversion elements, 24 filter elements, 26 micro-lenses.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The features of the terms first and second in the description and in the claims of the present application may explicitly or implicitly include one or more of such features. In the description of the present application, "a plurality" means two or more unless otherwise specified. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
In the description of the present application, it is to be understood that the terms "upper", "lower", "front", "rear", "left", "right", etc. indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience in describing the present application and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus should not be construed as limiting the present application.
In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art.
An image sensor, a camera module assembly, and an electronic apparatus according to an embodiment of the present application are described below with reference to fig. 1 to 9.
As shown in fig. 1 to 3, according to an embodiment of a first aspect of the present application, there is provided an image sensor including: a color filter array 1, the color filter array 1 including a plurality of sampling pixel groups 10, each sampling pixel group 10 including a plurality of pixel regions, the plurality of pixel regions including a color pixel region 110 and a white pixel region 120, the color pixel region 110 surrounding the white pixel region 120; the color pixel region 110 includes at least two first pixel regions 112, at least one second pixel region 114, and at least one third pixel region 116, where the first pixel region 112, the second pixel region 114, and the third pixel region 116 are configured to receive visible light in different wavelength ranges.
In this embodiment, the image sensor includes a Color Filter Array (CFA) 1, and the Color Filter Array 1 is capable of performing photoelectric conversion on received visible light of a plurality of different colors, thereby generating an electrical signal and outputting a Color image. The color filter array 1 includes a plurality of sampling pixel groups 10. Each sampling pixel group 10 includes a plurality of pixel regions, specifically, a plurality of pixel regions including a color pixel region 110 and a white pixel region 120. The color pixel region 110 is configured to receive visible light of each color corresponding to the color pixel region 110. The color pixel region 110 includes at least two first pixel regions 112, at least one second pixel region 114, and at least one third pixel region 116, and the first pixel region 112, the second pixel region 114, and the third pixel region 116 are configured to receive visible light of different colors, and the wavelength range of the visible light of each color is different, so that color information of a pixel can be obtained through the color pixel region 110. The white pixel region 120 is configured to receive visible light of all colors, obtain detail information of definition of color information through the white pixel region 120, and perform noise reduction on the received color information. By setting the pixel regions to the color pixel region 110 and the white pixel region 120. On one hand, each pixel has 5 color areas, so that color signals and black and white (Mono) signals can be read simultaneously, the signal-to-noise ratio of the image sensor is increased on the basis of ensuring the definition of color images, the bandwidth of acquired light is expanded, the light sensitivity of the sensor is improved, a picture with higher quality can be shot even in a dark light environment, and the imaging quality of the image sensor is effectively improved; on the other hand, the Mono image sensor and the color image sensor are combined on the chip of one image sensor, so that the color and Mono technologies can be realized simultaneously, the manufacturing cost of the image sensor is effectively reduced, the miniaturization of the image sensor is facilitated, the occupied space of the image sensor is reduced, and the operation burden of the electronic equipment on image processing is reduced.
The color pixel region 110 surrounds the white pixel region 120, that is, the color pixel region 110 surrounds the white pixel region 120, and the white pixel region 120 is located at a middle position of the plurality of pixel regions, for example, on the basis of an RGB Bayer array, a part of each color region is replaced with the white region.
Further, a plurality of sampling pixel groups 10 are connected and arranged in an array.
In one possible embodiment, as shown in fig. 1 and 2, the first pixel region 112 is a green pixel region; the second pixel region 114 is a red pixel region; the third pixel region 116 is a blue pixel region.
In this embodiment, the first pixel region 112 is a green pixel region for receiving visible light of a wavelength range corresponding to green; the second pixel region 114 is a red pixel region for receiving visible light in a wavelength range corresponding to red; the third pixel region 116 is a blue pixel region for receiving visible light of a wavelength range corresponding to blue. By simulating the sensitivity of human eyes to colors, the arrangement mode of 1 red, 2 green and 1 blue is adopted in the color pixel area 110, the number of RGB effective pixels is ensured, multiple differential processing is avoided, the accuracy of a mosaic differential algorithm is improved, the resolution of the image sensor is improved, the definition of the generated color image is better, and the imaging quality of the image sensor is improved. Meanwhile, the four-in-one RGB pixel array arrangement mode can be improved into a five-in-one RGGBW pixel array arrangement mode. The RGB signal and the Mono signal can be read simultaneously, the light sensing capability of the sensor is improved on the basis of ensuring the definition of a color image, and the imaging quality of the image sensor is effectively improved; moreover, the functions of the Mono image sensor and the RGB image sensor are combined on a chip of the image sensor, so that the RGB and Mono technologies can be realized simultaneously, the manufacturing cost of the image sensor is effectively reduced, the miniaturization of the image sensor is facilitated, the occupied space of the image sensor is reduced, and the operation burden of electronic equipment on image processing is reduced.
The wavelength range of visible light is about 390 nm-700 nm, the wavelength range of red light is about 620 nm-700 nm, the wavelength range of orange light is about 590 nm-620 nm, the wavelength range of yellow light is about 570 nm-590 nm, the wavelength range of green light is about 570 nm-490 nm, the wavelength range of cyan light is about 450 nm-490 nm, the wavelength range of blue light is about 430 nm-450 nm, and the wavelength range of purple light is about 390 nm-450 nm.
It is understood that the first pixel region 112, the second pixel region 114, and the third pixel region 116 may correspond to RGB color regions, respectively, or may correspond to other color regions, such as CMY color regions.
In addition, the shape and the position size of each region in the sampling pixel group 10 may be changed according to the actual situation. For example, as shown in fig. 2, the sampling pixel group 10 includes 2 green (G) pixel regions, 1 red (R) pixel region, 1 blue (B) pixel region, and one white (W) pixel region. The area of the white pixel region 120 is 1/4 of the area of the sampling pixel group 10, the areas of the red pixel region and the blue pixel region are 3/16 of the area of the sampling pixel group 10, the sum of the areas of the two green pixel regions is 3/8 of the area of the sampling pixel group 10, and the white pixel region 120 is located at the center of the sampling pixel group 10, so that the applicability of the image sensor is improved. As shown in fig. 3, the area of the white pixel region 120 is 1/8 of the area of the sampling pixel group 10, the areas of the red pixel region and the blue pixel region are 1/4 of the area of the sampling pixel group 10, respectively, and the areas of the 2 pixel regions are 3/16 of the area of the sampling pixel group 10, respectively, that is, the white region is embedded only in the green pixel region, so that RGB + Mono can be realized, and the light receiving influence on the red pixel region and the blue region can be eliminated.
In one possible embodiment, as shown in fig. 1 to 4, the sampling pixel group 10 further includes: the first deep trench isolation 130 is disposed between two adjacent pixel regions.
In this embodiment, the first Deep trench isolation 130(Deep trench isolation, DTI) is disposed between two adjacent pixel regions to isolate the pixel regions, so that a signal of one pixel region cannot affect other pixel regions, thereby avoiding crosstalk between two adjacent pixel regions, and preventing the pixel regions from giving wrong image information while collecting more light in the auxiliary pixel region.
In a specific application, the first deep trench isolation 130 includes a deep trench in the semiconductor layer of the color filter array 1 and a liner layer on the bottom and sidewalls of the deep trench. The material of the liner layer can be a metal material, such as tungsten, and a good light blocking effect can be achieved. The liner layer may also be an oxide or oxynitride, such as silicon dioxide, making the first deep trench isolation 130 process simple and easy to implement. The depth range of the deep trench can be set reasonably according to the thickness of the semiconductor layer of the color filter array 1, for example, the depth range of the deep trench is 2 μm to 3 μm.
In one possible embodiment, as shown in fig. 5-8, the set of sampled pixels 10 is a set of synthesized pixels; the color pixel region or white pixel region 120 includes a plurality of sub-pixels 140; a second deep trench isolation 150 is disposed between two adjacent sub-pixels 140.
In this embodiment, in consideration of the requirement of large resolution, the size of the pixels is reduced, and the optical and electrical performance of the pixels is greatly reduced compared with that of the large-sized pixels due to the undersize of the single pixel. For example, the charge storage capability of a pixel, when the pixel size drops to the 0.6um level, a single pixel can only store about 5000 electrons. For this, the sampling pixel groups 10 are set as synthesized pixel groups, that is, the color pixel region or the white pixel region 120 in each sampling pixel group 10 is configured by a plurality of sub-pixels 140. On one hand, the charge storage capacity of the pixel is improved, noise is reduced, and the image quality of the image is improved; on the other hand, each pixel region is composed of one sub-pixel 140, and the white pixel region 120 in the plurality of pixel regions can be flexibly allocated by arranging the plurality of sub-pixels 140, so that the design difficulty brought by the special-shaped pixel region is reduced, and the manufacturing cost of the color filter array 1 is reduced.
Furthermore, two adjacent sub-pixels 140 are isolated from each other by the second deep trench isolation 150, so that photoelectrons generated by any sub-pixel 140 cannot enter the adjacent sub-pixel 140, and further, the adjacent sub-pixels 140 are not affected, and crosstalk noise between the sub-pixels 140 can be greatly reduced while the auxiliary sub-pixels 140 condense light.
Specifically, for example, in the 2 × 2 synthesized pixel shown in fig. 5 and fig. 8, that is, four sub-pixels 140 constitute one pixel point, one sub-pixel 140 is taken out from one pixel point to be used as a white sub-pixel, the other three sub-pixels are used as color sub-pixels, the three color sub-pixels constitute one color pixel region, and all the white sub-pixels in one synthesized pixel group constitute the white pixel region 120. Thereby avoiding the design and manufacture difficulties brought by the L-shaped color pixel area.
Specifically, the position of the white sub-pixel is not fixed and the number of the white sub-pixels is not limited, and the white sub-pixel can be selected according to the actual requirement of the image sensor, which is not limited herein. For example, as shown in fig. 6, only the color pixel region is composed of a plurality of sub-pixels 140. As shown in fig. 7, the green pixel region, the red pixel region, and the blue pixel region are each composed of 3 subpixels 140, and the white pixel region is composed of 2 subpixels 140.
In one possible embodiment, as shown in fig. 4, each pixel region or each sub-pixel is provided with a photosensitive circuit 20, a photoelectric conversion element 22, and a filter element 24, which are stacked in this order.
In this embodiment, each pixel region or each sub-pixel is provided with a photosensitive circuit 20, a photoelectric conversion element 22, and a filter element 24, which are stacked in this order from bottom to top. The filter element 24 is used for filtering redundant infrared light and ultraviolet light, the filter element 24 enables visible light of a color corresponding to the pixel region to pass through, and the visible light is responded by the photoelectric conversion element 22, so that unnecessary light in the color filter array can be filtered, a photoelectric sensor is prevented from forming false color or ripple in the shooting process, and the effective resolution of an image and the color reducibility can be improved. The light signal received by the pixel region is converted into an electrical signal by the light receiving circuit 20 and the photoelectric conversion element 22 and output, thereby realizing imaging of the image sensor.
Further, each pixel region or each sub-pixel may have a separate photosensitive circuit 20, photoelectric conversion element 22 and filter element 24, so that RGB signals can be read from the RGBB region and Mono signals can be read from the W region at the same time, thereby achieving RGB + Mono effect and avoiding the problems of blurring and color misregistration of the conventional RGBW array.
It should be noted that one filter element may be shared for sub-pixels that receive visible light of the same color.
In a particular application, the photoelectric conversion element 22 is a photodiode and the filter element 24 is a color filter. A color filter of a corresponding color is provided for each pixel region. That is, a red color filter is provided for a red pixel region, a green color filter is provided for a green pixel region, a blue color filter is provided for a blue pixel region, and a visible light filter is provided for a white pixel region. Each color filter is formed by resin with organic pigment added inside, and the thickness of the color filter can be reasonably set according to actual requirements, such as 400 nm-1000 nm.
It can be understood that, the first deep trench isolation 130 is configured to be a U-shaped structure, and is used to completely isolate each photoelectric conversion element 22 in the sampling pixel group 10, so as to shield light of the photoelectric conversion element 22, thereby preventing the image sensor from optical crosstalk, ensuring the usability of the image sensor, and improving the imaging quality of the image sensor.
In one possible embodiment, as shown in fig. 1, 4 and 8, the image sensor further includes: a microlens layer disposed on the filter element 24; the microlens layer includes one or more microlenses 26, with the plurality of microlenses 26 being located on different pixel regions or on different sub-pixels 140.
In this embodiment, a microlens layer is disposed on the filter element 24, i.e., the photosensitive surface of the sampling pixel group 10, and the microlens 26 is disposed to focus light onto the pixel region or sub-pixel, thereby improving the photosensitive efficiency of the pixel region or sub-pixel.
Further, as shown in fig. 1 and 4, all pixel regions in one sampling pixel group 10 may share one microlens 26. Alternatively, as shown in fig. 8, by providing independent microlenses 26 for different sub-pixels 140 and providing independent microlenses 26 for different pixel regions or sub-pixels 140, crosstalk between pixel regions can be reduced while assisting the pixel regions to collect light, thereby effectively suppressing noise in the image sensor.
Specifically, the bottom contour of the pixel microlens 26 may be circular when looking down on the color filter array.
In one possible embodiment, the color filter array further comprises: and a semiconductor layer for mounting at least one of the photosensitive circuit, the photoelectric conversion element, and the filter element.
In this embodiment, a mounting space is provided in the semiconductor layer, at least one of the photosensitive circuit, the photoelectric conversion element, and the filter element can be mounted in the mounting space, and the first deep groove isolation and the second deep groove isolation are provided on the semiconductor layer, thereby ensuring the assembly stability of the color filter array.
Specifically, the semiconductor layer may be a silicon substrate, but is not limited thereto. Different pixel regions or different sub-pixels may share a common silicon substrate.
In one possible embodiment, the image sensor is a complementary metal oxide semiconductor image sensor.
In this embodiment, the Image Sensor is a complementary metal oxide semiconductor Image Sensor (CIS) having a High Dynamic Range (HDR) mode, and the CMOS Image Sensor has the advantages of simple process, easy integration with other devices, small size, light weight, low power consumption, low cost, and the like, and can be widely applied to various electronic devices, for example, a digital camera, a camera phone, a digital video camera, a medical Image pickup device (gastroscope), an Image pickup device for a vehicle, and the like. When the CMOS image sensor outputs an original image, a demosaicing difference algorithm is required to calculate the missing pixels, so that a final color image can be generated. By setting the pixel regions to be the color pixel region 110 and the white pixel region 120, the number of effective pixels is increased, the accuracy of the mosaic difference algorithm is improved, the resolution of the image sensor is improved, and the imaging quality of the image sensor is improved.
According to an embodiment of a second aspect of the present application, there is provided a camera module, including: a circuit board; the image sensor according to any one of the first aspect, electrically connected to a circuit board; and the lens is arranged on one side of the image sensor, which is deviated from the circuit board.
In this embodiment, the camera module includes a circuit board and a lens, wherein the lens is disposed on a side of the image sensor away from the circuit board for transmitting external light to the image sensor. After the image sensor correspondingly generates corresponding electric signals based on the received light, the converted electric signals are generated into images through the circuit board electrically connected with the image sensor, and then imaging of the camera module is achieved. Meanwhile, the camera module also has all the advantages of the image sensor in any of the above embodiments, and details are not repeated herein.
According to an embodiment of a third aspect of the present application, an electronic device is provided, which includes the camera module in the second aspect of the present application. Therefore, the electronic device also includes all the advantages of the camera module in the above embodiments, which are not described herein again.
Specifically, the electronic device is a mobile phone, a tablet computer, an intelligent bracelet, a notebook computer, a digital camera or other devices with a camera shooting function.
The electronic device in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The electronic device in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
Fig. 9 is a block diagram of a hardware structure of an electronic device 500 implementing an embodiment of the present application. The electronic device 500 includes, but is not limited to: radio unit 502, network module 504, audio output unit 506, input unit 508, sensor 510, display unit 512, user input unit 514, interface unit 516, memory 518, processor 520, etc.
Those skilled in the art will appreciate that the electronic device 500 may further include a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 520 through a power management system, so as to manage charging, discharging, and power consumption management functions through the power management system. The electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or fewer components than those shown, or combine some components, or a different arrangement of components. In the embodiment of the present application, the electronic device includes, but is not limited to, a mobile terminal, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a wearable device, a pedometer, and the like.
It should be understood that, in the embodiment of the present application, the radio frequency unit 502 may be used for transceiving information or transceiving signals during a call, and in particular, receiving downlink data of a base station or sending uplink data to the base station. Radio frequency unit 502 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
The network module 504 provides wireless broadband internet access to the user, such as assisting the user in emailing, browsing web pages, and accessing streaming media.
The audio output unit 506 may convert audio data received by the radio frequency unit 502 or the network module 504 or stored in the memory 518 into an audio signal and output as sound. Also, the audio output unit 506 may also provide audio output related to a specific function performed by the electronic apparatus 500 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 506 includes a speaker, a buzzer, a receiver, and the like.
The input unit 508 is used to receive audio or video signals. The input Unit 508 may include a Graphics Processing Unit (GPU) 5082 and a microphone 5084, and the Graphics processor 5082 processes image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 512, or stored in the memory 518 (or other storage medium), or transmitted via the radio unit 502 or the network module 504. The microphone 5084 may receive sound and may be capable of processing the sound into audio data, and the processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 502 in case of a phone call mode.
The electronic device 500 also includes at least one sensor 510, such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, a light sensor, a motion sensor, and others.
The display unit 512 is used to display information input by the user or information provided to the user. The display unit 512 may include a display panel 5122, and the display panel 5122 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
The user input unit 514 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 514 includes a touch panel 5142 and other input devices 5144. Touch panel 5142, also referred to as a touch screen, can collect touch operations by a user on or near it. The touch panel 5142 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 520, and receives and executes commands sent by the processor 520. Other input devices 5144 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 5142 can be overlaid on the display panel 5122, and when the touch panel 5142 detects a touch operation thereon or nearby, the touch panel is transmitted to the processor 520 to determine the type of the touch event, and then the processor 520 provides a corresponding visual output on the display panel 5122 according to the type of the touch event. The touch panel 5142 and the display panel 5122 can be provided as two separate components or can be integrated into one component.
The interface unit 516 is an interface for connecting an external device to the electronic apparatus 500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 516 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 500 or may be used to transmit data between the electronic apparatus 500 and the external device.
Memory 518 may be used to store application programs and various data. The memory 518 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the mobile terminal, and the like. Further, the memory 518 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 520 performs various functions of the electronic device 500 and processes data by running or executing applications and/or modules stored in the memory 518 and by invoking data stored in the memory 518, thereby providing an overall monitoring of the electronic device 500. Processor 520 may include one or more processing units; the processor 520 may integrate an application processor, which mainly handles operations of the operating system, user interface, application programs, etc., and a modem processor, which mainly handles operations of image processing.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. An image sensor, comprising:
a color filter array including a plurality of sampling pixel groups, each of the sampling pixel groups including a plurality of pixel regions including a color pixel region and a white pixel region, the color pixel region surrounding the white pixel region;
the color pixel regions comprise at least two first pixel regions, at least one second pixel region and at least one third pixel region, wherein the first pixel regions, the second pixel regions and the third pixel regions are used for receiving visible light in different wavelength ranges.
2. The image sensor of claim 1,
the first pixel region is a green pixel region;
the second pixel region is a red pixel region;
the third pixel region is a blue pixel region.
3. The image sensor of claim 1, wherein the sampling pixel group further comprises:
the first deep groove isolation is arranged between two adjacent pixel areas.
4. The image sensor according to any one of claims 1 to 3,
and the sampling pixel groups are connected and arranged in an array.
5. The image sensor of any of claims 1-3, wherein the set of sampled pixels is a set of synthesized pixels;
the color pixel region or the white pixel region includes a plurality of sub-pixels;
and a second deep groove isolation is arranged between every two adjacent sub-pixels.
6. The image sensor of claim 5,
each of the pixel regions or each of the sub-pixels is provided with a photosensitive circuit, a photoelectric conversion element, and a filter element, which are stacked in this order.
7. The image sensor of claim 6, further comprising:
a microlens layer disposed on the filter element;
the micro-lens layer comprises one or more micro-lenses, and a plurality of micro-lenses are positioned on different pixel areas or different sub-pixels.
8. The image sensor of claim 6, wherein the color filter array further comprises:
a semiconductor layer for mounting at least one of the photosensitive circuit, the photoelectric conversion element, and the filter element.
9. The utility model provides a module of making a video recording which characterized in that includes:
a circuit board;
the image sensor of any of claims 1 to 8, electrically connected with the circuit board;
and the lens is arranged on one side of the image sensor, which deviates from the circuit board.
10. An electronic apparatus characterized by comprising the camera module according to claim 9.
CN202011587866.9A 2020-12-28 2020-12-28 Image sensor, camera module and electronic equipment Pending CN112822466A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011587866.9A CN112822466A (en) 2020-12-28 2020-12-28 Image sensor, camera module and electronic equipment
PCT/CN2021/139889 WO2022143280A1 (en) 2020-12-28 2021-12-21 Image sensor, camera module, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011587866.9A CN112822466A (en) 2020-12-28 2020-12-28 Image sensor, camera module and electronic equipment

Publications (1)

Publication Number Publication Date
CN112822466A true CN112822466A (en) 2021-05-18

Family

ID=75854517

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011587866.9A Pending CN112822466A (en) 2020-12-28 2020-12-28 Image sensor, camera module and electronic equipment

Country Status (2)

Country Link
CN (1) CN112822466A (en)
WO (1) WO2022143280A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113674685A (en) * 2021-08-25 2021-11-19 维沃移动通信有限公司 Control method and device of pixel array, electronic equipment and readable storage medium
CN113676652A (en) * 2021-08-25 2021-11-19 维沃移动通信有限公司 Image sensor, control method, control device, electronic apparatus, and storage medium
CN113676651A (en) * 2021-08-25 2021-11-19 维沃移动通信有限公司 Image sensor, control method, control device, electronic apparatus, and storage medium
CN113691716A (en) * 2021-08-26 2021-11-23 维沃移动通信有限公司 Image sensor, image processing method, image processing apparatus, electronic device, and storage medium
CN113900307A (en) * 2021-10-21 2022-01-07 福建华佳彩有限公司 Camera LCD screen under screen
CN113973197A (en) * 2021-11-29 2022-01-25 维沃移动通信有限公司 Pixel structure, pixel array, image sensor and electronic equipment
CN113992862A (en) * 2021-11-30 2022-01-28 维沃移动通信有限公司 Image sensor, camera module, electronic equipment and pixel information acquisition method
CN114071035A (en) * 2021-11-30 2022-02-18 维沃移动通信有限公司 Image sensor, signal processing method and device, camera module and electronic equipment
CN114125319A (en) * 2021-11-30 2022-03-01 维沃移动通信有限公司 Image sensor, camera module, image processing method and device and electronic equipment
CN114205497A (en) * 2021-11-30 2022-03-18 维沃移动通信有限公司 Image sensor, camera module and electronic equipment
WO2022143280A1 (en) * 2020-12-28 2022-07-07 维沃移动通信有限公司 Image sensor, camera module, and electronic device
WO2023179522A1 (en) * 2022-03-22 2023-09-28 维沃移动通信有限公司 Camera module and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101978498A (en) * 2008-02-08 2011-02-16 美商豪威科技股份有限公司 Backside illuminated image sensor having deep light reflective trenches
CN205984987U (en) * 2015-09-11 2017-02-22 半导体元件工业有限责任公司 Image sensor pel array, image sensor and imaging device
CN108281438A (en) * 2018-01-18 2018-07-13 德淮半导体有限公司 Imaging sensor and forming method thereof
CN111614886A (en) * 2020-05-15 2020-09-01 深圳市汇顶科技股份有限公司 Image sensor and electronic device
CN111726549A (en) * 2020-06-29 2020-09-29 深圳市汇顶科技股份有限公司 Image sensor, electronic device, and chip

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105611125B (en) * 2015-12-18 2018-04-10 广东欧珀移动通信有限公司 Imaging method, imaging device and electronic installation
CN112822466A (en) * 2020-12-28 2021-05-18 维沃移动通信有限公司 Image sensor, camera module and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101978498A (en) * 2008-02-08 2011-02-16 美商豪威科技股份有限公司 Backside illuminated image sensor having deep light reflective trenches
CN205984987U (en) * 2015-09-11 2017-02-22 半导体元件工业有限责任公司 Image sensor pel array, image sensor and imaging device
CN108281438A (en) * 2018-01-18 2018-07-13 德淮半导体有限公司 Imaging sensor and forming method thereof
CN111614886A (en) * 2020-05-15 2020-09-01 深圳市汇顶科技股份有限公司 Image sensor and electronic device
CN111726549A (en) * 2020-06-29 2020-09-29 深圳市汇顶科技股份有限公司 Image sensor, electronic device, and chip

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022143280A1 (en) * 2020-12-28 2022-07-07 维沃移动通信有限公司 Image sensor, camera module, and electronic device
CN113676652A (en) * 2021-08-25 2021-11-19 维沃移动通信有限公司 Image sensor, control method, control device, electronic apparatus, and storage medium
CN113676651A (en) * 2021-08-25 2021-11-19 维沃移动通信有限公司 Image sensor, control method, control device, electronic apparatus, and storage medium
CN113674685A (en) * 2021-08-25 2021-11-19 维沃移动通信有限公司 Control method and device of pixel array, electronic equipment and readable storage medium
WO2023025229A1 (en) * 2021-08-25 2023-03-02 维沃移动通信有限公司 Image sensor, control method, control apparatus, electronic device, and storage medium
WO2023025232A1 (en) * 2021-08-25 2023-03-02 维沃移动通信有限公司 Control method and apparatus for pixel array, and electronic device and readable storage medium
WO2023025080A1 (en) * 2021-08-25 2023-03-02 维沃移动通信有限公司 Image sensor, control method, control apparatus, electronic device and storage medium
CN113674685B (en) * 2021-08-25 2023-02-24 维沃移动通信有限公司 Pixel array control method and device, electronic equipment and readable storage medium
CN113691716A (en) * 2021-08-26 2021-11-23 维沃移动通信有限公司 Image sensor, image processing method, image processing apparatus, electronic device, and storage medium
CN113691716B (en) * 2021-08-26 2023-07-11 维沃移动通信有限公司 Image sensor, image processing method, image processing device, electronic apparatus, and storage medium
CN113900307A (en) * 2021-10-21 2022-01-07 福建华佳彩有限公司 Camera LCD screen under screen
CN113973197A (en) * 2021-11-29 2022-01-25 维沃移动通信有限公司 Pixel structure, pixel array, image sensor and electronic equipment
CN113973197B (en) * 2021-11-29 2023-09-12 维沃移动通信有限公司 Pixel structure, pixel array, image sensor and electronic equipment
CN114205497A (en) * 2021-11-30 2022-03-18 维沃移动通信有限公司 Image sensor, camera module and electronic equipment
CN114125319A (en) * 2021-11-30 2022-03-01 维沃移动通信有限公司 Image sensor, camera module, image processing method and device and electronic equipment
CN114071035A (en) * 2021-11-30 2022-02-18 维沃移动通信有限公司 Image sensor, signal processing method and device, camera module and electronic equipment
CN113992862A (en) * 2021-11-30 2022-01-28 维沃移动通信有限公司 Image sensor, camera module, electronic equipment and pixel information acquisition method
WO2023179522A1 (en) * 2022-03-22 2023-09-28 维沃移动通信有限公司 Camera module and electronic device

Also Published As

Publication number Publication date
WO2022143280A1 (en) 2022-07-07

Similar Documents

Publication Publication Date Title
WO2022143280A1 (en) Image sensor, camera module, and electronic device
US20200350352A1 (en) Solid-state imaging device, manufacturing method thereof, and electronic device
CN212435793U (en) Image sensor and electronic device
CN101281921B (en) Solid-state imaging device and imaging apparatus
EP2446476B1 (en) Color filters for sub-diffraction limit sensors
US8134115B2 (en) Color filters for sub-diffraction limit-sized light sensors
CN108600712B (en) Image sensor, mobile terminal and image shooting method
CN108900750B (en) Image sensor and mobile terminal
US20120189293A1 (en) Imaging devices having arrays of image sensors and lenses with multiple aperture sizes
US20220130882A1 (en) Image sensor, camera assembly, and mobile terminal
EP1506679A1 (en) Color filter imaging array and method of formation
US9219894B2 (en) Color imaging element and imaging device
CN108965665B (en) image sensor and mobile terminal
WO2023179527A1 (en) Camera module, control method for camera module, and electronic device
US9143747B2 (en) Color imaging element and imaging device
WO2020015532A1 (en) Image sensor, mobile terminal, and photographing method
US9185375B2 (en) Color imaging element and imaging device
CN109814304B (en) Display panel and display device
WO2021159944A1 (en) Image sensor, camera assembly, and mobile terminal
WO2023025080A1 (en) Image sensor, control method, control apparatus, electronic device and storage medium
WO2023025232A1 (en) Control method and apparatus for pixel array, and electronic device and readable storage medium
WO2013145821A1 (en) Imaging element and imaging device
EP4033535A1 (en) Image sensor, camera assembly and mobile terminal
WO2023179589A1 (en) Camera module and electronic device
WO2020015561A1 (en) Image sensor, mobile terminal, and image capturing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination