CN114422730B - Image sensor and electronic device - Google Patents

Image sensor and electronic device Download PDF

Info

Publication number
CN114422730B
CN114422730B CN202210050802.8A CN202210050802A CN114422730B CN 114422730 B CN114422730 B CN 114422730B CN 202210050802 A CN202210050802 A CN 202210050802A CN 114422730 B CN114422730 B CN 114422730B
Authority
CN
China
Prior art keywords
light
image sensor
pixel unit
monochromatic
photodiode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210050802.8A
Other languages
Chinese (zh)
Other versions
CN114422730A (en
Inventor
刘义
何硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huizhou TCL Mobile Communication Co Ltd
Original Assignee
Huizhou TCL Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huizhou TCL Mobile Communication Co Ltd filed Critical Huizhou TCL Mobile Communication Co Ltd
Priority to CN202210050802.8A priority Critical patent/CN114422730B/en
Publication of CN114422730A publication Critical patent/CN114422730A/en
Application granted granted Critical
Publication of CN114422730B publication Critical patent/CN114422730B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/77Circuits for processing the brightness signal and the chrominance signal relative to each other, e.g. adjusting the phase of the brightness signal relative to the colour signal, correcting differential gain or differential phase

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

The application discloses image sensor and electronic equipment, this image sensor includes: the white pixel unit is sequentially provided with a beam splitting prism, a gap layer and a photodiode layer in a first direction, the beam splitting prism comprises a light inlet surface and a light outlet surface, the photodiode layer comprises a plurality of independent first photodiodes arranged in a second direction, light is injected from the first micro lens, is injected through the light inlet surface of the beam splitting prism, is emitted through the light outlet surface of the beam splitting prism, light is dispersed through the beam splitting prism, the light is divided into monochromatic light with different wavelengths, the monochromatic light with different wavelengths is separated after passing through the gap layer, and the separated monochromatic light is received by a plurality of first photodiodes arranged in the second direction, so that the photosensitivity of the image sensor is greatly improved, and the accuracy of automatic white balance and color reduction is improved.

Description

Image sensor and electronic device
Technical Field
The present disclosure relates to the field of image sensors, and more particularly to an image sensor and an electronic device.
Background
The image sensor refers to a device that converts an optical signal into an electrical signal. According to the principle, it can be classified into a CCD (Charge Coupled Device ) image sensor and a CMOS (Complementary Metal-Oxide Semiconductor, metal oxide semiconductor element) image sensor. Because the CMOS image sensor is manufactured by adopting the traditional CMOS circuit process, the image sensor and the peripheral circuits required by the image sensor can be integrated, so that the CMOS image sensor has wider application prospect.
At present, a CMOS image sensor is mostly used on an electronic device, in order to continuously improve the shooting effect of the electronic device in a dark state, the pixel size of the CMOS image sensor is continuously increased, so as to improve the light sensing quantity, but the continuously increased pixel size is unfavorable for the whole design and stacking of the electronic device, and meanwhile, the accuracy of automatic white balance and color restoration is one of the problems that the shooting always needs to be improved.
Disclosure of Invention
The embodiment of the application provides an image sensor and electronic equipment, which can greatly improve the light sensing quantity and simultaneously improve the accuracy of automatic white balance and color restoration.
The embodiment of the application provides an image sensor, which comprises: the white pixel unit comprises at least one white pixel unit and a first micro lens positioned on the white pixel unit, wherein the white pixel unit is sequentially provided with:
the light splitting prism comprises a light incident surface and a light emergent surface, wherein the light incident surface is positioned at one side close to the first micro lens, and the light emergent surface is positioned at one side far away from the first micro lens;
a gap layer;
a photodiode layer comprising a plurality of individual first photodiodes disposed in a second direction, the first direction being different from the second direction.
Wherein the light incident surface is a concave surface.
The light incident surface is rotatably connected to the light splitting prism main body;
when the incident angles of the incident light entering the first micro lens are different, the light entering surface is rotated, so that the angle of the light rays after the incident light is refracted for the first time through the light entering surface is kept unchanged.
The light emitting surface is obliquely arranged, and the angle of the oblique angle of the light emitting surface is related to the wavelength of a plurality of monochromatic lights which can be separated by the beam splitting prism.
The light-emitting surface is obliquely arranged and is movably arranged on the beam-splitting prism main body;
when the incident angles of the incident lights entering the first micro lenses are different, the light-emitting surface is moved to adjust the angle of the inclined angle of the light-emitting surface, so that the light-emitting angles of the same monochromatic light after the light-emitting surface is subjected to secondary refraction are the same.
The area of the gap layer, which is close to the first section corresponding to one side of the beam splitting prism, is smaller than the area of the gap layer, which is close to the second section corresponding to one side of the photodiode layer, and the first section and the second section are parallel.
Wherein each of the plurality of independent first photodiodes is configured to receive a corresponding one of the monochromatic lights, and the wavelength of the monochromatic light that each of the first photodiodes can receive is sequentially increased or decreased along the second direction.
The image sensor further comprises at least one second pixel unit and a second micro lens positioned on the second pixel unit, wherein the first micro lens and the second micro lens are arranged on the same layer, the size of the white pixel unit is the same as that of the second pixel unit, and the second pixel unit is at least one of a red pixel unit, a green pixel unit or a blue pixel unit.
The second pixel unit comprises an optical filter and a second photodiode which are sequentially arranged in the first direction and correspond to the color, and the distance between the second photodiode and the second microlens is smaller than the distance between the gap layer and the first microlens.
The embodiment of the application also provides electronic equipment, which comprises a camera, wherein the camera comprises the image sensor, and the camera uses the image sensor to sense light to shoot so as to obtain an image or a video.
The application provides an image sensor and electronic equipment, this image sensor include at least one white pixel unit and be located first microlens on the white pixel unit, the white pixel unit sets gradually beam splitting prism in first direction, gap layer and photodiode layer, beam splitting prism includes light inlet face and light-emitting surface, photodiode layer includes a plurality of independent first photodiodes that set up in the second direction, and wherein, light is penetrated into from first microlens, and the light inlet face through beam splitting prism is penetrated into, and the light outlet face through beam splitting prism is penetrated out, lets light appear the dispersion through beam splitting prism, separates the monochromatic light of different wavelength into the monochromatic light of different wavelength, and the monochromatic light after the separation is received by a plurality of first photodiodes that set up in the second direction, so, the white pixel unit in the image sensor in this application embodiment can keep a plurality of spectrum/a plurality of wavelength light for image sensor's promotion photosensitive and the white degree of accuracy of restoration by a wide margin, promotes the automatic photosensitive degree of balance.
Drawings
Technical solutions and other advantageous effects of the present application will be made apparent from the following detailed description of specific embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of an image sensor in the prior art.
Fig. 2 is a schematic diagram of a pixel arrangement of an image sensor in the prior art.
Fig. 3 is a schematic diagram of pixel arrangement of an image sensor according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of an image sensor according to an embodiment of the present application.
Fig. 5 is a schematic diagram of a dichroic prism of a white pixel unit according to an embodiment of the present application.
Fig. 6 is a schematic diagram of a gap layer and a photodiode layer provided in an embodiment of the present application.
Fig. 7 is a schematic view of refraction and reflection of light in a beam-splitting prism.
Fig. 8 is a schematic view of light after being emitted from the beam splitting prism.
FIG. 9 is a schematic diagram of a photodiode layer receiving corresponding monochromatic light.
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 11 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides an image sensor and electronic equipment, wherein, image sensor is applied to electronic equipment, electronic equipment includes smart mobile phone, pad, wearing formula equipment, PC, robot, vehicle event data recorder, intelligent cat eye etc.. The electronic equipment comprises a camera, wherein an image sensor is integrated in the camera, and the camera shoots by using the integrated image sensor.
Before describing the image sensor in the embodiment of the present application in detail, the principle of the current CMOS image sensor will be briefly described so as to facilitate understanding of the scheme in the embodiment of the present application. Fig. 1 is a schematic diagram of a structure of an image sensor in the prior art, and fig. 2 is a schematic diagram of a pixel arrangement in the prior art.
In fig. 1, the image sensor 100 includes a plurality of pixel units and a microlens 110 disposed over each pixel unit, each pixel unit including a color filter and a photodiode 120, the photodiode 120 being located at a side of the corresponding pixel unit away from the microlens 110. The filters may be a red filter 131, a green filter 132, or a blue filter 133, and the pixel units include a red pixel unit (R), a green pixel unit (G), and a blue pixel unit (B). The plurality of pixel units may be arranged as in fig. 2, and the arrangement shown in fig. 2 is also referred to as bayer (bayer) arrangement. A metal line 130 is also provided on the side of the photodiode remote from the microlens.
The incident light is filtered by the optical filter with corresponding color after passing through the micro lens, and then the received monochromatic light with corresponding color is subjected to photoelectric treatment by the photodiode. The single pixel unit in fig. 1 can only receive light with one wavelength of blue, green or red, and finally interpolate by the pixel (pixel) corresponding to the adjacent pixel unit to restore the color of the photographed object.
The embodiment of the application provides an image sensor, which comprises at least one white pixel unit and a first micro lens positioned on the white pixel unit. The image sensor further comprises at least one second pixel unit and a second microlens positioned on the second pixel unit. The first microlenses and the second microlenses are arranged on the same layer, and the size of the white pixel unit is the same as that of the second pixel unit, for example, the size of the second pixel unit in the second direction (horizontal direction), and the second pixel unit can be at least one of a red pixel unit, a green pixel unit or a blue pixel unit. The improvement points of the embodiments of the present application mainly include a white pixel unit, which will be described in detail below, and a second pixel unit that is identical to the existing second pixel unit in structure.
Note that the white pixels have no filters, so that all bands/wavelengths of light in the incident light can enter.
In an embodiment, the white pixel unit and the second pixel unit may be disposed at intervals. Such as providing a white pixel cell, a second pixel cell, etc. For example, one white pixel unit (W), one red pixel unit (R), one white pixel unit (W), one green pixel unit (G), one white pixel unit (W), one blue pixel unit (B), and the like may be provided.
In one embodiment, the second pixel unit may be set as a green pixel unit because the human eye is relatively sensitive to green. As shown in fig. 3, an arrangement schematic diagram of a white pixel unit and a second pixel unit (a green pixel unit) according to an embodiment of the present application is shown. Wherein the green pixel units and the white pixel units are arranged at intervals.
It should be noted that the image sensor in the embodiment of the present application may also be any other arrangement including at least one white pixel unit and at least one second pixel unit, where the second pixel unit is at least one of a red pixel unit, a green pixel unit, or a blue pixel unit; the plurality of pixel units may be arranged in any arrangement among the plurality of pixel units. Since the second pixel unit is at least one of the red pixel unit, the green pixel unit, and the blue pixel unit, there are multiple arrangement modes of at least one white pixel unit and at least one second pixel unit, and there are multiple arrangement modes corresponding to each arrangement mode, which is not exemplified herein.
The white pixel unit in the image sensor of the embodiment of the present application will be described in detail below.
Fig. 4 is a schematic structural diagram of an image sensor provided in an embodiment of the present application, fig. 5 is a schematic diagram of a dichroic prism of a white pixel unit provided in an embodiment of the present application, fig. 6 is a schematic diagram of a gap layer and a light splitting diode layer of the white pixel unit provided in an embodiment of the present application, fig. 7 is a schematic refractive reflection diagram of light in the dichroic prism, fig. 8 is a schematic diagram of light after being emitted from the dichroic prism, and fig. 9 is a schematic diagram of a photodiode layer receiving corresponding monochromatic light.
Referring to fig. 4 to 6, the image sensor 200 includes at least one white pixel unit 210 and a first microlens 220 disposed on the white pixel unit. Wherein, the white pixel unit 210 is sequentially disposed in the first direction: a beam splitting prism 211, a gap layer 212, and a photodiode layer 213. Wherein the first direction may be a vertical direction as shown in fig. 4.
The beam splitter prism 211 may be a glass prism or any other object that can disperse light. The dichroic prism 211 is configured to receive incident light incident from the first microlens 220, where the incident light may be incident natural light, and after refraction, reflection, and light splitting treatment by the dichroic prism, monochromatic light with multiple different wavelengths is obtained, that is, the light is dispersed and emitted from the dichroic prism 211.
In one embodiment, the dichroic prism 211 includes a light entrance surface 2111 and a light exit surface 2112. The light incident surface 2111 is located at a side close to the first micro lens 220, and the light exit surface 2112 is located at a side far from the first micro lens 220. The light incident surface 2111 receives the incident light from the first microlens 220, refracts, reflects, and splits the light, and then emits the obtained monochromatic light of a plurality of different wavelengths from the light emitting surface 2112.
The light entrance surface 2111 may be provided as a concave surface or a plane surface. In one embodiment, the light incident surface 2111 is provided as a concave surface. Since the first microlenses 220 have a condensing effect, light can be scattered by the concave surface.
In one embodiment, the curvature of the concave surface is related to the angle of incidence of the light entering the first microlens 220. For example, the curvature of the concave surface may be set to be the same as the incident angle of the incident light entering the first microlens 220 after the light incident on the concave surface is refracted at the first side after the first microlens is processed.
In an embodiment, the curvature of the concave surface may be the same as the curvature of the first microlens 220. In this way, the concave surface may be made to counteract the condensing action of the first microlenses 220.
In an embodiment, the radians of the concave surfaces corresponding to the light incident surfaces 2111 in the plurality of white pixel units may be different. It is appreciated that the curvature of the concave surface of the light incident surface 2111 can be adjusted to change the curvature of the concave surface.
In one embodiment, the light incident surface 2111 is rotatably connected to the prism body. When the incident angles of the light incident on the first micro lens 220 are different, the light incident surface 2111 is rotated, so that the angle of the light after the light is refracted for the first time through the light incident surface 2111 remains unchanged. Please understand in conjunction with fig. 7.
It can be appreciated that the light incident surface 2111 is rotated no matter how large the incident angle is, so that the angle of the light ray after the incident light incident at different incident angles is refracted for the first time remains unchanged. The angle of the light rays of the incident light rays incident from different incident angles after the first refraction is kept unchanged, and correspondingly, the angle of the light rays of the monochromatic light rays emitted from the light-emitting surface 2112 is also kept unchanged, so that the height of the light splitting prism is not required to be changed, the gap layer is not required to be changed, and the like.
In an embodiment, the light-emitting surface 2112 is disposed obliquely, as shown in fig. 4 and 5, so that the monochromatic light (after the second refraction) emitted from the light-emitting surface 2112 can be incident to the photodiode layer 213 vertically or relatively vertically. In this way, the photodiode layer 213 may be centrally or relatively centrally disposed under the first microlens 220, so as to facilitate the overall design of the image sensor and avoid affecting the second pixel unit.
In one embodiment, the angle of the inclination angle of the light-emitting surface 2112 is related to the wavelength of the plurality of monochromatic lights that can be separated by the dichroic prism 211. For example, assuming that the incident light is natural light, the angle of the inclination angle of the light-emitting surface 2112 may be set such that the monochromatic light of the center wavelength is emitted perpendicularly to the plane of the first microlens 220 after being refracted by the light-emitting surface 2112, or it may be understood that the monochromatic light of the center wavelength is emitted perpendicularly to the bottom photodiode layer 213. The monochromatic light of the center wavelength may be 550nm, for example, corresponding to green light, or the like.
In an embodiment, the light-emitting surface 2112 is obliquely disposed, and the light-emitting surface 2112 is movably disposed on the prism body. When the incident angles of the incident light inputted to the first micro lens 220 are different, the light-emitting surface 2112 is moved to move the angle of the inclined angle of the light-emitting surface 2112, so that the light-emitting angles of the same monochromatic light after the second refraction through the light-emitting surface 2112 are the same. The same monochromatic light after the second refraction through the light-emitting surface 2112 has the same light-emitting angle, so that the height of the light-splitting prism does not need to be changed, the gap layer does not need to be changed, and the like.
It can be appreciated that two ways may be provided in the embodiments of the present application to make the emission angles of the same monochromatic light after the second refraction through the light-emitting surface 2112 be the same: the first is to adjust the light entrance surface 2111, and the second is to move the light exit surface 2112. In an embodiment, the light incident surface 2111 and the light emergent surface 2112 can be adjusted simultaneously to achieve the same effect.
The light incident surface 2111 and the light emergent surface 2112 are disposed opposite to each other, and the height of the light splitting prism 211 is set such that the incident light entering from the first microlens 220 is refracted for the first time through the light incident surface 2111, and after being totally reflected again on one side surface of the light splitting prism 211, is just refracted for the second time from the light emergent surface 2112 and then emitted.
It should be noted that in all the above adjustments, consideration is also given to avoiding the problem of total reflection on the light-emitting surface 2112.
Referring to fig. 4, the gap layer 212 may include an air gap layer or a vacuum gap layer, or may be another gap layer with a uniform medium for light transmission. The plurality of monochromatic lights emitted from the light-emitting surface 2112 are transmitted through the gap layer 212, and after a certain distance, the monochromatic lights of different wavelengths/wavelength bands/spectrums are separated.
It will be appreciated that there are a plurality of types of monochromatic light emitted from each point where light refraction exists on the light-emitting surface 2112, but since refraction angles corresponding to each type of monochromatic light are different, but refraction angles corresponding to the same type of monochromatic light are the same, after a certain distance is transmitted, the plurality of types of monochromatic light emitted from each point where light refraction exists on the light-emitting surface 2112 are separated, for example, red monochromatic light and green monochromatic light are separated, and red monochromatic light emitted from each point appears in the same area. The gap layer 212 is used to separate the plurality of monochromatic lights when reaching the bottom photodiode layer 213 after transmitting the plurality of monochromatic lights by a certain distance, so that the plurality of photodiodes in the photodiode layer 213 can independently receive the corresponding monochromatic lights. See in particular fig. 8 and 9.
The area of the gap layer 212, which is close to the first cross section corresponding to the side of the beam-splitting prism 211, is smaller than the area of the gap layer, which is close to the second cross section corresponding to the side of the photodiode layer 213, wherein the first cross section and the second cross section are parallel. As can be seen from fig. 4, the diameter of the gap layer 212 becomes thicker from the side near the beam-splitting prism 211 to the side near the photodiode layer 213.
Wherein the height of the gap layer 212 is set such that a plurality of monochromatic lights propagating in the gap layer are separated at a side of the gap layer 212 near the photodiode layer 213.
Wherein the photovoltaic secondary light layer 213 comprises a plurality of independently spaced first photodiodes disposed in a second direction, as shown in fig. 8. Wherein the first direction is different from the second direction. For example, the first direction is the vertical direction shown in fig. 4, and the second direction is the horizontal direction shown in fig. 4.
The length of the photodiode layer 213 in the second direction is greater than the length of the side, close to the photoelectric secondary optical layer 213, of the gap layer 212, so that various monochromatic lights from the gap layer 212 are guaranteed to be received by the first photodiodes in the photoelectric secondary optical layer 213 and not emitted from the photoelectric secondary optical layer 213, and other devices of the image sensor are affected.
Each of the plurality of individual first photodiodes in the photodiode layer 213 is configured to receive a corresponding one of the monochromatic light, and the wavelength of the monochromatic light that each first photodiode is capable of receiving sequentially increases or decreases along the second direction. If the second direction is horizontal to the right, the wavelength of the monochromatic light which can be received by each first photodiode is sequentially increased, and if the second direction is horizontal to the left, the wavelength of the monochromatic light which can be received by each first photodiode is sequentially decreased. It will be appreciated that a plurality of individual first photodiodes may receive red, blue, cyan, green, yellow, orange, red light, respectively, from left to right in fig. 8. Correspondingly, the number of the plurality of first photodiodes is seven. In other embodiments, the light may be divided into more or less light, and the number of the first photodiodes may be greater or less. As shown in fig. 8, the leftmost two lines are blue light, the middle two lines are yellow light, and the rightmost two lines are red light.
Thus, after passing through the first micro lens, the incident light enters the beam-splitting prism in the white pixel unit to be refracted, then enters the side wall of the beam-splitting prism to be totally reflected, and then enters the air gap layer from the bottom of the beam-splitting prism. After entering the air gap layer a distance, monochromatic light of different spectra/wavelengths/bands is split up into the bottom photodiode layer. The first photodiodes in the photodiode layer are independently and correspondingly received according to the separated monochromatic light of different spectrums/wavelengths/wave bands. In this way, the white pixel unit can sense full-band monochromatic light for back-end image processing. That is, the white pixel unit used by the image sensor in the application can sense the monochromatic light of the full wave band/full spectrum, so that the photosensitivity of the image sensor can be greatly enhanced under the same pixel size, and meanwhile, the precision of color can be improved, the accuracy of color restoration and the precision of automatic white balance can be improved due to the fact that the monochromatic light of the full wave band can be sensed.
As shown in fig. 4, the image sensor in the embodiment of the present application includes at least one second pixel unit 230 and a second microlens 240 located above the second pixel unit 230, in addition to the white pixel unit 210 and the first microlens 220 located above the white pixel unit 210. The first microlenses 220 and the second microlenses 240 have the same size and the same function, and the first microlenses 220 and the second microlenses 240 are arranged in the same layer. Correspondingly, the white pixel unit 210 and the second pixel unit 230 have the same size. The second pixel unit is at least one of a red pixel unit, a green pixel unit or a blue pixel unit.
In fig. 4, the left and right sides of the white pixel unit 210 are both second pixel units, such as both green pixel units.
Each of the second pixel units 230 includes a filter 231 of a corresponding color, such as a green filter, and a second photodiode 232.
In an embodiment, the distance between the second photodiode 232 and the second microlens 240 is smaller than the distance between the gap layer 212 and the first microlens 220, so that the white pixel unit 210 does not affect the second pixel unit, so as to avoid the portion under the gap layer of the white pixel unit 210 from affecting the second pixel unit 230.
In an embodiment, the distance between the second photodiode 232 and the second microlens 240 is smaller than the distance between the photodiode layer 213 and the first microlens 220, so that the white pixel unit 210 does not affect the second pixel unit, so as to avoid the portion of the photoelectric secondary optical layer 21 of the white pixel unit 210 affecting the second pixel unit 230.
As shown in fig. 9, an embodiment of the present application further provides an electronic device, where the electronic device 300 includes a camera 310, and the camera 310 includes the image sensor described in any of the foregoing embodiments. The camera 310 captures an image or video using the light sensed by the image sensor.
In one embodiment, as shown in FIG. 9, the electronic device further includes a processor 320 and a memory 330. The processor 320 is electrically connected to the memory 330 and the camera 310. Processor 320 is a control center of electronic device 300 that utilizes various interfaces and lines to connect various portions of the overall electronic device, perform various functions of the electronic device and process data by running or loading applications stored in memory 330, and invoking data stored in memory 330, thereby performing overall monitoring of the electronic device. The memory 330 is also used for storing images or video data obtained after the camera 310 shoots. Memory 330 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, memory 330 may further include memory located remotely from processor 320, which may be connected to electronic device 300 via a network.
In one embodiment, as shown in fig. 10, the electronic device 400 may include the following modules and/or units in addition to the camera 410, the camera 410 including the image sensor, the processor 420, and the memory 430 described in any of the above embodiments.
The RF radio frequency circuit 440 is configured to receive and transmit electromagnetic waves, and to implement mutual conversion between the electromagnetic waves and the electrical signals, so as to communicate with a communication network or other devices. RF circuitry 610 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and the like. The RF circuitry 610 may communicate with various networks such as the internet, intranets, wireless networks, or other devices via wireless networks. The wireless network may include a cellular telephone network, a wireless local area network, or a metropolitan area network. The wireless networks described above may employ a variety of communication standards, protocols, and technologies, including some existing, and even those that are not currently developed.
The input unit 450 may be used to receive input numeric or character information and to generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 450 may include a touch-sensitive surface 451, as well as other input devices 452. The touch-sensitive surface 451, also referred to as a touch display screen (touch screen) or a touch pad, may collect touch operations thereon or thereabout by a user (such as operations of the user on the touch-sensitive surface 451 or thereabout by using any suitable object or accessory such as a finger, a stylus, etc.), and drive the corresponding connection means according to a pre-set program. The input unit 450 may comprise other input devices 452 in addition to the touch-sensitive surface 451. In particular, other input devices 452 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, mouse, joystick, etc.
The display unit 460 may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of the electronic device 400, which may be composed of graphics, text, icons, video, and any combination thereof. The display unit 460 may include a display panel 461, and optionally, the display panel 461 may be configured in the form of an LCD (Liquid Crystal Display ), an OLED (Organic Light-Emitting Diode), or the like. Further, the touch-sensitive surface 451 may cover the display panel 461, and when the touch-sensitive surface 451 detects a touch operation thereon or thereabout, the touch-sensitive surface is transferred to the processor 420 to determine the type of touch event, and then the processor 420 provides a corresponding visual output on the display panel 461 according to the type of touch event. Although in the figures the touch-sensitive surface 451 and the display panel 461 are implemented as two separate components for input and output functions, it will be appreciated that integrating the touch-sensitive surface 451 with the display panel 461 implements the input and output functions.
The electronic device 400 may also include at least one sensor 470, such as a light sensor, a direction sensor, a proximity sensor, and other sensors. As one of the motion sensors, the gravity acceleration sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and the direction when the mobile phone is stationary, and can be used for applications of recognizing the gesture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may also be configured with the electronic device 400 are not described in detail herein.
Audio circuitry 480, speakers 481, and microphone 482 may provide an audio interface between the user and the electronic device 400. The audio circuit 480 may transmit the received electrical signal converted from audio data to the speaker 481, and convert the electrical signal into a sound signal by the speaker 481 and output the sound signal; on the other hand, microphone 482 converts the collected sound signals into electrical signals, which are received by audio circuit 480 and converted into audio data, which are processed by audio data output processor 420 for transmission to, for example, another electronic device via RF circuit 440, or which are output to memory 430 for further processing. Audio circuitry 480 may also include an ear bud jack to provide for communication of a peripheral ear bud with electronic device 400.
The electronic device 400 may facilitate user reception of requests, transmission of information, etc. via the transmission module 490 (e.g., wi-Fi module), which provides wireless broadband internet access to the user. Although the transmission module 490 is illustrated, it is understood that it is not an essential component of the electronic device 400 and may be omitted entirely as desired within the scope of not changing the essence of the invention.
The electronic device 400 also includes a power supply 491 (e.g., a battery) that powers the various components, which in some embodiments may be logically connected to the processor 420 through a power management system, thereby performing functions such as managing charging, discharging, and power consumption by the power management system. The power supply 491 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, etc.
The electronic device 400 may further include a bluetooth module, an infrared module, etc., which are not described herein.
The foregoing has described in detail an image sensor and an electronic device provided by embodiments of the present application, and specific examples have been applied herein to illustrate the principles and embodiments of the present application, where the foregoing examples are provided to assist in understanding the methods of the present application and their core ideas; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.

Claims (8)

1. An image sensor, characterized by comprising at least one white pixel unit and a first micro lens positioned on the white pixel unit, wherein the white pixel unit is sequentially provided with:
the light-splitting prism comprises a light-in surface and a light-out surface, wherein the light-in surface is positioned at one side close to the first micro lens, the light-out surface is positioned at one side far away from the first micro lens, and the light-in surface is rotatably connected to the light-splitting prism main body; when the incident angles of the incident light entering the first micro lens are different, the light entering surface is rotated, so that the angle of the light rays after the incident light is refracted for the first time through the light entering surface is kept unchanged, the light entering surface is a concave surface, and the radian of the concave surface is the same as that of the first micro lens;
the gap layer is a gap layer of a uniform medium for light transmission, various monochromatic lights emitted from the light emitting surface are transmitted in the gap layer, and after a certain distance is transmitted, the monochromatic lights of different wavelengths/wave bands/spectrums can be separated;
a photodiode layer comprising a plurality of individual first photodiodes disposed in a second direction, the first direction being different from the second direction.
2. The image sensor of claim 1, wherein the light-emitting surface is inclined, and an angle of the inclined angle of the light-emitting surface is related to wavelengths of a plurality of monochromatic lights which can be separated by the light-splitting prism.
3. The image sensor of claim 1, wherein the light-emitting surface is disposed obliquely, and the light-emitting surface is movably disposed on the prism body;
when the incident angles of the incident lights entering the first micro lenses are different, the light-emitting surface is moved to adjust the angle of the inclined angle of the light-emitting surface, so that the light-emitting angles of the same monochromatic light after the light-emitting surface is subjected to secondary refraction are the same.
4. The image sensor of claim 1, wherein an area of a first cross section of the gap layer corresponding to a side near the beam splitting prism is smaller than an area of a second cross section corresponding to a side near the photodiode layer, the first cross section and the second cross section being parallel.
5. The image sensor of claim 1, wherein each of the plurality of individual first photodiodes is configured to receive a corresponding one of the monochromatic light, and wherein the wavelength of the monochromatic light that each first photodiode is configured to receive is sequentially incremented or decremented along the second direction.
6. The image sensor of claim 1, further comprising at least one second pixel cell and a second microlens on the second pixel cell, the first and second microlenses being co-layered, the white pixel cell being the same size as the second pixel cell, the second pixel cell being at least one of a red pixel cell, a green pixel cell, or a blue pixel cell.
7. The image sensor of claim 6, wherein the second pixel unit includes a filter of a corresponding color and a second photodiode sequentially disposed in a first direction, the second photodiode being spaced from the second microlens by a distance less than a distance of the gap layer from the first microlens.
8. An electronic device, characterized in that the electronic device comprises a camera, the camera comprises the image sensor according to any one of claims 1 to 7, and the camera uses the image sensor to sense light to shoot to obtain an image or video.
CN202210050802.8A 2022-01-17 2022-01-17 Image sensor and electronic device Active CN114422730B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210050802.8A CN114422730B (en) 2022-01-17 2022-01-17 Image sensor and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210050802.8A CN114422730B (en) 2022-01-17 2022-01-17 Image sensor and electronic device

Publications (2)

Publication Number Publication Date
CN114422730A CN114422730A (en) 2022-04-29
CN114422730B true CN114422730B (en) 2024-03-19

Family

ID=81274002

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210050802.8A Active CN114422730B (en) 2022-01-17 2022-01-17 Image sensor and electronic device

Country Status (1)

Country Link
CN (1) CN114422730B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115278127A (en) * 2022-07-25 2022-11-01 Oppo广东移动通信有限公司 Image sensor, camera and electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11313334A (en) * 1998-04-27 1999-11-09 Nippon Hoso Kyokai <Nhk> Solid-state image pickup device
KR20100052838A (en) * 2008-11-11 2010-05-20 주식회사 동부하이텍 Cmos image sensor and method for fabricating of the same
CN102510447A (en) * 2011-09-28 2012-06-20 上海宏力半导体制造有限公司 Image sensor
CN107613182A (en) * 2017-10-27 2018-01-19 北京小米移动软件有限公司 Camera photosensory assembly, camera and camera shooting terminal
CN111614878A (en) * 2020-05-26 2020-09-01 维沃移动通信(杭州)有限公司 Pixel unit, photoelectric sensor, camera module and electronic equipment
CN111739900A (en) * 2020-07-28 2020-10-02 深圳市汇顶科技股份有限公司 Image sensor, image sensitization method, chip and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210028808A (en) * 2019-09-04 2021-03-15 삼성전자주식회사 Image sensor and imaging apparatus having the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11313334A (en) * 1998-04-27 1999-11-09 Nippon Hoso Kyokai <Nhk> Solid-state image pickup device
KR20100052838A (en) * 2008-11-11 2010-05-20 주식회사 동부하이텍 Cmos image sensor and method for fabricating of the same
CN102510447A (en) * 2011-09-28 2012-06-20 上海宏力半导体制造有限公司 Image sensor
CN107613182A (en) * 2017-10-27 2018-01-19 北京小米移动软件有限公司 Camera photosensory assembly, camera and camera shooting terminal
CN111614878A (en) * 2020-05-26 2020-09-01 维沃移动通信(杭州)有限公司 Pixel unit, photoelectric sensor, camera module and electronic equipment
CN111739900A (en) * 2020-07-28 2020-10-02 深圳市汇顶科技股份有限公司 Image sensor, image sensitization method, chip and electronic equipment

Also Published As

Publication number Publication date
CN114422730A (en) 2022-04-29

Similar Documents

Publication Publication Date Title
KR102650473B1 (en) Display components, display screens, and electronic devices
CN108141549B (en) Image sensor and electronic device having the same
US20200089928A1 (en) Optical image capturing unit, optical image capturing system and electronic device
WO2022143280A1 (en) Image sensor, camera module, and electronic device
CN108600712B (en) Image sensor, mobile terminal and image shooting method
CN108900750B (en) Image sensor and mobile terminal
US11463641B2 (en) Image sensor, mobile terminal, and photographing method
CN108965665B (en) image sensor and mobile terminal
CN108965666B (en) Mobile terminal and image shooting method
US20220342130A1 (en) Spectral filter, and image sensor and electronic device including spectral filter
CN114422730B (en) Image sensor and electronic device
WO2020015626A1 (en) Mobile terminal and image capturing method
US12032186B2 (en) Spectral filter, and image sensor and electronic device including spectral filter
US11996421B2 (en) Image sensor, mobile terminal, and image capturing method
KR20130109407A (en) Chip on glass and camera module having the same
CN113099082A (en) Sensor module and electronic device
CN216014312U (en) Optical fingerprint identification device and electronic equipment
WO2023024091A1 (en) Fingerprint identification apparatus and electronic device
US11830281B2 (en) Electronic device including a fingerprint sensor
CN108965703A (en) A kind of imaging sensor, mobile terminal and image capturing method
US20240038798A1 (en) Color filter, image sensor, and electronic apparatus having the image sensor
WO2024174097A1 (en) Fingerprint recognition apparatus and electronic device
US20230185005A1 (en) Optical filter, and image sensor and electronic device including optical filter
CN113471226A (en) Image sensor and electronic equipment
KR20240047267A (en) Lens assembly and electronic device including the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant